Curing GUI Phobias

Since some time I’m a happy openSUSE camper, yet often frequenting the main Fedora IRC channel. Truth be told, It was tough to decide between those two distributions as both are extremely solid and bug-free. My third choice would fall to one of the Ubuntu spins (Xubuntu or Lubuntu most likely). Eventually,  I realized I’m less and less inclined to put in that extra time to set up Arch Linux or Gentoo per self-indulgence. I know I would be able to, but why should I? I’m familiar enough with Linux to roll any distribution. It seems my impressions go hand in hand with those of Roger from Dedoimedo (http://www.dedoimedo.com/computers/why-not-arch-linux.html). I’m sure he too would be able to install Arch Linux on any PC of his choosing, though again why should he?

Many modern Linux distributions are solid, finished products. You can install them on anything from a bootable USB or DVD and just get things done. That’s precisely the point. It took years, buckets of sweat and units of blood to reach that state. However, now we’re here! More so, we’ve beaten Windows, because it’s still as much of a pain to install as it used to be a decade ago. Therefore, instead of repeating the rite of passage (read: installing Gentoo or Linux From Scratch on a laptop), we should move forward. There is this wrongly placed elitism in some of us in the Linux and Unix communities (mea maxima culpa!) that if you don’t run a hardcore distribution on your shoddy PC, you’re not nerdy enough. So far from truth this be! Nerdy stuff can be done on virtually any of the mainstream distributions. You can set up servers running Ubuntu Server (duh!). You can build datacentre grade boxes with openSUSE Leap. File server on Raspberry Pi? Yup. And so on. No need to spend hours hacking the Linux kernel to squeeze out that extra 0.000000001% performance gain, thinking that alone makes us Computer Wizards. The important thing about mainstream distributions to bear in mind is that someone poured hours of their free time to assemble together the Linux kernel, utilities, a desktop environment, repositories, etc. All of this so that we wouldn’t have to do it ourselves. Isn’t that golden? Truly, we should build on that rather than shun it.

I understand this stands in stark contrast to my former preachings. I, like many others, have escaped from Windows because it was overburdened with black-box utilities and hidden system services. It was a pain to fix without bricking it entirely. However, it’s actually nice to have a pretty GUI and helper tools to simplify system maintenance. The main difference is that Linux-based operating systems are highly malleable, well-documented and can be adjusted to our liking. I realized that’s probably the main reasons I switched.

Advertisements

AI, Lisp and Why Languages Die

In my exploration of the things arcane and mythical, I stumbled upon a forgotten book by Peter Norvig – Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp. My IT colleague was delighted to see it and highly recommended the study of Common Lisp as a sort of meta language. As programming languages interest me greatly and my understanding of functional programming is rather lacking (I loath Python lambdas), I decided to give it a go. What I discovered was a language with an unusual syntax (defining all structures as lists of objects), yet a potential for writing useful tools efficiently. Then, I learned various Lisp dialects were heavily used during the computer boom of 1960-80s when the US government would pump billions of dollars into military and NASA projects (machine learning, AI algorithms, behavioral simulations,, etc.). The trend died down in the early 1990s and together with it Lisps either gave rise to modern languages like Clojure (also from the Lisp family) or simply disappeared. From the old generation Scheme and Common Lisp are still in use, though less and less by the day.

Artificial Intelligence has always been an extremely vital (and interesting) field of computer science. In fact, now more than ever, as rapid growth of the Internet forces us to develop smart tools to sieve through the wild abundance of information in real-time. No wonder projects like Alexa (Amazon) or Cortana (Microsoft) are on the rise. There are 2 crucial aspects of AI that seem to garner much interest in my opinion – human language interfaces (How to make programs understand us humans in all/most of our natural languages?) and intelligent filtering algorithms (How to make programs increasingly aware of our human needs and remember them?). The second aspect involves machine learning, which delves into data extrapolation, approximation and the progressive nature of filtering algorithms. It all boils down to making computers more human and doing some (most?) of our work for us. There are many quite realistic pitfalls, of course, like algorithms deciding that humans are the limiting factor in making our (human) lives easier. When we consider emotions as a complete contradiction to reason, this makes perfect sense. Unpredictable humans are the weakest link in an approach that relies on predictable values.

Going back to Lisp and its dialects, after its inception in 1959 it quickly became the language of choice for writing mathematical algorithms, especially in the field of AI. It was clear that the Lisp S-expression syntax makes code easy to read and the language itself has a strong propensity for evolution. From more modern times (1990-2000) there are plenty of success stories on how Lisp saved the day. Finally, Lisps pioneered crucial concepts like functional recurrence, concurrency and interactive programming (the famous REPL read-eval-print-loop, nowadays a common feature of Haskell, Python and other languages). Taking all of this into consideration it is quite difficult to understand why Common Lisp (the standardized Lisp effort) stopped being the hot stuff. Some of the sources I found mentioned that Lisps were pushed aside for political reasons. Budget cuts made a lot of NASA projects struggle for survival or meet swift demise. Also, new cool languages (*cough* *cough* Perl) came to be and Lisps were supposedly too arcane to be picked up and used easily. However, to me Common Lisp seems far less verbose (obfuscated?) than for example Java and far more orderly than said Perl. Its performance is also supposedly on par with Java, which might be interesting to people who would like to write useful tools quickly (as quickly as in Python, for instance), yet not get into the memory management details of vanilla C or C++ for better performance.

The truth is that no language is really dead until it becomes naturally obsolete. Even if it suddenly loses enterprise backing. While Lisps have some viable descendants, one would be hard pressed to find a language that directly supersedes Lisps. There are of course multiple functional languages that share Lisps’ strengths, yet they typically sport a vastly less approachable syntax, devoid of easily readable S-expressions. Therefore, I believe Scheme, Common Lisp and other modern Lisps deserve not only attention, but also proper appreciation.

Skype for Linux Woes

First and foremost I would like to thank Microsoft and contributors for considering our measly 1+% of desktop coverage and beginning works on the Skype on Linux desktop app. Frankly, previous Skype 4.2 and Skype 4.3 releases felt like rotten meat scraps thrown at a dog (or penguin) and were equally dangerous to Microsoft as to Linux users due to lacking security updates. However, one should be positive as there is light in the form of Skype for Linux Beta. Overall, the app is quite polished and doesn’t have too many graphical glitches. Alas, there are some things to consider still:

  • Only .deb and .rpm packages are available. This strange practice is quite common and I still don’t understand why a tarball with the binary + libraries is not offered alongside. There are many Linux distributions that use other packaging formats. Are vendors afraid that some hacker might reverse engineer the binaries? On top of this, Debian .deb archives can be quite different from Ubuntu .deb archives. Same with Fedora and openSUSE.
  • The website states “Video call your contacts.”. What it doesn’t mention is that this option is not available just yet. We learn it the hard way when calling our Windows-using parents. No talking heads for now, unfortunately.
  • Close to none debugging capabilities. This one pains me the most. The app is clearly denoted as a beta version, therefore one might assume that each user is in fact a beta tester. Any and all feedback would clearly be valuable to Microsoft. Yet, the app doesn’t leak any information to the terminal by default at all. Also, no something went wrong, please send us this pre-formatted report feature is available. I mean, really.
  • Bad software design is bad software design. I accidentally kept typing in the wrong account name, forgetting the domain for users is @hotmail, not @microsoft. Obviously, the account didn’t exist, but instead of telling me this, the app would show a glitched error screen with a Sorry…an unexpected error occurred. This is so Windows 98, honestly. No traceback, no nothing. Also, the Sorry and a cryptic error reference ID don’t really help anyone. What\s up with this attitude?

They’re on the right track, but plenty of polishing is needed. Truth be told, no piece of software shines from the get-go. Uncut diamonds need cutting. That’s just the way the world works.

 

Fedora 26 – RTL8188EU the Hard Way!

Following my former entry preaching on the greatness of Fedora 26, I decided to share some wisdom regarding USB wireless adapters (aka dongles) with the Realtek RTL8188EU chip. These and many other Realtek-based (usually RTL8188EU and RTL8192CU) adapters are affordable and extremely common. Companies like Hama, Digitus, TP-LINK and Belkin fit them into the cheapest 150N and 300N dongles, claiming that they’re compatible with Linux. In principle, they are. In practice, the kernel moves so fast that these companies have problems keeping up with driver updates. As a result,  poor quality drivers remain in the staging kernel tree. Some Linux distributions like Debian and Ubuntu include them, but Fedora doesn’t (for good reasons!) so Fedora users have to jump through quite some hoops to get them working…

The standard approach is to clone the git repository for the stand-alone RTL8188EU driver, compile it against our kernel + headers (provided by the Linux distribution of choice) and modprobe load if possible. Alas, since the stand-alone driver isn’t really in-sync with the kernel, it often requires manual patching and is in general quite flaky. An alternative, more fedorian approach is to build a custom kernel with the driver included. The rundown is covered by the Building a custom kernel article from the Fedora Wiki. All configuration options are listed in the various kernel-*.config files (standard kernel .config files prepped for Fedora), where “*” defines the processor architecture. Fortunately, we don’t have to mess with the kernel .configs too much, merely add the correct CONFIG_* lines to the “kernel-local” text file and fedpkg will add the lines prior to building the kernel. The lines I had to add for the RTL8188EU chip:

# ‘m’ means ‘build as module’, ‘y’ means ‘build into the kernel’
CONFIG_R8188EU=m
CONFIG_88EU_AP_MODE=y

This however will differ depending on the Realtek chip in question and the build will fail with an indication which line in the kernel .config was not enabled when it should’ve been. Finally, if you do not intend to debug your product later on, make sure to build only the regular kernel (without the debug kernel), as that takes quite some time.

 

Fedora 26 Beta – the Killer Distro

Lately, I have been distro-hopping way too much, effectively lowering the output of my Java bytecode. However, that’s over, at least for now. I jumped from Lubuntu to Fedora to openSUSE to Lubuntu again and long story short I ended up with all of my computers (but not my family…) converted to Fedora 26 Beta. One might think it’s too soon since Fedora 25 is not end-of-life just yet. Too soon for the faint of heart maybe, not so for a geeky daredevil such as myself!

fedora_dog

A cute Fedora doggie courtesy of the Internet

I tested Fedora 26 Beta as an upgrade from Fedora 25 on a legacy Dell Latitude E5500 (32-bit Intel Pentium M 575 + Intel GMA graphics), an aging and equally legacy MacBook 2008-2009 (64-bit Intel Core 2 Duo + nVidia Geforce 9400M GT) and yet another fossil PC – the Fujitsu-Siemens Celsius R650 (Intel Xeon E-series + nVidia Geforce 295 GTX). Each installation used the Fedora 25 LXDE spin as its base to keep things similar. No issues whatsover, even despite the fact that I heavily rely on the RPM Fusion repositories for nVidia and Broadcom drivers. This stands in stark contrast to any attempts to update Lubuntu or any Ubuntu spin I have tried thus far. My apologies beforehand, but personal experience with Ubuntu and its children is lacking on all fronts. Upgrading to a new release (even if it’s an LTS!) is like bracing for a tsunami. It will hit, hard. It seems that the dnf system-upgrade plugin has been perfected and is ready for shipping. Fresh installations of Fedora 26 Beta with LXQt were done on 2 PCs – an ASUS Vivobook S301LA (Intel Core i3 + Intel HD 4600 graphics) and an HP-Compaq Z200 workstation (Intel Xeon E-series + nVidia Quadro FX 1800). This time I used the Workstation flavor netinstall disc image as base. Again, only positive surprises here. All of the core Qt apps worked as intended. I was especially curious about Qupzilla, since it would often crash on other distributions (same with the webkit-gtk based Midori, in fact). I managed to write this entry/article without a single crash. I believe it is a testament not only to the various Fedora teams, but also to the Qupzilla, Qt and LXQt people who keep pushing forward with awe-inspiring zeal. Props, kudos and cheers!

Fedora 26 Beta is a great sign that Linux can into space. The experience is bug-free, solid and developer-ready so that I can return to taxing the OpenJDK JVM with peace of mind. Matter of fact, I begin to like Qt as a GUI framework and I am considering contributing to the Fedora project more ardently. They continuously provide me with great tools, I want to give something in return. We all take, we all should give.