The (Necessary?) GNU/Linux Fragmentation

I would like to share with you a story of my recent struggles with Debian. They’re partially my fault, but also partially due to the way Debian handles network management, which is quite different from how other GNU/Linux distributions do it.

The story begins with me being happy with a regular desktop install, powered by XFCE4, but then wanting to switch to the less distracting Openbox. I installed Openbox + extras like the tint2 panel, nitrogen (background/wallpaper setter) and other lightweight alternatives to XFCE4 components. While sweeping up XFCE4 leftovers, “apt autoremove” accidentally removed way too many packages, including network-manager. I was instantly left with no network connection and no means of restoring it, as I learned later. By default network management on Debian is handled by the ifupdown scripts, which “up” interfaces listed in /etc/network/interfaces and direct them to dhclient to get the DHCP lease or assign a static IP address. Incidentally, ifupdown utils have no means of directing wireless interfaces to wpa_supplicant for WPA-encrypted networks. Nowadays, this is handled by network-manager, which “Just Works”. network-manager uses wpa_supplicant to handle WPA-encryption (in addition to many other things), whilst performing the rest of network management itself. This is quite different from running wpa_supplicant directly, which simply failed in my case due to a known regression.

It’s quite sad to see that Debian, despite moving from init scripts to systemd for boot + service management, still insists on configuring network interfaces via Shell scripts (the mentioned ifupdown tools), while a mainstream solution in the form of network-manager is available! Why is it recommended as a “Just Works” alternative yet not offered by default? On Red Hat based distributions (say Fedora, CentOS, etc.) the matter is really simple – you get network-manager and you’re good to go out-of-the-box. That stands to reason, though, as Network Manager is a Red Hat project. Still, the “Just Works” approach baffles me (and disturbs even) greatly. “Just Works” sounds like a catch-phrase typical of commercial operating systems. Since they target desktop and laptop computers mainly, it’s enough if an ethernet interface and/or a wireless interface “just work”. What about servers with multiple NICs, routers, gateways, NATs and VMs, each having its IP address or set thereof? What then? Oh, right, we can write systemd units for each interface and control them via systemctl. Or use the ip utilities for a similar purpose. Or the deprecated ifconfig, which we shouldn’t use, but still can because it’s in the repositories of many distributions. Alternatively, perform DHCP requests via one of the selected clients – dhclient, dhcpcd, dhcpd, etc. We end up with a hodgepodge of programs that are best left to their own devices due to incomplete documentation and/or unclear configuration means. Each GNU/Linux distribution having a set of their own base utilities.

Personally, I feel that’s where the BSDs succeed. You get a clearly separated base system with well-documented and easily-configurable tools that are maintained as a whole. Network interface configuration borders on trivial. More so, the installer handles wireless connections almost seamlessly. Why is it so difficult on GNU/Linux? At this point, I believe the GNU/Linux community would profit greatly by agreeing on a common “base system”. Red Hat’s systemd is the first step to the unification of the ecosystem. While I am strongly opposed to systemd, because it gives merely the illusion of improved efficacy by simplifying configuration and obfuscating details, GNU/Linux should be a bit stronger on common standards, at least for system-level utilities.

Advertisements

GNU/Linux and Its Users

I decided to devote this entry to a reflection I made recently, while participating in discussions in the Facebook Linux group. I realized, as many have before me, that there is a strong correlation between the maturity and complexity of an operating system and the savviness of its users. The BSDs and more demanding GNU/Linux distributions like CRUX, Arch and Gentoo attract experienced computer users, while Ubuntu and its derivatives entice beginners mostly. As few express the need to learn the Unix Way, beginner-oriented operating systems (this includes Windows and MacOS X, of course) are far more popular. Consequently, they garner stronger commercial support from hardware and software companies as they constitute a market for new products.

The truth is, we have all been beginners once. More so, unless we’re old enough to remember the ancestral iterations of the original UNIX operating system (I’m not!), we’ve been MacOS X or Windows users way before switching to a modern Unix-like operating system. Alas, as such we have been tainted with a closed-source mindset, encouraging us to take no responsibility for our computers and solve system-level problems with mundane trial-and-error hackery. Not only is such a mindset counter-productive, but also hampers technological progress. Computers are becoming increasingly crucial in our everyday lives and a certain degree of computer-literacy and awareness is simply mandatory. Open-source technologies encourage a switch to a more modern mindset, entailing information sharing, discussions and learning various computer skills in general. The sooner we accustom ourselves with this mindset, the faster we can move on.

The current problem in the GNU/Linux community (much less in non-Linux Unix communities) is that the entry barrier is being continuously lowered as to yield a speedier influx of users. Unfortunately, many of these users are complete beginners not only in terms of Unices, but also in terms of using computers in general. With them the closed-source mentality is carried over and we, the more experienced users, have to deal with it. Some [experienced users] provide help, while others are annoyed with the constant nagging. Within us lies the responsibility to educate newbies and encourage them to embrace the open-source mindset (explained above). However, they don’t want to. They want the instant gratification they received when using Windows or MacOS X, because someone convinced them that GNU/Linux can be a drop-in replacement for their former commercial OS. They want tutorial-grade, easy-to-follow answers to unclear, badly formulated questions. Better yet, they want them now, served on a silver platter. We all love helping newbies, but we shouldn’t encourage them to remain lazy. Otherwise, we’ll eventually share the fate of Windows or MacOS X as just this other mainstream platform. I cannot speak for everyone, though I would personally prefer GNU/Linux to continue its evolution as a tech-aware platform of the future.

Software Design Be Like…

I stumbled upon this very accurate article from the always-fantastic Dedoimedo, recently. I don’t agree with the notion that replacing legacy software with new software is done only because old software is considered bad. Oftentimes we glorify software that was never written properly and throughout the years accumulated a load of even more ugly crust code as a means of patching core functionalities. At one point a rewrite is a given. However, I do fully agree with the observation that a lot of software nowadays is written with an unhealthy focus on developer experience, rather than user experience. Also, continuity in design should be assumed as sacred.

One of the things that ticks me off (much like it did in the case of the Dedoimedo author) is when developers emphasize how easy it is for prospective developers to continue working on their software. Not how useful it is to the average Joe, but how time-efficient it is to add code. It’s nice, but should not be emphasized so. Among the many characteristics a piece of software may have, I appreciate usefulness the most. Even in video games, mind you! Anything that makes the user spend less time on repetitive tasks is worth its weight in lines of code (or something of that sort). Features that developers consider useful are in reality not so to regular users way too often. Also, too many features are a bad thing. Build a core program with essential features only and mix in a scripting language to add features that users may require on a per-user basis. See: Vim, Emacs, Chimera, PyMOL, etc. Success guaranteed!

Another matter is software continuity. Backward-compatibility should be considered mandatory. It should be the number one priority of any software project as it’s the very reason people keep getting our software. FreeBSD is strong on backward-compatibility and that’s why it’s such a rock-solid operating system. New frameworks are good. New frameworks that break or remove important functionalities are bad. The user is king and he/she should always be able to perform their work without obstructions or having to re-learn something. Forcing users to re-learn every now and then is a *great* way of thinning out the user base. One of the top positions on my never-do list.

Finally, an important aspect of software design is good preparation. Do we want our software to be extensible? Do we want it to run fast and be multi-threaded? Certain features should be considered exhaustively from the very beginning of the project. Otherwise, we end up adding lots of unsafe glue code or hackish solutions that will break every time the core engine is updated. Also, never underestimate documentation. It’s not only for us, but also for the team leader and all of the future developers who *will* eventually work on our project. Properly describing I/O makes for an easier job for everyone!

Just Linux Things

chillin_penguins

As a follow-up to my previous post, I noticed that a lot of the recent open-source technologies are extremely Linux-centric. It all started around the time GNOME3 and systemd were introduced. Both heavily rely on facilities present in Linux-based operating systems, making them difficult to port to other Unix platforms like the BSDs. These technologies are also strongly promoted to entice prospective developers. While I understand the need for platform-centric efficiency inherently tied to Linux-specific features (cgroups, etc.), it is also important not to ignore the rest of the IT ecosphere. Yes, I mean even Windows, which is normally not considered on equal terms as Unix, but is relevant when talking about C# or .NET applications.

A recent example of this trend is Docker. Containers are now the new pink and everyone wants to get a piece of the pie. Docker barely reached release 1.x, yet some companies already make claims about “widespread adoption”. I thought industries prefer stable and tested-for-years solutions. I find this new craze odd the least. As expected, Docker is a Linux thing. While the containers are indeed OS-level on GNU/Linux systems, much like LXC (Linux Containers), they’re not on Windows and neither on MacOS X. Oddly enough, the latter uses a Unix virtual machine manager xhyve (based on FreeBSD’s bhyve). Therefore, despite the fact that developer interfaces are similar or even identical, the engines running underneath will have a substantial overhead on non-Linux systems. At that point one might consider whether native and more established solutions are not already available and more suitable for multi-container setups. On FreeBSD we have Jails and a ton of Jail managers to make one’s life easier. I have a feeling that Jails on FreeBSD would do a lot better than Docker containers on GNU/Linux. Not to mention that a FreeBSD base system is a lot slimmer than whatever one could consider a base system in GNU/Linux world. “Widespread adoption” seems to be lacking, because most of world’s servers run GNU/Linux.

Another weird trend I notice is identifying everything Linux-related with Ubuntu, as if Ubuntu was the only True Linux Distribution. I often read articles that claim to touch on Linux, but in reality discuss Ubuntu. A square is a rectangle, but not all rectangles are squares! That’s so obvious, no? This “Linux = Ubuntu” assumption hurts the whole ecosystem quite a lot. People learn how to use Ubuntu, think they’ve mastered Linux. Then, they’re dropped into a den full of Gentoos and CentOS’, and they end up suffering. Ouch! A fallout of this worrying trend is the fact that people deploy Ubuntu in places where a more lightweight GNU/Linux distribution would be a lot more suitable. That’s one of the reasons why Docker switched from Ubuntu to Alpine Linux eventually. With the wealth and diversity of the GNU/Linux ecosystem, one doesn’t even need to go far.