Game Design Series – Jurassic Park (NES)

To prepare myself for my future game developer career I decided to play through some gaming classics for various Nintendo (GameBoy, NES) and SEGA (Saturn, Dreamcast, Genesis) consoles and analyze them thoroughly. The truth is that many amazing gameplay elements were invented way back in the 70-90s and haven’t appeared since. It’s a real shame, because frankly speaking they were groundbreaking. In my analyses I will try to focus on game difficulty, graphics, interesting gameplay aspects and the overall appeal of the game. First off is Jurassic Park for the Nintendo Entertainment System.

Official Title: Jurassic Park
Release Year: 1993
Developer: Ocean Software

jp-2

Game main menu – super scary!

Synopsis
The game follows the plot from the movie by Steven Spielberg and the techno-thriller by Michael Crichton (entirely different feel than the movie). You are PhD Alan Grant and your task is to escape from the now wild Jurassic Park located on Isla Nublar. Along the way you have to save Tim and Lex (grandchildren of Prof. Hammond) from being eaten alive by the legendary predator T-Rex or trampled by a stampede of triceratops’.

jp-1

Alan Grant in front of the Park gate

Graphics
Jurassic Park features an isometric view produced by sprites drawn at an angle from various sides. Interestingly, the collision box of some of them was defined only by the sprite’s base, allowing the game’s protagonist or his enemies to vanish behind obstacles. The color palette is crisp, though consists primarily of gray, red and different shades of green. It definitely looks better than early NES games. Projectiles are animated and so are the various dinosaurs infesting the Park. The main menu screen, featuring a viciously looking T-Rex en face with dripping saliva is worth an extra mention. Unfortunately, the impressive visuals would occasionally tax the NES hardware causing graphical glitches and oddities.

Gameplay
In order to successfully escape from the Park, Alan needs to complete various tasks, ranging from saving Tim and Lex to unlocking computer terminals. A major part of the game is collecting turquoise-gray dinosaur eggs in order to reveal key cards, and collecting different types of ammo to combat the vicious dinos. There are several species of dinosaurs, each with a different behavior pattern. Compsognathus individuals are small and easy to kill as they always trot in a straight line towards Grant. Velociraptors are much faster and can actually outrun the player when charging. They also do much more damage on contact. Somewhat sadly all of the dinos drop only basic ammunition (swamp green). Bolas rounds (red), penetrating rounds (gray) and upgraded rounds (green) need to be collected from the ground in designated spots. An interesting aspect of the game are mystery boxes with a question mark on top. They provide extra lives, health packs or contain deadly booby traps. What I appreciate the most is the fact that the game does not follow the standard “stage(s) + boss fight” pattern. In fact, there are only 2 real boss fights against the T-Rex. The gameplay is well-balanced with a mix of regular collection stages, boss fights, puzzles and dynamic rescue missions. In total 6 levels with clear briefing screens explaining the tasks in each level.

Difficulty
Jurassic Park is one of those NES games which seem hard at first, but as the player memorizes enemy attack patterns, locations of health packs, etc. it becomes increasingly easier. In addition, it is not as overwhelming as, for instance Castlevania or Ninja Gaiden. Jurassic Park is definitely a beatable title, though admittedly the T-Rex levels can be quite annoying.

Closing Remarks
While the core of the game (collecting eggs and shooting dinos) is fairly standard among NES titles, the addition of rescue missions and unusual boss fights feels refreshing. I believe that even platformers would profit from such gameplay mix-ins. Actually, they’re often fun regardless of the genre.

Sources

Advertisements

Games Then and Now

Inspired by the talks from Brenda and John Romero, I decided to write a short piece on the evolution of gaming. I will not focus on specific time periods, however, as the industry progressed through subsequent phases quite fluidly. Rather, I will try to draw a comparison between then (1980-1990) and now (201x). I was born at the end of 1980s, therefore I still managed to get to know the amazing Nintendo Entertainment System (NES) first-hand. This will be the starting point of our journey, though I will mention other consoles and gaming systems when relevant.

To begin with, the NES and the original Nintendo Gameboy were amazing systems. Such a variety and richness of games as for these two platforms was never seen before. I didn’t own a Gameboy myself, because they were quite expensive, but some of my childhood friends had them so I would often borrow one to play a bit. Also, back then it was perfectly natural for kids to meet in small groups and game in turns. My favorites were Donkey Kong Land and Super Mario Land. Both were quite difficult, but the enjoyment was enormous regardless! I did have a Famiclone (a Japanese Famicom clone) as these were extremely popular in East-Central Europe. Of course, the cartridges were also Famicom imitations and the system itself (branded Pegasus) would never run any of the original NES games without a special converter. I had no idea about that when I was young since it was easy to get games from local flea-markets anyway. I remember playing Contra and Rescue Rangers 2 for hours and hours on until I could perfectly memorize the entire play-through. Many of the games then were platformers, beat-em-ups, racing games or sports games in general. Regardless of the genre, twitch reflexes were a must! Also, most of the games didn’t have password-based checkpoints so once dead, the player had to start from the very beginning. The replay value was in the difficulty of a game and the necessity to master it to complete it and beat the final boss. From today’s perspective this sounds terribly tedious, but the motivation behind making games was also different. They were supposed to bring fun and excitement in its purest form. Beating a game was intended as the supreme reward for mastering a game and honestly, it really felt rewarding back then. DOS games were slightly different due to the lack of a proper controller pad. They weren’t as fast-paced as NES games, but you could actually save the game state in some of them. Regardless, they still posed a considerable challenge.

tmnt2-4

One of the final bosses in Teenage Mutant Ninja Turtles 2 (NES)

Game design is an interesting topic when it comes to NES, DOS, the Nintendo Gameboy and other platforms from that era. Since games had to fit on a single cartridge or diskette (or multiple diskettes, of course), they could not contain information about the entire state of the game, but rather a set of procedures to draw pixels in correct positions at correct times. As a result, programmers had to implement various hacks to define object boundaries or increase the number of available colors. This caused graphical glitches when the bitmaps were too big or allowed the player to abuse the shape of an object to his/her advantage. Also, forget tutorial levels, help menus, maps, etc. Some games were packaged with a manual or booklet, which introduced the game world or explained basic gameplay aspects, but very rarely would a game provide any help features at all. The player had to explore the game to understand it fully and complete it.

final-fantasy-xv-screenshot-023

A screenshot from Final Fantasy XV (PS4)

Fast-forward several decades and games look and feel entirely different. Firstly, they are a lot more graphically appealing and realistic so we are no longer expected to use our imagination to complete the mental image of a character. Almost everything is WYSIWYG (What You See Is What You Get). That helps with immersion a lot! On the down side, gore and violence are a lot more explicit and traumatizing (think, the Dead Space franchise). Game mechanics haven’t changed much, since even nowadays every game has a “core” which defines its gameplay. However, because games are no longer limited by diskettes or computer memory, developers often mix genres and implement novel gameplay aspects which were unknown in the past. In addition, the player is often gradually introduced to the game world so that he or she is not overwhelmed by the game from the very beginning. Finally, there is a major shift towards developing games in franchises or series to generate sustainable revenue and not as one-off hits. This, of course, puts pressure on developers and emphasizes the use of pre-purchase bonuses or advertising to make sure the game sells.

The differences between games then and now don’t mean that games used to be better or worse in the past, compared to modern games. The evolution of games merely expresses the growth of the industry. Nowadays, gaming is more approachable so that everyone can enjoy it. To us, veterans of the early Nintendo and Sega consoles modern games might seem boring or too easy, though that is only our perspective. In addition, when I recently returned to Castlevania and Teenage Mutant Ninja Turtles II (both on the NES) I realized how unnecessarily frustrating games used to be due to technical limitations. In the end, to each their own. Since I have a lot less time nowadays, I prefer casual games and not the challenging monsters of the past. However, I did find Dark Souls enjoyable, to be perfectly honest.

How I migrate(d) to OpenSUSE and Why

I’m a die hard FreeBSD fan. I simply love it! It rubs me the right (UNIX) way. Through trials and tribulations I managed to make it do things it was possibly not designed to do. ZFS? Amazeballs. Cool factor over 9000! However, all of that came at a tremendous cost in energy and time. I reached a point when I don’t want to spend time manually configuring everything and needing to invent ways of automatizing things which should work out-of-the-box. Furthermore, most FreeBSD tools are not compatible with other operating systems, therefore learning FreeBSD (or any other BSD variant, for that matter) locks me in FreeBSD. Despite many incompatibilities, this is not the case with Linux. On a side note, the ZFS on Linux project was a great idea. The Linux ecosystem badly needed a mature storage-oriented filesystem, such as ZFS. BTRFS to me at least “is not there yet”. Other tools, such as containers were reinvented in some many different ways that Linux has outpaced FreeBSD many times over. Importantly, Linux tools were tested in many more real life scenarios and are in general more streamlined. For automation, this is crucial. Again, I don’t want to tinker with virtually every tool I intend to use. Neither do I want to read pages and pages of technical documents to get a simple container running. More so, I should not be forced to, since that’s terribly unproductive. Finally, I like to run the same operating system on most of my computers (be it i386, x86_64 or ARM). FreeBSD support for many desktop and laptop subsystems is spotty at best…

Enter OpenSUSE!

green_lizard

Cute lizard stock photo. Courtesy of the Interweb.

Seemingly, OpenSUSE addresses all of the above issues. True, ZFS support is not reliable and there are no plans to the contrary. The problem is as always licensing. BTRFS is still buggy enough to throw a surprise blow where it hurts the most. Personally, I don’t run RAID 5/6 setups, but that’s BTRFS’ biggest weakness right now. That and occasional “oh shit!” moments. Regardless, I think I’ll need to get used to it. Lots of backups, coffee and prayer – the bread & butter of a sysadmin. On the up side, this is virtually the only concern I have regarding OpenSUSE.

The clear positives:

  • Centralized system management via YaST2 (printers, bootloader, kernel parameters, virtual machines, databases, network servers, etc.). A command-line interface is also available for headless appliances. This is absolutely indispensable.
  • Access to extra software packages via semi-official repositories. Every tool or framework I needed was easily found. This is a much more scalable approach than the Debian/Ubuntu way of downloading ready .deb packages from vendors and having to watch out for updates. Big plus.
  • Impressive versatility. OpenSUSE is theoretically a desktop-oriented platform, though thanks to the many frameworks it offers, it works equally well on servers. In addition, there is the developer-centric rolling-release flavor, Tumbleweed, which tries to follow upstream projects closely. Very important when relying on core libraries like pandas or numpy in Python.

So far, I’ve switched my main desktop machines over to OpenSUSE, but I’m also testing its capabilities as a KVM host and database server. Wish me luck!

Why Golang is not for me…

Recently, I decided the time has come to progress my not-yet-existent game developer career. I always wanted to write games and there is a lot of great old-school games which deserve reiterations using modern technologies. After some discussions with my wife (big kudos to her!) and getting properly inspired by DOS-era gems and jewels, I was ready to pick a language. I’m quite confident in my Python skills, however for games I’d rather use one of the mid- to heavy-weight contestants like Java, C#, C or C++. Despite having some experience in C, pure C is too simplistic, heavily procedural and unfortunately doesn’t provide enough tools to build rich graphical applications. Sure, I could try nuklear.h or similar headers for drawing shapes. That’s sufficient for menus, though not for the entire project. Clearly, C is more suited for number-crunching subroutines. C++ is way too complex for me, though of course most games are written in C++, since rendering libraries are coded in C++ and so are game engines. That makes perfect sense. Something easier perhaps? C# is a Microsoft thing and I would like my game(s) to be easily accessible to all platforms. That left me with Java and a new contender – Go.

golang_gopher

Funky gopher on a funky horse – courtesy of the Web

The Golang project officially began in 2009 and managed to garner quite some appeal throughout the years. It’s not a Google toy anymore. For instance, CloudFlare uses it in their Railgun project (circa 4000 lines of code, last time I checked). Other notable examples include the entire TICK stack for time-based metrics (Telegraf, InfluxDB, Capacitator, Kibana) and Grafana (visualization platform for various database back-ends like InfluxDB, MySQL, ElasticSearch, etc.). I even found a 3D game engine advertised as programmed in Go (~50% was written in C, though). Since it appeared that Go is here to stay and is slowly establishing its position as one of the mainstream languages, I decided to at least take a look at it. Sadly, the more I read about it, the less inclined I was to code in it. The emphasis on concurrency is both important and useful, however I feel the language is severely lacking in many respects.

 

No time for classes

Thanks to my Python background I am well accustomed to object-oriented programming and I consider it relevant to writing DRY code. It’s not always the best approach, though in most cases it provides means of maintaining modular programs. We know that modular is good, because it allows us to exchange bits and pieces without breaking APIs. There was a bit of a switch from old-style classes in Python2 to diamond classes in Python3, which seemed to be inspired by Java. However, Python went one step ahead and introduced multiple inheritance, purposely omitted from Java. As it tends to be quite confusing, I avoid it, rather than abuse it. Pure C, the ancestor of many modern languages, lacks classes and they were never introduced in subsequent revisions of the C/C++ standard. It stands to reason as C++ came along in 2006 and expanded the successful formula of C with multiple useful features, including object-oriented paradigms. Also, back in the days, procedural programming was sufficient and even nowadays it is perfectly adequate for system level programming. Unfortunately, Go’s design follows C rather than C++. Thereby, it demonstrates a strong procedural focus, lacking actual means of data encapsulation. Forget classes, object hierarchies, clean polymorphism, operator overloading, etc. To me that’s a step backwards not forward. It means that Go will suffer from the very same general limitations as C.

 

The emperor’s new clothes

One of the major aspects of a language is its syntax. Python wins against many more performant languages, because it’s simple, encourages the use of a clean and consistent coding style, and makes reading other people’s code a breeze. In fact, so does C (in a way) if it’s not abused. The reason why Java was successful upon its release was that it closely followed the syntax of C and C++. It was meant as a portable, cross-platform language with a familiar look to encourage existing programmers to switch. One could code in C, C++ and Java, covering a multitude of use cases effortlessly. In addition, Java Virtual Machines support other languages like Scala, Clojure, Groovy and Jython for even more potent combinations. In contrast, Go was inspired by C, though it completely overhauled the standard C-like syntax for no apparent reason. This leads to confusion, the need to unlearn old, but useful habits and invest resources in learning a completely foreign language. At this point I’m hardly motivated.

 

Simple == useful?

As I mentioned earlier, Go selectively omits many modern and potentially useful language features like classes. It was originally advertised as a simple to understand systems programming language to make the life of people at Google easier. Yet, it locks prospective programmers in a one could even say dumbed down C/C++ syntax, which is alien to other languages. It is true that C++ is a monster of a language due to its scope. However, it is perfectly viable to establish the use of subsets or dialects to make it easier to understand. What I mean is that it would be more useful for prospective programmers to learn a language with more features than having to re-invent these features in a band-aid manner as they become more and more comfortable with the language.

 

Conclusions

While my general impression of Go is largely negative I do not by any means consider it a useless language. Quite the opposite! It managed to provide the server space with a number of useful engines and applications for networking, data storage and visualization. Actually, in some cases these pieces of software are more robust than existing solutions in C/C++. Personally, that’s quite impressive. However, I still believe the arguments against Go are valid. I would rather continue learning Java or even go straight for C++ and recommend others to do the same.

FreeBSD 11.1 on ASUS VivoBook S301LA

I decided it is time to write a piece on FreeBSD, since I now officially use it as my main operating system both at home (alongside OpenBSD) and at work. My mobile battle gear of choice is the ASUS VivoBook S301LA. It’s a 4th generation Intel-based ultrabook-class laptop, one of the many released by ASUS every year. It has strong points, though also quite some disadvantages. I would like to discuss it from the perspective of a FreeBSD enthusiast.

csm_dsc_0047_b74600070f

Photo courtesy of notebookcheck.com and the Interwebs

Hardware specifications:

  • Processor: Intel Haswell core i3-4010U @ 1.70 GHz
  • Graphics: Intel HD 4400 integrated GPU with up to 768 MB shared RAM
  • Memory: 4 GB DDRL 1600 MHz (soldered) + empty slot for another 4 GB
  • Hard drive: Western Digital Blue 500 GB 5400 rpm (replaceable)
  • Ethernet: Realtek 8169 Express Gigabit
  • Wireless: Mediatek MT7630e with Bluetooth built in (half-sized, replaceable)
  • Sound: Intel HD Audio (SonicMaster)
  • Webcam: Azurewave USB 2.0 UVC HD Webcam
  • Touchscreen: USB SiS Touch Controller
  • Battery: 4 hours
  • Microphone: yes, next to the Webcam
  • Keyboard: Generic AT keyboard
  • Touchpad: Generic touchpad with integrated click-fields
  • Additional ports:
    – left side: Ethernet, HDMI, USB 3.0, microphone/headphone jack
    – right side: Kensington lock, 2x USB 2.0, SD card slot

 

The good:

  • Extremely lightweight
  • Never overheats
  • Moderately fast after upgrades

The bad:

  • Paper-thin keyboard
  • Slippery touchpad
  • Highly reflective, mirror-like screen
  • Cheap, lower-end wireless card

Overall, this device is a fairly standard consumer-grade ultrabook. The crappy keyboard is something one can get used to rather quickly. I’m not a fan of touchpads, therefore I rely on PC mice for clicking and scrolling unless I’m on a plane or train. Nowadays, reflective screens are no longer an issue thanks to anti-glare screen protection sleeves. The obvious downside is that anti-glare screens lack sharpness typical of reflective screens. In general, the drawbacks can be easily mitigated with upgrades, which however turn the laptop into a moderate investment. The choice is down to the prospective user.  Furthermore, the manufacturer (ASUS) made some choices, which I am not entirely convinced by. Firstly, touchscreens are more useful on hybrid flip-laptops like the Lenovo Yoga. In this model the touchscreen is more of a nuisance when cleaning the plastic cover on the display and draws power needed elsewhere. Secondly, the wireless adapter is perhaps the worst of its generation with a nominal bandwidth of 150 bpms. Still, it’s more of a travesty to see it in high-end ROG gaming models (yes, it’s true…).

 

The FreeBSD perspective:

This might be somewhat disappointing. Depending on what one expects from a mobile device, the S301LA is either average or just plain broken. Not to sound rude, but I’m sure a Thinkpad or an Ideapad would be a far superior choice. Haswell HD 4400 graphics chips have proper (aka working) FreeBSD support since just release 11 and most other components are barely supported. The Azurewave USB webcam actually works (webcamd needs to be attached to USB device ugen0.2 by root, a superuser or a member of the webcamd group), but no VoIP software is available on FreeBSD out-of-the-box. I guess one could get Windows Skype to run via WINE or force the alpha-quality Linux client into submission, but that’s a lesson in futility, I think. Personally, I wouldn’t be using this ultrabook at all if not for the fact I finally managed to replace the trash wireless adapter with something half-decent (albeit from 10 years back) from Intel, namely the WiFi Link 5100. After adding another 4 GB RAM and a Western Digital SSD, I would consider this ultrabook worth the money and time. However, as I mentioned earlier, there are far better choices on the market.

Show Me Your Code!

For the last couple of months I have joined and participated in discussions in multiple Facebook tech groups. As demographically diverse as Facebook is, I noticed a worrying trend. Most of the inquiries have the following features:

  1. Incomplete, badly written and/or fail to explain the problem at hand in an understandable fashion.
  2. Expect immediate answers and solutions.
  3. Demonstrate that the inquirer did not try to address the problem him- or herself prior.

Feature 1. can be explained by the fact that most of the Facebook group members are not native English speakers and struggle with forming comprehensible questions. Still, I find it odd that they invest so little effort. For instance, if an inquiry refers to issues with a specific operating system, it would be wise to provide the full specification of the computer running the operating system or at least name the operating system, no? I would assume that this should be dictated by common sense, though perhaps education also plays a role here? After all, we are taught how to pose questions at school and in university. The consequence is that even if the question is answered, the inquirer may not understand the answer, because his language and/or technical skills are insufficient. It is a sad, but inescapable aspect of discussion groups.

Features 2. and 3. are interconnected, and grind my gears the most. The phrases I often encounter are “suggest me”, “give me solution”, “give me program command”, “give me/show me your code”. All of them assume that the answering party is obliged to provide a solution as quickly as possible, while in reality the opposite is true. The answering party is not obliged to do anything! Rather, the inquirer should display humbleness in order to receive a reliable answer. What is even more insulting and disrespectful is the fact that some of these questions are phrased in such a way that they could easily be executed as a Google query. No additional help from a dedicated technical group is needed. Other questions expect a detailed and clear explanation of an entire framework, which usually takes a year if not years to build. For instance, the inquirer wants to know how to build a fingerprint system for monitoring/registering students at a local school. He/she anticipates a full outline of the entire system in a “ready-to-go” package, best to be described in layman terms so that he/she can proceed with building such a system. In all honesty,  endeavors like this typically require a team of experienced software engineers not a ragtag group of volunteers.

In the end, it boils down to the issue of instant gratification, which plagues modern societies. Many Western business models are based on the premise that much can be achieved with minimal effort and that the evanescent everyman can become a hero instantly. A fantastically enticing end product is shown, together with a set of trivial instructions, which need to be followed. People seek happiness and obviously it’s best if that happiness is achieved quickly. However, instant gratification is not lasting and requires more units of the product or a newer product. That in turn drives the ever increasing demand for the product. Technology is no different. People are made to believe that coding is easy and great programs can be written overnight. Also, everyone can instantly become an experienced hacker, because why not? Reality is different, though. Impressions are cheap, while actual experience resource-intensive. Learning is a process, which takes time. We can disagree, though that will not alter reality. Merely our impression of it.

The Ubuntu Conundrum

Ubuntu is perhaps the most popular Linux-based operating system, however for that very reason it has as many proponents as enemies. I myself use Ubuntu (Xubuntu 16.04 LTS, to be exact) at work, both as a development platform and to host services in libvirt/KVM virtual machines (Ubuntu Server 16.04 LTS there). It performs alright and so far hasn’t let us down, though we haven’t been using it through more than 2 releases so we’re unable to gauge its reliability properly. On more personal grounds I can say it works splendidly on my early-2011 MacBook Pro 15″ with faulty AMD graphics and has since the very beginning (out-of-the-box, as one might say). Singular package upgrades don’t bring about all of the regressions people profess so fervently. However, I can understand where the hate is coming from and I admit it is partially justified.

Product and popularity
For whatever reason human psychology dictates that we equate quality with popularity. If something is extremely popular, it simply must be good, right? Wrong. Completely. A product is popular, because someone with enough resources made it visible to as many consumers as possible. The product was made popular. Quality is a useful, but clearly secondary measure. A good anecdote is the long gone rivalry between VHS and Betamax. We all remember VHS, though most of us do not remember Betamax, which was technically superior. However, it lost the popularity race and will forever be remembered as the second best or not remembered at all. Now, this is not to say that Ubuntu is in any way inferior…

Ubuntu, the (non)universal operating system
The main issue with Ubuntu is that it succeeded as a more open operating system alternative to Windows and macOS X, however did not solve the underlying problem – computer literacy. Of course, not every computer user has to be a geek and hack the kernel. However, when I see that Ubuntu users address their PC-related issues with the same shamanism and hocus pocus as in Windows, my soul twists in convulsions. We did not flee from closed-source operating systems only to change the names of our favorite tools and the look of our graphical user interfaces, though observing current trends, I might be terribly wrong. The other problem is that Ubuntu’s popularity has become self-perpetuating. It’s popular, because it’s popular. Many tutorials online and in magazines assume that if one uses Linux, he or she surely runs Ubuntu on all of his or her computers. This is extremely hurtful to the entirety of the Linux ecosystem, because neither Debian nor Ubuntu represent standard Linux. Both of those systems introduce a number of configuration improvements to applications, which are not defined in upstream documentation and absent in other distributions (so-called Debianisms). Therefore, Ubuntu being a universal operating system is more of a publicity gimmick than a fact. Especially, considering that on servers, SLES (SUSE Linux Enterprise edition), CentOS and Red Hat clearly dominate.

The solution?
I would say it’s high time we begin showing newcomers that there is an amazing world of Linux beyond Ubuntu. To that end, I have a couple of suggestions for specific needs and distributions covering those needs. Related questions come up often in the Linux Facebook group and around the Internet, but get answered superficially via click-bait articles listing top 10 distributions in 2017/18. Not exactly useful. Anyhow, the list:

  • Software development:
    – Fedora (up-to-date packages and developer-centric tools like COPR)
    – Arch Linux (up-to-date with a wide range of packages via AUR and vanilla package configuration for simplicity)
    – openSUSE Tumbleweed (up-to-date with a rolling, snapshot based release cycle, but sharing the Leap / SLES high-quality management tools like YaST2)
  • Servers:
    – openSUSE Leap (3-year long support life cycle, high-quality management tools like YaST2 and straightforward server + database + VM configuration)
    – CentOS (binary compatible with Red Hat Enterprise Linux)
    – FreeBSD (ZFS hard drive pool management + snapshots, reliable service/database separation via jails, rock solid base system)
  • Easy-to-use:
    – Manjaro Linux (based on Arch Linux, with lots of straightforward graphical configuration tools, multiple installable kernels, etc.)
    – Fedora (not only for developers!)
    – openSUSE Leap (for similar reasons as above + a streamlined, user-friendly installer)
  • For learning Linux:
    – Gentoo (painful at first, but extremely flexible with discrete software feature selection at compile-time via USE flags)
    – Arch Linux (Keep It Simple Stupid; no hand-holding, but with high-quality documentation to make the learning curve less steep)
    – CRUX (similar to Gentoo, but without the useful scripts; basically, vanilla Linux with a very simple package manager)
  • For learning BSDs:
    – FreeBSD (as mentioned above)
    – OpenBSD (strong emphasis on code-correctness, system engineering and network management)
    – DragonflyBSD (pioneering data storage and multi-processor systems)