Skip to content

Instantly share code, notes, and snippets.

@allquixotic
Created December 21, 2015 16:10
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save allquixotic/73a69bd5538b691c15c3 to your computer and use it in GitHub Desktop.
Save allquixotic/73a69bd5538b691c15c3 to your computer and use it in GitHub Desktop.
My Ubuntu Experience

Intro

I have been using some flavor of GNU/Linux (which excludes Android) since about 2002. The first distro I ever installed was Lindows, and while I often used Lindows as my "main" distro because it was so easy to use, I was very much a distro shopper in my early years. I'd find a spare USB hard drive or flash drive and throw Gentoo or Debian or openSUSE or Fedora on it. It wasn't until the release of Ubuntu 8.04, Hardy Heron, that I finally settled on a "preferred" distro: Ubuntu. And, unlike its commercial open source relative, Linspire, Ubuntu didn't run out of money and stop development. So that was a big plus.

Also unlike its other more distant relative, Fedora, Ubuntu had a more predictable release schedule, and longer support for each release. This meant I could feel free to use a crusty old installation on a dedicated server and be confident that I'd continue to receive critical security updates for years to come. And that's exactly what I did: starting around 2006, I have had either a dedicated server or a beefy (for the time) VPS, at any given time, more or less constantly.

Two Worlds

My history of using GNU/Linux has really been a tale of two worlds: my essentially constant usage of it on a dedicated server (plus a few little small VPSes here and there), and my very sporadic and inconsistent usage of it on the desktop.

The Server

As hinted earlier, I've used GNU/Linux on my personal dedicated server (as well as some work projects here and there) pretty much constantly since the early 2000s. But my server story is actually quite boring, and not why I'm writing this post. Server, for me, is something that GNU/Linux has always been great at, because of the immense investment of companies like Intel and Red Hat, who polish and optimize the heck out of things so that the key components of the system (the mainboard, the CPU, and the NIC) "just work", and work well. But we all knew that.

The server hardware I rent has changed many times (as has my choice of vendor), and the distro also changed fairly often until the last 4 years or so, when I decided to settle on Ubuntu LTS. I chose it because of its long term support, combined with the ease of installation of complex server products due to the extensive guides and PPAs available. Access to the latest stable release of many hard-to-compile programs is a huge productivity booster.

Today I have Ubuntu Server 14.04.2 running happily on my OVH dedicated server -- current box has 128 GiB of RAM, a Haswell Xeon E5, hybrid SSD+HDD storage with the SSDs as L2ARC for ZFS, and a software environment based on lxd. Aside from a bit of churn in lxd, the system has been really stable, and does what I want -- it allows the convenient separation of concerns with minimum overhead and no hypervisor, nested kernels, nested FS, or anything else that's messy and slow about traditional VMs. I even have a rebootless kernel (kernel security updates without rebooting) thanks to KernelCare. Whenever I need to update userspace, in the worst case I just have to reboot my containers; this takes 5 seconds, compared to the 10 minutes it takes for this server-grade iron to reboot (server BIOS and UEFI implementations are not known for their speed.)

Tracking the progress of GNU/Linux on the server by running it constantly has kept me mostly in the loop about changes that have been happening since my last serious foray into desktop Linux in 2009. But I wasn't prepared for how far it had actually come.

The Desktop

In 2002-2008, I was a big proponent of suffering through the pain points in order to continue filing bugs and providing user feedback. Early influences on me from the Linspire/Lindows community taught me to be patient and work through specific issues so that they can get resolved and then contributed towards the greater good.

But I had a lot of very poorly supported hardware, and eventually I, too, gave up on desktop Linux. I was tired of the constant churn in the Xorg stack; the PulseAudio upheaval broke many apps; and drivers just weren't ready for prime time. Essentially, where most users would try something and give up if they can't get it working in 1 day, my patience spanned about 6 years.

But at some point in 2009, after what was probably a kernel panic or a PulseAudio volume control bug or a really tough graphics driver rendering issue, something clicked and I just stopped trying. I simply wanted a stable system that worked, whether it was free and liberating and exciting, or closed and restrictive and boring. So I went back to Windows, which was extremely boring at the time (Windows 7). But then Windows actually got a little exciting as releases started to accelerate (8 -> 8.1 -> 10 in the span of a few years), which further pushed me away from caring about desktop Linux.

From 2009 until a few days ago, I've hardly used it on the desktop at all. I helped a relative get Mint installed and working fairly well on a PC that was effectively a web browser, and that ran for a few years in 2011-2013, but upgrade woes and graphics driver issues ultimately led us to just put Windows back on that machine to reduce the amount of maintenance burden on me. It seemed that even running Google Chrome and flash games was something that needed frequent attention to maintain on desktop Linux (the upgrade was spurred on by various websites refusing to load because the version of Chrome installed was too old; then the newer version of Chrome didn't like the old graphics drivers; etc. etc.)

Part of the reason why I've rarely used it on the desktop since my exodus in '09 was the graphics driver situation. For years I've been an AMD graphics card owner. Anyone who has tried to get a solid gaming experience on GNU/Linux with an AMD graphics processor knows that it's just an unnecessary pain. The proprietary driver is buggy and poorly integrated; the open source drivers are buggy and lack many important features and performance optimizations -- not to mention that open source is always running about 2 years late in terms of getting proper support enablement for new chipsets, just to get them running as well as the last generation.

That all changed when I picked up an Nvidia GTX 980 a few months ago. I didn't realize it at the time, but switching to the green team had set me up for being ready to finally make the switch to GNU/Linux more-or-less full-time. It's really quite astounding how functional, stable, and robust the Nvidia drivers are on desktop Linux -- and they've been that way for a long time. I always heard people say that, but I didn't believe it until I tried it.

The thing that spurred me on to actually install Ubuntu again was Kerbal Space Program. Ah, yes, that classic space simulator with some of the highest-quality mods ever created for any game. What's so great about KSP on GNU/Linux, you ask? Well, simple: KSP on Windows only runs reliably as a 32-bit executable, due to a Unity bug. There used to be a KSP 64-bit build for Windows, but they pulled it due to a persistent problem that depends on the Unity devs to fix. There's supposedly a community patch that fixes it, but I don't know if that gives you the latest version of KSP. If it's too unstable for Scott Manley to use for his Youtube videos, then I'm not interested!

KSP on GNU/Linux, by contrast, runs 100% fine as a native 64-bit binary. You can use as much RAM as you have, and the only real limiting factor then becomes how expensive it is for the simulation engine to run, CPU-wise, and how long you're willing to wait when it starts to lag. This is great for those of us who like to use a lot of mods (like the Realism Overhaul Suite), since there's a tendency for the 32-bit build of KSP to run out of memory and crash with too many mods installed.

Since 2009, there have been three huge things that have happened to desktop GNU/Linux that have made it a viable gaming platform:

  • Steam. Of course. I'm a huge PC gamer, and one of the things that keeps me running back to Windows is the dearth of games available on GNU/Linux. But things are really looking like they're finally starting to shape up for us, thanks in part to Valve providing a conduit for getting games onto Linux boxes.
  • Wine-stable. Seeing Wine finally start to stabilize and become fast and low-overhead has been of tremendous value for those who simply can't resist playing some Windows-only game or running some Windows-only program. Sure, you can't take any arbitrary Windows game and just run it under Wine with no problems, but many important, modern titles are working well at this point.
  • Android (and, by extension, engines like Unity that have gone full-on multi-platform). Android has driven the adoption of OpenGL for game development, which opens the road for fairly easy porting to non-Windows platforms, including, conveniently, GNU/Linux. Some games still use Direct3D, of course, but a significant fraction of those games have two renderer implementations and can swap over to OpenGL if the platform doesn't have Direct3D.

We've also seen an intensified focus by both Nvidia and AMD to optimize and add features to their GNU/Linux graphics drivers. AMDGPU and Nvidia's modesetting driver are two examples of them trying to go all-out in their support of desktop Linux. Honorable mention to Intel, who continue to advance the state of the open source graphics stack, while working with Valve to optimize their driver for the Source engine (and vice versa).

Quite a few important thick-client desktop applications are also running well on desktop Linux these days. It's really quite impressive how much support there is out there.

Here's a short list of a few programs I've gotten running natively on Linux and am very happy about:

  • Genymotion -- an Android emulator -- lets me run Android-only games on GNU/Linux (yes, even games with native ARM code; there's an ARM Translation library you can install to enable that support). It's free for personal use, which is good enough for me!
  • Hearthstone -- which runs perfectly well under Codeweavers Crossover Linux. It's so good that you can't tell it's running under an emulator. BTW guys, if you haven't bought into Crossover Linux, you really should. Support Codeweavers. They pay the salaries of the most important Wine developers. Without them, Wine would not make the progress that it has. Aside from that, it's much easier to install and run "supported" Windows apps under Crossover than it is to do the equivalent with Wine. There are also a few programs that work on Crossover, but break on the latest version of Wine.
  • Star Wars: The Old Republic -- which runs pretty well under Wine. Unfortunately I couldn't get it running on Crossover, but I suspect a fix was put in place with Wine 1.8, since running it there fixed it for me.
  • Spotify, who have a native GNU/Linux client. Hooray for developers who like desktop Linux!
  • Skype, who have a native GNU/Linux client. Hooray for developers who like desktop Linux! Uh... hi, Microsoft? :S
  • UltraEdit, which has a native GNU/Linux port (and it's quite good). Full-fat text editor resembling Notepad++. I can't live without my Notepad++-or-equivalent!
  • Steam, mainly for Kerbal Space Program, which I'm sure will not be the only game I run on Steam/Linux, but it's certainly the most important, since it literally runs better on GNU/Linux than on Windows.
  • TeamSpeak 3, for communicating with friends. Awesome that they continue to support desktop Linux.

I'm not going to link to the bog standards that we all know and love, like LibreOffice, Firefox, Chrome, etc. because we "expect" these programs to run well on GNU/Linux at this point.

Pain Points

I did hit a few pain points while bringing up my install:

  • Ubuntu still doesn't ship the latest Nvidia binary driver. I had to grab it from a PPA. Not hard, just a minor inconvenience. It's always best to use the latest driver, for performance/stability/features. Nvidia does their homework and generally tests this stuff well enough on their own. I'm not sure what Canonical is waiting on or why they're sticking to the 352 series.
  • SWTOR is pretty hard to get running, and takes some substantial hackery.
  • Writing XDG launcher desktop entries by hand to put shortcuts on my Unity bar is slightly annoying.
  • I had to install ldmtool and manually mount my Windows partition in order to use one of my storage arrays configured with Windows Logical Device Management ("Dynamic Disks"). It's set up this way because I have two separate, different-sized SSDs exposed from my RAID controller and I wanted them to be treated as a JBOD volume. I had to mount it in Ubuntu to copy over game data, so I don't have to re-download 50+ GB of data ;-)

My Setup & Experiences Installing

Hardware

  • Intel Core i7-3770K
  • Motherboard: Asus P8Z77-V
  • 32 GiB DDR3
  • Nvidia GTX 980
  • Storage: 4 x 4 TiB HGST 7200rpm HDD, and 1 x Samsung 850 Pro 1 TiB; 1 x Sandisk Ultra II 480 GiB; and 1 x Samsung 850 Pro 128 GiB. All of these are connected to my Adaptec 81605ZQ RAID controller with a flash cache/backup module. The HDDs are in RAID10, with as much as possible of the SSDs used as MaxCache 3.0 (basically a read and write cache to hide the IOPS limitations of HDDs). The remainder of the SSD space is formatted as a Windows Dynamic Disk (JBOD) as NTFS for storing games directly on the SSDs.
  • Cyberpower line-interactive pure sine wave 1500VA 900W UPS.
  • Blue Yeti USB condenser microphone.
  • Das Keyboard mechanical keyboard with Cherry Brown switches.
  • Steelseries Sensei mouse.
  • Networking: USB tethering to the Apple iPhone 6S Plus (yes, it works; there's a kernel driver that exposes the iPhone as an ethernet NIC when tethered; also, I have unlimited data ;p)
  • Sound: My Creative SoundBlaster Z doesn't work (Soundblaster support under ALSA has never been great), but the motherboard's S/PDIF works just as well, so I plugged it into my Imperial BART 1 full-stack S/PDIF to Bluetooth 4.0/aptX converter box. Then I connect it either to my Samsung Level On headphones or my Sennheiser Momentum On-Ear 2.0 headphones. PulseAudio and ALSA drive the Realtek codec's S/PDIF (Digital) port on my motherboard really well.

Disk Layout

  • Windows 10 as the main OS on the RAID array (boots in UEFI mode)
  • I non-destructively resized the main Windows partition to make 375 GB of space and put Ubuntu Desktop 15.10 x86_64 in there. For Ubuntu I'm just using ext4 as the filesystem for now.

Conclusion: Do I still need to dual boot?

Yes. Unfortunately. There are a few DirectX10/11 games that are near-and-dear to my heart and simply don't work under Wine at this point, and I'm not done playing them. Most notably: Fallout 4 and Witcher 3. But I can certainly avoid playing them very often, and just have weekend gaming episodes where I boot back into Windows for them. If I decide I want to play KSP, though, back to Ubuntu I go!

I am so impressed by the array of software and games available on Ubuntu that I think I'm going to start using it as my primary OS again. I'll be booting back into Windows for the foreseeable future to play Direct3D 11 games like Witcher 3 and Fallout 4, but I only play those on the weekends, anyway (weekdays are too busy and I don't have enough time to get immersed). Everything is a lot easier and less painful than I remember it from the bad old days. I think the year of the Linux desktop is actually getting closer, and maybe 2016 will be it.

It would be, in my eyes, if only CD Projekt Red and Bethesda had made a GNU/Linux port of their AAA games. Keep at it, and maybe the adoption of GNU/Linux will be at such a level that Fallout 5 and Witcher 4 (or whatever they're ultimately called) will be supported on our favorite platform. One can hope!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment