It's December Days time again. This year, I have decided that I'm going to talk about skills and applications thereof, if for no other reason than because I am prone to both the fixed mindset and the downplaying of any skills that I might have obtained as not "real" skills because they do not fit some form of ideal.
16: BadgerIf you like, put on
Badger Badger Badger. (Personally, I'm here for the
Mosesondope.EXE Demoscene Edition, because of the visuals and the sound, but the original is also good and will serve.)
The key word for this entry comes from the Lucy A. Snyder piece
Installing Linux on a Dead Badger: User's Notes, and then the subsequent book,
Installing Linux on a Dead Badger (and other oddities) that took the single article and created more from that world, packaged up into a small set of pages.
I'll admit that replacing the operating system on a computer is not a task for the faint of heart. This is also another one of those cases where having a spare machine, or even a spare hard drive to devote to the installation, makes things much more approachable than they might otherwise be in trying to make things happen with just one drive and one computer and therefore one chance to get it right, without additional help from recovery utilities. For a good amount of time, I had Linux on one drive and Windows on another. Actually, I still do, it's just that Linux now gets the faster and better drive now, instead of Windows.
To make some of those situations less frightening, there exist things like the Windows Subsystem for Linux, that you can install and enable and then download a compatible distribution to get terminal access to that distribution. (It doesn't do GUI.) Or there were projects that basically set up an image inside the already partitioned drive, so that there wasn't any need to repartition the drive and worry about what might happen to data as it gets shuffled around. And most Linux distributions have a live environment on their install images so that someone can at least poke around a little bit and see what it might be like to use that particular distribution, how it manages software, what things it includes as a default, and what desktop environment it believes is the best one, and therefore is the one it puts forward as the default option.
Because just about every Linux distribution is Opinionated about these things, it can take a certain amount of trial-and-discard before you find one that you're willing to work with long enough to figure out whether you're truly compatible, or whether it still does things over time that will annoy you to the point where you find you have developed Opinions of your own about how you want your Linux to function, and then you will be able to read distribution documentation and hype statements to find out whether or not their Opinions match yours. I started trying to run Linux in my undergraduate university days, and the experience was so rough I jettisoned the idea entirely for my undergraduate period. By the time graduate school rolled around, and a Linux environment was the best option for me to do some of my schoolwork, instead of trying to flatter Apple by buying into OS X, sufficient improvement had happened that the process of installation and use was much more on the Just Works side instead of the problem-ridden situation I ran into. It could be that I selected very poorly in my first distribution choice, as well. In any case, graduate school and my first few years of independent living were relatively smooth in terms of making it all work out, and I had a Linux machine with a TV tuner that I could use to watch most of the cable channels I had available to me. in my apartment.
So, in the sense of not necessarily needing large amounts of technical skill and fiddling with configurations, and that most distributions contain an installation wizard to set specific environment variables, partition drives appropriately, and install the software, it's not that difficult to install Linux on a desktop machine. Laptops are a little fiddlier, and the single-board computers, like the Fruit Pi lines, are the fiddliest of the lot. That said, the desktop installers work 99% of the time for laptops, and the Fruit Pis generally have their own installers / image writer programs to ensure that everything gets put in the right place. Significant amounts of work goes into the installers to make sure that they function well, cleanly, and without errors, so that someone can feel confident that the potentially most dangerous part of the process is simple enough, and that all of the options they need to make sure their machine will come out the other side running optimally and with any and all of the tweaks or packages it needs to do so already enabled.
The story that provides the keyword for the entry is much more like what it can be to try and port Linux to a new set of hardware, or trying to figure out how to get all the drivers in place and running smoothly, and any adjustments that need to be made to the kernel to make it happen. All of which is arcane wizardry well beyond my level of current understanding. I am in user space, not in kernel hacking space. (And there you can see why I think "Oh, I'm just running other people's software" is an appropriate deflection for any kind of praise for things that I'm doing with that software.) It's a lot easier than it has been to install Linux on a dead badger, or any other animal of your choice, than it has been before, and it will likely get easier as time goes on and the installers and distributions are refined even further, and Linux is available for a wider range of possible hardware and components attached to that hardware. Because Linux people want us to adopt a distribution (and preferably theirs), they're trying to make it as simple as they can to get it done. So having done it several times at this point, and changed distributions, and mostly just used the tools available to do it with, I don't consider having installed a Linux to be a particularly praiseworthy thing for me in most circumstances. (It's recipe usage. Just follow the directions and you'll be fine, pay no attention at all as to how following recipe almost always has an underlying assumption that you know all the techniques that it's going to ask you to do.)
George, the original-model Chromebook, is an exception to this. It was still recipe-following, but I had to be a proper information professional to find and extract the recipe from where it was being stored. Pulling the same feat again with a different model of Chromebook is pretty impressive, since that still meant finding the appropriate spots on the circuit board to disable the write protection and doing the thing the recipe needed to do that disabling. (I think it was removing a screw, in this case, instead of using electrical tape to prevent a connection.)
Putting aftermarket operating systems on phones and tablets is still recipe following, but in my case, for Android things, it requires operating from the terminal to achieve the desired results, as well as manipulation of buttons or finding ways to ensure that the correct places are being booted into to use those recipe tools. And while I've had success at every item I've attempted, there was one time where I straight-up botched the process by flashing the wrong thing to the device! While this would normally be a straight-up brick problem, on this specific device (an old Amazon Kindle Fire), with some digging in the information and reading more of the troubleshooting parts of the recipe, it turns out there's a pad on the circuit board that if you create the right kind of short to it, you can force the device into a firmware-upload acceptance state for a little bit of time. Which involved the dexterity and care needed to disassemble the device to expose the pad, to have the right kind of wire on hand to create the short, and to have the terminal command only needed the carriage return, so that I could hit my window of opportunity and flash the correct item to the device. And then, after that, to reassemble the device, after confirming that it had, in fact, taken the correct flash and could now function properly again. That was an adventure, and it'll teach me to read things more carefully the next time I get a wild hare in my bonnet about doing various things. (That said, this device was old, it was not mission-critical, and while it was much improved for having been put on this path, it still wasn't a very powerful device, and the version of Android built for it was several version numbers ago. So botching the flash was a question of whether I could do the recovery, not whether I had to do the recovery. Much less pressure.)
[Diversion: There is at last one cellular device carrier who locks the bootloaders of all their devices and refuses to provide any means of unlocking them to their consumers. The problem is, unless the seller already knows, and/or has already installed an aftermarket operating system on the device,
there's no way of knowing whether a used phone that you're interested in will be one that you can put aftermarket software on, or whether it will be one of these bootloader-locked devices from this carrier. It's remarkably hard to source new old devices because of this, and there's enough confusion between carrier unlocking, where a device can be used on any of the carrier networks in a country, and bootloader unlocking, to install software, that a device proclaiming itself "unlocked" is often carrier-unlocked, and unknown about whether it's bootloader-unlocked. I would happily source a device from the manufacturer to avoid such nonsense if I could, but buying direct from the manufacturer is often hella expensive, and needs to be paid all at once. (That, and they usually only carry their newest models of the niches they're looking for, so trying to sneak an older model from them usually is a no-go.) I'd rather test phones that I'm getting from the used market for their suitability before buying them, but that can also be difficult to do over the Internet, unless, of course, the seller knows what I'm asking for and can do those tests themselves and show me the results.]
There are two exceptions that I know of to the idea of making Linux easy to install and then just use, so long as you agree with the opinions of the distribution creators. The first is Gentoo, which, having now read the Wikipedia article on the distribution, seems to have a few more options for providing pre-built ways into a system, that then get taken over by the way that a Gentoo system really wants to work, and was previously installed: by compiling everything from source, according to preferences set by the user to ensure that the software that they used was exactly the way they wanted it to be (and that had been optimized for their hardware and use cases, so that it would go faster and potentially use less RAM on that system compared to others that had not been optimized.) Even I, supposed computer-toucher and polymath-in-training, have never attempted to stand up a Gentoo system according to the official instructions and handbook there. Just from how it is described, I feel like it offers a jam choice problem to everyone who doesn't already know all the answers to the questions it will ask in advance, and therefore can just set the flags and switches in the manner they desire and leave the machine to compile everything. (Plus, updating the system for Gentoo has to take significant amounts of time to do all the compiling, so I would hope that the performance improvements more than make up for the increased amount of time spent building all the packages from source.)
I
have, however, tangled with the second exception to the rule, and stood up several Arch Linux systems using their official methods. Arch's Opinion on Linux is that they actively try to avoid having one, past making sure that packages are built according to their specifications, and that they do not particularly care for large amounts of abstraction. Past the basics to get a system up and running, they have no defaults, they have no recommended packages, they have no application suite or desktop environment that they install by default. There are now a couple ways to go about setting up an Arch Linux system, one with a guided installation and one that follows the official installation process on the Arch Linux wiki. The Arch Linux wiki is the reason that Arch Linux isn't a niche distribution that only the hardest of hardcore users takes on. The documentation on the wiki is excellent, although sometimes it is esoteric, and the documentation is frequently more helpful than the forum users, who often demonstrate the kinds of stereotypical attitudes that people have come to think of Linux users, and of people who generally are unhelpful until you do the exact thing they're demanding, at which point they may give you a curt answer with no explanation to help you understand. So, for someone who believes in their ability to follow recipe, having a nice detailed recipe to follow and to refer to when things get a little squirrely is just the thing desired.
Thankfully, despite the flaws of its users, Arch is a distribution that fits with the idea of how I wanted to work with systems. On other systems and architectures, the developers and maintainers make easy the pathways they want users to use, and make very difficult pathways that the user might want to take that aren't what the maintainers want. And the update schedule for many distributions is slower than what I would like it to be. Debian updates, but there's enough that's been changed in the interim that I wouldn't be surprised if they recommend reinstalling rather than version upgrading in place. Ubuntu releases every six months, and Mint follows that schedule. Arch (and Gentoo) and their derivatives are, instead, rolling-release distributions, where updates to the packages are available immediately, rather than at specified update intervals. By remembering to keep the system updated regularly, or at your own decision of intervals, you can control a rolling-release for your own schedule, rather than having to set aside time when someone else wants to update. And because an Arch install from something other than the guided script has very few decisions made for you, it's perfect for cobbling together all of your favorite programs in one distribution, never mind how they might have clashing appearances with each other, based on what toolkits they're using for graphical styling.
Valve Corporation, in creating the operating system for their Steam Deck devices, SteamOS, took Arch Linux as the base and provided significant support to the distribution to ensure that it could continue to be used as the base for SteamOS. And, as they have done with many other things, the corporation created a very nice wrapper around Arch so that they could run Steam in Big Picture Mode on the device, and still allow for people to use the Deck as a desktop-style device, or to game in high resolution and power if they hooked it up to a dock. Arch's aggressive opinions about keeping the decisions in the hands of the user make it a great base for Valve to build upon, since they can choose what they want to apply and only what they want to apply.
All of the pure Arch installs I've done so far have eventually wound up with certain kinds of problems that necessitated their reinstall or my choice to move a machine to another system. Usually it had to do with storage problems. On the one that had enough storage, the problem was essentially that the discrete graphics card in the machine was no longer supported by the proprietary drivers for the corporation that made it, and so the desktop environment choices were very limited, and even then, the machine was starting to struggle with doing all that many things. These could have been defeated in various clever ways, but eventually it was clear that the problem was going to only keep creeping up on me and getting thrown back after a little bit. So, admittedly, having done the thing, I eventually abandoned doing the thing for a more opinionated setup that still runs the Arch base, but goes from there to provide some better quality-of-life features, and which has a gaming edition, which is what I wanted in the first place. I'm not sorry that I did the Arch from scratch approach, it introduced me to some neat tools that I can use in the future if I ever need to stand things up, or use console commands to try and achieve various situations like starting up wireless Internet. And, doing the whole thing from the command line meant that I got a little more confident in my abiity to do things from the command line. (And, subsequently, understand why there are so many warnings on the Internet about not using combinations where you download code you haven't looked at and then pipe it directly into your shell. Even though I probably wouldn't be able to examine such scripts to see if they were malicious before executing them. I don't have that kind of specialized knowledge, I operate firmly in user space, and so I do a lot of trusting that the people providing this software aren't doing it for malicious reasons, and that they haven't had someone introduce malicious code into their project, whether by force or by social engineering to get themselves attached to the project and then push malicious changes. That it's worked out marvelously so far is a testament to what people can do when they cooperate with each other and are able to use tools at their disposal to sign their work and make sure that it's trusted.
Installing Arch isn't quite like installing Linux on a dead badger, that would have been if I'd managed to successfully get the Linux experiment I did in my undergraduate days to get up and running and doing what I wanted without a lot of frustration and aggravation. But it is something that rightly suggests that at least my abilities to follow recipe and to troubleshoot when something other than the expected result comes out of the recipe, so as to get it back on track and working again. This happens regularly, and in the distributions that I'm running now, the updater script they provide tells me when there are things that may need my attention, like new configuration files that I might have to tweak to make work again. It's good, it's powerful, I like the aesthetic of it, and the machines that need to run it can. (And I'm still taking care of all the rest of the fleet of devices as well, with their specific purposes in mind. Update day is usually an all-day affair for all of my items, but the nice thing about package managers and update scripts is that they do most of the work for me, and I just have to run the commands to make sure everything is in order.
So, if there aren't reasons why you have to stay on a particular operating system, maybe give a Linux a spin for a bit and see if you like it? Or several of the Linux-type items for a spin, and see if any of them appeal. Each time Microsoft decides a Windows version gets no further security updates, or Apple decides that certain computers no longer get macOS updates, or phone manufacturers decide they're done providing updates to their devices, there's an opportunity for a Linux to step in and keep the device going, so long as you can install it. And so long as you trust the community of developers who are interested in keeping that device going past the end of official support from the manufacturer. It's worked out for me pretty well since I switched to Linux as a primary driver of things, and now I can say that a lot of gaming is actually coming along nicely on Linux systems, so that's pretty cool.