Tuesday, March 19, 2024

How to Avoid Breaking Ubuntu

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Unless you’ve been granted magical powers, odds are you’ve broken your operating system installation at one point in your life. And despite Ubuntu’s stability, it’s entirely possible to break a fresh installation.

One of the things that always surprises me is how careless some folks can be when it comes to installing Ubuntu, or any distro for that matter. Usually this happens more to newer users, however this also challenges more experienced users as well.

In this article, I’ll share my tips for preventing breaking your Ubuntu installation – and why most people won’t heed my advice.

Prevention is best

It has been my experience that many people will opt to try to prevent a data loss disaster by avoiding breaking their Ubuntu installation. Ubuntu, like any Linux distro, isn’t infallible. So while it’s always possible to restore your system by fixing X, reverting to another kernel or by simply uninstalling (purging) a problem program, sometimes an issue crops up that’s difficult to track down.

The best way to prevent breaking Ubuntu is to be proactive. For example, consider tracking your updates before you approve them. This is something some Arch Linux users do and have found to be quite helpful. While it’s been a few years since I’ve had Ubuntu create a major hassle with an update, it’s still sage advice to be aware of what you’re doing.

Updates – I recommend tracking new kernels, PPA (Personal Package Archive) offered software, and doing this before installing deb packages blindly from unchecked sources.

Don’t get me wrong, I love bleeding-edge software as much as the next user, but a user supported software repository is something to use with caution. When you’re updating something, make a note of it. This way if something goes wrong, you know which package to revert back to and/or file a bug on.

Random bash scripts – I was guilty of this when I started using Linux. Years ago, I’d find random scripts and run them without a care in the world. Luckily I never ended up hosing my installation, however I did spot one years later that had some questionable commands that could have led to problems. Learn what a script does before blindly running it.

Losing your password – Don’t laugh, but losing your password happens entirely too often. It’s usually something that happens with a shared or inherited computer. Obviously this is one is difficult to ignore, especially with the adoption of encrypted file systems. If at all possible, keep a copy of your encryption pass phrases in a safe place if you ever need to recover data from that directory after a crash.

Not running a dedicated Home partition – I’ve heard terrible advice recently suggesting that we don’t need a dedicated home any longer. Granted, it’s not a replacement for doing regular backups of your user data, but it’s a real-time saver when you need to re-install your Ubuntu install. Fact is, yes, many distros can detect an existing home directory. But this doesn’t allow for much flexibility when distro hopping, as not all distros do a great job with home directory detection. The short version – this is a good idea, be it’s not mandatory.

Delete your partitions accidentally – Luckily partitioning software won’t kill off your Linux install unless you have unmounted from that partition. Unfortunately this doesn’t stop folks with a LiveCD of GParted from making some costly mistakes with their partitions.

Make sure you know which partition represents which system data. It’s very simple to spin up a local instance of GParted and take a photo of the various partition mount points. Knowing this can spell the difference between ghosting your installation and dual-booting with another operating system successfully.

Running out of drive space – I’ve had close calls here, not paying attention to how much data I had accumulating and what not. Granted, if you have a semi-modern PC, this really isn’t as big of an issue as it once was. But it still happens and it’s worth it to check your available system and home free space once in a while.

Video drivers – Usually an apt-get remove purge will cure any issues you might run into with bad video drivers; even if it means starting over with Xorg and the rest of its comrades. If you’re running select proprietary drivers, I’ve found no amount of purging will help you in some rough cases.

The most recent experience I had here was with X-Swat proprietary drivers from the X-Swat PPA. Clearly, I made a mistake and installed a bad driver for my graphics card. Completely accepting the blame, I began a process I’ve done countless times in the past – purging all things Xorg and NVIDIA to get things back up and running.

You might be asking about the “bulletproof X” we’ve been told to use over the past year or so. In this situation, it wouldn’t have fixed anything. After hours of repeating X/video driver cures that have always worked in the past, I remembered having a similar issue years ago…with my video card.

Looking back, I suspect the driver had hooks into something I was overlooking, but it wasn’t worth the time to continue repeating “fixes” that clearly weren’t working. If you have a working video driver, leave it the heck alone! And unless you’re explicitly sure that video drivers from a PPA are going to fix something, avoid them!

Prepare for disaster

Despite all the tips I’ve provided above, I believe there is something to be said for properly backing up your data. It’s not merely just a good idea, if you value your current installation and the information contained within it, it’s mandatory.

Now I should point out that these days, I rarely bother trying to back up my applications. To be honest, there’s little reason to as I see it as a great opportunity to purge unneeded apps. Those I do need, will be re-installed easily when they’re needed.

What’s critical to save is the configuration data for those applications, most of which is in your home directory. Simply choosing to save the data within this directory can do more to get you back up and running than saving anything else. I compare it to having the media assets and database for a WordPress website – it’s that important.

So what’s the best way to protect these assets? Simple, find a means of backing things up that you’ll actually do. Personally, I recommend local backups for the entire directory and perhaps more strategic backups for anything to be stored in the cloud. My logic behind this is both time and bandwidth. As for preferred services, there are countless ways to back up to either Amazon cloud servers, Google or even your own servers off site.

Cloud options range from SpiderOak, JungleDisk, UbuntuOne (ala Deja Dup), CrashPlan (for local backup too) and perhaps the best of them all, wuala. As for good local backup, it’s really tough to beat rsync. If you’re looking for something easy though, I’d point you to Déjà Dup.

No matter how you look at it, though, avoiding disaster and in turn, preventing a broken Ubuntu installation, begins and ends with you. Using the tips I’ve provided here, you’ll be in a good starting place to avoid major problems and should something unexpected happen, hopefully you’ve heeded my warning about backing up today – not tomorrow.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles