Patching Linux - Pain or Gain?

Linux LogoLinux Logo

For many of us, Linux has been a welcome presence in data centers large and small. What do you do, though, when you start getting into more than you think you can handle ?


With the advent of ever-evolving technologies like SANs, virtualization and server consolidation, data centers are glowing with even more shimmering lights, and humming from the buzz of smaller form factor stand-alone servers, farms of virtual machine servers and rows of blade centers. The ease of using server templates, cloning and automated installations have definitely had a great impact on the number of servers you end up managing today. What may have been a 10:1 server to technician ratio several years ago has now changed, and enterprise sized server farms of several hundred machines are managed by just a handful of people. Simply put…if you can build it faster, better and cheaper, someone will take notice and expect more.

Along with the growing data center, you also have the rise of Linux as an enterprise level operating system. As more tech houses use this constantly maturing operating system, they run into issues like support, hardware compatibility and finding ways to get more bang for their buck using open source components in their existing infrastructures. Now, aside from finding better ways to manage your hardware, using tools, monitoring processes and other fun IT stuff, one of the biggest headaches IT has to face is keeping your machines up to date. Yes…we’re talking about patching.

When it comes to patching, Microsoft has the edge by far. Regardless of the number of patches Microsoft puts out every year, being the popular operating system that it is, it gets pretty good support from the industry when it comes to facilitating patch management. Aside from using Microsoft Update to patch your machines, there are plenty of third-party tools that support the Windows operating system. For Linux, on the other hand, you’ll only find a few third party tools. You can use the built-in update processes that the OS has to offer, but it can quite clumsy, especially if scheduling is required, or if there are package dependencies to consider. The few third party tools available can be rather limiting as well, since the majority only work with RedHat. You also need to deal with a vast number of machines in your server farm. How can you manage large scale patch deployments across thirty, sixty or even several hundred servers ?

In this article, I try to cover some of the basics of patching Linux using built-in mechanisms, what’s available in the third-party tool market and, some of the obstacles I’ve run into when trying to manage a small to large data center full of Linux servers.

Create a new thread in the Streaming Video & TVs forum about this subject
This thread is closed for comments
    Your comment
  • Darkk
    Very nice article. Patching is just a way of life of sys admins everywhere regardless of what flavor of server and desktop OS. Least the article explains in detail what to expect and the gotchas.

    Good job!

  • Anonymous
    Patching harder on Linux than Windows?!?
    Maybe I'm biased, but updating Debian or ArchLinux (more of a desktop distro) has been so easy as not to even think about it.
  • Anonymous
    You really don't know what you're talking about here..and readers should avoid this article.

    If you buy Red Hat Enterprise Linux with a Satellite subscription Red Hat does the patching.

    If you have Novell - ZenWorks will do the trick.

    If you're running a non-commercial unsupported version than sure some of the options you mention might make sense but a simple cron job with yum/apt will do it all with one command line.
  • Anonymous
    Was this article written by Steve Ballmer? And interestingly there is only a single line about the Debian-based distros? Why Mr. Anderson, why didn't you mention the details about APT? Now Steve Ballmer, let me tell you something - my close friend is a sysadmin of my university (University Of Toronto) and he doesnt even bother patching the systems because they are fully automated (over 200 machines). As well he deploys 50 machines with brand new OS installation with no more than 5 lines of commands.

    Sad Tom's.. sad.. you have been an amazing site once upon a time ...
  • Anonymous
    I'm a Windows guy most of the time but I enjoy playing with Linux from time to time.
    Actually as a Linux starter(some time ago) I had no problem patching my Linux.
    It was very easy...
    I do not remember reading or doing something special before patching it at that first time.
  • Anonymous
    Correction, Patch Quest by Advent Net was cited as patching only RedHat which is incorrect. It also patches Debian. In my experience finding a patch solution for your particular OS has not be that terribly difficult. Finding one that has robust scheduling, push on demand, and can handle the multitude of necessary evil apps, such as Adobe Reader, Quicktime, Realplayer, Instant Messaging, etc... is the real challenge.
  • GoK
    The author of this article needs to go back through their information, and edit this article. It is highly inaccurate! The fact that he gave Debian-based GNU/Linux flavors (ie., Ubuntu&Gentoo) less time in the article than his praise for Mircosofts upstream ability for patches, seems a bad sign for this article.

    Patching most GNU/Linux installs is a simple task, which is highly scalable, and that can be fully automated through the use of CRON scheduling, etc. NO EXTRA SOFTWARE should be required to update/maintain ANY enterprise level GNU/Linux server distro (also if you server has a GUI on it, its not running in an enterprise level configuration).

    I find the mention of Windows Server strange in the article, since it can't run services like Bind9 (DNS), it only makes up roughly 38% of the current market share of net servers, and since it can't run Bind9, it runs NONE of the internet backbone (DNS routing server).

    I am a huge fan of Tom's, but this article should never have been published.
  • nochternus
    While there are many Linux solutions, everybody will find what works best for them. I myself have become a fan of distributions like ArchLinux. I use it on my 3 servers at work and on my desktop and server at home. the package manager, pacman, is by far the best I've ever used. While it may not categorize some things into software groups, it does have it broken down into core, extra and then everything else. It is also extremely easy to configure and create wrappers or optional interfaces that utilize pacman (just like some of the others mentioned. There is also a package called the "arch build system" that allows you to create your own packages from source with the simple modifications of a PKGBUILD file, making recompiling and rebuilding easy and efficient. My latest server was not fully supported by a vanilla or even a patched kernel so a few quick modifications to the PKGBUILD and the kernel config and one command later, the package was compiled from source and installed without me sweating, swearing or crying.

    I don't want this to come off as a "YAY ARCH - EVERYBODY SWITCH" comment so much as a "do a little more research, or even a community probe could get you better information" comment. The concept of the article wasn't bad just slightly "mis-informative". Especially seeing as how not everything that is open-source and is an OS is linux/unix. Most are linux-like or unix-like (as is the nature of progression.

    As a note for the naysayers, I've used Windows Server, Debian, Gentoo, RedHat, SuSE, ubuntu, FreeBSD, OpenBSD, Solaris and many spin offs of some of those. All of them have their strengths and weaknesses (most notably the flaw of the Windows Server platform would be any machine that loads it - THAT is a biased opinion.)
  • malici0usc0de
    With Ubuntu you can also set it up to silently install them in the background, it just prompts for a password then goes away. I don't know how long Ubuntu has had this but I have been using it as my only OS at home for about 2 years now and have never had a problem with patches. I use XP at work as almost everyone does and I notice it operates almost exactly the same way except it doesn't ask you for a password. So if it works for the less techie MS user base then I don't see why so many problems are occurring with this same basic system running under Linux. sudo apt-get install brain
  • resistance
    The writer of this article has 0% knowledge of _present-day_ GNU/Linux or this article was sponsored by software monopolist.

    in Debian based distros like *ubuntu you can set automatically daily updates without _any_ user intervension and without installing additional software.

    Its a first time I see such badly written article on tomsharware.
  • Anonymous
    I have been using Fedora for years now, and the process of patching is really easy, either the system will patch and update averything for you is you have that enabled or a couple of clicks or command lines will do the trick, Ubuntu is really easy too, and you dont have to restart if you dont want to, you just can schedule, restarts, disk management, clean up of old files...everything, for me the process of patching GNU/Linux is by far easier than Windows, not only you dont have to spend days searching for updates of your non Micro$$$ software, by yourself.....find that some app broke down....GNU/Linux, just update everything, system, kernel, software, sometimes i got to update software that i compiled myself to newer versions.....
  • Anonymous
    Hmm what about Suns UCE (Update Connection Enterprise)

    What if find in this article is the lack of rollback possibility, UCE has and I've used it too. Disk space? It checks it before your do the test run? Yepp thats possible too. Oh did I also say it uploads on the testrun so the final update is not depending on network? Brilliant If I may say so.

    Disclamer: I am a linux admin, with small trips into solaris land, I am not a sun drone.
  • matobinder
    Windows/Microsloth update is very nice for most users. But it just doesn't cover much else. What is great about yum, or other tools, is they cover your compilers and even many games.

    However, both (L)unix and windows updates get to be more of a pain for companies. Not just in downtime, but you don't want to just drop a new compiler in, or anything for that matter. Care does need to be taken on updating those. That is one way Linux can be more of a pain. Companies generally use Windows for email, and probably Excel and Word. But, at least in my expierience, all real work is done under some Unix/Linux distribution.
  • Anonymous
    I'm just as disappointed with this article as everyone else. To tell you the truth I expected more from a Toms article. This seems uninformed and appears that the author lacks a knowledge of the subject.
    Patching RHEL or SLES is as simple as using RHN / Satellite or Zenworks. The servers will very rarely require a reboot (unless it's a kernel update) unlike their Windows cousins.
    If we're talking a production Datacenter network, as it seems here and comparing like with like, it's only fair to compare enterprise Linux distributions with Windows Server. These distributions have been designed around the most stable code base with supportability like stable patching and updating in mind. This is why Red Hat release periodic updates to their OS, much like MS release service packs.
    It's not really fair to compare a roll your own Linux distro and compare it to an os like Windows server sold as an "enterprise operating system" The days of being on your own with package updates and having to manually recompile kernels etc are well and truly gone unless you have some specific need or desire to do it.

    Unfortunatley it's articles like this written by people who have either no Linux experience or have not taken a good look at Enterprise linux distributions for a long time that get the eye of IT management and promote the misconception that linux systems are somehow less stable and harder to administer than Windows systems.
  • Anonymous
    First off...thanks for reading the article. What started as a Windows vs. Linux piece actually morphed over time into something completely different and a little more focused.

    I thought I'd reply to some of the comments...

    I'd like to see how many machines people are managing, especially those whose cousin's-buddy's-cousin Homer manages. If you work in an enterprise-sized environment, then you'd probably appreciate this article as these are issues I run into quite often.

    To all you Linux haters and Windows haters...I never really understood why folks can be so one-sided. One OS in a large environment will never be the answer. There's so many factors that will determine what OS you end up using, so why not use them both (and throw in a mainframe and some Sun while you're at it)!?!

    Really, if Windows and Linux had a kid, poor little WinNix would never have any friends.

    The details on patching Debian distros may be scant, but I felt I had to mention Ubuntu because of it's growing popularity. Sorry, I couldn't get too into it, but my bigger focus is with SuSE and RedHat.

    I'm just trying to cover the basic issues and techniques used to patch Linux. If there was more time I would have gotten more into ZenWorks and RHEL. Either way, I love to see how people over simplify patching servers without mentioning what they're managing. I find it hard to believe that these folks run more than a handful of machines and haven't run into any of these problems.

    Other than mentioning that there a lot of patching applications out there to run on your Windows environment, a lot of the hassles you run into patching Linux apply to Microsoft as well. No system is really better than the other because it's all about how YOU manage IT.

    One point about the article is it tougher to find something to help you manage your Linux patches. If you've got a nice sized budget, then get ZenWorks or buy a subscription. If you don't, then you'll need an alternative.

    Thanks for the mention about PatchQuest. I'll check my sources (still, with a user-base as big as Novell/SuSE, why would a vendor not support a marketed distro and support Debian instead?---yeah, loaded question. That's how I roll).

    ...and finally, a reboot is a reboot. Sure, it may not happen as much with Linux, but it still does and in a lot of cases, you still need to plan for it.

    Well, that's it for now. I hope you appreciate for what it basically is and not what you think it should be. Keep it positive and thrown in any advice that would benefit other readers experience with Linux. I'm sure they'd appreciate it.

    Signed...not Steve Balmer.
  • Anonymous
    I use Ubuntu and I use Windows. The update system are so similar there is really no reason to write an article about the differences.

    Well... Only if you're going to talk about updating the Operating System.

    You see if you run Ubuntu and use the SPM to install MySQL or PostGreSQL or Open Office or your music player or video player or email reader or yada yada yada, the Update Process can/will update all of those things for you, automatically.

    Microsoft only updates Microsoft. Ubuntu updates the World!

    And it's coming Stevie B.

    One odd animal inspired version at a time.

    It's in your city. Hell, it's on your street.

    Oh My God, Mr. Balmer. We've traced the Ubuntu update request. It's coming from inside your house!
  • Anonymous
    I code php and i see that in my company, the servers are updated manually, especialy php. A seemingly not-worth-my-time upgrade from php 5.1.x to 5.2.x can turn a happy client to angry, or one really pissed-off. Production servers get 1k hits a day, so it does matter if there is a downtime. Point is, there are updates you must micromanage if you are in the commercial/business environment.
    And is it really worth to update every new software version as soon as it gets stable? Nope.
    If you notice, major hosting firms are still having six-years-old php 4 at your disposal, probably still running kernel 2.4.x. Why? Becouse it works. And if it ain't broken, don't fix it.
    It's just business, if it's crititcal - we'll patch it. If a new version is running faster by 2%, it's not worth it.
    Somewhere in the real world there is no space for automatic updates...
  • hergieburbur
    Wow, just wow. No way is patching in Linux, with the possible exception of scheduling, more difficult than in Windows. It generally takes me 2-3 clicks, with no annoying reminders to restart every 5 minutes, and is completely un-intrusive. Nice try, but this article fails.
  • Garzan
    Okay, I read the entire article. My environment is almost totally RHEL4 with my New Years goal of being ~totally~ RHEL4. My comments are Redhat-centric, though apply equally to Centos or Scientific Linux.

    You spent a lot of ink talking about dependency problems, but you failed to make the point that using up2date, yum, or apt solve those dependency problems for you. Using current tools resolve those past shortcomings.

    A reboot may be a reboot, but I think you oversimplified. Unless you're doing a kernel update, reboots are generally not required. And even with a kernel update, you reboot when you want to start using the new kernel. Until then, your box will happily continue with the currently loaded kernel for as long as needed. Updating processes only require restarting that process, not the entire box.

    I think you make a good point about creating your own mirror. I have one box where I've run the "up2date-config" command to modify it from the default of erasing patches after install to keeping patches after install. On that box, I have a cron job that runs "createrepo" to 'repo-ize' the store/patch directory. All my other machines are set to update off that machine at LAN speed rather than WAN speed. Changing where up2date looks for patches is as simple as editing /etc/sysconfig/rhn/sources. Changing where yum looks is as simple as editing the baseurl line in /etc/yum.repos.d/<your>.repo.

    And if you really want some nitty-gritty on patching, my biased opinion is to read "Linux Patch Management" by Michael Jang ISBN 0-13-236674-4. I don't agree with everything he says either, but he's got the luxury of using 260+ pages to supply much more depth.
  • Anonymous
    Hi All, It's really good and enjoiyig moment to read all your above discussion ... liked and learned a lot with above ... Have few query with all above ... Do you guys really beleive in auto patching or patching from 3rd sourece server ... don't be confused ... I mean yum upgrade from primary yum server ... yemmm ...

    So here's the query .. have my server with say rhel 5.5 and have upgraded my all.. packages to .. and now my all pack upgraded to say 5.7 ... and suppose have an application installed that is only compatibale with mysql 5.3 so after upgrade that changed to 5.7 so my application stopped working properly so that case I require to upgrade my application too... that's okay will do that .. no issue .. so I won't recommend auto-upgrade ..

    It's just to focus you that case is with windows and Liux both ... patchying (upgrade) is not big deal in current IT market but you need to analyse when it recommened to automate and where that should be manual... writer of above article has done gr8 job really salute to him ..

    > Shirish