It’s remarkable how the world changes. Technology has evolved from green screen dumb terminals wired to mainframes through powerful desktops back to a centralised model where data centres are king again.
It makes sense to have a data centre; you can store major servers at a co-location facility with unbelievable bandwidth, multi-tiered redundant power supplies and regulated cooling. You can house massive multi-disk storage arrays and expensive tape auto-loaders.
Of course, the further your users are from their data, the more painful applications and data management become. Whether those damnable pesky Access databases that seem to pop up everywhere, or your payroll application or anything else, you don’t want to make users suffer the frustrations of dragging data across WAN links.
At the same time, you know if you have local file servers at each office people will undoubtedly make situations where the same file shares absolutely must be used by people elsewhere.
Ah, IT, it’s a never-ending circle of trying to maintain a balance of simple usability against security, performance, reliability and storage. But I digress ...
It’s no wonder Citrix and Terminal Services and competing products make inroads. For IT, it’s a walk in the park to manage network drive mappings, to keep users, apps and data working together and to remotely shadow sessions and resolve problems.
You can even virtualise servers to run 32-bit only applications or legacy apps while just maintaining the one physical 64-bit machine in the rack.
Yet, the server side is one thing. It’s how you set up the user which can often make or break contentment with such a centralised system.
In the past, some IT departments have taken the view you can give people any old junk machine and if it breaks it’s not important because you can replace the machine easily enough.
While there is a measure of persuasiveness here for the bean counters in your organisation it can be a false economy.
Giving a user a PC, no matter if it is a clunky ‘286, still means you must set up and maintain some local operating system on it, be it Microsoft Windows or Linux or something else.
You still need to deal with support issues relating to that machine. You still need to deal with users storing data locally.
Depending on how you set up the machine, your users may be required to go through two sets of login windows before they can perform any legitimate work – authenticating once on the local desktop then again on the terminal server. Really, that’s not an endearing or optimal experience for those forced to endure it.
So, that's where thin clients come in. And Dell's plan to begin shipping Linux as the chosen platform for their new OptiPlex range.
This is where thin clients come in. Like dumb terminals of yore, the thin client is a specialised machine that exists to work in conjunction with a remote server. At a basic level, a thin client offers little itself but the capability of sending keystrokes and mouse movements and receiving screen refreshes back.
But then, this means going back to something ‘computer-like’ rather than dumb terminal-like on the desktop. A smart thin client if you will.
You can find thin clients that work with an embedded version of Windows XP. Correspondingly, the license fee for Windows XP embedded edition is whacked on to the cost price.
Here’s where Dell are being innovative. They’ve signed with Novell to use SUSE Linux Enterprise on their new OptiPlex FX160 thin clients.
This means users don’t have issues with local storage or multiple logins because they’re using a true thin client device, not a desktop-cum-dumb-terminal. At the same time, users have an optimised experience because they’re using a smart thin client.
And at the same time, the cost of the units is driven down through the adoption of Linux as the embedded operating system.
IDC Worldwide produced a thin client forecast and analysis for 2008-2012 in June last year. It’s $10,000 but for that you get 27 pages of reading. That’s $370.37 per page. You might need to take my word for this, but IDC predict Linux will reach 30.5% market share of thin client operating system deployments by 2011.
There’s no reason for it not to be higher; after all, even if you’re completely a Microsoft shop it doesn’t really matter what runs on the thin clients. You don’t join a thin client to your domain and its management is both simple and straightforward.
Linux makes perfect sense for many things, and not least as the platform powering thin client machines. Linux is cheaper, faster and consumes less memory than that competing operating system.
Dell have picked up that centralised client/server computing is going to become bigger and they’re setting a standard already by selecting SUSE Linux as the base upon which their product rests.