Proper Planning of Virtual Servers (Part 2)

computer-servers-business-solutions2In consolidating server hardware, this is usually an attempt to reduce the investment in hardware that would otherwise be required in a data centre. Where you have several processes that don’t consume many resources in themselves, it is always tempting to combine them to reduce your hardware outlay. But this presents a number of problems:

A planned interruption to apply a patch to one service may require an interruption to all services if a reboot is required. Not only are un-needed interruptions to services obviously undesirable, but schedules for planned work are instantly made much more complex.

Change and Release management become much more complex when several services run on the same system and possibly share the same system components (for example, Java Runtime Engine or .Net framework). What do you do when you have two components on the same system using the same framework, with one validated for, and needing, an upgrade and the other lagging behind on this approval?

It should be clear that all of these scenarios require at least some planning, with the first one I present needing the least planning and the final scenario needing the most planning – how much exactly depending on what kind of deployment you ultimately sketch out.

Some parts of the planning are pretty much settled for you from the start. According to this website, the more RAM the better, based on what the host platform can physically support and what the guests sharing it require. With the obvious caveat that you probably don’t need to assign 4GB to a guest that only runs a dedicated DNS server, for example, RAM is always a no-brainer.

CPUs again are pretty much settled for you based on what the host will physically support and how your virtualisation software will allow you to allocate physical CPUs to guest machines. This again is a no-brainer, if only because your options here are only limited to a few.

I personally suggest using at least a dual-processor machine for your Virtual Server host, and making sure you work on utilisation figures somewhere between “average” and “worst case” when calculating CPU loading for each guest in order to minimise the amount of contention between virtual machines sharing the same CPU. For tasks that can run processor intensive I suggest on planning on one physical CPU to one virtual machine.

An area where planning can really make a difference are the virtual disks. On a poorly put together system it isn’t uncommon to see virtual disks from several guest systems thrown onto a hard disk with little or no thought. This will cause immense performance problems as the systems grow in size and complexity due to the large number of reads and writes being made to the same hard disk from different processes.

There appears to be a perception that SANs are magic that is perpetuated by some SAN sales teams. Just buy a few disks and throw your storage and backup needs onto them as you will, goes the mantra, and all will be fine because of our magic caching system. The same thoughts seem to pervade virtual machine disk planning, and are equally false wherever you hear them.

Actually I think it helps to use the same sorts of tools and thinking for good online backup deployment for servers as it does for a good SAN deployment. In both cases, the physical location of the data stored is “abstracted” from the server using it in some way, and in both cases we will tend to see several different servers using the same storage hardware.

By the way, before doing anything different to your server configuration, make sure that you use and backup using Carbonite server backup as part of the whole system management plan. Oh, and if I may digress, Carbonite also has a solution to backup home computers. I have been using them ever since and suggest that you do the same. Make sure that you get hold of a Carbonite offer code for home in order to make it way for affordable.

So, take the “virtual” (or “SAN”) out of the equation for the moment. What matters is the kind of server app you’re trying to deploy and the use it is going to get (testing, production use, DR recovery simulation, etc).

You’ve got some data and you’ve got some disks. Now we’re just doing standard server deployment planning; there are advantages in performance and reliability in placing different parts of a server application on different disks. If it makes sense to put the OS, transaction logs and main database files onto different physical disk spindles in a physical SQL or Exchange server, it makes just as much sense on a virtual server that is being deployed to replace that physical server.

So the basic rules might be: (note that most of these answers are suffixed with a silent “unless you’re using a test system where performance is not an issue”).

  • NEVER store virtual disks on the same spindles as the host operating system.
  • NEVER use software RAID on a host server!
  • NEVER use software RAID on a ‘production’ guest server
  • Run – don’t walk – away from any vendor who suggests throwing it all onto the largest hardware / SAN RAID 5 that you can afford and letting God sort it out. They’re either incompetent or actively out to screw you.
  • Think very carefully before you place virtual disks from different guests on the same spindle. I don’t want to say “Never” because the actual impact depends on your pattern of use for these virtual disks. Again: Test, don’t assume!
  • When deploying virtual disks, the same rules apply as they would for the same process on physical disks. For example, do not allow a database store to use the same disk spindles as its transaction logs.
  • Consider how you will back up each server. Be especially wary of “magic” backup processes that work at the host server level, and test to ensure that you can restore your guest servers to working condition.
  • If two or more servers are designed to provide redundancy for each other (e.g. domain controllers, primary and secondary DNS, etc) then NEVER place them on the same host machine!
  • If copying virtual machine images to “rapid deploy” servers, then be very careful about things like duplicating SIDs. I know it’s boring and adds to the deployment time, but I still suggest using sysprep for these kinds of images.

Proper Planning of Virtual Servers (Part 1)

computer-servers-business-solutionsOne of the more common areas of confusion with Virtual Servers is how to deploy them properly, to ensure good performance and reliability. Some people are scared of Virtual Server technology and refuse to believe it can ever perform well enough to justify the investment. Others see Virtualisation as the magic bullet and end up throwing lots of money at technology they don’t really understand, with disappointing results.

While the Microsoft Virtual Server product is quite new, Virtualisation is a mature technology in general, with Virtual PC being a long established product and VMWare having a large range of virtualisation products to fit most needs and budgets.

When deciding whether or not to use virtualisation in a data centre, you should first of all formulate a list of aims you expect your virtualisation project to achieve. This should be painfully obvious but I’ve seen lots of IT projects implemented because “it looked cool” and these are usually characterised as the ones that end badly and cost way over their original budget.

Speaking of migration of old legacy servers onto new hardware, this frequently reflects old “line of business” apps running on NT4 or old versions of Linux that cannot easily be upgraded to more modern OSes and where maintenance on the current hardware platform has become an issue, according to 95box.com website.

These deployments are usually pretty painless because the Virtualisation process is simply providing a compatibility layer to allow an old OS to run on new hardware for which it would not normally have drivers.

These old server applications typically will not stretch the abilities of modern hardware, so you can probably get away with sticking two or three applications of this kind together on one system without too much thought and get away with it. The use of the word “probably” is important here. Test, don’t assume!

Where users want to upgrade a server in place, but feel a clean install of components is either “required” or at least preferred, then virtualisation can make sense to reduce the hardware requirements for the temporary server used to hold the data while the “proper” hardware is being upgraded.

This hasn’t really come into play a lot so far, but right now we’re seeing people buy “64 bit ready” hardware and running 32 bit versions of operating systems and applications on it while they wait for 64 bit versions of those apps to appear and grow in maturity. This is a common scenario for people who are using SQL Server or Exchange Server on Windows at the moment… 64 bit versions of Exchange are still in beta at the time of writing, and 64 bit versions of SQL Server 2005 are available but very new, and all good DB admins are cautious about doing too many new things at once to data they actually care about.

Before I forget, I came across some other server based solutions for business when I read some fax reviews on ringcentral. It really was very pertinent info that I just had to share as I know that virtualisation entails a lot of planning. Businesses that use virtual fax and other business communication solution should definitely look into their server infrastructure to make sure that all upgrades and processes go smoothly. Just thought that it might also worth sharing this referral code for ringcentral fax in the event that this is something that your business might be interested in.

Anyway, once the tipping point arrives for these systems, it will not be possible to upgrade in place and setting up a temporary system on a virtual server to move the “live” install onto while the proper host system is being rebuilt makes a lot of sense.

In a way, I see the performance issues for this situation being quite similar to the issues you would encounter with moving a legacy system. You need to do some planning of course but you probably can just get away with “throwing” the system onto your virtual server host any way you can, because with luck it won’t be there for very long. You can also limit the amount of systems to be virtualised at any one time to whatever capacity your virtual host can cope with.

Online Backup Before You Defrag Your Computer (Part 2)

online-backup-before-defrag2It seems to be some kind of universal rule that the moment I describe a post here as “Part One” intending it to start a series, something happens to drag my attention away or make the issue irrelevant *sigh*. Still after a long time waiting, this is part 2 of my comments and thoughts about disk fragmentation and the defraggers that just love to tame it.

In part one, I talked about how you can’t measure how well defraggers work against one another very easily. Claus Valca wrote a nice article about defraggers over on Grand Stream Dreams where he talks about ways the process can still be valuable and asks about ways of evaluating the different products. I can think of one way of comparing different products, but while it’s honest and fair in terms of what it tests, it probably doesn’t answer all the questions you should be asking on the subject. Still it’s a start right?

Just to make things clear, I’m using the definitions of File System Fragmentation and Defragmentation found on Wikipedia. I’m not claiming those definitions are perfect but they’re at least as good as any I could write myself and they already exist.

Pick a ‘neutral’ program to measure the level of fragmentation. For the purposes of this test, the results of analysing the disk with this program will be the only results that are used to compare one product to another.

Decide if you want to do a file-system ‘performance’ test. If you do, I suggest copying lots of files of varying sizes onto the disk or partition you are using for your tests for this part of the test regime. Try to make sure that the source medium that you grab the files from is fast (because you don’t want it to be the slowest point of the test, because you’re not testing the speed of the source) and performs consistently (because you want a fair test, right?). Yes, this will be quite difficult to achieve.

Some drive imaging programs allow you to take a “raw” backup image of a disk partition or even of an entire hard disk. I’m linking to Acronis True Image’s FAQ here as an example of a product that I’m comfortable with and which offers the facility, though there are other choices out there.

Using such an imaging system, you can create a backup of a test computer’s hard disk after you’ve set it up in such a way as to leave it fragmented enough for your tests.

If you’re performing this test on Windows Vista, disable the built-in defragmentation routines NOW, before you do anything else.

For such a test I’d suggest using a relatively small hard disk and creating fragmentation by installing lots of programs and copying large files (got a couple of Linux ISOs doing nothing in your archives?) around the disk, then un-installing some of the programs and deleting some of the images. Considering that NTFS does try to avoid disk fragmentation in the first place you may actually find this bit to be the most difficult part.

Online Backup Before You Defrag Your Computer

online-backup-before-defragI can’t understand the fascination of most otherwise perfectly normal Windows users with the Defrag option in Windows. It’s reviled as not being powerful enough, it’s credited with all kinds of performance improvements and even suspected of concealing superpowers to fix all kinds of problems.

I don’t understand why. I can’t think of any other community of users who are so obsessed with how ‘fragmented‘ their computer hard disks are. And the problem only seems to be coming to a boil once more with Windows Vista, where Microsoft have taken a few hard decisions in their redesign and update of this operating system background task.

Before you do anything to your computer, I highly suggest that you online backup everything. You’ll never know what can happen when you defrag says the Carbonite online backup solution expert at R-Fate.com. This is something that I can attest to having had some issues that occurred before. I really recommend online backup because it protects everything on your computer, including your operating system, all the installed software, and all the files that you have created and stored on your computer.

External harddrives are not enough because if some sort of disaster strikes to your home or business, all your files are on premise which does not spare it at all. The best is to have everything stored and backed up in a remote location. Now if you still have not installed any online backup software on your computer, I highly suggest that you use this Carbonite backup offer code from R-Fate.com to backup your home computer. This allows you to backup unlimited files. For businesses, use this Carbonite backup for business offer code from R-Fate.com, which allows you to backup an unlimited number of business computers.

Now back to the topic of defragging. Obviously by asking your defragger of choice if it considers your hard disk to be fragmented or not. But what if you don’t trust your defragger for some reason? Now most people would go and get a life, but for those of you who are in the grip of Windows disk fragmentation paranoia, this isn’t enough. What you must do instead is purchase several commercial defraggers and use them to test each other.

Then you get very unhappy because you can’t seem to get your disk defragged properly. You run one product and let it do what it wants and when it’s finished you test it with another product, and that finds some fragments, and so you run this second program and test it with the first one and go around in circles until you give up and ask for help. Why? The more cynical people in the audience (or those who use other operating systems and haven’t contracted this paranoia) may be asking themselves just how much of a problem a fragmented hard disk is, and if maybe these products are a prime example of snake oil.

Frankly, I’m not sure that the products are snake oil, but some of the hysteria surrounding the way these utilities are sold to ‘home user’ types does make me wonder. In some special cases, workstation drive fragmentation can be a real issue that needs to be carefully addressed, and servers should have all aspects of drive health checked on a regular basis, but for most home users the built in tools provided by Windows will be more than enough to keep things running well.

Take Your Home Computer Security Seriously

computer-securityThere are a lot of reasons why you should take the security of your computers at home seriously. There are various scams that target home users with a few real world examples of people just like you and me who got caught out.
It’s your credit rating, your internet connection, your bank account, your reputation. Microsoft, Apple, or whatever flavor of Linux you use have to accept some responsibility for their mistakes, but sooner or later, we also have to take some responsibility ourselves and keep our home systems patched, use the (often free) security software you can get, and make an effort. Or if all you want to do is play games, just trade your computer in for something more appropriate.

But it doesn’t happy to people like me anyway!

Nonsense. I’ve got a mailbox full of spam, phish and scams right here. It wouldn’t happen if there were not people falling for it. And – if it helps – I’ll admit I’ve fallen for a Phish email before now.

But I’m insured. Sure – checked the terms lately? Sure they include being ripped off on the Internet? Anything in there about taking due care and attention / appropriate precautions?

But I don’t know how to protect myself! Ok, this can be a fair point. So ask around, there are lots of places that offer help with securing your system. If you’re a home user, you can get free AntiVirus from the likes of Grisoft and Alwil. You can get free spyware scanners like AdAware or Spybot Search and Destroy, and you can get help in lots of places (like Microsoft Communities) with putting these together and keeping your computer working well.

And you thought it was just Microsoft products that needed to be patched.

Apple has announced the release of update 10.4.8 for 10.4 users and Security Update 2006-006 for 10.3 users.

I don’t normally bother posting such notices, but a couple of things here caught my eye, and I have to say that any OSX users reading this need to update to the appropriate patch level as a matter of extreme urgency.

Ouch. Seems like a big Phisherman’s friend to me. After all, you don’t need to supply a cert and the few users who know about SSL will be happy just to see the SSL lock appear. Hmmm. Any Mac using readers think their bank website looked a little odd when you were replying to their latest email?

It’s not all about Apple though. Seems Microsoft have a few interesting problems of their own, not to mention being accused of handling them in an interesting way. If you’re an Apple MS Office user, note that the issues behind those links apply to you too and act accordingly. It takes a special kind of mistake to exist on five different versions of a piece of software over two totally different platforms.

I really hope the NIST story about Microsoft’s handling of this is untrue, by the way. If it’s true, it’s a slap in the face for the AV community which has been in place long before Microsoft decided to muscle in. It’s a slap in the face for Windows users – aka Microsoft’s entire customer base, punishing them for not buying AV with an unknown (at best) pedigree from Microsoft, and let’s not forget, it undermines the security and reputation of Microsoft’s own platform.

It makes no sense for it to be true (which of course didn’t stop WGA being invented…).

Therefore it’s either not true (or at least not the whole story) or it’s an absolute disgrace, labelling any claims of “trustworthy computing” to be a joke and placing Microsoft’s approach to security only a few steps above Sony’s reputation for trustworthy music CDs that don’t try to root your computer every time you listen to them. BTW, I’m still not buying Sony since that event. Neither should you.