A coworker was complaining about having upgraded to Vista to stay current with his software knowledge. The trouble (apart from upgrading to Vista) is that the rest of his home network is made up of computers running
- Windows XP
- Windows 2000
- Windows Server 2003
You might think with such a diverse group of computing platforms, adding one more, and a small evolution on top of the others no less, would be easy.
No so. The solution he found (thanks to almighty Google) is pased below. The long and short of it is to run a Virtual Machine running Windows XP inside of Vista and then using that to Remote Desktop to other computers.
In my case I had to use software provided by the client to connect, unfortunately the software only works with Windows XP and not Vista. Rather than installing XP and going on a driver hunt, I decided to take the easy route and install XP in a virtual machine on top of Vista. So I created the virtual machine with a 10G virtual HD, 1G Ram and Network Address Translation.
Once XP was installed I connected to the Internet, downloaded the appropriate service packs and install the client’s communication package. No go, it wouldn’t connect. After around six hours of poking around, reinstalling XP and threatening the computer with physical harm a light went on. Because I was using NAT Vista’s built-in firewall was blocking the traffic to and from the client’s communication software, I could actually watch this happen.
The resolution turned out to be having XP virtual machine to share the hardware rather than us NAT.
Is this true? Do you really need to run virtual Windows inside your native operating system just to perform the basics other versions are all able to agree on? This is absurd! If true, you might as well just use a Mac!