Category Archives: IT

Contains Everything IT Related

Getting Powershell Management Library for Hyper-V (PSHyper-V) up and running

One of the great capabilities of Powershell is the ability to extend its’ functionality through new libraries. One I am currently playing with is the Powershell Management Library for Hyper-V, or PSHyper-V. For Server Core and Hyper-V Server users, the cmdlets contained within this library add a new dimension of functionality and capabilities, and enable admins to reduce their reliance on Hyper-V manager to perform otherwise simple tasks.

Although you can load the library upon demand, there is another method available whereby it may be preloaded as part of your windows profile.

  1. Download the latest recommended release of PSHyper-V from
  2. Copy the contents of the ZIP archive onto your Hyper-V server. In my instance, these were to a folder called c:\powershell
  3. To get your profile path, type the following from with a Powershell prompt:


    You will typically get something like:


  4. The file Microsoft.PowerShell_profile.ps1 is the Powershell script that is executed upon the startup of any Powershell prompt for your user account. In most circumstances, this script doesn’t exist, so you will need to create it (it is worth checking first). To create the script, enter the following into a Powershell prompt

    new-item $profile -itemtype file -force

  5. Now edit this file and add the path to PSHyper-V.ps1 into it. In my example, this is as follows:

    . c:\powershell\PSHyper-V.psq

    Note the dot-space prior to the file path. This is required to execute the script.
    (Editing this file on server core or Hyper-V server can be a bit of trial. In the end, I did it remotely.)

  6. That’s it. To test it, either start a new Powershell prompt. If all is well, a list of the loaded PSHyper-V cmdlets will be shown.

Thanks to the author of the Technet blog post from which I have sourced most of the information for this.

Virtual Server to Hyper-V: Parallel port driver service failed to start error

I’m currently migrating a substantial number of virtual machines (VMs) from Virtual Server to Hyper-V server.  Although this is a fairly painless process, a common error for Windows VMs relates to the parallel port service.  The system eventlog error goes something like:

The parallel port driver service failed to start due to the following error:  The service cannot be started, either because it was disabled or because it has no enabled devices associated with it.

The reason for this is quite simple:  Virtual Server supported parallel ports, whilst Hyper-V doesn’t.  Typically, any basic VM created through VS will incorporate a virtual parallel port even once the Hyper-V integration components includes in the new HAL are installed.

There are two options here, you can either manually hack your way through Windows removing all traces of the parallel port, or much simpler, completely disable the parallel port service.  The latter is somewhat easier, and may be accomplished in a matter of minutes:

  1. Backup your registry
  2. Open Regedit
  3. Go to HKLM > System > CurrentControlSet > Services > ParPort
  4. Change the start parameter value to 4.
  5. Restart your server – optional, but I do this to confirm this has worked.

That’s it.  Upon the next restart, you should not get any service  alert for the parallel port.

Getting list of network interface in Linux

For those of us used to Windows management tools, getting a comprehensive list of data regarding installed hardware on a Linux box can be a little daunting.  Whilst recently migrating a virtualised ubuntu box, I needed to find out just what network hardware was in use.

Within a linux shell, type the following

lshw -class network

This will produce a full list of all installed network hardware.  For newbies, the define name you are probably looking for is called logical name.

Manually Setting DNS Server Addresses in Ubuntu (Linux)

No matter how many times I have installed and configured Linux, I can never remember the name of the configuration file that stores the DNS/Nameserver details.  This really only applies if your Linux machine is using a static IP address.  In most scenarios, it does not apply to DHCP clients.

DNS server settings are stored in /etc/resolve.conf  To edit this file, enter the following command from the shell
$sudo nano /etc/resolv.conf
(If you have installed XWindows/Gnome, you can use sudo gedit /etc/resolve.conf instead)

Add the entries for your DNS or nameservers as follows

nameserver <IP address of DNS server 1>
nameserver <IP address of DNS server 2>



I’ve been a fan of virtualization for a while and have played with extensively evaluated a number of different solutions over the last couple of years in both the desktop and server arena.  A few early incarnations did leave a lot to be desired in terms of overall functionality and reliability.  For the most part, however, they have all now become viable solution.

So what is Virtualization? Well, there already reams and reams of article, whitepapers and blog entries explaining this out there.  Inevitably some are very good, whilst others are biased towards one solution or another.  Some of the latter can descend almost into pure vitriol (VMWare & Microsoft blogs?).  The water is muddied somewhat further by the almost inevitable multiple usages and definitions of the term Virtualization.  WikiPedia has an entire page listing the varying types of Virtualization available.

From my perspective, the best way to describe and explain virtualization is by the typical end-product: the more efficient and manageable usage of IT resources.  There are lots of other advantages – hardware independance (aka separating runtime code from physical hardware), perceived increased resilience, snapshots, easier backups  – but what normally makes the case is greater efficiency and utilisation of IT resources.  In other words, the dreaded Return on Investment (ROI).

Consider this:  one of the most common forms of Virtualization is that of Platform Virtualization where the operating system is separated from the hardware upon which it is running. Instead of adopting the traditional route of installing an operating system directly onto the computer’s hard disk, it is installed into a software container known as Virtual Machine (VM).  This virtual machine is hosted by a piece of software called a HyperVisor.  The Hypervisor sits between the VM and the physical hardware of the host computer.  Instead of allowing the VM direct access to the host computer’s hardware, it provides a virtual hardware infrastructure upon which the Virtual Machine runs.  You are not limited to one Virtual Machine per Hypervisor.  In Platform Virtualization, you may have several Virtual Machines all running concurrently.

So why is this of any use?

I’ve had this question quite a few times, and this is best explanation I have come up with so far for Platform Virtualization.  It is a bit vanilla in nature, but I feel that it is a good general explanation:

Organisation A has three identical servers, all of whom never utilise more than 30% of their total resources.  In essence, 70% of the capacity of each server is wasted.  With Virtualization, all three servers could be converted to three Virtual Machines and then hosted on one physical server.  Doing this will save the organisation both money and space:  they will only be paying for the running costs of 1 physical server instead of three and they will only require the space of one server.

I use Platform Virtualization extensively.  Typically I have 3 or 4 virtual machines running at any one time, two of which run permanently. Whilst the majority are for testing, the latter two are crucial to my day-to-day operations as one is my Spam Filter and the other is a monitoring server.

As both a developer and a sysadmin, Virtualization has made the process of testing and evaluation a whole lot easier.  If I look back to the heady days of 2000/2001, the company I was then working for maintained an extensive suite of computers of varying vintages running a multitude of operating systems (side note:  I’m still trying to work out why we were testing a multimedia CD on a Sun Sparc).  When you figure in the space required, power usage and time required to manage and maintain such a setup, the costs do start to add up.  For me, the arrival of virtualization has all but nullified this requirement.  Instead of having a stack of PCs lying around, I now have a stack of Virtual Machines.

What do I use?

As I mentioned above, I have evaluated a substantial number of solutions, including amongst others VMWare Server, Microsoft Virtual Server and  Sun VirtualBox.  After going through all of these, and taking into consideration my internal requirements – I’m not a datafarm remember – I have chosen to use Microsoft Hyper-V Server.

My choice of Hyper-V is not because I am some sort of evangical Microsoft user.  Truth be told, whilst I tend to use MS products the most of the time, my operating decisions are based on the sound engineering principle of using the right tool for the job in hand.  Consequently, two of my production Virtual Machines are running Ubuntu Linux, not Microsoft Windows.  I would be lying if I didn’t say that the cost – Hyper-V is free – was a factor, but at the end of the day I cannot justify the capital expenditure of VMWare’s equivalent operation for my own uses.

To date, I am very impressed with Hyper-V, even more so considering I am running the Release Candidate of Hyper-V R2.  It has been rock-solid it terms of reliability.  The only problems I have really experienced have been with Microsoft Hyper-V Manager, but these were more down to Window’s security systems than anything else.  It’s not a perfect solution – there is no local GUI so all management has to be done via a remote tool (Hyper-V Manager) or via the command line, but for the price and feature set, I’m not complaining.

As I expand my usage of Hyper-V, I will post further details of what happens especially with regards to Linux Virtual Machines and ongoing management.

Converted to Windows 7. Nearly.

Over the past decade I have played with any number of operating systems in various states of completeness.  Windows, Unix, Linux, Mac and even once Solaris (I was really bored).  In very few cases I was surprised.  In most cases – especially with Betas – I was left with the realisation that several hours of work had resulted in a useless PC.

One golden rule that I had enforced throughout was that I would never perform any of these bash’n’crash installations on my principal computer.  This is a rule that I followed solemnly until the 5th May 2009 when I installed Windows 7 RC on my stalwart Dell laptop.

This was not an off-the-cuff decision.  I had been running Windows 7 Beta on another machine for some time, and unlike Vista on my Dell, it hadn’t crashed no matter what I threw at it.  Furthermore, I had gotten increasingly frustrated with Windows Vista, an operating system I had previously liked.  Dire file copying performance , intermittent video driver issues and an overall lack of general responsiveness had left me with two options: reinstall Vista or do something else.  Do Something Else won.

Installation was a doddle, with all the drivers being automatically installed bar one.  It even found a video driver that was better than the nVidia one I had previously been using.

Two weeks in, and Windows 7 has behaved flawlessly.  It is fluid, fast and above all stable.  No video problems.  Files move to and from my servers without any issues.

Now I know that I am currently in the honeymoon period.  Sooner or later, the inevitable WinRot will set in (where Windows progressively gums itself up)  but Windows 7 is currently ticking all the right boxes.  It has taken Microsoft far too long, but they appear to finally have produced a version of Windows for the 21st Century.

.Net Campaign to kill IE6

I came upon this little gem when I was trawling around for something completely unrelated – as you do.  .Net magazine has instigated an online campaign to kill Internet Explorer 6.

I’m always a little wary of online campaigns.  Perhaps I’m overly paranoid, but I am inheritently suspicious of any campaign or venture that claims to be beneficial that looks a little too slick and well organised for its’ own good.  The campaign website is a custom product with slick, targeted design and content.  Then again, you would expect any website directly associated with .net to incorporate a high design element.

I approach IE6 with two different perspectives, and unusually, they nearly cancel each other out.

From my web developer’s perspective, IE6 has always been a complete pain.  Its’ complete indifference to standards and the arbritrary way in which it renders HTML/CSS has infuriated and caused significant impact on development schedules.  From a purely web perspective and with the benefit of hindsight, IE6 significantly stagnated web development.

From the other perspective, that of the IT Support Guy, it is simply not practical to simply kill off IE6.  Firstly, there is still a substantial user base out there.  Significant number of businesses still run Windows 2000 on a high proportion of their computers, and their IT departments are unlikely to wish to migrate across to a non-Microsoft browser.  This is certainly true at the SME end of the market where computers tend not to be replaced as part of a defined replacement program, but as and when a computer fails beyond economic repair.

Is Microsoft likely to release a version of IE7 for Windows 2000?  Not likely.  In the eyes of Microsoft, Windows 2000 is a dead operating system.  It has been superseded not once, but twice.  The only option here is to upgrade or replace any Windows 2000 PC.  Not a practical or viable option in the SME arena; a computer works until it drops dead.

One suggestion has been that Windows 2000 users simply migrate to another browser like Mozilla Firefox or Opera.  For the casual home user, this is eminently doable, but for business users, things are not so clear cut.

There are some applications that require Internet Explorer to work, or utilise within one of their own components.  I know of one Oracle/Java web application that would only work properly in Internet Explorer – kind of curious since it was Java based.  There is a high market penetration of web-based applications that exploit the integrated security model within Internet Explorer, Sharepoint being an ideal example of one.

IE6 is also remotely and centrally manageable via Group Policies.  In any corporate network, this is a valuable management function and something that it still not available in IE’s competitors.  Its’ updates may also be centrally managed (WSUS) and monitored.  IE, whatever its’ incarnation and foibles, is an enterprise-level web browser.

Internet Explorer has also moved on.  We are now at version 8, which is a marked improvement over IE7 and an almost quantum leap up from IE6.  Sadly though for all its’ improvements IE8 maintains the Microsoft tradition of its own unique interpretation of W3C standards.

I am no fan of IE6, and in most circumstances I would like to see it gone but the pragmatist says this is wishful thinking.  This campaign, whilst with an admirable intent, will not get rid of it no matter who is involved or how many.  There are simply too many economic and operational constraints involved.  IE6 will only be dead and forgotten when Windows 2000 is, and given its’ longevity that may be for some time yet.

Internet Explorer 8

Internet Exploder Explorer 8 has finally arrived, and unlike the Beta version, it hasn’t crashed my computer.  Yet.

Even 5 years ago, the arrival of a new browser was a big event.  Nowadays, with the presence of Firefox, Safari and Chrome, its’ release is a little bit of an anticlimax.  One hopes that that IE8 will be the browser equivalent of Windows 7: good software as it should have been in the first place.  I’ve had too many “interesting” experiences with IE7 over the last couple of years.

So why, the fuss.  Well, I’m not going to repeat or perform an in-depth review of IE8.  There are plenty of other websites out there doing that and I simply haven’t spent that much time with it yet.  I’m looking forward to IE8 as it finally promises a browser from Microsoft that it is in some way consistent and compliant with various web standards.  As many web designers/developers out there will tell you, the CSS rendering in previous versions of IE7 is either fundamentally broken, wrong or just plain inconsistent.  I have spent far too much time making websites IE friendly with the resultant CSS stylesheets being a discordant mess.

This is not to say that IE is the the only perpetrator out there.  My favoured browser, Mozilla Firefox, still doesn’t pass the ACID3 test.  Neither does Google Chrome. I haven’t tested in Safari as my patience with Apple software evaporated many moons ago.  I can honestly say that I haven’t use Opera in many years.

Consistency has actually improved over the past decade though.  I can remember back to 2000/1 when the standard practice at the company I was then working for was to develop separate stylesheets for each browser.  Each website would programmatically detect the user’s browser and return the corresponding stylesheet.  Thankfully, things have moved on.  Back then, we only realistically worried about Internet Explorer and Netscape Navigator (remember that?).  Now we have a diverse range of browsers, and importantly, host operating systems and hardware platforms.

So, back to Internet Explorer 8.  I will continue to play with it over the next couple of weeks, and I will post an update with my various comments and opinions in the near future.  Will it ever replace Firefox?  No.  I am currently well and truly sold on the sheer extensibility of Firefox.  I would be completely lost without extensions like PDF Download and Download StatusBar.  If I’m honest, usage of IE8 will probably be restricted to Windows Update and the occasional download from Microsoft that requires a Windows Genuine Authentication check.

Toolkit: Virtual CloneDrive

As someone who uses ISO files on an almost daily basis, a tool like SlySoft Virtual CloneDrive is a brilliant utility that allows the easy mounting of any ISO file as local disk drive.  I’ve used several similar utilities over the years, but this one is the best I’ve used yet.  It has been installed on a variety of PCs including my venerable Thinkpad, and I have yet to experience any problems.

Virtual CloneDrive is a free download available through the SlySoft website at

As usual, you use Virtual CloneDrive at entirely your own risk.

SBS 2003 / Exchange 2003 Shutdown Script

Microsoft is quite clear about Microsoft Exchange and Active Directory being installed on the same server: don’t do it, it isn’t supported.  The only exception to this rule is Small Business Server, where out of necessity you will find both running side by side.

Unfortunately, Microsoft didn’t feel the need to address one of the problems behind the bar in SBS: the shutdown hang.  There is a good chance that if you go Start | Shutdown on an SBS server without any prior actions, the entire shut down process may take anything between 15 minutes and infinity to completed.

There is an easy to prevent this from happening is to shutdown all Exchange services prior to running Start | Shut Down.  This is easy enough to do via a good old batch script, a copy of which is enclosed below.

To use, simply run the batch script prior to restarting or shutting down your SBS 2003 server.

ECHO Stopping Exchanging Services
net stop MSExchangeES
net stop IMAP4Svc
net stop POP3Svc
net stop RESvc
net stop MSExchangeSRS
net stop MSExchangeMGMT
net stop MSExchangeMTA
net stop MSExchangeIS   /Y
net stop MSExchangeSA   /Y
ECHO Services Stopped