Thursday, April 17, 2008

Installing VMware Server on Ubuntu 7.10

If you're thinking about virtualizing your IT environment but you don't necessarily have a lot of budget/money to do it with, here are a few tips that may help you on your way.

Most of us already know about VMware's offering of VMware Server for free. This, coupled with some great open-sourced operating systems can make for some pretty efficient hypervisors.

I recently had to install a few VMware Server hosts using Ubuntu 7.10 Gutsy as the base host system. We chose Ubuntu because: the distribution CD is relatively small (~510Mb), it is simple to install and it provides the basic functionality needed. Furthermore, it uses the Debian packaging system which means you have access to potentially hundreds of additional packages (such as webmin) - should you choose to install them.

The process is pretty simple:

Step 1: Install your new system using the Ubuntu Server CD.

Step 2: Answer through the normal set of questions.

The actual Ubuntu installation procedure is a bit beyond the scope of this post, but suffice it to say that you should provide a nice big partition for where your VMs will reside. We like to put them in /home/vm)

When prompted to configure your server, select the options you want. We selected "OpenSSH server" and "Samba File Server" (we work in a mixed Linux/Windows environment here). Generally, it is a good idea to keep these selections to a minimum to save resources.

Once the install has finished, reboot.

Step 3: Next, install some packages that are necessary for VMware but that the base installer doesn't install by default, namely:

sudo apt-get install linux-headers-`uname -r` libx11-6 libx11-dev libxrender1 libxt6 libxtst6 libxext6 psmisc build-essential iceauth xinetd.

Note, you need to do this as a sudo user.

(Incidentally, if you don't install these packages, you will probably get an error like this if you try to install VMware Server:)

The correct version of one or more libraries needed to run VMware Server may be

missing. This is the output of ldd /usr/bin/vmware:
linux-gate.so.1 => (0xffffe000)
libm.so.6 => /lib32/libm.so.6 (0xf7f93000)
libdl.so.2 => /lib32/libdl.so.2 (0xf7f8f000)
libpthread.so.0 => /lib32/libpthread.so.0 (0xf7f76000)
libX11.so.6 => not found
libXtst.so.6 => not found
libXext.so.6 => not found
libXt.so.6 => not found
libICE.so.6 => not found
libSM.so.6 => not found
libXrender.so.1 => not found
libz.so.1 => /usr/lib32/libz.so.1 (0xf7f60000)
libc.so.6 => /lib32/libc.so.6 (0xf7e16000)
/lib/ld-linux.so.2 (0xf7fc6000)



Now you're ready to install VMWare.

Step 4: Obtaining VMware Server.

Go to http://www.vmware.com/download/server/ and download the latest version of
the VMware Server install package. The way I did this was to browse to the appropriate page on VMware's site using another computer (my workstation), register and accept license agreements then when it came time to actually download the file, I copied the link onto my clipboard and pasted it into my ssh session with my new server.

At the time of writing, this is what I executed:

wget http://download3.vmware.com/software/vmserver/VMware-server-1.0.5-80187.tar.gz

Similarly for the management interface and client packages:

wget http://download3.vmware.com/software/vmserver/VMware-mui-1.0.5-80187.tar.gz

wget http://download3.vmware.com/software/vmserver/VMware-server-linux-client-1.0.5-80187.zip


Once you have the files, untar/gunzip them using the following command:

tar -xvzf ./VMware-server-1.0.x-xxxxx.tar.gz
tar -xvzf ./VMware-mui-1.0.x-xxxxx.tar.gz

where xxxxx represents the proper version numbers.

Two new directories called vmware-server-distrib and vmware-mui-distrib will be created.

Step 5: Installing VMware

Change directories to vmware-server-distrib. Run the vmware-install.pl script. In our case, we answered the default for most questions except where we wanted the VMs to reside. We entered "/home/vm" for this but it is up to you where you want to put them on your system.

Step 6: Installing the VMware Management UI.

The VMware Management UI is a good way to get an overall view of the VMware server host. It presents you with helpful averages of CPU and memory usage per VM. We use it to guage approximately how many VMs a given physical server can handle.

To install it, change to your vmware-mui-distrib directory and run the vmware-installer.pl script. When you do this, you *may* get a message indicating that VMware is not installed. This error is in fact a misnomer - as it just means you have some missing libraries, AND it also may mean that your "sh" is pointing to something called "dash" instead of the more full fledged "bash".

To resolve these problems, try any or all of the following:

Needed packages not installed
sudo apt-get install libx11-6 libxtst6 libXt6 libXrender1 libXi6 libstdc++5 libssl0.9.7 libcrypto++5.2c2a

Incorrectly linked Library
ln -s /usr/lib/libdb-4.3.so
/usr/lib/libdb.so.3

Incorrectly pointed sh command
Determine if it is the case by doing an "ls -l /bin/sh" ... if it shows that sh is actually pointing to a program called "dash", remove the link and relink it to bash. (You may safely do this, then reverse it back to dash after running the install script).

Then correct the problem by:

rm /bin/sh
ln -s /bin/bash /bin/sh

One more note...
In our case, we decided to use a single user on our server host for all VMs. All VMs were then owned and write permissible for this single user. Depending on your environement, this approach may be feasible, or it may not. The advantage is simplicity, the disadvantage is privacy and flexibility. You decide.

Conclusion and Verification
If you've done the above steps correctly, you now have a functional VMware Server host machine. To verify:

1. Try installing and connecting with a VMware Server Console client. You should be able to create new virtual machines.

2. Try connecting to the management console, by pointing your browser to https://servername:8333.

If you've found this article useful or you have any compliments, constructive criticisms etc., please feel free to leave a comment!

Thursday, April 10, 2008

Installing webmin on Ubuntu 7.10

Recently, I've discovered a very neat tool for administering some of my linux servers. It is called webmin, and it may be one of the more complete web based system administration tools I have seen for linux.

Admittedly, I was a bit apprehensive at first trusting a web interface to do administration. After all, I am not one to shy away from the command line. But the more I used the webmin console, the more I could really see the benefit of using such a tool. In particular, I found the custom commands, and scheduled monitoring two particularly useful features.

I had been configuring small single-application virtual machines lately with Ubuntu, and found webmin to be a very robust tool in which to write custom commands to administer the various function of that server. With webmin, it made each VM more like a virtual appliance -- something we just turn on and off -- rather than something we had to learn oodles of commands to maintain.

Now installatin of webmin on Ubuntu isn't quite as straight forward as just saying "apt-get install webmin", so I've written a small HOWTO guide on how to install on Ubuntu. If you find this useful, please do leave a comment. It's nice to know what people find useful, and what things people don't.

Step 1: Get the latest webmin debian package from webmin's site. Since the bare bones Ubuntu Server won't have a graphical browser, the easiest way to do this is to find the download link via another machine, then use wget to download it onto your server. For me, this was a close sourceforge mirror from which to obtain it. Therefore, I issued a command like this:

wget http://internap.dl.sourceforge.net/sourceforge/webadmin/webmin_1.410_all.deb

This was the response:

--12:57:12-- http://internap.dl.sourceforge.net/sourceforge/webadmin/webmin_1.410_all.deb
=> `webmin_1.410_all.deb'
Resolving internap.dl.sourceforge.net... 74.201.26.4
Connecting to internap.dl.sourceforge.net|74.201.26.4|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 13,140,062 (13M) [text/plain]

100%[====================================================================================================>] 13,140,062 417.00K/s ETA 00:00

12:57:44 (416.17 KB/s) - `webmin_1.410_all.deb' saved [13140062/13140062]

Step 2: Next, install the appropriate libraries (as a sudo user) to get webmin to run:

sudo apt-get install libnet-ssleay-perl libauthen-pam-perl libio-pty-perl libmd5-perl openssl
This was the response:
Reading package lists... Done
Building dependency tree
Reading state information... Done
Suggested packages:
ca-certificates
The following NEW packages will be installed:
libauthen-pam-perl libio-pty-perl libmd5-perl libnet-ssleay-perl openssl
0 upgraded, 5 newly installed, 0 to remove and 39 not upgraded.
Need to get 1138kB of archives.
After unpacking 3555kB of additional disk space will be used.
Get:1 http://ca.archive.ubuntu.com gutsy/universe libauthen-pam-perl 0.16-1 [32.2kB]
Get:2 http://ca.archive.ubuntu.com gutsy/universe libio-pty-perl 1:1.07-1 [42.3kB]
Get:3 http://ca.archive.ubuntu.com gutsy/universe libmd5-perl 2.03-1 [5680B]
Get:4 http://ca.archive.ubuntu.com gutsy/main libnet-ssleay-perl 1.30-1 [186kB]
Get:5 http://ca.archive.ubuntu.com gutsy-updates/main openssl 0.9.8e-5ubuntu3.1 [872kB]
Fetched 1138kB in 9s (117kB/s)
Selecting previously deselected package libauthen-pam-perl.
(Reading database ... 33264 files and directories currently installed.)
Unpacking libauthen-pam-perl (from .../libauthen-pam-perl_0.16-1_i386.deb) ...
Selecting previously deselected package libio-pty-perl.
Unpacking libio-pty-perl (from .../libio-pty-perl_1%3a1.07-1_i386.deb) ...
Selecting previously deselected package libmd5-perl.
Unpacking libmd5-perl (from .../libmd5-perl_2.03-1_all.deb) ...
Selecting previously deselected package libnet-ssleay-perl.
Unpacking libnet-ssleay-perl (from .../libnet-ssleay-perl_1.30-1_i386.deb) ...
Selecting previously deselected package openssl.
Unpacking openssl (from .../openssl_0.9.8e-5ubuntu3.1_i386.deb) ...
Creating directory /etc/ssl
Setting up libauthen-pam-perl (0.16-1) ...
Setting up libio-pty-perl (1:1.07-1) ...
Setting up libmd5-perl (2.03-1) ...
Setting up libnet-ssleay-perl (1.30-1) ...
Setting up openssl (0.9.8e-5ubuntu3.1) ...


Step 3: Install the webmin package as root using dpkg.

sudo dpkg -i webmin_1.410_all.deb
This was the response:

Selecting previously deselected package webmin.
(Reading database ... 33791 files and directories currently installed.)
Unpacking webmin (from webmin_1.410_all.deb) ...
Setting up webmin (1.410) ...
Webmin install complete. You can now login to https://myserver:10000/
as root with your root password, or as any user who can use sudo
to run commands as root.

Step 4: You're done! Login to the server using your web browser and point it to the address it indicates.

Sunday, March 23, 2008

The best things in life are free

Ever had one of those weird moments when you check out what's stuck in your wallet and compare it with that of your buddies? I'm not talking about how much money you have, but all of the other things - receipts, drivers license, club cards, gym memberships, packs of sugar, ...? (Think George in that old episode of Seinfeld...)

Well, this post is all about what's stuck in your development environment. Perhaps one of the biggest lessons I have learned over the past few years is that some of the best development and I.T. tools out there don't have to cost you an arm and a leg. There was once upon a time when we subscribed to a different model: the pay-as-you-go model, where every time we ran into a problem, we'd call up our favourite vendor and "discover" what next big purchase we would need to make to solve the problem.

No longer. Now, thanks to a plethora of good open-source or economical commercial software, one no longer has to pay big bucks to develop good software.

Here's what I have installed:
  • Eclipse Europa -- this is my main integrated development environment. Within Eclipse, I use a variety of plugins and environments. I develop using Java 6, unit test with JUnit, manage my configuration using Subversion and Subclipse, integrate using Ant, and task manage using Mylyn.
  • Mozilla Firefox -- this is my main browser. I enjoy using this browser immensely because it is cross-platform, and because it is not tied to my operating system, is relatively secure.
  • GVIM - for all editing tasks I don't use Eclipse for. It is a trusty, albeit expert-friendly friend (vi) ported over to Windows. It is free and even does colour syntax highlighting.
  • Beyond Compare - about the best diff/compare/merge tool out there for Windows. It is not free, but the $30 license fee isn't going to kill you either. I highly recommend this tool as it does not only file comparisons, but does wonderful directory comparisons which is great for synchronizing directories.

Now for some of the back office stuff:
  • Bugzilla Bug Tracking System - tracks our defects and bugs and integrates nicely into Eclipse thanks to a nifty Mylyn connector plugin.
  • Subversion - handles all of our source control and configuration management
  • Bugzilla uses PHP, and MySQL extensively while both use the Apache Web Server which Ubuntu Server (based on the very versatile Debian linux distribution) does a nice job of pre-installing for you.
  • Almost all of these servers run as virtual machines on VMServer - a commercial but free product from VMWare that allows you to run more than one virtual machine on a given server. Great for lab testing products and hosting virtual servers.
  • The remaining stuff such as our Java Web Start deployment server runs on an instance of Apache Tomcat.
  • Apache ActiveMQ acts as our messaging server.

We have made similar utilizations in everything from our networking to even the on-hold music on our telephone system. It's not that we don't run commercial products. But, with the availability of open-source, community driven, community debugged programs available, IT professionals and developers are offered a much greater choice. The focus shifts from product-driven focus (and trying to find which product fits the budget) to a problem solving focus where real problems can be solved by expertise, experience and knowledge of how best to integrate the available open-sourced or commercial products together to make a working environment. Best of all, you can keep most of that big fat wallet of yours in your back pocket because you won't have to take it out very often. :-)



Those are my two bits. What about yours? What are the pros and cons of open source software in your environment? Hit me up in the comments.



Monday, March 17, 2008

Review of the Asus P5N-MX (LGA775) mainboard for use as a Linux File Server

I'm kind of an odd person. In my house, I have a closet that serves as the central hub for all my networking. It's sort of an odd hybrid of cheap IKEA utility shelves coupled with some rack mounted equipment.

I put "servers" in this closet too and serve files from this closet. Back in my university days, it was the "geek" thing to do: in fact it helped me tremendously during an advanced networking course as it meant I had my own in-house lab. Nowadays, it is more of a carry-over tradition. I have gone from a huge network of many servers to an array of embedded devices (WRT54GLs running OpenWRT) and one/two PCs running Linux.

Recently, one of the hard drives on one of these PCs decided to quit working. I had been mildly expecting it to happen (after all, hard drives nowadays tend to only last their warranty period...) so I had been diligently backing up to a mirror drive each day. When it finally died, I decided it was also time to upgrade the computer.

So, I went looking for a decent, low-powered, fast computer solution and came across the Asus P5N-MX motherboard. Now this isn't your gamer's motherboard, but what it does have is built in LAN, Video, Sound and RAID functionality, and a price tag that fits the budget: perfect for what I wanted to do with it. I purchased it along with a Pentium Dual-Core CPU - about the best Intel dual core CPU you can get on the market for under $100 that isn't a Celeron. I considered getting one of the higher end CPUs (such as a Core 2 Duo or higher) but I was on a budget, and frankly it was wasting money for the intended function of the machine.

Total price for case, CPU, motherboard, memory and new hard drive came to $300. Not bad considering I'm upgrading from a Pentium III 800Mhz!

Assembly and Regret

I bought the parts at a local computer shop, and two hours later, I had the thing assembled; gave it its first boot when all I heard was one long beeeeep! My heart sank. Out came the tools again, and soon through the process of elimination, I discovered I had a stick of faulty RAM.

(Side note: Now the funny thing was that a couple of years ago, I had vowed never to buy a home-built computer again - not that I was afraid of them, but simply that they had outlived their usefulness. One used to get home-built computers because they were no worse than the stuff you got from Dell or HP, but nowadays, things are different. I had convinced myself after working in IT for many years that purchasing name-brand actually was justified on warranty and heavy integration testing purposes alone. But, like many-a-times before, I was lured by the price and convenience of a home built computer. So when I discovered the faulty RAM, it reconfirmed my previous fears.)

Problem #1: Faulty RAM and/or lousy BIOS

After replacing the RAM (and kicking myself for not following my own advice and buying Dell), I got the machine working... sort of. All would seem to work except for a memory test failure every time I booted the computer. So I replaced the RAM yet again, and began to think to myself whether it might be something else. After much searching and at the advice of the technician at the local shop, I decided it might be a BIOS problem, so I upgraded the BIOS from version 01xx to 0402. Mysteriously the RAM problem went away.

Problem #2: Lousy ASUS documentation

One of the things I was quick to discover was the though ASUS is a reputable motherboard manufacturer, they really lack good documentation. Nowhere in the BIOS readme file did it even so much as mention that the upgrade solves memory problems. It brings to mind two questions:

-- Why did the motherboard not ship with a newer BIOS?
-- Why do they not document new BIOS features and fixes in detail?

Problem #3: False reports of an overheating CPU


I then focused my attention of another problem that had appeared in the meantime. The reported temperature of the CPU (according to the BIOS) was 71C! I knew this was erroneous because of the very cool heat sink sitting snugly attached to the CPU. It wouldn't really be that annoying except that all of the fan speeds are tied to this temperature! So even though the CPU was cool, the fans spun at full speed.

I searched all over, but could only find vague references to the problem - and in almost every case where it was mentioned, the defacto answer was that the fan assembly was somehow to blame.

True as that may sound, it was not the case, and in the end, I concluded it must be yet another BIOS error. This time however, I was out of "released" BIOSes to upgrade to, so only after reading a vague review on Newegg.com did I figure that upgrading to the latest "beta" BIOS (version 0601) might solve my problem.

I did upgrade, and it did solve my problem. I am finally happy with my new server on which I have installed Ubuntu server and created a giant file share using CIFS and NFS on my RAIDed hard drives.





Conclusion

It has been several years since I have bought and assembled a home-built computer, but now, more than ever I am convinced that it may not always be the best option. Until motherboard companies can begin to act responsibly and provide adequate testing of their BIOSes and provide detailed documentation of their upgrades, it seems like a lot of work and frustration for having saved not that much money.

Friday, March 14, 2008

Sysclean - a little known secret

One of the things that our I.T. department deals with on occasion (much to our frustration) is virus / malware / spyware / grayware infected computers. Though we do have a layered system in place, there is no system that will ultimately prevent every type of malware out there all the time.

(We also get lots of questions from our users about their home machines. Though we don't officially support home machines, developing good I.T. practices is part of our mandate and so we often encourage and help out with this out of good-will.)

In addition to telling them about some of the online scanners available (such as Trend Micro's Housecall or Symantec/Norton's equivalent) we also send them home with a rescue CD. On the CD is a little known secret... and it's free.

First, the secret
- then I will tell you why we do this in addition to online scanners.

Trend Micro offers an offline system cleaner called "sysclean". It isn't the most elegant of solutions, but it is thorough. It will detect most viruses, spyware as well as other forms of malware and it does a reasonable job of cleaning them up.

You can download the sysclean program here:

Once you have downloaded it, you will also need to download their latest pattern files. You can find those here:

You need both the Virus pattern as well as the Spyware pattern. Download the "new" pattern files titled SSAPIPTN.DA5.Put all the files into one directory, and unzip all pattern files. Then, run sysclean.com and let it scan away.


If you intend on putting this on a CD, there are several catches:

  1. On the target computer, you will need to copy the files off the CD onto a local directory and make them NOT read-only. This is because sysclean.com will actually extract other programs and require write access.
  2. Note that the patterns change almost daily - so be sure to keep the CD up to date.

Why do we encourage this in addition to online scanners?

For system recovery, this solution works well - it does not require plugins, or java to be installed. In fact, it does not even require an internet connection. But most importantly, it is not as suceptible to browser hijacks. (If we assume that the browser on the target computer is already infected, what good is a scanner that also requires that browser?)

Thursday, March 13, 2008

Start a knowledge base

I work with a relatively small I.T. team. Being half developer and half I.T. administrator, human resources can get quite stretched when it comes to domain knowledge. When one person goes on vacation, their absence is felt.

One of the things we did a couple of years ago to mitigate this is to start a knowledge base. Knowledge bases used to be these huge complicated things that people would have to manage. I still remember the times when I worked for a big consulting firm that had whole teams dedicated to maintaining and producing the knowledge base. But with the advent of "Web 2.0" and the popularity of blogs and WIKIs, this hurdle has been greatly overcome. The focus becomes on the content rather than the presentation. The fact that everyone can contribute to its content makes the content that much more relevant and useful.

Here are some tips to starting a successful knowledge base:

1. Encourage and convince your team (and other stakeholders) that this is a good thing. Resistance can be hard to overcome. It may come in the form of people who have ingrained knowledge of systems and fear losing their jobs. But job replacement is not the goal of a knowledge system. By documenting knowledge, you not only spread the wealth of knowledge, you encourage reproduction of leadership, and you encourage those that have the knowledge to learn more. (Don't they say that the best student is often the teacher?)

2. Identify some areas that desperately need documentation. Keep a running list of topics.

3. Find a repository. In our team, we prefer to use a WIKI. We find the collaborative nature of WIKIs suitable for something that changes constantly like a knowledge base. As we find new problems and new solutions, we update the WIKI. There are many WIKIs out there of varying size - find one that suits the size of your organization, and run with it. For us, we decided to use JSPWiki - mostly because of its simplicity, and the fact that it uses the file system as a repository. This made for simple backups as well as maintenance. Others will want to chose a more robust system such as TWiki or MediaWIKI where the UI is more full featured. Again, the emphasis should be on content, not presentation.

Wikipedia (a wiki in an of itself) has a great comparison page on wiki engines.

3. Have a reasonable organizational structure in mind. I'm cautious about saying this a little bit because sometimes finding the structure can be daunting. If this is the case, do step 4 first. But it is a good idea to have a general idea of a structure you may wish to use. It helps you to seed the Wiki with initial ideas. Then you can refine the structure later.

In our team, we decided to use the OSI layers as an initial structure. Because many of the things we wanted to write about in our knowledge base fell under one of these layers, this was a logical point to start. Other teams may wish to use other forms of organization. Whatever you choose, do try to make sure that it ties into some sort of existing context that your users will understand.

4. Start writing! In our team, many of our initial pages were just simply a description of what that page was intended to be. For instance, we may have a page called "ListOfServerIPs" that just contains the intention for that page. Then, as problems came up, we all made a commitment to find the relevant page and update it. In other words, there is not always a need to do this huge "dump" of initial information - just start using the knowledge base from this point forward, and the background will often fill itself in.

5. Repeat steps 2 through 4. Knowledge bases (like the people who remember the knowledge) are meant to be organic. It is only as good as the last person who maintained it - so, make it an iterative process. As your knowledge base grows, reevaluate if there might be a better way of organizing the information and write/rewrite pages accordingly. Of course, one nice thing about Wikis is that they are also web pages with full hyperlink capabilities so creating new "index" pages can be easy. A lot of WIKIs nowadays also employ tag and search systems allowing for greater flexibility in structure.

Wednesday, March 12, 2008

Blog resurrection

After a long hiatus of over a year, I am dusting off this blog, revamping it with a new look, and committing to updating it more regularly. Check back here soon and often for new tips on how to practice practical I.T.