Sunday, March 23, 2008

The best things in life are free

Ever had one of those weird moments when you check out what's stuck in your wallet and compare it with that of your buddies? I'm not talking about how much money you have, but all of the other things - receipts, drivers license, club cards, gym memberships, packs of sugar, ...? (Think George in that old episode of Seinfeld...)

Well, this post is all about what's stuck in your development environment. Perhaps one of the biggest lessons I have learned over the past few years is that some of the best development and I.T. tools out there don't have to cost you an arm and a leg. There was once upon a time when we subscribed to a different model: the pay-as-you-go model, where every time we ran into a problem, we'd call up our favourite vendor and "discover" what next big purchase we would need to make to solve the problem.

No longer. Now, thanks to a plethora of good open-source or economical commercial software, one no longer has to pay big bucks to develop good software.

Here's what I have installed:
  • Eclipse Europa -- this is my main integrated development environment. Within Eclipse, I use a variety of plugins and environments. I develop using Java 6, unit test with JUnit, manage my configuration using Subversion and Subclipse, integrate using Ant, and task manage using Mylyn.
  • Mozilla Firefox -- this is my main browser. I enjoy using this browser immensely because it is cross-platform, and because it is not tied to my operating system, is relatively secure.
  • GVIM - for all editing tasks I don't use Eclipse for. It is a trusty, albeit expert-friendly friend (vi) ported over to Windows. It is free and even does colour syntax highlighting.
  • Beyond Compare - about the best diff/compare/merge tool out there for Windows. It is not free, but the $30 license fee isn't going to kill you either. I highly recommend this tool as it does not only file comparisons, but does wonderful directory comparisons which is great for synchronizing directories.

Now for some of the back office stuff:
  • Bugzilla Bug Tracking System - tracks our defects and bugs and integrates nicely into Eclipse thanks to a nifty Mylyn connector plugin.
  • Subversion - handles all of our source control and configuration management
  • Bugzilla uses PHP, and MySQL extensively while both use the Apache Web Server which Ubuntu Server (based on the very versatile Debian linux distribution) does a nice job of pre-installing for you.
  • Almost all of these servers run as virtual machines on VMServer - a commercial but free product from VMWare that allows you to run more than one virtual machine on a given server. Great for lab testing products and hosting virtual servers.
  • The remaining stuff such as our Java Web Start deployment server runs on an instance of Apache Tomcat.
  • Apache ActiveMQ acts as our messaging server.

We have made similar utilizations in everything from our networking to even the on-hold music on our telephone system. It's not that we don't run commercial products. But, with the availability of open-source, community driven, community debugged programs available, IT professionals and developers are offered a much greater choice. The focus shifts from product-driven focus (and trying to find which product fits the budget) to a problem solving focus where real problems can be solved by expertise, experience and knowledge of how best to integrate the available open-sourced or commercial products together to make a working environment. Best of all, you can keep most of that big fat wallet of yours in your back pocket because you won't have to take it out very often. :-)



Those are my two bits. What about yours? What are the pros and cons of open source software in your environment? Hit me up in the comments.



Monday, March 17, 2008

Review of the Asus P5N-MX (LGA775) mainboard for use as a Linux File Server

I'm kind of an odd person. In my house, I have a closet that serves as the central hub for all my networking. It's sort of an odd hybrid of cheap IKEA utility shelves coupled with some rack mounted equipment.

I put "servers" in this closet too and serve files from this closet. Back in my university days, it was the "geek" thing to do: in fact it helped me tremendously during an advanced networking course as it meant I had my own in-house lab. Nowadays, it is more of a carry-over tradition. I have gone from a huge network of many servers to an array of embedded devices (WRT54GLs running OpenWRT) and one/two PCs running Linux.

Recently, one of the hard drives on one of these PCs decided to quit working. I had been mildly expecting it to happen (after all, hard drives nowadays tend to only last their warranty period...) so I had been diligently backing up to a mirror drive each day. When it finally died, I decided it was also time to upgrade the computer.

So, I went looking for a decent, low-powered, fast computer solution and came across the Asus P5N-MX motherboard. Now this isn't your gamer's motherboard, but what it does have is built in LAN, Video, Sound and RAID functionality, and a price tag that fits the budget: perfect for what I wanted to do with it. I purchased it along with a Pentium Dual-Core CPU - about the best Intel dual core CPU you can get on the market for under $100 that isn't a Celeron. I considered getting one of the higher end CPUs (such as a Core 2 Duo or higher) but I was on a budget, and frankly it was wasting money for the intended function of the machine.

Total price for case, CPU, motherboard, memory and new hard drive came to $300. Not bad considering I'm upgrading from a Pentium III 800Mhz!

Assembly and Regret

I bought the parts at a local computer shop, and two hours later, I had the thing assembled; gave it its first boot when all I heard was one long beeeeep! My heart sank. Out came the tools again, and soon through the process of elimination, I discovered I had a stick of faulty RAM.

(Side note: Now the funny thing was that a couple of years ago, I had vowed never to buy a home-built computer again - not that I was afraid of them, but simply that they had outlived their usefulness. One used to get home-built computers because they were no worse than the stuff you got from Dell or HP, but nowadays, things are different. I had convinced myself after working in IT for many years that purchasing name-brand actually was justified on warranty and heavy integration testing purposes alone. But, like many-a-times before, I was lured by the price and convenience of a home built computer. So when I discovered the faulty RAM, it reconfirmed my previous fears.)

Problem #1: Faulty RAM and/or lousy BIOS

After replacing the RAM (and kicking myself for not following my own advice and buying Dell), I got the machine working... sort of. All would seem to work except for a memory test failure every time I booted the computer. So I replaced the RAM yet again, and began to think to myself whether it might be something else. After much searching and at the advice of the technician at the local shop, I decided it might be a BIOS problem, so I upgraded the BIOS from version 01xx to 0402. Mysteriously the RAM problem went away.

Problem #2: Lousy ASUS documentation

One of the things I was quick to discover was the though ASUS is a reputable motherboard manufacturer, they really lack good documentation. Nowhere in the BIOS readme file did it even so much as mention that the upgrade solves memory problems. It brings to mind two questions:

-- Why did the motherboard not ship with a newer BIOS?
-- Why do they not document new BIOS features and fixes in detail?

Problem #3: False reports of an overheating CPU


I then focused my attention of another problem that had appeared in the meantime. The reported temperature of the CPU (according to the BIOS) was 71C! I knew this was erroneous because of the very cool heat sink sitting snugly attached to the CPU. It wouldn't really be that annoying except that all of the fan speeds are tied to this temperature! So even though the CPU was cool, the fans spun at full speed.

I searched all over, but could only find vague references to the problem - and in almost every case where it was mentioned, the defacto answer was that the fan assembly was somehow to blame.

True as that may sound, it was not the case, and in the end, I concluded it must be yet another BIOS error. This time however, I was out of "released" BIOSes to upgrade to, so only after reading a vague review on Newegg.com did I figure that upgrading to the latest "beta" BIOS (version 0601) might solve my problem.

I did upgrade, and it did solve my problem. I am finally happy with my new server on which I have installed Ubuntu server and created a giant file share using CIFS and NFS on my RAIDed hard drives.





Conclusion

It has been several years since I have bought and assembled a home-built computer, but now, more than ever I am convinced that it may not always be the best option. Until motherboard companies can begin to act responsibly and provide adequate testing of their BIOSes and provide detailed documentation of their upgrades, it seems like a lot of work and frustration for having saved not that much money.

Friday, March 14, 2008

Sysclean - a little known secret

One of the things that our I.T. department deals with on occasion (much to our frustration) is virus / malware / spyware / grayware infected computers. Though we do have a layered system in place, there is no system that will ultimately prevent every type of malware out there all the time.

(We also get lots of questions from our users about their home machines. Though we don't officially support home machines, developing good I.T. practices is part of our mandate and so we often encourage and help out with this out of good-will.)

In addition to telling them about some of the online scanners available (such as Trend Micro's Housecall or Symantec/Norton's equivalent) we also send them home with a rescue CD. On the CD is a little known secret... and it's free.

First, the secret
- then I will tell you why we do this in addition to online scanners.

Trend Micro offers an offline system cleaner called "sysclean". It isn't the most elegant of solutions, but it is thorough. It will detect most viruses, spyware as well as other forms of malware and it does a reasonable job of cleaning them up.

You can download the sysclean program here:

Once you have downloaded it, you will also need to download their latest pattern files. You can find those here:

You need both the Virus pattern as well as the Spyware pattern. Download the "new" pattern files titled SSAPIPTN.DA5.Put all the files into one directory, and unzip all pattern files. Then, run sysclean.com and let it scan away.


If you intend on putting this on a CD, there are several catches:

  1. On the target computer, you will need to copy the files off the CD onto a local directory and make them NOT read-only. This is because sysclean.com will actually extract other programs and require write access.
  2. Note that the patterns change almost daily - so be sure to keep the CD up to date.

Why do we encourage this in addition to online scanners?

For system recovery, this solution works well - it does not require plugins, or java to be installed. In fact, it does not even require an internet connection. But most importantly, it is not as suceptible to browser hijacks. (If we assume that the browser on the target computer is already infected, what good is a scanner that also requires that browser?)

Thursday, March 13, 2008

Start a knowledge base

I work with a relatively small I.T. team. Being half developer and half I.T. administrator, human resources can get quite stretched when it comes to domain knowledge. When one person goes on vacation, their absence is felt.

One of the things we did a couple of years ago to mitigate this is to start a knowledge base. Knowledge bases used to be these huge complicated things that people would have to manage. I still remember the times when I worked for a big consulting firm that had whole teams dedicated to maintaining and producing the knowledge base. But with the advent of "Web 2.0" and the popularity of blogs and WIKIs, this hurdle has been greatly overcome. The focus becomes on the content rather than the presentation. The fact that everyone can contribute to its content makes the content that much more relevant and useful.

Here are some tips to starting a successful knowledge base:

1. Encourage and convince your team (and other stakeholders) that this is a good thing. Resistance can be hard to overcome. It may come in the form of people who have ingrained knowledge of systems and fear losing their jobs. But job replacement is not the goal of a knowledge system. By documenting knowledge, you not only spread the wealth of knowledge, you encourage reproduction of leadership, and you encourage those that have the knowledge to learn more. (Don't they say that the best student is often the teacher?)

2. Identify some areas that desperately need documentation. Keep a running list of topics.

3. Find a repository. In our team, we prefer to use a WIKI. We find the collaborative nature of WIKIs suitable for something that changes constantly like a knowledge base. As we find new problems and new solutions, we update the WIKI. There are many WIKIs out there of varying size - find one that suits the size of your organization, and run with it. For us, we decided to use JSPWiki - mostly because of its simplicity, and the fact that it uses the file system as a repository. This made for simple backups as well as maintenance. Others will want to chose a more robust system such as TWiki or MediaWIKI where the UI is more full featured. Again, the emphasis should be on content, not presentation.

Wikipedia (a wiki in an of itself) has a great comparison page on wiki engines.

3. Have a reasonable organizational structure in mind. I'm cautious about saying this a little bit because sometimes finding the structure can be daunting. If this is the case, do step 4 first. But it is a good idea to have a general idea of a structure you may wish to use. It helps you to seed the Wiki with initial ideas. Then you can refine the structure later.

In our team, we decided to use the OSI layers as an initial structure. Because many of the things we wanted to write about in our knowledge base fell under one of these layers, this was a logical point to start. Other teams may wish to use other forms of organization. Whatever you choose, do try to make sure that it ties into some sort of existing context that your users will understand.

4. Start writing! In our team, many of our initial pages were just simply a description of what that page was intended to be. For instance, we may have a page called "ListOfServerIPs" that just contains the intention for that page. Then, as problems came up, we all made a commitment to find the relevant page and update it. In other words, there is not always a need to do this huge "dump" of initial information - just start using the knowledge base from this point forward, and the background will often fill itself in.

5. Repeat steps 2 through 4. Knowledge bases (like the people who remember the knowledge) are meant to be organic. It is only as good as the last person who maintained it - so, make it an iterative process. As your knowledge base grows, reevaluate if there might be a better way of organizing the information and write/rewrite pages accordingly. Of course, one nice thing about Wikis is that they are also web pages with full hyperlink capabilities so creating new "index" pages can be easy. A lot of WIKIs nowadays also employ tag and search systems allowing for greater flexibility in structure.

Wednesday, March 12, 2008

Blog resurrection

After a long hiatus of over a year, I am dusting off this blog, revamping it with a new look, and committing to updating it more regularly. Check back here soon and often for new tips on how to practice practical I.T.