.

Basketball Video

This video kind of sums up WVU men’s basketball since Coach Beilein came in 2002. The emphasis is mostly on what WVU did last year on there run to the elite 8, and it doesn’t have anything from the current season. I had seen it before but came across it again today. I’m putting it here for anyone else who might not have seen it. It’s a definite watch for any Mountaineer fan. Enjoy!

Tonight is the last home game of the 2005-2006 Men’s basketball season. We’re playing Pitt. It’s also going to be the last time Pittsnogle, Gansey, Herber, Collins, and Beilein play in a game at the Coliseum. People are reporting on the message boards that at 5:30am there were already students waiting in line for the game (tip off is at 7:00pm). It should be a great game. Here’s a good story about the 5 graduating seniors:
http://msnsports.net/page.cfm?story=9015&cat=exclusives

One last thing…. Eat Shit Pitt!


Glad I don’t work here

http://www.theregister.co.uk/2006/02/10/employees_chipped/

This company in Cincinnati is requiring employees that need access to there data center to have an RFID chip injected into there bicep. I hope this doesn’t become a widespread practice. I think if WVU ever started this I’d be looking for a new job. I don’t mind having to carry a card with an RFID chip on it, but having one implanted in me takes it to far. If I were to ever leave the company I could just give them the card back, but this chip in my arm would be with me forever.

Scary!


New Web Host

Well I did it. I’ve been telling support for months that if things didn’t change that I was moving to a new host. They didn’t seem to believe me. So tonight I signed on with some one new. My current host has been nothing but trouble since I registered. The constant outages, high loads, and unannounced system changes pushed me over the edge.

I still have a couple of months left on my contract with my current host. So I am going to try and get all of my content migrated before I change the DNS entries for thehaggertys.net. Regardless, there might be some time where my pages won’t load while the DNS changes make there way out to everyone on the web.

My new host gave me a free domain name. So I registered www.treah.com for my wife. It was the deal maker. She didn’t want me to spend the money until I told her she’d have her own domain name. So I’ll be migrating her blog over to it.

I’d just like to close with a strong word of warning for anyone who is considering using Web Hosting Buzz as a web host. The prices are cheap, but you get what you pay for. Stay away!


Load Monitoring

As many of you know I’m a computer guy. If you’ve been following my blog it may come as a shock as I haven’t really posted anything super geeky. Well that’s about to change. Please note that word press has word wrapped the code blocks bellow. So if you’re going to use this script copy and paste them into a text editor, that way you get the line breaks where they should be.

My web host (the people who host thehaggertys.net) sucks at managing servers. It seems as though my sites and email are down at least once a day. I would never know it until some one would say that they couldn’t get email or get to the blogs or the gallery. I had enough and asked my host to fix it. They refused to do anything about it, or even move me to one of there other 20+ servers. This all prompted me to write some scripts to monitor the load of my server. I know this has probably been done a million times in the past, but I wrote this solution and I’m proud of it. So here it is if anyone wants to use it on there web server.

The first step was to create a HTML file that will hold the output from the scripts. I called mine index.html and threw it in it’s own directory. Here’s what it consists off:

Time One Minute Average Five Minute Average Fifteen Minute Average

The next step is to write a script to fill in the table with data. For this I wrote a small korn shell script that runs the “uptime” command, it parses out the time, what the load is, and it then fills in the table in the index.html file. I have a cron job running it every 15 minutes. The effectively gives me what the load averages are on the box every 15 minutes. I also told the script to print out “High Load” followed by the load averages if the average was higher than 15. I then set cron to email me any output to standard out when it runs. So now I am emailed (and it’s send to my blackberry!) when ever the load is greater than 15. This means that if the web server was running on a machine with 15 processors it would be using 100% of the processor power. Since my host is using a 2 processor box a load of 15 would mean it is trying to run at 750% processing power. I think I did my math right. Anyway, 15 is way to high for a 2 processor server.

My shell script is as follows:

#! /bin/ksh

if [[ `uptime |wc -w` -eq 12 ]]
then
one=`uptime |awk ‘{print $10}’ |sed ’s/,//g’`
five=`uptime |awk ‘{print $11}’ |sed ’s/,//g’`
fifteen=`uptime |awk ‘{print $12}’`
time=`uptime |awk ‘{print $1}’`
elif [[ `uptime|wc -w` -eq 13 ]]
then
one=`uptime |awk ‘{print $11}’ |sed ’s/,//g’`
five=`uptime |awk ‘{print $12}’ |sed ’s/,//g’`
fifteen=`uptime |awk ‘{print $13}’`
time=`uptime |awk ‘{print $1}’`
fi

echo “

\n

$time

\n

$one

\n

$five

\n

$fifteen

\n

” >> /home/haggerty/public_html/load/index.html

newone=`echo $one |awk -F. ‘{print $1}’`
newfive=`echo $five |awk -F. ‘{print $1}’`
newfifteen=`echo $fifteen |awk -F. ‘{print $1}’`

if [[ $newone -gt 15 || $newfive -gt 15 || $newfifteen -gt 15 ]]
then
echo “High Load:\n$one, $five, $fifteen”
fi

Now since this script is running every 15 minutes for 24 hours, the page will have 96 entries per day. After a week you would have a table with 672 entries. After a month there would be 2688 to 2976 entries depending on the month. That’s a lot of data to be faced with.

So I decided that I would write another script that will archive the data at the end of the day. What this script does is copy the content of index.html to a file labeled with the date with an extension of .html. For example the data from Feb 09, 2006 would be called 20060209.html. I then add an entry to a table of contents page that adds a link to the newly created archive. After creating the new file it then clears index.html and recreates the above table.

The archive script is ran each night at 23:55:00 via cron. The script is as follows:


#!/bin/ksh

year=`date +%Y`
month=`date +%m`
day=`date +%d`

echo “$month/$day/$year Load\n

$month/$day/$year Load

\n“ > /home/haggerty/public_html/load/archive/$year$month$day.html

cat /home/haggerty/public_html/load/index.html >> /home/haggerty/public_html/load/archive/$year$month$day.html
echo “

  • $month/$day/$year
  • ” >> /home/haggerty/public_html/load/archive/index.html
    echo “Today’s Load on\n

    \n

    \n

    \n

    \n

    \n

    \n

    \n“ > /home/haggerty/public_html/load/index.html

    Time One Minute Average Five Minute Average Fifteen Minute Average

    I think that’s it. I’ve only had it up and running for about a day and a half now. I’ve already seen loads ranging from 0 to 189. I definitely will not be staying with this company after my contract is up.

    If you want to see the scripts in action you can see today’s loads at http://www.thehaggertys.net/load, archived loads are at http://www.thehaggertys.net/load/archive.


    Calendar
    February 2006
    S M T W T F S
    « Jan   Mar »
     1234
    567891011
    12131415161718
    19202122232425
    262728  
    Categories
    Archive

    .