.

Blog Spam

I know my blog isn’t read by a ton of people, as it has been up for quite a few months now and I just got my first comment on my last posting. It was a comment from Jason, a friend I work with and see all the time. Anyway, ever since I upgraded to WordPress 2 I have had Akismet turned on to catch spammers trying to advertise in the comments of my blog. It does a great job. I had not seen anything get through it in the past.

The interesting thing is that ever since Jason posted a comment I have had a ton of spammers trying to post comments. I normally would login and Akismet would have caught 10-20 comments that it though was spam. This morning I logged in and it had 79 comments it thought was spam. There were also 19 more comments that made it’s way past Akismet. Luckily I have WordPress setup so that I have to approve comments before they are posted to my site. So these 19 never actually made it to my page, they were waiting for my approval.

I hate spammers. I think we should send them to an island some where with only an internal network connection. That way they can piss off each other and leave the rest of us alone. I think I’ll go send my idea to my congress representatives.


New Web Host

Well I did it. I’ve been telling support for months that if things didn’t change that I was moving to a new host. They didn’t seem to believe me. So tonight I signed on with some one new. My current host has been nothing but trouble since I registered. The constant outages, high loads, and unannounced system changes pushed me over the edge.

I still have a couple of months left on my contract with my current host. So I am going to try and get all of my content migrated before I change the DNS entries for thehaggertys.net. Regardless, there might be some time where my pages won’t load while the DNS changes make there way out to everyone on the web.

My new host gave me a free domain name. So I registered www.treah.com for my wife. It was the deal maker. She didn’t want me to spend the money until I told her she’d have her own domain name. So I’ll be migrating her blog over to it.

I’d just like to close with a strong word of warning for anyone who is considering using Web Hosting Buzz as a web host. The prices are cheap, but you get what you pay for. Stay away!


Load Monitoring

As many of you know I’m a computer guy. If you’ve been following my blog it may come as a shock as I haven’t really posted anything super geeky. Well that’s about to change. Please note that word press has word wrapped the code blocks bellow. So if you’re going to use this script copy and paste them into a text editor, that way you get the line breaks where they should be.

My web host (the people who host thehaggertys.net) sucks at managing servers. It seems as though my sites and email are down at least once a day. I would never know it until some one would say that they couldn’t get email or get to the blogs or the gallery. I had enough and asked my host to fix it. They refused to do anything about it, or even move me to one of there other 20+ servers. This all prompted me to write some scripts to monitor the load of my server. I know this has probably been done a million times in the past, but I wrote this solution and I’m proud of it. So here it is if anyone wants to use it on there web server.

The first step was to create a HTML file that will hold the output from the scripts. I called mine index.html and threw it in it’s own directory. Here’s what it consists off:

Time One Minute Average Five Minute Average Fifteen Minute Average

The next step is to write a script to fill in the table with data. For this I wrote a small korn shell script that runs the “uptime” command, it parses out the time, what the load is, and it then fills in the table in the index.html file. I have a cron job running it every 15 minutes. The effectively gives me what the load averages are on the box every 15 minutes. I also told the script to print out “High Load” followed by the load averages if the average was higher than 15. I then set cron to email me any output to standard out when it runs. So now I am emailed (and it’s send to my blackberry!) when ever the load is greater than 15. This means that if the web server was running on a machine with 15 processors it would be using 100% of the processor power. Since my host is using a 2 processor box a load of 15 would mean it is trying to run at 750% processing power. I think I did my math right. Anyway, 15 is way to high for a 2 processor server.

My shell script is as follows:

#! /bin/ksh

if [[ `uptime |wc -w` -eq 12 ]]
then
one=`uptime |awk ‘{print $10}’ |sed ’s/,//g’`
five=`uptime |awk ‘{print $11}’ |sed ’s/,//g’`
fifteen=`uptime |awk ‘{print $12}’`
time=`uptime |awk ‘{print $1}’`
elif [[ `uptime|wc -w` -eq 13 ]]
then
one=`uptime |awk ‘{print $11}’ |sed ’s/,//g’`
five=`uptime |awk ‘{print $12}’ |sed ’s/,//g’`
fifteen=`uptime |awk ‘{print $13}’`
time=`uptime |awk ‘{print $1}’`
fi

echo “

\n

$time

\n

$one

\n

$five

\n

$fifteen

\n

” >> /home/haggerty/public_html/load/index.html

newone=`echo $one |awk -F. ‘{print $1}’`
newfive=`echo $five |awk -F. ‘{print $1}’`
newfifteen=`echo $fifteen |awk -F. ‘{print $1}’`

if [[ $newone -gt 15 || $newfive -gt 15 || $newfifteen -gt 15 ]]
then
echo “High Load:\n$one, $five, $fifteen”
fi

Now since this script is running every 15 minutes for 24 hours, the page will have 96 entries per day. After a week you would have a table with 672 entries. After a month there would be 2688 to 2976 entries depending on the month. That’s a lot of data to be faced with.

So I decided that I would write another script that will archive the data at the end of the day. What this script does is copy the content of index.html to a file labeled with the date with an extension of .html. For example the data from Feb 09, 2006 would be called 20060209.html. I then add an entry to a table of contents page that adds a link to the newly created archive. After creating the new file it then clears index.html and recreates the above table.

The archive script is ran each night at 23:55:00 via cron. The script is as follows:


#!/bin/ksh

year=`date +%Y`
month=`date +%m`
day=`date +%d`

echo “$month/$day/$year Load\n

$month/$day/$year Load

\n“ > /home/haggerty/public_html/load/archive/$year$month$day.html

cat /home/haggerty/public_html/load/index.html >> /home/haggerty/public_html/load/archive/$year$month$day.html
echo “

  • $month/$day/$year
  • ” >> /home/haggerty/public_html/load/archive/index.html
    echo “Today’s Load on\n

    \n

    \n

    \n

    \n

    \n

    \n

    \n“ > /home/haggerty/public_html/load/index.html

    Time One Minute Average Five Minute Average Fifteen Minute Average

    I think that’s it. I’ve only had it up and running for about a day and a half now. I’ve already seen loads ranging from 0 to 189. I definitely will not be staying with this company after my contract is up.

    If you want to see the scripts in action you can see today’s loads at http://www.thehaggertys.net/load, archived loads are at http://www.thehaggertys.net/load/archive.


    WordPress Upgrade

    I am working on a WordPress upgrade. I don’t think it will take very long, but things may look a bit hairy until it is done.

    ***UPDATE***
    The Upgrade went flawless. I’m now running WordPress version 2.0. If you’re looking to upgrade I followed this guide:
    http://codex.wordpress.org/Upgrading_WordPress

    It probably took about 10 minutes to upgrade, including making all of my backups.


    Working on a theme

    If I’m going to spend the time to have a blog, I might as well customize it. So you may notice things changing and looking different for the next couple of days until everything is worked out.


    Calendar
    October 2017
    S M T W T F S
    « Jan    
    1234567
    891011121314
    15161718192021
    22232425262728
    293031  
    Categories
    Archive

    .