Select Page

SEO for a Webhost?

I’m sure you’ve all read about how important it is for your customers to optimize their sites for search engines. But have you realized that, as a Webhost, you too play an important part in getting better search engine rankings for the sites you host? Your customers may be putting in a lot of effort into optimizing their site, but without your help, they wont get the results they expect! Lets take a look how you can effect their rankings.

Server Uptime

Yes, an important one for obvious reasons. Search engine bots crawl the web at various points throughout the day. So if your server happens to be down at that time, your sites do not get crawled, and you will have to wait till the next time the bots are back. Hopefully you’ll have your server back online by then. If the bots are repeatedly unable to crawl your sites, they could drop your server from their list and it wont be crawled again. Check out these posts on various monitoring software you could use to monitor your servers: Nagios, Icinga, Zenoss.

The Robots.txt file

The robots.txt file is used to manage the way search engine bots crawl the sites on your server. You can offer a optimized robots.txt file as part of your initial site setup. Using it, you can determine which bots you want to allow, and which you want to block. Which folders the bot should crawl and which it shouldn’t. Read more about robots.txt here.

Search Engine Blacklisting

Yes, search engines can blacklist you too. They can blacklist based on domain name, IP address or both. If its a shared IP address, all sites will be effected. There are many practices that search engines consider to be a big “no-no”. If you are found to be performing any of these, they will blacklist you and not crawl your sites again. Ensure that non of the sites you host are involved in activies like:

  • Mirror Websites: Identical sites with different URLs
  • Doorway Pages: Pages optimized to rank highly in search engines, when visitors to the page are redirect elsewhere.
  • Submitting pages too often: Most search engines only allow sites to be sumitted once in 30 days. Submitting a site more often than that can get it blacklisted.


About the Author:

Hamish works as a Senior Software Engineer in Bobcares. He joined Bobcares in July 2004, and is an expert in Control panels and Operating systems used in the Web Hosting industry. He is highly passionate about Linux and is a great evangelist of open-source. When he is not on his xbox, he is an avid movie lover and critic.



MonitoringSEO Watch
Bobcares
Bobcares is a server management company that helps businesses deliver uninterrupted and secure online services. Our engineers manage close to 51,500 servers that include virtualized servers, cloud infrastructure, physical server clusters, and more.
MORE ABOUT BOBCARES