In my first article, Primary Steps to Make Your Site Better Search Engine Friendly (SEF), we discussed the basic techniques to optimize a website using meta tags and keywords.
In this article, let us analyze some of the popular strategies to optimize your website URLs.
One of the best ways to build a good URL is by creating proper categories and file names for the documents on your website. It will help you to keep your site organized and will lead to better crawling by search engines. Also, it can create easier and “User Friendly” URLs. Keeping extremely long and cryptic URLs that contain a few recognizable words is not always a good option.
Example: Article Listing website
This example displays an article titled ‘Search Engine Optimization’ under the “seo” category. Our conventional URL listing would look something like
http://www.site.com/folder3/category/4545624/xbd/00065a.html
http://www.site.com/view-article.php?title=yes&id=1&show=true&comment=7&rating=4
Complex URLs like these can be confusing and unfriendly. Some users will be under an assumption that a portion of the URL is not required, especially if the URL shows unrecognizable parameters. Users tend to cut off a part, or break the link. So a good search engine friendly link should look like
http://www.site.com/articles/seo/search-engine-optimization.html
Some users might link to your page using the URL of that particular page as the anchor text. So if your URL is not good enough, it will ultimately affect the search listing badly.
Let us consider typical search keywords for this URL.
Search Key “seo articles” – this will match the first part of the URL.
Search Key “search engine optimization article” – the second part of URLs will be appropriate and it will get listed.
URL Structuring
A site’s URL structure should be as simple as possible. Organize your content so that URLs are constructed in a simple, manageable and logical manner. Try to avoid using long ID numbers.
Let me demonstrate an intelligent URL structuring.
Case Study 1
You are searching for information about Search Engine Optimization. The keyword is “Search Engine Optimization”. Look at the following sample links.
- http://en.wikipedia.org/wiki/Search_engine_optimization
- http://www.example.com/index.php?id_sezione=360&sid=3a5ebc944f41daa6f849f730f1
The first URL will help the users decide as to whether you should click that link or not. The second URL will be less appealing to users.
Case Study 2
Let us check the use of punctuations. In the below example, both URLs are simple and well constructed.
- http://www.site.com/articles/seo/search-engine-optimization.html
- http://www.site.com/articles/seo/searchengineoptimization.html
In the first case, the URL is readable and much more appealing to the visitors. The second URL is short and well formed, but not user-friendly. Such kind of URLs will not do good in SEO.
It is recommended to use hyphens (-) instead of underscores (_) in your URLs.
Good URL Design Practices
Let us check some of the good points while creating website URLs.
- Do not fill your page with lists of keywords:- This is very important. Almost all search engines use bots algorithm for web searching. It will recognize that you are attempting to “cloak” pages, or creating a “crawler only” page, which will affect the search listing . So in other words, if your site contains pages, links, or text that you don’t want your visitors to see, just avoid it. If not, Google will consider those links and pages deceptive, thereby producing negative results.
- Do not use images to display important names, content, or links: – Some search crawlers do not recognize text contained in graphics.
- Note:Use ALT attributes if the main content and keywords on your page cannot be formatted in regular HTML.
- Do not create many copies of a page under different URLs: – To ensure that the page you prefer gets included in the search results, you should block duplicates from the search engine spiders.
- Avoid Hidden text and link: – Hiding text or links in your site content can cause your site to be recognized as malicious or spam since it presents information to search engines in a different way.
There are many ways to make a text hidden.
- Using white text on a white background
- Including text behind an image
- Using CSS to hide text
- Setting the font size to 0
Dynamic URLS
Complex URLs containing multiple parameters can cause problems for crawlers because it unnecessarily creates high number of URLs pointing to the same site content. As a result, search engines will be unable to index all your site contents completely.
Some of the common issues that can occur are
-
1. Dynamic generation of documents
Example
-
- Feature:Show Article
Link: http://www.site.com/view-article.php?title=yes&id=1 - Feature: Show articles with control buttons
Link: http://www.site.com/view-article.php?title=yes&id=1&show=true - Feature: Show Articles with controls and comments.
Link: http://www.site.com/view-article.php?title=yes&id=1&show=true&comment=7&rating=4
- Feature:Show Article
In the above example, links are dynamically formed depending on the features.
-
2. Server parameters in the URL
Session Ids, language parameters etc can create massive amounts of duplication and a greater number of URLs.
-
3. Sorting parameters.
Some popular shopping sites and CMS sites provide multiple ways to sort the same items, resulting in a huge number of URLs.
Example: http://www.site.com/view-article.php?title=yes&id=1&show=true&comment=7&rating=4& search_query=tpb&search_sort=relevance&search_category=25
-
4. Irrelevant parameters
Irrelevant parameters in the URL, such as referral parameters, clicks, ads, affiliate links can make the url complex.
Example:
http://www.site.com/view-article.php?title=yes&id=1&show=true&comment=7&rating=4& click=6EE2BF1AF6A3D705D5561B7C3564D9C2&clickPage=Search+Engine+Optimization&cat=79
Steps to resolve Dynamic URL problems
To avoid problems with the dynamic URL structure, there are common solutions.
-
1. Using a robots.txt file
Use a “robots.txt” file to block Google bot’s access to problematic URLs. Using regular expressions in your robots.txt file will easily block a large number of URLs. We will discuss more on robots.txt later.
-
2. Avoid Server Parameters
Avoid the use of session IDs in URLs wherever possible. Almost all web servers have integrated solutions for hiding server parameters from the URLs.
-
3. Avoid unnecessary parameters
Try to shorten URLs by trimming the unnecessary parameters.
-
4. Other possible solutions
- If your site has an infinite calendar, add a nofollow attribute to links so that future calendar pages are dynamically created.
- Check your site for broken relative links.
- Do not send automated queries to any search engines.
- Do not publish links with any malicious behavior.
Common URL SEO Hacks
We were discussing about forming good URLs by changing existing complicated/dynamic links. There are many wrong assumptions (called SEO Spamming) that can blacklist your site on all search engines.
Let us discuss 3 important aspects that should be avoided.
Link Cloaking
Link Cloaking is the process of presenting different contents or URLs to users and search engines. Serving different results based on a user agent may remove your site from the Google index.
Examples
- Serving a page of HTML text to search engines, while showing a page of images or Flash to users.
- Serving different content to search engines than to users.
If your site contains rich media files other than Flash, JavaScript, or images, you should not provide cloaked content to search engines.
Solution
- Provide alt text that describes images for visitors with screen readers or images turned off in their browsers.
- In the case of javascript, provide the textual contents of JavaScript in a noscript tag.
JavaScript redirects (Link Masking)
The use of JavaScript redirect is a poor web practice. When a bot indexes a page containing JavaScript, it will index that page without following or indexing the links hidden in the JavaScript. When a redirect link is embedded in the JavaScript, the search engine indexes the original page rather than following the link. The users will be taken to the redirect target anyways.
Example
Just did it
Doorway pages
Doorway pages are huge sets of low quality pages, where each page is optimized for a specific keyword or phrase. In most cases, doorway pages are written to rank for a particular phrase and then to direct users to a single destination.
Google’s aim is to provide the most valuable and relevant search results for the users. Therefore, some users manipulate search engines and cheat other users by directing them to sites other than the ones they selected. Google will take action on the sites making use of these deceptive practice, including removing those sites from the Google index.
Concept of Backlinks (Inbound links)
Backlinks are incoming links to a web page. Backlinks are also known as inbound links, inlinks, and inward links. In the SEO, the number of backlinks is the indication of the popularity or importance of that particular website.
Note :
A site with a lot of backlinks implies that many other sites link to that site.
Why are inbound links so powerful?
Inbound links are so important because, the search engines stores a page’s data and the way that they process a search query is related to the number of inbound links.
In other words, we can say that every Search Engine consists of different levels of page indexes. For example, a database will store the words in link texts, and another database will collect the page meta title, tags and keywords. A separate long entity will be present to store all other words, headings, contents on a page. This means that the number of back links plays an important role in increasing the page hit.
How do we increase inbound links and Page Rank?
There are a number of effective ways to achieve this. Let us look at some of the commonly accepted possibilities.
-
Directories
Directories usually provide one-way links to websites. Submitting to directories is time-consuming but there are directory submitting services that can increase your site traffic.
-
Forums
Join forums and place links to your site(s) in the signature area. Use your main search terms as the link texts. There are a few things to be noted in this case
- Make sure that the forum is spiderable
- Make sure that the URLs do not have session IDs.
- Also make sure that links are not hidden from spiders (Avoid the use of javascript redirection)
-
Link exchange centers
Join a free link exchange center like LinkPartners.com. You will find a categorized directory of websites that wants to exchange links. Do not join any site link farms. Even though they are excellent for building up linkpop and PageRank, search engines disapprove all of them.
-
Link exchange requests
Search on Google for your main search terms and find competing websites. Find those sites which links to them and request a link exchange. You can get the inbound links in google by searching “link:www.cometitor-domain.com”.
PageRank Explained
PageRank (PR) is a number that defines the importance of a page on the web. It is the Google’s way of deciding a page’s importance. If Google figures that a page links to another page, a vote is cast for the other page. Google calculates the importance from the total votes cast for a particular page.
To calculate the PageRank for a page, all of its inbound links are considered. The equation used to calculate page rank is
PR(A) = (1-d) + d(PR(t1)/C(t1) + … + PR(tn)/C(tn))
t1 – tn: Pages linking to page A.
C: number of outbound links that a page has.
d: Damping factor, which is set to 0.85 to usually.
The maximum amount of PageRank in a site increases as the number of pages in the site increases.
URL Optimization Tools
-
1. Backlink checking Tool from Google
You can perform a Google search using “ link: operator to find a sampling of links to any site”. For instance, [link:www.google.com] will list web pages that have links pointing to the Google home page. Note there can be no space between the “link:” and the web page URL.
-
2. Backlink checker Webmaster Tools
Google Webmaster’s Tools has options to measure the links to your site. Here is how it works.
- On the Webmaster Tools Home page, click the site you want.
- Under Your site on the web, click Links to your site.
Relevant Links: http://www.google.com/webmasters/tools
-
3. Link Extractor
This tool will help you to to extract links from a specific web page with the associated anchor text, HTML code, attributes and Google page rank. Using this tool, the number of Inbound Links , Outbound Links, etc can also be found out.
-
4. Page Rank Checker
The results will show the Page Rank of a particular page.
-
5. Link popularity
The Backlink checker generates a complete site analysis e.g. amount of backlinks, indexed sites, Google PageRank. It also determines whether your domain is listed in the DMOZ and Yahoo directory.
About the Author :
Vipindas, works as a Senior Software Engineer in Bobcares. He joined Bobcares in September 2006, and is an expert in developing web based Java and PHP applications. He is highly passionate about software architectures and design patterns. On the cultural front, Vipin is a great singer and loves to tap his feet to any music on stage.
This was informative :o)
thnx for the tips….