Sourced from imakenews.com

Whether your company is building a website from scratch or looking to gain visibility and boost traffic on its current site, it is important to evaluate how “search engine friendly” it is. In other words, does the underlying structure of your site encourage search engine spidering or does it hinder the process?

Below are key factors to consider for improving and maintaining the search engine friendliness of your site.

Robots.txt – Can you even be found?

One of the easiest ways to ensure that your site is being crawled is to invite the spider to do so. Employing a “robots.txt” file helps ensure that search engine robots (also called spiders) can actually crawl your site and gather pages for indexing. These files can also be helpful when you want the robots to stay away from specific files or directories on your site, making them an effective way to avoid being penalized for duplicate content.

Duplicate Content – Are you spamming?

If a search engine spider finds duplicate content on your site it will view it as spam and penalize the site, or worse, ban the site completely. It is important that every page on your site has focused, unique content. It is imperative that you block duplicate content with the robots.txt file.

Cookies Implementation – Are you making them mandatory?

Cookies detection is used by many ecommerce sites to recognize users upon a return visit. While having cookies detection activated on your site in and of itself is not a bad thing, making them mandatory should be avoided, as search engine spiders are unable to accept cookies and consequently will not be able to index content from your site.

Dynamic URLs – Do you have query strings (URLs) that contain %,?, +, = or &?

Search engine spiders associate these characters with dynamic content. Most search engines do not index dynamic URLs because they do not want content to be delivered over and over again, dominating returned search results and making them inaccurate. There are multiple solutions for handling sites with dynamic content; one workaround is to make URLs appear static by utilizing software that hides these symbols. However, the best thing to do would be to consult with your SEM vendor to find the solution that best fits your website.

404 Error Page – Are you trapping the spider?

Generic to all websites, a 404 error page simply states that the page cannot be found. After hitting this page a spider will make no attempt to further crawl the site and search for the correct content, and it will drop the page from its index and all rankings associated with it. By implementing a 404 error page with text navigation, such as a site map, the spider will be prompted to continue crawling other areas of the site.

Site Maps – Do you provide a guided tour to your site?

Site Maps act as an alternative way for search engine spiders to find content as they parse through the site. The site map is a guide to the website, interlinking all the content of the site.

Secure Pages (“https”) – Are you hiding valuable content?

Particularly for ecommerce sites, secure pages are crucial in making your visitors feel safe when conducting business on your site. However, secure pages cannot be accessed by search engine spiders, so it is critical to not make pages secure that have valuable content on them.

Lack of Content – How relevant is your page?

You work with your SEM vendor to give each page on your site its own identity, but do you back up that identity with evidence? Search engine spiders look for a consistent page theme when evaluating the importance of a page. It will give preference to sites with the most appropriate content for the identified theme.

Broken Links – Are you telling the spider to “stop”?

Search engine spiders could stop indexing a site if they encounter a broken link. Make sure your links work on all pages of your site.

Frames – Are you invisible?

Search engine spiders are not always able to navigate through the frames’ source code in the underlying HTML, so any content that is contained within the frames is invisible to spiders. If possible, avoid using frames entirely.

Flash – Bad for spiders

“Flash” graphics are often used to enhance the website’s visitor experience, however the content usually cannot be read nor the embedded links followed by most search engine spiders. If possible, avoid using flash to ensure your site is search engine friendly.

Conclusion

Taking these factors into consideration when developing or updating a site will improve its ability to rank well in the major search properties and boost page traffic. No single recommendation will ensure visibility and traffic by itself, but providing attention to all of these factors will help the content of your site to be crawled by the search engine spiders.