Search engine robots and accessibility

Search engines view content on the web using simple browsers called "robots". Robots crawl the web at a fantastic speed. They can do this because they ignore much of the content that you or I would see - for example a robot wouldn't be able to understand an image, so saves a lot of time by not downloading them.

A search engine robot

Having some content ignored by robots isn't a problem if you also have robot friendly content, but sometimes a robot can't even navigate through a site. This is when you need to understand the limitations of robots.

The limitations of robots


Robots cannot follow JavaScript. This means that they can't follow links in drop down menus or image maps. If you use drop down menus then always make sure that your entire site can be navigated using redundant text links.


The robot will not download pictures, it will only read text. The heading at the top of your page would probably look much nicer if it was a picture, but will be much more useful for search engine ranking as heading text.

Robots can read alternative "alt" text for images. If you use images for links, then include "alt" text that briefly describes the page you are linking to.


The same story as images, however, the accessibility options introduced with FlashMX go some way to providing equivalent content to feed robots.

Some sites have flash detection pages that direct users without flash to a page where they can download it. Robots will not choose to download, so you will effectively refuse entry to them. Visitors without Flash will have deliberately chosen not to have Flash, so the redirection pages are annoying to them as well - they would far rather view your site.


Robots will reject cookies. Most web applications that serve cookies will let a visitor in who rejects a cookie. Cookies are only a problem if they are compulsory.


Many dynamic web sites append page address with variables in the form: ?search="widgets".

Google can now follow links with 3 such variables, but won't follow links with 4 variables. If your pages have more than 3 variables you should investigate "mod rewrite" programs which change the link to a static looking link with no variables.

Validating a site for robot accessibility

The easiest way to check if a site is accessible to robots is to use a very basic browser with the same limitations as search engine robots.

There are a number of search engine emulators that are free to use. Otherwise you can download a text browser and view your web pages without images of javascript: Lynx text browser for windows

If you can navigate the site with this browser then a search engine robot will also be able to navigate the site.

Next> Search engines - what not to do


Search engine overview

Link popularity

Web page optimization

More ranking factors

Accessibility for robots

What not to do

Search Engine Glossary

Web Accessibility

Website Specification

Search Engine Optimisation

Learn CSS

Web Design Guide

Web Standards

Business web design
VORD Web Design