Seekbot checks your web page.
Seekbot is a lite version of the Seekport Crawler. It was developed to help web masters check and adapt their web pages to ensure they are built to a good standard for search engines. This will help with gain a good ranking on Seekport and other search engines.
Seekport like other search engines collect their information with a 'Spider', also known as a 'Bot' or 'Crawler'. We will refer to this as a 'Crawler'. These Crawlers trawl the Internet gathering information on new resources. They work by moving from one link to the next and 'reading' the content of each web page, much like a normal web user does. However there are major differences between how a user and the Crawler reads the information. The main differences are detailed below.
How does it work?
It's very simple:
Seekbot analyses all elements within the source code of the selected web page that are important for search engines. Thus web masters can see how a search engine's Crawler reads their web page.
Seekbot is a free search engine compatibility check. Try it!
About Seekport |
Company | Contact | Add a Site |
© 2006 Seekport Internet Technologies UK Ltd
Seekport Internet Technologies GmbH