Growth Blog

Marketing Growth Hacks

What are robots, spiders, and crawlers?

Robots, spiders, and crawlers are different terms that refer to a certain program that search engines run. This program essentially builds a summary of the content on a website by crawling it, enabling Google and other search engines to index it. As a result, when a person performs a search using a specific keyword, that website has the chance to appear in the results.

In order for a website to be crawled by Googlebots and other search engine spiders, each website must have a robots.txt file enabled in the HTML. Otherwise, the bots will be unable to crawl and subsequently index any page.

Text Crawling Vs. Images and Video

The large number of websites out there makes direct text-only matching a rare instance. Instead, search engines use complex algorithms to rank potential matches for certain keywords. Bots will also only crawl text on a website, which means that you need to have supplementary keyword-infused HTML for all images and videos if you want them to rank for certain terms. For example, images will require alt text and keyword-rich file names that indicate what is in the image, while you should do the same for video titles and descriptions prior to uploading them to YouTube and other video sites.


Join 10,000+ subscribers and receive a new article each week!

Receive actionable marketing strategies to grow your business.




Comments


Check out how we're helping customers succeed!

We have partnered with Clickx for the last 10 years to not only build and manage our website, but also to support our SEO and inbound marketing initiatives. They do great work in addition to being easy to work with and always bring new ideas to the table. I would recommend them to everyone looking to improve their web presence and demand generation efforts.

TOP