A crawler, also known as a web crawler or spider, is an automated program that systematically searches the internet by visiting websites and analysing their content. Search engines such as Google use crawlers to discover and index new and updated content so that it can be presented in search results. Crawlers follow links between websites and collect information that is used for indexing and ranking the pages in search results.