Web Crawler Features
Raptor’s Web Crawler launched at the beginning of 2019 and has undergone significant updates ever since. We are now toting an ever-growing list of features and benefits with a wide range of applications to meet the needs of our customers.
Our web crawler is designed for SEO but can be used in a range of other applications, check out the features we’re offering to see how our software can serve you.
Advanced Web Crawling
Our web crawler can crawl almost any site and we do this in several unique and powerful ways that each lend certain benefits over the competition.
Cloud-Based Web Crawling
Using a cloud-based web crawler means that you can login to Raptor from any device and any location. Not limited by users or devices, you can set a crawl going from your mobile phone and export the report on your laptop.
Accessing our tools through web browsers reduces the potential for compatibility issues, runtime errors and problems with OS (Operating System) updates. Because you don’t need to install anything, you can always switch to another device if your computer throws up a blue screen of death!
Lots of Lovely Data!
We scrape a lot of data from every accessible location of a website from the robots.txt, sitemaps, HTML pages, and other file types.
We Scrape Everything You Need
In terms of what you need to audit, evaluate and optimise a site from an SEO perspective, we scrape it all! Our software will provide you with all the data below and much more:
Google Analytics & Tag Manager
Social Media Tag Data
Page Speed Data
Our web crawler performs various checks and analysis to automate some of the more mundane manual tasks. For example, among a range of check we can tell you all the following:
- If a page or resource is listed in a sitemap
- If a page is canonical or non-canonical
- If a page or resource is indexable or not
- Identify meta data errors and issues across a site
- If a page has no follow links pointing to it
For larger sites, scalability means that you can set and forget, no need to keep your computer running or stay logged on. Because we take advantage of autoscaling groups, task queues and advanced back-end technology we can crawl sites at a high rate.
Whether you are crawling a small 500-page site or a large 1-million-page site, our software can spool up servers to handle the load.
Another advantage of using cloud-based technology to deliver our SEO tools is that we efficiently store historical data, past crawls and old projects. All historical crawl data is accessible at the click of a button, allowing you to export that data or run audits on it at any time.
Compared to desktop web crawlers, where you will need to store csv files locally, there is no risk of the data being lost or difficult to find. So, if you need the crawl data from 8-months ago before a website migration, you can just login and download it.
We provide you with a range of reports and data exports of crawl data, you can see the details of these features below.
The SEO report is provided in an Excel (XLXS) format, which means that we can format the rows and columns as well as segment the data into multiple tabs / worksheets.
This feature allows the data to be more easily read, digested, analysed. This saves you time when sending it on to web developers and clients.
The SEO report can be used for various process such as technical site auditing or optimising a site for target keywords.
If you run this report on competitor sites, you can use the data to scrape competitor keywords and build an image of their site’s content and performance.
Broken Links Report
Running a broken links report is a very common practice in SEO but is also useful for PPC / SEM and other paid advertising to ensure that landing pages are accessible.
The broken links report that Raptor provides, gives you the location of all broken links and the status code the link returns.
This data is used to find and repair broken links throughout a site, which is especially useful if done before Google find them. Fixing broken links improves both the user experience and the accessibility of a site’s content.
This report comes in an Excel (XLXS) format and as such we segment the data into multiple tabs to make it easier to read, asses and act on.
Custom CSV Exports
As mentioned, we provide a slew of data available for export, you can export this into a CSV file which is a document without any formatting. As such it typically requires some re-jigging to be readable but is the simplest format for the data.
You can also choose what data you want to include within the file, meaning that you can limit it to just data you need too exporting all data. Exactly like the SEO report, these data can be used for various SEO processes.
The image report is also an Excel file (XLXS) and contains:
- A list of all images found during the crawl of a site
- The location of those images
- The alt tag text used on every image (per location)
- The file types
- The file names
- Details of the XML sitemaps where images are listed
This report is useful if you want to optimise images for target keywords, filenames and alt tag text both be optimised.
Our web crawler tool provides various data, charts and visualisations to help perform audits of sites.
Technical / SEO Audit
The technical SEO auditing section allows you to see a range of data specific to typical SEO audits. We include a range of charts that show the distribution of pages by word count, or page load times, with a quick and easy download function. This enables you to easily identify problem pages and export the data specific to those issues very easily.
All charts can also be downloaded in a range of image formats for use in reports
Raptor displays a range of content information such as the distribution of pages by word count. We also filter this data to show you the distribution of canonical pages by word count. The images report and distribution charts for images work in the same way as word count.
This data provides a very clear picture of the content across a website and on each page of the site.
The competitor analysis section has multiple functions… Adding your competitors to a project allows us to compare your site to your competitors. This is great for benchmarking to identify the strengths and weaknesses of your site in the competitive landscape.
This data is also very useful when pitching to clients or demonstrating the necessity of certain SEO work. Understanding where to direct your efforts, whether it’s in content, page speed, or backlinks is vital to delivering results.