SEO Site Monitoring

Audits Hero Image
When fruit goes bad it rots, telling you something is wrong and needs throwing. Your website is being worked on by lots of different people, monitoring allows you to similarly know when something is broken.

What is Site Monitoring and what is involved?

Modern, enterprise websites often release changes to their code several times a day, preferring to tolerate mistakes but not taking a long time to find them. My site monitoring platform allows for sites with millions of URLs to be monitored daily and includes a helpful Looker report for stakeholders and all the requisite data to replicate issues.
Crawling

Crawling

Crawling at an hourly, daily or weekly cadence is setup and run so to find and aggregate issues on the site.
Reporting

Reporting

Data from the crawl is aggregated and feed into a Looker report, which showcases both current versus previous crawl numbers and trends for 30 days previous. Additionally all of the data required to fix the indicated issues is included and available in a onedrive folder which updates after every crawl.
Data Gathering

Data Gathering

A secondary but useful side effect of crawling your website so often is I can also pull and aggregate data from across your site. Want a count of pages which have ‘Staff Writer’ instead of an actual author, this can be collected and provided daily.

How the Process Works

Step 1

Setup

Working with your engineering teams I create a route for daily crawling, which typically involves whitelisting my IP, UserAgent and header token. Once whitelisted I setup the crawls and begin a series of tests to confirm we won’t be blocked.
Step 2

Reporting and training

I connect the aggregated data collected from the crawls to a looker template which outlines current vs previous crawl deltas and provides a 30 day lookback of crawl stats. In recognition of the fact people ignore reporting they do not understand, I am happy to provide training.
Step 3

Alerting

Once reporting is being populated we begin looking at alerting. I will work with you to setup thresholds and/or categories that must always result in alerts, pushing them either by email or slack messages.

FAQ's

Why are you so much cheaper than the bigger tools?
Simply put, my monitoring service is a very specific tool for a very specific task. This service offers a simple, cost effective way to see how your site is performing now and every day from now. Being this simple in this way allows me to build the appropriate infrastructure and scripts – resulting in me being able to pass on the saving to you.
Can I not do this for myself using screaming frog?
Well yes, technically speaking you 100% can as the tool is brilliant and allows you to with minimal fuss. Where I add value beyond what the tool does is through offering the infrastructure to run the crawls and the effort of building and maintaining the pipeline. I have also created a handful of accompanying tests which are not available in screaming frog and can further create tests specifically for your website.
My site has to be crawled from a certain market, can you do that?
Yes, it’s a common request and something I do regularly. I can crawl from either a whitelisted static IP or using proxies from the preferred market – both at no additional cost to you.
Why do I need site monitoring?
The #1 reason I believe site owners need monitoring is to answer the question ‘We’ve seen a change in SEO performance, what could’ve caused it?’ If you crawl, aggregate and store data related to your technical performance you can identify improvements or drops caused by changes to your website – preventing hours of time being wasted trying to figure out if you need to re-work the content or a competitor got a new link.
How to Contact Me
Drop me an email
Visit my Contact Page