Welcome to a new
On Aug 18th, we spent an hour with Juliana asking her everything about conducting technical audits 💪🏽
I speak with the webmaster or client to ask if they have any specific issues to be aware of (eg just completed a site migration, changed CMS), then I:
Do a site search in Google, even if not accurate, it gives me an idea of how big the site is and it shows me if there are any duplicate metadata in the SERPs.
Check the page speed - test a few pages.
Check Google Search Console - for errors and the number of indexable pages.
Run a crawl of the site - from the crawl, I can see the other issues to address
Crawl the sitemap too - this is to check if there are any pages we want to be indexed that are NOT in the sitemap or pages that are in the sitemap that we do not want indexed.
I aim to keep everything separate, but I will mention any duplicate or missing or too long/short metadata which can fall into content. I do not go into the detail of the metadata. I use an audit checklist and update it as time goes on.
I tend to start work with my clients with a technical audit. I recommend always starting with that. From my experience, we tend to have an uplift in traffic, which makes most people very happy. I had some new tech recommendations implemented and we had more crawls of the pages, more traffic and we did see some more conversions. This does not happen all the time so it does depend on what changes you implement. What I did with some clients was also looking at how many sales were lost when people dropped out of the basket (for an e-commerce site). That is an easy way to put a revenue number against that. The check out process also falls under the technical audit.
I have found it is best to use a traffic light system - red for serious/high impact and yellow is medium and green is the least serious. Writing it in that way and highlighting the main changes you suggest as well as explaining it in plain English (no abbreviations) helps others understand it all. I tend to use Google Slides to show the findings and have it supported with Google Sheets which goes into the details.
Slow loading sites
robots.txt file is incorrectly blocking some pages
Pages we want indexed are not included in the sitemap and pages we do not want indexed are in the sitemap
Check if there have been any changes/work on the site (eg changing CMS or changing the robots.txt file)
Check if the rankings have dropped
Look at the traffic for the other channels - have they seen a drop too or not?
Check if there have been any algorithm updates
Check if there have been any major changes in the environment? - eg covid boosted many sites' traffic across many channels
I present the key pages (such as the home page, and some key category pages) and show the aspects they may care about. In terms of what I show them, I run this against GTMetrix and Page Speed Insights and I show the screengrabs of the score and what it is missing in Google Slides. I show the detail to the web developer, but I try and speak to the web developer first so that I am presenting what we can change.
I check Google Search Console
I check the robots.txt file
I check to see if some key pages have redirects (using a redirect checker)
I check to see if the page is excluded or not - using the robots exclusion checker
I check to see how the site looks when I disable JS with a chrome extension
Hard to pick just one.
Companies having their staging site being crawled and indexed.
Having duplicate content issues because of an issue with the CMS - it produced 3 versions of the same page all with different URLs.
Finding out thousands of pages are not indexable (that the team thought were indexable).
I have found this the other way. I do believe that tech and content are interlinked and they are separated out because they are two massive areas of a site. In the case of not living up to E-A-T with the content, I suggested creating About Us page, adding organization schema to the home page, adding author's names to the articles/blog posts.
I put the findings in a traffic light system report. The Google Slides highlight the issues and at the end of the presentation, there is a one page summary with:
Recommended next steps
The roadmap is something that is part of the proposal as I do a mini audit before putting the proposal together. This way I know how many issues there are.
I look at the size of the site and I see what issues they have from me spending around 2 hours on the site. Most people will not pay the same amount of money/time for audits done in agencies. For example, some agencies would take 5 working days. Many people I work with, do not want to spend that time so I allocate between 1 - 2 days carrying out the audit.
I have come across this too. I identify this in the proposal stage. If they are not willing to make the changes and do not believe in them, I will most likely not work with them.
I still report on them. Google does fluctuate on that metric but other search engines do not. I know Google in many countries has a monopoly. It is still good practice to have calls to action in the meta descriptions and to not try and stuff too much in the meta titles.
Yes, I include that as well but also in the content audit. It is important that everyone can access your site. I try and stick to the issues on the site and use other sites to back up the reasons to fix those issues.
I really like Screaming Frog and I also use Sitebulb too. I have used DeepCrawl and used Oncrawl too. I find ScreamingFrog one of the easiest tools to use but that may be because I have been using it since I started around 10 years ago.
The technical audit itself is one price. Then the executing and fixing them is another. I look at the site(s) before I do the proposal and based on what happened in the past with similar sites I worked on, I put a time on how long it would take to do the audit and fix these issues (my time). I normally help with 301s, 404 issues, robots.txt, sitemap issues, traffic and conversion reporting. I always state that I work WITH the development team and will guide them on what needs to be fixed. For example, I will not do 301 redirects in the htaccess file. I will do it if it is in a CMS like Wordpress. I never tell the client(s) I am going to fix it all, it is all working together, part of a team.
I have not seen one in a while. If I did see one, then I would tell the client and they can decide if they want to allocate budget to auditing that site.
I have found WordPress has been the easiest for me to fix things or get things fixed. BUT that may be because I have spent more time in it than Magento and Adobe. WordPress does have a lot of plugins, so sometimes we all get a bit excited adding more and more plugins which slows down the site and if they are not updated can cause issues.
I do the audit at the beginning of the agreement with the client and I like to also revisit it after 6 months. This can be done to check on the progress made (although I report this monthly) as well as running another crawl to see if there are any new issues (plus checking GSC and traffic).
Thanks Juliana for a truly insightful AMA and thanks to everyone for asking questions. You can learn more about Juliana through her LinkedIn and Twitter. You can also check out her upcoming TurnDigi event taking place on September 16th.