apply the SEO method to go up

The last core update was just announced on 11/17/2021!

Quick access: What to do after a core update? – latest algo updates from google

What to do after a core update?

I state that I use the same method for:

  • a website that just went down with a major Google update
  • a website whose SEO is doing well but who wants to anticipate the next core updates

Specifically, you must:

  1. Start an RM Tech audit in the full version with mandatory Search Console and Analytics couplings
  2. Follow the steps below to determine what situation you are in and what pages you need to study
  3. conclude with a human analysis on the specific pages that have been identified

If you don’t have the full RM Tech audit, request a RM Tech pre-audit with zombie site analysis. That is free (if you haven’t already used it)…

Once your test report is ready, look at the first two graphs in the summary.

Where does the sample data come from?

For the following, I could have described again the criteria that have the greatest impact with each core update, but this is often too general.

So I preferred to describe to you what you should do. Thanks to my tools on My Ranking Metrics the process is very simplified.

I will also base on 3 examples from the 53 analyzed websites:

  • Page A with about 1200 pages (e-commerce): It has many problems, has not reached its potential and has fallen again with the update
  • Page B with about 15,000 pages (contents page): He corrected a lot for a few months, but it wasn’t enough
  • C-site with about 1700 pages (e-commerce): It’s a hit on Google (and fell last year)

Below are screenshots of their RM Tech audit.

Let’s go !

QualityRisk helps predict SEO success

QualityRisk tells you if your pages are well optimized from the start in terms of SEO (mainly technical). You must be at 0 risk.

If you have too many pages with an index greater than 20, you can’t complain: you’re not giving yourself a chance to succeed. This is the case for Page A, which we can also say is far from having reached its potential for organic traffic:

Poor quality risk
We can see immediately that this page has not yet exhausted the potential for optimization: QualityRisk is far too high

Here’s the histogram of Page B, which just fell in June when it recently got these nice stats (low quality risk pages):

QualityRisk pretty good
QualityRisk pretty good, and yet the site crashed recently…that can be explained by QualityRisk + Zombies joint analysis

Compare with side C which has definitely won! It was dropped in the December 2020 Core Update and did a great job in early 2021:

Good quality risk
The pages of this website are almost all well optimized (low quality risk)

OK, now what? You will tell me that your QualityRisk index for Site B is extremely low and yet you have been penalized by Google’s latest core update.

I think it’s pretty new here: ONLY having sites with a low QualityRisk is no longer enough.

The zombie index helps identify pages that are too disappointing

That’s it: the internet user who arrives at your page from Google’s SERPs, you have to take care of it! It’s the formula I’ve been using for years: I feel like it’s never been better.

A page may seem well done (on the contrary, no technical problem and a very complete content) and yet Google doesn’t seem to want it (anymore?).

There are a few possible reasons and you know them. The famous general advice you’ve been reading for a long time. The site has too many ads, is too slow, doesn’t have enough words (or too many), UX issues, etc. These are great tips, but how do you study them all across all your pages and take stock?

this is there The zombie hint will save you insane time!

Whether you have hundreds, thousands or tens of thousands of pages, the tool will show you which pages are causing problems. They don’t spend hours looking for them.

Let’s start with the obvious case of Site A that hasn’t worked hard enough yet and for which I’d like to say “you can’t complain”:

High zombie indices
Too many sites with high Zombies indexes = site performs poorly and often fails on core updates

Here’s the zombie index overview of page B, which just got deleted, although I remind you of the seemingly good pages:

Zombies indices too high
Zombies indices too high despite low QualityRisk: Crossing the 2 indicators is valuable to identify problem sites

Don’t tell me “yes, but it’s normal that he fell down, there are only zombie sites”. nope !

  • First, because a zombie site is not a site without traffic, the index is based on many criteria.
  • Because the zombie index is calculated from the performance of the last 12 months. If the histogram is bad right after a core update, it was already bad before!

And here is the head the zombies histogram of site C, whose organic traffic just skyrocketed :

Good zombie clues
Almost no high zombie index sites: This site made a very big comeback with the June/July core updates

Small bracket: The person who manages the SEO of this page C had contacted us, Fabien and I, in January 2021 because the traffic had dropped by 40% in 1 month after the December 2020 core update. We have therefore used exactly the method described here. When she only showed her pages with a high zombie index, she was initially surprised because these were pages that should be at the top in terms of optimization (SEO). But she figured out pretty quickly why these sites were disappointing Internet users and other sites weren’t (albeit with sometimes insufficient optimizations). In summary, I would say that the sites did not deliver on the promise to the internet user (in the title tag and elsewhere on the page). There was some kind of mismatch with search intent. It used to work, but not with Google’s algorithm… And without the zombies index, she couldn’t find which pages were causing problems.

What to do with the data provided by RM Tech?

So What exactly is to be done with the audit report?

  1. Open the attachment available in the conclusion with your table and activate an automatic filter. It’s under Data > Filter in Excel and under Data > Create Filter in Google Sheets.
  2. Filter to keep only those pages whose zombie index is greater than 40
  3. Also add a filter to keep only those pages whose QualityRisk index is low (e.g. less than 20).
  4. Analyze the listed pages. Optionally sort by URL to see trends by page type.

End…

Since 2018, every Google Core update has highlighted the relationship between QualityRisk/Zombie and SEO Impact.

So nothing really new for us and for this update, except 1 point: It’s the first time that we see such a sharp drop in pages with a high zombie but still a low QualityRisk.

Like Fabien, I’m confident in the power of our two algorithms (QualityRisk and Zombies).

That’s why we recommend you to check what it offers you on your own website.

And if you don’t know our tool yet, we encourage you to try it for free.

What happened at Google in June and July 2021?

official info

As several times a year, Google formalizes the deployment of a global update of its algorithm, called the (Broad) Core Update. The peculiarity here is that at the beginning of June 2 consecutive and tight core updates were announced.

  • The June core update started on June 2nd and ended on June 12th. It took a few days for the effects to become apparent.
  • The July core update started on July 1st and ended on July 12th. The effect was very quick. Sometimes we notice a reversal of June’s impact for websites, as this happens regularly.

We are discussing this in the WebRankInfo forum: Core Update Google June/July 2021.

Please note that there have also been other official updates from Google:

  • June 15: Began considering Essential Web Signals (Core Web Vitals). It is used throughout the summer until the end of August (source). As of July 17, no study has found any impact on visibility in Google. Additionally, Google has stated that the Page Experience signal will be a light factor: think of it more as a factor to decide between results, HTTPS style. I’ve also read that Google states that we can only see an impact when pages exit the red zone to pass OK.
  • June 23-28: Anti-spam updates. The impact seems to have been very small, even on pages from sites using Black Hat SEO. Apparently this affects traditional web results as well as images, but not local SEO (source).

My opinion is based on analysis of 53 pages

It seems we always find the same things at first.

Although we can find counter-examples, in general we see an evolution of sites that have optimized the technical aspect very well and invested in quality editorial content. In competitive industries, backlinks naturally play a crucial role. And as is often the case, the effects are much stronger when the page is viewed as YMYL.

In fact, if you know about SEO, that sounds like trivia!

So I dug a little deeper! I’ve had a lot of exchanges: on social networks, where I’ve talked about this article I’m preparing, as well as with clients of my SEO My Ranking Metrics platform. I did this with Fabien, my partner at My Ranking Metrics. We analyzed 53 pages thanks to RM Tech:

  • 28 pages that fell hard
  • 12 stable sites
  • 13 websites that have made progress

There were a few instances where we couldn’t explain why the page fell on Google. But we noticed a common feature:

Most of the sites penalized by the June/July core update took “risks” for several months by having too many sites that disappoint the internet user. Google’s algorithm seems less “tolerant” than before in such cases. This is particularly noticeable in the joint QualityRisk/Zombies Pages analysis that is included in every RM Tech Audit.

Olivier Duffez (WebRankInfo)

We have confirmed that the correlation between QR/zombies and the SEO effect of core updates is stronger the older the site is (more than 2 years) and the larger the site is (more than 1000 URLs). Our conclusions are the same for ecommerce sites and content sites (and some other types like classifieds).

Did you like this article?

Leave a Reply

Your email address will not be published.