David McSweeney is our Chief Editor here at Seobility. He wrote and edited most of the recent guides and articles on our blog. The following article is his assessment of the state of Google Core Updates.
Google dropped an early Christmas present on the SEO community earlier this month, with their first (official) Core algorithm update in over 7 months.
And it’s safe to say, it was one of the biggest SERP shake-ups seen in recent times.
So what was this update all about? And has it actually improved the search results?
Well, before we delve into observations, conjecture, and perhaps a sprinkling of conspiracy, let’s start with the facts.
Table of Content
- 1 Google’s December Core Update: A Quick Timeline
- 2 What is Google trying to achieve with their Core updates? And are they succeeding?
- 3 Why Google’s Core updates encourage more spam, and less investment in quality content
- 4 In conclusion (and our advice if you’ve been hit by a Core update)
Google’s December Core Update: A Quick Timeline
Most SEOs (myself included) weren’t expecting another Google Core update this year.
We had been anticipating one hitting around September/October. But when that didn’t arrive, we figured Google wouldn’t roll anything out before January. It was a reasonable assumption, as normally they don’t rock the boat too much in the run up to the holidays.
We were dead wrong.
Because on the third of December, we got a couple of hours notice with a tweet, then it was time to ride the Core update rollercoaster.
The Semrush sensor (which tracks SERP volatility) started spiking on the evening of the third, and kept rising all the way up to a whopping 9.4 on the fourth.
And for those hit by the update, the impact was felt immediately.
You can see the sudden divergence in hourly search traffic on this chart from Google analytics (3rd December vs 2nd December).
In case you didn’t guess by the big red arrow, the Core update hits at around 5pm (PST).
That particular site lost around 60% of its search traffic overnight. Ouch.
Naturally those who were hit with the update were in a panic.
There’s no good time to lose search traffic. But for anyone involved in the retail sector (and let’s face it, most sites are to some extent) getting crushed at the start of the Christmas shopping season is quite literally devastating.
And we (us wise old SEOs again) figured that those who had lost traffic would need to sit tight, work on improving their sites, and wait for the next Core update for reassessment/recovery. We expected that — usual small fluctuations notwithstanding — it would be a few months before any big shake up in the SERPs again.
We anticipated that those hit would stay hit until the next Core update. Frozen in time if you will.
And once again we were completely wrong.
Because on the 10th of December, Google cranked the dial again.
And while it wasn’t quite as big an update as the 3rd, many sites saw a reversal in fortunes.
The site in the analytics screenshots above for example regained most of its traffic (Dec 13th vs Dec 4th).
And SEO forums such as Webmaster World were full of stories from those who had their initial gains of the 3rd completely reversed on the 10th.
Google giveth, Google taketh away.
Other sites stayed the same (they remained winners or losers).
And many weren’t impacted by either update and probably wondered what all the fuss was about.
Like this retail directory site, whose Search Console Performance chart shows a relatively stable line (or lines) throughout the whole period.
Update? What update?
On the 16th December Google announced that the roll out of their latest core update was complete.
And like the Grand Old Juke of York’s men, if you’re up you’re up, and if you’re down you’re down. At least until the next Core update.
Update: there’s increasing chatter of a further SERP shake up today. So it seems the update might not be quite over yet!
Which means we can now take a look at the search results and make some assessments.
So let’s move on and ask a question. Or two questions if you want to be pedantic…
What is Google trying to achieve with their Core updates? And are they succeeding?
Google has a stock response for questions related to Core updates.
They’ll point you to this 2019 blog post by Danny Sullivan.
To save you a few minutes, we can summarize the 1,632 words into:
- Create quality content (or improve your existing content)
- Build your (and your site’s) credibility
Point 2 there is what’s generally referred to as E-A-T (expertise, authority, trust).
So Google wants to rank high quality content from trusted sites.
Cool. All good so far.
And we definitely recommend you follow that advice.
But the problem is, if you’re someone like me who spends hours (days?) clicking through search results and digging under the hood, you’ll find example after example of sites ranking for highly competitive keywords which tick neither box.
Now I’m not trying to out specific sites in this post. All’s fair in love, war, and SEO.
(Although it does present a wider problem, which I’ll get to)
But here are some blatant examples of where Google’s quality and trust objective is falling apart.
1. Repurposed domains continue to coin it in
It’s no big secret.
Repurposed, high authority aged domains continue to rank fast, and rank high, even if the domain had no previous connection to the current niche.
In case you don’t know, a repurposed domain is a new site built on an old, expired domain, which has a strong, aged backlink profile.
Here’s an example in the sports niche.
To summarize, the site above was:
- launched in September 2019
- quickly grew to 15k organics p/m* (making it one of the biggest sites in this lucrative niche)
- took a bit of a dip with the May 2020 Core update
- came roaring back after the December Core update
*traffic estimates from tools are exactly that, estimates, and should be taken with a grain of salt. However, I happen to know the actual volumes/CTRs in this particular niche – and I expect actual organic traffic is closer to 50k/month.
So why is it ranking? And why did it benefit from the December Core update?
Well, here’s what the site doesn’t have:
- An about us page
- Any contact info
- Any author info
And here’s what it does have:
- A ton of powerful links picked up between 2005 and 2011 when the domain was home to a European cultural organization
- Lots of low quality (but reasonably lengthy) keyword focused articles
To be clear, none of the links are in any way relevant to the current niche/site.
And for extra lolz, a quick look at the domain on archive.org shows that it also spent a couple of years (2017-2018) happily living as a PBN site for a law firm.
It’s been around the block.
- Google still overly weights links in their algorithm, particularly when it comes to measuring trust.
- Despite what they might say publicly (see John Mueller’s quote below) there’s still no reset for links when an old domain is repurposed.
- And authority trumps relevance when it comes to links.
Are you sure about that John?
Update 22nd December:
Matt Diggity, from our friends over at Diggity Marketing, sent over this example of a repurposed domain in the health niche, which Google has taken a particular shine to.
If you think of the health niche that generates the most spam (and Google should be particularly sensitive about) you can probably guess what subniche it’s in.
And according to Matt, E-A-T is non-existent. It’s “all links”.
Google seems to be rewarding links. Especially high authority links. Repurposed domains are ranking left and right and they have nothing going for them except their links.
2. Google’s still got a big problem with cloaked redirects
If you read the comments on Barry Schwartz’ post which covered the initial 4th December roll out, you might have seen a guy called Joseph ranting about the search results for “free classifieds”.
And the thing is… he’s completely right.
Here’s a screen recording showing what happens when you click on the result at position 9.
You can’t see it on the recording, but suffice to say, my anti-virus went nuts.
And “free classifieds” is not exactly a low competition keyword. It has a search volume of 4.6K, a difficulty score of 50, and a $2.50 CPC.
So it’s safe to assume that plenty of people are going to be clicking on that result. Some of them might even end up with a virus on their machine.
This isn’t a new problem. But it certainly hasn’t gone away with the latest Core update. In my post-update SERP digging I came across spammy redirects on numerous search results.
In case you’re not familiar with what’s going on here, the website is showing Google different content to what a user will actually see when they hit the page. It’s a technique called cloaking, has been around for as long as Google has, and really should have been dealt with a long time ago.
Needless to say, it’s a clear violation of Google’s webmaster guidelines.
Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected.
But it also highlights a wider issue with Google’s assessment of content quality.
Because we can use this tool to find out what Googlebot actually sees when it crawls the page.
Yep, a load of scraped, gobbledygook text.
Almost 5,000 words of scraped, gobbledygook text to be specific…
…with the phrase “free classifieds” repeated 54 times for a keyword density of 7%.
I can’t quite believe I’m talking about keyword density in 2020, but we are where we are.
If I pick a random sentence from the scraped content I can find the original source.
Which also reveals that there are literally thousands of these scraper sites using exactly the same text, and they’re all indexed by Google.
- Google still hasn’t figured out how to identify cloaked URLs, at least not in a timely manner before they’re indexed and ranked
- Scraped text, combined from multiple sources can (and does) rank
- Google’s not as advanced at figuring out what is and isn’t quality content as they would like us to believe
- Keyword density is probably still a thing (ugh)
Note: the page in the screen recording was ranking for at least three days, but now appears to be gone. However, it’s since been replaced by a new cloaked page, which is ranking even higher.
It’s SERP whack-a-mole.
3. Can’t rank for a YMYL keyword? Throw up a doorway page on a Google site
You might have heard of parasite SEO before. It involves setting up a page (sometimes legitimately, sometimes not so legitimately) on a high authority domain, and ranking that single page for a competitive keyword based on the host domain’s strength.
Normally the page will be set up as a doorway page, which exists simply to link back to a money site.
Google doesn’t like doorway pages.
Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.
…it happens to host a LOT of them.
In fact, I checked the organic keywords for “sites.google.com/site/” and discovered that the subdomain currently ranks on page one for over 13,000 “buy” keywords.
Now some of these might be legitimate sites.
But a big percentage of them look like this….
Which at the time of writing (after the Core update has fully rolled out) is still sitting pretty at #1 for its target keyword.
A keyword which gets 100 US searches a month and is very much YMYL.
That’s just one example. But there are THOUSANDS of similar doorway pages hosted on single page Google sites. They rank for everything from cheap flights, to credit card offers, to imported cigarettes, and every pill and potion under the sun.
And of course, it’s not just Google sites. There are plenty of other domains (unwittingly) hosting similar doorway pages.
Let’s not get started on Pinterest…
Doorway pages on high authority parent domains still rank for many competitive (often YMYL) terms
Overall takeaway: Google’s going for the squirrels and completely disregarding the elephants
I highlighted the three tactics above — repurposed domains, cloaked pages, doorway pages — because they’re clear and obvious. They should be easy for a search engine as advanced as Google to spot and filter out.
They’re the elephants in the room, which Google seems to be either incapable, or unwilling to deal with.
Meanwhile, the squirrels — who may be a little mischievous, but generally play by the rules — get punished.
And that’s a big problem..
Why Google’s Core updates encourage more spam, and less investment in quality content
I’m going to kick off this section by quoting a post from Webmaster World. Because I really can’t put it any better myself.
“At least before you knew that provided you didn’t break Google’s webmaster guidelines and genuinely produced useful content, you were never going to be affected by an update reducing your traffic by -40%-80% just like that.
That stuff was supposed to be reserved for the spammers, scrappers and link scheme guys.
Now all these people outrank you with their expired domain 301 redirect and Fiverr articles just because of their link authority.
Now you suddenly ask why you shouldn’t start to spam yourself. Sure, you will get penalized eventually but so will you with your “white hat” site anyway at one point, and making a spam site is so much cheaper and simpler than a legit one. Why not just start making 10?
And now the web is a poorer place because you can’t find genuinely expert content anymore in narrow fields that aren’t covered by the big brands and mainstream sites. And full of even more spammers.”
Nail. On. Head.
Because it’s not just the fact that spam continues to rank that’s the problem. The spam has always been there.
It’s the fact that even if you’re playing by the rules, you might still take a big hit in a Core update.
Can’t happen to you? Well…
You might not be alright Jack
There are plenty of SEOs that will tell you that they don’t worry about Google Core updates as they’re doing nothing wrong. In fact, I used to be one of them.
But they should.
Because if the New York Times can lose a whack of search visibility overnight, then so can anyone.
Note: this doesn’t imply that the New York Times lost 21.4% of their traffic in the update (it’s a bit more complex than that), but it’s reasonable to assume they took a decent sized hit.
It’s hard to argue that content quality and trust is the issue when we’re talking about The New York Times.
What’s good today might not be good tomorrow (but might be fine again in three months time)
I’ve been involved in SEO for over 20 years. I’ve been there in the trenches through every major update. I danced the Google dance back in the late 90s. I remember when “Florida” hit in 2003 — it was a clear improvement.
Penguin made sense. Panda made sense. Hummingbird made a lot of sense.
Medic kind of made sense.
But ever since, there’s been a seeming randomness to Google’s Core updates.
I’ve seen plenty of sites that were crushed in one, changed nothing, recovered completely, were hit in the next one, recovered again…
We’re talking about going from the top spots, to nowhere, to back in the top spots again.
Hundreds of articles have been written about how to recover from Core updates. I would wager that “do nothing” and wait is not advice you’ll see written in many of them.
Now I should point out here that sometimes (perhaps most of the time), it’s clear what’s up. There’s an obvious issue with E-A-T or content quality that can be addressed. Or there’s a technical issue holding the site back.
But other times, there really is no logical reason for a hit. Which chimes with this horrible paragraph from Google’s Core update advice post:
We know those with sites that experience drops will be looking for a fix, and we want to ensure they don’t try to fix the wrong things. Moreover, there might not be anything to fix at all.
“There’s nothing you can do about it” isn’t what you want to hear when you’ve just lost 90% of your revenue overnight.
You didn’t do anything wrong, but you’re not going to eat tomorrow. Wait three months. Soz.
With great power comes great responsibility
(apologies for the cliché)
Let me dial back a little at this stage and say that Google has an incredibly tough job.
According to this post they successfully catch 25 billion newly discovered spam pages every single day.
That’s a mind blowing amount of content. The equivalent of every man, woman, and child on Earth churning out 3+ pages of spam every 24 hours.
And I’ve no reason to doubt their claim (in the same post) that 99% of visits from Google search results lead to spam free experiences.
It’s also fair to say that Google can’t please everyone. In any niche there’s going to be multiple sites competing for the top spots. They all want to be number 1, but by definition, only one of them can be.
Ups and downs in search traffic are to be expected. If a competitor is working harder than you — creating better content, getting better reviews, earning better links — they should be rewarded. And vice versa.
But while a dip in traffic can be weathered, a sudden and total loss of visibility (for which there is no obvious cause) cannot.
Google has the power to close a business down with the flick of a switch. And while the percentage of “good” sites that get incorrectly lumped in with the “bad” may be small, that’s no consolation if one of those false positives happens to be your business.
Danny Sullivan, Google’s public search liaison, is a great guy. He deals with a lot of heat from webmasters with good grace, patience and humour. But I took issue with this tweet.
Business owners would like to succeed. With search, that typically means getting visitors that they hope to convert in some way, at no or low price and effort, especially since they have businesses to run. Or am I off the mark here?….
— Danny Sullivan (@dannysullivan) December 8, 2020
Because for a business doing things the right way search traffic is certainly not free.
While that final click may be, it’s the result of previous investment — be that in money, or time — in creating the kind of quality content that Google wishes to surface in its search results.
It’s the result of hard work building relationships, getting mentioned on other sites in your niche or in the media, providing great customer service, creating a great product.
Or it’s the result of buying an aged expired domain with strong (but unrelated) links, and throwing up some cheap content from Fiverr.
The incentive and motivation to invest in high quality content decreases when there’s a risk (even if that risk is small) that you could lose it all overnight. And the temptation to gamble with spam tactics (with considerably lower capital investment risk) becomes more compelling — particularly when those tactics can be seen to be working for competitors.
25 billion spam pages a day becomes 50 billion, 100 billion, a googol…
And that’s not a good place for the web to be.
In conclusion (and our advice if you’ve been hit by a Core update)
Google’s Core updates are designed to improve search quality.
But there are still some HUGE loopholes that spammers are taking advantage of to rank. And the latest Core update doesn’t seem to have improved Google’s ability to detect and filter out these particular tactics.
- Repurposed domains are still riding high
- Cloaked redirects still litter the SERPs (and can be dangerous for users)
- Doorway pages (or parasite pages) rank for numerous YMYL queries
On top of that (at least in my opinion) Google’s Core updates can be too punitive on “white hat” sites which have not violated Google’s guidelines. Fluctuations for these sites should be expected, but not complete loss of visibility.
With this being said, I remain optimistic that Google genuinely wishes to improve its search results, and has no desire to penalize sites who are playing by the rules, creating great content, and providing a good experience for their users.
I believe we’re looking at collateral damage from Google’s fight against spam. False positives perhaps, which would account for the ups and downs between Core updates when nothing changes in the interim.
But false positives or otherwise, these big drops for “white hat” sites are devastating. And even if it’s only happening to a small (but understandably vocal) minority, they deserve to be listened to.
To put the scale into context, if one in a million web pages (0.0001%) are incorrectly flagged as low quality in a Core update, that’s still six hundred thousand web pages. That’s a lot of collateral damage
Finally, if you’ve been hit by the recent Google Core update, our advice for now is to follow Google’s advice:
- Take an objective look at your content, and consider how it could be improved. Are you completely satisfying the search intent for your target keywords?
- Conduct a full SEO audit and ensure there are no technical issues holding your site back
- Work on your on-site E-A-T signals (be clear about who you are, your expertise, why users should trust you)
- Work on your site speed, and make sure your Core Web Vitals are up to scratch
- Work on earning high quality backlinks to boost your site’s authority
If you have any questions, feel free to drop us a comment below.
And hop on our mailing list for a slate of in-depth SEO tutorials and case studies coming your way in 2021.