The Secret to Reversing the HCU Algorithm Updates – And How EEAT Actually Works

photo of Garit Boothe and the HCU algorithm update

2023 has been a brutal year in SEO for blogs and content sites. The Helpful Content Update (HCU) has been crushing everything in its path. 

But after literally traveling to the end of the earth in search of secret SEO knowledge, I now know EXACTLY how you can reverse the HCU algorithm penalty. 

Yes – it has to do with EEAT. But not in any way that you’ve ever been taught. In this article, I’ll show you the exact steps you need to take to build trust with Google. 

The great news is that this is also how you future-proof your website for SGE and Google’s new AI algorithm. Stick with me and I’ll explain it like you’ve never heard before.

How Google Knows If Your Site Is Trustworthy

Much has been said about building EEAT: experience, expertise, authority, and trust. I’ve read it all, and it is mostly wild conjecture and fluff.

garit boothe at the seo conference in chiang mai

Many of the recommendations are so safe that you never think to really question them. For example, sprucing up your authors’ biographies and adding a Privacy Policy page. Those are things you should be doing anyway, so no one questions them.

There’s one problem though: Google smashed thousands of websites with perfect author bios. Even sites where the finance writers are CPAs and health sites where the writers are doctors – the expertise is 100% valid! 

Longtime SEO guru Cyrus Shephard gave a very interesting talk recently about how Google views your site. He went undercover as a Google Quality Rater, and shared how things work on the inside.

cyrus shephard's seo talk

 

(I attended his talk in Chiang Mai, Thailand – quite literally on the other side of the earth from Utah!)

The first thing that Quality Raters look at is the reputation of the website. This is the “Trust” factor in EEAT. To find out what Google thinks, search for your brand name. Click on the three dots next to your URL.

 

Then the “About this result” panel pops up. Click on the “More about this page” button under the “Source” section.

more about this page

 

The “About the source” page pops up. If you have an established brand, Google shows you what it sees. At the time of this writing, this is what you see for my site:

about the source

 

Information straight from my GBP listing. Including the business name, address, phone number, intro section, and customer reviews.

Scroll below that and you’ll see two other sections: “Reviews from the web” and “Web results about the source”. 

For NerdWallet, this is what you get:

reviews from the web

 

Click “Show more” to see an expansive list of “Web results”:

show more web results

 

So far, so good. These are for websites that Google trusts. 

But if you have a site that Google doesn’t understand, you’ll see this:

google doesn't understand

 

This is for a finance website I picked at random.

This is a particularly interesting case because there are ten separate sites listed in the “Web results about the source”. Google still can’t figure out what the site is about.

Here is a screenshot of their declining traffic from Ahrefs:

ahrefs screenshot

 

Show me a website that got hit by the HCU website, and I’ll show you a site that Google doesn’t trust. 

If your site’s traffic tanked due to a Google algorithm update, check out my SEO Audit Services to help you recover. 

Optimize for Trust

This tool is incredible because it’s a microscope into the mind of the Google search machine. It tells you exactly the backlinks and off-page SEO strategy that you need to follow. 

Many advanced SEO people speculate that “EEAT is all about links”. 

But I’ve never heard them explain HOW to get backlinks that increase your trust with Google.

My penalized finance site has backlinks from Enterpreneur, Inc., MSN, Yahoo Finance, Go Banking Rates, and a host of other DR 70+ websites. Obviously, it didn’t matter to the Helpful Content algorithm update.

So what does matter to Google? 

First, look up your competitors. Analyze their “About the Source” page. 

Here’s a competitor for my finance site:

competitor about the source page

 

They didn’t get touched by the HCU update. Even though they’re just a big affiliate site. And here’s the kicker: they’re based in Romania. But they only market in the US.

As you can see, they have a Google Business Profile listing. (For the record, anyone who says that GBP listings don’t help your affiliate site, see here for evidence to the contrary!)

Check out their “Reviews from the web” sources:

reviews from the web sources

 

These ones are particularly interesting, because they aren’t important websites at all. Ahrefs shows that Monetize.info only gets around 500 per month in organic search traffic. And “ScamAdvisor”, well…hardly the most reputable review site.

Here are the “Web results about the source”:

web results about the source

 

Web results about the source show some interesting finds:

  • Crunchbase
  • A hosting company case study
  • A Muckrack profile

Here is the source profile from another blog/content site:

quora and reddit in the source profile

Yes, you saw that right: Quora and Reddit threads are used as a Google web source.

You’ll note that Google pulled in a Reddit snippet that says, “Adam is a scammer…”

And his “About the source” says this about him:

about the source again

 

Could this be one reason behind his site’s meteoric crash?

site traffic crashing

 

Despite a domain rating of 79 and an epic backlink profile, his site got destroyed in March 2022, the month of a product review update. Then the December 2022 Helpful Content Update finished it off for good.

To put that in context, this is a 7-figure income affiliate site that went up in flames because it didn’t have the right trust markers with Google. 

His site has a host of backlinks from DR 90 sites including Forbes, GoDaddy, Hubspot, Entrepreneur, HP, Yahoo, Calendly, and everywhere else, yet Google says that “it couldn’t find a good match”. 

So clearly, there is more to building trust with Google than “building links”. 

Interestingly, I’ve seen other sites commonly used as trusted “Web results” include:

  • YouTube
  • Podcast interviews from low DR websites
  • Yelp
  • ZoomInfo
  • GlassDoor

If your mind isn’t already set on fire with the possibilities here, let me spell it out for you: getting mentions and reviews from trusted sites is entirely doable. And I would add, somewhat easy. 

Exploiting the Trust Sources

The first step is to reverse-engineer the trusted web sources of your successful SEO competitors. The second step is to get reviews from each one of them.

To recap, we know that Google pulls trusted web sources from:

  • Google Business Profile
  • Crunchbase
  • GlassDoor
  • ZoomInfo

These are free business listings. You have no excuse to not go out right now and get them, 100% for free.

Then, go hire someone in a low-cost gig to create threads about your company in Quora and Reddit. Have an affiliate or friend in the business do a brand review about you on YouTube.

And there you have it. 7 trusted seed site sources before you can say, “Bob’s your uncle”.

If you’re in affiliate SEO or you have a blog, this is now the new barrier to entry. If you’re in local SEO, do this process to get into the Google Map pack. 

Getting business listings, also known in the business as “citations”, is a tried and true practice for getting local businesses in the coveted Google Map results. I’ve observed that most citation services (from the most reputable SEO companies) are complete nonsense, filled with spammy sites and low-quality links. In one citation service that I ordered, 50% of my citation pages never even indexed in Google. 

It makes sense that Google wouldn’t trust those sites, but I wasn’t sure what to do instead. Now you can skip the spammy citation services and build an off-page SEO presence you know will get you results. 

For anyone penalized by the HCU update, following the above advice should fill in the missing EEAT gap that you didn’t know about before. 

I also promised you that I would explain how to dominate Google’s AI, so I’ll explain how you can use this knowledge to game Google’s new, feared Search Generative Experience (SGE). 

Optimizing for SGE and the Google AI Machine

Google is trying to catch up to OpenAI with its new “Search Generative Experience”, officially still in beta in the United States. Here’s what it says about me:

what sge says

 

Google pulled this stuff straight from my Google Business Profile, the About page of my website, and my LinkedIn profile. 

In other words, entirely from sources that I gave it.

Google Knowledge Panel expert Jason Barnard recently gave a fascinating webinar about how SGE, Knowledge Panels, and EEAT all tie in together.

generative ai and knowledge graph and web index

 

SEO Geeks are used to optimizing for the “Web Index/Universal” search results. The Knowledge Graph currently created Knowledge Panels, and the new Generative AI will dominate the top of the page for future SERP results.

Another way to look at this is that SGE will provide the assessment, the knowledge graph will provide knowledge, and the traditional web results will provide the recommendations. 

assessment knowledge and recommendations

 

From a business standpoint, these three components show what Google trusts, likes and knows. 

know like and trust

For years, Google has been trying to understand “entities”. In other words, it tries to understand the world using a computer. 

In July, it expanded its knowledge base by adding more people (particularly writers) than ever before. This is in preparation for a wider AI rollout where they try to base rankings even more on trusted, known people and companies. 

For example, here is how Jason optimized the knowledge panel profile for one of his employees:

classification of writer

He called her a “writer” because Google knows what writers are and likes to promote them. This last update confirmed that:

author vectors

The late SEO patent researcher Bill Slawski noticed a large trend across many Google patents from the past 16 years. Namely, that “Google wants to index actual speakers and authors and websites treating each as an entity, understanding, and indexing each of those based upon the features which make them unique.”

spam ai algorithm

Google wants to build a spam AI algorithm by emphasizing authors that it knows, likes, and trusts.

Google massively expanded its AI database of people this year, but next year it will focus on expanding its database for Corporations and Organizations. 

Does your company have a knowledge panel? Does the “About this result” section on Google for your website show that Google knows what your site is? Then you’re in a good position for next year.

But if you don’t, then you’re just one algorithm update way from disaster. 

Steps to Improve Your EEAT and Prepare For Future AI Updates

Other than what I outlined in this article, here are other useful, actionable steps you can take to educate Google about yourself and your website:

  1. Make sure that your company has an accurate knowledge panel
  2. Get knowledge panels for your writers
  3. Use schema markup liberally to describe your company, its products, and the people on your website
  4. Focus on getting good reviews
  5. Make sure that your PR strategy aligns with your SEO strategy

Today I focused mostly on the “T” of EEAT, because it seems to be the biggest area that SEO professionals and other digital marketers don’t understand. It’s your biggest risk factor. Google also confirmed that it’s the most important factor.

In subsequent articles, I’ll delve into actionable, concrete things you can do to boost your experience, expertise, and authority in a way that the Google algorithm understands and rewards. 

Share this article on your Social Media

Facebook
Twitter
LinkedIn

Let's start working on your SEO Strategy today!

More articles for you