Friday, September 23, 2022

Top 4 Things to Know About GA4 — Whiteboard Friday

In this week’s Whiteboard Friday, Dana brings you some details on the exciting new world of Google Analytics 4. Watch and learn how to talk about it when clients and coworkers are intimidated by the move.

whiteboard outlining four insights into GA4

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, my name is Dana DiTomaso. I'm President at Kick Point. And I am here today at MozCon 2022 to bring you some details on the exciting world of Google Analytics 4, which I know all of you are like, "Ugh, I don't want to learn about analytics," which is totally fair. I also did not want to learn about analytics.

And then I kind of learned about it whether I liked it or not. And you should, too, unfortunately. 

So I think the biggest thing about the move from Universal Analytics to GA4 is that people are like they log in and everything looks different. "I don't like it." And then they leave. And I agree the user interface in GA4 leaves a lot to be desired. I don't think there's necessarily been a lot of good education, especially for those of us who aren't analysts on a day-to-day basis.

We're not all data scientists. I'm not a data scientist. I do marketing. So what I'm hoping is I can tell you the things you should know about GA4 on just a basic sort of level, so that you have a better vocabulary to talk about it when people are horrified by the move to GA4, which is inevitable. It's going to happen. You've got to get it on your site starting basically immediately, if you don't already have it. So I started out with three things, and then I realized there was a fourth thing. So you get a bonus, exciting bonus, but we'll start with the first three things. 

1. It's different

So the first thing it's different, which I know is obvious. Yes, of course, Dana it's different. But it's different. Okay, so in Universal Analytics, there were different types of hits that could go into analytics, which is where hits came from originally as a metric that people talked about. So, for example, in Universal Analytics, you could have a pageview, or you could have a transaction, or you could have an event.

And those were all different types of hits. In GA4, everything is an event. There is a pageview event. There is a transaction event. There is, well, an event event. I mean, you name the events whatever you want. And because of that, it's actually a lot better way to report on your data.

So, for example, one of the things that I know people always wanted to be able to report on in Universal Analytics is what pages did people see and how did that relate to conversion rate. And that was really tricky because a pageview was something that was at the hit scope level, which means it was just like the individual thing that happened, whereas conversion rate is a session scoped thing.

So you couldn't mash together a hit scope thing with pageview with conversion rate, which is session scoped. They just didn't combine together unless you did some fancy blending stuff in Data Studio. And who's got time for that? So now in GA4, because everything is an event, you have a lot more freedom with how you can slice and dice and interpret your data and figure out what pages do people engage with before they actually converted, or what was that path, not just the landing page, but the entire user journey on their path to conversion. So that part is really exciting. 

2. Engagement rate is not reverse bounce rate

Second thing, engagement rate is a new metric in GA4. They do have bounce rate. They did recently announce it. I'm annoyed at it, so we're going to talk about this a little bit. Engagement rate is not reverse bounce rate. But it is in GA4.

So in Universal Analytics, bounce rate was a metric that people reported on all the time, even though they shouldn't have. I hate bounce rate so much. Just picture like a dumpster fire GIF right now across your screen. I hate bounce rate. And why I hate bounce rate is it's so easily faked. Let's say, for example, your boss says to you, "Hey, you know what, the bounce rate on our site is too high. Could you fix it?"

You're like, "Oh, yeah, boss. Totally." And then what you do is whenever somebody comes on your website, you send what's called an interactive event off to Google Analytics at the same time. And now you have a 0% bounce rate. Congratulations. You got a raise because you made it up. Bounce rate could absolutely be faked, no question. And so when we moved over to GA4, originally there was no bounce rate.

There was engagement rate. Engagement rate has its own issues, but it's not measuring anything similar to what bounce rate was. Bounce rate in UA was an event didn't happen. It didn't matter if you spent an hour and a half on the page reading it closely. If you didn't engage in an event that was an interactive event, that meant that you were still counted as a bounce when you left that page.

Whereas in GA4, an engage session is by default someone spending 10 seconds with that tab, that website open, so active in their browser, or they visited two pages, or they had a conversion. Now this 10-second rule I think is pretty short. Ten seconds is not necessarily a lot of time for someone to be engaged with the website.

So you might want to change that. It's under the tagging settings in your data stream. So if you go to Admin and then you click on your data stream and you go to more tagging settings and then you go to session timeouts, you can change it in there. And I would recommend playing around with that and seeing what feels right to you. Now GA4 literally just as I'm filming this has announced bounce rate, which actually it is reverse engagement rate. Please don't use it.

Instead, think about engagement rate, which I think is a much more usable metric than bounce rate was in UA. And I'm kind of excited that bounce rate in UA is going away because it was [vocalization]. 

3. Your data will not match

All right. So next thing, your data is not going to match. And this is stressful because you've been reporting on UA data for years, and now all of a sudden it's not going to match and people will be like, "But you said there were 101 users, and today you're saying there were actually 102. What's the problem?"

So, I mean, if you have that kind of dialogue with your leadership, you really need to have a conversation about the idea of accuracy in analytics, as in it isn't, and error and everything else. But I mean, really the data is going to be different, and sometimes it's a lot different. It's not just a little bit different. And it's because GA4 measures stuff differently than UA did. There is a page on Google Analytics Help, which goes into it in depth. But here are some of the highlights that I think you should really know sort of off the top of your head when you're talking to people about this. 

Pageviews and unique pageviews

So first thing, a pageview metric, which we're all familiar with, in Universal Analytics, this was all pageviews, including repeats. In GA4, same, pageview is pageview. Great.

So far so good. Then we had unique pageviews in Universal Analytics, which was only single views per session. So if I looked at the homepage and then I went to a services page and I went back to the homepage, I would have two pageviews of the homepage for pageview. I would have one pageview of the homepage in unique pageviews. That metric does not exist in GA4. So that is something to really watch for is that if you were used to reporting on unique pageviews, that is gone.

So I recommend now changing your reports to sort of like walk people through this comfort level of getting them used to the fact they're not going to get unique pageviews anymore. Or you can implement something that I talk about in another one of my Whiteboard Fridays about being able to measure the percentage of people who are reloading tabs and tab hoarders. You could work that into this a little bit.

Users

Okay. Next thing is users. Users is really I think a difficult topic for a lot of people to get their heads around because they think, oh, user, that means that if I'm on my laptop and then I go to my mobile device, obviously I am one user. You're usually not, unfortunately. You don't necessarily get associated across multiple devices. Or if you're using say a privacy- focused browser, like Safari, you may not even be associated in the same device, which kind of sucks.

The real only way you can truly measure if someone is a user across multiple sessions is if you have a login on your website, which not everybody does. A lot of B2B sites don't have logins. A lot of small business sites don't have logins. So users is already kind of a sketchy metric. And so unfortunately it's one that people used to report on a lot in Universal Analytics.

So in Universal Analytics, users was total users, new versus returning. In GA4, it's now active users. What is an active user? The documentation is a little unclear on how Google considers an active user. So I recommend reading that in depth. Just know that this is going to be different. You never should have been reporting on new versus returning users anyway, unless you had a login on your site because it was such a sketchy, bad metric, but I don't think a lot of people knew how bad it was.

It's okay. Just start changing your reports now so that when you have to start using GA4, on July 1, 2023, for real UA is done, then at least it's not so much of a shock when you do make that transition. 

Sessions

So one other thing to think about as well with the changes is sessions. So in Universal Analytics, a session was the active use of a site, so you're clicking on stuff.

It had a 30-minute timeout. And you may have heard never to use UTM tags on internal links on your website. And the reason why is because if someone clicked on an internal link on your website that had UTMs on it, your session would reset. And so you would have what's called session breaking, where all of a sudden you would have a session that basically started in the middle of your website with a brand-new campaign and source and medium and completely detached from the session that they just had.

They would be a returning user though. That's great. You shouldn't have been reporting that anyway. Whereas in GA4 instead, now there's an event because, remember, everything is an event now. There is an event that is called session start. And so that records when, well, the session starts. And then there's also a 30-minute timeout, but there is no UTM reset.

Now that doesn't mean that you should go out there and start using UTMs on internal links. I still don't think it's a great idea, but it's not necessarily going to break things the way that it used to. So you can now see where did someone start on my site by looking at the session start event. I don't know if it's necessarily 100% reliable. We've seen situations where if you're using consent management tools, for example, like a cookie compliance tool, you can have issues with sessions starting and whatnot.

So just keep that in mind is that it's not necessarily totally foolproof, but it is a really interesting way to see where people started on the site in a way that you could not do this before. 

4. Use BigQuery

So bonus, bonus before we go. All right, the fourth thing that I think you should know about GA4, use BigQuery. There's a built-in BigQuery export under the settings for GA4. Use it.

The reason why you should use it is: (a) the reports in GA4 are not great, the default reports, they kind of suck; (b) even the explorations are a bit questionable, like you can't really format them to look nice at all. So what I'm saying to people is don't really use the reports inside GA4 for any sort of useful reporting purposes. It's more like an ad hoc reporting. But even then, I would still turn to BigQuery for most of my reporting needs.

And the reason why is because GA4 has some thresholding applied. So you don't necessarily get all the data out of GA4 when you're actually looking at reports in it. And this happened to me actually just this morning before I recorded this Whiteboard Friday. I was looking to see how many people engaged with the form on our website, and because it was a relatively low number, it said zero.

And then I looked at the data in BigQuery and it said 12. That amount could be missing from the reports in GA4, but you can see it in BigQuery, and that's because of the thresholding that's applied. So I always recommend using the BigQuery data instead of the GA4 data. And in Google Data Studio, if that's what you use for your reporting tool, the same issue applies when you use GA4 as a data source.

You have the same thresholding problems. So really just use BigQuery. And you don't need to know BigQuery. All you need to do is get the data going into BigQuery and then open up Google Data Studio and use that BigQuery table as your data source. That's really all you need to know. No SQL required. If you want to learn it, that's neat.

I don't even know it that well yet. But it is not something you have to know in order to report well on GA4. So I hope that you found this helpful and you can have a little bit more of a better dialogue with your team and your leadership about GA4. I know it seems rushed. It's rushed. Let's all admit it's rushed, but I think it's going to be a really good move. I'm really excited about the new kinds of data and the amounts of data that we can capture now in GA4.

It really frees us from like the category action label stuff that we were super tied to in Universal Analytics. We can record so much more interesting data now on every event. So I'm excited about that. The actual transition itself might be kind of painful, but then a year from now, we'll all look back and laugh, right? Thank you very much.

Video transcription by Speechpad.com

Wednesday, September 21, 2022

How Helpful Was the Helpful Content Update?

On August 25, Google started rolling out the Helpful Content Update, an ongoing effort to reward sites with “people-first” (i.e. not written specifically for SEO) content. MozCast measured rankings flux peaking at 92°F on August 26, which sounds relatively high, but this is what the two weeks on either side of the update looked like:

The dotted blue line shows the 30-day average for the period prior to the start of the update, which came in at 87°F. Ranking flux actually peaked on August 23 above any day of the update rollout. To make matters worse, we had to remove August 8-9 from the 30-day average, because Google’s data center outage completely disrupted search results.

Let me sum up: it’s a mess. I like to think I’m pretty good at handling messes, but this is like trying to find one particular drop of water in two weeks of rain during a summer-long storm. If you like messes, read on, but for the rest of you, I’ll tell you this — I found no clear evidence that this first iteration of the Helpful Content Update moved the needle for most sites.

Averages, lies, and damned lies

Given the extended rollout, I attempted to look at the difference in visibility for individual domains for the 14 days before and after the start of the rollout (which helps smooth out single-day outliers and keeps the days of the week consistent across both sides). One “loser” that immediately popped up was Conch-House.com, with nearly a 50% visibility loss in our data set. I admit, I even got a little judgmental about the hyphen in the domain name. Then, I looked at the daily data:

The averages don’t tell even half of this story. Whatever happened to Conch-House.com, they were completely knocked out of the rankings for 20 out of the 28 days analyzed. Note that the MozCast data set is limited, but our much larger STAT data set showed a similar pattern, with Conch-House.com ranking for up to 14,000 keywords on one day during this period.

What happened? I have no idea, but it quite definitely, almost certainly, very probably maybe was not the Helpful Content Update.

Confirmed content coincidence

Here’s an example I got pretty excited about. WhiteHouse.gov saw a +54% total visibility gain across the two time periods. The keyword set was pretty small so, once again, I dug into the daily numbers:

Looks great, right? There’s a clear spike on August 25 (although it fades a bit), and while the spike wasn’t as large, I was able to confirm this against a larger data set. If I was smart, I would’ve stopped the analysis right here. My friends, I was not smart.

One of the challenges of the Helpful Content Update is that Google has explicitly stated that helpful (or unhelpful) contact will impact rankings across a domain:

Any content — not just unhelpful content — on sites determined to have relatively high amounts of unhelpful content overall is less likely to perform well in Search …


Even so, it’s interesting to dig into specific pieces of content that improved or declined. In this case, WhiteHouse.gov clearly saw gains for one particular page:

This brief was published on August 24, immediately followed by a storm of media attention driving people to the official details. The timing was entirely a coincidence.

Is it helpful content (regardless of your take on the issue)? Almost certainly. Could WhiteHouse.gov be rewarded for producing it? Entirely possibly. Was this increase in visibility due to the Helpful Content Update? Probably not.

Is this blog post helpful content?

Hey, I tried. I’ve probably lost three nights of sleep over the past three weeks thanks to the Helpful Content Update. The truth is that extended rollouts mean extended SERP churn. Google search results are real-time phenomena, and the web is always changing. In this case, there was no clear spike (at least, no clear spike relative to recent history) and every once-promising example I found ultimately came up short.

Does that mean the update was a dud? No, I think this is the beginning of something important, and reports of niche impacts in sites with clear quality issues may very well be accurate (and, certainly, some have been reported by reputable SEOs whom I know and respect). The most viable explanation I can come up with is that this was a first step in rolling out a “helpfulness” factor, but that factor is going to take time to iterate on, ramp up, and fully build into the core algorithm.

One mystery that remains is why Google chose to pre-announce this update. Historically, for updates like Mobile-friendly or HTTPS, pre-announcements were Google’s way of influencing us to make changes (and, frankly, it worked), but this announcement arrived only a week before the update began, and after Google stated they had updated the relevant data. In other words, there was no time between the pre-announcement and the rollout to fix anything.

Ultimately, Google is sending us a signal about their future direction, and we should take that signal seriously. Look at the HTTPS update — when it first rolled out in 2014, we saw very little rankings flux and only about 8% of page-one organic results in MozCast were HTTPS URLs. Over time, Google turned up the volume and Chrome began to warn about non-HTTPS sites. In 2017, 50% of page-one organic results in MozCast were HTTPS. Today, in late 2022, it’s 99%.

The Helpful Content Update probably isn’t going to change the game overnight, but the game will change, and we would all do well to start learning the new rules.

Tuesday, September 20, 2022

How to Identify and Refresh Outdated Content

When someone regularly adds new content to their sites, they face an inevitable question: What happens to my older articles?

The way blogging works is really unfair to your past work: It gets buried in archives, losing traffic and relevance.

Is there a way to keep your content always up-to-date? Yes, but first let’s discuss the why.

Why update your content?

Keeping your content fresh and updated is more than overcoming the unfairness of your past work fading away. It's actually a legit marketing tactic that saves money and makes your users’ on-site experience smoother.

So let’s dive into why updating old content is so important:

1. User experience

The most obvious reason is that you want each of your site pages to be an effective entry landing page: Outdated content and broken links will likely result in bounces. These are lost leads and clients.

2. Search engine optimization

When it comes to SEO, content updates offer quite a few advantages:

  • Maintaining more consistent rankings, especially for those queries that deserve freshness. We’ve all seen this before: A competitor updates a page and suddenly gains 3-4 positions. It can be a temporary boost, but unless you perform an update to your page, it may last 12-18 months. Updating your old content on a regular basis will help you avoid these situations without necessarily chasing each particular loss.

  • Creating more helpful content (see what I did here?). What we knew about COVID three years ago has nothing to compare with what we know now. So, if you wrote an article on it back then, you will have a lot to add now. Adding new facts and guidelines to your old content makes it more in-depth — and yes, helpful — and that’s a ranking factor.

  • Making the most of your already-built link equity: Your old content may have attracted some backlinks and trust signals. You can benefit from those without investing in new links to your new articles.

  • Generating higher click-through thanks to a fresher date within your search snippet. I’m not aware of any organic search click-through study that would include dates in search snippets, but it’s safe to assume that in most cases, most people would be attracted to a fresher date, so if your search snippets include dates, it’s a good idea to make sure they're pretty recent:

How to identify content that needs updating

So how to identify outdated content (also referred to as expired content or content decay)?

Here are a few methods:

Loss of rankings

If you're monitoring your rankings, you will be notified of any loss.

There may be multiple reasons for rankings decline, but for content-based pages, it's often about content getting outdated. Evaluate your target SERP to see if a more up-to-date page is over-ranking yours now.

For new keywords you're not tracking positions for, you can analyze the ranking fluctuations using SE Ranking. The tool offers a handy SERP analytics feature that visualizes organic search result dynamics over time. It's a great way to analyze how versatile any SERP is and how often you may need to update your target page to keep up:

Loss of traffic

For multi-page, content-heavy sites, it may be next to impossible to keep track of all the rankings. Therefore regular page-level traffic audits will help you catch a possibly outdated article.

Google Analytics is a pretty solid way to identify pages that have seen a decline in the loss of clicks from organic search. All you need is to limit your Acquisition channels to “Organic search”, click through to the “Landing Pages” tab, and use the “Compare” checkbox when selecting dates.

You can compare clicks to the same period of the previous month, or go further back, depending on how often you do that exercise:

Make sure to select the same days of the week when comparing, as the traffic over the weekends will likely be always lower. You can also compare month- to-month to catch more extended losses.

Once you know which settings work best for your site, you can save that report to save yourself trouble of clicking all those settings again.

You can also use Search Console to identify pages losing traffic. There’s an option to compare clicks using various time frames, but you can only go as far as 16 months back:

Search Console lets you sort your pages by those that lost the most clicks, which is a great way to identify your biggest losses:

Clicking any page in the chart will load a new report focusing on that page. This is when you can click to the “Queries” tab to find the actual keywords that are sending fewer clicks.

It's also a good idea to set monitoring of your key competing pages, in order to be notified when those are updated. Visual Ping is a great tool for this, allowing you to monitor your competitors’ on-page SEO efforts and be alerted when they change anything on their sites:


In my niche, for example, I use the tool to monitor Wikipedia and Google pages to be alerted when there’s a change there. While I may not be able to ever compete with either, it's a good way to know when there’s an update needed for my pages on the same topic.

Link equity

You probably know your most-linked-to pages (if not, you may find them using Link Explorer):

The logic here is that if a page is heavily (and naturally) linked but has no organic traffic, you’re probably facing one of two issues:

  • It used to rank years ago

  • it's not optimized for any searchable query (and Google has trouble matching it to any)

In both of these cases, updating and re-optimizing a page could help it gain some rankings and clicks.

Likewise, if you see links pointing to an article of yours steadily disappearing, that may mean that your content has gone very outdated and the editors have started removing the links. Linkchecker is an easy backlink monitoring tool that will help you catch that unfortunate trend in a timely manner.

How to update your old content

Simply republish to a new date (Spoiler: Not recommended)

A popular way to ensure your search snippet has a recent date is to simply republish it to a new date.

Well, according to Google, this is wrong:

“...it’s against our guidelines to artificially freshen a story when the publisher didn't add significant information or demonstrated a compelling reason…”

It should be noted that it does help. While I cannot recommend anything that goes against Google’s guidelines, it's really frustrating to try and explain it to clients, especially when their competitors repeatedly and successfully use this simple tactic.

Nonetheless, let’s take this method off the table.

Add “significant” information

Unfortunately, Google wouldn’t tell what amount of information could be considered “significant.”

So you will have to use your editorial judgment. Here are a few ideas to get you started:

  • Include fresh stats or more recent numbers.

  • Update your sources (and replace broken links). Linking to articles that go back years ago will tell Google (and your site users) that yours is quite outdated as well.

  • Update your screenshots / images and embed newer videos.

  • Add new tools, mention new trends or recent events that may have influenced what is described in the article.

  • Add internal links to your newer content.

  • Add your new CTAs, link to your new (and relevant) lead magnets, update the forms

  • Optimize your page better, and in a more natural way: WebCEO offers a cool tool allowing you to identify which keywords any page can be re-optimized for higher organic visibility.

While it's not the only tool I tend to use, this one helps define the direction.

Next, I run my target keyword through Text Optimizer to find more angles, concepts and entities that can be used to expand my content and make it more relevant and in-depth:

Now, republish it to a new date to push it up on top of your site’s archives. There’s also a way to display the last updated date, without republishing (there are similar solutions on WP alternatives as well). However, that way it will remain deeper in your archives. If you update content often and don’t want all of those updates to appear on the front page, this is a good alternative method.

Redirect to a new page

This makes sense only if you have a few old articles on the same topic and you're consolidating them into a single new one.

Another valid reason for redirecting is when you have dates in URLs.

In other cases, I'm not a fan of internal redirects if you can do without them, so I’d almost always try to keep existing URLs.

Make content updates part of your routine

For well-established, content-heavy sites, updating content should be part of your content marketing routine. In other words, it should happen on a consistent and regular basis. To make it happen, try one or a combination of the following tactics:

  • Depending on your new and old content volume, make sure a certain percentage of content being published on your site is an update. For example, for every five new articles on your site, one should be an update.

  • Treat article updates as new content. Many reputable blogs (like this one) pay staff writers the same amount of money for an updated article as for a new one.

  • Allocate time every month to analyzing rankings and traffic losses and see if there’s an update opportunity there. I recommend assessing your declining organic visibility at least once a month.

  • Depending on your new content frequency, define one day in your editorial calendar to article updates. For example, if you update your blog daily, allocate every Friday to an article update.

  • Make sure your updated content is promoted as new: Create social media updates to push it using all available channels and diverse messaging. I use Creatopy for that because it makes this process extremely productive by allowing content writers and promoters to collaborate on visual creatives.

It should be noted that updating your site is not just about SEO and clicks. Many of your static pages that are not necessarily created to attract organic traffic are often left outdated. These include About us page, TOS, privacy policy page, FAQ page, and more. Keep those updated as well, based on your company’s milestones and legislation changes.

Takeaways

Keeping your existing content updated helps your user experience and SEO by letting you benefit from the past effort and already acquired link equity. Fresher content likely attracts higher click-through thanks to dated search snippets.

To identify content that needs updating, assess your losses in rankings and traffic. It's also a good idea to update well-linked content that has never ranked, for any reason.

Republishing an old article to a new date without updating it's against Google’s guidelines. Adding significant information – like new sources, tools, stats, images and videos — lets you republish old articles and push them on top of your archives, increasing their chances to rank higher and attract more clicks.

And finally, to make sure your content updates are really effective, make them part of your content marketing routine.

Monday, September 19, 2022

The MozCon 2022 Video Bundle Is Here (Plus, Our 2020 Videos are FREE!)

After three years and two virtual conferences, we gathered some of our best friends in the industry for the biggest SEO party of the year in Seattle — MozCon 2022. It felt great to be back in-person at camp MozCon, gathering insights and and watching tactical presentations from industry leaders, not to mention the opportunity to connect and network with fellow attendees!

And we're happy to share that if you missed the conference live, the MozCon 2022 video bundle is now available for your viewing pleasure!

Start watching now

For $299, you'll gain access to every presentation and speaker deck to watch as many times as you'd like. Schedule a viewing party with your team and get everyone on board with the best digital marketing advice, data, tools, and resources for the coming year.

If you'd like a taste of what this year's video bundle's got cooking, check out Dr. Pete Meyers' talk from this year's event:


Moz - The MozCon 2022 Video Bundle - pre order now!

Watch the MozCon 2020 videos for free in our SEO Learning Center!

Welcome to MozCon Virtual 2020 + State of the Industry

Sarah Bird

Sarah has a storied history of kicking MozCon off with a bright, sparkly bang. The former fearless leader of Moz welcomes each and every one of us to this year's virtual event, laying out all the pertinent details of the conference, and setting the tone for two jam-packed days of learning with a look at the State of the Industry.

Thought Leadership and SEO: The 3 Key Elements and Search Ranking Strategies

Andy Crestodina

Everyone wants to do it, but no one really knows what it is. So what is thought leadership? What isn’t it? And how does it affect search rankings?

This presentation is a data-rich perspective on the oh-so-popular topic of thought leadership, filled with practical takeaways for becoming an authority. And it’s all about the relationship between thought leadership and SEO. We’ll see how the research answers the questions and informs the tactics: Can brands be thought leaders? Can it be outsourced? Do you need to publish research? Or strong opinion? And how does it attract links and authority, rankings, and qualified visitors? Learn how a personal brand combines with content to drive big wins in SEO.

Great Expectations: The Truth About Digital PR Campaigns

Shannon McGuirk

In her talk, Shannon will challenge the desire for virality over consistency when it comes to digital PR and link building campaigns, while exploring the impact on the industry, team morale, and client expectations. By honestly sharing her own shortcomings, she'll push you to learn from your own campaign failures using tried and tested frameworks that’ll mean you can face any creative campaign or outreach struggle head-on.

Whatever You Do, Put Billboards in Seattle – Getting Brand Awareness Data from Google

Robin Lord

How can you harness the vast power of Google data to gain special insight into city- and product-level brand awareness? Robin will lead us on a journey through his Google Trends methodology to use Adwords search volume data for better brand intelligence.

How to Build a Global Brand Without a Global Budget

Phil Nottingham

As funnel-based marketing becomes less effective and harder to measure, "building a brand" is frequently touted as the panacea for all marketer's woes. But it's unclear how this can be achieved scalably and with a limited budget. Large enterprises resort to huge creative advertising campaigns that get their names out there by force of spend alone — but this isn't realistic for the smaller companies and the number of impressions is not the number of people impressed. In this session, Phil explains how modern brands are built through advocacy more than awareness alone, offering a deliverable method of brand marketing to radically shake up your content strategy.

The Science of Seeking Your Customer

Alexis Sanders

Users are at the core of everything we do in modern SEO. However, finding and understanding audiences can be daunting. Alexis will cover how to find your audience, share tools that are available for all price points, and show ways in which she’s found audience research to be useful as an SEO.

Moving Targets: Keywords in Crisis

Dr. Peter J. Meyers

Too often, we take a once-and-done approach to keyword research, but Google changes at the pace of information, and that pace speeds up even more during a crisis. How do we do keyword research in fast-paced industries and during world-changing moments? Dr. Pete provides concrete tactics for adaptive keyword research and spotting trends as they happen.

A Novel Approach to Scraping SEO Data

Rob Ousbey

Throughout a decade in SEO consulting, Rob needed to extract data from websites on many an occasion. Often this was at scale from sites that didn't have an API or export feature, or on sites that required some kind of authentication. While this was primarily a way to collect & combine data from different SEO tools, the use-cases were endless.

He found a technique that helped immensely, particularly when traditional tools couldn't do the job — but hadn't seen anyone using the same approach. In this very tactical session, Rob will walk through the steps he's used to extract data from all sorts of sites, from small fry to the giants, and give you the tools and knowledge to do the same.

Let It Go: How to Embrace Automation and Get Way More Done

Francine Rodriguez

Let the robot uprising begin! We've all heard horror stories about the dangers of automating your tasks, but now is not the time to deny yourself extra help. Robots never sleep. They don't get tired or overwhelmed by their to-do lists, and they're ready to work round-the-clock to accomplish whatever task we set before them. In this talk, you'll explore all the areas were automation is kicking butt in PPC — and how you can harness the power of robots to make more time for other efforts.

Designing a Content Engine: Going from Ideation to Creation to Distribution

Ross Simmonds

What does it take to develop a content engine that drives results? In this presentation, Ross will share data around the power of having a content engine, tools & strategies for content ideation, tools and tactics for content creation, and frameworks that brands can use to ensure that their content is distributed effectively after hitting publish. This presentation will help you not only uncover content-market fit, but also capitalize on it.

Accessible Machine Learning Workflows for SEOs

Britney Muller

"Machine learning" and "automation" aren't words SEOs need to fear. Machine learning enthusiast and ambassador of technical SEO Britney Muller shares a series of workflows intended for any SEO to access and use in their everyday work — no intimidation required.

How to Go Beyond Marketing for Clients: The Value of a Thriving Brand Ecosystem

Flavilla Fongang

Too many marketers serve their clients the bare minimum of what's expected from an agency. To stand out among the crowd, cultivate real loyalty, and maximize the lifetime value of your clients, you have to go beyond mere marketing — developing a thriving brand ecosystem that aligns with the brand's ultimate goals. Flavilla Fongang shares her tried-and-true framework for optimizing the customer journey, improving acquisition and retention, and going beyond what's expected to serve your clients well.

How to Be Ahead of the (CTR) Curve

Izzi Smith

Let’s face it: Carrying out SEO magic is all in vain when you’re forgetting about how your brand and products are being surfaced in the SERPs. By not properly analyzing or enhancing our organic CTR, we're greatly limiting our potential. Izzi will help you create the perfect SERP engagement strategy by covering practical ways to uplift your significant CTR, such as remedying your critical keyword rankings that could soon be lost, leveraging brand-empowering entity features (and assessing the risks of doing so), more intelligent testing of rich & featured snippet optimizations, and a whole lot more. CTR-you-ready?? You better be!

How to Promote Your Content Like a Boss

Brian Dean

Creating content is easy. But getting people to see your content? That's a different story. Brian Dean shares over a dozen practical strategies that you can use to spread the word about your latest blog post, podcast episode, or YouTube video.

Google My Business: Battling Bad Info & Safeguarding Your Search Strategy

Joy Hawkins

What's the harm in a little misinformation here and there? In the realm of local SEO, Joy Hawkins is here to outline exactly that. When it comes to local search and Google My Business, bad info can be make or break for your campaigns. Follow real data from a recent case study that illustrates why strategic decisions should be based on accurate information — and what can happen when that info is bad, wrong, or just plain incomplete.

Runtime: The 3-Ring Circus of Technical SEO

Michael King

Mike redefined technical SEO and its importance in our industry back in 2016. In 2018, he taught us everything we didn't know about SEO. This year, he's back to share the hottest technical tactics to up-level your efforts, plus the case studies and data that should be guiding your decisions.

Everyday Automation for Marketers

David Sottimano

As a general rule, we shouldn't be doing things that a computer can do better. However, a lot of automation is achieved through programming expertise — and that expertise isn't usually a marketer's forte. In this session, you'll learn how to gather data, use machine learning, and automate everyday tasks for marketers using low-code or no-code solutions.

Red Flags: Use a Discovery Process to Go from Red Flags to Green Lights

Dana DiTomaso

Ever get a few months into working with a new client and you’re thinking “if only we’d known…”? Or how about when you start that new job, except you can’t seem to make any forward progress because you’re always mopping up prior mistakes? Running a discovery process at the start of a project — or even as its own project — will help you turn those red flags into green lights.

Competitive Advantage in a Commoditized Industry

Heather Physioc

SEO isn't dead — it’s commoditized. In a world where search companies are a dime a dozen and brands tout bland "unique selling propositions" that aren't unique at all, how can you avoid drowning in the sea of sameness? What are you doing that's any different from every other SEO firm? In this talk, you'll learn how to find, activate, and articulate your competitive advantage. Learn how to identify unique strengths and innovative offerings that equate to competitive advantage through these real, working examples so you can bring them to life in search. You'll leave with actionable tips and homework to help your search business stand out — and that you can use with clients to help them find their competitive edge, too.

I Wanna Be Rich: Making Your Consultancy Profitable

Russ Jones

The last Mozcon talk from our dear friend and colleague Russ Jones explores a topic more pertinent than ever. How will your company weather the next update? How will you avoid layoffs and salary cuts? Being a master of SEO doesn't guarantee that your consultancy will succeed. With a decade and a half of experience, the late great Russ Jones outlines the techniques that will keep your clients happy and your bottom line healthy.

The CMO Role Has Been Disrupted: Are You Ready for Your New Boss?

Wil Reynolds

CMOs have the shortest tenure in the c-suite, and the CMO role has been eliminated at some of the largest brands. CEOs are now asking tougher and tougher questions about the value of marketing — and oftentimes marketers are not prepared.

Connecting your data and building your data flywheel is one way to support the swift answers CEOs expect from their CMOs. We need to get stronger at bridging our day-to-day work to the value it drives. And more than ever, “brand lift” isn’t enough to satisfy CEOs.

This presentation will start at the top. How businesses are run, how CEOs talk, and how we as search marketers can use the data we have access to everyday in new ways to answer the questions of the c-suite and raise our visibility and value in organizations.

Ready for more?

You'll uncover even more SEO goodness in the MozCon 2022 video bundle. At the low price of $299, this is invaluable content you can access again and again throughout the year to inspire and ignite your SEO strategy:


  • 29 full-length videos from some of the brightest minds in digital marketing

  • Instant downloads and streaming to your computer, tablet, or mobile device

  • Downloadable slide decks for presentation

Get my MozCon 2022 video bundle


Friday, September 16, 2022

Visual Search Optimization — Whiteboard Friday

In this week’s episode, MozCon 2022 speaker Crystal Carter talks you through the different optimizations that you can make for visual search, and the kinds of results that you might see for visual search content.

whiteboard outlining the process for visual search optimization

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to my Whiteboard Friday on visual search. Today, I'm going to talk about the different optimizations that you can make for visual search and the different kinds of results that you might see for visual search content.

Visual search optimization

So what happens with visual search is that you would do some optimizations on your website. Then, the user would do a visual search, and then they might get a different kind of result.

Image SEO

So the kinds of optimizations that you should consider for visual search, which is searches that are made via Google Lens or Pinterest Lens or via Bing's image search tools, include image SEO around making sure that you've got images that are performing well for image SEO with good file formats, titles, alt text, alt tags, schema, all of that sort of thing. 

Entities

Also, you're going to think about the kinds of entities which are within your photos. So visual search recognition software and tools, they can understand lots of different kinds of entities. There are a few that they prioritize in particular, though, and they include logos, landmark, text, and entities, which I've called "things" in this particular instance just as a shorthand, but entities that are essentially things that are found within the knowledge graph.

Composition

And then, the other one you want to think about is your composition. So the composition that you have for your image will affect what Google understands the image to be about. 

So, for instance, the way that different elements are positioned within an image can affect how Google understands the image. So I did an article for Moz at the beginning of the year, where I compared a teapot, and there was a teapot where the handle was here and the spout was here, and they understood that to be a teapot. 

How composition impacts Vision AI interpretation of images.

And then, when they turned it this way, they understood it to be a kettle, and those are two different things. So the way that you think about composition for your image can affect it.

So make sure that you have clean and clear images and also that you're thinking about your images being similar to user-generated content, particularly if you're in a B2C business, and also that you understand the primary focus. So, for instance, if you had a photo of a bicycle and you were trying to emphasize the bicycle part of the image, if you had somebody who was sitting on the bicycle or standing next to the bicycle and they were taking up most of the image, Google would think that that picture was more about that person than it was about the bicycle. So think about where the primary focus is in your image in order to optimize for visual search.

You also want to think about contrast, just making sure that it's very clear what the focus of the image is and so that you've got whatever is the focus of your image very clear and easy to decipher and not too busy if you need it to be about a single thing.

So these different elements are things that you should consider when you're optimizing your images for a visual search, particularly for Google Lens, and as users carry out a visual search. 

Visual search results

So, for instance, if you use Google Lens and you take a picture of a butterfly or a caterpillar or a flower or a chocolate donut, you're going to get lots of different types of results. 

Image pack

So, first of all, you may very well get an image pack result, and this will include some of the information that we were talking about before. 

So the difference between visual search and image search SEO is that in an image search SEO, like when you go to the Image tab within Google, you can enter the word "chocolate donut." But let's say you didn't know what a chocolate donut was, or let's say it was a different language and you didn't know the local word for chocolate donut. So what would happen is that the user would make the search of the chocolate donut, and Google would use its tools, like Vision AI, for instance, to understand that that's a chocolate donut, and then they would look through their images to understand which ones had text cues that were talking about chocolate donuts and that sort of thing. So that would return, potentially, some image pack information, and also, in the chocolate donut example in particular, it might return something like multisearch. 

So, for instance, you would do a modification. You might say a donut like this, but with sprinkles maybe, for instance. You might also get a result that's around Google Shopping, for instance.

SERP features

The other one you want to think about is the kinds of result you might get for a different SERP feature. So Local Pack is something that might come up. Also, knowledge panel. So particularly with the entities, the entities may very well be attached to a specific knowledge panel. So, for instance, logos for businesses or landmarks will have a knowledge panel, and also certain things, like if you were to think about something like Lego, that may very well have a knowledge panel as well. And landmarks, again, also could very well be showing in Google Maps. 

So think about the kinds of SERP features that you might show there. And that means that you could also, while you're optimizing this as part of your optimization for visual search, you might think about the optimizations that you make for these types of SERP results as well.

Visual match

Finally, the other kinds of results that Google might give to someone when they make a visual search is a visual match. So visual matches are images that look really similar to the picture that the person took, and these will sometimes return image packs and sometimes return a Local Pack, and they'll sometimes just return a general SERP result, like including a featured snippet that might have an image in it. You might also see something for a Google Business Profile. So if there is something that's local that has that, then they may very well get a Google Business Profile visual match, and also just general web content that might come through there.

So there's lots of different opportunities to return a visual match, but this one is particularly good when you're thinking about the composition of your images. So if you have a lot of footfall, if you have a lot of interaction with customers where they are reviewing your content, where they are visiting your establishment, and they're creating a lot of user-generated content, then think about how you can create images and add images to your website that satisfy the visual match queries that users might be making.

And I think there are some great opportunities across visual search in the next few years. Google has been investing in this quite a lot, and I think that this is an opportunity for businesses of all sizes, and I hope to see more people getting involved with visual search optimizations.

Video transcription by Speechpad.com

Wednesday, September 14, 2022

The State of Digital Accessibility: Three Key Challenges

Illustration of the web accessibility icon, the outline of a person, in purple on a white circle against a blue background.

Earlier this year, the Department of Justice (DOJ) published its first web accessibility guidance in 10 years. It was meant to remind businesses of all sizes that their websites — just like physical storefronts — need to be accessible to people with disabilities. 

The DOJ guidance comes at a time when the majority of US businesses are getting swept up in accelerated digital transformation and a struggle to make their websites accessible to people of all abilities. 

According to WebAIM’s most recent accessibility analysis of the top one million homepages, 97% of websites have accessibility errors — such as low contrast text and missing written descriptions of images — failing to meet some of the basic Website Content Accessibility Guidelines (WCAG), a de facto international standard. This is a slight improvement from 2020, when 98% of homepages were inaccessible. 

With only 3% of the Internet accessible, we have an urgent problem on a big scale. 

There are a number of reasons why, despite the growing awareness of digital accessibility, expectations of inclusivity, and renewed efforts by the government, we are still lagging behind. 

Among those reasons are the following three challenges that reflect that state of digital accessibility today. 

Three key challenges in digital accessibility 

1. The lack of clarity on legal requirements 

Illustration of a hand bringing down a purple gavel onto the web accessibility icon.

The Americans with Disabilities Act (ADA), which prohibits discrimination based on disability, and other laws governing accessibility in the United States were written before the Internet became an integral part of our lives. Today, the Justice Department and courts across the country decide on digital accessibility lawsuits on a case-by-case basis, relying on WCAG as a technical standard. But because these guidelines haven’t been codified, for many businesses it’s hard to know with certainty which standards are applicable to them, whether their websites meet legal requirements, and what specific steps they should take to comply with the laws.  

The Justice Department’s 2022 guidance somewhat addresses this ambiguity by reaffirming that web accessibility is a requirement under Title III of the ADA. Title III of the ADA requires any business “open to the public” to make their online content and services accessible to people who rely on assistive technologies, such as screen readers, to browse the Internet. 

With the current laws, businesses can choose how to ensure their content is accessible to people with disabilities. The DOJ guidance points to the WCAG and the Section 508 Standards (which the US federal government uses for its own websites), but it doesn’t provide a new legal standard. For example, it’s not clear whether businesses with online-only stores have to adhere to the same legal standard as those with both physical locations and e-commerce sites. 

With so much left to interpretation, including how many and which WCAG criteria a website needs to conform with in order to be considered ADA compliant, it’s hard for businesses to know where they stand when it comes to digital accessibility compliance. 

Further complicating matters is the complex and ever-changing nature of the Internet.

2. The dynamic nature of the Internet 

Illustration of several web page examples floating against a purple and teal background.

Whether it’s personalization based on user actions and preferences, or new content creation – websites are constantly changing, posing an ongoing challenge to keep them accessible. Every change, no matter how small — like adding a new product description or an image — can potentially make content inaccessible to users with disabilities. 

In a recent analysis of 3,500 websites across 22 industries, including healthcare, e-commerce, and employment, AudioEye, a web accessibility platform, found that 79% of the websites had at least three severe accessibility errors that could potentially block an assistive technology user from interacting with the content and/or completing the goal of a site visit, such as submitting a form or requesting information. 

When comparing different industries in the same analysis, the analysis found that 83% of e-commerce sites, 78% of healthcare sites, and 77% of jobs and career sites had accessibility errors that blocked or significantly impacted users’ ability to complete key tasks, such as viewing product descriptions, making a purchase, filling out an application, or booking an appointment.

Considering the dynamic nature of the Internet and the speed of content creation (more than 250,000 sites are launched every day), it’s clear we need a web accessibility solution that can monitor for accessibility errors in real-time and help fix issues as they come up. 

And while automation can provide rapid improvement at scale, it cannot solve all errors. 

3. Current limits of technology

Illustration of the web accessibility icon in a pink circle with a crack through it, centered among web page examples.

Even the best accessibility automation today, which can detect up to 70% of common accessibility errors and resolve two-thirds of them, cannot solve complex accessibility issues that require human judgment. Detecting more subtle errors often requires an understanding of context that is beyond even the most sophisticated AI today. For example, automation can detect that an image lacks a written description, or alt text, but it cannot tell whether an existing description is meaningful or accurate. Even with human judgment, if you ask two people to describe an image, their descriptions may be similar, but it is unlikely they would be exactly the same. Determining which description is the better one is also subjective, and AI is not yet able to make those types of judgments.

AudioEye’s analysis of 20,000 websites across industries showed that even the sites that were using some type of an automated digital accessibility solution — or about 6% of all sites in the analysis — still had accessibility errors with significant impact on the user experience. 

In another analysis — this time a manual audit of randomly selected 55 websites that used manual testing and remediation services, or traditional approach — AudioEye found over 950 accessibility issues. More than 40 of these sites had one or more severe accessibility issues, such as non-functional site navigation, unlabeled graphics, inaccessible video controls, and other issues that made digital content and tools inaccessible to people with disabilities.

Looking specifically at their own customers’ websites, AudioEye found that the majority of accessibility issues (up to 95%) can be fixed and prevented using a mix of automated and manual remediations, leveraging JavaScript, without the need to modify the original source code.

What will it take to solve digital accessibility at scale?

Accessibility solutions today range from simple automation-only tools to labor-intensive manual audits. AudioEye’s research, which included both automated and manual analysis of websites across industries, showed that the most effective way to solve web accessibility at scale is through a combination of technology and human expertise. 

To learn more about the state of digital accessibility and the role of technology in solving accessibility at scale, download AudioEye’s white paper on Building for Digital Accessibility at Scale which includes research details.

Tuesday, September 13, 2022

5 Things I Learned About E-A-T by Analyzing 647 Search Results

As a writer at a content marketing agency, I’ve written for a lot of different clients, and almost everything I’ve produced has been intended to rank on Google and encourage website traffic.

Here’s the challenge I (and every other marketing writer on the planet) am up against: search competition.

No matter what industry you’re in, or target audience you’re speaking to, you’re not alone. You have competition. And if you and your competition both understand the SEO game (which is very much the case for most companies nowadays), then what do you have to fall back on to protect your visibility in the all-important SERPs?

According to Google, it’s E-A-T: Expertise, Authoritativeness and Trustworthiness.

But here’s the complicated thing: Every one of my clients — even the small ones thriving in very big industries — has expertise, and authoritativeness, and even trustworthiness. So, how does that help them in search? And how can they possibly prove to Google, amid all the noise and competition and other experts out there, that they deserve a place on Page 1?

Last year, I set out to find out.

Methodology

Google is pretty clear about the fact that websites need E-A-T, but what they don’t really clue us in on is what E-A-T actually is or how it’s measured. I hypothesized that, if I compiled a big list of SERPs and closely analyzed all the Page 1 results, I could narrow down what may comprise E-A-T.

Theoretically, E-A-T affects different industries in different ways. That’s because some topics and subject areas are more critical than others to have extremely reliable information — like when you’re searching for information about prescription drugs or complicated financial products.

So, the first thing I did was choose seven topic categories to focus on: legal, insurance, health care, loans, pharmaceuticals, military, and informational questions. Next, I picked 10 queries for each category.

Then I searched. The resulting 70 SERPs produced 647 results. I analyzed each of those results, looking specifically for 32 different factors.

Finally, I reviewed what I had recorded and asked:

  • Which factors were the most prevalent across all 647 results?

  • Which factors were most prevalent among the 210 Top 3 results?

  • Were there differences in prevalent factors across the various topic categories I chose?

Before we get into the results, let’s talk about correlation vs. causation for a moment. While each of these factors seemed to be very common among Page 1 results, and it seems clear that some of these factors do play a role in establishing E-A-T, all I can really say for sure is that these traits are associated with pages that rank well in search. They could be indicators of a good page or website, but not necessarily the determining factor that’s putting them on Page 1.

With that in mind, here are five lessons I learned about E-A-T after closely analyzing the results from those 70 searches.

Lesson 1: Original, relevant, recent content is essential

Infographic outlining lesson one: original, relevant, recent content is a must.

Of all the lessons, this is the least surprising to me, but perhaps the most important. To rank well for relevant terms, you need to strongly demonstrate that your website belongs in search results. How? Content, obviously.

But it’s got to be high-quality content. Usually, I’d say that means you’re addressing the topic from all angles and leaving no questions unanswered. But after this SERP inspection exercise, I’d actually say the three most important characteristics of high-quality content are that it’s:

  1. Original

  2. Relevant

  3. Recently published or updated

Original research

One factor I sought throughout this study was original research. To me, this included any content that’s created using information the organization sources, analyzes, and publishes themselves.

Just shy of two-thirds of the results’ websites contained original research, but among the websites whose results were in the Top 3 positions, 70% had original research available. This shows the importance of creating your own, unique content — a story only you can tell. Trust me, you have one.

Relevance and topical authority

Beyond content just being unique, it also needs to be relevant to your industry and target audience. Topical authority is a weird concept because SEOs know it’s real, but there’s no way to measure it, and Google hasn’t exactly come out and said they have a topical authority ranking factor.

However, they have given us a lot of clues that point to topical authority being a highly important factor in E-A-T — like this patent they filed in 2017. Even in their recent Helpful Content Update, Google highlights questions that creators should ask themselves when considering their own site content. The question, “Does your site have a primary purpose or focus?” in particular alludes to the importance of creating content for a topic niche or specific subject area.

Given the limited tools on this subject, I decided to create my own (rudimentary) method of measuring topical authority by way of roughly determining the topic coverage depth throughout the whole website. Here’s what I did:

  1. Determine the parent topic of the query in question. “Insurance” is the parent topic for “types of insurance” and “world population” is the parent topic for “how many people are in the world,” for example.

  2. Find the Topic Coverage Score (TCS, as I call it) of each result’s website. That’s the number of pages indexed by Google that contain an exact-match of the parent term.

  3. Calculate the average TCS of all Page 1 results for each query.

  4. Compare the TSC of each result with the average TSC for that query.

After that procession of steps, I found that while 25% of Page 1 results had a TSC higher than the average, 40% of Top 3 results boasted the same. In other words, the websites that had the most topic coverage were more likely to land at the top of the page.

Recently published or updated

Half of all Top 3 and 48% of Page 1 results were dated within the previous two years. There are plenty of evergreen topics that don’t need regular content changes (the oldest result in my study was a page explaining why the sky is blue from 1997). Updating content just for the sake of giving it a new date won’t help you rank any higher in Google. However, creating timely content and updating old content as necessary could help.

Lesson 2: Your off-site, online presence matters

Here’s a lesson I wasn’t expecting to learn. When I set out on this study, I thought the biggest E-A-T factors would correspond to the website in question more so than the organization that manages it. Not so much: It became clear to me that your off-site, online presence plays a role in helping you rank in Google search results.

The vast majority (95%) of all results had third-party reviews of some kind, whether they’re Google My Business reviews, comments on Glassdoor, site trustworthiness information on Trustpilot, or something else.

Wikipedia is also a common thread between many of the results. While 89% of Page 1 results’ websites or organizations had at least one Wikipedia mention, 93% of Top 3 results did, too. As far as actual Wikipedia pages, 73% of Page 1 results and 82% of Top 3 results’ organizations had one.

The high prevalence on Page 1 tells me that it’s fairly common to have a Wikipedia connection, but the higher numbers corresponding to the Top 3 results hints at what their importance might be.

Another patent from Google, this one updated in 2018, discusses the topic of seed sites. A seed site, theoretically, is one that the search engine trusts because it generally has quality content and good, valuable links. Google hasn’t revealed whether this seed site theory is valid, or to what extent it plays a role in search algorithms (if any). But if I were to choose a seed site, Wikipedia would be a good contender. Each page has tons of links to websites with relevant information on carefully organized topics.

Another website worth mentioning is the Better Business Bureau. While it only gives limited perspective (since it only relates to Canadian and US businesses), I found that many Page 1 results’ organizations (70%) and even more Top 3 results (74%) had at least a BBB page but not necessarily a grade. In fact, a little over one-fourth of results that had a BBB page didn’t have a rating.

It seems to me that the value is in getting listed on BBB’s website more so than achieving a good grade — perhaps a North American-specific seed site of sorts.

Lesson 3: Transparency and honesty are the best policies

So far, we’ve learned a lot about E (expertise) and A (authoritativeness) but where the T — trustworthiness — really comes into sight is when we start talking about transparency.

Google states right in its Page Quality Rating guidelines that webmasters should state on their website exactly who is responsible for site content. That can be a person or people, or it could be an organization. At Moz, for example, the folks at Moz are responsible for their site content, and they explain all about it in their About page. Similarly, 91% of results I analyzed had a detailed About Us page.

Another way of being transparent about what your site is all about is by publishing editorial standards or guidelines. These documents detail how your site gets populated: where content comes from, what characteristics help it make the cut, what the organization won’t publish, and more.

43% of Page 1 results and 49% of Top 3 results had some sort of editorial guidelines published. These included information quality guidelines, pitch guidelines that reflect editorial standards, correction policies, and corporate governance documentation that addresses communication or media.

Why should publishing guidelines benefit your site? Well, I could see two factors at play here.

First off, Google's Page Quality Rating guidelines specifically notes that “High E-A-T news sources typically have published established editorial policies and robust review processes.” That doesn’t prove that the algorithm considers the presence of editorial guidelines (or even knows about them all the time) but it does lend us insight into the mind of Google.

Second, I’d be willing to bet that there’s a strong correlation between organizations that take the time to put together editorial guidelines and those that take the time to ensure their content is worthy of their site. Additionally, the process of putting together editorial guidelines is itself a good exercise in ensuring that your website content is high quality.

Lesson 4: Connections go a long way

No business operates in a vacuum, especially not on the Internet. The connections your organization has made with others, and how you acknowledge them, make an impact on how your community views you.

Reputable partners

There are all kinds of connections a business might make with another organization. Throughout the study, I kept track of something I called “reputable partners.” To earn this mark, a website had to demonstrate a relationship between themselves and another organization that’s plainly in support or favor of their work or mission.

Some of the most common types of demonstrations of these relationships included:

  • Articles and press releases announcing partnerships or outcomes.

  • Explanations of the relationships between those organizations.

  • Accolades from recognized organizations highlighted on-site through badges, links to award announcements, press releases, etc.

  • Links to press releases or articles demonstrating the relationship between organizations, and/or award badge displays.

While 73% of the results I looked at had clear “reputable partners,” 78% of those in the Top 3 did, too. My theory for this pattern is that making it obvious which other organizations are in support of you — generally or financially, e.g. through a grant — or in favor of your mission, you’re being transparent about how your organization operates. That fits squarely with the T in the E-A-T equation.

Backlinks

Another type of connection modern businesses deal in today is backlinks. Links put the “Inter” in “Internet,” and they’ve become essential for people and (more importantly for this subject) web crawlers to understand and navigate the web.

The average number of backlinks across all 647 results I analyzed was 32,572. Among the 210 Top 3 results, it was 88,581.

It’s certainly possible to get on Page 1 with fewer than that — about one-third had fewer than 100 backlinks and 28 had none whatsoever. However, we can clearly see that link quantity is valuable.

But what about link quality? For that, we can look at Moz’s Spam Score. This metric indicates your backlink profile health, with a 1% rating as really healthy and a 99% rating as super unhealthy.

While Moz considers a “low” score to be 30% or less, 44% of Page 1 results had a Spam Score of 1%, indicating that most Page 1 results have a very clean, healthy backlink profile. Another 19% had scores of 2 or 3%. The Top 3 results mirrored these results (with 44% at 1% and 18% at 2 or 3%).

Another way we can make some assumptions about link quality is by looking at referring domains. When there are loads of backlinks but very few referring domains, it seems less likely to be the result of deliberate link-building efforts. On the other hand, a higher number of referring domains could indicate more honest link-building tactics or simply just a really good web page that others want to link to.

The average number of referring domains among Page 1 results was 752. Meanwhile, among the Top 3 results, the average was 1,594. Making connections with other organizations online by way of honest link-building efforts can be one way to expand your reach, but also show Google and other search engines that you offer quality, worthwhile content.

Lesson 5: The right technology is essential

Last, but absolutely not least, if you have a website, it needs to be set up securely so that visitors can trust that they’re not putting their data at risk by interacting with it. In my study, I found that 96% of all results (also 96% of just top 3 results) used HTTPS. Interestingly, those that didn’t most often occurred in the military portion of the study.

Websites today also need technologies for cookie notifications, and some use pop-ups to convey important messages. Others use advertising to monetize their site. In any of these situations, the website owner should aim to minimize disruption to the user’s experience. Just 42% of all results had a pop-up: most of them (81) were inviting the user to subscribe to something (e.g. a newsletter), while nearly an equal number (79) were communicating information related to cookies.

Having the right technology enabled on your website may not seem inherently connected to E-A-T — which is why I didn’t evaluate even more technologic considerations such as e-payment systems — but considering that a huge aspect of Trustworthiness online today is about data gathering and management (and the ill effects of mis-management), it’s apparent that this area matters just as much, if not more so, than all your efforts into quality content creation.

Conclusion

When I set out to uncover the factors associated with E-A-T, I fully anticipated learning about proper author attribution, source citations, and good content. I guess I was thinking with my author hat on and not my web user hat, because I was only close on one of those three.

There are a lot of activities digital marketers can do to promote their businesses and goods and services today. Content creation and content marketing, link building, local SEO, advertising, public relations, and more can all seem like great options that you can pursue.

But the truth is, they’re not options — they’re must-haves for building a holistic digital presence. After conducting this study, my advice to webmasters and business leaders would be to assess your current online presence (including but not limited to your website’s user experience) and determine where there are holes. Working to fill those holes won’t necessarily be easy, but it will be worth it when your web traffic increases and your pages begin to rank.

To see a detailed explanation of each factor considered in this study, check out the full E-A-T study report on the Brafton blog.