Friday, March 12, 2021

How to Select Meaningful B2B SEO Keywords

Posted by Cody_McDaniel

It’s no secret that B2B marketing is different than B2C. The sales cycle is longer, there are multiple stakeholders involved, and it’s usually more expensive. To market effectively, you need to create content that helps, educates, and informs your clientele. The best way to do that is to identify the keywords that matter most to them, and build out content accordingly.

To find out how, watch this week's episode of Whiteboard Friday! 

Anatomy of a Perfect Pitch Email

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi and welcome to another Whiteboard Friday. My name is Cody McDaniel, and I'm an SEO manager at Obility. We are a B2B digital marketing agency, and today I want to talk about selecting meaningful B2B SEO keyword targets and the process and steps you can take in your own keyword research.

So B2B is a little bit different than you would see in your normal B2C types of marketing, right? The sales cycle or the length of time it takes to actually make a purchasing decision is usually a lot longer than you would see just buying something off Amazon, right? It's going to take multiple stakeholders. Individuals are going to be involved in that process. It's going to be usually a lot more expensive.

So in order to do that, they're going to want to be informed about their decision. They're going to have to look up content and information across the web to help inform that decision and make sure that they're doing the right thing for their own business. So in order to do that, we have to create content that helps, educates, and informs these users, and the way to do that is finding keywords that matter and building content around them.

1. Gather seed list

So when we're developing keyword research for our own clientele, the first thing that we do is gather a seed list. So usually we'll talk with our client contact and speak to them about what they care about. But it also helps to get a few other stakeholders involved, right, so the product marketing team or the sales team, individuals that will eventually want to use that information for their clients, and talk with them about what they care about, what do they want to show up for, what's important to them.

That will sort of help frame the conversation you want to be having and give you an understanding or an idea of where eventually you want to take this keyword research. It shouldn't be very long. It's a seed list. It should eventually grow, right? 

2. Review your content

So once you've done that and you have a baseline understanding of where you want to go, the next thing you can do is review the content that you have on your own website, and that can start with your homepage.

What's the way that you describe yourselves to the greater masses? What's the flagship page have to say about what you offer? You can go a little bit deeper into some of your other top-level pages and About Us. But try to generate an understanding of how you speak to your product, especially in relation to your clients in the industry that you're in. You can use that, and from there you can go a little bit further.

Go through your blog posts to see how you speak to the industry and to educate and inform individuals. Go to newsletters. Just try to get an understanding of what exists currently on the website, where your efficiencies may be, and of course where your deficiencies are or your lack of content. That will help you generate ideas on where you need to look for more keywords or modifications in the keywords you have.

3. Determine your rankings

Speaking of which, with the keywords that you currently have, it's important to know how you stand. So at this point, I try to look to see how we're ranking in the greater scheme of things, and there are a lot of different tools that you can use for that. Search Console is a great way to see how potential users across the web are going to your website currently. That can help you filter by page or by query.

You can get an understanding of what's getting clicks and generating interest. But you can also use other tools — SEMrush, SpyFu, Ahrefs, and Moz, of course. They'll all give you a keyword list that can help you determine what users are searching for in order to find your website and where they currently rank in the search engine results page. Now usually these lists are pretty extensive.

I mean, they can be anything from a few hundred to a few thousand terms. So it helps to parse it down a little bit. I like to filter it by things like if it has no search volume, nix it. If it's a branded term, I don't like to include it because you should be showing up for your branded terms already. Maybe if it's outside the top 50 in rankings, things like that, I don't want that information here right now.

4. Competitive research

I want to understand how we're showing up, where our competencies are, and how we can leverage that in our keyword research. So that should help the list to be a little bit more condensed. But one of the things you can also look at is not just internal but external, right? So you can look at your competition and see how we're ranking or comparing at least on the web.

What do they use? What sort of content do they have on their website? What are they promoting? How are they framing that conversation? Are they using blog posts? All that information is going to be useful for maybe developing your own strategies or maybe finding a niche where, if you have particularly stiff competition, you can find areas they're not discussing.

But use that competition as a framework for identifying areas and potential opportunities and how the general public or industry speaks to some of the content that you're interested in writing about. So once you have that list, it should be pretty big, good idea of the ecosystem you're working with, it's important to gather metrics.

5. Gather metrics

This is going to contextualize the information that you have, right? You want to make informed decisions on the keywords that you have, so this metric gathering will be important. There are a lot of different ways you can do it. Here at Obility, we might categorize them by different topic types so we can make sure that we're touching on all the different levels of keyword usage for the different topics that we discuss in our content.

You can look at things like search volume. There a lot of different tools that do that, the same ones I mentioned earlier — Moz, SpyFu, SEMrush. There's a great tool we use called Keyword Keg, that kind of sort of aggregates all of them. But that will give you an idea search volume on a monthly basis. But you can also use other metrics, things like difficulty, like how hard it is to rank compared to some of the other people on the web, or organic click-through rate, like what's the level of competition you're going to be going up against in terms of ads or videos or carousels or other sort of Google snippets.

Moz does a great job of that. So use these metrics, and what they should help you do is contextualize the information so that maybe if you're pretty close on two or three keywords, that metric gathering should help you identify which one is maybe the easiest, it has the most potential, so on and so forth. So once you have that, you should be getting a good understanding of where each of those keywords lives and you should be selecting your targets.

6. Select target keywords

Now I've run through a ton of clients who former agencies have sent them a list of 300 to 400 keywords that they're trying to rank for, and I cannot stand it. There's no value to be had, because how can you possibly try and optimize and rank for hundreds and hundreds of different variations of keywords. It would take too long, right? You could spend years in that rabbit hole.

What we try to do is focus on maybe 30 or 40 keywords and really narrow down what sort of content is going to be created for it, what you need to optimize. Does it exist on your website? If not, what do we need to make? Having that list makes a much more compartmentalized marketing strategy, and you can actually look at that and weigh it against how you're currently deploying content internally.

You can look at success metrics and KPIs. It just helps to have something a little bit more tangible to bite down on. Of course, you can grow from there, right? You start ranking well for those 20 or 30 terms, and you can add a few more on at the end of it. But again, I think it's really important to focus on a very select number, categorizing them by the importance of which ones you want to go first, and start there because this process in content creation takes a long time.

7. Consider intent

But once you've selected those, it's also important to consider intent. You can see I've outlined intent here a little bit more in depth. What do I mean by that? Well, the best way that I've seen intent described online is as an equation. So every query is made up of two parts, the implicit and the explicit. What are you saying, and what do you mean when you're saying it?

So when I think of that and trying to relate it to keywords, it's really important to use that framework to develop the strategy that you have. An example that I have here is "email marketing." So what's the implicit and explicit nature of that? Well, "email marketing" is a pretty broad term.

So implicitly they're probably looking to educate themselves on the topic, learn a little bit more about what it's about. You'll see, when you search for that, it's usually a lot more educational related content that helps the user understand it better. They're not ready to buy yet. They just want to know a little bit more. But what happens when I add a modifier on it? What if I add "software"? Well, now that you would have intent, it may mean the same thing as email marketing in some context, but software implies that they're looking for a solution.

We've now gone down the funnel and are starting to identify terms in which a user is more interested in purchasing. So that type of content is going to be significantly different, and it's going to be more heavily implied on features and benefits than just the email marketing. So that intent is important to frame your keywords, and it's important to make sure that you have them in every step of your purchasing funnel.

The way that I like to usually look at that, and you see it everywhere, it's an upside down triangle. You have your top, middle, and bottom level pieces of content. Usually the top is going to be things like blogs and other sorts of informational content that you're going to be having to use to inform users of the types of topics and things in the industry you care about.

That's probably where something like "email marketing" would exist. But "email marketing software" is probably going to be sitting right here in the middle, where somebody is going to want to make an informed decision, relate it to other pieces of content on competitor websites, check those features, and determine if it's a useful product for them, right? From there, you can go a little bit further and move them into different types of content, maybe email marketing software for small business.

That's far more nuanced and specific, and maybe you'll have a white paper or a demo that's specifically tailored to businesses that are looking for email marketing in the small business space. So having content in three separate spaces and three different modifications will help you identify where your content gaps are and make sure that users can move throughout your website and throughout the funnel and inform themselves on the decision they're trying to make.

Conclusion

So with that, this should give you some idea of how we develop keyword research here at our own agency, and I hope that you guys can utilize some of these strategies in your own keyword research wherever you are out in the world. So thanks again for listening. Happy New Year. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, March 10, 2021

What I Found After Experimenting with Google Discover for Two Months

Posted by Moberstein

I’m completely fascinated by Google’s Discover Feed. Besides the fact that it serves highly-relevant content, it also seems beyond the reach of being gamed. In a way, it almost seems beyond the reach of pure SEO (which makes it downright tantalizing to me).

It all made me want to understand what makes the feed tick.

So I did what any sensible person would do. I spent the better part of two months running all sorts of queries in all sorts of different ways to see how it impacted my Discover Feed.

Here are my ramblings.

My approach to analyzing Google’s Discover Feed

Let me explain what I did and how I did it, to both give you a better understanding of this analysis and point out its gaping limitations.

For five days a week, and over the course of two months, I executed all sorts of user behavior aimed at influencing my Discover Feed.

I ran queries on specific topics on mobile, searched for other topics on desktop… clicked on results… didn’t click on results... went directly to sites and clicked… went directly to sites and didn’t click anything, etc.

In other words, I wanted to see how Google reacted to my various behaviors. I wanted to see if one behavior influenced what showed in my Discover Feed more than other behaviors.

To do this, I searched for things I would generally never search for, went to sites I would normally never visit, and limited my normal search behavior at times so as not to influence the feed.

For example, I hate celebrity news and gossip with a passion, so I went to people.com every day (outside of the weekends) and scrolled through the site without clicking a thing. I then recorded if related material (i.e. celebrity nonsense) ended up in my Discover Feed the next day.

I recorded all of my various “web behaviors” in the same way. I would execute a given behavior (e.g. search for things related to a specific topic on mobile, but without clicking any results) and record what happened in my Discover Feed as time went on.

Here’s a breakdown of the various behaviors I executed along with the topics associated with each behavior. (For the record, each behavior corresponds to a single topic or website so I could determine the impact of that behavior on my Discover Feed.)

Allow me to quickly elaborate on the behaviors above:

  • These are all topics/sites that I am in no way interested or involved in (particularly self-help car repair).
  • When I clicked a YouTube video, I watched the entire thing (I mean, I didn’t actually watch half the time, but Google doesn’t know that… or do they?)
  • When I visited a site, I did scroll through the content and stay on the page for a bit.
  • A search for a “segment of a topic already existing in Discover Feed” means that the overall topic was something that regularly appeared in my feed (in this case, NFL football and MLB baseball). However, the subtopics, in this case the Cowboys and Marlins, were topics I never specifically searched for and did not appear in my feed. Also, the data for these two categories only reflects one month of experimentation.

All of this points to various limitations.

Is it possible that Google sees a topic like entertainment news as more “Discover-worthy” than sewing? It is.

Is it possible that going to a site like Fandango during a pandemic (when many theaters were closed) influenced Google’s decision to include or exclude things related to the topic matter dealt with by the site? It is.

What if I didn’t skip the weekends and executed the above every single day. Would that have made a difference? I don’t know.

I’m not trying to portray any of the data I’ll present as being overly-conclusive. This is merely what I did, what I found, and what it all made me think.

Let’s have at it then.

How user behavior impacted my Discover Feed

Before I dive into the “data”, I want to point out that the heart of my observations isn’t found in the data itself, but in some of the things I noticed in my Discover Feed along the way.

More than that, this data is far from conclusive or concrete, and in most ways speaks to my unique user-profile. That said, let’s have a look at the data, because there just may be some general takeaways.

As I mentioned, I wanted to see the impact of the various online behaviors on my Discover Feed. That is, how frequently did Google insert content related to the topics associated with each specific behavior into my feed?

For all the times I went to japantimes.co.jp how often was there content in my feed related to Japanese news? For all the times I searched for and watched YouTube videos on lawn care, how often did Google show me such content in Discover?

Survey says:

Here are some of the most intruding highlights reflected in the graph above:

  • Watching YouTube videos on mobile had no impact on my feed whatsoever (though it certainly did on what YouTube ads I was served).
  • Watching YouTube videos on desktop has little impact (in fact, any insertion of “sewing” into my feed was only as a topic card which contained no URL).
  • Searching on Google alone, without clicking a result, was ineffective.
  • Visiting a desktop site and clicking around was very effective at filling my feed with “cooking” content, however the same was not true on mobile.

Watching YouTube videos (desktop) about sewing was only successful in getting Google to include the topic in its “Discover more” cards.

I want to emphasize that when I say things like “YouTube mobile watches had no impact”, I don’t mean that as a general statement. Rather, such a statement is only aligned with the way I engaged with YouTube (one video watch per day). Clearly, and as is obvious, if you watch a large number of YouTube videos around one topic in a short time, Discover will pick this up.

I did, in fact, test this.

I gave my kids free rein at various moments to take my phone and watch a large number of videos related to specific topics (surprisingly, they were happy to oblige and to watch vast amounts of YouTube).

I have twin 9-year-old boys. One watched an obscene number of YouTube videos and executed an insane number of searches related to airplanes and flight simulators. I am still awaiting the day where my feed stops showing cards related to this topic. Here’s my search history to prove it:

The other little fellow watched videos about the weather and animal behavior that results from it for a few hours straight (hey, it was during the height of quarantine). That same day, this is what I saw in my feed:

You don’t need me to tell you that if Google thinks you’re going gaga over a specific topic, it will throw said topic into your Discover Feed posthaste.

My goal in all of this was not to see what is the quickest way to get Google to update the topics it shows in your Discover Feed. The point in my methodology was to see if there was one type of behavior that Google seemed to take more seriously than another vis-a-vis inserting new topics into my Discover Feed.

To that, Google did react differently to my various behaviors.

That doesn’t mean I can make many conclusions based on the above data. For example, Google clearly saw my going to foodnetwork.com and clicking on an article each day as a strong signal that “cooking” deserves to be in my Discover Feed.

Google was apt to think of my behavior of visiting foodnetwork.com and clicking an article each day as an endorsement for wanting “cooking” content in my Discover Feed.

At the same time, Google completely ignored that behavior on mobile. Each day I went to japantimes.co.jp and scrolled through an article. Yet, not once did Google include anything even remotely related to Japanese news in my feed.

I suspect that the topic here was too far removed from overall search behavior. So while it was reasonable for Google to assume I wanted cooking-related material in my feed, the same did not hold true for topics related to Japan.

I think this is the same reason why the topic associated with my visiting a site on desktop without clicking anything made it into my feed. The topic here was celebrity news, and I imagine that Google has profiled this topic as being one that is highly-relevant to Discover. So much so that Google tested including it in my feed at various points.

Despite never clicking on an article when visiting people.com each day, Google still flirted with showing celebrity news content in my Discover Feed.

That said, there is some reason to believe that desktop behavior has more of an impact than mobile user behavior.

The case for desktop Discover Feed dominance

About a month into my little experiment I wondered what would happen if I started searching for and clicking on things that were segments of topics that already appeared in my feed.

Deciding on these segments was quite easy. My feed is constantly filled with material on baseball and American football. Thus, I decided to search for and click on two teams I have no interest in. This way, while the topic overall was already in my feed, I would be able to see the impact of my behavior.

Specifically, on desktop I searched for things related to the Dallas Cowboys, clicking on a search result each time. Similarly, I did the same for the Miami Marlins baseball team on mobile.

Again, in both cases, content specific to these teams had yet to appear in my feed.

Here are the results of this activity:

Over a 30-day period, I found 10 instances of content related to the Dallas Cowboys in my feed and 6 instances of content about the Miami Marlins.

Again, just as in the first set of data I presented, a disparity between mobile and desktop exists.

Is this a general rule? Is this based on my particular profile? I don’t know. It’s just an interesting point that should be investigated further.

I will say that I doubt the content itself played a role. If anything, there should have been more results on mobile about the Marlins, as I was very much caught up in the World Series that was taking place at the time of my activity.

What does this data actually mean?

There are so many factors at play, that using any of the data above is a bit “hard.” Yes, I think there are some trends or indicators within it. However, that’s not really what I want you to take away from all of this. (Also, is it such a crime to consume data solely because it’s interesting to see some of what’s going on?)

What do I want you to take away, then?

As part of my data analysis (if you’ll even call it that) I looked at how long it took for a behavior to result in Discover Feed inclusion. Surprisingly, the numbers were pretty consistent:

Discounting the 31 behavior instances around my “Search Desktop No Click” activity (e.g. searching for all things related to "fishing" but clicking on nothing) to impact my feed, Google picked up on what I was doing fairly quickly.

Generally speaking, it took less than 10 behaviors for Google to think it should update the topics shown in my feed.

That’s really the point. Despite the normal things I search for and engage with both regularly and heavily (things like SEO, for example) Google took this “lighter” yet consistent behavior as a signal to update my feed.

Google was very aware of what I was doing and acted on it pretty quickly all things considered. In the case of “food/cooking” content, as shown earlier, Google took my behavior very seriously and consistently showed such content in my feed.

Forget which behavior on which device produced more cards in my feed. The fact that it varied at all is telling. It shows Google is looking at the type of engagement and where it happens in the context of your overall profile.

Personally, I think if you (yes, you, the person reading this) did this experiment, you would get different results. Maybe some of the trends might align, but I would imagine that would be it.

And now for the really interesting part of all this.

Diving into what was and what wasn’t in my Discover Feed

As I’ve mentioned, the data is interesting in some of the possible trends it alludes to and in that it shows how closely Google is watching your behavior. However, the most interesting facets of this little project of mine came from seeing what Google was and was not showing day-in and day-out.

Is Google profiling users utilizing the same account?

The first month of this study coincided with a lockdown due to COVID-19. That meant my kids were home, all day, for a month. It also meant they watched a lot of YouTube. From Wild Kratts to Woody Woodpecker, my kids consumed a heap of cartoons and they did so using my Google account (so I could see what they were watching).

Wouldn’t you know, a funny thing happened. There was no “cartoon” content in my Discover Feed. I checked my feed religiously that month and not once did I notice a card about a cartoon.

Isn’t that odd?

Not if Google is profiling my account according to the devices being used or even according to the content being consumed. All signs point to Google being well aware that the content my kids were watching was not being consumed by the one using Discover (me).

This isn’t a stretch at all. The same happens in my YouTube feed all the time. While my desktop feed is filled to the brim with Fireman Sam, the YouTube app on my phone is a mixture of news and sports (I don’t “SEO” on YouTube) as my kids generally don’t watch their “programs” on my phone.

The URLs I visited were absent from Discover

There was one other thing missing from my Discover Feed and this one has enormous implications.

URLs.

Virtually none of the URLs I visited during my two-month experiment popped up in my Discover Feed!

I visited the Food Network’s website some 40 times, each time clicking and reading (pretending to read to be fair) an article or recipe. By the time I was nearing the end of my experiment Discover was showing me some sorts of food/cooking related content every day.

Through all of that, not once did Google show me a URL from the Food Network! Do you like apples? Well, how do you like them apples? (Cooked slowly with a hint of cinnamon.)

This was the general trend for each type of behavior that produced new topics in my feed. I visited a few websites about car repair, Google threw some cards about the topic in my Feed… none of which were sites I visited.

The only time I saw the same site I visited that appeared in my Discover Feed was ESPN for some of the sports queries I ran and people.com which I visited every day. However, I think that was entirely accidental as both sites are top sources of content in their spaces.

Yes, some sites I visit regularly do appear in my feed in general. For example, there were some local news sites that I visited multiple times a day for the better part of a month so as to track COVID-19 in my area. I freely admit it was a compulsion. One that Google picked up on.

In other words, it took a heck of a lot for Google to think I wanted that specific site or sites in my feed. Moreover, it would seem that Google doesn’t want to simply show content from the URLs you visit unless the signal otherwise is immense.

This leads me to my next question…

Is Discover really an SEO issue?

What can you do to optimize for Google Discover? It’s almost an absurd question. I visited the same site every day and Google still didn’t include its URL in my feed. (Again, I am aware that certain behaviors will trigger a specific URL, my point is that Google is not as apt to do so as you might think.) It all points to a certain lack of control. It all points to Google specifically not wanting to pigeon-hole the content it shows in Discover.

In other words, you can’t create content specifically for Discover. There’s no such concept. There’s no such control. There is no set of standardized “ranking signals” that you can try to optimize for.

Optimizing your images to make sure they’re high-quality or ensuring they’re at least 1,200 pixels wide and so forth isn’t really “optimizing” for Discover. It’s merely making yourself eligible to get into the ballpark. There is no standardized path to actually get on the field.

The entire idea of Discover is to offer content that’s specifically relevant to one user and all of their various idiosyncrasies. The notion of “optimizing” for something like that almost doesn't compute.

Like with optimizing your images for Discover, all you can really do is position yourself.

And how does one position themselves for inclusion into the Discover Feed?

One of the sites that kept popping up in my feed was dallascowboys.com. This makes sense as I was searching for things related to the Dallas Cowboys and clicking on all sorts of results as a consequence. However, in my “travels” I specifically did not visit dallascowboys.com. Yet, once Google saw I was interested in the Cowboys, it was one of the first sites I was served with.

You don’t need to be a rocket scientist to see why. What other site is more relevant and more authoritative than the official site of the franchise?

If you want your site to be included in Discover, you need to be incredibly relevant and authoritative on whatever it is your site deals with.

That means investing time and resources into creating unique and substantial content. It means crafting an entire strategy around creating topical identity. After all, the idea is to get Google to understand that your site deals with a given topic, deals with it in-depth, and deals with it often (i.e., the topic is closely related to who you are as a site).

That sounds a heck of a lot more like “content marketing” than pure SEO, at least it does to me.

A cross-discipline marketing mind meld

Discover, to me, is the poster child for the merging of pure content creation and SEO. It speaks to the idea of needing a more abstract understanding of what a sound content strategy is, in order to be effective in the “Google-verse.”

It’s perhaps a different sort of motion than what you might typically find in the world of pure SEO. As opposed to diving into the minute details (be it a specific technical problem or a specific aspect of content optimization), Discover urges us to take a more holistic approach, to take a step back.

The way Discover is constructed advocates for a broader approach based on a meta-analysis of how a site is perceived by Google and what can be done to create a stronger profile. It’s almost the perfect blend of content, marketing, and an understanding of how Google works (SEO).

Fascinating.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, March 9, 2021

According to the Experts: 5 Technical SEO Trends to Watch in 2021

Posted by morgan.mcmurray

It’s no secret that SEO relies heavily on technical components to drive site rankability, and with so many emerging technologies, new tools, and metrics (*cough* Core Web Vitals *cough*), you might be wondering whether these constant updates will affect your more technical work.

To find out more about the state of technical SEO in 2021, we asked seven industry experts for their thoughts. The overwhelming answer? Keep doing what you’re doing.

“The core essentials in 2021 will remain about the same — every SEO needs to understand the fundamentals of crawling vs. indexing and the technical basics that have to be met before a site can rank,” says Moz Search Scientist, Dr. Pete Meyers. “All the fancy footwork in the world won’t get you anywhere if there’s no floor beneath you.”

But what should that floor be constructed of? Read on to find out!

1. Focus on the fundamentals

Click to tweet this!

Technical best practices are the “best” for a reason, so having a strong foundation of basic technical SEO skills is still a must.

“For me, the most underrated technical SEO strategy has always been the fundamental best practices,” says consultant Joe Hall. “It might sound surprising to some, but the vast majority of my clients have a difficulty in grasping the importance of best practices or just the basic fundamentals of SEO. I think this is in large part because of our community's focus and attention on the ‘next best thing’, and not very often talking about the basic fundamentals of technical SEO.”

Those fundamentals include hard skills like understanding how to recognize and fix crawlability, indexation, accessibility, and site performance issues, but also how to prioritize the issues you come across.

SEO expert Lily Ray notes that prioritization is an area of improvement that novice technical SEOs need to address first, as they may be more inclined to cite small things as major problems when they’re really not: “It is common for new or junior SEOs to send a laundry list of technical items as exported by [SEO tools] directly to clients, without prioritizing which ones are the most important to fix, or knowing which ones can be ignored,” she says. “In many cases, the tools don’t even flag some of the more severe technical issues that may be affecting crawling, indexation, or rendering… Good technical SEOs are able to pinpoint real problems that are having a significant impact on the website’s ability to rank well, and they know which tools or other resources to use to be able to solve those problems.”

So start taking note of not just the what when it comes to technical issues, but also the influence those issues actually have on the sites you work on.

Need to brush up or build on these hard skills? Not to worry — Moz Academy recently released a Technical SEO Certification that can help you do just that!

Sign Me Up!

Beyond the more hands-on, practical skill sets required for building and maintaining search-optimized websites, our experts agree that basic soft skills are just as important, with several citing the need for cross-team collaboration abilities.

“Technical SEO implementations generally require working with multiple teams... which means there’s a lot of partnership, persuasion, give and take collaborations,” says Alexis Sanders, the SEO Director at Merkle. “Having project management, client services, storytelling, and communication skills will support implementation.”

So don’t get stuck in the weeds of your technical work — make sure you’re in constant communication with the teams and stakeholders who will help support your initiatives.

2. Gear up for Core Web Vitals

Click to tweet this!

One of the hottest topics in the industry right now is no doubt Core Web Vitals, the new Google ranking factors update expected in May 2021. But do technical SEOs really need to worry about them?

The experts say yes, but to work as a team to address them, and make your SEO voice heard. Alexis Sanders puts it this way: “The page experience update consists of Core Web Vitals, mobile-friendliness, web app security, and removing interstitials. Regardless of how teams are structured, making progress is going to require a wide array of talents, giving SEO a more involved seat at the table, as these elements affect our bottom-line.”

When prioritizing what to focus on, make sure that improving site speed is at the top of your list.

“If you only work on one area of Technical SEO in 2021, make it site speed,” advises Kick Point President Dana DiTomaso. “Site speed is one of those great parts of technical SEO where the benefit isn't only for search engines — it also helps the people visiting your website. Remember, not everyone is coming to your website using the latest technology and fastest internet connection.”

When asked about their favorite ways to optimize, here’s what the experts suggested:

  1. Start using a content delivery network, such as cloudflare.com.
  2. Implement server-side caching for markup and design assets like CSS and JavaScript, and minimize the number of individual requests made for each page by bringing CSS and JavaScript in-line.
  3. Optimize media files by converting to next-generation formats and compressing for size and use of data.
  4. Use tools like BuiltWith, Wappalyzer, and Lighthouse to investigate what third party scripts have been loaded on a page, and remove them if you no longer need them, or move as many as compatible to a tag management tool.
  5. Focus on image performance optimization.
  6. Work with analytics and other internal teams to establish processes and expectations for adding and removing tagging.
  7. Set requirements and expectations around page speed early in the development process.

Addressing any site speed and usability issues now will set you up to better weather rankings shake-ups caused by Core Web Vitals.

Click to tweet this!

3. Use schema and structured data strategically

To ensure that crawlers can read, index, and serve the content of their sites to searchers, many SEOs rely on structured data and schema frameworks to organize everything — as well they should. But when implementing structured data, the experts agree, make sure you’re using it to achieve specific goals, and not just because you can.

“Some structured data has material impact on search results or how Google can process and understand a site, while other structured data will be totally irrelevant to any given site or have no measurable impact,” says Dr. Pete. “We need to use structured data with clear intent and purpose in order to see results.”

Lily Ray agrees, pointing out the debate on the topic of schema within the industry:

“There is a wide range of opinions on this topic within the SEO community, with some SEOs wanting to ‘mark up all the things’ and others not believing schema is important if it doesn’t generate Rich Results. Personally, I like to apply structured data if I believe it can provide search engines with more context about the entities included in our clients’ websites, even if that schema does not generate Rich Results. For example, I believe that adding Schema attributes related to your brand and your authors is a good approach to help solidify information in Google’s Knowledge Graph.”

The takeaway? Get clear on your goals, and implement structured data if it makes sense for your strategy, but don’t “mark up all the things” if doing so will create unnecessary work for you and your team without bringing about the results you’re looking for.

4. Leverage automation to get things done

Click to tweet this!

Emerging technologies don’t always stick around long enough to become useful, but one innovation that won’t be going away anytime soon is using languages like Python to help automate various workflows, like data analysis and research.

“The technical SEO industry has been exploding with new ideas and innovations in the past couple of years, particularly related to analyzing data at scale and automating SEO processes, which has resulted in programming languages like Python moving into the spotlight,” says Lily Ray.

Why is automation important? Not only can it make your day-to-day work easier and more streamlined, it can have positive effects on your business as well.

“I still think that improving time to task completion (performance optimization) is core to every business,” says Miracle Inameti-Archibong, the Head of SEO at Erudite. “Not just because of the page experience update coming in May, but because it affects all channels and directly affects the bottom line of the business (sale, leads) which is what key decision-makers are interested in.”

In 2021, explore ways in which automation can help you achieve both your technical SEO and broader business goals more effectively.

5. Don’t forget about analytics and reporting

Click to tweet this!

SEO success is incremental and gradual, usually taking months to years before you can definitively show how the work you put in has paid off. But if something goes wrong? Well, Dr. Pete has the perfect analogy: “The truth is that technical SEO is often like washing dishes — no one gives you much credit for it, but they sure notice when you break something.”

While technical SEO is the basis for all other SEO work, your non-SEO co-workers and managers will likely pay attention more when things are going wrong than when they’re going right. To help mitigate this issue, he suggests steering clear of “vanity metrics”, such as pages indexed, and instead “showing how a clear plan of action led to improvements in relevant rankings, traffic, and sales.”

Click to tweet this!

Make sure you’re outlining specific metrics and goals from the start of every campaign, which will help guide your efforts and give you an easier framework for reporting on things down the line. And don’t forget to factor in outside forces that may be affecting your results.

“Organic traffic can be impacted by a lot of external factors, or your other, non-technical SEO campaigns,” says Tom Capper, Moz’s Senior Search Scientist (say that five times fast). “Tactics like SEO split-testing or, at a more primitive level, counterfactual forecasting, can help to isolate these effects in many cases, and happily technical changes tend to have a quicker, more direct impact than some other types of change that don’t see returns until the next core update.”

So when analyzing and reporting, remember: quantity isn’t always quality, and make sure you have the full picture before gleaning insights.

Conclusion

Click to tweet this!

While the core of your technical SEO work will stay the same in 2021, there is plenty of opportunity to build and improve on foundational skills, implement structured data and automation, clarify the way you analyze and report your results, and plan for Core Web Vitals to take effect. And while technical work can sometimes feel isolating, remember that cross-team collaboration is key to success, and that you’re part of a community of SEOs with similar goals!

Speaking of community, we’d be remiss if we didn’t mention the amazing work of Areej AbuAli and the Women in Tech SEO network.

“If you identify as a woman, do join the Women in Tech SEO Slack channel and subscribe to its newsletter,” advises Miracle Inameti-Archibong. “I wish I had a community like that at the beginning of my career. There are loads of people always willing to help with not just technical SEO issues, but mentoring and sharing of opportunities.”

Have questions for the experts, or advice not mentioned here? Let us know in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, March 3, 2021

10 Steps to Blend STAT Ranking Data with Site Performance Metrics

Posted by AndrewMiller

Too often, we assume that SEO best practices will work in any industry against any competitive set. But most best practices go untested and may not be “best” in every situation.

We all know that tactics that worked in 2020 won’t necessarily move the needle in 2021 as Core Web Vitals (CWV) and other signals shuffle to the front. We have to do better for our businesses and our clients.

I’m a data nerd at heart with lots of battle scars from 15 years in SEO. The idea of analyzing thousands of local SERPs sounded like too much fun to pass up. I found some surprising correlations, and just as importantly, built a methodology and data set that can be updated quarterly to show changes over time.

I analyzed 50,000+ SERPs in the retail banking sector so I could make sense of the massive shifts in rankings and search behaviors during the lockdown period. We have a lot of historical data for bank websites, so comparing pre/post COVID data would be easier than starting from scratch.

I’ll share how I did it below. But first, I want to share WHY I think sharing this type of research is so important for the SEO community.

Why validate SEO best practices with data?

It’s a great time to be an SEO. We have amazing tools and can gather more data than ever. We have thriving communities and excellent basic training materials.

Yet, we often see our craft distilled into overly-simplified “best practices” that are assumed to be universally true. But if there’s one universal truth in SEO, it’s that there are no universal truths. A best practice can be misinterpreted or outdated, leading to missed opportunities or outright harm to a business.

Using the increasing importance of CWV as an example, SEOs have an opportunity (and obligation) to separate fact from fiction. We need to know if, and by how much, CWV will impact rankings over time so we can prioritize our efforts.

We can elevate our SEO game individually and collectively by testing and validating best practices with research. It just takes a curious mind, the right tools, and a willingness to accept the results rather than force a narrative.

Failing to validate best practices is a liability for SEO practitioners and shows an unwillingness to challenge assumptions. In my experience, a lack of data can lead to a senior stakeholders’ opinions carrying more weight than an SEO expert’s recommendations.

Start by asking the right questions

Real insight comes from combining data from multiple sources to answer critical questions and ensure your strategies are backed by valid data. In my analysis of local banks, I started by listing the questions I wanted to know the answers to:

  • What characteristics are shared by top-ranking local bank websites?
  • Who are banks actually competing against in the SERPs? Is it primarily other banks?
  • How do competitive SERPS change based on when/where/how users search?
  • How can smaller, local businesses gain an edge over larger competitors from outside their region?
  • How does SERP composition affect a bank’s ability to rank well for targeted keywords?
  • How important are Core Web Vitals (CWV) for rankings? How does this change over time?

You could run this same analysis by replacing “banks” with other local business categories. The list of potential questions is endless so you can adjust them based on your needs.

Here’s an important reminder - be prepared to accept the answers even if they are inconclusive or contradictory to your assumptions. Data-driven SEOs have to avoid confirmation bias if we’re going to remain objective.

Here’s how I analyzed 50,000 search results in a few hours

I combined three of my favorite tools to analyze SERPs at scale and gather the data needed to answer my questions:

  • STAT to generated ranking reports for select keywords
  • Screaming Frog to crawl websites and gather technical SEO data
  • Power BI to analyze the large data sets and create simple visualizations

Step 1: Determine your data needs

I used US Census Bureau data to identify all cities with populations over 100,000, because I wanted a representation of local bank SERPs across the country. My list ended up including 314 separate cities, but you could customize your list to suit your needs.

I also wanted to gather data for desktop and mobile searches to compare SERP differences between the device types.

Step 2: Identify your keywords

I chose “banks near me” and “banks in {city, st}” based on their strong local intent and high search volumes, compared to more specific keywords for banking services.

Step 3: Generate a STAT import file in .csv format

Once you have your keywords and market list, it’s time to prepare the bulk upload for STAT. Use the template provided in the link to create a .csv file with the following fields:

  • Project: The name of the new STAT project, or an existing project.
  • Folder: The name of the new folder, or an existing folder. (This is an optional column that you can leave blank.)
  • Site: The domain name for the site you want to track. Note, for our purposes you can enter any URL you want to track here. The Top 20 Report will include all ranking URLs for the target keywords even if they aren’t listed in your “Site” column.
  • Keyword: The search query you’re adding.
  • Tags: Enter as many keyword tags as you want, separated by commas. I used “city” and “near me” as tags to distinguish between the query types. (This is an optional column that you can leave blank.)
  • Market: Specify the market (country and language) in which you would like to track the keyword. I used “US-en” for US English.
  • Location: If you want to track the keyword in a specific location, specify the city, state, province, ZIP code, and/or postal code. I used the city and state list in “city, st” format.
  • Device: Select whether you would like Desktop or Smartphone results. I selected both.

Each market, location, and device type will multiply the number of keywords you must track. I ended up with 1,256 keywords (314 markets X 2 keywords X 2 devices) in my import file.

Once your file is complete, you can import to STAT and begin tracking.

Step 4: Run a Top 20 Report in STAT for all keywords

STAT’s built-in Google SERP Top 20 Comparison report captures the top 20 organic results from each SERP at different intervals (daily, weekly, monthly, etc.) to look at changes over time. I did not need daily data so I simply let it run on two consecutive days and removed the data I did not need. I re-run the same report quarterly to track changes over time.

Watch the video below to learn how to set up this report! 

My 1,256 keywords generated over 25,000 rows of data per day. Each row is a different organic listing and includes the keyword, monthly search volume, rank (includes the local pack), base rank (does not include the local pack), https/http protocol of the ranking URL, the ranking URL, and your tags.

Here’s an example of the raw output in CSV format:

It’s easy to see how useful this data is by itself but it becomes even more powerful when we clean it up and start crawling the ranking URLs.

Step 5: Clean up and normalize your STAT URLs data

At this point you may have invested 1-2 hours in gathering the initial data. This step is a bit more time consuming, but data cleansing allows you to run more advanced analysis and uncover more useful insights in Screaming Frog.

Here are the changes I made to the STAT rankings data to prepare for the next steps in Screaming Frog and Power BI. You’ll end up with multiple columns of URLs. Each serves a purpose later.

  1. Duplicate the Ranking URL column to a new column called Normalized URL.
  2. Remove URL parameters from the Normalized URL fields by using Excel’s text to columns tool and separating by “?”. I deleted the new columns(s) containing the URL parameters because they were not helpful in my analysis.
  3. Duplicate the new, clean Normalized URL column to a new column called TLD. Use the text to columns tool on the TLD column and separate by “/” to remove everything except the domain name and subdomains. Delete the new columns. I chose to keep the subdomains in my TLD column but you can remove them if it helps your analysis.
  4. Finally, create one more column called Full URL that will eventually become the list of URLs that you’ll crawl in Screaming Frog. To generate the Full URL, simply use Excel’s concatenate function to combine the Protocol and Normalized URL columns. Your formula will look something like this: =concatenate(A1, “://”, C1) to include the “://” in a valid URL string.

The 25,000+ rows in my data set are well within Excel’s limitations, so I am able to manipulate the data easily in one place. You may need to use a database (I like BigQuery) as your data sets grow.

Step 6: Categorize your SERP results by website type

Skimming through the SERP results, it’s easy to see that banks are not the only type of website that rank for keywords with local search intent. Since one of my initial questions was SERP composition, I had to identify all of the different types of websites and label each one for further analysis.

This step is by far the most time consuming and insightful. I spent 3 hours categorizing the initial batch of 25,000+ URLs into one of the following categories:

  • Institution (banks and credit union websites)
  • Directory (aggregators, local business directories, etc.)
  • Reviews (local and national sites like Yelp.com)
  • Education (content about banks on .edu domains)
  • Government (content about banks on .gov domains and municipal sites)
  • Jobs (careers sites and job aggregators)
  • News (local and national news sites with banking content)
  • Food Banks (yes, plenty of food banks rank for “banks near me” keywords)
  • Real Estate (commercial and residential real estate listings)
  • Search Engines (ranked content belonging to a search engine)
  • Social Media (ranked content on social media sites)
  • Other (completely random results not related to any of the above)

Your local SERPs will likely contain many of these website types and other unrelated categories such as food banks. Speed up the process by sorting and filtering your TLD and Normalized URL columns to categorize multiple rows simultaneously. For example, all the yelp.com rankings can be categorized as “Reviews” with a quick copy/paste.

At this point, your rankings data set is complete and you are ready to begin crawling the top-ranking sites in your industry to see what they have in common.

Step 7: Crawl your target websites with Screaming Frog

My initial STAT data identified over 6,600 unique pages from local bank websites that ranked in the top 20 organic search results. This is far too many pages to evaluate manually. Enter Screaming Frog, a crawler that mimics Google’s web crawler and extracts tons of SEO data from websites.

I configured Screaming Frog to crawl each of the 6,600 ranking pages for a larger analysis of characteristics shared by top-ranking bank websites. Don’t just let SF loose though. Be sure to configure it properly to save time and avoid crawling unnecessary pages.

These settings ensure we’ll get all the info we need to answer our questions in one crawl:

List Mode: Paste in a de-duplicated Full URL list from your STAT data. In my case, this was 6,600+ URLs.

Database Storage Mode: It may be a bit slower than Memory (RAM) Storage, but saving your crawl results on your hard disk ensures you won’t lose your results if you make a mistake (like I have many times) and close your report before you finish analyzing the data.

Limit Crawl Depth: Set this to 0 (zero) so the spider will only crawl the URLs on your list without following internal links to other pages on those domains.

APIs: I highly recommend using the Pagespeed Insights Integration to pull Lighthouse speed metrics directly into your crawl data. If you have a Moz account with API access, you can also pull link and domain data from the Moz API with the built-in integration.

Once you have configured the spider, let it rip! It could take several minutes to several hours depending on how many URLs you’re crawling and your computer’s speed and memory constraints. Just be patient! You might try running larger crawls overnight or on an extra computer to avoid bogging your primary machine down.

Step 8: Export your Screaming Frog crawl data to Excel

Dumping your crawl data into Excel is remarkably easy.

Step 9: Join your data sets in Power BI

At this point, you should have two data sources in Excel: one for your STAT rankings data and another for your Screaming Frog crawl data. Our goal is to combine the two data sources to see how organic search rank may be influenced by on-page SEO elements and site performance. To do this, we must first merge the data.

If you have access to a Windows PC, the free version of Power BI is powerful enough to get you started. Begin by loading your two data sources into a new project using the Get Data wizard.

Once your data sets are loaded, it’s time to make the magic happen by creating relationships in your data to unlock correlations between rankings and site characteristics. To combine your data in Power BI, create a many-to-many relationship between your STAT Full URL and Screaming Frog Original URL fields. 

If you are new to BI tools and data visualization, don’t worry! There are lots of helpful tutorials and videos just a quick search away. At this point, it’s really hard to break anything and you can experiment with lots of ways to analyze your data and share insights with many types of charts and graphs.

I should note that Power BI is my preferred data visualization tool but you may be able to use Tableau or some equally powerful. Google Data Studio was not an option for this analysis because it only allows for left outer joins of the multiple data sources and does not support “many-to-many” relationships. It’s a technical way of saying Data Studio isn’t flexible enough to create the data relationships that we need.

Step 10: Analyze and visualize!

Power BI’s built-in visualizations allow you to quickly summarize and present data. This is where we can start analyzing the data to answer the questions we came up with earlier.

Results — what did we learn?

Here are a couple examples of the insights gleaned from merging our rankings and crawl data. Spoiler alert — CWV doesn’t strongly impact organic rankings….yet!

Who are banks actually competing against in the SERPs? Is it primarily other banks?

On desktops, about 67% of organic search results belong to financial institutions (banks and credit unions) with heavy competition from reviews sites (7%) and online directories (22%). This information helps shape our SEO strategies for banks by exposing opportunities to monitor and maintain listings in relevant directories and reviews sites.

Okay, now let’s mash up our data sources to see how the distribution of website categories varies by rank on desktop devices. Suddenly, we can see that financial institutions actually occupy the majority of the top 3 results while reviews sites and directories are more prevalent in positions 4-10.

How important are Core Web Vitals (CWV) for rankings? How does this change over time?

Site performance and site speed are hot topics in SEO and will only become more important as CWV becomes a ranking signal in May this year. We can begin to understand the relationships between site speed and rankings by comparing STAT rankings and Pagespeed Insights data from Screaming Frog reports.

As of January 2021, sites with higher Lighthouse Performance Scores (i.e. they load faster) tend to rank better than sites with lower scores. This could help justify investments in site speed and site performance.

Some CWV elements correlate more closely with better rankings and others are more scattered. This isn’t to say CWV aren’t important or meaningful, but rather it’s a starting point for further analysis after May.

So what? What can we learn from this type of analysis?

Separately, STAT and Screaming Frog are incredibly powerful SEO tools. The data they provide are useful if you happen to be an SEO but the ability to merge data and extract relationships will multiply your value in any organization that values data, and acts on insights.

Besides validating some generally accepted SEO knowledge with data (“faster sites are rewarded with better rankings”), better use of relational data can also help us avoid spending valuable time on less important tactics (“improve Cumulative Layout Shift at all costs!”).

Of course, correlation does not imply causation, and aggregated data does not guarantee an outcome for individual sites. But if you’re a bank marketing professional responsible for customer acquisition from organic channels, you’ll need to bring this type of data to your stakeholders to justify increased investments in SEO.

By sharing the tools and methodology, I hope others will take it further by building and contributing their additional findings to the SEO community. What other datasets can we combine to deepen our understanding of SERPs on a larger scale? Let me know your thoughts in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, March 1, 2021

Featured Snippets Drop to Historic Lows

Posted by Dr-Pete

On February 19, MozCast measured a dramatic drop (40% day-over-day) in SERPs with Featured Snippets, with no immediate signs of recovery. Here's a two-week view (February 10-23):

Here's a 60-day view, highlighting this historic low-point in our 10K-keyword data set:

I could take the graph back further, but let's cut to the chase — this is the lowest prevalence rate of Featured Snippets in our data set since we started collecting reliable data in the summer of 2015.

Are we losing our minds?

After the year we've all had, it's always good to check our sanity. In this case, other data sets showed a drop on the same date, but the severity of the drop varied dramatically. So, I checked our STAT data across desktop queries (en-US only) — over two million daily SERPs — and saw the following:

STAT recorded an 11% day-over-day drop. Interestingly, there's been a 16% total drop since February 10, if we include a second, smaller drop on February 13. While MozCast is desktop-only, STAT has access to mobile data. Here's the desktop/mobile comparison:

While mobile SERPs in STAT showed higher overall prevalence, the pattern was very similar, with a 9% day-over-day-drop on February 19 and a total drop of about 12% since February 10. Note that, while there is considerable overlap, the desktop and mobile data sets may contain different search phrases. While the desktop data set is currently about 2.2M daily SERPs, mobile is closer to 1.7M.

Note that the MozCast 10K keywords are skewed (deliberately) toward shorter, more competitive phrases, whereas STAT includes many more "long-tail" phrases. This explains the overall higher prevalence in STAT, as longer phrases tend to include questions and other natural-language queries that are more likely to drive Featured Snippets.

Why the big difference?

What's driving the 40% drop in MozCast and, presumably, more competitive terms? First things first: we've hand-verified a number of these losses, and there is no evidence of measurement error. One helpful aspect of the 10K MozCast keywords is that they're evenly divided across 20 historical Google Ads categories. While some changes impact industry categories similarly, the Featured Snippet loss showed a dramatic range of impact:

Competitive healthcare terms lost more than two-thirds of their Featured Snippets. It turns out that many of these terms had other prominent features, such as Medical Knowledge Panels. Here are some high-volume terms that lost Featured Snippets in the Health category:

  • diabetes
  • lupus
  • autism
  • fibromyalgia
  • acne

While Finance had a much lower initial prevalence of Featured Snippets, Finance SERPs also saw massive losses on February 19. Some high-volume examples include:

  • pension
  • risk management
  • mutual funds
  • roth ira
  • investment

Like the Health category, these terms have a Knowledge Panel in the right-hand column on desktop, with some basic information (primarily from Wikipedia/Wikidata). Again, these are competitive "head" terms, where Google was displaying multiple SERP features prior to February 19.

Both Health and Finance search phrases align closely with so-called YMYL (Your Money or Your Life) content areas, which, in Google's own words "... could potentially impact a person’s future happiness, health, financial stability, or safety." These are areas where Google is clearly concerned about the quality of the answers they provide.

What about passage indexing?

Could this be tied to the "passage indexing" update that rolled out around February 10? While there's a lot we still don't know about the impact of that update, and while that update impacted rankings and very likely impacted organic snippets of all types, there's no reason to believe that update would impact whether or not a Featured Snippet is displayed for any given query. While the timelines overlap slightly, these events are most likely separate.

Is the snippet sky falling?

While the 40% drop in Featured Snippets in MozCast appears to be real, the impact was primarily on shorter, more competitive terms and specific industry categories. For those in YMYL categories, it certainly makes sense to evaluate the impact on your rankings and search traffic.

Generally speaking, this is a common pattern with SERP features — Google ramps them up over time, then reaches a threshold where quality starts to suffer, and then lowers the volume. As Google becomes more confident in the quality of their Featured Snippet algorithms, they may turn that volume back up. I certainly don't expect Featured Snippets to disappear any time soon, and they're still very prevalent in longer, natural-language queries.

Consider, too, that some of these Featured Snippets may just have been redundant. Prior to February 19, someone searching for "mutual fund" might have seen this Featured Snippet:

Google is assuming a "What is/are ...?" question here, but "mutual fund" is a highly ambiguous search that could have multiple intents. At the same time, Google was already showing a Knowledge Graph entity in the right-hand column (on desktop), presumably from trusted sources:

Why display both, especially if Google has concerns about quality in a category where they're very sensitive to quality issues? At the same time, while it may sting a bit to lose these Featured Snippets, consider whether they were really delivering. While this term may be great for vanity, how often are people at the very beginning of a search journey — who may not even know what a mutual fund is — going to convert into a customer? In many cases, they may be jumping straight to the Knowledge Panel and not even taking the Featured Snippet into account.

For Moz Pro customers, remember that you can easily track Featured Snippets from the "SERP Features" page (under "Rankings" in the left-hand nav) and filter for keywords with Featured Snippets. You'll get a report something like this — look for the scissors icon to see where Featured Snippets are appearing and whether you (blue) or a competitor (red) are capturing them:

Whatever the impact, one thing remains true — Google giveth and Google taketh away. Unlike losing a ranking or losing a Featured Snippet to a competitor, there's very little you can do to reverse this kind of sweeping change. For sites in heavily-impacted verticals, we can only monitor the situation and try to assess our new reality.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, February 26, 2021

Google Posts: Conversion Factor — Not Ranking Factor

Posted by Greg_Gifford

While Google Posts aren’t a ranking factor, they can still be an incredibly effective resource for increasing local business conversions — when used correctly. This week’s Whiteboard Friday host, Greg Gifford, shows you how to put your best post forward.

Anatomy of a Perfect Pitch Email

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to another edition of Whiteboard Fridays. I'm Greg Gifford, the Vice President of Search at SearchLab, a boutique digital marketing agency specializing in local SEO and paid search. I'm here today to talk about— you guessed it — Google Posts, the feature on Google My Business that lets you post interesting and attractive things to attract potential customers.

The importance of Google My Business

Mike Blumenthal said it first. Your Google My Business listing is your new homepage. Then we all kind of stole it, and everybody says it now. But it's totally true. It's the first impression that you make with potential customers. If someone wants your phone number, they don't have to go to your site to get it anymore. Or if they need your address to get directions or if they want to check out photos of your business or they want to see hours or reviews, they can do it all right there on the search engine results page.

If you're a local business, one that serves customers face-to-face at a physical storefront location or that serves customers at their location, like a plumber or an electrician, then you're eligible to have a Google My Business listing, and that listing is a major element of your local SEO strategy. You need to stand out from competitors and show potential customers why they should check you out. Google Posts are one of the best ways to do just that thing.

How to use Google Posts effectively

For those of you who don't know about Google Posts, they were released back in 2016, and they used to show up, up at the top of your Google My Business panel, and most businesses went crazy over them. In October of 2018, they moved them down to the very bottom of the GMB panel on desktop and out of the overview panel on mobile results, and most people kind of lost interest because they thought there would be a huge loss of visibility.

But honestly, it doesn't matter. They're still incredibly effective when they're used correctly.

Posts are basically free advertising on Google. You heard that right. They're free advertising. They show up in Google search results. Seriously, especially effective on mobile when they're mixed in with other organic results.

But even on desktop, they help your business attract potential customers and stand out from other local competitors. More importantly, they can drive pre-site conversions. You've heard about zero-click search. Now people can convert without getting to your site. They appear as a thumbnail, an image with a little bit of text underneath. Then when the user clicks on the thumbnail, the whole post pops up in a pop-up window that basically fills the window on either mobile or desktop.

Now they have no influence on ranking. They're a conversion factor, not a ranking factor. Think of it this way though. If it takes you 10 minutes to create a post and you do only one a week, that's just 40 minutes a month. If you get a conversion, isn't it worth doing? If you do them correctly, you can get a lot more than just one conversion. 

In the past, I would have told you that posts stay live in your profile for seven days, unless you use one of the post templates that includes a date range, in which case they stay live for the entire date range. But it looks like Google has changed the way that posts work, and now Google displays your 10 most recent posts in a carousel with a little arrow to scroll through. Then when you get to the end of those 10 posts, it has a link to view all of your older posts. 

Now you shouldn't pay attention to most of what you see online about Posts because there's a ridiculous amount of misinformation or simply outdated information out there.

Avoid words on the "no-no" list

Quick tip: Be careful about the text that you use. Anything with sexual connotation will get your post denied. This is really frustrating for some industries. If you put up a post about weather stripping, you get vetoed because of the word "stripping." Or if you're a plumber and you post about "toilet repairs" or "unclogging a toilet", you get denied for using the word "toilet."

So be careful if you have anything that might be on that no-no, naughty list. 

Use an enticing thumbnail



The full post contains an image. A full post has the image and then text with up to 1,500 characters, and that's all most people pay attention to. But the post thumbnail is the key to success. No one is going to see the full post if the thumbnail isn't enticing enough to click on.

Think of it like you're creating a paid search campaign. You need really compelling copy if you want more clicks on your ad or a really awesome image to attract attention if it's a banner image. The same principle applies to posts. 

Make them promotional

It's also important to be sure that your posts are promotional. People are seeing these posts in the search results before they go to your site. So in most cases they have no idea who you are yet.

The typical social fluff that you share on other social platforms doesn't work. Don't share links to blog posts or a simple "Hey, we sell this" message because those don't work. Remember, your users are shopping around and trying to figure out where they want to buy, so you want to grab their attention with something promotional.

Pick the right template

Most of the stuff out there will tell you that the post thumbnail displays 100 characters of text or about 16 words broken into 4 distinct lines. But in reality, it's different depending on which post template you use and whether or not you include a call to action link, which then replaces that last line of text.

But, hey, we're all marketers. So why wouldn't we include a CTA link, right? 

There are three main post types. In the vast majority of cases, you want to use the What's New post template. That's the one that allows for the most text in the thumbnail view, so it's easier to write something compelling. Now with the What's New post, once you include that call to action, it replaces that last line so you end up with three full lines of available text space.

Both the Event and Offer post templates include a title and then a date range. Some people dig the date range because the post stays visible for that whole date range. But now that posts stay live and visible forever, there's no advantage there. Both of those post types have that separate title line, then a separate date range line, and then the call to action link is going to be on the fourth line, which leaves you only a single line of text or just a few words to write something compelling.

Sure, the Offer post has a cool little price tag emoji there next to the title and some limited coupon functionality, but that's not a reason. You should have full coupon functionality on your site. So it's better to write something compelling with a "What's New" post template and then have the user click through on the call to action link to get to your site to get more information and convert there.

There's also a new COVID update post type, but you don't want to use it. It shows up a lot higher on your Google My Business profile, actually just below your top line information, but it's text only. Only text, no image. If you've got an active COVID post, Google hides all of your other active posts. So if you want to share a COVID info post or updates about COVID, it's better to use the What's New post template instead.

Pay attention to image cropping

The image is the frustrating part of things. Cropping is super wonky and really inconsistent. In fact, you could post the same image multiple times and it will crop slightly differently each time. The fact that the crop is slightly higher than vertical center and also a different size between mobile and desktop makes it really frustrating.

The important areas of your image can get cropped out, so half of your product ends up being gone, or your text gets cropped out, or things get really hard to read. Now there's a rudimentary cropping tool built into the image upload function with posts, but it's not locked to an aspect ratio. So then you're going to end up with black bars either on the top or on the side if you don't crop it to the correct aspect ratio, which is, by the way, 1200 pixels width by 900 pixels high.

You need to have a handle on what the safe area is within the image. So to make things easier, we created this Google Posts Cropping Guide. It's a Photoshop document with built-in guides to show you what the safe area is. You can download it at bit.ly/posts-image-guide. Make sure you put that in lowercase because it's case sensitive.

But it looks like this. Anything within that white grid is safe and that's what's going to show up in that post thumbnail. But then when you see the full post, the rest of the image shows up. So you can get really creative and have things like here's the image, but then when it pops up, there's additional text at the bottom. 

Include UTM tracking

Now, for the call to action link, you need to be sure that you include UTM tracking, because Google Analytics doesn't always attribute that traffic correctly, especially on mobile.

Now if you include UTM tagging, you can ensure that the clicks are attributed to Google organic, and then you can use the campaign variable to differentiate between the posts that you published so you'll be able to see which post generated more click-throughs or more conversions and then you can adjust your strategy moving forward to use the more effective post types. 

So for those of you that aren't super familiar with UTM tagging, it's basically adding a query string like this to the end of the URL that you're tagging so it forces Google Analytics to attribute the session a certain way that you're specifying.

So here's the structure that I recommend using when you do Google posts. It's your domain on the left. Then ?UTM_Source is GMB.Post, so it's separated. Then UTM_Medium is Organic, and UTM_Campaign is some sort of post identifier. Some people like to use Google as the source.

But at a high level, when you look at your source medium report, that traffic all gets lumped together with everything from Google. So sometimes it's confusing for clients who don't really understand that they can look at secondary dimensions to break apart that traffic. So more importantly, it's easier for you to see your post traffic separately when you look at the default source medium report.

You want to leave organic as your medium so that it's lumped and grouped correctly on the default channel report with all organic traffic. Then you enter some sort of identifier, some sort of text string or date that can let you know which post you're talking about with that campaign variable. So make sure it's something unique so that you know which post you're talking about, whether it's car post, oil post, or a date range or the title of the post so you know when you're looking in Google Analytics.

It's also important to mention that Google My Business Insights will show you the number of views and clicks, but it's a bit convoluted because multiple impressions and/or multiple clicks from the same users are counted independently. That's why adding the UTM tagging is so important for tracking accurately your performance. 

Upload videos

Final note, you can also upload videos so a video shows in the thumbnail and in the post.

So when users see that thumbnail that has a little play button on it and they click it, when the post pops up, the video will play there. Now the file size limit is 30 seconds or 75 MB, which if you got commercials, that's basically the perfect size. So even though they've been around for a few years, most businesses still ignore Posts. Now you know how to rock Posts so you'll stand out from competitors and generate more click-throughs.

Hopefully you enjoyed the video. If you've got any additional tips to share, please throw them in the comments down below. Thanks for watching, and I'll see you again next time.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!