Wednesday, March 22, 2023

The ROI of Digital Accessibility

In a recent AudioEye survey of 500 business leaders and web professionals, 70% said that “cost” was their main concern when it came to digital accessibility. Many of the respondents also thought they would have to rebuild their website from the ground up in order to deliver an accessible browsing experience.

This perception of digital accessibility as a cost center without an easy remedy is one of the reasons that just 3% of the internet is accessible to people with disabilities, despite the 1.3 billion people globally who live with a disability.

In this post, I discuss three benefits of digital accessibility — and hopefully, make a case for why inclusion isn’t just the right thing to do, but a huge business opportunity.

Purple illustration of a hand holding a gavel over a human symbol.

Three reasons to prioritize digital accessibility

Many business leaders are aware of the risk of non-compliance with the Americans with Disabilities Act (ADA) and other accessibility legislation. Over the last few years, there has been a record number of digital accessibility lawsuits. More companies are receiving demand letters or being taken to court over alleged violations under the ADA. And when that happens, other business leaders pay attention.

What business leaders don’t always consider is the opportunity that digital accessibility represents, whether it’s reaching more potential customers, building a more inclusive organization, or improving the browsing experience for all users — not to mention search engines and voice assistants.

1. Digital accessibility is not an edge case

Illustration of two piles of monetary bills. On the left, $1.9 trillion the income of people with disabilities. On the right, over $10 trillion the combined income of their friends and family.

One of the biggest misconceptions about digital accessibility is that it’s some sort of edge case. In fact, people with disabilities are the largest minority in the United States.

In the United States, one in four adults lives with some type of disability. That number goes even higher when you include temporary disabilities, like broken limbs or short-term impairments following surgery or medical treatments.

According to the Global Economics of Disability 2020 report, people with disabilities control $1.9 trillion in disposable income, globally. That number reaches over $10 trillion when their friends and family are included.

By designing for accessibility, you can make your website and digital experiences work better for everyone.

2. Accessible design is good for everyone

At its core, digital accessibility is all about eliminating barriers that can prevent people from browsing your website.

By following the best practices of accessible design, you can help ensure that everyone can interact with your digital content — regardless of age, disability, or any other factor.

For example, the World Wide Web Consortium’s (W3C) Supplemental Guidance to WCAG 2 includes best practices for clear and understandable content, such as:

  • Avoiding double negatives, such as “Time is not unlimited.”

  • Using short sentences with one point per sentence.

  • Putting the key takeaway or objective at the start of a paragraph.

  • When possible, using bulleted or numbered lists.

The goal of these recommendations is to remove confusion for people with dyslexia and other learning disabilities. But it could just as easily be a general writing best practice.

Every user can benefit from simple, direct language that removes friction and gives them a clear next step. It’s the foundation of any conversion-optimized website — and it just happens to overlap with the best practices of accessible design.

3. Digital accessibility supports discoverability

There’s also a clear overlap between accessibility and discoverability. For example, sites with clear, descriptive headings — the same kinds of headings that make navigation and comprehension easier for people with disabilities — are also easier for search engines like Google to crawl.

Because of this, there’s strong evidence that Google rewards accessibility when ranking websites. In fact, its Webmaster Guidelines — which outline the best practices that help Google to find, index, and rank your site — read like accessibility guidelines — and often correlate directly with WCAG.

Accessible websites are also beneficial to users who access websites with voice search. According to the Google Mobile Voice Study, 41% of US adults and 55% of teens use voice search daily. Businesses with websites that are optimized for voice search, have a better chance of being discovered and used by potential customers.

Making the business case for digital accessibility

Illustration of monetary bills in front of a web page.

The first goal of any digital accessibility initiative should be to deliver an inclusive experience to everyone who visits your website. Not only is it the right thing to do, but it can help you reach a market that’s traditionally been underserved.

However, it’s important to note the other benefit of building an accessible website: greater conformance with accessibility standards like the Web Content Accessibility Guidelines (WCAG), which are used to assess a site’s compliance with the ADA.

Based on recent guidance from the Department of Justice, it’s clear that businesses of all sizes are expected to meet accessibility standards like WCAG in order to comply with the ADA.

When you calculate the ROI of digital accessibility, you should factor in that the cost of defending a digital accessibility lawsuit — or even settling a demand letter — can often surpass the cost of making your website accessible.

By taking a more proactive approach to digital accessibility, you can comply with the law while also turning a requirement into an opportunity to grow your business and deliver an inclusive experience to every customer.

As you invest in digital accessibility, it’s worth measuring your progress over time. To get started, you can use a free accessibility checker to assess your website’s accessibility — and then see how it improves as you implement accessibility best practices.

Monday, March 20, 2023

How to Avoid Duplicate Conversions and Recreating the Conversion Funnel for GA4

As you’re probably all too aware at this point, GA4 is coming. Old versions of Google Analytics will be switched off for pretty much everyone come June 2023.

While GA4 is improving all the time, there are quite a few things that people are used to seeing in old versions of Analytics which, at the very least, take a bit of creativity in the new world.

One example is how conversions are handled. In the old versions of Google Analytics, a conversion could only fire once per session. In GA4 conversions are just another kind of event, so it’s possible for a conversion to fire multiple times in one session.

Problem is, you might be very interested if someone signs up via your contact-us form once. But that person might reload the thank-you page, or sign up for something else via a different form on the site. That doesn’t mean you necessarily want to track two conversions.

Speaking of signing up via different forms, on some websites, users may wind up on the same thank-you page having taken very different routes to get there. If we don’t have that much control, and we’re having to rely on thank-you page views to track conversions, it can be hard for us to separate out different kinds of conversions.

In old versions of GA you could use funnels with a “required” step. You might have one goal with a funnel requiring your event page, another goal with a funnel requiring a different page, and rely on them to give you different conversions. There also isn’t an obvious way to do this in GA4.

In this post, I’m going to take you through how to:

  • Avoid double counting in GA4.

  • Automatically ignore suspicious conversions (like people landing direct on the conversion page).

  • Recreate the kind of funnels we expected in Universal Analytics (in fact we’ll make them better).

I’ll take you through a few bits in GA4 and others using Google Tag Manager. The GA4 approach is more straightforward, but the Tag Manager is more robust and can help you make sure that all of your conversion pixels are showing roughly the same information (because we’re long past the point where GA is the only place we’re recording conversions).

Managing conversions in GA4

This section is about changes we can make purely through the GA4 interface. As long as you’re sending your page views conversion events to GA4 you should be able to use these tactics without any code changes.

However: There are some limitations of doing things through GA4, for example it can mean that your GA data doesn’t line up with conversions recorded via other platforms.

Avoiding double-counting

Julius Fedorovicius (of Analytics Mania fame) has produced a fantastic guide to making sure that conversions are only recorded once per session.

You should have a read but broadly:

  • You create a custom audience based on a sequence that begins with “session_start”

  • You fire an event when someone enters that audience

  • You use that event as your conversion.

No surprise that Julius has come up with a really smart way to handle the problem of double-counting:

If you’ve created Segments in Universal Analytics Audience sequences in GA4 look very like the sequences we used to create for Segments. However, the old Segments were just a way of visualizing data, whereas Audiences in GA4 are a way of grouping data. We can use Audiences to create something new.

That distinction is important because we can do cool things like fire custom events when someone enters an audience (which Julius makes use of in this solution).

Universal Analytics Segment sequence creator

GA4 Audience sequence creator

The limitations of using Google Analytics audiences

This isn’t really a limitation as far as GA goes but it’s a consideration nonetheless. Julius’ solution is great for making sure we’re not double-counting conversions in GA, but GA probably isn’t the only way we’re recording conversions.

The average site probably has a bunch of separate conversion tracking pixels and those could end up double-counting conversions.

For example: Facebook and Google both describe how they avoid double-counting conversions, but their solutions largely rely on exactly matching transaction IDs, and even if they’re handling it okay, there’s a bunch of smaller fish out there that are also offering conversion tracking and can need a bit more hand-holding.

If we want to make sure that we’re only recording one conversion per session, it’s useful to make sure all of our conversion tracking is working in a similar way. Tag Manager is a great solution for that (I describe a solution in the Tag Manager section below).

You can also run into problems if, for example, your confirmation page is somehow indexed or bookmarked by users — people landing directly on it can lead to weird unexpected conversions. We can also use Tag Manager to guard against that a little bit.

Recreating the conversion funnel

Sticking with the GA4 interface for now, we can also adapt the AnalyticsMania approach to create our funnel-based conversions too by adding additional steps to the sequence.

For what it’s worth, conversion funnels are not the ideal way to categorize conversions. If you can use anything more direct (like the id of the form they’ve filled out, a separate thank-you page) then that’s a much more reliable way to categorize conversions. That said, we don’t live in a perfect world, and sometimes there isn’t the option to completely rebuild your conversion process.

In Fedorovicius’ example we just have two steps in our audience sequence:

  1. Session_start
    Indirectly followed by

  2. Conversion

Which basically means “someone lands on the site and then at any point during their session, they convert”.

To recreate the goal funnels you might be using in Universal Analytics - we can just add another step to the sequence. For instance:

  1. Session_start
    Indirectly followed by

  2. Visiting our event_page
    Indirectly followed by

  3. Landing on our thank you page/converting

That should mean we can create one conversion which is: Users who went through our event page and then converted.

And another conversion which is: Users who went through our sponsorship page and then converted.

There are some limitations here though, for example, what if someone:

  1. Landed on the site

  2. Visited our event page

  3. Then visited our sponsorship page

  4. Converted using the form on either.

They would fulfill the criteria for our event conversion and the criteria for our sponsorship conversion. We’d record a conversion for each and we’d end up double-counting after all.

This is also a limitation of the old Universal Analytics funnels: Just because a step in the funnel was required doesn’t mean the user can’t wander off around the site between that step and their final conversion. So, if it’s any consolation, this isn’t any worse than old Universal Analytics funnels (but we can still do better).

The problem with using “directly followed by”

You might say “well that’s easily solved — at the moment the sequence says is indirectly followed by and we can just change that to is directly followed by”.

Surely that would mean that someone is on the sponsorship page and goes directly from the sponsorship page to the thank you page, right?

Unfortunately that’s usually not what “directly followed by” means because there’s all kinds of things that can get recorded in analytics which aren’t page views.

For example if someone lands on the sponsorship page, and then scrolls down and lands on the thank you page, the thank you page view doesn’t directly follow the sponsorship page view. It goes:

  • Page view: sponsorship

  • Scroll

  • Page view: thank you

So “directly followed by” isn’t an easy solution.

How about “within x minutes”?

GA4 has a really cool feature in the sequence builder where we can set a timer in-between steps. Even outside of tracking conversions within a session we can use it to keep track of cool things like people who came to our site, didn’t convert that time, but came back and converted within the next couple days.

Jill Quick has been talking a bunch about how powerful these options are.

We could use this to say something like: person landed on our event page and then landed on our thank you page within 10 minutes.

But as I’m sure you’ve guessed, that ends up being a kind of arbitrary cut off, maybe someone spends some time thinking about how to fill out our form, or maybe someone really quickly goes to one of our other pages and converts there. This could be better than the basic funnel, but we could also end up ignoring completely legitimate conversions.

So what do we do?

Using GA4 sequences for this is kind of fine, as I say above it’s certainly not worse than Universal Analytics, but we could do better with Google Tag Manager.

Managing conversions in Google Tag Manager

These approaches require you to run all your tracking via Tag Manager. Though even aside

from this, if you’re not already using Tag Manager, I’d advise you to look into it!

Since we need to keep track of what’s happened to a user across multiple pages, these solutions are also going to make use of cookies. In case that fills you with dread, don’t worry:

  • I’m going to walk you through how to create and delete these cookies (it takes a little Javascript but it’s copy-paste and easier than you think!)

  • These aren’t the kinds of cookies designed to give away people’s information to other services.

To reiterate what I say above: While this approach takes a bit more effort than just doing things through Google Analytics it allows us to do two things:

  1. Make sure all of our various tracking tags are firing in the same way

  2. Have more fine grained control, particularly if we’re trying to categorise different paths to conversion.

Avoiding double-counting

To recap what we want to do here, we want to make sure that if someone visits our site and converts we fire a conversion. However, if they revisit a thank you page, or go through a different conversion, we don’t fire a second conversion that session.

To do that, we’re going to:

  • Set a cookie when a user converts.

  • Make sure that the cookie automatically disappears after 30 minutes of inactivity (this is the default timeout for GA4 sessions but if you think that’s too short you can set it to whatever you want).

  • Every time we go to fire a conversion, check if that cookie is present and, if it is, don’t fire the conversion.

That should mean that if someone comes to our site and converts, we’ll set the cookie, and that will stop us from firing any more conversions (GA4 or otherwise) until the user has taken a little time away from the site.

Setting a cookie in JavaScript

The first thing you need to know is that we can use Tag Manager to run any JavaScript we want. The second thing to know is that we can use JavaScript to set cookies.

So first: Go to Google Tag Manager, create a new Tag and select the Custom HTML type

Give the tag the name “[Tag] setCookieConverted” and in the html content paste:

<script>

// Get time 30 minutes from now (this is because the default GA session time out

// is half an hour and we want our cookie timeout to match)

var minutesToAdd = 30

var currentTime = new Date(); // Get current time

var newDateObj = new Date(currentTime.getTime() + minutesToAdd*60000); // Add our minutes on

// Set the domain your're working on, this is because we want our cookies to be

// accessible in subdomains (like test.example.com) if needed

var yourDomain = "example.com"

// Set a cookie called ‘converted’ with the value being ‘true’ which expires in 30 minutes

document.cookie = "converted=true; path=/; domain="+yourDomain+"; expires="+newDateObj+";"

</script>

It should look like this:

The custom HTML tag will add the content there to the page, and as soon as the page detects a new script (the one we’ve written) it’ll run that script.

What our script does is:

  • It finds the current time, and what time it’ll be in half an hour.

  • It uses that, and your domain, to set a cookie called “converted” which can be read by any page on your website.

When you go to save your tag it’ll probably say “No Triggers Selected”.

For now we’re going to click “Add trigger” and choose the “All Pages” trigger.

This is purely so that while we’re putting this together we can easily test it..

Reading our cookie value

Tag Manager has a built-in way to read cookie values using variables. So go to the variables section, create a new variable called “convertedCookie” and set the Cookie Name as “converted”.

Now, if you click the “Preview” button and open up your site we can start to look at what value the convertedCookie variable pulls through for you.

Click into the “Variables” tab and you should see convertedCookie somewhere in the list. Here’s an example with other cookies blocked out so you know what to look for.

So now we can use the value of that variable in Tag Manager as part of our logic.

Using conversion cookie in our conversion logic

Everyone’s conversion setup will be the different so this might not match what you’re doing exactly but if you’re considering using GTM I’m assuming you are firing conversions something like this:

  1. You have a trigger based on some condition (probably either a custom event or a pageview)

  2. You have a tag (or multiple tags) that send your conversion information whenever that trigger is activated.

What we’re going to do is tweak your trigger to add another condition.

Imagine that your trigger was previously firing on every thank-you page visit:

What we’re going to do is add a second condition to the trigger:

convertedCookie does not contain true

While this example uses the thank you page path, it doesn’t have to, it can be anything.

Once you make this change, you can go and test your conversion. Because you have another tag adding the converted cookie on each page view, your conversion shouldn’t fire when it normally would.

Now we just need to change our converted cookie so that it only appears after someone has converted.

At the moment we’re setting the “converted” cookie on every page view, so we’ll never get any conversions.

We need to update that so:

  • We set a cookie when someone converts.

  • Every time we load a page, if the person is marked as “converted” we reset the cookie (I‘ll explain).

Setting a cookie only when someone has converted

First: we need to remove the trigger from [Tag] setCookieConverted so it doesn’t fire at all.

Then we go to whatever tag we’re using to send our conversion, open up “Advanced Settings”, click “Tag Sequencing” and select “Fire a tag after”.

Then we select our setCookieConverted tag and check “Don’t fire if conversion tag fails”.

This should mean that whenever we send our conversion, we’ll automatically then activate our cookie tag and mark the user as converted.

So now our logic is:

  • If someone converts, we check if there is a cookie saying they recently converted already.

  • If they don’t have that cookie we send a conversion.

  • Then we automatically set that cookie.

To test this, you can either clear the cookie or wait for it to expire. Here are instructions for how to clear cookies in Google Chrome (which you’re probably using if you’re working with tag manager).

Now, if you got into GTM preview and click around you should be able to look at your variables and see that convertedCookie is back to being ‘undefined’.

If you convert, you should see that both tags fire — your conversion tag and your setCookieConverted tag.

But if you convert again (reload the page, re-fill the form, whatever you’ve got to do) you should see that neither tag fires.

Congratulations! You’re filtering your conversions to avoid recording a conversion more than once for someone in a 30 minute window.

We just want to make one last tweak now.

Refreshing the cookie if it has been set

Our cookie has a 30 minute expiration. That means it’ll stick around for 30 minutes and then automatically be deleted from the browser. But what if someone hangs around on our website for more than half an hour, reading a blog post or something, and converts again?

To help deal with that, we’re going to add another trigger which checks if the user has recently converted, and if they have, refreshes the cookie with each new page load.

Head back to [Tag] setCookieConverted

At this point it should have no firing triggers. We’re going to add one back in.

Click the blue plus sign in this screen, and again in the next screen that comes up, we’re going to create a new trigger.

In the new trigger, we set it to fire only on page views where convertedCookie contains true.

So this gets a little bit circular, but basically:

  • When someone converts we set a “converted” cookie for the next half hour.

  • Every time someone loads a page, if they have a “converted” cookie we reset that cookie for another 30 minutes.

  • If at any point the user doesn’t load a new page for 30 minutes, the cookie will expire, which means our refresh won’t be triggered.

You can test this by clicking around your site with the GTM preview. Once you’ve converted, the [Tag] setCookieConverted should fire on every new page load.

Wrapping up

All you need to do now is make sure that all of your conversion tags use that same trigger (the one that has the condition that convertedCookie isn’t “true”). Once that’s set up, they should all behave the same — only recording one conversion per session unless someone clears their cookies or just hangs around on one page for a very long time.

What if we find we’re getting weird conversions where users haven’t visited any other pages on the site?

I have worked with sites in the past where:

  • There’s useful information on the thank-you page and users have been keeping it open/coming back to it.

  • Confirmation pages have been indexed in Google or people are finding their way to the conversion page some other way.

That can lead to weird tracked conversions that don’t correspond to actual conversions. While these problems should be solved at source, we can also clear up our analytics using the steps in “Creating a conversion funnel” below.

Creating a conversion funnel

This builds on the cookie meddling we’ve done in the last section, so if you haven’t read that bit, it’s worth taking a look!

If you’re here not because you want a specific funnel but because you want to deal with weird conversions where users just land straight on the conversion page - don’t worry you follow these instructions exactly the same, you just set the trigger for every page except your conversion page (I’ll take you through that).

Setting a “path” cookie

Just like the “converted” cookie before, we’re going to create a new cookie that records the location of the current page.

Create a new Tag called [Tag] setCookiePath, choose “Custom HTML” and add the following JavaScript

<script>

// Get time 30 minutes from now (this is because the default GA session time out

// is half an hour and we want our cookie timeout to match)

var minutesToAdd = 30

var currentTime = new Date(); // Get current time

var newDateObj = new Date(currentTime.getTime() + minutesToAdd*60000); // Add our minutes on

// Set the domain your're working on, this is because we want our cookies to be

// accessible in subdomains (like test.example.com) if needed

var yourDomain = "therobinlord.com"

var pagePathName = window.location.pathname // Get location of current page

// Set a cookie called ‘converted’ with the value being ‘true’ which expires in 30 minutes

document.cookie = "conversionPath="+location+"; path=/; domain="+yourDomain+"; expires="+newDateObj+";"

</script>

It should look like this:

This will save a cookie that records the location of the page. The first time it’s loaded it will create a new cookie with that information, every time after it’ll replace the value.

We’ll use this to make sure that whichever funnel page our user interacted with last is the one we record.

Triggering on your funnel pages

In creating our “funnel” we’re assuming that there are certain pages a user passes through in order to convert. So we’re going to set this to trigger only when one of those funnel pages is involved.

In your [Tag] setCookiePath tag - click to add a new trigger and create a new trigger.

We’re going to configure our tag to activate on every user click. This means that if a user is hopping between different funnel pages, each one will overwrite the cookie as they click around but only the one they interacted with last will be the one that sticks around in the cookie value.

Getting our funnelCookie

As in the double-counting instructions, create a new variable. But this time, call it funnelCookie and set the “Cookie Name” to conversionPath.

Once you’ve done that you should be able to test by using preview, going to any old page of your site (as long as it’s not one of your funnel pages) and checking funnelCookie in the Variables (it should be undefined).

Then go to one of your funnel pages, you should be able to see the cookie change.

As you visit other pages on the site, funnelCookie should stay the same, unless you visit another funnel page.

Changing our conversions based on the funnelCookie

Now, there are smart things you could do here with extracting the value of funnelCookie and putting that into a variable in your conversion tag but the setup for every tag will be different and I want to give you an option for if you’re not able to do that.

This will create a little bit more mess in your Tag Manager account because you’ll be duplicating some of your trigger and conversion tags.

First, let’s go back to the conversion trigger we were working on before. It looked like this when we left it:

We’re going to add in another condition:

funnelCookie contains event-page

This means now that this conversion will only fire if the last funnel page our user passed through was the event-page.

After this we can duplicate this trigger, our conversion tags, and, for our other set of conversions, change the funnelCookie value for the trigger.

Maybe instead we make it:

funnelCookiecontains form-page

Now you have two sets of conversions, each of which will fire based on which funnel page the user passed through. From there you can edit the values sent.

A couple caveats

Instead of duplicating our conversion tags it would be much better to pull in the value of the funnelCookie variable and use that to just dynamically change some of the values we’re sending as part of the conversion.

With this approach, you also run the risk of not recording any conversions at all if a user hasn’t passed through one of your funnel pages. That might be what you want, but it’s worth bearing that risk in mind in case you think people might take legitimate-but-unusual routes to conversion.

While I can’t take you through the process of updating all of your conversion tags, one option to make this information more ready for filling out conversion tags (and to optionally set a fallback in case you want to avoid losing conversions) is to use a lookup table like this, where you take the funnelCookie value and categorise the values.

Then instead of adding the funnelCookie value in your trigger, you keep the trigger the same and pull in the lookup table value.

Triggering on any page except your conversion page

If you’re not concerned about constructing page funnels but you want to make sure that users have visited at least one page before converting. There are a couple changes:

  • You trigger [Tag] setCookiePath based on any Page View that isn’t your confirmation page


  • You don’t bother creating different conversion flows, you just have one flow, but you still add a funnelCookie requirement which says that your funnelCookie has to be some page rather than undefined

Conclusion

Hopefully this has helped you get an idea of how to get more control of the conversions being recorded on your site, whether that’s entirely through GA4 or using the power of Tag Manager.

Happy tracking!

Friday, March 17, 2023

You're Measuring Your Branded SERP Wrong – Whiteboard Friday

Controlling the consumer experience of your brand is key to how people see it and how they interact with it. For SEOs, the part of brand experience that we control the most is the SERP, yet traditional ways of measuring brand reach on the SERP often fall short.

Today, Dominic talks through an example of how they fall short, and how we can do better.

infographic outlining better ways to measure your branded serp

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I'm Dominic. I am the founder of Piped Out, and I'm here to talk to you about measuring your brand. If you have a product, controlling the consumer experience of that brand is very, very important. How do they see it and how do they interact with it?

As SEOs, the bit of that brand experience that we control the most are the SERPs, but traditional ways of measuring that often fall short. We're going to talk through an example of how they fall short and how we can do better. So I'm going to use the example up here of cyberpunk.net. There was a video game that came out a little while ago called Cyberpunk 2077. I'm just picking this as an example of a product that had some real brand reputation problems.

It did not have a good launch. So if you were an SEO working for them, how might you've gone about controlling and measuring how well you were controlling their brand experience? Over here we've got our SERP. Our SERP is we've got some fairly standard blue link results where we've got a title and a meta description. We've got some other richer elements, things like a video block down here anda People Also Ask block. We've got a knowledge block over to the side. If we're looking at this in a rank tracker, we'd go, okay, so for cyberpunk.net, the primary domain that we control, here's our rank, and over time we've moved from five to four, fantastic.

Outside of brand, this is not a terrible way to do it. This is, of course, for a single keyword because we can see the sort of fluctuations there. If we have multiple keywords, we'll have an aggregate so that it will become an average. It's probably going to look a bit smoother. Maybe we've got something like bucket ranks sitting behind here, so the number of keywords that rank position one, number of keywords that rank position two, and we've got that to give us a bit of a richness over time.

But fundamentally, this is still a very simple picture for a branded SERP because, firstly, what we need to think about is how people have looked at SERPs has changed a bit over time. As SERPs have gotten richer, people like Nielsen have seen how we no longer just read straight down. We now ping pong around a bit more. We're willing to interact with more of the SERP. On a branded SERP, this assumption here, Google has domain crowding, so ideally they're trying not to show too many of each domain on each particular SERP.

But on a branded SERP, your goal is not to rank with one URL. Your goal is to rank with many URLs, and you have the ability to rank with many URLs unlike normal. For a branded term, like, for this example, it could be "Cyberpunk 2077." It could be something slightly more loaded, like "is cyberpunk good." It could be something slightly less loaded, like "is cyberpunk out." All of these are branded terms, where I think the publisher has a really good chance of appearing for multiple places.

So the green one, right at position four or five, is our cyberpunk.net. But actually we also own position three. That's our publisher domain, the people who published the game. So you've got the game website and the publisher thing. We also might want to count this as a win for us. We've got a journalist who's a really good, basically, champion for our brand. They really like the game, and they've written a really positive article that's down here.

There's one up here in red. This is a negative article. This is someone who is not having such a good time with it. We might have our Twitter account and any other number of things, the YouTube accounts, all of these other bits that we could be getting onto our branded SERP. If we're just measuring by rank, we're only ever going to get one single number to sum up all of this richness. It's just not good enough for brand. So how can we do better?

How do we take this? So firstly, what we need is a rich SERP data model. You can get this from anywhere. In practice, what we do at Piped Out is we use a service called DataForSEO, which is an API that gives a very rich SERP data model. We'd highly recommend. But there are plenty of services that offer this, and it really doesn't matter which one you pick. What's important is that you have a rich enough SERP data model to get all the ins and outs of each of these little SERP features.

So I don't just have the People Also Ask box as a single block. I have each individual question in the People Also Ask box with the title that ranks and then the URL that is below that. Same thing for the video. I need a rich enough SERP data source that for the video element, I don't just have the URL, I also get the source for a video block. Google has a little source there, and that's actually the name of the account.

When you go and get all that rich data, you can then define what is yours or what you want to be yours. We don't technically own this positive article, but we might count it as ours. So you can go and build up this definition of like all of the different things that you think are valuable. You then take this SERP data model and you say, okay, and you calculate the size for each of these things.

So how much is this size in pixels? This is an ordinary SERP block, so it's probably about 180 pixels tall. So you calculate the height of all of these different elements that you own, and you can turn that, and you could do area as well, but height for a more basic version, and you could turn that into this, the percentage of owned SERP measured by pixels. What percentage of this SERP do you own?

Again, our definition of ownership is whatever we want it to be. This is enabled by having a very rich SERP data source. Now, we are getting a little bit of something here, which is that with rank, we obviously know five is better than four. We lose a little bit of that context when we come with something like this, which is that obviously ranking down here is not as good as ranking up here. But when we're measuring size and all of these different elements, again, we can't literally have all of them there.

So we have to have some sort of compromise for how do we measure importance without going just back to rank. I think an excellent compromise is to measure above the fold and first page, so the percentage of the pixels of the SERP that you own above the fold on desktop and mobile and on the first page by desktop and by mobile. So the idea is that you have a graph that looks something like this, where we can say, "Okay, for our site, for cyberpunk, for the set of keywords that we're looking at, we own a growing percentage of the mobile pixels above the fold. Great, we're controlling our brand experience."

That gives us essentially a good idea of how what we're doing as SEOs is controlling that branded SERP. Then when you're doing brand projects and you're trying to enrich and fill on this domain, you're getting your results. You're correctly measuring your results rather than having a very simplified picture where you can't do this. Technically, like I said, this practically looks like finding a rich SERP data source.

In our case, at Piped Out, we take DataForSEO and we pump it into BigQuery. But again, it could be anything. Lots of great SERP API data sources do this. STAT might also do this. Don't know. Go ask STAT because it's a Moz product and this is a Moz blog. That's pretty much it.

Go get that data source, use it to build these sort of dashboards, and you can get better insight into how well you're controlling your brand. Thanks and we'll see you next time.

Video transcription by Speechpad.com

Thursday, March 16, 2023

Three Irish Small Business Ideas that Could Be US Hits

Fine art painting of a woman and girls knitting on the steps of a small business
“Knitting the Islands”, by Miriam Ellis

A happy and lucky St. Patrick’s day to all my readers! I’ve seen it again and again that small and local businesses became successful due to a great inspiration and some little happenstance bit of luck that got them noticed. Today, I’d like to celebrate with you by offering a shamrock of three ideas I’ve seen taking off in my mother country of Ireland. You may not replicate the exact business model, but do take away the underlying concepts which I strongly believe could succeed in the US. I’ll also point out how you can help luck along with a little creative marketing. Share this article with your team for brainstorming new campaigns, or with anyone in your life who wishes they could start a small business

Finding the "grá"

Ever wondered how to say “I love you” in Irish? One way is "tá grá agam duit" (taw graw ah-gum duts/ditch). It’s not uncommon to hear Irish folk saying they have a "grá" for something when speaking English, and to me, the word not only conveys love but a kind of longing. When people have a "grá" for some really good bread, or a trip to the seaside, or a warm coat they saw in a shop window, it’s what we might call “consumer demand” in American marketing lingo. Pay attention right now, and you may be starting to notice people in the US and elsewhere expressing a special kind of "grá" for a different life. Recently, such a thread stood out to me on Twitter, started by author and founder Dave Gerhardt.

screenshot of tweet in which author expresses fatigue with technology and states that he would like to build something local in his community.

Software, of course, isn’t going anywhere any time soon, and the more we see of the current state of AI chat, the less many analysts are convinced that it’s going to be a major disruptor at present, but what I observe in this tweet and the replies to it is that people are starting to get tired of the one-dimensional confines of too much screen time. Wanting a satisfying local life and community “IRL” is a great "grá" statement. Americans are deeply attached to our tech, but more and more, I’m running across peers talking about having an “analog life”, wishing their kids would become “luddites”, or wondering how an off-grid life would feel for their families. More simply put, many people would like to experience more satisfaction in what is right around them.

This dynamic is, in fact, tailor-made for small business entrepreneurs, so let’s look at these three aspirational concepts to see if you or your clients have got a "grá" tugging at you for any of them.

1. Be about life

Screenshot of a website selling rollout wildflower seed mats to replace lawns.

Within living memory, it was the mark of respectability to have your little weedless patch of green lawn. You constantly cut the grass to keep it under tight control. You yanked out every dandelion - or worse - poisoned your own nest with herbicides. Think things never change for the better? I hear you, but check out TheIrishGardener because now, instead of rolling out bundles of monocrop sod, the Irish are carpeting the outdoors with native wildflower matts. One dimension isn’t enough anymore - folk want flowers and bees and moths and butterflies and bugs and more of everything alive. Yard by yard, they are reinvigorating essential ecosystems. Clever wildflower seed sellers are now marketing their products like seed matts and seed bombs not just to homeowners but as wedding favors, holiday gifts, classroom projects, and more.

There’s been such a base trend in US marketing in which we try to sell things to our neighbors by scaring them. Our ads are full of guns, screaming, threats, panic, anxiety, and danger and it’s very weird contrasting this with the ads I listen to on Irish media which seem to be largely focused on green energy, eating nice things, and enjoying the arts.

Could your great small business reject fear-and-shock-based marketing and instead hinge on beauty and satisfaction in life? We do have that old adage of drawing more flies with honey than vinegar, and if you can align your business with the very strong yearning for life to be abundant, varied, diverse, interesting, healthy, and fun, I think you’re moving away from the old lifeless lawns to the new thriving garden.

2. Be about locality

Screenshot of a website featuring the harvesting of Irish seaweed.

There’s only one place you can get real Irish seaweed - from the coasts of the country, of course! WildIrishSeaWeeds.com is one of those rare businesses that has seen the potential in a gift of nature that many might pass by without noticing. Seaweed is practically a miracle - you can eat it, bathe in it, and use it as a very carbon-friendly fertilizer that elders have always sworn by. What was once mainly a snack remembered fondly by children is now becoming a serious green industry in Ireland, and not far from where I live, I see a Californian company testing whether they can latch onto a similar demand in the US.

What is overlooked where you live? Is it something that can only be gotten in your local area? Something people used to love but are forgetting about now? Maybe it’s a local food source that’s starting to disappear because no one is using it anymore, or maybe its a skilled craft like basketmaking in a local style, baking or brewing a regional speciality, knitting or sewing a heritage garment, compounding an old-time remedy. Maybe it’s reviving a tradition that used to anchor your community. Could your great small business idea simply be about reconnecting neighbors with what’s special about where you live…a place that may have started to have vanished in our collective consciousness because the screens are blocking the view?

3. Be about people’s simplest pleasures

Restaurateur growing potatoes on the balcony above his establishment.

Our SEO lives may be consumed with ChatGPT right now, or GA4, or what will happen next on or to Twitter, but Padraic Óg Gallagher is up on the balcony of his restaurant, growing real Irish potatoes for his Boxty House in Dublin. If you’ve never had the luck to eat boxty, it’s a delicious potato cake, beloved enough in Ireland to be the inspiration behind a restaurant that’s seen such success, it was able to open a second location. Boxty is not fancy. It’s something your mother would make you from leftovers, something treasured from childhood, the memory of which warms your very soul.

If we look again at Dave Gerhardt’s Twitter thread, he’s not longing for a yacht, nor a manion, nor a pot of gold. He just wants the simple pleasure you get from “building in your community.” Most of us can be plenty happy with just enough, and rather than creating a business idea around elite luxury, consider what you might offer that actually delivers human contentment to the most people. A basic kitchen good that isn’t made well any more? A handcrafted walking stick? A cozy bookshop, a guided tour for visitors, your grandmother’s pecan pie, a wooden toy, a cloth doll, a sturdy garden implement, a bayberry candle, a regional herbal tea?

The simpler and better quality your idea, the more of a welcome change it could be for customers increasingly expressing fatigue from low-quality, mass-produced, and very limited options. America’s Vermont Country Store has been outstandingly successful in helping people relocate fundamental merchandise they can’t find anymore. Study their approach.

Creative marketing of your small business idea

Creativity in an ancient illuminated manuscript

What can you do to catch the eye of your audience? You’ve probably guessed that I’m going to say that, no matter how small your local business, you’ve got to have a website and local business listings. 30 years ago, I would have said this about the telephone book, and however much we may long for more off-screen time, we’ve got to concede that the web makes it so easy to be found! So yes, publish the best website you can budget for, build out your Google Business Profile and other listings, and invest all you can in learning about digital reputation management. It will help you achieve your goals.

That being said, the room there is beyond the web for creative marketing could fill all the pages of the Book of Kells. If you’re starting out quite small, try these low-tech approaches to getting the word out about your new business idea in your community:

  • Ask an established business owner to host you as a pop-up shop inside their store, perhaps for tourist season or the holidays.

  • If you produce enough volume, meet with local shop owners to discover whether your product could win a permanent place on their shelves.

  • Approach local reporters with the most succinct, newsworthy angle of your business to seek press.

  • Real-world community message boards still exist in some towns. Use them.

  • Put a sign outside your house or in the window of your apartment. No room? Ask local officials for permission to put a sign in a vacant lot or on a street corner where you’ve seen other signage posted. Be ready to sell them on how your idea benefits the community.

  • Research local regulations regarding hanging fliers around town.

  • Research whether there is an opportunity for you to be included in existing print catalogs. 90 million Americans purchase something from a catalog annually, and even as the Internet has become so established in our lives, catalog shopping has continued to trend upwards.

  • Found or join a local business organization for brainstorming, networking and cross-selling.

  • Coordinate with other micro-business entrepreneurs to host a shared party in a local park, acquainting your community with your presence and offerings.

  • Sponsor local teams, events, and people and be cited for it both on and offline.

  • If your community still has a local radio station, try to get on it, either with an ad or as a guest, to reach 82.5% of US adults.

  • If you live in an area favored by tourists, contact the local visitors’ center to see how to get listed in their publications.

  • Advertise in the mailers and bulletins of local houses of worship and schools.

  • If what you produce relates to any type of food, music, art, cultural, or local festival, participate in it.

“Little as a wren needs, it must gather it.”

Irish stamp featuring a native wren bird.

I’m closing today with this famous Irish proverb, because it seems right for this moment in America, where the myth of endless growth and the dangers of an unchecked appetite for luxury have done no favors to the economy or environment our whole people must live in. The Irish phrase, Cé gur beag díol, caithfidh sé a sholáthar,” has traditionally been used to remind us that even the small wren has to work hard to provide for itself - a scenario every small business owner and local business marketer will easily relate to.

But I’m starting to see a double-meaning in this phrase, and new business trends in Ireland are helping me to see it: a more sustainable way to found a venture may be in asking not how much you want, but how little you actually need to be satisfied. SEOs everywhere already know it’s a best practice to get clients to define what success looks like before a project begins so that all parties can see when a goal has been attained. For most small business owners not seeking to become big business owners, achievement will simply mean something along the lines of being able to pay themselves and their staff enough to have a modest, good life. To me, this recognition matters right now, because most customers are in search of the same thing - having just enough.

Whether it’s through thrifting in Ireland or thrifting in America, re-storing in Drogheda or re-storing in Simi Valley, eating local and organic at Moyleabbey Farm in Kildare or at Waxwing Farm in Washington, or preserving traditional crafts that last on that side of the water or on this, tandem trends are indicative of a search for a simpler, better life. 57% of Americans say they shop small to keep money local, and there is no overstating how much both nearby economics and the global climate benefit from this approach. If you’ve decided 2023 is the year to lean into the new/old ways by starting or marketing small businesses, I’d say the luck may be on your side!

Monday, March 13, 2023

Daily SEO Fix: Exploring Subfolder Search with Moz Pro

A subfolder, also known as a subdirectory, is a way of organizing the pages on your website. They can be thought of as a way to divide up information or products, depending on your business. Within a subfolder, there can be more subfolders stacked within, like nesting dolls! Subfolders can be great for your website and for SEO. Having clear landing pages for your subfolders is important to both the searcher and the crawler. They are also another way to earn more backlinks, contributing to a greater Domain Authority. A subfolder could be a blog, a product category, an Etsy store, and more!

You may not have thought of conducting keyword research or competitive research for your subfolder, but now may just be the time to look into it. Here’s the good news — you can conduct subfolder research with Moz Pro! Follow the videos below to understand how to get the most out of researching your subfolder with Moz Pro tools.

Why You Should Do Subfolder Research

There are several different ways subfolders can be used on your website. Learn all about why subfolder research is important for your business.

Subfolder Search with Keyword Explorer

Explore keyword opportunities with Moz’s ‘Explore by Site’ tool. Search by subfolder using this tool to discover the number of keywords that you rank for, and the top keywords you rank for, along with keyword metrics that help you understand which keywords you should be focusing your efforts on.

Subfolder Search with True Competitor

You can use the ‘True Competitor’ tool in Moz Pro’s Competitive Research suite to figure out who your competitors are for your subfolder. See who your top 25 competitors are, some you may already know about, and some may not. Explore more metrics to understand which competitors you should pay closer attention to.

Subfolder Search with Keyword Gap

Dig a little deeper with the Moz ‘Keyword Gap’ tool. Input your own subfolder, and your previously discovered competitors to explore the keywords that you share with them. You’ll be able to look for certain keywords that you should workto improve your ranking for, and discover top competing content to give you further content ideas for your subfolder research.

Friday, March 10, 2023

The Fundamentals of Crawling for SEO – Whiteboard Friday

In this week’s episode of Whiteboard Friday, host Jes Scholz digs into the foundations of search engine crawling. She’ll show you why no indexing issues doesn’t necessarily mean no issues at all, and how — when it comes to crawling — quality is more important than quantity.

infographic outlining the fundamentals of SEO crawling

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Good day, Moz fans, and welcome to another edition of Whiteboard Friday. My name is Jes Scholz, and today we're going to be talking about all things crawling. What's important to understand is that crawling is essential for every single website, because if your content is not being crawled, then you have no chance to get any real visibility within Google Search.

So when you really think about it, crawling is fundamental, and it's all based on Googlebot's somewhat fickle attentions. A lot of the time people say it's really easy to understand if you have a crawling issue. You log in to Google Search Console, you go to the Exclusions Report, and you see do you have the status discovered, currently not indexed.

If you do, you have a crawling problem, and if you don't, you don't. To some extent, this is true, but it's not quite that simple because what that's telling you is if you have a crawling issue with your new content. But it's not only about having your new content crawled. You also want to ensure that your content is crawled as it is significantly updated, and this is not something that you're ever going to see within Google Search Console.

But say that you have refreshed an article or you've done a significant technical SEO update, you are only going to see the benefits of those optimizations after Google has crawled and processed the page. Or on the flip side, if you've done a big technical optimization and then it's not been crawled and you've actually harmed your site, you're not going to see the harm until Google crawls your site.

So, essentially, you can't fail fast if Googlebot is crawling slow. So now we need to talk about measuring crawling in a really meaningful manner because, again, when you're logging in to Google Search Console, you now go into the Crawl Stats Report. You see the total number of crawls.

I take big issue with anybody that says you need to maximize the amount of crawling, because the total number of crawls is absolutely nothing but a vanity metric. If I have 10 times the amount of crawling, that does not necessarily mean that I have 10 times more indexing of content that I care about.

All it correlates with is more weight on my server and that costs you more money. So it's not about the amount of crawling. It's about the quality of crawling. This is how we need to start measuring crawling because what we need to do is look at the time between when a piece of content is created or updated and how long it takes for Googlebot to go and crawl that piece of content.

The time difference between the creation or the update and that first Googlebot crawl, I call this the crawl efficacy. So measuring crawling efficacy should be relatively simple. You go to your database and you export the created at time or the updated time, and then you go into your log files and you get the next Googlebot crawl, and you calculate the time differential.

But let's be real. Getting access to log files and databases is not really the easiest thing for a lot of us to do. So you can have a proxy. What you can do is you can go and look at the last modified date time from your XML sitemaps for the URLs that you care about from an SEO perspective, which is the only ones that should be in your XML sitemaps, and you can go and look at the last crawl time from the URL inspection API.

What I really like about the URL inspection API is if for the URLs that you're actively querying, you can also then get the indexing status when it changes. So with that information, you can actually start calculating an indexing efficacy score as well.

So looking at when you've done that republishing or when you've done the first publication, how long does it take until Google then indexes that page? Because, really, crawling without corresponding indexing is not really valuable. So when we start looking at this and we've calculated real times, you might see it's within minutes, it might be hours, it might be days, it might be weeks from when you create or update a URL to when Googlebot is crawling it.

If this is a long time period, what can we actually do about it? Well, search engines and their partners have been talking a lot in the last few years about how they're helping us as SEOs to crawl the web more efficiently. After all, this is in their best interests. From a search engine point of view, when they crawl us more effectively, they get our valuable content faster and they're able to show that to their audiences, the searchers.

It's also something where they can have a nice story because crawling puts a lot of weight on us and our environment. It causes a lot of greenhouse gases. So by making more efficient crawling, they're also actually helping the planet. This is another motivation why you should care about this as well. So they've spent a lot of effort in releasing APIs.

We've got two APIs. We've got the Google Indexing API and IndexNow. The Google Indexing API, Google said multiple times, "You can actually only use this if you have job posting or broadcast structured data on your website." Many, many people have tested this, and many, many people have proved that to be false.

You can use the Google Indexing API to crawl any type of content. But this is where this idea of crawl budget and maximizing the amount of crawling proves itself to be problematic because although you can get these URLs crawled with the Google Indexing API, if they do not have that structured data on the pages, it has no impact on indexing.

So all of that crawling weight that you're putting on the server and all of that time you invested to integrate with the Google Indexing API is wasted. That is SEO effort you could have put somewhere else. So long story short, Google Indexing API, job postings, live videos, very good.

Everything else, not worth your time. Good. Let's move on to IndexNow. The biggest challenge with IndexNow is that Google doesn't use this API. Obviously, they've got their own. So that doesn't mean disregard it though.

Bing uses it, Yandex uses it, and a whole lot of SEO tools and CRMs and CDNs also utilize it. So, generally, if you're in one of these platforms and you see, oh, there's an indexing API, chances are that is going to be powered and going into IndexNow. The good thing about all of these integrations is it can be as simple as just toggling on a switch and you're integrated.

This might seem very tempting, very exciting, nice, easy SEO win, but caution, for three reasons. The first reason is your target audience. If you just toggle on that switch, you're going to be telling a search engine like Yandex, big Russian search engine, about all of your URLs.

Now, if your site is based in Russia, excellent thing to do. If your site is based somewhere else, maybe not a very good thing to do. You're going to be paying for all of that Yandex bot crawling on your server and not really reaching your target audience. Our job as SEOs is not to maximize the amount of crawling and weight on the server.

Our job is to reach, engage, and convert our target audiences. So if your target audiences aren't using Bing, they aren't using Yandex, really consider if this is something that's a good fit for your business. The second reason is implementation, particularly if you're using a tool. You're relying on that tool to have done a correct implementation with the indexing API.

So, for example, one of the CDNs that has done this integration does not send events when something has been created or updated or deleted. They rather send events every single time a URL is requested. What this means is that they're pinging to the IndexNow API a whole lot of URLs which are specifically blocked by robots.txt.

Or maybe they're pinging to the indexing API a whole bunch of URLs that are not SEO relevant, that you don't want search engines to know about, and they can't find through crawling links on your website, but all of a sudden, because you've just toggled it on, they now know these URLs exist, they're going to go and index them, and that can start impacting things like your Domain Authority.

That's going to be putting that unnecessary weight on your server. The last reason is does it actually improve efficacy, and this is something you must test for your own website if you feel that this is a good fit for your target audience. But from my own testing on my websites, what I learned is that when I toggle this on and when I measure the impact with KPIs that matter, crawl efficacy, indexing efficacy, it didn't actually help me to crawl URLs which would not have been crawled and indexed naturally.

So while it does trigger crawling, that crawling would have happened at the same rate whether IndexNow triggered it or not. So all of that effort that goes into integrating that API or testing if it's actually working the way that you want it to work with those tools, again, was a wasted opportunity cost. The last area where search engines will actually support us with crawling is in Google Search Console with manual submission.

This is actually one tool that is truly useful. It will trigger crawl generally within around an hour, and that crawl does positively impact influencing in most cases, not all, but most. But of course, there is a challenge, and the challenge when it comes to manual submission is you're limited to 10 URLs within 24 hours.

Now, don't disregard it just because of that reason. If you've got 10 very highly valuable URLs and you're struggling to get those crawled, it's definitely worthwhile going in and doing that submission. You can also write a simple script where you can just click one button and it'll go and submit 10 URLs in that search console every single day for you.

But it does have its limitations. So, really, search engines are trying their best, but they're not going to solve this issue for us. So we really have to help ourselves. What are three things that you can do which will truly have a meaningful impact on your crawl efficacy and your indexing efficacy?

The first area where you should be focusing your attention is on XML sitemaps, making sure they're optimized. When I talk about optimized XML sitemaps, I'm talking about sitemaps which have a last modified date time, which updates as close as possible to the create or update time in the database. What a lot of your development teams will do naturally, because it makes sense for them, is to run this with a cron job, and they'll run that cron once a day.

So maybe you republish your article at 8:00 a.m. and they run the cron job at 11:00 p.m., and so you've got all of that time in between where Google or other search engine bots don't actually know you've updated that content because you haven't told them with the XML sitemap. So getting that actual event and the reported event in the XML sitemaps close together is really, really important.

The second thing you can do is your internal links. So here I'm talking about all of your SEO-relevant internal links. Review your sitewide links. Have breadcrumbs on your mobile devices. It's not just for desktop. Make sure your SEO-relevant filters are crawlable. Make sure you've got related content links to be building up those silos.

This is something that you have to go into your phone, turn your JavaScript off, and then make sure that you can actually navigate those links without that JavaScript, because if you can't, Googlebot can't on the first wave of indexing, and if Googlebot can't on the first wave of indexing, that will negatively impact your indexing efficacy scores.

Then the last thing you want to do is reduce the number of parameters, particularly tracking parameters. Now, I very much understand that you need something like UTM tag parameters so you can see where your email traffic is coming from, you can see where your social traffic is coming from, you can see where your push notification traffic is coming from, but there is no reason that those tracking URLs need to be crawlable by Googlebot.

They're actually going to harm you if Googlebot does crawl them, especially if you don't have the right indexing directives on them. So the first thing you can do is just make them not crawlable. Instead of using a question mark to start your string of UTM parameters, use a hash. It still tracks perfectly in Google Analytics, but it's not crawlable for Google or any other search engine.

If you want to geek out and keep learning more about crawling, please hit me up on Twitter. My handle is @jes_scholz. And I wish you a lovely rest of your day.

Video transcription by Speechpad.com

Wednesday, March 8, 2023

Diving for Pearls: A Guide to Long Tail Keywords - Next Level

Welcome to this refreshed installment of our educational Next Level series! Originally published in June 2016 this blog has been rewritten to include new tool screenshots and refreshed workflows. Together we’ll uncover keywords in the vastness of the long tail.

Looking for more Next Level posts? Previously we explored how to create relevant and engaging SEO reports.

One of the biggest obstacles to driving forward your business online is being able to rank well for keywords that people are searching for. Getting your lovely URLs to show up in those precious top positions — and gaining a good portion of the visitors behind the searches — can feel like an impossible dream. Particularly if you’re working on a newish site on a modest budget within a competitive niche.

Well, strap yourself in, because today we’re going to live that dream. I’ll take you through the bronze, silver, and gold levels of finding, assessing, and targeting long tail keywords so you can start getting visitors to your site that are primed and ready to convert.

Quick steps to building a long tail keyword list:

  1. Draw from your industry and customer knowledge

  2. Add suggestions from Google Autocomplete

  3. Explore industry language on social media

  4. Pull relevant suggestions from a keyword tool

  5. Prioritize using popularity and difficulty metrics

  6. Understand the competitive landscape to pinpoint opportunities

What are long tail keywords?

The "long tail of search" refers to the many weird and wonderful ways the diverse people of the world search for any given niche.

People (yes, people! Shiny, happy, everyday, run-of-the-mill, muesli-eating, credit-card-swiping people!) rarely stop searching broad and generic 'head' keywords, like “web design” or “camera” or “sailor moon.”

They clarify their search with emotional triggers, technical terms they’ve learned from reading forums, and compare features and prices before mustering up the courage to commit and convert on your site.

The long tail is packed with searches like “best web designer in Nottingham” or “mirrorless camera 4k video 2016” or “sailor moon cat costume.”

This adaptation of the Search Demand Curve chart visualizes the long tail of search by using the tried and tested "Internet loves cats + animated gifs are the coolest = SUCCESS" formula.

The Search Demand Curve illustrates that while “head” and “body” terms typically amass higher search volume, seeming appealing at first. The vastness of the “long tail” presents a more substantial opportunity and larger percentage of search traffic that shouldn’t be ignored. You can really see this illustrated when combined as a percentage of search traffic. While this graph contains no cats, it is still entirely illustrative. However the long tail of search isn’t slowing down anytime soon with voice search and AI integrations we can expect the vastness of the long tail to continue to grow.


While search volume for any individual long tail keyword is typically less, user intent is much more specific and viewed as a group targeting the long tail often enables you to target a larger more engaged audience. Also beautifully illustrated in Dr. Pete’s infamous chunky thorax post.

The long tail of search is being constantly generated by people seeking answers from the Internet hive mind. There's no end to what you’ll find if you have a good old rummage about, including: Questions, styles, colors, brands, concerns, peeves, desires, hopes, dreams… and everything in between.

Fresh, new, outrageous, often bizarre keywords. If you’ve done any keyword research you’ll know what I mean by bizarre. Things a person wouldn’t admit to their best friend, therapist, or doctor they’ll happily pump into Google and hit search. In this post we’re going to go diving for pearls: keywords with searcher intent, high demand, low competition, and a spot on the SERPs just for you.

Bronze medal: Build your own long tail keyword

It’s really easy to come up with a long tail keyword. You can use your brain, gather some thoughts, take a stab in the dark, and throw a few keyword modifiers around your ‘head’ keyword.

Have you ever played that magnetic fridge poetry game? It’s a bit like that. You can play online if (like me) you have an aversion to physical things.

I’m no poet, but I think I deserve a medal for this attempt, and now I really want some "hot seasonal berry water."

Magnetic poetry not doing it for you? Don’t worry — that’s only the beginning.

Use your industry knowledge

Time to draw on that valuable industry knowledge you’ve been storing up, jot down some ideas, and think about intent and common misconceptions. I’m going to use the example pearls or freshwater pearls in this post as the head term because that’s something I’m interested in.

Let’s go! Let’s say I run a jewelry business and I know that my customers regularly have questions, like:

How do I clean freshwater pearls

Using my knowledge I can rattle off and build a keyword list.

Search your keyword

Engage google suggested search tool to get some more ideas. Manually enter your keyword into Google and prompting it to populate popular suggestions, like I’ve done below:

Awesome, I’m adding Freshwater pearls price to my list.

Explore the language of social media

Get amongst the over-sharers and have a look at what people are chatting about on social media by searching your keyword in Twitter, tiktok, Instagram, and Youtube. These are topics in your niche that people are talking about right now.

YouTube is also pulling up some interesting ideas around my keyword. This is simultaneously helping me gather keyword ideas and giving me a good sense about what content is already out there. Don’t worry, we’ll touch on content later on in this post. :)

I’m adding understanding types of pearls and Difference between saltwater and freshwater pearls to my list.

Ask keyword questions…?

You’ll probably notice that I’ve added a question mark to a phrase that is not a question, just to mess with you all. Apologies for the confusing internal-reading-voice-upwards-inflection.

Questions are my favorite types of keywords. What!? You don’t have a fav keyword type? Well, you do now — trust me.

Answer the Public is packed with questions radiating out from your seed term

Pop freshwater pearls into the tool and grab some questions for our growing list.

To leave no rock unturned (or no mollusk unshucked), let’s pop over to Google Search Console to find keywords that are already sending you traffic (and discover any mismatches between your content and user intent.)

Pile these into a list, like I've done in this spreadsheet.

Now this is starting to look interesting: we’ve got some keyword modifiers, some clear buying signals, and a better idea of what people might be looking for around "freshwater pearls."

Should you stop there? I’m flabbergasted — how can you even suggest that?! This is only the beginning. :)

Silver medal: Assess demand and explore topics

So far, so rosy. But we've been focusing on finding keywords, picking them up, and stashing them in our collection like colored glass at the seaside.

To really dig into the endless tail of your niche, you’ll need a keyword tool like our very own Keyword Explorer. This is invaluable for finding topics within your niche that present a real opportunity for your site.

If you’re trying out Keyword Explorer for the first time, you’ll have 10 free searches/mo with a free Moz Community account and even more with a Moz Pro free trial or paid subscription.

Find search volume for your head keyword

To start, enter a broad industry keyword. In my case I’ll type in "pearls" into the Keyword Explorer search box. Now you can see Moz’s Monthly Volume displaying how often a term or phrase is searched for in Google:

Now try "freshwater pearls." As expected, the search volume goes down, but we’re getting more specific.

We could keep going like this, but we’re going to burn up all our free searches. Just take it as read that, as you get more specific and enter all the phrases we found earlier, the search volume will decrease even more. There may not be any data at all. That’s why you need to explore the searches around this main keyword.

Find even more long tail keywords

Below the search volume, click on "Keyword Suggestions."

Well, hi there, ever-expanding long tail! We’ve gone from a handful of keywords pulled together manually from different sources to 1,000 suggestions right there on your screen. Positioned right next to that, search volume to give us an idea of demand.

The diversity of searches within your niche is just as important as that big number we saw at the beginning, because it shows you how much demand there is for this niche as a whole. We’re also learning more about searcher intent.

I’m scanning through those 1,000 suggestions and looking for other terms that pop up again and again. I’m also looking for signals and different ways the words are being used to pick out words to expand my list.

I like to toggle between sorting by Relevancy and search volume, and then scroll through all the results to cherry-pick those that catch my eye.

Now reverse the volume filter so that it’s showing lower-volume search terms and scroll down through the end of the tail to explore the lower-volume chatter.

If we don’t have tracked data in our database you can always cross reference with another data set to validate their value.

This is where your industry knowledge comes into play again. Bots, formulas, spreadsheets, and algorithms are all well and good, but don’t discount your own instincts and knowledge.

Use the suggestion filters to your advantage and play around with broader or more specific suggestion types.

Looking through the suggestions, I’ve noticed that the word “cultured” has popped up a few times.

To see these all bundled together, I want to look at the grouping options in Keyword Explorer. I like the high lexicon groups so I can see how much discussion is going on within my topics.

Scroll down and expand that group to get an idea of demand and assess intent.

I’m also interested in the words around "price" and "value," so I’m doing the same and saving those to my sheet, along with the search volume. A few attempts at researching the "cleaning" of pearls wasn’t very fruitful, so I’ve adjusted my search to "clean freshwater pearls."

Because I’m a keyword questions fanatic, I’m also going to filter by questions (the bottom option from the drop-down menu):

OK! How is our keyword list looking? Pretty darn hot, I reckon! We’ve gathered together a list of keywords and dug into the long tail of these sub-niches, and right alongside we’ve got search volume.

You’ll notice that some of the keywords I discovered in the bronze stage don’t have any data showing up in Keyword Explorer (indicated by the hyphen in the screenshot above). That’s ok — they’re still topics I can research further. This is exactly why we have assessed demand; no wild goose chase for us!

Ok, we’re drawing some conclusions, we’re building our list, and we’re making educated decisions. Congrats on your silver-level keyword wizardry! :D

Gold medal: Find out who you’re competing with

We’re not operating in a vacuum. There’s always someone out there trying to elbow their way onto the first page. Don’t fall into the trap of thinking that just because it’s a long tail term with a nice chunk of search volume all those clicks will rain down on you. If the terms you’re looking to target already have big names headlining, this could very well alter your roadmap.

To reap the rewards of targeting the long tail, you’ll have to make sure you can outperform your competition.

Manually check the SERPs

Check out who's showing up in the search engine results page (SERPs) by running a search for your head term. Make sure you’re signed out of Google and in an incognito tab.

We’re focusing on the organic results to find out if there are any weaker URLs you can pick off.

I’ll start with “freshwater pearls” for illustrative purposes.

Whoooaaa, this is a noisy page. I’ve had to scroll a whole 2.5cm on my magic mouse (that’s very nearly a whole inch for the imperialists among us) just to see any organic results.

Let’s install the Mozbar to discover some metrics on the fly, like domain authority and back-linking data.

Now, if seeing those big players in the SERPs doesn’t make it clear, looking at the Mozbar metrics certainly does. This is exclusive real estate. It’s dominated by retailers, although Wikipedia gets a place in the middle of the page.

Let’s get into the mind of Google for a second. It — or should I say "they" (I can’t decide if it’s more creepy for Google to be referred to as a singular or plural pronoun. Let’s go with "they") — anyway, I digress. "They" are guessing that we’re looking to buy pearls, but they're also offering results on what they are.

This sort of information is offered up by big retailers who have created content that targets the intention of searchers. Mikimoto drives us to their blog post all about where freshwater pearls come from.

As you get deeper into the long tail of your niche, you’ll begin to come across sites you might not be so familiar with. So go and have a peek at their content.

With a little bit of snooping you can easily find out:

  • how relevant the article is

  • if it looks appealing, up to date, and sharable

  • be really judge-y: why not?

Now let’s find some more:

  • when the article was published

  • when their site was created

  • how often their blog is updated

  • how many other sites are linking to the page with Link Explorer

  • how many tweets, likes, etc.

Learn more about how to do a competitor analysis in our free guide, and don’t forget to download the handy worksheet.

Document all of your findings in our spreadsheet from earlier to keep track of the data. This information will now inform you of your chances of ranking for that term.

Manually checking out your competition is something that I would strongly recommend. But we don’t have all the time in the world to check each results page for each keyword we’re interested in.

Keyword Explorer leaps to our rescue again

Run your search and click on "SERP Analysis" to see what the first page looks like, along with authority metrics and social activity.


All the metrics for the organic results, like Page Authority, goes into calculating the Difficulty score above (lower is better).

And all those other factors — the ads and suggestions taking up space on the SERPs — that's what's used to calculate Organic CTR (higher is better).

Priority is all the other metrics tallied up. You definitely want this to be higher.

So now we have 3 important numerical values we can use to gauge our competition. We can use these values to compare keywords.

After a few searches in Keyword Explorer, you’re going to start hankering for a keyword list or two. For this you’ll need a paid subscription, or a Moz Pro 30-day free trial.

It’s well worth the sign-up; not only do you get 5,000 keyword queries per month and 30 lists (on the Medium plan), but you also get to check out the super-magical-KWE-mega-list-funky-cool metric page. That’s what I call it, just rolls off the tongue, you know?

Okay, fellow list buddies, let’s go and add those terms we’re interested in to our lovely new list.

Then head up to your lists on the top right and open up the one you just created.

Now we can see the spread of demand, competition and SERP features for our whole list.

You can compare Volume, SERPS Features, Difficulty, Organic CTR, and Priority across multiple lists, topics, and niches.

How to compare apples with apples

Comparing keywords is something our support team gets questions about all the time.

Should I target this word or that word?

For the long tail keyword, the Volume is a lot lower, Difficulty is also down, the Organic CTR is a bit up, and overall the Priority is down because of the drop in search volume.

But don’t discount it! By targeting these sorts of terms, you’re focusing more on the intent of the searcher. You’re also making your content relevant for all the other neighboring search terms.

Let’s compare the difference between freshwater and cultured pearls with how much are freshwater pearls worth.

Search volume is the same, but for the keyword how much are freshwater pearls worth Difficulty is up, but so is the overall Priority because the Organic CTR is higher.

But just because you’re picking between two long tail keywords doesn’t mean you’ve fully understood the long tail of search.

You know all those keywords I grabbed for my list earlier in this post? Well, here they are sorted into topics.

Look at all the different ways people search for the same thing. This is what drives the long tail of search — searcher diversity. If you tally all the volume up for the cultured topic, we’ve got a bigger group of keywords and overall more search volume. This is where you can use Keyword Explorer and the long tail to make informed decisions.

You’re laying out your virtual welcome mat for all the potential traffic these terms send.

Platinum level: I lied — there's one more level!

For all you lovely overachievers out there who have reached the end of this post, I’m going to reward you with one final tip.

You’ve done all the snooping around on your competitors, so you know who you’re up against. You’ve done the research, so you know what keywords to target to begin driving intent-rich traffic.

Now you need to create strong, consistent, and outstanding content.

As Dr Pete confirmed:

We don’t have to work ourselves to death to target the long tail of search. It doesn’t take 10,000 pieces of content to rank for 10,000 variants of a phrase, and Google (and our visitors) would much prefer we not spin out that content. The new, post-NLP (Natural Language Processing) long tail of SEO requires us to understand how our keywords fit into semantic space, mapping their relationships and covering the core concepts. Study your SERPs diligently, and you can find the patterns to turn your own long tail of keywords into a chonky thorax of opportunity.

Here's where you really have to tip your hat to long tail keywords, because by strategically targeting the long tail you can start to build enough authority in the industry to beat stronger competition and rank higher for more competitive keywords in your niche.

Wrapping up…

The various different keyword phrases that make up the long tail in your industry are vast, often easier to rank for, and indicate stronger intent from the searcher. By targeting them you’ll find you can start to rank for relevant phrases sooner than if you just targeted the head. And over time, if you get the right signals, you’ll be able to rank for keywords with tougher competition. Pretty sweet, huh? Give Moz’s Keyword Explorer tool a whirl and let me know how you get on :)