Thursday, January 28, 2021

Finding Keyword Opportunities Without Historical Data

Posted by Imogen_Davies

At Google’s Search On event in October last year, Prabhakar Raghavan explained that 15% of daily queries are ones that have never been searched before. If we take the latest figures from Internet Live Stats, which state 3.5 billion queries are searched every day, that means that 525 million of those queries are brand new.

That is a huge number of opportunities waiting to be identified and worked into strategies, optimization, and content plans. The trouble is, all of the usual keyword research tools are, at best, a month behind with the data they can provide. Even then, the volumes they report need to be taken with a grain of salt – you’re telling me there are only 140 searches per month for “women’s discount designer clothing”? – and if you work in B2B industries, those searches are generally much smaller volumes to begin with.

So, we know there are huge amounts of searches available, with more and more being added every day, but without the data to see volumes, how do we know what we should be working into strategies? And how do we find these opportunities in the first place?

Finding the opportunities

The usual tools we turn to aren’t going to be much use for keywords and topics that haven’t been searched in volume previously. So, we need to get a little creative — both in where we look, and in how we identify the potential of queries in order to start prioritizing and working them into strategies. This means doing things like:

  1. Mining People Also Ask
  2. Scraping autosuggest
  3. Drilling into related keyword themes

Mining People Also Ask

People Also Ask is a great place to start looking for new keywords, and tends to be more up to date than the various tools you would normally use for research. The trap most marketers fall into is looking at this data on a small scale, realizing that (being longer-tail terms) they don’t have much volume, and discounting them from approaches. But when you follow a larger-scale process, you can get much more information about the themes and topics that users are searching for and can start plotting this over time to see emerging topics faster than you would from standard tools.

To mine PAA features, you need to:

1. Start with a seed list of keywords.

2. Use SerpAPI to run your keywords through the API call – you can see their demo interface below and try it yourself:


3. Export the “related questions” features returned in the API call and map them to overall topics using a spreadsheet:

4. Export the “related search boxes” and map these to overall topics as well:

5. Look for consistent themes in the topics being returned across related questions and searches.

6. Add these overall themes to your preferred research tool to identify additional related opportunities. For example, we can see coffee + health is a consistent topic area, so you can add that as an overall theme to explore further through advanced search parameters and modifiers.

7. Add these as seed terms to your preferred research tool to pull out related queries, like using broad match (+coffee health) and phrase match (“coffee health”) modifiers to return more relevant queries:


This then gives you a set of additional “suggested queries” to broaden your search (e.g. coffee benefits) as well as related keyword ideas you can explore further.

This is also a great place to start for identifying differences in search queries by location, like if you want to see different topics people are searching for in the UK vs. the US, then SerpAPI allows you to do that at a larger scale.

If you’re looking to do this on a smaller scale, or without the need to set up an API, you can also use this really handy tool from Candour – Also Asked – which pulls out the related questions for a broad topic and allows you to save the data as a .csv or an image for quick review:


Once you’ve identified all of the topics people are searching for, you can start drilling into new keyword opportunities around them and assess how they change over time. Many of these opportunities don’t have swathes of historical data reported in the usual research tools, but we know that people are searching for them and can use them to inform future content topics as well as immediate keyword opportunities.

You can also track these People Also Ask features to identify when your competitors are appearing in them, and get a better idea of how they’re changing their strategies over time and what kind of content and keywords they might also be targeting. At Found, we use our bespoke SERP Real Estate tool to do just that (and much more) so we can spot these opportunities quickly and work them into our approaches.

Scraping autosuggest

This one doesn’t need an API, but you’ll need to be careful with how frequently you use it, so you don’t start triggering the dreaded captchas.

Similar to People Also Ask, you can scrape the autosuggest queries from Google to quickly identify related searches people are entering. This tends to work better on a small scale, just because of the manual process behind it. You can try setting up a crawl with various parameters entered and a custom extraction, but Google will be pretty quick to pick up on what you’re doing.

To scrape autosuggest, you use a very simple URL query string:

https://suggestqueries.google.com/complete/search?output=toolbar&hl=&gl=uk&q=

Okay, it doesn’t look that simple, but it’s essentially a search query that outputs all of the suggested queries for your seed query.

So, if you were to enter “cyber security” after the “q=”, you would get:

This gives you the most common suggested queries for your seed term. Not only is this a goldmine for identifying additional queries, but it can show some of the newer queries that have started trending, as well as information related to those queries that the usual tools won’t provide data for.

For example, if you want to know what people are searching for related to COVID-19, you can’t get that data in Keyword Planner or most tools that utilize the platform, because of the advertising restrictions around it. But if you add it to the suggest queries string, you can see:

This can give you a starting point for new queries to cover without relying on historical volume. And it doesn’t just give you suggestions for broad topics – you can add whatever query you want and see what related suggestions are returned.

If you want to take this to another level, you can change the location settings in the query string, so instead of “gl=uk” you can add “=us” and see the suggested queries from the US. This then opens up another opportunity to look for differences in search behavior across different locations, and start identifying differences in the type of content you should be focusing on in different regions — particularly if you’re working on international websites or targeting international audiences.

Refining topic research

Although the usual tools won’t give you that much information on brand new queries, they can be a goldmine for identifying additional opportunities around a topic. So, if you have mined the PAA feature, scraped autosuggest, and grouped all of your new opportunities into topics and themes, you can enter these identified “topics” as seed terms to most keyword tools.

Google Ads Keyword Planner

Currently in beta, Google Ads now offers a “Refine keywords” feature as part of their Keyword Ideas tool, which is great for identifying keywords related to an overarching topic.

Below is an example of the types of keywords returned for a “coffee” search:

 Here we can see the keyword ideas have been grouped into:

  • Brand or Non-Brand – keywords relating to specific companies
  • Drink – types of coffee, e.g. espresso, iced coffee, brewed coffee
  • Product – capsules, pods, instant, ground
  • Method – e.g. cold brew, French press, drip coffee

These topic groupings are fantastic for finding additional areas to explore. You can either:

  1. Start here with an overarching topic to identify related terms and then go through the PAA/autosuggest identification process.
  2. Start with the PAA / autosuggest identification process and put your new topics into Keyword Planner

Whichever way you go about it, I’d recommend doing a few runs so you can get as many new ideas as possible. Once you’ve identified the topics, run them through the refine keywords beta to pull out more related topics, then run them through the PAA/autosuggest process to get more topics, and repeat a few times depending how many areas you want to explore or how in-depth you need your research to be.

Google Trends

Trends data is one of the most up-to-date sets you can look at for topics and specific queries. However, it is worth noting that for some topics, it doesn’t hold any data, so you might run into problems with more niche areas.

Using “travel ban” as an example, we can see the trends in searches as well as related topics and specific related queries:


Now, for new opportunities, you aren’t going to find a huge amount of data, but if you’ve grouped your opportunities into overarching topics and themes, you’ll be able to find some additional opportunities from the “Related topics” and “Related queries” sections.

In the example above we see these sections include specific locations and specific mentions of coronavirus – something that Keyword Planner won’t provide data on as you can’t bid on it.

Drilling into the different related topics and queries here will give you a bit more insight into additional areas to explore that you may not have otherwise been able to identify (or validate) through other Google platforms.

Moz Keyword Explorer

The Moz interface is a great starting point for validating keyword opportunities, as well as identifying what’s currently appearing in the SERPs for those terms. For example, a search for “london theatre” returns the following breakdown:


From here, you can drill into the keyword suggestions and start grouping them into themes as well, as well as being able to review the current SERP and see what kind of content is appearing. This is particularly useful when it comes to understanding the intent behind the terms to make sure you’re looking at the opportunities from the right angle – if a lot more ticket sellers are showing than news and guides, for example, then you want to be focusing these opportunities on more commercial pages than informational content.

Other tools

There are a variety of other tools you can use to further refine your keyword topics and identify new related ideas, including the likes of SEMRush, AHREFS, Answer The Public, Ubersuggest, and Sistrix, all offering relatively similar methods of refinement.

The key is identifying the opportunities you want to explore further, looking through the PAA and autosuggest queries, grouping them into themes, and then drilling into those themes.

Keyword research is an ever-evolving process, and the ways in which you can find opportunities are always changing, so how do you then start planning these new opportunities into strategies?

Forming a plan

Once you’ve got all of the data, you need to be able to formalize it into a plan to know when to start creating content, when to optimize pages, and when to put them on the back burner for a later date.

A quick (and consistent) way you can easily plot these new opportunities into your existing plans and strategies is to follow this process:

  1. Identify new searches and group into themes
  2. Monitor changes in new searches. Run the exercise once a month to see how much they change over time
  3. Plot trends in changes alongside industry developments. Was there an event that changed what people were searching for?
  4. Group the opportunities into actions: create, update, optimize.
  5. Group the opportunities into time-based categories: topical, interest, evergreen, growing, etc.
  6. Plot timeframes around the content pieces. Anything topical gets moved to the top of the list, growing themes can be plotted in around them, interest-based can be slotted in throughout the year, and evergreen pieces can be turned into more hero-style content.

Then you end up with a plan that covers:

  • All of your planned content.
  • All of your existing content and any updates you might want to make to include the new opportunities.
  • A revised optimization approach to work in new keywords on existing landing pages.
  • A revised FAQ structure to answer queries people are searching for (before your competitors do).
  • Developing themes of content for hubs and category page expansion.

Conclusion

Finding new keyword opportunities is imperative to staying ahead of the competition. New keywords mean new ways of searching, new information your audience needs, and new requirements to meet. With the processes outlined above, you’ll be able to keep on top of these emerging topics to plan your strategies and priorities around them. The world of search will always change, but the needs of your audience — and what they are searching for — should always be at the center of your plans.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, January 27, 2021

How We Increased Our Client’s Leads by 751% on Less Than £1K Per Month [Case Study]

Posted by LydiaGerman

It’s a common misunderstanding that working with a small budget for SEO means you can’t generate results. How can you possibly make enough improvements to the site in so few hours per month?

Well, for us at Tao Digital Marketing, our work with Fleetcover goes to show that results can be achieved by focusing on the most important changes in the little time you have.

In this case study, we’ll break down how we increased leads by 751%, keywords by 259% and impressions by 535% on a budget of less than £1,000 / $1,347 per month, equating to one day’s work. That’s a small spend for SEO, but making the right changes at the right time, and focusing our efforts on the most important aspects, generated these positive results.

Objectives

Our objectives were similar to what every website ultimately wants to achieve: generate leads for the business and increase online visibility for relevant search terms.

To be a little more specific, we picked this client up in March 2019, but of course, results generally started to pick up from November 2019 as Google started to crawl the site more regularly.

Our targets/KPIs for the next 12 months were based on numbers from April-November 2019, as below:

  • Increase leads from 175 to 500
  • Install a new chat function on the site and gain 50 leads through it
  • Increase site clicks from 2,200 to 5,000
  • Increase keywords ranked for from 229 to 500

The target audience was businesses that need fleet insurance. This spans a wide range of industries, from those operating coaches and taxis through to motor trade.

Our strategy focused on technical SEO and content creation. There was one big issue, though: we didn’t build the site ourselves, nor did we have the level of access that would allow us to make any design or fundamental changes that could support SEO and lead generation. In turn, our strategy had to be heavily content-driven.

Our strategy

1. Add a chat function

In November 2019, we added the ‘TawkTo’ chat function to the site which has helped generate leads. After analyzing when their audience was visiting the website, we found that most users were on the site late at night and on weekends.

With their team being out of the office and unable to answer any phone calls during these timeframes, we thought it would be of value to offer an online chat function to help capture inquiries so potential customers wouldn’t be put off or frustrated! This would put them at an advantage compared to their competitors who were not doing this.

We implemented the bot so it appears on the tab as a message notification, drawing people’s attention to the page even when it isn’t the active tab. So far, 330 inquiries have been made through this function.


Fleetcover Chat Bot

 

2. Implement technical SEO

Tweaks that support technical SEO are perhaps some of the most important changes you can make to see real results. We implemented this by:

  • Optimizing page titles
  • Creating meta descriptions that were between 100-155 characters, using keywords that naturally fit
  • Using the optimal image sizes that each website required
  • Using alt text for images
  • Implementing internal and external links where possible
  • Utilizing FAQ schema on the more frequently searched questions
  • Optimizing the sitemap by getting rid of URLs that wouldn't support organic search
  • Using the robots.txt file to point search crawlers in the right direction
  • Creating 301 redirects. There were a number of outdated pages as well as 404 errors that needed to be addressed
  • Making usability tweaks to the design. We were very limited in what we could achieve on the site as the incumbent were not massively helpful in terms of the access they would give us. We were able to get round this in certain areas, an example being the ‘Get a Quote’ buttons. We had a feeling user metrics mattered in this competitive market, so we did our utmost to capitalise on this.

3. Optimize the “Get a Quote” form

We added heat mapping and anonymized visitor recording to the site. When we analyzed the data, it became very apparent that many people weren’t filling out the “Get a Quote'' form due to it being too long — like standing at the bottom of a mountain, trying to work out the right route to the top! The original form had almost 10 questions, which overwhelmed the user and resulted in low conversion rates.

Step one of Fleetcover Quote form
Step two of Fleetcover Quote form

We’ve had great success using multi-step forms on other client’s sites, so we decided to create one for Fleetcover. We had all the questions needed to provide a full quotation, but split it all up into easier-to-digest tabs and user-designed icons, rather than just text.

Our new form was built creatively and had four steps, making the process easier. With this change alone, leads from the form grew from 175 before November 2019 to 1,489 over the past 12 months (751% increase).

4. Focus heavily on content creation

Example of Fleetcover service page (HGV fleet insurance)

Service pages

Content creation is an area where we really got the chance to demonstrate creative flair alongside data analysis. We started by reviewing Fleetcover’s service pages, and fleshed out the content to make it more engaging.

Example of Fleetcover service page (FAQs)

Keyword research and search intent

Over time, we continued to research keywords, focusing heavily on understanding the search intent behind them, and creating detailed content and FAQs to meet the audience’s needs and Google's understanding of those intents.

One topic we’ve been focusing on is the rise of electric vehicles and how this will grow and affect the insurance industry. As the development and popularity of these vehicles progresses, we’re going to look at how we can use this in our content strategy.

Formatting and style

Including clear, natural CTAs at the end of each piece was really important, not only to round out the articles, but also to encourage readers to use Fleetcover’s broker service. See an example from our piece about business car insurance below.

In addition, utilizing a simple but effective tone of voice helped to meet the needs of potential consumers and give them the information they need in a straightforward way. When focusing on keywords/phrases that contain industry jargon, we always include information about what the word or phrase means for those with informational intent about a particular topic, for example ‘fleet breakdown cover’.

Results

Sales

We achieved the goal of gaining more sales, as website conversion rates jumped from 3% to 14%, and leads increased from 175 to 1,489 (751%). This massive increase (pleasantly) surprised us as we are working with a site with a domain authority of 22 in a competitive industry, so to achieve these results so quickly was a great boost for both ourselves and Fleetcover.

Fleetcover was previously spending a considerable amount on purchasing leads from other companies, whereas now they have invested into SEO, which has significantly increased the number of leads they generate. With SEO, these leads are of a higher quality than PPC leads, and are therefore more likely to use their services. There is little need for Fleetcover to purchase leads now, as the business is becoming its own profitable arm of Walmsleys Insurance Brokers.

Rankings

We’ve helped Fleetcover gain online visibility for certain keywords such as “fleet insurance brokers” (#1) and “fleet insurance quote” (#2). Their positioning for “Fleet breakdown cover” has also moved from #15 to #4, and “fleet insurance quote” has moved from #10 to #2. The main benefit of these ranking improvements is the huge increase in traffic!

We also gained top spot for the main keyword of “fleet insurance”, but this has since been taken by one of the juggernauts (excuse the pun) of the industry. We’ll be back, but for now, domain authority reigned supreme.

In April 2019, Fleetcover was only ranking for 229 keywords, and they now rank for 824, a 259% increase.

Traffic

As mentioned, we saw results beginning in November as Google crawled the site more actively and found more relevant content. Therefore, April - November 2019 is our “before” comparison for what we’ve managed to achieve over the past 12 months:

April - November 2019:

  • Impressions: 296,000
  • Clicks: 2,220

November 2019 - November 2020:

  • Impressions: 1,880,000 (up 322%)
  • Clicks: 6,470 (up 194%)

Thanks to more than exceeding our set KPI goals, we were shortlisted for three SEO awards this year, and Fleetcover’s CEO had only good things to say:

“For years we’ve been looking for a company to do exactly what you have done and I can honestly say in 12 years of being involved in marketing, this is the first time that any marketing company has proactively gone ahead and done something for us in this way. I’ve whinged about it for so long that it made my day when it dropped in my inbox. Really chuffed.”

Well, that just speaks for itself, doesn’t it?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, January 26, 2021

Study Confirms Moz Has the Largest Link Data Set

Posted by CassandraLeeanne

Backlinks continue to be especially valuable for SEO, acting as a signal to search engines that vouches for both the value of your content and the worthiness of your website.

In measuring the success of your backlink efforts and informing your future SEO priorities, the data matters. Depending on the tool you use, your results can vary greatly and — in this case — the more data the better.

With this in mind, we’re excited to share that Moz has the largest link data set, according to a study done by Perficient.

Methodology

The link index study was done by Perficient, a respected publicly-traded consulting firm specializing in digital solutions. This award winning firm regularly publishes insightful studies, focused on SEO trends and topics. For this study, Eric Enge, Principal at Perficient, Search Engine Land author, and well-known SEO SME, led the investigation.

The study compared 3,000 search queries across the Technology, Health, and Finance market sectors to evaluate the link indexes of Moz, Ahrefs, SEMrush, and Majestic. The queries were evenly split across the sectors, and then used to pull results from Google for the top 100 domains. This led to the statistically significant sample size of 85,308 domains. Perficient then used the APIs for all four link indexes and evaluated the results. Perficient also performed manual checks to validate results.

Full disclosure: Moz financed the study, but Perficient conducted it independently — the conclusions are 100% their own, with no influence from Moz.

"The link graph — the sum total of all of the links that connect the pages on the worldwide web — is the foundation of PageRank and the original Google algorithm. While a lot has changed in 20 years, high-authority links are still a driving force in how Google values and surfaces content. Our goal is to reveal as much of the link graph as possible to understand how Google sees the web and rewards authoritative content, while at the same time helping people focus on the highest quality and most actionable links." — Dr. Pete Meyers

Results

The Moz link index reported approximately 90% more links than Majestic, which reported the second-largest. Moz reported the most links per domain 72% of the time.

In link research, size matters. Whether you’re launching a link building campaign, performing competitive analysis, or creating a Google disavow file, finding the most complete set of links to any URL or domain directly impacts the quality of your work as an SEO.

Moz reported the most linking domains 60% of the time.



When performing link research, the number of raw links isn't always as important as the number of unique linking domains. A tool that reports a million links, but all from the same domain, isn't nearly as valuable as a tool that reports 1,000 links from 100 different domains. Links often repeat across the same domain (think footer or sidebar links), so finding the most unique linking domains — as opposed to raw link counts — is often a more useful metric for SEOs.

Moz narrowly trailed Majestic by 0.2% as having the lowest percent of duplicate links.

Link counts can be inflated when tools report duplicate links. For example, some tools might report a link found with both HTTP:// and HTTPS:// as two separate links, even though one canonicals to the other. A lower percentage of duplicate links can indicate better data quality.

This is the second study in recent months to determine that Moz link data stands above the rest. In October of 2020, Search Engine Land compared eight SEO tools, and also noted that Moz reports a significantly higher number of linking domains.

Reviews

Perhaps just as important as the breadth of the data is that Moz tools make managing SEO easier, as these reviews mention:

Carly Schoonhoven, Senior SEO Manager at Obility: “Moz makes my life easier in so many ways. When doing link building, particularly, I absolutely love the link intersect tool. It’s a really great way to find linking opportunities quickly without having to put in a lot of extra effort. I also love that, you know, any data you need is really only just a couple of clicks away, whether it be site errors or backlinks or keywords. It just really is intuitive and makes finding data really fast.”

Kristina Kledzik, SEO Manager at Rover: “Moz has been a critical part of my link building strategies and competitive analysis for five years now, through agency work and in-house SEO. It’s easy to use, easy to understand, and adds an extra layer of information to every website you visit. Definitely a must have for any SEO.”

Lily Ray, SEO Director at Path Interactive: “I love using Moz Link Explorer to see how potential clients are doing compared to their competitors. You can get as granular as looking at the individual URL level, which is really helpful.”

Try Link Explorer Today

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, January 25, 2021

Outranking Tough Competitors: My One-Year Study of a Google Local Finder

Posted by MiriamEllis

Image credit: Sharon Mollerus

A new client comes to your digital marketing agency and says their competitors are stuck to the local packs like mussels cleaved to coastal rock.

“How do we edge our way up in Google’s local finder, and find our place above the tideline? We don’t even know where to begin,” the local business owner says.

The rough truth is that Google’s local search engine results often don’t make sense at first glance, or even at second or third glances. Local brands are left to puzzle out how to achieve maximum growth when they’re consistently being outranked by sticky competition for their core search phrases.

Methodology

About a year ago, I decided to run a study in which I’d track a local finder for a single query — “breakfast (X city)” — to see if anything brands or the public did over the course of 12 months would shift the top eatery out of its #1 spot. I chose a small SF Bay Area city where I’m not physically located (to remove the influence of proximity from the mix) and repeated the same query across time to see what we might learn from trying to explain the results at the end of the test period. I did my searches manually and tracked them in a spreadsheet.

My anonymized data and takeaways are at the service of your agency as you work to increase local clients’ visibility so that they can achieve optimum growth.

Visualizing a year of movement in the local finder

Google’s local finder results are paginated in sets of ten. In the following chart, you’ll see the top 10 competitors as they stood in January, moved throughout the year, and finished in December. A total of fifteen brands saw some visibility in the top 10 local finder results over the course of the year, and each brand is represented by its own color.

Takeaways

1) Nothing anyone did in 2020 shifted Brand #1 (which I’ll call Tansy) out of its top spot. No amount of links, reviews, photos, posts, category changes or any other activities over the course of year unseated Tansy.

When I see a results set like this, I suspect a sluggish market in which no one is making a strong enough marketing effort to surpass a business like Tansy. If you find a sluggish market, your client can become a winner with the right strategy.

2) The higher a business appeared in the local finder, the more stable it tended to be throughout the year. The lower a business appeared in the local finder, the more erratic its position was as the year moved along. One notable exception to this was dramatic eight-spot April drop the business that started out in position #2 experienced. My theory was that this outlier may have been tied to changes at the business at the onset of COVID-19. But other than this, the lower levels of the top 10 local finder results are very volatile, with some brands even vanishing while new ones popped into view.

It’s clear that local results are extremely dynamic, sometimes even changing from hour to hour within a single day, but you seem to be most secure at the top. Even if you only report on rankings to clients a few times a year, remind them that variation is the norm, and that they shouldn’t sweat the small ranking stuff because it’s tracking overall upward growth that matters most to their brand.

3) Of the 15 total brands that won a spot in the top 10 results over the course of the study, three began and ended the year in the same position. Two that maintained a presence in the results throughout the study ended up in a higher position at the end of the year than they’d begun with, and two ended up lower. Two latecomers began the year not in the top ten but achieved a top ten placement by year’s end. Finally, two that began the year in the top ten fell out of the set by year’s end and two latecomers made brief appearances at some point, only to disappear later.

I wasn’t expecting the pandemic to happen when I began this study, but my takeaway at the end of the test period was that this set of restaurants had done an amazing job adapting. Though a few restaurants lost their spot “above the tideline” of the top 10, most remained operational and visible, and some even made small gains. Anything you can do to help clients remain safely viable will be work of real value for the duration of COVID-19.

Seek strategic clues to unseat a sticky local competitor

Image credit: Megan Hemphill

So, a year has gone by in which no other brand was unable to unseat Tansy. This leaves us with two questions:

  1. What could restaurants lower in this local finder do in the year ahead to mount a challenge to Tansy’s dominance?
  2. As a side query, how do I feel about the results quality? Is it fitting that Tansy is ranking #1 for this search phrase, or are Google’s results inexplicable and/or of low quality?

To investigate this, I did an competitive audit of Tansy at #1, the brand at #5 which I’ll call Lovage, and the brand at #10 which I’ll call Rosemary, to continue my herbal theme.

Obviously, I don’t have access to the analytics of any of these brands, but I was able to analyze 48 data points for each brand to discover where Tansy is winning and where Lovage and Rosemary would need to improve to pry Tansy out of its spot, if possible. I won’t cover all 48 points here, but let’s look for answers together in this data set:

Location

All three brands are within Google’s mapped city borders, though Rosemary is right at the edge, just within the red perimeter.

No location shares an address with a competitor with the same primary category, though Lovage is within a block of businesses with the same primary category.

Lovage is moderately winning on proximity to the city centroid, at .1 miles from it. Tansy is .4 miles, and Rosemary is much further away at 4.5 miles.

There are no signs of spam, in terms of location. Everything is legit.

Business title

No business has the word “breakfast” or the name of the city in its business title, so no one is either spamming or winning an advantage here.

Categories

Tansy is categorized as “breakfast restaurant”, but both Lovage and Rosemary are categorized as “American Restaurant”.

This is our first big a-ha. If Lovage or Rosemary see breakfast-related queries as their primary queries (their head terms), they would likely need to change their primary category to compete with Tansy. Right now, Tansy is getting a big win here.

Tansy is also doing a better job with secondary categories, according to GMBspy, having selected brunch restaurant, american restaurant, and family restaurant to let Google know more about their relevance. Lovage has only selected the rather redundant restaurant as a secondary category, and Rosemary has no secondary categories. Lovage and Rosemary are leaving opportunities on the table here to enrich their secondary categories.

Photos

Tansy comes out ahead again by having uploaded about 20 photos and accumulated 400+ total photos from the public. Lovage has uploaded 0 photos, though the public has stepped in with 100+ uploads. Rosemary has also uploaded 0 photos, and has only amassed 20 public pics.

In terms of quality, I saw lots of good shots for Tansy and Lovage, but Rosemary’s user-uploaded photos are fuzzy, unflattering, and in need of work. Both Lovage and Rosemary need to invest the time in uploading a great photo set to enrich their listing and improve conversions.

Reputation

All three brands have achieved a laudable 4.6 star rating, so there is no clear winner here, but Tansy has 510 reviews, Lovage has 245, and Rosemary has 109.

Tansy is running away with the review game. Lovage needs to double its review corpus and Rosemary needs 5x the reviews it currently has to achieve comparable metrics.

Tansy is also ahead in terms of review recency, with their most recent review being 6 days ago, while it’s been 3 weeks since Lovage was reviewed and 2 weeks for Rosemary.

None of our three competitors have responded to a single Google review. This would seem to shore up the theory that owner responses don’t directly impact local pack rankings, because clearly a lack of responses isn’t preventing high placement in this local finder. That being said, ignoring conversations your customers are starting each time they review your business is not good customer service and could erode reputation and ratings over time. There’s opportunity here for Lovage and Rosemary to become more active than Tansy in ways that could improve customer experience and conversions.

I saw no signs of spam in Tansy’s body of Google reviews, so nothing can be reported by lower competitors to gain an advantage.

Meanwhile, over at Yelp, Tansy is ranking #2, with a 4-star rating, and 673 reviews. Lovage comes in at #7, with a 4-star rating, and 223 reviews. Rosemary is way down at #24, with a 4-star rating, and just 100 reviews. The high stars of all three brands could be doing something to shore up their rankings over in Google’s product, but this is just speculation on my part. I find Rosemary’s top 10 Google visibility a bit more mysterious after looking at Yelp.

Place topics

I consider this an experimental area of Google’s review interface. It surfaces and quantifies subjects reviewers are discussing. I like to look at this to gauge how Google might derive signals of relevance in relationship to the search phrase.

Tansy’s top ranking for a breakfast query might be somewhat supported by 43 mentions of “french toast” and 5 of “breakfast burritos”, but Lovage looks to be in the best shape here with 52 mentions of “breakfast”. Rosemary has received 11 mentions of “breakfast”.

I found the place topics for Lovage especially interesting, because as I accumulated some rather low metrics for them elsewhere in my audit which caused me to feel surprised by their good #5 ranking, I revisited this data point. Could this winning number of mentions of “breakfast” be doing a great deal to support Lovage’s ranking for my search term, even though their metrics are severely lacking in other areas of the audit? Food for thought!

Don’t forget that review acquisition campaigns can shape response language by the way requests are phrased. Tansy should secure their relevance by asking patrons to specifically comment about breakfast, and Rosemary needs to keep working on breakfast mentions as they increase their overall review count.

Attributes

Only Lovage was marked with take-out and delivery attributes. Tansy and Rosemary were leaving it up to diners to guess, phone, visit the website, or discover services in some more time-consuming way. This could certainly be giving Lovage a bit of a boost in terms of user behavior/conversions.

Google Posts, Q&A and menus

Tansy is in the lead again, with minor usage of Google Posts, and 4 questions asked with some response from the brand. Our other two competitors have never published a Google Post or received, published, or answered a question.

Lovage and Rosemary could shine here with a moderate effort put into Google Posts since Tansy’s usage has been lukewarm, and it would take about 15 minutes for the two lower-ranked competitors to put up 10 FAQs and answer them to take on a more active appearance than Tansy.

In regards to menu links, only Tansy had posted one. Smartly, it was a link to the menu on their own website rather than on a third-party platform.

Website

Here’s where I got a fairly significant audit shock.

Tansy has a real website that’s been around for 4 years with a Domain Authority of 19 and a GMB landing page Page Authority of 20. They’ve earned 70 links from 43 root domains. The basic contact info on the website matched the GMB contact info. The GMB landing page title tag optimization did not include my search phrase. The site passed Google’s mobile friendly test but does not pass secure HTTPs muster. The top link the site has earned is from a local online newspaper with a PA of 40, according to Moz Link Explorer.

But Lovage has no website at all, and aren’t linking their Google My Business listing to anything.

Meanwhile, Rosemary has a sketchy two-month-old subdomain on some sort of free website builder with a concerning backlink portfolio of 7,324 links from 74 root domains. The actual DA of the website builder domain is 22 and the GMB landing page PA is 15. The GMB NAP matched the landing page NAP but the GMB landing page title tag optimization did not include my search phrase. The site was neither mobile-friendly nor secure. Moz Link Explorer found that the top link followed to the site was from a completely unrelated web page on a lifestyle site about life in another state, with a PA of 43.

Tansy’s content was minimal, lacked the search phrase in its title tag, and was in what I’d consider pretty poor SEO shape. But it was better than having no website, like Lovage, or the single subdomain page that Rosemary has.

So, this is one of those good but startling audit surprises. No one has a strong website, and despite this, Lovage is ranking #5 with no website and Rosemary is managing top 10 visibility without a real website of their own. There is certainly opportunity for a competitor with a strong, optimized website and a solid backlink profile to make headway in a market like this where high rankings are being awarded despite minimal organic effort.

High level takeaways from the audit

Image credit: Marco Verch

I looked at a variety of other points, like hours of operation, price attributes, and the sites Google was surfacing from around the web on the GMB profiles, but I didn’t see any major wins or losses here.

From the overall audit process, what I did see was that:

Tansy’s win is clear

Of the 20 factors in which one of the three competitors scored a clear win, Tansy won 17, Lovage won 2, and Rosemary won 2.

Nobody but Google knows what all the local ranking factors are, but as far as my auditing process can measure, it made sense that Tansy’s 17 wins were translating to the top ranking among these three competitors. As far as I can measure as a local SEO without access to behavioral signals and other analytics, the top result, at least, makes sense.

Lovage and Rosemary’s claims to visibility are cloudier

Things fall apart a bit after acknowledging that Tansy deserves to be #1. Google is measuring Lovage as being a better result than Rosemary, despite the former having no website and the latter having at least a free subdomain page it is designating as home.

Maybe Google is as suspicious of that backlink profile on the free website builder as I am and is pushing Rosemary below Lovage because of it. Maybe Lovage’s winning Place Topic mentions of “breakfast” are keeping it in the running for my breakfast query, and are even moderately representative of Google’s overall understanding of this entity’s relevance to searches for breakfast in this city.

The trouble is, within the first 10 results of the Local Finder, I saw Lovage outranking restaurants with higher metrics in many areas I haven’t described in my summary, and so, Google’s weighting of ranking factors remains frustratingly vague in this test, as it does in so many real-world cases.

Lovage or Rosemary could unseat Tansy if they chose to

Despite the opacity of Google’s local algorithm, there is clearly room for improvement for both Lovage and Rosemary. If either of these brands were your agency’s client, you would need to take Tansy’s wins column and build your strategy from it. Your strategy could include recommendations for:

  • Primary category adjustment based on ranking goals
  • Website development and optimization
  • Link development
  • Photography
  • Review acquisition, including both numbers and recency, as well as review language
  • Customer service improvements via owner responses and Q&A usage

The main thing is that Tansy’s effort has not been so enormous that it can’t be overcome. It has remained at the top for a year due to a modest presence — not an insurmountable one.

X factors and Google’s local SERP quality

For nearly two decades, local SEOs have been trying to identify and assign weight to the various local search ranking factors. The truth is, whenever I have occasion to conduct an audit, I realize that:

  1. I’m confident that we know some of the factors, but certainly not all of them. I think there are X factors out there still to be discovered.
  2. I have little confidence that we know the weight Google assigns to individual factors, and I strongly suspect that Google weights unique factors differently in different industries.

More on that second point: in this data set, I’ll reveal that the business which ranked #8 in December is IHOP — a large, corporate competitor with a Domain Authority of 68, and nearly 700,000 links from nearly 20,000 root domains. Yet, it ended up being outranked by both Tansy and Lovage, which are single location, independently-owned eateries. How does that happen?

I strongly believe that organic factors have a huge impact on local rankings, but it doesn’t play out that way in this local finder. I also strongly believe that review count matters, but Lovage is beating IHOP with fewer reviews. I still moderately believe that for remote searches, distance to city centroid continues to have some effect, but IHOP is very centrally located in this instance. And so on and so forth.

Overall, I feel Google’s results are, indeed, delivering a good quality experience for a person searching for breakfast in this city. The searcher is certain to find a decent variety of nearby options for a meal, and I saw no spam lying in wait for them in this particular top 10 of the local finder. But as to the individual placement of each restaurant, I did see mysteries that I couldn’t easily solve for myself and that agencies like yours would likely find difficult to explain to clients.

Creating a strong plan of action for clients, despite any ranking mysteries

Image credit: Rose Dav

No one factor will “do the trick” in any local finder. Just like flour doesn’t equal bread unless you add yeast, salt, and water, a single local ingredient won’t = rankings without attention to the whole recipe.

Your agency will encounter sluggish packs where no brand is taking substantial action to challenge the top competitor, meaning achievable wins are totally possible with a few good ingredients. You’ll uncover local finders so riddled with spam that reporting bad actors will be core to your strategy. And you’ll also encounter SERPs that are so actively managed by mighty competitors, making any headway for your client will require throwing everything but the kitchen sink at the problem.

Regardless of scenario, you can create the strongest plan action with these five steps:

  1. The top ranked business in my study were the recipients of tons of love from the public. Their food, their service, their adaptations to the pandemic, and many other human factors really sang out loud in the reviews. The foundation of success both offline and online is positive real world relationships. Be sure you make this message central to what you teach all clients.
  2. Audit the individual packs and finders for each important search phrase. Make a copy of my free Local Business Competitive Audit spreadsheet to help you out. Always look at the metrics of the top competitor and measure your client’s metrics relative to them.
  3. Identify the top competitor’s wins, and prioritize your local marketing strategy based on which factors you believe are having the most impact in the client’s unique market, whether that’s reviews, photos, or what have you. Even if there are mystery rankings, you’ll typically get the best results by applying best practices to presumed ranking factors, hoping to see cumulative rewards. But don’t take anyone’s word for it. Keep experimenting when you encounter mysteries. It may be your agency that unlocks an X factor.
  4. Some agencies don’t report on rankings at all. If yours does, be sure you’re not overreporting, because the constant variation in ranking order can cause clients needless worry. Rather, use rankings mostly as internal benchmarks, and be sure you’re tracking how the work you’re doing is leading to upward growth in conversions and revenue.
  5. Be sure incoming clients understand the influence of user-to-business proximity, meaning that there are no static #1 rankings. This yields many, many chances for your client to rank because customers are multi-located, mobile, and being served up highly dynamic local SERPs.

What would your agency add to my to-do list? What have you seen in your own year-long or multi-year local SERP tracking? Do you suspect the identity of an X factor no one is talking about? If you’ll share in the comments, we can all keep learning together!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, January 22, 2021

A (Re)Introduction to Guest Posting

Posted by CitationLabs

Garrett French — founder of Citation Labs and all around link building expert — takes you on a comprehensive walkthrough of guest posting on sites supported by sales. Why is this a good strategy? How do your posts benefit these websites? How do you start and what websites do you reach out to? Watch to find out!

21 Smart SEO Tips for 2021

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hello, folks. My name is Garrett French, and I'm a link builder. I run Citation Labs. We have 120 employees, and we build lots of links. Today I am here to reintroduce to you the tactic of guest posting.

All right. Very specifically, though, guest posting with a target of publishers — this largest portion here of the publisher pyramid — who are supported by sales, whose main reason of publishing is to sell things.

Introduction

So let's dig in. We are talking about earned placements. The publishers have to approve this content. There's an editorial gatekeeper. Again, yes/no? Do we want to publish? Do we not?

Is it up to our standards? We're talking about real websites with real audiences. We're talking about flexible format. So you can think beyond an article. You can think into an FAQ, for example, or a glossary or something along those lines. Again, very much we want to emphasize the publishers that we're talking about here get their revenue from sales.

They're publishing so that they can get new clients or to sell products or services. We're not talking about PBNs. We're not talking about sponsored placements. We're not talking about any circumstance where you have to pay money in order to get in front of somebody's audience. Lastly, I want to point out we're not necessarily talking about op-ed circumstances here.

This isn't a branded expertise play. This isn't your chance to show how much you know. Now you're going to be able to show your expertise, but you're going to be second fiddle. You've got to put the publisher themselves and their interest in sales first. That's what you're doing here, and that's why you're approaching this group, and again it's why they publish. That's the publisher benefit that you're going to be emphasizing when you approach this group. 

Why guest posts?



Now, why guest posts? Well, guys, there's an enormous amount of visibility and reach here. Look at the pyramid. Now, this is representative of most industries generally, where we've got 95% of the publishers are publishing to get sales, 4% that are mission based and are supported by taxes, tuition, donations, subscriptions, etc.

Then we've got the 1% ad supported. There are so many publishers out there trying to sell in your vertical, in your clients' verticals, in your target vertical if you're in-house, and there's a lot of disaggregated reach there. There's a lot newsletters out there, a lot of social media followings out there, folks, that you could be working to get in front of.

You have a lot more topic and context control when you're publishing on these types of websites, when you're seeking publishing on these sites. Again, if you're looking at the tax, tuition, donation, and subscription supported swath here, the 4%, you can sometimes have topics where you can discuss sales or mention a sales page.

But more frequently you've got to really focus on the publisher's mission, why are they publishing. They're on a mission, and so they're supported by something besides sales. Then lastly, of course, if we're talking about digital PR or any kind of mainstream media focus or PR effort, they want content that's going to drive page views.

That's how they're supported. There's still some mission, of course, in there. But anyhow, you're much less able, at that point, to link into your sales pages. So again, what we're talking about here or one of the benefits here rather links to sales pages, which of course is going to improve the rankings of your sales pages.

How to guest post

Now why is that easier in this context, in the context of helping someone else sell? Well, let's dig in and talk through the how, and you'll see also what makes that possible. 

Finding publishers

So primarily we're talking about finding publishers with whom you have top-of-funnel overlap, where some of your top-of-funnel topics, the pains that your prospective clients have and the pains their prospective clients have are similar, interrelated.

Perhaps we're talking about audience overlap. Perhaps we're talking about industry overlap. Even location overlap. There's some kind of overlap here, and you're speaking into that place when you're thinking of topics for a given publisher. Another way to think about it is the members of that market it's what we think of as a solution stack.

So in the SEO space, we all have our favorite tool stack, the tools everybody uses, Moz for example. Well, if you're selling into that, if you're an agency like Citation Labs, it might make sense to work and try to get some visibility on a SaaS tool in the SEO space.

"Unbundling" the stack



Let's work here a little bit longer though, stick on this one a little bit longer and think about unbundling the stack in different verticals, because this is really at the heart of the process and the approach. Let's think about you're a realtor.

So within your stack or your industry and certainly within your location, there are going to be some roofers too, and a handful of these folks are going to have blogs. Not all of them, but a handful will. So you're going to approach a roofer with a topic such as 10 reasons to fix your roof before you put your home up for sale.

Now, this solves a roofer problem, doesn't it? It's reasons to purchase roofing services. Also it gives you an opportunity to talk about your expertise as a realtor and what impact roof condition may have on the sale of a home.

Let's go into this one here, commercial ovens, let's say those brick ovens for pizza. We're looking at somebody in the flour space. Maybe they've got some organic flour. Well, you're going to write them a guide on why you need to use organic flour in your pizza dough for your pizza restaurant, the difference that organic flour can make on the outcome of the quality of the dough, of the crust.

You're going to speak to temperature impact on organic versus not organic, if there is. There might not be, but let's just for the sake of this assume there is. Then you're also going to have a great chance to link to your commercial pizza ovens.

If you're on a site that sells flour into the restaurant space, well, it really makes sense for you to have some visibility there. Let's say you sell cell phones and you're thinking about the fitness or health space. So you can pitch something.

You find a physical therapist. You've got 10 apps that augment your physical therapy. This can work just as well for let's say a yoga studio or a CrossFit gym. Apps that augment your exercise, your physical fitness regimen. Again, you're putting them first, because you're talking about augmenting services or work that's already going on, which is kind of assuming that someone would be their customer, would choose to go to this physical therapist, or would choose to attend yoga classes at this particular studio.

So this is what we're talking about when we think about or talk about unbundling this stack. You see as we come up with topics that we would pitch, we're putting the publisher first. Always putting the publisher first and recognizing the reason that they publish.

Hone your pitch

This is the biggest piece, guys. Why do they publish? They publish because they want to sell services and products. So you're thinking about topics and formats that are going to support that and that overlap with what you're selling and how you're functioning. Let's see. Here's another good tip. Try and get calls to action for your publisher into the title.

So we could revise this one. Ten reasons to fix roof before sale of home. No, 10 reasons to call a roofer before you put your home up for sale, or 10 reasons to call a roofer now if you're going to put your home up for sale in April.

So again, you're really looking at honing your pitch for the intended purpose of this publisher group. You're thinking beyond the article. We talked about it a little bit, mentioned this earlier. You're thinking about FAQs. You're thinking about glossaries.

Explore different formats

What other formats could be strong, potential formats? An infographic, a small, little infographic. Any of these could be explained or supported through the use of graphics. Again, this is the type of document or pitch that could be really effective, because the publisher is going to see immediately how it could benefit their sales, the reason why they publish.

Keyword research

You're an SEO, right? You're going to lean into keyword research on your pitch. Hey, it looks like you're not ranking for some of these terms in your area. Again, there needs to be overlap for these terms and with what you're trying to sell it or with what your topic needs to be.

But if you've got some basis behind your pitch, some keyword research to support your topic and why it's going to benefit the publisher, you're miles ahead of anybody else who is pitching them. 

Help promote

Then you could even offer some promotion. You're going to link to it from another placement if you get another one. You're going to put it up on Twitter to your following. You're going to mention it on Facebook, etc. Maybe even buy some ads for it. 

Fact-based citations

Now one of the key pieces here, it's kind of hidden down here at the bottom. You're going to make sure that when you're linking to your pages on your site, you're doing it in the context of a fact-based citation. Ideally you've got something on your sales page, we call it a citable element, that's fact-based, ideally your own data that supports a purchase decision ultimately. 

For example, if you know that your ovens do best with organic flour at 412 degrees instead of 418 and you've got the data to support that, well, that's a great place and reason to link back to your oven page that would have that data point mentioned on it.

You're best served by linking in a justifiable manner, and that's specifically when we're talking about data and we're talking about some kind of citation that needs to be linked, where the link is absolutely mandatory, a quote for example.

So again, this model or this approach has to be supported by citable elements living on your sales pages or whatever page you're linking to, if you choose to go this route and not necessarily do sales pages. 

Conclusion

Whoo, I think that's about it, folks.

Probably lots of questions. But that's our approach to guest posting on sales-supported publishers. Give it a shot and let me know how it goes. Love to hear from you at garrett@citationlabs.com — happy to answer any questions. 

Thank you, folks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thursday, January 21, 2021

4 SEO Strategies for Programmatic Sites

Posted by Royh

Planning and executing SEO strategies for sites with hundreds of millions of pages is no easy task, but there are strategies to make it simpler.

Programmatic pages are pages that have been generated automatically on a very large scale. SEO strategies for these pages are used to target multiple keyword variations by creating landing pages at that scale automatically.

You’ll typically find these pages in major verticals like e-commerce, real estate, travel, and informational sites. These verticals are relying on programmatic pages to build their SEO strategy, and they have a dedicated page for each product and category. This set up can lead to hundreds of millions of pages — they’re efficient, functional, and user-friendly, however, they do come with some SEO challenges.

In my experience, the comprehensive SEO strategy covered in this post works best when tailored to fit a large site with programmatic pages. Many strategies that typically work for sites with only a few hundred pages won't necessarily get the same results on larger sites. Small sites rely on manual and meticulous content creation, compared to programmatic pages, which are the main traffic-driving pages of the site.

So, let’s get down to business! I’ll explore the four major SEO challenges you'll encounter when dealing with programmatic pages, and unpack how to overcome them.

1. Keyword research and keyword modifiers

Well-planned keyword research is one of the biggest challenges when operating on a programmatic scale. When working on a sizable set of pages and keywords, it’s important to choose and find the right keywords to target across all pages.

In order to function both efficiently and effectively, it’s recommended that you divide site pages into a few templates before digging into the research itself. Some examples of these templates could include:

  • Categories
  • Sub-categories
  • Product pages
  • Static pages
  • Blogs
  • Informational pages
  • Knowledge base/learning

Once you have all the page templates in place, it's time to build keyword buckets and keyword modifiers.

Keyword modifiers are additional keywords that, once you combine them with your head terms and core keywords, help with long tail strategy. For example, modifiers for the head term “amazon stock” can be anything related to market share, statistics, insights, etc.

Programmatic pages typically hold the majority of the site's pages. (Take Trulia, for example, which has over 30 million indexed pages — the majority of which are programmatic.) As a result, those pages are usually the most important on a larger website, both in terms of volume and search opportunity. Thus, you must ensure the use of the right keyword modifiers across each page template’s content.

Of course, you can’t go over every single page and manually modify the SEO tags. Imagine a website like Pinterest trying to do that — they’d never finish! . On a site with 30-100 million pages, it’s impossible to optimize each one of them individually. That's why it’s necessary to make the changes across a set of pages and categories — you need to come up with the right keyword modifiers to implement across your various page templates so you can efficiently handle the task in bulk.

The main difference here, compared to typical keyword research, is the focus on keyword modifiers. You have to find relevant keywords that can be repeatedly implemented across all relevant pages.

Let's take a look at this use case on a stock investment website:

The example above shows a website that is targeting users/investors with informational intent, and that relies on programmatic pages for the SEO strategy. I found the keyword modifier by conducting keyword research and competitor research.

I researched several relevant, leading websites using Moz’s Keyword Explorer and SimilarWeb’s Search Traffic feature, and noted the most popular keyword groups. After I’d accumulated the keyword groups, I found the search volume of each keyword to determine which ones would be the most popular and relevant to target

Once you have the keyword modifiers, you must implement them across the title tags, descriptions, headline tags, and content on the page template(s) the modifiers are for. Even when you multiply this strategy by millions of pages, having the right keyword modifier makes updating your programmatic pages a much easier process and much more efficient.

If you have a template of pages ordered by a specific topic, you'll be able to update and make changes across all the pages with that topic, for example, a stock information site with a particular type of stock page, or a category with stocks based on a price/industry. One update will affect all the pages in the same category, so if you update the SEO title tag of the template of a stock page, then all pages in the same category will be updated as well.

In the example above, the intent of the keywords is informational. Keyword intent focuses on how to match search intents to keyword modifiers. We’re targeting searchers who are looking to gather certain insights. They want more information regarding stocks or companies, market caps, expert evaluations, market trends, etc. In this case, it's recommended to add additional keywords that will include questions such as “how?”, “what?”, and “which?”.

As another example, transactional keywords — which are a better fit for e-commerce and B2C websites — are highly effective for addressing searches with purchase intent. These terms can include “buy”, “get”, “purchase”, and “shop”.

2. Internal linking

Smart internal linking plans are vital for large sites. They have the ability to significantly increase the number of indexed pages, then pass link equity between pages. When you work on massive sites, one of your main priorities should be to make sure Google will discover and index your site’s pages.

So, how should you go about building those internal linking features?

When looking at the big picture, the goal is that Page A will link to Page B and Page C, while Page B will link to Page D and Page E, etc. Ideally, each page will get at least one link from a different indexed page on the site. For programmatic sites, the challenge here is the fact that new pages emerge on a daily basis. In addition to the existing pages, it’s imperative to calculate and project so that you can jumpstart internal linking for the new pages. This helps these pages get discovered quickly and indexed in the proper fashion.

Related pages and “people also viewed”

One strategy that makes link building easier is adding a "related pages" section to the site. It adds value for the user and the crawlers, and also links to relevant pages based on affinity.

You can link to similar content based on category, product type, content, or just about any other descriptive element. Similar content should be sorted in numeric order or alphabetical order.

HTML sitemap

Yes, even large websites are using HTML sitemaps to help crawlers find new pages. They’re extremely effective when working on large scale sites with millions of pages.

Let’s take a look at this example from the Trulia HTML sitemap (shown above): Trulia built their HTML sitemap based on alphabetical order, and in a way that ensures all pages have links. This way, there won't be any orphan pages, which helps their goal of supplying link juice to all pages that they wish to index.

In general, many e-commerce and real estate websites are sequencing their sitemaps by alphabetical/categorical order to guarantee that no page will be alone.

3. Crawl budget and deindexing rules

Crawl budget is a very important issue that large websites need to consider. When you have tens of millions of programmatic pages, you need to make sure Google consistently finds and indexes your most valuable pages. The value of your pages should be based on content, revenue, business value, and user satisfaction.

First, choose which pages should not be indexed:

  1. Use your favorite analysis tool to discover which pages have the lowest engagement metrics (high bounce rates, low averages of time on site, no page views, etc.).
  2. Use the search console to discover which pages have high impressions and low CTRs.
  3. Combine these pages into one list.
  4. Check to see if they have any incoming links.
  5. Analyze the attribution of those pages to revenue and business leads.
  6. Once you have all of the relevant data and you choose the pages that should be removed from index, add no-index tag to all of them and exclude them from sitemap XML.

I work for SimilarWeb, a website with over 100 million pages, and I ran a no-index test on over 20 million pages based on the checklist above. I wanted to see the impact of removing a high number pages from our organic channel.

The results were incredible.

Although we lost over half a million visits over the course of a month, the overall engagement metrics on programmatic pages improved dramatically.



By removing irrelevant pages, I made more room for relevant and valuable pages to be discovered by the Google bot.

Rand Fishkin also has a really comprehensive checklist, which shows you how to determine if a page is low quality according to Google. Another great example is Britney Muller’s experiment, where she deindexed 75% of Moz’s pages with great results.

4. SEO split testing

Test everything! The advantage when working on a large scale SEO campaign is that you have access to big data and can utilize it for your SEO efforts. Unlike regular A/B testing, which tests human behavior, A/B split testing is purely for crawlers.

The split testing process is usually based on the same or similar templates of pages. Split the page into two or three groups — one group acts as a control, while the other groups are enabled. Test the following criteria:

  • Adding structured data
  • Changing the keyword modifier of SEO tags (title tag, description, H tags, etc.)
  • Image ALT tags
  • Content length
  • Page performance
  • Internal linking

In terms of measuring the performance, I recommend using one experiment at a time. For instance, you might adjust SEO tags first, and then continue testing other verticals after you’ve built some confidence.

Diving into a split testing example, let’s look at Etsy. Etsy wanted to test which title tag would rank higher and drive better CTR, and generally improve the organic traffic to the pages that were tested. In the image below, we can see how they performed the split test between control pages with default title tags against test pages with different tag variations in this article

Pinterest’s dashboard also shows how their growth team relies on split testing experiments for their SEO strategy. Pinterest’s goal was to build an experimentation tool that would help them measure the impact of SEO changes to their rankings and organic traffic.

Now it’s your turn

Since programmatic pages are different from most others, it’s imperative that you build and optimize these pages in the right way. This requires several adjustments from your normal SEO strategy, along with the application of new and proprietary strategies. The benefit of using the approach outlined above is the incremental scale with which you can contribute to your business.

Programmatic page searches are supposed to fit the search query, whether it’s by product search, address, or information. This is why it’s crucial to make sure the content is as unique as possible, and that the user will have the best answer for each query.

Once you grasp the four tactics above, you’ll be able to implement them into your SEO strategy and begin seeing better results for your programmatic pages.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, January 19, 2021

The Best-Laid Plans: Can We Predict Anything About 2021?

Posted by Dr-Pete

I've deleted this introduction twice. To say that no one could've predicted how 2020 unfolded seems trite since we're not even a month into 2021, and this new year has already unraveled. Our challenges in the past year, across the globe, have gone far beyond marketing, and I doubt any of us ended the year the way we expected. This graph from Google Trends tells the story better than I can:

The pandemic fundamentally rewrote the global economy in a way none of us has ever experienced, and yet we have to find a path forward. How do we even begin to chart a course in 2021?

What do we know?

Let's start small. Within our search marketing realm, is there anything we can predict with relative certainty in 2021? Below are some of the major announcements Google has made and trends that are likely to continue. While the timelines on some of these are unclear (and all are subject to change), these shifts in our small world are very likely.

Mobile-only indexing (March)

Mobile-first indexing has been in progress for a while, and most sites rolled over in 2020 or earlier. Google had originally announced that the index would fully default to mobile-first by September 2020, but pushed that timeline back in July (ostensibly due to the pandemic) to March 2021.

If you haven't made the switch to a mobile-friendly site at this point, there's not much time left to waste. Keep in mind that "mobile-first" isn't just about speed and user experience, but making sure that your mobile site is as crawlable as your desktop. If Google can't reach critical pages via your mobile design and internal links, then those pages are likely to drop out of the index. A page that isn't indexed is a page that doesn't rank.

Core Web Vitals (May)

While this date may change, Google has announced that Core Web Vitals will become a ranking factor in 2021. Here's a bit more detail from the official announcement ...

Page experience signals in ranking will roll out in May 2021. The new page experience signals combine Core Web Vitals with our existing search signals including mobile-friendliness, safe-browsing, HTTPS-security, and intrusive interstitial guidelines.

Many of these page experience signals already impact ranking to some degree, according to Google, so the important part really boils down to Core Web Vitals. You can get more of the details in this Whiteboard Friday from Cyrus, but the short version is that this is currently a set of three metrics (with unfortunately techie names):

(1) Largest Contentful Paint (LCP)

LCP measures how quickly the largest, visible block of your page loads. It is one view into perceived load-time and tries to filter out background libraries and other off-page objects.

(2) First Input Delay (FID)
FID measures how much time it takes before a user can interact with your page. "Interact" here means the most fundamental aspects of interaction, like clicking an on-page link.

(3) Cumulative Layout Shift (CLS)
CLS measures changes to your page layout, such as ads that appear or move after the initial page-load. I suspect the update will apply mostly to abusive or disruptive layout shifts.

While these metrics are a narrow slice of the user experience, the good news is that Google has defined all of them in a fair amount of detail and allows you to track this data with tools like Google Lighthouse. So, we're in a unique position of being be able to prepare for the May algorithm update.

That said, I think you should improve site speed and user experience because it's a net-positive overall, not because of a pending 2021 update. If past history — including the HTTPS update and mobile-friendly update — is any indicator, Google's hope is to use the pre-announcement to push people to make changes now. I strongly suspect that Core Web Vitals will be a very minor ranking factor in the initial update, ramping up over a period of many months.

Passage indexing/ranking (TBD)

In October 2020, Google announced that they were "... now able to not just index web pages, but individual passages from the pages." They later clarified that this wasn't so much passage indexing as passage ranking, and the timeline wasn't initially clear. Danny Sullivan later clarified that this change did not roll out in 2020, but Google's language suggests that passage ranking is likely to roll out as soon as it's tested and ready.

While there's nothing specific you can do to harness passage ranking, according to Google, I think this change is not only an indicator of ML/AI progress but a recognition that you can have valuable, long-form content that addresses multiple topics. The rise of answers in SERPs (especially Featured Snippets and People Also Ask boxes) had a side-effect of causing people to think in terms of more focused, question-and-answer style content. While that's not entirely bad, I suspect it's generally driven people away from broader content to shorter, narrower content.

Even in 2020, there are many examples of rich, long-form content that ranks for multiple Featured/Snippets, but I expect passage ranking will re-balance this equation even more and give us increased freedom to create content in the best format for the topic at hand, without worrying too much about being laser-targeted on a single topic.

Core algorithm updates (TBD)

It's safe to say we can expect more core algorithm updates in 2021. There were three named "Core" updates in 2020 (January, May, and December), but the frequency and timing has been inconsistent. While there are patterns across the updates, thematically, each update seems to contain both new elements and some adjustments to old elements, and my own analysis suggests that the patterns (the same sites winning and losing, for example) aren't as prominent as we imagine. We can assume that Google's Core Updates will reflect the philosophy of their quality guidelines over time, but I don't think we can predict the timing or substance of any particular core update.

Googlebot crawling HTTP/2 (2022+)

Last fall, Google revealed that Googlebot would begin crawling HTTP/2 sites in November of 2020. It's not clear how much HTTP/2 crawling is currently happening, and Google said they would not penalize sites that don't support HTTP/2 and would even allow opt-out (for now). Unlike making a site secure (HTTPS) or mobile-friendly, HTTP/2 is not widely available to everyone and may depend on your infrastructure or hosting provider.

While I think we should pay attention to this development, don't make the switch to HTTP/2 in 2021 just for Google's sake. If it makes sense for the speed and performance of your site, great, but I suspect Google will be testing HTTP/2 and turning up the volume on it's impact slowly over the next few months. At some point, we might see a HTTPS-style announcement of a coming ranking impact, but if that happens, I wouldn't expect it until 2022 or later.

When will this end?

While COVID-19 may not seem like a marketing topic, the global economic impact is painfully clear at this point Any plans we make for 2021 have to consider the COVID-19 timeline, or they're a fantasy. When can we expect the pandemic to end and businesses to reopen on a national and global scale?

Let me start by saying that I'm not a medical doctor — I'm a research psychologist by training. I don't have a crystal ball, but I know how to read primary sources and piece them together. What follows is my best read of the current facts and the 2021 timeline. I will try to avoid my own personal biases, but note that my read on the situation is heavily US-biased. I will generally avoid worst-case scenarios, like a major mutation of the virus, and stick to a median scenario.

Where are we at right now?

As I'm writing this sentence, over 4,000 people died just yesterday of COVID-19 in the US and over 14,000 globally. As a data scientist, I can tell you that every data point requires context, but when we cherry-pick the context, we deceive ourselves. What data science doesn't tell us is that everyone one of these data points is a human life, and that matters.

There is a light at the end of the tunnel, in the form of viable vaccines, including (here in the US and in the UK) the Pfizer-BioNTech, Moderna, and Oxford-AstraZeneca vaccines. These vaccines have been approved in some countries, have demonstrated promising results, and are in production. Here in the US, we're currently behind the timeline on distribution, with the CDC reporting about 10 million people vaccinated as of mid-January (initial goal was 20 million vaccinated by the end of 2020). In terms of the timeline, it's important to note that, for maximum effectiveness, the major vaccines require two doses, separated by about 3-4 weeks (this may vary with the vaccine and change as research continues).

Is it getting better or worse?

I don't want to get mired in the data, but the winter holidays and travel are already showing a negative impact here in the US, and New Year's Eve may complicate problems. While overall death rates have improved due to better treatment options and knowledge of the disease, many states and countries are at or near peak case rates and peak daily deaths. This situation is very likely to get worse before it gets better.

When might we reopen?

I'm assuming, for better or worse, that reopening does not imply full "herd immunity" or a zero case-rate. We're talking about a critical mass of vaccinations and a significant flattening of the curve. It's hard to find a source outside of political debates here in the US, but a recent symposium sponsored by Harvard and the New England Journal of Medicine suggests that — if we can adequately ramp up vaccine distribution in the second quarter of 2021 — we could see measurable positive impact by the end of our summer (or early-to-mid third quarter) here in the US.

Any prediction right now requires a lot of assumptions and there may be massive regional differences in this timeline, but the key point is that the availability of the vaccine, while certainly cause for optimism, is not a magic wand. Manufacturing, distribution, and the need for a second dose all mean that we're realistically still looking at a few months for medical advances to have widespread impact.

What can we do now?

First, let me say that there is absolutely no one-size-fits-all answer to this question. Many local businesses were decimated, while e-commerce grew 32% year-over-year in 2020. If you're a local restaurant that managed to stay afloat, you may see a rapid return of customers in the summer or fall. If you're a major online retailer, you could actually see a reduction in sales as brick-and-mortar stores become viable again (although probably not to 2019 levels).

If your e-commerce business was lucky enough to see gains in 2020, Miracle Inameti-Archibong has some great advice for you. To inadequately summarize — don't take any of this for granted. This is a time to learn from your new customers, re-invest in your marketing, and show goodwill toward the people who are shopping online more because of the difficulties they're facing.

If you're stuck waiting to reopen, consider the lead time SEO campaigns require to have an impact. In a recent Whiteboard Friday, I made the case that SEO isn't an on/off switch. Consider the oversimplified diagram below. Paid search is a bit like the dotted gray line — you flip the switch on, and the leads starting flowing. The trade-off is that when you flip the switch off, the leads dry up almost immediately.

Organic SEO has a ramp-up. It's more like the blue curve above. The benefit of organic is that the leads keep coming when you stop investing, but it also means that the leads will take time to rebuild when you start to reinvest. This timeline depends on a lot of variables, but an organic campaign can often take 2-3 months or more to get off the ground. If you want to hit the ground running as reopening kicks in, you're going to need to start re-investing ahead of that timeline. I acknowledge that that might not be easy, and it doesn't have to be all or none.

In a recent interview, Mary Ellen Coe (head of Google Marketing Solutions) cited a 20,000% increase during the pandemic in searches from consumers looking to support local businesses. There's a tremendous appetite for reopening and a surge of goodwill for local businesses. If you're a local business, even if you're temporarily closed, it's important to let people know that you're still around and to keep them up-to-date on your reopening plans as they evolve.

I don't expect that the new normal will look much like the old normal, and I'm mindful that many businesses didn't survive to see 2021. We can't predict the future, but we can't afford to wait for months and do nothing, either, so I hope this at least gives you some idea of what to expect in the coming year and how we might prepare for it.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!