For a while now, I’ve been disappointed with the People Also Ask (PAAs) feature in Google’s search results. My disappointment is not due to the vast amount of space they take up on the SERPs (that’s another post entirely), but more that the quality is never where I expect it to be.
The quality issue I’m running into is that I still find several obscure PAA questions and results or content from other countries.
When I run searches that have a universal answer, such as “can you eat raw chicken?”, the answer is universally correct so there is no issue with the results. But when I run a search that should return local (UK) content, such as “car insurance”, I’m finding a heavy influence from the US — especially around YMYL queries.
I wanted to find out how much of an issue this actually is, so my team and I analyzed over 1,000 of the most-searched-for keywords in the finance industry, where we would expect UK PAA results.
Before we dig in, my fundamental question going into this research was: “Should a financial query originating in the UK, whose products are governed within UK regulations, return related questions that contain UK content?”
I believe that they should and I hope that by the end of this post, you agree, too.
Our methodology
To conduct our analysis, we followed these steps:
1. Tag keywords by category and sub-category:
2. Remove keywords where you would expect a universal result, e.g. “insurance definition”.
3. Extract PAAs and the respective ranking URLs using STAT.
4. Identify country origin through manual review: are we seeing correct results?
Our findings
55.1% of the 4,507 available financial PAAs returned non-UK content. US content was served 50.5% of the time, while the remaining 4.6% was made up of sites from India, Australia, Canada, Ireland, South Africa, Spain, and Singapore.
Results by category
Breaking it down by category, we see that personal finance keywords bring back a UK PAA 33.72% of the time, insurance keywords 52.10%, utilities keywords 64.89%, and business keywords 38.76%.
Personal finance
Digging into the most competitive products in the UK, personal finance, we found that a significant percentage of PAAs brought back US or Indian content in the results.
Out of the 558 personal finance keywords, 186 keywords didn’t bring back a single UK PAA result, including:
financial advisor
first credit card
best car loans
balance transfer cards
how to buy a house
best payday loans
cheap car finance
loan calculator
Credit cards
17.41% of credit card PAAs were showing UK-specific PAAs, with the US taking just over four out of every five. That’s huge.
Another surprising find is that 61 out of 104 credit card keywords didn’t bring back a single UK PAA. I find this remarkable given the fact that the credit card queries originated in the UK.
Loans
Only 15.8% of searches returned a UK PAA result with over 75% coming from the US. We also saw highly-competitive and scrutinized searches for keywords like “payday loans” generate several non-UK results.
Mortgages
While the UK holds the majority of PAA results for mortgage-related keywords at 53.53%, there are still some major keywords (like “mortgages”) that only bring back a single UK result. If you’re searching for “mortgages” in the UK, then you want to see information about UK mortgages, but instead Google serves up mainly US results.
Insurance
Insurance results weren’t as bad as personal finance. However, there was still a big swing towards the US for some products, such as life insurance.
Out of the 350 insurance keywords tested, there were 64 keywords that didn’t bring back a single UK PAA result, including:
pet insurance
cheap home insurance
life insurance comparison
car insurance for teens
cheap dog insurance
types of car insurance
Car insurance
60.54% of car insurance PAAs were showing UK-specific PAAs, with the US taking 36.97%. Out of the 132 keywords that were in this sub-category, UK sites were present for 118, which is better than the personal finance sub-categories.
Home insurance
As one of the most competitive spaces in the finance sector, it was really surprising to see that only 56.25% of results for home insurance queries returned a UK PAA. There are nuances to policies across different markets, so this is a frustrating and potentially harmful experience for searchers.
Utilities
Although we see a majority of PAAs in this keyword category return UK results, there are quite a few more specific searches for which you would absolutely be looking for a UK result (e.g. “unlimited data phone contracts”) but that bring back only one UK result.
One interesting find is that this UKPower page has captured 35 PAAs for the 49 keywords it ranks for. That’s an impressive 71.43% — the highest rating we’ve seen across our analysis.
Business
At the time of our analysis, we found that 36.7% of business-related PAAs were from the UK. One of the keywords with the lowest representation in this category was "business loans", which generated only 6.25% UK results. While the volume of keywords are smaller in this category, there is more potential for harm with serving international content for queries relating to UK businesses.
What pages generate the most PAA results?
To make this post a little more actionable, I aggregated which URLs generated the most PAAs across some of the most competitive financial products in the UK.
Ironically, four out of the top 10 were US-based (cars.news.com manages to generate 32 PAAs across one of the most competitive industries in UK financial searches: car insurance). A hat tip to ukpower.co.uk, which ranked #1 in our list, generating 35 results in the energy space.
To summarize the above analysis, it’s clear that there is too much dominance from non-UK sites in finance searches. While there are a handful of UK sites doing well, there are UK queries being searched for that are bringing back clearly irrelevant information.
As an industry, we have been pushed to improve quality — whether it’s increasing our relevancy or the expertise of our content — so findings like these show that Google could be doing more themselves.
What does this mean for your SEO strategy?
For the purpose of this research, we only looked at financial terms, so whilst we can’t categorically say this is the same for all industries, if Google is missing this much across financial YMYL terms then it doesn’t look good for other categories.
My advice would be that if you are investing any time optimizing for PAAs, then you should spend your time elsewhere, for now, since the cards in finance niches are stacked against you.
Featured Snippets are still the prime real estate for SEOs and (anecdotally, anyway) don’t seem to suffer from this geo-skew like PAAs do, so go for Featured Snippets instead.
Have you got any thoughts on the quality of PAAs across your SERPs? Let me know in the comments below!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Google’s Index Coverage report is absolutely fantastic because it gives SEOs clearer insights into Google’s crawling and indexing decisions. Since its roll-out, we use it almost daily at Go Fish Digital to diagnose technical issues at scale for our clients.
Within the report, there are many different “statuses” that provide webmasters with information about how Google is handling their site content. While many of the statuses provide some context around Google’s crawling and indexation decisions, one remains unclear: “Crawled — currently not indexed”.
Since seeing the “Crawled — currently not indexed” status reported, we’ve heard from several site owners inquiring about its meaning. One of the benefits of working at an agency is being able to get in front of a lot of data, and because we’ve seen this message across multiple accounts, we’ve begun to pick up on trends from reported URLs.
Google’s definition
Let’s start with the official definition. According to Google’s official documentation, this status means: “The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling.”
So, essentially what we know is that:
Google is able to access the page
Google took time to crawl the page
After crawling, Google decided not to include it in the index
The key to understanding this status is to think of reasons why Google would “consciously” decide against indexation. We know that Google isn’t having trouble finding the page, but for some reason it feels users wouldn’t benefit from finding it.
This can be quite frustrating, as you might not know why your content isn’t getting indexed. Below I’ll detail some of the most common reasons our team has seen to explain why this mysterious status might be affecting your website.
1. False positives
Priority: Low
Our first step is to always perform a few spot checks of URLs flagged in the “Crawled — currently not indexed” section for indexation. It’s not uncommon to find URLs that are getting reported as excluded but turn out to be in Google’s index after all.
For example, here’s a URL that’s getting flagged in the report for our website: https://gofishdigital.com/meetup/
However, when using a site search operator, we can see that the URL is actually included in Google’s index. You can do this by appending the text “site:” before the URL.
If you’re seeing URLs reported under this status, I recommend starting by using the site search operator to determine whether the URL is indexed or not. Sometimes, these turn out to be false positives.
Solution: Do nothing! You’re good.
2. RSS feed URLs
Priority: Low
This is one of the most common examples that we see. If your site utilizes an RSS feed, you might be finding URLs appearing in Google’s “Crawled — currently not indexed” report. Many times these URLs will have the “/feed/” string appended to the end. They can appear in the report like this:
Google finding these RSS feed URLs linked from the primary page. They’ll often be linked to using a "rel=alternate" element. WordPress plugins such as Yoast can automatically generate these URLs.
Solution: Do nothing! You're good.
Google is likely selectively choosing not to index these URLs, and for good reason. If you navigate to an RSS feed URL, you’ll see an XML document like the one below:
While this XML document is useful for RSS feeds, there’s no need for Google to include it in the index. This would provide a very poor experience as the content is not meant for users.
3. Paginated URLs
Priority: Low
Another extremely common reason for the “Crawled — currently not indexed” exclusion is pagination. We will often see a good number of paginated URLs appear in this report. Here we can see some paginated URLs appearing from a very large e-commerce site:
Solution: Do nothing! You’re good.
Google will need to crawl through paginated URLs to get a complete crawl of the site. This is its pathway to content such as deeper category pages or product description pages. However, while Google uses the pagination as a pathway to access the content, it doesn’t necessarily need to index the paginated URLs themselves.
If anything, make sure that you don’t do anything to impact the crawling of the individual pagination. Ensure that all of your pagination contains a self-referential canonical tag and is free of any “nofollow” tags. This pagination acts as an avenue for Google to crawl other key pages on your site so you’ll definitely want Google to continue crawling it.
4. Expired products
Priority: Medium
When spot-checking individual pages that are listed in the report, a common problem we see across clients is URLs that contain text noting “expired” or “out of stock” products. Especially on e-commerce sites, it appears that Google checks to see the availability of a particular product. If it determines that a product is not available, it proceeds to exclude that product from the index.
This makes sense from a UX perspective as Google might not want to include content in the index that users aren’t able to purchase.
However, if these products are actually available on your site, this could result in a lot of missed SEO opportunity. By excluding the pages from the index, your content isn’t given a chance to rank at all.
In addition, Google doesn’t just check the visible content on the page. There have been instances where we’ve found no indication within the visible content that the product is not available. However, when checking the structured data, we can see that the “availability” property is set to “OutOfStock”.
It appears that Google is taking clues from both the visible content and structured data about a particular product's availability. Thus, it’s important that you check both the content and schema.
Solution: Check your inventory availability.
If you’re finding products that are actually available getting listed in this report, you’ll want to check all of your products that may be incorrectly listed as unavailable. Perform a crawl of your site and use a custom extraction tool like Screaming Frog's to scrape data from your product pages.
For instance, if you want to see at scale all of your URLs with schema set to “OutOfStock”, you can set the “Regex” to: "availability":"<="" p="">
You can export this list and cross-reference with inventory data using Excel or business intelligence tools. This should quickly allow you to find discrepancies between the structured data on your site and products that are actually available. The same process can be repeated to find instances where your visible content indicates that products are expired.
5. 301 redirects
Priority: Medium
One interesting example we’ve seen appear under this status is destination URLs of redirected pages. Often, we’ll see that Google is crawling the destination URL but not including it in the index. However, upon looking at the SERP, we find that Google is indexing a redirecting URL. Since the redirecting URL is the one indexed, the destination URL is thrown into the “Crawled — currently not indexed” report.
The issue here is that Google may not be recognizing the redirect yet. As a result, it sees the destination URL as a “duplicate” because it is still indexing the redirecting URL.
Solution: Create a temporary sitemap.xml.
If this is occurring on a large number of URLs, it is worth taking steps to send stronger consolidation signals to Google. This issue could indicate that Google isn’t recognizing your redirects in a timely manner, leading to unconsolidated content signals.
One option might be setting up a “temporary sitemap”. This is a sitemap that you can create to expedite the crawling of these redirected URLs. This is a strategy that John Mueller has previously recommended.
To create one, you will need to reverse-engineer redirects that you have created in the past:
Export all of the URLs from the “Crawled — currently not indexed” report.
Match them up in Excel with redirects that have been previously set up.
Find all of the redirects that have a destination URL in the “Crawled — currently not indexed” bucket.
Create a static sitemap.xml of these URLs with Screaming Frog.
Upload the sitemap and monitor the “Crawled — currently not indexed” report in Search Console.
The goal here is for Google to crawl the URLs in the temporary sitemap.xml more frequently than it otherwise would have. This will lead to faster consolidation of these redirects.
6. Thin content
Priority: Medium
Sometimes we see URLs included in this report that are extremely thin on content. These pages may have all of the technical elements set up correctly and may even be properly internally linked to, however, when Google runs into these URLs, there is very little actual content on the page. Below is an example of a product category page where there is very little unique text:
This page is likely either too thin for Google to think it’s useful or there is so little content that Google considers it to be a duplicate of another page. The result is Google removing the content from the index.
Here is another example: Google was able to crawl a testimonial component page on the Go Fish Digital site (shown above). While this content is unique to our site, Google probably doesn’t believe that the single sentence testimonial should stand alone as an indexable page.
Once again, Google has made the executive decision to exclude the page from the index due to a lack of quality.
Solution: Add more content or adjust indexation signals.
Next steps will depend on how important it is for you to index these pages.
If you believe that the page should definitely be included in the index, consider adding additional content. This will help Google see the page as providing a better experience to users.
If indexation is unnecessary for the content you're finding, the bigger question becomes whether or not you should take the additional steps to strongly signal that this content shouldn’t be indexed. The “Crawled —currently not indexed” report is indicating that the content is eligible to appear in Google’s index, but Google is electing not to include it.
There also could be other low quality pages to which Google is not applying this logic. You can perform a general “site:” search to find indexed content that meets the same criteria as the examples above. If you’re finding that a large number of these pages are appearing in the index, you might want to consider stronger initiatives to ensure these pages are removed from the index such as a “noindex” tag, 404 error, or removing them from your internal linking structure completely.
7. Duplicate content
Priority: High
When evaluating this exclusion across a large number of clients, this is the highest priority we’ve seen. If Google sees your content as duplicate, it may crawl the content but elect not to include it in the index. This is one of the ways that Google avoids SERP duplication. By removing duplicate content from the index, Google ensures that users have a larger variety of unique pages to interact with. Sometimes the report will label these URLs with a “Duplicate” status (“Duplicate, Google chose different canonical than user”). However, this is not always the case.
This is a high priority issue, especially on a lot of e-commerce sites. Key pages such as product description pages often include the same or similar product descriptions as many other results across the Web. If Google recognizes these as too similar to other pages internally or externally, it might exclude them from the index all together.
Solution: Add unique elements to the duplicate content.
If you think that this situation applies to your site, here’s how you test for it:
Take a snippet of the potential duplicate text and paste it into Google.
In the SERP URL, append the following string to the end: “#=100”. This will show you the top 100 results.
Use your browser’s “Find” function to see if your result appears in the top 100 results. If it doesn’t, your result might be getting filtered out of the index.
Go back to the SERP URL and append the following string to the end: “&filter=0”. This should show you Google’s unfiltered result (thanks, Patrick Stox, for the tip).
Use the “Find” function to search for your URL. If you see your page now appearing, this is a good indication that your content is getting filtered out of the index.
Repeat this process for a few URLs with potential duplicate or very similar content you’re seeing in the “Crawled — currently not indexed” report.
If you’re consistently seeing your URLs getting filtered out of the index, you’ll need to take steps to make your content more unique.
While there is no one-size-fits-all standard for achieving this, here are some options:
Rewrite the content to be more unique on high-priority pages.
Use dynamic properties to automatically inject unique content onto the page.
Remove large amounts of unnecessary boilerplate content. Pages with more templated text than unique text might be getting read as duplicate.
If your site is dependent on user-generated content, inform contributors that all provided content should be unique. This may help prevent instances where contributors use the same content across multiple pages or domains.
8. Private-facing content
Priority: High
There are some instances where Google’s crawlers gain access to content that they shouldn’t have access to. If Google is finding dev environments, it could include those URLs in this report. We’ve even seen examples of Google crawling a particular client’s subdomain that is set up for JIRA tickets. This caused an explosive crawl of the site, which focused on URLs that shouldn’t ever be considered for indexation.
The issue here is that Google’s crawl of the site isn’t focused, and it’s spending time crawling (and potentially indexing) URLs that aren’t meant for searchers. This can have massive ramifications for a site’s crawl budget.
Solution: Adjust your crawling and indexing initiatives.
This solution is going to be entirely dependent on the situation and what Google is able to access. Typically, the first thing you want to do is determine how Google is able to discover these private-facing URLs, especially if it’s via your internal linking structure.
Start a crawl from the home page of your primary subdomain and see if any undesirable subdomains are able to be accessed by Screaming Frog through a standard crawl. If so, it’s safe to say that Googlebot might be finding those exact same pathways. You’ll want to remove any internal links to this content to cut Google’s access.
The next step is to check the indexation status of the URLs that should be excluded. Is Google sufficiently keeping all of them out of the index, or were some caught in the index? If Google isn’t indexing a large amount of this content, you might consider adjusting your robots.txt file to block crawling immediately. If not, “noindex” tags, canonicals, and password protected pages are all on the table.
Case study: duplicate user-generated content
For a real-world example, this is an instance where we diagnosed the issue on a client site. This client is similar to an e-commerce site as a lot of their content is made up of product description pages. However, these product description pages are all user-generated content.
Essentially, third parties are allowed to create listings on this site. However, the third parties were often adding very short descriptions to their pages, resulting in thin content. The issue occurring frequently was that these user-generated product description pages were getting caught in the “Crawled — currently not indexed” report. This resulted in missed SEO opportunity as pages that were capable of generating organic traffic were completely excluded from the index.
When going through the process above, we found that the client’s product description pages were quite thin in terms of unique content. The pages that were getting excluded only appeared to have a paragraph or less of unique text. In addition, the bulk of on-page content was templated text that existed across all of these page types. Since there was very little unique content on the page, the templated content might have caused Google to view these pages as duplicates. The result was that Google excluded these pages from the index, citing the “Crawled — currently not indexed” status.
To solve for these issues, we worked with the client to determine which of the templated content didn’t need to exist on each product description page. We were able to remove the unnecessary templated content from thousands of URLs. This resulted in a significant decrease in “Crawled — currently not indexed” pages as Google began to see each page as more unique.
Conclusion
Hopefully, this helps search marketers better understand the mysterious “Crawled — currently not indexed” status in the Index Coverage report. Of course, there are likely many other reasons that Google would choose to categorize URLs like this, but these are the most common instances we’ve seen with our clients to date.
Overall, the Index Coverage report is one of the most powerful tools in Search Console. I would highly encourage search marketers to get familiar with the data and reports as we routinely find suboptimal crawling and indexing behavior, especially on larger sites. If you’ve seen other examples of URLs in the “Crawled — currently not indexed” report, let me know in the comments!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Negative SEO can hurt your website and your work in search, even when your rankings are unaffected by it. In this week's Whiteboard Friday, search expert Russ Jones dives into what negative SEO is, what it can affect beyond rankings, and tips on how to fight it.
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
All right, folks. Russ Jones here and I am so excited just to have the opportunity to do any kind of presentation with the title "Defense Against the Dark Arts." I'm not going to pretend like I'm a huge Harry Potter fan, but anyway, this is just going to be fun.
But what I want to talk about today is actually pretty bad. It's the reality that negative SEO, even if it is completely ineffective at doing its primary goal, which is to knock your website out of the rankings, will still play havoc on your website and the likelihood that you or your customers will be able to make correct decisions in the future and improve your rankings.
Today I'm going to talk about why negative SEO still matters even if your rankings are unaffected, and then I'm going to talk about a couple of techniques that you can use that will help abate some of the negative SEO techniques and also potentially make it so that whoever is attacking you gets hurt a little bit in the process, maybe. Let's talk a little bit about negative SEO.
What is negative SEO?
The most common form of negative SEO is someone who would go out and purchase tens of thousands of spammy links or hundreds of thousands even, using all sorts of different software, and point them to your site with the hope of what we used to call "Google bowling," which is to knock you out of the search results the same way you would knock down a pin with a bowling ball.
The hope is that it's sort of like a false flag campaign, that Google thinks that you went out and got all of those spammy links to try to improve your rankings, and now Google has caught you and so you're penalized. But in reality, it was someone else who acquired those links. Now to their credit, Google actually has done a pretty good job of ignoring those types of links.
It's been my experience that, in most cases, negative SEO campaigns don't really affect rankings the way they're intended to in most cases, and I give a lot of caveats there because I've seen it be effective certainly. But in the majority of cases all of those spammy links are just ignored by Google. But that's not it. That's not the complete story.
Problem #1: Corrupt data
You see, the first problem is that if you get 100,000 links pointing to your site, what's really going on in the background is that there's this corruption of data that's important to making decisions about search results.
Pushes you over data limits in GSC
For example, if you get 100,000 links pointing to your site, it is going to push you over the limit of the number of links that Google Search Console will give back to you in the various reports about links.
Pushes out the good links
This means that in the second case there are probably links, that you should know about or care about, that don't show up in the report simply because Google cuts off at 100,000 total links in the export.
Well, that's a big deal, because if you're trying to make decisions about how to improve your rankings and you can't get to the link data you need because it's been replaced with hundreds of thousands of spammy links, then you're not going to be able to make the right decision.
Increased cost to see all your data
The other big issue here is that there are ways around it.
You can get the data for more than 100,000 links pointing to your site. You're just going to have to pay for it. You could come to Moz and use our Link Explorer tool for example. But you'll have to increase the amount of money that you're spending in order to get access to the accounts that will actually deliver all of that data.
The one big issue sitting behind all of this is that even though we know Google is ignoring most of these links, they don't label that for us in any kind of useful fashion. Even after we can get access to all of that link data, all of those hundreds of thousands of spammy links, we still can't be certain which ones matter and which ones don't.
Problem #2: Copied content
That's not the only type of negative SEO that there is out there. It's the most common by far, but there are other types. Another common type is to take the content that you have and distribute it across the web in the way that article syndication used to work. So if you're fairly new to SEO, one of the old methodologies of improving rankings was to write an article on your site, but then syndicate that article to a number of article websites and these sites would then post your article and that article would link back to you.
Now the reason why these sites would do this is because they would hope that, in some cases, they would outrank your website and in doing so they would get some traffic and maybe earn some AdSense money. But for the most part, that kind of industry has died down because it hasn't been effective in quite some time. But once again, that's not the whole picture.
No attribution
If all of your content is being distributed to all of these other sites, even if it doesn't affect your rankings, it still means there's the possibility that somebody is getting access to your quality content without any kind of attribution whatsoever.
If they've stripped out all of the links and stripped out all of the names and all of the bylines, then your hard earned work is actually getting taken advantage of, even if Google isn't really the arbiter anymore of whether or not traffic gets to that article.
Internal links become syndicated links
Then on the flip side of it, if they don't remove the attribution, all the various internal links that you had in that article in the first place that point to other pages on your site, those now become syndicated links, which are part of the link schemes that Google has historically gone after.
In the same sort of situation, it's not really just about the intent behind the type of negative SEO campaign. It's the impact that it has on your data, because if somebody syndicates an article of yours that has let's say eight links to other internal pages and they syndicate it to 10,000 websites, well, then you've just got 80,000 new what should have been internal links, now external links pointing to your site.
We actually do know just a couple of years back several pretty strong brands got in trouble for syndicating their news content to other news websites. Now I'm not saying that negative SEO would necessarily trigger that same sort of penalty, but there's the possibility. Even if it doesn't trigger that penalty, chances are it's going to sully the waters in terms of your link data.
Problem #3: Nofollowed malware links & hacked content
There are a couple of other miscellaneous types of negative SEO that don't get really talked about a lot.
Nofollowed malware links in UGC
For example, if you have any kind of user-generated content on your site, like let's say you have comments for example, even if you nofollow those comments, the links that are included in there might point to things like malware.
We know that Google will ultimately identify your site as not being safe if it finds these types of links.
Hacked content
Unfortunately, in some cases, there are ways to make it look like there are links on your site that aren't really under your control through things like HTML injection. For example, you can actually do this to Google right now.
You can inject HTML onto the page of part of their website that makes it look like they're linking to someone else. If Google actually crawled itself, which luckily they don't in this case, if they crawled that page and found that malware link, the whole domain in the Google search results would likely start to show that this site might not be safe.
Of course, there's always the issue with hacked content, which is becoming more and more popular.
Fear, uncertainty, and doubt
All of this really boils down to this concept of FUD — fear, uncertainty, and doubt. You see it's not so much about bowling you out of the search engines. It's about making it so that SEO just isn't workable anymore.
1. Lose access to critical data
Now it's been at least a decade since everybody started saying that they used data-driven SEO tactics, data-driven SEO strategies. Well, if your data is corrupted, if you lose access to critical data, you will not be able to make smart decisions. How will you know whether or not the reason your page has lost rankings to another has anything to do with links if you can't get to the link data that you need because it's been filled with 100,000 spammy links?
2. Impossible to discern the cause of rankings lost
This leads to number two. It's impossible to discern the cause of rankings lost. It could be duplicate content. It could be an issue with these hundreds of thousands of links. It could be something completely different. But because the waters have been muddied so much, it makes it very difficult to determine exactly what's going on, and this of course then makes SEO less certain.
3. Makes SEO uncertain
The less certain it becomes, the more other advertising channels become valuable. Paid search becomes more valuable. Social media becomes more valuable. That's a problem if you're a search engine optimization agency or a consultant, because you have the real likelihood of losing clients because you can't make smart decisions for them anymore because their data has been damaged by negative SEO.
It would be really wonderful if Google would actually show us in Google Search Console what links they're ignoring and then would allow us to export only the ones they care about. But something tells me that that's probably beyond what Google is willing to share. So do we have any kind of way to fight back? There are a couple.
How do you fight back against negative SEO?
1. Canonical burn pages
Chances are if you've seen some of my other Whiteboard Fridays, you've heard me talk about canonical burn pages. Real simply, when you have an important page on your site that you intend to rank, you should create another version of it that is identical and that has a canonical link pointing back to the original. Any kind of link building that you do, you should point to that canonical page.
The reason is simple. If somebody does negative SEO, they're going to have two choices. They're either going to do it to the page that's getting linked to, or they're going to do it to the page that's getting ranked. Normally, they'll do it to the one that's getting ranked. Well, if they do, then you can get rid of that page and just hold on to the canonical burn page because it doesn't have any of these negative links.
Or if they choose the canonical burn page, you can get rid of that one and just keep your original page. Yes, it means you sacrifice the hard earned links that you acquired in the first place, but it's better than losing the possibility in the future altogether.
2. Embedded styled attribution
Another opportunity here, which I think is kind of sneaky and fun, is what I call embedded styled attribution.
You can imagine that my content might say "Russ Jones says so and so and so and so." Well, imagine surrounding "Russ Jones" by H1 tags and then surrounding that by a span tag with a class that makes it so that the H1 tag that's under it is the normal-sized text.
Well, chances are if they're using one of these copied content techniques, they're not copying your CSS style sheet as well. When that gets published to all of these other sites, in giant, big letters it has your name or any other phrase that you really want. Now this isn't actually going to solve your problem, other than just really frustrate the hell out of whoever is trying to screw with you.
But sometimes that's enough to get them to stop.
3. Link Lists
The third one, the one that I really recommend is Link Lists. This is a feature inside of Moz's Link Explorer, which allows you to track the links that are pointing to your site. As you get links, real links, good links, add them to a Link List, and that way you will always have a list of links that you know are good, that you can compare against the list of links that might be sullied by a negative SEO campaign.
By using the Link lists, you can discern the difference between what's actually being ignored by Google, at least to some degree, and what actually matters. I hope this is helpful to some degree. But unfortunately, I've got to say, at the end of the day, a sufficiently well-run negative SEO campaign can make the difference in whether or not you use SEO in the future at all.
It might not knock you out of Google, but it might make it so that other types of marketing are just better choices. So hopefully this has been some help. I'd love to talk you in the comments about different ways of dealing with negative SEO, like how to track down who is responsible. So just go ahead and fill those comments up with any questions or ideas.
I would love to hear them. Thanks again and I look forward to talking to you in another Whiteboard Friday.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
You’ve produced a piece of content you thought was going to be a huge success, but the results were underwhelming.
You double and triple checked the content for all the crucial elements: it’s newsworthy, data-driven, emotional, and even a bit controversial, but it failed to “go viral”. Your digital PR team set out to pitch it, but writers didn’t bite.
So, what's next?
Two questions you might ask yourself are:
Do I have unrealistic link expectations for my link-building content?
Is my definition of success backed by data-driven evidence?
Fractl has produced thousands of content marketing campaigns across every topic — sports, entertainment, fashion, home improvement, relationships — you name it. We also have several years’ worth of campaign performance data that we use to learn from our successes and mistakes.
In this article, I’m going to explain how businesses and agencies across seven different niches can set realistic expectations for their link-building content based on the performance of 626 content projects Fractl has produced and promoted in the last five years. I’ll also walk through some best practices for ensuring your content reaches its highest potential.
Managing expectations across verticals
You can’t compare apples to oranges. Each beat has its own unique challenges and advantages. Content for each vertical has to be produced with expert-level knowledge of how publishers within each vertical behave.
We selected the following common verticals for analysis:
Health and fitness
Travel
Sex and relationships
Finance
Technology
Sports
Food and drink
Across the entire sample of 626 content projects, on average, a project received 23 dofollow links and 88 press mentions in total. Some individual vertical averages didn’t deviate much from these averages, while others niches did.
Of course, you can’t necessarily expect these numbers when you just start dipping your toes in content marketing or digital PR. It’s a long-term investment, and it usually takes at least six months to a year before you get the results you’re looking for.
A “press mention” refers to any time a publisher wrote about the campaign. A press mention could involve any type of link (dofollow, nofollow, simple text attribution, etc.). We also looked at dofollow links individually, as they provide more value than a nofollow link or text attribution. For campaigns that went “viral” and performed well above the norm, we excluded them in the calculation so as not to skew the averages higher.
Based on averages from these 626 campaigns, are your performance expectations too high or too low?
Vertical-specific content considerations
Of course, there are universal principles that you should apply to all content no matter the vertical. The data needs to be sound. The graphic assets need to be pleasing to the eye and easy to understand. The information needs to be surprising and informative.
But when it comes to vertical-specific content considerations, what should you pay attention to? What tactics or guidelines apply to one niche that you can disregard for other niches? I solicited advice from the senior team at Fractl and asked what they look out for when making content for different verticals. All have several years of experience producing and promoting content across every vertical and niche. Here’s what they said:
Sex and dating
For content relating to sex and relationships, it’s important to err on the side of caution.
“Be careful not to cross the line between ‘sexy’ content and raunchy content,” says Angela Skane, Creative Strategy. “The internet can be an exciting place, but if something is too out-there or too descriptive, publishers are going to be turned off from covering your content.”
Especially be aware of anything that could be construed as misogynistic or pin women against each other. It’s likely not the message your client will want to promote, anyway.
Finance
Given the fact that money is frequently touted as one of the topics you avoid over polite dinner conversation, there's no doubt that talking and thinking about money evokes a lot of emotion in people.
“Finance can seem dry at first glance, but mentions of money can evoke strong emotions. Tapping into financial frustrations, regrets, and mistakes makes for highly entertaining and even educational content,” says Corie Colliton, Creative Strategy. “For example, one of my best finance campaigns featured the purchases people felt their partners wasted money on. Another showed the amount people spend on holiday gifts — and the number who were in debt for a full year after the holidays as a result.”
Emotion is one of the drivers of social sharing, so use it to your advantage when producing finance-related content.
We also heard from Chris Lewis, Account Strategy: “Relate to your audience. Readers will often try to use financial content marketing campaigns as a way to benchmark their own financial well-being, so giving people lots of data about potential new norms helps readers relate to your content.”
People want to read content and be able to picture themselves within it. How do they compare to the rest of America, or their state, or their age group? Relatability is key in finance-related content.
Sports
A little healthy competition never hurt anyone, and that’s why Tyler Burchett, Promotions Strategy, thinks you should always utilize fan bases when creating sports content: “Get samples from different fan bases when possible. Writers like to pit fans against each other, and fans take pride in seeing how they rank.”
Food and drink
According to Chris Lewis, don’t forgo design when creating marketing campaigns about food: “Make sure to include good visuals. People eat with their eyes!”
If the topic for which you’re creating content typically has visual appeal, it’s best to take advantage of that to draw people into your content. Have you ever bought a recipe book that didn’t include photos of the food?
Technology
Think tech campaigns are just about tech? Think again. Matt Gillespie, Data Science, says: “Technology campaigns are always culture and human behavior campaigns. Comparing devices, social media usage, or more nuanced topics like privacy and security, can only resonate with a general audience if it ties to more common themes like connection, safety, or shared experience — tech savvy without being overly technical.”
Travel
When creating content for travel, it’s important to make sure there are actionable takeaways in the content. If there aren’t, it can be hard for publishers to justify covering it.
“Travel writers love to extract ‘tips’ from the content they're provided. If your project provides helpful information to travelers or little-known statistics on flights and amenities, you're likely to gain a lot of traction in the travel vertical,” says Delaney Kline, Brand Promotions. “Come up with these ideal statistics before creating your project and use them as a template for your work.”
Health and fitness
In the health and wellness world, it can seem like everyone is giving advice. If you’re not a doctor, however, err on the side of caution when speaking about specific topics. Try not to pit any particular standard against another. Be careful around diet culture and mental health topics, specifically.
“Try striking a balance between physical and mental well-being, particularly being careful to not glorify or objectify one standard while demeaning others,” says Matt Gillespie, Data Science. “Emphasize overall wellness as opposed to focus on a single area. In this vertical, you need to be especially careful with whatever is trending. Do the legwork to understand the research, or lack thereof, behind the big topics of the moment.”
Improving content in any vertical
While you can certainly tailor your content production and promotion to your specific niche, there are also some guidelines you can follow to improve the chances that you’ll get more media coverage for your content overall.
Create content with a headline in mind
When you begin mapping out your content, identify what you want the outcome to look like. Before you even begin, ask yourself: what do you want people to learn from your content? What are the elements of the content you’re producing that journalists will find compelling for their audiences?
For example, we wrote a survey in which we wanted to compare the levels of cooking experience across different generations. We hypothesized that we’d see some discrepancies between boomers and millennials specifically, and given that millennials ruin everything, it was a good time to join the discussion.
As it turns out, only 64% of millennials could correctly identify a butter knife. Publishers jumped at the stats revealing millennials have a tough time in the kitchen. Having a thesis and an idea of what we wanted the project to look like in advance had a tremendous positive impact on our results.
Appeal to the emotionality of people
In past research on the emotions that make content go viral, we learned that negative content may have a better chance of going viral if it is also surprising. Nothing embodies this combination of emotional drivers than a project we did for a travel client in which we used germ swabs to determine the dirtiest surfaces on airplanes.
This campaign did so well (and continues to earn links to this day) that it’s actually excluded from our vertical benchmarks analysis as we consider it a viral outlier.
Why did this idea work? Most people travel via plane at least once a year, and everyone wants to avoid getting sick while traveling. So, a data-backed report like this one that also yielded some click-worthy headlines is sure to exceed your outreach goals.
Evergreen content wins (sometimes)
You may have noticed from the analysis above that, of the seven topics we chose to look at, the sports vertical has the lowest average dofollows and total press mentions of any other category.
For seasoned content marketers, this is very understandable. Unlike the other verticals, the sports beat is an ever-changing and fast-paced news cycle that’s hard for content marketers to have a presence in. However, for our sports clients we achieve success by understanding this system and working with it — not trying to be louder than it.
One technique we’ve found that works for sports campaigns (as well as other sectors with fast-paced news cycles such as entertainment or politics) is to come up with content that is both timely and evergreen. By capitalizing on the current interests around major sporting events (timely) and creating an idea that would work on any given day of the year (evergreen) we can produce content that's the best of both worlds, and that will still have legs once the timeliness wears off.
In a series of campaigns for one sports client, we took a look at the evolution of sports jerseys and chose teams with loyal fan bases such as the New York Yankees, Carolina Panthers, Denver Broncos, and Chicago Bears.
The sports niche has an ongoing, fast-paced news cycle that changes every day, if not every hour. Reporters are busy covering by-the-minute breaking news, games, statistics, rankings, trades, personal player news, and injuries. This makes it one of the most challenging verticals to compete in. By capitalizing on teams of interest throughout the year, we were able to squeeze projects into tight editorial calendars and earn our client some press.
For example, timing couldn’t have been better when we pitched “Evolution of the Football Jersey”. We pitched this campaign to USA Today right before the tenacious playoffs in which the Steelers and the Redskins played. Time was of the essence — the editor wrote and published this article within 24 hours and our client enjoyed a lot of good syndication from the powerful publication. In total, the one placement resulted in 15 dofollow links and over 45 press mentions. Not bad for a few transforming GIFs!
Top it off with the best practices in pitching
If you have great content and you have a set of realistic expectations for that content, all that’s left is to distribute it and collect those links and press mentions.
In a survey of over 500 journalists in 2019, I asked online editors and writers what their biggest PR pitch pet peeves were. When you conduct content marketing outreach, avoid these top-listed items and you’ll be good to go:
While you might get away with sending one too many follow-ups, most of the offenses on this list are just that — totally offensive to the writer you’re trying to pitch.
Avoid mass email blasts, personalize your pitch, and triple-check that the person you're contacting is receptive to your content before you hit send.
Conclusion
While there are certainly some characteristics that all great content should have, there are ways to increase the chances your content will be engaging within a specific vertical. Research what your particular audience is interested in, and be sure to measure your results realistically based on how content generally performs in your space.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
As a local business consultant, I know that deeper marketing insights can be discovered when you set aside formality and share experiences: a moment, a laugh, a common bond.
When I’m looking for ways to make life easier for a client, I sometimes reflect on ancient practices like yoga, tai chi, and mindful breathing, which are increasingly understood as beneficial to human health. For a space in time, they reduce the complex world we live in to a simpler one where being, breath, movement, and focus bring the practitioner to a more intuitive state.
Local marketing agencies can empathize with the complex world their clients inhabit. Local business owners must manage everything from rent and employee benefits to customer service, business reviews, web content, and online listings. When you take on a new client, you expect them to onboard a ton of information about marketing their brand online. Sometimes, the most basic motivations go unaddressed and get lost in assumptions and jargon — instead of decreasing client stress for your least technical clients, you can accidentally increase it.
Today, I’ll help you newly create an intuitive space by sharing five simple meditation exercises you can use with your agency’s clients. Instead of signaling via SEO, CTR, USPs, and GMB, let’s relax with clients by relating successful local search marketing practices to experiences people at any level of technical proficiency already understand.
Heart
For a local business owner, there is no more important quality than having their heart in the right place when it comes to their motivation for running a company.
Yes, all of us work to earn money, but it’s the dedication to serving others that is felt by customers in every interaction with them. When customers feel that a business is there for them, it establishes the loyalty and reputation that secure local search marketing success.
Heart meditation
Close your eyes for a few seconds and think of a time in your life when you most needed help from a business. Maybe you needed a tow truck, a veterinarian, a dentist, or a plumber. You really needed them to understand your plight, deliver the right help, and treat you as an important person who is worthy of respect. Whether you received what you required or not, remember the feeling of need.
Now, extend that recognition beyond your own heart to the heart of every customer who feels a need for something your client can offer them.
A business owner with their heart in the right place can powerfully contribute to local search marketing by:
Running a customer-centric business.
Creating customer guarantees that are fair.
Creating an employee culture of respect and empowerment that extends to customers.
Creating a location that is clean, functional, and pleasant for all.
Honestly representing their products, services, location, and reputation.
Refraining from practices that negatively impact their customers and reputation.
Participating positively in the life of the community they serve.
A good local search marketing agency will help the business owner translate these basics into online content that meets customer needs, local business listings that accurately and richly represent the business, and genuine reviews that serve as a healthy and vital ongoing conversation between the brand and its customers. A trustworthy agency will ensure avoidance of any tactics that pollute the Internet with spam listings, spam reviews, negative attacks on competitors, and negative impacts on the service community. An excellent agency will also assist in finding and promoting community engagement opportunities, helping to win desirable online publicity from offline efforts.
Ear
Local business success is so linked to the art of listening, I sometimes think Google should replace their teardrop map markers with little ears. In the local SEO world, there are few things sadder than seeing local business profiles filled with disregarded reviews, questions, and negative photos. (Someone cue “The Sound of Silence”.)
From a business perspective, the sound of branded silence is also the sound of customers and profits trickling away. Why does it work this way? Because only 4% of your unhappy customers may actually make the effort to speak up, and if a business owner is not even hearing them, they’ve lost the ability to hear consumer demand. Let’s make sure this doesn’t happen.
Ear meditation
Close your eyes for a few seconds and listen closely to every noise within the range of your hearing. Ask yourself, “Where am I?”
The sound of typing, phone calls, and co-workers chatting might place you in an office. Sliding doors, footsteps on linoleum, and floor staff speaking might mean you’re at your client's brick-and-mortar location. Maybe it’s birdsong outside and the baby in their crib that tell you you’re working from home today. Listen to every sound that tells you exactly where you are right now.
Now, commit to listening with this level of attention and intention to the signals of customer voices, telling you exactly where a local brand is right now in terms of faults and successes.
A business owner who keeps their ears open can actively gauge how their business is really doing with its customers by:
Having one-on-one conversations with customers.
Recording and analyzing phone conversations with customers.
Reading reviews on platforms like Google My Business, Yelp, Facebook and sites that are specific to their industry (like Avvo for lawyers or Healthgrades for physicians).
Reading the Q&A questions of customers on their Google Business Profile.
Reading mentions of their brand on social media platforms like Twitter, Facebook, and Instagram.
Reading the responses to surveys they conduct.
Reading the emails and form submissions the company receives.
A good local search marketing agency will help their client amass, organize, and analyze all of this sentiment to discover the current reputation of the business. From this information, you and your client can chart a course for improvement. Consider that, in this study, a 1.5 star improvement in online reputation increased consumer activity by 10%-12% and generated 13,000 more leads for the brands included. The first step to a better reputation is simply listening.
Eye
When your clients choose their business locations, they weigh several factors. They compare how the mantra of “location, location, location” matches their budget, and whether a certain part of town is lacking something their business could provide. They also look at the local competitors to see if the competition would be hard to beat, or if they could do the job better. Success lies in truly seeing the lay of the land.
Local search mirrors the real world. The market on the Internet is made up of the physical locations of your clients’ customers at the time they search for what your client has to offer.
Eye meditation
You already know most of the businesses on your street, and many of them in your neighborhood. Now, with eyes wide open, start searching Google for the things your listening work has told you customers need. Where appropriate, include attributes you’ve noticed them using like “best tacos near me”, “cheapest gym in North Beach”, or “shipping store downtown.”
See how your client is ranking when a person does these type of searches while at their location. Now, walk or drive a few blocks away and try again. Go to the city perimeter and try again. Where are they ranking, and who is outranking them as you move about their geographic market?
A local business keeping its eyes open never makes assumptions about who its true competitors are or how its customers search. Instead, it:
Regularly assesses the competition in its market, taking into account the distance from which customers are likely to come for goods and services.
Regularly reviews materials assembled in the listening phase to see how customers word their requests and sentiments.
Makes use of tools to analyze both markets and keyword searches.
A good local search marketing agency will help with the tools needed for market and search language analysis. These findings can inform everything from what a client names their business, to how they categorize it on their Google My Business listing, to what they write about to draw in customers from all geographic points in their market. Clear vision simultaneously enables you to analyze competitors who are outranking your client and assess why they’re doing so. It can empower your client to report spammers who are outranking them via forbidden tactics. An excellent agency will help their client see their competitive landscape with eyes on the prize.
Mind
With hearts ready for service, ears set on listening, and eyes determined to see, you and your client have now taken in useful information about their brand and the customers who make up their local market. You know now whether they’re doing a poor, moderate, or exceptional job of fulfilling needs, and are working with them to strategize next steps. But what are those next steps?
Mind meditation
Sit back comfortably and think of a time a business completely surprised you, or a time when an owner or employee did something so unexpectedly great, it convinced you that you were in good hands. Maybe they comped your meal when it wasn’t prepared properly, or special-ordered an item just for you, or showed you how to do something you’d never thought of before.
Recall that lightbulb moment of delight. Ask yourself how your client’s brand could surprise customers in memorable ways they would love. Create a list of those ideas.
A creative local business gives full play to the awesome imaginative powers of the brain. It gives all staff permission to daydream and brainstorm questions like:
What is something unexpected the business could do that would come as a delightful surprise to customers?
What is the most impactful thing the business could do that would be felt as a positive force in the lives of its customers?
What risks can the business take for the sake of benevolence, social good, beauty, renown, or joy?
A good local search marketing agency will help sort through ideas that could truly differentiate their clients from the competition and bring them closer to making the kinds of impressions that turn local brands into household names. An excellent agency will bring ideas of their own. Study “surprise and delight marketing” as it’s done on the large, corporate scale, and get it going at a local level like this small coffee roaster in Alexandria, Va. selling ethical java while doubling as funding for LGBTQ+ organizations.
Mouth
“Think before you speak” is an old adage that serves well as a marketing guideline. Another way we might say it is “research before you publish”. With heart, ear, eye, and mind, you and your client have committed, collected, analyzed, and ideated their brand to a point where it’s ready to address the public from a firm foundation.
Mouth meditation
Open your favorite word processor on your computer and type a few bars of the lyrics to your favorite song. Next, type the first three brand slogans that come to your mind. Next, type a memorable line from a movie or book. Finally, type out the the words of the nicest compliment or best advice someone ever gave you.
Sit back and look at your screen. Look at how those words have stuck in your mind — you remember them all! The people who wrote and spoke those words have indelibly direct-messaged you.
How will you message the public in a way that’s unforgettable?
A well-spoken local business masters the art of face-to-face customer conversation. In-store signage and offline media require great words, too, but local search marketing will take spoken skills onto the web, where they'll be communicated via:
Every page of the website
Every article or blog post
Social media content
Review responses
Answers to questions like Google Business Profile Q&A
Images on the website, business listings, and third-party platforms like Google Images and Pinterest
Videos on the website, YouTube, and other platforms
A good local search marketing agency will help their client find the best words, images, and videos based on all the research done together. An excellent agency will help a local business move beyond simply being discovered online to being remembered as a household name each time customer needs arise. An agency should help their clients earn links, unstructured citations, and other forms of publicity from those research efforts.
Determine to help your client be the "snap, crackle, pop", "un-Cola", "last honest pizza" with everything you publish for their local market, and to build an Internet presence that speaks well of their business 24-hours a day.
Closing pose
One of the most encouraging aspects of running and marketing a local business is that it’s based on things you already have some life experience doing: caring, listening, observing, imagining, and communicating.
I personally should be better at technical tasks like diagnosing errors in Schema, configuring Google Search Console for local purposes, or troubleshooting bulk GMB uploads. I can work at improving in those areas, but I can also work at growing my heart, ear, eye, mind, and mouth to master serving clients and customers.
Business is technical. Business is transactional. But good business is also deeply human, with real rewards for well-rounded growth.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!