I’m not a literary scholar, but I believe it was Hamlet that said “to have a featured snippet or not to have a featured snippet?” Ever since featured snippets came onto the scene, sites have been trying to secure them.
My team and I wanted in on this craze. Throughout our journey of research, testing, failure, and success, we found some interesting pieces of information that we wanted to share with the community. I’ll walk you through what we did and show you some of our results (though can’t share traffic numbers).
A featured snippet is the box that appears at the top of the search result page that provides information to succinctly and accurately answer your query and cites a website.
Why are featured snippets important?
A featured snippet is important because it represents an additional SERP feature that you can secure. Usually located at the very top of the results page, featured snippets offer you greater visibility to searchers and can boost brand recognition.
Our featured snippet plan of attack
Research, research, and more research on how to pull this off
Did we implement everything from what we learned during this discovery phase into our featured snippet strategy? No. Are we perfect at it now after a year and a half of practicing this? No, no, no. We are getting better at it, though.
2. Identify keywords we wanted to target
We originally started out focusing on big “head” keywords. These represented terms that had indeterminate searcher intent. The first head term that we focused on was HRIS. It stands for Human Resources Information System — sexy, right?
Note: Looking back on this, I wish we had focused on longer tail keywords when testing out this strategy. It's possible we could have refined our process faster focusing on long tail keywords instead of the large head terms.
3. Change how we structure our on-page content
We worked closely with our writing team to update how we lay out content on our blog. We changed how we used H2s, H3s (we actually used them now!), lists, and so on to help make our content easier to read for both users and robots.
In most of the content where we’re trying to rank for a featured snippet, we have an H2 in the form of a question. Immediately after the H2, we try and answer that question. We’ve found this to be highly successful (see pictures later on in the post). I wish I could say that we learned this tactic on our first try, but it took several months before this dawned on us.
4. Measure, test, and repeat
The first blog post that we tried this out on was our “What is an HRIS” article. Overall, this post was a success, it ranked for the head term that we were going for (HRIS), but we didn’t win a featured snippet. We deemed it a slight failure and went back to work.
This is where the fun started.
Featured snippet successes
We discovered a featured snippet trigger that we could capitalize on — mainly by accident. What was it?
Is.
Really. That was it. Just by adding that to some of our content, we started to pick up featured snippets. We started to do it more and more, and we were winning more and more featured snippets! I believe it was this strategic HR example that clued us onto the “is” trigger.
So we kept it up.
What did we learn?
I want to preface this by saying that all of this is anecdotal evidence. We haven’t looked at several million URLs, run it through any fancy number-crunching, or had a statistician look at the data. These are just a few examples that we’ve noticed that, when repeated, have worked for us.
Blog/HR glossary - We found that it was easier for us to gain featured snippets from our blog or our glossary pages. It seemed like no matter what optimizations that we made on the product page, we weren’t able to make it happen.
Is - No, not the clown from the Stephen King novel. “Is” seemed to be the big trigger word for winning featured snippets. During our audit, we did find some examples of list featured snippets, but the majority were paragraphs and the trigger word was "is."
Definitions - We saw that definitions of the head term we were trying to go for was usually what got the definition. Our on-page copy would have the H2 with the keyword (e.g. What is Employee Orientation?) and then the paragraph copy would answer that question.
Updating old posts - One surprising thing we learned is that when we went back to old posts and tried adding the “is” trigger word, we didn’t see a change — even if we added a good amount of new content to the page. We were only able to grab featured snippets with new content that we created. Also, when we updated large amounts of content on a few pages that had featured snippets, we lost them. We made sure to not touch the sections of the page that the snippet was pulling from, but we still lost the snippet (some have come back, but some are still gone).
Conclusion
A few final things to note:
First, while these examples are anecdotal, I think that they show some practices that anyone wanting to capture featured snippets can do.
Second, this was process was over a 12–18 month period and we’re still evolving what we think is the best way for us and our content team.
Third, we had a lot of failures with this. I showed you one example, but we’ve had many (short-form content, long-form content, glossary terms, blog posts, etc.) that didn’t work. We just kept measuring, testing, and optimizing.
Lastly, I need to give a shout out to our writing team. We massively disrupted their process with this and they have been phenomenal to work with (effective interdepartmental relationships are crucial for any SEO project).
Let me know what's worked for you or if you have any questions by leaving a comment down below.
Note: On January 23, 2020 Google announced that featured snippets would no longer be listed twice on the first page. For more information, you can check out this thread from Google Search Liaison. This may change how valuable featured snippets are to companies and the amount of clicks a listing gets. Before you start to panic, remember it will be important to watch and measure how this affects your site before doing anything drastic. If you do decide to go nuclear and to remove your featured snippets from the results, check out this documentation.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
A rising tide lifts all ships — and it's similar story with increased site authority. What factors are affected as you improve PageRank or Domain Authority, and how? In today's Whiteboard Friday, Cyrus details seven SEO processes that are made easier by a strong investment in link building and growing your authority.
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
Howdy, Moz fans. Welcome to another edition of Whiteboard Friday. I'm Cyrus Shepard. Quick Whiteboard this week. I want to talk about links.
We know in SEO we love links. Everybody wants links. But why? What do links do for you? They do a surprising amount for you that we sometimes don't realize. So the title of today's Whiteboard, "7 SEO Processes That Get Easier with Increased PageRank and Domain Authority." So why did we choose PageRank and Domain Authority?
Well, these are both algorithms that measure link power, both the number of links and the quantity of links. PageRank being Google's algorithm to rank web pages based on popularity and importance. Domain Authority, which Google doesn't use, just to be clear, Domain Authority being a Moz algorithm that measures both link quantity and quality.
For our purposes, we can basically use them in the same conversation. We're talking about the power of your links.
1. Ranking ability
The first thing that everybody knows about is links help you rank. They help you rank in many, many ways. You can get higher rankings. You can attack more keywords, but most importantly, you can attack more competitive keywords.
A good thing I like to do is, when I'm trying to see if I can rank for a keyword, simply Google it, check the Page Authority, which is a very similar metric, of all the top ranking pages, see what your Page Authority is for your top ranking keywords, and you can kind of have a pretty good idea if you have an ability to rank for that keyword.
2. Crawl budget
But then we get into the nitty-gritty, the other benefits of having that link equity, one of the most important being crawl budget.
When you have more link equity, Google will crawl more of your pages. If you only have a handful of links and a million pages on your website, it's going to be very difficult to get Google to crawl and index all those million pages. If you're eBay or Amazon or Google or a site with like a 100 Domain Authority, yes, you might be able to attract Google to crawl those million pages.
3. Indexation speed
Google will also crawl them faster. You may get Google to crawl your pages with low Domain Authority, but it's going to take a while for Google to visit those again. So then we get into the idea of indexation speed. With a higher Domain Authority, Google is going to crawl and index your content typically much faster than they would without.
So if you have a page that you've updated recently, you're going to see Google update it quicker the more authority that page has. Also you're going to see this in the SERPs. If you have outdated title tags or meta descriptions, you can ask Google to crawl it via the Submit URL tool. But generally, the more authority a page has, the more incoming link power, you're going to see those things updated so much quicker than you would with low link equity.
4. More powerful links
This is my favorite one. With increased link equity, your own links become more powerful, and this gives you incredible ranking power because your internal links, that you're linking to yourself, become more powerful with that link equity. So it makes everything easier to rank. The best link building you can do when you have high authority is linking to yourself, and it's so easy.
But also the links that you link out to other people also become more valuable, which makes you a more attractive target.
5. Insulation from bad links
My friend Everett Sizemore came up with that word "insulation." With better link equity, you're somewhat protected from a handful of bad links. Now if you have low link equity and you get a bunch of spam links to your site, your risk of penalization or being impacted by negative SEO increases pretty high.
But if you have a million links, a handful of bad links just aren't going to hurt you. A good way to think about this is ratios, because, of course, anybody can get penalized. Anybody can suffer the consequences of bad links. But if those bad links only make up a tiny portion, meaning a small ratio, then you are somewhat insulated by the impact of those bad links.
6. Less over-optimization
Now Google says they don't have an over-optimization penalty. But anecdotally, many SEOs understand that if you're a small site, you're just starting out, it's very easy to over-optimize for keywords with exact match anchor text and not rank. The key usually: in SEO, you want a lot of variety.
With a lot of links, that variety is much easier to get, and you have much less risk of over-optimization in linking internally with exact match anchor text. You can get away with a lot more with higher Domain Authority than you can with less Domain Authority. That's kind of the key to this whole thing. With higher Domain Authority, you just get away with a lot more. It's the idea of the rich getting richer.
7. The flywheel effect
Rand Fishkin, our friend, likes to talk about the flywheel effect. When you have more links, everything gets easier. When you start ranking and people start seeing you in the SERPs, you're going to get more links from that content, and more links are going to equal more ranking and the wheel is just going to keep turning and turning.
More people want to link to you and amplify you and work with you. You're also going to get a lot more spam requests and link requests and things like that, so it isn't fun. But generally, the more Domain Authority you have, the more PageRank you have, the easier life is going to get, and you just want to start building it up day after day after day. So, like I said, a quick and easy Whiteboard Friday this week.
Hope you enjoyed it. We'll talk to you next time. Thanks, everybody.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Without a doubt, it is our job as SEOs to keep an eye on the future and anticipate what Google is planning, testing, or looking to drop on our doorsteps. Over the past 12 months alone, we have seen several changes in Google Search — each impacting how we plan, implement, and report on campaigns.
In this article, I will take a look at what is in store for SEO in 2020 and how these factors will change the way we formulate strategies throughout the next year and beyond.
Artificial intelligence will continue to evolve
Over the past half-decade, artificial intelligence has become a pioneering force in the evolution of SEO.
In 2015, for example, we were introduced to RankBrain -- the machine-based search algorithm that helps Google push more relevant results to users. Although RankBrain is coming up on its fifth birthday, we are only now catching early glimpses into how artificial intelligence will dominate SEO in the coming years.
The most recent step in this progression of artificial learning is, of course, the introduction of Bidirectional Transformers for Language Understanding (BERT), which Google announced at the end of October. For those who missed it, BERT is Google’s neural network-based technique for natural language processing, and it’s important because it deals with the very fundamentals of how people search. Google itself says that the algorithm represents “the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.”
Affecting one in ten searches, BERT gives Google a better understanding of how language is used and helps it comprehend the context of individual words within searches. The important thing to know about BERT (and also RankBrain), is the fact that you cannot optimize for it.
There's nothing to optimize for with BERT, nor anything for anyone to be rethinking. The fundamentals of us seeking to reward great content remain unchanged.
BERT is just one signal of how Google understands language, but it is one of the most important in the search engine’s arsenal. This means that now more than ever, webmasters and SEOs alike must focus their efforts on creating the most useful, natural, and highest-quality content. Quite simply, as Danny Sullivan says, “write content for users.”
It’s also worth understanding how BERT interprets questions, which you can find out more about in the Whiteboard Friday episode below.
Voice search is here to stay
It’s hard to imagine at the dawn of 2020, but when voice search was released in 2012 many assumed it would be just another project consigned to the ever-growing Google graveyard.
Today, however, we know so much more about the technology and, thanks to schema.org, where it is likely to go in the future. The adoption rate is slower than predicted, but it has nevertheless leaked into our lives, so we must not completely ignore voice search.
Schema markup
A new form of markup is released nearly every month, with one of the latest developments being markup for movies. Although this might seem insignificant, the fact that we are now seeing markup for films shows just how granular and far-reaching structured data has come.
With smart speakers now numbering 120 million in the US alone, webmasters should be taking the time to investigate where schema can be placed on their website so they can take advantage of the 35.6 million voice search demands taking place every month. What’s more, website markup has a monumental influence on featured snippets, which can be highly lucrative for any website. Take a look at this Moz guide for more information on how voice search influences featured snippets.
Speakable
If you’re in the US, it’s also worth noting that Speakable (BETA) is used by Google Assistant to answer people’s questions about specific topics. The assistant can return up to three relevant articles and provide audio playback using text-to-speech markup. Implementing such a markup can be highly lucrative for news sites, because when the assistant provides an answer, it also attributes the source and sends the full article URL to the user's mobile device. If you’re a news site that publishes in English but doesn’t yet have Speakable markup implemented, you can read up on both the technical considerations and content requirements necessary for eligibility.
Google Actions
Actions on Google, a development platform for Google Assistant, is also worth your consideration. It allows the third-party development of "actions" — applets for Google Assistant that provide extended functionality. Actions can be used to get things done by integrating your content and services with the Google Assistant.
Actions allow you to do a number of things:
Build Actions to ensure Google Assistant uses your apps
Allow users to search for and engage with your app
Provide your app as a recommendation for user queries
Check out this fantastic article by Andrea Vopini about how to optimize your content using Google assistant.
Google is heavily invested in using entities
Entities aren’t something that you hear SEOs talking about every day, but they are something Google is putting a lot of resources into. Put simply, Google itself states that entities are “a thing or concept that is singular, unique, well-defined, and distinguishable.”
Entities don’t have to be something physical, but can be something as vague as an idea or a color. As long as it is either singular, unique, distinguishable, or well-defined, it is an entity.
As you can see, Moz shows up in the knowledge panel because the company is an entity. If you search the Google Knowledge Graph API for the company name, you can see how Google understands them:
But what does this mean for SEOs?
In 2015, Google submitted a patent named “Ranking Search Results Based On Entity Metrics,” which is where the above entity description is sourced from. Although few patents are worth getting excited about, this one caused a stir in the technical SEO scene because it takes machine learning to an entirely new level and allows Google to accurately calculate the probability of user intent, thus giving it an understanding of both user language and tone. What’s more, entities place a reduced reliance on links as a ranking factor, and depending on what your SEO strategy is, that could result in the need for big campaign changes.
The most important aspect you will need to consider is how Google understands the entities on your website.
For example, if your site sells shoes, you need to think about how many different types, colors, sizes, brands, and concepts exist for your shoes. Each shoe will represent a different entity, which means you must consider how to frame each product so that it meets the expectations of users as well as the learning capabilities of Google — which is where we meet markup once again.
Sites themselves can also become entities, and that provides huge rewards as they appear in the Knowledge Panel, which I will discuss next.
The knowledge panel will be important for personalities and brands
Although Google’s Knowledge Graph was launched way back in 2012, its expansion since then means it is still a core part of the search matrix and one that will reach far into the next decade.
Closely tied with featured snippets and rich results, earlier last year Google began allowing entities to claim their own knowledge panel, giving them access to edit and control the information presented to users in search results. They can make specific requests, such as changing the featured image, panel title, and social profiles provided within the panel.
The benefits of claiming your knowledge panel are numerous. They help users gain quick access to your site, which thanks to the Knowledge Graph, displays trust and authority signals. Knowledge panels also provide brands and personalities with the ability to control what objective information is shown to users. However, there are still many brands that have yet to claim their own panels.
You can claim your business’s knowledge panel in a few easy steps:
As you can see from the above examples, being in the Knowledge Graph can improve trust and add authenticity to your business or personal brand, as well as providing additional visibility. But it's easier said than done.
Unless you're a recognized, famous person or brand, claiming space in the Knowledge Graph is going to be difficult. Having a Wikipedia page can be enough, but I don't recommend creating pages just to get there — it will get deleted and waste your effort. Instead, build brand mentions and authority around your name gradually. While having a wikidata page can be helpful, it’s not guaranteed. The goal is to get Google to recognize you as a notable person or brand.
Queryless proactive predictive search is getting better
Google Discover was released in June of 2017, prompting a new kind of search altogether — one that is queryless. Discover is an AI-driven content recommendation tool and claims 80 million active users.
Using the aforementioned Knowledge Graph, Google added an extra layer called the Topic Layer, which is engineered to understand how a user’s interest develops over time (this article by the University of Maryland offers an in-depth explanation of topic layers and models).
By understanding the many topics a user is interested in, Discover identifies the most accurate content to deliver from an array of websites.
But what does this mean for SEOs?
To appear in Discover, Google states that pages appear “if they are indexed by Google and meet Google News content policies. No special tags or structured data are required.” It ranks content based on an algorithm that inspects the quality of content alongside the interests of the user and the topic of the page in question. The exact formula is unknown, however, based on several studies and experiments we now have a pretty good idea of how it works.
This screenshot from a presentation by Kenichi Suzuki highlights some of the factors that help pages appear in Discover.
According to Google, there are two ways to boost the performance of your content within Discover:
Post interesting content
Use high-quality images
As ever, ensure that you generate high quality content that is unique and creates a great experience for users. If your site tends to publish clickbait articles, the chance of those articles appearing in Discover is low.
Other tips for appearing in Discover would be to arrange your content semantically so that Google finds it easier to understand your work, and ensure that your website is technically proficient.
Like any form of search, you can use Google Search Console to see how well your articles are performing in Discover. You can find Discover stats under the performance section.
Google Discover analytics data is fairly new, and therefore limited. There isn't currently a native way to segment this traffic inside Google Analytics. To track user behavior data, this article provides a technique to track it inside Google Analytics.
We have yet to see the biggest changes in visual image Search
It could be argued that the biggest change to image search happened in September 2018 when Google Lens rolled out. Not only did featured videos begin appearing in image search, but AMP stories and new ranking algorithms and tags were also released.
But while speaking at a Webmaster Meetup in New York last year, John Mueller shared that there will be major changes in image search in the coming year. Rather than merely viewing images, very soon people will use itto accomplish goals, purchase products, and learn new information.
Google has always said that images should be properly optimized and marked, so if you have not started to add such data or information to your images, now is definitely the time to start.
In the past six months alone we have seen Google introduce small changes such as removing the “view image” function, as well as colossal changes, such as totally revamping image search for Desktop.
Furthermore, people don’t even have to search within it to see images anymore. It's common for the SERP to present a universal search result, which encompasses images, videos, maps, news, and shopping listings. The opportunity to appear in a universal (or blended) result is just another reason why properly tagged and marked images are so important.
Finally, Google has added visual image search attributes to search results. The interesting thing with this update is that these attributes are now available as image carousels within the main search results.
But what does this mean for SEOs?
With so much to play with, webmasters and SEOs should consider how they can take advantage of such changes, which could prove potentially very lucrative for the right sites — especially when you consider that 35% of Google product searches return transactions in as little as five days.
E-A-T doesn’t apply to every site — but it still matters
E-A-T (Expertise, Authoritativeness, Trustworthiness) is something every SEO should know back to front, but remember:
E-A-T isn’t a ranking factor
E-A-T is critical for Your Money or Your Life (YMYL) topics and pages
Although these two statements might seem contradictory, they make more sense when you consider what Google defines as YMYL.
According to Google’s Rater Guidelines, YMYL is a page or topic that “could potentially impact a person’s future happiness, health, financial stability, or safety.” This means that if your page has information that could potentially change a person’s life, it is considered YMYL and offering E-A-T is important. If your site is merely your personal collection of cat pictures, then showcasing authority or expertise is less critical.
But what does this mean for SEOs?
The issue, however, is that the majority of websites (and certainly the ones invested in SEO) are generally going to have YMYL pages or topics, but Google is taking big steps to ensure that low quality or questionable YMYL content is weeded out. As you might know, you can’t optimize for E-A-T because it isn’t an algorithm, but you can implement changes to make sure your site sends the right kind of quality signals to Google. This Moz article by Ian Booth and this guide by Lily Ray offer great tips for how to do that.
Topics and semantics over keywords
Google is putting less priority on both links and keywords, which is where topic modeling and semantics come into the conversation.
Google has become very clever at understanding what a user is searching for based on just a few basic words. This is thanks, in part, to topic modeling (as Google itself admitted in September 2018 when it introduced its “topic layer”). Indeed, this algorithm has a deep understanding of semantics and yearns to provide users with deep troves of information.
But what does this mean for SEOs?
This means that it has never been more important to create high quality, in-depth, and meaningful content for users — but you also need to think about information structure.
For example, if your site sells running shoes, you could create long-form educational pieces about how to choose shoes for specific running environments, athletic diets for runners, or tech accessory reviews. These articles could then be clustered into various topics. By clustering your topics into compartments through your website architecture, both users and crawlers can easily navigate and understand the content provided.
Studies have also shown that Google’s crawlers prefer pages with semantic groupings and sites that are designed around topic modeling. This 2018 presentation by Dawn Anderson gives a brilliant insight into this. If you want to know more about topic modeling and semantic connectivity, check out this Whiteboard Friday by Rand Fishkin.
SERPs will continue to evolve
Over the past couple of years, we’ve seen search results evolve and transform like never before. In fact, they have changed so much that, in some cases, being placed first within the organic search results may not be the most lucrative position.
This is something that would have been unthinkable just a few short years ago (check out this Moz article from 2018) that works to quell the panic from zero position SERPs).
With the introduction of Voice Search, rich results, rich snippets, knowledge panels, Google My Business, and updated Image Search results, SEOs now need to consider a whole new range of technical marketing strategies to appear in a multitude of organic search results.
It’s hard to know where Google is taking SERPs in the next year, but it is fair to say the strategies we use today for the search environment will likely be outdated in as little as six months.
Take, for example, the recent addition and subsequent removal of favicons in the SERPs; after backlash, Google reversed the change, proving we can never predict which changes will stick and which are blips on the radar.
But what does this mean for SEOs?
Ensure that your strategies are flexible and constantly prepare for changes in both your business sector (if you do not work within SEO) and the constantly evolving search environment. Pay attention to the seasonality of searches and use tools such as Google Trends to cover any out-of-season deficit that you may encounter.
You can use tools like Moz Keyword Explorer to help plan ahead and to create campaigns and strategies that provide useful traffic and lucrative conversions.
Conclusion
SEOs need to move away from the ideology that links and traditional search results should be priorities for an organic campaign. Although both still carry weight, without investment in technical strategy or willingness to learn about entities or semantic connectivity, no SEO campaign can reach its full potential.
The world of SEO in 2020 is bright and exciting, but it will require more investment and intelligent strategy than ever before.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
It's a brand-new decade, rich with all the promise of a fresh start and new beginnings. But does that mean you should be doing anything different with regards to your SEO?
In this Whiteboard Friday, our Senior SEO Scientist Britney Muller offers a seventeen-point checklist of things you ought to keep in mind for executing on modern, effective SEO. You'll encounter both old favorites (optimizing title tags, anyone?) and cutting-edge ideas to power your search strategy from this year on into the future.
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
Hey, Moz fans. Welcome to another edition of Whiteboard Friday. Today we are talking about SEO in 2020. What does that look like? How have things changed?
Do we need to be optimizing for favicons and BERT? We definitely don't. But here are some of the things that I feel we should be keeping an eye on.
☑ Cover your bases with foundational SEO
Titles, metas, headers, alt text, site speed, robots.txt, site maps, UX, CRO, Analytics, etc.
To cover your bases with foundational SEO will continue to be incredibly important in 2020, basic things like title tags, meta descriptions, alt, all of the basic SEO 101 things.
There have been some conversations in the industry lately about alt text and things of that nature. When Google is getting so good at figuring out and knowing what's in an image, why would we necessarily need to continue providing alt text?
But you have to remember we need to continue to make the web an accessible place, and so for accessibility purposes we should absolutely continue to do those things. But I do highly suggest you check out Google's Visual API and play around with that to see how good they've actually gotten. It's pretty cool.
☑ Schema markup
FAQ, Breadcrumbs, News, Business Info, etc.
Schema markup will continue to be really important, FAQ schema, breadcrumbs, business info. The News schema that now is occurring in voice results is really interesting. I think we will see this space continue to grow, and you can definitely leverage those different markup types for your website.
☑ Research what matters for your industry!
Just to keep in mind, there's going to be a lot of articles and research and information coming at you about where things are going, what you should do to prepare, and I want you to take a strategic stance on your industry and what's important in your space.
While I might suggest page speed is going to be really important in 2020, is it for your industry? We should still worry about these things and still continue to improve them. But if you're able to take a clearer look at ranking factors and what appears to be a factor for your specific space, you can better prioritize your fixes and leverage industry information to help you focus.
☑ National SERPs will no longer be reliable
You need to be acquiring localized SERPs and rankings.
This has been the case for a while. We need to localize search results and rankings to get an accurate and clear picture of what's going on in search results. I was going to put E-A-T here and then kind of cross it off.
A lot of people feel E-A-T is a huge factor moving forward. Just for the case of this post, it's always been a factor. It's been that way for the last ten-plus years, and we need to continue doing that stuff despite these various updates. I think it's always been important, and it will continue to be so.
This helps optimize your text for natural language processing. It helps make it more accessible and friendly for BERT. While you can't necessarily optimize for something like BERT, you can just write really great content that people are looking for.
☑ Understand and fulfill searcher intent, and keep in mind that there's oftentimes multi-intent
One thing to think about this space is we've kind of gone from very, very specific keywords to this richer understanding of, okay, what is the intent behind these keywords? How can we organize that and provide even better value and content to our visitors?
One way to go about that is to consider Google houses the world's data. They know what people are searching for when they look for a particular thing in search. So put your detective glasses on and examine what is it that they are showing for a particular keyword.
Is there a common theme throughout the pages? Tailor your content and your intent to solve for that. You could write the best article in the world on DIY Halloween costumes, but if you're not providing those visual elements that you so clearly see in a Google search result page, you're never going to rank on page 1.
☑ Entity and topical integration baked into your IA
Have a rich understanding of your audience and what they're seeking.
This plays well into entities and topical understanding. Again, we've gone from keywords and now we want to have this richer, better awareness of keyword buckets.
What are those topical things that people are looking for in your particular space? What are the entities, the people, places, or things that people are investigating in your space, and how can you better organize your website to provide some of those answers and those structures around those different pieces? That's incredibly important, and I look forward to seeing where this goes in 2020.
☑ Optimize for featured snippets
Featured snippets are not going anywhere. They are here to stay. The best way to do this is to find the keywords that you currently rank on page 1 for that also have a featured snippet box. These are your opportunities. If you're on page 1, you're way more apt to potentially steal or rank for a featured snippet.
One of the best ways to do that is to provide really succinct, beautiful, easy-to-understand summaries, takeaways, etc., kind of mimic what other people are doing, but obviously don't copy or steal any of that. Really fun space to explore and get better at in 2020.
☑ Invest in visuals
We see Google putting more authority behind visuals, whether it be in search or you name it. It is incredibly valuable for your SEO, whether it be unique images or video content that is organized in a structured way, where Google can provide those sections in that video search result. You can do all sorts of really neat things with visuals.
☑ Cultivate engagement
This is good anyway, and we should have been doing this before. Gary Illyes was quoted as saying, "Comments are better for on-site engagement than social signals." I will let you interpret that how you will.
But I think it goes to show that engagement and creating this community is still going to be incredibly important moving forward into the future.
☑ Repurpose your content
Blog post → slides → audio → video
This is so important, and it will help you excel even more in 2020 if you find your top-performing web pages and you repurpose them into maybe be a SlideShare, maybe a YouTube video, maybe various pins on Pinterest, or answers on Quora.
You can start to refurbish your content and expand your reach online, which is really exciting. In addition to that, it's also interesting to play around with the idea of providing people options to consume your content. Even with this Whiteboard Friday, we could have an audio version that people could just listen to if they were on their commute. We have the transcription. Provide options for people to consume your content.
☑ Prune or improve thin or low-quality pages
This has been incredibly powerful for myself and many other SEOs I know in improving the perceived quality of a site. So consider testing and meta no-indexing low-quality, thin pages on a website. Especially larger websites, we see a pretty big impact there.
☑ Get customer insights!
This will continue to be valuable in understanding your target market. It will be valuable for influencer marketing for all sorts of reasons. One of the incredible tools that are currently available by our Whiteboard Friday extraordinaire, Rand Fishkin, is SparkToro. So you guys have to check that out when it gets released soon. Super exciting.
☑ Find keyword opportunities in Google Search Console
It's shocking how few people do this and how accessible it is. If you go into your Google Search Console and you export as much data as you can around your queries, your click-through rate, your position, and impressions, you can do some incredible, simple visualizations to find opportunities.
For example, if this is the rank of your keywords and this is the click-through rate, where do you have high click-through rate but low ranking position? What are those opportunity keywords? Incredibly valuable. You can do this in all sorts of tools. One I recommend, and I will create a little tutorial for, is a free tool called Facets, made by Google for machine learning. It makes it really easy to just pick those apart.
☑ Target link-intent keywords
A couple quick link building tactics for 2020 that will continue to hopefully work very, very well. What I mean by link-intent keywords is your keyword statistics, your keyword facts.
These are searches people naturally want to reference. They want to link to it. They want to cite it in a presentation. If you can build really great content around those link-intent keywords, you can do incredibly well and naturally build links to a website.
☑ Podcasts
Whether you're a guest or a host on a podcast, it's incredibly easy to get links. It's kind of a fun link building hack.
☑ Provide unique research with visuals
Andy Crestodina does this so incredibly well. So explore creating your own unique research and not making it too commercial but valuable for users. I know this was a lot.
There's a lot going on in 2020, but I hope some of this is valuable to you. I truly can't wait to hear your thoughts on these recommendations, things you think I missed, things that you would remove or change. Please let us know down below in the comments, and I will see you all soon. Thanks.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
On January 13th, MozCast measured significant algorithm flux lasting about three days (the dotted line shows the 30-day average prior to the 13th, which is consistent with historical averages) ...
That same day, Google announced the release of a core update dubbed the January 2020 Core Update (in line with their recent naming conventions) ...
On January 16th, Google announced the update was "mostly done," aligning fairly well with the measured temperatures in the graph above. Temperatures settled down after the three-day spike ...
It appears that the dust has mostly settled on the January 2020 Core Update. Interpreting core updates can be challenging, but are there any takeaways we can gather from the data?
How does it compare to other updates?
How did the January 2020 Core Update stack up against recent core updates? The chart below shows the previous four named core updates, back to August 2018 (AKA "Medic") ...
While the January 2020 update wasn't on par with "Medic," it tracks closely to the previous three updates. Note that all of these updates are well above the MozCast average. While not all named updates are measurable, all of the recent core updates have generated substantial ranking flux.
Which verticals were hit hardest?
MozCast is split into 20 verticals, matching Google AdWords categories. It can be tough to interpret single-day movement across categories, since they naturally vary, but here's the data for the range of the update (January 14–16) for the seven categories that topped 100°F on January 14 ...
Health tops the list, consistent with anecdotal evidence from previous core updates. One consistent finding, broadly speaking, is that sites impacted by one core update seem more likely to be impacted by subsequent core updates.
Who won and who lost this time?
Winners/losers analyses can be dangerous, for a few reasons. First, they depend on your particular data set. Second, humans have a knack for seeing patterns that aren't there. It's easy to take a couple of data points and over-generalize. Third, there are many ways to measure changes over time.
We can't entirely fix the first problem — that's the nature of data analysis. For the second problem, we have to trust you, the reader. We can partially address the third problem by making sure we're looking at changes both in absolute and relative terms. For example, knowing a site gained 100% SERP share isn't very interesting if it went from one ranking in our data set to two. So, for both of the following charts, we'll restrict our analysis to subdomains that had at least 25 rankings across MozCast's 10,000 SERPs on January 14th. We'll also display the raw ranking counts for some added perspective.
Here are the top 25 winners by % change over the 3 days of the update. The "Jan 14" and "Jan 16" columns represent the total count of rankings (i.e. SERP share) on those days ...
If you've read about previous core updates, you may see a couple of familiar subdomains, including VeryWellHealth.com and a couple of its cousins. Even at a glance, this list goes well beyond healthcare and represents a healthy mix of verticals and some major players, including Instagram and the Google Play store.
I hate to use the word "losers," and there's no way to tell why any given site gained or lost rankings during this time period (it may not be due to the core update), but I'll present the data as impartially as possible. Here are the 25 sites that lost the most rankings by percentage change ...
Orbitz took heavy losses in our data set, as did the phone number lookup site ZabaSearch. Interestingly, one of the Very Well family of sites (three of which were in our top 25 list) landed in the bottom 25. There are a handful of healthcare sites in the mix, including the reputable Cleveland Clinic (although this appears to be primarily a patient portal).
What can we do about any of this?
Google describes core updates as "significant, broad changes to our search algorithms and systems ... designed to ensure that overall, we’re delivering on our mission to present relevant and authoritative content to searchers." They're quick to say that a core update isn't a penalty and that "there’s nothing wrong with pages that may perform less well." Of course, that's cold comfort if your site was negatively impacted.
We know that content quality matters, but that's a vague concept that can be hard to pin down. If you've taken losses in a core update, it is worth assessing if your content is well matched to the needs of your visitors, including whether it's accurate, up to date, and generally written in a way that demonstrates expertise.
We also know that sites impacted by one core update seem to be more likely to see movement in subsequent core updates. So, if you've been hit in one of the core updates since "Medic," keep your eyes open. This is a work in progress, and Google is making adjustments as they go.
Ultimately, the impact of core updates gives us clues about Google's broader intent and how best to align with that intent. Look at sites that performed well and try to understand how they might be serving their core audiences. If you lost rankings, are they rankings that matter? Was your content really a match to the intent of those searchers?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
I recently finished a project where I was tasked to investigate why a site (that receives over one million organic visits per month) does not rank for any featured snippets.
This is obviously an alarming situation, since ~15% of all result pages, according to the MozCast, have a featured snippet as a SERP feature. The project was passed on to me by an industry friend. I’ve done a lot of research on featured snippets in the past. I rarely do once-off projects, but this one really caught my attention. I was determined to figure out what issue was impacting the site.
In this post, I detail my methodology for the project that I delivered, along with key takeaways for my client and others who might be faced with a similar situation. But before I dive deep into my analysis: this post does NOT have a fairy-tale ending. I wasn’t able to unclog a drain that resulted in thousands of new visitors.
I did, however, deliver massive amounts of closure for my client, allowing them to move on and invest resources into areas which will have a long-lasting impact.
Confirming suspicions with Big Data
Now, when my client first came to me, they had their own suspicions about what was happening. They had been advised by other consultants on what to do.
They had been told that the featured snippet issue was stemming from either:
1. An issue relating to conflicting structured data on the site
OR
2. An issue relating to messy HTML which was preventing the site from appearing within featured snippet results
I immediately shut down the first issue as a cause for featured snippets not appearing. I’ve written about this topic extensively in the past. Structured data (in the context of schema.org) does NOT influence featured snippets. You can read more about this in my post on Search Engine Land.
As for the second point, this is more close to reality, yet also so far from it. Yes, HTML structure does help considerably when trying to rank for featured snippets. But to the point where a site that ranks for almost a million keywords but doesn’t rank for any featured snippets at all? Very unlikely. There’s more to this story, but let’s confirm our suspicions first.
Let’s start from the top. Here’s what the estimated organic traffic looks like:
Note: I’m unable to show the actual traffic for this site due to confidentiality. But the monthly estimation that Ahrefs gives of 1.6M isn’t far off.
Out of the 1.6M monthly organic visits, Ahrefs picks up on 873K organic keywords. When filtering these keywords by SERP features with a featured snippet and ordering by position, you get the following:
I then did similar research with both Moz Pro using their featured snippet filtering capabilities as well as SEMrush, allowing me to see historical ranking.
All 3 tools displaying the same result: the site did not rank for any featured snippets at all, despite ~20% of my client's organic keywords including a featured snippet as a SERP feature (higher than the average from MozCast).
It was clear that the site did not rank for any featured snippets on Google. But who was taking this position away from my client?
The next step was to investigate whether other sites are ranking within the same niche. If they were, then this would be a clear sign of a problem.
An “us” vs “them” comparison
Again, we need to reflect back to our tools. We need our tools to figure out the top sites based on similarity of keywords. Here’s an example of this in action within Moz Pro:
Once we have our final list of similar sites, we need to complete the same analysis that was completed in the previous section of this post to see if they rank for any featured snippets.
With this analysis, we can figure out whether they have featured snippets displaying or not, along with the % of their organic keywords with a featured snippet as a SERP feature.
The next step is to add all of this data to a Google Sheet and see how everything matches up to my client's site. Here’s what this data looks like for my client:
I now need to dig deeper into the sites in my table. Are they really all that relevant, or are my tools just picking up on a subset of queries that are similar?
I found that from row 8 downwards in my table, those sites weren’t all that similar. I excluded them from my final dataset to keep things as relevant as possible.
Based on this data, I could see 5 other sites that were similar to my clients. Out of those five sites, only one had results where they were ranking within a featured snippet.
80% of similar sites to my client's site had the exact same issue. This is extremely important information to keep in mind going forward.
Although the sample size is considerably lower, one of those sites has ~34% of search results that they rank for where they are unable to be featured. Comparatively, this is quite problematic for this site (considering the 20% calculation from my client's situation).
This analysis has been useful in figuring out whether the issue was specific to my client or the entire niche. But do we have guidelines from Google to back this up?
Google featured snippet support documentation
Within Google’s Featured Snippet Documentation, they provide details on policies surrounding the SERP feature. This is public information. But I think a very high percentage of SEOs aren’t aware (based on multiple discussions I’ve had) of how impactful some of these details can be.
For instance, the guidelines state that:
"Because of this prominent treatment, featured snippet text, images, and the pages they come from should not violate these policies."
They then mention 5 categories:
Sexually explicit
Hateful
Violent
Dangerous and harmful
Lack consensus on public interest topics
Number five in particular is an interesting one. This section is not as clear as the other four and requires some interpretation. Google explains this category in the following way:
"Featured snippets about public interest content — including civic, medical, scientific, and historical issues — should not lack well-established or expert consensus support."
And the even more interesting part in all of this: these policies do not apply to web search listings nor cause those to be removed.
It can be lights out for featured snippets if you fall into one of these categories, yet you can still be able to rank highly within the 10-blue-link results. A bit of an odd situation.
Based on my knowledge of the client, I couldn’t say for sure whether any of the five categories were to blame for their problem. It was sure looking like it was algorithmic intervention (and I had my suspicions about which category was the potential cause).
But there was no way of confirming this. The site didn’t have a manual action within Google Search Console. That is literally the only way Google could communicate something like this to site owners.
I needed someone on the inside at Google to help.
The missing piece: Official site-specific feedback from Google
One of the most underused resources in an SEOs toolkit (based on my opinion), are the Google Webmaster Hangouts held by John Mueller.
You can see the schedule for these Hangouts on YouTube here and join live, asking John a question in person if you want. You could always try John on Twitter too, but there’s nothing like video.
You’re given the opportunity to explain your question in detail. John can easily ask for clarification, and you can have a quick back-and-forth that gets to the bottom of your problem.
This is what I did in order to figure out this situation. I spoke with John live on the Hangout for ~5 minutes; you can watch my segment here if you’re interested. The result was that John gave me his email address and I was able to send through the site for him to check with the ranking team at Google.
I followed up with John on Twitter to see if he was able to get any information from the team on my clients situation. You can follow the link above to see the full piece of communication, but John’s feedback was that there wasn't a manual penalty being put in place for my client's site. He said that it was purely algorithmic. This meant that the algorithm was deciding that the site was not allowed to rank within featured snippets.
And an important component of John’s response:
If a site doesn’t rank for any featured snippets when they're already ranking highly within organic results on Google (say, within positions 1–5), there is no way to force it to rank.
For me, this is a dirty little secret in a way (hence the title of this article). Google’s algorithms may decide that a site can’t show in a featured snippet (but could rank #2 consistently), and there's nothing a site owner can do.
...and the end result?
The result of this, in the specific niche that my client is in, is that lots of smaller, seemingly less relevant sites (as a whole) are the ones that are ranking in featured snippets. Do these sites provide the best answer? Well, the organic 10-blue-links ranking algorithm doesn’t think so, but the featured snippet algorithm does.
This means that the site has a lot of queries which have a low CTR, resulting in considerably less traffic coming through to the site. Sure, featured snippets sometimes don’t drive much traffic. But they certainly get a lot more attention than the organic listings below:
Based on the Nielsen Norman Group study, when SERP features (like featured snippets) were present on a SERP, they found that they received looks in 74% of cases (with a 95% confidence interval of 66–81%). This data clearly points to the fact that featured snippets are important for sites to rank within where possible, resulting in far greater visibility.
Because Google’s algorithm is making this decision, it's likely a liability thing; Google (the people involved with the search engine) don’t want to be the ones to have to make that call. It’s a tricky one. I understand why Google needs to put these systems in place for their search engine (scale is important), but communication could be drastically improved for these types of algorithmic interventions. Even if it isn’t a manual intervention, there ought to be some sort of notification within Google Search Console. Otherwise, site owners will just invest in R&D trying to get their site to rank within featured snippets (which is only natural).
And again, just because there are categories available in the featured snippet policy documentation, that doesn’t mean that the curiosity of site owners is always going to go away. There will always be the “what if?”
Deep down, I’m not so sure Google will ever make this addition to Google Search Console. It would mean too much communication on the matter, and could lead to unnecessary disputes with site owners who feel they’ve been wronged. Something needs to change, though. There needs to be less ambiguity for the average site owner who doesn’t know they can access awesome people from the Google Search team directly. But for the moment, it will remain Google’s dirty little featured snippet secret.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
When it comes to the forms your site visitors are using, you need to go beyond completions — it's important to understand how people are interacting with them, where the strengths lie and what errors might be complicating the experience. In this edition of Whiteboard Friday, Matthew Edgar takes you through in-depth form tracking in Google Analytics.
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
Howdy, Moz fans. My name is Matthew Edgar. Welcome to another edition of Whiteboard Friday. I am an analytics consultant at Elementive, and in this Whiteboard Friday what I want to talk to you about are new ways that we can really start tracking how people are interacting with our forms.
I'm going to assume that all of you who have a form on your website are already tracking it in some way. You're looking at goal completions on the form, you're measuring how many people arrived on that page that includes the form, and what we want to do now is we want to take that to a deeper level so we can really understand how people are not just completing the form, but how they're really interacting with that form.
So what I want to cover are how people really interact with the form on your website, how people really interact with the fields when they submit the form, and then also what kind of errors are occurring on the form that are holding back conversions and hurting the experience on your site.
1. What fields are used?
So let's begin by talking about what fields people are using and what fields they're really interacting with.
So in this video, I want to use just an example of a registration form. Pretty simple registration form. Fields for name, company name, email address, phone number, revenue, and sales per day, basic information. We've all seen forms like this on different websites. So what we want to know is not just how many people arrived on this page, looked at this form, how many people completed this form.
What we want to know is: Well, how many people clicked into any one of these fields? So for that, we can use event tracking in Google Analytics. If you don't have Google Analytics, that's okay. There are other ways to do this with other tools as well. So in Google Analytics, what we want to do is we want to send an event through every time somebody clicks or taps into any one of these fields.
On focus
So for that, we're going to send an on focus event. The category can be form. Action is interact. Then the label is just the name of the field, so email address or phone number or whatever field they were interacting with. Then in Google Analytics, what we'll be able to look at, once we drill into the label, is we'll be able to say, "Well, how many times in total did people interact with that particular field?"
GA report
So people interacted with the name field 104 times, the revenue field 89 times, sales per day 64 times, and phone number 59 times. Then we could go through all the other fields too to look at that. What this total information starts to give us is an idea of: Well, where are people struggling? Where are people having to really spend a lot of time? Then it also gives us an idea of the drop-off rate.
So we can see here that, well, 104 people interacted with the full name field, but only 89 made it down here to the revenue field. So we're losing people along the way. Is that a design issue? Is that something about the experience of interacting with this form? Maybe it's a device issue. We have a lot of people on mobile and maybe they can't see all of those fields. The next thing we can look at here is the unique events that are happening for each of those.
Unique events aren't exactly but are close enough to a general idea of how many unique people interacted with those fields. So in the case of the name field, 102 people interacted 104 times, roughly speaking, which makes sense. People don't need to go back to the name field and enter in their name again. But in the case of the revenue field, 47 unique interactions, 89 total interactions.
People are having to go back to this field. They're having to reconsider what they want to put in there. So we can start to figure out, well, why is that? Is that because people aren't sure what kind of answer to give? Are they not comfortable giving up that answer? Are there some trust factors on our site that we need to improve? If we really start to dig into that and look at that information, we can start to figure out, well, what's it going to take to get more people interacting with this form, and what's it going to take to get more people clicking that Submit button?
2. What fields do people submit?
The next thing that we want to look at here is what fields do people submit. Not just what do they interact with, but when they click that Submit button, which fields have they actually put information into?
On submit
So for this, when people click that Submit button, we can trigger another event to send along to Google Analytics. In this case, the category is form, the action is submit, and then for the label what we want to do is we want to send just a list of all the different fields that people had put some kind of information in.
So there's a lot of different ways to do this. It really just depends on what kind of form you have, how your form is controlled. One easy way is you have a JavaScript function that just loops through your entire form and says, "Well, which of these fields have a value, have something that's not the default entry, that people actually did give their information to?" One note here is that if you are going to loop through those fields on your form and figure out which ones people interacted with and put information into, you want to make sure that you're only getting the name of the field and not the value of the field.
We don't want to send along the person's email address or the person's phone number. We just want to know that they did put something in the email address field or in the phone number field. We don't want any of that personally identifiable information ending up in our reports.
Review frequency
So what we can do with this is we can look at: Well, how frequently did people submit any one of these fields?
So 53 submissions with the full name field, 46 with revenue, 42 with sales per day, etc.
Compare by interact
The first thing we can do here is we can compare this to the interaction information, and we can say, "Well, there were 53 times that people submitted a field with the full name field filled out.But there are 102 people who interacted with that full name field."
That's quite the difference. So now we know, well, what kind of opportunity exists for us to clean this up. We had 102 people who hit this form, who started filling it out, but only 53 ended up putting in their full name when they clicked that Submit button. There's some opportunity there to get more people filling out this form and submitting.
Segment by source
The other thing we can do is we can segment this by source. The reason we would want to do that is we want to compare this to understand something about the quality of these submissions. So we might know that, well, people who give us their phone number, that tends to be a better quality submission on our form. Not necessarily. There are exceptions and edge cases to be sure.
But generally speaking, people who give us their phone number we know are better quality. So by segmenting by source, we can say, "Well, which people who come in from which source are more likely to give their phone number?" That gives us an idea of which source we might want to go after. Maybe that's a really good thing that your ad network is really driving people who fill out their phone number. Or maybe organic is doing a better job driving people to submit by giving you that information.
3. What fields cause problems?
The next thing we want to look at on our form is which errors are occurring. What problems are happening here?
Errors, slips, mistakes
When we're talking about problems, when we're talking about errors, it's not just the technical errors that are occurring. It's also the user errors that are occurring, the slips, the mistakes that people are just naturally going to make as they work through your form.
Assign unique ID to each error
The easiest way to track this is every time an error is returned to the visitor, we want to pass an event along to Google Analytics. So for that, what we can do is we can assign a unique ID number to each error on our website, and that unique ID number can be for each specific error. So people who forgot a digit on a phone number, that's one ID number. People who forgot the phone number altogether, that's a different ID number.
On return of error
When that error gets returned, we'll pass along the category is form, the action is error, and then the label is that unique ID number.
Frequency of errors
The first thing we can look at is the frequency of how frequently each error occurs. So we can say, "Well, Error ID No. 1 occurred 37 times, and Error ID No. 2 occurred 26 times."
Segment by form completion
It starts to give us an idea of how to prioritize these errors. But the more interesting thing to look at is we want to segment by the form completion, and then we can compare these two. So we can say, "Okay, people who completed this form, how often did they get these errors?" So in this case, we can say, "Well, Error ID No. 1, 29 people got it, but 27 people who submitted this form got it."
That means pretty much everybody who got that error was able to move beyond the error and submit the form. It's not that big of a deal. It's not hurting the experience on our site all that much. It's not hurting conversions all that much. Error ID No. 4 though, 19 people got the error, but only 3 of the people who got that error were able to submit the form. Clearly whatever this ID is, whatever this error is, that's the one that's really hurting the experience on our site.
That's the one that's really going to hurt conversions. So by improving or figuring out why that error is occurring, then we can start to improve conversions on our site. I hope these ideas have given you some new ways to really track and understand how people are interacting with your forms at a deeper level.
I look forward to hearing your comments about different things you're doing on your forms, and certainly if you start using any of these ideas, what kind of insights you're gaining from them. Thank you.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!