On January 13th, MozCast measured significant algorithm flux lasting about three days (the dotted line shows the 30-day average prior to the 13th, which is consistent with historical averages) ...
That same day, Google announced the release of a core update dubbed the January 2020 Core Update (in line with their recent naming conventions) ...
On January 16th, Google announced the update was "mostly done," aligning fairly well with the measured temperatures in the graph above. Temperatures settled down after the three-day spike ...
It appears that the dust has mostly settled on the January 2020 Core Update. Interpreting core updates can be challenging, but are there any takeaways we can gather from the data?
How does it compare to other updates?
How did the January 2020 Core Update stack up against recent core updates? The chart below shows the previous four named core updates, back to August 2018 (AKA "Medic") ...
While the January 2020 update wasn't on par with "Medic," it tracks closely to the previous three updates. Note that all of these updates are well above the MozCast average. While not all named updates are measurable, all of the recent core updates have generated substantial ranking flux.
Which verticals were hit hardest?
MozCast is split into 20 verticals, matching Google AdWords categories. It can be tough to interpret single-day movement across categories, since they naturally vary, but here's the data for the range of the update (January 14–16) for the seven categories that topped 100°F on January 14 ...
Health tops the list, consistent with anecdotal evidence from previous core updates. One consistent finding, broadly speaking, is that sites impacted by one core update seem more likely to be impacted by subsequent core updates.
Who won and who lost this time?
Winners/losers analyses can be dangerous, for a few reasons. First, they depend on your particular data set. Second, humans have a knack for seeing patterns that aren't there. It's easy to take a couple of data points and over-generalize. Third, there are many ways to measure changes over time.
We can't entirely fix the first problem — that's the nature of data analysis. For the second problem, we have to trust you, the reader. We can partially address the third problem by making sure we're looking at changes both in absolute and relative terms. For example, knowing a site gained 100% SERP share isn't very interesting if it went from one ranking in our data set to two. So, for both of the following charts, we'll restrict our analysis to subdomains that had at least 25 rankings across MozCast's 10,000 SERPs on January 14th. We'll also display the raw ranking counts for some added perspective.
Here are the top 25 winners by % change over the 3 days of the update. The "Jan 14" and "Jan 16" columns represent the total count of rankings (i.e. SERP share) on those days ...
If you've read about previous core updates, you may see a couple of familiar subdomains, including VeryWellHealth.com and a couple of its cousins. Even at a glance, this list goes well beyond healthcare and represents a healthy mix of verticals and some major players, including Instagram and the Google Play store.
I hate to use the word "losers," and there's no way to tell why any given site gained or lost rankings during this time period (it may not be due to the core update), but I'll present the data as impartially as possible. Here are the 25 sites that lost the most rankings by percentage change ...
Orbitz took heavy losses in our data set, as did the phone number lookup site ZabaSearch. Interestingly, one of the Very Well family of sites (three of which were in our top 25 list) landed in the bottom 25. There are a handful of healthcare sites in the mix, including the reputable Cleveland Clinic (although this appears to be primarily a patient portal).
What can we do about any of this?
Google describes core updates as "significant, broad changes to our search algorithms and systems ... designed to ensure that overall, we’re delivering on our mission to present relevant and authoritative content to searchers." They're quick to say that a core update isn't a penalty and that "there’s nothing wrong with pages that may perform less well." Of course, that's cold comfort if your site was negatively impacted.
We know that content quality matters, but that's a vague concept that can be hard to pin down. If you've taken losses in a core update, it is worth assessing if your content is well matched to the needs of your visitors, including whether it's accurate, up to date, and generally written in a way that demonstrates expertise.
We also know that sites impacted by one core update seem to be more likely to see movement in subsequent core updates. So, if you've been hit in one of the core updates since "Medic," keep your eyes open. This is a work in progress, and Google is making adjustments as they go.
Ultimately, the impact of core updates gives us clues about Google's broader intent and how best to align with that intent. Look at sites that performed well and try to understand how they might be serving their core audiences. If you lost rankings, are they rankings that matter? Was your content really a match to the intent of those searchers?
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
I recently finished a project where I was tasked to investigate why a site (that receives over one million organic visits per month) does not rank for any featured snippets.
This is obviously an alarming situation, since ~15% of all result pages, according to the MozCast, have a featured snippet as a SERP feature. The project was passed on to me by an industry friend. I’ve done a lot of research on featured snippets in the past. I rarely do once-off projects, but this one really caught my attention. I was determined to figure out what issue was impacting the site.
In this post, I detail my methodology for the project that I delivered, along with key takeaways for my client and others who might be faced with a similar situation. But before I dive deep into my analysis: this post does NOT have a fairy-tale ending. I wasn’t able to unclog a drain that resulted in thousands of new visitors.
I did, however, deliver massive amounts of closure for my client, allowing them to move on and invest resources into areas which will have a long-lasting impact.
Confirming suspicions with Big Data
Now, when my client first came to me, they had their own suspicions about what was happening. They had been advised by other consultants on what to do.
They had been told that the featured snippet issue was stemming from either:
1. An issue relating to conflicting structured data on the site
OR
2. An issue relating to messy HTML which was preventing the site from appearing within featured snippet results
I immediately shut down the first issue as a cause for featured snippets not appearing. I’ve written about this topic extensively in the past. Structured data (in the context of schema.org) does NOT influence featured snippets. You can read more about this in my post on Search Engine Land.
As for the second point, this is more close to reality, yet also so far from it. Yes, HTML structure does help considerably when trying to rank for featured snippets. But to the point where a site that ranks for almost a million keywords but doesn’t rank for any featured snippets at all? Very unlikely. There’s more to this story, but let’s confirm our suspicions first.
Let’s start from the top. Here’s what the estimated organic traffic looks like:
Note: I’m unable to show the actual traffic for this site due to confidentiality. But the monthly estimation that Ahrefs gives of 1.6M isn’t far off.
Out of the 1.6M monthly organic visits, Ahrefs picks up on 873K organic keywords. When filtering these keywords by SERP features with a featured snippet and ordering by position, you get the following:
I then did similar research with both Moz Pro using their featured snippet filtering capabilities as well as SEMrush, allowing me to see historical ranking.
All 3 tools displaying the same result: the site did not rank for any featured snippets at all, despite ~20% of my client's organic keywords including a featured snippet as a SERP feature (higher than the average from MozCast).
It was clear that the site did not rank for any featured snippets on Google. But who was taking this position away from my client?
The next step was to investigate whether other sites are ranking within the same niche. If they were, then this would be a clear sign of a problem.
An “us” vs “them” comparison
Again, we need to reflect back to our tools. We need our tools to figure out the top sites based on similarity of keywords. Here’s an example of this in action within Moz Pro:
Once we have our final list of similar sites, we need to complete the same analysis that was completed in the previous section of this post to see if they rank for any featured snippets.
With this analysis, we can figure out whether they have featured snippets displaying or not, along with the % of their organic keywords with a featured snippet as a SERP feature.
The next step is to add all of this data to a Google Sheet and see how everything matches up to my client's site. Here’s what this data looks like for my client:
I now need to dig deeper into the sites in my table. Are they really all that relevant, or are my tools just picking up on a subset of queries that are similar?
I found that from row 8 downwards in my table, those sites weren’t all that similar. I excluded them from my final dataset to keep things as relevant as possible.
Based on this data, I could see 5 other sites that were similar to my clients. Out of those five sites, only one had results where they were ranking within a featured snippet.
80% of similar sites to my client's site had the exact same issue. This is extremely important information to keep in mind going forward.
Although the sample size is considerably lower, one of those sites has ~34% of search results that they rank for where they are unable to be featured. Comparatively, this is quite problematic for this site (considering the 20% calculation from my client's situation).
This analysis has been useful in figuring out whether the issue was specific to my client or the entire niche. But do we have guidelines from Google to back this up?
Google featured snippet support documentation
Within Google’s Featured Snippet Documentation, they provide details on policies surrounding the SERP feature. This is public information. But I think a very high percentage of SEOs aren’t aware (based on multiple discussions I’ve had) of how impactful some of these details can be.
For instance, the guidelines state that:
"Because of this prominent treatment, featured snippet text, images, and the pages they come from should not violate these policies."
They then mention 5 categories:
Sexually explicit
Hateful
Violent
Dangerous and harmful
Lack consensus on public interest topics
Number five in particular is an interesting one. This section is not as clear as the other four and requires some interpretation. Google explains this category in the following way:
"Featured snippets about public interest content — including civic, medical, scientific, and historical issues — should not lack well-established or expert consensus support."
And the even more interesting part in all of this: these policies do not apply to web search listings nor cause those to be removed.
It can be lights out for featured snippets if you fall into one of these categories, yet you can still be able to rank highly within the 10-blue-link results. A bit of an odd situation.
Based on my knowledge of the client, I couldn’t say for sure whether any of the five categories were to blame for their problem. It was sure looking like it was algorithmic intervention (and I had my suspicions about which category was the potential cause).
But there was no way of confirming this. The site didn’t have a manual action within Google Search Console. That is literally the only way Google could communicate something like this to site owners.
I needed someone on the inside at Google to help.
The missing piece: Official site-specific feedback from Google
One of the most underused resources in an SEOs toolkit (based on my opinion), are the Google Webmaster Hangouts held by John Mueller.
You can see the schedule for these Hangouts on YouTube here and join live, asking John a question in person if you want. You could always try John on Twitter too, but there’s nothing like video.
You’re given the opportunity to explain your question in detail. John can easily ask for clarification, and you can have a quick back-and-forth that gets to the bottom of your problem.
This is what I did in order to figure out this situation. I spoke with John live on the Hangout for ~5 minutes; you can watch my segment here if you’re interested. The result was that John gave me his email address and I was able to send through the site for him to check with the ranking team at Google.
I followed up with John on Twitter to see if he was able to get any information from the team on my clients situation. You can follow the link above to see the full piece of communication, but John’s feedback was that there wasn't a manual penalty being put in place for my client's site. He said that it was purely algorithmic. This meant that the algorithm was deciding that the site was not allowed to rank within featured snippets.
And an important component of John’s response:
If a site doesn’t rank for any featured snippets when they're already ranking highly within organic results on Google (say, within positions 1–5), there is no way to force it to rank.
For me, this is a dirty little secret in a way (hence the title of this article). Google’s algorithms may decide that a site can’t show in a featured snippet (but could rank #2 consistently), and there's nothing a site owner can do.
...and the end result?
The result of this, in the specific niche that my client is in, is that lots of smaller, seemingly less relevant sites (as a whole) are the ones that are ranking in featured snippets. Do these sites provide the best answer? Well, the organic 10-blue-links ranking algorithm doesn’t think so, but the featured snippet algorithm does.
This means that the site has a lot of queries which have a low CTR, resulting in considerably less traffic coming through to the site. Sure, featured snippets sometimes don’t drive much traffic. But they certainly get a lot more attention than the organic listings below:
Based on the Nielsen Norman Group study, when SERP features (like featured snippets) were present on a SERP, they found that they received looks in 74% of cases (with a 95% confidence interval of 66–81%). This data clearly points to the fact that featured snippets are important for sites to rank within where possible, resulting in far greater visibility.
Because Google’s algorithm is making this decision, it's likely a liability thing; Google (the people involved with the search engine) don’t want to be the ones to have to make that call. It’s a tricky one. I understand why Google needs to put these systems in place for their search engine (scale is important), but communication could be drastically improved for these types of algorithmic interventions. Even if it isn’t a manual intervention, there ought to be some sort of notification within Google Search Console. Otherwise, site owners will just invest in R&D trying to get their site to rank within featured snippets (which is only natural).
And again, just because there are categories available in the featured snippet policy documentation, that doesn’t mean that the curiosity of site owners is always going to go away. There will always be the “what if?”
Deep down, I’m not so sure Google will ever make this addition to Google Search Console. It would mean too much communication on the matter, and could lead to unnecessary disputes with site owners who feel they’ve been wronged. Something needs to change, though. There needs to be less ambiguity for the average site owner who doesn’t know they can access awesome people from the Google Search team directly. But for the moment, it will remain Google’s dirty little featured snippet secret.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
When it comes to the forms your site visitors are using, you need to go beyond completions — it's important to understand how people are interacting with them, where the strengths lie and what errors might be complicating the experience. In this edition of Whiteboard Friday, Matthew Edgar takes you through in-depth form tracking in Google Analytics.
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
Howdy, Moz fans. My name is Matthew Edgar. Welcome to another edition of Whiteboard Friday. I am an analytics consultant at Elementive, and in this Whiteboard Friday what I want to talk to you about are new ways that we can really start tracking how people are interacting with our forms.
I'm going to assume that all of you who have a form on your website are already tracking it in some way. You're looking at goal completions on the form, you're measuring how many people arrived on that page that includes the form, and what we want to do now is we want to take that to a deeper level so we can really understand how people are not just completing the form, but how they're really interacting with that form.
So what I want to cover are how people really interact with the form on your website, how people really interact with the fields when they submit the form, and then also what kind of errors are occurring on the form that are holding back conversions and hurting the experience on your site.
1. What fields are used?
So let's begin by talking about what fields people are using and what fields they're really interacting with.
So in this video, I want to use just an example of a registration form. Pretty simple registration form. Fields for name, company name, email address, phone number, revenue, and sales per day, basic information. We've all seen forms like this on different websites. So what we want to know is not just how many people arrived on this page, looked at this form, how many people completed this form.
What we want to know is: Well, how many people clicked into any one of these fields? So for that, we can use event tracking in Google Analytics. If you don't have Google Analytics, that's okay. There are other ways to do this with other tools as well. So in Google Analytics, what we want to do is we want to send an event through every time somebody clicks or taps into any one of these fields.
On focus
So for that, we're going to send an on focus event. The category can be form. Action is interact. Then the label is just the name of the field, so email address or phone number or whatever field they were interacting with. Then in Google Analytics, what we'll be able to look at, once we drill into the label, is we'll be able to say, "Well, how many times in total did people interact with that particular field?"
GA report
So people interacted with the name field 104 times, the revenue field 89 times, sales per day 64 times, and phone number 59 times. Then we could go through all the other fields too to look at that. What this total information starts to give us is an idea of: Well, where are people struggling? Where are people having to really spend a lot of time? Then it also gives us an idea of the drop-off rate.
So we can see here that, well, 104 people interacted with the full name field, but only 89 made it down here to the revenue field. So we're losing people along the way. Is that a design issue? Is that something about the experience of interacting with this form? Maybe it's a device issue. We have a lot of people on mobile and maybe they can't see all of those fields. The next thing we can look at here is the unique events that are happening for each of those.
Unique events aren't exactly but are close enough to a general idea of how many unique people interacted with those fields. So in the case of the name field, 102 people interacted 104 times, roughly speaking, which makes sense. People don't need to go back to the name field and enter in their name again. But in the case of the revenue field, 47 unique interactions, 89 total interactions.
People are having to go back to this field. They're having to reconsider what they want to put in there. So we can start to figure out, well, why is that? Is that because people aren't sure what kind of answer to give? Are they not comfortable giving up that answer? Are there some trust factors on our site that we need to improve? If we really start to dig into that and look at that information, we can start to figure out, well, what's it going to take to get more people interacting with this form, and what's it going to take to get more people clicking that Submit button?
2. What fields do people submit?
The next thing that we want to look at here is what fields do people submit. Not just what do they interact with, but when they click that Submit button, which fields have they actually put information into?
On submit
So for this, when people click that Submit button, we can trigger another event to send along to Google Analytics. In this case, the category is form, the action is submit, and then for the label what we want to do is we want to send just a list of all the different fields that people had put some kind of information in.
So there's a lot of different ways to do this. It really just depends on what kind of form you have, how your form is controlled. One easy way is you have a JavaScript function that just loops through your entire form and says, "Well, which of these fields have a value, have something that's not the default entry, that people actually did give their information to?" One note here is that if you are going to loop through those fields on your form and figure out which ones people interacted with and put information into, you want to make sure that you're only getting the name of the field and not the value of the field.
We don't want to send along the person's email address or the person's phone number. We just want to know that they did put something in the email address field or in the phone number field. We don't want any of that personally identifiable information ending up in our reports.
Review frequency
So what we can do with this is we can look at: Well, how frequently did people submit any one of these fields?
So 53 submissions with the full name field, 46 with revenue, 42 with sales per day, etc.
Compare by interact
The first thing we can do here is we can compare this to the interaction information, and we can say, "Well, there were 53 times that people submitted a field with the full name field filled out.But there are 102 people who interacted with that full name field."
That's quite the difference. So now we know, well, what kind of opportunity exists for us to clean this up. We had 102 people who hit this form, who started filling it out, but only 53 ended up putting in their full name when they clicked that Submit button. There's some opportunity there to get more people filling out this form and submitting.
Segment by source
The other thing we can do is we can segment this by source. The reason we would want to do that is we want to compare this to understand something about the quality of these submissions. So we might know that, well, people who give us their phone number, that tends to be a better quality submission on our form. Not necessarily. There are exceptions and edge cases to be sure.
But generally speaking, people who give us their phone number we know are better quality. So by segmenting by source, we can say, "Well, which people who come in from which source are more likely to give their phone number?" That gives us an idea of which source we might want to go after. Maybe that's a really good thing that your ad network is really driving people who fill out their phone number. Or maybe organic is doing a better job driving people to submit by giving you that information.
3. What fields cause problems?
The next thing we want to look at on our form is which errors are occurring. What problems are happening here?
Errors, slips, mistakes
When we're talking about problems, when we're talking about errors, it's not just the technical errors that are occurring. It's also the user errors that are occurring, the slips, the mistakes that people are just naturally going to make as they work through your form.
Assign unique ID to each error
The easiest way to track this is every time an error is returned to the visitor, we want to pass an event along to Google Analytics. So for that, what we can do is we can assign a unique ID number to each error on our website, and that unique ID number can be for each specific error. So people who forgot a digit on a phone number, that's one ID number. People who forgot the phone number altogether, that's a different ID number.
On return of error
When that error gets returned, we'll pass along the category is form, the action is error, and then the label is that unique ID number.
Frequency of errors
The first thing we can look at is the frequency of how frequently each error occurs. So we can say, "Well, Error ID No. 1 occurred 37 times, and Error ID No. 2 occurred 26 times."
Segment by form completion
It starts to give us an idea of how to prioritize these errors. But the more interesting thing to look at is we want to segment by the form completion, and then we can compare these two. So we can say, "Okay, people who completed this form, how often did they get these errors?" So in this case, we can say, "Well, Error ID No. 1, 29 people got it, but 27 people who submitted this form got it."
That means pretty much everybody who got that error was able to move beyond the error and submit the form. It's not that big of a deal. It's not hurting the experience on our site all that much. It's not hurting conversions all that much. Error ID No. 4 though, 19 people got the error, but only 3 of the people who got that error were able to submit the form. Clearly whatever this ID is, whatever this error is, that's the one that's really hurting the experience on our site.
That's the one that's really going to hurt conversions. So by improving or figuring out why that error is occurring, then we can start to improve conversions on our site. I hope these ideas have given you some new ways to really track and understand how people are interacting with your forms at a deeper level.
I look forward to hearing your comments about different things you're doing on your forms, and certainly if you start using any of these ideas, what kind of insights you're gaining from them. Thank you.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
In the fall of 2018 our CEO had a simple yet head-exploding request of the JotForm marketing and growth teams: Produce 100,000 words of high-quality written content in a single month.
All types of content would count toward the goal, including posts on our own blog, help guides, template descriptions, and guest posts and sponsored articles on other sites.
In case you don’t think that sounds like a lot, 100,000 words is the length of a 400-page book. Produced in a single month. By a group of JotFormers who then numbered fewer than eight.
Why would on Earth would he want us to do all that?
It’s important to understand intent here. Our CEO, Aytekin, isn’t a crazy man. He didn’t send us on a mission just to keep us busy.
You see, for many months we’d dabbled with content, and it was working. Aytekin’s contributed posts in Entrepreneur magazine and on Medium were big hits. Our redesigned blog was picking up a lot of traction with the content we already had, and we were starting to understand SEO a lot better.
Still. Why would any software company need to produce that much content?
The answer is simple: infrastructure. If we could build a content engine that produces a high volume of quality content, then we could learn what works well and double down on creating great content. But in order to sustain success in content, we needed to have the pieces in place.
He allocated a sufficient budget and gave us the freedom to hire the staff we needed to make it happen. We were going to need it.
A full year later, I’m very proud to say we’ve officially crossed over the 100,000-word count in a single month [hold for applause].
However, it didn’t come without some painful learnings and mistakes.
Here’s what I figured out about scaling content through this process.
Develop a system early
Our old editorial calendar was a Google sheet. I started it back when JotForm was publishing one or two blogs per week and needed a way to keep it organized. It worked.
Back then, the only people who needed to view the editorial calendar were three people on the marketing staff and a couple of designers.
However, no spreadsheet on earth will be functional when you’re loading up 100,000 words. It’s too complicated. We discovered this right away.
After much discussion, we migrated our editorial workflow into Asana, which seemed like the closest thing to what we needed. It has a nice calendar view, the tagging functionality helped keep things orderly, and the board view gives a great overview of everyone’s projects.
This is where our marketing team lives.
Counterintuitively, we also use Trello, since it’s what our growth team had already been using to manage projects. Once the marketing team finishes writing a post, we send a request to our growth team designers to create banners for them using a form that integrates with their Trello board.
The system is intricate, but it works. We’d be lost if we hadn’t spent time creating it.
Style guides are your friends
Speaking of things to develop before you can really grow your content machine. Style guides are paramount to maintaining consistency, which becomes trickier and trickier the more writers you enlist to help you reach your content goals.
We consider our style guide to be a sort of living, ever-changing document. We add to it all the time.
It’s also the first thing that any legitimate writer will want to see when they’re about to contribute something to your site, whether they’re submitting a guest post, doing paid freelance work, or they’re your own in-house content writer.
Things to include in a basic style guide: an overview of writing style and tone, grammar and mechanics, punctuation particulars, product wording clarifications, and formatting.
Cheap writing will cost you, dearly
If you want cheap writing, you can find it. It’s everywhere — Upwork, Express Writers, WriterAccess. You name it, we tried it. And for less than $60 a blog post, what self-respecting marketing manager wouldn’t at least try it?
I’m here to tell you it’s a mistake.
I was thrilled when the drafts started rolling in. But our editor had other thoughts. It was taking too much time to make them good — nay, readable.
That was an oversight on my end, and it created a big bottleneck. We created such a backlog of cheap content (because it was cheap and I could purchase LOTS of it at a time) that it halted our progress on publishing content in a timely manner.
Instead, treat your freelance and content agencies as partners, and take the time to find good ones. Talk to them on the phone, exhaustively review their writing portfolio, and see if they really understand what you’re trying to accomplish. It’ll cost more money in the short term, but the returns are significant.
But good writing won’t mask subject ignorance
One thing to check with any content agency or freelancer you work with is their research process. The good ones will lean on subject matter experts (SMEs) to actually become authorities on the subjects they write about. It’s a tedious step, for both you and the writer, but it’s an important one.
The not-so-good ones? They’ll wing it and try to find what they can online. Sometimes they can get away with it, and sometimes someone will read your article and have this to say:
That was harsh.
But they had a point. While the article in question was well-written, it wasn’t written by someone who knew much about the subject at hand, which in this case was photography. Lesson learned. Make sure whoever you hire to write will take the time to know what they’re talking about.
Build outreach into your process
Let’s be real here. For 99.9 percent of you, content marketing is SEO marketing. That’s mostly the case with us as well. We do publish thought leadership and product-education posts with little SEO value, but a lot of what we write is published with the hope that it pleases The Google. Praise be.
But just publishing your content is never enough. You need links, lots of them.
Before I go any further, understand that there’s a right and a wrong way to get links back to your content.
Three guidelines for getting links to your content:
1. Create good content.
2. Find a list of reputable, high-ranking sites that are authorities on the subject you wrote about.
3. Ask them about linking or guest posting on their site in a respectful way that also conveys value to their organization.
That’s it. Don’t waste your time on crappy sites or link scams. Don’t spam people’s inboxes with requests. Don’t be shady or deal with shady people.
Create good content, find high-quality sites to partner with, and offer them value.
Successful content is a numbers game
One benefit to creating as much content as we have is that we can really see what’s worked and what hasn’t. And it’s not as easy to predict as you might think.
One of our most successful posts, How to Start and Run a Summer Camp, wasn’t an especially popular one among JotFormers in the planning stage, primarily because the topic didn’t have a ton of monthly searches for the targeted keywords we were chasing. But just a few months after it went live, it became one of our top-performing posts in terms of monthly searches, and our best in terms of converting readers to JotForm users.
Point being, you don’t really know what will work for you until you try a bunch of options.
You’ll need to hire the right people in-house
In a perfect world JotForm employees would be able to produce every bit of content we need. But that’s not realistic for a company of our size. Still, there were some roles we absolutely needed to bring in-house to really kick our content into high gear.
Here are some hires we made to build our content infrastructure:
Content writer
This was the first dedicated content hire we ever made. It marked our first real plunge into the world of content marketing. Having someone in-house who can write means you can be flexible. When last-minute or deeply product-focused writing projects come up, you need someone in-house to deliver.
Editor
Our full-time editor created JotForm’s style guide from scratch, which she uses to edit every single piece of content that we produce. She’s equal parts editor and project manager, since she effectively owns the flow of the Asana board.
Copywriters (x2)
Our smaller writing projects didn’t disappear just because we wanted to load up on long-form blog posts. Quite the contrary. Our copywriters tackle template descriptions that help count toward our goal, while also writing landing page text, email marketing messages, video scripts, and social media posts.
Content strategist
One of the most difficult components of creating regular content is coming up with ideas. I made an early assumption that writers would come up with things to write; I was way off base. Writers have a very specialized skill that actually has little overlap with identifying and researching topics based on SEO value, relevance to our audience, and what will generate clicks from social media. So we have a strategist.
Content operations specialist
When you aim for tens of thousands of words of published content over the course of a month, the very act of coordinating the publishing of a post becomes a full-time job. At JotForm, most of our posts also need a custom graphic designed by our design team. Our content operations specialist coordinates design assets and makes sure everything looks good in WordPress before scheduling posts.
SEO manager
Our SEO manager had already been doing work on JotForm’s other pages, but he redirected much of his attention to our content goals once we began scaling. He works with our content strategist on the strategy and monitors and reports on the performance of the articles we publish.
The payoff
JotForm’s blog wasn’t starting from scratch when Aytekin posed the 100,000-word challenge. It was already receiving about 120,000 organic site visitors a month from the posts we’d steadily written over the years.
A year later we receive about 230,000 monthly organic searches, and that’s no accident.
The past year also marked our foray into the world of pillar pages.
For the uninitiated, pillar pages are (very) long-form, authoritative pieces that cover all aspects of a specific topic in the hopes that search engines will regard them as a resource.
These are incredibly time-consuming to write, but they drive buckets full of visitors to your page.
We’re getting more than 30,000 visitors a month — all from pillar pages we’ve published within the last year.
To date, our focus on content marketing has improved our organic search to the tune of about 150,000 additional site visitors per month, give or take.
Conclusion
Content isn’t easy. That was the biggest revelation for me, even though it shouldn’t have been. It takes a large team of people with very specialized skills to see measurable success. Doing it at large scale requires a prodigious commitment in both money and time, even if you aren’t tasked with writing 100,000 words a month.
But that doesn’t mean you can’t find a way to make it work for you, on whatever scale that makes the most sense.
There really aren’t any secrets to growing your content engine. No magic recipe. It’s just a matter of putting the resources you have into making it happen.
Best of all, this post just gave us about 2,000 words toward this month’s word count goal.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
I’m often asked about what results are earned through content marketing and digital PR.
So I decided to take a data-driven approach to quantifying the value of links from top-tier press mentions by looking at the aggregate improvements seen by a group of domains that have enjoyed substantial press attention in the last few months. Then I examined which publishers can have the biggest impact on rankings.
My goal was to answer this question: What sort of median bump can be expected when your brand secures media coverage? And how can you potentially get the biggest organic lift?
First off: Top-tier links matter a great deal
This chart represents the correlation between the number of times a site was linked to from within the article text of publishers and its rankings and traffic.
Considering the sheer number of possible variables that contribute to rankings changes (on-site factors, amount and quality of on-site content, penalties, etc.) seeing R-values (which determine the linear relationship) this high is a good result.
In general, the higher the R-score, the stronger the relationship between number of links from publishers and improvements in organic ranking.
We found significant relationships between the number of mentions on news sites ranked in the Top 500 and an even stronger relationship for those ranked within the Top 300.
The likely reason for this is twofold:
Top 300 publishers confer more Domain Authority than less popular sites.
Top 300 publishers often have larger syndication networks and broader visibility, leading to more links being built as the result of a press mention, leading to more Domain Authority accumulation overall.
Which publishers link out the most?
When pitching publishers, it can be extremely useful to understand who is most likely to actually provide a link.
Some publishers have policies against outbound links of any type or nofollow all outbound links.
Looking at the huge dataset, I got a better understanding for which publishers link out to other sites most frequently.
Notice the large number of local news sites with high numbers of outbound links. Local news is often keen to link out.
Unfortunately, most local news won’t have large scale syndication, so looking at top-tier publishers with large numbers of outbound links is likely a better strategy when developing a pitch list. So when you remove those from the list, here are the winners.
The top 15 national publishers that provide links
Forbes
The New York Times
ZDnet
NPR
PR News Wire
Seeking Alpha
The Conversation
USA Today
CNN
Benzinga
Business Insider
Quartz
The Hill
Heavy
Vox
Sites like Forbes only dole out nofollow links, but many of these others provide dofollow links (in addition to just being great, high-authority coverage to achieve). Some industry specific options, like Seeking Alpha, Benzinga, and The Hill, can make for great vertical-specific dream publications to strive for coverage on.
Which publishers confer the most value in terms of organic search improvements?
Looking at this database, it’s possible to look at the median organic traffic gains aggregated by the site that gave the link.
This view is filtered to only include sites that had linked out 100+ times in order to reduce outlier publishers with small volumes of outbound links to only a handful of sites.
More popular sites are clustered near the top, further reiterating the fairly obvious point that the more popular a site, the more value a link from them will be in terms of improving organic ranking.
While most of the top-value links are from these sites, there are quite a few mid-tier sites that seem to grant disproportionate value, including several local news sites and niche authoritative publishers.
Methodology
I used The GDELT Project, a massive repository of news articles that are searchable using BigQuery, to extract the links from all news articles over the last year. Then I aggregated them by root domain.
For each domain from the GDELT dataset that was mentioned in a news article at least 30 times, we then pulled organic data from SEMrush’s API for each one.
I combined the SERP change numbers to the cleaned GDELT data by matching it to the URL of the linked-to site. This gave me organic changes (traffic volume, price, ranking keyword volume change) for each of the root URLs linked to more than 30 times from within the text of articles in the GDELT scrape.
From there, I ran a correlation analysis to see if we could find a statistically significant influence of news coverage on rankings.
Conclusion
Using insights like the ones above, you’ll be able to craft content better suited to those specific writers and audiences, increasing your chances of getting extremely impactful links via a digital PR strategy.
You can download the Tableau notebook and sort in the desktop version to explore the different sites relevant to your vertical. While not all of them may accept outside content, it’s a great start for building a “dream” pitch list. Study the type of content they typically publish, what their audience seems to enjoy most (based on shares and comments), and consider using these insights to hone your content strategy.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
For marketers, Reddit is more than a tool to while away your lunch break. It's a huge, thriving forum with subreddits devoted to almost any topic you can imagine — and exciting new content ideas lurk within threads, just waiting to be discovered. In this edition of Whiteboard Friday, Daniel Russell takes you through five simple steps to mine Reddit for content ideas bolstered by your target audience's interest.
Video Transcription
Howdy, Moz fans. Welcome to another edition of Whiteboard Friday. My name is Daniel Russell. I'm from an agency called Go Fish Digital. Today we're going to be talking about mining Reddit for content ideas.
Reddit, you've probably heard of it, but in case you haven't, it's one of the largest websites on the internet. It gets billions of views and clicks per year. People go there because it is a great source of content. It's really entertaining. But it also means that it's a great source of content for us as marketers. So today what we're going to be talking about is two main groups here.
We're going to first be talking about the features of Reddit, the different things that you can use on Reddit to find good content ideas. Then we're going to be talking about five steps that you can take and apply today to start finding ideas for your company, for your clients and start getting that successful content.
First, a big feature of Reddit is called subreddits. They're essentially smaller forums within Reddit, a smaller forum within a forum dedicated to a particular topic. So there might be a forum dedicated to movies and discussing movies. There's a forum dedicated to food and talking about different types of food, posting pictures of food, posting recipes.
There is a forum for just about everything under the sun. If you can think of it, it's probably got a forum on Reddit. This is really valuable to us as marketers because it means that people are taking their interests and then putting it out there for us to see. So if we are trying to do work for a sports company or if we're trying to do work for our company that's dentistry or something like that, there is a subreddit dedicated to that topic, and we can go and find people that are interested in that, that are probably within our target markets.
Upvoting and downvoting
There's upvoting and downvoting. Essentially what this is, is people post a piece of content to Reddit, and then other users decide if they like it or not. They upvote it or they downvote it. The stuff that is upvoted is usually the good stuff. People that are paying really close attention to Reddit are always upvoting and downvoting things. Then the things that get the most upvotes start rising to the top so that other people can see it.
It's super valuable to us again because this helps verify ideas for us. This helps us see what's working and what's not. Before we even put pen to paper, before we even start designing everything, we can see what has been the most upvoted. The most upvoted stuff leads to the next big feature, which is rankings. The stuff that gets voted the most ends up ranking on the top of Reddit and becomes more visible.
It becomes easier for us to find as marketers, and luckily we can take a look at those rankings and see if any of that matches the content we're trying to create.
Comments
There's the comments section. Essentially what this is, is for every post there's a section dedicated to that post for comments, where people can comment on the post. They can comment on comments. It's almost like a focus group.
It's like a focus group without actually being there in person. You can see what people like, what people don't like about the content, how they felt about it. Maybe they even have some content ideas of their own that they're sharing in there. It's an incredibly valuable place to be. We can take these different features and start digging in to find content ideas using these down here.
Reddit search & filters
Search bar
The search bar is a Reddit feature that works fairly well. It will probably yield mediocre results most of the time. But you can drill down a little further with that search bar using search parameters. These parameters are things like searching by author, searching by website.
Search parameters
There are a lot of different searches that you can use. There's a full list of them on Reddit. But this essentially allows you to take that mediocre search bar and make it a little bit more powerful. If you want to look for sports content, you can look specifically at content posted from ESPN.com and see what has been the most upvoted there.
Restrict results to subreddit
You can restrict your results to a particular subreddit. So if you're trying to look for content around chicken dishes, you're doing work for a restaurant and you're trying to find what's been the most upvoted content around chicken, you don't want people calling each other chickens. So what you can do is restrict your search to a subreddit so that you actually get chicken the food rather than posts talking about that guy is a chicken.
Filter results
You can filter results. This essentially means that you can take all the results that you get from your search and then you can recategorize it based off of how many upvotes it's gotten, how recently it was posted, how many comments it has.
Filter subreddits
Then you can also filter subreddits themselves. So you can take subreddits, all the content that's been posted there, and you can look at what's been the most upvoted content for that subreddit.
What has been the most controversial content from that subreddit? What's been the most upvoted? What's been the most downvoted? These features make it a really user-friendly place in terms of finding really entertaining stuff. That's why Reddit is often like a black hole of productivity. You can get lost down it and stay there for hours.
That works in our benefit as marketers. That means that we can go through, take these different features, apply them to our own marketing needs, and find those really good content ideas.
5 steps to finding content ideas on Reddit
So for some examples here. There's a set of key steps that you can use. I'm going to use some real-world examples, so some true-blue things that we've done for clients so that you can see how this actually works in real life.
1. Do a general search for your topic
The first step is to do a general search for your topic. So real-world example, we have a client that is in the transportation space. They work with shuttles, with limos, and with taxis. We wanted to create some content around limos. So the way we started in these key steps is we did a general search for limos.
Our search yielded some interesting things. We saw that a lot of people were posting pictures of stretch limos, of just wild limo interiors. But then we also saw a lot of people talking about presidential limos, the limos that the president rides in that have the bulletproof glass and everything. So we started noticing that, hey, there's some good content here about limos. It kind of helped frame our brainstorming and our content mining.
2. Find a subreddit that fits
The next step is to find a subreddit that fits that particular topic. Now there is a subreddit dedicated to limos. It's not the most active. There wasn't a ton of content there. So what we ended up doing was looking at more broad subreddits. We looked at like the cars subreddit.
There was a subreddit dedicated to guides and to breakdowns of different machines. So there were a lot of breakdowns, like cutaways of the presidential limos. So again, that was coming up. What we saw in the general search was coming up in our subreddit specific search. We were seeing presidential limos again.
3. Look at subreddit content from the past month
Step 3, look at that sub's particular content from the past month. The subreddit, for example, that we were looking at was one dedicated to automobiles, as I had mentioned earlier. We looked at the top content from that past month, and we saw there was this really cool GIF that essentially took the Chevy logo back from like the '30s and slowly morphed it over the years into the Chevy logo that we saw today.
We thought that was pretty cool. We started wondering if maybe we could apply that same kind of idea to our presidential limo finding that we were seeing earlier.
4. Identify trends, patterns, and sticky ideas
Number 4 was to identify trends, patterns, and sticky ideas. Sticky ideas, it just means if you come across something and it just kind of sticks in your head, like it just kind of stays there, likely that will happen for your audience as well.
So if you come across anything that you find really interesting, that keeps sticking in your head or keeps popping up on Reddit, it keeps getting lots of upvotes, identify that idea because it's going to be valuable. So for us, we started identifying ideas like morphing GIFs, the Chevy logo morphing over time. We started identifying ideas like presidential limos. People really like talking about it.
5. Polish, improve, and up-level the ideas you've found
That led us to use Step Number 5, which is to take those ideas that we were finding, polish them, improve them, one up it, take it to the next level, and then create some content around that and promote it. So what we did was we took those two ideas, we took presidential limos and the whole morphing GIF idea over time, and we combined them.
We found images of all of the presidential limos since like the '50s. Then we took each of those presidential limos and we created a morphing GIF out of them, so that you started with the old presidential limos, which really weren't really secure. They were convertibles. They were normal cars. Then that slowly morphed up to the massive tanks that we have today. It was a huge success.
It was just a GIF. But that idea had been validated because we were looking at what was the most upvoted, what was the most downvoted, what was ranked, what wasn't ranked, and we saw some ideas that we could take, one up, and polish. So we created this morphing presidential limo, and it did really well.
It got coverage in a lot of major news networks. ABC News picked it up. CBS talked about it. It even got posted to Reddit later and performed really well on Reddit. It was all because we were able to take these features, mine down, drill down, find those good content ideas, and then polish it and make it our own.
I'm really interested to hear if you've tried this before. Maybe you've seen some really good ideas that you'd like to try out on Reddit.
Do you have like a favorite search function that you use on Reddit? Do you like to filter by the past year? Do you like a particular subreddit? Let me know down in the comments. Good luck mining ideas. I know it will work for you. Have a great day.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
The annual study provides KPI benchmark data which allow digital marketers analyze their 2019 performance and plan their 2020. The most popular section in the report amongst Moz readers has always been the conversion correlation, where we crunch the numbers to see what sets the high-performing websites apart.
We're privileged to count a number of particularly high-performance websites among our dataset participants. There have been over twenty international digital marketing awards won by a spread of participant websites in the last three years. In these findings, you're getting insights from the global top tier of campaigns.
If we take a five-year look-back, we can see the conversion correlation section acts as an accurate predictor of upcoming trends in digital marketing.
In our 2016 study, the two stand-out correlations with conversion rate were:
High-performing websites got more significantly paid search traffic than the chasing pack.
High-performing websites got significantly more mobile traffic than the chasing pack.
The two strongest overall trends in our 2020 report are:
It’s the first year in which paid search has eclipsed organic for website revenue.
It’s the first year the majority of revenue has come from mobile devices.
This tells us that the majority of websites have now caught up with what the top-performing websites were doing five years ago.
So, what are the top performing websites doing differently now?
These points of differentiation are likely to become the major shifts in the online marketing mix over the next 5 years.
Let’s count down to the strongest correlation in the study:
4. Race back up to the top! Online PR and display deliver conversions
For the majority of the 2010s, marketers were racing to the bottom of the purchase funnel. More and more budget flowed to search to win exposure to the cherished searcher — that person pounding on their keyboard with their credit card between their teeth, drunk on the newfound novelty of online shopping. The only advertising that performed better than search was remarketing, which inched the advertising closer and closer to that precious purchase moment.
Now in 2020, these essential elements of the marketing mix are operating at maximum capacity for any advertiser worth their salt. Top performing websites are now focusing extra budget back up towards the top of the funnel. The best way to kill the competition on Search is to have the audience’s first search, be your brand. Outmarket your competition by generating more of your cheapest and best converting traffic, luvly brand traffic. We saw correlations with Average Order Value from websites that got higher than average referral traffic (0.34) and I can’t believe I’m going to write this, but display correlated with a conversion success metric, Average Order Value (0.37). I guess there's a first time for everything!
3. Efficiencies of scale
Every budding business student knows that when volume increases, cost per unit decreases. It’s called economies of scale. But what do you call it when it’s revenue per unit that’s increasing with volume? At Wolfgang, we call it efficiencies of scale. Similar to last year’s report, one of the strongest correlations against a number of the success metrics was simply the number of sessions. More visitors to the site equals a higher conversion rate per user (0.49). This stat summons the final wag for the long-tail of smaller specialist retailers. This finding is consistent across both the retail and travel sectors.
And it illustrates another reversal of a significant trend in the 2010s. The long-tail of retailers were the early settlers in the e-commerce land of plenty. Very specialist websites with a narrow product range could capture high volumes of traffic and sales.
For example, www.outboardengines.com could dominate the SERP and then affiliate link or dropship product, making for a highly profitable small business. The entrepreneur behind this microbusiness could automate the process and replicate the model again and again for the products of her choosing. Timothy Ferris’ book, The 4 Hour Work Week, became the bible to the first flush of digital nomads; affiliate conferences in Vegas saw leaning towers of chips being pushed around by solopreneur digital marketers with wild abandon.
Alas, by the end of the decade, Google had started to prioritize brands in the SERP, and the big players had finally gotten their online act together. As a result, we are now seeing significant ‘efficiencies of scale’ as described above
2. Attract that user back
What’s the key insight digital marketers need to act upon to succeed in the 2020s? Average Sessions per Visitor is 2, Average Sessions per Purchaser is 5.
In other words, the core role of the marketer is to create an elegant journey across touchpoints to deliver a person from two click prospect to five click purchaser. Any activity which increases sessions per visitor will increase conversion. Similar to last year’s report, another of the strongest and most consistent correlations was the number of Sessions per User (0.7) — which emphasizes the importance of this metric.
So where should a marketer seek these extra interactions?
Check out the strongest correlation we found with conversion success in the Wolfgang KPI Report 2020….
1. The social transaction
The three strongest conversion correlations across the 4,000 datapoints were related to social transactions. This tells us that the very top performing websites were significantly better than everybody else at generating traffic from social that purchases.
Google Analytics is astonishingly rigorous at suppressing social media success stats. It appears they would rather have an inferior analytics product than accurately track cross-device conversions and give social its due. They can track cross-device conversions in Google Ads — why not in Analytics? So, if our Google Analytics data is telling us social is the strongest conversion success factor, we need to take notice.
This finding runs in parallel with recent research by Forrester which finds one-third of CMOs still don’t know what to do with social.
Our correlation calc finds that social is the biggest point of difference between the high flyers and the chasing pack. The marketers who do know how to use social, are the tip top performing marketers of the bunch. We also have further findings on how to out-market the competition on social in the full study.
Here’s the top tier of correlations we extracted from a third of a billion euro in online revenues and over 100 million website visits:
Retail
Travel
Overall
To read more of our findings pertaining to:
The social sweet spot
Average conversion rates in your industry
In-store sales benchmarked
Why data is the new oil
2010 was the decade of the…
And much, much more
Have a look at the full e-commerce KPI report for 2020. If you found yourself with any questions or anecdotes relating to the data shared here, please let us know in the comments!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!