Satisfying search intent is a critical component of our daily SEO work. But if you're not thinking ahead to what a searcher might look for after that initial query is answered, you could be missing out.
In today's Whiteboard Friday, Ola tells you what "next search intent" is, why it's important, and how to optimize for it.
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
Hi, Moz fans. I'm Ola King. I'm a user researcher here at Moz, and I'm excited to join you today for another edition of Whiteboard Friday. Today I'll be talking to you about the next search intent. In a previous Whiteboard Friday, I mentioned the three bosses of SEO, and one of the factors that I mentioned affecting SEO are the search intents behind each and every search.
So everyone performing a search on a search engine is looking for something, and the search intent is the purpose behind that search. As an SEO, satisfying that search intent is critical to the success of your content. Britney Muller has a very in-depth Whiteboard Friday on this topic already, so I'm not going to be covering that. Logan Bryant also has another topic called the hidden search intent. So that's something you should check out as well.
But today I am going to be talking about what happens after you satisfy the initial search intent, so the next search intent.
What is next search intent?
So what's next search intent? Well, if you're looking at search as a journey, the next search intent is the next step in a searcher's journey that is what someone would most likely be looking for next after they've completed the objective of a particular page.
So if search intent helps a searcher stay on your page, next search intent helps a searcher stay on your site.
Why is it important?
So why is this important? Well, SEO is not just about ranking. In order to really maximize your efforts for SEO, you have to start thinking about how are your pages converting, how do the pages move people into the next stages in your funnel, so funnel optimization, what's the user experience for your searchers, what's the customer journey like and how are they engaged with the relevant content that you want them to be engaging with, how is this helping you retain your ideal customers or searchers, and how is each and every content internally linking with other pieces of content that you have on your site, and also the traffic distribution as well, so how are you moving traffic from the the top-performing pages into pages that might not be getting as much traffic.
How to identify next search intent
So how do you implement next search intent? Well, the goal of next search intent is really to identify what people are most likely to search for next and then nudge the searchers into those next actions. So you can do this with simple calls to action, embeds on pages, and links from one page to another.
Or you can get more advanced by tweaking your nav bar, making things a bit customized, adding a read next section to each one of your pages, having launchers that pop up. So many different ideas. Pretty much your goal is just to think of a particular page and think as a user, as a searcher, "What would I most likely need next after I've consumed this information?"
So some ways to get ideas is to understand your searcher's persona, look at similar keywords that might be related to what your page is ranking for, look at other ranking keywords that you are ranking for as well. Look at what your competitors are ranking for that you might not be ranking for. This might give you ideas of your blind spots for content that might not be relevant to your particular page but other related pages.
Understand the curiosity journey. So this is like customer journey, but instead of looking at your funnel, you're trying to look at, in terms of an information let's call it map, what would someone want to know about next. Focus on the user experience as well. Providing the most relevant information always helps with a good user experience.
Check your Google Analytics and see what pages are people visiting when they land on a particular page. That will give you clues into what's the next page or next intent that they want. You can also look on Google. Just search for a keyword and you can see for some keywords the people search next as well, and that's the most obvious way to find the next search intent.
Four types of search intent
So how do you do this? Well, if you've watched Britney's video or other information around the search intent, you will understand that there are four main types of search intent — so informational, commercial, navigational, and transactional.
Informational
So for informational, your goal is really to provide a good user experience and to optimize your funnel so that you can move people or searchers from one page to another. So you can do this by surfacing related content and then linking to your relevant pages on your site.
Commercial
For the commercial one, your goal is for conversion because commercial is just about purchase intent.
So you can do this by adding a comparison of your competitors' product or similar products that you have on your site, adding coupons, discount, and answering any objections that someone might have. So be proactive on the information that they would need before they need it and then surface it onto your page.
Navigational
Navigational, the goal is also a good user experience, retaining people on your website, and making sure you optimize your journey, so that's traffic and flow from one page to the most relevant next page. You can do this with launchers that pop up as users perform certain actions on the page.
You can have customized nav bars. You can set up your site links correctly so that from the search engines that becomes apparent what page people can visit as well.
Transactional
So for transactional, this is someone already knows what they want and they are just trying to buy things. Your goal in this case is just to convert and upsell.
So you want to have your related products surfacing, have your product variations, and then have the compatible purchases, like Amazon's people also buy type of thing. You can put a little demo as well to help the searchers who might be looking for how your product works in real life by being proactive and having that on your page.
But yeah, so that really is the next search intent. If you have any ideas that I might have missed, please don't hesitate to reach out and I would love to learn from you as well. The main key points to learn about this is when it comes to SEO, you want to think about things in a holistic way.
You don't want to just look at one page on your site. You want to look at how each page connects, and understanding the next search intent allows you to bring value from one page that is performing well to other pages on your website so that your entire site can be blooming. But yeah, thanks for joining me today and see you next time.
Last year, the team at Homeday — one of the leading property tech companies in Germany — made the decision to migrate to a new content management system (CMS). The goals of the migration were, among other things, increased page speed and creating a state-of-the-art, future-proof website with all the necessary features. One of the main motivators for the migration was to enable content editors to work more freely in creating pages without the help of developers.
After evaluating several CMS options, we decided on Contentful for its modern technology stack, with a superior experience for both editors and developers. From a technical viewpoint, Contentful, as a headless CMS, allows us to choose which rendering strategy we want to use.
We’re currently carrying out the migration in several stages, or waves, to reduce the risk of problems that have a large-scale negative impact. During the first wave, we encountered an issue with our cookie consent, which led to a visibility loss of almost 22% within five days. In this article I'll describe the problems we were facing during this first migration wave and how we resolved them.
Setting up the first test-wave
For the first test-wave we chose 10 SEO pages with high traffic but low conversion rates. We established an infrastructure for reporting and monitoring those 10 pages:
Rank-tracking for most relevant keywords
SEO dashboard (DataStudio, Moz Pro, SEMRush, Search Console, Google Analytics)
Regular crawls
After a comprehensive planning and testing phase, we migrated the first 10 SEO pages to the new CMS in December 2021. Although several challenges occurred during the testing phase (increased loading times, bigger HTML Document Object Model, etc.) we decided to go live as we didn't see big blocker and we wanted to migrate the first testwave before christmas.
First performance review
Very excited about achieving the first step of the migration, we took a look at the performance of the migrated pages on the next day.
What we saw next really didn't please us.
Overnight, the visibility of tracked keywords for the migrated pages reduced from 62.35% to 53.59% — we lost 8.76% of visibility in one day!
As a result of this steep drop in rankings, we conducted another extensive round of testing. Among other things we tested for coverage/ indexing issues, if all meta tags were included, structured data, internal links, page speed and mobile friendliness.
Second performance review
All the articles had a cache date after the migration and the content was fully indexed and being read by Google. Moreover, we could exclude several migration risk factors (change of URLs, content, meta tags, layout, etc.) as sources of error, as there hasn't been any changes.
Visibility of our tracked keywords suffered another drop to 40.60% over the next few days, making it a total drop of almost 22% within five days. This was also clearly shown in comparison to the competition of the tracked keywords (here "estimated traffic"), but the visibility looked analogous.
As other migration risk factors plus Google updates had been excluded as sources of errors, it definitely had to be a technical issue. Too much JavaScript, low Core Web Vitals scores, or a larger, more complex Document Object Model (DOM) could all be potential causes. The DOM represents a page as objects and nodes so that programming languages like JavaScript can interact with the page and change for example style, structure and content.
Following the cookie crumbs
We had to identify issues as quickly as possible and do quick bug-fixing and minimize more negative effects and traffic drops. We finally got the first real hint of which technical reason could be the cause when one of our tools showed us that the number of pages with high external linking, as well as the number of pages with maximum content size, went up. It is important that pages don't exceed the maximum content size as pages with a very large amount of body content may not be fully indexed. Regarding the high external linking it is important that all external links are trustworthy and relevant for users. It was suspicious that the number of external links went up just like this.
Both metrics were disproportionately high compared to the number of pages we migrated. But why?
After checking which external links had been added to the migrated pages, we saw that Google was reading and indexing the cookie consent form for all migrated pages. We performed a site search, checking for the content of the cookie consent, and saw our theory confirmed:
This led to several problems:
There was tons of duplicated content created for each page due to indexing the cookie consent form.
The content size of the migrated pages drastically increased. This is a problem as pages with a very large amount of body content may not be fully indexed.
The number of external outgoing links drastically increased.
Our snippets suddenly showed a date on the SERPs. This would suggest a blog or news article, while most articles on Homeday are evergreen content. In addition, due to the date appearing, the meta description was cut off.
So why wasn't this the case for the migrated pages? We crawled and rendered the pages with different user agents, but still couldn't find a trace of the Cookiebot in the source code.
Investigating Google DOMs and searching for a solution
The migrated pages are rendered with dynamic data that comes from Contentful and plugins. The plugins contain just JavaScript code, and sometimes they come from a partner. One of these plugins was the cookie manager partner, which fetches the cookie consent HTML from outside our code base. That is why we didn't find a trace of the cookie consent HTML code in the HTML source files in the first place. We did see a larger DOM but traced that back to Nuxt's default, more complex, larger DOM. Nuxt is a JavaScript framework that we work with.
To validate that Google was reading the copy from the cookie consent banner, we used the URL inspection tool of Google Search Console. We compared the DOM of a migrated page with the DOM of a non-migrated page. Within the DOM of a migrated page, we finally found the cookie consent content:
Something else that got our attention were the JavaScript files loaded on our old pages versus the files loaded on our migrated pages. Our website has two scripts for the cookie consent banner, provided by a 3rd party: one to show the banner and grab the consent (uc) and one that imports the banner content (cd).
The only script loaded on our old pages was uc.js, which is responsible for the cookie consent banner. It is the one script we need in every page to handle user consent. It displays the cookie consent banner without indexing the content and saves the user's decision (if they agree or disagree to the usage of cookies).
For the migrated pages, aside from uc.js, there was also a cd.js file loading. If we have a page, where we want to show more information about our cookies to the user and index the cookie data, then we have to use the cd.js. We thought that both files are dependent on each other, which is not correct. The uc.js can run alone. The cd.js file was the reason why the content of the cookie banner got rendered and indexed.
It took a while to find it because we thought the second file was just a pre-requirement for the first one. We determined that simply removing the loaded cd.js file would be the solution.
Performance review after implementing the solution
The day we deleted the file, our keyword visibility was at 41.70%, which was still 21% lower than pre-migration.
However, the day after deleting the file, our visibility increased to 50.77%, and the next day it was almost back to normal at 60.11%. The estimated traffic behaved similarly. What a relief!
Conclusion
I can imagine that many SEOs have dealt with tiny issues like this. It seems trivial, but led to a significant drop in visibility and traffic during the migration. This is why I suggest migrating in waves and blocking enough time for investigating technical errors before and after the migration. Moreover, keeping a close look at the site's performance within the weeks after the migration is crucial. These are definitely my key takeaways from this migration wave. We just completed the second migration wave in the beginning of May 2022 and I can state that so far no major bugs appeared. We’ll have two more waves and complete the migration hopefully successfully by the end of June 2022.
The performance of the migrated pages is almost back to normal now, and we will continue with the next wave.
Inclusivity is an important consideration for every business owner and content creator, and should be at the heart of your ongoing design efforts — not something you look at after a website or piece of content goes live.
Before we get to specific tips on creating this inclusive content, let’s go over key definitions and concepts.
What is inclusivity?
Inclusivity is about recognizing diversity. It ensures everyone can participate to the greatest possible extent. Other names for inclusivity include universal design and design for all.
Inclusivity addresses a wide range of issues, including:
Accessibility for people with disabilities
Access to and quality of internet connectivity, computer hardware, and computer software
Computer literacy and skills
Economic circumstances
Education
Geographic location
Culture
Age (older and younger people)
Language
Understanding Usability, Accessibility, and Inclusivity
Usability
According to the World Wide Web Consortium (W3C), an international standards organization that publishes guidelines and recommendations for web technologies, “Usability is about designing products to be effective, efficient, and satisfying.”
Usability factors measure the functionality of a product or design and the design interface’s ease of use. They assess how easy it is for users to learn the basic tasks of the interface, how quickly users can perform tasks on the interface, and whether users can remember how to perform those tasks after time away from the interface. Usability factors also consider whether the design satisfies users, and if there are errors in the interface, how severe those errors are, and the ease of recovering from those errors.
Think about Google’s search page design. When it first launched, it received considerable backlash. This was an era when internet users wanted their own “home pages” or “web portals,” each presenting their favorite news and links when the browser launched. But what did these portals all have in common? A search box. The fact that this simple (some still say ugly) design eventually became the homepage for billions of internet users speaks to the value of simplicity and how it enables inclusivity.
According to W3C, “Accessibility addresses discriminatory aspects related to equivalent user experience for people with disabilities. Web accessibility means that people with disabilities can equally perceive, understand, navigate, and interact with websites and tools.”
In other words, all users, regardless of ability or circumstance, should be able to:
Perceive all interface or document elements.
Operate the controls easily and intuitively.
Understand the content.
Use different assistive technologies, devices, browsers, and operating systems to interact with the content.
For example, using high-contrast colors in an app doesn’t just help people who have low vision or color blindness, it also helps people who use their devices in bright sunlight. Similarly, while improvements to usability, like using simple language and intuitive design, may allow people with cognitive disabilities to use a product or service more productively, they are also beneficial for people who may be busy or distracted, who are learning the language, or even people with slower internet access, because simple and intuitive sites may be faster to load.
Inclusivity means representing people who have, until now, been underrepresented. Inclusivity issues affect people from specific populations within a community, as well as communities that have been denied the opportunity to participate fully in economic, social, or civic life.
Inclusive content should always recognize diversity in the functional needs and abilities of individuals. To make your content inclusive, think of peoples’ diverse abilities, ensuring your content can be accessed in a variety of ways.
Inclusive content should also encompass diversity in personal needs and experiences. We should all challenge ourselves to do better at including different communities, identities, races, ethnicities, backgrounds, abilities, cultures, and beliefs. We must take steps to avoid “othering” people. By ensuring that everyone feels welcome in our digital spaces, we can more accurately represent the world we live in.
History has taught us about the struggles for equity and inclusion globally. Wars have been fought over human rights and the rights of enslaved peoples. Women have protested for suffrage. People of color have long fought for civil rights, and in recent years the Black Lives Matter movement has highlighted systemic inequalities and the need for social justice. The Gay Revolution has sought to achieve equal rights for gay, lesbian, bisexual, and transgender people. Indigenous people have struggled to gain equality and receive meaningful reconciliation for the abuse and mistreatment suffered at the hands of governments. And the Disability Rights Movement has worked to change attitudes, promote integration, and ensure that all people, regardless of ability, have equal access to transportation, housing, education, and employment opportunities.
While educational systems do teach us about these issues, they often present them as discrete events that have little real connection to the “dominant” society. As a result, most people are not comfortable speaking or writing about them. We simply do not have the vocabulary.
By making empathy an essential part of inclusion, business owners and content creators can improve their audience communications, build trust, and grow networks. Meeting the functional needs of all users creates a better reputation and improved word-of-mouth in more communities.
Nine ways to design inclusive content
Inclusive design should always meet the needs of as many users as possible. Use the following inclusive digital content tips when creating websites, mobile apps, e-mail, and documents.
1. Assess points of bias in your content and design practices.
Does your content default to the pronoun “he,” or does it make equal use of “she” and “they”? Review your text content, stock photography, and illustrations. Look at both digital and printed materials aimed at internal and external audiences. Do they feature mostly white, male, straight, non-disabled people? If so, it’s time to switch up the terminology and create a new aesthetic – one with more diversity.
Review words through the lens of inclusivity. Identify and remove all instances of othering and ableism. This boils down to respect. Avoid collective terms and labels, such as “females” or “the blind,” that group individuals into a category that promotes objectification. Always remember an individual’s disability, gender, race, ethnicity, nationality, or heritage is just a single facet of their unique and complex identity.
Disability
A disability is something that a person has. It is not who they are. Use “person with [a disability]” or “person who has [a disability].” Put the person at the forefront. For example, say “a person who is blind,” not “a blind person.”
Many people with disabilities prefer people-first language. However, there are also people with disabilities who prefer identity-first language, and it is important to respect this preference and give every person the self-determination to choose. Never correct someone who refers to themselves using identity-first language. For example, don’t correct or change the reference when someone calls themselves a “blind person” as opposed to “a person who is blind.”
Avoid any language that suggests pity or hopelessness, or that disempowers people with disabilities, such as “suffers” or “victim.” Avoid words like “brave” or “courageous,” as these words can belittle or trivialize people with disabilities.
Gender
The male-dominated cultures of North America and Europe have created a biased vocabulary that should be adjusted for inclusion. It is common, for example, to refer to groups of people as “guys,” even when the group includes both men and women. Adult women are often “girls,” but adult men are rarely “boys.” Then there are the outdated words and phrases that have become problematic, such as “manpower,” “man-hours,” “man the controls,” or “man up.” Substitute these terms with “workforce,” “hours,” “take the controls,” or “step up.”
What should you do if you’re not sure how to address someone? Use the gender-neutral pronouns they/them/their when meeting someone for the first time. Here’s an inclusive way to introduce yourself, “I refer to myself as (she/her), what pronouns do you prefer?” or “My pronouns are (they/them), what pronouns do you use?”
Race and ethnicity
Just like the vocabulary of gender, words associated with race, ethnicity, nationality, and heritage have been debated over vigorously in recent years. A central part of inclusion is adopting terms that are honest and respectful and being intolerant of terms that are disrespectful. Historic bias has permeated our culture, and as a result, problematic terminology once used to describe people who are not white has been applied to many different areas of life, including technology (whitelist/blacklist, master/slave devices). Always address people’s ethnicity, nationality, or heritage respectfully and without irony, racism, or satire.
3. Use responsive design that allows zoom and orientation changes.
Responsive design allows web content layouts to display well on many form factors, using “breakpoints” to define different widths. Sometimes app makers lock orientation or turn off the zoom ability. These are basic tools that we all need from time to time. Instead of disabling them, design apps to accommodate them.
Here are a few responsive design and accessibility guidelines:
Use link text that describes the link destination.
Write meaningful and descriptive alt text.
Use semantic HTML – tags that clearly describe the purpose of page elements, such as <header>, <article>, <aside>, or <footer>.
Instead of fixed or absolute font sizes, use relative units for font sizes, such as percentage units, viewport width, or viewport height.
Label all buttons and fields.
Using responsive design and accessibility guidelines makes it easier for search engines to crawl and interpret websites. For example, sites with clear, descriptive headings – the same kinds of headings that also make navigation and comprehension easier for people for disabilities – are better optimized for search engines to do their work.
Because of this, Google rewards accessibility when ranking websites. In fact, their Webmaster Guidelines – which lay out the best practices that help Google to find, index, and rank your site – read very much like accessibility guidelines, and often correlate directly with the W3C’s Web Content Accessibility Guidelines (WCAG).
By using accessible design, you are simultaneously improving the on-page experience, making your content more accessible to users with disabilities, and facilitating the work of search engine bots who are busy crawling, indexing your site, and assessing link equity between pages – all of which help boost your SEO ranking.
Use freely available Microsoft, Apple, and Android accessibility attributes and accessibility test apps when developing documents, web content, or apps.
Testing is an integral part of developing accessible and inclusive digital content. Fortunately, tech companies have created a range of tools and resources to help ensure your documents, websites, and apps can be accessed easily by all users. For example, Microsoft Office applications – Word, PowerPoint, or Excel – have the Check Accessibility feature, which resides in the Review menu. From there, you can choose the ‘Check Accessibility’ button and follow the instructions to remove accessibility problems, rechecking as you go until all issues have been addressed.
Use simple, hierarchical headings to organize documents, web pages, or emails. This allows all users and their technology to interpret the main ideas of the content and find the information they’re looking for. Graphics should be secondary and should never contain critical information. Provide equivalent alternatives for any image that contains text.
Also be sure to include a link to the web version of any HTML message right on top, in case of e-mail client issues with layout or graphics. If you’re using HTML messages, then follow basic, semantic HTML best practices. E-mail is the only technology where tables for layout is still acceptable, but even there, CSS is now preferred.
Evaluate your content without graphics or with the graphics turned off. This will ensure your content can be accessed on low-fi devices, over bad connections, or through a screen reader.
6. Provide text alternatives for non-text elements, such as images and forms.
Use simple, descriptive language as alt text for images. If the image serves a specific function, the alt text should explain what that function is and describe the contents of the image. The goal is to ensure that everyone has access to the same information about the image.
If the image contains a chart or graph, the alt text should include the data. If you’re using a creative photo or a photo as an illustration, the alt text should describe the elements of the image in detail.
If the document or content includes images that are not important, are used for layout, or do not serve a specific function, use null alt text (alt=“”). This will keep them hidden from assistive technologies.
Labels for form fields and options are also areas where inclusion and accessibility should be addressed. Every field and button should have a label, written in plain language. If button labels have icons or images, the alt text should describe the function of the button, rather than its appearance.
When creating contact forms, label all fields visibly. Include formatting hints to reduce errors. If you use a placeholder, ensure that it stays visible.
7. Take advantage of freely available accessibility checkers.
Check your content regularly to keep it inclusive and accessible. Every time you make changes to your website, you run a risk of making content inaccessible to people with disabilities. This is why ongoing monitoring is important.
AudioEye, a digital accessibility platform, offers Active Monitoring to help site owners keep their content accessible and inclusive. It checks for accessibility issues every time a site visitor loads a new page, and also tests for new accessibility issues, gathering information across all users and pages, and then automatically fixes the majority of common errors. The platform displays all accessibility issues found and fixed in an Issue Reporting dashboard (pictured above), along with details on how these issues affect users with disabilities and how to fix unresolved issues that require manual intervention. You can start by trying AudioEye’s Accessibility Checker.
Assess the color scheme for contrast and distinction. White text on a black background is high contrast, while white text on a pale blue background is low contrast. Many people with visual disabilities rely on high color contrast to view digital content. And because visual acuity and the ability to distinguish colors also fade with age, high color contrast ensures older users can access your content.
Social media is used to convey messages or ideas quickly, usually in just a line or two of text, using a single image or a short video. To meet accessibility requirements for social media, ensure that all images and video clips are described in detail. Include voiceovers, narration, and song lyrics in the description. Don’t forget to include the emotion the subject may be trying to evoke. This also helps low-bandwidth users participate in social media. Please see an example below.
Assess the cast of your videos. Do they feature mostly white men? Look for ways to feature a broad cross-section of society, including different genders, people of different ethnicities, and people with disabilities.
Emojis make social media posts fun, but they can also pose problems for people who can’t see them. Because screen readers use words to describe the emoji, a series of smiley faces and hearts added in the middle of an Instagram caption becomes “grinning face, smiling face with smiling eyes, smiling face with heart-eyes, red heart, red heart, red heart.” Limit your emojis to two or three, and put them at the end of your text, so they don’t get in the way of the information in the post.
Hashtags are an essential part of social media posts: highlighting key words and phrases with # makes it easy for users to find posts. Making hashtags accessible is simple: capitalize the first letter of each word in the hashtag (also known as CamelCase). That helps screen readers separate the words correctly (#SuperBowl, not #SuperbOwl) and to say them as words, rather than as separate letters.
As you start applying these best practices, remember that building accessible, inclusive, and usable content takes dedication and continuous improvement. Ask your site visitors and customers for feedback on a regular basis. Pick the right tools to make your efforts more sustainable and effective.
SEOs have powerful metrics at their disposal to measure the success of their strategies, such as Domain Authority (DA) and Page Authority (PA). But how best to use them? In today's Whiteboard Friday, Tom shows you how to think about these metrics as part of a holistic approach to your link building analysis.
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
Happy Friday, Moz fans, and today's Whiteboard Friday is about measuring link building. So obviously this is a very big and very old topic in the SEO space, and it's one that Moz, as a company, is heavily invested in, right? Like Domain Authority and Page Authority are two very popular products of ours, which are commonly used for this exact purpose.
Now this isn't going to be advertorial, though. I could stand here and just say obviously these are the best metrics in the world and that kind of thing. That's not what I'm here to do. I'm here to give you a bit of nuance about how and when to use these metrics and how to think about them, and how to use them alongside other metrics as well, rather than just having one tool and saying it's a solution to all problems, which isn't necessarily fair.
Google's PageRank
So to do that, I'm actually going to start by going right back to 1998 and Google's PageRank model. Now I know that a lot has changed since 1998, both with the world and with Google. But this was Google's original way of thinking about links, and in a lot of ways it's still the best that we have to go on. A lot of current SEO best practices and dogma are still based on this original understanding, except there are a few things we've sort of picked up along the way that don't really have a basis in anything that Google has said or done, which is part of why I want to sort of point them out.
So PageRank originally was a way of using links to estimate the probability that a user is on a page, and that's already quite interesting, because that shows that this is a model that is about popularity. So when we talk about this now, we often talk about things like trust and authority and this kind of thing. I'm sure those are relevant, but it's worth remembering that originally this was just a way of estimating effectively the popularity of a page.
Note that I said of the page as well, not even the domain. So imagine a world where there's one page on the internet, which is Page A that I've labeled here. Now if there's one page on the internet, it's not that hard to estimate the chance that a random browser is on that page. It's a certainty they're on that page. If we introduce a second page, it's still not that hard, and we just assume it's going to be 50-50 and so on and so forth.
Link probability
That's sort of the baseline probability that we have to work with. But then we can take a sort of bit of a tangent or a bit of a spice added to the situation when one page links to another, and that's obviously what we're actually interested in. So if A links to this second page and at the moment there are still only two pages on the internet, ignore these other boxes, they'll come in later, there are only two pages on the internet and A links to the second page.
We say that 0.85 times this probability is passed on. Now 0.85 is a fairly arbitrary sort of constant. It's one that comes from an old Google document. It probably isn't that exact value, but it's fine for illustrative purposes, and it's the best we've got to go on.
So, in this case, why have we said 0.85 by the way? Why haven't we said that all of the users on this page click through? Well, that's because we assume that some of them are going to go and do their own thing, stop browsing the internet, do something else. It turns out that this damping factor is quite important in a world where pages do actually link to each other in a big web rather than just one link in one direction.
So that's all well and good, right? What if we had a second link and introduced a third page to the internet? So this is still a very simplistic model. We've got an internet with three pages and two links, and the links only go in one direction.
This is very, very simple. But in this case we say we can't have both of these pages getting the full probability. No, the users aren't clicking through to both. They're clicking through to one of them. So that gets half of 0.85A. But then this one does too.
Again, in a more complex model, we might say, oh, one of these links is more likely to be clicked on, so it gets more probability or something like that. But in this simple version, we're saying it's split two ways. Now, in this case, we've already learned something interesting again, because by adding another link we've reduced the value of the existing links and that's something that we hardly ever think about in a link building context.
But that is sort of what we're thinking about when in technical SEO conversations we talk about not having too many links in the top nav and this kind of thing. We're trying to focus our strength where we most want it. Then, lastly, I promise the [indecipherable] will stop soon. Lastly, what if we had another jump in this system? Well, in this case, this 0.85, this damping happens again.
So 0.85 times 0.85 is about 0.72, so it's less. So basically it's 0.85 times this page above it, and so it's gotten even lower. This is why, as technical SEOs, sometimes we get caught up with things like chain redirects and this kind of thing, why we think that's important.
That's where that sort of dogma comes from. So I'm not going to go any further with this sort of simplified PageRank explanation. What I am trying to draw to your attention here is a few things. One is that there's a lot about the specifics of a page here that affects the value of these links, like the number of links that the page sent outwards and also things like what linked to the specific page.
Note that I didn't say anything about domains here. This could be on four different domains. It could be on one domain. We only talked about page specifics here. Google has been a little bit ambiguous over time in terms of how they think about pages versus domains. But broadly speaking, they say they care about pages, not domains. So that's interesting, right, because these could all be on the same domain conceivably and yet this page could potentially be a lot weaker and pass on a lot less strength than this one.
Metrics for link building
So that's interesting, and that's something we don't normally think about with link building. So if we bring this back on topic to what I said I was going to talk about, actual metrics for link building, there are a few qualities that we're looking for.
Fast
Now what I haven't just talked about is these first two. We do want metrics that are fast. We want it to be available as quickly as possible so we can report to our client or our boss or that kind of thing and also just we're busy people. We don't want to waste our time.
Ubiquitous
We want metrics that are ubiquitous, so when I do say to my boss, "Oh, I've got you a link which had DA 90," there's a good chance that he or she or they know what that means. Whereas if I say it had a Tom Capper score of 38B, they're going to say, "What are you talking about?" So I do need to use a metric that's reasonably well understood.
Page & link specifics
But then there's this page and link level specifics that I just talked about. So if I think about a metric like Domain Authority, it does very well on these first two and it does okay on this third one, because it is trained on rankings to some degree, which is some of what this is determining.
So there's some benefit there. It does take into account some of this stuff, but ultimately it's a domain level metric. So it has to treat all the pages on one domain equally by definition. That produces some pros and cons.
Using metrics together
So what I want to do is I want to put some metrics on a chart like this and suggest how you might use them alongside each other.
So I've got actual as the vertical axis here. So the closer it is to what we're actually trying to measure, which is Google's view of the value of the link basically, the further up it's going to be. But then I've also got this fast/slow sort of convenience metric. So a metric like Domain Authority is probably somewhere here. It's very fast.
It's very ubiquitous. But it's missing some of this nuance because it's a domain level metric and it's answering a slightly different question. DA is designed to answer the question, "How likely is a page on this domain, all things being equal, to rank well?" That's a slightly different question to how valuable is the link. But if I'm saying, oh, I want DA, but not necessarily domain level, you might say, "Oh, well, Moz has a metric for that and you should know and it's called Page Authority."
Well, yeah, that is a good candidate. So like most page level metrics in the industry, including Google's and including our own, Page Authority is initially informed by some domain level factors as well as page level factors. We've done correlation studies and this kind of thing.
It is a lot closer to measuring the value and ranking potential of a specific page than the Domain Authority is, as you would expect, because it's a more precise metric and it is capturing some of this nuance. But actually you can go a step further with this as well. Now Page Authority is a bit slower than Domain Authority because you have to wait for Moz to discover and crawl the page.
We do our best, but it's not instant. However, if you're willing to wait even longer than that, you could use a metric like referral traffic. Apologies for my absolutely awful writing there.
So with referral traffic, what we're interested in is how many people actually click through from the link that I built to my site. That's interesting because that's what Google was actually trying to measure in the first place. So if we can measure that, then we're getting pretty close to whatever they were aiming for.
So whatever sophistication they've built in, we're sort of capturing that nuance. Now that has some obvious drawbacks. One is that a lot of link building campaigns don't do very well on this metric, and you can draw your own conclusions about that. The other is that you're obviously going to have to wait quite some time for this data to become available, and even then there might be issues with the client's analytics or this kind of thing. Anyway, that's what I wanted to share with you today.
Essentially what I would suggest is that you use all of these metrics and some others that you could put yourself on this chart. So I'm interested to hear what metrics you would use and where you would draw them on this kind of a chart. I put these green lines in as sort of a guide because I think you could do prospecting in this first section, like before you've even built the link, and then initial reporting to the client.
Then this section would more be after the campaign, when you want to learn from it and think about what kind of links you would build in the future and whether you would do the same sort of thing again. But yeah, I'd love to hear your ideas. Thank you very much and Happy Friday.
Being an SEO, you can’t go a day without hearing about links: “Links are crucial!” or “Prioritize links!” or “Links are the nourishing lifeblood of the almighty algorithm!”
But for those of us who’ve taken the next step to actually figure out how to earn said links, we realize it’s not that straightforward.
It’s hard to sum up all of that in the tweets and LinkedIn posts that get shared about link earning, because the truth is, there’s no such thing as a one-size-fits-all link earning approach or vendor. You need a link earning “stack” that appropriately reflects the complexity of your marketing and content goals.
In an ideal world, here’s what that stack would look like. (Hat tip to Paul Zalewski, SVP of Marketing at Verblio, who gave me the idea for this breakdown!)
Passive link earning
Objective: Set up a foundation for link earning with lower effort over time
Content needed: A “linkable content” asset (this is key)
Promotion needed: Manual outreach and promotion to acquire the first one or two links, to help the page initially rank
Passive link earning is any content you can create that will naturally earn links over time without having to actively promote it on an ongoing basis. These pieces are designed to carry their weight in earning links without much active promotion (which is what separates it from the next category, post-specific link earning). They’re often not directly tied to your product or service, though if they are, that’s certainly a bonus.
When SEO teams can collaborate with content teams on creating “linkable content,” passive link earning magic can happen.
Some common examples of linkable content include:
Statistics-based or definition-focused posts: People are always looking for stats to cite or definitions to link to.
Tools or other interactive resources: If they’re useful, folks love to share them!
“Best”/”Top” posts or annual reports: People like to reference lists they’re on or share lists that are interesting, valuable compilations.
Andy Crestodina at Orbit Media Studios is excellent at this strategy.
This post presents new data points around blogging, and it’s earned 2,761 linking domains! How? Because there are so many blog posts about blogging (meta) that want to include statistics relevant to the point they’re trying to make — and Andy is providing them.
Here’s an example of an Alexa blog post citing a data point from the Orbit Media piece:
If you want to dive into this link-earning type more, I recommend reaching out to either Andy or Alex Heinz, who I just saw give an awesome presentation on this very topic.
Targeted link earning
Objective: Help elevate a particular, valuable post in the SERPs
Content needed: N/A — the post you want to boost already exists
Promotion needed: Manual outreach and promotion to more niche sites
You write a piece of content and you know it’s killer, but you want an initial boost to elevate it on page one.
You probably want to build credibility to that page by earning a link or two that will help demonstrate its value.
This is post-specific link earning. It usually involves highly tailored outreach in which you pitch sites to link back to your post. While it’s often a pretty manual effort, even a couple of links can make a huge difference.
I’m going to use Golden Thread Tarot as an example. I do not work with or for them, so I can’t confirm they did any manual outreach. But this example still illustrates my point. Also, tarot reading is my newest obsession.
This is a great example of a page that should be prioritized for targeted link earning so they can continue to maintain their positions. Why? Because people searching to learn more about reading tarot are the perfect potential customers for their tarot deck and app. Notice how the page has plenty of calls-to-action.
Obviously ranking for “money pages” won’t always be easy, especially for higher-volume terms with greater competition. But if you’re truly creating the best content in response to the searcher’s intent, a couple of links can give you a boost.
There is some overlap between targeted link earning and passive link earning, but the primary difference is that in the former, you’re designing a piece of content from scratch with the sole purpose of building links. The latter is link outreach you conduct in order to boost a page that’s important to your audience acquisition and conversions.
Site authority link earning
Objective: Improve the overall authority of your site/brand
Content Needed: Original reporting and/or newsworthy data
Promotion Needed: Pitching journalists or setting up syndication relationships
A rising tide lifts all boats. In this case, the boats are your specific pages of content, and the rising tide is your site’s domain authority.
If your site and brand are deemed authoritative, it increases the chances that your individual pieces of content will be considered authoritative, as well.
I’ll use one of our brand partners, Sidecar Health, as an example. We’ve been creating newsworthy stories on their behalf and distributing them through the Stacker newswire since December, and Sidecar Health’s domain authority has increased by four in a few months.
The strategy is to create top-of-the-funnel, newsworthy content that authoritative news publications would be interested in running. That way we can earn links/canonicals that demonstrate that the brand is producing valuable content.
When you earn this type of link equity to your site, you can then leverage internal linking to distribute that equity to pages that are important to your SEO goals. In using this strategy for just a few months, Sidecar Health has seen a 77% increase in keywords in positions 1-3 in the SERPs.
This type of link earning is often the missing piece for brands who have found that their on-site content is top-notch, but their traffic is plateauing anyway.
Niche link earning
Objective: Earn more relevant authority in your specific niche
Content Needed: Original reporting and/or newsworthy data
Promotion Needed: Pitching journalists or setting up syndication relationships
You can view niche link earning as a spinoff of site authority link earning. Niche link earning just focuses on a lower DA and higher relevance.
I’ve been talking a lot about authority in this piece, and for good reason: it’s one of the most important aspects and benefits of earning links.
But relevance is a piece of the puzzle as well. You don’t want to be earning links that have nothing to do with your brand, even if you’re taking a tangential approach mentioned in the previous section.
It’s good to supplement your general link earning strategy by ensuring you’re earning links from more relevant sites, that are specifically aligned with your brand offering.
To continue the Sidecar Health example, in addition to the links we’ve earned for them, their backlink profile includes links from sites like Verywell Health and Healthline as well as even more niche sites like Health Care Business Today and Electronic Health Reporter.
If you’re finding that you’re not naturally earning links from respected sites in your industry, it’s certainly worth trying to pitch them on your content or build syndication partnerships with them.
A well-rounded link earning strategy
Any one of these link-earning strategies can help move the needle for your organic growth, especially if you’re just getting started.
However, as you mature your program, you’ll need all of these strategies in order to grow sustainably and consistently. The trick becomes understanding how to implement these strategies — whether in-house or outsourcing. A common approach I see is having an internal team that focuses on post-specific and niche link earning while hiring outside help with passive and brand authority linking.
Whatever you try, remember to consistently check that you have your bases covered so you achieve the organic traffic growth you’re aiming for.
Over the next six months, Google is going to employ machine learning and AI to alter the hours of operation on twenty million Google Business Profiles as part of their project of creating a “self-updating map”. Some experts estimate that this is roughly one-fifth to one-sixth of all GBP listings, meaning the chances are strong that you or one or more of your clients could experience these edits.
Google has good reason for pursuing accuracy in their local index, but local business owners have even better reason to be on top of this announcement and proactively safeguard the validity of their own data. Today, we’ll show you what to do to take charge on these vital listings which, while they belong to Google, represent your business.
Why is this happening and is this new?
Google is right in observing that the chaos of COVID-19 has affected the accuracy of their local business index. Updating GBP hours to reflect changes may not be at the top of the to-do lists of business owners struggling with so many challenges.
However, Google’s description of how they plan to alter business hours is raising some alarm, due to the peculiarity of their disclosed methods. Some processes are sound. For example, Google mentions use of Duplex to actually phone business owners directly to ask what their current open hours are, which makes excellent horse sense. Additionally, asking Local Guides to validate this information could also help if an owner is unreachable, for some reason, and the guides being tapped are civic-minded instead of just playing for points. All fine and good.
Where we get into murkier waters is in Google saying they will use the hours of other related local businesses to “predict” what the hours should be for the business you are marketing. The example they use is determining what the hours of Liam’s Lemonade Shop should be by looking at the hours of other nearby lemonade shops. In other words, if Larry’s Lemonade Emporium is open from 9-5, Google assumes that Liam’s Lemonade Shop shop should be, too, which will come as a surprise to him if he runs a late night citrus spot. I’m not the only one finding Google’s logic less than exemplary on this.
Another process Google mentions is that of deriving information from Street View, which I am dubious about, given that many places I visit via this service have not been updated in more than a year, and in some cases, in more than a decade:
If Google’s thinking is that harried business owners have not had the free time necessary to keep their hours updated throughout the past two years, then trying to glean this information from random snapshots in time of whenever a Google vehicle last passed through town seems like a rather fuzzy solution. The hours of your business in 2022 may be quite different from what they were a year ago, or five years ago.
If some of Google’s ways and means accompanying this big announcement have a familiar ring to them, it’s because what they are describing is not, in fact, totally new. Since the beginning of local search history, Google has crowdsourced information and implemented it in their listings, and all of that time, local SEOs and local business owners have been suggesting that this is not a good substitute for getting information directly from the companies Google is representing and monetizing via their system. We basically have to view this development from Google as an acknowledgement of three things:
Your listings belong to Google.
Google has never reached the level of direct local business owner engagement they actually need to maintain the quality of their index.
In the absence of this, they substitute crowdsourcing and technology in hopes of achieving enough accuracy to maintain a certain degree of public trust necessary to be able to keep monetizing SERPs and having them seen and used.
So, take a deep breath. This is the Google we already know, putting a high tech spin on a historic communications failure, but don’t overlook this announcement. It’s a strong message from the search engine that you have to stay on top of your own listings if you don’t want Google to completely take over and edit your data based on random information. Fortunately, there are specific things you can do to take charge!
How to proactively protect your GBP hours
Here’s a short list of your five best options for signaling to Google that, yes, you are staying on top of your own hours and don’t require assistance.
Be sure the hours of operation on your website are accurate. Google says this is one of the places they investigate.
Sorry for the pain-in-the-neck, but if you manage your listings manually, you now need to regularly check all of them to see if Google has altered their hours. I’d recommend checking at least every month (as if you don’t already have enough to do). Moz Local customers have a much easier option. Just check the Profile Suggestions section of your dashboard to see, at a glance, whether Google or anyone else is trying to edit your hours, even if you have hundreds of listings. Being alerted when data changes are suggested should provide so much peace of mind, and you can accept or reject edit suggestions. Whew!
Thirdly, take some time this week to edit your hours, even if the edit is small. For example, you could go into your listings today to set special hours for the winter holidays in advance, proving to Google that you are well aware of your own schedule and that your hours of operation are not neglected.
Remember our recent discussion of the QRG and how Google employs human quality raters to get a sense of your business from what others are saying about it? Be sure all of your local business listings across your local search ecosystem are up-to-date with correct hours (another thing Moz Local makes so much easier!) so that quality raters aren’t encountering complaints from customers who came to your business and found it closed when it was listed online as being open for business.
Finally, for brick-and-mortar brands, do step outside today and be sure the hours displayed on your windows, doors, and street level signage are accurate, just in case a Google Maps Car or a local guide is heading your way.
Google is telling us, yet again, that local business listings aren’t a set-and-forget asset
A local SEO myth that I see surfacing frequently is that you can build out your listings and then forget about them. This is simply not true! Ongoing, active management of all of your listings has always been essential for three core reasons:
Incorrect information on neglected listings has been proven to lead to negative reviews from inconvenienced customers, and negative reviews undermine conversions/transactions. If your real-world hours change and you don’t update your online listings across the local search ecosystem, customers will complain in reviews and the low-star rating they assign you will influence the impressions and actions of other potential customers. Meanwhile, remember that wrong hours in one place can then be distributed to multiple local business listing platforms and apps in the absence of active management. Customer care is the number one reason why you can’t neglect your listings.
It isn’t just Google which can decide they know better than you about key fields of your listing. Any member of the public, including competitors and spammers, can suggest edits to your profiles that you will be unaware of if you are not paying attention.
It’s an outdated perspective to view local business listings as static entities. Year-over-year, Google Business Profiles, in particular, are becoming increasingly interactive and transactional. Competitive local businesses must have a solid strategy for continuous management of photos, reviews, Q&A, messaging, bookings, shopping and more. Far from being a one-and-done scenario, listings management is central to local business operations.
Given that Google shows no signs of ceding total control of listings to business owners, your best strategy is to take as much charge as you can and be as proactive as possible in publishing dynamic information to your listings. With Google’s latest announcement fresh in all our minds, today might be a good day to check out Moz Local to simplify your local to-do list.
If you're familiar with the blue ocean marketing strategy, you know that SEO is inherently a "red ocean" industry. With fierce competition to attain rankings, links, and authority, SEOs are constantly trying to one-up their competitors. In this environment, is there any hope for creating a "blue ocean" — innovating to avoid the choppy waters of the current market?
In today's Whiteboard Friday, PJ Howland of Leaders.com suggests that a blue ocean SEO strategy is achievable through creating realistic content for your customers. Watch to learn more!
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
Hey, Moz fans. Welcome back to another Whiteboard Friday. I am PJ Howland. I am the Head of SEO and Evergreen Content at leaders.com. So my background is kind of that sweet spot right in between if you've got SEO over here and content and editorial over here, that middle ground where SEO and editorial and content collide.
If you're familiar with that world, you know that we're dealing with thinner margins, we have fewer resources, and we have less clear direction. So this is the world we operate in. It's a place where we're trying to just get a little bit of an advantage over our competitors, and it's kind of created this fixed mindset of just trying to scrape by to just get a little bit of an advantage.
However, I think that there's a mindset that can really help break this mold to make sure that every single month is actually your best month ever. So it's the blue ocean SEO strategy.
What is blue ocean strategy?
So what is blue ocean SEO strategy? Well, start with what blue ocean strategy is.
Maybe you've read the book "Blue Ocean Strategy." Maybe you've heard the term thrown around. Maybe you haven't heard of it at all. Give me just one minute and I'll kind of explain this with a story.
So back in the days when circuses were traveling the countryside, there were two predominant circuses, Ringling Brothers and Barnum & Bailey, and they would spend lots of time, energy, and resources trying to one up each other.
They were producing who can have crazier stunts, who can have weirder clowns, who can have more eccentric animal tricks or whatever. It got to the point to where they were competing so hard with each other that they forgot what their customers really cared about. The sentiments of the day were that, hey, we don't want to see multiple rings. We don't like looking everywhere. Put it back down to one ring. We don't like the animal tricks. They're cruel. They're outdated. But these circuses were so fixed on competing with each other that both of them ended up losing out.
Then who should come along? This new player, Cirque du Soleil. Now, if you've been to a Cirque du Soleil show, you probably know where I'm going with this. It's unlike a traditional circus. In fact, it's unlike anything else out there. But what they did do is they took what worked and excluded what didn't work from a traditional circus. So they cut out the cruel animal acts. They also kept in the clowns and the acrobats, because those were working in the traditional circus, and they brought in this kind of third heat of this theater element, this fine theater experience.
If you've been to a Cirque show, you know what I'm talking about. There's definitely a narrative there. So what they've done is they've created a blue ocean. They've created a world where there really is no direct apples-to-apples competitors.
Now, Cirque du Soleil isn't the only player out there who's done this. We could talk about iTunes. What were your options before iTunes? Netflix, same thing. Before Netflix was on the scene, where were you watching your stuff? Airbnb. You see where I'm going with this, that all three of these businesses or products launch, there really wasn't anything out there. It was a true blue ocean.
Blue ocean SEO
So how do we get that from an SEO perspective?
Well, we need to start by realizing where SEO is right now. So SEO is a fixed mindset game. Inherently it is a red ocean strategy. If you don't believe me, let me illustrate it this way. You have this situation where you've got a keyword.
You look at your competitor and you see, hey, they've got 100 links. You say, "Big deal. I'll get 200 links." Their site speed score is 80. I'm going to make it 90. We're going to have 90. They write 1,500 words. You see where I'm going with this. We write 2,500 words. That's not blue ocean thinking. That is fixed, red ocean, fierce competition mindset thinking, but it's where a lot of SEOs find themselves. Heck, I've been there. That is essentially skyscraper content.
It's skyscraper article building. Hey, there's been a place for that, but at its core it really is red ocean, fixed mindset thinking. So how do we get over that? How do we as SEOs, digital marketers, content marketers, how do we get over the fact that we're playing a red ocean game?
Start with the customer
Well, it starts with the customer.
So picture this. Picture you're sitting down with a customer, except you're not a marketer anymore. You maybe work on the fulfillment or service or product side. You sit down with them and you say, "Hey, we are enhancing our product, our service, our fulfillment. What is going to make it a better solution for you?" Just picture a great conversation where they're going back and forth with you.
You collect valuable insight, and at the end, you produce a better product and that customer is grateful for it. Now picture that same customer except swap out yourself as a product marketer and now you're a marketer marketer. You're talking about your SEO strategy and you say, "Hey, so what we do is we look at the competitors and we add more words. We see the links that they build and we add more links. What do you think of that? How does that benefit you?"
The customer is going to look at you and say, "That doesn't help me at all. That's not what I asked for. That's not what I want. That's not the deal of how businesses work. You're here to make my world easier." Truth be told, when we add more words to a page, sometimes we're just making more words out there. Do you know what I mean? It's something that the customers don't want to deal with.
Set your writers up for success
So we live in this reality though, where as SEOs, as editorial professionals, as content marketers, we're in this place where we're tasked with almost an impossible task. How do we deliver exceptional content when the people producing that, maybe it's an SEO, maybe it's a writer, aren't necessarily the subject matter experts?
So it starts with what you give the writer or the SEO to actually begin this project. So I have found so many people start with they just say, "Hey, here's the keyword. Go for it. We're trying to rank for this. Just get that piece of content out." Inevitably, you will produce a skyscraper piece of content with that. That sets everyone up for failure.
Nobody wins when all you do is start off with a keyword. Instead what I have found is it really takes detailed write-ups that begin with interviews with subject matter experts. My outlines that I produce with my team, I've never seen one under two pages. Most of them are between three to five pages that we produce for these individual topics. That all has to come from your own insight with like your team. I can't really produce what your customer wants. That's something I invite you to figure out on your own. However, I have found that customers, no matter what industry you're in, I don't care if it's B2B, B2C, I don't care if you're selling SaaS, or if you've got an e-commerce platform, whatever, the person that says, "Yes, I want to buy that," is always a person.
People want reality
Ten times out of ten you're dealing with people. So what do people want? People want reality. They want authenticity. So there are three things that I think can help anyone, regardless of the industry that you're in, boost up your content. When I'm talking about content, that could be articles. It could blog posts. It could be landing pages. It could be white papers, whatever. The point is that people want reality.
1. Stories
So here's how I've been able to find really good delivery methods for this. Stories. My world is a lot more article focused, but we try to start every article with a story. People love that.
I find a much higher time on page when we lead out with stories. I know what you're thinking. It's like, "Well, Wikipedia is someone we're competing against, and they don't start with a story. Nobody starts with a story." Okay, decide if you want to be a red ocean player or a blue ocean player.
2. Advice
The next thing is advice. Now, again, going back to the reality of it is that we're dealing with situations where we've got writers who aren't subject matter experts.
That's okay. Nobody is in trouble for that reality of the situation. But it does matter that you seek that advice out. So schedule interviews with subject matter experts. Buy your team a book. Have a book club within your organization. Make sure that you're building expertise within your group.
3. Examples
Finally, examples. People want reality. They want authenticity. If you have customer data, if you have case studies, I don't think you should be gating all of that necessarily. There's a great use case to be made for just letting that stuff out there. People want to see real examples happening in the wild.
How to measure success
So you come up with this blue ocean strategy that only you can produce, because you should know your customer better than me, and you want to figure out how to measure it.
Well, if you've been doing content marketing for more than, I don't know, the last 30 minutes, you know that there's a big gap between the first piece of content produced and the dollars in the pocket of the organization. There's a lot that happens in between there. So how do you actually make sure that you can come through on this with proper measurement? I have found that time on page is the first thing that goes up.
So before we kind of rolled out this with leaders.com, we were seeing average time on page between 3.5 to 4 minutes. Not good, not bad. Fine, whatever. But after rolling this out, we're seeing average time on page of between 5.5 to 6.5 minutes. I have pages that are almost 3,000 words in length that actually get read.
We have articles that have average time on page of eight minutes or higher, and that's a really validating thing. I believe that after time on page goes up, then you can start seeing if your rankings and traffic go up. Now you can see right here there's no tinfoil hat. I'm not making a direct insinuation that there is a correlation between time on page and the rankings that Google is going to hand out to you.
What I am saying is that good content gets noticed, and when it gets noticed, you're going to see those links come in. You see how this works. We're starting backwards. Like a lot of the times we lead off with saying, "I want more links." But in reality, is that really going to serve us? I would rather play a game where links come as a byproduct of good content. So what's really the takeaway here?
I think it's a mindset takeaway. In my organization, we've buried skyscraper content. It's dead. It's in the ground. Now it doesn't mean we don't look at competitors. Obviously, that would be silly not to do. We're in SEO.
However, I think that if you want to really transition to a blue ocean, limitless mindset, there has to be some kind of reality check where you say, "Hey, at some point I'm not going to be listening to the competitors for every little word that I write on my web page." Thanks for watching the video today. I look forward to hearing your comments. For anyone that's done this type of shift in the organization, look forward to engaging with you in a good discussion in the comments below.
Thanks for stopping by today. See you on the next Whiteboard Friday.