Wednesday, December 4, 2019

Convince Your Boss to Send You to MozCon 2020 (Plus Bonus Letter Template!)

Posted by cheryldraper

It’s that time of year again. Professional development budgets are being distributed and you're daydreaming of Roger hugs and fist bumps. Well, this is a call to arms! It's time to get down to business and convince your boss that you HAVE to go to MozCon 2020.

You're already well acquainted with the benefits of MozCon. Maybe you're a MozCon alumnus. You may have lurked the hashtag once or twice for inside tips and you’ve likely followed the work of some of the speakers for a while. But how are you going to relay that to your boss in a way that sells? Don’t worry, we’ve got a plan.

(And if you want to skip ahead to the letter template, here it is!)

Copy the template

Step #1 - Gather evidence

Alright, so just going in and saying “Rand Fishkin is brilliant and have you seen any of Britney Muller’s Whiteboard Fridays lately?!” probably won’t do the trick — we need some cold hard facts that you can present.

MozCon delivers actionable insights

It’s easy to say that MozCon provides actionable insights, but how do you prove it? A quick scroll through our Facebook Group can prove to anyone that not only is MozCon a gathering of the greatest minds in search, but it also acts as an incubator and facilitator for SEO strategies.

If you can’t get your boss on Facebook, just direct them to the blog post written by Croud: Four things I changed immediately after MozCon. Talk about actionable! A quick Google (or LinkedIn) search will return dozens of similar recaps. Gather a few of these to have in your toolbelt just in case.

Or, if you have the time, pick out some of the event tweets from previous years that relate most to your company. The MozCon hashtag (#MozCon) has plenty of tweets to choose from — things like research findings, workflows, and useful tools are all covered. Some of our favorites from last year are listed below.

Attendees are often given access to exclusive tools and betas by the speakers, and that is something you don’t want to miss!

The networking is unbeatable

The potential knowledge gain doesn’t end with keynote speeches. Many of our speakers stick around for the entire conference and host niche- and vertical-specific Birds of a Feather tables over lunch, in addition to attending the networking events. If you find yourself with questions about their strategies, you'll often have the ability to ask them directly.

But the speakers aren’t the only folks worth networking with. We hand-select industry vendors to attend the conference and showcase their products. These vendors are also available for training and showcases throughout the entire conference.

Lastly, your peers! There's no better way to learn than from those who overcome the same obstacles as you. Opportunities for collaboration and peer-to-peer learning are often invaluable (especially those that happen over yummy snacks) and can lead to better workflows, new business, and even exciting partnerships.

Step #2 - Break down the costs

This is where the majority of the conversation will be focused, but fear not, Roger has already done most of the heavy lifting. So let’s cut to the chase. The goal of MozCon isn’t to make money — the goal is to break even and lift up our friends in search.

Top-of-the-line speakers

Every year we work with our speakers to bring cutting-edge content to the stage. You can be sure that the content you’ll be exposed to will set you up for a year of success.

Videos for everyone

While your coworkers won’t be able to enjoy Top Pot doughnuts or KuKuRuZa popcorn, they will be able to see all of the talks via professional video and audio. Your ticket to MozCon includes a professional video package which allows you (and your whole team) to watch every single talk post-conference, for free. (It's a $350 value for the videos alone!)

Good eats

MozCon doesn’t do anything on-par. We strive to go above and beyond in everything we do, and the food options are no exception. MozCon works with local vendors to ensure there are tasty, sustainable meals for everyone, including those with special diets. From breakfast to lunch and all the snacks in-between, MozCon has you covered (and saves your T&E budget a few bucks, as well).

Swag

Not to brag, but our swag is pretty great. Everyone gets their very own special MozCon memorabilia, in addition to other useful and fun items that vary from year to year. Previous gifts include “conference health” fanny packs (complete with Emergen-C!), moleskin notebooks, reusable water bottles, and phone chargers.

Discounts

This is probably the detail that'll make your boss's ears perk up. There are indeed discounts available for MozCon tickets! If you're buying now through January 31st 2020, Early Bird pricing is in effect, which saves you a cool $200 off regular ticket costs. If you've got a team interested in attending, we offer group discounts for parties of 5+ as well. And my final top-secret tip: if your company already subscribes to a Moz product, you can save even more —up to $700 off per regular-priced ticket if you snag Early Bird pricing, or $500 off after January 31st. That's a real chunk of change!

Step #3 - Be prepared to prove value

It’s important to go into the conference with a plan to bring back value. It’s easy to come to any conference and just enjoy the food and company (especially this one), but it’s harder to take the information gained and implement change.

Make a plan

Before approaching your boss, make sure you have a plan on how you're going to show off all of the insights you gather at MozCon! Obviously, you'll be taking notes — whether it’s to the tune of live tweets, bullet journals, or doodles, those notes are most valuable when they're backed up by action.

Putting it into action

Set expectations with your boss. "After each day, I'll select three takeaways and create a plan on how to execute them." Who could turn down nine potential business-changing strategies?!

And it really isn’t that hard! Especially not with the content that you'll have access to. At the close of each day, we recommend you look back over your notes and do a brain-dump. 

  • How did today's content relate to your business? 
  • Which sessions resonated and would bring the most value to your team? 
  • Which strategies can easily be executed? 
  • Which would make the biggest impact?

After you identify those strategies, create a plan of action that will get you on track for implementing change.

(Fun fact — if you're traveling, this can actually be done on the plane ride home!)

Client briefs

If you have clients on retainer, ongoing training for employees is something those clients should appreciate — it ensures you’re staying ahead of the game. Offer to not only debrief your in-house SEO team, but to also present to your clients. This sort of presentation is a value add that many clients don’t get and can set your business apart.

These presentations can be short blurbs at the beginning of a regular meeting or a chance to gather up all of your clients and enjoy a bit of networking and education.

Still not enough?

Give the boss a taste of MozCon by having them check out some videos from years past to get a taste for the caliber of our speakers. And if you're wanting to break into the speaking circuit, you can also take your shot at securing a community speaker spot onstage. Most years, the call for community speakers opens up in early springtime — keep an eye on the Moz Blog for your chance to pitch!

Lastly, the reviews speak for themselves. MozCon is perfect for SEOs of any level and we even factor in time for you to get a little work done in-between sessions — Vaneese can tell you!

Our fingers are crossed!

Alright friend, now is your time to shine. We've equipped you with some super-persuasive tools and we'll be crossing our fingers that the boss gives you the "okay!" Be sure to grab the letter template and make your case the easy way:

Copy the template

If you can make it, we promise to spoil you to the tune of endless Starbucks coffee, tons of new friends, and an experience that will change your perspective on search. We hope to see your smiling face at MozCon 2020!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, December 2, 2019

Simple Spam Fighting: The Easiest Local Rankings You’ll Ever Earn

Posted by MiriamEllis

Image credit: Visit Lakeland

Reporting fake and duplicate listings to Google sounds hard. Sometimes it can be. But very often, it’s as easy as falling off a log, takes only a modest session of spam fighting and can yield significant local ranking improvements.

If your local business/the local brands your agency markets aren’t using spam fighting as a ranking tactic because you feel you lack the time or skills, please sit down with me for a sec.

What if I told you I spent about an hour yesterday doing something that moved a Home Depot location up 3 spots in a competitive market in Google’s local rankings less than 24 hours later? What if, for you, moving up a spot or two would get you out of Google’s local finder limbo and into the actual local pack limelight?

Today I’m going to show you exactly what I did to fight spam, how fast and easy it was to sweep out junk listings, and how rewarding it can be to see results transform in favor of the legitimate businesses you market.

Washing up the shady world of window blinds

Image credit: Aqua Mechanical

Who knew that shopping for window coverings would lead me into a den of spammers throwing shade all over Google?

The story of Google My Business spam is now more than a decade in the making, with scandalous examples like fake listings for locksmiths and addiction treatment centers proving how unsafe and unacceptable local business platforms can become when left unguarded.

But even in non-YMYL industries, spam listings deceive the public, waste consumers’ time, inhibit legitimate businesses from being discovered, and erode trust in the spam-hosting platform. I saw all of this in action when I was shopping to replace some broken blinds in my home, and it was such a hassle trying to find an actual vendor amid the chaff of broken, duplicate, and lead gen listings, I decided to do something about it.

I selected an SF Bay area branch of Home Depot as my hypothetical “client.” I knew they had a legitimate location in the city of Vallejo, CA — a place I don’t live but sometimes travel to, thereby excluding the influence of proximity from my study. I knew that they were only earning an 8th place ranking in Google’s Local Finder, pushed down by spam. I wanted to see how quickly I could impact Home Depot’s surprisingly bad ranking.

I took the following steps, and encourage you to take them for any local business you’re marketing, too:

Step 1: Search

While located at the place of business you’re marketing, perform a Google search (or have your client perform it) for the keyword phrase for which you most desire improved local rankings. Of course, if you’re already ranking well as you want to for the searchers nearest you, you can still follow this process for investigating somewhat more distant areas within your potential reach where you want to increase visibility.

In the results from your search, click on the “more businesses” link at the bottom of the local pack, and you’ll be taken to the interface commonly called the “Local Finder.”

The Local Finder isn’t typically 100% identical to the local pack in exact ranking order, but it’s the best place I know of to see how things stand beyond the first 3 results that make up Google’s local packs, telling a business which companies they need to surpass to move up towards local pack inclusion.

Step 2: Copy my spreadsheet

Find yourself in the local finder. In my case, the Home Depot location was at position 8. I hope you’re somewhere within the first set of 20 results Google typically gives, but if you’re not, keep paging through until you locate your listing. If you don’t find yourself at all, you may need to troubleshoot whether an eligibility issue, suspension, or filter is at play. But, hopefully that’s not you today.

Next, create a custom spreadsheet to record your findings. Or, much easier, just make a copy of mine!

Populate the spreadsheet by cutting and pasting the basic NAP (name, address, phone) for every competitor ranking above you, and include your own listing, too, of course! If you work for an agency, you’ll need to get the client to help you with this step by filling the spreadsheet out based on their search from their place of business.

In my case, I recorded everything in the first 20 results of the Local Finder, because I saw spam both above and below my “client,” and wanted to see the total movement resulting from my work in that result set.

Step 3: Identify obvious spam

We want to catch the easy fish today. You can go down rabbit holes another day, trying to ferret out weirdly woven webs of lead gen sites spanning the nation, but today, we’re just looking to weed out listings that clearly, blatantly don’t belong in the Local Finder. 

Go through these five easy steps:

  1. Look at the Google Streetview image for each business outranking you.
    Do you see a business with signage that matches the name on the listing? Move on. But if you see a house, an empty parking lot, or Google is marking the listing as “location approximate”, jot that down in the Notes section of your spreadsheet. For example, I saw a supposed window coverings showroom that Streetview was locating in an empty lot on a military base. Big red flag there.
  2. Make note of any businesses that share an address, phone number, or very similar name.
    Make note of anything with an overly long name that seems more like a string of keywords than a brand. For example, a listing in my set was called: Custom Window Treatments in Fairfield, CA Hunter Douglas Dealer.
  3. For every business you noted down in steps one and two, get on the phone.
    Is the number a working number? If someone answers, do they answer with the name of the business? Note it down. Say, “Hi, where is your shop located?” If the answer is that it’s not a shop, it’s a mobile business, note that down. Finally, If anything seems off, check the Guidelines for representing your business on Google to see what’s allowed in the industry you’re investigating. For example, it’s perfectly okay for a window blinds dealer to operate out of their home, but if they’re operating out of 5 homes in the same city, it’s likely a violation. In my case, just a couple of minutes on the phone identified multiple listings with phone numbers that were no longer in service.
  4. Visit the iffy websites. 
    Now that you’re narrowing your spreadsheet down to a set of businesses that are either obviously legitimate or “iffy,” visit the websites of the iffy ones. Does the name on the listing match the name on the website? Does anything else look odd? Note it down.
  5. Highlight businesses that are clearly spammy.
    Your dive hasn’t been deep, but by now, it may have identified one or more listings that you strongly believe don’t belong because they have spammy names, fake addresses, or out-of-service phone numbers. My lightning-quick pass through my data set showed that six of the twenty listings were clearly junk. That’s 30% of Google’s info being worthless! I suggest marking these in red text in your spreadsheet to make the next step fast and easy.

Step 4: Report it!

If you want to become a spam-fighting ace later, you’ll need to become familiar with Google’s Business Redressal Complaint Form which gives you lots of room for sharing your documentation of why a listing should be removed. In fact, if an aggravating spammer remains in the Local Finder despite what we’re doing in this session, this form is where you’d head next for a more concerted effort.

But, today, I promised the easiness of falling off a log, so our first effort at impacting the results will simply focus on the “suggest an edit” function you’ll see on each listing you’re trying to get rid of. This is how you do it:

After you click the “suggest an edit” button on the listing, a popup will appear. If you’re reporting something like a spammy name, click the “change name or other details” option and fill out the form. If you’ve determined a listing represents a non-existent, closed, unreachable, or duplicate entity, choose the “remove this place” option and then select the dropdown entry that most closely matches the problem. You can add a screenshot or other image if you like, but in my quick pass through the data, I didn’t bother.

Record the exact action you took for each spam listing in the “Actions” column of the spreadsheet. In my case, I was reporting a mixture or non-existent buildings, out-of-service phone numbers, and one duplicate listing with a spammy name.

Finally, hit the “send” button and you’re done.

Step 5: Record the results

Within an hour of filing my reports with Google, I received an email like this for 5 of the 6 entries I had flagged:



The only entry I received no email for was the duplicate listing with the spammy name. But I didn’t let this worry me. I went about the rest of my day and checked back in the morning.

I’m not fond of calling out businesses in public. Sometimes, there are good folks who are honestly confused about what’s allowed and what isn’t. Also, I sometimes find screenshots of the local finder overwhelmingly cluttered and endlessly long to look at. Instead, I created a bare-bones representational schematic of the total outcome of my hour of spam-fighting work.

The red markers are legit businesses. The grey ones are spam. The green one is the Home Depot I was trying to positively impact. I attributed a letter of the alphabet to each listing, to better help me see how the order changed from day one to day two. The lines show the movement over the course of the 24 hours.

The results were that:

  • A stayed the same, and B and C swapping positions was unlikely due to my work; local rankings can fluctuate like this from hour to hour.
  • Five out of six spam listings I reported disappeared. The keyword-stuffed duplicate listing which was initially at position K was replaced by the brand’s legitimate listing one spot lower than it had been.
  • The majority of the legitimate businesses enjoyed upward movement, with the exception of position I which went down, and M and R which disappeared. Perhaps new businesses moving into the Local Finder triggered a filter, or perhaps it was just the endless tide of position changes and they’ll be back tomorrow.
  • Seven new listings made it into the top 20. Unfortunately, at a glance, it looked to me like 3 of these new listings were new spam. Dang, Google!
  • Most rewardingly, my hypothetical client, Home Depot, moved up 3 spots. What a super easy win!

Fill out the final column in your spreadsheet with your results.

What we’ve learned

You battle upstream every day for your business or clients. You twist yourself like a paperclip complying with Google’s guidelines, seeking new link and unstructured citation opportunities, straining your brain to shake out new content, monitoring reviews like a chef trying to keep a cream sauce from separating. You do all this in the struggle for better, broader visibility, hoping that each effort will incrementally improve reputation, rankings, traffic, and conversions.

Catch your breath. Not everything in life has to be so hard. The river of work ahead is always wide, but don’t overlook the simplest stepping stones. Saunter past the spam listings without breaking a sweat and enjoy the easy upward progress!

I’d like to close today with three meditations:

1. Google is in over their heads with spam

Google is in over their heads with spam. My single local search for a single keyword phrase yielded 30% worthless data in their top local results. Google says they process 63,000 searches per second and that as much as 50% of mobile queries have a local intent. I don’t know any other way to look at Google than as having become an under-regulated public utility at this point.

Expert local SEOs can spot spam listings in query after query, industry after industry, but Google has yet to staff a workforce or design an algorithm sufficient to address bad data that has direct, real-world impacts on businesses and customers. I don’t know if they lack the skills or the will to take responsibility for this enormous problem they’ve created, but the problem is plain. Until Google steps up, my best advice is to do the smart and civic work of watchdogging the results that most affect the local community you serve. It’s a positive not just for your brand, but for every legitimate business and every neighbor near you.

2. You may get in over your head with spam

You may get in over your head with spam. Today’s session was as simple as possible, but GMB spam can stem from complex, global networks. The Home Depot location I randomly rewarded with a 3-place jump in Local Finder rankings clearly isn’t dedicating sufficient resources to spam fighting or they would’ve done this work themselves.

But the extent of spam is severe. If your market is one that’s heavily spammed, you can quickly become overwhelmed by the problem. In such cases, I recommend that you:

  • Read this excellent recent article by Jessie Low on the many forms spam can take, plus some great tips for more strenuous fighting than we’ve covered today.
  • Follow Joy Hawkins, Mike Blumenthal, and Jason Brown, all of whom publish ongoing information on this subject. If you wade into a spam network, I recommend reporting it to one or more of these experts on Twitter, and, if you wish to become a skilled spam fighter yourself, you will learn a lot from what these three have published.
  • If you don’t want to fight spam yourself, hire an agency that has the smarts to be offering this as a service.
  • You can also report listing spam to the Google My Business Community Forum, but it’s a crowded place and it can sometimes be hard to get your issue seen.
  • Finally, if the effect of spam in your market is egregious enough, your ability to publicize it may be your greatest hope. Major media have now repeatedly featured broadcasts and stories on this topic, and shame will sometimes move Google to action when no other motivation appears to.

3. Try to build a local anti-spam movement

What if you built a local movement? What if you and your friendlier competitors joined forces to knock spam out of Google together? Imagine all of the florists, hair salons, or medical practitioners in a town coming together to watch the local SERPs in shifts so that everyone in their market could benefit from bad actors being reported.

Maybe you’re already in a local business association with many hands that could lighten the work of protecting a whole community from unethical business practices. Maybe your town could then join up with the nearest major city, and that city could begin putting pressure on legislators. Maybe legislators would begin to realize the extent of the impacts when legitimate businesses face competition from fake entities and illegal practices. Maybe new anti-trust and communications regulations would ensue.

Now, I promised you “simple,” and this isn’t it, is it? But every time I see a fake listing, I know I’m looking at a single pebble and I’m beginning to think it may take an avalanche to bring about change great enough to protect both local brands and consumers. Google is now 15 years into this dynamic with no serious commitment in sight to resolve it.

At least in your own backyard, in your own community, you can be one small part of the solution with the easy tactics I’ve shared today, but maybe it’s time for local commerce to begin both doing more and expecting more in the way of protections. 

I’m ready for that. And you?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, November 29, 2019

All About Fraggles (Fragment + Handle) - Whiteboard Friday

Posted by Suzzicks

What are "fraggles" in SEO and how do they relate to mobile-first indexing, entities, the Knowledge Graph, and your day-to-day work? In this glimpse into her 2019 MozCon talk, Cindy Krum explains everything you need to understand about fraggles in this edition of Whiteboard Friday.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, Moz fans. My name is Cindy Krum, and I'm the CEO of MobileMoxie, based in Denver, Colorado. We do mobile SEO and ASO consulting. I'm here in Seattle, speaking at MozCon, but also recording this Whiteboard Friday for you today, and we are talking about fraggles.

So fraggles are obviously a name that I'm borrowing from Jim Henson, who created "Fraggle Rock." But it's a combination of words. It's a combination of fragment and handle. I talk about fraggles as a new way or a new element or thing that Google is indexing.

Fraggles and mobile-first indexing

Let's start with the idea of mobile-first indexing, because you have to kind of understand that before you can go on to understand fraggles. So I believe mobile-first indexing is about a little bit more than what Google says. Google says that mobile-first indexing was just a change of the crawler.

They had a desktop crawler that was primarily crawling and indexing, and now they have a mobile crawler that's doing the heavy lifting for crawling and indexing. While I think that's true, I think there's more going on behind the scenes that they're not talking about, and we've seen a lot of evidence of this. So what I believe is that mobile-first indexing was also about indexing, hence the name.

Knowledge Graph and entities

So I think that Google has reorganized their index around entities or around specifically entities in the Knowledge Graph. So this is kind of my rough diagram of a very simplified Knowledge Graph. But Knowledge Graph is all about person, place, thing, or idea.

Nouns are entities. Knowledge Graph has nodes for all of the major person, place, thing, or idea entities out there. But it also indexes or it also organizes the relationships of this idea to this idea or this thing to this thing. What's useful for that to Google is that these things, these concepts, these relationships stay true in all languages, and that's how entities work, because entities happen before keywords.

This can be a hard concept for SEOs to wrap their brain around because we're so used to dealing with keywords. But if you think about an entity as something that's described by a keyword and can be language agnostic, that's how Google thinks about entities, because entities in the Knowledge Graph are not written up per se or their the unique identifier isn't a word, it's a number and numbers are language agnostic.

But if we think about an entity like mother, mother is a concept that exists in all languages, but we have different words to describe it. But regardless of what language you're speaking, mother is related to father, is related to daughter, is related to grandfather, all in the same ways, even if we're speaking different languages. So if Google can use what they call the "topic layer"and entities as a way to filter in information and understand the world, then they can do it in languages where they're strong and say, "We know that this is true absolutely 100% all of the time."

Then they can apply that understanding to languages that they have a harder time indexing or understanding, they're just not as strong or the algorithm isn't built to understand things like complexities of language, like German where they make really long words or other languages where they have lots of short words to mean different things or to modify different words.

Languages all work differently. But if they can use their translation API and their natural language APIs to build out the Knowledge Graph in places where they're strong, then they can use it with machine learning to also build it and do a better job of answering questions in places or languages where they're weak. So when you understand that, then it's easy to think about mobile-first indexing as a massive Knowledge Graph build-out.

We've seen this happening statistically. There are more Knowledge Graph results and more other things that seem to be related to Knowledge Graph results, like people also ask, people also search for, related searches. Those are all describing different elements or different nodes on the Knowledge Graph. So when you see those things in the search, I want you to think, hey, this is the Knowledge Graph showing me how this topic is related to other topics.

So when Google launched mobile-first indexing, I think this is the reason it took two and a half years is because they were reindexing the entire web and organizing it around the Knowledge Graph. If you think back to the AMA that John Mueller did right about the time that Knowledge Graph was launching, he answered a lot of questions that were about JavaScript and href lang.

When you put this in that context, it makes more sense. He wants the entity understanding, or he knows that the entity understanding is really important, so the href lang is also really important. So that's enough of that. Now let's talk about fraggles.

Fraggles = fragment + handle

So fraggles, as I said, are a fragment plus a handle. It's important to know that fraggles — let me go over here —fraggles and fragments, there are lots of things out there that have fragments. So you can think of native apps, databases, websites, podcasts, and videos. Those can all be fragmented.

Even though they don't have a URL, they might be useful content, because Google says its goal is to organize the world's information, not to organize the world's websites. I think that, historically, Google has kind of been locked into this crawling and indexing of websites and that that's bothered it, that it wants to be able to show other stuff, but it couldn't do that because they all needed URLs.

But with fragments, potentially they don't have to have a URL. So keep these things in mind — apps, databases and stuff like that — and then look at this. 


So this is a traditional page. If you think about a page, Google has kind of been forced, historically by their infrastructure, to surface pages and to rank pages. But pages sometimes struggle to rank if they have too many topics on them.

So for instance, what I've shown you here is a page about vegetables. This page may be the best page about vegetables, and it may have the best information about lettuce, celery, and radishes. But because it's got those topics and maybe more topics on it, they all kind of dilute each other, and this great page may struggle to rank because it's not focused on the one topic, on one thing at a time.

Google wants to rank the best things. But historically they've kind of pushed us to put the best things on one page at a time and to break them out. So what that's created is this "content is king, I need more content, build more pages" mentality in SEO. The problem is everyone can be building more and more pages for every keyword that they want to rank for or every keyword group that they want to rank for, but only one is going to rank number one.

Google still has to crawl all of those pages that it told us to build, and that creates this character over here, I think, Marjory the Trash Heap, which if you remember the Fraggles, Marjory the Trash Heap was the all-knowing oracle. But when we're all creating kind of low- to mid-quality content just to have a separate page for every topic, then that makes Google's life harder, and that of course makes our life harder.

So why are we doing all of this work? The answer is because Google can only index pages, and if the page is too long or too many topics, Google gets confused. So we've been enabling Google to do this. But let's pretend, go with me on this, because this is a theory, I can't prove it. But if Google didn't have to index a full page or wasn't locked into that and could just index a piece of a page, then that makes it easier for Google to understand the relationships of different topics to one page, but also to organize the bits of the page to different pieces of the Knowledge Graph.

So this page about vegetables could be indexed and organized under the vegetable node of the Knowledge Graph. But that doesn't mean that the lettuce part of the page couldn't be indexed separately under the lettuce portion of the Knowledge Graph and so on, celery to celery and radish to radish. Now I know this is novel, and it's hard to think about if you've been doing SEO for a long time.

But let's think about why Google would want to do this. Google has been moving towards all of these new kinds of search experiences where we have voice search, we have the Google Home Hub kind of situation with a screen, or we have mobile searches. If you think about what Google has been doing, we've seen the increase in people also ask, and we've seen the increase in featured snippets.

They've actually been kind of, sort of making fragments for a long time or indexing fragments and showing them in featured snippets. The difference between that and fraggles is that when you click through on a fraggle, when it ranks in a search result, Google scrolls to that portion of the page automatically. That's the handle portion.

So handles you may have heard of before. They're kind of old-school web building. We call them bookmarks, anchor links, anchor jump links, stuff like that. It's when it automatically scrolls to the right portion of the page. But what we've seen with fraggles is Google is lifting bits of text, and when you click on it, they're scrolling directly to that piece of text on a page.

So we see this already happening in some results. What's interesting is Google is overlaying the link. You don't have to program the jump link in there. Google actually finds it and puts it there for you. So Google is already doing this, especially with AMP featured snippets. If you have a AMP featured snippet, so a featured snippet that's lifted from an AMP page, when you click through, Google is actually scrolling and highlighting the featured snippet so that you could read it in context on the page.

But it's also happening in other kind of more nuanced situations, especially with forums and conversations where they can pick a best answer. The difference between a fraggle and something like a jump link is that Google is overlaying the scrolling portion. The difference between a fraggle and a site link is site links link to other pages, and fraggles, they're linking to multiple pieces of the same long page.

So we want to avoid continuing to build up low-quality or mid-quality pages that might go to Marjory the Trash Heap. We want to start thinking in terms of can Google find and identify the right portion of the page about a specific topic, and are these topics related enough that they'll be understood when indexing them towards the Knowledge Graph.

Knowledge Graph build-out into different areas

So I personally think that we're seeing the build-out of the Knowledge Graph in a lot of different things. I think featured snippets are kind of facts or ideas that are looking for a home or validation in the Knowledge Graph. People also ask seem to be the related nodes. People also search for, same thing. Related searches, same thing. Featured snippets, oh, they're on there twice, two featured snippets. Found on the web, which is another way where Google is putting expanders by topic and then giving you a carousel of featured snippets to click through on.



 So we're seeing all of those things, and some SEOs are getting kind of upset that Google is lifting so much content and putting it in the search results and that you're not getting the click. We know that 61% of mobile searches don't get a click anymore, and it's because people are finding the information that they want directly in a SERP.

That's tough for SEOs, but great for Google because it means Google is providing exactly what the user wants. So they're probably going to continue to do this. I think that SEOs are going to change their minds and they're going to want to be in those windowed content, in the lifted content, because when Google starts doing this kind of thing for the native apps, databases, and other content, websites, podcasts, stuff like that, then those are new competitors that you didn't have to deal with when it was only websites ranking, but those are going to be more engaging kinds of content that Google will be showing or lifting and showing in a SERP even if they don't have to have URLs, because Google can just window them and show them.

So you'd rather be lifted than not shown at all. So that's it for me and featured snippets. I'd love to answer your questions in the comments, and thanks very much. I hope you like the theory about fraggles.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, November 26, 2019

The Practical Guide to Finding Anyone's Email Address

Posted by David_Farkas

In link building, few things are more frustrating than finding the perfect link opportunity but being completely unable to find a contact email address.

It’s probably happened to you — if you’re trying to build links or do any sort of outreach, it almost always entails sending out a fairly significant amount of emails. There are plenty of good articles out there about building relationships within the context of link building, but it’s hard to build relationships when you can’t even find a contact email address.

So, for today, I want to focus on how you can become better at finding those important email addresses.

Link builders spend a lot of time just trying to find contact info, and it’s often a frustrating process, just because sussing out email addresses can indeed be quite difficult. The site you’re targeting might not even have a contact page in the first place. Or, if the site does have a contact page, it might only display a generic email address. And, sometimes, the site may list too many email addresses. There are eight different people with similar-sounding job titles — should you reach out to the PR person, the marketing director, or the webmaster? It’s not clear.

Whatever the case may be, finding the right email address is absolutely imperative to any successful outreach campaign. In our industry, the numbers around outreach and replies aren’t great. Frankly, it’s shocking to hear the industry standard — only 8.5% of outreach emails receive a response.

I can’t help but wonder how many mistakes are made along the way to such a low response rate.

While there are certainly instances where there is simply no clear and obvious contact method, that should be the exception — not the rule! An experienced link builder understands that finding relevant contact information is essential to their success.

That’s why I’ve put together a quick list of tips and tools that will help you to find the email addresses and contact information you need when you’re building links.

And, if you follow my advice, here is a glimpse of the results you could expect:

Screenshot of high open and reply rates on an email

We don’t track clicks, in case you were wondering ;)

ALWAYS start by looking around!

First, let’s start with my golden rule: Before you fire up any tool, you should always manually look for the correct contact email yourself.

Based on my experience, tools and automation are a last resort. If you rely solely upon tools and automated solutions, you’ll end up with many more misfired emails than if you were to go the manual route. There’s a simple reason for this: the email address listed on your target website may, surprisingly, belong to the right person you should contact!

Now, if you are using a tool, they may generate dozens of email addresses, and you’ll never end up actually emailing the correct individual. Another reason I advocate manually looking for emails is because many email finding tools are limited and can only find email addresses that are associated with a domain name. So, if there is a webmaster that happens to have a @gmail.com email address, the email finding tool will not find it.

It’s also important to only reach out to people you strongly believe will have an interest in your email in order to stay GDPR compliant.

So, always start your manual search by looking around the site. Usually, there will be a link to the contact page in the header, footer, or sidebar. If there’s not a page explicitly named “contact,” or if the contact page only has generic email addresses, that’s when I would recommend jumping to an “About Us” page, should there be one. 

You always want to find a personal email, not a generic one or a contact form. Outreach is more effective when you can address a specific individual, not whoever who is checking info@domain.com that day.

If you encounter too many emails and aren’t sure who the best person to contact is, I suggest sending an email to your best hunch that goes something like this:

And who knows, you may even get a reply like this:

Screenshot of a reply telling you to contact someone else

If you weren’t able to locate an email address at this point, I’d move on to the next section.

Ask search engines for help

Perhaps the contact page you were looking for was well-hidden; maybe they don’t want to be contacted that much or they're in desperate need of a new UX person.

You can turn to search engines for help.

My go-to search engine lately is Startpage. Dubbed as the world's most private search engine, they display Google SERPs in a way that doesn’t make you feel like you just stepped into Times Square. They also have a cool option to browse the search results anonymously with "Anonymous View."

For our purposes, I would use the site: search operator just like this:

If there is in fact a contact page or email somewhere on their website that you were not able to find, any competent search engine will find it for you. If the above site query doesn't return any results, then I’d start expanding my search to other corners of the web.

Use the search bar and type:

If you’re looking for the email of a specific person, type their name before or after the quotation marks.

With this query you can find non-domain email addresses:

If that person’s email address is publicly available somewhere, you will likely be able to find it within the search results.

Email-finding tools

There are many, many excellent email finding tools to choose from. The first one I want to talk about is Hunter.

Hunter has a Chrome extension that’s really easy to use. After you’ve downloaded the extension, there’s not much more that needs to be done.

Go to the site which you are thinking about sending an email to, click on the extension in the top right corner of your screen, and Hunter, well, hunts.

It returns every email address it can find associated with that domain. And also allows you to filter the results based on categories.

Did I say “email address?” I meant to say email address, name, job title, etc. Essentially, it’s a one-click fix to get everything you need to send outreach.

Because I use Hunter regularly (and for good reason, as you can see), it’s the one I’m most familiar with. You can also use Hunter’s online app to look up emails in bulk.

The major downside of working in bulk is coming up with an effective formula to sift through all the emails. Hunter may generate dozens of emails for one site, leaving you to essentially guess which email address is best for outreach. And if you’re relying on guess-work, chances are pretty high you’re leaving perfectly good prospects on the table.

There are several other email finding tools to pick from and I would be remiss to not mention them. Here are 5 alternative email-finding tools:

Even though I personally try not to be too dependent on tools, the fact of the matter is that they provide the easiest, most convenient route in many cases.

The guessing game

I know there's no word in the digital marketing world that produces more shudders than “guessing.” However, there are times when guessing is easier.

Let’s be real: there aren’t too many different ways that companies both large and small format their email addresses. It’s usually going to be something like:

If you’ve ever worked for a living, you know most of the variations. But, in case you need some help, there’s a tool for that.

Now, I’m not suggesting that you just pick any one of these random addresses, send your email, cross your fingers, and hope for the best. Far from it. There are actually tools that you can use that will indicate when you’ve selected the right one.

Sales Navigator is such a tool. Sales Navigator is a Gmail extension that is easy to use. Simply enter the name of the person you’re looking for, and it will return all of the possible standard variations that they may use for their email address. Then, you can actually test the address from your Gmail account. When you type in the address into the proper line, a sidebar will appear on your screen. If there no is no information in that sidebar, you have the wrong address. If, however, you get a return that looks like this:

Congratulations! You’ve found the right email address.

Obviously, this method only works if you know the name of the person you want to email, but just don’t have their email address. Still, in those scenarios, Sales Navigator works like a charm.

Trust, but verify

There’s nothing more annoying than when you think you’ve finally struck gold, but the gold turned out to be pyrite. Getting an email that bounces back because it wasn’t the correct address is frustrating. And even worse, if it happens too often, your email can end up on email blacklists and destroy your email deliverability.

There are ways to verify, however. At my company, we use Neverbounce. It’s effective and incredibly easy to use. With Neverbounce, you can enter in either individual email addresses or bulk lists, and voila!

It will let you know if that email address is currently Valid, Invalid, or Unknown. It’s that easy. Here are some other email verifiers:

Subscribe to their newsletter

Here’s one final out-of-the-box approach. This approach works more often with sites where one person clearly does most, if not all, of the work. A site where someone’s name is the domain name, for example.

If you come across a site like davidfarkas.com and you see a newsletter that can be subscribed to, hit that subscribe button. Once that’s done, you can simply reply to one iteration of the newsletter.

This method has an added benefit. An effective way of building links is building relationships, just like I said in the opening. When you can demonstrate that you're already subscribing to a webmaster’s newsletter, you'll be currying favor with that webmaster.

Conclusion

When you send a link building outreach email, you want to make sure it’s going to a real person and, even more importantly, ending up in the right hands. Sending an email to an incorrect contact periodically may seem like a negligible waste of time, but when you send emails at the volume a link builder should, the waste adds up very quickly. In fact, enough waste can kill everything else that you’re trying to accomplish.

It’s well worth your time to make sure you’re getting it right by putting in the effort to finding the right email address. Be a picky link builder. Don’t just choose the first email that comes your way and never rely solely on tools. If you email the wrong person, it will look to them like that you didn’t care enough to spend time on their site, and in return, they will ignore you and your pitch.

With the tips outlined above, you'll avoid these issues and be on your way to more successful outreach.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, November 25, 2019

App Store SEO: How to Diagnose a Drop in Traffic & Win It Back

Posted by Joel.Mesherghi

For some organizations, mobile apps can be an important means to capturing new leads and customers, so it can be alarming when you notice your app visits are declining.

However, while there is content on how to optimize your app, otherwise known as ASO (App Store Optimization), there is little information out there on the steps required to diagnose a drop in app visits.

Although there are overlaps with traditional search, there are unique factors that play a role in app store visibility.

The aim of this blog is to give you a solid foundation when trying to investigate a drop in app store visits and then we’ll go through some quick fire opportunities to win that traffic back.

We’ll go through the process of investigating why your app traffic declined, including:

  1. Identifying potential external factors
  2. Identifying the type of keywords that dropped in visits
  3. Analyzing app user engagement metrics

And we’ll go through some ways to help you win traffic back including:

  1. Spying on your competitors
  2. Optimizing your store listing
  3. Investing in localisation

Investigating why your app traffic declined

Step 1. Identify potential external factors

Some industries/businesses will have certain periods of the year where traffic may drop due to external factors, such as seasonality.

Before you begin investigating a traffic drop further:

  • Talk to your point of contact and ask whether seasonality impacts their business, or whether there are general industry trends at play. For example, aggregator sites like SkyScanner may see a drop in app visits after the busy period at the start of the year.
  • Identify whether app installs actually dropped. If they didn’t, then you probably don’t need to worry about a drop in traffic too much and it could be Google’s and Apple’s algorithms better aligning the intent of search terms.

Step 2. Identify the type of keywords that dropped in visits

Like traditional search, identifying the type of keywords (branded and non-branded), as well as the individual keywords that saw the biggest drop in app store visits, will provide much needed context and help shape the direction of your investigation. For instance:

If branded terms saw the biggest drop-off in visits this could suggest:

  1. There has been a decrease in the amount of advertising spend that builds brand/product awareness
  2. Competitors are bidding on your branded terms
  3. The app name/brand has changed and hasn’t been able to mop up all previous branded traffic

If non-branded terms saw the biggest drop off in visits this could suggest:

  1. You’ve made recent optimisation changes that have had a negative impact
  2. User engagement signals, such as app crashes, or app reviews have changed for the worse
  3. Your competition have better optimised their app and/or provide a better user experience (particularly relevant if an app receives a majority of its traffic from a small set of keywords)
  4. Your app has been hit by an algorithm update

If both branded and non-branded terms saw the biggest drop off in visits this could suggest:

  1. You’ve violated Google’s policies on promoting your app.
  2. There are external factors at play

To get data for your Android app

To get data for your Android app, sign into your Google Play Console account.

Google Play Console provides a wealth of data on the performance of your android app, with particularly useful insights on user engagement metrics that influence app store ranking (more on these later).

However, keyword specific data will be limited. Google Play Console will show you the individual keywords that delivered the most downloads for your app, but the majority of keyword visits will likely be unclassified: mid to long-tail keywords that generate downloads, but don’t generate enough downloads to appear as isolated keywords. These keywords will be classified as “other”.

Your chart might look like the below. Repeat the same process for branded terms.

Above: Graph of a client’s non-branded Google Play Store app visits. The number of visits are factual, but the keywords driving visits have been changed to keep anonymity.

To get data for your IOS app

To get data on the performance of your IOS app, Apple have App Store Connect. Like Google Play Console, you’ll be able to get your hands on user engagement metrics that can influence the ranking of your app.

However, keyword data is even scarcer than Google Play Console. You’ll only be able to see the total number of impressions your app’s icon has received on the App Store. If you’ve seen a drop in visits for both your Android and IOS app, then you could use Google Play Console data as a proxy for keyword performance.

If you use an app rank tracking tool, such as TheTool, you can somewhat plug gaps in knowledge for the keywords that are potentially driving visits to your app.

Step 3. Analyze app user engagement metrics

User engagement metrics that underpin a good user experience have a strong influence on how your app ranks and both Apple and Google are open about this.

Google states that user engagement metrics like app crashes, ANR rates (application not responding) and poor reviews can limit exposure opportunities on Google Play.

While Apple isn't quite as forthcoming as Google when it comes to providing information on engagement metrics, they do state that app ratings and reviews can influence app store visibility.

Ultimately, Apple wants to ensure IOS apps provide a good user experience, so it’s likely they use a range of additional user engagement metrics to rank an app in the App Store.

As part of your investigation, you should look into how the below user engagement metrics may have changed around the time period you saw a drop in visits to your app.

  • App rating
  • Number of ratings (newer/fresh ratings will be weighted more for Google)
  • Number of downloads
  • Installs vs uninstalls
  • App crashes and application not responding

You’ll be able to get data for the above metrics in Google Play Console and App Store Connect, or you may have access to this data internally.

Even if your analysis doesn’t reveal insights, metrics like app rating influences conversion and where your app ranks in the app pack SERP feature, so it’s well worth investing time in developing a strategy to improve these metrics.

One simple tactic could be to ensure you respond to negative reviews and reviews with questions. In fact, users increase their rating by +0.7 stars on average after receiving a reply.

Apple offers a few tips on asking for ratings and reviews for IOS app.

Help win your app traffic back

Step 1. Spy on your competitors

Find out who’s ranking

When trying to identify opportunities to improve app store visibility, I always like to compare the top 5 ranking competitor apps for some priority non-branded keywords.

All you need to do is search for these keywords in Google Play and the App Store and grab the publicly available ranking factors from each app listing. You should have something like the below.

Brand

Title

Title Character length

Rating

Number of reviews

Number of installs

Description character length

COMPETITOR 1

[Competitor title]

50

4.8

2,848

50,000+

3,953

COMPETITOR 2

[Competitor title]

28

4.0

3,080

500,000+

2,441

COMPETITOR 3

[Competitor title]

16

4.0

2566

100,000+

2,059

YOUR BRAND

​[Your brands title]

37

4.3

2,367

100,000+

3,951

COMPETITOR 4

[Competitor title]

7

4.1

1,140

100,000+

1,142

COMPETITOR 5

[Competitor title]

24

4.5

567

50,000+

2,647

     Above: anonymized table of a client's Google Play competitors

From this, you may get some indications as to why an app ranks above you. For instance, we see “Competitor 1” not only has the best app rating, but has the longest title and description. Perhaps they better optimized their title and description?

We can also see that competitors that rank above us generally have a larger number of total reviews and installs, which aligns with both Google’s and Apple’s statements about the importance of user engagement metrics.

With the above comparison information, you can dig a little deeper, which leads us on nicely to the next section.

Optimize your app text fields

Keywords you add to text fields can have a significant impact on app store discoverability.

As part of your analysis, you should look into how your keyword optimization differs from competitors and identify any opportunities.

For Google Play, adding keywords to the below text fields can influence rankings:

  • Keywords in the app title (50 characters)
  • Keywords in the app description (4,000 characters)
  • Keywords in short description (80 characters)
  • Keywords in URL
  • Keywords in your app name

When it comes to the App Store, adding keywords to the below text fields can influence rankings:

  • Keywords in the app title (30 characters)
  • Using the 100 character keywords field (a dedicated 100-character field to place keywords you want to rank for)
  • Keywords in your app name

To better understand how your optimisation tactics hold up, I recommended comparing your app text fields to competitors.

For example, if I want to know the frequency of mentioned keywords in their app descriptions on Google Play (keywords in the description field are a ranking factor) than I’d create a table like the one below.

Keyword

COMPETITOR 1

COMPETITOR 2

COMPETITOR 3

YOUR BRAND

COMPETITOR 4

COMPETITOR 5

job

32

9

5

40

3

2

job search

12

4

10

9

10

8

employment

2

0

0

5

0

3

job tracking

2

0

0

4

0

0

employment app

7

2

0

4

2

1

employment search

4

1

1

5

0

0

job tracker

3

0

0

1

0

0

recruiter

2

0

0

1

0

0

     Above: anonymized table of a client's Google Play competitors

From the above table, I can see that the number 1 ranking competitor (competitor 1) has more mentions of “job search” and “employment app” than I do.

Whilst there are many factors that decide the position at which an app ranks, I could deduce that I need to increase the frequency of said keywords in my Google Play app description to help improve ranking.

Be careful though: writing unnatural, keyword stuffed descriptions and titles will likely have an adverse effect.

Remember, as well as being optimized for machines, text fields like your app title and description are meant to be a compelling “advertisement” of your app for users..

I’d repeat this process for other text fields to uncover other keyword insights.

Step 2. Optimize your store listing

Your store listing in the home of your app on Google Play. It’s where users can learn about your app, read reviews and more. And surprisingly, not all apps take full advantage of developing an immersive store listing experience.

Whilst Google doesn't seem to directly state that fully utilizing the majority of store listing features directly impacts your apps discoverability, it’s fair to speculate that there may be some ranking consideration behind this.

At the very least, investing in your store listing could improve conversion and you can even run A/B tests to measure the impact of your changes.

You can improve the overall user experience and content found in the store listing by adding video trailers of your app, quality creative assets, your apps icon (you’ll want to make your icon stand out amongst a sea of other app icons) and more.

You can read Google’s best practice guide on creating a compelling Google Play store listing to learn more.

Step 3. Invest in localization

The saying goes “think global, act local” and this is certainly true of apps.

Previous studies have revealed that 72.4% of global consumers preferred to use their native language when shopping online and that 56.2% of consumers said that the ability to obtain information in their own language is more important than price.

It makes logical sense. The better you can personalize your product for your audience, the better your results will be, so go the extra mile and localize your Google Play and App Store listings.

Google has a handy checklist for localization on Google Play and Apple has a comprehensive resource on internationalizing your app on the App Store.

Wrap up

A drop in visits of any kind causes alarm and panic. Hopefully this blog gives you a good starting point if you ever need to investigate why an apps traffic has dropped as well as providing some quick fire opportunities to win it back.

If you’re interested in further reading on ASO, I recommend reading App Radar’s and TheTool’s guides to ASO, as well as app search discoverability tips from Google and Apple themselves.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, November 22, 2019

Better Content Through NLP (Natural Language Processing) - Whiteboard Friday

Posted by RuthBurrReedy

Gone are the days of optimizing content solely for search engines. For modern SEO, your content needs to please both robots and humans. But how do you know that what you're writing can check the boxes for both man and machine?

In today's Whiteboard Friday, Ruth Burr Reedy focuses on part of her recent MozCon 2019 talk and teaches us all about how Google uses NLP (natural language processing) to truly understand content, plus how you can harness that knowledge to better optimize what you write for people and bots alike.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I'm Ruth Burr Reedy, and I am the Vice President of Strategy at UpBuild, a boutique technical marketing agency specializing in technical SEO and advanced web analytics. I recently spoke at MozCon on a basic framework for SEO and approaching changes to our industry that thinks about SEO in the light of we are humans who are marketing to humans, but we are using a machine as the intermediary.

Those videos will be available online at some point. [Editor's note: that point is now!] But today I wanted to talk about one point from my talk that I found really interesting and that has kind of changed the way that I approach content creation, and that is the idea that writing content that is easier for Google, a robot, to understand can actually make you a better writer and help you write better content for humans. It is a win-win. 

The relationships between entities, words, and how people search

To understand how Google is currently approaching parsing content and understanding what content is about, Google is spending a lot of time and a lot of energy and a lot of money on things like neural matching and natural language processing, which seek to understand basically when people talk, what are they talking about?

This goes along with the evolution of search to be more conversational. But there are a lot of times when someone is searching, but they don't totally know what they want, and Google still wants them to get what they want because that's how Google makes money. They are spending a lot of time trying to understand the relationships between entities and between words and how people use words to search.

The example that Danny Sullivan gave online, that I think is a really great example, is if someone is experiencing the soap opera effect on their TV. If you've ever seen a soap opera, you've noticed that they look kind of weird. Someone might be experiencing that, and not knowing what that's called they can't Google soap opera effect because they don't know about it.

They might search something like, "Why does my TV look funny?" Neural matching helps Google understand that when somebody is searching "Why does my TV look funny?" one possible answer might be the soap opera effect. So they can serve up that result, and people are happy. 

Understanding salience

As we're thinking about natural language processing, a core component of natural language processing is understanding salience.

Salience, content, and entities

Salience is a one-word way to sum up to what extent is this piece of content about this specific entity? At this point Google is really good at extracting entities from a piece of content. Entities are basically nouns, people, places, things, proper nouns, regular nouns.

Entities are things, people, etc., numbers, things like that. Google is really good at taking those out and saying, "Okay, here are all of the entities that are contained within this piece of content." Salience attempts to understand how they're related to each other, because what Google is really trying to understand when they're crawling a page is: What is this page about, and is this a good example of a page about this topic?

Salience really goes into the second piece. To what extent is any given entity be the topic of a piece of content? It's often amazing the degree to which a piece of content that a person has created is not actually about anything. I think we've all experienced that.

You're searching and you come to a page and you're like, "This was too vague. This was too broad. This said that it was about one thing, but it was actually about something else. I didn't find what I needed. This wasn't good information for me." As marketers, we're often on the other side of that, trying to get our clients to say what their product actually does on their website or say, "I know you think that you created a guide to Instagram for the holidays. But you actually wrote one paragraph about the holidays and then seven paragraphs about your new Instagram tool. This is not actually a blog post about Instagram for the holidays. It's a piece of content about your tool." These are the kinds of battles that we fight as marketers. 

Natural Language Processing (NLP) APIs

Fortunately, there are now a number of different APIs that you can use to understand natural language processing: 

Is it as sophisticated as what they're using on their own stuff? Probably not. But you can test it out. Put in a piece of content and see (a) what entities Google is able to extract from it, and (b) how salient Google feels each of these entities is to the piece of content as a whole. Again, to what degree is this piece of content about this thing?

So this natural language processing API, which you can try for free and it's actually not that expensive for an API if you want to build a tool with it, will assign each entity that it can extract a salient score between 0 and 1, saying, "Okay, how sure are we that this piece of content is about this thing versus just containing it?"

So the higher or the closer you get to 1, the more confident the tool is that this piece of content is about this thing. 0.9 would be really, really good. 0.01 means it's there, but they're not sure how well it's related. 

A delicious example of how salience and entities work


The example I have here, and this is not taken from a real piece of content — these numbers are made up, it's just an example — is if you had a chocolate chip cookie recipe, you would want chocolate cookies or chocolate chip cookies recipe, chocolate chip cookies, something like that to be the number one entity, the most salient entity, and you would want it to have a pretty high salient score.

You would want the tool to feel pretty confident, yes, this piece of content is about this topic. But what you can also see is the other entities it's extracting and to what degree they are also salient to the topic. So you can see things like if you have a chocolate chip cookie recipe, you would expect to see things like cookie, butter, sugar, 350, which is the temperature you heat your oven, all of the different things that come together to make a chocolate chip cookie recipe.

But I think that it's really, really important for us as SEOs to understand that salience is the future of related keywords. We're beyond the time when to optimize for chocolate chip cookie recipe, we would also be looking for things like chocolate recipe, chocolate chips, chocolate cookie recipe, things like that. Stems, variants, TF-IDF, these are all older methodologies for understanding what a piece of content is about.

Instead what we need to understand is what are the entities that Google, using its vast body of knowledge, using things like Freebase, using large portions of the internet, where is Google seeing these entities co-occur at such a rate that they feel reasonably confident that a piece of content on one entity in order to be salient to that entity would include these other entities?

Using an expert is the best way to create content that's salient to a topic

So chocolate chip cookie recipe, we're now also making sure we're adding things like butter, flour, sugar. This is actually really easy to do if you actually have a chocolate chip cookie recipe to put up there. This is I think what we're going to start seeing as a content trend in SEO is that the best way to create content that is salient to a topic is to have an actual expert in that topic create that content.

Somebody with deep knowledge of a topic is naturally going to include co-occurring terms, because they know how to create something that's about what it's supposed to be about. I think what we're going to start seeing is that people are going to have to start paying more for content marketing, frankly. Unfortunately, a lot of companies seem to think that content marketing is and should be cheap.

Content marketers, I feel you on that. It sucks, and it's no longer the case. We need to start investing in content and investing in experts to create that content so that they can create that deep, rich, salient content that everybody really needs. 

How can you use this API to improve your own SEO? 

One of the things that I like to do with this kind of information is look at — and this is something that I've done for years, just not in this context — but a prime optimization target in general is pages that rank for a topic, but they rank on page 2.

What this often means is that Google understands that that keyword is a topic of the page, but it doesn't necessarily understand that it is a good piece of content on that topic, that the page is actually solely about that content, that it's a good resource. In other words, the signal is there, but it's weak.

What you can do is take content that ranks but not well, run it through this natural language API or another natural language processing tool, and look at how the entities are extracted and how Google is determining that they're related to each other. Sometimes it might be that you need to do some disambiguation. So in this example, you'll notice that while chocolate cookies is called a work of art, and I agree, cookie here is actually called other.

This is because cookie means more than one thing. There's cookies, the baked good, but then there's also cookies, the packet of data. Both of those are legitimate uses of the word "cookie." Words have multiple meanings. If you notice that Google, that this natural language processing API is having trouble correctly classifying your entities, that's a good time to go in and do some disambiguation.

Make sure that the terms surrounding that term are clearly saying, "No, I mean the baked good, not the software piece of data." That's a really great way to kind of bump up your salience. Look at whether or not you have a strong salient score for your primary entity. You'd be amazed at how many pieces of content you can plug into this tool and the top, most salient entity is still only like a 0.01, a 0.14.

A lot of times the API is like "I think this is what it's about," but it's not sure. This is a great time to go in and bump up that content, make it more robust, and look at ways that you can make those entities easier to both extract and to relate to each other. This brings me to my second point, which is my new favorite thing in the world.

Writing for humans and writing for machines, you can now do both at the same time. You no longer have to, and you really haven't had to do this in a long time, but the idea that you might keyword stuff or otherwise create content for Google that your users might not see or care about is way, way, way over.

Now you can create content for Google that also is better for users, because the tenets of machine readability and human readability are moving closer and closer together. 

Tips for writing for human and machine readability:

Reduce semantic distances!

What I've done here is I did some research not on natural language processing, but on writing for human readability, that is advice from writers, from writing experts on how to write better, clearer, easier to read, easier to understand content.Then I pulled out the pieces of advice that also work as pieces of advice for writing for natural language processing. So natural language processing, again, is the process by which Google or really anything that might be processing language tries to understand how entities are related to each other within a given body of content.

Short, simple sentences

Short, simple sentences. Write simply. Don't use a lot of flowery language. Short sentences and try to keep it to one idea per sentence. 

One idea per sentence

If you're running on, if you've got a lot of different clauses, if you're using a lot of pronouns and it's becoming confusing what you're talking about, that's not great for readers.

It also makes it harder for machines to parse your content. 

Connect questions to answers

Then closely connecting questions to answers. So don't say, "What is the best temperature to bake cookies? Well, let me tell you a story about my grandmother and my childhood," and 500 words later here's the answer. Connect questions to answers. 

What all three of those readability tips have in common is they boil down to reducing the semantic distance between entities.

If you want natural language processing to understand that two entities in your content are closely related, move them closer together in the sentence. Move the words closer together. Reduce the clutter, reduce the fluff, reduce the number of semantic hops that a robot might have to take between one entity and another to understand the relationship, and you've now created content that is more readable because it's shorter and easier to skim, but also easier for a robot to parse and understand.

Be specific first, then explain nuance

Going back to the example of "What is the best temperature to bake chocolate chip cookies at?" Now the real answer to what is the best temperature to bake chocolate cookies is it depends. Hello. Hi, I'm an SEO, and I just answered a question with it depends. It does depend.

That is true, and that is real, but it is not a good answer. It is also not the kind of thing that a robot could extract and reproduce in, for example, voice search or a featured snippet. If somebody says, "Okay, Google, what is a good temperature to bake cookies at?" and Google says, "It depends," that helps nobody even though it's true. So in order to write for both machine and human readability, be specific first and then you can explain nuance.

Then you can go into the details. So a better, just as correct answer to "What is the temperature to bake chocolate chip cookies?" is the best temperature to bake chocolate chip cookies is usually between 325 and 425 degrees, depending on your altitude and how crisp you like your cookie. That is just as true as it depends and, in fact, means the same thing as it depends, but it's a lot more specific.

It's a lot more precise. It uses real numbers. It provides a real answer. I've shortened the distance between the question and the answer. I didn't say it depends first. I said it depends at the end. That's the kind of thing that you can do to improve readability and understanding for both humans and machines.

Get to the point (don't bury the lede)

Get to the point. Don't bury the lead. All of you journalists who try to become content marketers, and then everybody in content marketing said, "Oh, you need to wait till the end to get to your point or they won't read the whole thing,"and you were like, "Don't bury the lead," you are correct. For those of you who aren't familiar with journalism speak, not burying the lead basically means get to the point upfront, at the top.

Include all the information that somebody would really need to get from that piece of content. If they don't read anything else, they read that one paragraph and they've gotten the gist. Then people who want to go deep can go deep. That's how people actually like to consume content, and surprisingly it doesn't mean they won't read the content. It just means they don't have to read it if they don't have time, if they need a quick answer.

The same is true with machines. Get to the point upfront. Make it clear right away what the primary entity, the primary topic, the primary focus of your content is and then get into the details. You'll have a much better structured piece of content that's easier to parse on all sides. 

Avoid jargon and "marketing speak"

Avoid jargon. Avoid marketing speak. Not only is it terrible and very hard to understand. You see this a lot. I'm going back again to the example of getting your clients to say what their products do. You work with a lot of B2B companies, you will you will often run into this. Yes, but what does it do? It provides solutions to streamline the workflow and blah, blah. Okay, what does it do? This is the kind of thing that can be really, really hard for companies to get out of their own heads about, but it's so important for users, for machines.

Avoid jargon. Avoid marketing speak. Not to get too tautological, but the more esoteric a word is, the less commonly it's used. That's actually what esoteric means. What that means is the less commonly a word is used, the less likely it is that Google is going to understand its semantic relationships to other entities.

Keep it simple. Be specific. Say what you mean. Wipe out all of the jargon. By wiping out jargon and kind of marketing speak and kind of the fluff that can happen in your content, you're also, once again, reducing the semantic distances between entities, making them easier to parse. 

Organize your information to match the user journey

Organize it and map it out to the user journey. Think about the information somebody might need and the order in which they might need it. 

Break out subtopics with headings

Then break it out with subheadings. This is like very, very basic writing advice, and yet you all aren't doing it. So if you're not going to do it for your users, do it for machines. 

Format lists with bullets or numbers

You can also really impact skimmability for users by breaking out lists with bullets or numbers.

The great thing about that is that breaking out a list with bullets or numbers also makes information easier for a robot to parse and extract. If a lot of these tips seem like they're the same tips that you would use to get featured snippets, they are, because featured snippets are actually a pretty good indicator that you're creating content that a robot can find, parse, understand, and extract, and that's what you want.

So if you're targeting featured snippets, you're probably already doing a lot of these things, good job. 

Grammar and spelling count!

The last thing, which I shouldn't have to say, but I'm going to say is that grammar and spelling and punctuation and things like that absolutely do count. They count to users. They don't count to all users, but they count to users. They also count to search engines.

Things like grammar, spelling, and punctuation are very, very easy signals for a machine to find and parse. Google has been specific in things, like the "Quality Rater Guidelines,"that a well-written, well-structured, well-spelled, grammatically correct document, that these are signs of authoritativeness. I'm not saying that having a greatly spelled document is going to mean that you immediately rocket to the top of the results.

I am saying that if you're not on that stuff, it's probably going to hurt you. So take the time to make sure everything is nice and tidy. You can use vernacular English. You don't have to be perfect "AP Style Guide" all the time. But make sure that you are formatting things properly from a grammatical standpoint as well as a technical standpoint. What I love about all of this, this is just good writing.

This is good writing. It's easy to understand. It's easy to parse. It's still so hard, especially in the marketing world, to get out of that world of jargon, to get to the point, to stop writing 2,000 words because we think we need 2,000 words, to really think about are we creating content that's about what we think it's about.

Use these tools to understand how readable, parsable, and understandable your content is

So my hope for the SEO world and for you is that you can use these tools not just to think about how to dial in the perfect keyword density or whatever to get an almost perfect score on the salience in the natural language processing API. What I'm hoping is that you will use these tools to help yourself understand how readable, how parsable, and how understandable your content is, how much your content is about what you say it's about and what you think it's about so you can create better stuff for users.

It makes the internet a better place, and it will probably make you some money as well. So these are my thoughts. I'd love to hear in the comments if you're using the natural language processing API now, if you've built a tool with it, if you want to build a tool with it, what do you think about this, how do you use this, how has it gone. Tell me all about it. Holla atcha girl.

Have a great Friday.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!