When you think of video results on Google in 2022 (and video optimization), you might think of something that looks like this (from a search for “flag football”):
In mid-October, we noticed a drop in this type of video result, and that drop became dramatic by late-October. Did Google remove these video results or was our system broken? As it turns out, neither — video results have split into at least three distinct types (depending on how you count).
(1) Video packs (simple & complex)
The example above is pretty simple, with the exception of “Key Moments” (which debuted in 2019), but even the familiar video packs can get pretty complex. Here’s one from a search for the artist Gustav Klimt:
All three of the videos here have Key Moments, including a pre-expanded section for the top video with thumbnails for each of the moments. Some specific SERPs also have minor variations, such as the “Trailers & clips” feature on this search for “Lion King”:
Video packs are still often 3-packs, but can range from two to four results. While only the header really changes here, it’s likely that Google is using a modified algorithm to surface these trailer results.
(2) Branded video carousels
Some videos are displayed in a carousel format, which seems to be common for branded results within YouTube. Here’s an example for the search “Dave and Busters”:
While the majority of these “brand” (loosely defined) carousels are from YouTube, there are exceptions, such as this carousel from Disney Video for “Lightning McQueen”:
Like all carousel-based results, you can scroll horizontally to view more videos. Google’s mobile-first design philosophy has driven more of this format over time, as the combination of vertical and horizontal scrolling is more natural on mobile devices.
(3) Single/thumbnail video results
Prior to breaking out video into separate features, Google typically displayed video results as standard results with a screenshot thumbnail. In the past month, Google seems to have revived this format. Here’s an example for the search “longboarding”:
If you hover over the thumbnail, you’ll see a preview, like this (edited for size):
In some cases, we see multiple video results on a single page, and each of them seems to be counted as one of the “10 blue links” that we normally associate with standard organic results from the web.
There’s also a variant on the single-video format that seem specific to YouTube:
This variant also shows a preview when you hover over it, but it launches a simplified YouTube viewing experience that appears to be new (and will likely evolve over time).
(4) Bonus: Mega-videos
This format has been around for a while and is relatively rare, but certain niches, including hit songs, may return a large-scale video format, such as this one for Taylor Swift’s “Anti-Hero”:
A similar format sometimes appears for “how to” queries (and similar questions), such as the one below for “how to roundhouse kick.” Note the text excerpt below the video that Google has extracted from the audio …
While neither of these formats are new, and they don’t seem to have changed significantly in the past month, they are important variants of Google video results.
(5) Bonus: TikTok results
Finally, Google has started to display a special format for TikTok videos, that typically includes a selection of five videos that preview when you hover over them. Here’s an example from one of my favorite TikTok personalities:
Typically, these are triggered by searches that include “TikTok” in the query. While it’s not a standard video format and isn’t available outside of TikTok, it’s interesting to note how Google is experimenting with rich video results from other platforms.
Does YouTube still dominate?
Back in 2020, we did a study across 10,000 competitive Google searches that showed YouTube holding a whopping 94% of page-one video results. Has this changed with the recent format shuffling? In a word: no. Across the main three video formats discussed in this post, YouTube still accounts for 94% of results in this data set, with Facebook coming in at a distant second place with 0.8%. This does not count specialized results, such as the TikTo results above.
What does this mean for you?
If you’re tracking video results, and have seen major changes, be aware that they may not have disappeared – they more likely morphed into another format. This is a good time to go look at your SERPs in the wild (on desktop and mobile) and see what kind of video formats your target queries are showing. Google is not only experimenting with new formats, but with new video-specific markup and capabilities (such as extracting text directly from the soundtracks of videos and podcasts). You can expect all of this to continue to evolve into 2023.
The year may slowly be wrapping up but we’ve got an extra special, early gift to share before you log off that laptop and put away your favorite travel mug.
We’re thrilled to announce the first 19 extraordinary speakers that will be taking the MozCon 2023 stage in Seattle this coming August (in alphabetical order).
Amanda is passionate about helping complex, large businesses improve their local visibility. Her background includes working with clients in the legal, health, financial, and home services industries.
Andi is the Founder and Strategy Director of Eximo Marketing, a marketing strategy consultancy based in the UK. Eximo works with established manufacturers who want to grow their business via direct to consumer. Andi also hosts the Strategy Sessions podcast.
Brie E Anderson is an Analytical Nerd with a Soft Spot for Strategy. She's spent the last 10 years helping businesses of all sizes execute data-driven strategies to increase ROI. Today, she runs BEAST Analytics, a digital marketing analytics consultancy.
Carrie Rose, Founder of leading Global Search-First Creative Agency Rise at Seven both driving and facilitating search demand for global brands operating in 4 locations across the world including UK, US and EU
Chris Long is the VP of Marketing for the Go Fish Digital team. He works with unique problems and advanced search situations to help clients improve organic traffic through a deep understanding of Google's algorithm and web technology.
Head of SEO Communications, Wix, Crystal is an SEO & digital marketer with over 15 years of experience. Her clients have included Disney, McDonalds, and Tomy. An avid SEO communicator, her work has been featured at Google Search Central, Brighton SEO and more.
Daniel is a Search Advocate at Google, part of the Search Console engineering team. His job is divided between educating / inspiring the Search community and working with the product’s engineering team to develop new capabilities.
Duane has lived in 6 cities across 3 continents while working with Ecom, DTC and SaaS brands. He now lives in Canada helping brands grow through data, strategy and PPC marketing across search & social ad platforms.
Jackie Chu is currently the SEO Lead, Intelligence for Uber, driving analytics and tooling for the SEO teams globally. She has deep experience in technical SEO, content SEO, ASO and international SEO spanning both B2B and B2C industries.
Group CMO at Swiss media giant Ringier, marketing technologist & mum of two tiny humans. Jes loves to talk about the future of search, smart marketing automation and travel.
Lidia has been working in SEO for almost a decade, helping businesses in SaaS, media and e-commerce grow online. She has a BSC in Psychology and a Master in Digital Business and is a regular speaker at SEO events such as MozCon, BrightonSEO or WTSFest.
Lily Ray is the Sr. Director, SEO & Head of Organic Research at Amsive Digital, where she provides strategic leadership for the agency’s SEO client programs. Lily began her SEO career in 2010 in a fast-paced start-up environment and moved quickly into the agency world, where she helped grow and establish an award-winning SEO department that delivered high impact work for a fast-growing list of notable clients, including Fortune 500 companies.
Head of Organic Search, John Lewis (Financial Services) @Mira_Inam
Miracle is Head of Organic Search at John Lewis (Financial Services) and is armed with more than a decade of supporting national, and global brands with technical SEO and data strategy.
Noah is a technical marketer, nicknamed the Kraken, who is happiest building SEO tools, automations, data pipelines and communities. When not in the lab, he loves skiing, fly fishing, camping with his family, and walking his dog, Shadow.
Dr. Pete is Marketing Scientist for Seattle-based Moz, where he works with the marketing and data science teams on product research and data-driven content.
Ross Simmonds is the founder & CEO of Foundation, a global marketing agency that provides services to organizations all over the world ranging from some of the fastest-growing startups to global brands. He was named one of Atlantic Canada's Top 50 CEO.
Tom is CTO at SearchPilot, where he leads the engineering & product teams. Tom has been working on the web for over 25 years, and has a PhD in Computer Science. He lives with his wife and 3 daughters in Germany.
Tom heads up the Search Science team at Moz, providing research and insight for Moz's next generation of tools. Previously he headed up the London consulting team for SEO agency Distilled, and worked as a chef in a roadside grill.
Wil has been leading the charge to leverage “Big Data” to break down silos between SEO, PPC, and traditional marketing -- pulling together data from various sources to see the big picture.
From fan favorites to fresh faces, it’s a pretty great start to what’s sure to be the best MozCon yet! We’ll have even more incredible speakers to reveal, including our community speaker lineup, in early 2023.
But don’t wait to snag your tickets! Save up to $600 on MozCon 2023 now with Super Early Bird pricing.
Ranking on Google is not ranking in a vacuum. Ranking is outranking your competitors. When you've got very limited space on the first page of the SERPs, you need to be doing better than your competitors.
In today's Whiteboard Friday, Lidia Infante shows you her recommended strategies for successful SEO gap analysis.
Click on the whiteboard image above to open a high resolution version in a new tab!
Video Transcription
Howdy, Moz fans, and welcome to a new edition of Whiteboard Fridays. My name is Lidia Infante, and I'm the Senior SEO Manager at sanity.io. Today, I'm going to be talking to you about SEO gap analysis, and yes, I know it's a very unsexy topic, but bear with me because it's worth it.
SEO gap analysis takes us to the first principles of what we do in SEO because ranking on Google is not ranking in a vacuum. Ranking is outranking your competitors. You've got a very limited space on the first page of the SERPs, and you need to be doing better than your competitors to be able to rank there. That means, then, you need to know what your competitors are doing and how you're going to do it better.
Once you have your set of competitors ready, you're going to proceed to benchmark yourself against them, and we're going to be doing this across the three pillars of SEO.
So we're going to be looking at content, we're going to be looking at links, and we're going to be looking at tech SEO. We're going to look at how our competitors perform from each of those and how we compare.
Content
So when it comes to content, the very first thing that we want to look at is at the estimated traffic by type that our competitors and we have. So when I'm talking about traffic by type, what I mean is like: Are they getting branded traffic versus unbranded traffic, product traffic, editorial traffic? It's going to be very different depending on the vertical that you're in, so adapt it to make it yours. We're also going to be looking at the number of editorial URLs that they have and how much traffic these editorial URLs are getting each on average. And lastly, we're going to be looking at the number of keywords that they're ranking for. We're not going to be looking at all of the keywords. We're going to be aiming for the range of 1 to 30. Again, you can make this yours. You know your market better, and you know what's relevant, but that should narrow the entire pool to stuff that's a little more relevant to your competitors.
Links
Then, we're going to be looking at links. We're going to begin with link gap analysis. That is we're going to look at how many links your competitors have and how many referring domains are pointing to your competitors. Then, we're going to use this to measure link growth. We're going to look at how many links your competitors had 6 months ago or 12 months ago if your market is a little slower, and we're going to get a percentage of growth out of that. That's going to indicate to you whether your search market is very aggressive with link building and you need to make an effort to keep up or it's a little bit more relaxed. Then, we're going to be looking at branded search. So how many people are looking for your competitors' brands versus how many people are looking for your brand? That's going to indicate the level of brand awareness that you have within your target audience in comparison to your competitors.
And we're going to take it one step further, and we're going to be looking again at branded traffic. There should be a very, very correlated relation between branded search and branded traffic. If you're first for branded search, you should be first for branded traffic and so on. But if there isn't, it might be an indicator that you don't have content within your site that's responding to the users' queries about your brand. So that's definitely a very quick win that you could action right now.
Technical SEO
Lastly, we're going to be looking at tech SEO, and this is incredibly difficult to measure because the requirements in tech SEO vary from website to website, from vertical to vertical. I am personally in the SaaS market, so my requirements for tech SEO is essentially make it readable and make sure that JavaScript is not blocking anything, classic crawling and rendering issues, and that's about it. But if you're in e-commerce, you're likely dealing with faceted navigation. You're dealing with filter management, and it's a little bit more demanding. So the best way that I have found to measure tech SEO changes and performance is Core Web Vital scores. We're going to go on the Chrome UX Report on Data Studio, and we're going to look at the main three Core Web vitals, grab the percentage of good URLs according to Google, and then we're going to average them out into one score. Then we're going to be looking at page speed. You can do this with PageSpeed Insights, and we're going to be looking at the scores for mobile versus desktop. I don't average these out because I think they provide really useful information of what issues your industry is running into when it comes to mobile usability. And then lastly, we're going to do some manual checks. Take a look at the robots.txt, take a look at the sitemap, how they manage canonicalization, and that's going to inform you better on how you could outperform your competitors.
And if this seems very complicated, don't worry. I have provided a free template for you so that you can make it yours.
Thank you so much for watching my Whiteboard Friday. My name is Lidia Infante, and you can find me on Twitter @LidiaInfanteM. You can find me on my website at lidia-infante.com and see you soon.
Today, I’m doing a quick follow-up to the manual portion of our earlier study in an effort to quantify and illustrate this abrupt alteration.
A total sea change in local pack headers
Between July and November of 2022, 83% of our previously-queried local pack headers underwent a complete transformation of nomenclature. Only 17% of the local pack headers were still worded the same way in autumn as they had been in the summertime. Here is a small set of examples:
In our manual analysis of 60 queries in July, we encountered 40 unique local pack headers - a tremendous variety. Now, all specificity is gone. For all of our queries, headings have been reduced to just 3 types: in-store availability, places, and businesses.
Entity relationships remain mysterious
What hasn’t changed is my sense that the logic underpinning which businesses receive which local pack header remains rather odd. In the original study, we noted the mystery of why a query like “karate” fell under the heading of “martial arts school” but a query for “tai chi” got a unique “tai chi heading”, or why “adopt dog” results were headed “animal rescue services” but “adopt bunny” got a pack labeled “adopt bunny”. The curious entity relationships continue on, even in this new, genericized local pack header scenario. For example, why is my search for “tacos” (which formerly brought up a pack labeled “Mexican restaurants”, now labeled this:
But my search for “oil change” gets this header:
Is there something about a Mexican restaurant that makes it more of a “place” and an oil change spot that makes it more of a “business”? I don’t follow the logic. Meanwhile, why are service area businesses, as shown in my search for “high weed mowing” being labeled “places”?
Surely high weed mowing is not a place…unless it is a philosophical one. Yet I saw many SABs labeled this way instead of as “businesses”, which would seem a more rational label, given Google’s historic distinction between physical premises and go-to-client models. There are many instances like this of the labeling not making much horse sense, and with the new absence of more specific wording, it feels like local pack headers are likely to convey less meaning and be more easily overlooked now.
Why has Google done this and does it matter to your local search marketing?
Clearly, Google decided to streamline their classifications. There may be more than three total local pack header types, but I have yet to see them. Hotel packs continue to have their own headings, but they have always been a different animal:
In general, Google experiments with whatever they think will move users about within their system, and perhaps they felt the varied local pack headers were more of a distraction than an aid to interactivity with the local packs. We can’t know for sure, nor can we say how long this change will remain in place, because Google could bring back the diverse headings the day after I publish this column!
As to whether this matters to your local search campaigns, unfortunately, the generic headers do obscure former clues to the mind of Google that might have been useful in your SEO. I previously suggested that local businesses might want to incorporate the varied local pack terms into the optimization of the website tags and text, but in the new scenario, it is likely to be pointless to optimize anything for “places”, “businesses”, or “in-store availability”. It’s a given that your company is some kind of place or business if you’re creating a Google Business Profile for it. And, your best bet for featuring that you carry certain products is to publish them on your listing and consider whether you want to opt into programs like Pointy.
In sum, this change is not a huge deal, but I’m a bit sorry to see the little clues of the diversified headers vanish from sight. Meanwhile, there’s another local pack trend going on right now that you should definitely be paying attention to…
A precipitous drop in overall local pack presence
In our original study, Google did not return a local pack for 18% of our manual July queries. By November, the picture had significantly changed. A startling 42% of our queries suddenly no longer displayed a local pack. This is right in line with Andrew Shotland’s documentation of a 42.3% drop from peak local pack display between August and October. Mozcast, pictured above, captured a drop from 39.6% of queries returning local packs on October 24th to just 25.1% on October 25th. The number has remained in the low-to-mid 20s in the ensuing weeks. It’s enough of a downward slope to give one pause.
Because I’m convinced of the need for economic localism as critical to healing the climate and society, I would personally like Google to return local packs for all commercial queries so that searchers can always see the nearest resource for purchasing whatever they need, but if Google is reducing the number of queries for which they deliver local results, I have to try to understand their thinking.
To do that, I have to remember that the presence of a local pack is a signal that Google believes a query has a local intent. Likely, they often get this right, but I can think of times when a local result has appeared for a search term that doesn’t seem to me to be obviously, inherently local. For example, in the study Dr. Pete and I conducted, we saw Google not just returning a local pack for the keyword “pickles” but even giving it its own local pack header:
If I search for pickles, am I definitely looking for pickles near me, or could I be looking for recipes, articles about the nutritional value of pickles, the history of pickles, something else? How high is Google’s confidence that vague searches like these should be fulfilled with a local result?
After looking at a number of searches like these in the context of intent, my current thinking is this: for some reason unknown to us, Google is dialing back presumed local intent. Ever since Google made the user the centroid of search and began showing us nearby results almost by default for countless queries, we users became trained not to have to add many (or any) modifiers to our search language to prompt Google to lay out our local options for us. We could be quite lazy in our searches and still get local results.
In the new context of a reduced number of searches generating local packs, though, we will have to rehabituate ourselves to writing more detailed queries to get to what we want if Google no longer thinks our simple search for “pickles” implies “pickles near me”. I almost get the feeling that Google wants us to start being more specific again because its confidence level about what constitutes a local search has suffered some kind of unknown challenge.
It’s also worth throwing into our thinking what our friends over at NearMedia.co have pointed out:
“The Local Pack's future is unclear. EU's no "self-preferencing"DMAtakes effect in 2023. The pendingAICOAhas a similar language.”
It could be that Google’s confidence is being shaken in a variety of ways, including by regulatory rulings, and local SEOs should always expect change. For now, though, local businesses may be experiencing some drop in their local pack traffic and CTR. On the other hand, if Google is getting it right, there may be no significant loss. If your business was formerly showing up in a local pack for a query that didn’t actually have a local intent, you likely weren’t getting those clicks anyway because a local result wasn’t what the searcher was looking for to begin with.
That being said, I am seeing examples in which I feel Google is definitely getting it wrong. For instance, my former searches for articles of furniture all brought up local packs with headings like “accent chairs” or “lamps”. Now, Google is returning no local pack for some of these searches and is instead plugging an enormous display of remote, corporate shopping options. There are still furniture stores near me, but Google is now hiding them, and that disappoints me greatly:
So here’s today’s word to the wise: keep working on the organic optimization of your website and the publication of helpful content. Both will underpin your key local pack rankings, and as we learned from our recent large-scale local business review survey, 51% of consumers are going to end up on your site as their next step after reading reviews on your listings. 2023 will be a good year to invest in the warm and inclusive welcome your site is offering people, and the investment will also stand you in good stead however local pack elements like headers, or even local packs, themselves, wax and wane.
Despite the resources they can invest in web development, large e-commerce websites still struggle with SEO-friendly ways of using JavaScript.
And, even when 98% of all websites use JavaScript, it’s still common that Google has problems indexing pages using JavaScript. While it's okay to use it on your website in general, remember that JavaScript requires extra computing resources to be processed into HTML code understandable by bots.
At the same time, new JavaScript frameworks and technologies are constantly arising. To give your JavaScript pages the best chance of indexing, you'll need to learn how to optimize it for the sake of your website's visibility in the SERPs.
Why is unoptimized JavaScript dangerous for your e-commerce?
By leaving JavaScript unoptimized, you risk your content not getting crawled and indexed by Google. And in the e-commerce industry, that translates to losing significant revenue, because products are impossible to find via search engines.
It’s likely that your e-commerce website uses dynamic elements that are pleasant for users, such as product carousels or tabbed product descriptions. This JavaScript-generated content very often is not accessible to bots. Googlebot cannot click or scroll, so it may not access all those dynamic elements.
Consider how many of your e-commerce website users visit the site via mobile devices. JavaScript is slower to load so, the longer it takes to load, the worse your website’s performance and user experience becomes. If Google realizes that it takes too long to load JavaScript resources, it may skip them when rendering your website in the future.
Top 4 JavaScript SEO mistakes on e-commerce websites
Now, let’s look at some top mistakes when using JavaScript for e-commerce, and examples of websites that avoid them.
1. Page navigation relying on JavaScript
Crawlers don’t act the same way users do on a website ‒ they can’t scroll or click to see your products. Bots must follow links throughout your website structure to understand and access all your important pages fully. Otherwise, using only JavaScript-based navigation may make bots see products just on the first page of pagination.
Guilty: Nike.com
Nike.com uses infinite scrolling to load more products on its category pages. And because of that, Nike risks its loaded content not getting indexed.
For the sake of testing, I entered one of their category pages and scrolled down to choose a product triggered by scrolling. Then, I used the “site:” command to check if the URL is indexed in Google. And as you can see on a screenshot below, this URL is impossible to find on Google:
Of course, Google can still reach your products through sitemaps. However, finding your content in any other way than through links makes it harder for Googlebot to understand your site structure and dependencies between the pages.
To make it even more apparent to you, think about all the products that are visible only when you scroll for them on Nike.com. If there’s no link for bots to follow, they will see only 24 products on a given category page. Of course, for the sake of users, Nike can’t serve all of its products on one viewport. But still, there are better ways of optimizing infinite scrolling to be both comfortable for users and accessible for bots.
Winner: Douglas.de
Unlike Nike, Douglas.de uses a more SEO-friendly way of serving its content on category pages.
They provide bots with page navigation based on <a href> links to enable crawling and indexing of the next paginated pages. As you can see in the source code below, there’s a link to the second page of pagination included:
Moreover, the paginated navigation may be even more user-friendly than infinite scrolling. The numbered list of category pages may be easier to follow and navigate, especially on large e-commerce websites. Just think how long the viewport would be on Douglas.de if they used infinite scrolling on the page below:
2. Generating links to product carousels with JavaScript
Product carousels with related items are one of the essential e-commerce website features, and they are equally important from both the user and business perspectives. Using them can help businesses increase their revenue as they serve related products that users may be potentially interested in. But if those sections over-rely on JavaScript, they may lead to crawling and indexing issues.
Guilty: Otto.de
I analyzed one of Otto.de’s product pages to identify if it includes JavaScript-generated elements. I used the What Would JavaScript Do (WWJD) tool that shows screenshots of what a page looks like with JavaScript enabled and disabled.
Test results clearly show that Otto.de relies on JavaScript to serve related and recommended product carousels on its website. And from the screenshot below, it’s clear that those sections are invisible with JavaScript disabled:
How may it affect the website’s indexing? When Googlebot lacks resources to render JavaScript-injected links, the product carousels can’t be found and then indexed.
Let’s check if that’s the case here. Again, I used the “site:” command and typed the title of one of Otto.de’s product carousels:
As you can see, Google couldn’t find that product carousel in its index. And the fact that Google can’t see that element means that accessing additional products will be more complex. Also, if you prevent crawlers from reaching your product carousels, you’ll make it more difficult for them tounderstand the relationship between your pages.
Winner: Target.com
In the case of Target.com’s product page, I used the Quick JavaScript Switcher extension to disable all JavaScript-generated elements. I paid particular attention to the “More to consider” and “Similar items” carousels and how they look with JavaScript enabled and disabled.
As shown below, disabling JavaScript changed the way the product carousels look for users. But has anything changed from the bots' perspective?
To find out, check what the HTML version of the page looks like for bots by analyzing the cache version.
When scrolling, you’ll see that the links to related products can also be found in its cache. If you see them here, it means bots don’t struggle to find them, either.
However, keep in mind that the links to the exact products you can see in the cache may differ from the ones on the live version of the page. It’s normal for the products in the carousels to rotate, so you don’t need to worry about discrepancies in specific links.
But what exactly does Target.com do differently? They take advantage of dynamic rendering. They serve the initial HTML, and the links to products in the carousels as the static HTML bots can process.
However, you must remember that dynamic rendering adds an extra layer of complexity that may quickly get out of hand with a large website. I recently wrote an article about dynamic rendering that’s a must-read if you are considering this solution.
Also, the fact that crawlers can access the product carousels doesn’t guarantee these products will get indexed. However, it will significantly help them flow through the site structure and understand the dependencies between your pages.
3. Blocking important JavaScript files in robots.txt
Blocking JavaScript for crawlers in robots.txt by mistake may lead to severe indexing issues. If Google can’t access and process your important resources, how is it supposed to index your content?
Guilty: Jdl-brakes.com
It’s impossible to fully evaluate a website without a proper site crawl. But looking at its robots.txt file can already allow you to identify any critical content that’s blocked.
This is the case with the robots.txt file of Jdl-brakes.com. As you can see below, they block the /js/ path with the Disallow directive. It makes all internally hosted JavaScript files (or at least the important ones) invisible to all search engine bots.
This disallow directive misuse may result in rendering problems on your entire website.
To check if it applies in this case, I used Google’s Mobile-Friendly Test. This tool can help you navigate rendering issues by giving you insight into the rendered source code and the screenshot of a rendered page on mobile.
I headed to the “More info” section to check if any page resources couldn’t be loaded. Using the example of one of the product pages on Jdl-brakes.com, you may see it needs a specific JavaScript file to get fully rendered. Unfortunately, it can’t happen because the whole /js/ folder is blocked in its robots.txt.
But let’s find out if those rendering problems affected the website’s indexing. I used the “site:” command to check if the main content (product description) of the analyzed page is indexed on Google. As you can see, no results were found:
This is an interesting case where Google could reach the website's main content but didn’t index it. Why? Because Jdl-brakes.com blocks its JavaScript, Google can’t properly see the layout of the page. And even though crawlers can access the main content, it’s impossible for them to understand where that content belongs in the page’s layout.
Let’s take a look at the Screenshot tab in the Mobile-Friendly Test. This is how crawlers see the page’s layout when Jdl-brakes.com blocks their access to CSS and JavaScript resources. It looks pretty different from what you can see in your browser, right?
The layout is essential for Google to understand the context of your page. If you’d like to know more about this crossroads of web technology and layout, I highly recommend looking into a new field of technical SEO called rendering SEO.
Winner: Lidl.de
Lidl.de proves that a well-organized robots.txt file can help you control your website’s crawling. The crucial thing is to use the disallow directive consciously.
Although Lidl.de blocks a single JavaScript file with the Disallow directive /cc.js*, it seems it doesn’t affect the website’s rendering process. The important thing to note here is that they block only a single JavaScript file that doesn’t influence other URL paths on a website. As a result, all other JavaScript and CSS resources they use should remain accessible to crawlers.
Having a large e-commerce website, you may easily lose track of all the added directives. Always include as many path fragments of a URL you want to block from crawling as possible. It will help you avoid blocking some crucial pages by mistake.
4. JavaScript removing main content from a website
If you use unoptimized JavaScript to serve the main content on your website, such as product descriptions, you block crawlers from seeing the most important information on your pages. As a result, your potential customers looking for specific details about your products may not find such content on Google.
As you can see above, the product description section disappeared with JavaScript disabled. I decided to use the “site:” command to check if Google could index this content. I copied the fragment of the product description I saw on the page with JavaScript enabled. However, Google didn’t show the exact product page I was looking for.
Will users get obsessed with finding that particular product via Walmart.com? They may, but they can also head to any other store selling this item instead.
The example of Walmart.com proves that main content depending on JavaScript to load makes it more difficult for crawlers to find and display your valuable information. However, it doesn’t necessarily mean they should eliminate all JavaScript-generated elements on their website.
To fix this problem, Walmart has two solutions:
Implementing dynamic rendering (prerendering) which is, in most cases, the easiest from an implementation standpoint.
Implementing server-side rendering. This is the solution that will solve the problems we are observing at Walmart.com without serving different content to Google and users (as in the case of dynamic rendering). In most cases, server-side rendering also helps with web performance issues on lower-end devices, as all of your JavaScript is being rendered by your servers before it reaches the client's device.
Let’s have a look at the JavaScript implementation that’s done right.
Winner: IKEA.com
IKEA proves that you can present your main content in a way that is accessible for bots and interactive for users.
When browsing IKEA.com’s product pages, their product descriptions are served behind clickable panels. When you click on them, they dynamically appear on the right-hand side of the viewport.
Although users need to click to see product details, Ikea also serves that crucial part of its pages even with JavaScript off:
This way of presenting crucial content should make both users and bots happy. From the crawlers’ perspective, serving product descriptions that don’t rely on JavaScript makes them easy to access. Consequently, the content can be found on Google.
Wrapping up
JavaScript doesn’t have to cause issues, if you know how to use it properly. As an absolute must-do, you need to follow the best practices of indexing. It may allow you to avoid basic JavaScript SEO mistakes that can significantly hinder your website’s visibility on Google.
Take care of your indexing pipeline and check if:
You allow Google access to your JavaScript resources,
Google can access and render your JavaScript-generated content. Focus on the crucial elements of your e-commerce site, such as product carousels or product descriptions,