CBPO

Tag: 2019

Hero Conf London 2019 Speaker Recap: Milka Kramer

November 25, 2019 No Comments

Key takeaways from Milka Kramer’s captivating presentation, The Data-Driven Marketer’s Blueprint for Success.

Read more at PPCHero.com
PPC Hero


Hero Conf London 2019 Speaker Recap: Sarah Barker

November 24, 2019 No Comments

SEO and PPC can and should work together was the premise of Sarah Barker’s keynote speech at Hero Conf London this year. Let’s take a dive into the why and how of collaborating effectively between the two specialties.

Read more at PPCHero.com
PPC Hero


2019 Google core algorithm updates: Lessons and tips to future-proof your SEO

November 19, 2019 No Comments

There’s nothing that beats that organic #1 position in Google’s SERPs when it comes to brand visibility, increase in traffic, trust factor boost, reduction in cost per lead, and so on.

Everyone who’s anyone in online business knows this, which is why the struggle to grab that marketer’s Holy Grail can look like a cut-throat business to many SEO novices.

However, even SEO pros get confused when Google throws a wrench into the intricate workings of the rankings machine. Google’s core algorithm updates can mess up even the best SEO strategies, especially if you react in a panic to a drop in the rankings.

Today, I’ll share with you the three things I’ve learned from 2019 Google algorithm updates that will help you future-proof your SEO. First, however, take a look at the hints that Google rolled out alongside those updates to see if you’re building your SEO strategy on a healthy foundation.

2019 Google core algorithm updates and what they tell us

In 2018, Google reported 3234 algorithm updates.

That’s just a bit shy of 9 updates per day.

All of them change how the algorithm evaluates a website and its rankings (most just slightly, though).

However, three of them were so-called ‘core algorithm updates’ – meaning that their impact on the rankings was likely significant for most indexed websites. Google announced these (in March, June, and September of 2019), which is not something that they normally do. This should give you an idea of how important they were in the grand scheme of all things SEO-related.

Google Sear Liaison's tweet on its 2019 Google core algorithm updates

Websites were affected differently, with some seeing increases in their rankings and traffic, and others plummeting to Google’s page #3. Many of the sites that experienced significant drops are in the Your Money, Your Life (YMYL) niche.

(Verywellhealth.com shows a significant drop after the March core update)

“The sensitive nature of the information on these types of websites can have a profound impact on peoples’ lives,” says Paul Teitelman of Paul Teitelman SEO Agency. “Google has long struggled with this and at least one of these core algorithm updates was designed to push trustworthy YMYL content to the top while sinking those websites that contain dubious and untrustworthy information.”

Google signaled a path forward with these updates. If you were not paying attention, here are the key takeaways:

  • Google signals an intent to keep rewarding fresh, complete, and unique content. Focus on answering the searcher’s questions thoroughly and precisely.
  • E-A-T (Expertise, Authoritativeness, Trustworthiness) guidelines are more important than ever. Things like backlinks from reputable websites, encryption, and who authors your posts can make or break your organic rankings.
  • Google wants to see you covering a wide range of topics from your broader niche. Increase your relevance with content that establishes you as the go-to source in your niche.

SEO is far from an exact science.

If anything, it’s educated guesswork based on countless hours of testing, tweaking, and then testing again.

Still, there are things that you can do to future-proof your SEO and protect your websites from reacting too violently to core algorithm updates.

Based on Google’s recent hints, here are three things that you should focus on if you’re going after those page #1 rankings in the SERPs.

Three tips to future-proof your website’s SEO

Keep the focus on high-quality, actionable content

I know you’re annoyed with hearing it by now but high-quality content is a prerequisite to ranking at the top of the SERPs and staying there.

This means that you need to pin-point a specific question that the searcher wants answers to and then write a piece of content that provides a detailed clarification of the issue. Does it need to be 5,000 words long? That depends on the question but, in most cases, it doesn’t. What it needs to be is concise and thorough, and clarify any and all questions that the searcher might have while reading it.

Ideally, you will want your content to be 1500+ words. According to Backlinko’s Brian Dean and his research, Google tends to reward longer content.

 

Source: https://backlinko.com/search-engine-ranking

My advice is to ask yourself the following questions when you’re writing:

  • Am I providing the reader with a comprehensive answer to their question?
  • Is my content more thorough than what’s already on the #1 page of the SERPs?
  • Am I presenting the information in a trustworthy way (citing sources, quoting experts)?
  • Is my content easy to understand, and free from factual, stylistic, and grammar errors?

If your answer to these questions is a yes, you’re already doing better than (probably) 95% of your competitors.

Improve the E-A-T score of your website

In SEO, E-A-T stands for Expertise, Authoritativeness, and Trustworthiness.

In other words – who is authoring blog posts and articles that are published on your website? Are they penned by an expert in the field or by a ghostwriter?

Why should people trust anything you (or your website) have to say? That’s the crux of E-A-T.

The concept appears in Google’s Quality Raters’ Guidelines (QRG), and SEO experts have debated for years whether or not it has any bearing on the actual organic rankings.

In 2018, Google cleared all doubts around it, announcing that QRG is, in fact, their blueprint for developing the search algorithm. “You can view the rater guidelines as to where we want the search algorithm to go,” Ben Gomes, Google’s vice president of search, assistant and news, said in a CNBC interview.

Here’s what the QRG has to say about E-A-T

Source: https://static.googleusercontent.com/media/guidelines.raterhub.com/en//searchqualityevaluatorguidelines.pdf

We have no idea if Google’s core algorithm can evaluate E-A-T parameters as well as an actual human rater. Still, if that’s Google’s end goal, it’s a good idea to pay attention to it now, regardless of whether it’s implemented or not. It most certainly will be at one point in the future.

To improve your E-A-T score, focus on the following

  • Add an author byline to your posts – every post that you publish should be authored by someone. Use your real name (or your author’s real name), and start building a reputation as an expert in the field.
  • Create your personal website – even if you’re trying to rank your business site, make sure to have a personal branding website of your own (and of any regularly contributing authors). Those websites should be maintained – you don’t need to SEO the heck out of them but you should publish niche-relevant content regularly.
  • Get featured on Wikipedia and authority websites – QRG clearly instructs raters to check for author mentions on Wikipedia and other relevant sites. That stands to reason because experts in the field will often be quoted by other publications.

(Image source: https://static.googleusercontent.com/media/guidelines.raterhub.com/en//searchqualityevaluatorguidelines.pdf)

  • Get mentions on forums – same goes for forum mentions. If people name-drop you on relevant forums, that means that they feel you have something important to say.
  • Secure your site with HTTPS – security is an important E-A-T factor, especially if you’re selling something via your website. An unsecured website will have a low E-A-T score so make sure to invest in encryption to boost trustworthiness.

Build quality backlinks and establish a social presence

Quality backlinks are still a very important ranking factor.

However, according to a report released by Backlinko, it’s not about one or two backlinks, regardless of how strong they are.

What moves the ranking needle are sustainable, evergreen link-building strategies – backlinks from trusted, niche-related websites that are acquired by white hat SEO methods such as blogger outreach, guest posting, and collaborations with other influencers in the niche. The more of these types of backlinks you get, the better your organic rankings.

Additionally, getting backlinks from a greater number of referring domains ensures that your rankings are protected if, for example, a couple of those websites get shut down or penalized in the future. When you’re playing the link-building game, it pays to think ahead.

(Image Source: https://backlinko.com/google-ranking-factors)

And, while they don’t carry the same weight as true backlinks, you’d be wrong to underestimate the value Google’s ranking algorithm places on social media signals.

A truly authoritative website – and all the authors that write for it – will have a strong social media presence. They will use it to amplify their message, build additional authority, and drive traffic to their website. Ahrefs’ Tim Soulo does this better than any other SEO expert that I know.

how having a strong social media presence helps create authority and tackle 2019 Google core algorithm updates

All of this will affect the aforementioned E-A-T parameters. If nothing, it will distribute your name far and wide, signaling to Google that you’re not a complete nobody that just happens to run a website or write a blog about a certain topic. The stronger your social media presence; the more followers, comments, and shares you end up earning – the better it is for your E-A-T.

Get people to trust you and the algorithm will follow

Pretty soon, the key to top rankings will be how believable and trustworthy you are. Google’s current insistence on E-A-T parameters clearly demonstrates that. Everything else will be just the icing on the cake after that – the fancy schema you’re using, the on-page SEO gimmicks, and all the other loopholes SEO experts are now using to rank their websites.

I’m interested to hear what you think about the direction that Google is taking with this year’s algorithm updates. Have any of your websites been affected? Leave a comment below and let’s discuss.

The post 2019 Google core algorithm updates: Lessons and tips to future-proof your SEO appeared first on Search Engine Watch.

Search Engine Watch


Only four days left to buy early-bird passes to Disrupt Berlin 2019

November 12, 2019 No Comments

Last week, we extended the early-bird pricing on passes to Disrupt Berlin 2019 until 15 November at 11:59 p.m. (CEST). Consider it distinctly non-divine intervention from Expeditus, the patron saint of procrastinators (and speedy causes). The countdown continues, and you have just four days left to save serious dough — we’re talking up to €500 depending on the type of pass you purchase.

No matter what role you play in the startup world, you’ll find tremendous value at Disrupt Berlin. Add even more value — buy an early-bird pass to Disrupt Berlin before the early bird flies away for good on 15 November at 11:59 p.m. (CEST).

Disrupt Berlin draws attendees from more than 50 countries across Europe and beyond, making it an international celebration of all things startup. This is the place to see the latest tech from innovative early-stage startups, and you’ll find hundreds of them exhibiting in Startup Alley. Don’t miss the Country Pavilions where you’ll find delegations from different countries showcasing the best of their up-and-coming startups.

You’ll also find TC Top Picks exhibiting in Startup Alley. Our editors selected up to five startups they feel represent the most interesting use of technology in the following categories: AI/Machine Learning, Biotech/Healthtech, Blockchain, Fintech, Mobility, Privacy/Security, Retail/E-commerce, Robotics/IoT/Hardware, CRM/Enterprise and Education. Come to meet, greet and network with the founders who earned the coveted Top Pick designation.

With so many exhibiting startups to see, not to mention all the founders, investors and technologists roaming around the Berlin Arena, how can you cut through the noise to find the people who align with your business goals and interests? Use CrunchMatch, our free business-matchmaking tool that slays the old needle-in-a-haystack approach to networking.

We’ll email all registered attendees when we launch CrunchMatch, and we’ll explain how to access the platform. You then create a professional profile outlining your role and the specific types of people and connections you want to make.  CrunchMatch will find and suggest matches and — with your approval — suggest meetings, send out meeting requests and schedule appointments. Closing the deal? That’s up to you.

Beyond all the networking opportunities, you’ll have the chance to learn from and engage with tech and investing experts and icons. Hear from world-class speakers, attend smaller Q&A Sessions where you have the chance to get your pressing questions answered, watch the Startup Battlefield and don’t miss the Hackathon finalists pitch on the Extra Crunch Stage. Check out the Disrupt Berlin agenda.

Join us on 11-12 December for all the value and opportunity Disrupt Berlin 2019 offers. And remember, you have just four more days to grab all the value you can. Channel Saint Expeditus and beat the deadline. Buy your early-bird pass before 15 November at 11:59 p.m. (CEST). We’ll see you in Berlin!

Is your company interested in sponsoring or exhibiting at Disrupt Berlin 2019? Contact our sponsorship sales team by filling out this form.


Startups – TechCrunch


Transformation of Search Summit 2019: Highlight reel

October 29, 2019 No Comments

On Friday we held the Transformation of Search Summit 2019 here in New York City. Huge thank you to all of our speakers, attendees, and sponsors who made the day a success!

In this article we’ve compiled some key quotes, stats, and otherwise tweetable highlights from the event.

Keynote: The transformation of search

First we heard from Carolyn Shelby, SEO Manager, Audience Development at the Walt Disney Company / ESPN.

One of the key quotes from her session was “The trick is to understand the psychology of people. Get in front of the consumer. That’s where search engines are going. What is the least amount of thinking that I can make a consumer do? How can I get them what they want the fastest?”

She also walked us through a brief SERP evolution, from collecting and organizing, to scoring / ranking relevancy, to now delivering immediate gratification.

The future of search is visual

Next up we heard from Michael Akkerman of Pinterest on the growth of visual search and its role in the future.

He talked about the evolution of consumer expectations, from physical stores, to digital convenience, to omnichannel promise, to the inspired shopping of today.

Where it once may have seemed that consumers were only focused on convenience, we’re now seeing the re-emergence of shopping and discovery in the consumer experience.

He also talked about the role of Pinterest in consumer discovery. On Pinterest, he says, they have billions of text-based searches every month. Of those, 90% are non-brand searches. “People don’t know what they want,” he says. For brands looking to focus on the discovery portion of the consumer journey, Pinterest could be a great option.

Michael was joined on stage by Dave Fall, CEO of BrandNetworks. They did a Q&A about what brands can do to get started with visual search.

For many brands, they said, it can feel like there’s a big barrier of entry or that it has to be a huge undertaking. But, they noted, remember that your brand does have visual assets already — think about what you use for your website, display ads, Amazon product listings, etc. Consider how you can re-purpose those to get started.

What DTCs and legacy brands can learn from each other 

Next we heard from Kerry Curran of Catalyst (GroupM). She talked about what brands can do to flip their performance marketing mindsets.

One particularly interesting finding she shared was that in campaigns, when brands communicate like a human, it can improve conversion by 900%.

She also noted that in the US, women over age 50 have $ 15 trillion in buying power. For many marketers, it might seem like younger generations have more appeal — but older generations have deeper pockets.

Embarking on a search transformation project

After this, we had a panel discussion on “embarking on a search transformation project.”

The panel included experts from Conde Nast, Microsoft, Mindshare, Volvo, and McKinsey.

John Shehata from Conde Nast shared some work they did to refresh and consolidate older content in order to boost keyword visibility by up to 1000%.

The challenge, as he pointed out, is that 90% of online content was created in the last two years, and 90% of that content gets no traffic. And, 50% of searches on Google end in no clicks. To face that, his team is working on taking past content, consolidating multiple pieces, and focusing on making each piece amazing.

Noel Reilly of Microsoft also touched on the speed at which new content is created. She encouraged marketers to think more broadly about what people want and are looking to discover. At Microsoft Ads, she said, 18% of queries each month are new queries.

When inputs are continuing to change so much, she recommended marketers really look at their search query reports to build content around those.

John Shehata of Conde Nast also spoke a bit about what they’re doing to prepare for voice search. Overall, he’s adopting a more conservative approach: investing a little, getting the foundation ready, and waiting for more clarity before diving into larger scale investment.

He likened the current discussion of voice search to the conversation about mobile a decade ago: “Remember when we said ‘mobile is here’ for ten years? But then it took ten years.”

And to wrap up from this session, we heard another great point from Noel of Microsoft: “The most successful brands I see are the ones putting people at the center of their advertising. Regardless of what the next big thing is in search, your job as a marketer is to understand your customer.”

Amazon search

Next we heard from John Denny with some interesting statistics and expert tips on Amazon search.

When it comes to how different generations search, he revealed that 52% of Gen Z named Amazon as their favorite site for shopping. The number two spot went to Nike, who claimed just 4% of votes — putting Amazon at 13 times that.

He also discussed three of the main options CPG brands have for driving purchases / traffic: a brand’s own website, a brand’s detail page on Amazon, and in-store traffic.

For the largest 100 CPG brands out there, he said, there was five times more traffic on the Amazon detail page plus in-store than there was on the brand’s own website.

His message: for brands not on Amazon, might be time to consider it.

Optimizing for voice search

Next, we heard another panel, this time specifically on voice search, from Mastercard, Synup, and Advantix Digital.

While earlier in the day we heard a more cautious perspective from Conde Nast, this panel was a bit more bullish on voice search.

Synup CEO Ashwin Ramesh gave one interesting rationale around the rapid adoption of voice search globally in countries like India, Indonesia, and parts of Southeast Asia. In India, he says, 50% of all searches are already done via voice. “They’re leapfrogging markets,” he said. He also gave the personal example that his grandmother — she doesn’t type and has never used a computer, but she sends him voice messages via her iPad.

Paradigm shifts in search

After this we heard from Stephen Kraus, Head of Digital Insights at Jumpshot. He shared many interesting statistics about the current state of the search industry and how it’s shifting.

90% of all search happens on Google, he says, and it skews branded (unlike on Pinterest). Of the top ten most used search terms on Google in the past couple months, seven are brands: Google, Facebook, Amazon, YouTube, Walmart, Craigslist, and BMW.

The other three, interestingly, were “you,” “weather,” and “news.”

While 90% of all search happens on Google, when it comes to product-related search, 54% happens on Amazon.

Stay tuned for part two with highlights from the afternoon sessions, as well as some deep dives into specific insights!

The post Transformation of Search Summit 2019: Highlight reel appeared first on Search Engine Watch.

Search Engine Watch


The State of SEO 2019 – Infographic

October 12, 2019 No Comments

Zazzle Media’s second annual “State of SEO survey” has assessed the value and ROI of SEO, looking at its impact in securing funds or resources.

The data suggested that 60% of marketers find that resources and a shortage of budget are the main reasons they don’t spend more on organic search activity. However, almost a third of surveyed marketers still don’t know how to measure the impact of SEO on their results.

The survey reviewed 70% of in-house marketers and 30% of agency heads from various companies. It called for marketers to develop a better understanding of attribution models, measurement tools, brand value, and purpose when it comes to spending more on SEO.

The main reasons cited for marketers struggling to secure investment are competitor awareness, revealing that marketers are too aware of their competitor’s activity, even noting that their branded keywords were being targeted by their competitors.

The report noted that data-led objectives can act as investment enablers as they can easily quantify and measure consumer traffic. They also help marketers prove ROI, by reviewing how marketing practices are improving year on year.

Yet the survey revealed that there is still a lack of understanding around best practices for marketers to use. A quarter of those surveyed called for clearer guidelines on best practice from Google Webmasters, revealing that there is, in fact, a knowledge and skills gap around SEO.

Zazzle Media’s head of search and strategy, Stuart Shaw, said

“As an industry, we’ve needed to educate, educate, educate – at almost every level of client infrastructure. That challenge still remains, in fact, it probably changes monthly but now with more noise than ever.

However knowledge has always been power in this industry, keeping up with updates, marketing news and best practice guidelines across Google and Bing can be the difference in the results marketers need to secure that extra budget.”

You can download the full results of The State of SEO here, and check out the top-line stats on the infographic below.

State of SEO 2019 Infographic

The post The State of SEO 2019 – Infographic appeared first on Search Engine Watch.

Search Engine Watch


Fall 2019 Updates to Google Merchant Center

October 10, 2019 No Comments

Get a recap of the Fall 2019 updates to Google Merchant Center including updated diagnostics, design, navigation, and much more. 

Read more at PPCHero.com
PPC Hero


5 days left to save on passes to Disrupt Berlin 2019

October 7, 2019 No Comments

A show of hands, startuppers. Who’s ready to save some money on passes to Disrupt Berlin 2019, our premier tech conference that takes place on 11-12 December? Then listen up, because our super early bird pricing ends in just five days. Right now, passes start at €345 + VAT and, depending on which pass you choose, you can save up to €600. Ka-ching!

Save your euros. Buy your passes to Disrupt Berlin before the Friday, 11 October at 11:59 p.m. (CEST) deadline. Then plan your strategy to make sure you take full advantage of Disrupt. Let’s look at what’s in store.

We’re talking two full days of programming. A roster of world-class speakers and panelists — founders, investors and icons. These are folks who have done the hard work in the trenches. They know how to succeed, and they’ll share their experiences, insights and advice.

We’re thrilled that our roster includes the likes of Julian Stiefel, co-founder/co-CEO of Tourlane. The company’s ongoing mission? Using a recent round of funding ($ 47 million) to address the challenging problems associated with booking group travel.

You’ll also hear from Jen Rubio, co-founder and chief brand officer of Away, one of the most successful consumer brands in years. How successful? The company, which launched in 2015, has sold more than 1 million suitcases, raised a $ 100 million round at a $ 1.4 billion valuation earlier this year and turned profitable in 2018. We’re guessing she might have just one or two tips for aspiring direct-to-consumer entrepreneurs.

Don’t miss the legendary entrepreneurial showdown that is Startup Battlefield. This epic pitch competition has, since its inception, launched 857 companies that have gone on to collective raise $ 8.9 billion and produce 113 exits. Be in the room and cheer on some of the world’s top early-stage startups as they compete for the $ 50,000 equity-free prize, investor love and global media attention.

Ready to network? There’s no better place to start than Startup Alley, the Disrupt expo floor. You’ll find hundreds of innovative early-stage startups exhibiting their tech products, services and platforms. Make connecting with the people who can help move your business forward by using CrunchMatch.

Our business-matching platform makes it easier to find and connect with people who share your business interests. You create a profile listing your specific criteria and goals. The CrunchMatch algorithm suggests matches and, subject to your approval, proposes meeting times and sends meeting requests.

When you’re in Startup Alley, be sure to keep an eye out for our TC Top Picks. These companies, curated and selected by TechCrunch editors, represent the best early-stage startups in these categories: AI/Machine Learning, Biotech/Healthtech, Blockchain, Fintech, Mobility, Privacy/Security, Retail/E-commerce, Robotics/IoT/Hardware, CRM/Enterprise and Education.

An amazing slate of speakers, a world-class pitch competition, hundreds of exhibitors and full-tilt networking. That’s just a small taste of what’s waiting for you at Disrupt Berlin 2019  on 11-12 December. Why pay more than necessary? The super early bird pricing disappears on Friday, 11 October at 11:59 p.m. (CEST) deadline. Buy your passes here today.

Is your company interested in sponsoring or exhibiting at Disrupt Berlin 2019? Contact our sponsorship sales team by filling out this form.


Startups – TechCrunch


Optimizing for voice search in 2019: Q&A with Amine Bentahar

September 28, 2019 No Comments

As we gear up for The Transformation of Search Summit at the end of October, we have another speaker Q&A. This time we’re hearing from Amine Bentahar about his upcoming session on voice search optimization.

Amine Bentahar is the Chief Digital and Operating Officer at Adantix Digital. He’s also an author and member of the Forbes Agency Council.

amine bentahar speaker interview

Amine’s session will be about “Optimizing for position 0: Everything you need to know about voice search.”

Tell us about your current work

Amine Bentahar: I’m the Chief Digital & Operating Officer at Advantix Digital. I’m in charge of operations and ensuring that we are delivering the best quality work and exceptional results for our clients.

I’m also responsible for the overall digital and marketing strategy for many of our key clients which includes publicly traded companies, companies backed by major VC and PE firms, and mid-sized companies from various industries. 

What are your key priorities over the next twelve months?

AB: Implement a voice search strategy for all of our B2C and B2B clients, and continue to leverage voice search as a channel to drive new customer acquisitions for our clients. 

What is your biggest challenge in achieving those?

AB: Most companies haven’t allocated a budget specific to just voice search, and aren’t taking the time to truly understand how their customers are either looking for information or shopping through voice.

Because of this, we are having to spend a lot of time educating companies about the importance of having a voice search strategy and budget. 

What’s your advice to others who may be facing similar challenges?

AB: Educate your teams or clients on voice search and how it’s changing the way customers are shopping or looking for information. 

What’s an interesting trend you’re seeing in the market right now?

AB: The integration of voice search technology in cars, TVs, appliances and other devices. 

How do you expect it will change in the next 6-12 months?

AB: With all the money being invested in R&D by the big players (Amazon, Google, Apple and Microsoft), I would expect to see this trend to continue growing, and for voice search technology to be available on even more devices. 

Tell us a bit about your session at the Search Summit?

AB: My session will be about optimizing for voice search and more specifically about the steps companies must take to rank for position 0. We will help attendees understand how voice search works and how to develop organic content to be “read” by Alexa or Google Home. 

What are you looking forward to most at the Summit?

AB: I’m looking forward to meeting other thought leaders and marketers and learning from their experiences about things that are disrupting the search world. 

What’s one of your favorite search technologies and why?

AB: Voice search as I find it somewhat amazing especially when you see the fast adoption rate of the technology and how it’s impacting the way customers are now searching. 

What’s something you do every day that helps you be more successful or productive?

AB: I do my best to exercise everyday and also I take at least 30 minutes of my day to read either about marketing or management. 

The post Optimizing for voice search in 2019: Q&A with Amine Bentahar appeared first on Search Engine Watch.

Search Engine Watch


Delete your pages and rank higher in search – Index bloat and technical optimization 2019

July 16, 2019 No Comments

If you’re looking for a way to optimize your site for technical SEO and rank better, consider deleting your pages.

I know, crazy, right? But hear me out.

We all know Google can be slow to index content, especially on new websites. But occasionally, it can aggressively index anything and everything it can get its robot hands on whether you want it or not. This can cause terrible headaches, hours of clean up, and subsequent maintenance, especially on large sites and/or ecommerce sites.

Our job as search engine optimization experts is to make sure Google and other search engines can first find our content so that they can then understand it, index it, and rank it appropriately. When we have an excess of indexed pages, we are not being clear with how we want search engines to treat our pages. As a result, they take whatever action they deem best which sometimes translates to indexing more pages than needed.

Before you know it, you’re dealing with index bloat.

What is the index bloat?

Put simply, index bloat is when you have too many low-quality pages on your site indexed in search engines. Similar to bloating in the human digestive system (disclaimer: I’m not a  doctor), the result of processing this excess content can be seen in search engines indices when their information retrieval process becomes less efficient.

Index bloat can even make your life difficult without you knowing it. In this puffy and uncomfortable situation, Google has to go through much more content than necessary (most of the times low-quality and internal duplicate content) before they can get to the pages you want them to index.

Think of it this way: Google visits your XML sitemap to find 5,000 pages, then crawls all your pages and finds even more of them via internal linking, and ultimately decides to index 30,000 URLs. This comes out to an indexation excess of approximately 500% or even more.

But don’t worry, diagnosing your indexation rate to measure against index bloat can be a very simple and straight forward check. You simply need to cross-reference which pages you want to get indexed versus the ones that Google is indexing (more on this later).

The objective is to find that disparity and take the most appropriate action. We have two options:

  1. Content is of good quality = Keep indexability
  2. Content is of low quality (thin, duplicate, or paginated) = noindex

You will find that most of the time, index bloat results in removing a relatively large number of pages from the index by adding a “NOINDEX” meta tag. However, through this indexation analysis, it is also possible to find pages that were missed during the creation of your XML sitemap(s), and they can then be added to your sitemap(s) for better indexing.

Why index bloat is detrimental for SEO

Index bloat can slow processing time, consume more resources, and open up avenues outside of your control in which search engines can get stuck. One of the objectives of SEO is to remove roadblocks that hinder great content from ranking in search engines, which are very often technical in nature. For example, slow load speeds, using noindex or nofollow meta tags where you shouldn’t, not having proper internal linking strategies in place, and other such implementations.

Ideally, you would have a 100% indexation rate. Meaning every quality page on your site would be indexed – no pollution, no unwanted material, no bloating. But for the sake of this analysis, let’s consider anything above 100% bloat. Index bloat forces search engines to spend more resources (which are limited) than needed processing the pages they have in their database.

At best, index bloat causes inefficient crawling and indexing, hindering your ranking capability. But index bloat at worst can lead to keyword cannibalization across many pages on your site, limiting your ability to rank in top positions, and potentially impacting the user experience by sending searchers to low-quality pages.

To summarize, index bloat causes the following issues:

  1. Exhausts the limited resources Google allocates for a given site
  2. Creates orphaned content (sending Googlebot to dead-ends)
  3. Negatively impacts the website’s ranking capability
  4. Decreases the quality evaluation of the domain in the eyes of search engines

Sources of index bloat

1. Internal duplicate content

Unintentional duplicate content is one of the most common sources of index bloat. This is because most sources of internal duplicate content revolve around technical errors that generate large numbers of URL combinations that end up indexed. For example, using URL parameters to control the content on your site without proper canonicalization.

Faceted navigation has also been one of the “thorniest SEO challenges” for large ecommerce sites, as Portent describes, and has the potential of generating billions of duplicate content pages by overlooking a simple feature.

2. Thin content

It’s important to mention an issue introduced by the Yoast SEO plugin version 7.0 around attachment pages. This WordPress plugin bug led to “Panda-like problems” in March of 2018 causing heavy ranking drops for affected sites as Google deemed these sites to be lower in the overall quality they provided to searchers. In summary, there is a setting within the Yoast plugin to remove attachment pages in WordPress – a page created to include each image in your library with minimal content – the epitome of thin content for most sites. For some users, updating to the newest version (7.0 then) caused the plugin to overwrite the previous selection to remove these pages and defaulted to index all attachment pages.

This then meant that having five images per blog post would lead to 5x-ing the number of indexed pages with 16% of actual quality content per URL, causing a massive drop in domain value.

3. Pagination

Pagination refers to the concept of splitting up content into a series of pages to make content more accessible and improve user experience. This means that if you have 30 blog posts on your site, you may have ten blog posts per page that go three pages deep. Like so:

  • https://www.example.com/blog/
  • https://www.example.com/blog/page/2/
  • https://www.example.com/blog/page/3/

You’ll see this often on shopping pages, press releases, and news sites, among others.

Within the purview of SEO, the pages beyond the first in the series will very often contain the same page title and meta description, along with very similar (near duplicate) body content, introducing keyword cannibalization to the mix. Additionally, the purpose of these pages is for a better browsing user experience for users already on your site, it doesn’t make sense to send search engine visitors to the third page of your blog.

4. Under-performing content

If you have content on your site that is not generating traffic, has not resulted in any conversions, and does not have any backlinks, you may want to consider changing your strategy. Repurposing content is a great way to maximize any value that can be salvaged from under-performing pages to create stronger and more authoritative pages.

Remember, as SEO experts our job is to help increase the overall quality and value that a domain provides, and improving content is one of the best ways to do so. For this, you will need a content audit to evaluate your own individual situation and what the best course of action would be.

Even a 404 page that results in a 200 Live HTTP status code is a thin and low-quality page that should not be indexed.

Common index bloat issues

One of the first things I do when auditing a site is to pull up their XML sitemap. If they’re on a WordPress site using a plugin like Yoast SEO or All in One SEO, you can very quickly find page types that do not need to be indexed. Check for the following:

  • Custom post types
  • Testimonial pages
  • Case study pages
  • Team pages
  • Author pages
  • Blog category pages
  • Blog tag pages
  • Thank you pages
  • Test pages

To determine if the pages in your XML sitemap are low-quality and need to be removed from search really depends on the purpose they serve on your site. For instance, sites do not use author pages in their blog, but still, have the author pages live, and this is not necessary. “Thank you” pages should not be indexed at all as it can cause conversion tracking anomalies. Test pages usually mean there’s a duplicate somewhere else. Similarly, some plugins or developers build custom features on web builds and create lots of pages that do not need to be indexed. For example, if you find an XML sitemap like the one below, it probably doesn’t need to be indexed:

  • https://www.example.com/tcb_symbols_tax-sitemap.xml

Different methods to diagnose index bloat

Remember that our objective here is to find the greatest contributors of low-quality pages that are bloating the index with low-quality content. Most times it’s very easy to find these pages on a large scale since a lot of thin content pages follow a pattern.

This is a quantitative analysis of your content, looking for volume discrepancies based on the number of pages you have, the number of pages you are linking to, and the number of pages Google is indexing. Any disparity between these numbers means there’s room for technical optimization, which often results in an increase in organic rankings once solved. You want to make these sets of numbers as similar as possible.

As you go through the various methods to diagnose index bloat below, look out for patterns in URLs by reviewing the following:

  • URLs that have /dev/
  • URLs that have “test”
  • Subdomains that should not be indexed
  • Subdirectories that should not be indexed
  • A large number of PDF files that should not be indexed

Next, I will walk you through a few simple steps you can take on your own using some of the most basic tools available for SEO. Here are the tools you will need:

  • Paid Screaming Frog
  • Verified Google Search Console
  • Your website’s XML sitemap
  • Editor access to your Content Management System (CMS)
  • Google.com

As you start finding anomalies, start adding them to a spreadsheet so they can be manually reviewed for quality.

1. Screaming Frog crawl

Under Configuration > Spider > Basics, configure Screaming Frog to crawl (check “crawl all subdomains”, and “crawl outside of start folder”, manually add your XML sitemap(s) if you have them) for your site in order to run a thorough scan of your site pages. Once the crawl has been completed, take note of all the indexable pages it has listed. You can find this in the “Self-Referencing” report under the Canonicals tab.

screenshot example of using Screaming Frog to scan through XML sitemaps

Take a look at the number you see. Are you surprised? Do you have more or fewer pages than you thought? Make a note of the number. We’ll come back to this.

2. Google’s Search Console

Open up your Google Search Console (GSC) property and go to the Index > Coverage report. Take a look at the valid pages. On this report, Google is telling you how many total URLs they have found on your site. Review the other reports as well, GSC can be a great tool to evaluate what the Googlebot is finding when it visits your site.

screenshot example of Google Search Console's coverage report

How many pages does Google say it’s indexing? Make a note of the number.

3. Your XML sitemaps

This one is a simple check. Visit your XML sitemap and count the number of URLs included. Is the number off? Are there unnecessary pages? Are there not enough pages?

Conduct a crawl with Screaming Frog, add your XML sitemap to the configuration and run a crawl analysis. Once it’s done, you can visit the Sitemaps tab to see which specific pages are included in your XML sitemap and which ones aren’t.

example of using Screaming Frog to run a crawl analysis of an XML sitemap

Make a note of the number of indexable pages.

4. Your own Content Management System (CMS)

This one is a simple check too, don’t overthink it. How many pages on your site do you have? How many blog posts do you have? Add them up. We’re looking for quality content that provides value, but more so in a quantitative fashion. It doesn’t have to be exact as the actual quality a piece of content has can be measured via a content audit.

Make a note of the number you see.

5. Google

At last, we come to the final check of our series. Sometimes Google throws a number at you and you have no idea where it comes from, but try to be as objective as possible. Do a “site:domain.com” search on Google and check how many results Google serves you from its index. Remember, this is purely a numeric value and does not truly determine the quality of your pages.

screenshot example of using Google search results to spot inefficient indexation

Make a note of the number you see and compare it to the other numbers you found. Any discrepancies you find indicates symptoms of an inefficient indexation. Completing a simple quantitative analysis will help direct you to areas that may not meet minimum qualitative criteria. In other words, comparing numeric values from multiple sources will help you find pages on your site that contain a low value.

The quality criteria we evaluate against can be found in Google’s Webmaster guidelines.

How to resolve index bloat

Resolving index bloat is a slow and tedious process, but you have to trust the optimizations you’re performing on the site and have patience during the process, as the results may be slow to become noticeable.

1. Deleting pages (Ideal)

In an ideal scenario, low-quality pages would not exist on your site, and thus, not consume any limited resources from search engines. If you have a large number of outdated pages that you no longer use, cleaning them up (deleting) can often lead to other benefits like fewer redirects and 404s, fewer thin-content pages, less room for error and misinterpretation from search engines, to name a few.

The less control you give search engines by limiting their options on what action to take, the more control you will have on your site and your SEO.

Of course, this isn’t always realistic. So here are a few alternatives.

2. Using Noindex (Alternative)

When you use this method at the page level please don’t add a site-wide noindex – happens more often than we’d like), or within a set of pages, is probably the most efficient as it can be completed very quickly on most platforms.

  • Do you use all those testimonial pages on your site?
  • Do you have a proper blog tag/category in place, or are they just bloating the index?
  • Does it make sense for your business to have all those blog author pages indexed?

All of the above can be noindexed and removed from your XML sitemap(s) with a few clicks on WordPress if you use Yoast SEO or All in One SEO.

3. Using Robots.txt (Alternative)

Using the robots.txt file to disallow sections or pages of your site is not recommended for most websites unless it has been explicitly recommended by an SEO Expert after auditing your website. It’s incredibly important to look at the specific environment your site is in and how a disallow of certain pages would affect the indexation of the rest of the site. Making a careless change here may result in unintended consequences.

Now that we’ve got that disclaimer out of the way, disallowing certain areas of your site means that you’re blocking search engines from even reading those pages. This means that if you added a noindex, and also disallowed, Google won’t even get to read the noindex tag on your page or follow your directive because you’ve blocked them from access. Order of operations, in this case, is absolutely crucial in order for Google to follow your directives.

4. Using Google Search Console’s manual removal tool (Temporary)

As a last resort, an action item that does not require developer resources is using the manual removal tool within the old Google Search Console. Using this method to remove pages, whole subdirectories, and entire subdomains from Google Search is only temporary. It can be done very quickly, all it takes is a few clicks. Just be careful of what you’re asking Google to deindex.

A successful removal request lasts only about 90 days, but it can be revoked manually. This option can also be done in conjunction with a noindex meta tag to get URLs out of the index as soon as possible.

Conclusion

Search engines despise thin content and try very hard to filter out all the spam on the web, hence the never-ending search quality updates that happen almost daily. In order to appease search engines and show them all the amazing content we spent so much time creating, webmasters must make sure their technical SEO is buttoned up as early in the site’s lifespan as possible before index bloat becomes a nightmare.

Using the different methods described above can help you diagnose any index bloat affecting your site so you can figure out which pages need to be deleted. Doing this will help you optimize your site’s overall quality evaluation in search engines, rank better, and get a cleaner index, allowing Google to find the pages you’re trying to rank quickly and efficiently.

Pablo Villalpando is a Bilingual SEO Strategist for Victorious. He can be found on Twitter @pablo_vi.

The post Delete your pages and rank higher in search – Index bloat and technical optimization 2019 appeared first on Search Engine Watch.

Search Engine Watch