CBPO

Tag: Search

Dynamic Search Ads For Beginners

September 1, 2019 No Comments

Learn what Dynamic Search Ads (DSAs) are, how to set them up and how they can be used alongside remarketing.

Read more at PPCHero.com
PPC Hero


How to check Google search results for different locations

August 30, 2019 No Comments

One of the fundamental truths about SEO is that no two Google searches are the same.

The logic behind it is simple, things you’ve Googled, read and watched are stored for at least three months before your Web & App Activity is deleted, if at all.

This, together with data on devices you use as well as places you go – both in terms of location history and the current IP – lets Google deliver personalized results. While this is convenient, you end up in the infamous “filter bubble”.  

In a world of highly customized SERPs on the one hand, and a host of ranking signals for local search Google uses in its algorithms on the other, pulling relevant ranking data is as challenging as it gets.

Luckily, there are a bunch of ways to pop the filter bubble, targeting the one thing that seems to be dominating personalized search – location.

Not only does it determine what users see in search results, but it also helps business owners address the issue of inconsistent SERP performance across their service areas.

The thing is, doing your local SEO homework doesn’t stop at continuous content improvement and link building, targeted specifically for local search. Poor performance can still be an issue – one that is oftentimes attributed to not having enough of a customer base in a certain location. Therefore, the problem can only be diagnosed by checking SERPs for the entirety of the geographical area covered.

Without further ado, let’s look at how you can fetch rankings for different locations manually and using designated tools – all from the comfort of your home.

Country-level search

First off, decide on the level of localization.

For brands working in multiple countries, pulling nationwide results is more than enough. For local businesses operating within a city, ranking data will differ district by district and street by street.

Check manually

So, say you want to see how well a website performs in country-level search. For that, you’ll need to adjust Google search settings and then specify the region you’d like to run a search for. And yes, you heard it right: simply checking that you have the correct TLD extension is no longer enough since Google stopped serving results on separate country domains a while back.

Now, in order to run a country-specific search manually, locate Search settings in your browser and pick a region from the list available under Region Settings.

google-regional-settings

 

Alternatively, use a proxy or VPN service – both work for doing a country-wide search.

Use rank tracking software

To automate the job, turn to the rank tracking software of choice, for example, Rank Tracker. The results will pretty much reflect the SERPs you fetched having manually adjusted search settings in the browser.

There you have it – non-geo-sensitive queries and multilingual websites performance tracking are all taken care of.

City-level search

Doing SEO for small or medium-sized business comes with many challenges, not the least of which is making sure your website shows up in local search.

Whether you have a physical store or simply provide services within a specific area, tracking ranking coverage on the city level will ultimately improve findability, and drive leads and customers.

Check manually

To manually run a search limited to a specific city, use the ‘&near=cityname’ search parameter in your Google URL:

As the name suggests, “&near=cityname” lets you pull SERPs near a certain location. While this method is easy to master, many claim that it’s unreliable, with results often delivered for a larger city nearby.

city-level-search-in-google

Still, the trick is nice to have up your sleeve as a quick and sound way of checking city-specific rankings manually.

Another silver bullet of local search that is sure to hit the city target is Google Ads’ Preview and Diagnosis Tool.

The Ad Preview and Diagnosis tool lets you pick a location, specify a language as well as user device – and fetch local SERPs regardless of your current whereabouts.

Use rank tracking software

Pretty much every rank tracking tool out there is able to run a city-specific ranking check.

Rank Tracker, Ahrefs, SEMrush, Whitespark, AccuRanker, BrightLocal – you name it – all boast the functionality and deliver local search results. That said, picking the right software for you and your business is a two-fold process.

First, take the time to look into the supported locations for search, since some of the tools, like Whitespark or SEMrush, have a somewhat limited location catalog. Second, you need to double-check that the software you’re most interested in is using their own database, with results relying on a well-designed and trusted crawler.

Doing this type of research helps you safeguard that you are able to easily see accurate SERPs for the location of your choosing.

In case you’re new to city-level ranking checks and/or baffled by the variety of options on the market, go for a single-dashboard tool: BrightLocal would be a perfect example of clean design and intuitive navigation.

Better yet, all data lives on BrightLocal’s website, which adds to the overall user-friendliness and lets you easily automate the monitoring of top search engines for multiple locations.

Street-level search

Google’s Local Pack is the place to be when running any kind of business. With over half of searches run from mobile devices, a single Local Pack may take up as much as an entire results page on a smartphone.

Both Maps and Local Pack results are extremely location-sensitive. Always keep that in mind while you’re doing your research. In order to verify that your business shows up for the right locations within a city, the search is to be narrowed down to a specific street address.

Check manually

Not to say that you cannot configure an address-specific search by yourself. Even manually, this is still perfectly doable.

However, unlike relying on a toolkit that would basically do the whole process for you, setting up a highly localized search in a browser involves multiple steps and also requires some groundwork.

  1. To start off, you need to get the exact geo-coordinates of the location you’d like to run the search from. When in doubt, use a designated tool.
  2. In your Google Chrome browser, open DevTools: navigate to the top right corner of your browser window and click on Tools > Developer Tools. You can also press Control+Shift+C (on Windows) or Command+Option+C(on Mac).

manually-checking-search-results-using-developer-tools

 

3. Navigate to the three-dot menu icon in the top right corner: from there, click More Tools > Sensors. This step is also the appropriate time to give yourself some credit for getting that far in Google search configuration.

4. In the Geolocation dropdown, select “Other” and paste your target longitude and latitude coordinates.

5. Run a search and retrieve the SERPs for the exact location you specified.

In case you aren’t particularly excited about a multistep search setup, try the Valentin app, it lets you check search results for any location with no DevTools involved.

Use rank tracking software

If anything, rank tracking for multiple precise locations is the one job you want automated and done for you by a tool that was specifically developed for local search.

There you have the idea behind SEO PowerSuite’s Rank Tracker designed to, among other things, pull hyper-localized SERPs for unlimited locations. Configure as many custom search engines as you wish. On top of that, set up scheduled tasks and have local search results checked autonomously.

Not only do I rely on Rank Tracker because it has been built by my team but also because it’s the only toolkit out there that automates what both Chrome and Valentin app help you configure manually. And of course, ranking data retrieved by the software is precise and easily exportable.

Another tool that lets you visualize – quite literally – any business’ search performance across a service area is Local Falcon. Created for Google Maps, the platform runs a search for up to 225 locations within any area specified.

With an overview of your search performance at hand, you can make better targeting choices while expanding outreach and winning new customers.

Final thoughts

Given that there are as many SERP variations as there are searches, rank tracking may feel utterly discouraging: if no two users get to see quite the exact same results, why bother? Well, the sentiment is totally understandable.

But in fact, it all boils down to understanding the reasons behind tracking rankings in the first place.

Is it to see how quickly your SEO efforts transform into higher positions in SERPs? That’d be one. Is it to make sense of the changes in traffic and sales at every point and in every location? Sure.

Big and small, businesses today simply have to keep tabs on their rankings not just country-wide but even on a street-by-street basis. There is hardly any excuse to ignore a single metric here.

Not just that, in business as well as SEO there is no such thing as an unexplainable dynamic. And more often than not, you have to take a closer look to see the root of any problem.

We all understand that rankings in themselves aren’t the only metric of success. It’s not as straightforward as having more traffic, getting more business is the main goal.

But it shouldn’t in any way undermine the overall importance of tracking rankings as a tried and tested way of checking that your website is served among relevant search results.

Local search is all about making sure your customers see you and get to you. So use it to your best advantage – whether you go for checking manually or using rank tracking software.

Aleh is the Founder and CMO at SEO PowerSuite and Awario. He can be found on Twitter at @ab80.

The post How to check Google search results for different locations appeared first on Search Engine Watch.

Search Engine Watch


Augmented Search Queries Using Knowledge Graph Information

August 24, 2019 No Comments

What are Augmented Search Queries?

Last year, I wrote a post called Quality Scores for Queries: Structured Data, Synthetic Queries and Augmentation Queries, which told us that Google may look at query logs and structured data (table data and schema data) related to a site to create augmentation queries, and evaluate information about searches for those comparing them to original queries for pages from that site, and if the results of the augmentation queries do well in evaluations compared to the original query results, searchers may see search results that are a combination of results from the original queries and the augmentation queries.

Around the time that patent was granted to Google another patent that talks about augmented search queries was also granted to Google, and is worth talking about at the same time with the patent I wrote about last year. It takes the concept of adding results from augmented search queries together with original search results, but it has a different way of coming up with augmented search queries, This newer patent that I am writing about starts off by telling us what the patent is about:

This disclosure relates generally to providing search results in response to a search query containing an entity reference. Search engines receive search queries containing a reference to a person, such as a person’s name. Results to these queries are often times not sufficiently organized, not comprehensive enough, or otherwise not presented in a useful way.

Augmentation from the first patent means possibly providing additional information in search results based upon additional query information from query logs or structured data from a site. Under this new patent, augmentation comes from recognizing that an entity exists in a query, and providing some additional information in search results based upon that entity.

This patent is interesting to me because it takes an older type of search – where a query returns pages in response to the keywords typed into a search box, with a newer type of search, where an entity is identified in a query, and knowledge information about that entity is reviewed to create possible augmentation queries that could be combined with the results of the original query.

The process behind this patent can be described in this way:

In some implementations, a system receives a search query containing an entity reference, such as a person’s name, that corresponds to one or more distinct entities. The system provides a set of results, where each result is associated with at least one of the distinct entities. The system uses the set of results to identify attributes of the entity and uses the identified attributes to generate additional, augmented search queries associated with the entity. The system updates the set of results based on one or more of these augmented search queries.

A summary of that process can be described as:

  1. Receiving a search query associated with an entity reference, wherein the entity reference corresponds to one or more distinct entities.
  2. Providing a set of results for the search query where the set of results distinguishes between distinct entities.
  3. Identifying one or more attributes of at least one entity of the one or more distinct entities based at least in part on the set of results.
  4. Generating one or more additional search queries based on the search query, the at least one entity, and the one or more attributes.
  5. Receiving an input selecting at least one of the one or more additional search queries and providing an updated set of results based on the selected one or more additional search queries, where the updated set of results comprises at least one result not in the set of results.

The step of generating one or more additional search queries means ranking the identified one or more attributes and generating one or more additional search queries based on the search query, the at least one entity, the one or more attributes, and the ranking.

That ranking can be based on the frequency of occurrence.
The ranking can also be based on a location of each of the one or more attributes with respect to at least one entity in the set of results.

augmented search queries - Planet of the apes example

This process can identify two different entities in a query. For instance, there were two versions of the Movie, the Planet of the Apes. One was released in 1968, and the other was released in 2001. They had different actors in them, and the second was considered a reboot of the first.

When results are generated in instances where there may be more than one entity involved, the search queries provided may distinguish between the distinct entities. They may identify one or more attributes of at least one entity of the one or more distinct entities based at least in part on the set of results. Augmented search queries may be generated for “one or more additional search queries based on the search query, the at least one entity, and the one or more attributes.”

This patent can be found at:

Providing search results using augmented search queries
Inventors: Emily Moxley and Sean Liu
Assignee: Google LLC
US Patent: 10,055,462
Granted: August 21, 2018
Filed: March 15, 2013

Abstract

Methods and systems are provided for updating a set of results. In some implementations, a search query associated with an entity reference is received. The entity reference corresponds to one or more distinct entities. A set of results for the search query is provided, and the set of results distinguishes between distinct entities. One or more attributes for at least one entity of the one or more distinct entities are identified based at least in part on the set of results. One or more additional search queries are identified based on the search query, the at least one entity, and the one or more attributes. An input selecting at least one of the additional search queries is received. An updated set of results is provided based on the selected additional search queries. The updated set of results comprises at least one result not in the set of results.

Some Additional Information About How Augmented Search Queries are Found and Used

A couple of quick definitions from the patent:

Entity Reference – refers to an identifier that corresponds to one or more distinct entities.

Entity – refers to a thing or concept that is singular, unique, well defined, and distinguishable.

This patent is all about augmenting a set of query results by providing more information about entities that may appear in a query:

An entity reference may correspond to more than one distinct entity. An entity reference may be a person’s name, and corresponding entities may include distinct people who share the referenced name.

This process is broader than queries involving people. We are given a list in the patent that it includes, and it covers, “a person, place, item, idea, topic, abstract concept, concrete element, other suitable thing, or any combination thereof.”

And when an entity reference appears in a query, it may cover a number of entities, for example, a query that refers to John Adams could be referring to:

  • John Adams the Second President
  • John Quincy Adams the Sixth President
  • John Adams the artist

Entity attributesIn addition to having an entity in an entity reference in a query, we may see a mention of an attribute for that entity, which is “any feature or characteristic associated with an entity that the system may identify based on the set of results.” For the John Adams entity reference, we may also see attributes included in search results, such as [second president], [Abigail Adams], and [Alien and Sedition Acts].

entity selection box

It sounds like an entity selection box could be shown that allows a searcher to identify which entity they might like to see results about, so when there is an entity in a query such as John Adams, and there are at least three different John Adams that could be included in augmented search results, there may be clickable hyperlinks for entities for a searcher to select or deselect which entity they might be interested in seeing more about.

Augmented Search Queries with Entities Process Takeaways

When an original query includes and entity reference in it, Google may allow searchers to identify which entity they are interested in, and possibly attributes associated with that entity. This really brings the knowledge graph to search, using it to augment queries in such a manner. A flowchart from the patent illustrates this process in a way that was worth including in this post:

augmented search queries flowchart

The patent provides a very detailed example of how a search that includes entity information about a royal wedding in England might be surfaced using this augmented search query approach. That may not be a query that I might perform, but I could imagine some that I would like to try out. I could envision some queries involving sports and movies and business. If you own a business, and it is not in Google’s knowledge graph you may end up missing out on being included in results from augmented search queries.


Copyright © 2019 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana

The post Augmented Search Queries Using Knowledge Graph Information appeared first on SEO by the Sea ⚓.


SEO by the Sea ⚓


New Releases on Hero Academy! Starting Pinterest & Apple Search Ads

August 21, 2019 No Comments

Hero Academy is Hanapin’s newest initiative featuring short and basic how-tos on paid advertising in a variety of platforms.

Read more at PPCHero.com
PPC Hero


Pinterest’s New Search Tool Puts Stress Relief in Your Feed

July 22, 2019 No Comments

Soon the company will begin placing anxiety-relieving exercises within its search results to help boost your mood.
Feed: All Latest


Delete your pages and rank higher in search – Index bloat and technical optimization 2019

July 16, 2019 No Comments

If you’re looking for a way to optimize your site for technical SEO and rank better, consider deleting your pages.

I know, crazy, right? But hear me out.

We all know Google can be slow to index content, especially on new websites. But occasionally, it can aggressively index anything and everything it can get its robot hands on whether you want it or not. This can cause terrible headaches, hours of clean up, and subsequent maintenance, especially on large sites and/or ecommerce sites.

Our job as search engine optimization experts is to make sure Google and other search engines can first find our content so that they can then understand it, index it, and rank it appropriately. When we have an excess of indexed pages, we are not being clear with how we want search engines to treat our pages. As a result, they take whatever action they deem best which sometimes translates to indexing more pages than needed.

Before you know it, you’re dealing with index bloat.

What is the index bloat?

Put simply, index bloat is when you have too many low-quality pages on your site indexed in search engines. Similar to bloating in the human digestive system (disclaimer: I’m not a  doctor), the result of processing this excess content can be seen in search engines indices when their information retrieval process becomes less efficient.

Index bloat can even make your life difficult without you knowing it. In this puffy and uncomfortable situation, Google has to go through much more content than necessary (most of the times low-quality and internal duplicate content) before they can get to the pages you want them to index.

Think of it this way: Google visits your XML sitemap to find 5,000 pages, then crawls all your pages and finds even more of them via internal linking, and ultimately decides to index 30,000 URLs. This comes out to an indexation excess of approximately 500% or even more.

But don’t worry, diagnosing your indexation rate to measure against index bloat can be a very simple and straight forward check. You simply need to cross-reference which pages you want to get indexed versus the ones that Google is indexing (more on this later).

The objective is to find that disparity and take the most appropriate action. We have two options:

  1. Content is of good quality = Keep indexability
  2. Content is of low quality (thin, duplicate, or paginated) = noindex

You will find that most of the time, index bloat results in removing a relatively large number of pages from the index by adding a “NOINDEX” meta tag. However, through this indexation analysis, it is also possible to find pages that were missed during the creation of your XML sitemap(s), and they can then be added to your sitemap(s) for better indexing.

Why index bloat is detrimental for SEO

Index bloat can slow processing time, consume more resources, and open up avenues outside of your control in which search engines can get stuck. One of the objectives of SEO is to remove roadblocks that hinder great content from ranking in search engines, which are very often technical in nature. For example, slow load speeds, using noindex or nofollow meta tags where you shouldn’t, not having proper internal linking strategies in place, and other such implementations.

Ideally, you would have a 100% indexation rate. Meaning every quality page on your site would be indexed – no pollution, no unwanted material, no bloating. But for the sake of this analysis, let’s consider anything above 100% bloat. Index bloat forces search engines to spend more resources (which are limited) than needed processing the pages they have in their database.

At best, index bloat causes inefficient crawling and indexing, hindering your ranking capability. But index bloat at worst can lead to keyword cannibalization across many pages on your site, limiting your ability to rank in top positions, and potentially impacting the user experience by sending searchers to low-quality pages.

To summarize, index bloat causes the following issues:

  1. Exhausts the limited resources Google allocates for a given site
  2. Creates orphaned content (sending Googlebot to dead-ends)
  3. Negatively impacts the website’s ranking capability
  4. Decreases the quality evaluation of the domain in the eyes of search engines

Sources of index bloat

1. Internal duplicate content

Unintentional duplicate content is one of the most common sources of index bloat. This is because most sources of internal duplicate content revolve around technical errors that generate large numbers of URL combinations that end up indexed. For example, using URL parameters to control the content on your site without proper canonicalization.

Faceted navigation has also been one of the “thorniest SEO challenges” for large ecommerce sites, as Portent describes, and has the potential of generating billions of duplicate content pages by overlooking a simple feature.

2. Thin content

It’s important to mention an issue introduced by the Yoast SEO plugin version 7.0 around attachment pages. This WordPress plugin bug led to “Panda-like problems” in March of 2018 causing heavy ranking drops for affected sites as Google deemed these sites to be lower in the overall quality they provided to searchers. In summary, there is a setting within the Yoast plugin to remove attachment pages in WordPress – a page created to include each image in your library with minimal content – the epitome of thin content for most sites. For some users, updating to the newest version (7.0 then) caused the plugin to overwrite the previous selection to remove these pages and defaulted to index all attachment pages.

This then meant that having five images per blog post would lead to 5x-ing the number of indexed pages with 16% of actual quality content per URL, causing a massive drop in domain value.

3. Pagination

Pagination refers to the concept of splitting up content into a series of pages to make content more accessible and improve user experience. This means that if you have 30 blog posts on your site, you may have ten blog posts per page that go three pages deep. Like so:

  • https://www.example.com/blog/
  • https://www.example.com/blog/page/2/
  • https://www.example.com/blog/page/3/

You’ll see this often on shopping pages, press releases, and news sites, among others.

Within the purview of SEO, the pages beyond the first in the series will very often contain the same page title and meta description, along with very similar (near duplicate) body content, introducing keyword cannibalization to the mix. Additionally, the purpose of these pages is for a better browsing user experience for users already on your site, it doesn’t make sense to send search engine visitors to the third page of your blog.

4. Under-performing content

If you have content on your site that is not generating traffic, has not resulted in any conversions, and does not have any backlinks, you may want to consider changing your strategy. Repurposing content is a great way to maximize any value that can be salvaged from under-performing pages to create stronger and more authoritative pages.

Remember, as SEO experts our job is to help increase the overall quality and value that a domain provides, and improving content is one of the best ways to do so. For this, you will need a content audit to evaluate your own individual situation and what the best course of action would be.

Even a 404 page that results in a 200 Live HTTP status code is a thin and low-quality page that should not be indexed.

Common index bloat issues

One of the first things I do when auditing a site is to pull up their XML sitemap. If they’re on a WordPress site using a plugin like Yoast SEO or All in One SEO, you can very quickly find page types that do not need to be indexed. Check for the following:

  • Custom post types
  • Testimonial pages
  • Case study pages
  • Team pages
  • Author pages
  • Blog category pages
  • Blog tag pages
  • Thank you pages
  • Test pages

To determine if the pages in your XML sitemap are low-quality and need to be removed from search really depends on the purpose they serve on your site. For instance, sites do not use author pages in their blog, but still, have the author pages live, and this is not necessary. “Thank you” pages should not be indexed at all as it can cause conversion tracking anomalies. Test pages usually mean there’s a duplicate somewhere else. Similarly, some plugins or developers build custom features on web builds and create lots of pages that do not need to be indexed. For example, if you find an XML sitemap like the one below, it probably doesn’t need to be indexed:

  • https://www.example.com/tcb_symbols_tax-sitemap.xml

Different methods to diagnose index bloat

Remember that our objective here is to find the greatest contributors of low-quality pages that are bloating the index with low-quality content. Most times it’s very easy to find these pages on a large scale since a lot of thin content pages follow a pattern.

This is a quantitative analysis of your content, looking for volume discrepancies based on the number of pages you have, the number of pages you are linking to, and the number of pages Google is indexing. Any disparity between these numbers means there’s room for technical optimization, which often results in an increase in organic rankings once solved. You want to make these sets of numbers as similar as possible.

As you go through the various methods to diagnose index bloat below, look out for patterns in URLs by reviewing the following:

  • URLs that have /dev/
  • URLs that have “test”
  • Subdomains that should not be indexed
  • Subdirectories that should not be indexed
  • A large number of PDF files that should not be indexed

Next, I will walk you through a few simple steps you can take on your own using some of the most basic tools available for SEO. Here are the tools you will need:

  • Paid Screaming Frog
  • Verified Google Search Console
  • Your website’s XML sitemap
  • Editor access to your Content Management System (CMS)
  • Google.com

As you start finding anomalies, start adding them to a spreadsheet so they can be manually reviewed for quality.

1. Screaming Frog crawl

Under Configuration > Spider > Basics, configure Screaming Frog to crawl (check “crawl all subdomains”, and “crawl outside of start folder”, manually add your XML sitemap(s) if you have them) for your site in order to run a thorough scan of your site pages. Once the crawl has been completed, take note of all the indexable pages it has listed. You can find this in the “Self-Referencing” report under the Canonicals tab.

screenshot example of using Screaming Frog to scan through XML sitemaps

Take a look at the number you see. Are you surprised? Do you have more or fewer pages than you thought? Make a note of the number. We’ll come back to this.

2. Google’s Search Console

Open up your Google Search Console (GSC) property and go to the Index > Coverage report. Take a look at the valid pages. On this report, Google is telling you how many total URLs they have found on your site. Review the other reports as well, GSC can be a great tool to evaluate what the Googlebot is finding when it visits your site.

screenshot example of Google Search Console's coverage report

How many pages does Google say it’s indexing? Make a note of the number.

3. Your XML sitemaps

This one is a simple check. Visit your XML sitemap and count the number of URLs included. Is the number off? Are there unnecessary pages? Are there not enough pages?

Conduct a crawl with Screaming Frog, add your XML sitemap to the configuration and run a crawl analysis. Once it’s done, you can visit the Sitemaps tab to see which specific pages are included in your XML sitemap and which ones aren’t.

example of using Screaming Frog to run a crawl analysis of an XML sitemap

Make a note of the number of indexable pages.

4. Your own Content Management System (CMS)

This one is a simple check too, don’t overthink it. How many pages on your site do you have? How many blog posts do you have? Add them up. We’re looking for quality content that provides value, but more so in a quantitative fashion. It doesn’t have to be exact as the actual quality a piece of content has can be measured via a content audit.

Make a note of the number you see.

5. Google

At last, we come to the final check of our series. Sometimes Google throws a number at you and you have no idea where it comes from, but try to be as objective as possible. Do a “site:domain.com” search on Google and check how many results Google serves you from its index. Remember, this is purely a numeric value and does not truly determine the quality of your pages.

screenshot example of using Google search results to spot inefficient indexation

Make a note of the number you see and compare it to the other numbers you found. Any discrepancies you find indicates symptoms of an inefficient indexation. Completing a simple quantitative analysis will help direct you to areas that may not meet minimum qualitative criteria. In other words, comparing numeric values from multiple sources will help you find pages on your site that contain a low value.

The quality criteria we evaluate against can be found in Google’s Webmaster guidelines.

How to resolve index bloat

Resolving index bloat is a slow and tedious process, but you have to trust the optimizations you’re performing on the site and have patience during the process, as the results may be slow to become noticeable.

1. Deleting pages (Ideal)

In an ideal scenario, low-quality pages would not exist on your site, and thus, not consume any limited resources from search engines. If you have a large number of outdated pages that you no longer use, cleaning them up (deleting) can often lead to other benefits like fewer redirects and 404s, fewer thin-content pages, less room for error and misinterpretation from search engines, to name a few.

The less control you give search engines by limiting their options on what action to take, the more control you will have on your site and your SEO.

Of course, this isn’t always realistic. So here are a few alternatives.

2. Using Noindex (Alternative)

When you use this method at the page level please don’t add a site-wide noindex – happens more often than we’d like), or within a set of pages, is probably the most efficient as it can be completed very quickly on most platforms.

  • Do you use all those testimonial pages on your site?
  • Do you have a proper blog tag/category in place, or are they just bloating the index?
  • Does it make sense for your business to have all those blog author pages indexed?

All of the above can be noindexed and removed from your XML sitemap(s) with a few clicks on WordPress if you use Yoast SEO or All in One SEO.

3. Using Robots.txt (Alternative)

Using the robots.txt file to disallow sections or pages of your site is not recommended for most websites unless it has been explicitly recommended by an SEO Expert after auditing your website. It’s incredibly important to look at the specific environment your site is in and how a disallow of certain pages would affect the indexation of the rest of the site. Making a careless change here may result in unintended consequences.

Now that we’ve got that disclaimer out of the way, disallowing certain areas of your site means that you’re blocking search engines from even reading those pages. This means that if you added a noindex, and also disallowed, Google won’t even get to read the noindex tag on your page or follow your directive because you’ve blocked them from access. Order of operations, in this case, is absolutely crucial in order for Google to follow your directives.

4. Using Google Search Console’s manual removal tool (Temporary)

As a last resort, an action item that does not require developer resources is using the manual removal tool within the old Google Search Console. Using this method to remove pages, whole subdirectories, and entire subdomains from Google Search is only temporary. It can be done very quickly, all it takes is a few clicks. Just be careful of what you’re asking Google to deindex.

A successful removal request lasts only about 90 days, but it can be revoked manually. This option can also be done in conjunction with a noindex meta tag to get URLs out of the index as soon as possible.

Conclusion

Search engines despise thin content and try very hard to filter out all the spam on the web, hence the never-ending search quality updates that happen almost daily. In order to appease search engines and show them all the amazing content we spent so much time creating, webmasters must make sure their technical SEO is buttoned up as early in the site’s lifespan as possible before index bloat becomes a nightmare.

Using the different methods described above can help you diagnose any index bloat affecting your site so you can figure out which pages need to be deleted. Doing this will help you optimize your site’s overall quality evaluation in search engines, rank better, and get a cleaner index, allowing Google to find the pages you’re trying to rank quickly and efficiently.

Pablo Villalpando is a Bilingual SEO Strategist for Victorious. He can be found on Twitter @pablo_vi.

The post Delete your pages and rank higher in search – Index bloat and technical optimization 2019 appeared first on Search Engine Watch.

Search Engine Watch


Paid Search Basics for Events

May 23, 2019 No Comments

This blog provides tips and insights into event-based paid search strategies surrounding audiences types and different ads to garner success.

Read more at PPCHero.com
PPC Hero


Five ways blockchain will impact search marketing

May 21, 2019 No Comments

Few technologies promise to have an impact on the marketplace as tremendous as the blockchain technology. Though many professionals in the search marketing industry are still entirely unfamiliar with it. Blockchain’s disruptive nature is changing the nature of digital advertising regardless of whether some professionals hear about it or not, however, meaning it’s imperative to catch up on how this technology is changing the industry if you want to remain competitive.

Here are five of the major ways that blockchain will impact search marketing, and how advertising professionals are already beginning to master this interesting technology as it takes over.

1. Blockchain will make ads trustworthy

Consumers hate advertisements for a number of reasons, but by and large the most common is that they simply think advertising technology is untrustworthy. Nobody likes feeling as if they are being surveilled 24/7, and few people trust digital advertisements that appear on their screen enough to click on them, even if its contents are interesting. Blockchain technology promises to help this problem by securing the ad supply chain and making the marketing process more trustworthy to consumers everywhere.

Soon, thanks to blockchain services, ad tech vendors, buyers, and publishers will be more connected than ever before. Transparency, that is sorely needed in the ad supply chain can be brought about by the application of blockchain services, which thanks to their nature as ledgers are accessible to every party involved in a financial transaction. Website owners and ad vendors of the future will thus be able to operate with one another much more securely when making marketing arrangements.

2. Blockchain is delivering ad transparency

Elsewhere, blockchain services will be applied to make ads more transparent in an effort to win over the trust of skeptical consumers. Companies like Unilever are now teaming up with the likes of IBM on blockchain projects that they hope will disclose information about their business footprint and the way they collect and utilize information on customers. As these endeavors become more successful, others will be convinced to enlist the help of blockchain technology when it comes to ensuring a transparent advertising industry.

3. Blockchain is changing ad payments

Blockchain technology will also impact search marketing by disrupting the way that advertisement payments are facilitated. Companies like Amino Payments will soon be springing up left and right as the market for blockchain services grows larger and larger. These businesses will help mainstream blockchain-powered ad buys that make use of interesting smart contracts. While smart contracts are only just beginning to become an accepted part of the business world, they’ll be a mainstream facet of doing business sooner than we think, all thanks to the wonderful power of blockchain.

4. New advertising ecosystems are springing up

Some of the ways that blockchain is impacting search marketing are truly monumental. Blockchain technology is helping new advertising ecosystems get on their feet, for instance, with nascent companies like Adshares that are working hard to create a blockchain-based advertising ecosystem. As cryptocurrencies and other blockchain-powered technologies become more mainstream, we’ll see an increased need for blockchain-friendly payment systems.

Search marketing professionals in the future may have to rely on specialized expertise when navigating these new blockchain-powered advertising ecosystems that use a standard bitcoin wallet, which will become dominated by the IT-savvy. Programmatic advertising has already been upended time and again in recent years as the digital revolution brought about better computers, and the rise of blockchain could very well be the next stage in that cycle of disruption.

5. New blockchain browsers will reshape user experiences

Finally, the digital experience of the average consumer will be fundamentally changed by the introduction of blockchain browsers. Browser options like Brave are becoming more popular and grabbing headlines as they promise a privacy-respecting internet experience that features more honest and safer ad tech. Our current understandings of the marketing world may be entirely useless a few years from now when blockchain powered browsers off secure, personalized search options to users who are sick and tired of modern advertising gurus.

Search marketing is in for more than its fair share of disruptive changes in the forthcoming years, largely because of the advent of blockchain technology. Like any other technological innovation, blockchain will take time and investment to grow into its full potential, but it’s already quite clear that its development is jarring advertising professionals.

The post Five ways blockchain will impact search marketing appeared first on Search Engine Watch.

Search Engine Watch


SEO case study: How Venngage turned search into their primary lead source

April 27, 2019 No Comments

Venngage is a free infographic maker that has catered to more than 21,000 businesses. In this article, we explore how they grew their organic traffic from about 275,000 visitors per month in November 2017 to about 900,000 today — more than tripling in 17 months.

I spoke with Nadya Khoja, Chief Growth Officer at Venngage, about their process.

Venngage gets most of their leads from content and organic search. The percentage varies from month to month in the range of 58% to 65%.

In Nov 2017, Venngage enjoyed 275,000 visitors a month from organic search traffic. Today (16 months later) it’s 900,000. Nadya Khoja (their Chief Growth Officer) extrapolated from their current trend that by December of 2019 (in nine months) they will enjoy three million organic search visitors per month.

Screenshot of Venngage's statistics

In 2015, when Nadya started with Venngage, they saw 300 to 400 registrations a week. By March of 2018, this was up to 25,000 a week. Today it’s 45,000.

While Nadya had the advantage of not starting from zero, that is impressive growth per any reasonable metric. How did they do it?

Recipe

There are a lot of pieces to this puzzle. I’ll do my best to explain them, and how they tie together. There is no correct order to things per se, so what is below is my perspective on how best to tell this story.

The single most important ingredient: Hypothesize, test, analyze, adjust

This critical ingredient is surprisingly not an ingredient, but rather a methodology. I’m tempted to call it “the scientific method”, as that’s an accurate description, but perhaps it’s more accurate to call it the methodology written up in the books “The Lean Startup” (which Nadya has read) and “Running Lean” (which Nadya has not read).

This single most important ingredient is the methodology of the hypothesize, test, analyze, and adjust.

What got them to this methodology was a desire to de-risk SEO.

The growth in traffic and leads was managed through a series of small and quick iterations, each one of which either passed or failed. Ones that passed were done more. Ones that failed were abandoned.

This concept of hypothesizing, testing, analyzing, and adjusting is used both for SEO changes and for changes to their products.

The second most important ingredient

This ingredient is shared knowledge. Venngage marketing developed “The Playbook”, which everyone in marketing contributes to. “The Playbook” was created both as a reference with which to bring new team members up to speed quickly, as well as a running history of what has been tested and how it went.

The importance of these first two ingredients cannot be overstated. From here on, I am revealing things they learned through trial and error. You have the advantage to learn from their successes and failures. They figured this stuff out the hard way. One hypothesis and one test at a time.

Their north star metrics

They have two north star metrics. The first one seems fairly obvious. “How many infographics are completed within a given time period?” The second one occurred to them later and is as important, if not more so. It is “how long does it take to complete an infographic?”

The first metric, of course, tells them how attractive their product is. The second tells them how easy (or hard) their product is to use.

Together these are the primary metrics that drive everything Venngage does.

The 50/50 focus split

As a result of both the company and the marketing department having a focus on customer acquisition and customer retention, every person in marketing spends half their time working on improving the first north star metric, and the other half spend their time working on improving the second.

Marketing driving product design

Those north star metrics have led to Venngage developing what I call marketing driven product design. Everywhere I ever worked has claimed they did this. The way Venngage does this exceeds anything ever done at a company I’ve worked for.

“How do I be good?”

This part of Nadya’s story reminds me of the start of a promo video I once saw for MasterClass.com. It’s such a good segue to this part of the story that I cropped out all but the good part to include in this article.

When Steve Martin shed light on an important marketing question

I’ve encountered a number of companies through the years who thought of marketing as “generating leads” and “selling it”, rather than “how do we learn what our customers want?”, or “how do we make our product easier to use?”

Squads

The company is structured into cross-functional squads, a cross-functional squad being people from various departments within Venngage, all working to improve a company-wide metric.

For example, one of the aspects of their infographic product is templates. A template is a starting point for building an infographic.

As templates are their largest customer acquisition channel, they created a “Template Squad”, whose job is to work on their two north star metrics for their templates.

The squad consists of developers, designers, UI/UX people, and the squad leader, who is someone in marketing. Personally, I love this marketing focus, as it de-focuses marketing and causes marketing to be something that permeates everything the company does.

There is another squad devoted to internationalization, which as you can infer, is responsible to improve their two north star metrics with users in countries around the world.

Iterative development

Each template squad member is tasked with improving their two north star metrics.

Ideas on how to do this come from squad members with various backgrounds and ideas.

Each idea is translated into a testable hypothesis. Modifications are done weekly. As you can image, Venngage is heavy into analytics, as without detailed and sophisticated analytics, they don’t know which experiments worked and which didn’t.

Examples of ideas that worked are:

  • Break up the templates page into a series of pages, which contain either category of templates or single templates.
  • Ensure each template page contains SEO keywords specific for the appropriate industry or audience segment. This is described in more detail further in this document.
  • Undo the forced backlink each of the embedded templates used to contain.
    • This allowed them to get initial traction, but it later resulted in a Google penalty.
    • This is a prime example of an SEO tactic that worked until it didn’t.
  • Create an SEO checklist for all template pages with a focus on technical SEO.
    • This eliminated human error from the process.
  • Eliminate “React headers” Google was not indexing.
  • Determine what infographic templates and features people don’t use and eliminate them.

Measuring inputs

I personally think this is really important. To obtain outputs, they measured inputs. When the goal was to increase registrations, they identified the things they had to do to increase registrations, then measured how much of that they did every week.

Everyone does SEO

In the same way that marketing is something that does not stand alone, but rather permeates everything Venngage does, SEO does not stand alone. It permeates everything marketing does. Since organic search traffic is the number one source of leads, they ensure everyone in marketing knows the basics of technical SEO and understands the importance of this never being neglected.

Beliefs and values

While I understand the importance of beliefs and values in human psychology, it was refreshing to see this being proactively addressed within an organization in the context of improving their north star metrics.

They win and lose together

Winning and losing together is a core belief at Venngage. Nadya states it minimizes blame and finger-pointing. When they win, they all win. When they lose, they all lose. It doesn’t matter who played what part. To use a sports analogy, a good assist helps to score a goal. A bad assist, well, that’s an opportunity to learn.

SEO is a team effort

While it is technically possible for a single person to do SEO, the volume of tasks required these days makes it impractical. SEO requires quality content, technical SEO, and building of backlinks through content promotion, guest posting, and the others. Venngage is a great example of effectively distributing SEO responsibilities through the marketing department.

To illustrate the importance of the various pieces fitting together, consider that while content is king, technical SEO is what gets content found, but when people find crappy content, it doesn’t convert.

You can’t manage what you don’t measure

This requires no elaboration.

But what you measure matters

This probably does justify some elaboration. We’ve all been in organizations that measured stupid stuff. By narrowing down to their two north star metrics, then focusing their efforts to improving those metrics, they’ve aligned everyone’s activity towards things that matter.

The magic of incremental improvements

This is the Japanese concept of Kaizen put into play for the development and marketing of a software product.

Done slightly differently, this concept helped Britain dominate competitive cycling at the 2008 Olympics in Beijing.

Customer acquisition is not enough

Venngage developed their second north star metric after deciding that acquiring new customers was not, in and of itself, any form of the Holy Grail. They realized that if their product was hard to use, fewer people would use it.

They decided a good general metric of how easy the product is to use was to measure how long people take to build an infographic. If people took “too long”, they spoke to them about why.

This led them to change the product in ways to make it easier to use.

Link building is relationship building

As a reader of Search Engine Watch, you know link building is critical and central to SEO. In the same way that everyone in Venngage marketing must know the basics of technical SEO, everyone in Venngage marketing must build links.

They do so via outreach to promote their content. As people earn links from the content promotion outreach, they record those links in a shared spreadsheet.

While this next bit is related to link building, everyone in Venngage marketing has traffic goals as well.

This too is tracked in a simple and reasonable way. Various marketers own different “areas” or “channels”. These channels are broken down into specific traffic acquisition metrics.

As new hires get more familiar with how things work at Venngage, they are guided into traffic acquisition channels which they want to work on.

Learning experience, over time

My attempt here is to provide a chronology of what they learned in what order. It may help you avoid some of the mistakes they made.

Cheating works until it doesn’t

Understanding the importance of links to search ranking, they thought it would be a good idea to implement their infographics with embedded backlinks. Each implemented infographic contained a forced backlink to the Venngage website.

They identified a set of anchor text they thought would be beneficial to them and rotated through them for these forced backlinks.

And it worked, for a while. Until they realized they had invited a Google penalty. This took a bit to clean up.

The lessons learned:

  • The quality of your backlinks matter.
  • To attract quality backlinks, publish quality content.

Blog posts brought in users who activated

At some point, their analytics helped them realize that users who activated from blog posts where ideal users for them. So they set a goal to increase activations from blog posts, which led to the decision to test if breaking up templates into categories and individual pages with only one template made sense. It did.

Website design matters

Changing the website from one big template page to thousands of smaller ones helped, and not just because it greatly increased the number of URLs indexed by Google. It also greatly improved the user experience. It made it easier for their audience to find templates relevant to them, without having to look at templates that weren’t.

Lesson learned: UI/UX matters for both users and SEO.

Hybrid content attracts

Hybrid content is where an article talks about two main things. For example, talking about Hogwarts houses sorting within the context of an infographic. This type of content brings in some number of Harry Potter fans, some of whom have an interest in creating infographics. The key to success is tying these two different topics together well.

Content is tuneable

By converting one huge templates page into thousands of small template pages, they realized that a template or set of templates that appeal to one audience segment would not necessarily appeal to others. This caused them to start to tune templates towards audience segments in pursuit of more long tail organic search traffic.

How did they figure out what users wanted in terms of better content? They used a combination of keyword research and talking with users and prospects.

Some content doesn’t make the cut

After they caught onto the benefits of tuning content to attract different audience segments, they looked for content on their site that no one seemed to care about. They deleted it. While it decreased the amount of content on their site, it increased their overall content quality.

Traffic spikes are not always good news

When they initially started creating forced backlinks in their infographics, they could see their traffic increase. They saw some spikes. Their general thought was more traffic is good.

When they experienced the Google penalty, they realized how wrong they were. Some traffic spikes are bad news. Others are good news.

When your website traffic shows a sudden change, even if you’re experiencing a spike in organic search traffic, you must dig into the details and find out the root cause.

Lesson learned: There is a thing as bad traffic. Some traffic warns you of a problem.

Links from product embeds aren’t all bad

They just needed to make the embedded links optional. To allow the customer to decide if they do or do not deserve a backlink. While this did not cause any change to their levels of organic search traffic, it was necessary to resolve the Google penalty.

Boring works

Incremental continuous improvement seems repetitive and boring. A one percent tweak here, a two percent tweak there, but over time, you’ve tripled your organic search traffic and your lead flow.

It’s necessarily fun, but it delivers results.

Lesson learned: What I’ll call “infrastructure” is boring, and it matters. Both for your product and your SEO.

Figure out what to measure

The idea of measuring the amount of time required to complete an infographic did not occur to them on day one. This idea came up when they were looking for a metric to indicate to them how easy (or difficult) their product was to use.

Once they decided this metric possibly made sense, they determined their baseline, then through an iterative process, making improvements to the product to make this a little faster.

As they did so, the feedback from the users was positive, so they doubled down on this effort.

Lesson learned: What you measure matters.

Teach your coworkers well

They created “The Playbook”, which is a compendium of the combined knowledge they’ve accumulated over time. The playbook is written by them, for them.

Marketing employees are required to add chapters to the playbook as they learn new skills and methods.

Its primary purpose is to bring new team members up to speed quickly, and it also serves as a historical record of what did and did not work.

One important aspect of continuous improvement is for new people to avoid suggesting experiments that previously failed.

Additionally (and I love this), every month everyone in marketing gives Nadya an outline of what they’re learning and what they’re improving on.

Their marketing stack

While their marketing stack is not essential to understanding their processes, I find it useful to understand what software tools a marketing organization uses, and for what. So here is theirs. This is not a list of what they’ve used and abandoned over time, but rather a list of what they use now.

  • Analytics: Google Analytics and Mixpanel
  • Customer communications: Intercom
  • Link analysis and building: Ahrefs
  • Link building outreach: Mailshake
  • Project management: Trello
  • General purpose: G Suite

In closing

To me, what Nadya has done at Venngage is a case study in how to do SEO right, and most of doing it right are not technical SEO work.

  • Help senior management understand that some things that are not typically thought of as SEO (website design for example) can have serious SEO implications.
  • Get senior management buy in to include these non-SEO functions in your SEO efforts.
  • Understand what very few basic metrics matter for your company, and how you measure them.
  • Distribute required SEO work through as many people as reasonably possible. Include people whose job functions are not necessarily SEO related (writers, designers, UI/UX, and more).
  • Test and measure everything.
  • Win big through a continuous stream of small incremental improvements.

Venngage has surely lead by example and all the guidelines and pointers shared above can surely help your organization implement its search for increased sales.

Kevin Carney is the Founder and CEO of the boutique link building agency Organic Growth. 

The post SEO case study: How Venngage turned search into their primary lead source appeared first on Search Engine Watch.

Search Engine Watch


How to optimize paid search ads for phone calls

April 16, 2019 No Comments

There have been an abundance of hand-wringing articles published that wonder if the era of the phone call is over, not to mention speculation that millennials would give up the option to make a phone call altogether if it meant unlimited data.

But actually, the rise of direct dialing through voice assistants and click to call buttons for mobile search means that calls are now totally intertwined with online activity.

Calling versus buying online is no longer an either/or proposition. When it comes to complicated purchases like insurance, healthcare, and mortgages, the need for human help is even more pronounced. Over half of consumers prefer to talk to an agent on the phone in these high-stakes situations.

In fact, 70% of consumers have used a click to call button. And three times as many people prefer speaking with a live human over a tedious web form. And calls aren’t just great for consumers either. A recent study by Invoca found that calls actually convert at ten times the rate of clicks.

However, if you’re finding that your business line isn’t ringing quite as often as you’d like it to, here are some surefire ways to optimize your search ads to drive more high-value phone calls.  

Content produced in collaboration with Invoca.

Four ways to optimize your paid search ads for more phone calls

  1. Let your audience know you’re ready to take their call — and that a real person will answer

If you’re waiting for the phone to ring, make sure your audiences know that you’re ready to take their call. In the days of landlines, if customers wanted a service, they simply took out the yellow pages and thumbed through the business listings until they found the service they were looking for. These days, your audience is much more likely to find you online, either through search engines or social media. But that doesn’t mean they aren’t looking for a human to answer their questions.

If you’re hoping to drive more calls, make sure your ads are getting that idea across clearly and directly. For example, if your business offers free estimates, make sure that message is prominent in the ad with impossible-to-miss text reading, “For a free estimate, call now,” with easy access to your number.

And to make sure customers stay on the line, let them know their call will be answered by a human rather than a robot reciting an endless list of options.

  1. Cater to the more than half of users that will likely be on mobile

If your customer found your landing page via search, there’s a majority percent chance they’re on a mobile device.

While mobile accounted for just 27% of organic search engine visits in Q3 of 2013, its share increased to 57% as of Q4 2018.

Statistic: Mobile share of organic search engine visits in the United States from 3rd quarter 2013 to 4th quarter 2018 | Statista

That’s great news for businesses looking to boost calls, since mobile users obviously already have their phone in hand. However, forcing users to dig up a pen in order to write down your business number only to put it back into their phone adds an unnecessary extra step that could make some users think twice about calling.  

Instead, make sure mobile landing pages offer a click to call button that lists your number in big, bold text. Usually, the best place for a click to call button is in the header of the page, near your form, but it’s best practice to A/B test button location and page layouts a few different ways in order to make sure your click to call button can’t be overlooked.

  1. Use location-specific targeting

Since 2014, local search queries from mobile have skyrocketed in volume as compared to desktop.

Statistic: Local search query volume in the United States from 2014 to 2019, by platform (in billions) | Statista

In 2014, there were 66.5 billion search queries from mobile and 65.6 billion search queries from desktop.

Now in 2019, desktop has decreased slightly to 62.3 billion — while mobile has shot up to 141.9 billion — nearly a 250% increase in five years.

Mobile search is by nature local, and vice versa. If your customer is searching for businesses hoping to make a call and speak to a representative, chances are, they need some sort of local services. For example, if your car breaks down, you’ll probably search for local auto shops, click a few ads, and make a couple of calls. It would be incredibly frustrating if each of those calls ended up being to a business in another state.

Targeting your audience by region can ensure that you offer customers the most relevant information possible.

If your business only serves customers in Kansas, you definitely don’t want to waste perfectly good ad spend drumming up calls from California.

If you’re using Google Ads, make sure you set the location you want to target. That way, you can then modify your bids to make sure your call-focused ads appear in those regions.  

  1. Track calls made from ads and landing pages

Keeping up with where your calls are coming from in the physical world is important, but tracking where they’re coming from on the web is just as critical. Understanding which of your calls are coming from ads as well as which are coming from landing pages is an important part of optimizing paid search. Using a call tracking and analytics solution alongside Google Ads can help give a more complete picture of your call data.

And the more information you can track, the better. At a minimum, you should make sure your analytics solution captures data around the keyword, campaign/ad group, and the landing page that led to the call. But solutions like Invoca also allow you to capture demographic details, previous engagement history, and the call outcome to offer a total picture of not just your audience, but your ad performance.

For more information on how to use paid search to drive calls, check out Invoca’s white paper, “11 Paid Search Tactics That Drive Quality Inbound Calls.”

The post How to optimize paid search ads for phone calls appeared first on Search Engine Watch.

Search Engine Watch