CBPO

Tag: Google’s

Google’s Duo video chat app gets a family mode with doodles and masks

May 9, 2020 No Comments

Google today launched an update to its Duo video chat app (which you definitely shouldn’t confuse with Hangouts or Google Meet, Google’s other video, audio and text chat apps).

There are plenty of jokes to be made about Google’s plethora of chat options, but Duo is trying to be a bit different from Hangouts and Meet in that it’s mobile-first and putting the emphasis on personal conversations. In its early days, it was very much only about one-on-one conversations (hence its name), but that has obviously changed (hence why Google will surely change its name sooner or later). This update shows this emphasis with the addition of what the company calls a “family mode.”

Once you activate this mode, you can start doodling on the screen, activate a number of new effects and virtually dress up with new masks. These effects and masks are now also available for one-on-one calls.

For Mother’s Day, Google is rolling out a special new effect that is sufficiently disturbing to make sure your mother will never want to use Duo again and immediately make her want to switch to Google Meet instead.

Only last month, Duo increased the maximum number of chat participants to 12 on Android and iOS. In the next few weeks, it’s also bringing this feature to the browser, where it will work for anyone with a Google account.

Google also launched a new ad for Duo. It’s what happens when marketers work from home.

Mobile – TechCrunch


How to improve your SEO after Google’s spot-zero-termination

March 17, 2020 No Comments

It was January 22 when Google announced changes to the world of search engine optimization. The so-called “spot zero” in the featured snippets of search engine results pages (SERPs) stopped.

In other words, the URL of the featured snippet appears a single time in the SERP instead of serving with the snippet and its base position lower in the result.

The change is already having a significant impact on the number of clicks that featured snippets receive.

Digital marketing agency, 97th Floor conducted a study looking at almost 3000 high-volume SERPs that were affected by the spot-zero termination.

They’ve written a whitepaper to share practical tips on how to prioritize your SEO needs after Google’s update, and have also provided a free STAR (Situation, Task, Action, Results) reporting template to help you strategize.

Here is how to get started.

Content produced in collaboration with 97th Floor.

1. Highlight the date of the change in your analytics platform

The first step is to make sure that you mark the date of January 22 in your analytics platform. It’s the easiest way to keep track of all the changes that happen after Google’s update.

You can also mark the date in other SEO tools that you’re using to track your success with keywords and traffic to your site.

2. Exclude featured snippets from your new keyword research

According to 97th Floor’s research, there was a significant drop in the number of clicks on featured snippets after the spot-zero termination. Thus, make sure you run new keyword research to explore new opportunities. This time, filter out featured snippets and “people also ask” boxes to find keywords that will earn more clicks.

3. Re-optimize your URLs to become featured snippets

97th Floor noticed that it now becomes easier for URLs at the front page of SERPs to turn into featured snippets. Up to now, most featured snippets were in spots 1-3 but now there is a shift that includes more SERPs from the first page.

This means that you might not necessarily need to aim for spot 1 or 2 to land a featured snippet. Spend some time on optimizing your URLs holding lower position keywords to increase your chances of success.

4. Review your traffic coming from featured snippets

Google’s update has impacted the popular snippets that used to attract a high volume of site clicks. Thus, it’s not necessarily useful to aim for landing a featured snippet.

If you’ve noticed a drastic decline in your number of clicks, explore the idea of opting out of the featured snippets aiming for spot two instead.

5. Review your title tags

Now that the clicks are reduced, it’s more important than ever to work on your title tags. Aim for text that is more “clickable” without being misleading.

For example, if your brand is not popular, you can leave out its name from the title to focus on the content that will make your URL more clickable.

Double the time you are spending on optimization to review your SEO success.

6. Improve your meta descriptions

As with title tags, it’s crucial to pay attention to your meta descriptions to make your links more appealing.

It’s the best time to review your meta descriptions to explore how it can affect your clicks to your site.

7. Review your structured data mark-up

Featured snippets don’t make the only way to highlight your links. The structured data associated with your URLs can help you boost your performance in SERPs.

There are many mark-ups for your URLs and they vary based on the content:

  • Customer reviews
  • Event details
  • Product pricing
  • Recipe information
  • Local business information
  • Single Images or Carousels

8. De-optimize the featured snippet if needed

The best way to ‘deactivate’ the featured snippet from your link is to implement the “data-nosnippet” attribute to the HTTP of any page you want to de-optimize.

It’s safer to use this code instead of changing the copy as this could potentially affect your ranking.

9. Communicate the changes

Don’t forget to update your boss or client about the latest changes. Communication can make your job easier both in the short and long term.

Educating your clients can also help you get buy-in for long-term action plans that are clear on the implications of any changes coming from Google.

Start by presenting the situation, how you’re going to address it and the next actions.

This article only provides a flavor of the actionable recommendations in 97th Floor’s whitepaper. Download ‘The 10 Actions SEOs Need to Take Following Google’s Spot-Zero-Termination‘ for a more in-depth view, as well as their free STAR reporting template.

The post How to improve your SEO after Google’s spot-zero-termination appeared first on Search Engine Watch.

Search Engine Watch


Four tips to help your brand thrive despite Google’s notification changes

March 5, 2020 No Comments

The browser Google Chrome (v80) is following in the footsteps of Mozilla Firefox (v72) and Apple Safari (v12.1) for notifications—websites that ask for opt-in immediately will now be only able to use quiet notification prompts.

These prompts are far less visible than the standard prompts that show up below the address bar.  What’s more, in Chrome, users can now receive all opt-in requests quietly if they choose.

Many brands—retailers and publishers, in particular—have experienced tremendous success with web notifications. For instance, Asda’s George.com gets an astonishing 40% conversion rate with notifications on abandoned carts and a 27% clickthrough rate on segmented alerts.

While web browsers give users more control, brands must adapt. Here are five ways of dealing with these changes:

1. Be clear about the benefits of opting-in 

What value does your website messaging offer? Will subscribers get exclusive content or offers or get alerts when their product shipped? It’s key to highlight such value in a soft-prompt before triggering the browser’s actual notification prompt.

2. Provide granular preferences

Offer visitors a preference center for them to customize settings to receive only notifications they truly want. For instance, a merchant may offer notifications for daily flash sales, weekly specials, new product arrivals and/or transaction updates. More control over notifications equals more customer happiness.

3. Don’t rush the “ask”

Like needy people, needy brands are a turn-off. Therefore, consider waiting until they’ve taken an action that signals interest before asking them to opt-in. Have they looked at a promotion, watched a video or searched for a specific product? Pinpoint the moment when asking for the opt-in will streamline the customer journey instead of stalling it.

4. Test various flows 

Web opt-ins are often the largest addressable audiences for brands, hence marketers don’t want to wait too long before making the ask. You should continuously A/B test your opt-in prompts, including timing, language, and offers. While browsers will judge your site by opt-in rate, brands should be focused on better long-term engagement, more conversions, higher frequency, and greater lifetime value.

5. Reward opens 

Last but not least, notifications have become central to the customer experience for both apps and mobile platforms, which explains why the opt-in rate for apps exceeds 50%. Website marketers should reward customers for notification engagement. For example, they can offer double loyalty point days, early access to the biggest deals or notifications when wish list items go on sale.

Mike Stone is the SVP of marketing at Airship.

The post Four tips to help your brand thrive despite Google’s notification changes appeared first on Search Engine Watch.

Search Engine Watch


A look at performance post Google’s average position sunset: Top vs side

February 25, 2020 No Comments

Average position as a metric has been retired since the end of September. This is a big change since for years clients, agencies, and any advertiser has always had at least a little bit of vanity management. By that I mean, everyone at some point submitted a bid with the sole goal of being “number one” and not any actual business metric.

This change was implemented to acknowledge that the average position is not meaningful when you are in a world of personalized search. Stopping vanity bidding is just a beneficial side effect. I wanted to take a look at some data, specifically CPC and CTR, to see how performance varies for top and side positions. I also wanted to look at how these metrics vary on Google.com vs. Search partners. What I found were some very interesting insights that might impact how you think about your campaigns.

When it comes to the differences between Google and it’s partners and top vs. other the keys are:

  • Google top vs. other has the biggest differences when it comes to CTR. The data showed a >900% increase in CTR across desktop, mobile, and tablet. This was the highest delta across the entire data set, expect for Partner top vs. other which was nearly 4x the difference.
  • Mobile for Google vs. the Partners was also a significant difference at 918%. This was noticeable because the desktop variance was only 30% (basically a tie). The importance of mobile can’t be understated.

CTR differences after average position sunset

When it comes to cost per click differences the variances were really noticeable when it comes to cost per click. The drop off between Google and partners was at least 100% and as high as 268%. The differences are driven primarily by demand. Many advertisers do not participate in the partner network. Therefore, demand is down and the cost per click would fall as well. This is where if the conversion rates are right you would be able to pick up some additional scale. The difference when looking at Google and Partners top vs. other is a much smaller delta. This just highlights the demand point above. The difference in mobile was only 13%. There are such a high demand and fewer spaces for mobile that the difference between top and side was the smallest of any data set that was reviewed.

CPC differences after average position sunset

While the CPCs weren’t that different the CTRs for Google mobile top were significantly higher than the search partners top. I thought this was worth showing the actual data to show the differences between mobile and desktop. The drop in mobile top is very high indicating a different search experience and relevance. The differences are very small and much lower CTR when looking at the “Other” positions.

CTR actuals at other positions after average position sunset

What action should you take based on this data?

1. Don’t manage to these metrics – Optimize them

Ultimately, you shouldn’t really care what the CPC is or what your CTR is. The goal is hitting your KPIs. If you need to pay $ 100 per click, but convert 100% of the clicks then it’s no different than paying $ 20 per click and a 20% conversion rate. That’s not to say you shouldn’t optimize to improve, you should. I’m just suggesting that metrics like top vs. side CTR are simply indicators on how you can improve. These are not your true KPIs.

2. Understand the value the search partner network brings your campaign

The search network provides scale to your campaigns and to Google for a revenue stream. That doesn’t mean in every case you need or require that scale. If you are struggling to perform break down your traffic by Google and the partner network. Look at not only CTR and CPC data, but also understand conversion rates. What would happen if you cut off the search partner network to both your volume and your cost per acquisition? Does this additional scale provide your business value or would it be better spent investing in other areas that perform better? This isn’t a one size fits all answer. You need to do the work and the result might be different by campaign or even keyword.

Note: The stats and observations shared by the author have been derived from BrandMuscle’s anonymized client data.
Feel free to share your observations in the comments section.

The post A look at performance post Google’s average position sunset: Top vs side appeared first on Search Engine Watch.

Search Engine Watch


How to make the most of Google’s “People also ask” results

February 21, 2020 No Comments

Google’s “People also ask” boxes are widely discussed within the SEO industry as they take a lot of SERP real estate while providing little to no organic visibility to the publishers’ sites.

That said, “People also ask” listings are probably helpful for Google’s users allowing them to get a better understanding of a topic they are researching. Yet, whether they do send actual clicks to publishers’ pages remains a huge question.

While we have no power over Google’s search engine page elements, our job as digital marketers is to find ways to take any opportunity to boost our clients’ organic visibility.

Is there any way for marketers to utilize this search feature better? Let’s see.

1. Understand your target query intent better

One of the cooler aspects of “People also ask” boxes is that they are dynamic.

When you click one question, it will take you in a new direction by generating more follow-up questions underneath. Each time you choose, you get more to choose from.

The coolest thing though is that the further questions are different (in topic, direction or intent) based on which question you choose.

Let me explain this by showing you an example. Let’s search for something like – “Is wine good for your blood?”

Now try clicking one of those questions in the box, for example, “What are the benefits of drinking red wine?” and watch more follow-up questions show up. Next, click a different question “Is red wine good for your heart and blood pressure?”. Do you see the difference?

Understanding search intent through Google's people also ask

 

Source: Screenshot made by the author, as of Feb 2020

Now, while this exercise may seem rather insignificant to some people, to me, it is pretty mind-blowing as it shows us what Google may know of their users’ research patterns and what may interest them further, depending on their next step.

To give you a bit of a context, Google seems to rely on semantic analysis when figuring out which questions fit every searcher’s needs better. Bill Slawski did a solid job covering a related patent called “Generating related questions for search queries” which also states that those related questions rely on search intent:

Providing related questions to users can help users who are using   un-common keywords or terminology in their search query to identify   keywords or terms that are more commonly used to describe their intent.

Google patent on generating related questions for search queries

Source: Google patent

For a deeper insight into the variety of questions and types of intent, they may signal, try Text Optimizer. The tool uses a similar process of extracting questions Google does. For example, here are intent-based questions that refer to the topic of bitcoin.

Finding intent based questions for people also ask using Text Optimizer

 

Source: TextOptimizer’s search screenshot, as of Jan 2020

2. Identify important searching patterns

This one somewhat relates to the previous one but it serves a more practical goal, beyond understanding your audience and topic better. If you search Google for your target query enough, you will soon start seeing certain searching patterns.

For example, lots of city-related “People also ask” boxes will contain questions concerning the city safety, whether it is a good place to live in and what it is famous for:

Finding important search patterns through Google's people also ask

Identifying these searching patterns is crucial when you want:

  • Identify your cornerstone content
  • Re-structure your site or an individual landing page
  • Re-think your site navigation (both desktop and mobile)
  • Create a logical breadcrumb navigation (more on this here)
  • Consolidate your multiple pages into categories and taxonomies

3. Create on-page FAQs

Knowing your target users’ struggles can help in creating a really helpful FAQ section that can diversify your rankings and help bring steady traffic.

All you need to do is to collect your relevant “People also ask” results, organize them in sections (based on your identified intent/searching patterns) and answer all those questions on your dedicated FAQ page.

When working on the FAQ page, don’t forget to:

  • Use FAQPage schema to generate rich snippets in Google search (WordPress users can take advantage of this plugin). If you have a lot of questions in your niche, it is a good idea to build a standalone knowledge base to address them. Here are all the plugins for the job.
  • Set up engagement funnels to keep those readers interacting with your site and ultimately turn them into customers. Finteza is a solid option to use here, as it lets you serve custom CTAs based on the users’ referral source and landing page that brought them to your site:

Screenshot on Finteza

 

Source: Screenshot by Finteza, as of July 2019

4. Identify your competitor’s struggles

If you have an established competitor with a strong brand, their branded queries and consequent “People also ask” results will give you lots of insight into what kinds of struggles their customers are facing (and how to serve them better).

When it comes to branded “People also ask” results, you may want to organize them based on possible search intent:

  • ROPO questions: These customers are researching a product before making a purchasing decision.
  • High-intent questions: Customers are closest to a sale. These are usually price-related queries, for example, those that contain the word “reviews”.
  • Navigational questions: Customers are lost on your competitor’s site and need some help navigating. These queries can highlight usability issues for you to avoid when building your site.
  • Competitive questions: These queries compare two of your competitors.
  • Reputation questions: Those customers want to know more about your competitor’s company.

Identifying competitor challenges through people also ask

Source: A screenshot made by the author in January 2020

This information helps you develop a better product and a better site than those of your competitors.

Conclusion

With the changes in search algorithms over the years, the dropping and adding of key search elements, the evolution of Google’s SERPs, navigating digital marketing trends seems almost treacherous.

Yet, at the core of things, not much has really shifted and much of what we do remains the same. In fact, some of those changes have made it even easier to make an impact on the web than ever before. While we may welcome or frown upon each new change, there’s still some competitive advantage in each of them.

Our job, as digital marketers, is to distinguish that competitive advantage and make the most of it.

I hope the above ideas will help you use “People also ask” results to your advantage.

Ann Smarty is the Brand and Community manager at InternetMarketingNinjas.com.

The post How to make the most of Google’s “People also ask” results appeared first on Search Engine Watch.

Search Engine Watch


The perils of tricking Google’s algorithm

February 9, 2020 No Comments

Let’s admit it, all of us are trying our best to please search engines (SE) and cracking Google’s algorithm. After all, who doesn’t want some extra visibility and revenue?

Naturally, billions of websites are adopting innovative practices to gain Google’s attention and approval. In order to rank high on the SERP, businesses should comply with the Google updates that are introduced on a regular basis. But this, in no way, means finding loopholes in these search engine algorithms or adopting strategies to trick them. In fact, businesses employing such empty SEO tricks have to face the music later. Many firms already have experienced Google’s wrath in the past.

Google has been regularly introducing algorithm updates to improve the quality of its search results. But it also penalizes sites that employ unethical or outdated practices to rank higher. This can adversely impact a brand’s reputation and bottom line. Ideally, these updates should be used as a guide for improving a site’s UX, ranking on SERPs is an end result that will follow.

Read on to know the ill-effects of chasing Google’s algorithms. There’s also a bonus involved! You will also learn some effective tips to stay on top of these updates while boosting your business reputation.

1. Google penalties

Google’s algorithm updates are a solution to reward good content and identify and penalize websites using unethical and outdated SEO practices. Google absolutely doesn’t approve of tactics like keyword stuffing, buying links, linking to penalized sites, unnatural links, and others. Algorithm updates, Panda, Penguin, Pigeon, RankBrain, Broad Core, and others aim at improving the quality of search results for users.

Google webmaster guidelines

Source: Google Webmaster Guideline

Thus, web developers, digital marketers, bloggers, and online businesses messing with these updates are penalized, sending their website plummeting down the SERP.

Google can penalize such websites in two ways –

A. Algorithmic penalty

Several other factors can cause your ranking to go down. Yet, with the introduction of an update, there’s a fair chance that your website may be affected. This is especially true if your site doesn’t adhere to the specific parameters assessed by the update.

For instance, Google Panda assigns a quality score to your site after checking for duplicate content and keyword stuffing. If your site has duplicate content, its ranking is bound to suffer.

Similarly, the latest January 2020 Core Update will be checking websites for authoritative and relevant content with a healthy E-A-T rating. So, if your website violates any of the guidelines shared by Google, it will automatically be penalized or filtered.

Make sure you check for issues in your domain on Google Search Console at regular intervals.

B. Manual penalty

This is a direct result of your website being penalized by a Google employee for not complying with the search quality guidelines. Manual penalties are Google’s way of punishing websites with spammy behavior. The Manual Actions Report on Search Console allows you to check such penalties, offering an opportunity to resolve them.

Check out this infographic by DigitalThirdCoast that shares an analysis of the businesses that tried to cheat Google along with the repercussions they had to face later.

2. Loss of reputation and credibility

Businesses obsessed with algorithm updates not only attract penalties but also lose focus on improving their site’s UX. Either way, the business loses its reputation and credibility. Lost reputation means an immediate loss of potential revenue, benefiting no one else but the competition.

Check out what John Mueller, Webmaster Trends Analyst at Google has to say about cleaning up the mess after being slapped by a Google penalty.

John Mueller's comment about Google penalties

Source: Reddit

Of course, there are ways to recover from Google penalties. But it takes a lot of effort to rebuild the business reputation and trustworthiness, let alone improving the firm’s online ranking and winning back the lost customers.

3. Marketing myopia

One of the gravest dangers of being preoccupied with Google algorithm updates is losing sight of the business vision and goals. Instead of focusing on the audience’s needs the firm tends to adopt an inward-looking approach only to satisfy Google.

Google will forever introduce these updates. There’s no end to their journey towards improving the quality of search results. Google is clearly focused on its vision. Are you?

Don’t lose sight of your vision. Use Google’s algorithm updates as a guide to steer closer to your business goals.

What can you do to rank better on Google?

1. Don’t perennially chase Google updates

Google makes minor changes in its algorithm almost every other day. In 2019 alone multiple updates were reported. Not all were confirmed as Google is less upfront about these updates.

List of Google's algorithm updates in 2019

Source: Moz

The sole objective of these updates is to create a better user experience. Merely chasing them and going all over the place with execution will not only land you with a penalty but also affect your reputation in the long term.

Stop obsessing about these updates and focus on making your website and content better each day.

2. Focus on delivering first-rate digital experience

Google’s algorithms are constantly judging and rating sites based on the quality of experience they offer and their E-A-T rating. In a nutshell, you need to prioritize these pointers.

A. Serve quality content

“Quality” seems to be a subjective term but not for Google. The search giant clearly states that the content on a website should be in-depth, relevant, useful, and from a credible source. Simply put, it asks us to create E-A-T worthy content.

This is especially true for the YMYL websites that affect an individual’s health, happiness, safety, or financial stability.

Google's page quality rating standards for YMYL websites

(Source: Google’s Search Quality Guidelines)

Ask yourself these three questions when creating a piece of content:

  • Is the content contributor an expert on the subject? (Expertise)
  • Is the content contributor an influencer or an authority in the domain? (Authority)
  • Is this content accurate and from a credible source? (Trustworthiness)

B. Work on your backlink profile

Backlinks are one of the top-ranking factors that help Google decide a website’s authority and credibility in its niche. Focus on getting quality backlinks from authority sites.

How?

Well, authoritative sites will award links to websites serving relevant, useful, and shareable content. Build authority by creating great content in various forms like videos, podcasts, case studies, infographics, and others.

You should also collaborate with experts for content-creation projects. For instance, expert roundups can not only strengthen your network with influential people in a niche but also provide solid content for your upcoming posts.

Tip to work on back links via roundup posts to rank well

(Source: https://www.rankwatch.com/future-of-seo.html)

Check out how RankWatch conducted an expert roundup involving 25 marketing experts like Rand Fishkin and Barry Adams to discuss the future of SEO. Such inbound link-building initiatives have earned the website a healthy number of backlinks from websites with healthy page authority (PA).

Here are the results as seen on MozBar.

Inbound link result on Mozbar

Source: Moz Analytics

C. Improve your site speed

A website’s bounce rate is directly proportional to its load time. Google recommends having a site speed index of under three seconds.

How page load time affects traffic

Source: Think with Google

If your website takes longer than three seconds to load, be prepared to wear Google’s “Badge of Shame”. You read it right! Google’s planning to slap slow websites with this badge.

Google's badge of shame

Source: Chromium blog

It’s best suggested to take effective steps to improve your site speed which will, in turn, boost your site’s UX and improve your ranking.

D. Avoid over-optimizing webpages

Google will see through any unscrupulous SEO hacks that are employed to game the system. Build sites to improve your audience’s online experience, not to trick Google. We will touch such unethical practices at the next point.

3. Play by the rules

Though Google isn’t transparent with its algorithm updates, it keeps sharing valuable tips for webmasters and content creators, encouraging them to serve quality content and boost their site’s UX. Use these tips to your advantage.

A. Take learnings from the search quality guidelines

Google wants webmasters to follow its guidelines when building sites and posting online content. So, it’s important to constantly stay updated about the current guidelines. Refer to the search quality guidelines when creating an SEO strategy for your business.

B. Avoid black and gray-hat SEO tactics

Avoid using black-hat SEO techniques and monetization schemes like keyword stuffing, private blog networks, spammy links, and affiliate links among others. Moreover, Google absolutely disapproves of gray-hat SEO tricks like buying expired domains, cloaking, dummy social accounts, and scraped content among others. These techniques normally go unnoticed but when used excessively are spotted by Google, attracting a penalty.

Therefore, it’s best to avoid both these unethical SEO tactics that only focus on tricking the algorithm. Make delivering value to users a priority!

4. Check for crawl errors

At times, your website isn’t featured in the top searched because Google’s spiders haven’t crawled it. One of the major reasons for this is a possible error in your code. Use Google’s Index Coverage report and URL Inspection tool to identify and fix the gaps in your code.

Also, remember to optimize your crawl budget to ensure that your important webpages in Robots.txt are crawled. Finally, watch out for 301 and 302 redirect chains that can hurt your crawl limit and cause the SE crawler to stop crawling your site.

Wrapping up

A website doesn’t enjoy high visibility on Google, it practically doesn’t exist. Therefore, everyone’s bending over backward to crack Google’s algorithm updates. However, businesses adopting strategies merely to trick Google are headed for a slippery slope.

Google’s algorithms are smart enough to identify and punish websites that are up to no good. So, take my advice – instead of trying to crack Google’s algorithm updates, work towards creating awesome content and offering the best experience to users. The tips shared in this post will guide you in the process.

George Konidis is the co-founder of Growing Search, a Canadian based digital marketing agency providing optimal SEO and link building services worldwide. He can be found on Twitter @georgekonidis.

The post The perils of tricking Google’s algorithm appeared first on Search Engine Watch.

Search Engine Watch


Google’s average position sunset: Are you set up for the transition?

November 23, 2019 No Comments

On September 30th, Google turned off average position as a metric for search campaigns and now requires advertisers to transition to new impression share and impression rate tools.

The news was first announced in February as an effort to establish more accurate and transparent forms of measurement. Advertisers now get to experience how often ads are appearing for eligible searches (share) and how often ads are showing at the top of the search results page (rate)—and while these new tools will ultimately be beneficial, the forced change from Google will undoubtedly stir up routine for many advertisers.

Here are a few ways advertisers can get set up with the rollout of new metrics.

Understanding the basics

To understand the impact of this change, let’s first define impression share and impression rate. Impression share is the percentage of impressions an ad receives compared to the total number that the ad is qualified for on the search engine results page (SERP). Impression share is a novel way to discover room for ad performance improvements—it displays any missed opportunities by showing how often a certain ad showed up in the top search results.

In contrast, the average position did not properly measure whether ads showed up above the organic results or not; it just showcased their order compared to other ads. Advertisers were left with a guessing game.

Impression rate shows advertisers how often their ads show up at the top of the SERP based on their total impressions—in other words, what percent of the time an ad is in the very top spot (absolute top) or shown anywhere above the organic search results (top). These details address another shortcoming of average position since even an ad in position two might be at the bottom of the page.

Measuring impression share and impression rate

There are three versions of impression share, all which measure ad impressions divided by the total eligible impressions for that ad, but based on different locations on the SERP:

  • Search (abs.) top IS: The new impression an ad has received in the absolute top location (the very first ad above the organic search results) divided by the estimated number of impressions the ad was eligible to receive in the top location. This metric is new.
  • Search top IS: The impressions an ad has received anywhere above the organic search results compared to the estimated number of impressions the ad was eligible to receive in the top location. This metric is also new.
  • Search impression share: This already-existing metric measures impressions anywhere on the page.

For the impression rate, there are two metrics that are only based on ad impressions, not the total number of eligible impressions.

  • Impr. (absolute top) %: The percent of ad impressions that are shown as the very first ad above the organic search results.
  • Impr. (top) %: The percent of ad impressions that are shown anywhere above the organic search results.

Optimizing for awareness and performance

If an advertiser is more focused on driving awareness than ROI, impression share and impression rate are both greatly valuable, as they guarantee the ads are meeting a visibility threshold and can boost awareness.

On the other hand, advertisers using Google’s new impression share options in Smart Bidding should be cautious. The impression share data is not accessible on the same day, so it’s hard to track performance – and setting a high target may significantly boost spending by making an ad eligible for additional, unwanted auctions. A better strategy for Smart Bidding is to bid to impression rate, which has data available intraday. This approach allows advertisers to optimize their impressions showing at the top of the SERP.

As a general starting point, the easiest way for advertisers to set targets is to look at recent performance for campaigns across the three impression % (rate) metrics. This should ensure the smoothest transition from targeting a position to targeting impression share.

Impression share metrics table updated

Setting up for the transition

Advertisers using Google have been encouraged to focus on the impression metrics for some time. Still, many advertisers probably feel an impact from the shift to these metrics, particularly because of the new obstacles it presents for bidding strategies. Therefore, advertisers should set the right bids to achieve their shared goal.

With this switch to the new metrics, advertisers should check any rules that support average position, and update reports and saved columns that include the average position. The following applications may include average position:

  • Bidding settings and AdWords rules
  • Custom columns
  • Saved reports (especially any with filters)
  • AdWords scripts
  • Saved column sets
  • Scorecards that use average position in dashboards
  • URLs using the {ad position} parameter

Google announced it will be automatically migrating “Target Position on Page” bid strategies, but there’s no certainty on a timeline or details regarding the migration. Therefore, advertisers should watch for any campaign targeting average position from now on to ensure they’re getting the expected results.

Wes MacLaggan is SVP of Marketing at Marin Software.

The post Google’s average position sunset: Are you set up for the transition? appeared first on Search Engine Watch.

Search Engine Watch


Evolution of Google’s News Ranking Algorithm

November 1, 2019 No Comments

Image: Photo by Nathan Dumlao on Unsplash

Did the Algorithm Behind How News Articles Rank at Google Just Change?

A Google Patent about how news articles are ranked by Google was updated this week, and in this case it suggests how entities in those documents can have an impact on ranking.

How Have News Articles Been Ranked at Google?

This patent was originally filed in 2003.

The beta version of Google News was first launched by Google in 2002, so this was one of the early patents that described how Google ranked news articles.

One of the inventors of the original patent was Krishna A. Bharat, known as a founder of Google News.

The newest version (a continuation patent) was just granted and is the Sixth Version of the patent. It can be found at:

Systems and methods for improving the ranking of news articles
Inventors: Michael Curtiss, Krishna A. Bharat, and Michael Schmitt
Assignee: Google LLC
US Patent: 10,459,926
Granted: October 29, 2019
Filed: April 27, 2015

This version of the patent provides a history of previous versions of the patent, and when they were filed and what the patent numbers of the earlier 5 versions are:

This application is a

(1) continuation of U.S. patent application Ser. No. 14/140,108, filed on Dec. 24, 2013, which is a

(2) continuation of U.S. patent Ser. No. 13/616,659, filed on Sep. 14, 2012 (now U.S. Pat. No. 8,645,368), which is a

(3) continuation of U.S. patent application Ser. No. 13/404,827, filed Feb. 24, 2012, (now U.S. Pat. No. 8,332,382), which is a

(4) continuation of U.S. patent application Ser. No. 12/501,256, filed on Jul. 10, 2009, (now U.S. Pat. No. 8,126,876), which is a

(5) continuation of U.S. patent application Ser. No. 10/662,931, filed Sep. 16, 2003, (now U.S. Pat. No. 7,577,655),

the disclosures of which are hereby incorporated by reference herein.

What A Continuation Patent is

Continuation Patents take the date of the filing of the patent they are continuing (or the ones those patents are continuing) and are intended to show how the process described by the patents have changed. The processes are set out in the claims sections of the patents, which are the parts of the patents that the prosecuting patent officer reviews when deciding whether or not to grant the new patents.

Often, looking at the very first claim of each patent can help identify important aspects that have changed from one version of a patent to another. It is somewhat rare (in my experience) to see a patent that has been updated 6 times as this one has. I recently wrote about Google’s Universal Search Interface patent which was recently updated a fourth time – Google’s New Universal Search Results.

What Caused A Recent Rankings Change at the New York Times?

A post on Twitter this week suggested that The New York Times may have been negatively impacted by a new Algorithm called Bert that was just released at Google, which was announced in Understanding searches better than ever before.

That Tweet does tell us that it is possible that BERT may have had an impact or a move to Mobile-First Indexing may have caused a loss of rankings at the Newspaper’s site. But seeing that tweet, and seeing that there was a new version of this patent made me curious to see what it contained, and what the changes it may have brought about were.

The Changing Claims from the Ranking of News Articles Patents

But it’s possible that other changes at Google could also have an impact on rankings at news sites. One way to tell how Google changed it how ranks articles is to look at how the patent covering the ranking of news articles has changed over time.

Compare How the first 4 claims from this patent have changed over time.

The latest first claim in this patent introduces some new things to look at

What is claimed is:

1. A method for ranking results, comprising: receiving a list of objects; identifying a first object in the list and a first source with which the first object is associated; identifying a second object in the list and a second source with which the second object is associated; determining a quantity of named entities that (i) occur in the first object that is associated with the first source, and (ii) do not occur in objects that are identified as sharing a same cluster with the first object but that are associated with one or more sources other than the first source; computing, based at least on the quantity of named entities that (i) occur in the first object that is associated with the first source, and (ii) do not occur in objects that are identified as sharing a same cluster with the first object but that are associated with one or more sources other than the first source, a first quality value of the first source using a first metric, wherein a named entity corresponds to a person, place, or organization; computing a second quality value of the second source using a second metric that is different from the first metric; and ranking the list of objects based on the first quality value and the second quality value.

2. The method of claim 1 wherein the identifying the first source with which the first object is associated includes: identifying the first source based on a uniform resource locator (URL) associated with the first object.

3. The method of claim 1 wherein the first source is a news source.

4. The method of claim 1 wherein computing the first quality value of the first source is further based on: one or more of a number of articles produced by the first source during a first time period, an average length of an article produced by the first source, an amount of important coverage that the first source produces in a second time period, a breaking news score, network traffic to the first source, a human opinion of the first source, circulation statistics of the first source, a size of a staff associated with the first source, a number of bureaus associated with the first source, a breadth of coverage by the first source, a number of different countries from which traffic to the first source originates, and a writing style used by the first source.

From the version of the patent that was filed on Sep. 14, 2012 (now U.S. Pat. No. 8,645,368):

What is claimed is:

1. A method comprising: determining, using one or more processors and based on receiving a search query, articles and respective scores; identifying, using one or more processors, for an article of the articles, a source with which the article is associated; determining, using one or more processors, a score for the source, the score for the source being based on: a metric that represents an evaluation, by one or more users, of the source, and an amount of traffic associated with the source; and adjusting, using one or more processors, the score of the article based on the score for the source.

2. The method of claim 1, where identifying the source includes identifying the source based on an address associated with the article.

3. The method of claim 1, where determining the score includes accessing a memory to determine the score for the source.

4. The method of claim 1, where the score for the source is further based on a length of time between an occurrence of an event and publication, by the source, of an article associated with the event.

From the Version of the patent filed on Feb. 24, 2012, (now U.S. Pat. No. 8,332,382):

What is claimed is:

1. A computer-implemented method comprising: obtaining, in response to receiving a search query, articles and respective scores; identifying, using one or more processors, for an article of the articles, a source with which the article is associated; determining, using one or more processors, a score for the source, based on polling one or more users to request the one or more users to provide a metric that represents an evaluation of a source and based on a length of time between an occurrence of an event and publication, by the source, of another article associated with the event; and adjusting, using one or more processors, the score of the article based on the score for the source.

2. The method of claim 1, where identifying the source includes identifying the source based on an address associated with the article.

3. The method of claim 1, where adjusting the score of the article includes: determining, using the score for the source, a new score for the article associated with the source; and adjusting the score of the article based on the determined new score.

4. The method of claim 1, where the score for the source is further based on a usage pattern indicating traffic associated with the source.

From the version of the patent that was filed on February 10, 2009, (Now U.S. Pat. No. 8,126,876):

What is claimed is:

1. A method, performed by one or more server devices, the method comprising: receiving, at one or more processors of the one or more server devices, a search query, from a client device; generating, by one or more processors of the one or more server devices and in response to receiving the search query, a list of references to news articles; identifying, by one or more processors of the one or more server devices and for each reference in the list of references, a news source with which each reference is associated; determining, by one or more processors of the one or more server devices and for each identified news source, whether a news source rank exists; determining, by one or more processors of the one or more server devices and for each reference with an existing corresponding news source rank, a new score by combining the news source rank and a score corresponding to a previous ranking of the reference; and ranking, by one or more processors of the one or more server devices, the references in the list of references based, at least in part, on the new scores.

2. The method of claim 1, where determining whether each news source rank exists includes accessing a database to locate the news source rank.

3. The method of claim 1, further comprising: providing the ranked list of references to the client device.

4. The method of claim 1, where determining the new score comprises: determining, for each reference with an existing corresponding news source rank, a weighted sum of the news source rank and the score corresponding to the previous ranking of the reference.

And the Very First Version of the patent filed on September 16, 2003, (Now U.S. Pat. No. 7,577,655):

What is claimed is:

1. A method comprising: determining, by a processor, one or more metric values for a news source based at least in part on at least one of a number of articles produced by the news source during a first time period, an average length of an article produced by the news source, an amount of coverage that the news source produces in a second time period, a breaking news score, an amount of network traffic to the news source, a human opinion of the news source, circulation statistics of the news source, a size of a staff associated with the news source, a number of bureaus associated with the news source, a number of original named entities in a group of articles associated with the news source, a breadth of coverage by the news source, a number of different countries from which network traffic to the news source originates, or a writing style used by the news source determining, by the processor, an importance metric value representing the amount of coverage that the news source produces in a second time period, where the determining an importance metric includes: determining, by the processor, for each article produced by the news source during the second time period, a number of other non-duplicate articles on a same subject produced by other news sources to produce an importance value for the article, and adding, by the processor, the importance values to obtain the importance metric value; generating, by the processor, a quality value for the news source based at least in part on the determined one or more metric values; and using, by the processor, the quality value to rank an object associated with the news source.

2. The method of claim 1 where the determining includes: determining, by the processor, a plurality of metric values for the news source.

3. The method of claim 2 where the generating includes: multiplying, by the processor, each metric value in the plurality of metric values by a factor to create a plurality of adjusted metric values, and adding, by the processor, the plurality of adjusted metric values to obtain the quality value.

4. The method of claim 3 where the plurality of metric values includes a predetermined number of highest metric values for the news source.

How the News Ranking Claims Differ

An analysis of changes over Time to the patent for “Systems and methods for improving the ranking of news articles,” should reflect how Google has changed how they have been implementing that patent.

We can see that in the claims for the very first patent (filed in 2003) that Google was looking at metric values for different news sources to rank the content that those sources were creating. That very long first claim from that version of the patent list a number of metrics to use to rank news sources, and that ranking influenced the ranking of news articles. So a story from a very well known news agency would have a tendency to rank higher than a story from a lesser-known agency.

The version of the patent filed in 2009 still focuses upon news sources (and a “news source rank”), along with references to the news articles generated by those news sources.

The version of the patent filed in February 2012 again tells us about a score for a news article that is influenced by a score for a news source, but it doesn’t include the many metrics that the 2003 version of the patent does.

The version of the patent filed in September 2012 Holds on to the score for the source, but tells us that score is based on a metric that represents an evaluation, by one or more users, the amount of traffic associated with the source, and a score for the article based upon a score for the source.

The most recent published version of this patent, filed in April 2015, and granted in October 2019 introduces some changes in how news articles may be ranked by Google. It tells us about how articles covering different topics are placed in clusters (which isn’t new in itself), and how those articles may rank higher than other articles by covering more entities that aren’t covered by articles in the same clusters


Copyright © 2019 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana

The post Evolution of Google’s News Ranking Algorithm appeared first on SEO by the Sea ⚓.


SEO by the Sea ⚓


The evolution of Google’s rel “no follow”

October 29, 2019 No Comments

Google updated the no-follow attribute on Tuesday 10th September 2019 regarding which they say it aims to help fight comment spam. The Nofollow attribute has remained unchanged for 15 years, but Google has had to make this change as the web evolves.

Google also announced two new link attributes to help website owners and webmasters clearly call out what type for link is being used,

rel=”sponsored”: Use the sponsored attribute to identify links on your site that were created as part of advertisements, sponsorships or other compensation agreements.

rel=”ugc”: UGC stands for User Generated Content, and the ugc attribute value is recommended for links within user-generated content, such as comments and forum posts.

rel=”nofollow”: Use this attribute for cases where you want to link to a page but don’t want to imply any type of endorsement, including passing along ranking credit to another page.

March 1st, 2020 changes

Up until the 1st of March 2020, all of the link attributes will serve as a hint for ranking purposes, anyone that was relying on the rel=nofollow to try and block a page from being indexed should look at using other methods to block pages from being crawled or indexed.

John Mueller mentioned the use of the rel=sponsered in one of the recent Google Hangouts.

Source: YouTube

The question he was asked

“Our website has a growing commerce strategy and some members of our team believe that affiliate links are detrimental to our website ranking for other terms do we need to nofollow all affiliate links? If we don’t will this hurt our organic traffic?”

John Mueller’s answer

“So this is something that, I think comes up every now and then, from our point of view affiliate links are links that are placed with a kind of commercial background there, in that you are obviously trying to earn some money by having these affiliate link and pointing to a distributor that you trust and have some kind of arrangement with them.

From our point of view that is perfectly fine, that’s away on monetizing your website your welcome to do that.

We do kind of expect that these types of links are marked appropriately so that we understand these are affiliate links, one way to do that is to use just a nofollow.

A newer way to do that to let us know about this kind of situation is to use the sponsored rel link attribute, that link attribute specifically tells us this is something to do with an advertising relationship, we treat that the same as a no-follow.

A lot of the affiliate links out there follow really clear patterns and we can recognize those so we try to take care of those on our side when we can  but to be safe we recommend just using a nofollow or rel sponsered link attribute, but in general this isn’t something that would really harm your website if you don’t do it, its something that makes it a little clearer for us what these links are for and if we see for example a website is engaging in large scale link selling then that’s something where we might take manual action, but for the most part if our algorithms just recognize these are links we don’t want to count then we just won’t count them.”

How quickly are website owners acting on this?

This was only announced by Google in September and website owners have until march to make the change required but data from Semrush show that website owners are starting to change over to the new rel link attribute with.

The data shows that out of From one million domains, only 27,763 has at least one UGC link but the interesting fact is that if we’ll look at those 27,763 domains that have at least one UGC link, each domain from this list on average has 20,904,603 follow backlinks, 6,373,970 – no follow, 22.8 – UGC, 55.5 – sponsored.

Source: Semrush.com

This is still very early days but we can see that there is change and I would expect that to grow significantly into next year.

Conclusion

I believe that Google is going to use the data from these link attributes to catch out website owners that continue to sell links and mark them up incorrectly in order to pass any sort of SEO value other to another website in any sort of agreement Paid or otherwise.

Paul Lovell is an SEO Consultant And Founder at Always Evolving SEO. He can be found on Twitter @_PaulLovell.

The post The evolution of Google’s rel “no follow” appeared first on Search Engine Watch.

Search Engine Watch


Google’s How News Works, aimed at clarifying news transparency

June 11, 2019 No Comments

In May, Google announced the launch of a new website aimed at explaining how they serve and address news across Google properties and platforms.

The site, How News Works, states Google’s mission as it relates to disseminating news in a non-biased manner. The site aggregates a variety of information about how Google crawls, indexes, and ranks news stories as well as how news can be personalized for the end user.

How News Works provides links to various resources within the Google news ecosystem all in one place and is part of The Google News Initiative.

What is The Google News Initiative?

The Google News Initiative (GNI) is Google’s effort to work with news industry professionals to “help journalism thrive in the digital age.” The GNI is driven and summarized by the GNI website which provides information about a variety of initiatives and approaches within Google including:

  • How to work with Google (e.g., partnership opportunities, training tools, funding opportunities)
  • A list of current partnerships and case studies
  • A collection of programs and funding opportunities for journalists and news organizations
  • A catalog of Google products relevant to journalists

Google attempts to work with the news industry in a variety of ways. For example, it provides funding opportunities to help journalists from around the world.

Google is now accepting applications (through mid-July) from North American and Latin American applicants to help fund projects that “drive digital innovation and develop new business models.” Applicants who meet Google’s specified criteria (and are selected) will be awarded up to $ 300,000 in funding (for U.S. applicants) or $ 250,000 (for Latin American applicants) with an additional award of up to 70% of the total project cost.

The GNI website also provides users with a variety of training resources and tools. Journalists can learn how to partner with Google to test and deploy new technologies such as the Washington Post’s participation in Google’s AMP Program (accelerated mobile pages).

AMP is an open source initiative that Google launched in February 2016 with the goal of making mobile web pages faster.

AMP mirrors content on traditional web pages, but uses AMP HTML, an open source format architected in an ultra-light way to reduce latency for readers.

News transparency and accountability

The GNI’s How It Works website reinforces Google’s mission to “elevate trustworthy information.” The site explains how the news algorithm works and links to Google’s news content policies.

The content policy covers Google’s approach to accountability and transparency, its requirements for paid or promotional material, copyright, restricted content, privacy/personalization and more.

This new GNI resource, a subsection of the main GNI website, acts as a starting point for journalists and news organizations to delve into Google’s vast news infrastructure including video news on YouTube.

Since it can be difficult to ascertain if news is trustworthy and accurate, this latest initiative by Google is one way that journalists (and the general public) can gain an understanding of how news is elevated and indexed on Google properties.

The post Google’s How News Works, aimed at clarifying news transparency appeared first on Search Engine Watch.

Search Engine Watch