CBPO

Monthly Archives: November 2022

Turn insights into ROI with Google Analytics

November 8, 2022 No Comments

Three years ago, we introduced Google Analytics 4, a re-imagined tool that helps you get a complete view of consumer behavior across web and app by using first-party, modeled data. This is critical in an evolving privacy and technology landscape, where marketers have to rethink their approach to measurement in order to keep getting the insights they rely on. Today we’re introducing new resources to help you make the switch to Google Analytics 4, improved machine learning features, actionable reporting and new integrations.

Make the switch now to Google Analytics 4 with helpful solutions

Earlier this year we shared that we will begin sunsetting standard Universal Analytics properties on July 1, 2023. We recognize that setting up Google Analytics 4 to fit your needs takes time and resources, in particular for large enterprises with complex Analytics 360 setups. To allow enterprise customers more time to have a smoother transition to Google Analytics 4, we’re moving the Universal Analytics 360 properties’ sunset date from October 1, 2023 to July 1, 2024. We’re focusing our efforts and investments on Google Analytics 4 to deliver a solution built to adapt to a changing ecosystem. Because of this, throughout 2023 we’ll be shifting support away from Universal Analytics 360 and will move our full focus to Google Analytics 4 in 2024. As a result, performance will likely degrade in Universal Analytics 360 up until the new sunset date.

To help everyone make the move, we’re launching new resources and tools to help you get started with Google Analytics 4. Our step by step guide helps you complete the entire setup of Google Analytics 4 at your pace and customize it to your needs. Or, if you prefer a more automated experience, you can use the Setup Assistant in the admin section of your Universal Analytics property. Once a Google Analytics 4 property is created and connected, the Setup Assistant can automate some required setup steps and help you track your progress. For example, the Setup Assistant lets you select the goals you want to import to Google Analytics 4, copy desired Google Ads links and audiences, and add users who have access to your current property.

Screenshot of the Setup Assistant showing the tools to configure data collection and property settings for a new Google Analytics 4 property

The Setup Assistant tools

The best Google Analytics 4 setup comes from following the steps above to create a customized property tailored to your needs. The earlier you do this, the more historical data and insights you will have in Google Analytics 4. For example, SunCorp, one of Australia’s largest financial services brands, prioritized setting up Google Analytics 4 to build a base of historical insights.

When Universal Analytics stops collecting data in 2023, we will have over two years of insights and reporting in Google Analytics 4. This is critical for a business like us to ensure we have a robust foundation of data to inform decision making. Mim Haysom
Chief Marketing Officer, Suncorp Group

Beginning in early 2023, the Setup Assistant will also create a new Google Analytics 4 property for each standard Universal Analytics property that doesn’t already have one — helping you jumpstart your migration. These new Google Analytics 4 properties will be connected with the corresponding Universal Analytics properties to match your privacy and collection settings. They’ll also enable equivalent basic features such as goals and Google Ads links. If you’d rather begin the switch on your own, you can opt out of having the Setup Assistant do it for you.

Get accurate insights with new machine learning solutions

Behavioral modeling uses machine learning to fill gaps in your understanding of customer behavior when cookies and other identifiers aren’t available. Soon, behavioral modeling will also be available in the real time reporting, giving you a complete view of the consumer journey as it happens. It’s helping marketers like Nestlé get accurate insights from more customer activity.

Behavioral modeling with Consent Mode in Google Analytics 4 drove a 23% increase in the observable traffic in analytics reporting on European and UK websites. Jaime Rodera
Privacy & Consumer Data Manager, Nestlé

Improve ROI with new actionable reporting and integrations

To get a more accurate picture of your campaigns across all of your marketing touchpoints, we will soon introduce custom channel grouping in Google Analytics 4 to help you see the performance of different channels aggregated. For example, you’ll be able to compare the performance of your paid search brand with your non-brand campaigns. These custom channel groupings work in reporting retroactively, and across the advertising and explore workspaces.

Your insights are only as good as the actions you can take from them. On top of Google Ads, Display & Video 360 and Search Ads 360, we will soon launch an integration with Campaign Manager 360 via Floodlight. This will allow marketers to bid towards Google Analytics 4 conversions in Display & Video 360’s automated bid strategies.

Now is the time to make Google Analytics 4 your cross-platform Analytics solution. Get started with Google Analytics 4 now, complete the setup by following our step by step guide and learn how to get the most out of it with the refreshed Google Analytics 4 certification.


Google Analytics Blog


How to ‘Quiet Quit’ Elon Musk’s Twitter

November 8, 2022 No Comments

The chaos engulfing the platform provides an opportunity to reclaim control of your online life, without logging off for good.
Feed: All Latest


BMW i7 2022 Review: High Life, High-Tech

November 6, 2022 No Comments

Yes, we enjoyed the built-in 31-inch 8K TV, but this electric sedan also offers impeccable interior design and handling (if you can afford it).
Feed: All Latest


Could these alternative SEO techniques be key to ranking successfully?

November 5, 2022 No Comments

With algorithm updates being rolled out on a more regular basis, staying at the top of Google’s search engine results pages (SERPs) has never been harder.

Gone are the days of signing up to directories, exact match domains, and keyword stuffing; SEO practitioners must do whatever they can to outrank their competitors.

Sure, you can put out more grind work through aggressive link-building outreach, hire a PR agency or a bunch of content writers, but could these simple alternative strategies below give you the edge?

Tip #1 – Post content when no one else is

The notion that ‘content is king’ is certainly true – you need good quality content and regular content uploaded to show you are proactive and not a dormant site.

Content cannot be thin, with thousands and thousands of words, because it needs to be relevant and answer questions in your industry, presented by # tags, useful links, images, and videos where possible.

“A competitive edge over your rivals is posting content when no one else is,” explains Rosie Marie, CEO of Rosca Technologies, a data optimisation solution.

“You have to understand that Google is an algorithm and a machine and not just a bunch of suits who look at websites one-by-one.”

Google recrawls every day or at least every few days, Marie explains, stressing it is hard to know precisely when this is. “If you can post content at alternative times of the day or year and Google decides to index your site, who knows, this could give you a competitive advantage,” she says.

Marie notes her business has tried posting content on weekends – easy to do if using a content management system – because others are unlikely to do so.

“In addition, we take advantage of things like UK bank holidays and that lazy week between Christmas and New Year’s because Google could very well pick up that we are being more proactive than our competitors. If this is the case, don’t we deserve to rank higher?”

Tip #2Get impartial users to critique your website

We regularly hear that time on site is a good SEO indicator, after all, if people hang on your site for a long time and click through to various pages, this shows that your information is useful, compared to a user who comes and leaves after 5 seconds, resulting in a low bounce rate.

“You could ask your partners and people in your industry to look through your site and offer their feedback,” says Gavin Cooper, founder of Claims Bible.

“Start by posting on LinkedIn and Facebook and say to your friends that you have just redesigned your site or have launched a new business and would truly welcome some feedback,” he says.

“Some of the feedback may not be nice to hear! But you will get a lot of dedicated users really looking through pages and scrolling through and this is great for SEO, certainly in the early days of a website launch.”

Cooper notes businesses must be careful to not forget that a higher click-through rate on Google’s search results also helps during the early buzz of a site launch.

“For instance, if you are ranked position 9 and more people click on you than position 3 or 4, this should also help your ranking,” he says.  “Similar ideas include sending out blast emails and SMS messages or making a big announcement on LinkedIn, but linking to your website. Don’t give LinkedIn the traffic, keep it for yourself!”

Tip #3 – Acquire links from simple sources

One of the most traditional link-building techniques is to create quality data-driven pieces and then email around and get links back to it as a resource e.g calculators, money that can be saved etc.

But not only is this very time-consuming but you have no guarantee over which links point back to you, and what anchor text is used.

“Our alternative technique involves finding websites that have already written articles or blog posts on your subject, whether you talk about health, finance, travel or anything that has expert opinion,” explains Luke Fitzpatrick, head of digital at Earned Media.

“Our approach involves reaching out to all those guides on page two to ten, contacting them and offering to give additional data to help ‘bulk up’ and refine their articles,” he says.

“Understandably, several publishers were thrilled to have more information in their articles and were pleased to give a follow link back as a reference. Link building achieved!”

Tip #4 – Using link bait that has already been successful

Content is king, but we know that links make the world go round. There are some things that work as excellent link bait for a brand, such as being nominated or winning awards, and being featured in press sections.

“But looking into competitors, there seem to be some lists that grab more attention than others,” says Richard Allan, co-founder of Capital Bean.

“Creating top lists such as best cities to do something, start a family or retire, tend to attract more interest than others, especially if they are filled with data.”

Allan also notes businesses should consider sponsoring large organisations in the health industry or non-profits since they often give a link and badge as part of it on their websites. “You get to help a great cause too,” he says.

“Another fascinating one is launching a scholarship or essay writing competition – which can attract links from universities and colleges if positioned well.”

 Tip #5 – Are you starting to fall? simple, refresh the content

Finally, if you ranked beautifully for some big keywords but find yourself starting to fall, you can just consider refreshing the content and replacing it with new and improved information, taking other points used by those who seem to be ranking better lately.

Google loves fresh content, and this helps your indexability, so it would not be strange to update your main landing pages every 6 months or so to give you that refresh.

The post Could these alternative SEO techniques be key to ranking successfully? appeared first on Search Engine Watch.

Search Engine Watch


What Is Apple One? A Breakdown of Plans, Pricing, and Included Services

November 4, 2022 No Comments

Going all-in with the services bundle could be a smart move, especially for families. We break down what’s included and how much it costs.
Feed: All Latest


Is Google headed towards a continuous “real-time” algorithm?

November 4, 2022 No Comments

Is Google headed towards a continuous “real-time” algorithm

30-second summary:

  • The present reality is that Google presses the button and updates its algorithm, which in turn can update site rankings
  • What if we are entering a world where it is less of Google pressing a button and more of the algorithm automatically updating rankings in “real-time”?
  • Advisory Board member and Wix’s Head of SEO Branding, Mordy Oberstein shares his data observations and insights

If you’ve been doing SEO even for a short while, chances are you’re familiar with a Google algorithm update. Every so often, whether we like it or not, Google presses the button and updates its algorithm, which in turn can update our rankings. The key phrase here is “presses the button.” 

But, what if we are entering a world where it’s less of Google pressing a button and more of the algorithm automatically updating rankings in “real-time”? What would that world look like and who would it benefit? 

What do we mean by continuous real-time algorithm updates?

It is obvious that technology is constantly evolving but what needs to be made clear is that this applies to Google’s algorithm as well. As the technology available to Google improves, the search engine can do things like better understand the content and assess websites. However, this technology needs to be interjected into the algorithm. In other words, as new technology becomes available to Google or as the current technology improves (we might refer to this as machine learning “getting smarter”) Google, in order to utilize these advancements, needs to “make them a part” of its algorithms.

Take MUM for example. Google has started to use aspects of MUM in the algorithm. However, (at the time of writing) MUM is not fully implemented. As time goes on and based on Google’s previous announcements, MUM is almost certainly going to be applied to additional algorithmic tasks.  

Of course, once Google introduces new technology or has refined its current capabilities it will likely want to reassess rankings. If Google is better at understanding content or assessing site quality, wouldn’t it want to apply these capabilities to the rankings? When it does so, Google “presses the button” and releases an algorithm update. 

So, say one of Google’s current machine-learning properties has evolved. It’s taken the input over time and has been refined – it’s “smarter” for lack of a better word. Google may elect to “reintroduce” this refined machine learning property into the algorithm and reassess the pages being ranked accordingly.    

These updates are specific and purposeful. Google is “pushing the button.” This is most clearly seen when Google announces something like a core update or product review update or even a spam update. 

In fact, perhaps nothing better concretizes what I’ve been saying here than what Google said about its spam updates

“While Google’s automated systems to detect search spam are constantly operating, we occasionally make notable improvements to how they work…. From time to time, we improve that system to make it better at spotting spam and to help ensure it catches new types of spam.” 

In other words, Google was able to develop an improvement to a current machine learning property and released an update so that this improvement could be applied to ranking pages. 

If this process is “manual” (to use a crude word), what then would continuous “real-time” updates be? Let’s take Google’s Product Review Updates. Initially released in April of 2021, Google’s Product Review Updates aim at weeding out product review pages that are thin, unhelpful, and (if we’re going to call a spade a spade) exists essentially to earn affiliate revenue.

To do this, Google is using machine learning in a specific way, looking at specific criteria. With each iteration of the update (such as there was in December 2021, March 2022, etc.) these machine learning apparatuses have the opportunity to recalibrate and refine. Meaning, they can be potentially more effective over time as the machine “learns” – which is kind of the point when it comes to machine learning. 

What I theorize, at this point, is that as these machine learning properties refine themselves, rank fluctuates accordingly. Meaning, Google allows machine learning properties to “recalibrate” and impact the rankings. Google then reviews and analyzes and sees if the changes are to its liking. 

We may know this process as unconfirmed algorithm updates (for the record I am 100% not saying that all unconfirmed updates are as such). It’s why I believe there is such a strong tendency towards rank reversals in between official algorithm updates. 

It’s quite common that the SERP will see a noticeable increase in rank fluctuations that can impact a page’s rankings only to see those rankings reverse back to their original position with the next wave of rank fluctuations (whether that be a few days later or weeks later). In fact, this process can repeat itself multiple times. The net effect is a given page seeing rank changes followed by reversals or a series of reversals.  

across the board fluctuations - Google moving towards a “real-time” algorithm

A series of rank reversals impacting almost all pages ranking between position 5 and 20 that align with across-the-board heightened rank fluctuations 

This trend, as I see it, is Google allowing its machine learning properties to evolve or recalibrate (or however you’d like to describe it) in real-time. Meaning, no one is pushing a button over at Google but rather the algorithm is adjusting to the continuous “real-time” recalibration of the machine learning properties.

It’s this dynamic that I am referring to when I question if we are heading toward “real-time” or “continuous” algorithmic rank adjustments.

What would a continuous real-time google algorithm mean? 

So what? What if Google adopted a continuous real-time model? What would the practical implications be? 

In a nutshell, it would mean that rank volatility would be far more of a constant. Instead of waiting for Google to push the button on an algorithm update in order to rank to be significantly impacted as a construct, this would simply be the norm. The algorithm would be constantly evaluating pages/sites “on its own” and making adjustments to rank in more real-time. 

Another implication would be a lack of having to wait for the next update for restoration. While not a hard-fast rule, if you are significantly impacted by an official Google update, such as a core update, you generally won’t see rank restoration occur until the release of the next version of the update – whereupon your pages will be evaluated. In a real-time scenario, pages are constantly being evaluated, much the way links are with Penguin 4.0 which was released in 2016. To me, this would be a major change to the current “SERP ecosystem.” 

I would even argue that, to an extent, we already have a continuous “real-time” algorithm. In fact, that we at least partially have a real-time Google algorithm is simply fact. As mentioned, In 2016, Google released Penguin 4.0 which removed the need to wait for another version of the update as this specific algorithm evaluates pages on a constant basis. 

However, outside of Penguin, what do I mean when I say that, to an extent, we already have a continuous real-time algorithm? 

The case for real-time algorithm adjustments

The constant “real-time” rank adjustments that occur in the ecosystem are so significant that they refined the volatility landscape. 

Per Semrush data I pulled, there was a 58% increase in the number of days that reflected high-rank volatility in 2021 as compared to 2020. Similarly, there was a 59% increase in the number of days that reflected either high or very high levels of rank volatility: 

Data showing volatility - Google moving towards a “real-time” algorithm

Simply put, there is a significant increase in the number of instances that reflect elevated levels of rank volatility. After studying these trends and looking at the ranking patterns, I believe the aforementioned rank reversals are the cause. Meaning, a large portion of the increased instances in rank volatility are coming from what I believe to be machine learning continually recalibrating in “real-time,” thereby producing unprecedented levels of rank reversals. 

Supporting this is the fact (that along with the increased instances of rank volatility) we did not see increases in how drastic the rank movement is. Meaning, there are more instances of rank volatility but the degree of volatility did not increase. 

In fact, there was a decrease in how dramatic the average rank movement was in 2021 relative to 2020! 

Why? Again, I chalk this up to the recalibration of machine learning properties and their “real-time” impact on rankings. In other words, we’re starting to see more micro-movements that align with the natural evolution of Google’s machine-learning properties. 

When a machine learning property is refined as its intake/learning advances, you’re unlikely to see enormous swings in the rankings. Rather, you will see a refinement in the rankings that align with refinement in the machine learning itself. 

Hence, the rank movement we’re seeing, as a rule, is far more constant yet not as drastic. 

The final step towards continuous real-time algorithm updates

While much of the ranking movement that occurs is continuous in that it is not dependent on specific algorithmic refreshes, we’re not fully there yet. As I mentioned, much of the rank volatility is a series of reversing rank positions. Changes to these ranking patterns, again, are often not solidified until the rollout of an official Google update, most commonly, an official core algorithm update. 

Until the longer-lasting ranking patterns are set without the need to  “press the button” we don’t have a full-on continuous or “real-time” Google algorithm. 

However, I have to wonder if the trend is not heading toward that. For starters, Google’s Helpful Content Update (HCU) does function in real-time. 

Per Google

Our classifier for this update runs continuously, allowing it to monitor newly-launched sites and existing ones. As it determines that the unhelpful content has not returned in the long-term, the classification will no longer apply.”

How is this so? The same as what we’ve been saying all along here – Google has allowed its machine learning to have the autonomy it would need to be “real-time” or as Google calls it, “continuous”: 

This classifier process is entirely automated, using a machine-learning model.” 

For the record, continuous does not mean ever-changing. In the case of the HCU, there’s a logical validation period before restoration. Should we ever see a “truly” continuous real-time algorithm, this may apply in various ways as well. I don’t want to let on that the second you make a change to a page, there will be a ranking response should we ever see a “real-time” algorithm.

At the same time, the “traditional” officially “button-pushed” algorithm update has become less impactful over time. In a study I conducted back in late 2021, I noticed that Semrush data indicated that since 2018’s Medic Update, the core updates being released were becoming significantly less impactful.

the relation between Google's updates and rank volatility - Google moving towards a “real-time” algorithm

Data indicates that Google’s core updates are presenting less rank volatility overall as time goes on

Subsequently, this trend has continued. Per my analysis of the September 2022 Core Update, there was a noticeable drop-off in the volatility seen relative to the May 2022 Core Update

lesser rank volatility seen during Google's core update in Sep 2022 - Google moving towards a “real-time” algorithm

Rank volatility change was far less dramatic during the September 2022 Core Update relative to the May 2022 Core Update 

It’s a dual convergence. Google’s core update releases seem to be less impactful overall (obviously, individual sites can get slammed just as hard) while at the same time its latest update (the HCU) is continuous. 

To me, it all points towards Google looking to abandon the traditional algorithm update release model in favor of a more continuous construct. (Further evidence could be in how the release of official updates has changed. If you look back at the various outlets covering these updates, the data will show you that the roll-out now tends to be slower with fewer days of increased volatility and, again, with less overall impact). 

The question is, why would Google want to go to a more continuous real-time model? 

Why a continuous real-time google algorithm is beneficial

A real-time continuous algorithm? Why would Google want that? It’s pretty simple, I think. Having an update that continuously refreshes rankings to reward the appropriate pages and sites is a win for Google (again, I don’t mean instant content revision or optimization resulting in instant rank change).

Which is more beneficial to Google’s users? A continuous-like updating of the best results or periodic updates that can take months to present change? 

The idea of Google continuously analyzing and updating in a more real-time scenario is simply better for users. How does it help a user looking for the best result to have rankings that reset periodically with each new iteration of an official algorithm update? 

Wouldn’t it be better for users if a site, upon seeing its rankings slip, made changes that resulted in some great content, and instead of waiting months to have it rank well, users could access it on the SERP far sooner? 

Continuous algorithmic implementation means that Google can get better content in front of users far faster. 

It’s also better for websites. Do you really enjoy implementing a change in response to ranking loss and then having to wait perhaps months for restoration? 

Also, the fact that Google would so heavily rely on machine learning and trust the adjustments it was making only happens if Google is confident in its ability to understand content, relevancy, authority, etc. SEOs and site owners should want this. It means that Google could rely less on secondary signals and more directly on the primary commodity, content and its relevance, trustworthiness, etc. 

Google being able to more directly assess content, pages, and domains overall is healthy for the web. It also opens the door for niche sites and sites that are not massive super-authorities (think the Amazons and WebMDs of the world). 

Google’s better understanding of content creates more parity. Google moving towards a more real-time model would be a manifestation of that better understanding.

A new way of thinking about google updates

A continuous real-time algorithm would intrinsically change the way we would have to think about Google updates. It would, to a greater or lesser extent, make tracking updates as we now know them essentially obsolete. It would change the way we look at SEO weather tools in that, instead of looking for specific moments of increased rank volatility, we’d pay more attention to overall trends over an extended period of time. 

Based on the ranking trends we already discussed, I’d argue that, to a certain extent, that time has already come. We’re already living in an environment where rankings fluctuate far more than they used to and to an extent has redefined what stable rankings mean in many situations. 

To both conclude and put things simply, edging closer to a continuous real-time algorithm is part and parcel of a new era in ranking organically on Google’s SERP.


Mordy Oberstein is Head of SEO Branding at Wix. Mordy can be found on Twitter @MordyOberstein.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

The post Is Google headed towards a continuous “real-time” algorithm? appeared first on Search Engine Watch.

Search Engine Watch


What Is Predictive Analysis and Its Role in a Winning Marketing Strategy

November 3, 2022 No Comments

Predictive analysis is often applied to manage supply chains and business operations and to analyze consumer behavior. According to Statista.com, predictive analysis is here to stay with a value of $ 5.29 billion in 2020, and is forecast to grow to $ 41.52 billion by 2028.   

But what is it? How can it positively impact your business and marketing strategies? Let’s find out.    

What is Predictive Analysis?

Predictive analysis is a form of business analysis that uses statistics or machine learning to predict the outcome of something. That something can be anything from consumer intent and customer lifetime value to sales trends. 

Compared to other types of business analysis, while predictive analysis focuses on what is likely to happen, descriptive analysis looks at what has happened. Prescriptive analysis seeks answers based on the other two analytics to determine what should happen – according to what has happened and what is likely to happen. 

Predictive analysis can be used to:

  • Forecast future customer churn rates. 
  • Accurately predict future sales forecasts.
  • Enable businesses to order the optimal amount of inventory to meet customer demand.
  • Calculate a customer’s lifetime value (CLV).
  • Predict what products a customer is likely to buy in the future. 
  • Prevent logistics or warehouse equipment malfunctions. 

What Are Methods of Predictive Analysis?

Harnessing current and/or historical data with statistical techniques like predictive modeling, deep learning algorithms, machine learning, and data mining, predictive analysis can forecast future likely events.  

Other types of predictive analysis techniques include:

  • Data clustering uses machine learning to group objects into categories based on similarities, such as audience segmentation based on past engagement.  
  • Classification is a prediction technique that involves calculating the probability that an item belongs to a particular category.
  • Logistic regression finds correlations between inputs and outputs.
  • Decision trees are supervised learning algorithms used to determine courses of action and the probabilities associated with each, depending on sets of variables. 
  • Time series analysis is a technique used for analyzing time-series data, such as changes over periods. 

What is a Predictive Analysis Example?

A good use case for predictive analysis is in the eCommerce space – specifically product recommendations. Smart algorithms create accurate projections for consumers based on what they’ve previously bought and other contextual reasons. 

One example of these algorithms in practice involves looking at the purchase and reviewing the history of the consumer and recommending products on similar user data. Any products that the user has previously purchased would be disregarded.  

Brands seeking to improve customer engagement and conversion rates often garner great results from recommendation engines. Done right, this predictive analysis marketing strategy encourages upsells and cross-sells, establishes brand loyalty, and ensures the customers return for more.

The Role of Predictive Analysis in Marketing 

Personalized Experiences 

Predictive analysis forms the backbone of winning marketing strategies. This is because using data in the right way enables personalized customer experiences and drives sales. In marketing, needs forecasting is a widely used predictive analytics tool, where businesses anticipate customer needs based on their web browsing habits. 

For instance, online home renovation retailers can predict when a customer is in the market for decorating products due to increased searches for home improvements. 

Solving Problems

Predictive analysis solves customer problems before they are aware that they have problems. Using customer intent and behavior data, businesses can see which customers are more at risk of churn and act accordingly, even if they have a PandaDoc convertible note agreement template in place. Proactively addressing potential issues is a good business position to be in and minimizes the impact on the overall customer experience. 

New Customer Acquisition 

Use data segmentation as predictive analysis to define customer identification models. This practice works by identifying potential customers based on your existing customers’ needs, wants, purchase behavior, and preferences.  

Optimize Marketing Budget 

Predictive analysis enables marketers to spend budgets more effectively – whether the goal is to convert potential customers, attract a new audience segment, or retain existing customers. Because predictive analysis can help you understand the actions of users that indicate their conversion intentions, you can craft relevant landing pages, sales funnels, and marketing campaigns that are poised to positively impact your bottom line.

The Predictive Analysis Marketing Process

How could predictive analysis look in your business? 

  1. Define what question you want to answer – e.g. which prospects are likely to sign up for my service within the next 30 days? 
  2. Gather the data – our example needs historical prospects data (specifically how much time it took past prospects to convert), demographic and channel data, plus a current list of prospects. 
  3. Undertake descriptive analysis to determine facts, such as whether the average conversion time varies between channels and whether demographics correlate with these time frames.  
  4. Use statistical techniques to test your theories. 
  5. Create a predictive model after your test discoveries to predict outcomes. 
  6. Deploy the predictive model to glean actionable insights, e.g., the prospects that will likely sign up within the next 30 days.
  7. Create targeted marketing strategies with these prospects in mind in the hope of maximum conversions. 
  8. Update the predictive model regularly to meet new requirements. 

Remember that external influences can skew your data – think about seasonal changes, news events, global crises, etc. 

How to Maximize Success in Marketing with Predictive Analysis

Some critical tips to use predictive analysis to its best effect include:

  • Gather as much of the right data as possible. 
  • Decide on the most relevant modeling techniques and algorithms for the specific project. 
  • Have processes to reduce potential biases. 

We Predict That Predictive Analysis is the Future of Marketing

Predictive analysis is fast becoming a vital decision-making tool for forward-thinking businesses. Regardless of industry, predictive analysis can give you the insights you need to drive your marketing. By enabling intelligent data for science collection and harnessing it to accurately predict future outcomes, organizations use predictive analysis to make extremely profitable decisions.

The post What Is Predictive Analysis and Its Role in a Winning Marketing Strategy first appeared on PPC Hero.

PPC Hero


Twitter Had a Plan to Fix Social Media. Will Elon Musk Follow It?

November 2, 2022 No Comments

For years, the platform has funded a project that’s meant to create a better, decentralized online experience. Now Twitter’s new owner will decide its future.
Feed: All Latest


In a sea of signals, is your on-page on-point?

November 2, 2022 No Comments

In a sea of signals, is your on-page on-point

30-second summary:

  • Content managers who want to assess their on-page performance can feel lost at sea due to numerous SEO signals and their perceptions
  • This problem gets bigger and highly complex for industries with niche semantics
  • The scenarios they present to the content planning process are highly specific, with unique lexicons and semantic relationships
  • Sr. SEO Strategist at Brainlabs, Zach Wales, uses findings from a rigorous competitive analysis to shed light on how to evaluate your on-page game

Industries with niche terminology, like scientific or medical ecommerce brands, present a layer of complexity to SEO. The scenarios they present to the content planning process are highly specific, with unique lexicons and semantic relationships. 

SEO has many layers to begin with, from technical to content. They all aim to optimize for numerous search engine ranking signals, some of which are moving targets. 

So how does one approach on-page SEO in this challenging space? We recently had the privilege of conducting a lengthy competitive analysis for a client in one of these industries. 

What we walked away with was a repeatable process for on-page analysis in a complicated semantic space. 

The challenge: Turning findings into action

At the outset of any analysis, it’s important to define the challenge. In the most general sense, ours was to turn findings into meaningful on-page actions — with priorities. 

And we would do this by comparing the keyword ranking performance of our client’s domain to that of its five chosen competitors.

Specifically, we needed to identify areas of the client’s website content that were losing to competitors in keyword rankings. And to prioritize things, we needed to show where those losses were having the greatest impact on our client’s potential for search traffic.

Adding to the complexity were two additional sub-challenges:

  1. Volume of keyword data. When people think of “niche markets,” the implication is usually a small number of keywords with low monthly search volumes (MSV). Scientific industries are not so. They are “niche” in the sense that their semantics are not accessible to all—including keyword research tools—but their depth & breadth of keyword potential is vast.
  2. Our client already dominated the market. At first glance, using keyword gap analysis tools, there were no product categories where our client wasn’t dominating the market. Yet they were incurring traffic losses from these five competitors from a seemingly random, spread-out number of cases. Taken together incrementally, these losses had significant impacts on their web traffic. 

If the needle-in-a-haystack analogy comes to mind, you see where this is going. 

To put the details to our challenge, we had to:

  • Identify where those incremental effects of keyword rank loss were being felt the most — knowing this would guide our prioritization;
  • Map those keyword trends to their respective stage of the marketing funnel (from informational top-of-funnel to the transactional bottom-of-funnel) 
  • Rule out off-page factors like backlink equity, Core Web Vitals & page speed metrics, in order to…
  • Isolate cases where competitor pages ranked higher than our client’s on the merits of their on-page techniques, and finally
  • Identify what those successful on-page techniques were, in hopes that our client could adapt its content to a winning on-page formula.   

How to spot trends in a sea of data

When the data sets you’re working with are large and no apparent trends stand out, it’s not because they don’t exist. It only means you have to adjust the way you look at the data.

As a disclaimer, we’re not purporting that our approach is the only approach. It was one that made sense in response to another challenge at hand, which, again, is one that’s common to this industry: The intent measures of SEO tools like Semrush and Ahrefs — “Informational,” “Navigational,” “Commercial” and “Transactional,” or some combination thereof — are not very reliable. 

Our approach to spotting these trends in a sea of data went like this:

Step 1. Break it down to short-tail vs. long tail

Numbers don’t lie. Absent reliable intent data, we cut the dataset in half based on MSV ranges: Keywords with MSVs above 200 and those equal to/below 200. We even graphed these out, and indeed, it returned a classic short/long-tail curve.

on-page SEO signals - Short tail vs long tail keyword performance 

This gave us a proxy for funnel mapping: Short-tail keywords, defined as high-MSV & broad focus, could be mostly associated with the upper funnel. This made long-tail keywords, being less searched but more specifically focused, a proxy for the lower funnel. 

Doing this also helped us manage the million-plus keyword dataset our tools generated for the client and its five competitor websites. Even if you perform the export hack of downloading data in batches, neither Google Drive nor your device’s RAM want anything to do with that much data.

Step 2. Establish a list of keyword-operative root words

The “keyword-operative root word” is the term we gave to root words that are common to many or all of the keywords under a certain topic or content type. For example, “dna” is a common root word to most of the keywords about DNA lab products, which our client and its competitors sell. And “protocols” is a root word for many keywords that exist in upper-funnel, informational content.

We established this list by placing our short- and long-tail data (exported from Semrush’s Keyword Gap analysis tool) into two spreadsheets, where we were able to view the shared keyword rankings of our client and the five competitors. We equipped these spreadsheets with data filters and formulas that scored each keyword with a competitive value, relative to the six web domains analyzed.  

Separately, we took a list of our client’s product categories and brainstormed all possibilities for keyword-operative root words. Finally, we filtered the data for each root word and noted trends, such as the number of keywords that a website ranked for on Google page 1, and the sum of their MSVs. 

Finally, we applied a calculation that incorporated average position, MSV, and industry click-through rates to quantify the significance of a trend. So if a competitor appeared to have a keyword ranking edge over our client in a certain subset of keywords, we could place a numerical value on that edge. 

Step 3. Identify content templates

If one of your objectives is to map keyword trends to the marketing funnel, then it’s critical to understand the role of page templates. Why? 

Page speed performance is a known ranking signal that should be considered. And ecommerce websites often have content templates that reflect each stage of the funnel. 

In this case, all six competitors conveniently had distinct templates for top-, middle- and bottom-funnel content:

  • Top-funnel templates: Text-heavy, informational content in what was commonly called “Learning Resources” or something similar;
  • Middle-funnel templates: Also text-heavy, informational content about a product category, with links to products and visual content like diagrams and videos — the Product Landing Page (PLP), essentially;
  • Bottom-funnel templates: Transactional, Product Detail Pages (PDP) with concise, conversion-oriented text and purchasing calls-to-action.

Step 4. Map keyword trends to the funnel

After cross-examining the root terms (Step 2), keyword ranking trends began to emerge. Now we just had to map them to their respective funnel stage.

Having identified content templates, and having the data divided by short- & long-tail made this a quicker process. Our primary focus was on trends where competitor webpages were outranking our client’s site. 

on-page SEO signals - Page Speed Insight Scores on-page SEO signals - Page Speed Insight Scores by device and competitor comparison

Identifying content templates brought the added value of seeing where competitors, for example, outranked our client on a certain keyword because their winning webpage was built in a content-rich, optimized PLP, while our client’s lower-ranking page was a PDP.

Step 5. Rule out the off-page ranking factors

Since our goal was to identify & analyze on-page techniques, we had to rule out off-page factors like link equity and page speed. We sought cases where one page outranked another on a shared keyword, in spite of having inferior link equity, page speed scores, etc. 

For all of Google’s developments in processing semantics (e.g., BERT, the Helpful Content Update) there are still cases where a page with thin text content outranks another page that has lengthier, optimized text content — by virtue of link equity. 

To rule these factors out, we assigned an “SEO scorecard” to each webpage under investigation. The scorecard tallied the number of rank-signal-worthy attributes the page had in its SEO favor. This included things like Semrush’s page authority score, the number of internal vs. external inlinks, the presence and types of Schema markup, and Core Web Vitals stats.

on-page SEO signals - SEO Scorecard

The scorecards also included on-page factors, like the number of headers & subheaders (H1, H2, H3…), use of keywords in alt-tags, meta titles & their character counts, and even page word count. This helped give a high-level sense of on-page performance before diving into the content itself. 

Our findings

When comparing the SEO scorecards of our client’s pages to its competitors, we only chose cases where the losing scorecard (in off-page factors) was the keyword ranking winner. Here are a few of the standout findings.

Adding H3 tags to products names really works

This month, OrangeValley’s Koen Leemans published a Semrush article, titled, SEO Split Test Result: Adding H3 Tags to Products Names on Ecommerce Category Pages. We found this study especially well-timed, as it validated what we saw in this competitive analysis.

To those versed in on-page SEO, placing keywords in <h3> HTML format (or any level of <h…> for that matter) is a wise move. Google crawls this text before it gets to the paragraph copy. It’s a known ranking signal. 

When it comes to SEO-informed content planning, ecommerce clients have a tendency — coming from the best of intentions — to forsake the product name in pursuit of the perfect on-page recipe for a specific non-brand keyword. The value of the product name becomes a blind spot because the brand assumes it will outrank others on its own product names.

It’s somewhere in this thought process that an editor may, for example, decide to list product names on a PLP as bolded <p> copy, rather than as a <h3> or <h4>. This, apparently, is a missed opportunity. 

More to this point, we found that this on-page tactic performed even better when the <h>-tagged product name was linked (index, follow) to its corresponding PDP, AND accompanied with a sentence description beneath the product name. 

This is in contrast to the product landing page (PLP) which has ample supporting page copy, and only lists its products as hyperlinked names with no descriptive text. 

Word count probably matters, <h> count very likely matters

In the ecommerce space, it’s not uncommon to find PLPs that have not been visited by the content fairy. A storyless grid of images and product names. 

Yet, in every case where two PLPs of this variety went toe-to-toe over the same keyword, the sheer number of <h> tags seemed to be the only on-page factor that ranked one PLP above its competitors’ PLPs, which themselves had higher link equity. 

The takeaway here is that if you know you won’t have time to touch up your PLPs with landing copy, you should at least set all product names to <h> tags that are hyperlinked, and increase the number of them (e.g., set the page to load 6 rows of products instead of 4).  

And word count? Although Google’s John Mueller confirmed that word count is not a ranking factor for the search algorithm, this topic is debated. We cannot venture anything conclusive about word count from our competitive analyses. What we can say is that it’s a component of our finding that…

Defining the entire topic with your content wins

Backlinko’s Brian Dean ventured and proved the radical notion that you can optimize a single webpage to rank for not the usual 2 or 3 target keywords, but hundreds of them. That is if your copy encompasses everything about the topic that unites those hundreds of keywords. 

That practice may work in long-form content marketing but is a little less applicable in ecommerce settings. The alternative to this is to create a body of pages that are all interlinked deliberately and logically (from a UX standpoint) and that cover every aspect of the topic at hand.

This content should address the questions that people have at each stage of the awareness-to-purchase cycle (i.e., the funnel). It should define niche terminology and spell out acronyms. It should be accessible.

In one stand-out case from our analysis, a competitor page held position 1 for a lucrative keyword, while our client’s site and that of the other competitors couldn’t even muster a page 1 ranking. All six websites were addressing the keyword head-on, arguably, in all the right ways. And they had superior link equity.

What did the winner have that the rest did not? It happened that in this lone instance, its product was being marketed to a high-school teacher/administrator audience, rather than a PhD-level, corporate, governmental or university scientist. By this virtue alone, their marketing copy was far more layman-accessible, and, apparently, Google approved too.

The takeaway is not to dumb-down the necessary jargon of a technical industry. But it highlights the need to tell every part of the story within a topic vertical. 

Conclusion: Findings-to-action

There is a common emphasis among SEO bloggers who specialize in biotech & scientific industries on taking a top-down, topical takeover approach to content planning. 

I came across these posts after completing this competitive analysis for our client. This topic-takeover emphasis was validating because the “Findings-To-Action” section of our study prescribed something similar:

Map topics to the funnel. Prior to keyword research, map broad topics & subtopics to their respective places in the informational & consumer funnel. Within each topic vertical, identify:

  • Questions-to-ask & problems-to-solve at each funnel stage
  • Keyword opportunities that roll up to those respective stages
  • How many pages should be planned to rank for those keywords
  • The website templates that best accommodate this content
  • The header & internal linking strategy between those pages

Unlike more common-language industries, the need to appeal to two audiences is especially pronounced in scientific industries. One is the AI-driven audience of search engine bots that scour this complex semantic terrain for symmetry of clues and meaning. The other is human, of course, but with a mind that has already mastered this symmetry and is highly capable of discerning it. 

To make the most efficient use of time and user experience, content planning and delivery need to be highly organized. The age-old marketing funnel concept works especially well as an organizing model. The rest is the rigor of applying this full-topic-coverage, content approach.


Zach Wales is Sr. SEO Strategist at Brainlabs.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

The post In a sea of signals, is your on-page on-point? appeared first on Search Engine Watch.

Search Engine Watch


Introducing two new solutions powered by Ads Data Hub

November 1, 2022 No Comments

Ads Data Hub helps advertisers, agencies and measurement partners do customized analysis of campaigns while protecting user privacy. More than 3,000 brands, agencies, and measurement partners use cloud-based Ads Data Hub to perform analyses for their specific business objectives.

Customers of Ads Data Hub have different needs, so we’ve created more specialized entry points to get started. Marketers require tools to quantify a consumer’s path to purchase and the ability to activate new audiences. At the same time, measurement partners conduct third-party assessment of metrics such as video viewability and audience reach.

To offer a more tailored experience, we are introducing an evolution to the Ads Data Hub core platform by introducing two dedicated solutions: Ads Data Hub for Marketers and Ads Data Hub for Measurement Partners.

New solutions for more catered needs

Ads Data Hub for Marketers offers a new way for advertisers and agencies to analyze their data. With this solution, they can seamlessly access insights to better inform the way they purchase media. This means a simplified experience for marketers running queries and activating their first-party data.

Riot Games, for example, used Ads Data Hub for richer marketing analyses. The company centralized their insights and combined them with Display & Video 360 and Campaign Manager 360 data. This let Riot Games attribute credit to various ad touch points, accurately measure return on ad spend (ROAS), and establish a new benchmark showing that for every $ 1 Riot Games spent on Google media, it received $ 2 in revenue. Marketers, like Riot Games, perform these analyses regularly, with hundreds of thousands of queries run in 2022 alone.

Over time, new query templates, automated workflows, and updates to reporting will reduce the need for additional technical resources and decrease time to generate insights – with plans to implement Publisher Advertiser Identity Reconciliation, also known as PAIR. In addition to these improvements, marketers will soon be able to activate their audience segments on new inventory, including YouTube. As privacy expectations evolve, we will continue to build more solutions that enable advertisers and agencies to measure and activate their first-party data with Ads Data Hub for Marketers.

Ads Data Hub for Measurement Partners gives partners a new access point to provide YouTube measurement services on behalf of marketers, advertisers, agencies, or publishers. With this launch, it’ll be easier for partners to offer accurate measurement and deliver near real-time insights. For marketers, this means they can work with independent third-party partners to calculate and report on YouTube ad performance across devices, formats, and metrics.

These third-party independent measurement services are available to marketers via our growing partner ecosystem. With Dynata, and other vendors, we have expanded measurement services on Ads Data Hub to enable cross-media solutions for YouTube. Customers will be able to analyze the performance of YouTube campaigns relative to other media channels (including linear TV, streaming TV, or online video sources). Another partner, DoubleVerify, has earned YouTube Video Viewability accreditation by the Media Rating Council (MRC), in addition to Ads Data Hub’s own accreditation announced last year.

In 2023 we plan to integrate with new partners such as iSpot and VideoAmp, joining the list of measurement partners already available with Ads Data Hub.

Commitment to a privacy-centric future

Marketers and measurement partners will benefit from rigorous privacy checks that protect the personal data of users online while still being able to perform comprehensive analytics. These analyses, in addition to insight generation and audience activation, can all be performed with Ads Data Hub users only having access to aggregated data. By investing in privacy-centric solutions that address the specific needs of marketers and measurement partners, we’ve simplified the path to accurate measurement across YouTube and Google campaigns.


Google Analytics Blog