CBPO

Monthly Archives: November 2017

How to use your data to supercharge paid search

November 14, 2017 No Comments

In today’s marketing climate, data is key. Indeed, more data is generated in a 24-hour period than ever before, with 2.5 quintillion bytes of data being created daily across the globe (IBM, 2017).

The challenge lies in being able to harness this data to optimize marketing activities. After all, without an understanding of what your customers are doing, it is almost impossible to increase conversions and ROI.

One of the key channels for marketers is paid search. Indeed, this is rapidly becoming the most powerful digital marketing channel, with over 2.3 million searches occurring per day. With all these interactions, marketers are paying a premium to get their brand, reflected in the fact that pay-per-click advertising costs are sky rocketing.

Marketers can gain visibility on their paid search activities, and overcome the rising cost of customer acquisition and retention in this channel, by taking control of their customer data.

In my previous article on how to stop Google AdWords campaigns from failing, I looked at how businesses can use a Customer Data Platform to gain a holistic overview of customer conversion, and properly attribute the role of each keyword in the conversion path.

In this article, I’ll expand on how data-driven attribution and the use of a Customer Data Platform can supercharge your paid search activities.

Content produced in partnership with Fospha.

Step 1: Integrate

The key challenge of the rise in multi-channel and multi-device customer journeys is the fact that businesses store this multitude of data in disparate silos, as illustrated in Figure 1.

The result? No unified view of the customer journey, and no understanding of how they are interacting with various marketing channels and campaigns. Businesses must therefore look to integrate their various data sources, using a Customer Data Platform, to provide this granular single customer view.

As well as integrating customer data, a Customer Data Platform will stitch data together, and link typically anonymised data with known identifiers. In doing so, multiple visits – across numerous sessions, channels and devices – are linked to one individual, so marketers can begin to understand who specific customers are, where they came from, what they viewed, and how they interacted with marketing channels on their path to purchase.

Once this view is in place, marketers are better equipped to understand the role of specific marketing channels – in this instance, paid search activities – in relation to customer conversions, as they have a full view of where customers interacted with their business before purchase.

Step 2: Attribute

Once your customer data is integrated and providing a clearer picture of what your customers are doing, marketers must then look to accurately attribute the role of their paid search channels in customer conversions.

For this, a data-driven attribution model – defined as ‘accurately assigning value to each digital channel marketing touchpoint across the complete user journey’ – is key. This model uses advanced algorithmic modelling to help marketers understand the real value and cost associated with each of their marketing touchpoints.

With these insights, you can identify where marketing activity in a particular channel plays little to no role in driving conversions. Marketers can then drill down into their paid search channel, to understand which individual keywords are leading to these conversions.

With this in-depth view, and the granular data source from the Customer Data Platform, marketers gain a much more comprehensive understanding of which keywords are a drain on resources, and which are bringing in high ROI. With this knowledge, they can redistribute spend to help accelerate growth without a drop in leads.

Step 3: Operationalize

Once marketers have access to these insights, the final step in supercharging their paid search activities is being able to operationalize at scale and in real-time. A Customer Data Platform can integrate directly with bid management platforms – which are already great at optimizing and automating PPC campaigns – to boost their efforts.

The granular understanding of keyword performance, derived through the Customer Data Platform’s rich data and attribution modelling layer, is pushed directly into a bid management platform, like Kenshoo or Marin, to automatically optimize the algorithms that inform their bidding.

This data-driven approach, executed in an automated and frictionless way, helps marketers optimize their paid search channel at scale.

Once you have taken these steps to optimize your paid search channels, you can use your Customer Data Platform to tackle other priority channels – to reduce costs and boost ROI – simply by integrating that data source into your Customer Data Platform and applying the same data-driven attribution modelling.

Content produced in partnership with Fospha. Views expressed in this article do not necessarily reflect the opinions of Search Engine Watch.

Search Engine Watch


So You Want to Geoengineer the Planet? Beware the Hurricanes

November 14, 2017 No Comments

A new study gives fascinating insight into how pumping sulfur into the stratosphere could affect hurricanes.
Feed: All Latest


Does Tomorrow Deliver Topical Search Results at Google?

November 14, 2017 No Comments
The Oldest Pepper Tree in California

At one point in time, search engines such as Google learned about topics on the Web from sources such as Yahoo! and the Open Directory Project, which provided categories of sites, within directories that people could skim through to find something that they might be interested in.

Those listings of categories included hierarchical topics and subtopics; but they were managed by human beings and both directories have closed down.

In addition to learning about categories and topics from such places, search engines used to use such sources to do focused crawls of the web, to make sure that they were indexing as wide a range of topics as they could.

It’s possible that we are seeing those sites replaced by sources such as Wikipedia and Wikidata and Google’s Knowledge Graph and the Microsoft Concept Graph.

Last year, I wrote a post called, Google Patents Context Vectors to Improve Search. It focused upon a Google patent titled User-context-based search engine.

In that patent we learned that Google was using information from knowledge bases (sources such as Yahoo Finance, IMDB, Wikipedia, and other data-rich and well organized places) to learn about words that may have more than one meaning.

An example from that patent was that the word “horse” has different meanings in different contexts.

To an equestrian, a horse is an animal. To a carpenter, a horse is a work tool when they do carpentry. To a gymnast, a horse is a piece of equipment that they perform manuevers upon during competitions with other gymnasts.

A context vector takes these different meanings from knowledge bases, and the number of times they are mentioned in those places to catalogue how often they are used in which context.

I thought knowing about context vectors was useful for doing keyword research, but I was excited to see another patent from Google appear where the word “context” played a featured role in the patent. When you search for something such as a “horse”, the search results you recieve are going to be mixed with horses of different types, depending upon the meaning. As this new patent tells us about such search results:

The ranked list of search results may include search results associated with a topic that the user does not find useful and/or did not intend to be included within the ranked list of search results.

If I was searching for a horse of the animal type, I might include another word in my query that identified the context of my search better. The inventors of this new patent seem to have a similar idea. The patent mentions

In yet another possible implementation, a system may include one or more server devices to receive a search query and context information associated with a document identified by the client; obtain search results based on the search query, the search results identifying documents relevant to the search query; analyze the context information to identify content; and generate a group of first scores for a hierarchy of topics, each first score, of the group of first scores, corresponding to a respective measure of relevance of each topic, of the hierarchy of topics, to the content.

From the pictures that accompany the patent it looks like this context information is in the form of Headings that appear above each search result that identify Context information that those results fit within. Here’s a drawing from the patent showing off topical search results (showing rock/music and geology/rocks):

Search Results in Context
Different types of ‘rock’ on a search for ‘rock’ at Google

This patent does remind me of the context vector patent, and the two processes in these two patents look like they could work together. This patent is:

Context-based filtering of search results
Inventors: Sarveshwar Duddu, Kuntal Loya, Minh Tue Vo Thanh and Thorsten Brants
Assignee: Google Inc.
US Patent: 9,779,139
Granted: October 3, 2017
Filed: March 15, 2016

Abstract

A server is configured to receive, from a client, a query and context information associated with a document; obtain search results, based on the query, that identify documents relevant to the query; analyze the context information to identify content; generate first scores for a hierarchy of topics, that correspond to measures of relevance of the topics to the content; select a topic that is most relevant to the context information when the topic is associated with a greatest first score; generate second scores for the search results that correspond to measures of relevance, of the search results, to the topic; select one or more of the search results as being most relevant to the topic when the search results are associated with one or more greatest second scores; generate a search result document that includes the selected search results; and send, to a client, the search result document.

It will be exciting to see topical search results start appearing at Google.


Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana

The post Does Tomorrow Deliver Topical Search Results at Google? appeared first on SEO by the Sea ⚓.


SEO by the Sea ⚓


Twitter launches lower-cost subscription access to its data through new Premium APIs

November 14, 2017 No Comments

 Twitter tried to mend its relationship with developers earlier this year with the launch of a new API platform which focused on streamlining APIs and the promise of additional tiers of access. Twitter said it would offer free APIs for testing ideas, self-serve access, as well as paid access for increased functionality, in addition to its enterprise APIs. Today, Twitter is delivering on its… Read More
Social – TechCrunch


Semantic Keyword Research and Topic Models

November 14, 2017 No Comments

Seeing Meaning

I went to the Pubcon 2017 Conference this week in Las Vegas Nevada and gave a presentation about Semantic Search topics based upon white papers and patents from Google. My focus was on things such as Context Vectors and Phrase-Based Indexing.

I promised in social media that I would post the presentation on my blog so that I could answer questions if anyone had any.

I’ve been doing keyword research like this for years, where I’ve looked at other pages that rank well for keyword terms that I want to use, and identify phrases and terms that tend to appear upon those pages, and include them on pages that I am trying to optimize. It made a lot of sense to start doing that after reading about phrase based indexing in 2005 and later.

Some of the terms I see when I search for Semantic Keyword Research include such things as “improve your rankings,” and “conducting keyword research” and “smarter content.” I’m seeing phrases that I’m not a fan of such as “LSI Keywords” which has as much scientific credibility as Keyword Density, which is next to none. There were researchers from Bell Labs, in 1990, who wrote a white paper about Latent Semantic Indexing, which was something that was used with small (less than 10,000 documents) and static collections of documents (the web is constantly changing and hasn’t been that small for a long time.)

There are many people who call themselves SEOs who tout LSI keywords as being keywords that are based upon having related meanings to other words, unfortunately, that has nothing to do with the LSI that was developed in 1990.

If you are going to present research or theories about things such as LSI, it really pays to do a little research first. Here’s my presentation. It includes links to patents and white papers that the ideas within in are based upon. I do look forward to questions.


Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana

The post Semantic Keyword Research and Topic Models appeared first on SEO by the Sea ⚓.


SEO by the Sea ⚓


How Often Does Google Search Match to Demographics?

November 14, 2017 No Comments

Google publicly rolled out demographic targeting for search in the fall of 2016. This feature allows advertisers to segment data and adjust bids based on gender and age estimates of users.   The one downside is the “undetermined segment.” This segment can make up a large proportion of the traffic. This can make some advertisers […]

Read more at PPCHero.com
PPC Hero


iUNU aims to build cameras on rails for growers to keep track of their crop health

November 14, 2017 No Comments

 You’ve probably spent a lot of time keeping track of your plants and all the minor details, like the coloration of the leaves, in order to make sure they’re healthy — but for professional growers in greenhouses, this means keeping track of thousands of plants all at once. That can get out of hand really quickly as it could involve just walking through a greenhouse with an… Read More
Enterprise – TechCrunch


Killer demand gen strategy, Part 2: Google Display Network targeting

November 14, 2017 No Comments

This is Part 2 of my blog series on crafting and executing killer demand gen strategies.

In Part 1, I discussed building out various personas to target, as well as how to craft the right creative. Now let’s chat through how to actually target these personas!

Both Google Display Network and Facebook have great audience targeting capabilities that allow you to get in front of your target audiences and the personas you have built out. Full disclosure: I was planning to wrap the GDN and Facebook together for this post, but both have so many features that they warrant their own edition.

So let’s dive into how to target your personas and audiences on the GDN, and save Facebook for Part 3.

Keyword contextual targeting (KCT)

Keyword contextual targeting is where you bid on keywords and Google will match you to pages relevant to your terms. You’ll notice two options when it comes to KCT:

  1. Content – shows ads on relevant webpages, etc.
  2. Audience – with this option, the ad will show on relevant pages and to people who might be interested in these keywords (so basically you are giving Google more control to do its thing).

My recommendation is to start off with Content, because you know exactly what you are getting into; don’t give Google control right away and make it hard to understand true performance. Content will have a lot less reach, but you have full visibility into things. As you begin seeing results, you can always adjust accordingly.

My general recommendation is to start off with your top 10-15 performing search terms – and then, of course, layer on demographic age and gender information so you are getting in front of the most relevant eyes.

Additionally, think about the personas you developed. In Part 1, I gave the example of a persona that loved celebrity fashion and gossip; building terms around those interests to get onto those pages is another way to get in front of the right eyes.

Custom Affinity Audiences

With Custom Affinity Audiences, you can input domains and Google will look at the types of users visiting those domains – makeup, demographics, topics of sites they visit, etc. Then Google crafts an audience similar to those users, which you can target.

With Custom Affinity Audiences, I recommend creating different audiences to target based off of:

  1. Competitor domains
  2. Industry-relevant websites
  3. Persona-relevant websites (think of the personas you have created and the types of websites they would visit)

In-Market Audiences

With In-Market Audiences, Google identifies people who are actively shopping for certain products and services. This is pretty clear-cut – choose In-Market Audiences relevant to your business.

Don’t forget to leverage the audience insights that Google gave you when developing your personas; those typically showcase other products/services that your core audience is typically in market for!

Refine your targeting to get closer to your target personas

For both KCT and In-Market Audiences, I recommend that you further refine your targeting by applying demographic layering onto those campaigns to get closer to your target personas. (With Custom Affinity Audiences, Google already incorporates demographic information from the data they pull as they analyze the audiences visiting the sites you enter.)

The above strategies are well worth testing out as you look to get in front of the right eyes and scale your business.

In part 3, we’ll dive into Facebook and how to best leverage its advanced targeting capabilities to get in front of your personas and target market!

Search Engine Watch


iOS 11.2 is going to support faster 7.5W Qi wireless charging

November 14, 2017 No Comments

 The iPhone 8, iPhone 8 Plus and iPhone X all support wireless charging using the Qi standard. It means that iPhones are now compatible with hundreds of chargers out there. But iPhone Qi charging is currently limited to 5W, or the slowest wireless charging speed. Apple is currently working on iOS 11.2 — this update is going to support 7.5W charging.
Wireless charging is nice if you… Read More

Gadgets – TechCrunch