CBPO

Analytics

The middle ground for single keyword ad groups (SKAGs)

May 25, 2019 No Comments

Aside from perhaps the most unfortunate acronym in the industry, do single keyword ad groups (SKAGs) have a role in modern paid search?

For many years, single keyword ad groups were the hallmark of good PPC strategy. And aside from a slight feeling of unease when saying the word, SKAGs appeared to offer much.

In simple words, this was the practice of placing single keywords in an ad group, instead of a small group of closely themed keywords. This provided the advertiser with increased control, the ad copy could contain the exact keyword, maximizing relevance and the quality score. Match types and negative keywords could be used to ensure queries were matched to your keyword exactly, providing precise control over visibility. And finally, you could easily understand the true performance of an individual keyword.

Complexity at scale

Arguably, however, the benefits of this approach were incremental when implemented in an otherwise well organized and maintained PPC account structure. In fact, the benefits could be outweighed by the challenges they posed.

The complexity of the SKAG structure, when operated at scale, could jeopardize accuracy. For example, if you were operating a standard structure with 1,500 keywords, averaging three ads and five keywords per ad group, you would be managing 900 individual ads. Convert this to a SKAG structure, maintaining three ads per ad group, and that number jumps to 4,500 individual creatives to be maintained.

Not only that, but the application of cross-matching negatives to stream traffic accurately across this number of ad groups makes it significantly more complex. This is just a simplified example with a modest number of keywords, retailers with a large product range may operate keywords in the tens of thousands.

Operating SKAGs at scale increases the chances of inadvertently blocking traffic to keywords, as well as poor quality or inappropriate ad copy being overlooked. Both of which would have negative impacts on performance. To mitigate against this, increased amounts of “housekeeping” work is required, either detracting from more strategic work to develop and grow the activity or increasing costs to allow for the extra resource required.

So, are the merits of the SKAG structure outweighed by the effort to maintain them? Or even worse, and perhaps ironically, do they increase the risk of inaccuracy?

Are SKAGs still relevant?

Putting this question aside, there is a question over whether SKAG is even appropriate in contemporary PPC accounts.

In a greatly discussed recent article, Emma Franks of Hanapin Marketing makes the case that SKAGs no longer serve as a best practice for paid search. Her argument is centered around the evolution of Google’s match types, which are shifting to better match keywords to the user’s intent, rather than simply matching words to the query.

Emma F's comment on SKAGs

Source: Unbounce

This means that a single keyword could effectively be matched to many variations, all of which are relevant and have the same intent. The below example of how this works is taken directly from Google Ads Help pages:

Source: Google Inside AdWords

This level of variant matching then implies that to truly achieve the goal of the SKAG structure, which is, complete control over what queries match and the creative that is served, the extent of negative cross-matching required would become too taxing and hard to achieve.

Emma’s summary of the potential issues was

  • Multiple ad groups that address the same keyword intent
  • Duplicated ad copy that is no longer customizable for each individual search
  • Cross-contamination among keyword search terms for multiple ad groups
  • The potential for missed impressions/clicks/conversions/revenue due to an overabundance of negative keywords
  • Wasted time spent on keyword additions and exclusions, ad copy testing and revisions further topped with stress about new Google updates

Essentially, as Google increasingly takes benefit of machine learning to match ads better with the user’s intent, the SKAG structure offers advertisers an increasingly more difficult way to grab that control back from Google and control it manually. But, in an industry that is being driven by automation, machine learning, and AI, can a manually-controlled account ever keep up?

Is there a middle ground?

So then, SKAGs are a challenge to manage at scale and essentially pull in the opposite direction to the way in which Google is developing the Google Ads offering. In this case, they don’t have a place in a well-managed PPC campaign, right? Well, not entirely.

Where individual keywords command a very high share of the overall search volume, placing those terms in ad groups all of their own can offer greater flexibility. You get the control over the matching landing pages and copy SKAGs provide but at an infinitely more manageable scale. But you can also apply specific audience targeting, demographic and device modifiers and, day-parting at an effective keyword level. This provides a lot more levers for optimization of such high-volume terms. Take things a step further and place each SKAG in its own campaign and you now can apply specific budgets, the ad rotations, and delivery methods for that keyword, as well as its very own bid strategy.

Once again, this comes back to an assessment of “effort vs reward”. To be truly worth it and indeed to make automated features such as bid strategies work, the individual keywords themselves must drive a high volume.

A blended approach

So in the war of opinions on this subject (refer back to the comments section of Emma Franks article!), there is an answer to the entire “mixed feelings scenario” for SKAGs. Yes, SKAGs do have a role in effective PPC activity, but they should be used strategically alongside other strategies to maximize performance.

High-volume hero or brand terms can benefit from the SKAG structure to increase the levels of control and flexibility at a keyword level for the terms that drive the largest proportion of your traffic. Using traditional, tightly themed ad groups for the bulk of your remaining inventory will ensure more manageability while it continues to deliver performance. Finally, tools such as Dynamic Search Ads can offer a “catch-all” strategy to capture new and emerging search terms when deployed correctly.

An approach such as this provides maximum control over the terms that drive the most performance, whilst also allowing advertisers to reap the benefits of machine learning and automation to efficiently and effectively manage the body and long tail terms.

Advertisers are all different, so inevitably, each paid search structure will be unique as a result. The key, as ever, is finding the right balance that works for you.

Jon Crowe is Director of PPC Strategy at a global digital marketing agency, Croud.

The post The middle ground for single keyword ad groups (SKAGs) appeared first on Search Engine Watch.

Search Engine Watch


Five ways blockchain will impact search marketing

May 21, 2019 No Comments

Few technologies promise to have an impact on the marketplace as tremendous as the blockchain technology. Though many professionals in the search marketing industry are still entirely unfamiliar with it. Blockchain’s disruptive nature is changing the nature of digital advertising regardless of whether some professionals hear about it or not, however, meaning it’s imperative to catch up on how this technology is changing the industry if you want to remain competitive.

Here are five of the major ways that blockchain will impact search marketing, and how advertising professionals are already beginning to master this interesting technology as it takes over.

1. Blockchain will make ads trustworthy

Consumers hate advertisements for a number of reasons, but by and large the most common is that they simply think advertising technology is untrustworthy. Nobody likes feeling as if they are being surveilled 24/7, and few people trust digital advertisements that appear on their screen enough to click on them, even if its contents are interesting. Blockchain technology promises to help this problem by securing the ad supply chain and making the marketing process more trustworthy to consumers everywhere.

Soon, thanks to blockchain services, ad tech vendors, buyers, and publishers will be more connected than ever before. Transparency, that is sorely needed in the ad supply chain can be brought about by the application of blockchain services, which thanks to their nature as ledgers are accessible to every party involved in a financial transaction. Website owners and ad vendors of the future will thus be able to operate with one another much more securely when making marketing arrangements.

2. Blockchain is delivering ad transparency

Elsewhere, blockchain services will be applied to make ads more transparent in an effort to win over the trust of skeptical consumers. Companies like Unilever are now teaming up with the likes of IBM on blockchain projects that they hope will disclose information about their business footprint and the way they collect and utilize information on customers. As these endeavors become more successful, others will be convinced to enlist the help of blockchain technology when it comes to ensuring a transparent advertising industry.

3. Blockchain is changing ad payments

Blockchain technology will also impact search marketing by disrupting the way that advertisement payments are facilitated. Companies like Amino Payments will soon be springing up left and right as the market for blockchain services grows larger and larger. These businesses will help mainstream blockchain-powered ad buys that make use of interesting smart contracts. While smart contracts are only just beginning to become an accepted part of the business world, they’ll be a mainstream facet of doing business sooner than we think, all thanks to the wonderful power of blockchain.

4. New advertising ecosystems are springing up

Some of the ways that blockchain is impacting search marketing are truly monumental. Blockchain technology is helping new advertising ecosystems get on their feet, for instance, with nascent companies like Adshares that are working hard to create a blockchain-based advertising ecosystem. As cryptocurrencies and other blockchain-powered technologies become more mainstream, we’ll see an increased need for blockchain-friendly payment systems.

Search marketing professionals in the future may have to rely on specialized expertise when navigating these new blockchain-powered advertising ecosystems that use a standard bitcoin wallet, which will become dominated by the IT-savvy. Programmatic advertising has already been upended time and again in recent years as the digital revolution brought about better computers, and the rise of blockchain could very well be the next stage in that cycle of disruption.

5. New blockchain browsers will reshape user experiences

Finally, the digital experience of the average consumer will be fundamentally changed by the introduction of blockchain browsers. Browser options like Brave are becoming more popular and grabbing headlines as they promise a privacy-respecting internet experience that features more honest and safer ad tech. Our current understandings of the marketing world may be entirely useless a few years from now when blockchain powered browsers off secure, personalized search options to users who are sick and tired of modern advertising gurus.

Search marketing is in for more than its fair share of disruptive changes in the forthcoming years, largely because of the advent of blockchain technology. Like any other technological innovation, blockchain will take time and investment to grow into its full potential, but it’s already quite clear that its development is jarring advertising professionals.

The post Five ways blockchain will impact search marketing appeared first on Search Engine Watch.

Search Engine Watch


Using IF functions on Google Ads to improve productivity

May 18, 2019 No Comments

Back in the days when I was learning PPC, one of the two biggest growing pains I had were:

  1. Learning the difference between segmenting campaigns out to maximize efficiency
  2. Reaching the point where the juice is no longer worth the squeeze

Rather than creating clutter and a burdensome account to manage, I’ve since learned to make use of everything I can to speed up my workflow and free up bandwidth to focus on things that actually make a difference.

IF functions are a versatile means to tailor your ads to users in real time, using either the type of device they’re browsing on or the audience segment they belong to as signals to serve up specialized ad copy. The right message at the right time can make all the difference between a conversion or another bounced visitor. Search marketing is rapidly moving towards heavy automation and personalization, so IF functions are helpful because they’re a simple way to keep your seat at the table.

Setting up IF functions

The process of setting up IF Functions is painless. You could easily set one up in the time it will take to finish this article, regardless of your comfort level with Excel formulas. And if doing it on Excel is too daunting, you can set them up directly in the Google Ads UI under the Ads tab.

The basic logic is as follows

{=IF(condition is met, show this text):If not, show this text}.

So, if you wanted specific messaging for users on mobile, the logic runs something like this:

IF the user is ON a mobile device, show mobile-friendly CTA. If not, show the general CTA.

To put that in the basic formula

{=IF(device=mobile, Call Now!):Get a Quote.}

Another common usage of IF statements is serving specific offers to specific audience segments.

The basic formula for audience-based IF functions is

{=IF(Audience IN(audience name), Audience-specific copy.):General copy}

To put the above into a sentence: “If a user is IN this specific audience segment, serve them this specific copy. Otherwise, serve this more general copy.”

Suppose you were running a tiered promotion, where Club Members were eligible for an additional 15% discount on top of a 30% off sale, that text would look something like this:

Shop Now for{=IF(Audience IN(ClubList),45%):30%} Off!

Or, if your nurture campaigns weren’t entirely broken out and you wanted to move recent visitors into booking a consultation, you might have something like:

{=IF(Audience IN(Returning Visitor 7 Days), Book Your Consultation Today!):Download Our Free Guide.

Take note that you can target multiple audience segments in the same IF function. However, you are still limited to two copy options. The syntax is the same, just with your audiences separated by commas in the Audience IN section –

{=IF(Audience IN(Segment1,Segment2,Segment3)Learn More!):Get a Quote.}

If you’re feeling overwhelmed by keeping track of all of those brackets, commas, and colons, you can also build IF functions directly in the Google Ads UI. Simply add an open bracket in an ad field, anywhere from the headline one to URL paths one or two (note that ad customizers in Final URLs are not supported) and let the system walk you through putting it together.

Things to note while using IF functions

  • The character limits for each field still apply (but only for the ad text defined in your functions).
  • Symbols in the function’s ad text options like quote marks (both single and double), commas, and colons will need to be preceded by backslashes (\) for the function to work properly. For example, rather than “SearchEngineWatch’s” your function copy would read “SearchEngineWatch/’s.”

Using IF functions for fun and profit

Although IF functions don’t offer as many options to customize ads as using a business data feed, the options they do provide are staggering.

Shaping expectations based on device type is a must. While mobile browsers have come a long way in recent years, filling out long forms on a small screen with no keyboard is a slog, and desktop users might not have the same propensity to turn into brick and mortar visitors.

Tailoring your copy for devices isn’t a replacement for setting realistic device bid modifiers and taking cross-device/cross-channel conversions into account. But it is another way to squeeze more efficiency out of your ad budget.

Beyond device-type, the real power of IF functions come from the ease with which you can target specific audience segments. If you have a large enough CRM list to make customer match audiences viable for search, great. If your lists aren’t quite big enough, have no fear, you can create details of the possible audiences in Google Analytics and import it to Google Ads, the options are endless.

Bonus: Countdown ads

Countdown ads are yet another feature that is effective and easy to use but tend to fly under the radar. Beyond highlighting promotions, I’ve seen success in highlighting shipping windows (keep that in mind for the holiday shopping season), special events (for example, store openings), and more. Just like the other customizers available, countdowns can be put anywhere in an ad except for the URL.

The syntax is pretty straightforward

  • Specify a date in Year/Month/Day, pick a time in Hour:Minute:Second
  • Specify the language you’re targeting, and how many days you’d like the countdown to run

In the below example, the countdown will end at midnight on June 7, 2019, after starting seven days prior

{=COUNTDOWN(“2019/7/7 12:00:00″,”en-US”,7)}

The future is now

Running a successful paid search campaign has always required knowing who your customers are. Ad customizers make reaching the right user with the right messaging easier, and at scale. IF functions are easy inroads towards better tailoring of your users’ experiences towards their needs. It gives you more control over your ad copy than dynamic keyword insertion or responsive search ads, with a lower likelihood of matching to undesirable search queries than dynamic search ads. And with less setup needed than the Ad Customizer feeds, IF functions ultimately give savvy search marketers a powerful tool to boost performance.

Have any queries or interesting functions you know? Share them in the comments.

Clay Schulenburg is Director of SEM at PMG.

The post Using IF functions on Google Ads to improve productivity appeared first on Search Engine Watch.

Search Engine Watch


A summary of Google Data Studio: Updates from April 2019

May 14, 2019 No Comments

April was a big month for Google Data Studio (GDS), with Google introducing some significant product updates to this already robust reporting tool.

For those not familiar with GDS, it is a free dashboard-style reporting tool that Google rolled out in June 2016. With Data Studio, users can connect to various data sources to visualize, and share data from a variety of web-based platforms.

GDS supports native integrations with most Google products including Analytics, Google Ads, Search Ads 360 (formerly Doubleclick Search), Google Sheets, YouTube Analytics, and Google BigQuery.

GDS supports connectors that users can purchase to import data from over one hundred third-party sources such as Bing Ads, Amazon Ads, and many others.  

Sample Google Data studio dashboard

Source: Google

1. Google introduces BigQuery BI Engine for integration with GDS

BigQuery is Google’s massive enterprise data warehouse. It enables extremely fast SQL queries by using the same technology that powers Google Search. Per Google,

“Every day, customers upload petabytes of new data into BigQuery, our exabyte-scale, serverless data warehouse, and the volume of data analyzed has grown by over 300 percent in just the last year.”

BigQuery BI Engine stores, analyzes, and finds insights on your data Image Source: Google

Source: Google

2. Enhanced data drill-down capabilities

You can now reveal additional levels of detail in a single chart using GDS’s enhanced data drill down (or drill up) capabilities.

You’ll need to enable this feature in each specific GDS chart and, once enabled, you can drill down from a higher level of detail to a lower one (for example, country to a city). You can also drill up from a lower level of detail to a higher one (for example, city to the country). You must be in “View” mode to drill up or drill down (as opposed to the “Edit” mode).

Here’s an example of drilling-up in a chart that uses Google’s sample data in GDS.

GDS chart showing clicks by month

Source: Google

To drill-up by year, right click on the chart in “View” mode and select “Drill up” as shown below.

GDS chart showing the option to “Drill up” the monthly data to yearly data

Visit the Data Studio Help website for detailed instructions on how to leverage this feature.

3. Improved formatting of tables

GDS now allows for more user-friendly and intuitive table formatting. This includes the ability to distribute columns evenly with just one click (by right-clicking the table), resizing only one column by dragging the column’s divider, and changing the justification of table contents to left, right, or center via the “Style” properties panel in “Edit” mode.

Example of editing, table properties tab in GDS

Source: Google

Detailed instructions on how to access this feature are located here.

4. The ability to hide pages in “View” mode

GDS users can now hide pages in “View” mode by right clicking on the specific page (accessed via the top submenu), clicking on the three vertical dots to the right of the page name, and selecting “Hide page in view mode”. This feature comes in handy when you’ve got pages you don’t want your client (or anyone) to see when presenting the GDS report.

The new “Hide page” feature in GDS

Source: Google

5. Page canvas size enhancements

Users can now customize each page’s size with a new feature that was rolled out on March 21st (we’re sneaking this into the April update because it’s a really neat feature).

Canvas size settings can be accessed from the page menu at the top of the GDS interface. Select Page>Current Page Settings, and then select “Style” from the settings area at the right of the screen. You can then choose your page size from a list of pre-configured sizes or set a custom size of your own.

GDS Page Settings Wizard

Source: Google

6. New Data Studio help community

As GDS adds more features and becomes more complex, it seems only fitting that Google would launch a community help forum for this tool. So, while this isn’t exactly a new feature to GDS itself, it is a new resource for GDS users that will hopefully make navigating GDS easier.

Users can access the GDS Help Community via Google’s support website or selecting “Help Options” from the top menu bar in GDS (indicated by a question mark icon) then click the “Visit Help Forum” link.

The Help menu within GDS

Source: Google

Conclusion

We hope that summarizing the latest GDS enhancements has made it a little easier to digest the many new changes that Google rolled out in April (and March). Remember, you can always get a list of updates, both new and old by visiting Google’s Support website here.

Jacqueline Dooley is the Director of Digital Strategy at CommonMind.

The post A summary of Google Data Studio: Updates from April 2019 appeared first on Search Engine Watch.

Search Engine Watch


Study: How to use domain authority for digital PR and content marketing

May 11, 2019 No Comments

For the SEO community, Domain Authority is a contentious metric.

Domain Authority (DA) is defined by Moz as

“A search engine ranking score developed by Moz that predicts how well a website will rank on search engine result pages (SERPs). A Domain Authority score ranges from one to 100, with higher scores corresponding to a greater ability to rank.”

Some people say that this score does more harm than good because it distracts digital marketers from what matters. Improving your DA doesn’t mean you’re improving your rankings. Others tend to find it useful on its own as a quick way to determine the quality or trustworthiness of a site.

Here’s what I say, from a digital PR perspective, domain authority is valuable when you’re using it to compare sites relative to one another. In fact, DA provides value for us PRs and is incredibly useful to our work.

Think of it this way. There are more websites than ever before, about 1.5 billion to be exact and so in some ways, this means there is more opportunity for marketers to get their content out in the world and in front of new audiences. While most people think that journalism is dying out, an enlightening post on Recode by Rani Molla explains that “while job postings for journalists are off more than 10 percent since 2004, jobs broadly related to content have almost quadrupled.” 

In other words, if outreach is executed well, there are more places than ever to get your content featured and lead to driving traffic, broadening your audience, and improving your search ranking.

But even the most skilled PR teams can’t reach out to 1.5 billion sites. The knowledgeable ones know that you really only need one successful placement to get your content to spread like wildfire all over the Internet, earning links and gaining exposure for your brand in the process. With so many options out there, how do PR professionals know which sites to spend time targeting?

That’s where DA comes into play. When it comes to link building, content marketers know that not all backlinks and brand mentions are created equally. The value of a link or mention varies depending on the referring website. Moz’s DA score is a way for us PRs to quickly and easily assess the quality of the websites we target for our client’s content marketing campaigns.

Our team tends to bucket online publishers, blogs, and websites into three categories:

  • Top-tier
  • Mid-tier
  • Low-tier

Keep in mind, particularly with the new Moz update, when deciding who to pitch, you must take a holistic approach. While domain authority is an excellent way to quickly assess the quality of a website, a site’s DA can change at any minute due to a multitude of factors, so make sure you are also taking into account your goals, the site’s audience, social following, and reputation as well as Moz DA score. In response to a Marketing Land tweet about the new DA, Stevie Howard says it perfectly.

Screenshot of Stevie Howard's tweet in response to a Marketing Land tweet about the new DA

Top-tier sites

What constitutes a top-tier website? Can a top-tier site have a low DA? Potentially, but it’s uncommon.

When you look at the holy grail of media coverage, DA tends to align perfectly. Take, for example, the following seven major publishers that any brand or business would love to earn coverage on. The DA scores for all of these sites fall above 90. These sites all have an extremely large audience, both on-site and on social media.

List of top tier sites having a DA score of 90 and above

Our team at Fractl has an innate sense of the online publisher landscape, and the largest and most well-known content publishers out there all tend to have a domain authority above 90. This is what we consider to be the “top-tier”.

These publishers are difficult to place with because of their large audience, social following, and reputation, so for the best chance at earning organic press mentions on these sites, offer them authoritative, unique, exclusive, and newsworthy content.

Mid-tier sites

Mid-tier sites may not be the holy grail of news publishers, but they’re our bread and butter. This is where the majority of placements tend to happen. These publishers hit a sweet spot for digital PR pros—they’re not as sought-after as Buzzfeed and don’t deeply scrutinize pitches the way The New York Times does, but they have large audiences and tend to be much more responsive to content pitches.

I tend to categorize the mid-tier as publishers that fall within a DA of 66 to 89. Here are some examples of publishers that may be considered mid-tier.

List of mid-tier publishers that have a DA of 66 to 89

Low-tier sites

Don’t underestimate a low-tier site simply because of its domain authority. For example, it wasn’t long ago that personal finance website, Money-ish, had a DA of 1. Launched in 2017, it was first its own website before being absorbed as part of the larger MarketWatch domain. MarketWatch has a DA of 93, with social engagement as high as 12,294,777 in the last year. If you ignored Money-ish because of its DA when they first started, you would have missed out on a chance to get your content featured on MarketWatch as well as build relationships with writers that are now under the MarketWatch umbrella. There are all types of content, and most marketers can figure out which projects have “legs” and which have less appeal. These lower-tier sites are often very niche and the perfect home for content that is aimed towards smaller, more precise audiences. These lower-tier sites also tend to have a high engagement where it matters, your target audience. Consider the site’s community. Does this site have a ton of email subscribers or high comment engagement? Are they killing it on Instagram or on another social network? You never know which site will become the next Money-ish, either!

List of low-tier sites with DA below 60 or 65

Pitching differences for each tier

There are plenty of sites that fall within different ranges of domain authority that would be an excellent fit for your content. It all just depends on your goals. In Fractl’s latest internal study, we were able to identify trends in the way journalists respond to PR professionals, based on the DA of the site they write for.

Graph on how journalists respond to PRs based on their sites DA score

Observations

  • Feedback from writers working for sites with a DA lower than 89 was most likely to be complimentary of the content campaigns we pitched them.
  • The verbiage of their responses was also more positive on average than those from journalists working for publishers with a DA of 90 or above.

An example of the feedback we received that would be labeled as complimentary is,

“Thanks for sending this over, it fits perfectly with our audience. I scheduled a post on this study to go up tomorrow.”- Contributor, Matador Network (DA: 82)

Those of us that have been pitching mainstream publishers for a while know from experience that it’s often easier to place with websites that tend to fall in the mid to low-tier buckets. Writers at these publishers are usually open to email pitches and open to writing about outside content because such websites have less stringent editorial guidelines.

Conversely, publishers that fall into our definition of “high-tier” were less positive on average than writers working for publishers with a DA less than 90. On average, the higher the DA, the less positive the language becomes.

Why might that be? It makes perfect sense that publishers like The New York Times, CNN, TIME, and The Washington Post would be less positive. They’re likely receiving hundreds of PR pitches a day because of their popularity. If they do respond to a pitch, they want to ensure that they’re inquiring about content that would eventually meet their editorial guidelines, should they decide to cover it.

According to our study, when journalists at publishers with a DA of 90 or above do respond, they’re more likely to be asking about the methodology or source of the content.

An example of this feedback is from a staff writer at CNN.

“Thanks for sending along. I’m interested to know more about the methodology of the study.”

A response like this isn’t necessarily bad, in fact, it’s quite good. If a journalist is taking time to ask you more about the details of the content you pitched, it’s a good indication that the writer is hoping to cover it, they just need more information to ensure that any data-driven content is methodologically-sound.

Conclusion

Domain authority will continue to remain a controversial metric for SEOs, but for those of us working in digital PR, the metric provides a lot of value. Our study found a link between the DA of a site and the type of responses we received from writers at these publishers. High DA sites were less positive on average and requested research back methodologies more than lower-tier sites. Knowing the DA of a site allows you to:

  • Improve your list building process and increase outreach efficacy
  • Customize each outreach email you send to publishers of varying DAs
  • Anticipate the level of editorial scrutiny you’re up against in terms of content types and research methodologies
  • Optimize content you create to fit the needs of your target publisher
  • Predict the outcome of a content campaign depending on where you placed the “exclusive”

Remember, just because a site has a high DA, it doesn’t mean it’s necessarily a good fit for your content. Always be sure to take a holistic approach to your list building process. Keep in mind the social engagement of the site, the topics they cover, who their audience is, their editorial guidelines, and most importantly, the goals of you or your client before reaching out to any publisher solely based on domain authority.

Domenica is a Brand Relationship Manager at Fractl. She can be found on Twitter @atdomenica.

The post Study: How to use domain authority for digital PR and content marketing appeared first on Search Engine Watch.

Search Engine Watch


The SEO metrics that really matter for your business

May 4, 2019 No Comments

Whether you are a business owner, marketing manager or simply just interested in the world of ecommerce, you may be familiar with how a business can approach SEO.

To every person involved, the perception of SEO and its success can vary from a sophisticated technical grasp to a knowledge of the essentials.

At all levels, measurement and understanding of search data are crucial and different metrics will stand out; from rankings to the finer details of goals and page speed.

As you may know, you can’t rely solely on ranks as a method to track your progress. But there are other, simple ways to measure the impact of SEO on a business.

In a recent AMA on Reddit, Google’s own Gary Illyes recently urged SEO professionals to stick to the basics and this way of thinking can be applied to the measurement of organic search performance.

In this article, we will look to understand the best metrics for your business when it comes to understanding the impact of SEO, and how they can be viewed from a technical and commercial perspective. Before we start, it’s worth mentioning that this article has used Google’s own demo analytics account for screenshots. If you need further info to get to grips, check out this article, or access the demo version of Google Analytics.

Each of these are commercial SEO metrics — data that means something to everyone in a business.

Organic traffic

This is undoubtedly a simple, if not the most simple way of understanding the return of any SEO efforts. The day-to-day traffic from search engines is the key measure for many marketers and any increase can often be tied to an improved level of search visibility (excluding seasonal variation).

In a world where data drives decisions, these figures are pretty important and represent a key part of any internet user’s session, whether that is to get an answer, make a purchase or something else.

In Google Analytics, simply head follow this path: Acquisition -> All Traffic -> Channels to see the organic traffic received within your chosen time period

Identifying traffic sources in Google Analytics

You might be asking, “how can I know more?”

Google might have restricted access to keyword data back in 2011, but you can still dig down into your traffic from organic search to look at landing pages and locations.

Organic traffic data – Filtered by landing page 

Not all traffic from search hits your homepage, some users head to your blog or to specific landing pages, depending on their needs. For some searches, however, like those for your company name, your homepage will be the most likely option.

To understand the split of traffic across your site, use the “Landing Page” primary dimension and explore the new data, split by specific page URL.

Understanding the traffic split using Google Analytics

Organic traffic data – Filtered by location

Within the same section, the organic search data can be split by location, such as city, to give even further detail on the makeup of your search traffic. Depending on how your business operates, the locations shown may be within the same country or across international locations. If you have spent time optimizing for audiences in specific areas, this view will be key to monitor overall performance.

Screenshot of search data filtered by city

Screenshot of the city wise breakdown of the search traffic in Google Analytics

Revenue, conversions, and goals

In most cases, your website is likely to be set up to draw conversions, whether that is product sales, document downloads, or leads.

Part of understanding the success of SEO, is the contribution to the goal of a business, whether that is monetary or lead-based.

For revenue based data, head to the conversions section within Google analytics, then select the product performance. Within that section, filter the secondary dimension by source/medium to show just sales that originate from search engine traffic.

Screenshot of the product performance list to track search originated sales

If your aim isn’t totally revenue based, perhaps a signup form or some downloadable content, then custom analytics goals are your way of fully understanding the actions of visitors that originate from search engines.

Within the conversions section, the source of your goal completions can be split by source, allowing you to focus on solely visits from organic search.

Graph on source wise split of goal conversions

If a visitor finds your site from a search and then buys something or registers their details, it really suggests you are visible to the right audience.

However, if you are getting consistent organic search visits with no further actions taken, that suggests the key terms you rank for, aren’t totally relevant to your website.

SEO efforts should focus on reaching the relevant audiences, you might rank #1 for a search query like “cat food” but if you only sell dog products, your optimization hasn’t quite worked.

Search and local visibility

In the case that your business has web and/or physical store presences, you can use the tools within Google My Business to look further into and beyond the performance of the traditional blue links.
Specifically, you can understand the following:

  • How customers search for your business
  • How someone sees your business
  • What specific actions they take

The better your optimization, the more of these actions you will see, check these out!

Doughnut graph of search volume seen in Google Analytics

Graph of customer actions

Graph of listing sources for Google my business

Average search rankings

Rankings for your key terms on search engines have traditionally been an easy way to quickly get a view of overall performance. However, a “quick Google” can be hard to draw conclusions from. Personalized search from your history and location essentially skews average rank to a point where its use has been diminished.

A variety of tools can be used to get a handle on average rankings for specific terms. The free way to do this is through Google Search Console with freemium tools like SEMRush and Ahrefs, which also offer an ability to understand average rank distribution.

With search rankings becoming harder to accurately track, the measure of averages is the best way to understand how search ranking relates to and impacts the wider business.

Graph on average positioning of the website in search

Technical metrics – Important but not everyone pays attention to these

When it comes to the more technical side of measuring SEO, you have to peel back the layers and look beyond clicks and traffic. They help complete the wider picture of SEO performance, plus they can help uncover additional opportunities for progress.

Search index – Through search consoles and other tools

Ensuring that an accurate index of your website exists is one thing that you need to do with SEO. Because if only a part of your site or the wrong pages are indexed, then your overall performance will suffer.

Although a small part of overall SEO work, its arguably one of the most crucial.

One quick way is to enter the command “site:” followed by the URL of your site’s homepage, to see the total number of pages that exist in a search engine’s index.

To inspect the status of a specific page on Google, the Google Search Console is your best option. The newest version of the search console provides a quick way to bring up results.

Screenshot of the latest Google Search Console

Search crawl errors

As well as looking at what has been indexed, any website owner needs to keep an eye out for what may be missing, or if there have been any crawl errors reported by Google. These often occur because a page has been blocked, or the format isn’t crawlable by Google.

Head to the “Coverage” tab within Google Search Console to understand the nature of any errors and what page the error relates to. If there’s a big zero, then you and your business naturally have nothing to worry about.

Screenshot of viewing error reports in Google Search Console

Click-through rate (CTR) and bounce rate

In addition to where and how your website ranks for searches, a metric to consider is how often your site listing is clicked in the SERPs. Essentially, this shows the percentage of impressions that result in a site visit.

This percentage indicates how relevant your listing is to the original query and how well your result ranks compared to your competitors.

If people like what they see and can easily find your website, then you’ll likely get a new site visit.

The Google Search Console is the best go-to resource again for the most accurate data. Just select the performance tab and toggle the CTR tab to browse data by query, landing page, country of origin, and device.

Screenshot of a CTR performance graph on the basis of query, landing page, country of origin, and device

If someone does venture onto your site, you will want to ensure the page they see, is relevant to their search, after all, search algorithms love to reward relevance! If the page doesn’t contain the information required or isn’t user-friendly, then it is likely the user will leave to find a better resource, without taking any action, known as a bounce.

In some cases, one visit may be all that is needed, therefore a bounce isn’t an issue. Make sure to view this metric in the wider context of what your business offers.

Mobile friendliness

Widely reported in 2015, was the unveiling of mobile-friendliness as a ranking factor. This is crucial to the evolution of browser behavior, with mobile traffic, often greater in volume than desktop for some sites.

Another report in the ever useful Google Search Console gives a clear low-down of how mobile-friendly a site is, showing warnings for any issues. It’s worth saying, this measure isn’t an indication of how likely a conversion is, but more the quality of your site on a mobile device.

Graph for tracking the mobile-friendliness of a website

Follow your metrics and listen to the data

As mentioned at the start of this article, data drives decisions. In all areas of business, certain numbers will stand out. With SEO, a full understanding comes from multiple data points, with positives and negatives to be taken at every point of the journey.

Ultimately, it often comes down to traffic, ranks, and conversions, the numbers that definitely drive business are made up of the metrics that don’t often see the light of day but are just as important.

As a digital marketer, it is always a learning experience to know how data drives the evolution of a business and ultimately, how successes and opportunities are reported and understood.

Matthew Ramsay is Digital Marketing Manager at Digitaloft. 

Further reading:

The post The SEO metrics that really matter for your business appeared first on Search Engine Watch.

Search Engine Watch


Blog Post Title

April 30, 2019 No Comments

What goes into a blog post? Helpful, industry-specific content that: 1) gives readers a useful takeaway, and 2) shows you’re an industry expert.

Use your company’s blog posts to opine on current industry topics, humanize your company, and show how your products and services can help people.


Google Analytics Premium


SEO case study: How Venngage turned search into their primary lead source

April 27, 2019 No Comments

Venngage is a free infographic maker that has catered to more than 21,000 businesses. In this article, we explore how they grew their organic traffic from about 275,000 visitors per month in November 2017 to about 900,000 today — more than tripling in 17 months.

I spoke with Nadya Khoja, Chief Growth Officer at Venngage, about their process.

Venngage gets most of their leads from content and organic search. The percentage varies from month to month in the range of 58% to 65%.

In Nov 2017, Venngage enjoyed 275,000 visitors a month from organic search traffic. Today (16 months later) it’s 900,000. Nadya Khoja (their Chief Growth Officer) extrapolated from their current trend that by December of 2019 (in nine months) they will enjoy three million organic search visitors per month.

Screenshot of Venngage's statistics

In 2015, when Nadya started with Venngage, they saw 300 to 400 registrations a week. By March of 2018, this was up to 25,000 a week. Today it’s 45,000.

While Nadya had the advantage of not starting from zero, that is impressive growth per any reasonable metric. How did they do it?

Recipe

There are a lot of pieces to this puzzle. I’ll do my best to explain them, and how they tie together. There is no correct order to things per se, so what is below is my perspective on how best to tell this story.

The single most important ingredient: Hypothesize, test, analyze, adjust

This critical ingredient is surprisingly not an ingredient, but rather a methodology. I’m tempted to call it “the scientific method”, as that’s an accurate description, but perhaps it’s more accurate to call it the methodology written up in the books “The Lean Startup” (which Nadya has read) and “Running Lean” (which Nadya has not read).

This single most important ingredient is the methodology of the hypothesize, test, analyze, and adjust.

What got them to this methodology was a desire to de-risk SEO.

The growth in traffic and leads was managed through a series of small and quick iterations, each one of which either passed or failed. Ones that passed were done more. Ones that failed were abandoned.

This concept of hypothesizing, testing, analyzing, and adjusting is used both for SEO changes and for changes to their products.

The second most important ingredient

This ingredient is shared knowledge. Venngage marketing developed “The Playbook”, which everyone in marketing contributes to. “The Playbook” was created both as a reference with which to bring new team members up to speed quickly, as well as a running history of what has been tested and how it went.

The importance of these first two ingredients cannot be overstated. From here on, I am revealing things they learned through trial and error. You have the advantage to learn from their successes and failures. They figured this stuff out the hard way. One hypothesis and one test at a time.

Their north star metrics

They have two north star metrics. The first one seems fairly obvious. “How many infographics are completed within a given time period?” The second one occurred to them later and is as important, if not more so. It is “how long does it take to complete an infographic?”

The first metric, of course, tells them how attractive their product is. The second tells them how easy (or hard) their product is to use.

Together these are the primary metrics that drive everything Venngage does.

The 50/50 focus split

As a result of both the company and the marketing department having a focus on customer acquisition and customer retention, every person in marketing spends half their time working on improving the first north star metric, and the other half spend their time working on improving the second.

Marketing driving product design

Those north star metrics have led to Venngage developing what I call marketing driven product design. Everywhere I ever worked has claimed they did this. The way Venngage does this exceeds anything ever done at a company I’ve worked for.

“How do I be good?”

This part of Nadya’s story reminds me of the start of a promo video I once saw for MasterClass.com. It’s such a good segue to this part of the story that I cropped out all but the good part to include in this article.

When Steve Martin shed light on an important marketing question

I’ve encountered a number of companies through the years who thought of marketing as “generating leads” and “selling it”, rather than “how do we learn what our customers want?”, or “how do we make our product easier to use?”

Squads

The company is structured into cross-functional squads, a cross-functional squad being people from various departments within Venngage, all working to improve a company-wide metric.

For example, one of the aspects of their infographic product is templates. A template is a starting point for building an infographic.

As templates are their largest customer acquisition channel, they created a “Template Squad”, whose job is to work on their two north star metrics for their templates.

The squad consists of developers, designers, UI/UX people, and the squad leader, who is someone in marketing. Personally, I love this marketing focus, as it de-focuses marketing and causes marketing to be something that permeates everything the company does.

There is another squad devoted to internationalization, which as you can infer, is responsible to improve their two north star metrics with users in countries around the world.

Iterative development

Each template squad member is tasked with improving their two north star metrics.

Ideas on how to do this come from squad members with various backgrounds and ideas.

Each idea is translated into a testable hypothesis. Modifications are done weekly. As you can image, Venngage is heavy into analytics, as without detailed and sophisticated analytics, they don’t know which experiments worked and which didn’t.

Examples of ideas that worked are:

  • Break up the templates page into a series of pages, which contain either category of templates or single templates.
  • Ensure each template page contains SEO keywords specific for the appropriate industry or audience segment. This is described in more detail further in this document.
  • Undo the forced backlink each of the embedded templates used to contain.
    • This allowed them to get initial traction, but it later resulted in a Google penalty.
    • This is a prime example of an SEO tactic that worked until it didn’t.
  • Create an SEO checklist for all template pages with a focus on technical SEO.
    • This eliminated human error from the process.
  • Eliminate “React headers” Google was not indexing.
  • Determine what infographic templates and features people don’t use and eliminate them.

Measuring inputs

I personally think this is really important. To obtain outputs, they measured inputs. When the goal was to increase registrations, they identified the things they had to do to increase registrations, then measured how much of that they did every week.

Everyone does SEO

In the same way that marketing is something that does not stand alone, but rather permeates everything Venngage does, SEO does not stand alone. It permeates everything marketing does. Since organic search traffic is the number one source of leads, they ensure everyone in marketing knows the basics of technical SEO and understands the importance of this never being neglected.

Beliefs and values

While I understand the importance of beliefs and values in human psychology, it was refreshing to see this being proactively addressed within an organization in the context of improving their north star metrics.

They win and lose together

Winning and losing together is a core belief at Venngage. Nadya states it minimizes blame and finger-pointing. When they win, they all win. When they lose, they all lose. It doesn’t matter who played what part. To use a sports analogy, a good assist helps to score a goal. A bad assist, well, that’s an opportunity to learn.

SEO is a team effort

While it is technically possible for a single person to do SEO, the volume of tasks required these days makes it impractical. SEO requires quality content, technical SEO, and building of backlinks through content promotion, guest posting, and the others. Venngage is a great example of effectively distributing SEO responsibilities through the marketing department.

To illustrate the importance of the various pieces fitting together, consider that while content is king, technical SEO is what gets content found, but when people find crappy content, it doesn’t convert.

You can’t manage what you don’t measure

This requires no elaboration.

But what you measure matters

This probably does justify some elaboration. We’ve all been in organizations that measured stupid stuff. By narrowing down to their two north star metrics, then focusing their efforts to improving those metrics, they’ve aligned everyone’s activity towards things that matter.

The magic of incremental improvements

This is the Japanese concept of Kaizen put into play for the development and marketing of a software product.

Done slightly differently, this concept helped Britain dominate competitive cycling at the 2008 Olympics in Beijing.

Customer acquisition is not enough

Venngage developed their second north star metric after deciding that acquiring new customers was not, in and of itself, any form of the Holy Grail. They realized that if their product was hard to use, fewer people would use it.

They decided a good general metric of how easy the product is to use was to measure how long people take to build an infographic. If people took “too long”, they spoke to them about why.

This led them to change the product in ways to make it easier to use.

Link building is relationship building

As a reader of Search Engine Watch, you know link building is critical and central to SEO. In the same way that everyone in Venngage marketing must know the basics of technical SEO, everyone in Venngage marketing must build links.

They do so via outreach to promote their content. As people earn links from the content promotion outreach, they record those links in a shared spreadsheet.

While this next bit is related to link building, everyone in Venngage marketing has traffic goals as well.

This too is tracked in a simple and reasonable way. Various marketers own different “areas” or “channels”. These channels are broken down into specific traffic acquisition metrics.

As new hires get more familiar with how things work at Venngage, they are guided into traffic acquisition channels which they want to work on.

Learning experience, over time

My attempt here is to provide a chronology of what they learned in what order. It may help you avoid some of the mistakes they made.

Cheating works until it doesn’t

Understanding the importance of links to search ranking, they thought it would be a good idea to implement their infographics with embedded backlinks. Each implemented infographic contained a forced backlink to the Venngage website.

They identified a set of anchor text they thought would be beneficial to them and rotated through them for these forced backlinks.

And it worked, for a while. Until they realized they had invited a Google penalty. This took a bit to clean up.

The lessons learned:

  • The quality of your backlinks matter.
  • To attract quality backlinks, publish quality content.

Blog posts brought in users who activated

At some point, their analytics helped them realize that users who activated from blog posts where ideal users for them. So they set a goal to increase activations from blog posts, which led to the decision to test if breaking up templates into categories and individual pages with only one template made sense. It did.

Website design matters

Changing the website from one big template page to thousands of smaller ones helped, and not just because it greatly increased the number of URLs indexed by Google. It also greatly improved the user experience. It made it easier for their audience to find templates relevant to them, without having to look at templates that weren’t.

Lesson learned: UI/UX matters for both users and SEO.

Hybrid content attracts

Hybrid content is where an article talks about two main things. For example, talking about Hogwarts houses sorting within the context of an infographic. This type of content brings in some number of Harry Potter fans, some of whom have an interest in creating infographics. The key to success is tying these two different topics together well.

Content is tuneable

By converting one huge templates page into thousands of small template pages, they realized that a template or set of templates that appeal to one audience segment would not necessarily appeal to others. This caused them to start to tune templates towards audience segments in pursuit of more long tail organic search traffic.

How did they figure out what users wanted in terms of better content? They used a combination of keyword research and talking with users and prospects.

Some content doesn’t make the cut

After they caught onto the benefits of tuning content to attract different audience segments, they looked for content on their site that no one seemed to care about. They deleted it. While it decreased the amount of content on their site, it increased their overall content quality.

Traffic spikes are not always good news

When they initially started creating forced backlinks in their infographics, they could see their traffic increase. They saw some spikes. Their general thought was more traffic is good.

When they experienced the Google penalty, they realized how wrong they were. Some traffic spikes are bad news. Others are good news.

When your website traffic shows a sudden change, even if you’re experiencing a spike in organic search traffic, you must dig into the details and find out the root cause.

Lesson learned: There is a thing as bad traffic. Some traffic warns you of a problem.

Links from product embeds aren’t all bad

They just needed to make the embedded links optional. To allow the customer to decide if they do or do not deserve a backlink. While this did not cause any change to their levels of organic search traffic, it was necessary to resolve the Google penalty.

Boring works

Incremental continuous improvement seems repetitive and boring. A one percent tweak here, a two percent tweak there, but over time, you’ve tripled your organic search traffic and your lead flow.

It’s necessarily fun, but it delivers results.

Lesson learned: What I’ll call “infrastructure” is boring, and it matters. Both for your product and your SEO.

Figure out what to measure

The idea of measuring the amount of time required to complete an infographic did not occur to them on day one. This idea came up when they were looking for a metric to indicate to them how easy (or difficult) their product was to use.

Once they decided this metric possibly made sense, they determined their baseline, then through an iterative process, making improvements to the product to make this a little faster.

As they did so, the feedback from the users was positive, so they doubled down on this effort.

Lesson learned: What you measure matters.

Teach your coworkers well

They created “The Playbook”, which is a compendium of the combined knowledge they’ve accumulated over time. The playbook is written by them, for them.

Marketing employees are required to add chapters to the playbook as they learn new skills and methods.

Its primary purpose is to bring new team members up to speed quickly, and it also serves as a historical record of what did and did not work.

One important aspect of continuous improvement is for new people to avoid suggesting experiments that previously failed.

Additionally (and I love this), every month everyone in marketing gives Nadya an outline of what they’re learning and what they’re improving on.

Their marketing stack

While their marketing stack is not essential to understanding their processes, I find it useful to understand what software tools a marketing organization uses, and for what. So here is theirs. This is not a list of what they’ve used and abandoned over time, but rather a list of what they use now.

  • Analytics: Google Analytics and Mixpanel
  • Customer communications: Intercom
  • Link analysis and building: Ahrefs
  • Link building outreach: Mailshake
  • Project management: Trello
  • General purpose: G Suite

In closing

To me, what Nadya has done at Venngage is a case study in how to do SEO right, and most of doing it right are not technical SEO work.

  • Help senior management understand that some things that are not typically thought of as SEO (website design for example) can have serious SEO implications.
  • Get senior management buy in to include these non-SEO functions in your SEO efforts.
  • Understand what very few basic metrics matter for your company, and how you measure them.
  • Distribute required SEO work through as many people as reasonably possible. Include people whose job functions are not necessarily SEO related (writers, designers, UI/UX, and more).
  • Test and measure everything.
  • Win big through a continuous stream of small incremental improvements.

Venngage has surely lead by example and all the guidelines and pointers shared above can surely help your organization implement its search for increased sales.

Kevin Carney is the Founder and CEO of the boutique link building agency Organic Growth. 

The post SEO case study: How Venngage turned search into their primary lead source appeared first on Search Engine Watch.

Search Engine Watch


SEO writing guide: From keyword to content brief

April 23, 2019 No Comments

If content is queen, and the critical role SEO plays a role of bridging the two to drive growth, then there’s no question as to whether or not keyword research is important.

However, connecting the dots to create content that ranks well can be difficult. What makes it so difficult? How do you go from a target keyword phrase and write an article that is unique, comprehensive, encompasses all the major on-page SEO elements, touches the reader, and isn’t structured like the “oh-so-familiar” generic SEO template?

Example of a typical article template structure

There’s no one size fits all approach! However, there is a simple way to support any member of your editorial, creative writing, or content team in shaping up what they need in order to write SEO-friendly content, and that’s an SEO content brief.

Key benefits of a content brief:

  • Productivity and efficiency – A content brief clearly outlines expectation for the writer resulting in reduced revisions
  • Alignment – Writers understand the intent and goals of the content
  • Quality – Reduces garbage in, garbage out.

So the rest of this article will cover how we actually get there & we’ll use this very article as an example:

  • Keyword research
  • Topical expansion
  • Content/SERP (search engine results page) analysis
  • Content brief development
  • Template and tools

Any good editor will tell you great content comes from having a solid content calendar with topics planned in advance for review and release at a regular cadence. To support topical analysis and themes as SEOs we need to start with keyword research.

Start with keyword research: Topic, audience, and objectives

The purpose of this guide isn’t to teach you how to do keyword research. It’s to set you up for success in taking the step beyond that and developing it into a content brief. Your primary keywords serve as your topic themes, but they are also the beginning makings of your content brief, so try to ensure you:

  • Spend time understanding your target audience and aligning their goals to your keywords. Many call this keyword intent mapping. Rohan Ayyr provides an excellent guide to matching keywords to intent in his article, ‘How to move from keyword research to intent research’.
  • Do the keyword research in advance, it will allow writers and editors the freedom to move things around and line it up with trending topics.

How does all this help in supporting a content brief?

You and your team can get answers to the key questions mentioned below.

  • What will they write about? Primary keywords serve as the topic in your content brief.
  • Who is the intended audience? Keyword intent helps unearth what problem the user is trying to solve, helping us understand who they are, and what they need.

Now with keywords as our guide to overall topical themes, we can focus on the next step, topical expansion.

Topical expansion: Define key points and gather questions

Writers need more than keywords, they require insight into the pain points of the reader, key areas of the topic to address and most of all, what questions the content should answer. This too will go into your content brief.

We’re in luck as SEOs because there is no shortage of tools that allow us to gather this information around a topic.

For example, let’s say this article focuses on “SEO writing”. There are a number of ways to expand on this topic.

  • Using a tool like SEMRush’s topic research tool, you can take your primary keyword (topic), and get expanded/related topics, a SERP snapshot and questions in a single view. I like this because it covers what many other tools do separately. Ultimately it supports both content expansion & SERP analysis at the same time.

Example of finding potential topics using SEMRush's topic research tool

  • Use keyword suggestion tools like KeywordTool.io or Ubersuggest to expand the terms combined with Google search results to quickly view potential topics.

Finding potential topics by combining keyword suggestion tools' results with Google's search results

  • Use Answerthepublic.com to get expanded terms and inspirational visuals.

Example of finding potential topics using Answerthepublic

You’ve taken note of what to write about, and how to cover the topic fully. But how do we begin to determine what type of content and how in-depth it should be?

Content and SERP analysis: Specifying content type and format

Okay, so we’re almost done. We can’t tell writers to write unique content if we can’t specify what makes it unique. Reviewing the competition and what’s being displayed consistently in the SERP is a quick way to assess what’s likely to work. You’ll want to look at the top ten results for your primary topic and collect the following:

  • Content type – Are the results skewed towards a specific type of content? (For example, in-depth articles, infographics, videos, or blog posts)
  • Format – Is the information formatted as a guide? A how-to? Maybe a list?
  • Differentiation points – What stands out about the top three results compared to the rest?

Content brief development: Let’s make beautiful content together

Now you’re ready to prepare your SEO content brief which should include the following:

  • Topic and objective – Your topic is your primary keyword phrase. Your objective is what this content supposed to accomplish.
  • Audience and objective – Based on your keyword intent mapping, describe who the article is meant to reach.
  • Topical coverage – Top three related keyword phrases from your topical expansion.
  • Questions to answer – Top three to five from topical expansion findings. Ensure they support your related keyword phrases as well.
  • Voice, style, tone – Use an existing content/brand style guide.
  • Content type and format – Based on your SERP analysis.
  • Content length – Based on SERP Analysis. Ensure you’re meeting the average across the top three results based on content type.
  • Deadline – This is only pertinent if you are working solo, otherwise, consult/lean on your creative team lead.

[Note: If/when using internally, consider making part of the content request process, or a template for the editorial staff. When using externally be sure to include where the content will be displayed, format/output, specialty editorial guidance.]

Template and tools

Want to take a shortcut? Feel free to download and copy my SEO content brief template, it’s a Google doc.

Other content brief templates/resources:

If you want to streamline the process as a whole, MarketMuse provides a platform that manages the keyword research, topic expansion, provides the questions, and manages the entire workflow. It even allows you to request a brief, all in one place.

I only suggest this for larger organizations looking to scale as there is an investment involved. You’d likely also have to do some work to integrate into your existing processes.

Jori Ford is Sr. Director of Content & SEO at G2Crowd. She can also be found on Twitter @chicagoseopro.

The post SEO writing guide: From keyword to content brief appeared first on Search Engine Watch.

Search Engine Watch


Using Python to recover SEO site traffic (Part three)

April 20, 2019 No Comments

When you incorporate machine learning techniques to speed up SEO recovery, the results can be amazing.

This is the third and last installment from our series on using Python to speed SEO traffic recovery. In part one, I explained how our unique approach, that we call “winners vs losers” helps us quickly narrow down the pages losing traffic to find the main reason for the drop. In part two, we improved on our initial approach to manually group pages using regular expressions, which is very useful when you have sites with thousands or millions of pages, which is typically the case with ecommerce sites. In part three, we will learn something really exciting. We will learn to automatically group pages using machine learning.

As mentioned before, you can find the code used in part one, two and three in this Google Colab notebook.

Let’s get started.

URL matching vs content matching

When we grouped pages manually in part two, we benefited from the fact the URLs groups had clear patterns (collections, products, and the others) but it is often the case where there are no patterns in the URL. For example, Yahoo Stores’ sites use a flat URL structure with no directory paths. Our manual approach wouldn’t work in this case.

Fortunately, it is possible to group pages by their contents because most page templates have different content structures. They serve different user needs, so that needs to be the case.

How can we organize pages by their content? We can use DOM element selectors for this. We will specifically use XPaths.

Example of using DOM elements to organize pages by their content

For example, I can use the presence of a big product image to know the page is a product detail page. I can grab the product image address in the document (its XPath) by right-clicking on it in Chrome and choosing “Inspect,” then right-clicking to copy the XPath.

We can identify other page groups by finding page elements that are unique to them. However, note that while this would allow us to group Yahoo Store-type sites, it would still be a manual process to create the groups.

A scientist’s bottom-up approach

In order to group pages automatically, we need to use a statistical approach. In other words, we need to find patterns in the data that we can use to cluster similar pages together because they share similar statistics. This is a perfect problem for machine learning algorithms.

BloomReach, a digital experience platform vendor, shared their machine learning solution to this problem. To summarize it, they first manually selected cleaned features from the HTML tags like class IDs, CSS style sheet names, and the others. Then, they automatically grouped pages based on the presence and variability of these features. In their tests, they achieved around 90% accuracy, which is pretty good.

When you give problems like this to scientists and engineers with no domain expertise, they will generally come up with complicated, bottom-up solutions. The scientist will say, “Here is the data I have, let me try different computer science ideas I know until I find a good solution.”

One of the reasons I advocate practitioners learn programming is that you can start solving problems using your domain expertise and find shortcuts like the one I will share next.

Hamlet’s observation and a simpler solution

For most ecommerce sites, most page templates include images (and input elements), and those generally change in quantity and size.

Hamlet's observation for a simpler approach based on domain-level observationsHamlet's observation for a simpler approach by testing the quantity and size of images

I decided to test the quantity and size of images, and the number of input elements as my features set. We were able to achieve 97.5% accuracy in our tests. This is a much simpler and effective approach for this specific problem. All of this is possible because I didn’t start with the data I could access, but with a simpler domain-level observation.

I am not trying to say my approach is superior, as they have tested theirs in millions of pages and I’ve only tested this on a few thousand. My point is that as a practitioner you should learn this stuff so you can contribute your own expertise and creativity.

Now let’s get to the fun part and get to code some machine learning code in Python!

Collecting training data

We need training data to build a model. This training data needs to come pre-labeled with “correct” answers so that the model can learn from the correct answers and make its own predictions on unseen data.

In our case, as discussed above, we’ll use our intuition that most product pages have one or more large images on the page, and most category type pages have many smaller images on the page.

What’s more, product pages typically have more form elements than category pages (for filling in quantity, color, and more).

Unfortunately, crawling a web page for this data requires knowledge of web browser automation, and image manipulation, which are outside the scope of this post. Feel free to study this GitHub gist we put together to learn more.

Here we load the raw data already collected.

Feature engineering

Each row of the form_counts data frame above corresponds to a single URL and provides a count of both form elements, and input elements contained on that page.

Meanwhile, in the img_counts data frame, each row corresponds to a single image from a particular page. Each image has an associated file size, height, and width. Pages are more than likely to have multiple images on each page, and so there are many rows corresponding to each URL.

It is often the case that HTML documents don’t include explicit image dimensions. We are using a little trick to compensate for this. We are capturing the size of the image files, which would be proportional to the multiplication of the width and the length of the images.

We want our image counts and image file sizes to be treated as categorical features, not numerical ones. When a numerical feature, say new visitors, increases it generally implies improvement, but we don’t want bigger images to imply improvement. A common technique to do this is called one-hot encoding.

Most site pages can have an arbitrary number of images. We are going to further process our dataset by bucketing images into 50 groups. This technique is called “binning”.

Here is what our processed data set looks like.

Example view of processed data for "binning"

Adding ground truth labels

As we already have correct labels from our manual regex approach, we can use them to create the correct labels to feed the model.

We also need to split our dataset randomly into a training set and a test set. This allows us to train the machine learning model on one set of data, and test it on another set that it’s never seen before. We do this to prevent our model from simply “memorizing” the training data and doing terribly on new, unseen data. You can check it out at the link given below:

Model training and grid search

Finally, the good stuff!

All the steps above, the data collection and preparation, are generally the hardest part to code. The machine learning code is generally quite simple.

We’re using the well-known Scikitlearn python library to train a number of popular models using a bunch of standard hyperparameters (settings for fine-tuning a model). Scikitlearn will run through all of them to find the best one, we simply need to feed in the X variables (our feature engineering parameters above) and the Y variables (the correct labels) to each model, and perform the .fit() function and voila!

Evaluating performance

Graph for evaluating image performances through a linear pattern

After running the grid search, we find our winning model to be the Linear SVM (0.974) and Logistic regression (0.968) coming at a close second. Even with such high accuracy, a machine learning model will make mistakes. If it doesn’t make any mistakes, then there is definitely something wrong with the code.

In order to understand where the model performs best and worst, we will use another useful machine learning tool, the confusion matrix.

Graph of the confusion matrix to evaluate image performance

When looking at a confusion matrix, focus on the diagonal squares. The counts there are correct predictions and the counts outside are failures. In the confusion matrix above we can quickly see that the model does really well-labeling products, but terribly labeling pages that are not product or categories. Intuitively, we can assume that such pages would not have consistent image usage.

Here is the code to put together the confusion matrix:

Finally, here is the code to plot the model evaluation:

Resources to learn more

You might be thinking that this is a lot of work to just tell page groups, and you are right!

Screenshot of a query on custom PageTypes and DataLayer

Mirko Obkircher commented in my article for part two that there is a much simpler approach, which is to have your client set up a Google Analytics data layer with the page group type. Very smart recommendation, Mirko!

I am using this example for illustration purposes. What if the issue requires a deeper exploratory investigation? If you already started the analysis using Python, your creativity and knowledge are the only limits.

If you want to jump onto the machine learning bandwagon, here are some resources I recommend to learn more:

Got any tips or queries? Share it in the comments.

Hamlet Batista is the CEO and founder of RankSense, an agile SEO platform for online retailers and manufacturers. He can be found on Twitter @hamletbatista.

The post Using Python to recover SEO site traffic (Part three) appeared first on Search Engine Watch.

Search Engine Watch