“Hey Siri, what is the cost of an iPad near me?”
In today’s internet, a number of specialist search engines exist to help consumers search for and compare things within a specific niche.
As well as search engines like Google and Bing which crawl the entire web, we have powerful vertical-specific search engines like Skyscanner, Moneysupermarket and Indeed that specialize in surfacing flights, insurance quotes, jobs, and more.
Powerful though web search engines can be, they aren’t capable of delivering the same level of dedicated coverage within a particular industry that vertical search engines are. As a result, many vertical-specific search engines have become go-to destinations for finding a particular type of information – above and beyond even the all-powerful Google.
Yet until recently, one major market remained unsearchable: prices.
If you ask Siri to tell you the cost of an iPad near you, she won’t be able to provide you with an answer, because she doesn’t have the data. Until now, a complete view of prices on the internet has never existed.
Enter Pricesearcher, a search engine that has set out to solve this problem by indexing all of the world’s prices. Pricesearcher provides searchers with detailed information on products, prices, price histories, payment and delivery information, as well as reviews and buyers’ guides to aid in making a purchase decision.
Founder and CEO Samuel Dean calls Pricesearcher “The biggest search engine you’ve never heard of.” Search Engine Watch recently paid a visit to the Pricesearcher offices to find about the story behind the first search engine for prices, the technical challenge of indexing prices, and why the future of search is vertical.
Pricesearcher: The early days
A product specialist by background, Samuel Dean spent 16 years in the world of ecommerce. He previously held a senior role at eBay as Head of Distributed Ecommerce, and has carried out contract work for companies including Powa Technologies, Inviqa and the UK government department UK Trade & Investment (UKTI).
He first began developing the idea for Pricesearcher in 2011, purchasing the domain Pricesearcher.com in the same year. However, it would be some years before Dean began work on Pricesearcher full-time. Instead, he spent the next few years taking advantage of his ecommerce connections to research the market and understand the challenges he might encounter with the project.
“My career in e-commerce was going great, so I spent my time talking to retailers, speaking with advisors – speaking to as many people as possible that I could access,” explains Dean. “I wanted to do this without pressure, so I gave myself the time to formulate the plan whilst juggling contracting and raising my kids.”
More than this, Dean wanted to make sure that he took the time to get Pricesearcher absolutely right. “We knew we had something that could be big,” he says. “And if you’re going to put your name on a vertical, you take responsibility for it.”
Dean describes himself as a “fan of directories”, relating how he used to pore over the Yellow Pages telephone directory as a child. His childhood also provided the inspiration for Pricesearcher in that his family had very little money while he was growing up, and so they needed to make absolutely sure they got the best price for everything.
Dean wanted to build Pricesearcher to be the tool that his family had needed – a way to know the exact cost of products at a glance, and easily find the cheapest option.
“The world of technology is so advanced – we have self-driving cars and rockets to Mars, yet the act of finding a single price for something across all locations is so laborious. Which I think is ridiculous,” he explains.
Despite how long it took to bring Pricesearcher to inception, Dean wasn’t worried that someone else would launch a competitor search engine before him.
“Technically, it’s a huge challenge,” he says – and one that very few people have been willing to tackle.
There is a significant lack of standardization in the ecommerce space, in the way that retailers list their products, the format that they present them in, and even the barcodes that they use. But rather than solve this by implementing strict formatting requirements for retailers to list their products, making them do the hard work of being present on Pricesearcher (as Google and Amazon do), Pricesearcher was more than willing to come to the retailers.
“Our technological goal was to make listing products on Pricesearcher as easy as uploading photos to Facebook,” says Dean.
As a result, most of the early days of Pricesearcher were devoted to solving these technical challenges for retailers, and standardizing everything as much as possible.
In 2014, Dean found his first collaborator to work with him on the project: Raja Akhtar, a PHP developer working on a range of ecommerce projects, who came on board as Pricesearcher’s Head of Web Development.
Dean found Akhtar through the freelance website People Per Hour, and the two began working on Pricesearcher together in their spare time, putting together the first lines of code in 2015. The beta version of Pricesearcher launched the following year.
For the first few years, Pricesearcher operated on a shoestring budget, funded entirely out of Dean’s own pocket. However, this didn’t mean that there was any compromise in quality.
“We had to build it like we had much more funding than we did,” says Dean.
They focused on making the user experience natural, and on building a tool that could process any retailer product feed regardless of format. Dean knew that Pricesearcher had to be the best product it could possibly be in order to be able to compete in the same industry as the likes of Google.
“Google has set the bar for search – you have to be at least as good, or be irrelevant,” he says.
PriceBot and price data
Pricesearcher initially built up its index by directly processing product feeds from retailers. Some early retail partners who joined the search engine in its first year included Amazon, Argos, IKEA, JD Sports, Currys and Mothercare. (As a UK-based search engine, Pricesearcher has primarily focused on indexing UK retailers, but plans to expand more internationally in the near future).
In the early days, indexing products with Pricesearcher was a fairly lengthy process, taking about 5 hours per product feed. Dean and Akhtar knew that they needed to scale things up dramatically, and in 2015 began working with a freelance dev ops engineer, Vlassios Rizopoulos, to do just that.
Rizopoulos’ work sped up the process of indexing a product feed from 5 hours to around half an hour, and then to under a minute. In 2017 Rizopoulos joined the company as its CTO, and in the same year launched Pricesearcher’s search crawler, PriceBot. This opened up a wealth of additional opportunities for Pricesearcher, as the bot was able to crawl any retailers who didn’t come to them directly, and from there, start a conversation.
“We’re open about crawling websites with PriceBot,” says Dean. “Retailers can choose to block the bot if they want to, or submit a feed to us instead.”
For Pricesearcher, product feeds are preferable to crawl data, but PriceBot provides an option for retailers who don’t have the technical resources to submit a product feed, as well as opening up additional business opportunities. PriceBot crawls the web daily to get data, and many retailers have requested that PriceBot crawl them more frequently in order to get the most up-to-date prices.
Between the accelerated processing speed and the additional opportunities opened up by PriceBot, Pricesearcher’s index went from 4 million products in late 2016 to 500 million in August 2017, and now numbers more than 1.1 billion products. Pricesearcher is currently processing 2,500 UK retailers through PriceBot, and another 4,000 using product feeds.
All of this gives Pricesearcher access to more pricing data than has ever been accumulated in one place – Dean is proud to state that Pricesearcher has even more data at its disposal than eBay. The data set is unique, as no-one else has set out to accumulate this kind of data about pricing, and the possible insights and applications are endless.
At Brighton SEO in September 2017, Dean and Rizopoulos gave a presentation entitled, ‘What we have learnt from indexing over half a billion products’, presenting data insights from Pricesearcher’s initial 500 million product listings.
The insights are fascinating for both retailers and consumers: for example, Pricesearcher found that the average length of a product title was 48 characters (including spaces), with product descriptions averaging 522 characters, or 90 words.
Less than half of the products indexed – 44.9% – included shipping costs as an additional field, and two-fifths of products (40.2%) did not provide dimensions such as size and color.
Between December 2016 and September 2017, Pricesearcher also recorded 4 billion price changes globally, with the UK ranking top as the country with the most price changes – one every six days.
It isn’t just Pricesearcher who have visibility over this data – users of the search engine can benefit from it, too. On February 2nd, Pricesearcher launched a new beta feed which displays a pricing history graph next to each product.
This allows consumers to see exactly what the price of a product has been throughout its history – every rise, every discount – and use this to make a judgement about when the best time is to buy.
“The product history data levels the playing field for retailers,” explains Dean. “Retailers want their customers to know when they have a sale on. This way, any retailer who offers a good price can let consumers know about it – not just the big names.
“And again, no-one else has this kind of data.”
As well as giving visibility over pricing changes and history, Pricesearcher provides several other useful functions for shoppers, including the ability to filter by whether a seller accepts PayPal, delivery information and a returns link.
This is, of course, if retailers make this information available to be featured on Pricesearcher. The data from Pricesearcher’s initial 500 million products shed light on many areas where crucial information was missing from a product listing, which can negatively impact a retailer’s visibility on the search engine.
Like all search engines, Pricesearcher has ranking algorithms, and there are certain steps that retailers can take to optimize for Pricesearcher, and give themselves the best chance of a high ranking.
With that in mind, how does ‘Pricesearcher SEO’ work?
How to rank on Pricesearcher
At this stage in its development, Pricesearcher wants to remove the mystery around how retailers can rank well on its search engine. Pricesearcher’s Retail Webmaster and Head of Search, Paul Lovell, is currently focused on developing ranking factors for Pricesearcher, and conceptualizing an ideal product feed.
The team are also working with select SEO agencies to educate them on what a good product feed looks like, and educating retailers about how they can improve their product listings to aid their Pricesearcher ranking.
Retailers can choose to either go down the route of optimizing their product feed for Pricesearcher and submitting that, or optimizing their website for the crawler. In the latter case, only a website’s product pages are of interest to Pricesearcher, so optimizing for Pricesearcher translates into optimizing product pages to make sure all of the important information is present.
At the most basic level, retailers need to have the following fields in order to rank on Pricesearcher: A brand, a detailed product title, and a product description. Category-level information (e.g. garden furniture) also needs to be present – Pricesearcher’s data from its initial 500 million products found that category-level information was not provided in 7.9% of cases.
If retailers submit location data as well, Pricesearcher can list results that are local to the user. Additional fields that can help retailers rank are product quantity, delivery charges, and time to deliver – in short, the more data, the better.
A lot of ‘regular’ search engine optimization tactics also work for Pricesearcher – for example, implementing schema.org markup is very beneficial in communicating to the crawler which fields are relevant to it.
It’s not only retailers who can rank on Pricesearcher; retail-relevant webpages like reviews and buying guides are also featured on the search engine. Pricesearcher’s goal is to provide people with as much information as possible to make a purchase decision, but that decision doesn’t need to be made on Pricesearcher – ultimately, converting a customer is seen as the retailer’s job.
Given Pricesearcher’s role as a facilitator of online purchases, an affiliate model where the search engine earns a commission for every customer it refers who ends up converting seems like a natural way to make money. Smaller search engines like DuckDuckGo have similar models in place to drive revenue.
However, Dean is adamant that this would undermine the neutrality of Pricesearcher, as there would then be an incentive for the search engine to promote results from retailers who had an affiliate model in place.
Instead, Pricesearcher is working on building a PPC model for launch in 2019. The search engine is planning to offer intent-based PPC to retailers, which would allow them to opt in to find out about returning customers, and serve an offer to customers who return and show interest in a product.
Other than PPC, what else is on the Pricesearcher roadmap for the next few years? In a word: lots.
The future of search is vertical
The first phase of Pricesearcher’s journey was all about data acquisition – partnering with retailers, indexing product feeds, and crawling websites. Now, the team are shifting their focus to data science, applying AI and machine learning to Pricesearcher’s vast dataset.
Head of Search Paul Lovell is an analytics expert, and the team are recruiting additional data scientists to work on Pricesearcher, creating training data that will teach machine learning algorithms how to process the dataset.
“It’s easy to deploy AI too soon,” says Dean, “but you need to make sure you develop a strong baseline first, so that’s what we’re doing.”
Pricesearcher will be out of beta by December of this year, by which time the team intend to have all of the prices in the UK (yes, all of them!) listed in Pricesearcher’s index. After the search engine is fully launched, the team will be able to learn from user search volume and use that to refine the search engine.
The Pricesearcher rocket ship – founder Samuel Dean built this by hand to represent the Pricesearcher mission. It references a comment made by Eric Shmidt to Sheryl Sandberg when she interviewed at Google. When she told him that the role didn’t meet any of her criteria and asked why should she work there, he replied: “If you’re offered a seat on a rocket ship, don’t ask what seat. Just get on.”
At the moment, Pricesearcher is still a well-kept secret, although retailers are letting people know that they’re listed on Pricesearcher, and the search engine receives around 1 million organic searches on a monthly basis, with an average of 4.5 searches carried out per user.
Voice and visual search are both on the Pricesearcher roadmap; voice is likely to arrive first, as a lot of APIs for voice search are already in place that allow search engines to provide their data to the likes of Alexa, Siri and Cortana. However, Pricesearcher are also keen to hop on the visual search bandwagon as Google Lens and Pinterest Lens gain traction.
Going forward, Dean is extremely confident about the game-changing potential of Pricesearcher, and moreover, believes that the future of the industry lies in vertical search. He points out that in December 2016, Google’s parent company Alphabet specifically identified vertical search as one of the biggest threats to Google.
“We already carry out ‘specialist searches’ in our offline world, by talking to people who are experts in their particular field,” says Dean.
“We should live in a world of vertical search – and I think we’ll see many more specialist search engines in the future.”
For small and medium businesses who want to compete on the same playing-field as much larger corporations with greater resources at their disposal, having a strong local SEO strategy is crucial.
Irrespective of what industry you’re in, you’ll always have at least one competitor who has been around longer and has allocated more budget and resources to building their visibility on the web and in search engines.
It may feel futile to try and compete with them in the realm of SEO.
But local SEO plays by slightly different rules to the regular kind. You don’t need to have reams of funding at your disposal or hundreds of links pointing to your site to be visible and relevant to a local audience – you just need to understand the unique characteristics of local SEO, and apply a few simple strategies to cater to them.
In this article, we’ll explore four cost-effective strategies that small and medium businesses can use to give themselves the best chance of ranking locally.
Verify your business’ Google Plus page
Your first step is to link your business’ Google Plus page with Google My Business. Google My Business allows SMEs to update their information across Google Search, Google Maps, and Google Plus in one fell swoop, to ensure that a potential customer can find you wherever they are and whatever device they’re on.
Local searches lead to more purchases than non-local searches, and verifying your Google Plus page makes it possible for you to monopolize the majority of the search results pages for your brand name, especially for the Local Business Card on the right.
Business owners need to realize that anyone can edit your business listing and this includes your competitors. Once you have provided Google My Business with all your details, it is very important to login to your Google My Business dashboard regularly to ensure that no one has attempted to make any unwanted changes to your listing.
Take advantage of the many interesting features available to businesses on Google My Business such as Google Posts, booking button feature, messaging, Questions & Answers, and more. The possibilities are limitless, get busy!
Launching your website is only the first step. Implementing a marketing strategy that includes Pay-Per-Click advertising is an important part of small business success within the digital landscape.
Virtually every small business can benefit from implementing a pay-per-click marketing strategy to build its web presence. The idea is to identify targeted, relevant keywords, understand your target audience and develop a strategy that will drive the right types of leads.
Selling a product or service that is difficult for consumers to find locally makes your business a great candidate for PPC advertising campaigns. People often rely on internet searches to locate unusual or rare products.
On top of this, many local searches with high purchase intent take place on mobile, as consumers search for a business or service “near me” while they’re out and about. As PPC advertising dominates a greater proportion of screen space on mobile, having paid search ads in place will give you the best chance of appearing in front of consumers in these moments.
Learning to combine the strengths of both search and social media, pay-per-click will effectively round out any small business’ paid advertising strategy. Understanding the difference and when to use each platform will increase visibility and decrease cost.
Host user-generated reviews
Google sees reviews as a major factor for ranking on the new carousel design; however, more than anything your reviews are for Google users who see your company on a SERP. Peer-to-peer reviews are powerful because they give your potential customers a good sense of what it’s really like to use your goods or services.
In this regard, the internet has leveled the playing field for small businesses across the globe through the power and exposure of online user-generated reviews.
Search engine spiders like content that is unique and frequently updated, and user reviews are an easy way to create more of this. Content generated by users is often unique to the user, therefore, it is different from the generic content mostly used by e-commerce sites which is the same thing as the manufacturer description.
This, combined with the fact that the words and phrases used by reviewers are often the same as those used by searchers, increases the chances of ranking well for search queries that are relevant to your product.
When you consider that 88% of shoppers consider product reviews before making any purchase, it’s a safe bet to assume that more and more consumers will be searching for the name of your product along with the word ‘review’, or related words like ‘ratings’.
Get your visitors started by simply putting a button on your webpage to facilitate leaving a review, prompt visitors to leave a review after purchasing something or visiting a particular landing page, or talk directly with people in your store or company about leaving a review.
Optimize your images
Optimization for local SEO is not limited to text. Due to the increasingly blended nature of search results, you can now see images on the search listings page, so it’s important to optimize your imagery for search engines.
Ensure your images are search engine-friendly. It all starts with the file name. There are a billion and one images out there, so you don’t want to use a generic image file name like ‘image12345.jpg’ that will guarantee your business gets lost in the pile. Instead, you want to use something descriptive to make it easier for your images to compete in rankings.
Search engines can’t read images, so it is up to you to use alt tags to help describe your image to ensure it pops up during relevant queries. Write a concise, relevant description that contains the appropriate keywords. Don’t forget to write content above and below the images on your website, using keywords where appropriate; the more the text is related to the image, the better.
Most importantly, if you want your images to rank for localized keywords, make sure you add local keywords wherever you can for blended results optimized for a specific local area.
In short, there’s no elevator to rise to the top of the search engine rankings, especially when there’s a massive competitor lingering on the scene. But with a strategy that leverages your geographic location, you can selectively overcome your competitors in specific key areas.
Give yourself an advantage by narrowing your topic and keyword focus and increasing your location-specific relevance. You might not rank for as many keywords as the big players, but you will be able to surpass them in relevance for your chosen focal points.
If you want to dive further into local SEO strategies after reading these tips, the following articles will take you more in-depth:
- How to create a kickass link-building strategy for local SEO
- 6 ways to market your local business online (beyond Google Maps)
- How creating relevant experiences can boost your clicks on local search ads
- How to optimize Google My Business listings for multi-location businesses
Pius Boachie is the founder of DigitiMatic, an inbound marketing agency.
Streamlined account management
With centralized account management, you can control user access and permissions across multiple products, like Analytics, Tag Manager, and Optimize.
The first step is to create an organization to represent your business. You then link this organization to all of the different accounts that belong to your business. You can also move accounts between the organizations you create.
Now you have a central location where administrators for your organization can:
- Create rules for which types of new users should be allowed access to your organization
- Audit existing users and decide which products and features they should have access to
- Remove users who have left your organization or no longer need access to the tools
- See the last time a user in your organization accessed Google Analytics data
- Allow users to discover who are your organization’s admins and contact them for help
New home page
Setting up an organization also gives you access to a new home page that provides an overview of your business. You’ll be able to manage accounts and settings across products and get insights and quick access to the products and features you use most. For example, you might see a large increase in visitors for a specific Analytics property, and then click through to Analytics to investigate where the visitors are coming from.
Finally, you’ll get a unified user experience across products. Common navigation and product headers make it easy to switch between products and access the data you need. You can view accounts by organization, or see everything you have access to in one place. We’ve also redesigned search, making it possible to search across all of your accounts in a single place.
These updates will be rolling out over the next few weeks, so please stay tuned if you don’t yet have access.
Note: If you’re using the enterprise versions of our products, like Analytics 360, you already have access to these features as part of the Google Analytics 360 Suite.
Posted by John Oberbeck, Product Manager Google Analytics
In May, we announced Google Attribution, a new free product to help marketers measure the impact of their marketing across devices and across channels. Advertisers participating in our early tests are seeing great results. Starting today, we’re expanding the Attribution beta to hundreds of advertisers.
We built Google Attribution to bring smarter performance measurement to all advertisers, and to solve the common problems with other attribution solutions.
Google Attribution is:
- Easy to setup and use: While some attribution solutions can take months to set up, Google Attribution can access the marketing data you need from tools like AdWords and Google Analytics with just a few clicks.
- Cross-device: Today’s marketers need measurement tools that don’t lose track of the customer journey when people switch between devices. Google Attribution uses Google’s device graph to measure the cross-device customer journey and deliver insights into cross-device behavior, all while protecting individual user privacy.
- Cross-channel: With your marketing spread out across so many channels (like search, display, and email), it can be difficult to determine how each channel is working and which ones are truly driving sales. Google Attribution brings together data across channels so you can get a more comprehensive view of your performance.
- Easy to take action: Attribution insights are only valuable if you can use them to improve your marketing. Integrations with tools like AdWords make it easy to update your bids or move budget between channels based on the new, more accurate performance data.
Results from Google Attribution beta customers
Last April, we shared that for AdWords advertisers, data-driven attribution typically delivers more conversions at a similar cost-per-conversion than last-click attribution. This shows that data-driven attribution is a better way to measure and optimize the performance of search and shopping ads.
Today we’re pleased to share that early results from Google Attribution beta customers show that data-driven attribution helps marketers improve their performance across channels.
HelloFresh, a meal delivery service, grew conversions by 10% after adopting Google Attribution. By using data-driven attribution to measure across channels like search, display, and email, Google Attribution gives HelloFresh a more accurate measurement of the number of conversions each channel is driving. And because Google Attribution is integrated with AdWords, HelloFresh can easily use this more accurate conversion data to optimize their bidding.
“With Google Attribution, we have been able to automatically integrate cross-channel bidding throughout our AdWords search campaigns. This has resulted in a seamless change in optimization mindset as we are now able to see keyword and query performance more holistically rather than inadvertently focusing on only last-click events.– Karl Villanueva Head of Paid Search & Display, HelloFresh
Pixers, an online marketplace, is also seeing positive results including increased conversions. Google Attribution allows Pixers to more confidently evaluate the performance of their AdWords campaigns and adopt new features that improve performance.
“By using Google Attribution data we have finally eliminated guesswork from evaluating the performance of campaigns we’re running, including shopping and re-marketing. The integration with AdWords also enabled us to gradually roll-out smart bidding strategies across increasing number of campaigns. The results have significantly exceeded expectations as we managed to cut the CPA while obtaining larger conversion volumes.”– Arkadiusz Kuna, SEM & Remarketing Manager at Pixers
Google Attribution can also help brands get a better understanding of their customer’s path to purchase. eDreams ODIGEO, an online travel company, knows that people don’t usually book flights or hotels after a single interaction with their brand. It often requires multiple interactions with each touchpoint having a different impact.
“Some channels open the customer journey and bring new customers, whereas other channels are finishers and contribute to close the sales. Google Attribution is helping us to understand the added value of each interaction. It enhances of our ability to have a holistic view of how different marketing activities contribute to success.”
– Manuel Bruscas, Director of Marketing Analytics & Insights, eDreams ODIGEO
In the coming months we’ll invite more advertisers to use Google Attribution. If you’re interested in receiving a notification when the product is available for you, please sign up here.
Don’t forget, even before adopting Google Attribution, you can get started with smarter measurement for your AdWords campaigns. With attribution in AdWords you can move from last-click to a better attribution model, like data-driven attribution, that allows you to more accurately measure and optimize search and shopping ads.
Posted by Bill Kee, Group Product Manager, Measurement and Attribution
Domain Authority (DA) is a metric that serves as a handy heuristic in the SEO industry. Put simply, it provides insight into how likely a site is to rank for specific keywords, based on the SEO authority it holds. There are numerous tools that can help us arrive at these useful scores.
Below, we round up some of the most accurate and intuitive ways to see a site’s SEO equity.
In an often opaque industry, with few insights into how Google’s algorithms really work for organic search, the lure of a metric like Domain Authority is self-evident.
It provides a glimpse into the SEO “strength” of a website, in a similar fashion to the now obsolete PageRank toolbar. Google still makes use of some variation of the PR algorithm internally, but its scores are no longer visible to the public and were never particularly helpful.
If anything, they encouraged some negative attempts to “game” Google’s rankings through link acquisition.
However, many SEOs make use of Domain Authority to sense-check the quality of their inbound links and to understand how these are affecting their own’s site’s SEO health.
What is Domain Authority?
“Domain Authority (DA) is a search engine ranking score developed by Moz that predicts how well a website will rank on search engine result pages (SERPs). A Domain Authority score ranges from one to 100, with higher scores corresponding to a greater ability to rank.
Domain Authority is calculated by evaluating linking root domains, number of total links, MozRank, MozTrust, etc. — into a single DA score. This score can then be used when comparing websites or tracking the “ranking strength” of a website over time.” – Moz.
Ultimately, this is a representative model of how Google decides which pages should rank for each query, and in what order they should rank.
As is the case with the term ‘relevance’, authority covers a very broad area of assessment that is open to interpretation. Domain Authority aims to cut through that ambiguity by providing a metric that can compare the SEO strength of different websites based on a consistent methodology.
Although marketers are aware that DA has intrinsic limitations as a metric, it is at least a barometer of whether our SEO efforts are gaining traction or not. As such, it serves an important purpose.
When prospecting for new links, for example, it is helpful to check the DA of external sites before contacting the site about a potential partnership. Combined with a range of other metrics – both qualitative and quantitative – Domain Authority can therefore guide brands towards more effective SEO decisions.
‘Domain Authority’ was devised by Moz and they have naturally taken ownership of this name. Their suite of tools (some of which are discussed in this article) will reveal the authority of particular domains, but dozens of other free tools use Moz’s API to show these scores too.
However, a couple of other SEO software packages provide a slightly different view on a domain’s SEO strength.
Moz’s scores are based on the links contained within its own index, which is undoubtedly smaller than Google’s index of URLs.
Other SEO software companies, such as Majestic and Ahrefs, have their own index of URLs. These indexes will largely overlap with each other, but there are still questions to pose to your chosen provider:
- Index size: How many URLs are contained within the software’s index?
- Frequency of index crawling: How often is the index refreshed?
- Live links: Are there common instances of ‘false positives’, where inactive links are reported with 200 status codes?
- Correlation with actual rankings: Simply, does a higher domain score equate to better rankings?
The importance of these questions, and the resultant significance of their answers, will depend on a brand’s context. Nonetheless, these are points worth considering when assessing the scores your site receives.
Each of the main players in this space has subtle distinctions within its methodology, which will be important for most SEOs.
We will begin our round-up with the Moz tools (some of them free) that will show the Domain Authority for any site, before looking at a couple of alternatives that provide a valuable reference point.
Moz (MozBar, Open Site Explorer)
It should be clear that Moz is the major contender when it comes to checking a domain’s SEO authority. We included MozBar on our list of the best Google Chrome extensions for SEO and it deserves its place in this list, too.
MozBar will highlight the Domain Authority of any site a user is browsing, along with the Page Authority (PA) of that particular URL. As the name suggest, PA applies a similar methodology to DA, but localized to a particular URL rather than a domain.
This is also available in search results pages, making it possible to see whether a site’s Domain or Page Authority correlates with higher rankings for particular queries.
As such, these two metrics in combination are a great starting point for investigations into the quality and quantity of backlinks pointing to a domain.
Marketers should be aware, however, that these scores do fluctuate.
That should be viewed as a positive, as the scores are an increasingly accurate reflection of how Google is evaluating sites. Moz employs machine learning algorithms to re-calibrate the authority scores based on link activity across its index, but also the impact that certain types of link have.
We can consider this an attempt to peg the Moz index to that of Google, and we know the latter is tweaked thousands of times a year.
Therefore, we should be careful about the causal links we infer from DA scores.
When tracking Domain Authority, always benchmark against similar sites to avoid viewing this as an absolute indication of how well you are performing. By viewing it as a relative metric instead, we can gain a healthier insight into whether our strategy is working.
This is where another Moz-owned tool, Open Site Explorer, proves its worth. Open Site Explorer uses a range of proprietary Moz metrics to highlight the areas in which specific sites under- or over-perform. the side by side comparisons it creates are an intuitive way to spot strengths and weaknesses in a site’s link profile on a broader scale.
Moz’s Domain Authority is undoubtedly useful – especially when used as an entry point into deeper investigation. MozBar and Open Site Explorer provide access to this metric for all marketers, so they should be viewed as the go-to resources for anyone seeking a check on their site’s SEO ranking potential.
Ahrefs boasts an index of over 12 trillion links and data on 200 million root domains, making it an invaluable repository for SEOs wanting to understand their site’s SEO performance.
The two metrics that matter within the scope of this article are URL Rating (UR) and Domain Rating (DR).
We can consider these Ahrefs’ equivalents to Page Authority and Domain Authority, respectively, at least in terms of their purpose.
The latter is defined by Ahrefs as “a proprietary metric that shows the strength of a target website’s total backlink profile (in terms of its size and quality).”
It appears frequently within the software interface, in examples like the one in the screenshot below:
So, why would you use the Ahrefs DR score over Moz’s DA calculation? Their definitions do seem strikingly similar, after all.
As always, the detail is critical. If we refer back to our initial points for consideration, it becomes possible to compare Ahrefs with Moz:
- Index size
- Frequency of index crawling
- Live links
- Correlation with actual rankings
Both Moz and Ahrefs have invested significantly in improving the size, quality and freshness of their link data. Some SEOs have a preference for one over the other, and their scores do vary significantly on occasion.
Those that prefer Ahrefs typically do so for the freshness of its index and DR’s correlation with actual rankings.
The clarity of the Ahrefs methodology is also very welcome, right down to the number of links typically required to reach a specific DR score.
To put things simply, we calculate the DR of a given website the following way:
- Look at how many unique domains have at least 1 dofollow link to the target website;
- Take into account the DR values of those linking domains;
- Take into account how many unique domains each of those websites link to;
- Apply some math and coding magic to calculate “raw” DR scores;
- Plot these scores on a 0–100 scale (which is dynamic in nature and will “stretch” over time).
- DR 0–20: 20 ref.domains
- DR 20–40: 603 ref.domains
- DR 40–60: 4,212 ref.domains
- DR 60–80: 25,638 ref.domains
- DR 80–100: 335,717 ref.domains
Ahrefs requires a monthly licence to access its data; for those that do sign up, it provides a very useful sanity check for the domain strength scores seen elsewhere.
Majestic is marketed as “The planet’s largest link index database” and it remains a trusted component of any SEO toolbox for the thorough nature of its backlink data.
Offering two index options (Fresh and Historic), it also allows marketers to different views of how their domain is performing. As with Moz and Ahrefs, Majestic’s scores for site strength are calculated almost exclusively based on the quality and quantity of inbound links.
Opting for the Historic Index will see Majestic scour the billions of URLs it has crawled within the last 5 years, while the Fresh Index is updated multiple times per day.
This software takes a slightly different tack in relation to the labeling of its domain metrics, which are known as Trust Flow and Citation Flow.
These are interrelated metrics that combine to form the set of Majestic Flow Metrics. These are very insightful because of the immediate score they provide (ranging from a low of 0 to a high of 100), and also for the opportunities to dig further into the backlink data.
One favorite feature of Majestic is the ability to analyze historical backlink acquisition trends, both in terms of links gained and links lost. As such, Majestic’s domain strength metrics provide actionable insight that can be used to shape strategy immediately. For example, the loss of a lot of links on a particular date may provide an opportunity to reach out to webmasters and try to regain that equity.
Majestic also comes with a handy toolbar that overlays domain metrics on the site a user is browsing. Although an apples to apples comparison between Majestic and Moz or Majestic and Ahrefs, in relation to the efficacy of their domain authority rankings, would be difficult, this would also be to miss the point.
All of these tools are aiming to mimic the functioning of Google as accurately as they can; taken together they form a more rounded picture.
Given the ongoing significance not only of backlinks, but also the potential of unlinked mentions to boost performance, search marketers are quite rightly looking to Domain Authority to assess their SEO potential.
The core elements of a successful, customer-centric remain the same as they always were; higher scores, from whichever domain metrics one chooses to monitor, should be seen as a natural by-product of a strategy that fulfils the modern consumer’s needs.
Any stellar SEO strategy should be meticulously tracked and heavily data-driven.
Gut feel is great when deciding on which new pair of shoes to buy, but it’s not the best foundation to base your SEO work upon.
Google Analytics is a treasure trove of insightful data. And it’s free! However, with so much data available at our fingertips, it can be a bit of a minefield, and most people only scratch the surface.
Keyword rankings are great for stroking your ego and making your client smile and nod, but they don’t tap into the bigger picture.
In order to continually build on and improve your campaign, you need to pay close attention to the nitty-gritty of your data. There’s a lot to take into account, but in this post we’ll provide an overview of the key Google Analytics reports and views to bolster your SEO campaigns.
Many of these reports can be created as custom reports, which is handy for tailoring your reporting to specific business needs and sharing with clients.
Read on and we’ll help you to track and measure your SEO efforts like the analytical guru you are.
1. Organic search
Where to find it: ‘Acquisition’ > ‘Overview’ > Click through to ‘Organic Search’
It’s an obvious one but a good place to start. Head to the ‘Overview’ tab under ‘Acquisition’ for a base level indication of your website’s primary traffic channels. This provides an immediate summary of your top channels and how each is performing in terms of traffic volume, behavior and conversions.
As well as showing a general overview of organic traffic, you can also dig deeper into the data by clicking on ‘Organic Search’ in the table and playing around with the filters. Consider the most popular organic landing pages, an overview of keywords, search engines sending the most traffic, exit pages, bounce rates, and more.
On the topic of bounce rates, it’s a good idea to pay particular attention to this metric with regards to individual pages. Identify those pages with a bounce rate that is below the average for your site. Take some time to review these pages and work out why that might be, subsequently applying any UX/UI or targeting amendments.
This is all very well but wouldn’t it be handy if you could view only your organic traffic across the whole of your Google Analytics? It’s easier than you think. Simply click to ‘Add Segment’ and check the box for organic traffic.
Leave the ‘All Users’ segment for a handy comparison, or remove this segment for a view of only your organic traffic.
2. Landing page and page titles
Where to find it: ‘Behavior’ > ‘Site Content’ > ‘Landing Pages’ > Add secondary dimension ‘Page Titles’
One of the most frustrating aspects of Google Analytics organic reports is the dreaded ‘(not provided)’ result which features under ‘Keyword’.
This unfortunate occurrence is the result of searches which have been carried out securely. In other words, if the URL of the search engine features HTTPS or if they are logged into a Google account and therefore protected by data privacy policies. In these scenarios, the search term deployed by the user will not be provided.
But how wonderful would it be to see a list of all the search terms people used to find your site? Unfortunately I’m not a magician and I can’t abracadabra these search phrases from the Google abyss. But I can offer an alternative solution that will at least give you an overview.
View your organic traffic via landing page and page title, as this will show which pages are performing best in terms of organic search. By including the page title, you can then look at which keywords those pages are optimised for and get a pretty good idea of the search phrases users are deploying and those which are performing best in terms of traffic and bounce rate.
This can also help you identify the pages which are not performing well in terms of organic traffic. You can then review whether the keywords need refining, the onsite optimization needs an overhaul, or the content needs revamping.
3. Conversion goals
Where to find it: ‘Conversions’ > ‘Goals’ > ‘Overview’
It’s all very well having a high volume of organic traffic but if it isn’t converting then there’s really not much point. To test the quality of your organic traffic, you need to be tracking conversions. There are two levels to this.
The first is your conversion goals. You can filter these with regards to traffic and understand what percentage of a website’s conversions are resulting from organic traffic.
To further improve this data, add monetary value to your conversions to better demonstrate the value that your SEO efforts are bringing. Some clients care only about keyword rankings, some care only about the dollar signs. Either way, it’s worth spending some time with your client to work out how much each conversion is worth and the data that they are most interested in.
For example, let’s say you sell kitchens. If you know the average cost of a sale and the percentage of kitchen brochure downloads which convert to a sale, then you can work out an approximate value for each conversion.
4. Assisted conversions
Where to find it: ‘Conversions’ > ‘Multi-Channel Funnels’ > ‘Assisted Conversions’
Although useful, conversion goals only give a surface view of conversions. What if someone initially found your website via Google and didn’t convert, but then later returned to your website by typing in the URL direct and then converted?
It’s very common for users not to convert on their first visit to a website, especially if they are only in the awareness or consideration phase of the sales funnel. When returning the next time around to make a purchase, they are more likely to go direct, or perhaps they see a reminder via social media.
This is where assisted conversions can save the day. Find these by clicking on ‘Multi-Channel Funnels’ under ‘Conversions’, and then ‘Assisted Conversions’.
With this data, you can identify whether each channel featured on the conversion path of a user, therefore providing more accurate data in terms of the quality of your organic traffic.
Pay attention to any drops or surges in organic traffic in this section. If, for example, you have noticed a drop in organic assisted conversions yet your organic traffic has remained consistent, then it may indicate that the leads are no longer as qualified. This should prompt a review of your keyword and content strategy.
5. Site speed
Where to find it: ‘Behavior’ > ‘Site Speed’ > ‘Overview’
Site speed is important, we all know that. There are a number of tools we can use to find out the overall speed of a website: Google Page Insights, Pingdom, GTmetrix. However, these don’t tend to drill down into specific pages. The site speed report via Google Analytics can help you to identify any pages which are proving particularly slow.
You are likely to see a correlation between the time taken to load and the exit pages, you can also layer in bounce rate metrics.
Using this information regarding individual pages, you can then approach your development team with the cold hard evidence that they need to resolve that page speed issue.
6. Site search
Where to find it: ‘Behavior’ > ‘Site Search’ > ‘Search Terms’
If you have a site search function on your website then this report is super useful for a number of reasons. Firstly, it can indicate where the user experience may not be particularly strong on your website. If a page is proving difficult to find without having to search for it then it may hint at a wider site navigation issue.
In addition, it can also help identify any keywords or search terms which you may need to create a new page for if one does not already exist. The site search report is ideal for unearthing these gaps in your website’s offering.
Where to find it: ‘Audience’ > ‘Mobile’ > ‘Overview’
Comparing the traffic of mobile users to that of desktop and tablet is a handy way of identifying whether your site may have some mobile optimization issues. For example, if the bounce rate of mobile sessions is significantly higher than that of your desktop sessions, then you may need to carry out a mobile site audit.
It’s also worth considering the conversion rate of the different devices, as this can indicate which device traffic is the most valuable.
Given that over half of website traffic is now on mobile, you should see similar results reflected in your own analytics. Although it’s worth bearing in mind that some businesses are more likely to be more prevalent on mobile than others.
For example, a local business should feature in a lot of mobile searches, whereas a business to business service is more likely to be searched for on desktop by people sitting in an office.
8. Customize your dashboard
Where to find it: ‘Customization’ > ‘Dashboards’
Finally, for a quick overview of reporting, it pays to design a tailored dashboard for your client. We often find that clients don’t appreciate too much text or complex tables in reports, as they can be overwhelming at an initial glance.
Sure, you may be a Google Analytics whizz, but the chances are that your client isn’t. Therefore presenting the data in a way that is digestible and manageable is key to convincing them of your SEO prowess.
Create a dashboard that your client will understand. Use digestible charts, like bar graphs, pie charts and simplified tables. This will help the client visualize all of the data in one easy-to-view report. This can also be emailed to your client each week so they get regular updates.
Dashboards are created using customizable widgets. Begin by selecting the type of widget: this could be a simple metric, a timeline, a geomap, a table, or a pie or bar chart. With some widgets, you can also select whether to show a specified date range or whether to show data in real-time.
Once you have chosen your widget, you can configure the finer details, such as dimensions and other options depending on the type. Widgets can be edited, cloned or deleted, allowing flexibility in refining your dashboard as both you and your client see fit. For further information on creating a custom dashboard, have a read of Google’s handy guide.
There are a whole myriad of other reports and views available within Google Analytics; it takes time to become familiar with all the different types of data and formats. Hopefully this list has provided a solid starting point for genuinely valuable and insightful SEO reporting.
Pivot tables let users narrow down a large data set or analyze relationships between data points. Additionally, they reorganize user’s dimensions and metrics to help quickly summarize data and see relationships that might otherwise be hard to spot.
Example Pivot Table (Help center doc here)
Coordinated coloring allows users to bind colors to specific data. When a user creates visualizations, Data Studio automatically binds colors to data, so that color:data pairs stay consistent between visualizations and when filtering. This feature is automatically turned on for all new reports, and available in old reports.
Example Coordinated Coloring (Help center doc here)
Google Analytics Sampling Indicator
Google Analytics samples data in order to provide accurate reporting in a timely manner. Data Studio now shows a sampling indicator in Data Studio reports when a component contains sampled Analytics data.
Field Reports Editing
Data Studio has also recently added new options to the chips in reporting. These new options allow you to:
- Rename fields
- Change aggregation types
- Change semantic types
- Change date functions
- Apply % of total, difference from total, or percent difference from total to a metric from within the report.
Example of new field editing options (Help center doc here).
Submitting and voting for new features
The Data Studio team will continue to introduce new features and product enhancements based on your submissions. You can view requests submitted by other users, upvote your favorites, or create new ones. Learn more here.
Posted by Dave Oleson, Product Manager, Google Data Studio
AdWords integration: Find the best landing page
Marketers spend a lot of time optimizing their Search Ads to find the right message that brings the most customers to their site. But that’s just half the equation: Sales also depend on what happens once people reach the site.
The Optimize and AdWords integration we announced in May gives marketers an easy way to change and test the landing pages related to their AdWords ads. This integration is now available in beta for anyone to try. If you’re already an Optimize user, just enable Google Optimize account linking in your AdWords account. (See the instructions in step 2 of our Help Center article.) Then you can create your first landing page test in minutes.
Suppose you want to improve your flower shop’s sales for the keyword “holiday bouquets.” You might use the Optimize visual editor to create two different options for the hero spot on your landing page: a photo of a holiday dinner table centerpiece versus a banner reading “Save 20% on holiday bouquets.” And then you can use Optimize to target your experiment to only show to users who visit your site after searching for “holiday bouquets.”
If the version with the photo performs better, you can test it with other AdWords keywords and campaigns, or try an alternate photo of guests arriving with a bouquet of flowers.
Objectives: More flexibility and control
Since we released Optimize and Optimize 360, users have been asking us for a way to set more Google Analytics metrics as experiment objectives. Previously,
Optimize users could only select the default experiment objectives built into Optimize (like page views, session duration, or bounces), or select a goal they had already created in Analytics.
With today’s launch, Optimize users no longer need to pre-create a goal in Analytics, they can create the experiment objective right in Optimize:
When users build their own objective directly in Optimize, we’ll automatically help them check to see if what they’ve set up is correct.
Plus, users can also set their Optimize experiment to track against things like Event Category or Page URL.
Learn more about Optimize experiment objectives here.
Why do these things matter?
It’s always good to put more options and control into the hands of our users. A recent study showed that marketing leaders – those who significantly exceeded their top business goal in 2016 – are 1.5X as likely to say that their organizations currently have a clear understanding of their customers’ journeys across channels and devices.1 Testing and experimenting is one way to better understand and improve customer journeys, and that’s what Optimize can help you do best.
1Econsultancy and Google, “The Customer Experience is Written in Data”, May 2017, U.S.
Posted by Rotimi Iziduh and Mary Pishny, Product Managers, Google Optimize
More than six hundred developers have signed up for developer access to Data Studio Community Connectors since the Developer Launch. Community Connectors give developers an opportunity to come up with innovative solutions for data access and broaden the scope of data sources users can connect to.
Based on community feedback, we recognized that many of you are looking to share your work on connectors with the community. Also, developers are looking for more examples to follow. With these community needs in mind, today, we are announcing the Open Source Community Connectors repository on GitHub.
Use open source Community Connectors
For every connector that is hosted in the open source repository, the Data Studio Developer Relations team will manage a deployment for the connector’s latest code. This managed deployment will enable all users to immediately try the connector in Data Studio by simply clicking a link. Managed deployments also make it easier for developers since you do not have to deploy and maintain the connectors yourself; we’ll take care of this for you.
You can try out the following Open Source connectors directly in Data Studio:
- npm Downloads connector: Fetch download counts for specific npm packages by date
- Fusion Tables connector: Fetch data from Google Fusion Tables
- Stack Overflow Questions connector: Fetch Stack Overflow Question metadata for specific tags
Example dashboards using these connectors:
Learn about best practices
If you want to connect to new Data Sources using Data Studio but have not yet looked into Community Connectors, now would be the best time to start since a variety of example connector code have become available. These examples will give you a head start and create a platform for you to learn and share with other community members.
Initially, we are releasing these connectors in our open source repository:
Contribute to the community
If want to submit your own open source connector to the repository, you can send us a pull request. Alternatively you can maintain your own repository and link to that from the official repository.
This Git repository is a small start where we plan to make new additions. We have already seen other open source Community Connectors like data.world and getSTAT. We are hoping that initiative will help developers and users to create connectors to new Data Sources and thus make more data accessible in Data Studio. Developers can also collaborate with each other as well as report new issues and fix existing ones through these open source connectors.
This collaboration platform gives developers the option to leverage support from the community. If you want to develop your own connector but are unable to maintain it in the long run, you can add it to our repository so that the community can support it.
Today at Dreamforce, Google and Salesforce are announcing a strategic partnership to deliver four new, turnkey integrations between Google Analytics 360, Salesforce Sales Cloud and Salesforce Marketing Cloud:
- Sales data from Sales Cloud will be available in Analytics 360 for use in attribution, bid optimization and audience creation
- Data from Analytics 360 will be visible in the Marketing Cloud reporting UI for a more complete understanding of campaign performance
- Audiences created in Analytics 360 will be available in Marketing Cloud for activation via direct marketing channels, including email and SMS
- Customer interactions from Marketing Cloud will be available in Analytics 360 for use in creating audience lists
These new connections between our market-leading digital analytics solution and Salesforce’s market-leading customer relationship management (CRM) platform will change the game for how our clients understand and reach their customers — and how they measure the impact of their marketing. These integrations are fully consistent with our privacy policies and have settings that offer privacy controls and choice on how data is used.
By integrating your customer data, you can see a customer’s path from awareness all the way through to conversion and retention. And with connections to Google’s ad platforms and Salesforce’s marketing platform, you can quickly take action, engaging them at the right moment. You’ll see these new integrations begin to arrive in the first half of 2018.
Until now, businesses have not been able to connect offline interactions, such as an estimate provided by a call center rep or an order closed by a field sales rep, with insights on how customers use digital channels. With the connection between Sales Cloud and Analytics 360, soon you’ll be able to include offline conversions in your attribution modeling when using Google Attribution 360, so you’ll have a more complete view of ROI for each of your marketing channels and even more reason to move away from a last-click attribution method. This integration will also let you see how your most valuable customers engage with your digital properties, answering some important questions like, what are they looking for and are they actually finding what they need?
With the integration allowing data from Analytics 360 to be visible in Marketing Cloud, you’ll gain a more complete understanding of how your marketing campaigns perform. For example, if you send an email campaign to frequent shoppers to promote your fall fashion line, you’ll be able to see right in Marketing Cloud information such as how many pages people visited when they came to your site, the number of times people clicked on product details to learn more, and how many people added items to their shopping cart and converted.
Easy to take action
Today, Google Analytics allows you to create audience lists and goals that you can easily send to AdWords and DoubleClick for digital remarketing and to optimize bids. With the new connection from Sales Cloud to Analytics 360, in addition to unlocking new insights and more data for attribution modeling, you’ll be able to combine Salesforce data (such as sales milestones or conversions) with behavioral data from your digital properties to create richer audiences and for smarter bidding.
For example, if you’re a residential solar panel company and want to find new customers, you can create an audience in Analytics 360 of qualified leads from Sales Cloud and use AdWords or DoubleClick Bid Manager to reach people with similar characteristics. Or, create a goal in Analytics 360 based on leads marked as closed in Sales Cloud, and automatically send that goal to AdWords or DoubleClick Search to optimize your bidding and drive more conversions.
With the Analytics 360 connection to Marketing Cloud, you’ll be able to use customer insights to take action in marketing channels beyond Google’s ad platforms, such as email, SMS or push notification. For example, you can create an audience in Analytics 360 of customers who bought a TV on your site and came back later to browse for home theater accessories, and use that list in Salesforce to promote new speakers with a timely and relevant email.
Every day, Google Analytics processes hundreds of billions of customer moments, Salesforce Marketing Cloud sends 1.4 billion emails, and there are over 5 million leads and opportunities created in Salesforce Sales Cloud. These new integrations represent a powerful combination, and we believe they will help marketers take a big step closer to the ultimate dream: providing every customer with a highly relevant experience at each step of their journey.
You’ll see these new joint capabilities become available beginning in 2018, and we’ll be sure to keep you updated along the way. Contact us here if you would like to learn more about Analytics 360. We hope you’re as excited as we are!
Posted by Babak Pahlavan, Senior Director of Product Management, Measurement & Analytics
The Google Analytics 360 + Salesforce integrations are just one part of a broader strategic alliance announced today between Google and Salesforce. Read about new integrations between G Suite and Salesforce and a new partnership between Google Cloud and Salesforce here.