CBPO

Tag: Analytics

Moz Local Search Analytics and industry trends: Q&A with Moz’s Sarah Bird and Rob Bucci

October 7, 2019 No Comments

Moz is known and loved by many in the SEO community not only for their tools, but also for the ways they’ve contributed to SEO education via their blog, Whiteboard Fridays, Search Ranking Factors study, and more.

We caught up with Moz’s Sarah Bird and Rob Bucci to learn about what they’ve been working on and trends they’re seeing in SEO. Sarah is CEO of Moz and has been at the company since joining as the eighth employee in 2007. She’s helped grow the company from a few hundred customers to now more than 37,000. Sarah holds a J.D. and previously worked as an attorney before getting into the startup space.

2018 Sarah Bird Moz CEO headshot

Rob is VP of R&D at Moz. He previously was CEO of STAT Search Analytics, which he helped build since 2011 and which was acquired by Moz in October 2018. 

rob bucci VP R&D moz

Their company is headquartered in Seattle, where Sarah is based, and they also have a large office in Vancouver, where Rob is based.

In this conversation, we focus mostly on Moz’s interest in and work on local search, as well as better understanding queries the way that Google understands them.

SEW: Tell us about what you’ve been working on lately around local search?

Sarah: We’re really excited — we think this is the golden age of search. More people are searching than ever before, and they have more devices and opportunities to use when searching. That’s come also with changes at Google of not wanting to just be a portal or a gateway to websites, but to actually allow users to transact and interact right there on Google property. Google is more of a destination now and not just a gateway.

What we’ve noticed is that while we may have more searches than ever before, not all those searches are created equal. Some searches are simply not commercizable anymore for anyone but Google. But we think you still have some great opportunities, particularly in the local space.

Research coming out from Google, others, and our own internal research is really showing that local intent searches lead to a purchase much more quickly.

And it’s hyper-local. You can get a different search result on one street corner, then walk four blocks and get a different search result on that corner. It means that more people can actually play the search game. There’s much more SEO opportunity in local.l

A big theme at Moz right now is focusing on making local search more understood and easier to do for SEOs.

Rob: In today’s Google, there’s really, for the vast majority of queries, no such thing as a national SERP anymore. Everything is local. Google gets a lot of local signals, especially from mobile devices. And the mobile device doesn’t say “I’m searching from the U.S.,” it says “I’m searching from the corner of 5th avenue and Tucker Street.” Google takes that information and uses it to create a SERP that has all sorts of content relevant to that specific local area. 

We’ve been helping our users adapt to that reality by building out a set of functionality that we call Local Market Analytics. It allows users to get actual, on-the-ground reality that a searcher would see in the area where they’re searching.

Part of how we do that is by sampling within a given market. Let’s say a market is Toronto, San Francisco, or Seattle. Local Market Analytics would sample from several different zip codes within that market to pull out an average rank or average appearance on that SERP. So truly, this is the actual appearance in that market.

We have studies that have shown that even for sites that don’t have brick-and-mortar locations, their performance varies dramatically depending on where their searcher or their customer is searching from. 

We hope that this functionality better allows our users to adapt to this new reality and make sure they can have the right data to build the foundation of their strategies.

Moz Local vs Local Market Analytics

Sarah: We at Moz are dedicated to local search because we know it’s so commercializable and because we know there’s so much organic opportunity. Because it’s so hyper-local and focused, there are some really interesting ways of thinking when you view local search.

We’ve relaunched our Moz Local product. The new Moz Local allows you to do even more than the prior version. We’re enabling, even more, review management which is super important for search right now, as well as more Google posts and more subtle GMB management. Moz Local is separate from Local Market Analytics, and there’s a good reason for that.

With the new Moz Local, you really need to have a physical location in order for it to be valuable.

But Local Market Analytics doesn’t require you to have a physical location. It just requires that the kind of queries that you care about will vary by location.

Rob: For local SEOs, the spectrum of things that they care about is varied. On one hand, they’ll care about the appearance of their business’s local listings — the accuracy of that data, review management, and having the right distribution partners for those listings. Moz Local, especially the new version that we’ve launched, handles that side of the equation very well.

Where we believe the market has been traditionally underserved has to do with the performance of a website itself in organic search results. As those organic search results get increasingly hyper-local, we’ve found that local SEOs have been underserved with the quality of data they’ve had in order to build their local strategies.

Local Market Analytics seeks to solve that part of the problem: performance of their websites in hyper-local organic search.

What kind of feedback have you gotten about the tool so far?

Rob: There’s a ton of excitement. We talked about this at MozCon, and it really resonated with people: this idea that “Yes, I search from my phone all the time and see a lot of local results, even when I’m not looking for a local business, and I see my search change.” Or agencies that have customers in three different areas and they’re asking why the rankings they’re sending aren’t the same as what their clients are seeing, because they’re impacted by local.

I think a lot of people intuitively understand that this is where Google is. Google is by nature right now intensely hyper-local. So there’s a great hunger for this kind of data. Historically, people have thought they just couldn’t get it.

A lot of times people get accustomed to the idea that we can’t get what we need from Google — that the data just isn’t available. 

So when we’re able to show them that the data actually is available, and that we’ve built functionality around it, there’s a lot of excitement.

Local Search Volume: New functionality

We also rolled out our new Local Search Volume functionality. It’s a brand-new data point that people traditionally haven’t been able to get. 

Most products on the market can tell you “search volume in the U.S. is X and in Germany it’s Y.” That’s very broad — nationwide. But when we care about tracking the market of Toronto or San Fran or Memphis, we want to know what our search volume is in that city. People have traditionally thought that they couldn’t get that data, but we’ve now made it available, and we’re really excited about that.

Right now, we’re doing it on a city basis, and we’ve rolled it out to states. I don’t want to over-promise. I would love to have it be more specific, and that’s certainly something that we’re thinking about.

What’s going to be really cool is when we can get to a place where we help people understand demand per capita in their markets. 

Let’s take an example. We might think that Brooklyn is the epicenter of pizza. But when we actually look at New Hampshire and the number of searches there versus how many people are in that market, we might find that the demand per capita for pizza is greater in New Hampshire than in Brooklyn.

Being able to show people if there’s a big untapped opportunity — I’m really looking forward to empowering that kind of analysis.

Sarah: This ties into what I alluded to before – we need to understand queries and types of search results like Google does. Search results vary dramatically nowadays, with all kinds of SERP features. All of this impacts whether there’s a click at all, and certainly the clickthrough rate.

We are doing a bunch of R&D right now to make sure that we can help our audience of SEOs understand queries like Google, and also understand what a search result might look like for a kind of query, and what impact that could have on CTR. This stuff is more in the R&D territory. Local Search Volume is part of that interest and investment on our part.

When it comes to the distribution of clicks between organic, paid, and no-click searches, some people see the rise of paid and no-click searches as disheartening. You sound optimistic. What’s your response to those trends?

Sarah: Absolutely for some part of searches happening, if you’re not Google you can’t take advantage of it. The value stays with Google — that is absolutely true. But the overall number of searches continues to rise — that’s also a trend. 

And I believe very strongly that just because there isn’t a click doesn’t mean there isn’t some value created. 

We have these old ways of thinking about whether or not you’re successful in SEO. Those ways are deeply entrenched, but we need to let go of them a bit. Traffic to your website is no longer an accurate measure of the value you’re getting from search. It might be a minimum — that’s at least the value you’re getting. But it’s nowhere near the maximum. 

I think that brand marketers, who come from different disciplines, have always known that visibility — how you show up and how compelling it is — that those things matter, even if you can’t measure it like old-school SEO or PPC.

There’s a danger in equating an increase in no-click searches with a decrease in the value of SEO. 

We should shift our attention to not just “am I showing up” and “am I getting traffic,” but “how am I showing up in search results?” 

What does it look like when someone lands on your search result? Are they getting a phone number? Are they getting what they want, the answer they need? Is your search result compelling? 

That’s part of what’s driving our interest in thinking more holistically about what a search result looks like and feels like, and how users interact with it. We want to know more about how you’re showing up and how Google thinks about queries.

Those two concepts: How does Google understand queries, and what does a search result look like, feel like, and how does the searcher experience it — those are related.

Rob: There’s still a ton of value out there, especially just for building a sense of credibility and brand authority. 

We live in a world, right now at least, where we’ll continue to see Google chipping away at these opportunities. They’re a business and they’re trying to maximize shareholder value. They have a natural inclination to grab as much as they can. 

We shouldn’t get despondent because of that. There’s still a lot of value there. Even with no-click SEO, you can still deliver a lot of brand authority. 

What are other trends that SEOs should be paying attention to?

Rob: One of the other areas we’re thinking about is how do we better help our customers think about queries in the same way Google thinks about queries? 

Google goes a lot deeper than just understanding which words mean what. They look at the intent of the searcher — what are they trying to solve? We’re really interested in helping people think about queries in that way.

We have some really interesting R&D work right now around intent and understanding what Google thinks an intent is. How can our users use that information to adapt their content strategies? That’s an area that’s really ripe and that people in the industry should be paying attention to. It’s not going anywhere. I’m really excited about that.

How do you go about understanding how Google understands intent?

Rob: Without getting too deep into it, there’s a number of ways that one could do it. One might be inclined to look at the NLP (natural language processing) approach — what might these words mean when used together and what might they say about the state of mind of the searcher? That’s a viable approach rooted in NLP and ML (machine learning).

Another approach might be to look at the SERP itself. Google has already decided what it is. I can look at what Google’s decided the signals are to what the intent is. Both of these are approaches one might use.

SEO is an ever-changing industry. What skills should people be focused on developing or learning about in the next few months?

Sarah: From a skills perspective, this is what I’ve always loved about SEO and what makes it challenging to be great at, but something that’s critical nonetheless — it’s a great blend of art and science. 

You have to be technical, but you also have to be able to put your mind into the user. Or rather, you have to be able to think about what Google will think about what the user thinks. 

What could the ultimate user be trying to accomplish, and how will Google follow that? 

You also have to have a strong technical foundation, so you know how to go out and execute. But those aren’t necessarily new skills.

Rob: I think people always look for what’s new, but sometimes we overlook the basic fundamentals which never go out of style. It’s about reaffirming what’s really important. 

There are two basic skills I think all SEOs need:

  1. You need to be able to interpret data. You need to be able to look at a bunch of disparate data points and weave them together into a narrative. What is it telling you? In doing that, people need to get really good at overcoming their own self-serving biases about interpreting data in a way that’s convenient or how they think the world should line up. The ability to interpret data is critical to an SEO who’s going to succeed at finding new opportunities that no one else has spotted.
  2. Understanding how to talk to people in a way that will get them to do what you want them to do. That really comes down to understanding how your content should be optimized and what you should be saying on your pages. What problem are you trying to solve for them and how are you trying to solve it?

Those are good fundamental skills I think people should continue to focus on, rather than thinking about, “I need to learn Python.” That’s a lot of distraction and it’s very specialized. 

Learning Python or R might seem sexy because technical SEO is having a renaissance right now. But at the end of the day, it’s not a basic skill you need to succeed in SEO.

SEO is a broad career and discipline. If you find yourself in a role that requires you to know that stuff, great. But I wouldn’t make that sweeping advice to the entire SEO industry because I think it’s a bit of a distraction.

Thanks so much to Sarah and Rob for talking with us!

Ps — They’re running a pilot program for their Local Market Analytics tool. It’s invite-only but anyone can register interest to be selected. They’re quite excited about it and would love feedback from the industry.

The post Moz Local Search Analytics and industry trends: Q&A with Moz’s Sarah Bird and Rob Bucci appeared first on Search Engine Watch.

Search Engine Watch


Google Website Optimizer Moves to Google Analytics – Experiments under Content Section

September 17, 2019 No Comments

Google Analytics and Google Website Optimizer have merged. Now Google Website Optimizer, a free A/B and Multi-variate testing tool, is available in Google Analytics via Experiments link under Content Section (see image below).

You can create and manage all your tests within Google Analytics without going to Google Website Optimizer site.

Functionality Difference between Experiments and Google Website Optimizer

  1. Easy Implementation – Since you already have Google Analytics on your site, now you will need one script to put on the original version, rest of the work will be done by standard Google analytics script.
  2. No Multivariate Testing Anymore – There is no option to run MVT and only allows A/B testing in the “Experiments”

 

The last day you’ll be able to access Google Website Optimizer will be August 1st, 2012

We will add more posts as we uncover new functionality in Experiments.

[adsense]


Google Analytics Premium


How to View the Visitor Flow of Specific Pages in Google Analytics

August 24, 2019 No Comments

Open Visitor Flow Report Under “Audience” menu open in “Standard Reports”. See below.  This view will give you the visitor flow for the complete site.

Left click on the page that you want the visitor flow for and select “Explore traffic through here” from the menu.

Now you have the visitor flow for that page (see below)

[adsense]


Google Analytics Premium


How to View Click Map of a Page in Google Analytics

August 20, 2019 No Comments

Click maps are a great way to visually see where visitors are clicking on a page. Google Analytics provides click map under Content –> In-Page Analytics (See below).

Click on In-Page Analytics and the first page that you will see is your site’s home page. To view a different page, either click on a link on the home page (still within In Page Analytics) or select a page from drop down available on top left (see below).

Other options

  • Select click, goal value or a goal to display on click map and also a threshold from the drop down next to it.
  • Show Bubbles – this shows the orange bubbles and the numbers that you see in the report. This is the default view, when you first land on the page.
  • Show Colors – this options add another visual to the way you view click map, it is sort of heat map of clicks.
  •  Browser Size – This options show what percent of your visitors who see the area displayed in your report. By choosing this option you get a slider that allows you to slide and choose percent of visitors. As you choose the visitor percent, the page resizes to show the area of the page, those percent of visitors see.

[adsense]


Google Analytics Premium


How to Filter out Bots and Spiders from Google Analytics

August 13, 2019 No Comments

A common misconception is that Google Analytics or any other JavaScript based Web Analytics solution filters out Spiders and Bots automatically.  This was true till few years ago because most of the spiders and bots were not capable of executing JavaScript and hence were never captured by JavaScript based Web Analytics solutions. As shown in 4 reasons why your bounce rate might be wrong, these days bots and spiders can execute JavaScript and hence are showing up in your Web Analytics reports.

Google Analytics has released a new feature that will let you filter out known spiders and bots.  Here are few things to keep in mind

  1. The data will only filter spiders and bots from the day you enable this setting. It won’t be allied to the data already processed.
  2. Since this will filter out bots, you might notice a drop in your visits, page views etc.

 

Here are the steps to filter out Spiders and Bots

  1. Go to the Admin section of your Google Analytics report
  2. Click  “View” section and choose the right report view
  3. Click  on “ View Settings” (see image 1 below)
  4. Check the box under “Bot Filtering” which says “Exclude all hits from known bots and spiders” (see image 2 below)
  5. Click “Save” button at bottom and you are done.

 

filter-spider-bots-google-analytics-1Image 1

filter-spider-bots-google-analytics-2Image 2


Google Analytics Premium


6 Reasons Why Your Google Analytics Reports Might Be Wrong

August 7, 2019 No Comments

tumblr_mxa45rwot71ro4c22o1_500

  1. Missing Tags – This is the most common error of wrong data.  This generally happens when new pages are added or the exiting pages are redesigned/recoded and the developer forgets to include the tags.  Make sure all of your pages are tagged with Google Analytics code.  You can use a tool like GAChecker, to verify if the Google Analytics tags are missing on any pages of your site.
  2. Mistagged Pages – Incorrect implementation or double tagging leads to wrong data in Google Analytics.  Double tagging results in increased page views and a low bounce rate. If you bounce rate is lower than 20% then that’s the first thing you should check.
  3. Location of GA Tags – Placing the tag towards the bottom of the page could result in no data particularly for the users with slow connections or pages that are slow to load.  This happens when a user tries to loads a page and clicks on another link before the first page is loaded. Since the Google Analytics tag is towards the bottom of the page, it might not get a chance to execute.  To avoid this issue, put your Google Analytics JavaScript in the <head> section of the page.
  4. Incorrect Filters – Wrong Filters can mess up the data and distort the view.  Always create an unfiltered view so that you have correct data to fall back on.
  5. Tags Not Firing Properly – This can happen when your page(s) have JavaScript errors.  A JavaScript error on any part of the page can result in an error in Google Analytics code. Verify the JavaScripts on your site to make sure there are no errors.
  6. Sampling – Sampling happens on highly trafficked site. Sampling in Google Analytics is the practice of selecting a subset of data from your traffic and reporting on the trends available in that sample set.  For most purposes, this might not be a non-issue however it can be of concern in eCommerce sites where sampling can (will) result in wrong sales figures.   You can get more information about GA sampling on “How Sampling Works“.


Google Analytics Premium



SQL for Marketing Analysis – All Google Analytics Analysts Should Know

July 23, 2019 No Comments

Marketers and Marketing Analysts generally depend on the tools or IT department to help them pull the data for marketing purposes. There comes a time when they can’t just wait around for IT to help them data pulls and manipulations.  They have to know how to do it on their own. This course is for those marketers who would like to know how to use SQL to conduct their marketing analysis.

The course uses MYSQL to show how SQL works but all the leanings and syntax are applicable to other databases as well.  Sign up for SQL for Marketers and Marketing Analysts

SQLForMarketersCoverImage


Google Analytics Premium


Software development analytics platform Sourced launches an enterprise edition

July 2, 2019 No Comments

Sourced, or source{d}, as the company styles its name, provides developers and IT departments with deeper analytics into their software development lifecycle. It analyzes codebases, offers data about which APIs are being used and provides general information about developer productivity and other metrics. Today, Sourced is officially launching its Enterprise Edition, which gives IT departments and executives a number of advanced tools for managing their software portfolios and the processes they use to create them.

“Sourced enables large engineering organizations to better monitor, measure and manage their IT initiatives by providing a platform that empowers IT leaders with actionable data,” said the company’s CEO Eiso Kant. “The release of Sourced Enterprise is a major milestone towards proper engineering observability of the entire software development life cycle in enterprises.”

Engineering Effectiveness Efficiency

Since it’s one of the hallmarks of every good enterprise tools, it’s no surprise that Sourced Enterprise also offers features like role-based access control and other security features, as well as dedicated support and SLAs. IT departments can also run the service on-premise, or use it as a SaaS product.

The company also tells me that the enterprise version can handle larger codebases so that even complex queries over a large dataset only takes a few seconds (or minutes if it’s a really large codebase). To create these complex queries, the enterprise edition includes a number of add-ons to allow users to create these advanced queries. “These are available upon request and tailored to help enterprises overcome specific challenges that often rely on machine learning capabilities, such as identity matching or code duplication analysis,” the company says.

Cloud Migration

The service integrates with most commonly used project management and business intelligence tools, but it also ships with Apache Superset, an open-source business intelligence application that offers built-in data visualization capabilities.

These visualization capabilities are also now part of the Sourced Community Edition, which is now available in private beta.

“Sourced Enterprise gave us valuable insights into the Cloud Foundry codebase evolution, development patterns, trends, and dependencies, all presented in easy-to-digest dashboards,” said Chip Childers, the CTO of the open-source Cloud Foundry Foundation, which tested the Enterprise Edition ahead of its launch. “If you really want to understand what’s going on in your codebase and engineering department, Sourced is the way to go.”

To date, the company has raised $ 10 million from Frst VC, Heartcore Capital, Xavier Niel and others.

Talent Assessment Managment


Enterprise – TechCrunch


Sisense acquires Periscope Data to build integrated data science and analytics solution

May 14, 2019 No Comments

Sisense announced today that it has acquired Periscope Data to create what it is calling a complete data science and analytics platform for customers. The companies did not disclose the purchase price.

The two companies’ CEOs met about 18 months ago at a conference, and running similar kinds of companies, hit it off. They began talking and, after a time, realized it might make sense to combine the two startups because each one was attacking the data problem from a different angle.

Sisense, which has raised $ 174 million, tends to serve business intelligence requirements either for internal use or externally with customers. Periscope, which has raised more than $ 34 million, looks at the data science end of the business.

Both CEOs say they could have eventually built these capabilities into their respective platforms, but after meeting they decided to bring the two companies together instead, and they made a deal.

Harry Glasser from Periscope Data and Amir Orad of Sisense.

Harry Glasser from Periscope Data and Amir Orad of Sisense

“I realized over the last 18 months [as we spoke] that we’re actually building leadership positions into two unique areas of the market that will slowly become one as industries and technologies evolve,” Sisense CEO Amir Orad told TechCrunch.

Periscope CEO Harry Glasser says that as his company built a company around advanced analytics and predictive modeling, he saw a growing opportunity around operationalizing these insights across an organization, something he could do much more quickly in combination with Sisense.

“[We have been] pulled into this broader business intelligence conversation, and it has put us in a place where as we do this merger, we are able to instantly leapfrog the three years it would have taken us to deliver that to our customers, and deliver operationalized insights on integration day on day one,” Glasser explained.

The two executives say this is part of a larger trend about companies becoming more data-driven, a phrase that seems trite by now, but as a recent Harvard Business School study found, it’s still a big challenge for companies to achieve.

Orad says that you can debate the pace of change, but that overall, companies are going to operate better when they use data to drive decisions. “I think it’s an interesting intellectual debate, but the direction is one direction. People who deploy this technology will provide better care, better service, hire better, promote employees and grow them better, have better marketing, better sales and be more cost effective,” he said.

Orad and Glasser recognize that many acquisitions don’t succeed, but they believe they are bringing together two like-minded companies that will have a combined ARR of $ 100 million and 700 employees.

“That’s the icing on the cake, knowing that the cultures are so compatible, knowing that they work so well together, but it starts from a conviction that this advanced analytics can be operationalized throughout enterprises and [with] their customers. This is going to drive transformation inside our customers that’s really great for them and turns them into data-driven companies,” Glasser said.


Enterprise – TechCrunch