- A failing SEO strategy can happen to the best of us
- No doubt it’s disheartening when your competitors are miles ahead and your business is struggling to bring in new leads
- Founder of LSEO and best-selling author, Kristopher (Kris) Jones provides comprehensive steps and advice on how you can salvage your SEO performance
Dumpster fires: surely they can’t happen to you. Right? But before you know it, your website’s traffic has tanked, your competitors are getting all the organic love, and you couldn’t get a conversion if your life depended on it. Folks, if your SEO performance sounds like that, you might just have a dumpster fire on your hands.
A failing SEO strategy can happen to the best of us. There’s no doubt it’s disheartening when your competitors are all miles ahead of you and your business isn’t bringing in new leads.
The good news is that it’s never too late to turn things around.
When is the best time to plant a tree? 20 years ago.
When’s the second-best time? Right now, so let’s get to it.
Here’s how to salvage your dumpster fire of an SEO strategy.
1. Review and optimize all your current content
I’m going to talk about content a few times in this post.
That’s because content has long been and remains the most important element to focus on in your overall SEO strategy.
Websites are nothing without content.
You can see a website getting by with no meta descriptions, you can see them getting by without optimized images, but without content, what do you have?
Not a website!
But if you’re focusing on content first to turn around your SEO strategy, where do you start?
Yes, you optimize everything you already have.
You don’t want to get ahead of yourself by constantly creating new content when you have a whole slew of old pages and posts that may have fallen into SEO disrepair.
Google treats optimized content the same as new content, so to start out, you’ll want to audit your existing content to see what’s good, what’s bad, and what you can fix up to be good again.
You can use a content audit tool like that found in Semrush, or, if you have a more manageable load of content to work with, checking things out manually would work well, too.
This is about more than just deciding what content you like or do not like, although you should be able to tell at a glance which topics are still relevant to your website.
But to check out the SEO performance of each page and post, you can use Semrush as I said, or go manual with Google Search Console.
What I like to do is to put each URL into Search Console and check out how it’s doing as far as impressions versus clicks, click-through rate, and the average positions of its ranking keywords.
That gives me a decent snapshot of which pages need attention.
A page with 10,000 impressions in a 30-day period but only 100 clicks will have a CTR of only one percent (not too great).
I would then go to that page to figure out what is causing the low CTR.
The page is obviously being ranked for the keyword, given its high impressions, but if few people are clicking, then maybe the page isn’t as relevant for the term as it once was.
If that’s the case, then optimizing the page for SEO could be a matter of creating new sections of content around that keyword, and certainly retooling what’s there already.
Optimizing your website’s content is a major part of improving your SEO strategy because it involves so many things that are going to help you.
For this first point, I focused only on the writing and editing part of the content optimization.
Let’s now move on to some other parts of an SEO strategy where you could update things (things that could nonetheless still be involved in content optimization).
2. Assess and update all meta tags
Your pages’ meta tags play an important role in your website’s overall SEO health.
Meta tags are also one of the easiest things to let slip by as you work on your website, because they’re so brief and simple, and there are so many of them.
The thing is, meta tags can go out of date as the landscape shifts around your industry and the keywords for which you were optimizing are no longer relevant.
Meta tags are a classic example of why you can’t set it and forget it with SEO.
Meta tags are another element to look at as you go through your content pages to improve their CTR.
Sure, a lot of your content itself could use updates, but retool the meta titles and descriptions, as well.
Remember, the meta information is what organic users see as they scroll a SERP.
If your title and description aren’t interesting or urgent enough to draw in audiences that are in the awareness stage, then those people will keep on scrolling.
Redoing meta tags could include using a new target keyword, rewriting the call to action, or making everything more concise.
Maybe start with a handful of pages only, say 20 or 30, and A/B test the old and new titles and descriptions to see how traffic and CTR change after your edits.
Doing that will confirm for you whether the updating you’re doing is worth it, and whether you should continue down this road with the rest of your pages.
3. Work on your technical performance
When you have to turn around your entire SEO strategy, you have to think about your website holistically.
That means focusing not just on your keywords and content, but also on how your pages perform technically.
I’m grouping issues such as image compression, site speed, mobile responsiveness, and Core Web Vitals all together under the umbrella of “technical performance.”
Although these factors are less “creative” and open-ended as compared to performing new keyword research or optimizing content, they matter just as well.
When people get to your website and are greeted with slow pages, a messy mobile appearance, and content elements that jump around as they load, their trust in you drops.
In a world as competitive as ours, you can’t afford to give people cause for distrust, because you can bet that there are a hundred competitors waiting in line to market to those customers if you can’t do so successfully.
If development work isn’t your forte, look into contracting out to someone who can clean up your website’s coding and otherwise speed things up while also optimizing for mobile.
Images should be compressed so they take up less space but don’t lose any of their quality, and each image should have optimized alt text.
Compressing and optimizing images is something you can definitely do yourself, either through a plugin (on WordPress) or manually if it’s feasible.
Even though page speed and load times aren’t always the most accessible kind of work to business owners and website owners, those are important issues to keep in mind as you labor toward turning around your dumpster fire of an SEO strategy.
4. Resume creating new content
You can turn around even the worst SEO strategy in the world.
Google isn’t going to hold you to the fire forever just because your SEO has been in the dumps even for the last few years.
Google crawls your site every so often whether you’re doing something with it or not, and as it sees that your SEO is improving, it can start to rank some of those pages higher.
So here is where we get into creating all-new, high-quality content.
Content in 2023 can mean a whole range of things, from blog posts to infographics to videos and podcasts and webinars and slide decks.
Whatever makes sense for your business and your industry is what you should do. Whatever types of media you know your audiences like to consume, give that to them.
In 2023, however, you have to be incredibly mindful of being comprehensive and useful for people.
If there’s anything that we’ve learned from 2022’s helpful content update, it’s that you just cannot skimp on content creation (not that you ever could, but Google is smarter than it was 10 years ago).
Gone are the days of skirting by on SEO-centric content, created just to score some ranking for this or that keyword.
Google is paying much more attention now to the intent and usefulness of a piece, and rewarding those web pages featuring actually helpful content (get it?) with higher rankings.
A perfect example of how Google is thinking these days is the product review update, also from 2022.
Google is now deprioritizing the ranking of low-quality product reviews in favor of more expert-level reviews where the reviewer has actually used the product or service and can speak to its pros and cons.
Why? Because Google wants to direct users to content they can actually trust to help them.
When you take the product reviews update and helpful content update together, you can see why content marketing has gotten so much harder over the years.
You can’t just rank after spending an hour on a 400-word blog post anymore.
You have to be a real expert, or at least put in the time and effort to create deep content if you work for a client portfolio.
These are all things you must keep in mind as you create new content for your website in the name of putting out your dumpster fire of an SEO strategy.
Now, of course, there are the nuts and bolts you have to remember, as well, when it comes to new content.
You have to mine the SERPs, develop the proper keyword strategy, and understand the correct intent behind those keywords to be sure you’re creating what people expect to see when they search that keyword.
That stuff you can all learn.
What I want you to take from this section is the idea that you have to work to create that new content. You have to put in that time and dedication to do it well.
5. If you’re local, focus on reviews
I don’t want to leave out the local businesses here: if you’re a local business, do you know that one of the single largest factors in helping your SEO is getting positive Google reviews?
Now, local businesses need to perform all the on-page SEO work that anyone else does, but what do you do as an ongoing SEO strategy?
The play here isn’t keyword-driven SEO content so much, because your local audience isn’t really going to find you that way.
Local audiences find local businesses by performing local searches and checking out the reviews in the map pack.
In fact, 77 percent of local buyers always read online reviews while checking out local businesses.
Your reviews affect the level of trust the public has in you. More people are likely to visit your website and use your business when they see that others have had a positive experience with you.
The cycle goes on when you encourage your customers to leave positive Google reviews.
The more reviews you have, and the more positive they are, the better off your chances will be of rising to the top of your local map pack.
Being at the top should translate into more traffic and better SEO overall.
6. Build natural backlinks
Finally, I want to mention another pillar of Google’s list of known ranking factors: natural backlinks.
Links are what unite everything on the internet together.
They’re also vital in keeping the ranking juices flowing to your web pages when it comes to your SEO strategy.
Backlinks to your website from other websites show Google that you’re an authority in your market niche since people want to reference what you have to say.
Link building, then, is really about building relationships to get your name out there as a trustworthy resource for others.
When Google sees your links coming from relevant, authoritative websites, it will assign more trust to your own site.
Just remember to keep the links coming from websites that make sense to your own.
The quality matters much more than the quantity here.
To do it, create content that people would want to link to, something with a lot of useful stats and other data.
You can also scout other websites in your niche to see where they may have content gaps, and then create content to fill that gap and ask for a link back.
It takes time and effort, and you’re not guaranteed anything, but it’s the natural way to earn backlinks that will actually help your SEO.
Give your SEO time to turn around
You can put out even the biggest dumpster fire when you know what to do and how to do it.
I’ll say again that SEO dumpster fires can happen to the best of us. Sometimes we go all-in on things we think will work, and they don’t.
Sometimes we get lazy and let our SEO go for years.
But it’s never too late to correct things.
It will definitely take time to see things start to shift for you, though; SEO isn’t an overnight solution. It needs anywhere from three to six months or longer to start showing a difference.
If you keep in mind both the broad strokes and the specifics of everything I’ve described here, you truly can reinvent your SEO strategy and be on your way to business growth.
Kris Jones is the founder and former CEO of digital marketing and affiliate network Pepperjam, which he sold to eBay Enterprises in 2009. Most recently Kris founded SEO services and software company LSEO.com and has previously invested in numerous successful technology companies. Kris is an experienced public speaker and is the author of one of the best-selling SEO books of all time called, ‘Search-Engine Optimization – Your Visual Blueprint to Effective Internet Marketing’, which has sold nearly 100,000 copies.
Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.
The post Is your SEO performance a dumpster fire? Here’s how to salvage it appeared first on Search Engine Watch.
There are many factors that can have a negative impact on paid search conversion performance, including prices, shipping costs, and competition.
Read more at PPCHero.com
To help address these challenges, we’re introducing Server-Side Tagging to Google Tag Manager and Tag Manager 360. You’ll now be able to move many third-party tags off your site and into a new server container hosted in your Google Cloud account. That means when customers interact with a page on your site, third-party tags are loaded directly in the server container rather than the site. This provides you with faster page load times, greater security for your customer data, and additional data controls.
Deliver faster site experiences to your customers
When you move third-party tags off your site, fewer tags must load when your customers visit – leading to faster page load times. A recent research study showed that a decrease in page load times for mobile sites improved progression rates for every step of the purchase funnel for all brands surveyed. In fact, for retail sites every 0.1 second reduction in mobile site speed on average increases average order value by nearly 10 percent.
Consider an ecommerce retailer that works with many technology partners to execute marketing campaigns and measure customer behavior. Whenever this retailer wants to work with a new partner, for example to run email marketing campaigns, it needs to add a new third-party tag to its site to measure success. Instead of doing that, the retailer can now place the new tag into its server container in Tag Manager. And when a customer loads the retailer’s site, this tag will run in the server container after the page loads. This allows businesses to measure the success of their campaign without impacting the customer experience.
Secure your customer data
When customers engage with your business online, they share information with you. You want to ensure that information is safe and only authorized partners are able to access it.
When third-party tags are implemented directly on your site, these tags are able to access and interact with other information customers are entering into your site. With Server-Side Tagging, you place third-party tags in a secure server container in your Google Cloud project. This means tags in your server container only have access to information sent to the server and no longer have access to the information entered on your site. And because these tags are placed into your server container, you gain visibility into what data the tags are collecting and where that information is being sent.
Control the behavior of third-party tags
Each tag that you add to your server container will have to declare how it will behave, for example which cookies can be accessed or where data can be sent. And you can also set policies to automatically control what tags are allowed to do. This helps you ensure that any new tags added to your container follow the same permissions so you do not need to continuously check tag behaviors in the future.
Get started with Server-Side Tagging
Server-Side Tagging is now available to all Tag Manager and Tag Manager 360 accounts. When you log into your Tag Manager account, you can create a new server container and connect it with a new or existing Cloud account. You can learn more about setting up Server-Side Tagging for your business with this guide. And if you don’t have a Tag Manager account, you can create one for free.
- The research shows most agencies failed when it comes to the performance of their website.
- Search engine ranking is a multi-factor game, and performance, while it matters for many reasons, is just one piece in this puzzle.
- Nebojsa Radakovic shares insights.
Ever since Google announced that page speed would be a ranking factor in its mobile-first index in 2018, the need for speed became one of the most important aspects of web dev trait. A lot of businesses jumped onto the speed train.
Sure enough, one year later, Google reported that sites are faster, and abandonment rates are down since making page speed a ranking factor.
With performance being one of the top-selling points of a modern-day web dev architecture Jamstack that we are so into, it was only natural to take a deep dive into the industries that tackle website performance and see how we stand against our peers.
TL;DR: Key findings
Don’t have the time to read through the research? Here are the key findings:
- 27% of websites from our 20K sample still run on HTTP
- 65.7% of the websites are built with WordPress
- Only 2.7% of websites have good performance scores
- 2.9% of websites provide good user experience to their users, ie Largest Contentful Paint (LCP) occurs within 2.5 seconds of when the page first starts loading
What data was I interested in, and why?
Lighthouse performance metrics. There are a couple of popular speed testing tools, but most people use Lighthouse. While it may not be perfect because it provides a mix of both lab and field data about a page, I’ve used Pagespeed Insights API as described in James McNulty UpBuild post here, although updated to show core web vitals.
CMS. WordPress or not. 37% of all websites are powered by WordPress. Being the most popular web dev solution, it would be interesting to see and compare different solutions in terms of speed and performance.
Where did I get my URLs from?
Gathering URLs is a time-consuming work. But I managed to get 20k URLs (20397 URLs to be exact). I’ve cross-referenced results I got from scraping the first-page organic results of a set of keywords (like SEO agency, web dev agency, etc.), results I got by using tools such as Phantombuster to scrap review websites, and results I got from hiring virtual assistants on Upwork and Fiver.
There are a couple of issues I had to take care of first. Amazingly 27% of websites from my 20K sample still run on HTTP. That’s not good at all. On top of that, I had a bunch of URLs coming up with NET::ERR_CERT_DATE_INVALID error message in Chrome. Once those were taken care of, I ended up having results 13945 URLs instead of 20K.
Of course, the most popular CMS is WordPress, with 65.7% of websites from my sample using it. For 18.8%, I was not able to detect any CMS. 2.58% run on Squarespace, 1.6% are built with Drupal, 1.41% are on Wix, and so on.
The results should not come as a surprise given that WordPress powers 37% of all the websites on the Internet or 63.6% of all the websites with known CMS.
Performance scores – How scores are color-coded by Google
The metrics scores and the perf score are colored according to these ranges:
- 0 to 49 (Red): Poor
- 50 to 89 (Orange): Needs Improvement
- 90 to 100 (Green): Good
You can read more about it here.
As far as the performance scores for all websites are concerned, 77.1% of the websites are in the poor range, which means there is a lot of room for improvement.
Pretty much the same story when we check only WordPress websites, 83.9% are in the poor performance range.
Core Web Vitals
By now, you probably are well aware of Core Web Vitals. Their importance is twofold:
- Google considers them essential in a webpage’s overall user experience, and understanding them can help you improve the quality of experience you are delivering to your users,
- Google plans to make a page experience an official Google ranking factor with Core Web Vitals being an essential part of it.
The current set for Core Web Vitals focuses on three aspects of the user experience: loading (described with Largest Contentful Paint (LCP) metric), interactivity (described with First Input Delay (FID) metric), and visual stability (described with Cumulative Layout Shift (CLS) metric).
For this research, numbers follow the performance scores. For example, check out the Largest Contentful Paint (LCP) results.
Being that I’ve tested only 20k URLs (actually 13945), let’s not generalize conclusions. However, the general ‘feel’ is that the ones required to think of speed and performance failed the test.
Performance, while it matters for many reasons, is not and should not be the end goal. It depends not only on the tech used but also ‘features’ you’ll have on a website, which pretty much depends on the industry/theme your website is in. And balancing performance and functionality successfully depends on the value a feature brings to your business versus the reduction in speed that results.
The thing is, whatever tech you use, you can end up with good scores (some easier than others). The real question is, how important are the scores for your client, their business, and their audience?
Nebojsa Radakovic is an SEO wiz with 20 years of experience. He is also an extreme sports enthusiast. He can be found on Twitter @CookieDuster_N.
The post Speed and performance of Web dev, SEO, and marketing agencies websites appeared first on Search Engine Watch.
Many brands are seeing strong year over year growth and in some cases with a conservative PPC strategy. Why is this and what does it mean for future strategies?
Read more at PPCHero.com
Average position as a metric has been retired since the end of September. This is a big change since for years clients, agencies, and any advertiser has always had at least a little bit of vanity management. By that I mean, everyone at some point submitted a bid with the sole goal of being “number one” and not any actual business metric.
This change was implemented to acknowledge that the average position is not meaningful when you are in a world of personalized search. Stopping vanity bidding is just a beneficial side effect. I wanted to take a look at some data, specifically CPC and CTR, to see how performance varies for top and side positions. I also wanted to look at how these metrics vary on Google.com vs. Search partners. What I found were some very interesting insights that might impact how you think about your campaigns.
When it comes to the differences between Google and it’s partners and top vs. other the keys are:
- Google top vs. other has the biggest differences when it comes to CTR. The data showed a >900% increase in CTR across desktop, mobile, and tablet. This was the highest delta across the entire data set, expect for Partner top vs. other which was nearly 4x the difference.
- Mobile for Google vs. the Partners was also a significant difference at 918%. This was noticeable because the desktop variance was only 30% (basically a tie). The importance of mobile can’t be understated.
When it comes to cost per click differences the variances were really noticeable when it comes to cost per click. The drop off between Google and partners was at least 100% and as high as 268%. The differences are driven primarily by demand. Many advertisers do not participate in the partner network. Therefore, demand is down and the cost per click would fall as well. This is where if the conversion rates are right you would be able to pick up some additional scale. The difference when looking at Google and Partners top vs. other is a much smaller delta. This just highlights the demand point above. The difference in mobile was only 13%. There are such a high demand and fewer spaces for mobile that the difference between top and side was the smallest of any data set that was reviewed.
While the CPCs weren’t that different the CTRs for Google mobile top were significantly higher than the search partners top. I thought this was worth showing the actual data to show the differences between mobile and desktop. The drop in mobile top is very high indicating a different search experience and relevance. The differences are very small and much lower CTR when looking at the “Other” positions.
What action should you take based on this data?
1. Don’t manage to these metrics – Optimize them
Ultimately, you shouldn’t really care what the CPC is or what your CTR is. The goal is hitting your KPIs. If you need to pay $ 100 per click, but convert 100% of the clicks then it’s no different than paying $ 20 per click and a 20% conversion rate. That’s not to say you shouldn’t optimize to improve, you should. I’m just suggesting that metrics like top vs. side CTR are simply indicators on how you can improve. These are not your true KPIs.
2. Understand the value the search partner network brings your campaign
The search network provides scale to your campaigns and to Google for a revenue stream. That doesn’t mean in every case you need or require that scale. If you are struggling to perform break down your traffic by Google and the partner network. Look at not only CTR and CPC data, but also understand conversion rates. What would happen if you cut off the search partner network to both your volume and your cost per acquisition? Does this additional scale provide your business value or would it be better spent investing in other areas that perform better? This isn’t a one size fits all answer. You need to do the work and the result might be different by campaign or even keyword.
The post A look at performance post Google’s average position sunset: Top vs side appeared first on Search Engine Watch.
Google Analytics (GA) is one of the most popular traffic analytics tools for websites, but it can have serious drawbacks for anyone looking to measure content performance.
The problem is systemic: Analytics was built to track traffic for ecommerce and content sites, with the structure of its reports built around pageviews. It can provide some sophisticated data around those views – what kinds of audience members are behind them, how they might have arrived, what they did next, and other such questions – but today’s content marketers need the ability to measure and understand much more than that.
How do people interact with your content when they’re viewing an individual landing page? How do they feel about your brand after having been exposed to it on other media channels? Where are they running into conversion roadblocks? What are the content assets across touchpoints that people are consuming most on their paths to conversion? What assets are most compelling to your most qualified individual leads?
GA can hint at some of the answers to these types of questions, but to truly understand these aspects of your content marketing performance, you’ll need to turn elsewhere.
Here are a few of the biggest ways that Google Analytics can’t measure your content performance properly, along with some tips for overcoming these shortcomings.
1. On-page behavior
Google Analytics only tracks page views and movement within your site. Unless you manually add layers of event tracking, it can’t reveal what people do within specific pages. You’ll never know if visitors get two lines into your content and then get distracted by an interesting link.
This is the value of heatmaps, which are remarkably effective at showing user behavior. They map out which areas of the page get the most view time and the most clicks, and where the mouse rests.
A heatmap shows areas that get the most attention in red, shading to blue for those that get the least. It reveals whether the visitor engaged and interacted with the page, or left it open and unread for hours. With a heatmap, you can discover the most popular parts of your pages, the navigation links people click on most, and whether key elements below the fold are going unseen.
To get started experimenting with heatmaps, you can try using Hotjar, Lucky Orange or CrazyEgg.
2. Brand sentiment lift
Google Analytics is limited to tracking page views on your own website. It can’t tell you anything about the impact of your content on earned or shared media channels, where you don’t have the ability to install its tracking pixel. And even if you could use it track content views on all channels, you still wouldn’t know much about the impact that the content has on brand sentiment, or your share of voice in the general market.
Instead, use a social listening tool to track what people think about your brand. Social listening tools track social media shares, comments, reactions and mentions. This information has many key use cases, one of which is gaining a holistic view of brand sentiment.
The better platforms track far more than the number of brand mentions on social media, using semantic text analysis to reveal the emotions behind the posts and comparing these signals to those of your competitors. Merge these trends with your timeline of content marketing achievements, and correlations will start to emerge.
To get started experimenting with social listening for brand sentiment tracking, you can try using Awario, Mention or Talkwalker.
3. Friction points on forms
If a visitor tries to complete an online form and gives up in frustration, Google Analytics will never let you know. The best it can do is to show you how much time all visitors spent on the page. (Even this information can be extremely misleading since GA measures page view durations starting from the moment given page loads to the moment the next internal page loads. If your visitor stays for 10 minutes, reads your article from top to bottom, shares it, and then closes the tab without browsing any further within your site, GA will register ‘zero’ time on page.)
When it comes to lead capture forms, contact forms, and sales checkout forms, it can be hard to tell how many fields you’re best off including. The fewer fields your forms have, the lesser friction people will have opting in, which makes for more conversions.
On the other hand, the more fields you include, the more data you’ll have to work with when people do complete and submit forms, which is useful for identifying personas when executing segmented nurture sequences. You’ll also learn more about your audience, and you’ll be in the best possible position for determining the relevance of your leads. And there’s something to be said for asking a lot of your audience, as it helps to filter out people who are “just curious” about your lead magnet and will never actually do business with you.
To really understand the extent to which form fields are serving as roadblocks on the path to conversion, turn to your form builder tool’s analytics. The better platforms will reveal partial submissions, and how far a user gets through a form before abandoning it, so you can see if any single field is too long or question too confusing.
To get started experimenting with form conversion optimization, I recommend Formstack, Formismo or Jotform.
4. The identity of every visitor
One of GA’s biggest weaknesses is its inability to give context to visitor behavior. It can’t show you much about the identity of your visitors – at best, you can segment data about your entire pool of visitors according to their physical locations, devices, referrers, rough demographics and points of entry to your site.
What’s more, Google Analytics only uses a sample of your visitors, so that even if you tinker with your report settings to reveal the IP addresses of individual sessions, you can’t rely on this information as a comprehensive source of individual user insights.
Instead of GA, use audience intelligence tools that provide information about the interests, behavior, personal data (in a GDPR-compliant manner, of course.) and historic activity of every user, so that you can gain a deeper understanding of your visitors. This allows you to fine-tune your content to appeal to your audience, and it also reveals opportunities for account-based marketing.
To get started with audience intelligence, try Albacross, LinkedIn Website Demographics or Visitor Queue.
5. Funnel analytics
It is possible to use Google Analytics to track users through your funnel and measure its effectiveness. However, setting this all up can be highly complicated. You have to build a confusing series of filters and a dedicated URL structure that allows GA to correlate content pages with each stage of the funnel.
It’s much better to use a single tool that follows users through your funnel. Pick one that logs abandonment points and the cumulative impact of your various key funnel touchpoints. You’ll also need a good way to track the activity of returning visitors, which is another weak point for GA, thanks to uncertainty about cookies, lack of reliability when tracking visitors across devices, and the aforementioned notorious data sampling issue.
And if you integrate a funnel analytics tool with your CRM, logging each lead’s engagement activity on your website, you’ll be in great shape to set up a smart lead scoring system for identifying sales-readiness levels.
To get started with funnel analytics, check out Kissmetrics, Woopra or Yandex Metrica.
6. Off-site interactions
Google Analytics only measures interactions with the content on your own site. It’s not something you can use to measure the impact of content on shared, paid or earned media. So that guest post you recently published on someone else’s blog, or your LinkedIn Publisher articles, for example, will be blind spots for you.
GA can show you information about some of the visits you acquired via clickthroughs from these media presences, but that’s about it.
You’ll get better results from a multi-channel dashboard tool that pulls together user analytics from all channels, including email marketing, advertising tools, and social media. This type of solution can’t show you how people found your content on these properties, nor where they went next if they didn’t end up on your website, but it will help you consolidate all your metrics into one centralized dashboard for a more holistic analysis.
What’s more, if you combine data relating to engagement on all touchpoints into one timeline, you’ll start to see correlations between spikes on certain channels and website conversions, which can point you in the right direction for further drill-downs
To get started with multi-channel dashboards, try Klipfolio, Databox or Geckoboard.
Google Analytics isn’t a magic button
Google Analytics is hugely popular, but it can’t do everything, especially if you’re concerned about content performance. Fortunately, there are other tools that fill the gaps GA leaves behind, giving you a much clearer understanding of your content marketing success.
The post Six key content performance aspects that Google Analytics can’t measure appeared first on Search Engine Watch.
Download an easy to use ppc performance troubleshooting template that will organize your analysis and allow for easy collaboration among teams.
Read more at PPCHero.com
Google updated their Test My Site tool to include custom recommendations for mobile sites. Read more to find how this tool can improve your mobile performance.
Read more at PPCHero.com