How many times have you quoted a metric plucked from Google Analytics without really knowing what it means? Fear not, you’re not alone.
For far too long now, marketers have had misconceptions over how to define one particular metric – bounce rate, either confusing it for exit rate or adding non-existent criteria. So, we’ve put together a quick-fire guide to help you become a bounce rate aficionado.
How is the bounce rate calculated in Google Analytics?
The Google Analytics help guide is a good first stop when trying to get to the bottom of the topic. And with it, you only need to remember two key things:
1. A bounce in Google Analytics is a single-page session on a website
2. The bounce rate for a page is based only on sessions that start with that page
What does this mean in practice?
Here’s an example with three sessions:
Imagine there have been three user sessions on your website. During these sessions, the following pages were viewed in this order:
- Session one: Page A > Page B > Page C > exit
- Session two: Page B> Page A > Page C> exit
- Session three: Page A> exit
Page A bounce rate = 50%
Page B bounce rate =0%
Page C bounce rate = 0%
Why? You might tend to think that Page A’s bounce rate is 33% because the page was viewed three times and the user only exited the website after viewing page A. It’s a typical misconception, but that logic is actually the definition of “exit rate”.
Similarly, you might be tempted to think that Page C’s bounce rate is 100%, as all the sessions that have included Page C as part of their journey have been immediately followed by an exit. However, only pages that start a session are included in these calculations.
Here’s an example with five sessions:
- Page B > Page A > Page C> exit
- Page B > exit
- Page A > Page C> Page B > exit
- Page C > exit
- Page B > Page C > Page A > exit
Page C’s bounce rate is 100%. It has been visited four times, however, only one session started with it. It is, therefore, the only one counted by Google Analytics in its bounce rate calculations.
What is an exit in Google Analytics?
Simply put, an exit is when a user exits the website in one way or another.
This means that if one of the goals of your website is to get users to click through to a third-party retailer after visiting a product page, users will need to exit the website in order to be counted as a conversion.
In this particular case, you could theoretically have pages with both a 100% bounce rate and a 100% conversion rate at the same time. But is lowering the number of single-page sessions on your website really your objective?
If not, you might want to consider a different KPI for your business. For SEO marketers, it is often the “go-to” KPI when reporting on performance, but others – such as exit rate – might be a better fit depending on your website’s objectives.
How should we use bounce rate and exit rate for efficient reporting?
1. Bounce rate at a website level
At a website level – the figure typically found on the Google Analytics dashboard – bounce rate only means the percentage of single-page sessions compared to overall sessions.
Due to its default settings, Google Analytics can be misleading as it will indicate a decreasing one with a green arrow, suggesting it is “good”, while any upturn is marked in red and perceived as “bad”. However, having a higher bounce rate can be a good thing – perhaps the user only needed to visit one page in order to find the information they needed. This entirely depends on the type of website you are reporting on and the content it serves (ecommerce, blogs, informational, and the others).
Changes in bounce rate at the website level should not be used to evaluate website performance, but rather to notify a change that requires further investigation.
2. Bounce rate at the page level
If it increases for a particular page, it is important to evaluate the type of page to understand if the change is positive or negative:
A non-exhaustive list of examples
- Homepage: an increase in bounce rate is generally negative and means less users are willing to visit a website beyond its home page.
- Content/article: an increase in bounce rate could mean that users have found the information they need. In this case, bounce rate alone cannot be used to determine a positive or negative change.
- Product page: an increase in bounce rate on pages with ecommerce functionalities needs to be analyzed in conjunction with recent website template changes to ensure the user experience is not negatively impacting shopping experience.
3. Exit rate at the website level
At a website level, the exit rate does not provide very meaningful data because users will always have to exit a website from one of its pages at some point.
Google Analytics still provides this type of data under the behavior tab, but it is not recommended to use this information to report web performance.
Exit rate at the website level cannot be anything other than 100%. However, be aware that Google Analytics takes an average of the exit rates for all pages of the website to come up with a “website average”.
4. Exit rate at the page level (or set of pages)
This is where the exit rate really shines. If you have an ideal user journey for your website, the exit rate can help you identify changes in user behavior. From there, you can tweak web page templates to bring users from one point to the other – using multiple pages and monitoring where users exit – and therefore finish their journey.
Now that you’ve mastered the difference between bounce rate and exit rate and how to use them effectively in your reporting, it’s time to put your knowledge into practice. Log into Google Analytics and start to delve into what these stats really mean for the website.
The post Want to reduce your bounce rate, but what does that actually mean? appeared first on Search Engine Watch.
A new guide, Best Practices for Website Redesign & Migration, outlines detailed best practices for implementing search engine optimization (SEO) for a website redesign.
It includes a list of comprehensive tips aimed at educating website owners about common redesign obstacles, website structure as it relates to SEO, preserving domain and URL equity, keyword research, and more.
This post summarizes some key SEO migration elements listed in the guide, with an emphasis on helping organizations avoid costly errors related to a website redesign and domain or platform migration.
Content produced in collaboration with Investis Digital.
Common SEO obstacles in website redesign and migration
A proliferation of web design platforms and tools have made it easier than ever for businesses to complete a website redesign in record time.
However, many of these platforms aren’t fully compatible with older computers, slower connection speeds, and present obstacles for search engine spiders.
The Investis Digital SEO guide lists twelve of the most common obstacles and includes tips for avoiding them. Here are a few examples:
Dynamic URLs that don’t include keywords
Dynamic URLs rely on variable parameters provided by the web server and are not easily indexed by search engines since they change based on user query input. A dynamic URL typically includes character strings versus keywords. Investis Digital recommends that dynamic URLs be rewritten to include relevant keywords.
Since text is the backbone of how search engines determine keyword relevancy, the Investis Digital guide recommends that some relevant content be included on all pages. Ideally, at least one short paragraph of unique text should be present on each page. If this isn’t possible, then tier-one and tier-two pages should incorporate text-based, keyword-rich headers.
Three of the twelve obstacles listed in the Investis Digital SEO guide
Six key elements of an SEO-focused redesign
The Investis Digital guide lays out a comprehensive list of elements that companies should incorporate into their redesign strategy, from website structure to keyword research, and from to meta information to internal site linking. The guide acts as a blueprint for businesses so they can minimize the impact of the redesign on organic search engine rankings and traffic. Here is a brief summary of each element:
1. Website structure
Folder structure, web page file names, and keyword-rich content all play a role in an optimal website structure. “The decisions you make about the naming conventions of your folders and files, and the way in which you point to specific pages of your website, can have a huge impact on overall traffic and sales,” writes Investis Digital.
2. Keyword research
Keyword research should be the starting point of your website redesign so that you can incorporate relevant, high-volume keywords throughout the entire site structure. Investis Digital reviews important keyword guidelines such as how many different terms to target and what keyword research tools to use when gathering information.
3. Meta information
Meta information—also referred to as metadata—is the information that appears in search engine results pages for organic listings. SEO meta information includes a variety of tags such as <title> and <meta-description>. The guide provides checklists to help businesses fully optimize each of these important tags.
4. Body content
Body content is important for good search engine rankings as well as overall website usability. The Investis Digital guidelines cover the specifics of creating high quality, SEO-friendly content attributes that will contribute to search ranking such as keyword choice, frequency, placement, spacing, and titles.
5. Internal site linking
Internal links are an important element of good SEO design as they determine how search engines perceive relevancy for specific keywords. Investis Digital covers the best practices for internal link creation such as using descriptive text-based links in the main navigation, limiting the number of links on a page, and more.
6. URL equity
URL equity is “the sum of several important values tied to URL structure.” Dynamic versus static URLs, as noted above, play a role in URL equity as do a URL’s external links to the website, age of the domain, and more.
The importance of creating an SEO redesign strategy
A key pain point with any redesign is the loss of organic search traffic that occurs when established domain equity is lost. The Investis Digital guide provides information to help companies avoid the negative effect a large-scale redesign can have on search visibility and website traffic.
Example of a successful client domain migration. Source—Investis Digital
With a strong emphasis on maintaining URL equity, a sample workflow to help with planning, and a list of expected obstacles, businesses can use this guide to create a comprehensive SEO strategy for their website redesign or platform migration.
For more tips on how to create an SEO strategy for website redesign and migration, check out the full guide, “Best Practices for Website Redesign & Migration.”
The post How to create an SEO strategy for website redesign and migration appeared first on Search Engine Watch.
Today, SEOs have a massive opportunity to expand their role in digital workflows, as far as both the volume and importance of tasks available. As companies increasingly look for equal parts creativity and analytical skills in digital leadership, experienced SEOs are uniquely positioned to fill the gap—that is, if they are able to capitalize on the innovations that enable them to practice scaling their best SEO efforts.
I wrote recently about the power of intelligent automation; that is, automation supercharged by a layer of artificial intelligence. Digital is encountering and even coming to rely on AI in predictive analytics, automated sales analysis, research and information aggregation, automated communications (think chatbots and email), and even virtual personal assistants.
Across digital channels, campaigns, and tactics, it seems nothing is untouched by automation. And yet the degree of automation and just how much AI informs the decisions being made can vary widely. Intelligent automation isn’t an all-or-nothing prospect—there are many different ways we can work alongside the machines revolutionizing digital.
In fact, those with the greatest understanding of how to collaborate with intelligent machines hold the keys to driving digital forward.
5 levels of automation: Elevating SEO and driving digital performance
Let’s look to the automotive industry for a model automation framework that illustrates just how much technology impacts the human experience in a defined task. The Society of Automotive Engineers created its “Levels of Driving Automation” standard to define the six levels of automation in the driverless car industry.
From Level 0 through to 5, the tasks a human driver must undertake decrease, while technological features increase.
Image source: NHSTA.gov
In Level 0, the driver is on her own and must constantly steer, brake, accelerate, signal, and otherwise control all aspects of the car’s performance. By Level 3, she may be in the driver’s seat but various automated driving features are engaged. The conditions must be right for automated operation, and the machine can ask her to take over at any time.
Lane-keeping assist, adaptive cruise control, and self-parking are a few examples of automated driving features you might experience in a Level 3 or 4 vehicle. (As Lance Eliot pointed out recently in Forbes, we’ve yet to see a true Level 5—a truly automated and driverless car—in action. The U.S Department of Transportation expects we’ll see fully driverless cars from 2025 and beyond).
We can apply this sort of framework to help further our understanding of the intelligent automation opportunity for SEOs. And while the automotive industry has all sorts of regulations and laws in place to protect public safety, you are a lot freer to explore the boundaries of intelligent automation in digital (and in SEO, in particular), so long as you respect user privacy.
Level 1: Manual SEO
The SEO of days gone by was almost entirely manual and incredibly time-consuming. That’s not to say that the more labour-intensive style of SEO doesn’t still happen today; there are still some SEOs who toil away in Excel, manually auditing and optimizing sites. Manual SEO gives you complete control over your entire search strategy, from selecting backlinks at the individual level to careful optimizations throughout the site. But with 53% of website traffic coming from organic search – opportunities to capitalize and scale are immense.
However, it’s simply impossible to ingest, analyze and activate search data at any sort of scale without automation. Manual SEO can be incredibly effective, but that value is reduced by every missed opportunity caused by slow implementation and the expensive overhead of a team of experts.
Level 2: Simple automation
Some of the tools you use in SEO today were probably borne out of earlier SEOs’ need to automate manual tasks such as keyword research and tracking rankings. Second generation SEO tools began to automate content optimizations (at least, pointing out opportunities to optimize).
Simple automation is almost exclusively about reducing manual labor for simple tasks. Gathering the same data set at regular intervals, such as daily rankings on specific keywords, for example. Scouring the web for links to your site.
Simple automation will reduce labor spent on these tasks, freeing up time and energy for more creative and strategic activities. Think of this level of SEO automation as cruise control—you can take your foot off the pedal and let the machine do some of the work, but you are very much still in control of the car.
Level 3: Application of AI for insights
In Level 3, our SEO software becomes a whole lot smarter thanks to artificially intelligent analysis. In this stage, the car is driving itself—but only under certain conditions.
From your place in the driver’s seat, you control what input you feed your SEO tools and define what you’d like as outcomes. Through the analysis of massive data sets (far more than you could ever plow through on your own), you’re able to glean greater insights and make better decisions.
You can automate the process of analyzing your site’s content and have your software return optimization recommendations based on your selected keywords, for example. This can inform your content creation efforts and ensures that you’re spending the time you do have on the areas with the potential for the highest impact.
Level 4: Real-time interactions
The car is driving itself, but you can’t throw caution to the wind and take it out in all conditions. In the driverless car world, a Level 4 vehicle may not even have a steering wheel or brake pedals installed. In a 2018 test case in Japan, for example, a driverless taxi ferried paying fares on a defined route through the manic streets of Tokyo.
Chatbots are a great example of a Level 4 automation in digital. SEOs are a natural fit for driving conversational AI strategy, as consumers often turn to chatbots as an extension of the search experience. Whether querying by text or by voice, connected consumers look to intelligent automated assistants to help them solve their immediate needs. Who understands that customer’s journey and which pieces of content best answer their questions better than the SEO team? If chatbots are being treated as a function of sales, it’s important that marketing (and search in particular) assert their will to be consulted, or even to lead.
Level 5: Real-time decision-making and automated optimizations
The fully autonomous car may drive itself, but its independence is an illusion. The car—when it comes to market—will rely on teams of skilled designers, engineers, and developers to create and maintain the systems that will enable it to make decisions and take action in the moment to keep its passengers safe.
And so it is with fully AI-integrated search software. Disparate tools that automate only a handful of functions have fallen to the wayside as platforms have evolved to ensure that all functions are able to share data and “talk” to one another. Empowering the machines to prioritize tasks and make decisions about which optimizations can be executed in real time, as consumers make their needs apparent, enables brands to be fully responsive and even proactive in personalizing the search experience.
Don’t be left holding the wrench | Add automated talent to your team
Let me us a mechanic as a generic example. Being a mechanic is a perfectly respectable profession. However, 20 years from now a person with those skills alone is going to have a difficult time finding employment. The mechanic who is upgrading his skills today, who is now studying engineering or transportation safety or automotive software development, is ensuring his employability when driverless cars hit the market.
Those mechanical skills will still be needed, but they’ll have to be complemented by a solid understanding of the computerized systems that drive the car. They’ll have to be able and willing to work alongside the machines.
SEOs today have the opportunity to elevate their role and self-drive some SEO functions with automation and be the CMOs or CDOs of tomorrow something (that my company) does with BrightEdge Autopilot. Especially if they can position themselves as the best choice to work alongside Level 5 systems.
Search’s broad impact across the entire digital customer journey gives SEOs a wide-ranging set of skills and perspectives on which to build and eventually lead. To capitalize on the opportunity, search professionals must be willing to embrace AI not as a tool, but as a collaborative digital partner—one that can be trusted to make the right decisions when guided by the right strategy.
All too often marketers find themselves in mixed debate over the merit of one digital channel over the other, the pros and cons of Artificial Intelligence and the opportunities for the next promotion. The reality is that in any thriving ecosystem balance is key. The same is true for search and digital marketing. It is particularly the case for SEO as working in technically orientated environment requires much work and a lot of balance given its influence on content and all digital channels.
Those who can intelligently embrace advancements in AI and automation are actually the ones who stand to get that next promotion and elevate their role in digital. After all – in reality they already have extra members on their team.
Jim Yu is the founder and CEO of leading enterprise SEO and content performance platform BrightEdge. He can be found on Twitter @jimyu.
The post Scaling SEO: 5 levels of automated digital progression & elevation appeared first on Search Engine Watch.
Google updated the no-follow attribute on Tuesday 10th September 2019 regarding which they say it aims to help fight comment spam. The Nofollow attribute has remained unchanged for 15 years, but Google has had to make this change as the web evolves.
Google also announced two new link attributes to help website owners and webmasters clearly call out what type for link is being used,
rel=”sponsored”: Use the sponsored attribute to identify links on your site that were created as part of advertisements, sponsorships or other compensation agreements.
rel=”ugc”: UGC stands for User Generated Content, and the ugc attribute value is recommended for links within user-generated content, such as comments and forum posts.
rel=”nofollow”: Use this attribute for cases where you want to link to a page but don’t want to imply any type of endorsement, including passing along ranking credit to another page.
March 1st, 2020 changes
Up until the 1st of March 2020, all of the link attributes will serve as a hint for ranking purposes, anyone that was relying on the rel=nofollow to try and block a page from being indexed should look at using other methods to block pages from being crawled or indexed.
John Mueller mentioned the use of the rel=sponsered in one of the recent Google Hangouts.
The question he was asked
“Our website has a growing commerce strategy and some members of our team believe that affiliate links are detrimental to our website ranking for other terms do we need to nofollow all affiliate links? If we don’t will this hurt our organic traffic?”
John Mueller’s answer
“So this is something that, I think comes up every now and then, from our point of view affiliate links are links that are placed with a kind of commercial background there, in that you are obviously trying to earn some money by having these affiliate link and pointing to a distributor that you trust and have some kind of arrangement with them.
From our point of view that is perfectly fine, that’s away on monetizing your website your welcome to do that.
We do kind of expect that these types of links are marked appropriately so that we understand these are affiliate links, one way to do that is to use just a nofollow.
A newer way to do that to let us know about this kind of situation is to use the sponsored rel link attribute, that link attribute specifically tells us this is something to do with an advertising relationship, we treat that the same as a no-follow.
A lot of the affiliate links out there follow really clear patterns and we can recognize those so we try to take care of those on our side when we can but to be safe we recommend just using a nofollow or rel sponsered link attribute, but in general this isn’t something that would really harm your website if you don’t do it, its something that makes it a little clearer for us what these links are for and if we see for example a website is engaging in large scale link selling then that’s something where we might take manual action, but for the most part if our algorithms just recognize these are links we don’t want to count then we just won’t count them.”
How quickly are website owners acting on this?
This was only announced by Google in September and website owners have until march to make the change required but data from Semrush show that website owners are starting to change over to the new rel link attribute with.
The data shows that out of From one million domains, only 27,763 has at least one UGC link but the interesting fact is that if we’ll look at those 27,763 domains that have at least one UGC link, each domain from this list on average has 20,904,603 follow backlinks, 6,373,970 – no follow, 22.8 – UGC, 55.5 – sponsored.
This is still very early days but we can see that there is change and I would expect that to grow significantly into next year.
I believe that Google is going to use the data from these link attributes to catch out website owners that continue to sell links and mark them up incorrectly in order to pass any sort of SEO value other to another website in any sort of agreement Paid or otherwise.
Paul Lovell is an SEO Consultant And Founder at Always Evolving SEO. He can be found on Twitter @_PaulLovell.
Google always ranks a web page after determining its overall quality. Page quality is a measure of the importance of a web page in the eyes of Google.
In order to determine the overall quality of a web page, Google hires real humans who are known as “Search Quality Raters“.
Page Quality rating or PQ is a grade given by Page Quality raters who have the responsibility of evaluating “how well a page achieves its purpose”.
Purpose of the content, author expertise, links, and brand citations all come into play while measuring the quality of a page.
In this article, I will discuss the top five factors that directly impact the overall quality of a web page. Let’s start!
1. Purpose of the page
The purpose of the page is the real reason behind the creation of the page.
A page can be created to serve a particular purpose or multiple purposes, make money or harm the user by inserting malicious code via cookies or download buttons.
The first thing that Google does is understanding the purpose of the page in response to the user search. Google applies semantic search to understand the meaning of the words behind the query and matches them with the purpose of the page.
Google presents the best answers to the user after accurately identifying the real intent of the searcher. The purpose of your page must match the real intent of the searcher.
Different sites have different purposes. Hence it is important to identify the real purpose of the page.
Some common purposes of a page
- The homepage of a news website to share the news with the people.
- The category page of a shopping portal to sell products to people.
- A personal review site to inform users about the features, pros, and cons of the product.
- A how-to page created to help users find the answers to a specific question.
- A video created to educate people on how to draw a summer landscape.
- Category page of a software website to allow people to download a particular software.
For example, this page of Best VPN Zone site might have a high PQ rating for the query “how to save money on internet safety” because it lists 55 ways that actually help the searcher to find different methods that helps them to save money on internet safety. Content is over 3000 words and it is divided into proper subheadings that improve the overall readability score of the page. (For tools that you can use to check the word count and readability levels of a web page, please see point three).
When creating a web page, you should keep in mind the actual intent of the user. Identify the main purpose of your page and ask yourself – Does it accurately serve the user intent? The answer should be “yes”.
A page should not be created solely to earn money by running ads or to harm the user. Such pages have the lowest PQ rating.
2. Amount of expertise, authoritativeness, and trustworthiness
Expertise, authoritativeness, and trustworthiness are collectively known as EAT in SEO. Pages that have strong EAT are rated highly by the search quality raters. Let’s understand what EAT means:
Who is the creator of the content? (An article written by Danny Sullivan on SEO has more expertise when compared to an article written by any new author having a few years of experience).
How authoritative is the website where the content is published or how authoritative is the author? (An article published on science mission on the NASA website is far more authoritative when compared with an article published on a local science magazine such as this).
How trustworthy is the website where the content is published or how trustworthy is the author? (An article published by the Medical Association of Alabama is found to be more trustworthy when compared with the information in the personal blog of any random Alabama blogger).
EAT is an extremely important factor to evaluate the overall quality of a page. A page lacking EAT is considered to be of a low-quality and ranks poorly in the search results.
3. Main content quality and amount
The quality of the MC or main content is another major criteria in the calculation of the PQ rating. While determining the quality of MC, Google pays special focus on the following things:
- There should be no spelling or grammatical errors.
- Content should be clearly written and comprehensive (an interesting point to note here is that long-form content gets more backlinks when compared to shorter content and this is another reason why long-form content actually helps in rankings. This Backlinko study proves it.)
- The information presented on the site should be factually correct.
- The information should be presented well.
- Content on a shopping website should allow users to find the products easily.
- Any video or other features on the site like a calculator or game should be working properly.
- EAT also applies here.
You can check the word count of a web page using a tool like Word Counter. Similarly, Grammarly can be used to check the content for any grammatical errors. Sophisticated tools like Readable give you a score for your content based on its readability levels.
A good example of a page having high-quality MC is this Wiki on Siberian Husky. The information is comprehensive, clearly written, accurate, has lots of images to make readers understand the various characteristics and every point is backed up by proper data. This makes this Wiki a page having very high-quality MC and no wonder it ranks on the first position in Google for its target keyword.
4. Clear and satisfying website information
Any website on the web should have clear information about who is responsible for the information contained on the website along with details like office address and other contact details.
Having all the contact details on your websites adds to a high degree of trust. For websites that are directly responsible for the health and well-being of a human, disclosing the details of the organization or the person behind the site is extremely necessary.
For shopping websites, adding a customer support number is important because it helps the users to resolve issues. Hence, contact information along with customer support numbers or live chats are a factor in the PQ rating of Google. Depending on the niche of your website, you must add all the information in it that will help your users.
5. Website reputation
Google also finds out the reputation of the website by analyzing the web about references from other experts regarding what they have written or said about a website.
Some ways how Google identifies a website’s reputation
- Articles published in reputed news agencies about the website.
- Awards and recognitions won by the business. For example, a website run by a culinary expert who has won the James Beard Foundation Award for culinary excellence would be trusted more by Google when compared to any random blog run by a blogger who hasn’t received any awards.
- User ratings about an online store or business or about a particular product or service. Google considers a large number of positive reviews as evidence of a positive reputation.
- For health-related queries, Google carefully considers both the website and the author’s reputation while evaluating the PQ ratings. For example for a query like “what is CBD”, this resource from CBD Central might achieve high PQ ratings because it has clear information about the author. Similarly, this resource from Medicine Net has all the claims are backed up by trustworthy references and might be rated highly by the raters.
- Any other information about the website or the author of the article on any other website like Wikipedia, niche blogs, magazine articles, and forums.
You can check the reputation of a website using tools like the Moz (for checking Domain Authority), SEMrush (for checking the Trust Score), Ahrefs (for checking the Ahrefs Domain Rating) and Majestic SEO (for checking the Trust Flow). Each of these metrics is important to determine the reputation of a website.
Here are some useful ways that you can use to build the reputation of your website.
You can’t ignore the page quality if you want to rank your page(s) highly in the search results. The above five factors should be considered carefully and steps should be taken to optimize your pages in accordance with these.
Remember, PQ rating is given by real people so don’t think of applying any Black Hat tactics to fool them. Offer the best services to your customers and genuinely earn a positive reputation for your brand. Focus on the main content quality and the purpose of the page.
Last but not least, try to earn brand mentions and links from reputed media publications and nominate your business for prestigious awards in your business category.
Joydeep Bhattacharya is a digital marketing evangelist and author of the SEO Sandwitch blog.
The post Five factors that determine the overall page quality appeared first on Search Engine Watch.
The general misunderstanding is that web accessibility standards stand in a way of search engine optimization tactics preventing SEO experts from doing their job properly. Well, that’s not true. In fact, SEO and web accessibility overlap in many areas.
Web accessibility has recently become a hot topic in the digital marketing industry due to a wave of widely publicized scandals, that is, web users suing big and small businesses for failing to provide them with a smooth user experience.
Moreover, Google helps to raise awareness by helping web accessibility standards to be widely adopted. Google has official guidelines explaining accessibility and how they help create a better user experience.
Broadly speaking, when we say a site is accessible, we mean that the site’s content is available, and its functionality can be operated, by literally anyone.
And yet, while smart marketers have recognized the tangible benefit of making your site accessible (that is, making it possible for more web users to buy from your site), web accessibility seldom makes it to marketing priority lists.
What if I told you that by making a site accessible you can actually improve your SEO? Let’s see how:
1. Site and page structure
The foundation of web accessibility is very similar to that of an SEO strategy: You need a clear, logical site and page structure.
1.1. Site structure
Site-wise this includes:
- Clear navigation
The navigation of a website is important in helping visitors quickly find the content they want. It can also help search engines understand what content the webmaster thinks is important.
Although Google’s search results are provided at a page level, Google also likes to have a sense of what role a page plays in the bigger picture of the site.
For WordPress sites, an easy way to quickly improve navigation is to add breadcrumbs using a free Yoast plugin: It’s really one-click integration.
Image: Screenshot created by the author
1.2. Page structure
When it comes to page structure, this includes:
- The meaningful use of subheadings
- Clickable table of contents taking you to a specific subhead
Both are recommended for web accessibility purposes as they enable screen readers to navigate a page. From an SEO perspective these two elements have very important benefits:
- Using your target keyword in subheadings improves its visibility (helping the page rank higher)
- A table of contents generates “Jump to” links in search snippets improving its click-through
- For guidelines on how to use subheadings refer to this detailed article on article structure
- To create a clickable table of contents, use a plugin called “Easy Table of Contents”. It automates the process, so you don’t have to do anything apart from ensuring the consistent use of H2-H3 subheadings:
Image source: wpbeginner.com
Additionally, structured markup helps all types of devices to better understand and interpret information, so using schema never hurts. Here’s are a list of six free Schema generators to semantically structure your content.
2. Alt text for visual content
The basic SEO principle is that you need keyword-optimized alt text for every image on your page to make it easier for Google to understand what it is about.
This rule applies to web accessibility as well. The only difference is that when it comes to web accessibility, the alt text should make sense. Imagine going through your page without actually seeing any images but instead reading the alt text. Are you able to understand the full context?
People with visual difficulties are using assistive technologies that rely on image alt text to describe the image contents to the user. This makes alt text so important for usability.
The featured snippet tool may be of help here showing you which images are missing an alt tag and which images show a meaningless alt tag, on-page by page basis:
Image: Screenshot created by the author
If you operate a huge website and going from page to page is not an option, accessiBe can automate the process. AccessiBe utilizes AI image recognition technology to provide accurate alt text to images site-wide. This is a great way to make your site accessible (and SEO-friendlier) without too much money or time investment.
Check out multiple examples of how the tool works to better understand what it does:
Image source: Screenshot from the demo video
3. Video transcripts
Providing text context for your video page helps deaf users to still understand what it is about. In fact, when it comes to accessibility video transcript is the only required element.
A video transcript also helps the video page to rank for a wider variety of queries because text context is as important to Google.
Youtube video description is what Google uses to rank the page in organic results, as well as featured snippets and people also ask results:
Image: Screenshot created by the author
There are lots of automated solutions for creating video transcripts but I really prefer Speechpad.com.
Finally, another accessibility principle that can also boost your SEO, making your copy readable means writing in a clear way, using simple words. Basically, this includes:
- Write in short sentences and paragraphs
- Use simple short words
- Provide definitions for any professional terms or slang
We don’t know exactly how Google is using readability level analysis in its algorithm but what we know for sure is that focusing on easier readability levels will help:
- Get featured more: Google prefers concise, easy-to-understand answers to feature
- Rank in voice search: Voice search devices are just screen readers. They need easy wording and structure to adequately transfer the message to a human being. Google knows that, so it is featuring easier answers and consequently those are the ones that are being read in response to a voice query.
Keep readability in mind when having your content created. Some smart content creation platforms already have readability integrated. For example, Narrato uses artificial intelligence to match content orders to content writers, allows them to select the writing style, specify the writer’s expertise, and upload content guidelines to keep your content quality and readability to the required level.
Image: Screenshot created by the author
[You can read more about Narrato’s process]
Again, Yoast has a reading level analysis integrated into its free plugin version, but there are also multiple tools to analyze and improve the readability of your content.
At the end of the day, web accessibility is basically about making your site easier to navigate and understand. It’s pretty much what SEO is about too.
The post Accessibility and SEO: Where they overlap and how to optimize for both appeared first on Search Engine Watch.
It goes without saying that the world of SEO is becoming ever more technical, and over the past decade, webmasters, SEOs, and in-house teams have been widening their knowledge and skillsets to help their sites compete in search engine results pages.
One of these areas, which has seen the most development since its launch in 2011, is, of course, schema.org markup.
Although it has been eight years since the data schema was introduced, whether due to lack of development capability or technical knowledge, many popular brands are still to implement structured data to their websites.
In this article, we’re going to take a look at what structured data is, and the benefits that the markup can provide for websites.
A brief introduction to structured data
Put simply, structured data is a form of markup that is implemented in the code of a website and provides search engines with specific pieces of information about a page, site, or organization.
By improving the knowledge that a search engine has about a particular page or site, it can, therefore, provide users with the information that they need when conducting a search.
It also means that if a business invests in structured data throughout its site, it could enjoy higher and more relevant levels of traffic.
But how does this happen?
Structured data can enhance AMP pages
Despite structured data not being a direct ranking factor, it can, however, influence other elements of your website which are ranking factors.
In a world where a lot of searches (even the biggest part) are made through mobile devices, site speed has never been more important, especially when you consider that users will leave a page that takes longer than three seconds to load.
For this reason, many businesses have implemented Accelerated Mobile Pages (AMP) on their site (read more about them here), which can help overcome critical mobile speed issues and improve the usability of pages.
But most people don’t realize that AMPs can actually be enhanced via structured data markup.
Google states that by implementing structured data to AMPs, they can enhance the appearance of the page in mobile search results while offering the ability to appear within rich results.
If a site gains the opportunity to appear within rich results for an important search term, the site could gain a great amount of search traffic as a result.
You can learn a little more about how structured data enhances AMP pages in this handy Google guide.
Structured data helps sites appear in Google’s Knowledge Graph
For sites that appear in highly competitive verticals, getting the edge over your competition is critical, and one way to do this is by establishing your site presence with Google and appearing in the Knowledge Graph.
Knowledge Graph cards appear on the right-hand side of search results and they provide users with functional and visual elements of your site; making it far easier for users to familiarise themselves with it.
To enable your business Knowledge Graph card, you need to add the necessary Corporate Contact markup on the homepage of your website.
Like all types of markup however, there are important guidelines and rules that you must follow, such as ensuring that markup is not blocked from crawling by robots.txt directives.
You can find more information on how to properly implement Corporate Contact markup in this Google Developers Guide.
Structured data can be vital for improving a site’s click-through rates
CTR of a website is rather important for its rankings. And according to Neil Patel, the best way to increase it is to research and use keywords, especially long-tail keywords. Serpstat can help you make deep and useful keyword research and improve your rankings as well.
Also, the whole point of structured data is to provide clean and concise parcels of information to search engines so that you can clarify the purpose of your site and its pages to quickly provide users with the accurate information that they require.
This means that by implementing well-written and relevant structured data into your pages, your site should be shown to a more relevant audience base, meaning that your click-through rates will inevitably improve.
In fact, sites that implemented structured data found that their CTRs improved by at least 10%.
How to implement structured data
We’ve already learned the meaning and value of structured data on the site. Now, we’ll explore two of the main approaches for adding schema markup to your website.
How to add Schema.org micro-markup with Schema plugin
To install, go to Plugins – Add New in the WordPress console and find “Schema.” Activate it and go to Settings.
Fill in basic information, such as the location of your About Us page, Contacts, upload your website logo.
By filling out additional information, content, knowledge graph, and search results, you can optimize your site for each of the areas.
Then, you can go to Schema – Types and add the selected schema type or publication category.
If the above-mentioned plugin doesn’t suit you, you can choose from a large number of WordPress plugins alternatives for schema markup. Here are some of them:
- All In One Schema Rich Snippets
- Schema JSON-LD Markup
- Rich Reviews
- WP SEO Structured Data Schema
- Markup (JSON-LD) structured in schema.org
How to add Schema.org markup manually
Here, you should work more with the code, but you can add your schema markup individually to any page or post.
With arbitrary schema markup, you can include several different types of markup on the same page. For example, if you have an event page, and you also want to place a feedback schema on it, you can easily do it.
Remember to follow all Google structured data guidelines while creating the code for your markup.
To use this approach, go to any post or page where you want to put the markup. Click Screen Settings at the top of the page and check the “Custom Fields” box. Now, scroll down to the “Custom Fields” settings and press “Enter new” to create a new field. Name it “Schema” and enter the code. For example, local businesses data type:
Please provide the source and a possible caption for the above image
Next, you’ll need to edit your header.php file. Open it and paste the following code before the closing </head> tag:
Thanks to these actions, your schema code will load separately with metadata. You can add any kind of custom schema markup to your WordPress website with the above-described approach.
Just remember to run your page or post in the Google structured data testing tool to check your markup for errors. This validator understands the following formats:
Using it, you can check the page in two ways:
- Copy in HTML format
- Specify a link to the page
If the site is being developed on a PC or if you need to test some options, you need to use the first method. The second one is suitable for the final verification of the finished markup. Also, here you can check the site pages when using ready-made CMS templates. They may contain some errors in markups.
For example, let’s check the Phase 5 Analytics page. After copying the URL and clicking the “Run test” button, the result of the verification appeared on the screen. There was the HTML code on the left, and markup on the right with errors if they were found.
Adding structured data to the site will not take a lot of time. This action will help improve the look of the snippet in the search engine and increase traffic to the site.
The process may seem a little technically complicated, but you’ll discover that even the option to manually add it is not as hard as you’d assume. In addition, many available plugins will make developing structured data very simple.
Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter @erin_yat.
Every business wants as much customer feedback as possible. That’s why we obsessively measure NPS (which barely has any statistical validity) and run surveys (which, in addition to being biased by definition, can negatively impact customer experience) like it’s the end of the world.
But the feedback we really want is different. It’s genuine, quick and easy to get, and structured enough so we can analyze it effectively. That’s where social listening, or social media monitoring, comes in.
Social listening is the process of monitoring mentions of keywords (for example, a brand name) or key phrases across social media and the Internet at large. Think of it as a way to measure people’s awareness of any subject – and their opinion on it – without having to ask questions.
More and more companies are adopting social media monitoring every year, and social listening tools are also evolving quickly. Even though they’re called social media monitoring tools, many apps go beyond social media and monitor the web at large. Finally, they analyze the data in order to provide you with insights you can learn from and act on.
In this post, we’ll look at the best social media monitoring tools you can use in 2020.
Awario is one of the best options in terms of bang for the buck. With pricing starting at $ 29/month, it comes equipped with many features of Enterprise-geared tools: sentiment analysis, topic clouds, Boolean search, and more.
In terms of coverage, Awario monitors Twitter, Facebook, Instagram, YouTube, Reddit, news and blogs, and the rest of the web. Let’s look at what makes Awario stand out.
Awario lets users measure dozens of social listening metrics, such as sentiment, reach, share of voice, key themes, top countries, and more. On top of that, you can use the tool to identify your biggest influencers and compare several brands side-by-side against crucial metrics for benchmarking and competitor analysis.
Boolean search isn’t for every brand. If your company name isn’t a common or ambiguous word (think Apple or Tesla), you’ll be just fine by simply feeding your brand name to the tool.
However, social listening has plenty of benefits beyond brand monitoring: from lead generation and PR, to doing research for your content strategy, this is where Boolean search comes in handy. It’s an advanced search mode that uses Boolean logic, letting you create flexible queries of any complexity to make sure you only get relevant results, whatever your use case may be.
Awario offers a free 7-day trial. Pricing starts at $ 29/mo for the Starter plan (with 3 topics to monitor and 30,000 mentions/mo) and goes up to $ 299/mo for Enterprise.
Although TweetDeck isn’t a specialized social media monitoring tool, it definitely deserves a place on this list.
First of all, TweetDeck is free. Second, it lets you run Twitter searches using its powerful filters. And third, it combines the search functionality with everything else you’ll need to manage your Twitter presence.
Monitoring and scheduling in one tool
TweetDeck lets you schedule tweets, manage your DMs, and track mentions of your company on the network. You can set up as many searches as you need and reply to tweets right from the dashboard by connecting your Twitter account to the app.
Customizable column layout
Another great thing about TweetDeck is its column layout where you get to choose what each column shows. For instance, you could have your Twitter feed in one column, your DMs in another, and your social listening search in yet another one.
TweetDeck is free.
Talkwalker is an excellent social listening tool for digital agencies. The software collects the latest mentions of your brand and offers detailed analytics on your social media presence.
The tool’s social media coverage is pretty impressive, on top of Twitter, Facebook, Instagram, and YouTube, the platform also monitors Flickr and Pinterest.
In addition to monitoring mentions, Talkwalker will give you insights on people who mention you, including your audience’s gender, age, interests, and geography.
Talkwalker’s Enterprise plan offers an ability to monitor images and videos, this way you’ll be notified whenever your logo appears in an Instagram photo or YouTube video.
Pricing starts at $ 9,600/year for 10,000 mentions/mo.
Mention is a social media tool that’s primarily geared towards agencies and big brands, although they do offer plans for smaller businesses. Mention’s focus is on real-time monitoring – if you sign up and create an alert, you’ll only see mentions from the last 24 hours. Historical data is available under custom plans.
For businesses that like to have their analytics in one place, Mention offers API access, letting you integrate it into your own tools. If you’re not into coding, Mention offers an integration with Zapier, letting you automatically send mentions to a Google Spreadsheet, set up Slack notifications, and more.
In addition to social media monitoring, Mention lets you search for industry influencers across Twitter and Instagram; on top of that, it finds influential websites that you can partner with or guest post on.
Mention’s pricing starts at $ 29/mo for its basic Solo plan, which lets you monitor one topic. For bigger brands, the app offers custom plans which start at $ 600/mo.
That’s our list of the best social media monitoring tools for the coming year. Each of them has its own unique pros, so I do hope you’ve found one that’s a perfect fit for your use case and budget.
This is a sponsored post from Awario. Awario is a social listening and analytics platform trusted by over 5,000 companies worldwide. The tool gives brands access to meaningful insights on their customers, industry, and competitors through real-time social media and web monitoring. Awario monitors social media networks, news websites, blogs, and the rest of the web in real time, crawling over 13 billion pages daily to ensure you never miss important conversations that spark out online.
The post Top four social listening tools for 2020 and why they’re great appeared first on Search Engine Watch.
Zazzle Media’s second annual “State of SEO survey” has assessed the value and ROI of SEO, looking at its impact in securing funds or resources.
The data suggested that 60% of marketers find that resources and a shortage of budget are the main reasons they don’t spend more on organic search activity. However, almost a third of surveyed marketers still don’t know how to measure the impact of SEO on their results.
The survey reviewed 70% of in-house marketers and 30% of agency heads from various companies. It called for marketers to develop a better understanding of attribution models, measurement tools, brand value, and purpose when it comes to spending more on SEO.
The main reasons cited for marketers struggling to secure investment are competitor awareness, revealing that marketers are too aware of their competitor’s activity, even noting that their branded keywords were being targeted by their competitors.
The report noted that data-led objectives can act as investment enablers as they can easily quantify and measure consumer traffic. They also help marketers prove ROI, by reviewing how marketing practices are improving year on year.
Yet the survey revealed that there is still a lack of understanding around best practices for marketers to use. A quarter of those surveyed called for clearer guidelines on best practice from Google Webmasters, revealing that there is, in fact, a knowledge and skills gap around SEO.
Zazzle Media’s head of search and strategy, Stuart Shaw, said
“As an industry, we’ve needed to educate, educate, educate – at almost every level of client infrastructure. That challenge still remains, in fact, it probably changes monthly but now with more noise than ever.
However knowledge has always been power in this industry, keeping up with updates, marketing news and best practice guidelines across Google and Bing can be the difference in the results marketers need to secure that extra budget.”
You can download the full results of The State of SEO here, and check out the top-line stats on the infographic below.