CBPO

Monthly Archives: May 2020

How To Double Relevant Google Shopping Volume in 5 Steps

May 10, 2020 No Comments

If you have been frustrated by limited targeting options for Google Shopping ads, this easily-implemented strategy will grow your reach and revenue.

Read more at PPCHero.com
PPC Hero


Google’s Duo video chat app gets a family mode with doodles and masks

May 9, 2020 No Comments

Google today launched an update to its Duo video chat app (which you definitely shouldn’t confuse with Hangouts or Google Meet, Google’s other video, audio and text chat apps).

There are plenty of jokes to be made about Google’s plethora of chat options, but Duo is trying to be a bit different from Hangouts and Meet in that it’s mobile-first and putting the emphasis on personal conversations. In its early days, it was very much only about one-on-one conversations (hence its name), but that has obviously changed (hence why Google will surely change its name sooner or later). This update shows this emphasis with the addition of what the company calls a “family mode.”

Once you activate this mode, you can start doodling on the screen, activate a number of new effects and virtually dress up with new masks. These effects and masks are now also available for one-on-one calls.

For Mother’s Day, Google is rolling out a special new effect that is sufficiently disturbing to make sure your mother will never want to use Duo again and immediately make her want to switch to Google Meet instead.

Only last month, Duo increased the maximum number of chat participants to 12 on Android and iOS. In the next few weeks, it’s also bringing this feature to the browser, where it will work for anyone with a Google account.

Google also launched a new ad for Duo. It’s what happens when marketers work from home.

Mobile – TechCrunch


How writers can optimize content for a variety of search engines

May 9, 2020 No Comments

30-second summary:

  • If you think optimizing your content for Google is tough, then youre going to be amazed by how many factors youll have to consider when optimizing your writing for multiple search engines. 
  • Two benefits of doing so can be seen in local SEO and voice search.
  • UK Linkology’s Content Marketing Manager, Hannah Stevenson walks you through the complex process to understand and implement how you can optimize content for search engines beyond Google – Bing, DuckDuckGo, Ask.com, and more.

Optimizing your content for just one search engine can be a challenge, as weve still got no idea what Google expects. 

There is a range of different tools out there designed to help, but theyre all merely making educated guesses. To use them effectively, you need to be assessing what they tell you and, where possible, using more than one metric to evaluate your sites success and boost its rankings. 

If you think optimizing your content for Google is tough, then youre going to be amazed by how many factors youll have to consider when optimizing your writing for multiple search engines.  

Read on to find out why its important that you dont overlook alternative search engines and how you can include them in your optimization process.  

Why you need to optimize content for a range of search engines 

Google has the largest market share of any search engine in the world, so, understandably, most writers and SEOs focus on optimizing for it. 

However, there are a wide variety of alternative search engines out there. Bing, Microsoftsearch engine offering, has 5.53% of this market. This might seem like a small percentage, but when you consider that the digital population around the world is in the billions, it is still a significant number of users that youre overlooking by only optimizing your content for Google. 

A percentage of users of Bing will have it set as their prefered search tool due to the browser or device they are using. Microsoft favours its own tools, which is why Bing is the default search engine on Windows phones, tablets and computers.  

Some developers have deals to make certain search engines their default. Many of these deals involve Google, but in some cases, the titan of the search engine market is usurped.  

For example, AOL chose Bing over Google in 2015, meaning that Bing is the default engine on AOL browsers. While this might not seem significant, many users will not bother to change their settings, and simply use the default search engine, meaning if this option is not Google, then other search engines will rise in popularity.  

Additionally, some smaller search engines target specific demographics, such as Ecosia, which is marketed at environmentally-conscious users and donates money towards planting trees with every search that users make.  

For users who are concerned about privacy and data storage, DuckDuckGo is a search engine that promises not to store information and block out hidden tracking software.  

As such, if you are targeting these specific demographics, then you need to make sure that you optimize your content for these tools.  

Research the search engines on the market 

Before you start optimizing your content, you need to check out the search engines on offer and work out which ones are the most relevant to your website.  

Some of the key search engines on the market, not including Google, are: 

  • Bing: As mentioned earlier, Bing is Microsoft’s search engine, which has a strong market share.  
  • Yahoo!: Powered by Bing, Yahoo! Uses the same technology, but is a different platform, meaning that you can optimize for this solution using the same techniques you use for Bing.  
  • Ecosia: An eco-friendly search engine that promotes itself by offering to donate money towards tree planting efforts for every search users make on its platform.  
  • DuckDuckGo: A privacy-focused search engine that does not track user data, making it harder to optimize for and less-informative than other tools.  
  • Qwant: Another search engine that’s dedicated to privacy, Qwant has it’s own indexing engine and doesn’t track user activity.  
  • Ask.com: Using a question and answer format, Ask.com providers users with answers to any queries they may have by showing them relevant pages and content.  

Look beyond Google Analytics 

The first step towards to optimize content for alternative search engines is to find new sources of traffic information.

Most webmasters use Google Analytics to review their traffic and site information, but this platform only shows clicks from Google searches. 

If you want to find out where youre getting all of your page visits from, then youll need to find alternative ways to review your traffic.  

Analytics tools such as SEMrush, SimilarWeb and Ahrefs all show you where your traffic is coming from, as well as offering a wide range of additional tools such as keyword searches and top page analysis. As such, theyre definitely worth investing in if you want to boost your site, both on Google and a range of other search engines.   

Follow them on social media to stay updated 

One of the easiest ways you can learn about the latest developments in the way these alternative search engines operate, and how you can optimize content around them is to stay updated.  

As such, you should follow them on social media and sign up to their newsletters to read the latest developments and advice that theyre offering to users and content creators.  

Keeping tabs on so many different search engines can be a challenge, particularly if youre trying to optimize your content around several different tools.  

Youll be able to get all of the updates as and when theyre released. Youll also receive expert commentary on what these developments mean for you and your content.  

Local SEO benefits some alternative search engines

Some search engines offer tailored local insight, meaning that you can use local SEO practices to target these platforms. 

For example, Bing offers Bing Places, a directory of local companies, and is committed to offering users search results tailored around their location.  

As Yahoo! is powered by Bing, boosting your reach on one platform will translate to growth on the other.  

Bings dedication to sharing local search results means that, if you use local search terms in your content, you will be more likely to rank on this platform.  

Flash is Bings favourite

Bing also has technical preferences, with a focus on Flash and Silverlight based applications: 

Rich Internet Applications (RIAs), such as Microsoft Silverlight and Adobe Flash Player, can improve the aesthetic appearance or the functional ability of a site for end-users. However, the way these technologies are typically implemented often causes problems with the ability of search engine bots to crawl and fetch any meaningful data from the site. 

Sites extensively employing JavaScript and AJAX technologies can also cause the same problems for search. This is because search bots are primarily readers of the text. It is much more difficult to parse and derive indexable, relevant content from graphic and multimedia content. As a result, sites who implement these technologies without regard to search bot accessibility often unexpectedly see their search rankings drop off (from the search bot perspective, the site simply has little-to-no indexable content available, which adversely affects its relevance to the site’s main theme).

As such, you need to try to move your site onto RIAs where possible and optimize the meta tags and description tags to help you achieve strong results on Bing.  

Ask.com is optimized in a similar way to voice search

Voice search is one of the fastest-growing trends in the SEO market currently, with so many consumers now turning to their smart devices and virtual assistants to give them the information they need. 

When optimizing content for voice search, the key is to answer questions, as the majority of verbal searches are questions. 

This is because voice search queries use natural voice commands, as users are speaking rather than typing. Google has identified that almost 70% of searches on Google Assistant are performed in natural language, rather than the keywords that you often find in written searches.  

Its more natural to ask a question than it is to yell keywords towards your device. As such, optimizing your content for voice search involves including questions and providing the answers.  

Creating content with question and answers in it not only helps you to boost your voice search results, but also helps you to optimize your content for Ask.com. 

As Ask.com focuses on providing users with the answers to questions, by also focusing on this format, you can kill two birds with one stone and optimize your writing for both Ask.com and voice search.  

Never sacrifice quality and relevance

The key to search engine visibility and increased traffic will always be quality and relevance. No matter what tools you use and what search engines you choose to target, you should always focus on creating readable content that grabs your readers attention.  

Always make sure that your content is thoroughly proofread and that you havent stuffed too many keywords into obscure positions. If you start every sentence with your target keywords, then search engines will pick up on this and may penalize your site.  

A Google penalty is a serious issue, but a penalty from any other search engine can also cause you major problems.  

Tools such as Grammarly or Hemingway Editor can help with readability, while SEO Surfer can help you understand keyword density and content layout. It should be mentioned that SEO Surfer takes much of its data from Google, but the tool can be useful for spreading out your keywords to help boost your rankings in a variety of search engines.  

Conclusion  

At the end of the day, content remains king in the SEO market. Creating quality content needs to remain your key focus, with optimizing it and getting it in front of your target audience your second priority.  

As with any business decision, when youre optimizing your content, you should try to spread your risk. Aim to create content that is valuable not only for Google but a range of other search engines too.   

Hannah Stevenson is the Content Marketing Manager at UK Linkology. 

The post How writers can optimize content for a variety of search engines appeared first on Search Engine Watch.

Search Engine Watch


Microsoft and AWS exchange poisoned pen blog posts in latest Pentagon JEDI contract spat

May 9, 2020 No Comments

Microsoft and Amazon are at it again as the fight for the Defense Department JEDI contract continues. In a recent series of increasingly acerbic pronouncements, the two companies continue their ongoing spat over the $ 10 billion, decade-long JEDI contract spoils.

As you may recall (or not), last fall in a surprise move, the DoD selected Microsoft as the winning vendor in the JEDI winner-take-all cloud infrastructure sweepstakes. The presumed winner was always AWS, but when the answer finally came down, it was not them.

To make a very long story short, AWS took exception to the decision and went to court to fight it. Later it was granted a stay of JEDI activities between Microsoft and the DoD, which as you can imagine did not please Microsoft . Since then, the two companies have been battling in PR pronouncements and blog posts trying to get the upper hand in the war for public opinion.

That fight took a hard turn this week when the two companies really went at it in dueling blog posts after Amazon filed its latest protest.

First there was Microsoft with PR exec Frank Shaw taking exception to AWS’s machinations, claiming the company just wants a do-over:

This latest filing – filed with the DoD this time – is another example of Amazon trying to bog down JEDI in complaints, litigation and other delays designed to force a do-over to rescue its failed bid.

Amazon’s Drew Herdner countered in a blog post published this morning:

Recently, Microsoft has published multiple self-righteous and pontificating blog posts that amount to nothing more than misleading noise intended to distract those following the protest.

The bottom line is that Microsoft believes it won the contract fair and square with a more competitive bid, while Amazon believes it should have won on technical superiority, and that there was political interference from the president because he doesn’t like Amazon CEO Jeff Bezos, who also owns the Washington Post.

If you’ve been following this story from the beginning (as I have), you know it has taken a series of twists and turns. It’s had lawsuits, complaints, drama and intrigue. The president has inserted himself into it, too. There have been accusations of conflicts of interest. There have been investigations, lawsuits and more investigations.

Government procurement tends to be pretty bland, but from the start when the DoD chose to use the cutesy Star Wars-driven acronym for this project, it has been anything but. Now it’s come down to two of the world’s largest tech companies exchanging angry blog posts. Sooner or later this is going to end right?


Enterprise – TechCrunch


Five digital marketing tips for freelancers

May 9, 2020 No Comments

30-second summary:

  • Freelancers are responsible for handling every aspect of their business to see growth – and digital marketing helps boost business growth. 
  • Your freelancing career involves marketing yourself while also handling client relationships. For different stages of the client journey, you need to stay organized, send relevant emails, and think creatively about what you have to offer.
  • Digital marketing opens opportunities for you to market your freelance business with the best possible website content, start new client relationships, reassure clients about their worries, and send personalized, automated emails at each stage of your client relationships.  

Freelancers have to execute work for their clients, but they also need to promote their business to get new clients. Digital marketing can be a powerful way to grow a business, but it’s also an additional skill set that not all freelancers practice.  

When it comes to building your brand, creating a compelling website, and generally marketing yourself online, what are the best ways to get started?  

Here are five digital marketing tips for freelancers.  

1. Focus on these four important website pages

As a freelancer, your website is one of your most essential tools. One of the reasons that people trust businesses is because of what they have on their website. 

And no website is complete without these four most important pages: 

  • The homepage 
  • The “About me” page 
  • The portfolio and work experience page 
  • The “Contact me” page 

Your website is the easiest and most direct way for a prospective client to learn about who you are, your skills, and how to work with you in the same successful way previous clients have.  

The homepage is where you should include your value proposition, and proof, like a client testimonial – of how working with you helps people see their desired results. A website visitor leaves a page within 10-20 seconds of arriving. However, a web page with a clear and immediate value proposition is known to keep web visitors on a page longer – long enough to (hopefully) convert them into clients.  

The “About me” page is where you show people what kind of a person your clients will work with – and that’s important to them. A report by BBMG states that 73% of people care about the company, not just the product or service when making a purchase. People want to buy from people. Authentic, similar, and likeable people.  

The portfolio and work experience page is where you can show off your best work. Clients often want to see a sample of the work that they’re going to get, to see if your style matches what they’re looking for. This page is that sample. 

The “Contact me” page is key. This is where people can hire you, so it’s a great place to answer any outstanding questions — to lay out the next steps and reiterate your value proposition. It’s also a good place to collect important information, like the budget, from people who contact you. 

You can have all other types of pages on your site. But start by getting these four right. 

2. Create an online course 

According to a Stratistics MRC report, the global elearning market is projected to account for over $ 275 billion by 2022 (which is way up from the $ 165 billion accounted for in 2015). 

Creating an online course can feel like a lot of work, but it’s become easier than ever to sell your expertise. Online platforms like Thinkific or Podia give freelancers the tools they need to productize their experience — and potentially introduce some recurring income. 

When it comes to creating your course, keep a running list of the questions you get most often from your clients. Which projects come your way the most often? What information do you wish all your clients understood? 

Each of those questions can serve as the backbone for a new course — and most technology platforms will give you more ideas about how exactly you can package your product to sell.  

It takes an up-front investment to get an online course off the ground, but once you launch you can create a revenue stream that isn’t hourly or project-based billing (and expands beyond selling your time directly). 

You can host online courses on your website or through another platform, and you can promote courses via social media posts, paid advertising, and segmented emails.  

3. Create a client onboarding checklist 

When you have a new client, your first step is not to immediately start work on their project.  

Up until now, you and your new client have talked about things like your skills, the types of projects they have in mind, and the cost you agreed on for your services. You made a general plan for your work together. Now is the time when you put the plan in motion. 

A client onboarding process can make sure that everyone is on the same page. More importantly, onboarding is where you show new clients what you can do.  

As you begin work with a new client, you can upsell them on an online course, gated content, and other helpful resources. And as you start, it helps to have an onboarding checklist to keep track of each necessary step to sustain productive client relationships.  

A client onboarding checklist includes steps like

  • Send an automated welcome email with details about what to expect next 
  • Send an invoice email for the first half of payment 
  • Set up a shared project management tool for project progress visibility 

The structure is an important part of any relationship – and especially your client relationships. You can’t expect clients to follow your lead on a project if they don’t know what path they are walking. Clearly lay out a client onboarding process for yourself and use a checklist to stay on track.  

4. Address these three most common client objections 

Any prospective client is going to have questions for you – and they are also going to come ready with reasons why working with you won’t work out. 

But you have to remember – they are talking to you because they need your help. When common client objections arise, it’s up to you to be able to handle them. 

You can use your digital marketing to handle client objections before they even reach you. Your website copy, blog content, social media posts, and emails can all showcase how you deal with your clients’ problems. 

For example, a prospective client who is worried about the cost of hiring a freelancer may respond positively to website copy that says, “Pricing is month to month with no contracts, no extra fees, and no hidden gimmicks”. 

When you are directly dealing with a client through emails and phone calls, being prepared with a list of objections and your responses can help with them over 

Three common client objections you can encounter are

  1. It’s too expensive: Instead of leading with price, focus on the value and benefits of your service. Circle back to what the client gets with your offering, and the problems or pain points it solves.  
  2. You don’t understand my business: Respond with curiosity instead of getting defensive. Agree with them and let it be known that you want to learn about their business, then ask them to tell you about it. 
  3. I don’t have the time or resources for this right now: Instead of asking them when the best time to talk later is, ask them what time and resources they DO have right now. This helps you learn what support you can offer them in the meantime.  

Some clients might be worried that hiring you is too expensive. Instead of a reply that defends your pricing, focus on the value and benefits of your offering. Circle back to what the client gets with your offering, and the problems or pain points it solves.  

5. Send automated follow-up and thank you emails to clients 

Throughout a client relationship, certain emails are required. During a project, you should send multiple check-in emails. After a project is done, you should still keep in contact with clients through follow-up and thank you emails. 

Follow-up emails can go to past clients that you want to work with again. In many email marketing platforms, you can find customizable follow-up email templates to cut down on communication time for you.   

When you have a new client, it’s important to build on your existing relationship and keep communication open. Sending a personal thank you note after you complete a client’s first project shows that you care. 

Personalized, thoughtful communication is the root of good customer experiences – and a big reason why clients continue to work with businesses. However, it can sound like a lot of time-consuming work to send out all of the necessary types of emails to clients. Automation exists to help you send every email you need without extra work from you.  

Nucleus found that marketing automation improved business productivity by an average of 20%. It can help you track and maintain communication with your clients. You should use marketing automation to send a thank-you email to first-time clients and follow-up emails to past clients.  

I hope freelancers everywhere can benefit from these tips and other free resources for freelancers. Manage clients, promote your freelance brand, and grow your business by taking hold of your marketing strategy.

Lauren Minning is a Content Marketing Specialist at ActiveCampaign.

The post Five digital marketing tips for freelancers appeared first on Search Engine Watch.

Search Engine Watch


Platforms scramble as ‘Plandemic’ conspiracy video spreads misinformation like wildfire

May 8, 2020 No Comments

A coronavirus conspiracy video featuring a well-known vaccine conspiracist is spreading like wildfire on social media this week, even as platforms talk tough about misinformation in the midst of the pandemic.

In the professionally-produced video, a solemn interviewer named Mikki Willis interviews Judy Mikovits, a figure best known for her anti-vaccine activism in recent years. The video touches on a number of topics favored among online conspiracists at the moment, filtering most of them through the lens that vaccines are a money-making enterprise that causes medical harm.

The video took off mid-week after first being posted to Vimeo and YouTube on May 4. From those sites, it traveled to Facebook, Instagram and Twitter where it circulated much more widely, racking up millions of views. Finding the video is currently trivial across social platforms, where it’s been reposted widely, sometimes with its title removed or reworded to make it more difficult to detect by AI moderation.

According to Twitter, tweets by Mikovits apparently don’t violate the platform’s rules around COVID-19 misinformation, but it has marked the video’s URL as “unsafe” and blocked the related hashtags “#PlagueOfCorruption and #Plandemicmovie. The company also hasn’t found evidence that her account is being amplified as part of a coordinated campaign.

Over on Facebook, the video indeed runs afoul of the platform’s coronavirus and health misinformation rules—but it’s still very easy to find. For this story, I was able to locate a copy of the full video within seconds and at the time of writing Instagram’s #plandemic hashtag was well-populated with long clips from the video and even suggestions for related hashtags like #coronahoax. Facebook is currently working to stem the video’s spread, but it’s already collected millions of views in a short time.

On YouTube, a search for “Plandemic” mostly pulls up content debunking the video’s many false claims, but plenty of clips from the video itself still make the first wave of search results.

The video itself is a hodgepodge of popular false COVID-10 conspiracies already circulating online, scientifically unsound anti-vaccine talking points and claims of persecution.

Mikovits, who in the video states that she’s not opposed to vaccines, later goes on to make the claim that vaccines have killed millions of people. “The game is to prevent the therapies ‘til everyone is infected and push the vaccines, knowing that the flu vaccines increase the odds… of getting COVID-19,” Mikovits says, conspiratorially. At the same time, she suggests that doctors and health facilities are incentivized to overcount COVID-19 cases for the medicare payouts, an assertion that contradicts the expert consensus that coronavirus cases are likely still being meaningfully undercounted.

In the video, Mikovits accuses Dr. Anthony Fauci of suppressing treatments like hydroxychloroquine—falsely touted by President Trump as a likely cure for the virus. While her claims appear to have landed at the perfect opportunistic moment, her beef with Fauci is actually longstanding. As Buzzfeed reported, in a book she wrote six years ago, Mikovits accused Dr. Fauci of banning her from the NIH’s facilities—an event Fauci himself was not familiar with.

Mikovits also touches on a popular web of conspiracy theories fixated on the idea Bill Gates is somehow implicated in causing the pandemic to profit off the eventual vaccine and makes the unfounded claim that “it’s very clear this virus was manipulated and studied in the laboratory.”

In other interviews, Mikovits has suggested that face masks pose a danger because they can “activate” the virus in the wearer. In the “Plandemic” clip, Mikovits also makes the unscientific claim that beaches should not have been closed due to “healing microbes in the saltwater” and “sequences” in the sand that protect against the coronavirus.

To the uninformed viewer, Mikovits might appear to ably address scientific-sounding topics, but her own scientific credentials are extremely dubious. In 2009, Mikovits authored a study on chronic fatigue syndrome that was retracted by the journal Science two years later when an audit found “evidence of poor quality control” in the experiment and the results could not be replicated in subsequent studies. That event and her subsequent firing from a research institute appear to have kicked off her more recent turn as an anti-vaccine crusader, conspiracist and author.

With “Plandemic,” Mikovits seems to have positioned herself successfully for relevance in the pandemic’s information vacuum—her book sales have even soared on Amazon. Toward the end of the clip, her interviewer even cannily sets up a future outrage cycle at the inevitable crackdown from social media platforms, where the video flouts rules ostensibly banning harmful health conspiracies like the ones it contains.

“It’s other people shutting down other citizens and the big tech platforms follow suit and they shut everything down,” Willis says with steely concern. “There is no dissenting voices allowed any more in this free country.” 

As we’ve reported previously, the coronavirus crisis is fertile ground for conspiracy theories and potentially lethal misinformation— a fact that the “Plandemic” video’s apparent mainstream crossover success demonstrates. Widespread uncertainty and fear is a powerful thing, capable of breathing new life into debunked ideas that would have otherwise kept collecting dust in conspiracist backwaters, where they belong.


Social – TechCrunch


Alphabet’s Sidewalk Labs Scraps Its Ambitious Toronto Project

May 8, 2020 No Comments

The Google sibling envisioned a tech-enabled and eco-friendly neighborhood. But residents rebelled over plans to collect and use their data, among other things.
Feed: All Latest


JavaScript rendering and the problems for SEO in 2020

May 7, 2020 No Comments

30-second summary:

  • Anyone working in enterprise SEO in 2020 will have encountered this web architecture scenario with a client at some point. Frameworks like React, Vue, and Angular make web development more simply expedited.
  • There are tons of case studies but one business Croud encountered migrated to a hybrid Shopify / JS framework with internal links and content rendered via JS. They proceeded to lose traffic worth an estimated $ 8,000 per day over the next 6 months… about $ 1.5m USD.
  • The experienced readers amongst us will soon start to get the feeling that they’re encountering familiar territory.
  • Croud’s VP Strategic Partnerships, Anthony Lavall discusses JavaScript frameworks that deal with the most critical SEO elements.

While running the SEO team at Croud in New York over the last three years, 60% of our clients have been through some form of migration. Another ~30% have either moved from or to a SPA (Single Page Application) often utilizing an AJAX (Asynchronous Javascript and XML) framework to varying degrees.

Anyone working in enterprise SEO in 2020 will have encountered this web architecture scenario with a client at some point. Frameworks like React, Vue, and Angular make web development more simply expedited. This is especially true when creating dynamic web applications which offer relatively quick new request interactivity (once the initial libraries powering them have loaded – Gmail is a good example) by utilizing the power of the modern browser to render the client-side code (the JavaScript). Then using web workers to offer network request functionality that doesn’t require a traditional server-based URL call.

With the increased functionality and deployment capabilities comes a cost – the question of SEO performance. I doubt any SEO reading this is a stranger to that question. However, you may be still in the dark regarding an answer.

Why is it a problem?

Revenue, in the form of lost organic traffic via lost organic rankings. It’s as simple as this. Web developers who recommended JavaScript (JS) frameworks are not typically directly responsible for long-term commercial performance. One of the main reasons SEOs exist in 2020 should be to mitigate strategic mistakes that could arise from this. Organic traffic is often taken as a given and not considered as important (or controllable), and this is where massive problems take place. There are tons of case studies but one business we encountered migrated to a hybrid Shopify / JS framework with internal links and content rendered via JS. They proceeded to lose traffic worth an estimated $ 8,000 per day over the next 6 months… about $ 1.5m USD.

What’s the problem?

There are many problems. SEOs are already trying to deal with a huge number of signals from the most heavily invested commercial algorithm ever created (Google… just in case). Moving away from a traditional server-rendered website (think Wikipedia) to a contemporary framework is potentially riddled with SEO challenges. Some of which are:

  • Search engine bot crawling, rendering, and indexing – search engine crawlers like Googlebot have adapted their crawling process to include the rendering of JavaScript (starting as far back as 2010) in order to be able to fully comprehend the code on AJAX web pages. We know Google is getting better at understanding complex JavaScript. Other search crawlers might not be. But this isn’t simply a question of comprehension. Crawling the entire web is no simple task and even Google’s resources are limited. They have to decide if a site is worth crawling and rendering based on assumptions that take place long before JS may have been encountered and rendered (metrics such as an estimated number of total pages, domain history, WhoIs data, domain authority, etc.).

Google’s Crawling and Rendering Process – The 2nd Render / Indexing Phase (announced at Google I/O 2018)

  • Speed – one of the biggest hurdles for AJAX applications. Google crawls web pages un-cached so those cumbersome first loads of single page applications can be problematic. Speed can be defined in a number of ways, but in this instance, we’re talking about the length of time it takes to execute and critically render all the resources on a JavaScript heavy page compared to a less resource intensive HTML page.
  • Resources and rendering – with traditional server-side code, the DOM (Document Object Model) is essentially rendered once the CSSOM (CSS Object Model) is formed or to put it more simply, the DOM doesn’t require too much further manipulation following the fetch of the source code. There are caveats to this but it is safe to say that client-side code (and the multiple libraries/resources that code might be derived from) adds increased complexity to the finalized DOM which means more CPU resources required by both search crawlers and client devices. This is one of the most significant reasons why a complex JS framework would not be preferred. However, it is so frequently overlooked.

Now, everything prior to this sentence has made the assumption that these AJAX pages have been built with no consideration for SEO. This is slightly unfair to the modern web design agency or in-house developer. There is usually some type of consideration to mitigate the negative impact on SEO (we will be looking at these in more detail). The experienced readers amongst us will now start to get the feeling that they are encountering familiar territory. A territory which has resulted in many an email discussion between the client, development, design, and SEO teams related to whether or not said migration is going to tank organic rankings (sadly, it often does).

The problem is that solutions to creating AJAX applications that work more like server-based HTML for SEO purposes are themselves mired in contention; primarily related to their efficacy. How do we test the efficacy of anything for SEO? We have to deploy and analyze SERP changes. And the results for migrations to JavaScript frameworks are repeatedly associated with drops in traffic. Take a look at the weekly stories pouring into the “JS sites in search working group” hosted by John Mueller if you want some proof.

Let’s take a look at some of the most common mitigation tactics for SEO in relation to AJAX.

The different solutions for AJAX SEO mitigation

1. Universal/Isomorphic JS

Isomorphic JavaScript, AKA Universal JavaScript, describes JS applications which run both on the client and the server, as in, the client or server can execute the <script> and other code delivered, not just the client (or server). Typically, complex JavaScript applications would only be ready to execute on the client (typically a browser). Isomorphic Javascript mitigates this. One of the best explanations I’ve seen (specifically related to Angular JS) is from Andres Rutnik on Medium:

  1. The client makes a request for a particular URL to your application server.
  2. The server proxies the request to a rendering service which is your Angular application running in a Node.js container. This service could be (but is not necessarily) on the same machine as the application server.
  3. The server version of the application renders the complete HTML and CSS for the path and query requested, including <script> tags to download the client Angular application.
  4. The browser receives the page and can show the content immediately. The client application loads asynchronously and once ready, re-renders the current page and replaces the static HTML with the server rendered. Now the web site behaves like an SPA for any interaction moving forwards. This process should be seamless to a user browsing the site.

Source: Medium

To reiterate, following the request, the server renders the JS and the full DOM/CSSOM is formed and served to the client. This means that Googlebot and users have been served a pre-rendered version of the page. The difference for users is that the HTML and CSS just served is then re-rendered to replace it with the dynamic JS so it can behave like the SPA it was always intended to be.

The problems with building isomorphic web pages/applications appear to be just that… actually building the thing isn’t easy. There’s a decent series here from Matheus Marsiglio who documents his experience.

2. Dynamic rendering

Dynamic rendering is a more simple concept to understand; it is the process of detecting the user-agent making the server request and routing the correct response code based on that request being from a validated bot or a user.

This is Google’s recommended method of handling JavaScript for search. It is well illustrated here:

JavaScript - Dynamic Rendering from Google 

The Dynamic Rendering Process explained by Google

The output is a pre-rendered iteration of your code for search crawlers and the same AJAX that would have always been served to users. Google recommends a solution such as prerender.io to achieve this. It’s a reverse proxy service that pre-renders and caches your pages. There are some pitfalls with dynamic rendering, however, that must be understood:

  • Cloaking – In a world wide web dominated primarily by HTML and CSS, cloaking was a huge negative as far as Google was concerned. There was little reason for detecting and serving different code to Googlebot aside from trying to game search results. This is not the case in the world of JavaScript. Google’s dynamic rendering process is a direct recommendation for cloaking. They are explicitly saying, “serve users one thing and serve us another”. Why is this a problem? Google says, “As long as your dynamic rendering produces similar content, Googlebot won’t view dynamic rendering as cloaking.” But what is similar? How easy could it be to inject more content to Googlebot than is shown to users or using JS with a delay to remove text for users or manipulate the page in another way that Googlebot is unlikely to see (because it is delayed in the DOM for example).
  • Caching – For sites that change frequently such as large news publishers who require their content to be indexed as quickly as possible, a pre-render solution may just not cut it. Constantly adding and changing pages need to be almost immediately pre-rendered in order to be immediate and effective. The minimum caching time on prerender.io is in days, not minutes.
  • Frameworks vary massively – Every tech stack is different, every library adds new complexity, and every CMS will handle this all differently. Pre-render solutions such as prerender.io are not a one-stop solution for optimal SEO performance.

3. CDNs yield additional complexities… (or any reverse proxy for that matter)

Content delivery networks (such as Cloudflare) can create additional testing complexities by adding another layer to the reverse proxy network. Testing a dynamic rendering solution can be difficult as Cloudflare blocks non-validated Googlebot requests via reverse DNS lookup. Troubleshooting dynamic rendering issues therefore takes time. Time for Googlebot to re-crawl the page and then a combination of Google’s cache and a buggy new Search Console to be able to interpret those changes. The mobile-friendly testing tool from Google is a decent stop-gap but you can only analyze a page at a time.

This is a minefield! So what do I do for optimal SEO performance?

Think smart and plan effectively. Luckily only a relative handful of design elements are critical for SEO when considering the arena of web design and many of these are elements in the <head> and/or metadata. They are:

  • Anything in the <head> – <link> tags and <meta> tags
  • Header tags, e.g. <h1>, <h2>, etc.
  • <p> tags and all other copy / text
  • <table>, <ul>, <ol>, and all other crawl-able HTML elements
  • Links (must be <a> tags with href attributes)
  • Images

Every element above should be served without any JS rendering required by the client. As soon as you require JS to be rendered to yield one of the above elements you put search performance in jeopardy. JavaScript can, and should be used to enhance the user experience on your site. But if it’s used to inject the above elements into the DOM then you have got a problem that needs mitigating.

Internal links often provide the biggest SEO issues within Javascript frameworks. This is because onclick events are sometimes used in place of <a> tags, so it’s not only an issue of Googlebot rendering the JS to form the links in the DOM. Even after the JS is rendered there is still no <a> tag to crawl because it’s not used at all – the onclick event is used instead.

Every internal link needs to be the <a> tag with an href attribute containing the value of the link destination in order to be considered valid. This was confirmed at Google’s I/O event last year.

To conclude

Be wary of the statement, “we can use React / Angular because we’ve got next.js / Angular Universal so there’s no problem”. Everything needs to be tested and that testing process can be tricky in itself. Factors are again myriad. To give an extreme example, what if the client is moving from a simple HTML website to an AJAX framework? The additional processing and possible issues with client-side rendering critical elements could cause huge SEO problems. What if that same website currently generates $ 10m per month in organic revenue? Even the smallest drop in crawling, indexing, and performance capability could result in the loss of significant revenues.

There is no avoiding modern JS frameworks and that shouldn’t be the goal – the time saved in development hours could be worth thousands in itself – but as SEOs, it’s our responsibility to vehemently protect the most critical SEO elements and ensure they are always server-side rendered in one form or another. Make Googlebot do as little leg-work as possible in order to comprehend your content. That should be the goal.

Anthony Lavall is VP Strategic Partnerships at digital agency Croud. He can be found on Twitter @AnthonyLavall.

The post JavaScript rendering and the problems for SEO in 2020 appeared first on Search Engine Watch.

Search Engine Watch


The Top 50 Most Influential PPC Experts [2020]

May 7, 2020 No Comments

While our readers anxiously await the Top 25 Most Influential PPC Experts list (which will be released in a few weeks), we are sharing the Top 50!

Read more at PPCHero.com
PPC Hero


Eight HTML elements crucial for SEO

May 6, 2020 No Comments

30-second summary:

  • SEO’s love to write about HTML elements as a vital ranking signal, and as a part of any “perfectly” optimized page.
  • To avoid possible confusion, this is not an HTML guide.
  • Aleh Barysevich, Founder and CMO of SEO PowerSuite and Awario, takes a detailed look at the top eight HTML elements to better communicate with the search engines to achieve better SERP rankings.
  • Lots of pro tips to watch out for, read on!

Using HTML for SEO benefits isn’t new. SEO’s love to write about HTML elements as a vital ranking signal, and as a part of any “perfectly” optimized page

Why do we love them so much? Because the essence of SEO is communicating to the search engine what a webpage/website is all about, and using HTML tags and their attributes is one of the best ways to do so. 

To avoid possible confusion: this is not an HTML guide. Instead, I’ll look at how you can use HTML tags to better communicate with the search engines to achieve better rankings. 

1. Title tag

Title tag is your main anchor. Both on the SERP, as well as on social media like Facebook, <title> is used as an anchor to your page.

HTML elements for SEO - Title tag

So writing it isn’t only about SEO, it also needs to be laconic, informative, unique and eye-catching.

How to use it for SEO

First, don’t make your title tags longer than 60-70 characters. Long titles are shortened to about 600-700px on the SERP, so your longer title simply ends looking incomplete in the SERPs. Second, the keywords. A long time ago Google only understood exact keyword matches. Today, thanks to RankBrain, among other things, Google gets you. Plus, you get penalized for overstuffing your titles. 

Conclusion: use your keywords in your <title> tag, but only to help search engines parse out the meaning of your page and to help your users. 

Structure of the <title> tag

Pro tip 

A page’s title is not just visible on a SERP. It’s also shown in the web browser as a tab title. Some webmasters use that title tag to attract a user’s attention — if you switch tabs the text changes to something like “Come back, we miss you!”. 

It’s the exact approach used by Facebook/LinkedIn to show you you have notifications and can be used to pretty good effect.

2. Meta description tag

The meta description tag determines what’s going to be written about your page on the SERP.

HTML elements for SEO - Meta description tag

How to use it for SEO

First, before writing a meta description, it’s a great idea to check out the first SERP for your target keywords, get a feeling of how the top-ranking results compose their descriptions. Plus, avoid repeating other descriptions word for word Second, try and explain what your page is about in 70-200 characters, and be careful not to overoptimize. Instead, aim to match the search intent of a potential query. 

Structure of the meta description tag

Pro tip

Don’t use quotation marks in your description tag without using HTML entities “&quot;” to encase the word you want to be in quotation marks. If you simply use “ quotation marks around your content, the search engine is likely to cut off your description immediately.

3. Meta Robots txt tag

Among meta tags, the robots one occupies a special place. It’s used to instruct crawlers on how to crawl and index your page. Now, it should be noted that meta robots tag might be ignored at any point, but mostly crawlers respect the wishes of the webmasters.

How to use it for SEO

You can use one or a combination of the following attributes within this tag:

  • noindex — stop search engines from indexing the page entirely.
  • nofollow — tells search engines to not follow the outgoing links on the page, and to not take these links into account when creating SERPs.
  • noimageindex — stop the image indexing on the page.
  • noarchive — SERP will present a cached version of this page.
  • nosnippet — don’t show any meta description on the SERP.
  • unavailable_after — after a certain date, the page won’t be indexed.

Structure of the meta robots tag

Pro tip

Use the nofollow attribute to optimize your crawl budget. Remember to balance your meta robots values and your robots.txt parameters. If you block a page in robots, then obviously a crawler won’t be able to access and heed its meta tags. On the other hand, a crawler might ignore the block in robots.txt, and then if your meta tags don’t specify that your page is noindex/nofollow, the crawler might index it anyway.

4. Headings tags

Heading tags, from h1 to h6, are arranged hierarchically. Use them to break your text up into chapters and as convenient headings for your contents table.

<h1> is the “main” text heading, and by far the most important for our purposes. 

How to use it for SEO

To answer this question, our colleagues ran an experiment not so long ago. I’d recommend you check out the entire findings, but in summary: <h1> tags considerably influence your rankings. Definitely fill them out, and definitely use some of your target keywords.

Structure of headings tags

Pro tip

While using keywords in your headings is important, overall your title tag is what the search engines will be looking at much more attentively. That said, Google recommends matching your title tag and h1 heading, so you can pretty much repeat the same thing, maybe a little more user-friendly.

5. Canonical tag

The rel=”canonical” is an attribute within the <link> tag. Use it to point towards the “main” version of the page among its duplicates. It’s used because a certain amount of duplication is inevitable, and massive duplication will actually harm your rankings in the long run. It’s generally a great idea to use an auditor tool to keep an eye on all of your duplicate pages and canonical tags in a single dashboard. 

How to use it for SEO

It should be correctly implemented within the <head> section of the page and should point to the version that you want to be rankingAlternatively, if you can configure your server, you can indicate the canonical URL using rel=”canonical” HTTP headers.

Structure of the canonical tag

Pro tip

rel=”canonical” tag may be used not just for duplicates, but also near-duplicates. Be careful though: if the two pages connected by a canonical tag differ too much in content, the search engine will simply disregard the tag. Use it for two nearly identical product pages in two different categories, for example, or for two products differing in one small attribute.

6. Nofollow attribute

We all know that links are super important for ranking. But a link’s weight will significantly change depending on how that link is covered by the rel attribute in the <a> tag. 

The rel=”nofollow” element is used in order to point out that you don’t want Google to associate this link with your webpage, and you don’t want to pass your Link Authority to them.

How to use it for SEO

The most obvious use for nofollow HTML element is to block out the spam and promotional links. Remember that by default all of the links on your pages are “follow”, but be careful not to make Google associate you with the wrong pages. When doing link building, you want to avoid nofollow links, and as a webmaster, the situation is reversed, and you should tag any link you don’t want the search engine to associate you with as nofollow.

Structure of the nofollow element

Pro tip

Please remember that internal PageRank sculpting using nofollow, which is sometimes promoted as solid advice, is actually uselessOn the other hand, use the rel values such as ugc to tag your user-generated content, and sponsored for the paid links — this will help Google make a sober assessment of your ranking.

7. Structured data markup

Data markup is an approach to the organization of the information on your page. It’s a win-win decision for a webmaster. Structured data is both good for the UX, and also carries huge SEO valueYou see, lists, along with data markup (schema.org in particular) are absolutely vital to get some additional SERP real estate in the form of rich snippets. From FAQ information, to your review ratings, and much more — to have that additional SERP space, you need to use data markup.

How to use it for SEO

It’s about much more than simply using <ul>/<ol> tags, although it’s still not very difficult. Simply go to schema.org, find the type of markup that suits your page, and implement it into your page’s code.

Additionally, in terms of SEO, data markup is absolutely necessary to get into a featured snippet. While there is no guarantee that the Featured snippet will be yours thanks to structured data, the rich snippets alone make it a worthy investment of your time.

Structure of data markup

Pro tip

Note that you can combine different markup schemas, and should do so when appropriate. When creating a page describing a person, a “person” schema is an obvious choice, but you can also easily add “address” and “organization” to the relevant page elements. That would net you the best SERP results.

8. Image alt text descriptions

Within the <img> tag, the key attribute for SEO is definitely alt. The thing about this tag is it’s indexed. Having your images show up for a certain search query is all about writing a good alt text.

How to use it for SEO

Alt text gives your page a relevance boost. Plus, an additional opportunity to be displayed for relevant search queries. What you need to do is describe the image in about 125 characters or less. It’s definitely the case that you shouldn’t use the words “picture of” or “image of” — just jump straight to the point, and explain specifically and in some detail what’s on the page.

Structure of image alt text description

Pro tip

In a situation where you’ve mapped certain keywords to a page with multiple images, the overoptimization threshold is different. You might get penalized for using your target keywords too consistently over a number of alt tags. A good idea would be to choose an image that best reflects what you’re trying to rank for, and put a keyword in its description. Describe the rest of the pictures as naturally and specifically as you can.

Conclusion

SEO is not an isolated practice, it influences every part of a website’s life, webpage creation included. 

Sure, there is no golden rule to writing HTML tags, no “trick” that would guarantee you a top ranking — what’s important is the accumulation of the best SEO practices. 

The next step after implementing the advice described here would be to switch to HTML5 semantic tags wholesale. These elements help the search engines sort through which element occupies which semantic place on the webpage

By carefully optimizing your tags you get an opportunity to communicate with browsers and search engines directly, and this is something you need to be proactive about.

Aleh is the Founder and CMO at SEO PowerSuite and Awario. He can be found on Twitter at @ab80.

The post Eight HTML elements crucial for SEO appeared first on Search Engine Watch.

Search Engine Watch