- Google recently rolled out the “Full Coverage” feature for mobile SERPs
- Will this impact SEO traffic for news sites, SEO best practices, and content strategies?
- Here’s what in-house SEOs from The LA Times, New York Times, Conde Nast, Wall Street Journal, and prominent agency-side SEOs foresee
Google’s “Full Coverage” update rolled out earlier this month – but what does it really mean for news-SEOs? In-house SEOs from The LA Times, New York Times, Conde Nast, Wall Street Journal, and prominent agency-side SEOs weigh in.
As a news-SEO person myself, I was eager to get my peers’ opinions on:
- If this feature will result in greater SEO traffic for news sites?
- If editorial SEO best practices and content strategies will evolve because of it?
- If it will result in closer working relationships between SEO and editorial teams?
- Or, will everything remain “business as usual”?
ICYMI: Google’s new, “Full Coverage” feature in mobile search
Google added the “full coverage” feature to its mobile search functionality earlier this month – with the aim of making it easier for users to explore content related to developing news stories from a diverse set of publishers, perspectives, and media slants.
Just below the “Top Stories” carousel, users will now begin seeing the option to tap into “Full Coverage”/“More news on…” for developing news stories. The news stories on this page will be organized in a variety of sub-news topics (versus one running list of stories like we’re used to seeing), such as:
- Top news
- Local news
- Beyond the headlines, and more
Take a look at in-action, here:
While the concept of Google “Full Coverage” was developed back in 2018, it pertained strictly to the Google News site and app. The technology, temporal co-locality, works by mapping the relationships between entities – and understanding the people, places, and things in a story right as it evolves. And then, organizes it around storylines all in real-time to provide “full coverage” on the topic searched for.
The launch of Google’s new “Full Coverage” feature in mobile search, specifically, is exciting because it takes its technology a step further; able to detect long-running news stories that span many days, like the Super Bowl, to many weeks or months like the pandemic to serve to users. The feature is currently available to English speakers in the U.S. and will be rolled out to additional languages and locations over the next few months.
What five news-SEO experts think about “Full Coverage” in mobile search
1. Lily Ray, Senior Director, SEO & Head of Organic Research at Path Interactive
Lily Ray is a Senior SEO Director at Path Interactive in New York. She’s a prominent voice within the SEO community (with +15K followers on Twitter), and has been nominated for multiple search marketing awards throughout her career. She is well known for her E-A-T expertise. Here’s what she had to say:
“Full Coverage appears to be another new tool in Google’s arsenal for displaying a diversity of perspectives and viewpoints on recent news and events. It’s a good thing for publisher sites because it represents another opportunity to have news content surfaced organically. It may also serve as a way for niche or local publishers to gain more visibility in organic search, since Google is specifically aiming to show a broader range of viewpoints that may not always come across with the major publications.
Hopefully, Google will allow us to be able to monitor the performance of Full Coverage via either Search Console or Google Analytics, so we can segment out how our articles do in this area compared to in other areas of search.”
2. Louisa Frahm, SEO Editor at The LA Times
Louisa Frahm currently serves as the SEO Editor at the Los Angeles Times and is also pursuing a master’s degree in communication management at the University of Southern California. Prior to the LA Times, Frahm was an SEO strategist at other high-profile digital publications including Entertainment Weekly, People Magazine, TMZ, Yahoo!, and E! Online. Here’s her take:
“I’ve always liked that element of Google News. It taps into readers (like me!) who are consistently hungry for more information.
Working in the journalism field, I’m always in favor of readers utilizing a diverse array of news sources. I’m glad that this new update will tap into that. I’m interested to see which stories will fall into the “develop over a period of time” criteria. I could see it working well for extended themes like COVID-19, but big breakout themes like Harry and Meghan could also potentially fit that bill.
A wide variety of story topics have resulted from that Oprah interview, and fresh angles keep flowing in! As we’re in the thick of 2021 awards season, I could also see the Golden Globes, Grammys, and Oscars playing into this with their respective news cycles before, during, and after the events.
The long-term aspect of this update inspires me to request more updates from writers on recurring themes, so we can connect with the types of topics this particular feature likes. Though pure breaking news stories with short traffic life cycles will always be important for news SEO, this feature reinforces the additional importance of more evergreen long-term content within a publisher’s content strategy.
I could see this update providing a traffic boost, since it provides one more way for stories to get in front of readers. We always want as many eyeballs as possible on our content. Happy to add one more element to my news SEO tool kit. Google always keeps us on our toes!”
3. Barry Adams, Founder of Polemic Digital
Barry Adams is the founder of SEO consultancy, Polemic Digital. He has earned numerous search marketing awards throughout his career and has also spoken at several industry conferences. His company has helped news and publishing companies such as – The Guardian, The Sun, FOX News, and Tech Radar to name a few. This is his opinion:
“The introduction of Full Coverage directly into search results will theoretically mean there’s one less click for users to make when trying to find the full breadth of reporting on a news topic.
Whether this actually results in significantly more traffic for publishers is doubtful. The users who are interested in reading a broad range of sources on a news story will already have adopted such click behaviour via the news tab or directly through Google News.
This removal of one layer of friction between the SERP and a larger number of news stories seems more intended as a way for Google to emphasize its commitment to showing news from all kinds of publishers – the fact remains that the initial Top Stories box is where the vast majority of clicks happen. This Full Coverage option won’t change that.”
4. John Shehata, Global VP of Audience Development Strategy at Conde Nast, Founder of NewzDash News SEO
John Shehata is the Global VP of Audience Development Strategy at Conde Nast, the media company known for brands such as – Architectural Digest, Allure, Vanity Fair, and Vogue. He’s also the founder of NewzDash News SEO – a News & Editorial SEO tool that helps publishers and news sites boost their visibility and traffic in Google Search. This is his opinion:
“Google has been surfacing more news stories on their SERPs over the past few years, first Top Stories were two-three links then it became a 10-link carousel. Google then started grouping related stories together expanding Top Stories carousel from one to three featuring up 30 news stories. They also introduced local news carousels for some local queries, [and now, this new feature]. It is obvious that Google keeps testing with different formats when it comes to news. One of our top news trends and prediction for 2021 is Google will continue to introduce multiple and different formats in the SERPs beyond Top Stories article formats.
As of the impact on traffic back to publishers, it is a bit early to predict but I do not expect much boost in traffic. Do not get more wrong, this feature provides more chances for more publishers to be seen, the question is how many search users will click. And if users click, Google surfaces over 50 news links plus tweets which makes it even more competitive for publishers to get clicks back to their stories.
I did some quick analysis back in July of last year When Google Search Console started providing News tab data. I found that News Impressions are less than five percent of total web impressions. Not quite sure how is the new “Full Coverage” feature CTR will be and how many users will click! The “full coverage” link placement is better than the tabs, so we might see higher CTR.”
5. Claudio Cabrera, Deputy Audience Director, News SEO at The New York Times
Claudio Cabrera serves as the Deputy Audience Director of News SEO at the New York Times. He is an award-winning audience development expert, journalist, and educator. Prior to working at The New York Times, he was Director of Social and Search strategy at CBS Local. Here are his thoughts:
“It can be looked at in so many ways. Some brands will look at it as an opportunity to gain more visibility while some will feel their strong foothold may be lost. I think it just encourages better journalism and even better SEO because it forces us to think outside of our playbooks and adjust on some level to what we’re seeing Google provide users.
From a site traffic perspective, I can’t really comment on whether this has affected us or not but I do know there are so many other areas where sites have done serious research and testing into like Discover where audiences can grow and be picked up if you do see a drop-off. I don’t think the best practices of SEO change too much but I think the relationship between search experts and editors deepens and becomes even closer due to the changes in the algo.”
Google’s new “Full Coverage” feature in mobile search rolled out earlier this month and is an extension of the full coverage function developed for Google News back in 2018. The aim of this new feature is to help users gain a holistic understanding of complex news stories as they develop – by organizing editorial content in such a way that it goes beyond the top headlines and media outlets. In essence, giving users the “full coverage” of the event.
News-SEO experts seem to be in agreement that this new feature will make it simpler for users to explore – and gain a holistic understanding of – trending news stories. As far as what this new feature means for SEO traffic and strategy, experts can only speculate until more developing news stories emerge and we can analyze impact.
Elizabeth Lefelstein is an SEO consultant based in Los Angeles, California. She’s worked with a variety of high-profile brands throughout her career and is passionate about technical SEO, editorial SEO, and blogging. She can be found on LinkedIn and Twitter @lefelstein.
The post What five news-SEO experts make of Google’s new, “Full Coverage” feature in mobile search results appeared first on Search Engine Watch.
- Knowledge gap stands as the biggest challenge for AI technology adoption and implementation
- Our AI Summit 2020 is a cost-free event that aims to equip marketers with the much needed knowledge to adopt AI, realize AI’s true power, and know how to create strategies that can create huge competitive advantages.
- Brian Solis, IBM Watson Advertising, Adobe and Esri are our headline speakers
- More details on why marketers can’t afford to miss this golden opportunity
Artificial intelligence (AI) has long been looked at as an “industry game-changer” but has merely become jargon than actual hands-on technology.
While it continues to grow rapidly – the AI market is expected to grow from $ 28.42 billion in 2019 to $ 40.74 billion in 2020 at a CAGR of 43.39% — we observed that the knowledge gap stands as one of the biggest challenges for AI technology adoption and implementation, and our AI Summit 2020 aims to help businesses address exactly that continuum.
For a better idea, these quick facts perfectly display the AI-related challenges faced:
- According to Gartner, only one in 25 CIOs reported applying AI in their business verticals
- Retailers that implemented machine learning for personalization gained 2X as compared to retailers who did not
- According to a McKinsey, only 8% of respondents across industries said their AI-relevant data are accessible by systems across the organization
- Only 3% of an organization’s data meet the quality standards needed for analytics
About the ClickZ AI Summit 2020
Our AI Virtual Summit on June 25, is a half-day event that aims to equip marketers with the much-needed knowledge to adopt and realize AI’s true power and know how to create strategies that can create huge competitive advantages.
AI is the next dream boat that marketers need to be on in order to stay ahead of the curve. Why?
- Better customer experiences
- Lower CPAs
- More profitable and customer-focused business
Our event headliners help you become AI confident and AI ready
Leading experts along with cutting edge AI technology providers will enable you to discover the realistic power of AI, what you should be doing/using right now, and explore what’s next.
Brian Solis is a world-renowned digital anthropologist and futurist. He is also an award-winning author and global keynote speaker.
Brian’s research, advisory and presentations humanize the relationship between disruptive innovation and its impact on institutions, markets and societies.
He not only helps audiences understand what’s happening and why, he visualizes future trends and inspires people to take leading roles in defining the future they want to see.
Brian serves as Global Innovation Evangelist at Salesforce. His work focuses on thought leadership and research that explores digital transformation, innovation and disruption, CX, commerce, and the cognitive enterprise.
Dave Neway is the head of product marketing at IBM Watson Advertising (formerly The Weather Company’s ad sales business).
Watson Advertising offers marketers and agencies a suite of media, data, and AI technology solutions to help improve decision-making and reduce costs across key facets of the marketing lifecycle – from media planning through measurement.
In this role, Neway is responsible for ideating the go-to-market strategy for all Watson Advertising offerings. He works closely with the offering management team and key stakeholders to position, price, and present Watson Advertising’s products across media, data and technology categories to the marketplace.
Previously, Neway was director of sales strategy, where he created, developed, and executed plans to drive business across consumer packaged goods, pharmaceuticals, and financial services.
Tim Waddell is Director of Product Marketing for Adobe Experience Platform.
He has been with Adobe since 2009 working on a variety of projects, but always with a passion for audience activation built on rich customer profiles. Tim brings significant experience in the online and traditional marketing disciplines from both the customer and agency perspectives.
Prior to Adobe, Tim built and managed the Bing marketing analytics team at Microsoft. He also managed MSN’s commerce team, driving the demand generation program and developed packaging solutions for partners. His online experience began with the launch of Travelocity, managing the advertising and sales efforts.
Robert Yocum is Marketing Technologist at Esri, an international supplier of geographic information system software, web GIS and geodatabase management applications.
Robert functions across the Marketing Technology suite to integrate and use tools to advance the capabilities and maturation of the overall Marketing Department. He works with Change Enablement, Data and Analytics, IT, and marketing groups across the enterprise to create, prioritize, and implement new capabilities to advance digital marketing best practices.
To book your seat for the AI Virtual Summit on June 25, sign up free of charge here.
The post ClickZ AI Summit 2020: Where industry experts bridge the knowledge gap appeared first on Search Engine Watch.
Over two dozen encryption experts call on India to rethink changes to its intermediary liability rules
Security and encryption experts from around the world are joining a number of organizations to call on India to reconsider its proposed amendments to local intermediary liability rules.
In an open letter to India’s IT Minister Ravi Shankar Prasad on Thursday, 27 security and cryptography experts warned the Indian government that if it goes ahead with its originally proposed changes to the law, it could weaken security and limit the use of strong encryption on the internet.
The Indian government proposed (PDF) a series of changes to its intermediary liability rules in late December 2018 that, if enforced, would require millions of services operated by anyone from small and medium businesses to large corporate giants such as Facebook and Google to make significant changes.
The originally proposed rules say that intermediaries — which the government defines as those services that facilitate communication between two or more users and have five million or more users in India — will have to proactively monitor and filter their users’ content and be able to trace the originator of questionable content to avoid assuming full liability for their users’ actions.
“By tying intermediaries’ protection from liability to their ability to monitor communications being sent across their platforms or systems, the amendments would limit the use of end-to-end encryption and encourage others to weaken existing security measures,” the experts wrote in the letter, coordinated by the Internet Society .
With end-to-end encryption, there is no way for the service provider to access its users’ decrypted content, they said. Some of these experts include individuals who work at Google, Twitter, Access Now, Tor Project and World Wide Web Consortium.
“This means that services using end-to-end encryption cannot provide the level of monitoring required in the proposed amendments. Whether it’s through putting a ‘backdoor’ in an encryption protocol, storing cryptographic keys in escrow, adding silent users to group messages, or some other method, there is no way to create ‘exceptional access’ for some without weakening the security of the system for all,” they added.
Technology giants have so far enjoyed what is known as “safe harbor” laws. The laws, currently applicable in the U.S. under the Communications Decency Act and India under its 2000 Information Technology Act, say that tech platforms won’t be held liable for the things their users share on the platform.
Many organizations have expressed in recent days their reservations about the proposed changes to the law. Earlier this week, Mozilla, GitHub and Cloudflare requested the Indian government to be transparent about the proposals that they have made to the intermediary liability rules. Nobody outside the Indian government has seen the current draft of the proposal, which it plans to submit to India’s Supreme Court for approval by January 15.
Among the concerns raised by some is the vague definition of “intermediary” itself. Critics say the last publicly known version of the draft had an extremely broad definition of the term “intermediary,” that would be applicable to a wide-range of service providers, including popular instant messaging clients, internet service providers, cyber cafes and even Wikipedia.
Amanda Keton, general counsel of Wikimedia Foundation, requested the Indian government late last month to rethink the requirement to bring “traceability” on online communication, as doing so, she warned, would interfere with the ability of Wikipedia contributors to freely participate in the project.
A senior executive with an American technology company, who requested anonymity, told TechCrunch on Wednesday that even as the proposed changes to the intermediary guidelines need major changes, it is high time that the Indian government decided to look into this at all.
“Action on social media platforms, and instant communications services is causing damage in the real world. Spread of hoax has cost us more than at least 30 lives. If tomorrow, someone’s sensitive photos and messages leak on the internet, there is currently little they can expect from their service providers. We need a law to deal with the modern internet’s challenges,” he said.
In this video, Hanapin’s Danielle Gonzales and John Williams discuss the future of PPC and what’s on their wishlists for 2020.
Read more at PPCHero.com
It’s that time of year again! The time to release our newest edition of the Top 25 and honor some of the hardest workers in our tight-knit PPC community. Find out the 2019 Top 25 Most Influential PPC Experts in the world.
Read more at PPCHero.com
For the first time in the history of the Top 25 list, we decided to announce a Top 50. The Top 50 is based on votes only…meaning only those with the most votes got in the Top 50 to be scored for the final Top 25 list. Find out who made the Top 50!
Read more at PPCHero.com
In what appears to be the latest salvo in a new, wired form of protest, developer Sam Lavigne posted code that scrapes LinkedIn to find Immigration and Customs Enforcement employee accounts. His code, which basically a Python-based tool that scans LinkedIn for keywords, is gone from Github and Gitlab and Medium took down his original post. The CSV of the data is still available here and here and WikiLeaks has posted a mirror.
“I find it helpful to remember that as much as internet companies use data to spy on and exploit their users, we can at times reverse the story, and leverage those very same online platforms as a means to investigate or even undermine entrenched power structures. It’s a strange side effect of our reliance on private companies and semi-public platforms to mediate nearly all aspects of our lives. We don’t necessarily need to wait for the next Snowden-style revelation to scrutinize the powerful — so much is already hiding in plain sight,” said Lavigne.
Doxxing is the process of using publicly available information to target someone online for abuse. Because we can now find out anything on anyone for a few dollars – a search for “background check” brings up dozens of paid services that can get you names and addresses in a second – scraping public data on LinkedIn seems far easier and innocuous. That doesn’t make it legal.
“Recent efforts to outlaw doxxing at the national level (like the Online Safety Modernization Act of 2017) have stalled in committee, so it’s not strictly illegal,” said James Slaby, Security Expert at Acronis. “But LinkedIn and other social networks usually consider it a violation of their terms of service to scrape their data for personal use. The question of fairness is trickier: doxxing is often justified as a rare tool that the powerless can use against the powerful to call attention to perceived injustices.”
“The problem is that doxxing is a crude tool. The torrent of online ridicule, abuse and threats that can be heaped on doxxed targets by their political or ideological opponents can also rain down on unintended and undeserving targets: family members, friends, people with similar names or appearances,” he said.
The tool itself isn’t to blame. No one would fault a job seeker or salesperson who scraped LinkedIn for targeted employees of a specific company. That said, scraping and publicly shaming employees walks a thin line.
“In my opinion, the professor who developed this scraper tool isn’t breaking the law, as it’s perfectly legal to search the web for publicly available information,” said David Kennedy, CEO of TrustedSec. “This is known in the security space as ‘open source intelligence’ collection, and scrapers are just one way to do it. That said, it is concerning to see ICE agents doxxed in this way. I understand emotions are running high on both sides of this debate, but we don’t want to increase the physical security risks to our law enforcement officers.”
“The decision by Twitter, Github and Medium to block the dissemination of this information and tracking tool makes sense – in fact, law enforcement agents’ personal information is often protected. This isn’t going to go away anytime soon, it’s only going to become more aggressive, particularly as more people grow comfortable with using the darknet and the many available hacking tools for sale in these underground forums. Law enforcement agents need to take note of this, and be much more careful about what (and how often) they post online.”
Ultimately, doxxing is problematic. Because we place our information on public forums there should be nothing to stop anyone from finding and posting it. However, the expectation that people will use our information for good and not evil is swiftly eroding. Today, wrote one security researcher, David Kavanaugh, doxxing is becoming dangerous.
“Going after the people on the ground is like shooting the messenger. Decisions are made by leadership and those are the people we should be going after. Doxxing is akin to a personal attack. Change policy, don’t ruin more lives,” he said.