By 2025, 463 exabytes of data will be created each day, according to some estimates. (For perspective, one exabyte of storage could hold 50,000 years of DVD-quality video.) It’s now easier than ever to translate physical and digital actions into data, and businesses of all types have raced to amass as much data as possible in order to gain a competitive edge.
However, in our collective infatuation with data (and obtaining more of it), what’s often overlooked is the role that storytelling plays in extracting real value from data.
The reality is that data by itself is insufficient to really influence human behavior. Whether the goal is to improve a business’ bottom line or convince people to stay home amid a pandemic, it’s the narrative that compels action, rather than the numbers alone. As more data is collected and analyzed, communication and storytelling will become even more integral in the data science discipline because of their role in separating the signal from the noise.
Data alone doesn’t spur innovation — rather, it’s data-driven storytelling that helps uncover hidden trends, powers personalization, and streamlines processes.
Yet this can be an area where data scientists struggle. In Anaconda’s 2020 State of Data Science survey of more than 2,300 data scientists, nearly a quarter of respondents said that their data science or machine learning (ML) teams lacked communication skills. This may be one reason why roughly 40% of respondents said they were able to effectively demonstrate business impact “only sometimes” or “almost never.”
The best data practitioners must be as skilled in storytelling as they are in coding and deploying models — and yes, this extends beyond creating visualizations to accompany reports. Here are some recommendations for how data scientists can situate their results within larger contextual narratives.
Make the abstract more tangible
Ever-growing datasets help machine learning models better understand the scope of a problem space, but more data does not necessarily help with human comprehension. Even for the most left-brain of thinkers, it’s not in our nature to understand large abstract numbers or things like marginal improvements in accuracy. This is why it’s important to include points of reference in your storytelling that make data tangible.
For example, throughout the pandemic, we’ve been bombarded with countless statistics around case counts, death rates, positivity rates, and more. While all of this data is important, tools like interactive maps and conversations around reproduction numbers are more effective than massive data dumps in terms of providing context, conveying risk, and, consequently, helping change behaviors as needed. In working with numbers, data practitioners have a responsibility to provide the necessary structure so that the data can be understood by the intended audience.
Google will make a major change to its search algorithm in May 2021. Here’s what you need to know about Core Web Vitals and how to prepare your site.
Read more at PPCHero.com
Last year, we introduced Consent Mode, a beta feature to help advertisers operating in the European Economic Area and the United Kingdom take a privacy-first approach to digital marketing. When a user doesn’t consent to ads cookies or analytics cookies, Consent Mode automatically adjusts the relevant Google tags’ behavior to not read or write cookies for advertising or analytics purposes. This enables advertisers to respect user choice while helping them still capture some campaign insights.
Without cookies, advertisers experience a gap in their measurement and lose visibility into user paths on their site. They are no longer able to directly tie users’ ad interactions to conversions, whether the users are repeat visitors or whether those users have arrived from paid or organic traffic sources. To help close this gap, we’re introducing conversion modeling through Consent Mode. This will help marketers preserve online measurement capabilities, using a privacy-first approach.
Now, Consent Mode will enable conversion modeling to recover the attribution between ad-click events and conversions measured in Google Ads. Early results from Google Ads have shown that, on average, conversion modeling through Consent Mode recovers more than 70% of ad-click-to-conversion journeys lost due to user cookie consent choices. Results for each advertiser may vary widely, depending primarily on user cookie consent rates and the advertiser’s Consent Mode setup.
How modeling fills in measurement gaps
Conversion modeling can help fill in blanks in media measurement at times when it’s not possible to observe the path between ad interactions and conversions. Conversion modeling through Consent Mode specifically addresses gaps in observable data from regulations on cookie consent in various regions. Conversion modeling uses machine learning to analyze observable data and historical trends, in order to quantify the relationship between consented and unconsented users. Then, using observable user journeys where users have consented to cookie usage, our models will fill in missing attribution paths. This creates a more complete and accurate view of advertising spend and outcomes — all while respecting user consent choices. Conversion modeling also upholds privacy by not identifying individual users, unlike tactics like fingerprinting which Google has a strict policy against.
Using modeling to probabilistically recover linkages between ad interactions and conversions that would otherwise go unattributed means more conversion insights for optimizing campaign bidding and understanding what’s driving sales. It’s important for any modeling approach to account for the fact that people who consent to cookies are likely to convert at a different rate than those who don’t.
Holistic measurement for your Google Ads campaigns
It’s important for advertisers to have accurate reporting so they can make their marketing investments go further. Advertisers using Consent Mode will now see their reports in Google Ads updated: for Search, Shopping, Display, and Video campaigns, the “Conversions,” “All conversions” and “Conversion value” columns will now include modeled conversions for consent gaps. All other Google Ads campaign performance reports that use conversion data will also reflect the impact from adding in modeled conversions.
Modeled conversions through Consent Mode will be integrated directly in your Google Ads campaign reports with the same granularity as observed conversions. This data then makes its way into Google’s bidding tools so that you can be confident your campaigns will be optimized based on a full view of your results.
For advertisers who want to optimize their campaigns based on return on ad spend or cost-per-acquisition, they can use Target Return on Ad Spend (tROAS) orTarget Cost Per Acquisition (tCPA) Smart Bidding strategies with Consent Mode. If you had previously adjusted targets to account for cookie consent changes, you can now go back to setting targets in line with your ROI goals. Note that you’re likely to see gradual improvements in reported performance as we recover lost conversions through modeling.
For advertisers who want to maintain their campaign spend, conversion modeling through Consent Mode also works with the Maximize conversions or Maximize conversion value Smart Bidding strategies in Google Ads. We recommend you make sure that the budget you’ve decided on is well-aligned with your spend goals.
If you’re an advertiser operating in the European Economic Area or the United Kingdom, have implemented Consent Mode and are using Google Ads conversion tracking, conversion modeling from Consent Mode is available for you today.
And if you aren’t using Consent Mode yet, you have two options to get started. You can implement it yourself on your website by following our instructions. Or if you need some extra help, we’ve partnered closely with several Consent Management Platforms, a few of which already take care of critical implementation steps on behalf of advertisers.
We are continuously adding new privacy-forward techniques to help our machine learning solutions better understand the aggregate behavior of non-consenting users, and offer actionable insights in reporting for deeper clarity on your marketing spend. We’ll be bringing conversion modeling through Consent Mode to other Google advertising products, like Campaign Manager 360, Display & Video 360 and Search Ads 360 later this year.
- In 2020, majority of the 181.7 billion U.S. dollar revenues came from advertising through Google Sites or its network sites
- Even though they will be removing the third-party cookie from 2022, the search giant still has a wealth of first-party data from its 270+ products, services, and platforms
- The Trade Desk’s 20 percent stock price drop is proof of Google’s monopoly and why it shouldn’t enjoy it anymore
- Google expert, Susan Dolan draws from her rich experience and details the current search scape, insights and predicts future key themes that will arise out of the 3p cookie death
Imagine search as a jungle gym, you automatically imagine Google as the kingpin player on this ground. This has been a reality for decades now and we all know the downside of autonomy which is why the industry now acknowledges a need for regulation. Google announced that it would remove the third-party cookie from 2022. But a lot can happen in a year, 2020 is proof of that! Does this mean that cookies will completely bite the dust? Think again. I dive deep into years of my experience with the web to share some thoughts, observations, and insights on what this really means.
For once, Google is a laggard
Given the monopoly that Google has enjoyed and the list of lawsuits (like the anti-trust one and more) this move is a regulatory step to create a “net-vironment” that feels less like a net and is driven towards transparency and search scape equality.
But Firefox and Safari had already beaten Google to the punch in 2019 and 2020 respectively. Safari had launched the Safari Intelligent Tracking Prevention (ITP) update on March 23, 2020. Firefox had launched its Enhanced Tracking Protection feature in September 2019 to empower and protect users from third-party tracking cookies and crypto miners.
Google’s solution to respect user privacy
Google recently announced that it won’t be using identifiers. Google is developing a ‘Privacy Sandbox’ to ensure that publishers, advertisers, and consumers find a fair middle ground in terms of data control, access, and tracking. The idea is to protect anonymity while still delivering results for advertisers and publishers. The Privacy Sandbox will don the FLoC API that can help with interest-based advertising. Google will not be using fingerprints, PII graphs based on people’s email addresses that other browsers use. Google will move towards a Facebook-like “Lookalike audience” model that will group users for profiling.
Did that raise eyebrows? There’s more.
Don’t be fooled – They still have a lavish spread of first-party data
Google is already rich with clusters of historical, individual unique data that they’ve stored, analyzed, predicted, and mastered over the years and across their platforms and services. These statistics give you a clear sense of the gravity of the situation:
- Google has 270+ products and services (Source)
- Among the leading search engines, the worldwide market share of Google in January 2021 was almost 86 percent (Source)
- In 2020, majority of the 181.7 billion U.S. dollar revenues came from advertising through Google Sites or Google Network Sites (Source)
- There are 246 million unique Google users in the US (Source)
- Google Photos has over one billion active users (Source)
- YouTube has over 1.9 billion active users each month (Source)
- According to Google statistics, Gmail has more than 1.5 billion active users (Source)
- A less-known fact, there are more than two million accounts on Google Ads (Source)
- There are more than 2.9 million companies that use one or more of Google’s marketing services (Source)
- As of Jan 2021, Google’s branch out into the Android system has won it a whopping 72 percent of the global smartphone operating system market (Source)
- Google sees 3.5 billion searches per day and 1.2 trillion searches per year worldwide (Source)
Google has an almost-never ending spectrum of products, services, and platforms –
Here’s the complete, exhaustive list of Google’s gigantic umbrella.
Google already has access to your:
- Search history
- Credit/debit card details shared on Google Pay
- Data from businesses (more than 2.9 million!) that use Google services
- Your device microphone
- Mobile keyboard (G-board)
- Apps you download from the Google Playstore and grant access to
- Device camera, and that’s not even the tip of the iceberg
Google’s decision to eliminate the third-party cookie dropped The Trade Desk’s stock by 20 percent
Nobody should have monopoly and this incident serves as noteworthy proof. Google’s decision to drop 3p cookies shocked The Trade Desk’s stock prices causing a 20 percent slump in their stock value. The Trade Desk is the largest demand-side platform (DSP) and Google’s decision kills the demand for The Trade Desk’s proprietary Unified ID 1.0 (UID 1.0) – a unique asset that chopped out the need for cookie-syncing process and delivered match rate accuracy up to 99 percent.
Google’s statement on not using PII also jeopardizes the fate of The Trade Desk’s Unified ID 2.0. which already has more than 50 million users.
Here’s what Dave Pickles, The Trade Desk’s Co-Founder and Chief Technology Officer had to say,
“Unified ID 2.0 is a broad industry collaboration that includes publishers, advertisers and all players in the ad tech ecosystem.”
“UID provides an opportunity to have conversations with consumers and provide them with the sort of transparency we as an industry have been trying to provide for a really long time.”
Adweek’s March town hall saw advertisers and publishers haunted by the mystery that surrounds Google as Google denied to participate in the event. The industry is growing precarious that Google will use this as a new way to establish market dominance that feeds its own interests.
We love cookies (only when they’re on a plate)
Cookies are annoying because they leave crumbs everywhere… on the internet! Did you know, this is how people feel about being tracked on the web:
- 72 percent of people feel that almost everything they do online is being tracked by advertisers, technology firms or other companies
- 81 percent say that the potential risks of data collection outweigh the benefits for them
These stats were originally sourced from Pew Research Center, but the irony, I found these stats on one of Google’s blogs.
On a hunt to escape these cookies or to understand the world’s largest “cookie jar” I checked out YouTube which seemed like a good place to start since it has over 1.9 billion monthly active users. You could visit this link to see how ads are personalized for you – the list is long!
My YouTube curiosity further landed me on this page to see how my cookies are shared (you can opt out of these). Even my least used account had 129 websites on this list, imagine how many sites are accessing your data right now.
Back in 2011 when I was the first to crack the Page rank algorithm, I could already sense the power Google held and where this giant was headed – the playground just wasn’t big enough.
Key themes that will emerge
Bottom line is, the cookie death is opening up conversations for advertising transparency and a web-verse that is user-first, and privacy compliant. Here’s what I foresee happening in search and the digital sphere:
- Ethical consumer targeting
- Adtech companies collaborating to find ways that respect their audience’s privacy
- A more private, personalized web
- More conversations around how much and what data collection is ethical
- More user-led choices
- Rise in the usage of alternative browsers
- Incentivizing users to voluntarily share their data
- Better use of technology for good
What do you think about the current climate on the internet? Join the conversation with me on @GoogleExpertUK.
Susan Dolan is a Search Engine Optimization Consultant first to crack the Google PageRank algorithm as confirmed by Eric Schmidt’s office in 2014. Susan is also the CEO of The Peoples Hub which has been built to help people and to love the planet.
The post The search dilemma: looking beyond Google’s third-party cookie death appeared first on Search Engine Watch.
A slower vaccine rollout isn’t ideal, but it’s more important that Americans know they can trust officials to address health concerns when they arise.
Feed: All Latest
Instagram today will begin a new test around hiding Like counts on users’ posts, following its experiments in this area which first began in 2019. This time, however, Instagram is not enabling or disabling the feature for more users. Instead, it will begin to explore a new option where users get to decide what works best for them — either choosing to see the Like counts on others’ posts, or not. Users will also be able to turn off Like counts on their own posts, if they choose. Facebook additionally confirmed it will begin to test a similar experience on its own social network.
Instagram says tests involving Like counts were deprioritized after Covid-19 hit, as the company focused on other efforts needed to support its community. (Except for that brief period this March where Instagram accidentally hid Likes for more users due to a bug.)
The company says it’s now revisiting the feedback it collected from users during the tests and found a wide range of opinions. Originally, the idea with hiding Like counts was about reducing the anxiety and embarrassment that surrounds posting content on the social network. That is, people would stress over whether their post would receive enough Likes to be deemed “popular.” This problem was particularly difficult for Instagram’s younger users, who care much more about what their peers think — so much so that they would take down posts that didn’t receive “enough” Likes.
In addition, the removal of Likes helped reduce the sort of herd mentality that drives people to like things that are already popular, as opposed to judging the content for themselves.
But during tests, not everyone agreed the removal of Likes was a change for the better. Some people said they still wanted to see Like counts so they could track what was trending and popular. The argument for keeping Likes was more prevalent among the influencer community, where creators used the metric in order to communicate their value to partners, like brands and advertisers. Here, lower engagement rates on posts could directly translate to lower earnings for these creators.
Both arguments for and against Likes have merit, which is why Instagram’s latest test will put the choice back into users’ own hands.
This new test will be enabled for a small percentage of users globally on Instagram, the company says.
If you’ve been opted in, you’ll find a new option to hide the Likes from within the app’s Settings. This will prevent you from seeing Likes on other people’s posts as you scroll through your Instagram Feed. As a creator, you’ll be able to hide Likes on a per-post basis via the three-dot “…” menu at the top. Even if Likes are disabled publicly, creators are still able to view Like counts and other engagements through analytics, just as they did before.
The tests on Facebook, which has also been testing Like count removals for some time, have not yet begun. Facebook tells TechCrunch those will roll out in the weeks ahead.
Making Like counts an choice may initially seem like it could help to address everyone’s needs. But in reality, if the wider influencer community chooses to continue to use Likes as a currency that translates to popularity and job opportunities, then other users will continue to do the same.
Ultimately, communities themselves have to decide what sort of tone they want to set, preferably from the outset — before you’ve attracted millions of users who will be angry when you later try to change course.
There’s also a question as to whether social media users are really hungry for an “Like-free” safer space. For years we’ve seen startups focused on building an “anti-Instagram” of sorts, where they drop one or more Instagram features, like algorithmic feeds, Likes and other engagement mechanisms, such as Minutiae, Vero, Dayflash, Oggl, and now, newcomers like troubled Dispo, or under-the-radar Herd. But Instagram has yet to fail because of an anti-Instagram rival. If anything is a threat, it’s a new type of social network entirely, like TikTok –where it should be noted getting Likes and engagements is still very important for creator success.
Instagram didn’t say how long the new tests would last or if and when the features would roll out more broadly.
“We’re testing this on Instagram to start, but we’re also exploring a similar experience for Facebook. We will learn from this new small test and have more to share soon,” a Facebook company spokesperson said.
Startup Battlefield — the matriarch of all pitch competitions — is the stuff of tech legend. Heck, it even played a role in the HBO show, “Silicon Valley,” and its influence touches early-stage startups around the globe. Under no circumstance will you find a bigger, better platform for launching your startup to the world.
Battlefield has a long history of producing notable names. Need an example? A little startup by the name of Dropbox competed in the Battlefield at TC50 (the precursor to Disrupt) way back in 2008.
TechCrunch is on the hunt for innovative, game-changing startups to take the Startup Battlefield challenge and wrangle with the best-of-the-best at TC Disrupt 2021 in September. Are you game?
Apply to compete in Startup Battlefield before the deadline closes on May 13 11:59 pm (PT).
The stakes: A shot at $ 100,000 in equity-free prize money. Major exposure for all competing startups — think investors eager to find and fund the next big thing, journalists in search of exciting, game-changing startups to cover and potential customers and partners who can help take your business to new levels of success.
The investment: Your time. Yup, that’s it. Appyling to and participating in Startup Battlefield is 100 percent free. No fees, no equity cut. You simply invest your time — all participating founders receive several weeks of training with the Startup Battlefield team. Your demo and presentation will be, well, pitch perfect when you deliver it to panels of top VC judges. And you’ll be thoroughly prepped to handle the Q&A that follows.
The perks: In addition to the massive interest from just about all Disrupt attendees, competing startups get exhibition space in the Startup Alley expo area, free passes to future TechCrunch events, a free membership to Extra Crunch and invitations to private events like the Startup Battlefield reception.
You’ll meet members of the Startup Battlefield alumni community — we’re talking about 922 companies (like Vurb, Mint, Yammer and, yes, Dropbox) that have collectively raised $ 9.5 billion and produced 117 exits. Once Disrupt ends, you’re part of this phenomenal community — just imagine the networking possibilities.
The details: Read more about how Startup Battlefield works.
TC Disrupt 2021 takes place September 21-23. If you’ve got an innovative, game-changing startup, apply to compete in Startup Battlefield. Make sure you submit your completed application before the deadline expires on May 13 11:59 pm (PT).
Is your company interested in sponsoring or exhibiting at Disrupt 2021? Contact our sponsorship sales team by filling out this form.
SambaNova raises $676M at a $5.1B valuation to double down on cloud-based AI software for enterprises
Artificial intelligence technology holds a huge amount of promise for enterprises — as a tool to process and understand their data more efficiently; as a way to leapfrog into new kinds of services and products; and as a critical stepping stone into whatever the future might hold for their businesses. But the problem for many enterprises is that they are not tech businesses at their cores and so bringing on and using AI will typically involve a lot of heavy lifting. Today, one of the startups building AI services is announcing a big round of funding to help bridge that gap.
SambaNova — a startup building AI hardware and integrated systems that run on it that only officially came out of three years in stealth last December — is announcing a huge round of funding today to take its business out into the world. The company has closed in on $ 676 million in financing, a Series D that co-founder and CEO Rodrigo Liang has confirmed values the company at $ 5.1 billion.
The round is being led by SoftBank, which is making the investment via Vision Fund 2. Temasek and the Government of Singapore Investment Corp. (GIC), both new investors, are also participating, along with previous backers BlackRock, Intel Capital, GV (formerly Google Ventures), Walden International and WRVI, among other unnamed investors. (Sidenote: BlackRock and Temasek separately kicked off an investment partnership yesterday, although it’s not clear if this falls into that remit.)
Co-founded by two Stanford professors, Kunle Olukotun and Chris Ré, and Liang, who had been an engineering executive at Oracle, SambaNova has been around since 2017 and has raised more than $ 1 billion to date — both to build out its AI-focused hardware, which it calls DataScale and to build out the system that runs on it. (The “Samba” in the name is a reference to Liang’s Brazilian heritage, he said, but also the Latino music and dance that speaks of constant movement and shifting, not unlike the journey AI data regularly needs to take that makes it too complicated and too intensive to run on more traditional systems.)
SambaNova on one level competes for enterprise business against companies like Nvidia, Cerebras Systems and Graphcore — another startup in the space which earlier this year also raised a significant round. However, SambaNova has also taken a slightly different approach to the AI challenge.
In December, the startup launched Dataflow-as-a-service as an on-demand, subscription-based way for enterprises to tap into SambaNova’s AI system, with the focus just on the applications that run on it, without needing to focus on maintaining those systems themselves. It’s the latter that SambaNova will be focusing on selling and delivering with this latest tranche of funding, Liang said.
SambaNova’s opportunity, Liang believes, lies in selling software-based AI systems to enterprises that are keen to adopt more AI into their business, but might lack the talent and other resources to do so if it requires running and maintaining large systems.
“The market right now has a lot of interest in AI. They are finding they have to transition to this way of competing, and it’s no longer acceptable not to be considering it,” said Liang in an interview.
The problem, he said, is that most AI companies “want to talk chips,” yet many would-be customers will lack the teams and appetite to essentially become technology companies to run those services. “Rather than you coming in and thinking about how to hire scientists and hire and then deploy an AI service, you can now subscribe, and bring in that technology overnight. We’re very proud that our technology is pushing the envelope on cases in the industry.”
To be clear, a company will still need data scientists, just not the same number, and specifically not the same number dedicating their time to maintaining systems, updating code and other more incremental work that comes managing an end-to-end process.
SambaNova has not disclosed many customers so far in the work that it has done — the two reference names it provided to me are both research labs, the Argonne National Laboratory and the Lawrence Livermore National Laboratory — but Liang noted some typical use cases.
One was in imaging, such as in the healthcare industry, where the company’s technology is being used to help train systems based on high-resolution imagery, along with other healthcare-related work. The coincidentally-named Corona supercomputer at the Livermore Lab (it was named after the 2014 lunar eclipse, not the dark cloud of a pandemic that we’re currently living through) is using SambaNova’s technology to help run calculations related to some Covid-19 therapeutic and antiviral compound research, Marshall Choy, the company’s VP of product, told me.
Another set of applications involves building systems around custom language models, for example in specific industries like finance, to process data quicker. And a third is in recommendation algorithms, something that appears in most digital services and frankly could always do to work a little better than it does today. I’m guessing that in the coming months it will release more information about where and who is using its technology.
Liang also would not comment on whether Google and Intel were specifically tapping SambaNova as a partner in their own AI services, but he didn’t rule out the prospect of partnering to go to market. Indeed, both have strong enterprise businesses that span well beyond technology companies, and so working with a third party that is helping to make even their own AI cores more accessible could be an interesting prospect, and SambaNova’s DataScale (and the Dataflow-as-a-service system) both work using input from frameworks like PyTorch and TensorFlow, so there is a level of integration already there.
“We’re quite comfortable in collaborating with others in this space,” Liang said. “We think the market will be large and will start segmenting. The opportunity for us is in being able to take hold of some of the hardest problems in a much simpler way on their behalf. That is a very valuable proposition.”
The promise of creating a more accessible AI for businesses is one that has eluded quite a few companies to date, so the prospect of finally cracking that nut is one that appeals to investors.
“SambaNova has created a leading systems architecture that is flexible, efficient and scalable. This provides a holistic software and hardware solution for customers and alleviates the additional complexity driven by single technology component solutions,” said Deep Nishar, Senior Managing Partner at SoftBank Investment Advisers, in a statement. “We are excited to partner with Rodrigo and the SambaNova team to support their mission of bringing advanced AI solutions to organizations globally.”
Rising consumer expectations and changing industry regulations have set higher standards for user privacy and data protection. This has led many businesses to revisit how they are managing data in their Google Analytics accounts. To help, Analytics provides businesses with a variety of features to control how their data is used. Here is an updated overview of controls in Analytics that govern how data is collected, stored, and used–all of which can be adjusted at any time.
Three ways businesses can manage data in Google Analytics:
Control the data settings in your account
You can access various settings in your Analytics account to control how you collect, retain, and share data.
Decide if you need to accept the Data Processing Terms.
The optional Data Processing Terms are meant for businesses affected by the European Economic Area General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and other similar regulations. You can review and accept the terms if needed in your Analytics account, under Account Settings.
Anonymize IP addresses for your Web property.
When you enable IP anonymization in your Web property, Analytics will anonymize the addresses as soon as technically feasible. This may be useful for you to comply with your company’s privacy policies or government regulations. For Apps properties and App + Web properties, IP anonymization is enabled by default.
Disable some or all data collection.
Set the data retention period.
You can select how long user-level and event-level data is stored by Analytics, and whether new events can reset that time period. Once that amount of time has passed, the data will be scheduled for automatic deletion from your account and Google’s servers.
Select what data you share with your support team and Google.
The data sharing settings allow you to customize whether to share Analytics data with Google, including whether to allow Google technical support representatives and Google marketing specialists to access your account when you want support using the product or performance recommendations.
Review your Google signals setting.
The Google signals setting allows you to enable additional features in Analytics like remarketing, demographics and interests reports, and Cross Device reports. You can also further customize this setting to keep Google signals enabled for reporting while limiting or disabling advertising personalization.
Choose whether your data is used for ads personalization
Digital advertising helps you reach people online and drive conversions on your app and website. When you enable ads personalization in Analytics, for example by activating Google signals, you gain the ability to use your Analytics audiences to personalize your digital ads which can improve the performance of your campaigns. You can customize how your Analytics data is used for ads personalization.
Control ads personalization for your entire Analytics property.
You can choose to disable ads personalization for an entire property, which will cause all incoming events for that property to be marked as not for use in ads personalization. You can manage this in the property settings of your account.
Control ads personalization by geography.
If you need to set the ads personalization setting for your property at the geographic level, you now have the ability to enable or disable this setting by country. And in the United States, you can adjust the setting at the state level.
Control ads personalization by event type or user property.
In App + Web properties, you can adjust the ads personalization setting for a specific event type or user property. For example you can exclude specific events or user properties from being used to personalize ads and only use that data for measurement purposes.
Control ads personalization for an individual event or session.
You can also manage whether an individual event or session is used for ads personalization. For example, if you need to obtain consent before enabling the setting you can dynamically disable ads personalization at the beginning of the session and on each subsequent event until consent is obtained.
Independent of these ads personalization controls that Analytics offers to advertisers, users can control their own ads personalization setting for their Google account. Once they’ve turned off this setting, Google will no longer use information about them for ads personalization.
Remove data from Analytics
You can remove your data from Analytics for any reason and at any time. You can request the data to be deleted from the Analytics servers or delete information for a single user.
Request data to be deleted.
If you need to delete data from the Analytics servers, you can submit a request for its removal. There is a seven-day grace period starting from the time you make the request before Analytics will begin the deletion process. All administrators and users with edit permission for your account will be informed of your request and have the ability to cancel the request during the grace period. Similar functionality will be available in App + Web properties soon.
Delete data for individual users.
You are able to delete a single user’s data from your Analytics account. If you have edit permission for the account, you can do this through the User Explorer report in Web properties or the User Explorer technique in the Analysis module in App + Web properties. Data associated with this user will be removed from the report within 72 hours and then deleted from the Analytics servers in the next deletion process. Your reports based on previously aggregated data, for example user counts in the Audience Overview report, won’t be affected. If you need to delete data for multiple users, you can use the Analytics User Deletion API.
Delete a property.
All of the above features are available to use right now. For more information, please visit the Help Center.
We hope that you found this overview of current controls helpful. Google Analytics is continuously investing in capabilities to ensure businesses can access durable, privacy-centric, and easy to use analytics that work with and without cookies or identifiers. Please stay tuned for more in the coming months.
- The story of SEO and UX began almost 20 years ago with both making a foray into the market in the 1990s
- After years of analyzing data, I found that UX is a critical ranking factor for SEO
- If you’ve exhausted all your SEO techniques but still don’t see a considerable movement on your website or rankings – you’re probably losing at user experience (UX)
- Adobe Research’s Sr. Web Engineer, Atul Jindal condenses years of his experience and observations into this SEO guide to help you win at SEO and search experience
I’ve worked with many SEO and CRO campaigns as well as fortune 50 companies over the years. This gives me access to valuable data that helped me understand what is working and what’s not. Over the years by analyzing data I found that UX is a critical ranking factor for SEO.
The story of SEO and UX began almost 20 years ago with both making a foray into the market in the 1990s. While SEO was widely used as a marketing technique, UX (user experience) concentrated on giving the users an enhanced engaging experience on the website.
If you have exhausted all your SEO techniques but still don’t see a considerable movement on your website or rankings. Then probably you’re losing at User experience.
But it is quite difficult to find UX-related issues on your website. When you’re only looking at your website from an SEO perspective! You need to take a look at your website with your user’s (customer’s) eyes.
In this guide, I’ll explain UX and guide you on how to implement it into your SEO campaigns to get results.
What is UX?
User experience (UX) is the experience of a user with your website/application. An easy-to-use website will provide a pleasant user experience but an unplanned website will have a bad or poor user experience.
UX focuses on the site architecture, visitor journey, desktop, and mobile layouts, user flows. In short, user experience is driven by how easy or difficult it is to navigate through the user interface elements that the website designers have created.
User interface (UI) focuses on the graphical layout of any application. It includes several factors such as fonts and design styles, text entry fields, transitions, images, and animation interface. In short, anything visual comes under the umbrella of UI.
It is important to note that UI and UX are two different functionalities. While UI revolves around design layout, UX is the experience of the user on the website while they are navigating the web pages.
Since we have a better understanding of the two, let us further understand how we can successfully implement UX into an SEO campaign.
Why does UX matter in SEO?
In recent years, Google has changed its ranking criteria. There was a time when Google was looking for the keyword reparations in your content or the number of backlinks that your website has.
But now the scenario has been completely changed. Google is becoming more user-centric day by day. They are using artificial intelligence (AI), machine learning (ML), natural language processing (NLP), and other kinds of latest technologies to understand, evaluate and provide the best of the best results.
Google has introduced the EAT concept as well as metrics like search intent, page speed, mobile-friendliness, dwell-time are ranking factors to rank on Google. All these factors are part of a rich user experience.
A rich user experience is a factor that creates the difference between the first and second positions. Providing a rich user experience is always helpful for visitors and encourages them to stay longer and engage more on your website. That sends positive quality signals that show your website the best result to Google. And as a result of that Google rewards you with top spots.
How to implement UX into an SEO campaign?
As mentioned above, SEO and UX share common end goals – audience engagement. SEO will answer a person’s query, while UX will take care of their navigational queries once they reach the webpage.
Today, it has become imperative to include the two while designing SEO campaigns or any digital marketing strategy. Google is constantly evolving its user experience and merging effective SEO strategies to give the audience a more meaningful experience.
An excellent example of UX and SEO design is IKEA. We all know what IKEA stands for, but their website forms a story at every step. It guides the user to the correct landing pages and keeps them engaged. The color palette, their tags, and categories make a user stay longer and engaged on the website.
Source: IKEA designed on Canva
Empathy plays a vital role in optimizing your web pages with the right combination of keywords. Those days are no more with us when the exact keyword matches were enough to rank well. Today, it is about putting yourself out there and thinking from a bigger perspective.
Google has done a great job over the past five years of getting away from ranking signals that can be spammed easily such as links and keyword stuffing.
In other words, understanding your audience’s buying intent and analyzing their search queries will lead to refined and sustainable results.
Let us understand the three most critical factors that influence the SEO + UX ranking.
Understand your audience
It is probably one of the trickiest parts of running any successful campaign – Understanding the target audience.
Most companies spend a considerable amount of time researching the audience before concluding who will be their right target. It is why we have spent a sizable amount of time highlighting its importance.
We have often heard of marketers, businesses, and content creators emphasizing the importance of the right target audience. While sometimes it is more or less commonsensical to grasp the audience’s pulse, there are times when you need to explicitly ask:
- Who is my target audience?
- What do they want?
- What they are searching for?
- How are they looking for the information?
- Did my searcher bounce right away?
- Was there any action taken on the link?
These are key questions, Google’s algorithm takes into consideration to understand whether search results are aligned to the searcher’s intent.
For example, Airbnb works on an inclusive design model that concentrates on improving readability across all platforms. Their target audience is clearly defined – travel enthusiasts, people looking for holiday home options, and people looking for holiday hosting solutions. Their focal point has been improving the user experience by leading them to the right landing pages. They coupled it with catchy CTAs that probed the user to take an action. Whether you are a host or someone seeking an extraordinary travel experience, their comprehensive holiday solutions pave the way to make booking a holiday faster and easier.
Source: Airbnb. Designed on Canva
Once you understand your audience completely, it can lead to a page getting clicks and some action taking place if you are on the first of Google search results.
UX helps the audience stay glued to the page while SEO honors their intent to click on the page’s keyword and land. Everything you do, your focal points are always around the satisfactory experience of the users. From addressing their color preferences to the layout and messages, you have to build everything that caters to your customers.
Another critical factor in understanding the audience is the user’s intent. It would help if you addressed it while carrying out a detailed audience persona such as informational, navigational, transactional, or commercial purpose. In each case, the queries have to be predefined to understand the user’s need.
Understanding the intent of potential visitors landing on your web page through search is another crucial factor that makes up for an effective UX and SEO strategy. If your website is not fully optimized with the right set of keywords, there is a bleak chance of it ranking on Google or even leading to any action.
For example, imagine searching for the keywords – “How to wear a bowtie?”
The most logical conclusion is that your search will lead you to a tutorial or a video, right? If the same set of keywords are used by an ecommerce site selling the bowtie, your query will remain unanswered. You may conclude that the website using this keyword is not worth visiting in the future because they apply ‘click-bait’ words to lead a consumer to their website.
But if the person lands on the right page with the instructions clearly outlined, they stay to learn, thus increasing the dwell time and may browse the website for more information. Here your keyword has played a vital role in leading the consumer straight to the tutorial.
Google keyword planner, Moz keyword explorer, Keywordtool.io, Ahrefs Keywords explorer, or SECockpit are some practical tools used widely to search for the right keywords.
The best way to select the right keywords to fit your SEO strategy is to iterate the keywords you need ranking. Search relevant topics based on your business to portray and understand how the user intent affects keyword usage.
In short, keyword research, before setting up SEO campaigns and merging them with UX, help you evolve with changing market trends.
Designing a website without optimizing it for search engines is a waste of time and vice versa. Both these aspects work together and need to be carefully considered right from the beginning.
The site’s architecture is how the pages flow on your website. From the SEO point of view, good website architecture means Google will easily find and index your page. Simply, links should help Google to navigate smoothly from high to low authority pages. Google Search Console has improved a lot since its early days and became highly informative to SEO technicians, helping them to understand how a website is indexed and appeared to Google.
Using H1, H2 tags, headings, taglines, catchy CTAs, and informational menu labels, decide whether your audience will interact with your website or not. Remember- your homepage should not be more than four clicks away.
Mobile-responsive design has gained significant importance for both the user experience and SEO. Over 50 percent of all traffic is now driven by mobile search and sites that are not mobile-responsive will compromise the user experience.
According to Google’s page experience document, mobile-friendly websites have priority access to appear above in search results. Enhancing the readability of your readers by incorporating the right font family and text size is a must-have to consider improving the mobile experience. Having a responsive website with the ability to load faster has on varying screen sizes has become a standard these days.
Bad SEO + UX ruins the entire motive of brand building. It pays well to give importance to the fine attributes today. It includes domain name, informational content, internal links, optimizing meta tags, meta descriptions, image alt tags, headings, and page titles to make the entire experience worthwhile.
Implementing SEO with UX design may seem a little daunting initially; however, it is critical to boost rankings and build a great brand.
Atul Jindal is Sr. Web Engineer at Adobe Research.
- Data scientists: Bring the narrative to the forefront
- Core Web Vitals & Preparing for Google’s Page Experience Update
- Conversion modeling through Consent Mode in Google Ads
- The search dilemma: looking beyond Google’s third-party cookie death
- The FDA’s Decision to Pause J&J Could Help Defeat Covid-19