CBPO

Monthly Archives: November 2018

LinkedIn violated data protection by using 18M email addresses of non-members to buy targeted ads on Facebook

November 25, 2018 No Comments

LinkedIn, the social network for the working world with close to 600 million users, has been called out a number of times for how it is able to suggest uncanny connections to you, when it’s not even clear how or why LinkedIn would know enough to make those suggestions in the first place.

Now, a run-in with a regulator in Europe illuminates how some of LinkedIn’s practices leading up to GDPR implementation in Europe were not only uncanny, but actually violated data protection rules, in LinkedIn’s case concerning some 18 million email addresses.

The details were revealed in a report published Friday by Ireland’s Data Protection Commissioner covering activities in the first six months of this calendar year. In a list of investigations that have been reported concerning Facebook, WhatsApp and the Yahoo data breach, the DPC revealed one investigation that had not been reported before. The DPC had conducted — and concluded — an investigation of Microsoft-owned LinkedIn, originally prompted by a complaint from a user in 2017, over LinkedIn’s practices regarding people who were not members of the social network.

In short: in a bid to get more people to sign up to the service, LinkedIn admitted that it was using people’s email addresses — some 18 million in all — in a way that was not transparent. LinkedIn has since ceased the practice as a result of the investigation.

There were two parts to the supervision, as the DPC describes it:

First, the DPC found that LinkedIn in the US had obtained emails for 18 million people who were not already members of the social network, and then used these in a hashed form for targeted advertisements on the Facebook platform, “with the absence of instruction from the data controller” — that is, LinkedIn Ireland — “as is required.”

Some backstory on this: LinkedIn, Facebook and others in the lead-up to GDPR coming into effect moved data processing that had been going through Ireland to the US.

The claim was that this was to “streamline” operations but critics have said that the moves could help to shield companies a bit more from any GDPR liability over how they use process data for non-EU users.

“The complaint was ultimately amicably resolved,” the DPC said, “with LinkedIn implementing a number of immediate actions to cease the processing of user data for the purposes that gave rise to the complaint.”

Second, the DPC then decided to conduct a further audit after it became “concerned with the wider systemic issues identified” in the initial investigation. There, it found that LinkedIn was also applying its social graph-building algorithms to build networks — to suggest professional networks for users, or “undertaking pre-computation,” as the DPC describes it.

The idea here was build up suggested networks of compatible professional connections to help users overcome the hurdle of having to build networks from scratch — that being one of the hurdles in social networks for some people.

“As a result of the findings of our audit, LinkedIn Corp was instructed by LinkedIn Ireland, as data controller of EU user data, to cease pre-compute processing and to delete all personal data associated with such processing prior to 25 May 2018,” the DPC writes. May 25 was the date that GDPR came into force.

LinkedIn has provided us with the following statement in relation to the whole investigation:

“We appreciate the DPC’s 2017 investigation of a complaint about an advertising campaign and fully cooperated,” said Denis Kelleher, Head of Privacy, EMEA, for LinkedIn. “Unfortunately the strong processes and procedures we have in place were not followed and for that we are sorry. We’ve taken appropriate action, and have improved the way we work to ensure that this will not happen again. During the audit, we also identified one further area where we could improve data privacy for non-members and we have voluntarily changed our practices as a result.”

(The ‘further area’ is the pre-computation.)

There are some takeaways from the incident:

Taking LinkedIn’s words at face value, it would seem that the company is trying to show that it is acting in good faith by going one step further than simply modifying what has been identified by the DPC, changing practices voluntarily before it gets called out.

Then again, LinkedIn would not be the first company to “ask for forgiveness, not permission,” when it comes to pushing the boundaries of what is considered permissible behavior.

If you are wondering why LinkedIn did not get fined in this process — which could be one lever for pushing a company to act right from the start, rather than only change practices after getting called out — that’s because until the implementation of GDPR at the end of May, the regulator had no power to enforce fines.

What we also don’t really know here — the DPC doesn’t really address it — is where LinkedIn obtained those 18 million email addresses, and any other related data, in the first place.

Other cases reviewed in the report, such as the inquiry into Facial Recognition usage by Facebook, and how WhatsApp and Facebook share user data between each other, are still ongoing. Others, such as the investigation Yahoo security breach that affected 500 million users, are now trickling down into the companies modifying their practices.


Social – TechCrunch


You Can Pry My Air Fryer Out of My Cold, Greasy Hands

November 25, 2018 No Comments

Do air-fried foods taste as good as their fat-bathed analogues? No. But an air fryer can provide comfort food on demand.
Feed: All Latest


8 New Jobs Added to the PPC Hero Job Board!

November 24, 2018 No Comments

From time to time, we’ll let you know if new jobs have been added to the PPC Hero Job Board. In the past 2 weeks, 8 new jobs have been posted! Here’s a brief look at some of the newly posted positions: Go Local Interactive Overland Park, KS Role: Paid Media Specialist As a Paid Media […]

Read more at PPCHero.com
PPC Hero


BlueCargo optimizes stacks of containers for maximum efficiency

November 24, 2018 No Comments

Meet BlueCargo, a logistics startup focused on seaport terminals. The company was part of Y Combinator’s latest batch and recently raised a $ 3 million funding round from 1984 Ventures, Green Bay Ventures, Sound Ventures, Kima Ventures and others.

If you picture a terminal, chances are you see huge piles of containers. But current sorting methods are not efficient at all. Yard cranes end up moving a ton of containers just to reach a container sitting at the bottom of the pile.

BlueCargo wants to optimize those movements by helping you store containers at the right spot. The first container that is going to leave the terminal is going to be at the top of the pile.

“Terminals spend a lot of time making unproductive or undesired movements,” co-founder and CEO Alexandra Griffon told me. “And yet, terminals only generate revenue every time they unload or load a container.”

Right now, ERP-like solutions only manage containers according to a handful of business rules that don’t take into account the timeline of a container. Empty containers are all stored in one area, containers with dangerous goods are in another area, etc.

The startup leverages as much data as possible on each container — where it’s coming from, the type of container, if it’s full or empty, the cargo ship that carried it, the time of the year and more.

Every time BlueCargo works with a new terminal, the startup collects past data and processes it to create a model. The team can then predict how BlueCargo can optimize the terminal.

“At Saint-Nazaire, we could save 22 percent on container shifting,” Griffon told me.

The company will test its solution in Saint-Nazaire in December. It integrates directly with existing ERP solutions. Cranes already scan container identification numbers. BlueCargo could then instantly push relevant information to crane operators so that they know where to put down a container.

Saint-Nazaire is a relatively small port compared to the biggest European ports. But the company is already talking with terminals in Long Beach, one of the largest container ports in the U.S.

BlueCargo also knows that it needs to tread carefully — many companies already promised magical IT solutions in the past. But it hasn’t changed much in seaports.

That’s why the startup wants to be as seamless as possible. It only charges fees based on shifting savings — 30 percent of what it would have cost you with the old model. And it doesn’t want to alter workflows for people working at terminals — it’s like an invisible crane that helps you work faster.

There are six dominant players managing terminals around the world. If BlueCargo can convince those companies to work with the startup, it would represent a good business opportunity.


Enterprise – TechCrunch


Facebook policy VP, Richard Allan, to face the international ‘fake news’ grilling that Zuckerberg won’t

November 23, 2018 No Comments

An unprecedented international grand committee comprised of 22 representatives from seven parliaments will meet in London next week to put questions to Facebook about the online fake news crisis and the social network’s own string of data misuse scandals.

But Facebook founder Mark Zuckerberg won’t be providing any answers. The company has repeatedly refused requests for him to answer parliamentarians’ questions.

Instead it’s sending a veteran EMEA policy guy, Richard Allan, now its London-based VP of policy solutions, to face a roomful of irate MPs.

Allan will give evidence next week to elected members from the parliaments of Argentina, Brazil, Canada, Ireland, Latvia, Singapore, along with members of the UK’s Digital, Culture, Media and Sport (DCMS) parliamentary committee.

At the last call the international initiative had a full eight parliaments behind it but it’s down to seven — with Australia being unable to attend on account of the travel involved in getting to London.

A spokeswoman for the DCMS committee confirmed Facebook declined its last request for Zuckerberg to give evidence, telling TechCrunch: “The Committee offered the opportunity for him to give evidence over video link, which was also refused. Facebook has offered Richard Allan, vice president of policy solutions, which the Committee has accepted.”

“The Committee still believes that Mark Zuckerberg is the appropriate person to answer important questions about data privacy, safety, security and sharing,” she added. “The recent New York Times investigation raises further questions about how recent data breaches were allegedly dealt with within Facebook, and when the senior leadership team became aware of the breaches and the spread of Russian disinformation.”

The DCMS committee has spearheaded the international effort to hold Facebook to account for its role in a string of major data scandals, joining forces with similarly concerned committees across the world, as part of an already wide-ranging enquiry into the democratic impacts of online disinformation that’s been keeping it busy for the best part of this year.

And especially busy since the Cambridge Analytica story blew up into a major global scandal this April, although Facebook’s 2018 run of bad news hasn’t stopped there…

The evidence session with Allan is scheduled to take place at 11.30am (GMT) on November 27 in Westminster. (It will also be streamed live on the UK’s parliament.tv website.)

Afterwards a press conference has been scheduled — during which DCMS says a representative from each of the seven parliaments will sign a set of ‘International Principles for the Law Governing the Internet’.

It bills this as “a declaration on future action from the parliaments involved” — suggesting the intent is to generate international momentum and consensus for regulating social media.

The DCMS’ preliminary report on the fake news crisis, which it put out this summer, called for urgent action from government on a number of fronts — including floating the idea of a levy on social media to defence democracy.

However UK ministers failed to leap into action, merely putting out a tepid ‘wait and see’ response. Marshalling international action appears to be DCMS’ alternative action plan.

At next week’s press conference, grand committee members will take questions following Allan’s evidence — so expect swift condemnation of any fresh equivocation, misdirection or question-dodging from Facebook (which has already been accused by DCMS members of a pattern of evasive behavior).

Last week’s NYT report also characterized the company’s strategy since 2016, vis-a-vis the fake news crisis, as ‘delay, deny, deflect’.

The grand committee will hear from other witnesses too, including the UK’s information commissioner Elizabeth Denham who was before the DCMS committee recently to report on a wide-ranging ecosystem investigation it instigated in the wake of the Cambridge Analytica scandal.

She told it then that Facebooks needs to take “much greater responsibility” for how its platform is being used, and warning that unless the company overhauls its privacy-hostile business model it risk burning user trust for good.

Also giving evidence next week: Deputy information commissioner Steve Wood; the former Prime Minister of St Kitts and Nevis, Rt Hon Dr Denzil L Douglas (on account of Cambridge Analytica/SCL Elections having done work in the region); and the co-founder of PersonalData.IO, Paul-Olivier Dehaye.

Dehaye has also given evidence to the committee before — detailing his experience of making Subject Access Requests to Facebook — and trying and failing to obtain all the data it holds on him.


Social – TechCrunch


BlueCargo optimizes stacks of containers for maximum efficiency

November 23, 2018 No Comments

Meet BlueCargo, a logistics startup focused on seaport terminals. The company was part of Y Combinator’s latest batch and recently raised a $ 3 million funding round from 1984 Ventures, Green Bay Ventures, Sound Ventures, Kima Ventures and others.

If you picture a terminal, chances are you see huge piles of containers. But current sorting methods are not efficient at all. Yard cranes end up moving a ton of containers just to reach a container sitting at the bottom of the pile.

BlueCargo wants to optimize those movements by helping you store containers at the right spot. The first container that is going to leave the terminal is going to be at the top of the pile.

“Terminals spend a lot of time making unproductive or undesired movements,” co-founder and CEO Alexandra Griffon told me. “And yet, terminals only generate revenue every time they unload or load a container.”

Right now, ERP-like solutions only manage containers according to a handful of business rules that don’t take into account the timeline of a container. Empty containers are all stored in one area, containers with dangerous goods are in another area, etc.

The startup leverages as much data as possible on each container — where it’s coming from, the type of container, if it’s full or empty, the cargo ship that carried it, the time of the year and more.

Every time BlueCargo works with a new terminal, the startup collects past data and processes it to create a model. The team can then predict how BlueCargo can optimize the terminal.

“At Saint-Nazaire, we could save 22 percent on container shifting,” Griffon told me.

The company will test its solution in Saint-Nazaire in December. It integrates directly with existing ERP solutions. Cranes already scan container identification numbers. BlueCargo could then instantly push relevant information to crane operators so that they know where to put down a container.

Saint-Nazaire is a relatively small port compared to the biggest European ports. But the company is already talking with terminals in Long Beach, one of the largest container ports in the U.S.

BlueCargo also knows that it needs to tread carefully — many companies already promised magical IT solutions in the past. But it hasn’t changed much in seaports.

That’s why the startup wants to be as seamless as possible. It only charges fees based on shifting savings — 30 percent of what it would have cost you with the old model. And it doesn’t want to alter workflows for people working at terminals — it’s like an invisible crane that helps you work faster.

There are six dominant players managing terminals around the world. If BlueCargo can convince those companies to work with the startup, it would represent a good business opportunity.


Startups – TechCrunch


LinkedIn cuts off email address exports with new privacy setting

November 22, 2018 No Comments

A win for privacy on LinkedIn could be a big loss for businesses, recruiters and anyone else expecting to be able to export the email addresses of their connections. LinkedIn just quietly introduced a new privacy setting that defaults to blocking other users from exporting your email address. That could prevent some spam, and protect users who didn’t realize anyone who they’re connected to could download their email address into a giant spreadsheet. But the launch of this new setting without warning or even a formal announcement could piss off users who’d invested tons of time into the professional networking site in hopes of contacting their connections outside of it.

TechCrunch was tipped off by a reader that emails were no longer coming through as part of LinkedIn’s Archive tool for exporting your data. Now LinkedIn confirms to TechCrunch that “This is a new setting that gives our members even more control of their email address on LinkedIn. If you take a look at the setting titled ‘Who can download your email’, you’ll see we’ve added a more detailed setting that defaults to the strongest privacy option. Members can choose to change that setting based on their preference. This gives our members control over who can download their email address via a data export.”

That new option can be found under Settings & Privacy -> Privacy -> Who Can See My Email Address? This “Allow your connections to download your email [address of user] in their data export?” toggle defaults to “No.” Most users don’t know it exists because LinkedIn didn’t announce it; there’s merely been a folded up section added to the Help center on email visibility, and few might voluntarily change it to “Yes” as there’s no explanation of why you’d want to. That means nearly no one’s email addresses will appear in LinkedIn Archive exports any more. Your connections will still be able to see your email address if they navigate to your profile, but they can’t grab those from their whole graph.

Facebook came to the same conclusion about restricting email exports back when it was in a data portability fight with Google in 2010. Facebook had been encouraging users to import their Gmail contacts, but refused to let users export their Friends’ email addresses. It argued that users own their own email addresses, but not those of their Friends, so they couldn’t be downloaded — though that stance conveniently prevented any other app from bootstrapping a competing social graph by importing your Facebook friend list in any usable way. I’ve argued that Facebook needs to make friend lists interoperable to give users choice about what apps they use, both because it’s the right thing to do but also because it could deter regulation.

On a social network like Facebook, barring email exports makes more sense. But on LinkedIn’s professional network, where people are purposefully connecting with those they don’t know, and where exporting has always been allowed, making the change silently seems surreptitious. Perhaps LinkedIn didn’t want to bring attention to the fact it was allowing your email address to be slurped up by anyone you’re connected with, given the current media climate of intense scrutiny regarding privacy in social tech. But trying to hide a change that’s massively impactful to businesses that rely on LinkedIn could erode the trust of its core users.


Social – TechCrunch


Google Assistant iOS update lets you say ’Hey Siri, OK Google’

November 22, 2018 No Comments

Apple probably didn’t intend to let competitors take advantage of Siri Shortcuts this way, but you can now launch Google Assistant on your iPhone by saying “Hey Siri, OK Google .”

But don’t expect a flawless experience — it takes multiple steps. After updating the Google Assistant app on iOS, you need to open the app to set up a new Siri Shortcut for Google Assistant.

As the name suggests, Siri Shortcuts lets you record custom phrases to launch specific apps or features. For instance, you can create Siri Shortcuts to play your favorite playlist, launch directions to a specific place, text someone and more. If you want to chain multiple actions together, you can even create complicated algorithms using Apple’s Shortcuts app.

By default, Google suggests the phrase “OK Google.” You can choose something shorter, or “Hey Google,” for instance. After setting that up, you can summon Siri and use this custom phrase to launch Google’s app.

You may need to unlock your iPhone or iPad to let iOS open the app. The Google Assistant app then automatically listens to your query. Again, you need to pause and wait for the app to appear before saying your query.

This is quite a cumbersome walk-around and I’m not sure many people are going to use it. But the fact that “Hey Siri, OK Google” exists is still very funny.

On another note, Google Assistant is still the worst when it comes to your privacy. The app pushes you to enable “web & app activity,” the infamous all-encompassing privacy destroyer. If you activate that setting, Google will collect your search history, your Chrome browsing history, your location, your credit card purchases and more.

It’s a great example of dark pattern design. If you haven’t enabled web & app activity, there’s a flashy blue banner at the bottom of the app that tells you that you can “unlock more Assistant features.”

When you tap it, you get a cute little animated drawing to distract you from the text. There’s only one button, which says “More,” If you tap it, the “More” button becomes “Turn on” — many people are not even going to see “No thanks” on the bottom left.

It’s a classic persuasion method. If somebody asks you multiple questions and you say yes every time, you’ll tend to say yes to the last question even if you don’t agree with it. You tapped on “Get started” and “More” so you want to tap on the same button one more time. If you say no, Google asks you one more time if you’re 100 percent sure.

So make sure you read everything and you understand that you’re making a privacy trade-off by using Google Assistant.

Mobile – TechCrunch


Microsoft acquires FSLogix to enhance Office 365 virtual desktop experience

November 20, 2018 No Comments

Back in September, Microsoft announced a virtual desktop solution that lets customers run Office 365 and Windows 10 in the cloud. They mentioned several partners in the announcement that were working on solutions with them. One of those was FSLogix, a Georgia virtual desktop startup. Today, Microsoft announced it has acquired FSLogix. It did not share the purchase price.

“FSLogix is a next-generation app-provisioning platform that reduces the resources, time and labor required to support virtualization,” Brad Anderson, corporate VP for Microsoft Office 365 and Julia White, corporate VP for Microsoft Azure,  href=”https://blogs.microsoft.com/blog/2018/11/19/microsoft-acquires-fslogix-to-enhance-the-office-365-virtualization-experience/”>wrote in a joint blog post today.

When Microsoft made the virtual desktop announcement in September they named Citrix, CloudJumper, Lakeside Software, Liquidware, People Tech Group, ThinPrint and FSLogix as partners working on solutions. Apparently, the company decided it wanted to own one of those experiences and acquired FSLogix.

Microsoft believes by incorporating the FSLogix solution, it will provide a better virtual desktop experience for its customers by enabling better performance and faster load times, especially for Office 365 ProPlus customers.

Randy Cook, founder and CTO at FSLogix, said the acquisition made sense given how well the two companies have worked together over the years. “From the beginning, in working closely with several teams at Microsoft, we recognized that our missions were completely aligned. Both FSLogix and Microsoft are dedicated to providing the absolute best experience for companies choosing to deploy virtual desktops,” Cook wrote in a blog post announcing the acquisition.

Lots of companies have what are essentially dumb terminals running just the tools each employee needs, rather than a fully functioning standalone PC. Citrix has made a living offering these services. When employees come in to start the day, they sign in with their credentials and they get a virtual desktop with the tools they need to do their jobs. Microsoft’s version of this involves Office 365 and Windows 10 running on Azure.

FSLogix was founded in 2013 and has raised more than $ 10 million, according to data on Crunchbase. Today’s acquisition, which has already closed according to Microsoft, comes on the heels of last week’s announcement that the company was buying Xoxco, an Austin-based developer shop with experience building conversational bots.


Enterprise – TechCrunch


How to optimize your local business for voice search

November 20, 2018 No Comments

Voice search is growing, a statement appearing time and time again throughout the web. It has fundamentally changed the way people search and it’s here to stay.

With a simple command, users can conduct searches for information, products, services and local businesses.

It’s such a hot topic that our Head of Search and Strategy Stuart Shaw spoke at one of the UK’s largest SEO conferences a few weeks ago to talk about the details of voice search and why it’s important for brands.

While voice isn’t likely to surpass traditional search any time soon, it has spurred us to explore how local businesses can optimize, adjust their marketing strategies and understand the potential voice search could have on their bottom lines.

The opportunity for local businesses

To get information about a local service near to us, we pull out our phones and we search for it:

  • ‘Plumbing services near me’
  • ‘Local pizza delivery’
  • ‘What are the opening times for…’
  • ‘Is so and so open today?’ etc.

In fact, a recent study by Brightlocal highlighted that 53% of people owning smart speakers such as Amazon’s Alexa & Google Home are performing searches like these for local businesses every day in the US:

Putting that in context for the UK

A recent YouGov study showed that people in the UK owning a smart speaker had doubled between Q3 2017 and Q1 2018 to 10% of the total population.

A study by radiocentre predicted that this growth could reach as high as 40% by the end of 2018.

Looking a little deeper, we could say that per household there is more than one occupant. In fact on average there’s actually 2.3 people per household, according to the most recent UK gov statistics:

Source: Office of national statistics

So, if the 40% of UK households prediction is correct, that is potentially 11 million households exposing voice search content to 25 million people in the UK.

Who’s leading the smart speaker market?

Three-quarters of the market share in the UK in Q1 2018 was taken up by Amazon’s Alexa. This, of course, will change but right now this is where the biggest opportunity lies for local businesses optimizing for smart speakers in the UK:

 

Source: Office of national statistics

Although voice search is still in a stage of infancy, and we have only talked about smart speakers, it’s clear to see just how relevant this technology is to brick and mortar businesses.

And, it’s constantly evolving…

Here’s a timeline from Stuart’s presentation, highlighting significant changes in voice search, and it’s becoming increasingly accessible for more and more people to conduct a voice search every day:

3 Biggest steps to optimize your local business for voice search

1. Take ownership of your digital footprint

Although voice assistants seem all-knowing, they rely heavily on information they can find around the web about your business.

A big part of optimizing for local SEO is ‘citations’ which are online references to your business name, address and phone number (NAP).

Voice assistants use these citations from trusted sources to provide information to users that are conducting local search queries.

So, where should I cite my business?

Each voice assistant relies on different and sometimes multiple data aggregators for answers to local search queries:

  • Siri
    • Search: Google
    • Business listings: Apple maps
    • Reviews: Yelp
  • Alexa
    • Search: Bing
    • Business listings: Yelp and more recently Yext
    • Reviews: Yelp
  • Google Assistant
    • Search: Google
    • Business listings: Google my Business
    • Reviews: Google my Business
  • Cortana
    • Search: Bing
    • Business listings: Bing
    • Reviews: Yelp

So, these data sources are the most important places to make sure your business is correctly cited, up-to-date and optimised:

2. Utilize schema markup

Schema is a type of on-page data markup that allows webmasters to provide search engines with data about their business in a more structured way.

The structured format allows search engines to understand the contents and context of web pages much easier (less algorithmic interpretation) and, subsequently, the engines can better understand the relevance of pages to particular search queries and present richer results.

Schema is only going to play a bigger part in ranking for rich results and featured snippets which are heavily used in for voice search content.

What does schema markup do?

Search engines experiment with how they display rich results all the time and by having your site marked up, you have the opportunity to be featured in new rich results.

For example, Google experimented with a ‘prominent knowledge panel card’ shown on mobile devices which displays when users conduct a branded search for the business. In the knowledge card you can see ‘place actions’ such as ‘find a table’ or ‘book an appointment’ which would direct searchers into an appropriate webpage to conduct the action.

These rich results went on to influence the structure of Google My Business which is now heavily used by local businesses. The point here is that the business websites shown in the example image below were ‘future proofed’ and optimal which qualified them for this rich result.

In other words, as Gary Illyes – web trends analyst at Google puts it:

“If you want your sites to appear in search features, implement structured data.”

The biggest benefit and ‘thing it does’ is help Google understand relevance much more fluently. Another few quotes from Gary Illyes helps explain this:

“Add structured data to your pages because, during indexing, we will be able to better understand what your site is about.”

“And don’t just think about the structured data that we documented on developers.google.com. Think about any schema.org schema that you could use on your pages. It will help us understand your pages better, and indirectly… it leads to better ranks.”

Why it’s important for local businesses

Schema is a tool which search engines and subsequently voice assistants are using to paint a clearer picture of a business website’s central topic and the services the site can offer users.

With structured data present, it is much more likely that your business (if relevant) will be identified as a good candidate for answering local voice search queries.

Using local business schema will:

  • Future-proof your website for richer search features (which voice search content is heavily influenced by)
  • Reinforce your online digital footprint
  • Bolster relevancy signals & geographic accuracy
  • Help drive more conversions both online and offline
  • Indirectly help your website rank better (important for voice)

So how do you take control?

There are hundreds of schema types which can be utilised for hundreds of business and content types.

There are also multiple ways of marking up schema in your page source code. By far the easiest is using JSON-LD. Using the example from above the marked up code looks something like this:

The best way to get your code ready is to go to SchemaApp.com, follow the instructions or use this tutorial and locate the schema types that are most relevant to you and your business.

Types of local business data that can be marked up:

  • Business name
  • Address
  • Phone number
  • Main email address
  • Business opening hours
  • Geo-location information (latitude and longitude)
  • Reviews
  • Company logo
  • Business description
  • Social profile links
  • Site name

Bear in mind there are guidelines for usage summarized below:

  • ‘Data must not deceive or mislead experience for search users’
  • ‘Use only the most specific types and property names defined by schema.org’
  • ‘Marked-up content must be visible on the page where the script is added’

See Google’s policies for structured data for more information.

Once you’ve gone through SchemaApp, copy and paste the output code into relevant pages before your closing </head> tag or, if it’s content specific schema (such as the review rating above), paste the code before the closing </body> tag in the HTML of your page.

Finally, check your mark up with this structured data testing tool which will highlight any errors once implemented.

Note: Avoid using Google Tag Manager for this markup, apply the code natively where possible.

3. Produce content relevant to voice search needs

There are great ways of optimizing specifically for voice search using your on-site content.

The simplest is to explore the realm of user intent and uncover the types of questions people may want answering, when it comes to your business.

That doesn’t mean you need to create 1000s of pages that are optimized specifically for voice search terms. Instead, search engines such as Google pulls answers to voice queries directly from page content, even if it is a snippet that makes up a small section of the content.

Work long tail queries into long-form content

Conduct some long tail keyword research and look for questions people ask about your local business and work them into your content, where it is relevant to do so. I highly recommend Answer the Public to scale your efforts here.

Here’s an example of what I mean.

This is a query I searched recently that could be relevant to any local business:

‘Does tesco take american express?’

Here’s what was shown at the top in a featured snippet (the content that will be read out if conducting a voice search with Google Home):

And here’s the content that Google has pulled out from halfway down the page from choose.co.uk:

FAQ pages can be perfect for voice search

Written correctly, an FAQ page can serve voice search queries really effectively and if you struggle to work in your long tail optimisation into relevant pages, an FAQ page is a great way to get around it:

  • People use voice search conversationally, which you can naturally replicate on an FAQ page without the content appearing out of place
  • It appeals to long tail voice & traditional searches which widen your reach
  • Voice search often seeks concise information, under 30 words, which an FAQ page can clearly communicate
  • Creating a dedicated page specifically with this key information in mind could help with higher placement in SERPs for voice searches, which is vital for capturing that first click/interaction

Conclusion

However, you look at SEO, voice is the future and it’s growing exponentially and it’s being integrated into more and more of our everyday tech. Local business marketers should be making specific efforts to capitalize on voice search to maximize their online and offline conversion.

The caveat here is not to let your standard SEO practice fall behind. Having a fully mobile responsive website, fast site speed and good quality local backlinks, among many other optimizations, are still, and will remain, vital for ranking in local search and will greatly impact your voice search efforts.

Get a deeper dive to voice search or get help with your voice search strategy.

The post How to optimize your local business for voice search appeared first on Search Engine Watch.

Search Engine Watch