Learn a new Google Ads campaign strategy focused on audience segmentation and how to leverage the segments to drive efficiency.
Read more at PPCHero.com
One can easily translate sign reputation management to income management. Your public image directly affects sales, career and financial well-being in any field – whether searching for an investor, overcoming the negativity spread by your rivals, a change of field, or creation of a new public persona.
But what should you do if there are already lots of negative things written about you on the Internet? In this post, we’ll use one of our actual cases as an example to show how we changed a client’s reputation from 48% negative to neutral.
This article has been created by BDCenter Digital. We sign an NDA with all our customers. Therefore, all the data that could infringe on the client’s confidentiality have been changed. This doesn’t affect the mechanism of reputation management in any way.
Our assignment was to make sure that searching for our client’s name on Google in the US would yield zero negative content on the first two search engine results pages (SERPs).
At the time when the client asked us to help improve their reputation, 48% of the top 20 results were negative:
A total of seven BDCenter Digital team members worked on this reputation improving project, including:
Two SEO specialists + an assistant: Their job was to monitor and analyze search results, work out a strategy to eliminate negativity, and publish content on appropriate resources.
PR specialist: Who identified news-worthy content, contacted the media, as well as prepared and published articles.
SMM specialist: Who created social media accounts for the client and filled them with info.
Project Manager: Who allocated tasks, tracked progress, kept in touch with the client and the team, and evaluated the results.
Designer: Who prepared templates for social media and news resources.
Four months and 560 hours of work later, there was NO negativity left in the top two result pages on Google. Reputation improved!
Read on to find out how we did it.
|Igor Erenkov||Artem Shcherbakov||Olga Vodchyts|
1. Identifying resources containing negative content and monitoring changes
Our first step was to study the SERPs (with our client’s name as the search query) and find the sites that published negative content about him. This helped us understand the scope of the job and see which sites we would have to work with to push all negativity out of the top 20 results.
Every week, we would fine-tune our strategy – since Google often changes its ranking algorithm, we would get slightly differing results every day. For instance, a resource that was ranked as no.1 yesterday might not even be on the first page tomorrow.
For this reason, we checked on the situation once a week and recorded the results in a spreadsheet:
The color indicates the tonality of each resource relative to the individual in question. The names of sites were removed for the purposes of confidentiality.
One of the factors impacting how results are placed on a SERP is the age of the content. A new relevant piece of content can easily get a resource in the top 10, but just a week or two later, it can lose around 30 to 50 positions.
2. Posting mentions of the person on various websites
Undesirable information about the client was posted on large resources, one of them with 20 million monthly visitors. One of the obvious solutions was to overcome this negativity by posting positive content on even larger websites.
However, we couldn’t rely on this tool alone for two reasons:
A. High costs: The client would have to pay $ 4000 to $ 5000 per publication, and the actual budget was much lower.
B. Risk of repetitiveness: Google tries to vary its results, filling its SERPs with sites in different formats. Therefore, we decided to post content about our client on the following types of sites:
- News websites
- Blogging platforms
- Profiling sites
- Video hostings
- Podcast sites
- Social networks
- Interview-centered sites
- Client’s corporate pages
- Dropped domains
- Presentation hostings
3. Optimizing the client’s corporate site
Google prioritizes those sites that are most relevant to the search query. What do you see at the top of the list when googling the name of someone? Depending on the popularity, it can be a Wikipedia article, a corporate website, or a social media account.
In our case, the client’s corporate website was among the top results already, but we wanted to strengthen its position. To do this, we optimized the Team page and created an additional page with the client’s bio.
As a result, these two pages ended up in Google’s top three in the US, pushing all the negativity down the list.
4. Using dropped domains
When time is limited and you need a quick result, you can benefit from dropped domains.
A drop is a domain that its owner decided not to pay for any longer and is now for sale. Some of these dropped domains are still indexed by Google, and you can get good results by publishing backlinks there.
After confirming this step with the client, we created a site based on a good dropped domain and published new content on that site. In just a month, the site was ranked among the top five on Google.
5. Pushing negativity out of Google Image Search
The image search also yielded some negative results, so we had to work not only on pushing individual websites out of the top 20 but specific images, too.
Since Google likes unique content, we made sure to use only unique images of the client in our publications and his social media accounts.
If you don’t have any fresh pictures available, you can edit some of the old ones, changing the background, size, or color profile. This will make Google see them as unique, showing them first.
By the way, changing just the size doesn’t work. Google views such pictures as identical, showing only the one with the best resolution.
PR and content
1. Identifying newsworthy materials
The client didn’t have any important news to share, so we had to create it ourselves. In particular, we watched the industry news closely – and as soon as we found something valuable, we confronted the event with our client’s expertise. Thanks to his status and extensive experience, he could provide commentary on the latest research and news for the media.
2. Publishing content
The technique described above provided us with publications on news websites – however, they would allow free coverage only for really important events. Working with niche websites was much easier: we used them to publish expert articles and interviews.
We only chose sites that fit the following three criteria:
- Relevance to the subject – wealth management, finance, and investment.
- The site had to contain a negative article about our client. Publishing fresh content on the same site would get the old article to rank lower.
- Importance – the site’s «weight», or authority, had to be higher or equal to that of the sites that contained negativity, helping to overcome it.
By weight we mean the level of Google’s trust in the resource. This trust is based on the number of visitors, the site’s age and level of optimization.
If you need quick results, you can get a lot of coverage fast by publishing your content on PR Newswire. Read our recent post on how to do this.
Our client’s name had to be mentioned in the title: -this helped articles rank much better for our search query.
However, our title headline didn’t always fit the editing guidelines of individual resources: some preferred to list the author at the very bottom of the piece. Such articles weren’t useful to us since they didn’t rank the way we would’ve liked.
We tested this headline theory many times. Even a publication on the gigantic Yahoo! Finance with one mention in the body of the text works worse than an article on a small website, but with the client’s name mentioned in the title, lead-in, and text body.
1. Creating and filling social media accounts
We created accounts for the client on Twitter, Linkedin, Facebook, and other platforms. We didn’t use those social networks that weren’t relevant to the client’s business — such as Pinterest, for example.
Linkedin yielded the best result: Our client’s profile on this platform still ranks as no. 1 in the search results, pushing out the old negative content. Xing, Tumblr, and Instagram didn’t produce any result at all: none of them got into the top 20.
We made sure to fill new social media pages with expert content – mostly pieces for the articles we wrote for the media. Naturally, we always adapted the text for social media. The posts were accompanied by photos of the client: we arranged special photoshoots for that purpose.
2. Posting podcasts and videos
Google prefers content to be varied. So it prioritizes not only fresh articles but also video and podcasts.
We started accounts on YouTube and Vimeo for our client and added several videos: some we created specifically to fit recent news, others were chosen among existing content.
We posted those videos not only on the client’s own accounts but also in other users’ profiles. By the way, it was a video posted on the page of another user that ended up in the top 10 of Google.
As for podcasts, they can work well, too – as long as you post them on popular platforms, such as iTunes or audioboom.com, which has over two million monthly users.
Project Manager’s comments
SERM, or search engine reputation management, combines such tools as SEO, PR, and SMM. In order to leverage this combination with maximum benefit, we utilize the following principles:
- Regular strategy updates – since both SERPs and relevant content change all the time, we have to monitor all changes and reassess our action plans when required.
- Analysis of the results – we constantly check what works and what doesn’t. This helps us work faster, better, and without wasting our resources.
- Daily contact with the client – this way we can quickly make strategic decisions and create fresh content.
- Generating relevant content – even though SERM is more about pushing negativity as far down as possible in the SERPs, we are also very serious about what we post – and so are our clients, of course. Content should also be relevant to the objective. In the case, we’ve described that meant niche articles, podcasts, and videos that accentuated the client’s expertise.
By using all these tools, we managed to radically transform the first two Google result pages. 90% of the top 20 were now positive, with the remaining 10% neutral.
Based on our experience with reputation management – and we’ve already worked with a Nobel laureate, several politicians, and CEO’s of financial institutions – your public image can have a tremendous impact on your business and career. By maintaining a good public image on a constant basis is much easier and cheaper than launching major reputation rehaul campaigns once every few years.
To maintain your reputation, make sure to monitor the search results for your name or brand. Select your key search queries and set up alerts: this way you’ll know what Google users see when they look for information about you and will be ready to react to any negativity.
The post Case Study: How BDCenter transformed a reputation from 48% negative on Google to neutral appeared first on Search Engine Watch.
The cameras on our phones are getting good enough that it’s becoming hard to justify having a dedicated picture-taking device. Fujifilm’s X100 series has always made one of the strongest cases for it, however, and the latest iteration makes it more convincing than ever.
I reviewed the original X100 back in 2011, and the series has received a new model about every two years since its announcement; today’s X100V is the fifth. But its changes are more significant than those of any one of its predecessors.
The X100V has a new 24-megapixel APS-C sensor and image processor, taken from Fuji’s high-end X-Pro3, which I’ve used and been quite impressed with. It also inherits the X-Pro3’s much-improved OLED/optical viewfinder, autofocus system and other features. But they’re married to a redesigned 35mm equivalent, F/2 lens that improves on what was already excellent glass.
The series has always had a throwback aesthetic, adding dials while others eliminated them, but in a concession to modernity the rear LCD is now a tilting touchscreen, now a must-have for many shooters. It also has improved video capabilities, and is now weatherproofed as well.
All these fit into a package that is highly compact and attractive, though admittedly considerably thicker than a phone. But although under some circumstances a phone camera can indeed rival a dedicated camera, the X100V perhaps more than any other compact camera justifies itself (incidentally, DPReview’s initial impressions are highly favorable).
The shooting experience is so different (the hybrid viewfinder is and always has been genius), it puts so many options at your disposal, and the resulting image will not only be superior, but more defined by what you want to create than what your phone is capable of doing.
I’ve been trying to reconnect with photography and I’ve found that relying on the phone for that simply isn’t an option for me any more. I want the right tool for the job, yet I don’t want to be inconvenienced by a camera’s size or operation, or obsess over my lens selection. I want an image-taking device as dedicated to that purpose as a knife is to cutting.
Is that the X100V? There is real competition from Ricoh’s latest GR III street shooter, as well as the Canon G5 X II and Sony’s RX100 VII. Although camera sales are dropping, there’s no better time to want or have a compact device in this class. Fortunately it seems to come down to personal preference. I’d be happy with any one of those in my hand, if it means I can leave my phone in my pocket.
While Facebook CEO Mark Zuckerberg seemed cheerful and even jokey when he took the stage today in front of journalists and media executives (at one point, he described the event as “by far the best thing” he’d done this week), he acknowledged that there are reasons for the news industry to be skeptical.
Facebook, after all, has been one of the main forces creating a difficult economic reality for the industry over the past decade. And there are plenty of people (including our own Josh Constine) who think it would be foolish for publishers to trust the company again.
For one thing, there’s the question of how Facebook’s algorithm prioritizes different types of content, and how changes to the algorithm can be enormously damaging to publishers.
“We can do a better job of working with partners to have more transparency and also lead time about what we see in the pipeline,” Zuckerberg said, adding, “I think stability is a big theme.” So Facebook might be trying something out as an “experiment,” but “if it kind of just causes a spike, it can be hard for your business to plan for that.”
At the same time, Zuckerberg argued that Facebook’s algorithms are “one of the least understood things about what we do.” Specifically, he noted that many people accuse the company of simply optimizing the feed to keep users on the service for as long as possible.
“That’s actually not true,” he said. “For many years now, I’ve prohibited any of our feed teams … from optimizing the systems to encourage the maximum amount of time to be spent. We actually optimize the system for facilitating as many meaningful interactions as possible.”
For example, he said that when Facebook changed the algorithm to prioritize friends and family content over other types of content (like news), it effectively eliminated 50 million hours of viral video viewing each day. After the company reported its subsequent earnings, Facebook had the biggest drop in market capitalization in U.S. history.
Zuckerberg was onstage in New York with News Corp CEO Robert Thomson to discuss the launch of Facebook News, a new tab within the larger Facebook product that’s focused entirely on news. Thomson began the conversation with a simple question: “What took you so long?”
The Facebook CEO took this in stride, responding that the question was “one of the nicest things he could have said — that actually means he thinks we did something good.”
Zuckerberg went on to suggest that the company has had a long interest in supporting journalism (“I just think that every internet platform has a responsibility to try to fund and form partnerships to help news”), but that its efforts were initially focused on the News Feed, where the “fundamental architecture” made it hard to find much room for news stories — particularly when most users are more interested in that content from friends and family.
So Facebook News could serve as a more natural home for this news (to be clear, the company says news content will continue to appear in the main feed as well). Zuckerberg also said that since past experiments have created such “thrash in the ecosystem,” Facebook wanted to make sure it got this right before launching it.
In particular, he said the company needed to show that tabs within Facebook, like Facebook Marketplace and Facebook Watch, could attract a meaningful audience. Zuckerberg acknowledged that the majority of Facebook users aren’t interested in these other tabs, but when you’ve got such an enormous user base, even a small percentage can be meaningful.
“I think we can probably get to maybe 20 or 30 million people [visiting Facebook News] over a few years,” he said. “That by itself would be very meaningful.”
Facebook is also paying some of the publishers who are participating in Facebook News. Zuckerberg described this as “the first time we’re forming long-term, stable relationships and partnerships with a lot of publishers.”
Several journalists asked for more details about how Facebook decided which publishers to pay, and how much to pay them. Zuckerberg said it’s based on a number of factors, like ensuring a wide range of content in Facebook News, including from publishers who hadn’t been publishing much on the site previously. The company also had to compensate publishers who are taking some of their content out from behind their paywalls.
“This is not an exact formula — maybe we’ll get to that over time — but it’s all within a band,” he said.
Zuckerberg was also asked about how Facebook will deal with accuracy and quality, particularly given the recent controversy over its unwillingness to fact check political ads.
He sidestepped the political ads question, arguing that it’s unrelated to the day’s topics, then said, “This is a different kind of thing.” In other words, he argued that the company has much more leeway here to determine what is and isn’t included — both by requiring any participating publishers to abide by Facebook’s publisher guidelines, and by hiring a team of journalists to curate the headlines that show up in the Top Stories section.
“People have a different expectation in a space dedicated to high-quality news than they do in a space where the goal is to make sure everyone can have a voice and can share their opinion,” he said.
As for whether Facebook News will include negative stories about Facebook, Zuckerberg seemed delighted to learn that Bloomberg (mostly) doesn’t cover Bloomberg.
“I didn’t know that was a thing a person could do,” he joked. More seriously, he said, “For better or worse, we’re a prominent part of a lot of the news cycles. I don’t think it would be reasonable to try to have a news tab that didn’t cover the stuff that Facebook is doing. In order to make this a trusted source over time, they have to be covered objectively.”
When budgets are time flexible, the client’s bottom line is more than likely to be positively impacted. I’ve noticed four specific areas where this setup is tremendously beneficial: flexibility With Inconsistent Seasonality, End of Month Opportunity, Capilization on Unexpected Traffic Changes, More Room for Error
Read more at PPCHero.com
Venngage is a free infographic maker that has catered to more than 21,000 businesses. In this article, we explore how they grew their organic traffic from about 275,000 visitors per month in November 2017 to about 900,000 today — more than tripling in 17 months.
I spoke with Nadya Khoja, Chief Growth Officer at Venngage, about their process.
Venngage gets most of their leads from content and organic search. The percentage varies from month to month in the range of 58% to 65%.
In Nov 2017, Venngage enjoyed 275,000 visitors a month from organic search traffic. Today (16 months later) it’s 900,000. Nadya Khoja (their Chief Growth Officer) extrapolated from their current trend that by December of 2019 (in nine months) they will enjoy three million organic search visitors per month.
In 2015, when Nadya started with Venngage, they saw 300 to 400 registrations a week. By March of 2018, this was up to 25,000 a week. Today it’s 45,000.
While Nadya had the advantage of not starting from zero, that is impressive growth per any reasonable metric. How did they do it?
There are a lot of pieces to this puzzle. I’ll do my best to explain them, and how they tie together. There is no correct order to things per se, so what is below is my perspective on how best to tell this story.
The single most important ingredient: Hypothesize, test, analyze, adjust
This critical ingredient is surprisingly not an ingredient, but rather a methodology. I’m tempted to call it “the scientific method”, as that’s an accurate description, but perhaps it’s more accurate to call it the methodology written up in the books “The Lean Startup” (which Nadya has read) and “Running Lean” (which Nadya has not read).
This single most important ingredient is the methodology of the hypothesize, test, analyze, and adjust.
What got them to this methodology was a desire to de-risk SEO.
The growth in traffic and leads was managed through a series of small and quick iterations, each one of which either passed or failed. Ones that passed were done more. Ones that failed were abandoned.
This concept of hypothesizing, testing, analyzing, and adjusting is used both for SEO changes and for changes to their products.
The second most important ingredient
This ingredient is shared knowledge. Venngage marketing developed “The Playbook”, which everyone in marketing contributes to. “The Playbook” was created both as a reference with which to bring new team members up to speed quickly, as well as a running history of what has been tested and how it went.
The importance of these first two ingredients cannot be overstated. From here on, I am revealing things they learned through trial and error. You have the advantage to learn from their successes and failures. They figured this stuff out the hard way. One hypothesis and one test at a time.
Their north star metrics
They have two north star metrics. The first one seems fairly obvious. “How many infographics are completed within a given time period?” The second one occurred to them later and is as important, if not more so. It is “how long does it take to complete an infographic?”
The first metric, of course, tells them how attractive their product is. The second tells them how easy (or hard) their product is to use.
Together these are the primary metrics that drive everything Venngage does.
The 50/50 focus split
As a result of both the company and the marketing department having a focus on customer acquisition and customer retention, every person in marketing spends half their time working on improving the first north star metric, and the other half spend their time working on improving the second.
Marketing driving product design
Those north star metrics have led to Venngage developing what I call marketing driven product design. Everywhere I ever worked has claimed they did this. The way Venngage does this exceeds anything ever done at a company I’ve worked for.
“How do I be good?”
This part of Nadya’s story reminds me of the start of a promo video I once saw for MasterClass.com. It’s such a good segue to this part of the story that I cropped out all but the good part to include in this article.
When Steve Martin shed light on an important marketing question
I’ve encountered a number of companies through the years who thought of marketing as “generating leads” and “selling it”, rather than “how do we learn what our customers want?”, or “how do we make our product easier to use?”
The company is structured into cross-functional squads, a cross-functional squad being people from various departments within Venngage, all working to improve a company-wide metric.
For example, one of the aspects of their infographic product is templates. A template is a starting point for building an infographic.
As templates are their largest customer acquisition channel, they created a “Template Squad”, whose job is to work on their two north star metrics for their templates.
The squad consists of developers, designers, UI/UX people, and the squad leader, who is someone in marketing. Personally, I love this marketing focus, as it de-focuses marketing and causes marketing to be something that permeates everything the company does.
There is another squad devoted to internationalization, which as you can infer, is responsible to improve their two north star metrics with users in countries around the world.
Each template squad member is tasked with improving their two north star metrics.
Ideas on how to do this come from squad members with various backgrounds and ideas.
Each idea is translated into a testable hypothesis. Modifications are done weekly. As you can image, Venngage is heavy into analytics, as without detailed and sophisticated analytics, they don’t know which experiments worked and which didn’t.
Examples of ideas that worked are:
- Break up the templates page into a series of pages, which contain either category of templates or single templates.
- Ensure each template page contains SEO keywords specific for the appropriate industry or audience segment. This is described in more detail further in this document.
- Undo the forced backlink each of the embedded templates used to contain.
- This allowed them to get initial traction, but it later resulted in a Google penalty.
- This is a prime example of an SEO tactic that worked until it didn’t.
- Create an SEO checklist for all template pages with a focus on technical SEO.
- This eliminated human error from the process.
- Eliminate “React headers” Google was not indexing.
- Determine what infographic templates and features people don’t use and eliminate them.
I personally think this is really important. To obtain outputs, they measured inputs. When the goal was to increase registrations, they identified the things they had to do to increase registrations, then measured how much of that they did every week.
Everyone does SEO
In the same way that marketing is something that does not stand alone, but rather permeates everything Venngage does, SEO does not stand alone. It permeates everything marketing does. Since organic search traffic is the number one source of leads, they ensure everyone in marketing knows the basics of technical SEO and understands the importance of this never being neglected.
Beliefs and values
While I understand the importance of beliefs and values in human psychology, it was refreshing to see this being proactively addressed within an organization in the context of improving their north star metrics.
They win and lose together
Winning and losing together is a core belief at Venngage. Nadya states it minimizes blame and finger-pointing. When they win, they all win. When they lose, they all lose. It doesn’t matter who played what part. To use a sports analogy, a good assist helps to score a goal. A bad assist, well, that’s an opportunity to learn.
SEO is a team effort
While it is technically possible for a single person to do SEO, the volume of tasks required these days makes it impractical. SEO requires quality content, technical SEO, and building of backlinks through content promotion, guest posting, and the others. Venngage is a great example of effectively distributing SEO responsibilities through the marketing department.
To illustrate the importance of the various pieces fitting together, consider that while content is king, technical SEO is what gets content found, but when people find crappy content, it doesn’t convert.
You can’t manage what you don’t measure
This requires no elaboration.
But what you measure matters
This probably does justify some elaboration. We’ve all been in organizations that measured stupid stuff. By narrowing down to their two north star metrics, then focusing their efforts to improving those metrics, they’ve aligned everyone’s activity towards things that matter.
The magic of incremental improvements
This is the Japanese concept of Kaizen put into play for the development and marketing of a software product.
Done slightly differently, this concept helped Britain dominate competitive cycling at the 2008 Olympics in Beijing.
Customer acquisition is not enough
Venngage developed their second north star metric after deciding that acquiring new customers was not, in and of itself, any form of the Holy Grail. They realized that if their product was hard to use, fewer people would use it.
They decided a good general metric of how easy the product is to use was to measure how long people take to build an infographic. If people took “too long”, they spoke to them about why.
This led them to change the product in ways to make it easier to use.
Link building is relationship building
As a reader of Search Engine Watch, you know link building is critical and central to SEO. In the same way that everyone in Venngage marketing must know the basics of technical SEO, everyone in Venngage marketing must build links.
They do so via outreach to promote their content. As people earn links from the content promotion outreach, they record those links in a shared spreadsheet.
While this next bit is related to link building, everyone in Venngage marketing has traffic goals as well.
This too is tracked in a simple and reasonable way. Various marketers own different “areas” or “channels”. These channels are broken down into specific traffic acquisition metrics.
As new hires get more familiar with how things work at Venngage, they are guided into traffic acquisition channels which they want to work on.
Learning experience, over time
My attempt here is to provide a chronology of what they learned in what order. It may help you avoid some of the mistakes they made.
Cheating works until it doesn’t
Understanding the importance of links to search ranking, they thought it would be a good idea to implement their infographics with embedded backlinks. Each implemented infographic contained a forced backlink to the Venngage website.
They identified a set of anchor text they thought would be beneficial to them and rotated through them for these forced backlinks.
And it worked, for a while. Until they realized they had invited a Google penalty. This took a bit to clean up.
The lessons learned:
- The quality of your backlinks matter.
- To attract quality backlinks, publish quality content.
Blog posts brought in users who activated
At some point, their analytics helped them realize that users who activated from blog posts where ideal users for them. So they set a goal to increase activations from blog posts, which led to the decision to test if breaking up templates into categories and individual pages with only one template made sense. It did.
Website design matters
Changing the website from one big template page to thousands of smaller ones helped, and not just because it greatly increased the number of URLs indexed by Google. It also greatly improved the user experience. It made it easier for their audience to find templates relevant to them, without having to look at templates that weren’t.
Lesson learned: UI/UX matters for both users and SEO.
Hybrid content attracts
Hybrid content is where an article talks about two main things. For example, talking about Hogwarts houses sorting within the context of an infographic. This type of content brings in some number of Harry Potter fans, some of whom have an interest in creating infographics. The key to success is tying these two different topics together well.
Content is tuneable
By converting one huge templates page into thousands of small template pages, they realized that a template or set of templates that appeal to one audience segment would not necessarily appeal to others. This caused them to start to tune templates towards audience segments in pursuit of more long tail organic search traffic.
How did they figure out what users wanted in terms of better content? They used a combination of keyword research and talking with users and prospects.
Some content doesn’t make the cut
After they caught onto the benefits of tuning content to attract different audience segments, they looked for content on their site that no one seemed to care about. They deleted it. While it decreased the amount of content on their site, it increased their overall content quality.
Traffic spikes are not always good news
When they initially started creating forced backlinks in their infographics, they could see their traffic increase. They saw some spikes. Their general thought was more traffic is good.
When they experienced the Google penalty, they realized how wrong they were. Some traffic spikes are bad news. Others are good news.
When your website traffic shows a sudden change, even if you’re experiencing a spike in organic search traffic, you must dig into the details and find out the root cause.
Lesson learned: There is a thing as bad traffic. Some traffic warns you of a problem.
Links from product embeds aren’t all bad
They just needed to make the embedded links optional. To allow the customer to decide if they do or do not deserve a backlink. While this did not cause any change to their levels of organic search traffic, it was necessary to resolve the Google penalty.
Incremental continuous improvement seems repetitive and boring. A one percent tweak here, a two percent tweak there, but over time, you’ve tripled your organic search traffic and your lead flow.
It’s necessarily fun, but it delivers results.
Lesson learned: What I’ll call “infrastructure” is boring, and it matters. Both for your product and your SEO.
Figure out what to measure
The idea of measuring the amount of time required to complete an infographic did not occur to them on day one. This idea came up when they were looking for a metric to indicate to them how easy (or difficult) their product was to use.
Once they decided this metric possibly made sense, they determined their baseline, then through an iterative process, making improvements to the product to make this a little faster.
As they did so, the feedback from the users was positive, so they doubled down on this effort.
Lesson learned: What you measure matters.
Teach your coworkers well
They created “The Playbook”, which is a compendium of the combined knowledge they’ve accumulated over time. The playbook is written by them, for them.
Marketing employees are required to add chapters to the playbook as they learn new skills and methods.
Its primary purpose is to bring new team members up to speed quickly, and it also serves as a historical record of what did and did not work.
One important aspect of continuous improvement is for new people to avoid suggesting experiments that previously failed.
Additionally (and I love this), every month everyone in marketing gives Nadya an outline of what they’re learning and what they’re improving on.
Their marketing stack
While their marketing stack is not essential to understanding their processes, I find it useful to understand what software tools a marketing organization uses, and for what. So here is theirs. This is not a list of what they’ve used and abandoned over time, but rather a list of what they use now.
- Analytics: Google Analytics and Mixpanel
- Customer communications: Intercom
- Link analysis and building: Ahrefs
- Link building outreach: Mailshake
- Project management: Trello
- General purpose: G Suite
To me, what Nadya has done at Venngage is a case study in how to do SEO right, and most of doing it right are not technical SEO work.
- Help senior management understand that some things that are not typically thought of as SEO (website design for example) can have serious SEO implications.
- Get senior management buy in to include these non-SEO functions in your SEO efforts.
- Understand what very few basic metrics matter for your company, and how you measure them.
- Distribute required SEO work through as many people as reasonably possible. Include people whose job functions are not necessarily SEO related (writers, designers, UI/UX, and more).
- Test and measure everything.
- Win big through a continuous stream of small incremental improvements.
Venngage has surely lead by example and all the guidelines and pointers shared above can surely help your organization implement its search for increased sales.
Kevin Carney is the Founder and CEO of the boutique link building agency Organic Growth.
The post SEO case study: How Venngage turned search into their primary lead source appeared first on Search Engine Watch.
When it comes to best practices in digital marketing, practitioners should embrace a “trust, but verify” mindset. Specifically, adopt best practices when possible, but don’t assume that they’ll improve performance – every account exists in a different context with a multitude of different intervening variables that can affect the impact of any initiative. One such best practice that deserves interrogation is the adoption of promotions in ad copy in Google Ads.
Read more at PPCHero.com
No one likes being stalked around the Internet by adverts. It’s the uneasy joke you can’t enjoy laughing at. Yet vast people-profiling ad businesses have made pots of money off of an unregulated Internet by putting surveillance at their core.
But what if creepy ads don’t work as claimed? What if all the filthy lucre that’s currently being sunk into the coffers of ad tech giants — and far less visible but no less privacy-trampling data brokers — is literally being sunk, and could both be more honestly and far better spent?
Case in point: This week Digiday reported that the New York Times managed to grow its ad revenue after it cut off ad exchanges in Europe. The newspaper did this in order to comply with the region’s updated privacy framework, GDPR, which includes a regime of supersized maximum fines.
The newspaper business decided it simply didn’t want to take the risk, so first blocked all open-exchange ad buying on its European pages and then nixed behavioral targeting. The result? A significant uptick in ad revenue, according to Digiday’s report.
“NYT International focused on contextual and geographical targeting for programmatic guaranteed and private marketplace deals and has not seen ad revenues drop as a result, according to Jean-Christophe Demarta, SVP for global advertising at New York Times International,” it writes.
“Currently, all the ads running on European pages are direct-sold. Although the publisher doesn’t break out exact revenues for Europe, Demarta said that digital advertising revenue has increased significantly since last May and that has continued into early 2019.”
It also quotes Demarta summing up the learnings: “The desirability of a brand may be stronger than the targeting capabilities. We have not been impacted from a revenue standpoint, and, on the contrary, our digital advertising business continues to grow nicely.”
So while (of course) not every publisher is the NYT, publishers that have or can build brand cachet, and pull in a community of engaged readers, must and should pause for thought — and ask who is the real winner from the notion that digitally served ads must creep on consumers to work?
The NYT’s experience puts fresh taint on long-running efforts by tech giants like Facebook to press publishers to give up more control and ownership of their audiences by serving and even producing content directly for the third party platforms. (Pivot to video anyone?)
Such efforts benefit platforms because they get to make media businesses dance to their tune. But the self-serving nature of pulling publishers away from their own distribution channels (and content convictions) looks to have an even more bass string to its bow — as a cynical means of weakening the link between publishers and their audiences, thereby risking making them falsely reliant on adtech intermediaries squatting in the middle of the value chain.
There are other signs behavioural advertising might be a gigantically self-serving con too.
Look at non-tracking search engine DuckDuckGo, for instance, which has been making a profit by serving keyword-based ads and not profiling users since 2014, all the while continuing to grow usage — and doing so in a market that’s dominated by search giant Google.
DDG recently took in $ 10M in VC funding from a pension fund that believes there’s an inflection point in the online privacy story. These investors are also displaying strong conviction in the soundness of the underlying (non-creepy) ad business, again despite the overbearing presence of Google.
Meanwhile, Internet users continue to express widespread fear and loathing of the ad tech industry’s bandwidth- and data-sucking practices by running into the arms of ad blockers. Figures for usage of ad blocking tools step up each year, with between a quarter and a third of U.S. connected device users’ estimated to be blocking ads as of 2018 (rates are higher among younger users).
Ad blocking firm Eyeo, maker of the popular AdBlock Plus product, has achieved such a position of leverage that it gets Google et al to pay it to have their ads whitelisted by default — under its self-styled ‘acceptable ads’ program. (Though no one will say how much they’re paying to circumvent default ad blocks.)
So the creepy ad tech industry is not above paying other third parties for continued — and, at this point, doubly grubby (given the ad blocking context) — access to eyeballs. Does that sound even slightly like a functional market?
In recent years expressions of disgust and displeasure have also been coming from the ad spending side too — triggered by brand-denting scandals attached to the hateful stuff algorithms have been serving shiny marketing messages alongside. You don’t even have to be worried about what this stuff might be doing to democracy to be a concerned advertiser.
Fast moving consumer goods giants Unilever and Procter & Gamble are two big spenders which have expressed concerns. The former threatened to pull ad spend if social network giants didn’t clean up their act and prevent their platforms algorithmically accelerating hateful and divisive content.
While the latter has been actively reevaluating its marketing spending — taking a closer look at what digital actually does for it. And last March Adweek reported it had slashed $ 200M from its digital ad budget yet had seen a boost in its reach of 10 per cent, reinvesting the money into areas with “‘media reach’ including television, audio and ecommerce”.
The company’s CMO, Marc Pritchard, declined to name which companies it had pulled ads from but in a speech at an industry conference he said it had reduced spending “with several big players” by 20 per cent to 50 per cent, and still its ad business grew.
So chalk up another tale of reduced reliance on targeted ads yielding unexpected business uplift.
At the same time, academics are digging into the opaquely shrouded question of who really benefits from behavioral advertising. And perhaps getting closer to an answer.
Last fall, at an FTC hearing on the economics of big data and personal information, Carnegie Mellon University professor of IT and public policy, Alessandro Acquisti, teased a piece of yet to be published research — working with a large U.S. publisher that provided the researchers with millions of transactions to study.
Acquisti said the research showed that behaviourally targeted advertising had increased the publisher’s revenue but only marginally. At the same time they found that marketers were having to pay orders of magnitude more to buy these targeted ads, despite the minuscule additional revenue they generated for the publisher.
“What we found was that, yes, advertising with cookies — so targeted advertising — did increase revenues — but by a tiny amount. Four per cent. In absolute terms the increase in revenues was $ 0.000008 per advertisment,” Acquisti told the hearing. “Simultaneously we were running a study, as merchants, buying ads with a different degree of targeting. And we found that for the merchants sometimes buying targeted ads over untargeted ads can be 500% times as expensive.”
“How is it possible that for merchants the cost of targeting ads is so much higher whereas for publishers the return on increased revenues for targeted ads is just 4%,” he wondered, posing a question that publishers should really be asking themselves — given, in this example, they’re the ones doing the dirty work of snooping on (and selling out) their readers.
Acquisti also made the point that a lack of data protection creates economic winners and losers, arguing this is unavoidable — and thus qualifying the oft-parroted tech industry lobby line that privacy regulation is a bad idea because it would benefit an already dominant group of players. The rebuttal is that a lack of privacy rules also does that. And that’s exactly where we are now.
“There is a sort of magical thinking happening when it comes to targeted advertising [that claims] everyone benefits from this,” Acquisti continued. “Now at first glance this seems plausible. The problem is that upon further inspection you find there is very little empirical validation of these claims… What I’m saying is that we actually don’t know very well to which these claims are true and false. And this is a pretty big problem because so many of these claims are accepted uncritically.”
There’s clearly far more research that needs to be done to robustly interrogate the effectiveness of targeted ads against platform claims and vs more vanilla types of advertising (i.e. which don’t demand reams of personal data to function). But the fact that robust research hasn’t been done is itself interesting.
Acquisti noted the difficulty of researching “opaque blackbox” ad exchanges that aren’t at all incentivized to be transparent about what’s going on. Also pointing out that Facebook has sometimes admitted to having made mistakes that significantly inflated its ad engagement metrics.
His wider point is that much current research into the effectiveness of digital ads is problematically narrow and so is exactly missing a broader picture of how consumers might engage with alternative types of less privacy-hostile marketing.
In a nutshell, then, the problem is the lack of transparency from ad platforms; and that lack serving the self same opaque giants.
But there’s more. Critics of the current system point out it relies on mass scale exploitation of personal data to function, and many believe this simply won’t fly under Europe’s tough new GDPR framework.
They are applying legal pressure via a set of GDPR complaints, filed last fall, that challenge the legality of a fundamental piece of the (current) adtech industry’s architecture: Real-time bidding (RTB); arguing the system is fundamentally incompatible with Europe’s privacy rules.
We covered these complaints last November but the basic argument is that bid requests essentially constitute systematic data breaches because personal data is broadcast widely to solicit potential ad buys and thereby poses an unacceptable security risk — rather than, as GDPR demands, people’s data being handled in a way that “ensures appropriate security”.
To spell it out, the contention is the entire behavioral advertising business is illegal because it’s leaking personal data at such vast and systematic scale it cannot possibly comply with EU data protection law.
Regulators are considering the argument, and courts may follow. But it’s clear adtech systems that have operated in opaque darkness for years, without no worry of major compliance fines, no longer have the luxury of being able to take their architecture as a given.
Greater legal risk might be catalyst enough to encourage a market shift towards less intrusive targeting; ads that aren’t targeted based on profiles of people synthesized from heaps of personal data but, much like DuckDuckGo’s contextual ads, are only linked to a real-time interest and a generic location. No creepy personal dossiers necessary.
If Acquisti’s research is to be believed — and here’s the kicker for Facebook et al — there’s little reason to think such ads would be substantially less effective than the vampiric microtargeted variant that Facebook founder Mark Zuckerberg likes to describe as “relevant”.
The ‘relevant ads’ badge is of course a self-serving concept which Facebook uses to justify creeping on users while also pushing the notion that its people-tracking business inherently generates major extra value for advertisers. But does it really do that? Or are advertisers buying into another puffed up fake?
Facebook isn’t providing access to internal data that could be used to quantify whether its targeted ads are really worth all the extra conjoined cost and risk. While the company’s habit of buying masses of additional data on users, via brokers and other third party sources, makes for a rather strange qualification. Suggesting things aren’t quite what you might imagine behind Zuckerberg’s drawn curtain.
Behavioral ad giants are facing growing legal risk on another front. The adtech market has long been referred to as a duopoly, on account of the proportion of digital ad spending that gets sucked up by just two people-profiling giants: Google and Facebook (the pair accounted for 58% of the market in 2018, according to eMarketer data) — and in Europe a number of competition regulators have been probing the duopoly.
Earlier this month the German Federal Cartel Office was reported to be on the brink of partially banning Facebook from harvesting personal data from third party providers (including but not limited to some other social services it owns). Though an official decision has yet to be handed down.
While, in March 2018, the French Competition Authority published a meaty opinion raising multiple concerns about the online advertising sector — and calling for an overhaul and a rebalancing of transparency obligations to address publisher concerns that dominant platforms aren’t providing access to data about their own content.
The EC’s competition commissioner, Margrethe Vestager, is also taking a closer look at whether data hoarding constitutes a monopoly. And has expressed a view that, rather than breaking companies up in order to control platform monopolies, the better way to go about it in the modern ICT era might be by limiting access to data — suggesting another potentially looming legal headwind for personal data-sucking platforms.
At the same time, the political risks of social surveillance architectures have become all too clear.
Whether microtargeted political propaganda works as intended or not is still a question mark. But few would support letting attempts to fiddle elections just go ahead and happen anyway.
Yet Facebook has rushed to normalize what are abnormally hostile uses of its tools; aka the weaponizing of disinformation to further divisive political ends — presenting ‘election security’ as just another day-to-day cost of being in the people farming business. When the ‘cost’ for democracies and societies is anything but normal.
Whether or not voters can be manipulated en masse via the medium of targeted ads, the act of targeting itself certainly has an impact — by fragmenting the shared public sphere which civilized societies rely on to drive consensus and compromise. Ergo, unregulated social media is inevitably an agent of antisocial change.
The solution to technology threatening democracy is far more transparency; so regulating platforms to understand how, why and where data is flowing, and thus get a proper handle on impacts in order to shape desired outcomes.
Greater transparency also offers a route to begin to address commercial concerns about how the modern adtech market functions.
And if and when ad giants are forced to come clean — about how they profile people; where data and value flows; and what their ads actually deliver — you have to wonder what if anything will be left unblemished.
People who know they’re being watched alter their behavior. Similarly, platforms may find behavioral change enforced upon them, from above and below, when it becomes impossible for everyone else to ignore what they’re doing.
Let’s face it. We live in a fast-paced era and work in a technology-driven industry where the ability and expectations to quickly make changes can inhibit critical thinking. In fact, when speaking to the pros of paid search, the speed at which marketers are able to make changes based on data is typically a highlight. We can increase bids, change budgets, update ads, migrate to new platforms, and more with a little analysis and a click of a mouse. However, from time to time the pace at which we can move can sometimes leave us missing the biggest piece of the puzzle – the…
Read more at PPCHero.com