Organizations spend ungodly amounts of money — millions of dollars — on business intelligence (BI) tools. Yet, adoption rates are still below 30%. Why is this the case? Because BI has failed businesses.
Logi Analytics’ 2021 State of Analytics: Why Users Demand Better survey showed that knowledge workers spend more than five hours a day in analytics, and more than 99% consider analytics very to extremely valuable when making critical decisions. Unfortunately, many are dissatisfied with their current tools due to the loss of productivity, multiple “sources of truth,” and the lack of integration with their current tools and systems.
A gap exists between the functionalities provided by current BI and data discovery tools and what users want and need.
Throughout my career, I’ve spoken with many executives who wonder why BI continues to fail them, especially when data discovery tools like Qlik and Tableau have gained such momentum. The reality is, these tools are great for a very limited set of use cases among a limited audience of users — and the adoption rates reflect that reality.
Data discovery applications allow analysts to link with data sources and perform self-service analysis, but still come with major pitfalls. Lack of self-service customization, the inability to integrate into workflows with other applications, and an overall lack of flexibility seriously impacts the ability for most users (who aren’t data analysts) to derive meaningful information from these tools.
BI platforms and data discovery applications are supposed to launch insight into action, informing decisions at every level of the organization. But many are instead left with costly investments that actually create inefficiencies, hinder workflows and exclude the vast majority of employees who could benefit from those operational insights. Now that’s what I like to call a lack of ROI.
Business leaders across a variety of industries — including “legacy” sectors like manufacturing, healthcare and financial services — are demanding better and, in my opinion, they should have gotten it long ago.
It’s time to abandon BI — at least as we currently know it.
Here’s what I’ve learned over the years about why traditional BI platforms and newer tools like data discovery applications fail and what I’ve gathered from companies that moved away from them.
The inefficiency breakdown is killing your company
Traditional BI platforms and data discovery applications require users to exit their workflow to attempt data collection. And, as you can guess, stalling teams in the middle of their workflow creates massive inefficiencies. Instead of having the data you need to make a decision readily available to you, instead, you have to exit the application, enter another application, secure the data and then reenter the original application.
According to the 2021 State of Analytics report, 99% of knowledge workers had to spend additional time searching for information they couldn’t easily locate in their analytics solution.
- Agencies are particularly struggling to find ways to gain a broad view of the search market.
- Many agencies rely too heavily on Google tools which on provide top-level search insights and need better tools.
- COVID-19 is resulting in surprising search results and agencies are having trouble explaining these outcomes without proper data.
Search advertising is one of the most dynamic and rapidly evolving areas of the advertising ecosystem today. And as search continues to emerge as the barometer by which all other advertising activities are gauged, the need for sophisticated search intelligence has never been higher.
Yet, agencies, in particular, are continuing to have difficulties deriving the search intelligence they need and finding ways to unlock the potential of the insights that they already have on hand. Moreover, as agencies continue to invest in more data-generating tools, they are having to sift through more data than ever, and are struggling to keep up.
With that in mind, below are some key items agencies should keep in mind when it comes to their search intelligence infrastructure and how they can get the best out of it.
Agencies only have a fragmented search view
The search landscape is vast and continues to reshape itself on a daily basis. Therefore, having the most comprehensive view of the search landscape and all its nuances is imperative to driving success and making the most informed decisions possible. And data is the key component in building this holistic view.
Incomplete and inaccurate data can not only depress campaign effectiveness but can also have detrimental impacts on an advertiser’s standing versus competitors. For example, without high-quality data insights, it becomes impossible for advertisers to detect when competitors start to encroach on their brand terms — among other things. However, with the proper data tools in place, agencies can build better strategies for clients so that they can achieve maximum ROI and protect their market position.
Google tools don’t allow for proper performance analysis
While Google does provide a top-level view of search performance it does not nearly do so in the depth that is needed for agencies to be able to properly explain performance to their clients, particularly as it relates to competitor activity. Agencies need to be able to quickly justify why performance has changed and what steps can be taken to address these fluctuations — positive or negative. And Google simply does not allow them to do this. Additionally, without a comprehensive set of insights, it can be very hard for agencies to justify budget needs to their clients as well, and how to counteract the spends that other competitors are dedicating to certain segments. So agencies should be very wary of only relying on Google’s analytics tools.
Explaining the COVID-19 effect
As COVID-19 has disrupted consumer online and search behavior it has also materially impacted the search industry. From differences in the types of searches to a growing prevalence of local search as individuals looking to stay closer to home amid the pandemic, the entire search industry is scrambling to make sense of what may unfold next as a result of the current crisis. In addition, as we continue to move towards the conclusion of the pandemic, search professionals are also being tasked with figuring out which pandemic era trends may stick around and which ones won’t, adding a further layer of complexity to this already hectic period. Questions like, “which industries will emerge first?”, as well as, “which competitors will emerge fastest?” all need to be answered.
Luckily, by embracing a more ‘whole-market’ approach to data, agencies can quickly make sense of the changes that are occurring and deliver data-driven explanations to clients seeking answers for why an unexpected outcome took place. Furthermore, agencies can keep track of which pandemic era trends seem to have “staying power” and game plan accordingly.
Enabling a holistic view
Given how many different silos exist organizationally at agencies, it isn’t surprising that synthesizing all of the data that exists and reporting on it is hugely labor-intensive. This can be particularly challenging for agencies that are assessing strategies across the full complement of clients’ advertising activity, including traditional channels such as TV and radio along with other digital channels like mobile and paid social in addition to search.
Breaking down the walls that exist between the different branches of agencies is the only way to get the “truth” when it comes to reporting. This means making sure that the data is fully harmonized, comparable, and accessible through an integrated tool that provides the right capabilities for each agency role. AI can also play a critical role in creating fast, highly usable insights that can quickly translate into action. . This integrated and intelligent approach will cut down significantly on time spent generating reports while also making an agency’s performance much more agile, effective, and accurate.
After having to deal with a tremendous amount of upheaval and rethinking over the last decade, the idea of having to adapt is not a new one for agencies. Yet, while agencies have done well to roll with the times thus far, search still represents a bit of a pain point. However, by re-examining the current state of their data operations, agencies can boost their search intelligence exponentially, while making their entire business more intelligent as well.
Ian O’Rourke is CEO at Adthena and Stephen Davis is the Global Product Leader for Media Intelligence at Kantar, a leading British market research company.
As the pandemic rages on, companies are looking for an edge when it comes to sales. Having the right data about the customers most likely to convert can be a huge boost right now. Slintel, an early-stage startup building a sales intelligence tool, announced a $ 4.2 million seed round today.
The investment was led by Accel with help from Sequoia Capital India and existing investor Stellaris Venture Partners. The company reports it has now raised $ 5.7 million, including a pre-seed round last year.
Deepak Anchala, company founder and CEO, says that while sales and marketing teams are trying to target a broad market, most of the time their emails and other forms of communication with customers fall flat. As a sales person in previous startups, Eightfold and Tracxn, this was a problem Anchala experienced first hand. He believed with data, he could improve this, and he started Slintel to build a tool to provide the sales data that he was missing in these previous positions.
“We focus on helping our customers solve that [lack of data] by identifying people with high buying intent. So we are able to tell sales and marketing teams, for example, who is most likely to buy your product or your service, and who is most likely to buy your product today, as opposed to two months or six months from now,” Anchala explained.
They do this by looking at signals that might not be obvious, but which let sales teams know key information about these companies and their likelihood of buying soon. He says that every company leaves a technology footprint. This could be data from SEC filings, annual reports, job openings and so forth.
“In today’s world there is an enormous amount of footprint left online when a company uses a certain product. So what our algorithms do is we map that at scale for about 15 million companies to all the products that they’re using from the different sources we are able to identify — and we track it all from week to week,” he said.
The company has 45 employees today and expects to double that number by the end of 2021. As he builds the company, especially as an immigrant founder, Anchala wants to build a diverse and inclusive organization.
“I think one of the key successes for companies today is having diversity. We have a global workforce, so we have a workforce in the U.S. and India and we want to capitalize on that. In the next phase of hires we are looking at hiring more diverse candidates, more female employees and people of different nationalities,” he said.
The company, which was founded in 2018, and emerged from stealth last year, has amassed 100 enterprise customers and has seen most of the customers actually come on board this year as COVID has forced companies to find ways to be more efficient with their sales processes.
Meet Watchful, a Tel Aviv-based startup coming out of stealth that wants to help you learn more about what your competitors are doing when it comes to mobile app development. The company tries to identify features that are being tested before getting rolled out to everyone, giving you an advantage if you’re competing with those apps.
Mobile app development has become a complex task, especially for the biggest consumer apps, from social to e-commerce. Usually, mobile development teams work on a new feature and try it out on a small subset of users. That process is called A/B testing as you separate your customers in two buckets — bucket A or bucket B.
For instance, Twitter is trying out its own version of Stories called Fleets. The company first rolled it out in Brazil to track the reaction and get some data from its user base. If you live anywhere else in the world, you’re not going to see that feature.
There are other ways to select a group of users to try out a new feature — you could even take part in a test because you’ve been randomly picked.
“When you open the app, you’ll probably see a different version from the app I see. You’re in a different region, you have a different device,” co-founder and CEO Itay Kahana told me. He previously founded popular to-do app Any.do.
For product designers, it has become a nightmare as you can’t simply open an app and look at what your competitors are doing. At any point in time, there are as many different versions of the same app as there are multiple A/B tests going on at the same time.
Watchful lets you learn from the competition by analyzing all those different versions and annotating changes in user flows, flagging unreleased features and uncovering design changes.
It is different from other mobile intelligence startups, such as App Annie or Sensor Tower. Those services mostly let you track downloads and rankings on the App Store and Play store to uncover products that are doing well.
“We’re focused on everything that is open and visible to the users,” Kahana said.
Like other intelligence startups, Watchful needs data. App Annie acquired a VPN app called Distimo and a data usage monitoring app called Mobidia. When you activate those apps, App Annie captures data about your phone usage, such as the number of times you open an app and how much time you spend in those apps.
According to a BuzzFeed News report, Sensor Tower has operated at least 20 apps on iOS and Android to capture data, such as Free and Unlimited VPN, Luna VPN, Mobile Data and Adblock Focus. Some of those apps have been removed from the stores following BuzzFeed’s story.
I asked a lot of questions about Watchful’s source of data. “It’s all real users that give us access to this information. It’s all running on real devices, real users. We extract videos and screenshots from them,” Kahana said.
“It’s more like a panel of users that we have access to their devices. It’s not an SDK that is hidden in some app and collects information and do shady stuff,” he added.
You’ll have to trust him as the company didn’t want to elaborate further. Kahana also said that data is anonymized in order to remove all user information.
Images are then analyzed by a computer vision algorithm focused on differential analysis. The startup has a team in the Philippines that goes through all that data and annotates it. It is then sent to human analysts so that they can track apps and write reports.
Watchful shared one of those reports with TechCrunch earlier this year. Thanks to this process, the startup discovered that TikTok parent company ByteDance has been working on a deepfake maker. The feature was spotted in both TikTok and its Chinese sister app Douyin.
But Watchful’s customers aren’t news organizations. The company sells access to its service to big companies working in the mobile space. Kahana didn’t want to name them, but it said it is already working with “the biggest social network players and the biggest e-commerce players, mainly in the U.S.”
The startup sells annual contracts based on the number of apps that you want to track. It has raised a $ 3 million seed round led by Vertex Ventures .
Twitter chief executive Jack Dorsey and Facebook chief operations officer Sheryl Sandberg will testify in an open hearing at the Senate Intelligence Committee next week, the committee’s chairman has confirmed.
Larry Page, chief executive of Google parent company Alphabet, was also invited but has not confirmed his attendance, a committee spokesperson confirmed to TechCrunch.
Sen. Richard Burr (R-NC) said in a release that the social media giants will be asked about their responses to foreign influence operations on their platforms in an open hearing on September 5.
It will be the second time the Senate Intelligence Committee, which oversees the government’s intelligence and surveillance efforts, will have called the companies to testify. But it will be the first time that senior leadership will attend — though, Facebook chief executive Mark Zuckerberg did attend a House Energy and Commerce Committee hearing in April.
It comes in the wake of Twitter and Facebook recently announcing the suspension of accounts from their platforms that they believe to be linked to Iranian and Russian political meddling. Social media companies have been increasingly under the spotlight in the past years following Russian efforts to influence the 2016 presidential election with disinformation.
A Twitter spokesperson said the company didn’t yet have details to share on the committee’s prospective questions. TechCrunch also reached out to Google and Facebook for comment and will update when we hear back.
Mobile search drives billions of calls to business each year, and calls convert at a higher rate than digital leads. When properly optimized, calls can have a transformational impact on your bottom line. Join this webinar to learn tactical tips and smart strategies to boost your PPC results with call analytics.
Read more at PPCHero.com
Did you know that by 2020 the digital universe will consist of 44 zettabytes of data (source: IDC), but that the human brain can only process the equivalent of 1 million gigabytes of memory?
The explosion of big data has meant that humans simply have too much data to understand and handle daily.
For search, content and digital marketers to make the most out the valuable insights that data can provide, it is essential to utilize artificial intelligence (AI) applications, machine learning algorithms and deep learning to move the needle of marketing performance in 2018.
In this article, I will explain the advancements and differences between artificial intelligence (AI), machine learning and deep learning while sharing some tips on how SEO, content and digital marketers can make the most of the insights – especially from deep learning – that these technologies bring to the search marketing table.
I studied artificial intelligence in college and after graduating took a job in the field. It was an exciting time, but our programming capabilities, when looking back now, were rudimentary. More than intelligence, it was algorithms and rules that did their best to mimic how intelligence solves problems with best-guess recommendations.
Fast forward to today and things have evolved significantly.
The Big Bang: The big data explosion and the birth of AI
Since 1956, AI pioneers have been dreaming of a world where complex machines possess the same characteristics as human intelligence.
In 1996, the industry reached a major milestone when the IBM’s Deep Blue computer defeated a chess grandmaster by considering 200,000,000 chessboard patterns a second to make optimal moves.
Between 2000 and 2017, there were many developments that enabled great leaps forward. Most important were the geometric increases in the amount data collected, stored, and made retrievable. That mountain of data, which came to be known as big data, ushered in the advent of AI.
And it keeps growing exponentially: in 2016 IBM estimated that 90% of the world’s data had been generated over the last few years.
When thinking about AI, machine learning and deep learning, I find it helps to simplify and visualize how the 3 categories work and relate to each other – this framework also works from a chronological, sub-set development and size perspective.
Artificial intelligence is the science of making machines do things requiring human intelligence. It is human intelligence in machine format where computer programs develop data-based decisions and perform tasks normally performed by humans.
Machine learning takes artificial intelligence a step further in the sense that algorithms are programmed to learn and improve without the need for human data input and reprogramming.
Machine learning can be applied to many different problems and data sets. Google’s RankBrain algorithm is a great example of machine learning that evaluates the intent and context of each search query, rather than just delivering results based on programmed rules about keyword matching and other factors.
Deep learning is a more detailed algorithmic approach, taken from machine learning, that uses techniques based on logic and exposing data to neural networks (think human brain) so that the technology trains itself to perform tasks such as speech and image recognition.
Massive data sets are combined with pattern recognition capabilities to automatically make decisions, find patterns, emulate previous decisions, etc. Self-learning comes from here as the machine gets better from the more data that it is supplied.
Driverless cars, Netflix movie recommendations and IBMs Watson are all great examples of deep learning applications that break down tasks to make machine actions and assists possible.
Organic search, content and digital performance: Challenge and opportunity
Organic search (SEO) drives 51% of all website traffic and hence in this section it is only natural to explain the key benefits that deep-learning brings to SEO and digital marketers.
Organic search is a data-intensive business. Companies value and want their content to be visible on thousands or even millions of keywords in one to dozens of languages. Search best practices involve about 20 elements of on-page and off-page tactics. The SERPs themselves now come in more than 15 layout varieties.
Organic search is your market-wide voice of the customer, telling you what customers want at scale. However, marketers are faced with the challenge of making sense of so much data, having limited resources to mine insights and then actually act on the right and relevant insight for their business.
To succeed in highly demanding markets against your competitors’ many brands now requires the expertise of an experienced data analyst, and this is where machine learning and deep learning layers help recommend optimizations to content.
Connecting the dots with deep learning: Data and machine learning
The size of the organic data and the number of potential patterns that exist on that data make it a perfect candidate for deep learning applications. Unlike simple machine learning, deep-learning works better when it can analyse a massive amount of relevant data over long periods of time.
Deep learning and its ability to identify or prioritize material changes in interests and consumption behavior allows organic search marketers to gain a competitive advantage, be at the forefront of their industry, and produce the material that people need before their competitors, boosting their reputation.
In this way, marketers can begin to understand the strategies put forth by their competitors. They will see how well they perform compared to others in their industry and can then adjust their strategies to address the strengths or weaknesses that they find.
- The insights derived from deep learning technologies blend the best of search marketing and content marketing practices to power the development, activation, and automated optimization of smart content, content that is self-aware and self-adjusting, improving content discovery and engagement across all digital marketing channels.
- Intent data offers in-the-moment context on where customers want to go and what they want to know, do, or buy. Organic search data is the critical raw material that helps you discover consumer patterns, new market opportunities, and competitive threats.
- Deep learning is particularly important in search, where data is copious and incredibly dynamic. Identifying patterns in data in real-time makes deep learning your best first defense in understanding customer, competitor, or market changes – so that you can immediately turn these insights into a plan to win.
To propel content and organic search success in 2018 marketers should let the machines does more of the leg work to provide the insights and recommendations that allow marketers to focus on the creation of smart content.
Below are a just a few examples of the benefits for the organic search marketer:
Pinpoint and fix critical site errors that drive the greatest benefits to a brand’s bottom line. Deep learning technology can be used to incorporate website data, detect anomalies tying site errors to estimated marketing impact so that marketers can prioritize fixes for maximum results.
Without a deep learning application to help you, you might be staring at a long list of potential fixes which typically get postponed to later.
Identifying patterns in real-time makes deep learning a brands’ best first defense in understanding customer, competitor, or market changes– so that marketers can immediately turn these insights into a plan to win.
Surface high-value topics that target different content strategies, such as stopping competitive threats or capitalizing on local demand.
Deep learning technology can be used to assess the ROI of new content items and prioritize their development by unveiling insights such as topic opportunity, consumer intent, characteristics of top competing content, and recommendations for improving content performance.
Score the quality and relevance of each piece of content produced. Deep learning technology can help save time with automated tasks of content production, such as header tags, cross-linking, copy optimization, image editing, highly optimized CTAs that drive performance, and embedded performance tracking of website traffic and conversion.
Deep learning technology can help ensure that each piece of content is optimized for organic performance and customer experience—such as schema for structure, AMP for better mobile experiences, and Open Graph for Facebook. Technology can help marketers can amplify their content in social networks for greater visibility.
Automation helps marketers do more with less and execute more quickly. It allows marketers to manage routine tasks with little effort, so that they can focus on high-impact activities and accomplish organic business goals at scale.
Note: To make the most of the insights and recommendations from deep learning marketers need to take action and make the relevant changes to web page content to keep website visitors engaged and ultimately converting.
Additionally, because the search landscape changes so frequently, deep learning fuels the development of smart content and can be used to automatically adjust to changes in content formats and standards.
Deep learning in action
An example of deep learning in organic search is DataMind. BrightEdge (disclosure, my employer) Data Mind is like a virtual team of data scientists built into the platform, that combines massive volumes of data with immediate, actionable insights to inform marketing decisions.
In this case the deep learning engine analyzes huge, complex, and dynamic data sets (from multiple sources that include 1st and 3rd party data) to determine patterns and derive the insights marketers need. Deep learning is used to detect anomalies in a site’s performance and interpret the reasons, such as industry trends, while making recommendations about how to proceed.
Think of deep learning applications as your own personal data scientist – here to help and assist and not to replace. The adoption of AI, machine learning and now deep learning technologies allows faster decisions, more accurate and smarter insights.
Brands compete in the content battleground to ensure their content is optimized and found, engages audiences and ultimately drives conversions and digital revenue. When armed with these insights from deep learning, marketers get a new competitive weapon and a massive competitive edge.
MioTech, a financial tech startup with offices in Hong Kong and Shanghai, has raised $ 7 million in Series A funding to develop artificial intelligence based software for investment managers. The round was led by Horizons Ventures, the private investment arm of Hong Kong business tycoon Li Ka-shing, with participation from returning investor Zhenfund. Read More
Startups – TechCrunch
Social media can be hard. I’m not speaking hypothetically here. There have definitely been days where I’ve thought, Shoot, it’s been way too long since I’ve posted on Instagram or Twitter — and then completely failed to post anyway because I had nothing to say. Maybe I would’ve done better if I’d used Post Intelligence. Read More
Social – TechCrunch
Michel Morvan is the CEO of The CoSMo Company, a big data service provider and insight generator. He knows how to use AI to help C-level execs make decisions and he thinks the current crop of AI is just the beginning. Morvan doesn’t believe AI will become “self-aware.” Instead he speaks of “augmented intelligence,” robots that help us think better and make… Read More
Enterprise – TechCrunch
- Data scientists: Bring the narrative to the forefront
- Core Web Vitals & Preparing for Google’s Page Experience Update
- Conversion modeling through Consent Mode in Google Ads
- The search dilemma: looking beyond Google’s third-party cookie death
- The FDA’s Decision to Pause J&J Could Help Defeat Covid-19