- Following the passage of landmark consumer privacy laws, Google announced its intention to phase out third-party cookies by 2022
- Businesses that rely on these cookies for granular consumer data are now forced to rethink their strategies for accurate audience targeting
- Some businesses are turning to publisher walled gardens, while others are leaning more into contextual advertising
- Coegi’s Sean Cotton explores the challenges and opportunities marketers face in the absence of third-party cookies, as well as viable alternatives they can use to keep audience targeting on point
Following the passage of landmark consumer privacy laws, Google officially announced its intention to phase out third-party cookies on Chrome browsers by next year. This is certainly a victory for the conscious consumer wary of selling data to advertisers, but it’s also one that might leave businesses scrambling when the cookie jar disappears. But these businesses should be more excited than alarmed. While the death of third-party cookies is an obstacle, it’s also an opportunity: As alternatives to third-party cookies emerge, advertisers might find themselves better-equipped audience targeting and acquirement methods.
Third-party cookies haven’t always been perfect right out of the oven, and their quality was largely dependent on factors such as the data provider’s methodologies, the latency and recency of that data, and any related acquisition costs. Although occasionally stale, these prebuilt audiences allowed advertisers to quickly scale their audiences. The forthcoming phaseout will put pressure on marketers to rethink their strategies for accurately targeting audiences.
What are the alternatives to third-party cookies?
Publisher walled gardens (in which publishers trade free content for first-party data) are a solid starting point for advertisers seeking alternatives to third-party cookies. These audiences won’t come cheap, but it will be possible to find publishers with audiences that strongly align with your own customer base. And because these sources of data are generally authenticated, they’re also an accurate source of modeling data to use as you construct your own user databases.
Many purchases these days begin with online research, so savvy marketers are also exploring contextual advertising as a third-party cookie alternative. Mapping out the sales funnel for your product or service will help you identify opportunities for targeted advertising as your audience performs research, but it’s important to be precise at the same time. Be sure to use negative search terms and semantic recognition to prevent your brand or product from appearing in potentially embarrassing or unsafe placements. (Just consider the word “shot,” which in this day and age could relate to anything from COVID-19 or health and wellness to debates surrounding the Second Amendment.)
There’s still time for a smooth transition away from your dependency on cookies, but you shouldn’t wait much longer to get started. As you explore new ways to get your message out to precise audiences, these strategies are a great place to start:
1. Lean on second-party data
Second-party data (such as the kind provided on publisher walled gardens) can offer accurate audience targeting for advertisers in a hurry to replace third-party cookies. This type of data can inform people- or account-based marketing strategies, helping you identify individuals in a specific industry or those with a certain relevant job title. Similarly, integrating second-party data with your broader digital marketing strategy can create use cases for lookalike modeling or provide a strong foundation for sequential messaging.
Because second-party data will come at a potentially high cost, however, try to partner with publishers and providers for the long term to keep rates as low as possible. As an added benefit, this will give you time to experiment and use various types of data in different ways.
2. Implement mobile ad ID (or MAID) targeting
MAID targeting is based on an anonymous identifier associated with a user’s mobile device operating system. MAIDs have always been the go-to for application targeting because they’re privacy-compliant and serve as a great way to segment audiences based on behaviors and interests. In fact, everyone expected MAIDs to grow as mobile and in-app usage has accelerated. In the U.S., for instance, mobile users spend just over an hour more on those devices than their computers each day, and they spend 87 percent of the time on their smartphones in-app. But the death of third-party cookies will certainly accelerate the usage of these audiences across channels even more.
One of the most powerful insights offered by MAIDs is the ability to track a user’s location data. If a device is frequenting an NFL stadium, for example, you can infer that the user is a football fan, which allows a host of other inferences to form. You can also enrich MAIDs with offline deterministic data, allowing you to construct a more complete picture of the user, their demographic information, and their relevant interests.
Note that recent changes to Apple’s iOS 14 platform might limit this type of targeting on the company’s devices. Besides this, it’s also important to verify the precision and accuracy of the provider giving you location data.
3. Build custom models and indexes
Algorithmic targeting or lookalike modeling caught a bad rap from advertisers who worried the modeled audiences would broaden targeting too far. But as the quality of your audience input increases, the quality of your modeling output increases as well. In other words, concerns are justified only if you’re modeling audiences after modeled data.
On the other hand, models can be an excellent source of additional insight if you’re using deterministic data. This information comes from all kinds of sources, including social media platforms, questionnaires and surveys, and e-commerce sites that have information on user purchase history. In short, it’s data you can trust — meaning it can inform the creation of accurate audience segments and models that capture real customer intent. With deterministic data at the helm, you can create your own models and indexes to aid in your targeting efforts.
First-party data from customers and active social media followers generally provides the best source for models. Be aware of outliers when it comes to audience insights, though; signals should be strong enough to imply the target audience’s actual behavior.
4. Use Unified ID solutions
The death of third-party cookies doesn’t mean the death of all your strategies, and you can expect to see a variety of sophisticated solutions emerge in the coming years that offer audience segmentation with increased control for advertisers and enhanced privacy protections for consumers. In fact, some companies are already working collaboratively to create Unified ID solutions that modernize audience targeting and measurement.
The solutions they’re creating aim to collect user information (such as email addresses) in exchange for free content. Those addresses will then be assigned encrypted IDs that are transmitted along the bid stream to advertisers. If publishers widely adopt unified identity products, they’ll provide an excellent alternative to an overreliance on walled gardens.
However, one of the biggest hurdles for a unified ID solution will be scalability: It will likely not be a solution that can stand on its own for some time.
The death of third-party cookies will absolutely shake up the advertising world, but that’s probably a good thing. Cookies were never designed to be the backbone of digital advertising, and their disappearance makes room for alternatives to third-party cookies that actually deliver a better experience for advertisers and the audiences they’re looking to target. As advertisers gain more granular control over who hears their messaging (and when) and customer data is ensconced behind modern encryption and privacy protection tools, it’s not hard to argue that everyone wins when we put away the cookie jar.
The post Everything you need to know about audience targeting without relying on third-party cookies appeared first on Search Engine Watch.
One specialist shares her top 5 tips for taking a PPC holiday without disaster striking.
Read more at PPCHero.com
Here are simple actions that boost your Facebook campaign traffic without throwing tons of extra budget at the campaign or sacrificing efficiency.
Read more at PPCHero.com
- With Google frequently changing its search engine algorithm in recent times in a bid to reduce the organic reach of most businesses so they can invest more in Ads, what are the options left for your small business in this period?
- According to GoGulf, 46% of all Google searches are of people searching for local information, and 86% of people look up the location of a business on Google Maps.
- If Local SEO is that effective, what is it, and why should your business rely on it?
- In this article, I will examine basic things that still work like adding your business to online directories, building backlinks, developing local content targeted at your local audience, incorporating titles and meta description tags, and the use of targeted keywords.
SEO changes all the time. That’s why you need to update your SEO strategies regularly to remain visible.
According to this article, 72% of consumers visit a store within 5 miles after doing a local search. This shows how powerful local SEO can be. If you can make your business visible such that your business appears in the search results when a potential customer searches for a product and service, there is a 72% chance they’ll drop by your business.
But what is local SEO? Local SEO involves the optimization of your online presence in order to improve your chances of being discovered by people who make local searches. Think of it as your traditional SEO, but with the inclusion of geography in it. In other words, it’s you trying to attract more business from local searches.
In this article, you’ll learn five local SEO tactics that will help you skyrocket your visibility without breaking the bank.
1. Be strategic about your title and meta description tags
When you search for something on Google’s search engine, you’ll see millions of results competing for your attention. The only way you can tell if the search result has what you’re looking for is the title and the description you see immediately after the title.
Many business owners take the title of their blog posts and meta descriptions for granted. You need to start seeing the title and the description as a way you can “sell” your page to a potential visitor of your page.
A useful tool that will help you optimize your title and description is Yoast SEO. It’ll be able to test how good your title and description are.
Deliberately include the location of your business in your blog post titles. For example, let’s assume you sell wine and your potential client is looking for the best New York wine, you’d be doing yourself a great service by including the words “New York” in the title and the description.
Source: Google Search
2. Optimize your Google My Business account
You know how you’ll search for a pizza place on your phone and Google will show you a list of pizza places near you? That’s made possible by using Google My Business. Google My Business (GMB) is a tool used to manage your online presence across Google, including Google Search and Google Maps.
You’ll find it shocking that 56% of local businesses haven’t claimed their Google My Business listings. So don’t sleep on this tip.
If you haven’t claimed your GMB listing then make sure you do so. But don’t just claim your GMB listing and forget all about it. Optimize your GMB by filling in your Google My Business Profile, choosing the relevant category, and including images. This will not only help your potential customer find you, but it will also give them some information about your business and thus influence their decision to stop by your business.
3. Create local content
According to GoGulf, 46% of all Google searches are of people searching for local information. So how do you harness that attention so as to get your target audience to know about your business?
Creating local content that will be of interest to your target audience makes you the local authority for your industry. By local content, I mean the creation of content that is targeted to your local audience. This will require you to be strategic with keywords.
So as a florist, instead of creating content on the best flowers to give your wife, think of the best flowers your customer can give their wife in Florida. That way you’re specifically addressing those in Florida and those who come across your article see you as the go-to florist in Florida.
4. Get inbound links to raise the domain authority
As beneficial as it is to create local content for your own website, Moz revealed that link signals are an important local search ranking factor that will help enhance your visibility as shown in the diagram below.
Link signals include inbound anchor text, linking domain authority, and linking domain quantity. All this helps to raise the domain authority as this helps increase your local search rankings.
To improve your link signals you should also guest post local content on other websites as well. Create valuable local resources that your target audience will love.
As you guest post and refer people to the blog on your business website, you’re acquiring inbound links to raise your domain authority. Those will help you with your SEO and increase your chances of being visible on Google when the people within your geographical location search for things related to your business.
5. Add your business to online directories
Why stop at just adding your business on only one directory when there are so many directories out there. Adding your business to as many online directories (especially local ones) as you can possibly find will increase your chances of being found online. It will be time-consuming but it’s worth it.
Online business directories like Binge Places for Business, Yelp, Yellow Pages, Angie’s List, and Trip Advisor will make you more visible to those who need your services locally. And to add to that, getting listed on these sites will make you earn backlinks from them which will help build your domain authority and increase your ranking on Google’s Search Engine Result Pages (SERP), therefore increasing your organic reach.
What you need to take note of is that the information on your business on other directories is the same as that on your GMB. This consistency will help with your rankings.
Ready to maximize local SEO?
Over time Google has tweaked its search engine algorithm to reduce the organic reach of businesses so as to direct their attention to investing in ads. As a small business that has limited resources, investing in ads may seem like a long shot.
If Google’s reducing your ability to organically reach your target audience, then what’s the next available option for you? Local SEO can give you the needed exposure to your target audience organically and at little or no cost.
The good news is that applying the steps above will put you ahead of other local businesses scrambling for their customers’ attention on the most coveted first page of Google search.
Now that you’ve got the tips how did you would you say you’ve fared with your local SEO?
The post Maximizing local SEO: Five tactics to enhance visibility without breaking the bank appeared first on Search Engine Watch.
When I ask prospects or clients if they are tracking phone calls from their website, they often tell me they are not, never thought of it or “I guess we could look at our records from the phone company”. To make things worse, nowadays trying to make sense of attribution and storytelling to the client on performance has become an analytical nightmare. In this post, I will discuss the many benefits of Call Tracking and why it matters so much for both advertisers and agencies.
Let’s be clear, Call Tracking may not be beneficial for every business. In fact, some may not want to receive phone calls simply because they solely want to rely on online forms or digital transactions. But…. Here’s the problem. For those businesses that do rely on phone calls for their business’s success, it’s imperative that they know where the calls are coming from. This is not only a dilemma for the business, but also the agency or marketing director handling the marketing and advertising dollars.
Benefits of Call Tracking
For many years, I have managed everything from PPC, SEO, Email, Landing Pages, Social, etc…. In fact, even though they had extensive Google Analytics and platform pixels installed, tracking phone calls from the website was always the biggest obstacle because I could not verify that metric. With the addition of call tracking “into the mix”, it allows me as a marketer to identify which Ad platforms, campaigns and keywords generate phone calls. In addition, I can then correlate the Caller Id’s in the reporting to justify a valuable lead from a junk lead.
Learn more about Call Rail
While there are many call tracking companies available online, I have found that Call Rail provides the best features, easiest integration and frankly top-notch customer service around. Here are just some of the features of Call Rail:
Visitor & Keyword-Level Tracking
CallRail’s call tracking can reveal which keywords, campaigns, and landing pages are effectively driving phone conversions. See your visitor’s journey through your website before, during, and after the call.
Dynamic Number Insertion
Campaign-Level Call Tracking
Create trackable phone numbers to use in all of your online and offline marketing campaigns, including paid search, digital advertising, direct mail, television, radio, and print ads. Find out which ads are effective.
Multi-Channel Call Attribution
See the full story on your PPC, organic, social, remarketing, and other campaigns. Understand how they influence your customer’s journey. Multi-channel call attribution goes beyond first- and last-click metrics.
Capture leads from forms instantly, and let CallRail alert you by phone, text message, or email. View detailed information about where your form completions are coming from and call customers back immediately.
The founders of Glide, a member of the Y Combinator Winter 2019 class, had a notion that building mobile apps in the enterprise was too hard. They decided to simplify the process by starting with a spreadsheet, and automatically turning the contents into a slick mobile app.
David Siegel, CEO and co-founder at Glide, was working with his co-founders Jason Smith, Mark Probst and Antonio Garcia Aprea at Xamarin, a cross-platform mobile development company that Microsoft acquired for $ 500 million in 2016. There, they witnessed first-hand the difficulty that companies were having building mobile apps. When their two-year stint at Microsoft was over, the four founders decided to build a startup to solve the problem.
“We saw how desperate some of the world’s largest companies were to have a mobile strategy, and also how painful and expensive it is to develop mobile apps. And we haven’t seen significant progress on that 10 years after the smartphone debuted,” Siegel told TechCrunch.
The founders began with research, looking at almost 100 no-code tools and were not really satisfied with any of them. They chose the venerable spreadsheet, a business tool many people use to track information, as the source for their mobile app builder, starting with Google Sheets.
“There’s a saying that spreadsheets are the most the most successful programming model of all time, and smartphones are the most successful computers of all time. So when we started exploring Glide we asked ourselves, can these two forces be combined to create something very valuable to let individuals and businesses build the type of apps that we saw Xamarin customers needed to build, but much more quickly,” Siegel said.
The company developed Glide, a service that lets you add information to a Google Sheet spreadsheet, and then very quickly create an app from the contents without coding. “You can easily assemble a polished, data-driven app that you can customize and share as a progressive web app, meaning you can get a link that you can share with anybody, and they can load it in a browser without downloading an app, or you can publish Glide apps as native apps to app stores,” Siegel explained. What’s more, there is a two-way connection between app and spreadsheet, so that when you add information in either place, the other element is updated.
The founders decided to apply at Y Combinator after consulting with former Xamarin CEO, and current GitHub chief executive, Nat Friedman. He and other advisors told them YC would be a great place for first-time founders to get guidance on building a company, taking advantage of the vast YC network.
One of the primary lessons he says that they have learned is the importance of getting out in the field and talking to customers, and not falling into the trap of falling in love with the act of building the tool. The company has actually helped fellow YC companies build mobile apps using the Glide tool.
Glide is live today and people can create apps using their own spreadsheet data, or using the templates available on the site as a starting point. There is a free tier available to try it without obligation.
There’s a reason why our team thinks we are a great place to work and no, its not because we have a ping pong table set up. See more about Hanapin’s latest certification + we’ll let you in on a little secret!
Read more at PPCHero.com
Microsoft is capping off a rather impressive year without any major missteps in its final report for its performance in its 2018 fiscal year, posting a quarter that seems to have been largely non-offensive to Wall Street.
In the past year, Microsoft’s stock has gone up more than 40 percent. In the past two years, it’s nearly doubled. All of this came after something around a decade of that price not really doing anything as Microsoft initially missed major trends like the shift to mobile and the cloud. But since then, new CEO Satya Nadella has turned that around and increased the company’s focused on both, and Azure is now one of the company’s biggest highlights. Microsoft is now an $ 800 billion company, which, while still considerably behind Apple, Amazon and Google, is a considerable high considering the past decade.
In addition, Microsoft passed $ 100 billion in revenue for a fiscal year. So, as you might expect, the stock didn’t really do anything, given that nothing seemed to be too wrong with what was going on. For a company that’s at around $ 800 billion, that it’s not doing anything bad at this point is likely a good thing. That Microsoft is even in the discussion of being one of the companies chasing a $ 1 trillion market cap is likely something we wouldn’t have been talking about just three or four years ago.
The company said it generated $ 30.1 billion in revenue, up 17 percent year-over-year, and adjusted earnings of $ 1.13 per share. Analysts were looking for earnings of $ 1.08 per share on revenue of $ 29.23 billion.
So, under Nadella, this is more or less a tale of two Microsofts — one squarely pointed at a future of productivity software with an affinity toward cloud and mobile tools (though Windows is obviously still a part of this), and one that was centered around the home PC. Here are a couple of highlights from the report:
- LinkedIn: Microsoft said revenue for LinkedIn increased 37 percent, with LinkedIn sessions growth of 41 percent. Microsoft’s professional network was also listed in a bucket of other segments that it attributed to increased operating expenditures, which also included cloud engineering, and commercial sales capacity. It was also bucketed into a 12 percent increase in research and development with cloud engineering, as well as a bump in sales and marketing expenses. This all seems pretty normal for a network Microsoft hopes to continue to grow.
- Azure: Microsoft’s cloud platform continued to drive its server products and cloud services revenue, which increased 26 percent. The company said Azure’s revenue was up 89 percent “due to growth from consumed and SaaS revenue.” Once again, Microsoft didn’t break out specifics on its Azure products, though it seems pretty clear that this is one of their primary growth drivers.
- Office 365: Office 365 saw commercial revenue growth of 38 percent, and consumer subscribers increased to 31.4 million. Alongside LinkedIn, Microsoft seems to be assembling a substantial number of subscription SaaS products that offset a shift in its model away from personal computing and into a more cloud-oriented company.
- GitHub: Nada here in the report. Microsoft earlier this year said it acquired it for a very large sum of money (in stock), but it isn’t talking about it. But bucket it alongside Office 365 and LinkedIn as part of that increasingly large stable of productivity tools for businesses, as GitHub is one of the most widely adopted developer tools available.
Facebook is scrambling to add safeguards against abuse of user data as it reels from backlash over the Cambridge Analytica scandal. Now TechCrunch has learned Facebook will launch a certification tool that demands that marketers guarantee email addresses used for ad targeting were rightfully attained. This new Custom Audiences certification tool was described by Facebook representatives to their marketing clients, according to two sources. Facebook will also prevent the sharing of Custom Audience data across Business accounts.
This snippet of a message sent by a Facebook rep to a client notes that “for any Custom Audiences data imported into Facebook, Advertisers will be required to represent and warrant that proper user content has been obtained.”
Once shown the message, Facebook spokesperson Elisabeth Diana told TechCrunch “I can confirm there is a permissions tool that we’re building.” It will require that advertisers and the agencies representing them pledge that “I certify that I have permission to use this data”, she said.
Diana noted that “We’ve always had terms in place to ensure that advertisers have consent for data they use but we’re going to make that much more prominent and educate advertisers on the way they can use the data.” The change isn’t in response to a specific incident, but Facebook does plan to re-review the way it works with third-party data measurement firms to ensure everything is responsibly used. This is a way to safeguard data” Diana concluded.The company declined to specify whether it’s ever blocked usage of a Custom Audience because it suspected the owner didn’t have user consent. ”
The social network is hoping to prevent further misuse of ill-gotten data after Dr. Aleksandr Kogan’s app that pulled data on 50 million Facebook users was passed to Cambridge Analytica in violation of Facebook policy. That sordid data is suspected to have been used by Cambridge Analytica to support the Trump and Brexit campaigns, which employed Custom Audiences to reach voters.
Facebook launched Custom Audiences back in 2012 to let businesses upload hashed lists of their customers email addresses or phone numbers, allowing advertisers to target specific people instead of broad demographics. Custom Audiences quickly became one of Facebook’s most powerful advertising options because businesses could easily reach existing customers to drive repeat sales. The Custom Audiences terms of service require that businesses have “provided appropriate notice to and secured any necessary consent from the data subjects” to attain and use these people’s contact info.
But just like Facebook’s policy told app developers like Kogan not to sell, share, or misuse data they collected from Facebook users, the company didn’t go further to enforce this rule. It essentially trusted that the fear of legal repercussions or suspension on Facebook would deter violations of both its app data privacy and Custom Audiences consent policies. With clear financial incentives to bend or break those rules and limited effort spent investigating to ensure compliance, Facebook left itself and its users open to exploitation.
Last week Facebook banned the use of third-party data brokers like Experian and Acxiom for ad targeting, closing a marketing featured called Partner Categories. Facebook is believed to have been trying to prevent any ill-gotten data from being laundered through these data brokers and then directly imported to Facebook to target users. But that left open the option for businesses to compile illicit data sets or pull them from data brokers, then upload them to Facebook as Custom Audiences by themselves.
The Custom Audiences certification tool could close that loophole. It’s still being built, so Facebook wouldn’t say exactly how it will work. I asked if Facebook would scan uploaded user lists and try to match them against a database of suspicious data, but for now it sounds more like Facebook will merely require a written promise.
Meanwhile, barring the sharing of Custom Audiences between Business Accounts might prevent those with access to email lists from using them to promote companies unrelated to the one to which users gave their email address. Facebook declined to comment on how the new ban on Custom Audience sharing would work.
Now Facebook must find ways to thwart misuse of its targeting tools and audit anyone it suspects may have already violated its policies. Otherwise it may receive the ire of privacy-conscious users and critics, and strengthen the case for substantial regulation of its ads (though regulation could end up protecting Facebook from competitors who can’t afford compliance). Still the question remains why it took such a massive data privacy scandal for Facebook to take a tougher stance on requiring user consent for ad targeting. And given that written promises didn’t stop Kogan or Cambridge Analytica from misusing data, why would they stop advertisers bent on boosting profits?
For more on Facebook’s recent scandals, check out TechCrunch’s coverage:
Beginning in 2011, search marketers began to lose visibility over the organic keywords that consumers were using to find their websites, as Google gradually switched all of its searches over to secure search using HTTPS.
As it did so, the organic keyword data available to marketers in Google Analytics, and other analytics platforms, slowly became replaced by “(not provided)”. By 2014, the (not provided) issue was estimated to impact 80-90% of organic traffic, representing a massive loss in visibility for search marketers and website owners.
Marketers have gradually adjusted to the situation, and most have developed rough workarounds or ways of guessing what searches are bringing customers to their site. Even so, there’s no denying that having complete visibility over organic keyword data once more would have a massive impact on the search industry – as well as benefits for SEO.
One company believes that it has found the key to unlocking “(not provided)” keyword data. We spoke to Daniel Schmeh, MD and CTO at Keyword Hero, a start-up which has set out to solve the issue of “(not provided)”, and ‘Wizard of Moz’ Rand Fishkin, about how “(not provided)” is still impacting the search industry in 2017, and what a world without it might look like.
Content produced in association with Keyword Hero.
“(not provided)” in Google Analytics: How does it impact SEO?
“The “(not provided)” keyword data issue is caused by Google the search engine, so that no analytics program, Google Analytics included, can get the data directly,” explains Rand Fishkin, founder and former CEO of Moz.
“Google used to pass a referrer string when you performed a web search with them that would tell you – ‘This person searched for “red shoes” and then they clicked on your website’. Then you would know that when people searched for “red shoes”, here’s the behavior they showed on your website, and you could buy ads against that, or choose how to serve them better, maybe by highlighting the red shoes on the page better when they land there – all sorts of things.”
“You could also do analytics to understand whether visitors for that search were converting on your website, or whether they were having a good experience – those kinds of things.
“But Google began to take that away around 2011, and their reasoning behind it was to protect user privacy. That was quickly debunked, however, by folks in the industry, because Google provides that data with great accuracy if you choose to buy ads with them. So there’s obviously a huge conflict of interest there.
“I think the assumption at this point is that it’s just Google throwing their weight around and being the behemoth that they can be, and saying, ‘We don’t want to provide this data because it’s too valuable and useful to potential competitors, and people who have the potential to own a lot of the search ranking real estate and have too good of an idea of what patterns are going on.
“I think Google is worried about the quality and quantity of data that could be received through organic search – they’d prefer that marketers spend money on advertising with Google if they want that information.”
Where Google goes, its closest competitors are sure to follow, and Bing and Yandex soon followed suit. By 2013, the search industry was experiencing a near-total eclipse of visibility over organic keyword data, and found itself having to simply deal with the consequences.
“At this point, most SEOs use the data of which page received the visit from Google, and then try to reverse-engineer it: what keywords does that page rank for? Based on those two points, you can sort of triangulate the value you’re getting from visitors from those keywords to this page,” says Fishkin.
However, data analysis and processing have come a long way since 2011, or even 2013. One start-up believes that it has found the key to unlocking “(not provided)” keyword data and giving marketers back visibility over their organic keywords.
How to unlock “(not provided)” keywords in Google Analytics
“I started out as a SEO, first in a publishing company and later in ecommerce companies,” says Daniel Schmeh, MD and CTO of SEO and search marketing tool Keyword Hero, which aims to provide a solution to “(not provided)” in Google Analytics. “I then got into PPC marketing, building self-learning bid management tools, before finally moving into data science.
“So I have a pretty broad understanding of the industry and ecosystem, and was always aware of the “(not provided)” problem.
“When we then started buying billions of data points from browser extensions for another project that I was working on, I thought that this must be solvable – more as an interesting problem to work on than a product that we wanted to sell.”
Essentially, Schmeh explains, solving the problem of “(not provided)” is a matter of getting access to the data and engineering around it. Keyword Hero uses a wide range of data sources to deduce the organic keywords hidden behind the screen of “(not provided)”.
“In the first step, the Hero fetches all our users’ URLs,” says Schmeh. “We then use rank monitoring services – mainly other SEO tools and crawlers – as well as what we call “cognitive services” – among them Google Trends, Bing Cognitive Services, Wikipedia’s API – and Google’s search console, to compute a long list of possible keywords per URL, and a first estimate of their likelihood.
“All these results are then tested against real, hard data that we buy from browser extensions.
“This info will be looped back to the initial deep learning algorithm, using a variety of mathematical concepts.”
Ultimately, the process used by Keyword Hero to obtain organic keyword data is still guesswork, but very advanced guesswork.
“All in all, the results are pretty good: in 50 – 60% of all sessions, we attribute keywords with 100% certainty,” says Schmeh.
“For the remainder, at least 83% certainty is needed, otherwise they’ll stay (not provided). For most of our customers, 94% of all sessions are matched, though in some cases we need a few weeks to get to this matching rate.”
If the issue of “(not provided)” organic keywords has been around since 2011, why has it taken us this long to find a solution that works? Schmeh believes that Keyword Hero has two key advantages: One, they take a scientific approach to search, and two, they have much greater data processing powers compared with six years ago.
“We have a very scientific approach to SEO,” he says.
“We have a small team of world-class experts, mostly from Fraunhofer Institute of Technology, that know how to make sense of large amounts of data. Our background in SEO and the fact that we have access to vast amounts of data points from browser extensions allowed us to think about this as more of a data science problem, which it ultimately is.
“Processing the information – the algorithm and its functionalities – would have worked back in 2011, too, but the limiting factor is our capability to work with these extremely large amounts of data. Just uploading the information back into our customers’ accounts would take 13 hours on AWS [Amazon Web Services] largest instance, the X1 – something we could never afford.
“So we had to find other cloud solutions – ending up with things that didn’t exist even a year ago.”
A world without “(not provided)”: How could unlocking organic keyword data transform SEO?
If marketers and website owners could regain visibility over their organic keywords, this would obviously be a huge help to their efforts in optimizing for search and planning a commercial strategy.
But Rand Fishkin also believes it would have two much more wide-reaching benefits: it would help to prove the worth of organic SEO, and would ultimately lead to a better user experience and a better web.
“Because SEO has such a difficult time proving attribution, it doesn’t get counted and therefore businesses don’t invest in it the way they would if they could show that direct connection to revenue,” says Fishkin. “So it would help prove the value, which means that SEO could get budget.
“I think the thing Google is most afraid of is that some people would see that they rank organically well enough for some keywords they’re bidding on in AdWords, and ultimately decide not to bid anymore.
“This would cause Google to lose revenue – but of course, many of these websites would save a lot of money.”
And in this utopian world of keyword visibility, marketers could channel that revenue into better targeting the consumers whose behavior they would now have much higher-quality insights into.
“I think you would see more personalization and customization on websites – so for example, earlier I mentioned a search for ‘red shoes’ – if I’m an ecommerce website, and I see that someone has searched for ‘red shoes’, I might actually highlight that text on the page, or I might dynamically change the navigation so that I had shades of red inside my product range that I helped people discover.
“If businesses could personalize their content based on the search, it could create an improved user experience and user performance: longer time on site, lower bounce rate, higher engagement, higher conversion rate. It would absolutely be better for users.
“The other thing I think you’d see people doing is optimizing their content efforts around keywords that bring valuable visitors. As more and more websites optimized for their unique search audience, you would generally get a better web – some people are going to do a great job for ‘red shoes’, others for ‘scarlet sandals’, and others for ‘burgundy sneakers’. And as a result, we would have everyone building toward what their unique value proposition is.”
Daniel Schmeh adds that unlocking “(not provided)” keyword data has the ability to make SEO less about guesswork and more substantiated in numbers and hard facts.
“Just seeing simple things, like how users convert that use your brand name in their search phrase versus those who don’t, has huge impact on our customers,” he says. “We’ve had multiple people telling us that they have based important business decisions on the data.
“Seeing thousands of keywords again is very powerful for the more sophisticated, data-driven user, who is able to derive meaningful insights; but we’d really like the Keyword Hero to become a standard tool. So we’re working hard to make this keyword data accessible and actionable for all of our users, and will soon be offering features like keyword clustering – all through their Google Analytics interface.”
To find out more about how to unlock your “(not provided)” keywords in Google Analytics, visit the Keyword Hero website.
- Once VMware is free from Dell, who might fancy buying it?
- Facebook faces ‘mass action’ lawsuit in Europe over 2019 breach
- Chinese hardware makers turn to crowdfunding as they look to go global
- Core Web Vitals & Preparing for Google’s Page Experience Update
- Conversion modeling through Consent Mode in Google Ads