American Online Phonebook

Tag: Beyond

The search dilemma: looking beyond Google’s third-party cookie death

April 17, 2021 No Comments

30-second summary:

  • In 2020, majority of the 181.7 billion U.S. dollar revenues came from advertising through Google Sites or its network sites
  • Even though they will be removing the third-party cookie from 2022, the search giant still has a wealth of first-party data from its 270+ products, services, and platforms
  • The Trade Desk’s 20 percent stock price drop is proof of Google’s monopoly and why it shouldn’t enjoy it anymore
  • Google expert, Susan Dolan draws from her rich experience and details the current search scape, insights and predicts future key themes that will arise out of the 3p cookie death

Imagine search as a jungle gym, you automatically imagine Google as the kingpin player on this ground. This has been a reality for decades now and we all know the downside of autonomy which is why the industry now acknowledges a need for regulation. Google announced that it would remove the third-party cookie from 2022. But a lot can happen in a year, 2020 is proof of that! Does this mean that cookies will completely bite the dust? Think again. I dive deep into years of my experience with the web to share some thoughts, observations, and insights on what this really means.

For once, Google is a laggard

Given the monopoly that Google has enjoyed and the list of lawsuits (like the anti-trust one and more) this move is a regulatory step to create a “net-vironment” that feels less like a net and is driven towards transparency and search scape equality.

But Firefox and Safari had already beaten Google to the punch in 2019 and 2020 respectively. Safari had launched the Safari Intelligent Tracking Prevention (ITP) update on March 23, 2020. Firefox had launched its Enhanced Tracking Protection feature in September 2019 to empower and protect users from third-party tracking cookies and crypto miners.

Google’s solution to respect user privacy

Google recently announced that it won’t be using identifiers. Google is developing a ‘Privacy Sandbox’ to ensure that publishers, advertisers, and consumers find a fair middle ground in terms of data control, access, and tracking. The idea is to protect anonymity while still delivering results for advertisers and publishers. The Privacy Sandbox will don the FLoC API that can help with interest-based advertising. Google will not be using fingerprints, PII graphs based on people’s email addresses that other browsers use. Google will move towards a Facebook-like “Lookalike audience” model that will group users for profiling.

Did that raise eyebrows? There’s more.

Don’t be fooled – They still have a lavish spread of first-party data

Google is already rich with clusters of historical, individual unique data that they’ve stored, analyzed, predicted, and mastered over the years and across their platforms and services. These statistics give you a clear sense of the gravity of the situation:

  • Google has 270+ products and services (Source)
  • Among the leading search engines, the worldwide market share of Google in January 2021 was almost 86 percent (Source)
  • In 2020, majority of the 181.7 billion U.S. dollar revenues came from advertising through Google Sites or Google Network Sites (Source)
  • There are 246 million unique Google users in the US (Source)
  • Google Photos has over one billion active users (Source)
  • YouTube has over 1.9 billion active users each month (Source)
  • According to Google statistics, Gmail has more than 1.5 billion active users (Source)
  • A less-known fact, there are more than two million accounts on Google Ads (Source)
  • There are more than 2.9 million companies that use one or more of Google’s marketing services (Source)
  • As of Jan 2021, Google’s branch out into the Android system has won it a whopping 72 percent of the global smartphone operating system market (Source)
  • Google sees 3.5 billion searches per day and 1.2 trillion searches per year worldwide (Source)

Google has an almost-never ending spectrum of products, services, and platforms –

Here’s the complete, exhaustive list of Google’s gigantic umbrella.

Google's 270+ products, services, and platforms

Source: Matrics360

Google already has access to your:

  • Location
  • Search history
  • Credit/debit card details shared on Google Pay
  • Data from businesses (more than 2.9 million!) that use Google services
  • Your device microphone
  • Mobile keyboard (G-board)
  • Apps you download from the Google Playstore and grant access to
  • Device camera, and that’s not even the tip of the iceberg

Google’s decision to eliminate the third-party cookie dropped The Trade Desk’s stock by 20 percent

Nobody should have monopoly and this incident serves as noteworthy proof. Google’s decision to drop 3p cookies shocked The Trade Desk’s stock prices causing a 20 percent slump in their stock value. The Trade Desk is the largest demand-side platform (DSP) and Google’s decision kills the demand for The Trade Desk’s proprietary Unified ID 1.0 (UID 1.0) – a unique asset that chopped out the need for cookie-syncing process and delivered match rate accuracy up to 99 percent.

Google’s statement on not using PII also jeopardizes the fate of The Trade Desk’s Unified ID 2.0. which already has more than 50 million users.

Here’s what Dave Pickles, The Trade Desk’s Co-Founder and Chief Technology Officer had to say,

“Unified ID 2.0 is a broad industry collaboration that includes publishers, advertisers and all players in the ad tech ecosystem.”

“UID provides an opportunity to have conversations with consumers and provide them with the sort of transparency we as an industry have been trying to provide for a really long time.”

Adweek’s March town hall saw advertisers and publishers haunted by the mystery that surrounds Google as Google denied to participate in the event. The industry is growing precarious that Google will use this as a new way to establish market dominance that feeds its own interests.

We love cookies (only when they’re on a plate)

Cookies are annoying because they leave crumbs everywhere… on the internet! Did you know, this is how people feel about being tracked on the web:

  • 72 percent of people feel that almost everything they do online is being tracked by advertisers, technology firms or other companies
  • 81 percent say that the potential risks of data collection outweigh the benefits for them

These stats were originally sourced from Pew Research Center, but the irony, I found these stats on one of Google’s blogs.

On a hunt to escape these cookies or to understand the world’s largest “cookie jar” I checked out YouTube which seemed like a good place to start since it has over 1.9 billion monthly active users. You could visit this link to see how ads are personalized for you – the list is long!

My YouTube curiosity further landed me on this page to see how my cookies are shared (you can opt out of these). Even my least used account had 129 websites on this list, imagine how many sites are accessing your data right now.

Back in 2011 when I was the first to crack the Page rank algorithm, I could already sense the power Google held and where this giant was headed – the playground just wasn’t big enough.

Key themes that will emerge

Bottom line is, the cookie death is opening up conversations for advertising transparency and a web-verse that is user-first, and privacy compliant. Here’s what I foresee happening in search and the digital sphere:

  • Ethical consumer targeting
  • Adtech companies collaborating to find ways that respect their audience’s privacy
  • A more private, personalized web
  • More conversations around how much and what data collection is ethical
  • More user-led choices
  • Rise in the usage of alternative browsers
  • Incentivizing users to voluntarily share their data
  • Better use of technology for good

What do you think about the current climate on the internet? Join the conversation with me on @GoogleExpertUK.

Susan Dolan is a Search Engine Optimization Consultant first to crack the Google PageRank algorithm as confirmed by Eric Schmidt’s office in 2014. Susan is also the CEO of The Peoples Hub which has been built to help people and to love the planet.

The post The search dilemma: looking beyond Google’s third-party cookie death appeared first on Search Engine Watch.

Search Engine Watch


Beyond the ad: Conversion Optimization

April 6, 2021 No Comments

Today, using data for driving business decisions has become common practice for most companies, with many having a dedicated analytics team checking the impact of marketing investments, which channels to invest in and effect. But the majority of these activities are focused on optimizing parameters before the audience click the ad. The question is: are you taking the same data driven approach to your website design?

If you don’t use data to optimize your site’s user experience, you risk low conversion rates and lost revenue. A well-designed user interface could increase your website’s conversion rate by up to 200 percent, and a better UX design could yield conversion rates up to 400 percent. 

Now take your revenue, check your conversion rate, and calculate what it would be if the conversion rate would increase +200%. The number right there is why the companies that will thrive in the future most likely will be the ones that are data driven in, and focus as much on, both crucial moments during the user journey—before and beyond the ad.

The solution

Building this strength comes down to working with the research methods within conversion optimization and step by step A/B testing your way to a website your customers will love using.

Here are three steps on how to get started:

  1. Find the weak spots on the site. Combine quantitative research in Google Analytics, qualitative research such as user testing (in the Optimize Resource Hub you can find easy instructions) and inspiration from best practices. The Optimize Resource Hub gives you best practice suggestions from Google and a library of test results from other companies

  2. Prioritize the most impactful tests. Give each test idea a score of one to ten according to the uplift you think it will generate, and subtract a score of one to ten depending on the effort the test will require. 

  3. Start testing. You can get started today by setting up Google Optimize—the tool that uses the full power of Google Analytics. A free version is available so you can have a test up and running within a few minutes. 

For more in-depth knowledge around the process of conversion optimization, check out the CRO tips in the Optimize Resource Hub.

Optimize CRO 1

Learn from experts

We have one more treat for you, in the form of a new series of articles that will be published here on the blog: The Optimize CRO Series—Experts share their secrets. In this series, CRO experts from all over the world will give their best advice around these topics:

  • Research methods

  • Prioritizing tests

  • Favorite frameworks for analyzing sites

  • How to do a QA (quality assurance) of an A/B test 

  • The experts’ best tests

  • Learn from the failing tests

Eager to know more? Make sure you start following the Google Analytics products blog through the channel that fits you to get the upcoming guides.


Google Analytics Blog


Going beyond keywords: how conversational insights take the guesswork out of marketing

April 6, 2021 No Comments

30-second summary:

  • Keywords represent the tip of the iceberg when it comes to understanding consumer intent
  • Using AI-powered chatbots, conversational data that occurs over messaging channels like Facebook Messenger and Instagram Messaging can give businesses a deeper understanding of what consumers want
  • Below, we’ll discuss how conversational marketing platforms like Spectrm use natural language processing (NLP) and artificial intelligence (AI) to guide customers through the buying funnel
  • A robust conversational marketing platform makes it possible for companies to build chatbots that engage and convert customers on the websites, apps, and social platforms where people spend their time

conversational insights and keywords - Spectrm

For more than two decades, Google and other search engines have attempted to crack the consumer intent code. The entry point for a search marketing campaign is the keyword list. Yet keywords—whether spoken or typed—represent the tip of the iceberg when it comes to understanding what a user wants. There’s no way to clearly measure (or identify) user intent, but Google is getting better at figuring out what a user wants with technologies like Google Hummingbird, an algorithm update they rolled out in 2013. Google introduced Hummingbird in response to the increasingly conversational nature of search queries. 

Per a 2013 article in Wired, “Google is now examining the searcher’s query as a whole and processing the meaning behind it.” In January 2020, Statista reported roughly 40 percent of US search queries contained four or more terms.

Asking a search engine or virtual assistant a question is the beginning of a conversational journey that carries the searcher across channels until they ultimately find what they want (or not). Keywords pull the curtain of intent back, but they only provide a glimpse of the customer journey, labeling the searcher’s thoughts without revealing the “why” of what they’re searching for. 

Once a user clicks on a search result, the conversation—from the search engine’s perspective—is over. 

But thanks to advances in natural language processing (NLP), machine learning (ML), and artificial intelligence (AI), businesses have access to a much deeper understanding of what consumers want across the entire buying journey. 

AI-powered chatbots that “speak” to consumers can collect customer intent data and take the conversation beyond an initial keyword query. They enable businesses to leverage that customer intent data instantly to scale one-to-one personalization in direct chat.

Below, we discuss how conversational marketing platforms employ NLP and AI in chatbots to guide customers through the buying funnel, using conversational analysis to gain an understanding of customer intent that goes far beyond keywords. 

Content created in partnership with Spectrm.

The customer conversation is online

According to Hootsuite’s Digital In 2020 report, 60 percent of the world’s population is online. The report found that, globally, users spend an average of 6 hours and 43 minutes online each day—40 percent of their waking life using the internet. A large chunk of that time, more than two hours, is spent using social media.

Consumers were using mobile messaging and chat an average of 20 minutes per day in 2020, with Business Insider predicting that the average would grow to 24 minutes by 2021. Interacting with chatbots is a natural extension of consumers’ comfort with messaging in social media apps like Facebook and Instagram.

Increasingly, messaging is how we connect with each other. Facebook and Instagram are at the center of this trend. Businesses have the potential to reach and engage with over two billion people on Facebook and Instagram using their respective messengers. This level of engagement gets to the root of consumer intent, diving beneath surface keywords to the conversational data that can help companies understand what’s motivating the consumer to conduct their search in the first place. 

Leveraging conversations to drive results

Conversational marketing platforms use messaging apps to engage with consumers and determine intent. This is next-level chatbot technology that uses AI to create a two-way exchange with every customer, asking them questions throughout the buying process and capable of operating on multiple messaging channels.

Spectrm is an example of a conversational marketing platform that goes beyond simple, generic approaches to conversational AI by using domain-specific NLP to guide consumers through the customer journey. Generic conversational AI uses general NLP that can be used for simple tasks like autosuggestions and basic keyword matching. Domain-specific NLP is trained for the individual business. Spectrm’s approach to conversational AI combines domain-specific NLP with the use of generative adversarial networks, a type of machine learning that enables enterprises with little or no customer intent data to quickly generate their own data sets to train the algorithm.

“Marketing chatbots that use domain-specific NLP learn how your individual customers speak. The customer intent data specific to your business, customers, and goals are used to continuously improve your chatbot. It’s about understanding how your customers engage naturally with your brand, and training your bot to respond to that to drive outcomes valuable to your business. Even if you don’t have a lot of conversational data to train your bot.” – Writes Spectrm

Chatbots are only part of what makes conversational marketing platforms work. Platforms like Spectrm operate across multiple messaging channels where consumers spend all their time including Facebook Messenger, Instagram Messaging, Google Business Messages, and even at the display level via conversational display ads using AdLingo and Google DV360.

Consumers like chatting with businesses. They’re already moving through the buying cycle using one-on-one conversations that provide much more in-depth intent data than a simple keyword search. Consider the follow statistics:

  • 75 percent of consumers prefer to engage with brands in private messaging channels versus traditional channels
  • 65 percent of people are more likely to shop with a business they can reach via chat

Conversational data = More targeted campaigns

Conversational data can be used to create marketing campaigns that are more targeted than traditional search and display campaigns. They enable businesses to design targeted messaging around the customer journey, learning what customers want/need in the context of how they’re interacting with the chatbot.

Conversational data also enables businesses to create customer profiles using the answers people provide in chat. Personalization and segmentation become easier based on the granularity and specificity of conversational data. This information can be used to personalize marketing messages at a one-to-one level directly in chat. 

None of this is possible without the right platform. Some factors to strongly consider while evaluating an enterprise-level conversational marketing platform would be:

  • An easy to implement, no-coding setup
  • Customizations for your specific company and customer needs
  • Easy integrations with your tech stack
  • Enforcement of the highest privacy standards (GDPR, CCPA, and the others)
  • Connection to your product feed (for ecommerce websites) and ability to serve product recommendations/content in real-time based on user input
  • Flexible role management with the ability to set user access roles

Tools like Spectrm are at the heart of marketing automation, enabling companies to acquire new customers at scale. A robust conversational marketing platform makes it possible for companies to build chatbots that engage and convert customers on the websites, apps, and social platforms where people spend their time—no engineering resources needed.

Just like search engines, conversational intelligence tools effectively use language to get to the heart of consumer intent. They go beyond keywords to make every datapoint actionable, using chatbot analytics to optimize funnels and segment customers

In Spectrm’s words, “Reaching the right audience is getting harder every day. Consumers are more curious, demanding, and impatient than ever. They expect their digital experiences to be personalized, instant, and effortless. Chatbots enable brands to connect with their audience personally and offer seamless customer experiences from the start.”

To view Spectrm’s offerings, click here.

The post Going beyond keywords: how conversational insights take the guesswork out of marketing appeared first on Search Engine Watch.

Search Engine Watch


On-page SEO: a handy checklist to tick off in 2021 and beyond

March 12, 2021 No Comments

30-second summary:

  • On-page SEO is the process of optimizing your web pages and content for search engine crawlers
  • It involves a lot of moving parts, so it’s easy to forget some elements or tweak them incorrectly
  • This quick checklist will help you keep all the various on-page SEO elements on track

On-page SEO is basically a set of techniques and best practices for your web pages to make them more search-engine friendly and thus, boost your rankings.

Now, as you know, keywords are at the heart of nearly everything on-page SEO. But on-page optimization involves a lot of elements — not just keywords — so it’s easy to overlook some of them.

To make it easy for you to ensure all your pages are correctly optimized for the best possible rankings, here’s a handy checklist to tick off.

URLs

Review the URLs of all pages on your site to ensure they’re concise rather than long and complex. Shorter URLs tend to have better click-through rates and are more easily understood by search engine crawlers.

Include your page’s primary keyword in the URL, remove filler (aka stop) words like “the”, “for”, and “to”, and keep it under 60 characters.

Images

Your website is likely brimming with images, and that’s a good thing as images contribute significantly to improving both user experience and rankings. They make your content more easy-to-consume, engaging, and memorable, and when optimized correctly, help you drive more traffic to your website.

To optimize your images for on-page SEO, here are a couple of things to ensure:

Image filename and alt text

Google bots can’t “see” images like humans. They need accompanying text to understand what the image is about. So, write a descriptive filename (“blue-running-shoes.jpg” instead of “82596173.jpg”) and alt text (which helps in case the image fails to load for some reason) for every image on your site, including keywords in both.

Alt text also helps make your website more accessible, as screen readers use the alt text to describe images to visually-challenged users. In fact, it’s prudent to test your website’s accessibility to ensure you don’t ever have to cough up big bucks for ADA lawsuit settlements.

Image file size

Page speed is a major ranking signal for both desktop and mobile searches, and bulky images slow down your site’s load speed. So make sure to compress all images to reduce their size — ideally under 70 kb.

Titles and meta descriptions

See to it that you’ve included your main keywords in the front of the title tags of all pages. Ensure the length of your title tags is under 60-65 characters and no longer than 70 characters, otherwise, it may get truncated in the SERPs.

Also, the title should be the only element wrapped in the H1 heading tag. In other words, only one H1 tag per page that’s reserved for the title.

For meta descriptions, just ensure you have written a keyword-rich and inviting meta description that is relevant to your user’s search intent. Keep it under 160 characters for all your pages. If you don’t, Google will pick some relevant text from the page and display it as the meta description in the SERP, which isn’t ideal for SEO.

Page load speed

Speed is a major ranking factor you just can’t afford to overlook. If your pages take anything over two to three seconds to load, your visitors will bounce to a competitor, and achieving first page rankings will remain a dream.

Thus, verify that:

  • Code is optimized with minified CSS and JS
  • There are no unnecessary redirects
  • You have compressed all images
  • You’ve enabled file compression and browser caching
  • Server response time is optimal

Regularly review your site speed using PageSpeed Insights to find out the exact areas that can be improved.

Links – internal and external

Ensure you have a proper linking strategy that you always follow. Both internal and external links play a role in your on-page SEO.

External links

Citing external sources and having outbound links is crucial for building credibility in the eyes of both Google crawlers and human visitors. However, make sure that you’re only linking back to high-quality websites and reliable sources.

Plus, ensure there are no broken (“404 not found”) links, as they hurt SEO and user experience. In case you may have a lot of site pages, it is best advised to create an engaging and easy-to-navigate 404 error page. This will help you retain site visitors and help them find relevant content/actions.

Internal links

Make sure to strategically interlink pages and content on your website. This helps crawlers to better understand and rank your content for the right keywords.

Internal linking also helps to guide visitors to relevant pages and keep them engaged.

Content

All your blog posts and website copy play a pivotal role in on-page optimization. Besides ensuring your target keywords are sprinkled judiciously and naturally throughout your content title, URL, subheadings, and paras. Here are a couple of things to get right.

Structure and readability

Verify the structure of content on all pages. Make sure you’ve used keyword-optimized headings and subheadings – H1, H2, H3, and so on, to build a logical hierarchy, which improves the readability and crawlability of your content.

Comprehensiveness

Studies suggest that longer, in-depth posts perform better than shorter ones when it comes to Google rankings. So, strive to have a word count of 2,000+ words in every piece of content.

Comprehensive, long-form content will also serve your audience better as it likely answers all their questions about the topic so they don’t have to look for more reading resources.

Over to you

With each new update to its core algorithm, Google is fast shifting its focus on rewarding websites with the best user experience.

But nailing your on-page optimization which ties closely with UX will continue to help you achieve top rankings and stay there. And so, keep this checklist handy as you work on your SEO in 2021 and beyond.

Gaurav Belani is a senior SEO and content marketing analyst at Growfusely, a content marketing agency specializing in content and data-driven SEO. He can be found on Twitter @belanigaurav.

The post On-page SEO: a handy checklist to tick off in 2021 and beyond appeared first on Search Engine Watch.

Search Engine Watch


Taking your SEO content beyond the acquisition

February 2, 2021 No Comments

30-second summary:

  • The typical advice around merely improving on the content already ranking at the top of the SERP is fundamentally flawed.
  • SEOs often limit their content possibilities by thinking of SEO content from purely acquisitional point of view.
  • Thinking of content from a branding perspective leads to differentiation and aligns with Google’s focus on topical expertise and authority.
  • Emerging AI writing technology may not be symmetrical with Google’s evolving algorithm.

I have a bone to pick with the way our industry thinks about content. In general, I think we often don’t appreciate what good content really is. Nor do I think we consider what should go into creating great content. Here, in specific, I want to challenge the notion that all content is “acquisition” content. 

I don’t just mean landing pages, but blog posts as well. That’s right, not all content should be created with the objective of getting more conversions or even more traffic to your site. 

Does that sound outlandish? Perhaps. But by the time you finish reading this, you might agree with me. (Although let’s be honest, you probably won’t).

SEO from a branding perspective

I often think of SEO from a branding perspective. I know, you’re probably thinking, “Well, that’s a crazy statement right there!”. Outlandish as it might sound, thinking of SEO in terms of branding will greatly impact how you see “SEO content”. Why? Because in terms of mindset, content creation and branding are very similar.

Let’s substitute “your brand” with “your site” because your site is your brand to both users and search engines. 

Think of your site as your brand. Just like you think about your brand’s identity and perception–that’s how you should think about your site because that’s how it’s seen by Google.

We, as SEOs, might refer to this as your site’s “trust” and “authority.” When you break those concepts down fundamentally, what you’re really talking about is how your site is being perceived based on what it’s meant to be doing (that is, its identity). 

In other words, what would the fundamental problem be with a site that offered cancer treatment advice while peddling payday loans? It would be the perception that the health advice is, at best, “tainted”. Even if the site wasn’t “seedy” and offered cancer treatment advice as well as investment advice, there would be a severe lack of identity. 

In many fundamental ways, things like E-A-T and brand identity (and subsequently, perception) are the same thing. 

So let’s ask, if you wanted your brand to be perceived as trustworthy and authoritative how would you go about writing your content? What would your content look and sound like? 

That kind of content would have to be substantial, nuanced and detailed. Most importantly, it would have to be unique. Having brand identity that is borrowed from another brand is entirely antithetical to having your own brand identity. This would apply to everything from an in-depth blog post to a product image or description. Brand identity and differentiation go hand in hand. Differentiation and nuance go hand in hand. Do you see where I’m going here?  

Does your “SEO content” sound like this? Are we hyper-focused on differentiation? 

Quite the opposite. A lot of the basic advice you hear about writing “good SEO content” is about replicating what the top-ranking sites are doing already. 

The typical “content for SEO” is irksome

The typical advice about creating “SEO content” flies in the face of content that has a unique identity and brand value. Namely, it often calls on folks to see what’s ranking on the top of the SERP and make sure the topics that the top-ranking sites cover make their way to your content as well. Differentiation is damned.  

Worse, this advice is often directed to new SEOs and it’s presented without a hint that there’s more to the story here. 

Obviously, surveying the top-ranking pages and taking some ideas away is a fine thing to do. However, it does not create unique value. Skyscraper content, as it’s often called, doesn’t help you differentiate your content in any substantial way. 

For those of you who adhere to the notion of simply improving upon what currently ranks let me ask you, would you take the same approach with your brand?

Would you be happy with a brand identity that was simply a take on another brand’s identity? That kind of feels a bit cheap and it isn’t a truly effective branding strategy. 

Why is your content any different? 

Is regurgitating what’s already out there going to help your content stand out or be memorable? (The answer is no in case you were really wondering) 

By the way, there is a fundamental flaw in this approach. Namely, it rests on the assumption that what is there already is the best that it can possibly be. But, isn’t it entirely possible that Google would prefer content that took the topic from a totally different angle? Isn’t it possible that the content already ranking isn’t the best, but is simply the best Google has at the moment? What if you were to take a new approach or introduce new relevant subtopics that other pages don’t? Isn’t there a chance that you would rank and not those other pages? 

However, if you only look at content that’s already ranking, you won’t think about the content that people really want and need, that doesn’t exist yet. That’s potentially a huge opportunity that you’d be missing out on. 

So, why is this tolerated? Why do we spread the idea that all it takes is a wee bit of keyword research and some surveying of the ranking sites? 

I believe it comes down to mindset. We generally think of content as acquisitional and that’s a bit problematic. 

The problem with thinking about content as purely acquisitional

When you think of content as being purely acquisitional, you become blinded by the drug that is acquisition. When your sole goal is acquisition you’re not thinking about things like: 

  • What’s genuinely good for the user? 
  • How do I differentiate my content? 
  • What does my content say about my brand?  

The idea of content being acquisitional is not intrinsically problematic. Content should bring in new users, it should generate traffic, it should result in sales…but it should also do more. 

Content should help give identity to your site. It should create relationships with users. It should lend an air of authority and expertise to your site. (We’re right back at the whole E-A-T thing again because branding and E-A-T are two peas in a pod) 

However, we don’t live in a world of identity, relationships, and authority. Our world consists of clicks, traffic, conversions, sales, and so forth. In turn, we distort content, which in this author’s opinion is not fundamentally about acquisition, into only being about acquisition. 

It’s not hard to see how a mentality that revolves around seeing what already works and replicating it came to dominate our industry. Things like identity and consumer trust, well those are “marketing” concepts. What do they have to do with SEO? SEO is about traffic. Let’s create content that brings in that traffic, no? 

Except, I would argue, SEO is not that at all. Search engines are looking at who your site is and what it claims to be (and if the content you have aligns with that). They are judging your expertise and authority. They want to match the user with helpful content that aligns to query intent. 

Search engines don’t care about your traffic and conversions. They care about users, much the way that a more ‘brand-centric’ outlook on SEO would care about how a user perceives a website.  

What should content be created for if not the acquisition of more sales or traffic?

So if you’re not writing content for acquisition then who and what are you writing content for? I don’t know, how about your audience or potential audience? (I’m referring to creating content for the user, so cliche, I know.)

There are various starting points when thinking about content that serves users. One of which is thinking about yourself and your site and how the content you create represents you. Because once you do, you sure are not going to want to put out anything that presents you the wrong way. 

I don’t want to get into the whole “is keyword research dead” debate (it’s dead, it’s not really a debate). Do what you want with your keywords. I don’t care about your keywords, I care about your content. 

Your content is you. The content you have on your site is who you are to the users who visit your site. Your content is branding. There isn’t a way around that. So while you’ve been focused on scraping every topic and subtopic you can from your competitors, your users (can we call them readers?) are asking why your content looks and feels like every other piece of content they’ve come across. Congratulations. 

(By the way, I personally believe search engines are most likely saying the same thing. That is, what is the real value in ranking this page over what’s already there, if fundamentally, they are the same?) 

Traffic and growth and conversions or however you want to frame this is not a linear equation. Driving more traffic or getting more conversions is a complex and messy endeavor. You can’t just think about what is immediately in front of you. How users feel about your site and perceive your brand over time is an important part of the equation. The content your readers consume, whether it be a product description or a blog post, define you and your brand. That can determine if they return to your site, recommend your site, link to your site, mention your site, and so forth. 

Is this not part of SEO? Because if it is, that only happens when you do things like thinking about content from a “perception” or “branding” (or whatever you want to call it) point of view.  

Moreover, thinking of your content and your site overall from a brand authority perspective naturally hones your topical focus. It forces you to create substantial content that reflects well on who you are. And as I mentioned earlier, that topical focus gives your site identity to both users (in the form of brand identity) and to search engines (in the form of, “hey, this site comprehensively tackles this topic over multiple posts, let’s rank them for this topic across the board”). 

But this only happens if you step back from the acquisition mindset and think of your content from a wider and less strictly “traditional SEO” perspective. This only happens when you write content that’s differentiated, that focuses on quality, and that isn’t about making sure you cover certain topics for the sake of covering a certain topic. 

What I am trying to say is that content is naturally closer to branding than it is to SEO (at least SEO as many of us know it). If you don’t look at your content from a branding/perception point of view you are fundamentally missing out on what content is.

That, in turn, means creating strong and quality content will be an uphill battle for you. And that means that ranking long term is also going to be an uphill battle for you, as Google continues to refine how it understands language and how it profiles sites. 

Succinctly, instead of asking “how will this content get me more traffic?”, ask yourself, ‘“How will this content make me look to my users?”. That will put you on the path to writing unique, helpful content. 

GPT-3, it’s a trap!

I could end the piece here, but I have one more “concern” that needs to be addressed. AI writers. 

Do I think AI writers, namely GPT-3 will be good at writing a product description? Yes, I do. I think AI writers will ultimately do a wonderful job with something like a product description. 

Do I think AI writers, namely GPT-3, will be good at writing something titled, “A Speculative Critique of Relativity from a Quantum Physics Perspective”? Absolutely not. Do you? 

As this field rapidly develops I want to issue a warning: don’t fall into the trap. Don’t think that you can get away with using something like GPT-3 to write a deeply nuanced and differentiated article or blog post. 

Yes, I do think people will try to do just that. Why? Because of the same acquisition mindset, I complained about earlier. When it comes to more substantial content, an AI writer just can’t deliver the nuance and quality that you need to make a difference.

As I see it the danger is that it’s easy to get caught up in emerging technology and go all-in on it. Just remember, Google is also an emerging technology, and a lot of what it’s doing in the algorithm stands in contradiction to the full-on adoption of AI written content. 

While the emergence of AI writers might make it easier to create content, you could be creating the very content that Google does not want. And while something like GPT-3 would, all things being equal, work well on a landing page, the content it produces for a topic your blog handles may need more nuance and depth.

Of course, all of this hinges on thinking there is a world of content beyond acquisition fluff. (If you love fluff, go ahead GPT-3 yourself to death.)

Feel the perception pressure

How do users perceive your site? How do they feel about you after reading the content on your site or interacting with your site? Thinking about your site’s perception can be a pathway to creating content that is substantial and ultimately effective (and I mean from an SEO point of view). 

The problem is when we get so caught up in linear metrics that we don’t even feel that pressure. When SEO content creation becomes a hustle to outrank whatever is currently at the top of the SERP it sacrifices perspective. That perspective can be the difference between being another piece of the same ol’ content versus being something both users and search engines value. 

End the hustle. 

Mordy Oberstein is Liaison to the SEO Community at Wix.

The post Taking your SEO content beyond the acquisition appeared first on Search Engine Watch.

Search Engine Watch


Startups look beyond lidar for autonomous vehicle perception

January 18, 2021 No Comments

Last CES was a time of reckoning for lidar companies, many of which were cratering due to a lack of demand from a (still) non-existent autonomous vehicle industry. The few that excelled did so by specializing, and this year the trend has pushed beyond lidar, with new sensing and imaging methods pushing to both compete with and complement the laser-based tech.

Lidar pushed ahead of traditional cameras because it could do things they couldn’t — and now some companies are pushing to do the same with tech that’s a little less exotic.

A good example of addressing the problem or perception by different means is Eye Net’s vehicle-to-x tracking platform. This is one of those techs that’s been talked about in the context of 5G (admittedly still somewhat exotic), which for all the hype really does enable short-distance, low-latency applications that could be life-savers.

Eye Net provides collision warnings between vehicles equipped with its tech, whether they have cameras or other sensing tech equipped or not. The example they provide is a car driving through a parking lot, unaware that a person on one of those horribly unsafe electric scooters is moving perpendicular to it ahead, about to zoom into its path but totally obscured by parked cars. Eye Net’s sensors detect the position of the devices on both vehicles and send warnings in time for either or both to brake.

CG illustration of a bicyclist and car being warned of an imminent collision.

Image Credits: Eye Net

They’re not the only ones attempting something like this, but they hope that by providing a sort of white-label solution, a good size network can be built relatively easily, instead of having none, and then all VWs equipped, and then some Fords and some e-bikes, and so on.

But vision is still going to be a major part of how vehicles navigate, and advances are being made on multiple fronts.

Brightway Vision, for instance, addresses the issue of normal RGB cameras having limited visibility in many real-world conditions by going multispectral. In addition to ordinary visible-light imagery, the company’s camera is mated to a near-infrared beamer that scans the road ahead at set distance intervals many times a second.

CG illustration of a camera using infrared to see further ahead at night.

Image Credits: Brightway Vision

The idea is that if the main camera can’t see 100 feet out because of fog, the NIR imagery will still catch any obstacles or road features when it scans that “slice” in its regular sweep of the incoming area. It combines the benefits of traditional cameras with those of IR ones, but manages to avoid the shortcomings of both. The pitch is that there’s no reason to use a normal camera when you can use one of these, which does the same job better and may even allow another sensor to be cut out.

Foresight Automotive also uses multispectral imagery in its cameras (chances are hardly any vehicle camera will be limited to visible spectrum in a few years), dipping into thermal via a partnership with FLIR, but what it’s really selling is something else.

To provide 360-degree (or close) coverage, generally multiple cameras are required. But where those cameras go differs on a compact sedan versus an SUV from the same manufacturer — let alone on an autonomous freight vehicle. Because those cameras have to work together, they need to be perfectly calibrated, aware of the exact position of the others, so they know, for example, that they’re both looking at the same tree or bicyclist and not two identical ones.

Image showing Foresight cameras being attached magnetically to a car's body.

Image Credits: Foresight Automotive

Foresight’s advance is to simplify the calibration stage, so a manufacturer or designer or test platform doesn’t need to be laboriously re-tested and certified every time the cameras need to be moved half an inch in one direction or the other. The Foresight demo shows them sticking the cameras on the roof of the car seconds before driving it.

It has parallels to another startup called Nodar that also relies on stereoscopic cameras, but takes a different approach. The technique of deriving depth from binocular triangulation, as the company points out, goes back decades, or millions of years if you count our own vision system, which works in a similar ways. The limitation that has held this approach back isn’t that optical cameras fundamentally can’t provide the depth information needed by an autonomous vehicle, but that they can’t be trusted to remain calibrated.

Nodar shows that its paired stereo cameras don’t even need to be mounted to the main mass of the car, which would reduce jitter and fractional mismatches between the cameras’ views. Attached to the rear view mirrors, their “Hammerhead” camera setup has a wide stance (like the shark’s), which provides improved accuracy because of the larger disparity between the cameras. Since distance is determined by the differences between the two images, there’s no need for object recognition or complex machine learning to say “this is a shape, probably a car, probably about this big, which means it’s probably about this far away” as you might with a single camera solution.

Image Credits: Nodar

The industry has already shown that camera arrays do well in harsh weather conditions, just as human eyes do,” said Nodar COO and co-founder Brad Rosen. “For example, engineers at Daimler have published results showing that current stereoscopic approaches provide significantly more stable depth estimates than monocular methods and LiDAR completion in adverse weather. The beauty of our approach is that the hardware we use is available today, in automotive-grade, and with many choices for manufacturers and distributors.”

Indeed, a major strike against lidar has been the cost of the unit — even “inexpensive” ones tend to be orders of magnitude more expensive than ordinary cameras, something that adds up very quickly. But team lidar hasn’t been standing still either.

Sense Photonics came onto the scene with a new approach that seemed to combine the best of both worlds: a relatively cheap and simple flash lidar (as opposed to spinning or scanning, which tend to add complexity) mated to a traditional camera so that the two see versions of the same image, allowing them to work together in identifying objects and establishing distances.

Since its debut in 2019 Sense has refined its tech for production and beyond. The latest advance is custom hardware that has enabled it to image objects out to 200 meters — generally considered on the far end both for lidar and traditional cameras.

“In the past, we have sourced an off-the-shelf detector to pair with our laser source (Sense Illuminator). However, our 2 years of in-house detector development has now completed and is a huge success, which allows us to build short-range and long-range automotive products,” said CEO Shauna McIntyre.

“Sense has created ‘building blocks’ for a camera-like LiDAR design that can be paired with different sets of optics to achieve different FOV, range, resolution, etc,” she continued. “And we’ve done so in a very simple design that can actually be manufactured in large volumes. You can think of our architecture like a DSLR camera where you have the ‘base camera’ and can pair it with a macro lens, zoom lens, fisheye lens, etc. to achieve different functions.”

One thing all the companies seemed to agree on is that no single sensing modality will dominate the industry from top to bottom. Leaving aside that the needs of a fully autonomous (i.e. level 4-5) vehicle has very different needs from a driver assist system, the field moves too quickly for any one approach to remain on top for long.

“AV companies cannot succeed if the public is not convinced that their platform is safe and the safety margins only increase with redundant sensor modalities operating at different wavelengths,” said McIntyre.

Whether that means visible light, near-infrared, thermal imaging, radar, lidar, or as we’ve seen here, some combination of two or three of these, it’s clear the market will continue to favor differentiation — though as with the boom-bust cycle seen in the lidar industry a few years back, it’s also a warning that consolidation won’t be far behind.

Gadgets – TechCrunch


Openbook is the latest dream of a digital life beyond Facebook

August 13, 2018 No Comments

As tech’s social giants wrestle with antisocial demons that appear to be both an emergent property of their platform power, and a consequence of specific leadership and values failures (evident as they publicly fail to enforce even the standards they claim to have), there are still people dreaming of a better way. Of social networking beyond outrage-fuelled adtech giants like Facebook and Twitter.

There have been many such attempts to build a ‘better’ social network of course. Most have ended in the deadpool. A few are still around with varying degrees of success/usage (Snapchat, Ello and Mastodon are three that spring to mine). None has usurped Zuckerberg’s throne of course.

This is principally because Facebook acquired Instagram and WhatsApp. It has also bought and closed down smaller potential future rivals (tbh). So by hogging network power, and the resources that flow from that, Facebook the company continues to dominate the social space. But that doesn’t stop people imagining something better — a platform that could win friends and influence the mainstream by being better ethically and in terms of functionality.

And so meet the latest dreamer with a double-sided social mission: Openbook.

The idea (currently it’s just that; a small self-funded team; a manifesto; a prototype; a nearly spent Kickstarter campaign; and, well, a lot of hopeful ambition) is to build an open source platform that rethinks social networking to make it friendly and customizable, rather than sticky and creepy.

Their vision to protect privacy as a for-profit platform involves a business model that’s based on honest fees — and an on-platform digital currency — rather than ever watchful ads and trackers.

There’s nothing exactly new in any of their core ideas. But in the face of massive and flagrant data misuse by platform giants these are ideas that seem to sound increasingly like sense. So the element of timing is perhaps the most notable thing here — with Facebook facing greater scrutiny than ever before, and even taking some hits to user growth and to its perceived valuation as a result of ongoing failures of leadership and a management philosophy that’s been attacked by at least one of its outgoing senior execs as manipulative and ethically out of touch.

The Openbook vision of a better way belongs to Joel Hernández who has been dreaming for a couple of years, brainstorming ideas on the side of other projects, and gathering similarly minded people around him to collectively come up with an alternative social network manifesto — whose primary pledge is a commitment to be honest.

“And then the data scandals started happening and every time they would, they would give me hope. Hope that existing social networks were not a given and immutable thing, that they could be changed, improved, replaced,” he tells TechCrunch.

Rather ironically Hernández says it was overhearing the lunchtime conversation of a group of people sitting near him — complaining about a laundry list of social networking ills; “creepy ads, being spammed with messages and notifications all the time, constantly seeing the same kind of content in their newsfeed” — that gave him the final push to pick up the paper manifesto and have a go at actually building (or, well, trying to fund building… ) an alternative platform. 

At the time of writing Openbook’s Kickstarter crowdfunding campaign has a handful of days to go and is only around a third of the way to reaching its (modest) target of $ 115k, with just over 1,000 backers chipping in. So the funding challenge is looking tough.

The team behind Openbook includes crypto(graphy) royalty, Phil Zimmermann — aka the father of PGP — who is on board as an advisor initially but billed as its “chief cryptographer”, as that’s what he’d be building for the platform if/when the time came. 

Hernández worked with Zimmermann at the Dutch telecom KPN building security and privacy tools for internal usage — so called him up and invited him for a coffee to get his thoughts on the idea.

“As soon as I opened the website with the name Openbook, his face lit up like I had never seen before,” says Hernández. “You see, he wanted to use Facebook. He lives far away from his family and facebook was the way to stay in the loop with his family. But using it would also mean giving away his privacy and therefore accepting defeat on his life-long fight for it, so he never did. He was thrilled at the possibility of an actual alternative.”

On the Kickstarter page there’s a video of Zimmermann explaining the ills of the current landscape of for-profit social platforms, as he views it. “If you go back a century, Coca Cola had cocaine in it and we were giving it to children,” he says here. “It’s crazy what we were doing a century ago. I think there will come a time, some years in the future, when we’re going to look back on social networks today, and what we were doing to ourselves, the harm we were doing to ourselves with social networks.”

“We need an alternative to the social network work revenue model that we have today,” he adds. “The problem with having these deep machine learning neural nets that are monitoring our behaviour and pulling us into deeper and deeper engagement is they already seem to know that nothing drives engagement as much as outrage.

“And this outrage deepens the political divides in our culture, it creates attack vectors against democratic institutions, it undermines our elections, it makes people angry at each other and provides opportunities to divide us. And that’s in addition to the destruction of our privacy by revenue models that are all about exploiting our personal information. So we need some alternative to this.”

Hernández actually pinged TechCrunch’s tips line back in April — soon after the Cambridge Analytica Facebook scandal went global — saying “we’re building the first ever privacy and security first, open-source, social network”.

We’ve heard plenty of similar pitches before, of course. Yet Facebook has continued to harvest global eyeballs by the billions. And even now, after a string of massive data and ethics scandals, it’s all but impossible to imagine users leaving the site en masse. Such is the powerful lock-in of The Social Network effect.

Regulation could present a greater threat to Facebook, though others argue more rules will simply cement its current dominance.

Openbook’s challenger idea is to apply product innovation to try to unstick Zuckerberg. Aka “building functionality that could stand for itself”, as Hernández puts it.

“We openly recognise that privacy will never be enough to get any significant user share from existing social networks,” he says. “That’s why we want to create a more customisable, fun and overall social experience. We won’t follow the footsteps of existing social networks.”

Data portability is an important ingredient to even being able to dream this dream — getting people to switch from a dominant network is hard enough without having to ask them to leave all their stuff behind as well as their friends. Which means that “making the transition process as smooth as possible” is another project focus.

Hernández says they’re building data importers that can parse the archive users are able to request from their existing social networks — to “tell you what’s in there and allow you to select what you want to import into Openbook”.

These sorts of efforts are aided by updated regulations in Europe — which bolster portability requirements on controllers of personal data. “I wouldn’t say it made the project possible but… it provided us a with a unique opportunity no other initiative had before,” says Hernández of the EU’s GDPR.

“Whether it will play a significant role in the mass adoption of the network, we can’t tell for sure but it’s simply an opportunity too good to ignore.”

On the product front, he says they have lots of ideas — reeling off a list that includes the likes of “a topic-roulette for chats, embracing Internet challenges as another kind of content, widgets, profile avatars, AR chatrooms…” for starters.

“Some of these might sound silly but the idea is to break the status quo when it comes to the definition of what a social network can do,” he adds.

Asked why he believes other efforts to build ‘ethical’ alternatives to Facebook have failed he argues it’s usually because they’ve focused on technology rather than product.

“This is still the most predominant [reason for failure],” he suggests. “A project comes up offering a radical new way to do social networking behind the scenes. They focus all their efforts in building the brand new tech needed to do the very basic things a social network can already do. Next thing you know, years have passed. They’re still thousands of miles away from anything similar to the functionality of existing social networks and their core supporters have moved into yet another initiative making the same promises. And the cycle goes on.”

He also reckons disruptive efforts have fizzled out because they were too tightly focused on being just a solution to an existing platform problem and nothing more.

So, in other words, people were trying to build an ‘anti-Facebook’, rather than a distinctly interesting service in its own right. (The latter innovation, you could argue, is how Snap managed to carve out a space for itself in spite of Facebook sitting alongside it — even as Facebook has since sought to crush Snap’s creative market opportunity by cloning its products.)

“This one applies not only to social network initiatives but privacy-friendly products too,” argues Hernández. “The problem with that approach is that the problems they solve or claim to solve are most of the time not mainstream. Such as the lack of privacy.

“While these products might do okay with the people that understand the problems, at the end of the day that’s a very tiny percentage of the market. The solution these products often present to this issue is educating the population about the problems. This process takes too long. And in topics like privacy and security, it’s not easy to educate people. They are topics that require a knowledge level beyond the one required to use the technology and are hard to explain with examples without entering into the conspiracy theorist spectrum.”

So the Openbook team’s philosophy is to shake things up by getting people excited for alternative social networking features and opportunities, with merely the added benefit of not being hostile to privacy nor algorithmically chain-linked to stoking fires of human outrage.

The reliance on digital currency for the business model does present another challenge, though, as getting people to buy into this could be tricky. After all payments equal friction.

To begin with, Hernández says the digital currency component of the platform would be used to let users list secondhand items for sale. Down the line, the vision extends to being able to support a community of creators getting a sustainable income — thanks to the same baked in coin mechanism enabling other users to pay to access content or just appreciate it (via a tip).

So, the idea is, that creators on Openbook would be able to benefit from the social network effect via direct financial payments derived from the platform (instead of merely ad-based payments, such as are available to YouTube creators) — albeit, that’s assuming reaching the necessary critical usage mass. Which of course is the really, really tough bit.

“Lower cuts than any existing solution, great content creation tools, great administration and overview panels, fine-grained control over the view-ability of their content and more possibilities for making a stable and predictable income such as creating extra rewards for people that accept to donate for a fixed period of time such as five months instead of a month to month basis,” says Hernández, listing some of the ideas they have to stand out from existing creator platforms.

“Once we have such a platform and people start using tips for this purpose (which is not such a strange use of a digital token), we will start expanding on its capabilities,” he adds. (He’s also written the requisite Medium article discussing some other potential use cases for the digital currency portion of the plan.)

At this nascent prototype and still-not-actually-funded stage they haven’t made any firm technical decisions on this front either. And also don’t want to end up accidentally getting into bed with an unethical tech.

“Digital currency wise, we’re really concerned about the environmental impact and scalability of the blockchain,” he says — which could risk Openbook contradicting stated green aims in its manifesto and looking hypocritical, given its plan is to plough 30% of its revenues into ‘give-back’ projects, such as environmental and sustainability efforts and also education.

“We want a decentralised currency but we don’t want to rush into decisions without some in-depth research. Currently, we’re going through IOTA’s whitepapers,” he adds.

They do also believe in decentralizing the platform — or at least parts of it — though that would not be their first focus on account of the strategic decision to prioritize product. So they’re not going to win fans from the (other) crypto community. Though that’s hardly a big deal given their target user-base is far more mainstream.

“Initially it will be built on a centralised manner. This will allow us to focus in innovating in regards to the user experience and functionality product rather than coming up with a brand new behind the scenes technology,” he says. “In the future, we’re looking into decentralisation from very specific angles and for different things. Application wise, resiliency and data ownership.”

“A project we’re keeping an eye on and that shares some of our vision on this is Tim Berners Lee’s MIT Solid project. It’s all about decoupling applications from the data they use,” he adds.

So that’s the dream. And the dream sounds good and right. The problem is finding enough funding and wider support — call it ‘belief equity’ — in a market so denuded of competitive possibility as a result of monopolistic platform power that few can even dream an alternative digital reality is possible.

In early April, Hernández posted a link to a basic website with details of Openbook to a few online privacy and tech communities asking for feedback. The response was predictably discouraging. “Some 90% of the replies were a mix between critiques and plain discouraging responses such as “keep dreaming”, “it will never happen”, “don’t you have anything better to do”,” he says.

(Asked this April by US lawmakers whether he thinks he has a monopoly, Zuckerberg paused and then quipped: “It certainly doesn’t feel like that to me!”)

Still, Hernández stuck with it, working on a prototype and launching the Kickstarter. He’s got that far — and wants to build so much more — but getting enough people to believe that a better, fairer social network is even possible might be the biggest challenge of all. 

For now, though, Hernández doesn’t want to stop dreaming.

“We are committed to make Openbook happen,” he says. “Our back-up plan involves grants and impact investment capital. Nothing will be as good as getting our first version through Kickstarter though. Kickstarter funding translates to absolute freedom for innovation, no strings attached.”

You can check out the Openbook crowdfunding pitch here.


Social – TechCrunch


Periscope expands virtual tipping via Super Hearts beyond the U.S.

December 1, 2017 No Comments

 Twitter’s big push to draw in more live video stars to its Periscope streaming service is now expanding beyond the U.S. The company announced today the Periscope Super Broadcaster program, which allows video stars to earn revenue from their streams through a virtual tipping mechanism, is now available in Canada, Ireland, and the U.K. Other countries will be added to the program soon,… Read More
Social – TechCrunch


Bing Exact Match Close Variant Update and Beyond

August 31, 2017 No Comments

In an effort to help advertisers reach more customers, Bing expands exact match close variant technology.

Read more at PPCHero.com
PPC Hero


5 Tips To Stay Organized Through The Holidays And Beyond

November 24, 2016 No Comments

Just like fashion, everyone has a management style all their own. Here are a few tips and pointers to help you stay organized this holiday season and into the new year.

Read more at PPCHero.com
PPC Hero


©2018-2020 - American Online Phonebook - All Rights Reserved

Privacy Policy | Terms & Conditions