American Online Phonebook

Author: fragglerock

How to make the most of internal linking for higher rankings and improved organic search visibility

September 29, 2018 No Comments

An internal link is a hyperlink pointing to a page within the same domain. Internal linking is crucially important for both website rankings and usability:

  • Internal links allow users to conveniently navigate around the website (i.e. in order to complete a purchase, learn more about a product or read about your business)
  • Internal links allows crawlers discover more of your site pages, even those that have no external backlinks (especially important ones)
  • Internal links are thought* to improve each given page authority (Google puts some emphasis on the signal: The more internal links a page has, the more internal authority it is supposed to have).

*This has never been officially confirmed by Google (unless I missed the announcement) but we’ve seen web pages doing considerably better once we add internal in-links pointing to it, so let’s say this one is an educated theories backed by multiple experiments.

Now, the question is however how to use internal links correctly. Let’s see…

1. Internal Linking Basics and Best Practices

I won’t repeat what Rand said in this Whiteboard Friday video because I agree with most (all?) points. But let me recap:

  • Well-structured navigation is crucial both for user experience and crawling… however
  • In-content internal links (links embedded within meaningful context) seem to carry more weight for rankings
  • Google is believed to give the least importance to footer links
  • Internal anchor text does matter. This has almost been confirmed by a Googler. That said, if you target specific queries for a specific page, use descriptive keyword-focused (but meaningful) keyword links when linking to that page (when that makes sense). However stay away from always using exact-match anchor text, as it may seem unnatural.
  • If there are two internal links to the same page on one page, only top anchor text seems to matter to Google
  • Google seems to like text links more than image links with an alt text
  • Generally, the more in-links a page has, the better its rankings (this is easy to test: Just pick a page on your site and start linking to it consistently. You are likely to see it moving up in SERPs)

2. Analyzing and Evaluating the Internal Link Structure

Surprisingly, given the amount of weight SEOs put on internal linking, there are not many tools that allow you to see internal structure clearly. Yes, there a few powerful crawling solutions including this free one as well as one of my favorite ones Screaming Frog.

But there’s no easy way to analyze how each specific landing page is linked to throughout the site.

Jet Octopus solves that problem by introducing Linking Explorer feature. Crawl your site using Jet Octopus and open the Linking Explorer section (Behind “Explorer” link in the navigation). From there you can provide the URL of your (or your competitor’s) landing page to see how exactly it is linked to from other pages on the site.

I love the section of the report that shows in-linking anchor text. This gives you a great insight:

  • If you are doing enough optimizing your internal anchor text (or if you are over-optimizing things to the point where it looks a bit ridiculous / unnatural)
  • What keywords your competitors want to rank each specific rankings for

Jetoctopus Link Explorer

3. Using Structured Data for Internal Linking

Apart from anchor text, there’s a more beautiful way to signal Google of your internal structure by linking: Schema.org

Some of Schema.org properties have been confirmed by Google as impacting the way they interpret websites (at least the way the page look when listed in SERPs). Others are presumably helping, because, as confirmed by Google, structured data in general helps Google understand websites better and it may even be a ranking factor.

So when using internal linking which Schema.org properties can be implemented?

1. /BreadcrumbList

Google says marking up breadcrumbs using Schema.org is one of the enhancement tools that could positively influence your website’s organic visibility and engagement (i.e. click-through)

  • The markup helps Google understand the website’s hierarchy better
  • /BreadcrumbList markup helps Google generate breadcrumbs-like format of the URL structure which is more appealing and may increase click-through

BreadcrumbList

[Indicate the position of each URL in the site’s hierarchy using BreadcrumbList]

There are a variety of WordPress plugins allowing you to easily implement the markup, including this one.

2. Authorship

Even though Google’s authorship has been discontinued (meaning authors are no longer highlighted in search results), that experiment revealed two things:

That being said, making their life easier never hurts, so marking up internal bio links using Schema.org/author is a smart idea.

3. /ListItem

Another way to stand out is search is getting intro those search carousals. Officially, Google supports list format for the following content types: Recipe, Film, Course, Article. However, as they confirm, this list is ever growing, so marking up your product lists is not a bad idea.

Google Carousels

[Here is an example of a list from a single website shown in a carousel]

4. Reviews!

Reviews get huge SERPs visibility. It’s one of the oldest rich snippets Google has been experimenting with and today Google supports a variety of types, including “including businesses, products, and different creative works such as books or movies. ” Here’s a solid collection of WordPress plugins for each supported type depending on what it is you are doing. All of the plugins in that list are Schema.org-based.

Google recommends using schema.org/URL whenever you want to point them to the page with the full review.

URL

5. More!

Again, whether Google is currently supporting a certain Schema.org type or not, it’s always worth thinking “What else should I do to help them understand your site easier”

Besides, Google has stated it many times that they are working on supporting new and new schema properties including FAQ and HOW-to (just recently). So whenever you are working on creating or editing pages, consider Schema.org properties that make sense there. For example, you can:

  • Point to your About page using schema.org/Organization
  • You can link to your home page using schema.org/copyrightHolder whenever you are publishing a new content asset, etc.

What internal linking tactics are you using to maximize your website’s organic visibility? Please share in the comments!

Search Engine Watch


New Interface, New Syntax: Automating Rules in Google Ads

September 28, 2018 No Comments

Whether you are pro-change or anti-change with Google, there is one certainty – You have to change. Learn how to create automated rules in the new interface.

Read more at PPCHero.com
PPC Hero


FCC cracks the whip on 5G deployment against protests of local governments

September 28, 2018 No Comments

The FCC is pushing for speedy deployment of 5G networks nationwide with an order adopted today that streamlines what it perceives as a patchwork of obstacles, needless costs and contradictory regulations at the state level. But local governments say the federal agency is taking things too far.

5G networks will consist of thousands of wireless installations, smaller and more numerous than cell towers. This means that wireless companies can’t use existing facilities, for all of it at least, and will have to apply for access to lots of new buildings, utility poles and so on. It’s a lot of red tape, which of course impedes deployment.

To address this, the agency this morning voted 3 to 1 along party lines to adopt the order (PDF) entitled “Accelerating Wireline Broadband Deployment by Removing Barriers to Infrastructure Investment.” What it essentially does is exert FCC authority over state wireless regulators and subject them to a set of new rules superseding their own.

First the order aims to literally speed up deployment by standardizing new, shorter “shot clocks” for local governments to respond to applications. They’ll have 90 days for new locations and 60 days for existing ones — consistent with many existing municipal time frames but now to be enforced as a wider standard. This could be good, as the longer time limits were designed for consideration of larger, more expensive equipment.

On the other hand, some cities argue, it’s just not enough time — especially considering the increased volume they’ll be expected to process.

Cathy Murillo, mayor of Santa Barbara, writes in a submitted comment:

The proposed ‘shot clocks’ would unfairly and unreasonably reduce the time needed for proper application review in regard to safety, aesthetics, and other considerations. By cutting short the necessary review period, the proposals effectively shift oversight authority from the community and our elected officials to for-profit corporations for wireless equipment installations that can have significant health, safety, and aesthetic impacts when those companies have little, if any, interest to respect these concerns.

Next, and even less popular, is the FCC’s take on fees for applications and right-of-way paperwork. These fees currently vary widely, because as you might guess it is far more complicated and expensive — often by an order of magnitude or more — to approve and process an application for (not to mention install and maintain) an antenna on 5th Avenue in Manhattan than it is in outer Queens. These are, to a certain extent anyway, natural cost differences.

The order limits these fees to “a reasonable approximation of their costs for processing,” which the FCC estimated at about $ 500 for one application for up to five installations or facilities, $ 100 for additional facilities, and $ 270 per facility per year, all-inclusive.

For some places, to be sure, that may be perfectly reasonable. But as Catherine Pugh, mayor of Baltimore, put it in a letter (PDF) to the FCC protesting the proposed rules, it sure isn’t for her city:

An annual fee of $ 270 per attachment, as established in the above document, is unconscionable when the facility may yield profits, in some cases, many times that much in a given month. The public has invested and installed these assets [i.e. utility poles and other public infrastructure], not the industry. The industry does not own these assets; the public does. Under these circumstances, it is entirely reasonable that the public should be able to charge what it believes to be a fair price.

There’s no doubt that excessive fees can curtail deployment and it would be praiseworthy of the FCC to tackle that. But the governments they are hemming in don’t seem to appreciate being told what is reasonable and what isn’t.

“It comes down to this: three unelected officials on this dais are telling state and local leaders all across the country what they can and cannot do in their own backyards,” said FCC Commissioner Jessica Rosenworcel in a statement presented at the vote. “This is extraordinary federal overreach.”

New York City’s commissioner of information technology told Bloomberg that his office is “shocked” by the order, calling it “an unnecessary and unauthorized gift to the telecommunications industry and its lobbyists.”

The new rules may undermine deployment deals that already exist or are under development. After all, if you were a wireless company, would you still commit to paying $ 2,000 per facility when the feds just gave you a coupon for 80 percent off? And if you were a city looking at a budget shortfall of millions because of this, wouldn’t you look for a way around it?

Chairman Ajit Pai argued in a statement that “When you raise the cost of deploying wireless infrastructure, it is those who live in areas where the investment case is the most marginal—rural areas or lower-income urban areas—who are most at risk of losing out.”

But the basic market economics of this don’t seem to work out. Big cities cost more and are more profitable; rural areas cost less and are less profitable. Under the new rules, big cities and rural areas will cost the same, but the former will be even more profitable. Where would you focus your investments?

The FCC also unwisely attempts to take on the aesthetic considerations of installations. Cities have their own requirements for wireless infrastructure, such as how it’s painted, where it can be located and what size it can be when in this or that location. But the FCC seems (as it does so often these days) to want to accommodate the needs of wireless providers rather than the public.

Wireless companies complain that the rules are overly restrictive or subjective, and differ too greatly from one place to another. Municipalities contend that the restrictions are justified and, at any rate, their prerogative to design and enforce.

“Given these differing perspectives and the significant impact of aesthetic requirements on the ability to deploy infrastructure and provide service, we provide guidance on whether and in what circumstances aesthetic requirements violate the [Communications] Act,” the FCC’s order reads. In other words, wireless industry gripes about having to paint their antennas or not hang giant microwave arrays in parks are being federally codified.

“We conclude that aesthetics requirements are not preempted if they are (1) reasonable, (2) no more burdensome than those applied to other types of infrastructure deployments, and (3) published in advance,” the order continues. Does that sound kind of vague to you? Whether a city’s aesthetic requirement is “reasonable” is hardly the jurisdiction of a communications regulator.

For instance, Hudson, Ohio city manager Jane Howington writes in a comment on the order that the city has 40-foot limits on pole heights, to which the industry has already agreed, but which would be increased to 50 under the revisions proposed in the rule. Why should a federal authority be involved in something so clearly under local jurisdiction and expertise?

This isn’t just an annoyance. As with the net neutrality ruling, legal threats from states can present serious delays and costs.

“Every major state and municipal organization has expressed concern about how Washington is seeking to assert national control over local infrastructure choices and stripping local elected officials and the citizens they represent of a voice in the process,” said Rosenworcel. “I do not believe the law permits Washington to run roughshod over state and local authority like this and I worry the litigation that follows will only slow our 5G future.”

She also points out that the predicted cost savings of $ 2 billion — by telecoms, not the public — may be theorized to spur further wireless deployment, but there is no requirement for companies to use it for that, and in fact no company has said it will.

In other words, there’s every reason to believe that this order will sow discord among state and federal regulators, letting wireless companies save money and sticking cities with the bill. There’s certainly a need to harmonize regulations and incentivize wireless investment (especially outside city centers), but this doesn’t appear to be the way to go about it.

Mobile – TechCrunch


Moving Pieces: Thinking About PPC Strategy

September 27, 2018 No Comments

Strategy. What is your strategy? As an agency, our team responds to that very question on a daily basis. How do we respond?

Read more at PPCHero.com
PPC Hero


Tech and ad giants sign up to Europe’s first weak bite at ‘fake news’

September 26, 2018 No Comments

The European Union’s executive body has signed up tech platforms and ad industry players to a voluntary  Code of Practice aimed at trying to do something about the spread of disinformation online.

Something, just not anything too specifically quantifiable.

According to the Commission, Facebook, Google, Twitter, Mozilla, some additional members of the EDIMA trade association, plus unnamed advertising groups are among those that have signed up to the self-regulatory code, which will apply in a month’s time.

Signatories have committed to taking not exactly prescribed actions in the following five areas:

  • Disrupting advertising revenues of certain accounts and websites that spread disinformation;
  • Making political advertising and issue based advertising more transparent;
  • Addressing the issue of fake accounts and online bots;
  • Empowering consumers to report disinformation and access different news sources, while improving the visibility and findability of authoritative content;
  • Empowering the research community to monitor online disinformation through privacy-compliant access to the platforms’ data.

Mariya Gabriel, the European commissioner for digital economy and society, described the Code as a first “important” step in tackling disinformation. And one she said will be reviewed by the end of the year to see how (or, well, whether) it’s functioning, with the door left open for additional steps to be taken if not. So in theory legislation remains a future possibility.

“This is the first time that the industry has agreed on a set of self-regulatory standards to fight disinformation worldwide, on a voluntary basis,” she said in a statement. “The industry is committing to a wide range of actions, from transparency in political advertising to the closure of fake accounts and demonetisation of purveyors of disinformation, and we welcome this.

“These actions should contribute to a fast and measurable reduction of online disinformation. To this end, the Commission will pay particular attention to its effective implementation.”

“I urge online platforms and the advertising industry to immediately start implementing the actions agreed in the Code of Practice to achieve significant progress and measurable results in the coming months,” she added. “I also expect more and more online platforms, advertising companies and advertisers to adhere to the Code of Practice, and I encourage everyone to make their utmost to put their commitments into practice to fight disinformation.”

Earlier this year a report by an expert group established by the Commission to help shape its response to the so-called ‘fake news’ crisis, called for more transparency from online platform, as well as urgent investment in media and information literacy education to empower journalists and foster a diverse and sustainable news media ecosystem.

Safe to say, no one has suggested there’s any kind of quick fix for the Internet enabling the accelerated spread of nonsense and lies.

Including the Commission’s own expert group, which offered an assorted pick’n’mix of ideas — set over various and some not-at-all-instant-fix timeframes.

Though the group was called out for failing to interrogate evidence around the role of behavioral advertising in the dissemination of fake news — which has arguably been piling up. (Certainly its potential to act as a disinformation nexus has been amply illustrated by the Facebook-Cambridge Analytica data misuse scandal, to name one recent example.)

The Commission is not doing any better on that front, either.

The executive has been working on formulating its response to what its expert group suggested should be referred to as ‘disinformation’ (i.e. rather than the politicized ‘fake news’ moniker) for more than a year now — after the European parliament adopted a Resolution, in June 2017, calling on it to examine the issue and look at existing laws and possible legislative interventions.

Elections for the European parliament are due next spring and MEPs are clearly concerned about the risk of interference. So the unelected Commission is feeling the elected parliament’s push here.

Disinformation — aka “verifiably false or misleading information” created and spread for economic gain and/or to deceive the public, and which “may cause public harm” such as “threats to democratic political and policymaking processes as well as public goods such as the protection of EU citizens’ health, the environment or security”, as the Commission’s new Code of Practice defines it — is clearly a slippery policy target.

And online multiple players are implicated and involved in its spread. 

But so too are multiple, powerful, well resourced adtech players incentivized to push to avoid any political disruption to their lucrative people-targeting business models.

In the Commission’s voluntary Code of Practice signatories merely commit to recognizing their role in “contributing to solutions to the challenge posed by disinformation”. 

“The Signatories recognise and agree with the Commission’s conclusions that “the exposure of citizens to large scale Disinformation, including misleading or outright false information, is a major challenge for Europe. Our open democratic societies depend on public debates that allow well-informed citizens to express their will through free and fair political processes,” runs the preamble.

“[T]he Signatories are mindful of the fundamental right to freedom of expression and to an open Internet, and the delicate balance which any efforts to limit the spread and impact of otherwise lawful content must strike.

“In recognition that the dissemination of Disinformation has many facets and is facilitated by and impacts a very broad segment of actors in the ecosystem, all stakeholders have roles to play in countering the spread of Disinformation.”

“Misleading advertising” is explicitly excluded from the scope of the code — which also presumably helped the Commission convince the ad industry to sign up to it.

Though that further risks muddying the waters of the effort, given that social media advertising has been the high-powered vehicle of choice for malicious misinformation muck-spreaders (such as Kremlin-backed agents of societal division).

The Commission is presumably trying to split the hairs of maliciously misleading fake ads (still bad because they’re not actually ads but malicious pretenders) and good old fashioned ‘misleading advertising’, though — which will continue to be dealt with under existing ad codes and standards.

Also excluded from the Code: “Clearly identified partisan news and commentary”. So purveyors of hyper biased political commentary are not intended to get scooped up here, either. 

Though again, plenty of Kremlin-generated disinformation agents have masqueraded as partisan news and commentary pundits, and from all sides of the political spectrum.

Hence, we must again assume, the Commission including the requirement to exclude this type of content where it’s “clearly identified”. Whatever that means.

Among the various ‘commitments’ tech giants and ad firms are agreeing to here are plenty of firmly fudgey sounding statements that call for a degree of effort from the undersigned. But without ever setting out explicitly how such effort will be measured or quantified.

For e.g.

  • The Signatories recognise that all parties involved in the buying and selling of online advertising and the provision of advertising-related services need to work together to improve transparency across the online advertising ecosystem and thereby to effectively scrutinise, control and limit the placement of advertising on accounts and websites belonging to purveyors of Disinformation.

Or

  • Relevant Signatories commit to use reasonable efforts towards devising approaches to publicly disclose “issue-based advertising”. Such efforts will include the development of a working definition of “issue-based advertising” which does not limit reporting on political discussion and the publishing of political opinion and excludes commercial

And

  • Relevant Signatories commit to invest in features and tools that make it easier for people to find diverse perspectives about topics of public interest.

Nor does the code exactly nail down the terms it’s using to set goals — raising tricky and even existential questions like who defines what’s “relevant, authentic, and authoritative” where information is concerned?

Which is really the core of the disinformation problem.

And also not an easy question for tech giants — which have sold their vast content distribution farms as neutral ‘platforms’ — to start to approach, let alone tackle. Hence their leaning so heavily on third party fact-checkers to try to outsource their lack of any editorial values. Because without editorial values there’s no compass; and without a compass how can you judge the direction of tonal travel?

And so we end up with very vague suggestions in the code like:

  • Relevant Signatories should invest in technological means to prioritize relevant, authentic, and authoritative information where appropriate in search, feeds, or other automatically ranked distribution channels

Only slightly less vague and woolly is a commitment that signatories will “put in place clear policies regarding identity and the misuse of automated bots” on the signatories’ services, and “enforce these policies within the EU”. (So presumably not globally, despite disinformation being able to wreak havoc everywhere.)

Though here the code only points to some suggestive measures that could be used to do that — and which are set out in a separate annex. This boils down to a list of some very, very broad-brush “best practice principles” (such as “follow the money”; develop “solutions to increase transparency”; and “encourage research into disinformation”… ).

And set alongside that uninspiringly obvious list is another — of some current policy steps being undertaken by the undersigned to combat fake accounts and content — as if they’re already meeting the code’s expectations… so, er…

Unsurprisingly, the Commission’s first bite at ‘fake news’ has attracted some biting criticism for being unmeasurably weak sauce.

A group of media advisors — including the Association of Commercial Television in Europe, the European Broadcasting Union, the European Federation of Journalists and International Fact-Checking Network, and several academics — are among the first critics.

Reuters reports them complaining that signatories have not offered measurable objectives to monitor the implementation. “The platforms, despite their best efforts, have not been able to deliver a code of practice within the accepted meaning of effective and accountable self-regulation,” it quotes the group as saying.

Disinformation may be a tough, multi-pronged, multi-dimensional problem but few would try to argue that an overly dilute solution will deliver anything at all — well, unless it’s kicking the can down the road that you’re really after.

The Commission doesn’t even seem to know exactly what the undersigned have agreed to do as a first step, with the commissioner saying she’ll meet signatories “in the coming weeks to discuss the specific procedures and policies that they are adopting to make the Code a reality”. So double er… !

The code also only envisages signatories meeting annually to discuss how things are going. So no pressure for regular collaborative moots vis-a-vis tackling things like botnets spreading malicious disinformation then. Not unless the undersigned really, really want to.

Which seems unlikely, given how their business models tend to benefit from engagement — and disinformation-fuelled outrage has shown itself to be a very potent fuel on that front.

As part of the code, these adtech giants have at least technically agreed to make information available to the Commission on request — and generally to co-operate with its efforts to assess how/whether the code is working.

So, if public pressure on the issue continues to ramp up, the Commission does at least have a route to ask for relevant data from platforms that could, in theory, be used to feed a regulation that’s worth the paper it’s written on.

Until then, there’s nothing much to see here.


Social – TechCrunch


Inside Facebook Stories’ quest for originality amidst 300M users

September 26, 2018 No Comments

There’s a secret Facebook app called Blink. Built for employees only, it’s how the company tests new video formats it’s hoping will become the next Boomerang or SuperZoom. They range from artsy Blur effects to a way even old Android phones can use Slo-Mo. One exciting format in development offers audio beat detection that syncs visual embellishments to songs playing in the background or added via the Music feature for adding licensed songs as soundtracks that is coming to Facebook Stories after debuting on Instagram.

“When we first formed the team . . . we brought in film makers and cinematographers to help the broader team understand the tropes around storytelling and film making,” says Dantley Davis, Facebook Stories’ director of design. He knows those tropes himself, having spent seven years at Netflix leading the design of its apps and absorbing creative tricks from countless movies. He wants to democratize those effects once trapped inside expensive desktop editing software. “We’re working on formats to enable people to take the video they have and turn it into something special.”

For all the jabs about Facebook stealing Stories from Snapchat, it’s working hard to differentiate. That’s in part because there’s not much left to copy, and because it’s largely succeeded in conquering the prodigal startup that refused to be acquired. Snapchat’s user count shrank last quarter to 188 million daily users.

Meanwhile, Facebook’s versions continue to grow. The Messenger Day brand was retired a year ago and now Stories posts to either the chat app or Facebook sync to both. After announcing in May that Facebook Stories had 150 million users, with Messenger citing 70 million last September, today the company revealed they have a combined 300 million daily users. The Middle East, Central Latin America and Southeast Asia, where people already use Facebook and Messenger most, are driving that rapid growth.

With the success of any product comes the mandate to monetize it. That push ended up pushing out the founders of Facebook acquisition WhatsApp, and encroachment on product decision-making did the same to Instagram’s founders who this week announced they were resigning.

Now the mandate has reached Facebook Stories, which today opened up to advertisers globally, and also started syndicating those ads into Stories within Messenger. Facebook is even running “Stories School” programs to teach ad execs the visual language of ephemerality now that all four of its family of apps, including Instagram and WhatsApp, monetize with Stories ads. As sharing to Stories is predicted to surpass feed sharing in 2019, Facebook is counting on the ephemeral slideshows to sustain its ad revenue. Fears they wouldn’t lopped $ 120 billion off Facebook’s market cap this summer.

Facebook Stories ads open to all advertisers today

But to run ads you need viewers, and that will require responses to questions that have dogged Facebook Stories since its debut in early 2017: “Why do I need Stories here too when I already have Instagram Stories and WhatsApp Status?” Many find it annoying that Stories have infected every one of Facebook’s products.

Facebook user experience research manager Liz Keneski

The answer may be creativity. However, Facebook is taking a scientific approach to determining which creative tools to build. Liz Keneski is a user experience research manager at Facebook. She leads the investigative trips, internal testing and focus groups that shape Facebook’s products. Keneski laid out the different types of research Facebook employs to go from vague idea to polished launch:

  • Foundational Research – “This is the really future-looking research. It’s not necessarily about any specific products but trying to understand people’s needs.”
  • Contextual Inquiry – “People are kind enough to invite us into their homes and talk with us about how they use technology.” Sometimes Facebook does “street intercepts” where they find people in public and spend five minutes watching and discussing how they use their phone. It also conducts “diary studies” where people journal about how they spend their time with tech.
  • Descriptive Research – “When we’re exploring a defined product space,” this lets Facebook get feedback on exactly what users would want a new feature to do.
  • Participatory Design – “It’s kind of like research arts and crafts. We give people different artifacts and design elements and actually ask them to a deign what an experience that would be ideal for them might look like.”
  • Product Research – “Seeing how people interact with a specific product, the things they’re like or don’t like, the things they might want to change” lets Facebook figure out how to tweak features it’s built so they’re ready to launch.

Last year Facebook went on a foundational research expedition to India. Devanshi Bhandari, who works on the globalization, discovered that even in emerging markets where Snapchat never got popular, people already knew how to use Stories. “We’ve been kind of surprised to learn . . . Ephemeral sharing wasn’t as new to some people as we expected,” she tells me. It turns out there are regional Stories copycats around the globe.

As Bhandari dug deeper, she found that people wanted more creative tools, but not at the cost of speed. So Facebook began caching the Stories tray from your last visit so it’d still appear when you open Facebook Lite without having to wait for it to load. This week, Facebook will start offering creative tools like filters inside Facebook Lite Stories by enabling them server-side so users can do more than just upload unedited videos.

That trip to India ended up spawning whole new products. Bhandari noticed some users, especially women, weren’t comfortable showing their face in Stories. “People would sometimes put their thumb over the video camera but share the audio content,” she tells me. That led Facebook to build Audio Stories.

Facebook now lets U.S. users add music to Stories just like Instagram

Dantley Davis, Facebook Stories’ director of design

Back at Facebook headquarters in California, the design team runs exercises to distill their own visions of creative. “We have a phase of our design cycle where we ask the designers . . . to bring in their inspiration,” says Davis. That means everything from apps to movie clips to physical objects. Facebook determined that users needed better ways to express emotion through text. While it offers different fonts, from billboard to typewriter motifs, they couldn’t convey if someone is happy or sad. So now Davis reveals Facebook is building “kinetic text.” Users can select if they want to convey if text is supposed to be funny or happy or sad, and their words will appear stylized with movement to get that concept across.

But to make Stories truly Facebook-y, the team had to build them into all its products while solving problems rather than creating them. For example, birthday wall posts are one of the longest running emerging behaviors on the social network. But most people just post a thin, generic “happy birthday!” or “HBD” post, which can feel impersonal, even dystopic. So after announcing the idea in May, Facebook is now running Birthday Stories that encourage friends to submit a short video clip of well wishes instead of bland text.

Facebook recently launched Group and Event Stories, where members can collaborate by all contributing clips that show up in the Stories tray atop the News Feed. Now Facebook is going to start building its own version of Snapchat’s Our Stories. Facebook is now testing holiday-based collaborative Stories, starting with the Mid-Autumn Festival in Vietnam. Users can opt to post to this themed Story, and friends (but not the public) will see those clips combined.

This is the final step of Facebook’s three-part plan to get people hooked on Stories, according to Facebook’s head of Stories, Rushabh Doshi. The idea is that first, Facebook has to get people a taste of Stories by spotlighting them atop the app as well as amidst the feed. Then it makes it easy for people to post their own Stories by offering simple creative tools. And finally, it wants to “Build Stories for what people expect out of Facebook.” That encompasses all the integrations of Stories across the product.

Rushabh Doshi, Facebook’s head of Stories

Still, the toughest nut to crack won’t be helping users figure out what to share but who to share to. Facebook Stories’ biggest disadvantage is that it’s built around an extremely broad social graph that includes not only friends but family, work colleagues and distant acquaintances. That can apply a chilling effect to sharing as people don’t feel comfortable posting silly, off-the-cuff or vulnerable Stories to such a wide audience.

Facebook has struggled with this problem in News Feed for over a decade. It ended up killing off its Friend List Feeds that let people select a subset of their friends and view a feed of just their posts because so few people were using them. Yet the problem remains rampant, and the invasion of parents and bosses has pushed users to Instagram, Snapchat and other younger apps. Unfortunately for now, Doshi says there are no Friend Lists or specific ways to keep Facebook Stories more private amongst friends. “To help people keep up with smaller groups, we’re focused on ways people are already connecting on Facebook, such as Group Stories and Event Stories” Doshi tells me. At least he says “We’re also looking at new ways people could share their stories with select groups of people.”

At 300 million daily users, Facebook Stories doesn’t deserve the “ghost town” label any more. People who were already accustomed to Stories elsewhere still see the feature as intrusive, interruptive and somewhat desperate. But with 2.2 billion total Facebookers, the company can be forced to focus on one-size-fits-all solutions. Yet if Facebook’s Blink testing app can produce must-use filters and effects, and collaborative Stories can unlock new forms of sharing, Facebook Stories could find its purpose.

Mobile – TechCrunch


How using a VPN can benefit SEO

September 25, 2018 No Comments

The SEO industry is growing rapidly—estimated to balloon to $ 79 billion by 2020. Even though you may already be doing great with SEO marketing, you should not ignore the potential benefit of a virtual private network (VPN) on SEO strategies. A VPN is a solution that helps connect two parties on the internet anonymously and using an encrypted network that is private. It is mainly used to protect your online privacy and access content that is not available in a particular zone.

So, how does a VPN benefit SEO? Let’s get started.

Understand local SEO using a VPN

There are many locations that a company might want to target. For example, you can be in Australia and want to target India. However, if you do a quick Google search, it will show local results from Australia—not a specific result for India that you wanted. As an SEO specialist, you may want to know what the people of India are searching for. Moreover, you would also want to know about the competition around those areas. If that’s the case, you should use a VPN.

A VPN can seriously change how you do research about a local market. By using a VPN, you can trick Google into thinking that you are from India (or any region that you are trying to learn about). This also means you can do a local search and learn about the local audience needs and try to understand what queries they are using. All this information can change how you perform in other areas.

Why use a VPN when you can always use targeted ads? Well, first, you need to learn about what the local audience searches for. Clearly, there is an added advantage in knowing local searches. Moreover, you can also see competitors local ads and learn how they are targeting the audience.

Last, but not the least, you can also know how your ads are being served in local areas.

Protect privacy when working

SEO is a competitive market. To succeed, you need always to be ahead of your competitors. This means hiding your steps when you visit your competitors or when you mimic/modify their strategies.

All of this sounds good, but the competitors can easily track your IP and know about you ahead of time. This can lead them to your site which turn will open up the possibility of them copying your strategy. As competition is high, you should always try to hide your steps or at least hide your strategy from the competitors as much as you can.

That’s not the only problem. Google can also track you if they find anything suspicious. They can know if you are buying backlinks—not good.

The solution for all these problems is to use a VPN. It doesn’t matter if you are running a website that is new or old, you should always hide your digital footprint as much as you can. With a VPN, you can do your research and stay hidden at the same time. This will improve your chances to grow in the market.

Do remote SEO work

With the rise in remote work, there is no denying that we need to protect our privacy when working online. Also, as an SEO specialist, you need to have access to the different tools which might be restricted due to the location from which you are trying to access it. For example, China blocks most of the Google services. But if your main focus is SEO, you must have access to Google.

That’s why you should use a VPN to have stress-free access to any tool, website, or service you want. This will improve your productivity, and you won’t have to think twice when working.

Get past the Google’s search query reCAPTCHAs

Working as an SEO specialist, you are always expected to to keep a tab on certain SEO stats and keywords that are relevant to your project. Not only that, but you also need to search for new keywords every now and then. However, Google might flag you for doing too many searches too often. If you are flagged, you will constantly be redirected to Google reCAPTCHAs.

You may also get a different error which might say that there is unusual traffic from this network. Now to proceed further, you need to solve a reCAPTCHA every few searches. This can lead to a loss of productive work. Also, it is no fun to fill reCAPTCHAs all the live long day.

To solve the issue, you must use a VPN. A good VPN can change IP addresses, making you work with a flow. VPNs are also useful in building a blog, and that’s why you will see good blogging guide always encourage new bloggers to use a VPN.

Why should I use a VPN when I can use proxy?

One of the common questions that we receive is why use a VPN when a proxy can be used to the same effect? That’s partially true, but there are many advantages of using a VPN over a proxy. With a VPN, you can:

  • Work faster than with a proxy
  • Change the IP address on the fly
  • Not have to worry about reCAPTCHAs
  • Work with cross-platforms
  • Secure your connection completely
  • Protect your privacy
  • Have a stable and great user experience

What do you think about using VPNs for SEO benefits?

Search Engine Watch


Snapchat lets you take a photo of an object to buy it on Amazon

September 25, 2018 No Comments

See, snap, sale. In a rare partnership for Amazon, the commerce giant will help Snapchat challenge Instagram and Pinterest for social shopping supremacy. Today Snapchat announced it’s slowly rolling out a new visual product search feature, confirming TechCrunch’s July scoop about this project, codenamed “Eagle.”

Users can use Snapchat’s camera to scan a physical object or barcode, which brings up a card showing that item and similar ones along with their title, price, thumbnail image, average review score and Prime availability. When they tap on one, they’ll be sent to Amazon’s app or site to buy it. Snapchat determines if you’re scanning a song, QR Snapcode or object, and then Amazon’s machine vision tech recognizes logos, artwork, package covers or other unique identifying marks to find the product. It’s rolling out to a small percentage of U.S. users first before Snap considers other countries.

Snap refused to disclose any financial terms of the partnership. It could be earning a referral fee for each thing you buy from Amazon, or it could just be doing the legwork for free in exchange for added utility. A Snapchat spokesperson tells me the latter is the motivation (without ruling out the former), as Snapchat wants its camera to become the new cursor — your point of interface between the real and digital worlds.

Social commerce is heating up as Instagram launches Shopping tags in Stories and a dedicated Shopping channel in Explore, while Pinterest opens up Shop the Look pins and hits 250 million monthly users. The feature should mesh well with Snap’s young and culture-obsessed audience. In the U.S., its users are 20 percent more likely to have made a mobile purchase than non-users, and 60 percent more likely to make impulse purchases according to studies by Murphy Research and GfK.

The feature functions similarly to Pinterest’s Lens visual search tool. In the video demo above, you can see Snapchat identifying Under Armour’s HOVR shoe (amongst all its other models), and the barcode for CoverGirl’s clean matte liquid makeup. That matches our scoop based on code dug out of Snapchat’s Android app by TechCrunch tipster Ishan Agarwal. Snapchat’s shares popped three percent the day we published that scoop, and again this morning before falling back to half that gain.

The feature could prove useful for when you don’t know the name of the product you’re looking at, as with shoes. That could turn visual search into a new form of word-of-mouth marketing where every time an owner shows off a product, they’re effectively erecting a billboard for it. Eventually, visual search could help users shop across language barriers.

Amazon is clearly warming up to social partnerships, recognizing its inadequacy in that department. Along with being named Snapchat’s official search partner, it’s also going to be bringing Alexa voice control to Facebook’s Portal video chat screen, which is reportedly debuting this week according to Cheddar’s Alex Heath.

Snapchat could use the help. It’s now losing users and money, down from 191 million to 188 million daily active users last quarter while burning $ 353 million. Partnering instead of trying to build all its technology in-house could help reduce that financial loss, while added utility could aid with user growth. And if Snap can convince advertisers, they might pay to educate people on how to scan their products with Snapchat.

Snap keeps saying it wants to be a “Camera Company,” but it’s really an augmented reality software layer through which to see the world. The question will be whether it can change our behavior so that when we see something special, we interact with it through the camera, not just capture it.


Social – TechCrunch


MetroPCS is now Metro by T-Mobile

September 25, 2018 No Comments

It’s been five years since T-Mobile picked up MetroPCS, and now the prepaid service is finally getting a fresh coat of paint. The “PCS” bit is getting the old heave-ho, while the brand’s owners are letting you know who’s boss with the new Metro by T-Mobile brand name.

The new name involves some new plans, along with a couple of perks from key partners. There are two new (pricier) tiers, in addition to the standard ones. The new unlimited plans run $ 50 and $ 60 a month, and both include storage via Google One.

That makes the newly rebranded service the first to offer up access to Google’s new storage plan. The cloud deal also offers access to Google Experts, who can help you troubleshoot issues with any Google service.

The $ 60 a month plan, meanwhile, tosses in Amazon Prime for good measure. That’s not exactly a solid reason to upgrade in and of itself, given that an Amazon Prime plan currently runs $ 119 a year, but the more premium plan offers 15GB of LTE data for its mobile hotspot versus 5GB.

Mobile – TechCrunch


Bing’s New(ish) Automated Bidding Strategies

September 24, 2018 No Comments

Bing continues to roll out automated strategies to extract more performance from your campaigns. Take a look at the two newest and how to best use them.

Read more at PPCHero.com
PPC Hero


©2018-2020 - American Online Phonebook - All Rights Reserved

Privacy Policy | Terms & Conditions