Fraudulent Web Traffic Continues to Plague Advertisers, Other Businesses - WSJ

In a recent study, Adobe found that about 28% of website traffic showed strong “non-human signals,” leading the company to believe that the traffic came from bots or click farms. The company studied traffic across websites belonging to thousands of clients.

Adobe is currently working with a handful of clients in the travel, retail and publishing industries to identify how much of their web traffic has non-human characteristics. By weeding out that misleading data, brands can better understand what prompted consumers to follow their ads and ultimately visit their websites and buy their products.

“It’s really about understanding your traffic at a deeper level. And not just understanding, ‘I got this many hits.’ What do those hits represent? Were they people, malicious bots, good bots?” said Dave Weinstein, director of engineering for Adobe Experience Cloud.

While hardly the first study of online fraud, Adobe’s findings are one more indication of how the problem has roiled the fast-changing ad, media and digital commerce industries, while prompting marketers to rethink their web efforts.

Non-human traffic can create an “inflated number that sets false expectations for marketing efforts,” said Mr. Weinstein.

Marketers often use web traffic as a good measure for how many of their consumers saw their ads, and some even pay their ad vendors when people see their ads and subsequently visit their website. Knowing more about how much of their web traffic was non-human could change the way they pay their ad vendors.

Advertisers have told Adobe that the ability to break down human and non-human traffic helps them understand which audiences matter “when they’re doing ad buying and trying to do re-marketing efforts, or things like lookalike modeling,” he said. Advertisers use lookalike modeling to reach online users or consumers who share similar characteristics to their specific audiences or customers.

Ad buyers can also exclude visitors with non-human characteristics from future targeting segments by removing the cookies or unique web IDs that represented those visitors from their audience segments.

In addition to malicious bots, many web visits also come from website “scrapers,” such as search engines, voice assistants or travel aggregators looking for business descriptions or pricing information. Some are also from rivals “scraping” for information so they can undercut the competition on pricing.

While bots from big search engines and aggregators tend to overtly present themselves as bots, and can easily be discounted from human web traffic, a small percentage of scrapers generate visits even if they’re not intentionally posing as visitors, said Mr. Weinstein.

“We realized that with the growth of things like Alexa and Google Home and other assistants, increasingly more and more traffic is going to be automated in nature,” he said. “In the long term, real humans at real browsers will be a diminishing portion of traffic.”

While there aren’t any plans to monetize a tool that can analyze non-human web traffic for clients, Adobe eventually could use it to sell something like a “bot score,” said Mr. Weinstein. For now, the company will likely just build the function into its existing analytics products.

http://archive.is/LWhJ7