Property websites targeted

Bot attacks on real estate websites

Real estate websites have experienced a 300 percent increase in bad bot activity, with large real estate sites experiencing the most pain according to research from Distil Networks in an annual report that identifies statistically significant data on global bot traffic.

Bad bots are used by competitors, hackers and fraudsters, and are the key culprits behind web scraping, brute force attacks, competitive data mining, online fraud, account hijacking, data theft, unauthorised vulnerability scans, spam, man-in-the-middle attacks, digital ad fraud, and downtime.

“When we dug into the bot activity in 2015, we identified an influx of Advanced Persistent Bots (APBs),” said Rami Essaid, co-founder and CEO of Distil Networks.

“ABPs can mimic human behaviour, load JavaScript and external assets, tamper with cookies, perform browser automation, and spoof IP addresses and user agents. The persistency aspect is that they evade detection with tactics like dynamic IP rotation from huge pools of IP addresses, use Tor networks and peer to peer proxies to obfuscate their origins, and distribute attacks over hundreds of thousands of IP addresses.


“A huge 88 percent of 2015 bad bot traffic were APBs. This shows that bot architects have already taken note of traditional bot detection techniques and are finding new sophisticated ways to invade websites and APIs, in an effort to take advantage of critical assets and impact a business’s bottom line.”

Key Findings of Bot Traffic

  • Some 46 percent of all web traffic originates from bots, with over 18 percent from bad bots.
  • For the first time since 2013, humans outnumbered bots for website traffic.
  • Medium-sized websites (10,001 to 50,000 Alexa ranking) are at a greater risk, as bad bot traffic made up 26 percent of all web traffic for this group.
  • Chrome edged out Firefox as the browser of choice for bad bot creators with over 26 percent of all user agents utilising the Google browse.

The rise of Advanced Persistent Bots (APBs)

  • Some 88 percent of all bad bot traffic has one or more characteristics of an Advanced Persistent Bot.
  • 53 percent of bad bots are now able to load external resources, like JavaScript, meaning these bots will end up falsely attributed as humans in Google analytics and other tools.
  • A total of 39 percent of bad bots are able to mimic human behaviour, so tools such as WAFs, web log analysis, or Firewalls, which perform less detailed analysis of clients and their behaviour, will likely result in huge amounts of false negatives.
  • At least 36 percent of bad bots disguise themselves using two or more user agents, and the worst APBs change their identities more than 100 times.
  • A total of 73 percent of bad bots rotate or distribute their attacks over multiple IP addresses and of those, a whopping 20 percent surpassed 100 IP addresses.

Digital publishing and real estate industry websites were the most significant bot targets last year. As an industry, digital publishers were hit hardest by bad bots, which make up over 31 percent of all their traffic. For small digital publishers (Alexa 50,001 – 150,000) 56 percent of traffic originates from bad bots Huge increase in bad bot traffic from China, but the United States still has biggest bot problem.

The 2016 Bad Bot Landscape Report is based on aggregate data gathered from Distil Networks’ bot detection and mitigation solution that identifies and tracks bots in real time, the world’s largest Known Violators Database of bad bot fingerprints, as well as Distil’s global network of 17 data centres.