In today’s climate, we can’t underestimate the importance of data in the initiation, maintenance and continual development of businesses. Put simply: data shows you what is working and what isn’t. It enables businesses and marketers to understand why some customers buy while others abandon. And fundamentally, it presents businesses with an opportunity to course correct – adapting their strategy to one that resonates with their customers.
Look no further than 2015, when Facebook overstated the average time users spent watching videos on the platform to advertisers. This inflation of figures didn’t just rile up advertisers globally, but members of the press too. Journalists were fired, business strategies were changed, and marketing campaigns scrapped and then reimagined – all because of inaccurate data showing a ‘pivot to video’. In June 2017, Fox Sports also fired editors and writers who used this data to influence their content strategy and within three months, led to an 88% decline in audience. It doesn’t take long for bad data to result in bad business.
Any marketeer worth their salt knows that inaccurate data can lead to bad marketing decisions too. Yet the reason for inaccurate data still eludes many marketing departments globally. Typically, skewed analytics are a result of automated traffic, otherwise known as bots. Bots have traditionally been considered a cybersecurity threat – not a threat to data analytics. But this perception needs to change. If companies begin basing their website design, product pricing strategy and ad campaign investments on unreliable data, caused by bots, it will lead to wasted money, time and effort.
The impact of skewing and scraping
Data is imperative in marketing. When it comes to building a content-enabled marketing campaign you need data to understand what pieces are engaging your audience in dialogue and why. The same applies when creating buying personas, you need to understand when your target market is online, how they interact with your website and ultimately what makes them tick.
However, with half of all online traffic being generated by bots, marketing teams need to ensure the data they are using to make these decisions is reflective of customer behaviour, not bot behaviour. Bots are designed to perform a simple but repetitive task that would be laborious for a human. While not all bot traffic is bad – some bots are used to ensure website links work and crawler bots catalogue and index web pages for search providers like Google – half of all bot traffic has a malicious intent.
In the context of marketing, both good and bad bots can skew your website analytics. You may notice sudden spikes of website traffic, from certain geographical locations, at certain times. While a successful marketing campaign will cause similar spikes in activities, it’s imperative to lookout for anomalies that indicate bot activity.
But ultimately, the bigger threat to your data, is posed by bad bots. Bots from competing websites can scrape data from your website and use it to gain a competitive advantage in the market. Let’s take the example of your company’s pricing strategy – your competitor could use bots to scrape your pricing information and then mark down the price of their own product; putting them at a competitive advantage. When it comes to content, scraper bots can download your marketing content and repurpose it with malicious intent, such as duplicating the content for search engine optimisation on websites the attacker owns, violating copyrights and diverting organic traffic.
As technology evolves and changes, and new threats continue to present themselves, preserving data must become a top priority for businesses.
Staying one step ahead
Increasingly sophisticated bots need equally sophisticated defences. Businesses and marketing teams need to differentiate not only between human and bot traffic, but also between good and bad bot activity to preserve the integrity of their business data.
The only way these distinctions can be made is by looking at behaviour. Bots tend to follow certain behavioural patterns. They will spend an abnormally low amount of time on a page and go through pages at a very high speed. This means that if you see a lot of page views with a short time spent on the page, this could well be bots. Moreover, if there is a sudden increase in traffic to your website but no recently launched or ongoing ad campaign, then this again is likely to be caused by bots.
Another indicator of bot traffic would be a significant difference between website visits and actual customers. If the number of visits to your website suddenly increases and this isn’t converting to sales, check this traffic’s origin. Likewise, any daily spike in referral traffic from an unknown domain probably comes from bots.
Marketing teams must introduce the term ‘bot’ into their everyday vocabulary. As the term becomes dissociated with cybersecurity and businesses wise-up to the reality of bot activity, the better equipped they will become to make the right decisions, based on the right data.
By Thomas Platt, Head of E-commerce at Netacea
PrivSec Conferences will bring together leading speakers and experts from privacy and security to deliver compelling content via solo presentations, panel discussions, debates, roundtables and workshops.
For more information on upcoming events, visit the website.
comments powered by Disqus