Fraud Prevention

Online Gaming Has Become a Big Target for Fraud

March 26, 20203 min Read

Online gaming has drastically changed the industry and the nature in which people play video games. Around 700 million people around the world play games online. Additionally, those aged 18-25 are spending 77 percent more time watching online gaming on platforms such as Twitch than traditional sports on television.

This explosion in popularity in recent years has also fermented a new breeding ground for fraudsters to attack. According to data from the Arkose Labs platform, online gaming fraud is on the rise. There was a 30% spike in attacks on gaming platforms last quarter.

Findings from the Arkose Labs Q1 2020 Fraud and Abuse Report further show most of the growth is coming from new account registration attacks, which increased more than 70%. Attacks targeting this sector are increasing at a rapid pace and often demonstrate highly sophisticated fraud patterns.

Fraudsters target online games in a number of ways. Many of them offer microtransactions, where players can pay real currency for new items or various upgrades. But beyond that, most online platforms offer some sort of in-game virtual currency that is also extremely valuable. In-game assets are now estimated to be worth more than $50 billion. These assets are emerging as a hot new target for fraudsters who use account takeover attacks and illicit real money trading to steal and sell on assets in exchange for virtual gold.

Fraudsters can also use bots to create a large amount of fake accounts in order quickly and efficiently level up the account profiles, which can then be sold to other gamers who are willing to pay for a high-powered character.

Arkose Labs further found that human-driven attacks on gaming platforms grew sharply this past quarter -- especially for logins and payments. Fraudsters tap into global “sweatshops” to scale up attacks with human resources, while keeping costs low. Arkose Labs has detected an increase in human-driven in-game online gaming fraud, with fraudsters trying to improve their success rates in attacks which involve two-way interactions that bots cannot execute well. 

The ingenuity of fraudsters means that they can monetize online interactions intended to have no financial value and tap into a shadow support ecosystem in order to attack at scale and sell assets on a virtual black market. The more success they see, the stronger their fraud operations become, as profits are used to expand resources and launch fresh attacks.

It can be extremely difficult – if not impossible – for online gaming platforms to effectively fight the increasing tide of fraud while also maintaining a good user experience for their customers. By making legitimate users fight through too much friction in the authentication process, gaming companies risk the possibility that customers will stop coming to the platform, or use it less, meaning lost revenue and a loss of trust among its user base.

To properly fend off fraud while still offering a seamless experience to users, gaming companies need to implement a two-pronged approach – first being able to accurately determine between real customers and bots or sweatshops, and then presenting authentication challenges to that traffic that appears suspicious. Bots would be presented with an enforcement challenge they are unable to solve because it requires human cognition, and sweatshops would be slowly drained of their profits by having to complete increasingly more arduous and complicated challenges. Such an approach will mitigate fraud while also not adversely affecting the experience of gamers.