Botting, an often misused term that encapsulates multi-accounting, cheating, and fraud, plays a pivotal role in gaming — sometimes enhancing experiences but often undermining them.

Our host, Alexandra Takei, Director at Ruckus Games, sits down with Henry LeGard, CEO and co-founder of Verisoul, and George Ng, CTO and co-founder of GGWP, to unpack the botting phenomenon. Their nuanced conversation covers why gaming is rife with bots, when bots are good, when bots are bad, and what gaming can learn from cybersecurity and homeland security. 

The conversation digs into detection strategies (including behavioral analysis, device fingerprinting, and the deployment of ban waves) and also explores how community plays a role in bot detection and reporting. Finally, the trio discusses how the botting industry will change for the better or the worse with the advancements in AI. 

Reforged Labs

We’d also like to thank Reforged Labs for making this episode possible. This YC startup automates the traditionally costly and time-consuming ad production process, delivering high-quality, cost-effective video ads in under 24 hours – and it’s all powered by a proprietary AI engine trained on thousands of successful game ads. Learn more: https://reforgedlabs.com/?utm_source=newsletter&utm_medium=email&utm_campaign=naavik_promo&utm_id=naavik+promo&utm_term=newsletter&utm_content=october30.


This transcript is machine-generated, and we apologize for any errors.

Alexandra: What's up, everyone, and welcome to the Naavik Gaming Podcast. I'm your host, Alex Takei, and this is the Interview and Insights segment. So all industries struggle with bots.

As one of our guests will explain today, they're especially pervasive in social media and e commerce. X faces a botting overload that has diluted its platforms, information, and comments. While e-commerce sites have suffered bot attacks that drain inventory, just ask anyone who tried to buy a PlayStation 5 in 2020.

In gaming, bots play a fascinating dual role. Sometimes they're beneficial actors, facilitating matchmaking, building followership, or serving as NPCs. However, they often turn malicious, you know, farming game economies, ruining the experience for real players, or enabling cheating. In fact, gaming ranks as the number one vertical for botting with the highest percentage of fake and fraudulent traffic.

This has led to the development of numerous infrastructure and security services specifically designed to reduce botting and catch bad actors. And two experts from the field are joining me today. We'll explore the botting industry in games, examining different types of bots, understanding why games are targeted, discussing solutions for bot detection across a product's lifecycle, and investigating where the business opportunities lie, specifically how AI is changing, the botting industry.

So my first guest is George Ng CTO, and co-founder of GGWP. George, Prior to founding GGWP worked at cybersecurity startups, as well as the U.S. Department of Homeland Security and U.S. Cert. I am super excited to have him on air today, as it will be really interesting to see how strategies and fraud and cyber translate to games.

Welcome to the Pod George.

George: Welcome. Thank you. Awesome.

Alexandra: All right. My next guest is Henry LeGarde, founder and CEO of Verisoul. Verisoul is a platform that helps businesses automate fake account detection to keep your business healthy and bot free, I ran into Henry at an event a couple of months ago. We just had a fascinating conversation about the botting industry that inspired this podcast.

So, Henry, I'm super excited to have you on air today.

Henry: Yeah, thank you, Alex. Thanks for making our dream come to life here. Yeah, it was super great to meet, and George great to see you here as well today, too. Awesome.

Alexandra: All right. So before we dive in, I'd love to do a quick round of intros. Tell the audience a bit about how you came to found your respective companies. I would also love to hear from you your best case study on how you've helped a group crack their botting problem. And I will also share one of my favorite botting problems as well after you guys kick it off.

Henry, how about you lead here?

Henry: Perfect. So I'm Henry, I'm the CEO and founder of Verisuol. We help companies detect and prevent fake accounts, bots, etc. One of our primary focuses is in gaming. I come from the fraud and identity space and throughout my entire career have realized how challenging it is to stay ahead of Fake accounts bots.

And, you know, when we talked to customers, their engineering teams are fed up with it, they don't want to work on it, et cetera. And so we figured, Hey, it's, you know, there's safety in numbers and there's value in bringing this all under the same roof. And so we take strategies we see across all of our different clients and help.

Apply those to the rest of them and so really we focus on helping customers prevent multi accounting. So, you know, people trying to farm economies with multiple accounts, whether that's a potentially poker tables or reward systems, referral systems all the way to the most advanced AI bots that mimic human behavior can mimic human conversation, et cetera, which George we'll talk about.

One of the most interesting case studies. Is that we've seen recently call it in 2024 is one bad actor that was responsible for over a million dollars in chargebacks, creating basically over 50 percent of the fake accounts on a single gaming platform. So basically massive problem with fake accounts, huge number of chargebacks.

They were losing kind of millions of dollars monthly to fraudulent payments. When we uncovered it, it was one single individual that was creating massive accounts at scale. And I'm sure George Alex, you've seen this same concept before where it's likely not that there's a huge number of kind of fraudsters in there, but it's a few fraudsters creating huge numbers of fake accounts and huge havoc.

And so helping them uncover that was a massive unlock both for growth and for cost reduction. So, that was a super interesting one just from the, you know, kind of nature of one, one kind of unmasked guy.

Alexandra: Dang. Yeah, that's, that's a good one. George, you got, what's you got?

George: Yeah. So I'll tell you a little bit about GGWP first.

We have a community management solution using AI based content moderation. We also have sanctions, nudges, and warnings in the product. , Let me think. So I can't share anything too spicy about any specific game, but I will share that. Recently, you know, we look at things like aim bots, speed hacks, spam and games.

But we did a lot of work around smurfing and boosting for a popular competitive game. And just in the first band wave, we looked at over, let's, let's call it in the tens of thousands obviously don't know about the total volume in the community, and that's part of the challenge here in this space, just from that alone, with, you know, fairly rudimentary, behavioral analysis at a starting point, we're able to pick up a good number of these types of accounts in game.

Alexandra: Got it, all right. So GGWP and you, how did you found yourself at GGWP before, giving your background in cyber?

George: Yeah. So, as you mentioned before, I was at, Homeland Security for some time. I was also at DARPA, doing DoD research. We were supporting a conflict in Afghanistan, and that's kind of how I got into cyber in the first place since the nature of the threat.

I sort of moved digital in in a lot of areas after founding science and being acquired, I sort of took some time off. I was teaching at Cal just thinking about what I wanted to do. I've always been a lifelong gamer. So it was a good opportunity. I was playing, you know, a During COVID, I guess, you know, you couldn't go out as often.

So I resorted to playing online games with some, some friends, some of which I co founded GGWP with, we're playing competitive games and we just kind of saw how people were talking and kind of surprised how that hadn't really changed over the years. So we really just wanted to address that problem.

Alexandra: Got it. Very cool. Awesome. Co founders found in Discord over, over gaming. That's the best gaming story, the best gaming founder story. All right. So one of my favorite botting stories is, during my time at Blizzard. I'll, I'll not name the franchise, but this franchise and a particular region had a pretty malignant bot problem.

And so it was really frustrating from the business side, because. The DAU of this game was basically, inaccurate to some extent, right? And so that diluted lots of different KPI metrics like ARPDAU and DAUMAU ratios and other kind of key critical, you know, business health indicators, That you would use to basically make a product decision or steer business decisions.

And so basically it was kind of already, it was automatically known that in this region there are X number of bots and you have to strip them out every single time. , in every report that's ever made in, collected through Tableau, collected through financial software systems, et cetera. And so it was obviously having a lot of challenges because you also needed to educate anybody that was new to the franchise about the bot community because otherwise they may report something inaccurately.

And so I thought that it was, it's really interesting. And it obviously also had, a lot of these bots were basically farming the economy, so it also had big impacts on currency reserves, et cetera.

Henry: It's a great, it's a, it's a really interesting point. And I'll just mention that it's one of those things, kind of one of those pernicious problems people don't really think about, like when they're listing kind of the top impacts of bots, it's like, Oh yeah, user experience for real users, cost farming, et cetera.

But like, Just the day to day data ops for the team. Like it just drags on you. You're like, all right, new person hired. Okay. Here's the

Alexandra: Exactly.

Henry: Here’s the fake numbers. Here's the real stuff. Like here's the calc. You got to back it out.

Alexandra: It's like a hard coded minus X every single time across. Yeah. And it's pretty crazy.

Yeah, it's definitely huge. So, all right. So we both all approached in this conversation with a couple of different bonding problems, you know, yours, Henry, from this one malignant, bad actor, that was kind of really sabotaging the, kind of the, the monetary outcome of this, of this company and making, you know, you know, 50 X the scale of fake accounts, George, yours has to do a lot with competitive smurfing, et cetera, and then mine more like on the business side.

But these were all obviously in gaming. And the first topic I want to talk about today is kind of why bots are a prolific problem in games. And I'd be curious to hear from you guys, you know, why bots over, why games over any other industry? And specifically also like what games are most suspect to bots?

I can think of several genres off the top of my head, but I would love to hear from you guys where, you know, you guys work with a lot of these clients. So Henry, maybe I'll pass it to you first. What's your pitch on sort of why? Why bots and gaming?

Henry: Yeah, it's a great question. We think about this a lot.

I think there's kind of three main Components at play here. So number one is generally kind of an unbounded Play economy. So if you think about most Industries, there's some sort of transaction or some moment at which you can extract value So let's just say in e commerce. It's You know, you've got kind of one transaction point when you're thinking about like ticketing your one ticketing purchase, and then you kind of get detected within gaming.

There's almost an unbounded earning capability because you have this rich, almost, you know, kind of lifelike economic system that mirrors the real world and has this unbounded nature. And so the more you play, the more you earn. And so it kind of is an uncapped reward structure, super interesting versus other industries.

Yeah. Yeah. Secondly, I think just from what we see in terms of the, client sophistication, I think gaming companies are a little bit behind the ball versus some of the other folks take, you know, obviously compared to like a financial services or something like that, the, the gaming industry, there are, you know, some triple a games that have a lot of.

Internal resources spent on, on fraud, trust, safety, et cetera. With, you know, extremely powerful backgrounds, et cetera. I just think that on the whole, the industry is a bit weaker from a defensive standpoint, from what we've seen in terms of the tools and technologies at play. I also think like the end, you know, the folks joining gaming companies, they just want to build games.

Like they really care about the user experience, the gaming, and they don't want to get sidled with broad and abuse and stopping that. And I think it often comes. Less on to a unique team and more on to a single person who's like, hey, I don't really want to work on this And then, yeah, the, the, the third is actually slipping my mind, but let me think on it, George, I'll pass it to you in terms of, um, kind of what you think.

And then I'll come back. Let me think on that for a sec.

George: Sure. I definitely agree with the unbounded play economy point. I think to the less sophisticated companies agree with that as well. One consideration is like, even for these really large game publishers, usually we're talking about like thousands of employees.

So to be fair, if you're talking about. Let's call it the equivalent of a AAA financial institution. You're talking about tens of thousands of employees, which means, yes, you do have a larger surface to cover, but you also have like a lot more people and a lot more. Capital able to set up some of this infrastructure.

I think for why do you see a lot of stuff in gaming? I agree that there's three points. One is limited deterrence in my mind, right? One consideration is like what is the penalty if you get caught what happens? And gaming and you know, we can talk about you can have a whole another podcast on free to play models alone, but specifically here just if someone is banned, they just create a new account so that minimizes the penalty Punitive impact, let's say, and often the space itself is gray, right?

In terms of the legal enforcement. So it makes it less risky than financial or e commerce fraud to potentially. Right. Another thing you have is like high engagement, right? You, you have these younger tech savvy people that. Already playing these games. So they just get to spend time in these environments that they're already familiar with and to go further into that, people get really invested in these games.

So like these competitive games, it's not just about a financial incentive. They've really want to win. So in turn, they start to cheat. And that's a big part of the, the botting that, that we see especially. And then, yeah, that the last one is just that persistent market demand, right? Like, to, to Henry's point, it's an unbound play economy.

There's a huge black market for in game currencies. And there's new games coming up all the time. So, lots of opportunities there.

Henry: Yeah, the what you just said, George resonates and actually remind me of the third point, which is the kind of black gray economic market on which a lot of these things transact.

That's exactly there's just less visibility on what's happening there. Often it's kind of a secondary market that exists outside of the game itself in some circumstances, kind of a black or gray market. And so it does, you know, there's less visibility, less of a spotlight on those transactions. And so, you know, much easier to commit fraud or friendly, you know, type fraud in that circumstance.

Alexandra: Those are great points. Yeah, and George, you kind of already alluded to this, , but before we talk maybe a little bit about genres or particular regions where botting is a little bit more malignant in video games, , you know, was curious about what you've observed in defense that games could potentially learn from, , because you did some comparing to other industries, you know, the consequences are low, the appetite is there, the marketplace is gray, but is there something that games could basically, you know, what's a lesson that games I think could take from.

Other industries you've worked in cyber.

George: Yeah, like, like cyber, like, I think does a much better job of digital fingerprinting. The same opportunities there are available in gaming. So games can certainly better leverage behavioral analysis. And again, it's not that games don't do anything there. There's just a lot more they can do with the digital fingerprints.

And they and ML to better detect these threats. I think another easy one from like banking is just like KYC, not just banking, but a lot of areas where you're a little bit stricter on the identity. So you think about how banking deals with fraud, how, for creating a loan, you're using a social security number in the U.

S. It's, it's not as easy, like just thinking about your ID in a game, you're just creating a new account with an email oftentimes. And you'll just generate these. fake emails constantly. If it was tied to your ID, you might not have the same signups. That's like a different problem. But from a security standpoint, that's obviously going to be easier to manage.

And then the last thing I think the Intel community in general does a very good job of is around Intel sharing, right? You think about like law enforcement agencies and them sharing watch lists or behavioral attack patterns. Games, from what I understand, do some of it. Like the large triple A's do have security people that they send to these same conferences, but beyond that, I think the gaming industry just can do a much better job around shared databases of, I don't know, like known, you know, CSAM threats, no spam threats, right?

Bad actors of different types, known fraudsters and just patterns as well.

Alexandra: That's really interesting. Yeah, those are great pieces of advice. Imagine signing up for a game with your social security number. I think I would, I think I would adopt maybe a very different play style. Admittedly, that's, that's really interesting.

No more casting people out when you get connected and stuff like that. Right.

George: I mean, Korea actually does, you know, use your national ID to sign up and people still play a ton in Korea. So that's not necessarily the, The end of it. But yeah, certainly there's like UX and CCU considerations. The game has to balance.

Alexandra: Got it. Henry, you've worked with a couple of clients. Is there a particular, maybe a particular pattern that you see to the genre of games that you, that you've worked with?

Henry: Yeah, it's a, it's a great question. I think there's a couple things that we look for kind of telltale signs that there will be fraud.

The first and clear front runner is economic value somewhere. So whether that's. An in-game currency that you can use to buy items that has some sort of in-game marketplace or, you know, maybe a dark gray or black marketplace for it. you know, whether there are, marketplaces for in-game items, things like skins, et cetera, whether there's potentially even like a cryptocurrency tied to it, some sort of token, some sort of economic component.

Far and away, the most powerful driver of true fraud, I think, secondary to that would be one point George you brought up is just this incentive to win, you know, like I want to win and therefore I'm going to go to extra lengths to do it, even if I'm not earning money, maybe it's leaderboards, prestige, things like that.

So some sort of multiplayer component is kind of the next. Thing I would think about, you know, if it's a single player game story mode, there's really not an incentive for, for bots and fraud. So right. Number one is kind of that economic piece. Number two is some sort of. Other value or some sort of intrinsic value.

Maybe that's pride or prestige or, things like that. Some of the signs we see.

Alexandra: Okay. So I think that that also is a nice segue to start talking about, you know, some of the definition bingo that we're talking about when we're talking about the bot space, because you just gave two examples there. One is kind of like an economic, economically motivated bot.

And the other one is about winning. Or leaderboards, et cetera. And so before we kind of dive into the meat of things, I kind of want to make sure we're all aligned around the same language. And so I would love for you guys to tell me, you know, what's the difference between a game development, game developed bot, a player side bot and a server side bot, or are some of those things overlapping or different?

George, maybe you can take us, kick us off on that.

George: Yeah, so I think of game or server bots being more like things like NPCs are CPU controlled adversaries and training modes, right? Or like a, a teammate in a left for dead type game, right? Player bots are things that the players are creating, oftentimes doing things that the game doesn't necessarily want them to do, like a wow, gold farming bot or an aim bot, , reducing recoil or, you know, Just automatically locking on to or tracking like heads.

Alexandra: Okay, and so then four, but for those like some of that is an account side bot and then some of that is a potentially like a tool that's helping me cheat. Is there a nomenclature that you use to kind of delineate like on gray area tools like aim assist software or inventory optimizers versus a this is a fake account henry to your example of the your example that you opened with this person had created like 50 a lot of a lot of fake accounts?

George: Yeah, I think some of it depends on the game's definition and it can actually change on that mode As well, right?

So, is setting a macro? Okay, I think that's a good example of a great one. And a macro is one which probably blurs the definition doesn't help clarify, but games can allow you to create macros within the game, then it's a server side or game server type macro, but players can create macros through hardware or through software.

And if you think about like, if you're going to play League of Legends at the World Championship, there's like uniform hardware, because deliberately they don't want you to create these macros to simplify these complex button press combinations. Some games though are actually fine with it. It kind of goes back to what Henry was describing.

It's about the impact to other players.

Alexandra: Right. Got it. Yeah. And we'll definitely talk a little bit about when botting is good and when it's bad. And I think that that's a great example right there. As there are so many tools that actually help players experience the game better that you might consider to be quote a bot.

In terms of, do you guys use this, this word user integrity a lot? What does user integrity mean? Yeah. What does user integrity mean in gaming?

Henry: It's a good question. So I think when we, when we think about kind of whether users are good or bad, we, we think about it in a couple of different categories.

The first is the kind of root level user integrity. This is the, are they like a real human? Are they a unique actor? And are they, do they have good or bad intentions? And you can discern some of these things by, by not even looking at their behavior, but looking at things like their device, their network, how they connect, how they move the mouse, how they collect these, these type of root level interactions and connection activities, then there's kind of this behavioral integrity component, right? This is like, are they doing good or bad things in your application? And this is really where, you know, George and team are specifically focus is trying to root out those bad behaviors. And where we typically focus is, Hey, are these unique?

Users signing up, do they have good intentions? Are they just here to, you know, extract? Have we seen them before? Have they been banned before all that type of stuff? Like, is it a prima facie, like good or bad kind of person signing up?

Alexandra: Got it. Okay. So there's like two parts to that. There's the behavior and then there's like the are you a real person?

Henry: Yeah, exactly. And part of that user integrity is also like do they align with your terms of service, right? There's certain applications or that might be geo fenced, right? So are they in the right geography? Are they kind of in your target market? Are they, are they, are there users you'd feel comfortable showing your CEO that you have, or are they folks that are, you know, They're creating multiple accounts.

They're, you know, spoofing geo location to pretend to be in your target geo when in reality, they're coming from a high risk geography, things like that.

Alexandra: Got it. All right. Okay, so we've talked about user integrity, which is a component of behaviors and then also user identity. Talked about game, just game, develop bots, player side bots and server side bots.

We talked a little bit about aim assist software and kind of like gray area tools, like some sort of, you know, modding and things like that. Are there any other terms from your guys's perspective of being experts in this space that are basically used totally wrong by like noobs like me? And if, and if so, if any of those come to mind, what ha what are they?

Henry: The, the only thing that comes to mind is the word bot in general. I feel like people use as this catch all term for bad. When, you know, in reality, it's like, there's good and bad bots. Like we just talked about, and then there's also, there's, there's bots, but there's also like real users that are just manually creating a bunch of accounts with a bunch of devices or like getting around blocks or doing.

And so everyone kind of gets looped into this like bot idea. And I think it's not necessarily bad. I think it's easy to do. I just think there's, when you peel back, there's kind of, there's different ways to look at it. Like, Hey, is it actually an automated thing happening? That's what I think of when I think of a bot or is it like, Hey, this one person's creating many accounts.

That's more like multi accounting or maybe even just like pure fraud. They're using fake credit cards, fake emails to, you know, buy stuff, charge it back and then resell it. Right. It's almost. Thinking about it all as one, I think you're going to apply like a sledgehammer when you kind of need more like different approaches for each of those different problem sets.

Alexandra: George, any thoughts?

George: Yeah, I think maybe it's helpful just to define like, what is a bot? Right at the core. I think my perspective would be we're really just talking about automation and some things are intelligently automated and some aren't. And then there's like other bad activity that doesn't require bots that Henry is also referring to that often gets swept in, right?

Because you can do fraud without using a bot. You can do fraud with using a bot, and then there's like going back to like player game, like really you're talking about like who creates it and what is the target scope of use?

Alexandra: Got it. In other words, I should not call this episode an episode about bots. It should be an episode about fraud and bots.

Henry: No, I think it's good. And again, I don't think, it's like, I think it's a great, I don't want to disparage it at all. I think it's a great term that everyone, it's clear. It's like everyone understands it, which is great. I think where it's, it's, When I think about the solution, it's like, I like separating into kind of three or four unique problems under that category, but from a high level, like conversationally it's bot suck and we hate bots.

Yeah. It's the right way to, okay, right way to call it.

Alexandra: Got it. Alright, so, we're gonna talk a little bit about setting up the infrastructure now that you've decided that bots, you know, once you've decided that bots are bad, but, you know, both of you have talked and you've started a business that anchors yourself in the fundamentals of removing toxicity or removing bad actors.

And we've talked a little bit about how bots are just not always bad. You even mentioned that, Henry, a couple, a couple minutes ago. You know, there's examples of games that might be better with bots in them. And, you know, something like Cod Mobile uses bots for matchmaking liquidity, and that actually seems to be helping with player retention.

What are the positive applications in your guys's mind behind, besides something obvious like matchmaking or NPCs? , you know, are there other basically positive things that bots can be used for besides those two components, which people think of as kind of like the, you know, The Hail Mary or the, sorry, the Golden Trove of good bots, the white, the white bot, the white actors, the white hats.

Henry: Yeah, it's a good question. I think it depends on the game, but there are instances where there's a lot of low level activity that might be extremely manual or tedious, etc., that's not creating, like, unique value or isn't super fun for players, that In some circumstances, bots can kind of help automate that away and provide some sort of economic liquidity in the fact that they're able to kind of do some of these tedious tasks that, you know, people don't want to do along the way.

So in that way, maybe there's something interesting. I think, you know, it also can create buzz around, you know, one of the main things. We see is that folks can be apprehensive to remove bots or they don't want to look under the curtain because there's. Growth, right? You want numbers, you want users, and you want to see those go up and to the right.

And so there's something scary about trying to kind of uncover who's real and who's not. And I think, and so in some circumstances, I think bots in a game can actually be a positive signal that there's some value there, that there is something kind of interesting from an economic perspective that should be driving use real user activity as well.

 But our perspective is not always that you need to remove every bot. It's just that you need to figure out which ones are bots and which are not. That way you can make that decision effectively, right? If you figure out, Hey, there's a cohort of bots that it's kind of like maybe on the gray area, but they're not actually, they're not actively deteriorating from the user experience or the economy, then.

In that way, they might not be that bad. And so there are circumstances in some cases, maybe even it's kind of like a judgment call at the end of the day. Like, are they, are they really hurting you or not in this certain case? And you can kind of do case by case at that point.

George: Yeah, I think you can tell because games are incorporating it into the workflows, right?

You think about. Monopoly Go and being able to auto roll, or like even, Pokemon, the, the new TCG Pocket Game, where you can auto battle. So there's definitely a lot of that. Again, those are game created bots, so we know it's okay, or at least they believe it's okay and, and helpful for the players. But I, I think the spicier take is the, are some of these player designed bots that, on the surface, hurt the game economy, actually good for player base?

At least in the short term, at least, right? So if you think about, I don't know, some of these games, like the popular MMOs, like, like a runescape back in the day, like a wow. Did some players actually enjoy the game more because of this market? So now you as someone that doesn't want to grind that long, get access to this really cool gear.

that you wouldn't otherwise get access to? I think the answer is probably yes, in some ways, right? Because even if you're not buying from the gold farmers directly, the entire economy is a lower cost for you, unless you're able to, you know, get this gear at a cheaper rate. The trade off is in the long term, it can create some market imbalance.

And the game has to decide, like, when do they need to really crack down on this to find that balancing act? Because they lose one of their core levers in maintaining that balance. Because it's not great either way, right? When It goes off the rails and no one can get anything or everybody can get everything because it starts to delegitimize like the value person that, you know, is spending 40 hours a week and then going on raids with their friends to, to get the same gear and that's kind of a trade off.

Henry: It is, it's a really good point, George. And there's one, one component of that. That's kind of interesting where, you know, if there are bots or farmers that are creating kind of item liquidity, let's say, or creating kind of that economy. A lot of where that transaction set will take place is in like a black or gray market outside of the game itself So like in the you know Runescape, you know example a lot of the transaction activity happened outside of the game And so it is there is value there but I can't help but think that the game could be better off by capturing that economic value with some sort of In game transaction set, you know marketplace and I think a lot of games are kind of headed that way , but the more you can keep within your wall's garden the more you can monetize those transactions that are taking place and also, you know as you were saying kind of govern or watch over and make sure that the supply demand balance stays in Equilibrium to some extent.

So it's a totally I think that's another great point about when bots can be good It's just there they can be good But maybe even better if you can capture some of that extra economic activity that maybe is happening because of it.

Alexandra: Yeah, that makes a lot of sense and I think , you know George your question if it's like is it making the experience better or worse for players like You I used a bot to open my Hearthstone packs because I would grind a lot and I'd open a lot of packs and I would just, I just built a little spacebar pressy thingy because the games, they refuse to let you mass open packs.

And I was like, well, I don't want to open pack 75 pack one was fun, but 75 is not. I also have another friend who's a huge Lost Ark person and I would make fun of him for playing Lost Ark at like randomly at 945 in the morning. I'm like, aren't you supposed to be at work? And he's like, no, I'm at work.

These are my bots. And I'm like, Oh, so it's every single time the little icon comes up. That's like X, Y, Z is playing Lost Ark. I know that's his bots. It's a matter of fact, he's probably having more fun because he feels like he's kind of like optimizing and min maxing and that's actually enjoyable. Right.

And so there's kind of like this, I think, balance between like, how much do you as a developer allow bots for gameplay purposes, because your players actually feel like that's actually a fun part of them playing. Playing the game versus the consequences of leading the player base on, I think, like you said, Henry, you know, maybe the question is not whether or not you exterminate them, but.

You know, how do you identify them and make sure that you're, that you at least know that this is a bot or this is a player that's using an augmented, you know, application of some kind and flagging those accounts. And so maybe that's like, kind of like where we should start now. So like, you've decided that like you are on a mission to basically either identify the players in your player base that are bots or identify players who are using software that is, you know, outside the realm of the original scope of the game. , and so my first question is before we talk about the actual game studios themselves, like your game itself, right?

Is there anything that the first parties do, like a Steam, Xbox, or PlayStation to expunge bots? For example, I was doing some research on YouTube. Explicitly forbids the use of automatic systems to explicitly artificially grow any account , what's the landscape looking like out there for you know, some of the primary pc and console platforms?

Henry: Yeah, it's a great question. I can't specifically talk to the consoles, but I can speak to the kind of PC as well as online service providers. I think from what we see Almost everyone does have something here. So steam certainly has, you know, internal resources focused on this. I would be, I would venture a guess that the console, you know, consoles have teams focused on this as well.

It's just a really hard problem. And so you kind of need it at every layer and you should never trust. You should really never trust that the guy in front of you in line or the, you know, service in front of you in line did a good job. , one example of this just broadly is I'll talk to folks and like, Oh, well, we use, you know, sign up with Google and expecting that Gmail has done the work of preventing fake accounts and you know, you can buy a Gmail for an aged Gmail for like 7 cents an inbox, right?

So you can get kind of unlimited of these things for zero cost nearly. And so. The key is, you know, kind of trust, but verify, right? You want to, you want to have some trust that there's something happening further up the stack, but definitely not rely on it as far as kind of what we see.

George: Yeah. So as far as far as what I see in the gaming space specifically, I know, steam has like valve anti cheat, specifically for some of the behavioral stuff in game.

I know Sony works with a lot of third party anti cheats for detections. Microsoft has been public in the space. They've filed a few patents on their own behavioral analysis detection, unsure in practice what they actually use. And plus, it's a difficult problem. There's a lot of variation, especially in the behavioral side, in game, like from, you know, title to title.

So it's a little bit different. I think more notably, they've taken more action on deterrence. Right, like one in setting policy, as you mentioned, if you don't have a policy, it kind of catches users off guard because you didn't set their guardrails in the first place. And at this point in time, like, everybody Or at least all the major large players are much better in terms of where their policies are in terms of use today , like for example, I think just this year Sony decided to block cronus which You can do some macros around you can do things that like help with like aim assist And it kind of makes sense if you think about the cross platform First person shooter games that they were doing it did give console users some advantage that they allowed , that type of controller input.

So, in that way, they're rejecting certain partnerships or affiliations with certain, you know, hardware types, for example. And in the deterrents, you actually see platform level bans, and I think that in itself is a major deterrent. Right, meaning if you get caught cheating or doing something fraudulent in any, you know, Sony game, you get banned in that ecosystem.

Which is much heavier than a mobile free to play, where you can just create a new account for that one game.

Henry: Yeah, I think on concept so for anyone who owns the hardware For instance, you know, the console providers or in mobile's case, you know, Apple and Google, they have hardware level data about the users.

And so it kind of a platform level on an owned piece of on an owned device like a console, they can take that action. The challenge is when it comes to PC or online, it's nearly impossible to kind of block one user from returning or even to detect them as the same person across console to PC, things like that.

The other question at play here is kind of how much data the games themselves would get, get from the device owners. So for instance. Xbox, you know, PlayStation, et cetera. They'll know the unique piece of hardware a user is playing on. However, they likely aren't exposed. And I don't know the console as well, but like in mobile, you're not really exposing that data.

You're giving them an ID that they can then use, but through resets, things like that, users can create, basically expunge that data and appear to create a totally fresh device such that that game or user or, or , and so it's a bit about there, there are things in place and some of this data can be used by games, but the data that the games get themselves from the device owners is transient and kind of ephemeral and not, you know, fully, you I guess, um, fraud resistant.

Alexandra: See. Okay. So if you can't, I guess that goes to the question of like, how should you set up the infrastructure to like, catch your bots, and I think obviously your, your answer is probably going to depend a lot upon like the kind of bot behavior that's existing in the game. If it's, if it's cheating, if it's fake accounts, et cetera.

But kind of like if I'm, if I'm, I'm starting from a fresh new start, right? And I'm building a game, you know, when should I start thinking about setting up bot detection infrastructure and what are kind of like the kind of bare basic, bare bones, basic minimums that I should be doing? Kind of like food, water, shelter.

Okay. Thanks. In terms of protecting my game and being able to identify the bad actors, either on the game side, game server side, or the player created sidebots.

Henry: Yeah, it's a great question and something companies spend, you know, huge teams figuring out. I think the piece of advice I would give folks is I definitely wouldn't start too early.

Like, I think you want to have users and you want to have kind of folks in there before you spend a lot of resources on this but I definitely think once you have a thriving economy, that's when you want to be focused on it. I think basically don't add, don't don't spend your resources before you kind of have Game market fit, if you will.

You want to have some sort of economic fit or product market fit with your game before you actually are spending all of your time thinking about bots and fraud, things like that. So I would do it then and, or when it becomes a problem, just from a timing perspective. And I'd love to hear your opinion on the timing as well, George, but from a.

Defense framework perspective, I think there's a couple of things I would say, super high level, the more signals, the better, right? So the more unique signal, components you have, the better, whether that's behavioral device, network server side, you know, linkages, all these types of things you want as many kind of unique complimentary signals high level.

And then secondly, the more you can solve it higher up the star, like earlier in the user journey or earlier in the user user funnel, the more works you're going to save, right? So if you can catch them at sign up, you're going to save yourself. You know, 10 times the work if you're catching them in game, just because of the sheer number of data points you have to look at down the funnel versus higher up in the user journey.

So those are my kind of high level things. And I'd love to hear your, your take on those George, and then we can get into the details as well.

George: Yeah, so I think ultimately as a game developer, you're betting on your own future success. So I agree that in practice, none of this stuff, or a lot of this stuff is overkill, if you're not a large multiplayer game.

But in success, these become massive problems. So you should at least address the low hanging fruit. So plan for, you know, what you need to do in advance, I think, practically speaking, just logging the data pipelines. If you don't have logs, , for the early sign up flows you mentioned, Henry, lots of low hanging fruit there.

Looking at things like hardware profiles, Keyboard bindings. Those are like relatively simplistic attributes that you can look at and you can find a lot of information just on a player behavioral pattern from that, but even like logging key events in game for the stuff that GGWP focuses more on, you know, in addition to chat and, , voice where we're looking at gameplay too.

So you can't find, you know, like things like aimbots if you don't like. Track any of the events in the game. So you just need to track a lot of that stuff up front. And when you get down to it, there, there are like a lot of low hanging fruit elements, like some stuff is pretty straightforward, like repetitive movement.

I mean, that in itself can get complex, but the most simplistic form or like perfect timing. If time to cool down to press is zero milliseconds or less than 20 milliseconds. Over and over like that's like a DDoS pattern and you know normal cyber security and in gaming That's just clearly not a human being with perfect, you know response to something long play durations There's got to be some upper bound.

What is someone's log on for 48 hours doing stuff non stop? Like there's some cutoff if you care about maintaining the balancing gameplay progression. Those are really easy things to implement you know other things obviously get like way more hard, right? If you're like saying oh someone has to hold the right angle and they're looking the wrong way You And that looks like cheating.

I mean, player support teams can always figure that stuff out consistently. But again, there's also low hanging fruit. So not, not all of this stuff is like equally challenging. So at a minimum, I think you can invest in that or like work with somebody. Cause a lot of those things work out of the box.

Alexandra: Interesting. Yeah. I think that like the tracking of the data itself, right. Which also could just be useful for other things, right? So this kind of has a double, double purpose. You know, I would definitely want to know how average, how long my session lengths are in my game, right? And then therefore, since I'm already interested in tracking that KPI, and I see a bunch of accounts with session lengths that are above 48 hours, you're like, This looks suspicious, and so it sounds like your recommendation is from the beginning of a game, you know, kind of just do some of the bare bones bones things.

And Henry, your recommendation is that, okay, well, if it becomes like a serious, serious problem, right? Once you have like product market fit to some extent, then you can crack down on your specific bot. Problem in a specific way, and so maybe let's split those two things into like the first thing is, is fake accounts.

Right? And so I guess maybe walk me through and maybe we can use the case study from like my example, right? Of how do you now that we're at scale, right? We have definitely product market fit. How do I get rid of them? , and let's maybe like, because we're going to do GWP is more on the anti cheat side toxicity.

And then your side on the fake fraudulent accounts, you know, you, let's just say you have identified them. How do you get rid of your bot problem?

Henry: Yeah. Great question. So I think, yeah, as I said earlier, the, the, the more you can focus early on the user journey and kind of, and I think behavioral signals are an incredibly important component of this, but the more you can catch before you even have to get the beat to the behavioral component.

The better off you off, you will be from a data perspective alone. You're just saving yourself massive amounts of time in terms of sifting through data. I think the three. Main things we think about are kind of three problem vectors. We think about our multi accounting bought or automation and fraud signals.

So let me just kind of break each individually multi accounting the way we typically approach it and the way we recommend folks approach it is look at things like. Device network timings, you know, sign in timings, locations, hardware, IDs, configurations of the device, all this type of stuff to create kind of clusters or maps or probabilities that users are the same person.

So that helps to identify clusters of fraud. It helps to identify multi accounting, Sybil attacks, this type of stuff. On the fraud signals, we're typically looking for things as simple as fake emails, you know, extremely new emails, things like, you know, if you're playing PC or, or free to play, things like virtual machines, emulators, things like connecting via, you know, proxy IPs, VPNs, Tampering with your device, these type of things really clue you into fraud signals.

One of the biggest things we see is that decently sophisticated folks know that you're looking at things like device and network. So they're trying to obfuscate those details. They're trying to hide, you know, right. And so they're, they're, they'll tamper with their device details. They'll come, they'll, they'll sign in from different locations, different IPs, different, you know, networks, things like this.

Identifying that at the onset is a really easy way to kind of cluster users and detect fraud signals. And then the last is automation. It's a lot of the stuff George mentioned, but a lot of these things. That folks will use, give off kind of telltale signals or actually change the device details themselves.

And you can actually detect by just looking at the device details for certain keywords or patterns or weird stuff happening in permissions. All this type of randomly small tasks that can actually detect a lot of these automation frameworks or macros. Before they even run. And so that's kind of the set of things we look at.

I think the other thing I would mention is just almost everyone's using some sort of cloud infrastructure. All of those have kind of this web application firewall WAF. That's a, I would highly recommend folks use that. It kind of works out of the box. It's going to kind of prevent DDoS attacks or really high scale attacks from one or two IPs, data centers, things like that from, from attacking you.

And so that's a super easy way to get started. With an out of the box, nearly free product.

Alexandra: Okay, so, but if I can't, so those are, those are all things basically address them, like you said, and as early in the player journey as possible, like basically prevent them from signing up in the first place, they cannot even create an account because it's like, you've entered your email.

I know you're a bot you've been rejected, but let's just say that you have the bots and they're in the ecosystem and they're tampering with the game. How do you at scale expunge them? And you just kind of go in, you've used some of these behavioral patterns that you set up and then you cancel the accounts.

And then they just can never log in again. Is that the way that you guys would approach it? Or is that too kind of naive?

Henry: Yeah, I think it comes back to something you'd said earlier, George, that I liked, which is just how clear you've been with the community around the terms of service. And like, kind of what, what are you, what do you have jurisdiction to do, right?

Like, have you been clear about no bots and fraud and stuff like that? I think one of the strategies we see a lot of games take that is smart. And you use this word earlier, George ban waves. So they'll do it in waves. And just what they're, one of the reasons is, so if you block someone at signup, they can kind of quickly get that feedback and they can retry, come back in with a new account, et cetera.

But by doing it in waves, you actually, you, you, you lengthen that feedback cycle. And so we do, you If they are malicious, we would recommend likely either banning them or asking for some additional form of verification, whether that is, you know, a KYC, George, or a phone number verification, something like that to say, Hey, we're going to add cost to this process for you, or we're going to flat out ban you and say, Hey, if you have a problem with this, go to customer service.

We, I mean, one of the games you worked with, I think they banned in one wave, like over 30, 000 accounts and had two or three complaints. Right. So bots, they know they've been caught. They don't complain. And so if you're targeting is good, like you're going to be fine. It's really, you know, like these guys, when they get caught, they just go, they don't, they're not reaching out and wasting their time or yours.

Like they, they don't. They know they've been hit. So you're there, they're not going to, if the targeting is good, you should feel comfortable banning. And if you don't, we think things like additional verification steps. Are fine because you're not actually adding friction for the real users. Right? So in the KYC example, George, it's like, you don't want everyone to do that.

Probably. It's just like going to deter people, as you said, but you know, if users look really suspicious, it's a fine time to ask them to do that because they're fraudsters or bots anyway. And if you did get a false positive, like they'll probably just do it and it's fine. They lost like a minute of time.

George: Yeah, I mean, from a security standpoint and policy standpoint, you could absolutely ban all the people. I think it's a little bit more complicated than that, though, for many of these consumer platforms, right? There has to be an internal commitment that you're willing to address the problem. And that means also acknowledging the growth numbers are going to take a hit.

There's going to be some change in the communication of those numbers. I actually think that's the biggest one because you need internal executive alignment that it's now okay to go publicly with these things. Yeah. Thanks. And then once you choose that route, and I think you get there because you think announcing this is better for the game than worse, right?

Because there is some trade off in some ways, you want to be transparent, so not only having the bandwave, but maybe selectively indicating it, telegraphing in advance, doing early communication, giving people time to stop on their own even. But you make this announcement and say, in X amount of time, we're going to do this.

So that way your entire community can react to it, can get feedback. I think that matters a lot. And then once you do it, you publicly announce it and that's the deterrence step, right? You show that you've actually gone to this step. You note the people that you got, because you're not going to get everybody, right?

Cause there's like different paths or approaches to do it. And then when they see that, then that causes additional deterrence as well. And you continue to engage the community. You get their feedback, right? It's bad if you, I don't know, say everybody liked it because, you know, they were tired of getting crushed by sweaty real players all the time in some game.

Like, oh, well, you actually created a slight problem for your community, maybe that balances, okay, low level rank or unranked, you just let it be, but in ranked games, obviously you can't have like these bots boosting each other because that's not something you want. So just being very specific with the goal and just sharing, you know, gamers as a whole are like very smart, they know the community well, they know the platform well, so just to share and just engage with them.

In that decision making. I think is very valuable.

Alexandra: Yeah, that's like, that's pretty, because that has actually a lot less to do with the infrastructure of it and more has to do with the humans involved in the process. Your community, your executive team and how you kind of manage that, right? It sounds like to you, community should play a big role in either reporting bot detection or, you know, advocating.

Yes, it's positive or no, it's negative. Is that correct conclusion?

George: Yeah, and I think in part it's because. These are all large multiplayer games in the first place that have these problems. So they have very large communities.

Henry: That makes sense. Yeah, I think, no, I think it's a really great set of points, George.

There's kind of one piece that I might just want to jump in on and it's this idea of kind of telegraphing in advance. I think it's good to do. There is a double edged sword though, where you can actually, you basically show your hand. So anyone who has anyone who has kind of accumulated resources. In a fraudulent way can now go unload those and actually kind of capitalize on that before they've lost that chance.

And so if you get in this pattern of, Hey, you telegraph exactly when you're going to do something the, you know, weeks before then botters know, Hey, I just got to, you know, I just got to beat the deadline and they'll tell me when it is. And so there's a bit of a. I think there's a, a bit of a surprise element you do want to have in there at some points.

And maybe it's, you know, you're really clear that it's happening, but you're not telling him when. But just like, it's like, you know, if you're playing football and I tell you the play, I'm going to run, like, you know, it's like, it's pretty easy to prepare for. And so there's kind of this, you want some element of surprise in there, but you do want community buy in.

Totally agree with that. I think it's a really good point. That I didn't touch on at all. And I think we do see that is really effective and people do cheer when, you know, when you boot up Boston, sometimes it's even fun to show kind of the fraudsters you've found, you know, like almost like a leaderboard of like the bad guys you found, right?

This guy had 84 accounts. We kicked them all out. We returned, you know, 30 skins and 20, 000 of resources to you all, you know, like it. It's just like a weird, like you gamify the bot killing, you know, it's like, or like you have hunters that can report them. And like, if they find them and it's true, like maybe they get some resources.

Just kind of this, you know, community aspect. So community is super important. Those are all great points. Just some minor nuances there from what I've seen.

George: Yeah, I totally agree. It's a, it's a trade off, right? I mean, it ultimately comes down to what you're looking for as well. If it's very serious, if it's like, I don't know, you're completely destroying the economy.

You have a crime syndicate around it. You would treat it like an FBI swarm case. You don't say anything until you like read, right? It's like Alex with her opening heartstone packs. You probably don't need to jump on her in silent band. Right. You should probably give her like the heads up and create it as a game feature.

I think that's just the reasonable.

Henry: Yeah. There's times when you want the secret SWAT team and there's times when you want like the nice neighbor calling, that's right.

George: That's right.

Alexandra: Yeah, that's great nuance. , and I think that also like, you know, it's like, even after you've identified everybody, the. , I guess the manner of death execution, which is a very, I guess, to your point, Henry, if you've strung up the bots on his own, put their heads on a pike, , is different depending on what you're trying to accomplish.

And so, guys, we're kind of wrapping up on time here. And there's, you know, a ton of, ton of actually topics to go into on the details of detection systems, that we didn't be, weren't able to talk about. But I want to kind of ask a final concluding question which is about AI and its relationship to botting.

And I'd love to kind of hear your takes. On the one hand, is AI making it easier to catch the bots or is AI making it easier for parties to invent bots? And so do you think that botting will get worse or better for games in the years to come?

Henry: I think from our perspective, so it's definitely both.

I think the, the thing that's really interesting is splitting the bad actor camp into two halves. One is, AI is adding sophistication to the existing bots, but it also makes creating bots that much more approachable. Now kind of anyone with a computer can do it. And I think the scale is actually the thing that we've seen be the most interesting explosion.

So yeah, the sophistications. Getting is more advanced, but really it's about the scale has exploded. Anyone can kind of do this stuff now and you can do it at scale really easily. And so that's where we've seen the most interesting kind of inflection point from a detection perspective. Yes. AI of course helps.

And in George, it's probably even more effective for, for your, what you guys do the other. Thing that cha is kind of a headwind is player. Player Privacy is a huge focus, and of course that's super important. Every time there's more privacy, there's less data you can use to actually determine whether someone's kinda like real or unique.

And AI can help kind of bridge that gap a little bit, and just help kind of cluster or see things faster than deterministic computer models can. But, yeah, I think that this, this explosion of scale is probably the most interesting thing we've seen with AI.

George: I mean, I think AI is essentially creating or escalating this, this botting arms race, right?

You can detect stuff better too, but you can also create stuff much more easily and more complicated, nuanced stuff. So I think what you would expect to see is at the floor, the easier stuff is just going to get picked up. It takes less effort for a studio to do something small to at least get the basic stuff.

There's more tools out there. It's easier to set up. But at the same time, the really nuanced hard stuff is just going to become even harder to detect. I think when I hear your point about scale, Henry, but what I found is that humans tend to prioritize and focus energy on larger and larger scale problems.

So the bigger the problem is, The more likely people will swarm around and solve it collectively. And in fact, the argument for why you don't see like things like more sharing and gaming is there isn't something that would like lead to massive criminal or the completely destroyed a game. Right. To that same extent.

So I'm slightly less worried because if you think about it from a bad actor perspective, the motivation is money or like winning, but it's also like not getting caught. And the more something goes bad or is really big, you're more much more likely to get caught at the end of the day.

Alexandra: Yeah, two very, you know, different, I guess, actually different takes, which is the whole point about predicting the future. Well guys, this was really fantastic. I feel like I'm much more educated on, some of the nuances in botting or fraud, or multi accounting or whichever word is more appropriate to encapsulate this topic of discussion. And it was a pleasure to have you guys on air today. If there are any people in the audience that are looking to either get in touch with their companies, because they are looking to implement a detection solution or, you know, bolster up the, the, their defenses and user integrity, how can they get in touch with you guys?

Henry: Our website, or shoot me an email at [email protected]

George: Same website, ggwp.com or [email protected].

Alexandra: Excellent. Well, as always friends, that's our episode today. If you've got feedback or ideas, please hit me up at [email protected]. We are always open. And with that, we are out. Henry and George, thank you so much.

Henry: Thank you so much. Super fun.

George: Thank you.

If you enjoyed today's episode, whether on YouTube or your favorite podcast app, make sure to like, subscribe, comment, or give a five-star review. And if you wanna reach out or provide feedback, shoot us a note at [email protected] or find us on Twitter and LinkedIn. Plus, if you wanna learn more about what Naavik has to offer, make sure to check out our website www.naavik.co there. You can sign up for the number one games industry newsletter, Naavik Digest, or contact us to learn about our wide-ranging consulting and advisory services.

Again, that is www.naavik.co. Thanks for listening and we'll catch you in the next episode.