Meet Christina Camilleri, Netflix Games' "Chaos Gremlin" and "Safety By Design" Champion
From a dedicated MapleStory player to an "ethical hacker" to the Head of Trust and Safety at Netflix Games, Camilleri has blazed a unique trail across the digital community support landscape
[A] misconception of trust and safety is that it inhibits fun and it’s a cost center, and it’s similar to how security is viewed, and that it’s something that the company has to do for legal reasons or safety reasons, but not necessarily something that contributes to the profitability or the success of the game, which I think is a really unfortunate…because it’s the opposite. I think safe games are sticky games. When people feel respected and welcome, they stay.—Christina Camilleri
Last August, Greg Posner, host of the Player Driven podcast, had a wide-ranging and thought-provoking discussion with Christina Camilleri, Netflix Games’ Head of Trust and Safety. In this post, I’ll highlight parts of their exchange that I found especially resonant. (Note: the full episode is likely on your preferred podcast platform, and it’s here too: From Hacker to Guardian: How Safe Games Keep Players Hooked.) A key theme of the discussion involved how video game studios can, and are working to ensure the safety of their users online, but many of the takeaways and best practices apply to nongaming digital community contexts as well.
Before diving into the highlights, it’s worth acknowledging the two elephants in the room: first, Netflix is, naturally, far better known for video streaming than for games (although their mobile game Squid Game: Unleashed struck a chord last year) and, second, in early December Netflix announced it intended to acquire Warner Bros. Discovery in a blockbuster deal valued at $82.7 billion. Camilleri’s and Posner’s discussion took place well before this megadeal was announced, but it’s worth pointing out that Warner Bros. operates several game studios with popular IP, including Harry Potter, Mortal Kombat, and Batman: Arkham. Assuming Camilleri stays on and the acquisition closes as anticipated, it’s likely that several of the topics addressed below will touch millions of gamers worldwide in 2026.
If you’re enjoying our content and haven’t subscribed, what are you waiting for?
“I used to ride motorcycles,” Camilleri told Posner early on in their August 2025 exchange. "But I had to give it up after an accident…My current, I guess, half project is my truck that I’ve kitted out to go overlanding…[It’s] a pickup truck, which now has a camper on the back so I can work and live out of it when I want to get away. So it’s fun.”
Camilleri’s Aussie roots showed through in this comment: overlanding is a term people from the land down under use to describe what Americans call off-roading.
Posner asked what drove her to invest in that big restoration project.
Camilleri’s answered that it “stems back to my love for just taking things apart and figuring out how they work. And then, in that process, when putting them back together, you realize you can put it back together sort of however you like and customize it.”
“I’ve always had a bit of chaos gremlin in me. Again, I love figuring out how things work by breaking them. And yeah, that naturally translated into security.”
“I think there’s a thrill in learning how something works, and then learning how to bypass them, especially systems that were supposedly locked down.”
A true hacker’s mindset there!
Beyond playing games, two of Camilleri’s favorite pastimes are playing with her cat and…fly fishing?
I’ve “gotten really into fly fishing,” she told Posner. “I actually just got my first pair of waders two weekends ago. I wanted to make sure I loved fly fishing before starting to invest in all of the gear.”
“I just got back from a trip from Italy a few weeks ago. And one of the things I did there, up in the Dolomites, was fly fish for a day with a guide, which was a magical experience. And that was a point where I was like, ‘Okay, it’s time I’m getting waders.’”
Read on to catch three of Camilleri’s best practices for designing and managing safe and trusted online communities:
Cover the basics with a set of proactive and reactive community moderation tools
Use “safety by design” techniques that can help mitigate toxic user behaviors before they begin
Offer easy-to-use customization options so that users who experience toxicity can opt-out of similar future episodes.
Bonus coverage: Along the way, we’ll chronicle how an impressionable teen MapleStory Aussie noob wound up becoming Product Manager, Games Trust and Safety at Netflix, a Fortune 150 U.S. company.
Community Management 101, With A Side of MapleStory Love
Safeguards have gotten better, but there’s also just a lot more people online and other ways for exploitation and unsafe things to happen. I think a lot of that is just tied to human nature. Unfortunately, you put a system in front of them, they’ll find a way to get around it.—Christina Camilleri
“My mom worked three jobs,” Camilleri explained to Posner at the outset of their Player Driven podcast discussion last August. “When I was young, I was home alone a lot…Games were around. I think my first console that my mom spent a long time saving up for was an Xbox. And then I sort of naturally transitioned to a PC.”
Games “were, as cheesy as it sounds, I guess my second home,” Camilleri continued. Wizet’s and Nexon’s “MapleStory was, I think, the first MMO that I played.”
“I was definitely a bit of an awkward and shy kid.”
Social video games soon became “a way for me to connect with other like-minded people online. I actually made some of my really important and influential friendships” on those early online communities. “My first relationship actually came from someone that I met online.”
Looking back, Camilleri also credits MapleStory for launching her career.
“One of the strong memories I have was in MapleStory. I was part of a guild, and the guild predominantly used IRC, Internet Relay Chat, for communication.”
“It was through IRC that I, weirdly, formed my community around security.”
“I think Freenode was kind of the go-to server for a lot of InfoSec and, uh, nerdy folks…That’s actually how I got connected to some security folks and then ultimately ended up at my first conference.”
“It was the Wild West in, like, a lot of ways. I was, I mean, obviously, a younger female in those spaces and there weren’t really many, if any, safeguards or safety nets on these games or on IRC.”
“Now I think that the duality of being in both games and communities, both being places, like, a safe haven and a place of risk, was what shaped my passion for safety.”
“I have, probably, MapleStory to thank for my career and my move to this country in a weird and roundabout way.”
MapleStory friends pointed her to IRC. IRC led her “to those security folks that led me to my first conference,” Camilleri recalled. “If I didn’t go to that first conference, I wouldn’t have gone to this, like, recruiting event. If I didn’t go to that recruiting event, I wouldn’t have gotten my first job.”
The timestreams have officially been crossed.
Let’s switch gears and break down of one of Camilleri’s best practices for supporting healthy, secure, and trusted online communities: delivering a baseline of backend support systems and tools.
Near the middle of the podcast, Posner asked Camilleri a broad question about the kinds of mistakes she’s seen studios and other companies make over the years.
“There’s sort of three buckets...where dev’s [developers] go wrong” and fail, she said, “to address the potential for harm.”
The first bucket involves “reactive moderation tooling, [which] is getting better and better at preventing and removing harmful content, but it still requires players to misbehave or something to go wrong before you [the community provider’s support team] can do something about it.”
A core piece of a safe, trusted online community is to have a flagging system in place that allows users to report what they believe is inappropriate behavior by other users. This is like a person listening for danger. Game studios and other online community providers need to aggregate those incoming reports so they can be analyzed, sorted into levels of increasing seriousness, and acted upon as needed (ideally, with as short a turnaround as possible).
If players and users are interacting by voice chat, text, or other communication systems, all of those data streams need to be captured and stored in a way that cases of alleged toxicity can be judged fairly and quickly. Sometimes false positives occur that need to be corrected. A studio can also proactively act on this backend data repository to identify and weed out the worst of its community trolls without them being flagged. These proactive and reactive tools, ideally, will also surface repeat offenders for increased proactive scrutiny, and ramp up their penalties if they don’t change their behavior, or even permaban them.
Each user account needs to have an integrated, searchable behavioral log associated with it. If a studio doesn’t cover this base, Camilleri maintained, especially if the associated video game is competitive, its inviting serious harm and blowback.
“I read a case study around Among Us a while ago,” she told Posner. “It was initially prototyped and tested with, like, a trusted group of friends and developers. And then they launched it, and then it got a massive reception. I think COVID had to do with some of that.”
“But they released the game with no chat filtering and no basic reporting or blocking. And the whole game is about murdering people and lying about it.”
“The game had a massive surge in toxic behavior and griefing, I think, largely due to the amount of strangers now interacting with one another, and because it was a competitive game.”
(I’ll add parenthetically that Among Us was, and is especially popular with younger kids—including my daughter who was six back in 2020—that were home from school for months as the pandemic raged.)
“You can’t just bolt on a moderation tool and hope for the best,” Camilleri added. “You need some kind of reassessment or reset into that system.”
“Some studios just don’t even care” to cover this important trust and safety base. Without getting specific *cough COD*, Camilleri said that some “competitive online FPSs [first-person shooters] are like, ‘Yeah, we know our players are toxic and horrible. It’s a fucking shooting game and we don’t care to do anything about it.’”
Lol.
Naturally, these basic proactive and reactive backend community systems should be paired with a clearly spelled out and repeatedly stressed set of community guidelines and standards. Community members need to know what the rules are and understand there are consequences if they run afoul of them. This is policing 101.
Before moving to Netflix in 2019, Camilleri spent about two years working security at Riot Games in Los Angeles.
At Riot, Camilleri recalled to Posner, “I had a bit of a weird hybrid role. I was focused on both employee security, so helping employees...understand how to improve their security posture, but I also had a player investigation angle, and you can probably imagine that the reports that came in, and the things that the tooling picked up at the time on League of Legends chat was probably the some of the worst parts of player behavior I’ve seen.”
“I was responsible for investigating high sensitivity threats there, then figuring out who was behind the threats, and then, sometimes, escalating that to law enforcement.”
Clearly, some hardcore League of Legends players weren’t just violating the game’s community standards, they were breaking irl laws.
A studio or company that fails to track how their users are interacting on its servers isn’t just opening the door to toxic behavior but, potentially, to legal liability. (This isn’t theoretical: in addition to private cases, at least six U.S. state attorneys general have pending lawsuits against Roblox for allegedly failing to protect children.) It’s true that community trust and safety systems aren’t free. Camilleri’s point to Ponser was that these tools can also have a net positive ROI because they (1) help reduce churn by weeding out toxic trolls and (2) shield studios and companies from legal risks.
Safety by Design
Safety by design is about intentionally shaping systems to prevent harm before it happens. So, not just patching things when they go wrong or after things go wrong…It’s a much more proactive mindset, not just a feature set.—Christina Camilleri
“I don’t think a design always means forcing purely pro-social behavior,” Camilleri explained to Posner in their August 2025 Player Driven discussion. “While games should promote respectful play, part of the beauty of gaming is that emotional range.”
“I think smart design acknowledges, as well, that not all intense or competitive interactions are toxic, and we should support rivalry without, necessarily, the real world harm aspect.”
“Instead of anticipating bad things to happen, let’s instead look at what be what might be causing that bad behavior to happen,” Camilleri added. “And that leads us to the next thing [bucket of mistakes], which is trying to design features without considering social dynamics or proper controls.”
This is an important mindset to have, she said, “when you’re releasing to an audience, potentially, full of anonymous strangers, not, like, a trusted testing group.”
Moreover, “Competitive elements, like I was talking about before, can increase people’s aggression.”
“What Riot Games opened my eyes to,” Camilleri said, “is the types of badness you can see, and a better understanding into what motivates some of that.”
Games such as League of Legends make a mistake if they put “low-trust users into high-trust situations.” If a studio puts players who “have never interacted with each other before…into a situation where they’re expected to perform well…that’s a lot of pressure on someone to do well without really knowing anything about” their teammates (or the players on the opposing team).
“That can create toxicity because, if someone lets you down, maybe the reaction is going to be pretty poor, or you’re going to put a lot of heat and hate and pressure on that person that isn’t performing well. And they’re going to get defensive because everyone is in a high-stress situation. And that just, sort of, cultivates a high-stress, aggressive environment for players.”
A safety by design technique can help studios and their player communities avoid this problem by not putting low-trust players into a high-trust situation. This has more to do with human psychology and social dynamics than coding.
At GDC 2024, Camilleri was part of a panel (that Community Clubhouse presented) that got into this topic in more detail than we’ll do here. That panel, “Building Safe and Prosocial Game Platforms,” was quite instructional (and if you have access to GDC Vault, I definitely recommend it).
Daniel Cook of Spry Fox was especially eloquent about the value of having a safety by design mindset on the panel. He said that humans are capable of miraculous things if a high level of trust exists between them. If a studio, instead, puts a group of people together who don’t know each other and don’t trust each other into a situation where deeply coordinated action is the only way they can “win,” those people inevitably turn on each other as soon as things go sideways. That’s a poor design choice, not a fault in the coding. High levels of trust situations are something that must be worked toward over time, and they have to be an opt-in environment for all users.
Safety by design is about mitigating toxicity before it begins based on an understanding of human psychology and social dynamics. Online community toxicity, in part, is driven by suboptimal design choices. Those poor choices can increase churn rates and lower the amount of money that players spend, especially in free-to-play games.
Camilleri told Posner she’d recently ran across some research that showed “that 65% of players quit or...would quit a game because of harmful interactions. And 61% of those who stay still reduce their in-game spending when faced with inappropriate behavior.”
If directionally accurate, this implies that studios that use a safety by design technique to help mitigate toxicity in their online communities—and have a solid baseline of policing tools and community standards in place—may reap massive windfalls over time. (Note: I poked around online to find this paper, and this one, from 2023, which was put together by Take This based Nielsen survey data, appears to back up Camilleri’s assertion: it found that 6 out of 10 North American gamers reported quitting a session/match or quitting a game permanently because they were subjected to harassment and hate within that gaming community; it also found 61% of players reported they had, at least once, decided not to spend money in a game because of how other players had treated them.)
If even half true generally, that’s a lot of potential spending left on the table, and it would take a big increase in marketing spend to replace those exiting players.
“There’s a game called Sky: Children of Light, a very cozy, wonderful game,” Camilleri added. Developed and published by Thatgamecompany, Camilleri said that she’d read another piece of research that “showed that generosity in that game is very contagious. So players who observe or receive pro-social acts engage more and engage more frequently.”
This is another aspect of the safety by design mindset: done right, the technique encourages pro-social behavior, which in turn, can make game communities stickier. (Note: I did track down the associated study, which was published by the nonprofit computer sciences society, the Association for Computing Machinery, in late 2022. This paper indeed found “empirical evidence” that the MMO Sky: Children of Light was designed in a way that created “contagious generosity,” which led to “higher future engagement in the game.”)
Posner asked Camilleri when she began thinking about such techniques. Her thoughts ran back to her pre-Riot days when she was working on security at companies like BAE.
“What I loved about being a consultant, or some people might call it, like, an ‘ethical hacker,’ is putting yourself in the, I guess, hacker’s shoes, or that offensive mentality where your goal is to try every way to break into the system or to circumvent those controls…I learned over time that the weakest link was usually the human behind it.”
Solving trust and safety problems is as much about understanding what makes people tick as it is about designing high-tech systems and rolling out software tools.
Wrapping It Up With Reputation Systems and Rich User Customization Options
I believe Overwatch has a system now where, when you file a report, if an action is taken on that player, they’ll actually send you a message in-game a few days later saying, “Hey, thanks for sending that report in…We investigated and we took action on that player,” which made two things happen. One, the reporter feels like their reports were taken seriously and actually something was done about it…And two, that then reinforces the community of like, “Oh, this company actually does something to stop and prevent bad behavior from happening. So maybe I should cut down on what I say or what I do in-game.”—Christina Camilleri
According to Camilleri, another “common theme I see [the third bucket of mistakes], where developers go wrong, or can consider doing better, in is user controls.”
“A lot of trust a player has in a game’s ability to address harm, I think, can be tied to features like reporting systems and feedback loops,” she explained. “When we deploy controls like blocking, muting, reporting, they often fall short on actually combating the harassment.”
Closing the feedback loop by letting a reporting community member know their experience matters, and that their report made a difference, is an important aspect of building trust between community providers and members. If a baseline set of tools are about a community provider listening to users, then these feedback note—which don’t have to be at all fancy—is the community provider talking directly back to individual users and letting them know their action mattered.
Posner wondered what happens in cases of false reporting.
Camilleri quickly donned her “ethical hacker” hat.
“We were talking,” she said, “a bit before about testing systems.” When you’re designing “safety systems, like block [and] reporting functionality, you have to test those too, because users will find ways to repurpose systems that are designed to keep people safe into weapons.”
False reporting is a real issue, Camilleri continued, and some games take “automated action on a user when you submit, like, a certain amount of reports on them. And players will usually quickly find out, and then use that against players that they don’t want to continue being online.”
“I always encourage that games with [a] large online community, or [an] online interaction aspect, [to] have some kind of reputation system…You want to track both the good and bad. So, how many times someone’s been reported, or how many times they’ve been actioned against, but also positive contributions, like how many reports have they submitted where it has been accurate and has led to action?”
Such a reputation scoring system “can help you make better decisions about actions to take against those users. If they have a proven track record of being a terrible person, maybe you don’t want them to have access to really rich online communication features.”
Giving users more control over the features they prefer to use, in Camilleri’s view, is another underappreciated aspect of high-quality trust and safety systems.
“I remember first getting a VR headset,” she recalled, “and jumping into VRChat to see how ridiculous that was. I was immediately thrown into open proximity chat with no way of muting it, or turning it off, or like controlling who I could hear or who could, like, interact with me.”
“That’s where a lot of abuse can come from. Just assuming that you can throw someone new coming into your game into, like, an open voice chat, or open text chat, or global chat environment, and hope that nothing bad will happen. And when it does happen, having controls that don’t really allow the user to to opt in or opt out…It can feel super overwhelming.”
We’re back to a key churn driver in social games, especially competitive titles.
“When you don’t give the appropriate amount of controls or levers to the player, you’re sort of making an assumption on how they want to play and experience your game,” Camilleri continued. “Maybe someone loves to play hyper-competitive shooters, but just doesn’t ever want to be on voice chat.”
“By not giving them the controls, you’re assuming, ‘Hey, this kind of player wants this kind of experience and that’s the experience they’re going to have.’ But by just adding a little bit more customization options, or ways that the player can choose to opt in or opt out of certain experiences” you can help users avoid toxicity, which can lead to more and longer engagement (and probably more spending).
Roblox, Camilleri noted “requires your ID for using spatial voice…Fortnite lets players opt out entirely of voice chat. Destiny 2 gives you really, really granular privacy settings.”
“This is shifting the conversation a bit away from what reactive controls can we put in place, to sort of stop bad behavior after it happens, and shifting it more into what what controls can we give to the players, and what power can we give to the players to tailor the experience.”
Posner noted that some games offer UGC (user-generated content), such as Roblox, Minecraft and Fortnite, and he suggested that UGC can open up another can of potentially toxic worms.
Camilleri acknowledged this was indeed an issue.
You’ve probably heard of “TTP or ‘time to penis,’” she answered. “When you give players controls and the ability to create content, the first thing they’re typically going to make is...a penis or phallic object or something inappropriate.”
“It’s just a great example of players being passionate, and that can lead to great things, and also terrible things.”
“I am very quick to drop games like that, especially if I can’t turn off or opt out of voice chat. It’s just like, ‘Nope I don’t have the energy for this today. I don’t want to play this game.’”
Unsurprisingly, Camilleri comes down solidly on the side of gamers who quickly opt out of communities that have holes in their trust and safety systems. Are you?






