Earlier this week, LEGO released LEGO Worlds, “a fully open-world, creativity-driven game.” It’s the digital LEGO set and is widely-described as the company’s answer to Minecraft.

However, this isn’t LEGO’s first foray into open-world gaming. Previously, they offered LEGO Universe, an MMO (massively multiplayer online game) that was officially released to the public in October of 2010. The service was shuttered in January of 2012 due to a lack of a “satisfactory revenue model.”

Megan Fox, a former senior programmer on the LEGO Universe team who now heads Glass Bottom Games, shared an interesting story on Twitter, describing one of the big challenges they faced, in creating a kid-friendly MMO: penises.

The Story

Understandably, this series of tweets has been picked up by numerous well-read outlets, including The Guardian, Polygon and Eurogamer.net. I wanted to take some time to reflect on this from the community and moderation side, with people who are actually out there doing this work.

More Than Penises

The first person I thought of, when I read those tweets, was Rebecca Newton. Newton was an early outside consultant for LEGO Universe in 2008 and 2009. She is currently the chief community and safety officer at Mind Candy, who are best known for Moshi Monsters, a virtual world and online game for kids. Beyond that, she has a long track record of building kid-friendly spaces online.

“As I recall, in the early days, LEGO was quite restrictive in what their users could ‘say’ in terms of text/chat,” Newton said, when I emailed. “Disney was equally restrictive back in the mid-2000s. They believed their customers (parents, really) would flee if there was anything offensive. The problem with restricting communication so severely is that kids (or adults) either make up their own language, use a third party product so your walled garden is essentially not walled or leave (enter tumbleweeds).

“Kids, like adults, will police each other in any environment online or off. I remember my daughter, at 2 years old, telling her cousin the rules of the nursery, on her cousin’s first day at pre-school. I thought, ‘now, there’s a hall monitor if I ever saw one.’ Kids will apply peer pressure. There are the hall monitor types, ‘you’re not allowed to do X or Y.’ And there are the rebels, “watch me do X or Y!” And there are the in-betweens, ‘wonder if anything will happen to them for doing X or Y?’ Generally, if you set the tone (as a moderator), the majority of community members will respect that tone and enforce it themselves.”

For a company like LEGO, there is an uncomfortable balance. LEGO empowers people to be creative and that creativity really knows no limits (even if their brand is generally kid-friendly). People can use LEGO bricks to build whatever they want. If they can’t relinquish some control, they can’t really begin to compete with Minecraft because users will run into roadblocks that they won’t hit in Minecraft. I’m not even talking about offensive content. Just something normal that might hit a filter and create frustration.

“No company is willing to admit their space isn’t completely ‘safe,’ yet realistically any free build component poses risks,” adds Alison Michalk, CEO of Quiip, a content moderation company. “Even in worlds such as Club Penguin users might arrange their objects or even meet with other avatars to form ‘inappropriate shapes.’

“As technology improves automated scanning may make this more cost effective, but people are often strangely determined to find ways around filters so it may only reduce the burden slightly.”

Software (and the Need for People)

That said, software is getting better and, with any project of this scale, it’s vital. Newton mentioned that Mind Candy is working with Community Sift, a company that has partnered with Image Analyzer to offer image filtering.

“LEGO and Disney and many other kid content providers tried to solve the moderation issue with people, which would not and did not scale,” Newton explained. “You can’t throw hundreds of human moderators at a mountain of UGC and expect to to keep every F word and penis pic out. That system hasn’t been effective since the late ’90s. I believe filtering/moderation software is not only cost-efficient, it’s non-judgmental and so much more efficient than the antiquated community watch system (pushing a button to report a problem). That’s a reactive system – where using software and AI is proactive, scalable, efficient and accurate in my experience. You always need the humans and you always need professional, experienced, moderators for judgment, context and the human touch. The human moderation system on its own, however, will never scale and is very expensive, so your ROI is shot to hell in 5 minutes.”

Software may be limited, but people are just as limited. Which is why they need each other. For success to be possible, you must have a balance, where the automated-solutions knock out massive quantities of simple, repetitive issues and people are reserved for more nuanced concerns.

“Good technology supports speed and scale,” says Jenna Woodul, chief community officer at LiveWorld, where she leads the company’s client services, including moderation. “However, we rely on people in the final analysis. There’s nothing like human judgment to bring the sensitivity and context that software may miss or get wrong – like the difference between a family pool picture and child porn – or an innocent structure versus a phallus palace.”

If the goal is to be kid-friendly, then a substantial budget should be allocated to community, moderation and safety, because they are priority issues. It would be nice to be able to say “you can do this with software and it’ll be cheap!” But that’s not realistic. The quality of the experience on your service, which is tied directly to moderation efforts, is part of the value-proposition.

Offensive vs. Unsafe

When we talk about something like a penis made out of LEGO pieces, we might be inclined to use the word “safety.” But it’s important to draw a distinction between offensive content and an unsafe situation. While an F word or a penis might be offensive to some, there probably is no real danger associated with it. It’s not a safety issue. When we label everything offensive as “unsafe,” we risk crying wolf and missing out on a situation where someone is truly in danger.

“The F word is most definitely offensive, but it is not unsafe,” Newton told me. “A picture of a penis might be offensive to some, but it is most definitely not unsafe. The statue of David is art, so it’s considered neither offensive nor unsafe. Penis pictures are to kids’ user-generated content (UGC), what cat content is to the internet. Prolific, at best.”


I just wanted to touch on COPPA briefly because Fox mentioned it, and it really isn’t of consequence to this discussion. Fox herself said this later. Short for The Children’s Online Privacy Protection Act of 1998, COPPA is targeted at the information you collect online from people who are younger than 13 years of age. The 2013 update of COPPA – often referred to as COPPA 2.0 – placed stronger restrictions on companies, including game developers.

LEGO (Worlds) Now

LEGO Worlds has a lot of potential, and it could be a great success. The first try at something – in this case, LEGO Universe – isn’t usually the best. From what Fox and Newton have said, LEGO Universe provided a lot of lessons that can be applied to LEGO Worlds.

Though the LEGO penis story helps create a narrative for us to discuss these topics, these issues are really bigger than LEGO. This is a challenge that many platforms and game developers face. Even if they aren’t kid-friendly, they have to deal with abuse and safety issues. If they are kid-friendly, obviously, that takes it up a notch. But I don’t think it’s hopeless. It just takes the right leadership, a strong team and good technology partners.