LEGO® plans to increase its UK and Danish workforce to support the expansion of digital services. In an interview with the BBC, CEO, Niels Christiansen, said that users upload one of their LEGO creations to its digital platforms every 2.77 seconds. The BBC also reported on the launch of LEGO VIDIYO™, a partnership with Universal Music enabling children to make and upload their own music videos.
LEGO’s desire to move with the times and create digital services and communities to appeal to children is understandable when we look at the fact that over half of Britain’s ten-year-olds play video games from Roblox, which is valued at £33.8 billion. What struck me about this announcement is that LEGO will have to comply with impending legislation requiring platform operators to moderate what users upload to their digital platforms to prevent online harms. When we think about online content moderation and legislation, we often think about the more obvious social media giants.
As a company operating in Europe and the UK, LEGO’s digital services will fall under the scope of the UK Online Safety Bill and the EU Digital Services Act which will require interactive website owners to provide the necessary oversight to ensure that content uploaded to their digital platforms does not harm other members of their online communities.
In October 2019, Image Analyzer participated in the UK’s inaugural content moderation symposium, where experts from government, academia, law enforcement and the technology sector discussed the growing responsibilities of online games communities and interactive website operators to protect minors from online harms including toxic images, grooming and cyber bullying.
he stay-at-home orders imposed during the pandemic have massively increased the number of minors using online games. Games companies already have a moral obligation to safeguard their young customers, soon this will be a legal obligation too