Ofcom noted that in its view, CSAM does include âAI-generated imagery, deepfakes and other manipulated media,â which âwould fall under the category of a âpseudo-photograph.ââ As Ofcom explained, âIf the impression conveyed by a pseudo-photograph is that the person shown is a child, then the photo should be treated as showing a child.â Similarly, âmanipulated images and videos such as deepfakes should be considered within the scopeâ of intimate image abuse, Ofcom said. âAny photograph or video which appears to depict an intimate situationâ that a real person would not want publicly posted should âbe treated as a photograph or video actually depicting such a situation.â Some Grok fans think that the chatbotâs outputs that undress people and put them in skimpy bikinis or underwear isnât abuse. However, the UK law further details that an âintimate situationâ could be an image where a personâs âgenitals, buttocks, or breastsâ are âcovered only with underwearâ or âcovered only by clothing that is wet or otherwise transparent.â Itâs unclear how long Ofcom may take to reach its decision, but the regulator acted urgently to intervene. And UK officials who were shocked by the scandal have confirmed that they are quickly moving to protect people in the UK from being targeted by Grokâs worst outputs. While Ofcom does not directly refer to Muskâs comments on censorship, the regulator takes a defensive stance in its announcementâlikely preparing to fight Xâs argument by pointing out that X would be the one in charge of deciding whatâs illegal content and what should be removed. âThe legal responsibility is on platforms to decide whether content breaks UK laws, and they can use our Illegal Content Judgements Guidance when making these decisions,â Ofcom noted. âOfcom is not a censorâwe do not tell platforms which specific posts or accounts to take down.â
First seen: 2026-01-12 17:02
Last seen: 2026-01-13 18:06