On the face of it Taylor Swift, Jenna Ortega, Alexandria Ocasio-Cortez, Giorgia Meloni and I don’t have a lot in common. But there’s one thing that unites us: we’ve all fallen victim to “deepfake” porn, where AI is used to create sexually explicit material without consent.
By the time graphic pictures of Swift had been viewed some 45 million times, my colleagues on the Channel 4 News investigations team were already several weeks into a project researching the extent and impact of this new and insidious phenomenon. They’d started out with the idea of getting a sense of the number of women who’d been “deepfaked”, as well as the damage caused by this abuse, and what should be done to try to stop it.
They