Hundreds of AI apps and sites can now easily make fake pornography from photos of anyone, including children. Three women in Minneapolis tell CNBC how a close friend made explicit imagery of them using a “nudify” tool. Often marketed as simple face-swappers, they’re all over, from the Apple and Google app stores to ads on Facebook and Instagram. It’s legal to create nonconsensual fake nudes, but now a group of friends is trying to change that. Warning: This story contains sensitive content.
Visited 1 times, 1 visit(s) today