I guess Grok, the official AI of Twitter/X got an upgrade because many people have been using it in a way that can vary from funny to creepy. You can have a picture of anyone and ask Grok to alter the picture so they are wearing different clothes. Make someone wear an outfit you normally see on a video-game character or such--perfectly innocent. Or have Grok take a picture of a woman and remove her clothes so it looks like she is wearing lingerie...hm.
Some women have uploaded suggestive pictures of themselves and then done the trick of tweeting at Grok to put them in an even more revealing outfit. These women are generally influencers promoting their OnlyFans accounts or such, and more power to them. They are consenting to their pictures being used in a sexual manner and telling the AI themselves to use a picture and make it saucy. I'm sex-positive and pro-sex work; that's their choice. It gets a lot weirder when a woman uploads a harmless picture of herself and a bunch of men start tweeting at Grok to alter the picture so she is in a bikini, or has different colored hair and is licking ice cream, or a variation a woman didn't consent to. This has been referred to as a, "Mass Undressing Spree," but instead of people literally ripping clothes off of women, they are using AI.
Some dudes have had their pictures tweaked in suggestive ways from what I can see, but this impacts women a good 99% of the time. Where it gets really uncomfortable and borderline illegal is when it is girls under 18 having a picture of them altered to look suggestive. Grok won't make anything outright pornographic, but when a well-known actress who everyone is aware is underage has a picture tweaked so she's in extra-revealing clothes...well, maybe the users doing those prompts need a bit of extra scrutiny because they're just a hop-skip-and-a-jump from outright committing a crime.
How has Twitter/X responded to all of the news about this? They released a press statement saying, "Legacy media lies." Elon Musk himself seems relatively unbothered too. I mean, even if the media allegedly lies, the countless prompts demanding a woman have her outfit turned into a transparent bikini say it all. I swear, AI is only going to get, "Better," at faking images or videos to the point where, in three or four years, we legitimately won't be able to tell what is a real picture/video or generated/tweaked by AI. At that point, life is going to get really scary. As it stands right now, continue (as always) to be careful what you put on social media. Not only could the real content be used against you, but now people can change something innocent into questionable content. The future is now, and the future is deepfaked beyond recognition.















