You can finally tell when an image is an AI fake made by DALL-E
Favicon 
bgr.com

You can finally tell when an image is an AI fake made by DALL-E

We've been talking about the problems fake AI photos can create for months‚ ever since it became clear that AI image generators would be able to come up with pictures that are indistinguishable from reality. That's why I don't like how easy it is to use Google's AI in photos and change those memories until they stop looking like whatever you actually photographed. These tools can be used for malicious purposes‚ like faking photos of candidates during a big election year. People who still don't know how good AI imagery can be might be fooled easily. But then‚ fake Taylor Swift AI porn images started popping up a few weeks ago. I am sure it must have been an incredibly painful event for the beloved music star. But the flip side is that the world is now aware of how realistic these fake AI images can be. I'm not saying the Taylor Swift AI scandal is the reason why OpenAI has just announced that it will watermark AI photos created with ChatGPT and Dall-E 3. Or that Meta is taking a similar approach on its social platforms because of the explicit AI content that made the rounds in the past week. But it's all happening now‚ much later than it should have. Continue reading... The post You can finally tell when an image is an AI fake made by DALL-E appeared first on BGR. Today's Top Deals This $16 clip-on lens kit fits the iPhone or any Android phone‚ and it’s awesome Amazon deal offers a 7-inch Android tablet for under $43 Save 61% on a 6-port USB rapid charger on Amazon Save 75% on a Canon black and white multifunction laser printer on Amazon