Kamala Harris rally crowds are not AI-generated. Here's how you can find out

Kamala Harris rally crowds are not AI-generated. Here’s how you can find out

Suffice it to say, this mountain of evidence from direct sources outweighs the annotated images of conservative commentators like Chuck Callesto And Dinesh D’Souzaboth of whom have been caught spreading election disinformation in the past.

When it comes to accusations of AI tampering, the more disparate sources of information you have, the better. While a single source can easily generate a plausible picture of an event, multiple independent sources showing the same event from multiple angles are much less likely to be involved in the same hoax. Photos that match video evidence are even better, especially since creating long, convincing videos of humans or complex scenes remains a challenge for many AI tools.

It’s also important to track down the original source of any alleged AI images you view. It’s incredibly easy for a social media user to create an AI-generated image, pretend it’s from a news report or live footage of an event, and then use the obvious flaws of this false image as “proof” that the event itself was faked. Links to original images from an original source’s website or verified account are much more reliable than screenshots that could come from anywhere (and/or be edited by anyone). no matter who).

Telltale signs

While finding original and/or corroborating sources is useful for a major news event like a presidential rally, confirming the authenticity of images and videos from a single source can be trickier. Tools like Winston AI Image Detector or IsItAI.com claim to use machine learning models to determine whether or not an image is AI. But as detection techniques continue to evolve, these types of tools are typically based on unproven theories whose reliability has not been demonstrated in any large studies, raising the prospect of false positives/ negative a real risk.

Writing on LinkedIn, UC Berkeley professor Hany Farid cited two GetReal Labs models as showing “no evidence of AI generation” in the Harris rally photos released by Trump. Farid then cited specific parts of the image that speak to its authenticity.

“The text on the signs and on the plane shows none of the usual signs of generative AI,” writes Farid. “Although the lack of evidence of manipulation does not prove that the image is real. We find no evidence that this image is AI-generated or digitally altered.”

And even when parts of a photo seem like absurd signs of AI manipulation (like the distorted hands in some AI image models), consider that there may be a simple explanation for some apparent optical illusions. The BBC notes that the lack of reflection of the crowd on the plane in some photos of Harris’ rally could be caused by a large empty area of ​​tarmac between the plane and the crowd, as seen in reverse angles of the scene . Simply circling strange objects in a photo with a red marker is not necessarily strong evidence of AI manipulation in itself.

Papua New Guinea to follow Australia in banning social media for children based on age

Papua New Guinea to follow Australia in banning social media for children based on age

Opinion article: Zero trust in the AI ​​era – strengthening security frameworks

Opinion article: Zero trust in the AI ​​era – strengthening security frameworks

Leave a Reply

Your email address will not be published. Required fields are marked *