the blue plague 70s sovietunion, made by AI

The war that never was - when AI fantasizes images that do not exist

Recently I came across an article by interestingengineering.com where it was about AI being able to create images that don't exist, while it might still be amusing to see how Trump arrested by police becomes, or the Pope on fashion icon AI can or could be used to create images that do not exist in a context that is dangerous for everyone. Images of uprisings, armed conflicts and catastrophes.

Until a few weeks ago, one could be sure about pictures, if there is a picture, it is a proof of a situation, a situation, a circumstance that has just been taken by a photographer, although even here it must be said that a photo alone, rarely shows the truth, but only a perspective, and that of the photographer.

But with AI systems like Midjourney becoming more and more powerful, the images that can be produced, or imagined by the AI, are getting better and more realistic. I myself use Midjourney for post images when I want to show something that I could not find as a real image on the internet. But these images I use are just nothing world moving and only serve the purpose of replacing a placeholder.

But by producing images that don't exist, and then rows and rows of them, scammers could be raising funds for disasters like the The 2001 Great Cascadia 9.1 Earthquake & Tsunami - Pacific Coast of US/CanadaOne must look, be able to look very exactly, so that one realizes that these pictures never originated, but were produced. Also the former president George Bush is to be seen there, with which one could derive the temporally also. But this catastrophe never happened.

And a danger that exists, history can be changed so also with images, here Cinematic shots in the Vietnam war and World War 2, new images, different perspectives and you could already question the story.

For example also the blue plague in the 70s in de Soviet Union.

And who has not yet, from the UFO attack heard in WW2? Or that Kim Yong Un traveling is?

In the future, we will receive more and more images where we have to ask ourselves if they really show reality or if they are manipulated or entirely created by AI. I think that photo analysts will be in high demand in the future, because they will be needed with expertise when news portals need to verify the truth behind images, and with each new generation of image-generating AIs, that gets harder. Not for nothing do I practice Criticism of Generative AI.

I think a law to mark AI images, is only a matter of time. Perhaps via a link on the image or a note in the respective post.

 

Further sources:

 

 

 

You like this article? Share it!

Posted by Petr Kirpeit

All articles are my personal opinion and are written in German. In order to offer English-speaking readers access to the article, they are automatically translated via DeepL. Facts and sources will be added where possible. Unless there is clear evidence, the respective article is considered to be my personal opinion at the time of publication. This opinion may change over time. Friends, partners, companies and others do not have to share this position.