Do you think an AI image detector is needed nowadays?

Posted by picki 10 hours ago

Counter3Comment7OpenOriginal

Comments

Comment by picki 10 hours ago

Hi everyone, I am just validating an idea if a project like susmeter.ai is actually needed at all? It can detect AI/Deepfake signatures in images and videos, and it does it for free, however, there is already soooo much AI generated content that it doesn't even feel like such projects are needed... What do you think? The detector scans the signatures within files and the accuracy is actually quite good, tbh.

Comment by minimaxir 10 hours ago

A reliable AI image detector would be useful, with "reliable" being the operative word. It is impossible to identify AI images reliably with current techniques.

"The detector scans the signatures within files" makes zero sense. I tried some knowingly AI-generated images and it only predicted a 55% chance they were AI, with the rationale being "Extremely uniform byte distribution - may indicate encrypted or random data" which also makes zero sense.

Comment by picki 10 hours ago

That's sweet actually, 55% is a high chance it to be AI generated, tbh. I am wondering what would Gemini or other visually capable model tell you about those images you tested

Comment by rolph 9 hours ago

that end of the stick is kinda nasty, wash your hands, and grasp the other end of the stick, you should detect "iamhuman" watermarks, rather than AI signatures. You will be searching for opted in humans that want to be known as human, much easier than searching for AI that wants to disguise itself as human, or forcing industry to allow you to reliably identify AI products.

Comment by picki 9 hours ago

That's OK - I am happy to work with that 'nasty' end. Generally, there is no such signature as 'iamhuman' in media files, however there are plenty of AI/Deepfake/Edit signatures that can be traced (either through simple metadata, or SynthID or per-pixel distribution). Generally, when the one knows how to read deep data, the file can immediately tell how it was 'created'. Interestingly enough, even screenshots of an AI-generated image have a good rate of detection in susmeter.

Comment by rolph 9 hours ago

>there is no such signature as 'iamhuman' in media files<

no official specification exists, but humans do irrational things.

e.g. pee wees playhouse, had the secret word of the day. [X] and whenever a visitor evoked the secret variable of the day, the playhouse gang would crackup laughing and finger pointing.

an unwritten tradition, to give alternate meaning and context to secret word of the day was in effect.

a human, can demonstrate humanity, to other humans, in subjective fashion, that AI will never deduce through logic.

when humans can reliably identify to each other as human, all else that fails to be human, is by inductive logic AI.

of course there are phalluses to concidder but hue manz havework around around

Comment by picki 8 hours ago

I get the point you’re making about human self-identification and social signals. That’s an interesting alternative framing of the problem.

What I’m trying to validate at this stage isn’t the philosophical question of “what proves humanity,” but a much narrower one: "Is there demand for a tool that can probabilistically assess whether an image or video has been AI-generated or materially manipulated, using technical signals available today?"

For this idea validation I’m specifically testing whether probabilistic AI/media detection (imperfect as it is) solves a real problem for people today, without requiring new standards or behavior changes.