

AI might hallucinate things that aren’t there when used in computational photography when it’s trying to fill in the gaps.
So it’s putting things that “aren’t there” in when it’s…filling in things that aren’t there? This is why “hallucinate” is such a problematic term. It obfuscates the fact that this is what these programs do with everything - they were designed from the start to make shit up. It’s not “hallucinating”, it’s fulfilling its core programming function.
Bookmark it. Or download a copy to have on hand locally.