These days AI has the meta trail that shows it's fake if you know what you are looking for, and as I'm oft for saying this is why we always have in journalism (or used to) a second independent verifiable source - in this case it would be other photos from a different source showing similar from a differing angle.
Photos can be used, but they need an identification and evidence line along with substantial other independent verification to be admissible, otherwise a single photo at best is circumstantial.
CCTV is different as you can investigate the tape/DVD whatever etc - you can look for hacks, changes, overwrites, false footage placed over etc. And that's why it's preferred.
Meta trail in that sense is probably the wrong phrase, but it's the same info at core - it's authentication simply put, and it's no different a process really than art using provenance to check a genuine artwork.
It's like DNA, comparing alleles has improved massively over the years further ruling out mistakes, video/photo forensics are improving just the same.
And as for BB's point, the risk of manipulation has always existed long before Photoshop came along - it's deep fakes on the net people should be worried about (much like the above game) because they aren't scrutinised or questioned, in a way evidence in a court case would be mate.