Man, the AI Bros shilling this stuff are really active in 4Chan, apparently.
Part of the fun of watching stuff isn’t because it “customised to me” it’s sharing an experience with the creator(s) and friends, family etc.
I see genAI being used as a tool for creators but not as an automation of content creation.
AI would be chronically incapable of implementing actually surprising plot twists that are both unexpected and consistent with the rest of the plot (and not somehow someone back into existence). If it hadn’t been written before, an AI would never make Darth Vader be Luke’s father unless specifically prompted, at which point, why even.
(I’ve just finished a hexalogy marathon, my head is full of jedi.)
I don’t think everyone is into that link tho (/j)
This has already happened, many years ago. I know this because everyone but me is actually a highly sophisticated robot that resembles a member of my species. I’m onto you.
When I was a kid I had a theory that I’m the only conscient being in the world, and that everyone is some sort of a robot.
I couldn’t share it with anyone, because obviously no one was real but me.
He figured it out. Time to shut it down.
deleted by creator
Finally. This iteration was starting to become weird anyway.
you can’t trick me machine. You can’t convince me I am the robot and you are the conscious. it can’t be possible.
deleted by creator
Just swap security cameras back over to analog, problem solved for video evidence
You could just as easily burn a deepfake onto tape.
Shit
The only thing that I can come up with is tamper-proof cameras with a secure execution environment that cryptographically sign the recordings and physically trash themselves when the enclosure is opened.
Sounds better than anything I could think of
Wouldn’t be against that
I would put up with all the negatives if I could generate quality video games with a prompt.
Would.
That porn had to be trained on real people’s bodies who will never see a penny of it. That’s laundered revenge porn.
We should get polaroids and analog film again
Video evidence is relatively easy to fix, you just need camera ICs to cryptographically sign their outputs. If the image/video is tampered with (or even re-encoded) the signature won’t match. As the private key is (hopefully!) stored securely in the hardware IC taking the photo/video, any generated images or videos can’t be signed by such a private key.
Wouldn’t this be as easy to break as to point a camera at a screen playing whatever you want?
Perhaps not with light field cameras. But then you could probably tamper with the hardware somehow.
getting the picture to perfectly replicate the image on the screen without it being noticeable that it’s just a picture of a screen would be so difficult it would probably be easier to modify the camera instead
So whatever way the camera output is being signed, what’s stopping you from signing an altered video with a similar private key and then saying “you can all trust that my video is real because I have the private key for it.”
The doubters will have to concede that the video did indeed come from you because it pairs with your key, but why would anyone trust that the key came from the camera step instead of coming from the editing step?
You can enter the camera as evidence, and prove that it has been used for other footage. Each camera should have a unique key to be effective.
So if you create a new key, it won’t match the one on am existing camera. If you steal the key, then once that’s discovered, the camera should generate a new one.
But if you don’t actually check the physical camera and prove that key for yourself, then it can easily be faked by generating a key that is not coming from the camera and is used for the “proof” video and the fake video.
Any self-respecting judge would check, and hopefully most journalists would keep records of these things to prove where the footage came from.
You, the end user, don’t have access to your camera’s private key. Only the camera IC does. When your phone / SD card first receives the image/video it’s already been signed by the hardware.
so you want the hardware to be significantly more opaque and almost impossible for new manufacturers to compete?
It’s pretty standard practise these days to have some form of secure enclave on an SoC - Arm’s TrustZone, Intel’s SGX, AMD’s SME/SEV. This wouldn’t be any different. Many camera ICs are already using an Arm CPU internally already.
Mate, digital cinema uses this encryption /decryption method for KDMs.
The keys are tied into multiple physical hardware ids, many of which (such as player/.projector ) are also married cryptographically. Any deviation along a massive chain and you get no content.
Those playback keys are produced from DKDMs that are insanely tightly controlled. The DKDM production itself even more so.
And that’s just to play a movie. This is proven tech, decades old. You’re not gonna break it with premiere.
This is for restricting use, not proving authenticity of the videos recording. Anyone can spin up keys and sign videos, so in a legal battle it would be worthless.
The technology would be extremely easy to adapt, with the certs being tied to the original recording equipment hardware. Given i don’t see a $60 ip cam having a dolphin board it would probably be relegated to much higer end equipment, but any modification with a new key would break the chain of veracity
This is blatantly not true, it would be extremely simple to circumvent. How do you “tie” the cert to a specific hardware without trusting manufacturers? You just can’t, it’s like putting a padlock on a pizzabox.
I literally explained earlier how this exact technology is used in digital cinema dude c’mon.
That doesn’t mean it’s useful for forensics, IMO.
Edit: not saying it wont be though, just that it’s not as bullet proof as you’d think, IMO.
As with everything, trust is required eventually. It’s more about reducing the amount of trust required than removing it entirely. It’s the same with HTTPS - website certificates only work if you trust the root certificate authorities, for example. Root manufacturer keys may only be certified if they have passed some level of trust with the root authority/authorities. Proving that trust is well-founded is more a physical issue than an algorithmic one. As it is with root CAs it may involve physical cybersecurity audits, etc.
Yep, totally fair. It’s kind of crazy actually how we all trust that stuff, and when there’s a breach people just want to expire certificates more often etc.
I bet there is a better way but as long as no one is paying, we’re stuck with this mess. I have programmed stuff with x509 in the medical sector, what a trusty spaghetti mess that was, but when you finally got your cert, you could basically do whatever.
Sorry for the rant 😅 I just want to show people that even if the mathematics behind RSA is fantastic and secure, the human side is always there to break that 🤷🏼♀️.
But how would one simple member of the audience easily determine if this whole chain of events is valid, when they don’t even get how it works or what to look out for?
You’d have to have a public key of trusted sources that people automatically check with their browser, but all the steps in between need to be trusted too. I can imagine it is too much of a hassle for most.
But then again, that has always been the case for most.
This is just standard public key cryptography, we already do this for website certificates. Your browser puts a little lock icon next to the URL if it’s legit, or provides you with a big, full-page warning if something’s wrong with the cert.
I know, but as a physical, mobile object as a camera is involved I imagine it’s much more vulnerable to man-in-the-middle attacks than today’s TLS certificates for sites. There are more moving parts / physical steps and the camera is probably not always online.
But in essence you are right, operating the camera the same way as a server should be possible of course. We need some basic trusted authorities that are as trusted as we have for our current TLS certificates.
What it will prove, is whether the video is actually of a specific camera certificate. Not who owns the camera, if it has been swapped or if the video footage is real.
…what audience?
I mean the viewers of the video.
deleted by creator
Removed by mod
I personally doubt that will happen, since the current models require a lot of data to get better, something we actually don’t have. The real danger is what happens once we figure out how to make models without an absurd amount of data.
Hopefully never
well at the very least those models’ drawings might not be an assult on our collective eyes
Making models without a mountain of data is just engineering lol. That’s what we were doing before, are still doing, and will continue doing for the rest of our existence.
As well as that, the internet is less reliable since there’s a lot more botshit on it.
Most readers would gladly read AI slop instead of real literature, and thousands likely already do. Just look at how much brainrot gente fiction is pushed on “booktok” and the now common practice of choosing books by tags only.
Also, for AI porn, it’s already all over /b/Reading this made my eye twitch.
Just the kind of anthropomorphism a bot with no eyes would use 🤔
The last bullet is true even now. Just go into Threads or Bluesky. So many bots and scammers.
Actually, polls show that most people are not fond of AI-generated content and want it to be labelled or don’t want it at all.
As for generating your own entertainment at home, see interactive movies. They did not take off because people don’t want to be “working” for their entertainment. That’s their time to relax and not make decisions.
All in all, we’re not as careless as it may seem.
A fb group i moderate recently had an AI jammed up it. I ran a poll to keep or disable. “Get rid of it” got more votes than the option “Put a gimp mask on it and whore it out for grapefruit”
Not to mention those interactive movies from the early 90s games that also didn’t take off because they were sorely lacking in the game department
I’d be very interested in these polls if you have some to link!
TIHI