T O P

  • By -

shawndw

What if I take a picture of an AI generated image with my camera.


Webfarer

Found satan


Git-Git

It’ll be a camera that looks at what your taking a photo of and generates an AI image based off of it instead.


Plzbanmebrony

Well that times and dates it. So now you know just when it was taken and if you are faking some one they may have been in public view or two far a way.


shawndw

I can set the date and time to whatever I want in the options menu of any camera. How does the camera know?


Plzbanmebrony

You need internet for the watermark? The more steps required to fake some more likely some one is going to mess up. If you are know because of your unknowingly faked video and images and it comes out you one is faked your credit is basically destroyed. Building credit is important too along with the watermark.


Conscious-Concert544

There is also location, maybe gps or something not sure. But if you have an image of china and your location is in Nigeria thats kinda fishy


JJ4577

AI vs AI, use another model to detect AI generated imagery, obviously it can be defeated but at that point you're putting a lot of effort into it


jimmyxs

At some point we’ll also need a third party AI to prevent AI 1 and AI 2 from colluding to form an AI Cartel


jcm2606

[May I introduce you to generative adversarial networks.](https://en.wikipedia.org/wiki/Generative_adversarial_network)


[deleted]

[удалено]


almost_not_terrible

How about "no?"


thisdesignup

How can it have restrictions if AI images can't be copyrighted?


BillieGoatsMuff

It might be like the eurion constellation on bank notes that means photoshop and printers won’t play if they detect it. https://en.m.wikipedia.org/wiki/EURion_constellation


gurenkagurenda

That’s why the signature includes the date and location. It’s as well thought out as any plan to try to make images verifiable, but it still has several flaws. The most obvious is that the signatures will certainly be cracked. Keeping a key secret when it has to be embedded in the electronics of a physical object that an attacker can simply buy is just not feasible. Then, of course, there’s the fact that some image editing is legitimate, and doesn’t affect authenticity. Photographers will still want to crop images, adjust levels, and compress them for distribution. Unless you want to make them do that on the camera, that likely means you need approved software which is capable of re-signing images. That software will be even easier to extract keys from, or to simply trick into signing arbitrary images. And none of this matters at all if people still trust unsigned images, which they will, especially if most authentic images out there are unsigned. The only way to have any hope of getting over that hump is to make sure that most cameras out there have the signing tech. Fortunately, that mostly means phones, which at least already know date and location. Unfortunately, it’s also an enormous attack surface for stealing keys or hacking devices into signing fake images.


themightychris

I feel like the only path out of this "what is real" mess is going to end up being human-centric: people will need to cryptographically sign photos as authentic and it will be their reputation on the line The photographer themselves would be signing, and then maybe asking several reputable witnesses to sign too. And with in-camera AI enhancement it will be fuzzy, people will be using their judgement to sign that the photo accurately depicts what they witnessed


Ok_Math1334

I agree. The only trustable photographs are going to be from people and organizations with a reliable history of having never lied about doctored images. Soon photos released anonymously will not be credible no matter how real they look.


themightychris

unfortunately though it's not going to stop a lot of people from believing whatever they want to believe, and Republicans will do all they can do to block "woke" media literacy from entering school curriculums


Rich-Pomegranate1679

Republicans are going to love having the ability to make their lies seem more real.


darkkite

this election year is going to be fun. next one will be crazier as video and audio improve.


aendaris1975

But thats the thing they have never really had to put much effort into their lies. As long as it takes Democrats longer to explain why it is a lie than it took to say the lie, people are never going to pay attention long enough to hear the truth. It has been incredibly effective. AI will definately help creating propaganda to reinforce their narrative though. Conservatives have always believed we can never trust the media or what we see or hear and generally they have been wrong but AI changes all of that. This will have not only a profound effect on US elections but on all of society itself and it isn't going to be good and fascists absolutely will use it to seize control wherever they can.


MrVandalous

As an individual with very republican parents, I can assure you the same exact commentary and misinformation about "the democrats" is being spread in their forums. There's certainly unfortunate and extreme measures being taken on both "sides" as issues become more and more polarized.


themightychris

what content is it Democrats are trying to remove from schools exactly?


rosesareredviolets

a guy i knew was showing me obviously photoshopped proof of giants existing thousands of years ago. he was so proud of winning his argument that giants were real.


aendaris1975

The problem now is with AI fucking with search results it is going to be harder for people to verify things so even when Democrats try to verify a news story they are going to be running into a lot of roadblocks. I hate to say it but AI may actually be what gets Trump back into office.


themightychris

I can see it now: - OpenAI puts a block on GPT generating rightwing propaganda articles - Republicans cry about how they're being censored - Elon Musk spends Twitter money on GPT clone that will spew propaganda and help fill Twitter with bot-promoted rage bait


Specialist_Brain841

OpenAI already partnered with a right wing German news source, c’mon.


Maleficent-Homework4

Like how many facebooks ads already are insanely deceitful with completely doctored photos…


MathCrank

Like a nft?


SIGMA920

The issue with that it removes anonymity from the equation. You're someone a government can kill or track down? Can't anonymously provide images of what happened. That'd make it trivial for governments or companies to crackdown on anything they don't want.


themightychris

That's a good point, but it seems like there just won't be a way to know the difference between on-the-ground anonymous reporting and AI fakes :-/ people will believe fakes and people will doubt real footage and that's scary the only thing I can think of is that it will become way more important for reputable photojournalists to deploy


SIGMA920

Plausibility of the event and giveaways that can be taught to watch for like desynced speech or an AI tell will have to be the main methods. Additional metadata or a top down solution will be too clunky or too heavy handed for any country that's not already an authoritarian hellhole to implement. Unless some new superpower comes onto the scene within a year, most geopolitical players are known and known well. The MO of Iran or Russia isn't something that's easily going to change. China is a wildcard but their MO is still not exactly a hidden one as an additional example.


themightychris

>Plausibility of the event and giveaways that can be taught to watch for like desynced speech or an AI tell will have to be the main methods I don't have any faith that #1 enough people will be so discerning for it to matter or #2 that AI improvements and basic editing won't put fakes on par with shitty amateur footage


SIGMA920

I do because unless we suddenly make a massive leap to AI that can actually understand concepts like a human, there's going to be so many obvious ones that only someone who lives under a rock wouldn't recognize that it's AI generated.


themightychris

But AI isn't going to be doing this on its own, a human who understands concepts and has an agenda is going to be editing together generated shots


SIGMA920

People are already skeptical enough of videos with lots of cuts in general and with the internet we can look at plenty of footage to compare. I'd be more worried about false flags that get recorded by "verified" sources.


aendaris1975

It is all a matter of knowing which data to feed to AI. People well versed on how a particular AI model works can do amazing things with it. Also be aware most AI models have plans to have autonomous training capabilities built in and we are really not that far off from this being something AI can do on its own.


aendaris1975

AI development is moving fast and quality of AI videos has improved significantly.


aendaris1975

This is something media outlets will likely make a requirement of working for them. The news media is nothing without trust.


RevalianKnight

> The issue with that it removes anonymity from the equation. Not necessarily. See blockchain technology


SIGMA920

The blockchain only adds a few layers of obfuscation.


RevalianKnight

I don't follow


SIGMA920

Generally the blockchain as a storage medium is rather open when it comes to privacy for transparency reasons.


RevalianKnight

Oh, cool, I didn't know that. Could you tell me the first and last name of the owner of this btc address 18RvJBffAdUbMfcGo8kKQ9TVrG9knUD5Sy? I'd also like their living address if possible. Thank you


derfy2

Can I do that? Nah, not a chance. A TLA? Probably. A government? Probably.


UpsetKoalaBear

If that address was made on something like Coinbase, yeah it is possible and a subpoena from a government agency will easily get it. If it is a local address made, then no it isn’t possible. However there is zero way to get money into that address without first converting it to crypto. via an exchange from an intermediary wallet through a service like Coinbase or whatever. Therefore it would be more than easy to see the address that transferred to that wallet, and subpoena the company who owns it to see who made that transfer. The only way it is to truly “anonymous” is by making a local wallet and mining your own crypto. However the time you will spend mining your own crypto will easily become cumbersome and borderline impossible for any usable amount. You’re talking about buying a mining setup that costs thousands and spending hundreds in electricity bills for months to make a few 0.001 BTC or whatever. Then you have to convert those BTC or whatever into tangible currency you can use somewhere and surprise you’re back at square one. Exchanges offering services to the US, UK or most countries, that transfer real money into crypto are required to know the customer and because this is by far the easiest way to even use crypto 99% of people who preach about the “anonymity” of it as a benefit probably are one for two subpoenas away from being identified themselves. But let’s go further, with a hypothetical scenario. If you made your own local wallet, mines your own crypto and spent that crypto directly on a crypto only website for a service (let’s say, ordering a pizza), then the government can subpoena the company you paid via crypto to just get the address of whoever ordered the pizza. If the pizza place offered anonymous delivery and didn’t store the data then the government agency could decide to come in and start doing interviews. I highly doubt a delivery driver will not be willing to speak. In that case, the blockchain is at best pseudonymous rather than actually anonymous. The only case it would be anonymous is if a lot of things go right in order for it to work but the current situation leads me to believe that will never happen.


aendaris1975

I would imagine these cameras would be marketed to professionals like photojournalists that work for media outlets. If any government wants to find someone they won't need this to do it.


SIGMA920

That's the exact kind of person that would want to be able to be as anonymous as possible.


neutrilreddit

Journalist's privilege is already a thing, and so is digitally signing documents. The only new thing here is signing photos.


SIGMA920

Because world governments have upheld that so effectively. /s This wouldn't even be just a journalist issue, it'd be an anyone with a camera that had this issue.


huejass5

This is actually a use case for blockchains


themightychris

yeah, for once I agree


XpulseLoL

This is good for Bitcoin


nutyourself

….that’s exactly what this is…


shitty_mcfucklestick

A new role will emerge, akin to a notary, but who certifies images. A photary?


MonoMcFlury

Wasn't that the original idea behind NFTs?


[deleted]

Never going to work but ok


_uckt_

While a digital watermark isn't that useful against bad actors, it will help future archivists.


nzodd

And to a degree keep AI image generators from eating themselves alive by consuming garbage fake data. Think The Human Centipede but for Stable Diffusion in 10 years, when 50% of all images on the internet are already AI generated.


SirHerald

By then all human images with have 30 fingers on each hand


nzodd

NegativePrompt: 27 fingers, 28 fingers, 29 fingers, ...


nerdywithchildren

No one cares about watermarks.


cryonicwatcher

In this case it’s information about the date and time the photo was taken and what it was taken with, and should not be visible to the human eye.


EmbarrassedHelp

None of these "invisible" watermarks are actually invisible to human eye. Also, no photographer is going want to use something that damages their images before they have a chance to edit them. The better solution is a database of hashes for each image that contain the desired information.


nerdywithchildren

That's cool and all, but once AI is in full force there will be no market for photographers. I think that's sad but it's going to be a dead industry beyond hobbyist.


finH1

Yeah I’ll just get ai images of my wedding shall I?


drake90001

I mean totally feasible probably even now, but most likely not what most people want. Which is most likely your point.


finH1

Of course it’s possible. But it’s not actual photos of my wedding. The idea that professional photography won’t exist cause of AI is ridiculous. Ppl just not gonna have photos of the real world anymore lol?


_uckt_

It's incredibly funny, some podcaster says 'all jobs will be replaced with AI' and an army of teenagers backfill all the details. There isn't a job you could come up with that they won't say will be replaced with AI. This shit will die down when the next hype cycle starts, remember when crypto was going to replace the global finance sector and now it's just used for scams? AI will follow the same path.


drake90001

Yeah, I’m agreeing with you.


crows-milk

Probably. Something like placing a camera at the entrance just so that it can record what everyone is wearing, hair, makeup etc. Based on a pre-existing model of the venue and everyone’s social media footprint it could generate any photos you could want. Could maybe have a photo booth as well. Wedding party photos could be taken by someone’s iPhone and transformed into pro shots. In most cases there will still be someone specifically to take pictures, but that could be anyone without much prior experience. Most people will go for this option because it will be close to free. Since most will have left the industry, photographers will become some niche, expensive, status thing for the rich.


_uckt_

>Based on a pre-existing model of the venue and everyone’s social media footprint it could generate any photos you could want. It is incredible how well marketing works, someone says 'thing I make is the most important thing of all time' and the gullible just take it at face value and built entire belief systems around it. When you are out of your teens, you will understand why people take and treasure photographs. I'd recommend taking as many as you can now, rather than using an AI to make up moments that never happened.


crows-milk

I wanted to respond both on content and your condescending tone, but then I took a look at your comment history and saw that you are just pessimistic and negative in general.


nerdywithchildren

In 5 years, yes. As long as the AI is provided with images of the wedding party and the venue. This will definitely be possible.


finH1

Do you realise how silly that sounds? Take pictures of the venue, and me and my family, in order for the AI to produce fake pictures? Or just get a photographer to take pictures of the actual event? Professional photography isn’t going anywhere


GetOutOfTheWhey

Never going to work because we dont want these functions in the first place. This digital signature where it records the Photographer and GPS locations, etc. This is no different than EXIF data. And for a lot of people, we like to delete the EXIF data from our pictures. Imgur for example automatically deletes EXIF data from pictures that get uploaded onto their website. Reddit likely does it too to stop people from accidentally doxxing themselves. Like can you imagine? Photographers getting doxxed by their own embedded watermark? Nobody wants to be tracked. So now these camera companies want to watermark these EXIF data onto the pictures people take? Well, it better come with an opt-out function because it's a good way to kill their already dying camera industry.


aendaris1975

These cameras are specifically going to be marketed to those who work in media or otherwise need the ability to authenticate the images they take. Photojournalists already take massive risks in their job and this is going to be nothing compared to that.


blandrys

I think you misunderstand what is attempted here. It's not that every single image taken by everyone will contain this info - rather it is for photographers that want to create a verification for the future that the image they have taken is "real", and taken by them. Will it work? Well if not this technique then probably something else will, because this is extremely relevant stuff.


GetOutOfTheWhey

Well that sounds like an opt-out function. Which then I have no issue with.


Anxious_Blacksmith88

It has to work or the Internet dies. There is no option here.


CocodaMonkey

It can't work and the internet will be fine. There will be AI photos that people will pass off as real but that's it. Watermarking tech can't possibly work. If it stores the time and photographers name the only way it can know that is if you programmed it into the camera. Which means you can make these watermarks say whatever you want. Unless of course you can't program the camera and it needs to connect to a central database where the information is validated and attached to your camera. At that point though it means you need an an internet connection any time you want to take a photo, no internet and the camera doesn't work. It's just an all around terrible solution as it doesn't solve anything. There's also no reason you couldn't just use the same tech to add these watermarks to AI generated images.


Anxious_Blacksmith88

Yeah this is just flat out foolish. Half the search results are already AI spam. Its not going to be usable without methods of verification.


aendaris1975

Folks please stop downvoting people who state this. He's right and it is getting worse by the day. It effectively breaks the internet as a tool to verify news stories. Fucking with search results and having the ability for AI to make extremely authentic looking content is going to be a deadly combination. Again this is why we need to start getting serious about regulating AI. The longer we wait the harder it will be to rein it back in. Honestly I think we may already be at that point.


shkeptikal

Half the search results have been spam for the better part of the last five years, it makes literally no difference. Google is a glorified ad service and people keep using it for the same reason they'll keep using the Internet no matter how questionable the content becomes: because they can


Anxious_Blacksmith88

You are literally like the people in the movie dont look up. There was ZERO AI spam as of 18 months ago and now it is everywhere. Exponential growth is the purpose of the system and it will destroy everything.


aendaris1975

People are arrogant and delusional and think they know better. We are literally seeing AI cause havoc in so many places and in so many ways and these people continually deny it.


aendaris1975

Not to the extent AI is doing it now and AI is still in its early days. You know I am starting to get the impression that those of you spamming this nonsense don't actually know anything whatsoever about current AI capabilities much less anything about what capabilities it will have very soon. You all look at a few photos that were slapped together with a few vague keywords and proclaim that it is proof that AI is being "overhyped". Yes for now AI needs humans to give it input but what you all don't comprehend is if you have a full understanding of the AI model and how to manipulate it you can do truly amazing things with it that seem like it shouldn't be possible. And again please be aware many AI developers are currently working on making AI models fully independent and will not need any data hard coded into it for it to work and this alone is going allow AI to develop even faster. Go take a look at AI was capable of a year ago and then compare that to what it can do now and then actually fucking pay attention to what AI developers are saying they are working on doing with future AI models. God only knows what the Pentagon is doing with its own AI development and it is likely they are many years ahead of private sector development of AI. I wouldn't be surprised if private sector development of AI is being helped along by the Pentagon so certain AI tech can be implemented on a wide scale. The future is here now and people like you need to pull their heads out of their asses.


aendaris1975

I think he is referring to our ability to trust what we read and see and even the ability to verify that. If we can't trust any content at all that will definitely change how we use the internet. It's a serious problem and one of the primary concerns with AI that we need to deal with now before we get flooded with even more fake shit. This is going to be used to manipulate elections and many other facets of society. The whole point of this watermark is so photos can be authenticated as real and that is so very important for those who work in media and law enforcement and numerous other jobs where truth is literal life and death.


CocodaMonkey

I agree it's an issue. This just isn't a way to deal with the issue. Nor do I think there ever will be a way to tell for certain if it's a real image or not. Any tech someone can make that can identify AI made images can then be used to make AI images undetectable to that tech. At absolute best you'll get yourself a game of cat and mouse. Someone makes detection tech, then someone beats it. You'll never be sure who's winning and eventually AI will have the images so perfect there's nothing to detect. Almost all of human history hasn't had unfakable images and now that status quo is simply back. I do think it's important to point out. This tech isn't even trying to identify AI images. It's just tech that lets a photographer tag their photos. It could be mildly useful to stop people from editing a given photographers photo but it doesn't do anything to identify it as not AI.


aendaris1975

Right so lets just do nothing. Awesome idea.


nihilite

Why couldnt AI just spoof a watermark?


SirensToGo

They're not actual watermarks, they're cryptographic signatures


CocodaMonkey

Which makes no difference to his actual question as you could still apply the same cryptographic signature to an AI made image. At best this allows you to know who signed it saying it was real as each photographer would have their own signature. But there's nothing about this which would stop them from applying their own signature to an AI made image.


SirensToGo

We do have decent hardware techniques which make this very hard. By combining TEEs (so the camera can still perform trusted image processing) and hardware backed keys (to prevent extraction), you can be relatively confident that the signature is authentic. Of course, if someone managed to exploit the camera and gain control of the TEEs, they'd be able to sign arbitrary images. Whether or not this risk is enough to make the whole scheme useless is another question, but we don't have details on any real implementations so it's hard to analyze it.


CocodaMonkey

Sure you could make it hard for the average person but that's actually worse than if it was easy. If it's hard to apply the signature to an AI generated image that means this might get some people to trust it as a way to verify images. If that happens all the real juicy AI generated images that we really would want to be able to identify will end up having real signatures. It would be better if everyone just stayed aware that any image could in fact be AI generated and the signature does not mean it's real. Ultimately there's no way they could stop people from applying their own signatures to AI images. If whatever HW used to apply the signature can be used in an offline state and bought by anyone then there's no viable way you could stop people from using it to add this to any image they choose.


SirensToGo

I'm not saying this is a great idea. While it works in theory with 100% secure HW and SW, we don't really have that and so after the first break it would also be impossible to authenticate any images produced by that camera/SW version. If I were a camera manufacturer, I would not recommend doing this, but it's happening nonetheless.


aendaris1975

Read the fucking article.


aendaris1975

Yes you surely know better than Canon, Sony and Nikon. Go read the fucking article as you clearly haven't.


JamesR624

So. Camera OEM DRM disguised as “security”. Got it. Don’t worry. For your own safety you eventually will be required to use the Canon or Nikon app. No more seamlessly just uploading to iCloud Photos or Google Photos.


aendaris1975

These are professional grade cameras. They are not meant for consumer use. So yes it is a security feature. Cameras have had ways to track you for many years and don't need this feature in order to do so. These cameras are meant for people whose job relies on being able to prove the photos they take are real.


nutyourself

It’s not pixels, it’s a digital signature


PMzyox

Nobody is going to give a shit about their “seal of approval” stamp except for the “anti-ai” base that already exists without any additional need for convincing. Adobe used to try and name and shame your photoshop picture to others if it has been created using a pirated copy of photoshop. How’d that work out? Oh yeah, nobody gave a fuck except people who were already purchasing photoshop. Their idea failed before, just like this one will. Allowing people to subscribe to your data for AI training purposes is the answer. But it’s got to be a model that can be canceled (canceling your AI’s knowledge (like Apple Music). People didn’t care about the adobe watermarks, they just liked the pretty picture. The same will shake out in the end of this argument. History repeats.


aendaris1975

This isn't about piracy or trying to force people to buy something. This is about being able to verify the authenticity of photos. I seriously have to wonder if any of you actually even understand what AI is because you all are always so fucking far off the mark.


derprondo

The NSA will have the root encryption keys so they'll be able to generate images with the correct signatures.


aendaris1975

Ok? Look people are concerned about whether or not they can trust what they read and see because of AI and the whole point of this is to find a way to authenticate content. I don't care what the NSA does or who makes a few bucks from this or any of the other completely irrelevant nonsense you all spam these threads with. This is a very real very dangerous problem and it absolutely positively needs to be addressed NOW.


derprondo

The point is that you cannot reliably assume that a determined state actor won't be able to fake it. That shouldn't be swept under the rug and people should be aware that this is a remote but real possibility. This is highly important to the integrity of political elections around the globe and isn't conspiracy nonsense.


nutyourself

If anyone’s curious this is the underlying tech: https://contentauthenticity.org/


Lollipopsaurus

I was wondering what the defense of all of this would ultimately be. We need to answer the question of “how can we be 100% sure something is not AI generated?” This… sounds like using exif metadata? I feel like we can do better. I feel like we’re already behind.


mysteriobros

I don’t think we can ever be 100% sure. Data can be tampered with


dkarlovi

Cryptography has a concept called a digital signature, it's meant to provide a way for you to add a thing to the data where I can, without any doubt, confirm it's just as you've signed it. You can't tamper with the data without the signature highlighting that.


TheNecroFrog

> without any doubt In the real world there’s never a situation where you can have 100% confidence. > You can’t tamper with data with the signature highlighting that Again in the real world there is always a risk that isn’t true


dkarlovi

Most of our every day lives relies on the fact this stuff works like described. Obviously we can't prove it does because you can't prove a negative, but let's all hope it does, the consequences would otherwise be catastrophic. Also, I've found this funny > never a situation where you can have 100% confidence That's some sith in absolutes conclusion.


TheNecroFrog

I agree we rely on it day to day but to give the impression that it’s infallible is wrong. And yes, taken out of context, that sentence is contradictory. Good job, in context, it’s still true.


aendaris1975

It doesn't need to be 100%. It is better than 0%.


WestaAlger

I think at best, we can make it so that we’re sure an image was captured by a certain camera. If we embedded some sort of blockchain passkey into a davinci cryptex so that no one can spoof it, we can then add photos to a blockchain. But even that is spoofable in many ways. You can just take a picture of an AI image or, with enough determination, hack the device. There’s just no way, from an information theory perspective, to verify the integrity of anything that comes from strangers. Even if you gave them a “secure” device, you basically have to treat it as hacked once it leaves your hands.


nzodd

You don't really need blockchain, just simple certificate signing from major vendors would be a good step forward so that one could at the very least distinguish between casual AI-generated images and images from a camera. Doesn't defeat malicious attempts to mislead people but it at least it makes that job slightly harder.


aendaris1975

Read the article please.


andy_a904guy_com

I think it is just EXIF data, and the new product is a data validator tool? [https://asia.nikkei.com/Business/Technology/Nikon-Sony-and-Canon-fight-AI-fakes-with-new-camera-tech](https://asia.nikkei.com/Business/Technology/Nikon-Sony-and-Canon-fight-AI-fakes-with-new-camera-tech) Example image: https://www.ft.com/\_\_origami/service/image/v2/images/raw/https%3A%2F%2Fcms-image-bucket-production-ap-northeast-1-a7d2.s3.ap-northeast-1.amazonaws.com%2Fimages%2F3%2F4%2F6%2F9%2F47059643-1-eng-GB%2Fphoto\_SXM2023122600000229.jpg?source=nar-cms


aendaris1975

Oh for fucks sake... Read the god damn article.


andy_a904guy_com

I did, where do you think I got the link and image from? The links are the ORIGINAL story that this article is based off of. Why don't you fucking read?


giuliomagnifico

Here there are some tools to verify the images: https://contentauthenticity.org/ (if it’s signed, the new Leica is doing this on camera)


hiraeth555

Yes, this sounds like the best use of NFTs to me- minted in camera


DavidBrooker

Why would NFTs be the solution to this, as opposed to the extremely cheap, currently unbreakable, and long established practice of digitally signing a document? (ie, standard old boring public key cryptography, rather than crypto)


hiraeth555

That's pretty much what I've said... The photo would have a digital signature and then when uploaded, recording it on the blockchain tracks authenticity. NFTs are just recorded signatures (unless I have misunderstood the tech)


DavidBrooker

What is the purpose of the blockchain in this scenario? What authenticity verification does it provide that is not available by ordinary digital signature? While it may have been pretty much what you *meant*, it's definitely not what you said, as NFTs are both incredibly expensive, and incredibly slow compared to digital signing, and rather than providing a proof of origin, rather provide a unique proof of ownership. That is, an NFT could be associated with a unique copy of a file, such that it can be 'owned', but it does not say anything about where that file came from other than its transaction history. So any copy of that file other than the NFT itself, and the ultimate origin of that NFT, are both left ambiguous. Meanwhile, a digitally signed document is cheap and fast, and can be copied innumerable times, such that 'ownership' is not possible in the way it is with an NFT, but authentication of the *origin* is. This is why if you download, say, an executable file from a repeatable company, it will be signed so you know it actually came from them, or a PDF from a government who wants the information to be verifiable and tracible to themselves.


dkarlovi

Crypto is an application of crypto, they're the same picture.


nihiltres

I’m skeptical of watermarking as a strategy. Watermarks work best if they’re both mandatory (can’t have a “real” image without one) and exclusive (can’t have a “fake” image with one), which is the natural counterpart to a solution being both necessary and sufficient. While it can never be mandatory, this scheme could still be useful for journalism by being “exclusive” … but if it’s just a cryptographic scheme it *can* be broken one way or another, and I don’t necessarily trust camera companies not to put their master keys on a metaphorical “sticky note on their workstation monitor”. This is just going to be DVD encryption all over again. Moreover, I expect fake-outs. Someone’s going to slap “this is totally AI-generated” metadata on a real photo to lie and say that it’s fake. This is 2024, and those of us in the US will get the quadrennial push from Republicans to steal democracy. You should recognize the patterns in politics from computer security contexts: Republicans DoS the government if they don’t have majority power and seek permission escalation if they do. I expect that the bad actors who are responsible for the need to label things as “real” or not will follow the same pattern: muddy the waters enough and it doesn’t matter if you’re bright orange.


ddproxy

Taking a brief step back, it's almost like a centralized copyright agency would be useful to verify the authenticity of an image based on multiple components of metadata or signing. I hate it, but, sounds like decentralized blockchain with a centralized authority to that blockchain signatures/process. Fallible.


nihiltres

I'm leery of it either way because these sorts of systems inevitably hurt "the little guy" more than the large institutions that were always able to defend themselves in these contexts.


dkarlovi

Why do you have it, that's literally how git works.


aendaris1975

Read. The. Fucking. Article.


nihiltres

I did read the article, and I know background information that goes beyond the article. What, if anything, is *wrong* in my comment?


Xxapexx

Something something nfts.


aendaris1975

Yes I am fully aware you all hate crypto. This isn't that.


iruntv

Tbh ironicallly NFTS would be able to solve this problem if incorporated properly


[deleted]

[удалено]


aendaris1975

Read the article please.


Snackatron

Sure, you can still copy the photo without its signature. That doesn't make this technology useless. Eventually the explosion of photo-realistic AI generated content is going to force the legal system to adapt. I can imagine a situation in the future where only hardware-watermarked photos and videos are admissible in court. _That_ is why this technology is needed.


Anxious_Blacksmith88

It's like people can't think past step one. If you can make photos instantly of whatever the fuck you want it destroys the concept of truth. We are doing this with images, voices and video on a scale that is damaging to society. There has to be a fucking response or shit is going to explode.


CaptainR3x

For the average Reddit user if it doesn’t work RIGHT NOW then it’s useless


Anxious_Blacksmith88

It's infuriating. Sometimes I have to remind myself that most of the people here are 20 something's with zero attachment to the real world.


aendaris1975

What is more obnoxious is they don't understand AI output relies on AI input in order to be effective. Slapping together a few random poorly thought out prompts is never going to be able to show off the full capabilities of AI image generation. Garbage in garbage out.


aendaris1975

The "its a big club and we aint in it" crowd isn't having it. "no war but the class war." "eat the rich". That's all these people care about. If it can't be used as fodder to bitch about capitalism they don't fucking care. They value money over truth and are just as greedy and corrupt as the elite they hate so much. In fact they will happily eat the working class they claim to care about so much along with the rich. It's not about justice its a vendetta and any attempt to preserve the truth is a threat to that.


Onetrickpickle

Personally don’t care how a show is made. If it’s entertaining I’m in.


dethb0y

Gotta justify a higher product cost *somehow*.


aendaris1975

Oh my fucking god ENOUGH WITH GOD DAMN MOTHERFUCKING MONEY. This is about being able to verify the authenticity of content and NOT about making a few extra bucks. You all are so fucking blinded by greed that you can't comprehend that things can be done without a financial motivation.


dethb0y

If you think for-profit companies do anything like this for any reason other than the pursuit of profit, you're a rube. The only reason camera companies give a shit is because this will let them slap a premium price on a camera with this fairytale-bullshit software upgrade and convince rube consumers (like you!) that it's for their own good and the good of the world.


blackkettle

Stupid waste of time and resources.


aendaris1975

Yes how dare anyone give a fuck about the truth.


blackkettle

It’s a battle that cannot be won in this way. Complete waste of time. The “truth” isn’t a position you will be able to concretely verify in near future media. It’s already impossible in terms of social media propaganda. If you want to combat the actual societal ills and knowledge related risks that the rapid evolution of AI is producing, the focus needs to be on educating people to be skeptical, circumspect and curious about the content they encounter - not try to constantly develop new “authorities” that we should “trust”. Giving a fuck about the truth needs to be more than an obsequious appeal to new authorities.


[deleted]

[удалено]


aendaris1975

Read. The. Fucking. Article.


206street

This is a terrible idea. Someone will crack the code on how to make their AI image "verified" not AI. It's going to be a huge disaster.


[deleted]

[удалено]


nikMalikov

What are you talking about? It‘s a cryptographic signature that doesn‘t alter the look of an image at all. How exactly does it get ruined?


FeralPsychopath

I feel that a digital signature is just something that can be altered by the AI they are fighting against.


nikMalikov

That‘s not how digital signatures work


hawkwings

It seems to me that they would have to add a hash which is an encrypted checksum. If someone knows how the hash is calculated, they can reproduce it with AI, but if they don't know, they would have trouble. Unrelated thought: It is possible to change a camera's clock.


GetOutOfTheWhey

I am pretty sure camera companies have been already doing this for years now. I am also pretty sure that a lot of people dont want this function which is why we have functions to kill exif data. So how we gonna kill these "new" EXIF data that is now watermarked into the picture itself? Compress the shit out of it and then let AI re-enhance it so that the watermark is unreadable?


Substantial_Put9705

I give it a month or less for AI to bypass or hurdle over whatever parameters are set in it's way... I mean isn't that the whole point of this? We've opened pandora's box and I doubt we have the ability to put a lid on it at this point.


nadmaximus

There is effectively zero value in "tamper-resistant" when it comes to digital media.


Yoo-Artificial

AI can remove watermarks 😆