I write code for a living. I certainly complain when I find a bug in one of my dependencies.
I write code for a living. I certainly complain when I find a bug in one of my dependencies.
By Code do you mean VSCode? I use it all the time with VIM key bindings. It offers so much more than VIM with less finicky configuration. It’s the first IDE I’ve ever actually liked. Before now it was VIM or nothing.
At one point I had a plugin for MS Word that added vim key bindings because I kept leaving stray vim commands while editing other people’s documents.
Do you have recommendations for tutorials on this?
For raster imagery (and probably vector) I recommend imagemagick.
They currently live in the US. So, whether they properly answered your prompt seems to depend on your definition of “your country”.
Wow, $2 for a pack of cigarettes. I’m glad it’s not that cheap here. It would have made it harder for me to quit!
I’m fairly sure this is only the debt accrued due to us legal losses plus interest.
What kind of laws are on the books in the UK relating to biometric data?
It would be nice to enact similar laws in the US but I doubt it will ever happen…
I don’t understand why companies who commit blatant fraud like this aren’t required to disgorge all fraudulently earned money. If someone defrauds banks they get fined based on their earnings in a way that hurts. If someone defrauds consumers for “tens of millions of dollars” they are only fined $16M.
Well, actually I do understand, I just don’t like it and don’t like what it says about this country’s priorities.
I generally agree with you but that is still a fuzzy line to draw that is likely very dependent on circumstances. The devil is in the details.
I still disagree that there is a clear line. Yes, it is obvious that photo grain is different from making you look like a human head on a shark’s body. The problem is somewhere in the middle. Determining where that line is drawn is going to be difficult and is only going to become more difficult as this technology advances.
Add to that the fact that our brains run software that doesn’t even try to faithfully store images and you have part of the reason that photos are, currently, more reliable than eye witnesses. That may be changing though.
Our brains are natural intelligence and perform natural learning. The results are even less reliable, predictable, and repeatable than the results provided by artificial intelligence.
Yeah, you’re right. It still scares me somewhat, though. What happens when courts fall behind and continue to rely on photo evidence after photo evidence becomes easy for anyone to fake. What happens when the courts finally do realize that photos are unreliable?
I don’t think this change can or should be stopped. It is just worrisome and thought should be put into how to mitigate the problems it will inevitably cause
It’s not just the sensors though. The software used to convert what the sensors saw into an image makes decisions. Those decisions are sometimes simple and sometimes complex. Sometimes they are the result of machine learning and might already be considered to be AI. This is just another step in the direction of less faithfulness in photos.
I disagree. It’s not that easy to draw a line.
First, current cameras that we consider to not use AI still manipulate images beyond just attempting to approximate the scene. They may not allow easy face swapping but they still don’t faithfully represent the scene much of the time.
Also, I don’t even think it is clear where we can draw a line between “normal” algorithms and “AI” algorithms. What level of machine learning is required before we consider an alrogitm to be AI?
Simple non-AI algorithms and generative AI are on a spectrum of comlexity. They aren’t discrete from one another such that they can be easily categorized.
A Polaroid is the best representation that can be made of a scene on Polaroid photo film. The lens, the paper, and other factors will always make the representation, to a degree, not real. That was the Samsung exec’s point. It’s a little disingenuous, though. The discussion shouldn’t be about “real” vs “fake” it should be about “faithful” vs “misleading”.
The statement that “There is no such thing as a real picture” isn’t wrong. It kind of missed the point though. It’s true that, even when a photo attempts to make the most faithful representation possible, it can only approximate what it sees. The sensors used all have flaws and idiosyncracies and software that processes the images makes different decisions in different situations to give a good image. Trying to draw a line between a “real” picture and a “fake” picture is like trying to define where the beach ends and where the ocean begins. The line can be drawn in many places for many reasons.
That said, the editing that the S24 is going to allow may be going a little far in the direction of “fake” from the sounds of things. I’m not sure if that is good or bad but it does scare me that photos can’t really be relied upon to give an accurate representation of a scene anymore. Everyone having access to ti’s kind of AI is going to make it tremendously difficult to distinguish between realistic and misleading images.
So, just an FYI, I bought Eufy cameras because I believed their marketing bullshit about being secure and end-to-end encrypted. About two months later they changed how they describe their security and quietly modified their privacy policy. Turns out they’re not really end-to-end encrypted and it is possible to gain access to the streams sometimes.
My recommendation, after doing my research is not to buy anything that is able to be viewed remotely. Buy something that stores the video locally, in your home. If possible, buy and install wired cameras.
FWIW the US is claiming that the researcher had confidential information from Los Alamos National Labs against the terms of his NDA. They claim the researcher admitted to taking the information and attempting to conceal it.
I honestly hope this is the explanation. If we’re starting to deny entry to the country simply due to criticisms of domestic policy decisions, we’re going down yet another dark path.