Always easier when you can avoid the law and just buy it off the shelf. It’s fine to do this, we say, because it’s not being done by the government - but if they’re allowed to turn around and buy it we’re much worse off.
digiown 50 minutes ago [-]
That's why it doesn't make sense to ban governments from doing things while still allowing private companies. Either it is illegal to surveil the public for everyone, or the government can always do it indirectly with the same effect.
I don't think the deal described here is even that egregious. It's basically a labeled data scrape. Any entity capable of training these LLMs are able to do this.
asveikau 26 minutes ago [-]
The difference is that a government can take personal liberty away from people in the most direct way. A private company can't decide to lock somebody away in prison or send them to death row. (Hopefully anyway.) So we put a higher standard on government.
That said, I do believe there ought to be more restrictions on private use of these technologies.
pixl97 8 minutes ago [-]
>A private company can't decide to lock somebody away in prison or send them to death row.
A private company can 100% do this in many ways. They already do this buy putting up and using their technology in minority areas, for example.
unethical_ban 56 seconds ago [-]
It's a distinction. Private companies are partnering with the government to take away personal liberty.
We should ban the government from accessing data gathered by private companies by default, perhaps. I need to mull on it.
helterskelter 14 minutes ago [-]
Yeah but these companies are operating hand in glove with govt such that there's no discernible difference between the current system and government just doing it themselves. Ban it outright.
kristopolous 10 minutes ago [-]
The separation between private and the government is purely theatrics - a mere administrative shell.
I really don't understand why people treat it with such sacrosanct reverence.
It reminds me of a cup and ball street scam. Opportunistic people move things around and there's a choir of true believers who think there's some sacred principles of separation to uphold as they defend the ornamental labels as if they're some divine decree.
I mean come on. Know when you're getting played.
CGMthrowaway 27 minutes ago [-]
Or we could just restrict funding for the governments/agencies/projects which are doing bad things (power of the purse, line item veto, sunset clauses, etc)...without hampering legitimate private enterprise. Instead we renew the NDAA every year.
(downvotes, keep em coming. I thought defunding ICE was more popular)
plagiarist 5 minutes ago [-]
Facial recognition is not a legitimate private enterprise. It is a complete failure of legislation that it is allowed to exist.
throwaway894345 21 minutes ago [-]
I would much rather have a democratically elected and constitutionally constrained government than private enterprise with limitless power. It would also be helpful if the “government is bad” people would stop electing the people who seek to sabotage the government.
duped 9 minutes ago [-]
This is why we should shun the people that build this stuff. If you take a paycheck to enable fascism, you're a bad person and should be unwelcome in polite society.
givemeethekeys 5 minutes ago [-]
How long before the bring the price down and local PD's start using it too?
yababa_y 1 hours ago [-]
local laws forbidding facial recognition tech have never been wiser
There are certain people who believe that average citizens can be held responsible for the actions of their government, to the point that they are valid military targets.
Well, if that's true then employees of the companies that build the tools for all this to happen can also be held responsible, no?
I'm actually an optimist and believe there will come a time whena whole lot of people will deny ever working for Palantir, for Clearview on this and so on.
What you, as a software engineer, help build has an impact on the world. These things couldn't exist if people didn't create and maintain them. I really hope people who work at these companies consider what they're helping to accomplish.
the_gastropod 20 minutes ago [-]
I never worked at a company that could broadly be considered unethical, I don't think. But it was always a bit disheartening how many little obviously unethical decisions (e.g., advertised monthly plans with a small print "annual contract" and cancellation fee) almost every other employee would just go along with implementing, no pushback whatsoever. I don't know what it is, but your average employee seemingly sees themselves as wholly separate from the work they're paid to do.
I have friends who are otherwise extremely progressive people, who I think are genuinely good people, who worked for Palantir for many years. The cognitive dissonance they must've dealt with...
throw-qqqqq 14 minutes ago [-]
> I don't know what it is, but your average employee seemingly sees themselves as wholly separate from the work they're paid to do.
Hannah Arendt coined the term “the banality of evil”. Many people think they are just following orders without reflecting on their actions.
neuroelectron 1 hours ago [-]
Don't we already have facial recognition technology that isn't based on AI? why is throwing AI into the mix suddenly a reasonable product? Liability wavers?
dylan604 1 hours ago [-]
I think the facial rec systems you're thinking of will recognize faces, but not ID them. They need you to label a face, and then it recognizes that face with a name from there on. Clearview is different in that you can provide it an unknown face and it returns a name. Whether it's just some ML based AI vs an LLM, it's still under the AI umbrella technically.
lazide 51 minutes ago [-]
Uh no? Facial recognition to names has been the bread and butter of facial recognition since the beginning. It’s literally the point.
dylan604 45 minutes ago [-]
There are plenty of facial rec systems. Thinking of systems like in iOS Photos, or any of the other similar photo library systems. I think pretty much everyone would be freaked out if they started IDing people in your local libraries.
joering2 5 minutes ago [-]
unsure what you mean by starting IDing? Majority business in US does it already, all banks use facial recognition to know who comes through their door (friend who works in IT at Bank of America told me they implemented it cross all Florida branches sometime in 2009), most large chain gas stations as well, so does car rentals, most hotels, etc. I was recently booted out of Mazda Dealership in Florida because 11 years ago in Georgia I sued Toyota Dealership for a lemon sell, and now they both under same ownership and my name came up on "no business" alert when I entered their offices.
porridgeraisin 36 minutes ago [-]
Note that there is no difference in the model or in the training. The only thing needed to convert ios photos into one that IDs people is access to a database mapping name to image. The IDing part is done after the "AI" part, it's just a dot product.
lazide 41 minutes ago [-]
Huh? What relevance does that have with the discussion?
porridgeraisin 51 minutes ago [-]
After the literal first one which just measured distance between nose and mouth and stuff like that from the 1960s, everything else has been based on AI.
If my memory serves me, we had a PCA and LDA based one in the 90s and then the 2000s we had a lot of hand-woven adaboosts and (non AI)SIFTs. This is where 3D sensors proved useful, and is the basis for all scifi potrayals of facial recognition(a surface depth map drawn on the face).
In the 2010s, when deep learning became feasible, facial recognition as well as all other AI started using an end to end neural network. This is what is used to this day. It is the first iteration pretty much to work flawlessly regardless of lighting, angle and what not. [1]
Note about the terms AI, ML, Signal processing:
In any given era:
- whatever data-fitting/function approximation method is the latest one is typically called AI.
- the previous generation one is called ML
- the really old now boring ones are called signal processing
Sometimes the calling-it-ML stage is skipped.
[1] All data fitting methods are only as good as the data. Most of these were trained on caucasian people initially so many of them were not as good for other people. These days the ones deployed by Google photos and stuff of course works for other races as well, but many models don't.
mschuster91 2 hours ago [-]
And this right here is why Clearview (and others) should have been torn apart back when they first appeared on stage.
I 'member people who warned about something like this having the potential to be abused for/by the government, we were ridiculed at best, and look where we are now, a couple of years later.
gostsamo 1 hours ago [-]
"This cannot happen here" should be classified as a logical fallacy.
dylan604 59 minutes ago [-]
As stated in many of the comments in my code where some else branch claims this shouldn't be happening
josefritzishere 1 hours ago [-]
Skynet. "You only postponed it. Judgment Day is inevitable."
lenerdenator 1 hours ago [-]
Wear a face mask in public. Got it.
estebank 53 minutes ago [-]
I think anything short of fully obscuring your face (a-la ICE-agent/stormtrooper) will be merely a mitigation and not 100% successful. I recall articles talking about face recognition being used "successfully" on people wearing surgical masks in China. In the US they ask you to remove face masks in places where face recognition is used (at the border, TSA checkpoints), but would be unsurprised if that isn't strictly needed in most cases (but asking people to remove it preemptively ends up being faster for throughput).
quantified 49 minutes ago [-]
Probably room to add little cheek pads or other shape-shifters under the mask.
verdverm 29 minutes ago [-]
You have to change how you walk and sounds as well
lotsofpulp 21 minutes ago [-]
99.9% of people walk around with an electronic device that identifies them. If a particular person doesn’t, it should be trivial to filter out all the people that it couldn’t have been, leaving only a small list of possible people.
dylan604 59 minutes ago [-]
Aren't we back to where this is illegal again, unless you're an ICE agent.
lenerdenator 55 minutes ago [-]
"Hey man, doctor's orders. Gotta wear it to get allergy relief. And no, can't ask about it... HIPAA stuff."
hackingonempty 25 minutes ago [-]
It is not a good idea to lie to an employee of the USA.
Sadly, I'm sure that will go over "not well" with ICE agents who will happily assault you for carrying a phone...
seanw444 19 minutes ago [-]
I disagree with the shooting too, but this is such a massive oversimplification of the event.
dylan604 32 minutes ago [-]
"I'll show you mine if you show me yours"
OutOfHere 1 hours ago [-]
We need a Constitutional amendment that guarantees a complete right to anonymity at every level: financial, vehicular, travel, etc. This means the government must not take any steps to identify a person or link databases identifying people until there has been a documented crime where the person is a suspect.
Only if an anonymous person or their property is caught in a criminal act may the respective identity be investigated. This should be sufficient to ensure justice. Moreover, the evidence corresponding to the criminal act must be subject to a post-hoc judicial review for the justifiability of the conducted investigation.
Unfortunately for us, the day we stopped updating the Constitution is the day it all started going downhill.
quantified 46 minutes ago [-]
Maybe. Anonymity is where bad actors play. Better to have better disclosure and de-anonymization in some cases. If some live in fear (e.g. of cartels), go after the cartels harder than they go after you.
OutOfHere 13 minutes ago [-]
> Anonymity is where bad actors play
That is a myth spread by control freaks and power seekers. Yes, bad actors prefer anonymity, but the quoted statement is intended to mislead and deceive because good actors can also prefer strong anonymity. These good actors probably even outnumber bad ones by 10:1. To turn it around, deanonymization is where the bad actors play.
Also, anonymity can be nuanced. For example, vehicles can still have license plates, but the government would be banned from tracking them in any way until a crime has been committed by a vehicle.
_3u10 1 hours ago [-]
That will be wildly unpopular with both parties and most importantly their constituents. I doubt even the libertarian party should they get the president, house and senate could pull it off
OutOfHere 1 hours ago [-]
Note that the Amendment would apply only to the government, not to private interests. Even so, i could be unpopular among advertisers and data resellers, e.g. Clearview, who sell to the government. I guess these are what qualify as constituents these days. The people themselves have long been forgotten as being constituents.
catlover76 1 hours ago [-]
[dead]
comrade1234 2 hours ago [-]
"You’ve read your last free article."
I don't think I've read a Wired article since 2002...
j45 1 hours ago [-]
Wired still seems to write some good pieces.
toomuchtodo 1 hours ago [-]
I subscribe to keep the reporting going. Journalism costs money.
(ProPublica, 404media, APM Marketplace, Associated Press, Vox, Block Club Chicago, Climate Town, Tampa Bay Times, etc get my journalism dollars as well)
laweijfmvo 55 minutes ago [-]
are you using a vpn or something like that that might look like “you” have read wired articles?
charcircuit 1 hours ago [-]
Having AI assisted law enforcement will be a big force of making the law applied evenly. Law enforcement has limited resources so being able to give them a force multiplier will help clean up a lot of issues that were thought to be impossible to enforce before.
runako 1 hours ago [-]
This is exactly, precisely the opposite of what the impact will be.
For example:
- every technology has false positives. False positives here will mean 4th amendment violations and will add an undue burden on people who share physical characteristics with those in the training data. (This is the updated "fits the description."
- this technology will predictably be used to enable dragnets in particular areas. Those areas will not necessarily be chosen on any rational basis.
- this is all predictable because we have watched the War on Drugs for 3 generations. We have all seen how it was a tactical militaristic problem in cities and became a health concern/addiction issues problem when enforced in rural areas. There is approximately zero chance this technology becomes the first use of law enforcement that applies laws evenly.
aunty_helen 1 hours ago [-]
Same could be said about the computer systems that have been developed in the last 20 years. But that hasn’t happened…
rhcom2 1 hours ago [-]
The targets for the AI are still set by humans, the data the AI was trained on is still created by humans. Involving a computer in the system doesn't magically make it less biased.
charcircuit 1 hours ago [-]
That is true for now, but eventually it should be possible for it to be more autonomous without needing humans to set its target.
Refreeze5224 50 minutes ago [-]
Not only is this incredibly naive, it misses that whole "consent of the governed" thing. I don't want AI involved in policing. They are bad enough and have so little accountability without "computer says so" to fall back on, That's all AI will do, make a bad situation worse.
monknomo 1 hours ago [-]
are you sure it won't enabled targeted enforcement for people law enforcement finds irritating, more than evenly applied law? It's still people setting the priorities and exercising discretion about charging.
charcircuit 1 hours ago [-]
It should be easier to audit since you would have a list of who broke the law, but action had not been taken yet.
monknomo 14 minutes ago [-]
do you think the records of the vast number of police departments and agencies would be combinable with the separate court records, as well as the facial recognition access data source (if it exists?)
I think that is pretty unlikely
HPsquared 1 hours ago [-]
I wonder how many laws and sentencing guidelines etc are formulated with an implicit assumption that most of the time, people aren't caught.
cucumber3732842 1 hours ago [-]
In my estimation all of the criminal ones and at least half of the civil ones.
charcircuit 1 hours ago [-]
I think it will reveal unfair laws and as a society we will have to rebalance things that had such an assumption in place.
iLoveOncall 1 hours ago [-]
Meanwhile all AI face recognition software works poorely on non-caucasians.
dylan604 58 minutes ago [-]
With this administration, I think that is a feature not a bug
mrguyorama 1 hours ago [-]
None of the destruction of your rights has lead to improvement in clearance rates.
Crimes aren't solved, despite having a literal panopticon. This view is just false.
Cops are choosing to not do their job. Giving them free access to all private information hasn't fixed that.
charcircuit 1 hours ago [-]
Then cops should be taken out of the core law enforcement agentic loop. There could be a new role of people who the AI dispatches instead to do law enforcement work in the real world.
Refreeze5224 49 minutes ago [-]
I think you fundamentally misunderstand what the role of the police is. They protect property, the owning class, and the status quo. Laws are just a tool for them to do that. Equal justice for all is not a goal for them, and AI will not provide more of it.
mindslight 1 hours ago [-]
Why do you write so many low-effort, disingenuous, inflammatory comments? They're "not even wrong", yet they just suck energy right out of productive discussion as people inevitably respond to one part of your broken framing, and then they're off to the races arguing about nonsense.
The main problem with the law not being applied evenly is structural - how do you get the people tasked with enforcing the law to enforce the law against their own ingroup? "AI" and the surveillance society will not solve this, rather they are making it ten times worse.
charcircuit 30 minutes ago [-]
I want to share my opinion even if I know that it may not be a popular one on HN. I am not trying to maximize my reputation by always posting what I believe will get the most upvotes, but instead I prioritize sharing my opinion.
>people inevitably respond to one part of your broken framing, and then they're off to the races arguing about nonsense.
I agree that this unproductive. When people have two very different viewpoints it is hard for that gap to be bridged. I don't want to lay out my entire world view and argument from fist principals because it would take too much time and I doubt anyone would read it. Call it low effort if you want, but at least discussions don't turn into a collection of a single belief.
>how do you get the people tasked with enforcing the law to enforce the law against their own ingroup?
Ultimately law enforcement is responsible to the people so if the people don't want it then it will be hard to change. In regards to avoiding ingroup preference it would be worth coming up with ways of auditing cases that are not being looked into and having AI try to find patterns in what is causing it. The summaries of these patterns could be made public to allow voters and other officals to react to such information and apply needed changes to the system.
Ar-Curunir 1 hours ago [-]
LE has been getting increasingly advanced technology over the years. The only thing that’s increased is their ability to repress and oppress.
Go lick boots elsewhere.
Rendered at 19:12:32 GMT+0000 (Coordinated Universal Time) with Vercel.
I don't think the deal described here is even that egregious. It's basically a labeled data scrape. Any entity capable of training these LLMs are able to do this.
That said, I do believe there ought to be more restrictions on private use of these technologies.
A private company can 100% do this in many ways. They already do this buy putting up and using their technology in minority areas, for example.
We should ban the government from accessing data gathered by private companies by default, perhaps. I need to mull on it.
I really don't understand why people treat it with such sacrosanct reverence.
It reminds me of a cup and ball street scam. Opportunistic people move things around and there's a choir of true believers who think there's some sacred principles of separation to uphold as they defend the ornamental labels as if they're some divine decree.
I mean come on. Know when you're getting played.
(downvotes, keep em coming. I thought defunding ICE was more popular)
Well, if that's true then employees of the companies that build the tools for all this to happen can also be held responsible, no?
I'm actually an optimist and believe there will come a time whena whole lot of people will deny ever working for Palantir, for Clearview on this and so on.
What you, as a software engineer, help build has an impact on the world. These things couldn't exist if people didn't create and maintain them. I really hope people who work at these companies consider what they're helping to accomplish.
I have friends who are otherwise extremely progressive people, who I think are genuinely good people, who worked for Palantir for many years. The cognitive dissonance they must've dealt with...
Hannah Arendt coined the term “the banality of evil”. Many people think they are just following orders without reflecting on their actions.
If my memory serves me, we had a PCA and LDA based one in the 90s and then the 2000s we had a lot of hand-woven adaboosts and (non AI)SIFTs. This is where 3D sensors proved useful, and is the basis for all scifi potrayals of facial recognition(a surface depth map drawn on the face).
In the 2010s, when deep learning became feasible, facial recognition as well as all other AI started using an end to end neural network. This is what is used to this day. It is the first iteration pretty much to work flawlessly regardless of lighting, angle and what not. [1]
Note about the terms AI, ML, Signal processing:
In any given era:
- whatever data-fitting/function approximation method is the latest one is typically called AI.
- the previous generation one is called ML
- the really old now boring ones are called signal processing
Sometimes the calling-it-ML stage is skipped.
[1] All data fitting methods are only as good as the data. Most of these were trained on caucasian people initially so many of them were not as good for other people. These days the ones deployed by Google photos and stuff of course works for other races as well, but many models don't.
I 'member people who warned about something like this having the potential to be abused for/by the government, we were ridiculed at best, and look where we are now, a couple of years later.
https://www.law.cornell.edu/uscode/text/18/1001
Only if an anonymous person or their property is caught in a criminal act may the respective identity be investigated. This should be sufficient to ensure justice. Moreover, the evidence corresponding to the criminal act must be subject to a post-hoc judicial review for the justifiability of the conducted investigation.
Unfortunately for us, the day we stopped updating the Constitution is the day it all started going downhill.
That is a myth spread by control freaks and power seekers. Yes, bad actors prefer anonymity, but the quoted statement is intended to mislead and deceive because good actors can also prefer strong anonymity. These good actors probably even outnumber bad ones by 10:1. To turn it around, deanonymization is where the bad actors play.
Also, anonymity can be nuanced. For example, vehicles can still have license plates, but the government would be banned from tracking them in any way until a crime has been committed by a vehicle.
I don't think I've read a Wired article since 2002...
Most Americans don’t pay for news and don’t think they need to - https://news.ycombinator.com/item?id=46982633 - February 2026
(ProPublica, 404media, APM Marketplace, Associated Press, Vox, Block Club Chicago, Climate Town, Tampa Bay Times, etc get my journalism dollars as well)
For example:
- every technology has false positives. False positives here will mean 4th amendment violations and will add an undue burden on people who share physical characteristics with those in the training data. (This is the updated "fits the description."
- this technology will predictably be used to enable dragnets in particular areas. Those areas will not necessarily be chosen on any rational basis.
- this is all predictable because we have watched the War on Drugs for 3 generations. We have all seen how it was a tactical militaristic problem in cities and became a health concern/addiction issues problem when enforced in rural areas. There is approximately zero chance this technology becomes the first use of law enforcement that applies laws evenly.
I think that is pretty unlikely
Crimes aren't solved, despite having a literal panopticon. This view is just false.
Cops are choosing to not do their job. Giving them free access to all private information hasn't fixed that.
The main problem with the law not being applied evenly is structural - how do you get the people tasked with enforcing the law to enforce the law against their own ingroup? "AI" and the surveillance society will not solve this, rather they are making it ten times worse.
>people inevitably respond to one part of your broken framing, and then they're off to the races arguing about nonsense.
I agree that this unproductive. When people have two very different viewpoints it is hard for that gap to be bridged. I don't want to lay out my entire world view and argument from fist principals because it would take too much time and I doubt anyone would read it. Call it low effort if you want, but at least discussions don't turn into a collection of a single belief.
>how do you get the people tasked with enforcing the law to enforce the law against their own ingroup?
Ultimately law enforcement is responsible to the people so if the people don't want it then it will be hard to change. In regards to avoiding ingroup preference it would be worth coming up with ways of auditing cases that are not being looked into and having AI try to find patterns in what is causing it. The summaries of these patterns could be made public to allow voters and other officals to react to such information and apply needed changes to the system.
Go lick boots elsewhere.