Ban the sale of precise geolocation
Posted by hn_acker 8 hours ago
Comments
Comment by Johnbot 8 hours ago
Comment by rockskon 7 hours ago
It's a rhetorical fiction the ad industry tells itself.
Comment by Terr_ 5 hours ago
Comment by mapt 4 hours ago
Comment by wafflemaker 3 hours ago
Comment by breppp 13 minutes ago
Comment by abustamam 36 minutes ago
Comment by Forgeties79 7 hours ago
Edit: It's a rhetorical fiction the ad industry tells us.
Comment by teraflop 6 hours ago
https://arxiv.org/abs/cs/0610105
If movie ratings are vulnerable to pattern-matching from noisy external sources, then it should be obvious that location data is enormously more vulnerable.
Comment by vovanidze 7 hours ago
waiting for legislation or eulas to fix this is a lost cause since adtech always finds a loophole. the fix has to be architectural. moving toward stateless proxies that strip device identifiers at the edge before they even hit upstream servers. if the payload never touches a persistent db there is literally nothing to de-anonymize. stateless infra is the only sane way forward
Comment by microtonal 7 hours ago
Comment by rolph 6 hours ago
why would someone include tech that makes people think twice about using the app, unless it is required if you want to "sell" in a particular venue.
if your developing geolocation based apps, location tracking is a core function.
a calender, absolutely does not require location tracking beyond what side of the prime meridian are you on.
Comment by nickburns 6 hours ago
But the subsequent sale of that data is not—is the discussion here.
Comment by rolph 4 hours ago
you cant sell what you dont have unless you lie lower than a rug.
fix the data collection problem and a second order effect of no data for sale emerges.
Comment by nickburns 4 hours ago
Comment by LeifCarrotson 3 hours ago
Because the overwhelming majority of people don't think twice about this tech.
I do, and that's why I use a lot of web tools or old-fashioned phone calls, but most people think metadata=unimportant and assume that the purpose of the app is what it does for them rather than to gather their personal information for sale.
Comment by CPLX 6 hours ago
Comment by chimeracoder 4 hours ago
Even if Google and Apple both want to commit to fighting this, it becomes a game of whack-a-mole, because there are all sorts of different ways to track users that the platforms can't control.
As an easy example: every time you share an Instagram post/video/reel, they generate a unique link that is tracked back to you so they can track your social graph by seeing which users end up viewing that link. (TikTok does the same thing, although they at least make it more obvious by showing that in the UI with "____ shared this video with you").
Comment by uxhacker 4 hours ago
Is there not also a requirement for clean consent? Ie a weather app can’t track your precise location?
Comment by sroussey 7 hours ago
Comment by ImPostingOnHN 7 hours ago
Comment by gessha 5 hours ago
Comment by nzach 3 hours ago
I think a lot of people don't realize the power of a big enough sample size. With enough samples even something pretty innocent looking like your daily step counter could make you identifiable.
As far as I know we don't have large enough databases to make this happen in practice, but I don't think this is impossible in the future.
Comment by jandrewrogers 3 hours ago
Comment by ninalanyon 6 hours ago
Comment by kube-system 6 hours ago
Alone, these points are not deanonymizing, it's when there's other data associated.
Comment by jandrewrogers 7 hours ago
The analytic reconstruction of identity from location is far more sophisticated than the scenarios people imagine. You don't need to know where they live to figure out who they are. Every human leaves a fingerprint in space-time.
Comment by nickburns 7 hours ago
It's not though.
Critical for myriad elective purposes? Sure.
Comment by jandrewrogers 6 hours ago
Comment by xphos 6 hours ago
Comment by philipallstar 4 hours ago
Comment by nickburns 6 hours ago
Could you be more specific with maybe a single example of where my physical geographic location is electronically critical for a purpose that isn't elective/optional/avoidable?
(And I'm not just trying to be obtuse. I think you're touching on at least part of the 'heart' of both this conversation and that of digital ID verification.)
Comment by quickthrowman 6 hours ago
Edit: I assume I am missing a crucial part of logistics that you’re familiar with, genuinely curious.
Comment by ramoz 3 hours ago
Comment by 1121redblackgo 7 hours ago
Comment by malfist 7 hours ago
A lot isn't good enough.
Comment by ch4s3 8 hours ago
Comment by Dwedit 8 hours ago
Comment by ch4s3 7 hours ago
Comment by ryandrake 7 hours ago
Comment by pocksuppet 7 hours ago
Comment by kube-system 6 hours ago
Furthermore, you cannot contract away criminal liability if any exists.
Comment by lukeschlather 6 hours ago
Comment by kube-system 6 hours ago
Comment by celeritascelery 6 hours ago
Comment by ch4s3 7 hours ago
Comment by pwg 7 hours ago
Comment by kube-system 6 hours ago
Comment by nine_k 3 hours ago
The fact that 100% of its users, except the litigant, skimmed through the EULA and did not notice anything does not relieve the company from the responsibility.
Comment by nickburns 6 hours ago
Comment by rolph 6 hours ago
that is defined as extortion, but labled as onboarding.
Comment by kube-system 6 hours ago
Comment by stavros 7 hours ago
Comment by teeray 5 hours ago
Comment by toofy 7 hours ago
Comment by rubyfan 8 hours ago
Comment by rubyfan 8 hours ago
Comment by gruez 7 hours ago
That's opt-in, not opt-out.
Comment by nickburns 6 hours ago
Comment by rcxdude 2 hours ago
Comment by wakawaka28 4 hours ago
Comment by troupo 7 hours ago
GDPR tried. And the narrative around GDPR was deliberately completely derailed by adtech.
Lack of enforcement didn't help either
Comment by ch4s3 7 hours ago
Comment by microtonal 7 hours ago
The problem is not the GDPR, the problem is the surveillance industry that wants to grab as much data as possible and try to do as much malicious compliance as possible.
Comment by jandrewrogers 6 hours ago
The costs are often worse on industrial side because the data is so much larger and faster than web or mobile data.
Comment by gwerbin 5 hours ago
Comment by jandrewrogers 5 hours ago
The trouble started when lawyers correctly noticed that these are incidentally capable surveillance systems even though that isn't how we use them or what they were designed for.
Comment by gwerbin 3 hours ago
Comment by jandrewrogers 1 hour ago
GDPR frames everything in the context of a person's data. There is no "person_id" or similar field in these data models. That isn't the purpose of the data, it would be expensive to extract it, and then it would create obvious liability under GDPR. This makes the idea of finding a person's data expensive -- brute-force search on huge data volumes.
Compounding this, these data systems are often operational and some of the data may be in situ at the edge because it is too large to move all of it. The power and compute budget may not exist to find a person using brute force.
AFAICT, current best practice is to maintain a polite fiction that people aren't being tracked because that is not the intent. No one thinks that would stand up to serious legal scrutiny though. If the regulators come after you then plead best effort based on the technical infeasibility of doing anything else.
Comment by GJim 2 hours ago
The GDPR is there to protect your personal/sensitive data, or data that can personally identify you. If has nothing whatsoever to do with data capture from industrial machinary.
I remain astounded how ignorant some people are of basic GDPR principle: protecting your _personal_ data.
Comment by jandrewrogers 1 hour ago
How is this not your personal data?
Exploitation of these types of data sources has been demonstrated for 15+ years at this point. Abuse is often impractical for technical reasons but GDPR doesn't give you free pass on collecting personal data just because you aren't using it like personal data.
Comment by pocksuppet 7 hours ago
Comment by ch4s3 7 hours ago
Comment by jandrewrogers 7 hours ago
I have read GDPR and don't work in adtech. It is vague and it is pretty easy to find pathological scenarios that don't make much sense or impose an unusually high burden for no benefit. Every European law firm seems to agree with this assessment despite what proponents assert. Consequently, it forces a lot of expensive defensive activity in practice.
To some extent, it was just a failure of imagination on the part of GDPR's authors. Many things are not nearly as simple as it seems to assume and it bleeds into data models that have nothing to do with people.
It is what it is but no one should pretend it is not a burden for companies that have nothing to do with adtech or even data about people.
Comment by troupo 7 hours ago
Congrats on gullibly believing the ad tech narrative.
Comment by ryandrake 7 hours ago
Comment by kentm 4 hours ago
YOUR collection of user's data is an overreach and breach of privacy. MY collection of data is absolutely necessary to grow my scrappy small business and provide value. I am a good person with good intentions, so its OK. You are a bad person doing bad things, so its not OK.
Comment by IX-103 4 hours ago
What is data processing essential for the services being provided? Many publishers assumed that getting paid was an essential part of providing a service, and it was not until 3 months before the implementation deadline that the committee clarified that getting paid is not included when you are being paid by a third party.
How are you to know whether or not the user is an EU citizen (and thus subject to the GDPR)? Is making that determination a service essential for providing your service? The answers apparently were "You don't" and "No", which would effectively make companies assume that the GDPR applies to everyone on the planet.
The GDPR also is fundamentally opposed to how things currently work in the internet, making almost all advertising on the web illegal overnight. It was too big of a change to happen at once, so it effectively only loosely enforced in practice.
I like the idea of the GDPR, but the implementation sucks.
Comment by GJim 1 hour ago
What utter utter FUD
You are free to collect as much personal data as you want, PROVIDING you have my explicit opt-in informed consent to do so.
What about this is difficult to understand?
> How are you to know whether or not the user is an EU citizen (and thus subject to the GDPR)?
The GDPR provides _basic_ data safety and consumer protection. If you aren't protecting users private data regardless of where they live in line with GDPR principles (such as collecting it fairly, and not selling it to randoms) then you are playing fast and loose with your users private, sensitive data. In which case you need to _seriously_ consider if what you are doing is ethical.
> The GDPR also is fundamentally opposed to how things currently work in the internet, making almost all advertising on the web illegal overnight.
Utter Bullshit!
You are free to advertise as much as you like! But if you want to track me with your advertising (hello scummy adtech industry) then you need my explicit informed consent to do so. And so you should!
Again, what about this is difficult to understand?
Comment by ryandrake 1 hour ago
It's interesting and revealing when someone responds to a law that says "You're not allowed to abuse users in countries X, Y, and Z" with "How can I figure out who's in the other countries, so I can abuse them?" instead of "I'll just stop abusing everyone, and then I don't even need to worry about where anyone is."
Whenever you find yourself asking "how do I toe as close to the 'illegal' line as I can without technically going over it?" I think it's time to ask yourself some pretty hard questions.
Comment by pocksuppet 7 hours ago
Comment by lukeschlather 6 hours ago
Comment by ch4s3 7 hours ago
Comment by buzer 3 hours ago
DPA won't punish you for not following EDPB's recommendations, they will punish you for breaking GDPR. You are free to ignore EDPB if you think your legal position is strong, but you carry the risk if you are wrong.
Comment by stavros 7 hours ago
The rest of the "It'S So LaRgE AnD UndErSpEciFieD" is just FUD. The regulators don't just slap fines, they work with you to get you to comply, and they just want to see that you're putting in the effort instead of messing them about.
I have literally never been surprised by the GDPR. Whenever I thought "surely this is allowed" it was, whenever I thought "this can't be allowed", it wasn't. For everything in the middle, nobody will punish you for an honest mistake.
Comment by kentm 4 hours ago
This is not too hard if you do proper engineering work ahead of time and are purposeful about how you move and manage data (step 1 is just not collecting it unless its vital). But the industry encourages us to be very bad about that because we gotta "move fast and break things or you're not gonna make it."
Comment by ch4s3 5 hours ago
How do you know that? Again the law establishes a rules making body that can at any time change or add rules, and as far as I can tell there's no public review process.
Comment by stavros 4 hours ago
Comment by redwall_hp 7 hours ago
Just don't spy on people.
Comment by stavros 6 hours ago
Comment by KaiserPro 2 hours ago
Until that changes you're going to be stuck.
Something as simple as the data protections act 1998 (https://en.wikipedia.org/wiki/Data_Protection_Act_1998) would kneecap a lot of the shady shit that goes on in the USA.
Comment by treebeard901 4 hours ago
Comment by crummy 3 hours ago
Comment by romaniv 7 hours ago
So the current feedback process involves: construction → exploitation → reporting → public awareness → legislation. This is too slow. Moreover, operating in this environment is exhausting.
We need a different feedback loop altogether. I'm not sure which one would work best, but something different needs to be considered.
Comment by jjk166 5 hours ago
And critically, it is not someone becoming aware of private information that is the abuse of privacy, it is exploiting that private information which is the abuse. There may be countless legitimate technical reasons you need to collect data, but there can not possibly be a technical justification for selling it.
Comment by groos 4 hours ago
Comment by linkjuice4all 7 hours ago
Comment by gruez 6 hours ago
Comment by ButlerianJihad 3 hours ago
There was one person with a feminine name who showed up with a “home address” that would correspond to being my “neighbor” at home, at my clinic, at church, when I went to college, etc. All the years corresponded correctly, and the addresses were some residential place about a block or less away from the places where I went.
For all I know, this person was either fictional or an innocent bystander. She did appear to have a Facebook account or two. I was never able to directly contact her. But I found it very strange and I wondered what would be gained by doxxxing me in this manner?
Of course this has nothing directly to do with GPS coordinates, but imagine if the GPS began to be part of your public record as well, or on your credit report. Imagine if it was entered into the public record what coffee house you visited every morning, or if there were errors in this record.
Comment by GJim 1 hour ago
* coordinates
There are many ways of establishing ones latitude and longitude without recourse to one particular GNSS system.
Comment by ButlerianJihad 46 minutes ago
Comment by victor22 1 hour ago
Comment by uxhacker 8 hours ago
https://citizenlab.ca/research/analysis-of-penlinks-ad-based...
Comment by reenorap 5 hours ago
The previous views on privacy didn't take into account the fact that everyone now has video cameras and people are incentivized to violate privacy to make money as influencers. I think people's privacies need to be protected and I think that means making laws around it much, much stricter. This includes things like location data, it shouldn't be sold or exposed at all.
Comment by wakawaka28 4 hours ago
Comment by pnw 4 hours ago
Comment by lifeisstillgood 6 hours ago
Imagine a option on your iPhone that says “Enable this to allow geo-location tracking for organisations registered under the NOADSJUSTPUBLICGOOD Act” - then any wifi endpoint could locate you as long based on signal strength etc and that data could only be made available to people registered under the act.
Would we see new understanding of how people move around in cities, would we see better traffic information, Inthink so - as long as people believe that there are real teeth to the laws and they enforced loudly and publically.
We should embrace the benefits of a society wide epidemiology experiment - the benefits for public health are incredible. (Add to that supply chain logistics on open ledgers and many of the new things that just were not possible before and the future of open transparent but well regulated democracies is bright.
Let me know if you spot one.
Comment by Terr_ 3 hours ago
What about: "If something bad happens because of the data your company shared or lost, it is criminally and financially liable?"
Comment by atmosx 3 hours ago
Comment by kidnoodle 4 hours ago
Alas, I was stymied by not having any cash to work on it, and the unit economics were not very VC friendly (at least I assume that’s one of the reasons why I didn’t get any traction from VCs).
Comment by eptcyka 3 hours ago
Comment by lionkor 3 hours ago
Comment by Eextra953 7 hours ago
Comment by Cider9986 5 hours ago
Comment by titzer 6 hours ago
Comment by glitchc 7 hours ago
Comment by davebren 7 hours ago
Comment by warkdarrior 5 hours ago
Comment by ssl-3 2 hours ago
Fitness apps can be local. We have pocket supercomputers; certainly, we don't need help from the clown to keep track of how far (or how energetically) we biked or walked today, or where that took place.
Comment by Mithriil 7 hours ago
Comment by kristianpaul 6 hours ago
Comment by charcircuit 3 hours ago
Comment by erelong 5 hours ago
Comment by shevy-java 5 hours ago
Comment by lifestyleguru 8 hours ago
Comment by Cider9986 5 hours ago
Comment by reorder9695 6 hours ago
Comment by mystraline 6 hours ago
And the FLOSS/Linux phone hardware attempts have frankly sucked.
I was hoping that my PinePhone Pro would actually be usable. But no, its a PineDoorstop.
Proper Linux would be a great 3rd choice. But yeah. We've got a duopoly and not much we can do about it.
Comment by 9991 5 hours ago
Comment by troupo 8 hours ago
Comment by Swizec 8 hours ago
Comment by mzajc 8 hours ago
> Lifespan: 13 Months
> ...
> Standard retention (4320 Days)
It looks like a cookie prompt, so I assume "Lifespan" refers to cookie expiration and "retention" to how long the data (including geolocation) is retained on the spyware company's servers.Comment by troupo 8 hours ago
Data Retention: Standard Retention (4320 days)
Comment by wolvoleo 8 hours ago
Missed opportunity by the EU when they wrote GDPR.
Comment by GJim 5 hours ago
Not really.
There are legitimate reasons why I might wish to be tracked or give my personal data to a company. As long as I'm asked to give clear, opt-in informed consent, this is perfectly fine. This is the very essence of the GDPR!
Instead, direct your ire to the scummy adtech industry who are constantly asking to invade my privacy and smell my knickers trying to work out what I ate for lunch. Another law to ban the adtech industry would be welcome from me, though would meet fierce resistance from the likes of Google.
The GDPR is well written.
Comment by wolvoleo 4 hours ago
In these cases they don't even need to ask for your permission.
> Instead, direct your ire to the scummy adtech industry who are constantly asking to invade my privacy and smell my knickers trying to work out what I ate for lunch. Another law to ban the adtech industry would be welcome from me, though would meet fierce resistance from the likes of Google.
No, the EU should have done more to prevent this. They didn't want to kill a billions-of-euros industry. But they should have.
Comment by troupo 7 hours ago
GDPR has literally nothing to do with cookie popups. That was, and is, adtech
Comment by em-bee 7 hours ago
that's what causes the popups.
it should prohibit it outright, consent or not.
Comment by SoftTalker 7 hours ago
Comment by em-bee 6 hours ago
Comment by GJim 5 hours ago
Comment by wolvoleo 4 hours ago
Comment by nathanlied 4 hours ago
The adtech industry has, time and again, proven they cannot self-regulate to any decent capacity. At this point, the only reasonable course of action is to shackle them down with such heavy legislative burdens they're rendered de facto extinct.
I will not mourn their loss.
Comment by pocksuppet 7 hours ago
Comment by wolvoleo 4 hours ago
For example, giving consent should be the same difficulty as denying it. So one click consent means there must be also one click non-consent. But this is policed very poorly.
I think they should just ban adtech altogether, at least any form of targeted advertising, individual pricing (which is already illegal in many EU countries) and ideally also deep market research.
Comment by lotu 7 hours ago