Hacker News new | past | comments | ask | show | jobs | submit login
GDPR: Is It Worth It? (arxiv.org)
72 points by belter 15 days ago | hide | past | favorite | 205 comments



This can be an unpopular opinion but I work at a company in the EU and feel the law is very positive for consumers in general.

In today's world users hand over their data to a huge number of companies for all kinds of different reasons and it's important to know that those companies are pushed by the law to only collect and retain the information they need to provide their services.


The only part of GDPR most people know about are the website pop-ups, which probably was a bad idea. They don't know about all the other stuff that is really positive.


The pop-up is not part of the law. The law says user should be aware and able to do informed changes, not that they should be annoyed. That part is entirely on companies


If I go to the EU website ( https://european-union.europa.eu ) and see a pop up, surely there's a better way of implementing that it if it wasn't the best way to do so.

If there is a pop up there, for just accessing the home page of the European Union, is that site collecting too much data?

There's even a bit at the bottom of https://gdpr.eu

> We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.


GDPR.eu is operated by Proton AG, a Swiss technology company offering privacy-focused online services.

It’s not an EU site.


Nonetheless, is it possible for the site to fulfill its obligations under the GDPR without needing a cookie or other tracking information that would warrant needing to say "you are being tracked"?

    GDPR.EU is a website operated by Proton Technologies AG, which is co-funded by Project REP-791727-1 of the Horizon 2020 Framework Programme of the European Union. This is not an official EU Commission or Government resource. The europa.eu webpage concerning GDPR can be found here. Nothing found in this portal constitutes legal advice.

The "here" link goes to https://commission.europa.eu/law/law-topic/data-protection/r...

Would you be surprised if there was a big banner at the bottom of the page that read

    This site uses cookies. Visit our cookies policy page or click the link in any footer for more information and to change your preferences


As far as I can tell, yes. If cookies (or other means of storing data on the users computer) are strictly technically necessary, and no data collection takes place as a legitimate interest, and you do not process data requiring consent, neither the ePrivacy directive nor the GDPR requires you to create a cookie banner. Needless to say, I'm not a lawyer and this is not legal advice.

On a technical level this would likely mean turning off access logs and other tracking mechanisms. (IP addresses are considered personal data last I checked.)


I'm being a bit snarky, but maybe the better way involves not tracking people to death.


Under the GDPR, an IP address is PII and recording the IP address of a visitor can be problematic.

https://gdpr.eu/eu-gdpr-personal-data/

    Looking back at the GDPR’s definition, we have a list of different types of identifiers: “a name, an identification number, location data, an online identifier.” A special mention should be made for biometric data as well, such as fingerprints, which can also work as identifiers. While most of these are straightforward, online identifiers are a bit trickier. Fortunately, the GDPR provides several examples in Recital 30 that include:

    Internet protocol (IP) addresses;
    cookie identifiers; and
    other identifiers such as radio frequency identification (RFID) tags.

    These identifiers refer to information that is related to an individual’s tools, applications, or devices, like their computer or smartphone. The above is by no means an exhaustive list. Any information that could identify a specific device, like its digital fingerprint, are identifiers.


The trick is to read through and understand what falls under "Legitimate Interest" (important note: it never involves giving other services access to it!).

A very common example is recording the IP address for technical (debugging, ensuring site functionality) and legal (security, audit logs) does fall under Legitimate Interest.

Worst case you have to mention it in privacy policy.

The thing you can't do is let your marketing team or others grab at it in any way, or sell it to other vendors, and arguably you should not move outside of safe countries.


That popup is not the half-height YOUR PRIVACY IS IMPORTANT TO US blocker, it's a banner at the bottom. Totally different (but still worth a block from my side if possible)


Yes popups are not part of the law. they are an implementation choosen by the websites.

Also I put a bit of responsibility on us, the consumers. It does not seem to me that popups are such a real issue since no website reported: we are removing popups because users are so annoyed that they stopped visiting our website.


That, the whole pop-up idea was just copy&pasted. It's possible to make commercial websites without it when turning off tracking without logging or having to explicitly opt-in, possibly for added value. There's a lot of confusion while the law is mainly about differentiating which data is needed to operate and which is purely nice to have. That said, there are plenty of GDPR-compliant services that keep data for 10 years mostly in finance or auditing related use cases and nobody bats an eye


> The law says user should be aware and able to do informed changes, not that they should be annoyed.

Then tell us how companies should implement this law both

- without annoying the user

- not being in a risk of getting sued

If the law does not make this relatively easy, one has a right to be furious about the EU.


They could just not track users? I seem to remember coding a lot of websites that totally failed to track users.


Your websites have no logging at all? Remember, even an IP address is PII under GDPR.


You don't have to log IPs. And if you need them for rate limits / abuse prevention, that falls under the "necessary / legitimate interest" rule.


Security, debugging and audit logs all fall under "legitimate interest".

Just keep them out of hands of parties that might want to use them for other reasons.


You don't need to ask if the information you collect is necessary for your site to work.


> You don't need to ask if the information you collect is necessary for your site to work.

Shysters/pettifoggers are very "flexible" (in a negative way) in the interpretation what is "necessary for your site to work". :-(


It's written in plain English: you can read it yourself. Articles 5, 12, 24 and 25.

The main problem, and the reason certain parties are working so hard not to comply, is that GDPR makes several business models illegal. (Not that I see that as much of a problem.)


> It's written in plain English: you can read it yourself. Articles 5, 12, 24 and 25.

This is what the law says, but not how shysters/pettifoggers interpret it. :-(


> It's written in plain English: you can read it yourself.

That isn't how the law works for things like this though; what is plain to a lawyer is different to an untrained person. People make that mistake with tax law all the time; they read something and come up with a plain interpretation that shows they don't have to pay their taxes and that delusion persists until someone explains the actual, less plain, reading.

It isn't the worst starting point but winging the law with untrained readings is not really acceptable for running a business.


People also come up with "plain interpretations" of the laws of physics that allow them to construct perpetual motion machines. Arrogance and wishful thinking are a bad combination, no matter the field.


> People also come up with "plain interpretations" of the laws of physics that allow them to construct perpetual motion machines.

But you don't (typically) get into trouble with the law enforcement agencies if you don't/didn't understand the laws of physics correctly.


The GDPR is not the ePrivacy Directive (the "cookie law"), the ePrivacy Directive predates the GDPR by many years.


I don’t have much of an opinion on the overall value of GDPR, but I’ve been asked multiple times now to create indexes of personal data to _comply_ with the law when said indexes were not wanted for business or technical reasons.

An example is in big data repositories where personal data may have been provided but not used for anything. Yes it’s there but nothing uses it. To enable the reporting and deletion requirements you build a system that keeps track of all that, now the data _is_ trivially correlated which seems like the opposite of the intent.

Even when I build systems that prunes the data on ingest it feels weird. To conform with a privacy law I’m building a sophisticated filter for finding personal data I don’t even want.


> An example is in big data repositories where personal data may have been provided but not used for anything

In that case you have to delete it, you know this, right?


Yes. That’s the point. I’m building the capability to comply with the deletion requirements. But prior to that requirement the systems _didn’t_ have any usage of the personal info.


So, you are just building a huge pool of personally-identifiable data that you are holding onto for no commercial or business reason or benefit to the user?

You’re literally the reason the law was passed. Don’t do that. Like what happens when you get hacked or leave an Amazon bucket public and someone sells all your customers’ PII on the dark web?

A lot of HN’ers “”struggle”” with the concept that the law isn’t “confusing” or “bad” it’s that they themselves are taking actions that are socially harmful and laws have been passed to target this behavior. But it’s difficult to make a man understand something when his paycheck depends on not understanding it.

The axiom has been for businesses to hoover up as much data as possible, and that’s explicitly the mindset the EU wants you to stop. Being asked to at least catalog what you’re hoovering up is a tiny tiny ask, let alone the immense risks you’re running with other people’s identifiable data. Don’t create these targets for people to hack in the first place, especially when they self-admittedly serve no useful purpose even to your marketing.

You’re literally the guy getting mad the government is making him clean up his superfund site. Moving the drums is riskier than just letting them sit there forever… up until they inevitably rust out.

https://en.wikipedia.org/wiki/Valley_of_the_Drums

“I ain’t hurtin’ nobody, it’s a free country!”


I agree that in an ideal case the data wouldn’t be collected or stored.

My concern is once it has, is it better to build a capability that can specifically be used to identify personal data to remove it or to let it lie.

I’m glad you have moral certainty on that. It’s certainly never been that clear to me.


If you are able to build something to identify PII from your dataset, anyone is. So if you "let it lie", once you (inevitably) leak the data, someone else can just build that same thing and get the PII.


If you didn't have any use for that data, why was it being stored in the first place? If it continues to not be necessary for your business, why don't you stop recording it instead of building an index to remove them after the fact?


Usually it’s a case where a data provider gives you data that is necessary for a business but _also_ includes data you don’t want or need.

Some process will ingest and index the valuable data and throw the original file into non-indexed storage in case it needs to be re-run.

Ideally you filter that data at ingest time but even that is building a capability that feels against the intent.

But let’s say you didn’t even know to build the filter. Now you need to go in and scan all those files to remove the personal data. A capability that requires you to read data you hadn’t prior.


"An example is in big data repositories where personal data may have been provided but not used for anything. Yes it’s there but nothing uses it."

It's not being used until somebody decides to use it.


Sure, but does me building a capability to proactively do that make it more likely that someone decides to use the data than if I hadn’t built it?


Having the data at all makes it more likely that somebody will use it.


> To conform with a privacy law I’m building a sophisticated filter for finding personal data I don’t even want.

...and then presumably deleting it, as required by the law if there is no business or technical reason for having the personal data. That seems like a pretty good pro-privacy move, vs leaving that data hanging around forever.


It seems weird to me because I’m creating a capability that could be used to invade privacy when none would exist besides the compliance requirement.


It seems a net positive overall to me: the system has gone from collecting and storing personal info for no reason, with all the risk that entails (e.g. a data breach that exposes that info) to no longer storing that info (and hopefully no longer collecting it in the first place either). It seems to me like you've built an important auditing tool for picking up areas where the system as a whole was being loose with personal data.

If the end goal is to improve privacy, not collecting and storing the data seems a much stronger protection than crossing your fingers hoping someone doesn't go rummaging around looking for personal data.


Just because you do not yet have the capability to abuse the invasion of privacy you have already done, does not prevent you from doing so later.

GDPR's goal was to make PII into "toxic materiel", to be only handled if you really, really need it and are willing to deal with the toxicity.

Having a pool of toxic sludge and declaring everything is fine because you do not use it after filtering what was useful for you does not stop it from being a dangerous pool of toxic sludge.


The only way _you_ can think of is "invading privacy".

If you delete any data that can single out people when you're ingesting it, then nothing more needs to be done. So you delete any user identifying data by default, not keep it from the very start, unless you have a very strong use case. You can also have completely anonymised but still useful data. There are many statistical methods to collect useful but reasonably and provably anonymous data like differential privacy.

If the businesses that give you data don't provide the option of completely deleting it then they are non-compliant as well.


Don't ask for personal data then.


I had a case once where an hr system was supposed to provide a catalog of office locations.

Unbeknownst to us (or the hr system owner) after a software update it started sending us extra fields which would occasionally include the office managers name and contact details (including home address at times).

Had another case where a pos vendor was supposed to be sending us aggregate sales data. Again, not in the agreed spec, they included the names of any loyalty users who had used each pos.

I once saw a script kiddie spew a whole credit card file into a non existent endpoint filling the request logs with pci scoped data.

Many web based api (especially json based ones) will not refuse requests that have extra fields in them, so you can get in the case where poorly crafted clients are sending data they shouldn’t.

My point is many many systems don’t ask for personal data and still get and store it.


> My point is many many systems don’t ask for personal data and still get and store it.

It's strange that your response to it is "yup, it's totally fine who cares" and not "the systems should be updated to not do that".


Where did I respond with that?

My central concern is there is a trade off to be made between keeping the data in place and building a capability to actually read/index the actual personal data. My opinion many times is that not having the technical capability to read that data is often a better privacy control in practice than building it to delete data and hoping it doesn’t get repurposed.

That seems to be a minority opinion here.


You've just almost literally said "yup, we should continue ingesting whatever data, it's totally fine" instead of "yes, we should upgrade our systems not to send and not to receive/store that data"


Seems it's good to have regulations that forces people to pay attention.


None of those cases other than the pci scoped one were covered by regulations at the time. They are just experiences I’ve had where I got data I didn’t want and didn’t ask for and had to make the decision on if building a capability to read/detect it systematically for deletion was a better privacy control than not.

I have had similar cases covered by GDPR as well of course but my broader point is that the idea that you only get the data you ask for doesn’t align with my lived experience.


Wow, storing data blindly like that sounds like a nightmare but I can totally see it happening.


They could have taken the cheap way of just deleting everything. So someone already wants to use it or at the very least have the option.


What a strange game. The only winning move is not to play.


That is precisely the intent of the law: align the incentives of users and service providers, making storing PII annoying enough that the service provider will ask themselves "do I really need this data in the first place?"


> making storing PII annoying enough that the service provider will ask themselves "do I really need this data in the first place?"

No, the service provider will rather ask himself: "Where can I find a really good lawyer who can tell me how to handle shitty EU laws without getting into a risk of getting sued?"

In other words: this law is a job creation scheme for lawyers.


If you're technically unable to comply with the requirements of fetching all PII for a given person and subsequently delete it, you have no business storing that PII in the first place. Your line of argument is like complaining that seatbelts are government overreach that increases the cost of cars.


Usually, complying with the law is the best solution if you do not want to get sued.


The problem is that it is often hard to interpret what the laws mean (they are made by politicians and not computer scientists), and which behaviour risks you getting sued and which not.


Complying with GDPR by not taking data you do not need is easy.

Lawyers mainly worked on producing disinformation about it because confusing people made for better consulting fees, but especially once guidance on "legitimate use" filtered out from legalese, they kinda stopped having an angle

... unless what you actually want to do is to violate GDPR and trade in PII, then you need to lawyer up


> Complying with GDPR by not taking data you do not need is easy.

If your claim was true, the solution would be easy: I simply find a need for all data that I collect.

Problem solved? For sure not, which obviously falsifies your "idea" that it is so easy.


It does not, because what falls under legitimate interest is also quite clearly described.

Is it necessary for performing the service to end user? Is it necessary to support technical or legal issues in providing the service? (Trading data with other entities, marketing, etc ARE NOT, logging, debugging, security/audit logs ARE).

Creating BS claims for "legitimate interest" like IAB does is just setting yourself for big fines for breaking the law.


Well, that seems like a good idea, but will that lead to a larger rise in more anonymous services? It also seems like KYC (know your customer) style laws may be incompatible with ECD (E-commerce directive, similar to USA Section 230) laws too.


more than unpopular is polarizing.


The GDPR has been a godsend for me as a developer.

It provides me with a language to speak with the product and marketing side of our business when pushing back against unnecessary surveillance and requesting resources for data security. And even when I'm working with a team that is unconcerned about the ethical side of things, the GDPR places a cost on collecting data that can be used to prevent at least some of the worst suggested ideas purely on a net value basis.

This was not the case before the GDPR where data security and privacy were of limited concern and everyone was trying to collect as much data as possible on our users.

My company's products have never been in the GDPR's radar but even without facing any threat of direct consequence from GDPR it has indirectly been responsible for us creating a significantly better product for both our users and for the company as well.


Same here. GDPR has forced us to think more about customer data and how we store it.


Yeah, there's been a number of times where I've pointed out that the thing product is asking for has GDPR implications and then the critical data collection we Must Have magically turns out to not be all that important after all. Even when we go forward with things in a GDPR-compliant way it's made it a lot easier to push back on some user-hostile things and ship something much less awful.


Yes, GDPR is worth it. It’s not cookie banners.

If you’re going to maintain a database about members of the general public, you should be responsible for stewardship of that data. The GDPR requires some forms of data stewardship that are inconvenient for businesses, but that’s why it’s a government regulation. When all businesses have to do it, individual businesses won’t need to put themselves at a competitive disadvantage for expending the cost to do the right thing.

Idc about the cookies, they can repeal that part if they want. But keep the rest of it.


For you EU citizens, how do you feel about cookie banners?


They are the most adversarial implementation of the law. There is no requirement for them. The Do Not Track HTTP header would've been perfectly compliant but of course the ad companies -- and let's not forget Google is one -- did not want it. They want you to be pissed off by the cookie banners so eventually they might go away.


I get pissed off at the website for asking to track me, not pissed off at the law that requires my consent to track me. I hope many people feel the same way I do.


Unfortunately not enough people care, or are even aware of what they don't care about, and that is a large part of what the dark patterns hope to take advantage of.

I'd accept a straight opt-in/opt-out choice if, as per the law, opting out was just as easy as opting in and opting in was not any sort of default. But the stalky side of the industry (most of it) doesn't want informed consent, they don't want to have to acknowledge the idea of consent at all.


I just remove cookie popups with ublock origin. Nobody annoyed me enough yet to see if they're compliant afterwards - pretty much all pages work properly after that, and they'd need to handle that case as selecting the refusal button.


That is expected to happen with every law.


To which, the solution would be to tell companies not to use intrusive banners, not allow tracking like it was before.


There were already updated requirements to avoid some dark patterns deployed in the wild, like mandating providing a "reject all" button.


Unfortunately, I still see tons of sites that don't provide one, or have opted-in-by-default settings for news, etc.

These definitely need to be enforced better.


Hopefully leeches like IAB who are behind many of these banners will get their dues: https://www.osborneclarke.com/insights/digital-ad-ruling-cje...


There are an excellent way to see which sites I really can't trust. Which is unfortunately many of them. If they are willing to use that many dark patterns to try engineer accidentally acceptance of stalking, then I have no doubt that they'd carry drive-by downloaded rootkits if that were still as easy as it once was, there was a fraction of a sniff of profit in it, and they thought they'd get away with it (or could convincingly blame someone else).

The majority of the cookie banners that you see aren't even compliant with the regulations that the site owners claim that they are caused by, some aren't even close to compliant. At that point it isn't “malicious compliance” but instead “overstepping as far as they feel safe”. Almost none of the cookie pop-overs would be needed at all if allowing advertisers to follow you around your life was opt-in. You don't need to ask permission for anything that is strictly necessary, like non-permanent session tracking & auth tokens, presenting that part at all is part of the theatre to try convince users that the regulations are the problem not the sites themselves doing things that need to be regulated.

One of the most annoying bits of many is “legitimate interest” which basically means “we see your opt-out preference, but fuck you and your preferences we want to stalk you and we will”. Note that these are usually implemented in such a way that if you don't explicitly object they'll be allowed, even if you click the “reject all” button without opening the concertina UI element that hides them from default view.

The phrase “cookie banner” itself is misleading, though we are guilty of that as well as them. You are opting in/out of the ability for them to do things, cookies are just part of the options they have for implementing those doings.

And while I'm having a little rant… “We value your privacy!” — no you don't, you lying sack of shit. You value the opportunity to sell the possibility of invading my privacy to the highest bidder.


They are annoying, but I quite like the option to choose. I definitely prefer not to have tracking cookies.


I find them very informative, to have a clear indicator if a site's answer to upholding my rights is genuine effort or malicious compliance.


They are annoying but not needed if people would respect privacy. How about don’t use tracking or collecting too much info? Then you don’t have to put them.


I use Consentomatic, which automatically rejects cookies and can also reject dark pattern banners. Of course this should be better solved, but at least this way I have some control over my privacy without being disturbed.


That they're a form of malicious compliance to do anything except for comply with very reasonable laws.

Cookie banners predate the GDPR by like, a decade anyway. They were the soft attempt to get tracking companies to back off; instead, tracking companies just attempted (and mostly succeeded) at tiring out the user.

I highly recommend the Consent-O-Matic extension to fully reject all that crap automatically[0] in lieu of corporations properly complying with the law and accepting the general rejection signals like DNT and GPC.

[0]: https://consentomatic.au.dk/


Consent-o-matic takes care of all honest cookie banners out there and opts out for me before I even realize it.

When I run into a website that it doesn't work with, I sometimes use the report function and set it myself.

However, most websites with a cookie banner that's too obstrusive for Consent-o-matic turn out to be useless clickbait LLM spam that's not worth the effort. So most of the time I just go somewhere better and leave the obnoxious cookie wall to itself.

I think it all turned out fine (for the time being) thanks to cookie walls :)


A lot of websites and people misunderstand and think that you need popups for essential cookies like login cookies -- you don't. Every time you see a cookie popup it's because they want to track your actions beyond the basics needed for their site to work.

Im all for them.


The vast majority of the sites don't misunderstand, I wouldn't give them that much benefit of doubt. They present things in a way to encourage people to misunderstand, in the hope that people will think the regulations are the problem and support getting rid of them.


It becomes very obvious when the "cookie banners" talks about "Company and our 1234 partners ..."


They are annoying, disrespectful, unwarranted and unasked for.

But with uBlock Origin and Firefox, we have solid tooling to almost never see them now.

GDPR is also not the issue. The issue is incentives to track people.


> They are annoying, disrespectful, unwarranted and unasked for.

The tracking itself is : if there is a banner, it shows you this company is indeed disrespectful.


It is nice to see which websites do not value their visitors' privacy. It is too bad that there are so many of them.


I hate all the illegal consent popups.


I like it. Gives you good idea who actualy cares about their users and who consider us to be the product.


They are pretty annoying, and I very much prefer for web sites not to need them.

But when they employ the practices that make them have them, I'm glad I can say no.


I wish the rules for consent banners were actually enforced. What bothers me most is when some site clearly doesn't care and puts up a "you consent by viewing this site" banner.

It's understandable, they can do whatever they want without any consequences right now, but that needs to change.


It's enforced, but not fast enough, and not hard enough.

Even the enforcers are just not mentally ready (or just have too many cases) to really, really slam the monopoly of force on offenders.


They are easier to click now, first sites with cookie wall setups were often bad pattern, hard to disagree with non essential cookies and so on, they basically forced you or tried to trick you into accepting all cookies, essential, marketing, social and what not. Now you see more sites that have default setting to only allow essential.

But regulator obviously didn't know how cookies work, what they are good for, what they are bad for and whatever they try to outlaw now gets correlated via different means. User-agent, ip and some other browser fingerprinting.

So bottomline, we have to click once more (or every time, when private/incognito mode) when you visit a site.


> But regulator obviously didn't know how cookies work, what they are good for, what they are bad for and whatever they try to outlaw now gets correlated via different means. User-agent, ip and some other browser fingerprinting.

The regulators are very aware of it, that's why the law doesn't say anything about cookies. You'd know that if you cared to read it. It's only been 6 years, and the law can be read in it's entirety in an afternoon or less. So I see why the task of actually learning anything about GDPR beyond what the industry tells you to seems insurmountable to most IT people.


So there's the old e-privacy directive which I agree was poorly thought out, but cookie banners in their current form were implemented by companies as a reaction to GDPR which has no thoughts on cookies - fingerprinting is equivalent to cookies under the GDPR.


It already was equivalent under ePrivacy.


This should be done with API, not banners. As someone who prefers using Incognito mode in order to regularly clean cookies from third-parties, the choice of place of storing my GDPR preferences for each web-site is nuts.

BTW I do not use much websites with that banners because they are for idiots. Good forums and torrent trackers don't have that because there will be always an anark spirit on the Internets, no matter how severe the censorship is going to be. And this part of the Internets is really better part than the opposite one.


Checking if user has "Do Not Track" or its newer equivalent set and just triggering the same flow as if the user refused all tracking (outside of what falls under "legitimate interest" according to GDPR and not IAB) would be perfectly fine implementation.


"Do Not Track" has some problems with reputation, probably "Do Not Track Even The System Cookies" might solve all the problem such as repeating to see GDPR banners and Mozilla certainly will not make the "Do Not Track Even The System Cookies" as a default choice.


They are a very minor part of GDPR. It's kind of a shame that it is most people's only experience of it, because the other changes are all very positive for EU citizens.


> They are a very minor part of GDPR.

Cookie banners are not a part of GDPR. Especially not the common dark-pattern-riddled "we sell your info to thousands of companies" ones


As a non-EU citizen that is affected by this, I find the entire idea stupid and they're annoying as heck. Every website I go to has a stupid cookie banner. It's beyond ridiculous. This should be a simple setting in the browser that I can override per site and never see a cookie banner again. This is a perfect example of government overreach done wrong.


Don’t track people solves the entire issue. Complain to companies who want to fingerprint and track you; those are the evil ones and the ones with overreach. Without that you don’t have to put banners.


> This should be a simple setting in the browser that I can override per site and never see a cookie banner again.

There was the Do Not Track header. That the industry you're defending immediately used to fingerprint and track people

> This is a perfect example of government overreach done wrong.

The law doesn't say anything about cookie banners. The blame for them lies squarely with the great amazing privacy-preserving and customer-loving industry of ours.


It's horrible.

Cookies should be a client thing, browsers should forget them once the tab/window is closed by default, and there should be a button by the url bar to remember cookies for that domain. EU should mandate the default settings in preinstalled browsers on all devices sold in eu, and that would solve 99% of the problem.


It's mostly malicious compliance.


They remind me of how we as humans often end up with ineffective solutions to problems we try to solve. It's like it goes off the rails halfway and ends up stuck there.


I have uMatrix installed and never see them. The greatest achievement for GDPR for me was that it significantly reduced the amount of spam I receive.


They are an annoyance, but it is great not to have websites track you if you select them not to.


The ones that disappear AS SOON as you click your choice are ok :-)


Its great to know which websites track you and which don't.


Cookie banners are just like ad banners: adblock fodder.


IMHO one of the worst things about the current state of the web, especially if you're browsing on a device like an iPhone which doesn't have uBlock Origin.

It used to be, you open a website, you can view the content. Now it's more like you open a website, get an overlay popup, take 30 seconds to solve the dark pattern logic puzzle of disabling tracking, then you view the content. Every. Single. Time.


I like them, it's a sign that the website is shit and helps me go away from it.


Extremely annoying.


Here's a thing governments should say a lot more - a combination of being humble and also instilling humbleness in companies that need to hear it:

"Look mates, if you don't clean up your stuff, we're gonna have to write a bunch of laws and nobody wants that, so, fix your problems". (i.e: Government is aware legislating something probably results in a highly suboptimal/inefficient endresult, but, it's the only tool governments have. It's just as much on a company / on all companies in a given area of expertise that government sees no way out other than to wield the heavy and inaccurate club of law-making).

Separate from the many comments that highlight both how [A] the GDPR is a lot more than those cookie banners and [B] those cookie banners are like those email footers - lawyers gone nuts. A whole bunch of them are not required as per GDPR, just there because someone asked a lawyer and of course they say 'yeah sure add it'. If I ask a painter if it's a good idea to paint my house at this point in time, they're bound to say 'yes':

Companies went way too far with it, so GDPR happened. It's a good thing to teach companies that going too far results in annoying laws. It's fine even if the GDPR is a loser - if everybody loses, that's.. sad, and a better law would be much preferred, but that can still be better than no laws, because its mere existence tells companies to try not to go too far.

So far, US-based companies do not appear to have understood this (see also: USB-C rules, apple's current problems, repeated and widespread illegal shit going on at banking institutions), but it's worth doubling down on it until they do.


> Companies went way too far with it, so GDPR happened. It's a good thing to teach companies that going too far results in annoying laws. It's fine even if the GDPR is a loser - if everybody loses, that's.. sad, and a better law would be much preferred, but that can still be better than no laws, because its mere existence tells companies to try not to go too far.

What rather happens/happened is that the hatred in the population for the EU increased a lot (as witnessed by the rise of right-wing nationalist parties, even though admittedly there also exist a lot of other reasons for this rise).


I have been involved in trying to implement GDPR, and one constant source of frustration was it's not actually very clear when a system is fully compliant. We had engineers and product people and legal people who had different understandings of what various terms meant.

For example, you have a 'right to object' to a data controller doing further processing of your personal data, and you have a 'right to be forgotten' in which case we should not keep your personal data any longer -- but we need to remember enough that when we encounter you again, we recognize that you've objected to further processing. How are we supposed to know in future interactions that you've opted out if we've deleted all mentions of you and your PII?

What does 'deletion' mean? If I have a DB which is based on an append-only WAL which can include write and delete "operations", which mean the DB will respond to queries as if the record were deleted, but the record is definitely on disk, and still gets read into memory but just isn't returned as part of any query, but someone with access to the machine could still in principle read it ... is it 'deleted'? Are you 'forgotten'?

What if you're gone from the DB but an old DB backup in cold storage still has a record? What if a columnar file for the datalake in block storage still has a record but you're gone from all DBs that are part of online systems? What if no DB has your raw PII, but your IP was added to a bloomfilter or other sketch datastructure, so it can't be read back out but we could potentially identify with some confidence that your IP had previously been in our logs?

I totally think GDPR was a step in the right direction, and I wish my own country would adopt a strong data privacy law. But I also wish that the EU had set up e.g. a certification system, a large set of reference examples for how pieces fit together, or something to give implementing parties some confidence about whether they're doing it correctly.


Oh right, the other one that was totally baffling is people have a right to request their data, and we should give it to them in a timely manner -- but it's not sufficiently clear how we should know that the person making a request, possibly based on PII, is in fact the data subject in question. How do you check that? How do you find all the data that is associated to a person, given that accounts are not people? Sometimes it's gonna be impossible but GDPR doesn't say that your company's obligation to give people their data is contingent on them still having control of the email address they used to sign up years go, etc.


> but we need to remember enough that when we encounter you again, we recognize that you've objected to further processing. How are we supposed to know in future interactions that you've opted out if we've deleted all mentions of you and your PII?

Just hash the PII, delete the original and reinsert the hashed version (perhaps into another table). On insert check if hashes match and the opt out bit is set, if they match and it’s set then act appropriately.


You used the word "just" there, but I don't think storing hashed PII is necessarily an out here. Clearly, the whole point of the suggestion is that we can still match the person against a record, and the hashed device id or ip or email or whatever else is functioning as an identifier.


The best thing about GDPR is that in case of violation, you're not immediately raked over coals.

The responsible national body is supposed to cooperate with willing business in ensuring compliance - if the business cooperates in good faith, the fines are minor or can be waived altogether.

And part of such cooperation would be for regulator to help specify down the case.


Hmm, if anything the emphasis on national bodies having discretion in when this is enforced seems sketchy and concerning to me. Sketchy on the regulatory side because I don't trust them to not pick targets in a manner which is informed by public sentiment or some other motivation. Concerning on the corporate side because I have heard people say things like "oh, it doesn't have to be 100%, we just need to be good enough to not be a target for enforcement", which then becomes the basis for not making an entirely good faith effort.

Whereas something like PCI compliance for businesses handling credit card transactions isn't a law, but does have pretty clear standards, a certification process, and serious potential risks to your business if you do it wrong. But you go through the process, you get certified, and then you know you're doing it.


It's not "discretion at enforcement".

It's that the goal of the national body is to get you up to compliance, with reduced fines as incentives for helping you get there. The difference with PCI is that with PCI non-compliance means that payment networks will refuse to work with you, but at least in theory you're not forced to work with them either - you can not "opt out" of GDPR. So the purpose of the national body is to get the rules followed, and it has both ways to sweeten the deal (reduced fines for cooperating business) and the stick (vast increase in fines).

Worst thing about GDPR was the cottage industry of "GDPR lawyers" that exploded around it coming into force in order to capitalize on panicking businesses, often fueling said panic. In reality, the main difference in GDPR versus previously active law is that GDPR increased fines and introduced policy framework for enforcing and helping compliance.

BTW, there's no such thing as "not being a target". It's just enforcement depends on finite resources - and on complaints from people.

And "do the minimum to not be target for enforcement" would be actually grounds for increased fines for acting in bad faith.


Your description sounds exactly like I'd expect a regulator to describe discretionary enforcement.

My point about PCI is that people can actually implement it and know that they've done so. If the goal of GDPR is to get people up to compliance ... well since no one but the regulators can determine if a company is complying, we have no idea whether they're actually accomplishing that goal.

The cottage industry you're complaining about exists precisely because the law was so unclear. I remember reviewing "non-binding recital" documents before it went into effect which were not helpful. For a law which created new requirements for systems at thousands of companies, they could have written material which directly was targeted at the parties who would be responsible for implementation, and which they would commit to being binding. But they didn't.

> you can not "opt out" of GDPR

For companies based elsewhere, there was a very serious consideration of should we just not try to do business in the EU. If you're a midsized company in the US faced with a large engineering effort to possibly meet regulatory requirements for a region where you have a small minority of customers, it's a real question.


I wonder of some of that doesn't come from different interpretation doctrines in different places. Under teleological interpretation used in practically by all relevant to GDPR courts, there's more flexibility than mechanical direct wording approach used in some places - and practiced by many PCI implementations and auditors to the point of being a running joke in some circles. A basic guiding principle with GDPR can be considered to metaphorically make "personal data" toxic, something you want to deal with as little as possible (with "sensitive private data" being extra toxic), with goal of preventing profiling and other abuses. Significant majority of business processes do not actually require sensitive data. A company that does not handle sensitive data does not even require a DPO.

Similarly, the flexibility means that as a company you can make a point of showing action in good faith, while regulator does not have to update the law for every possible new detail.

Consider for a moment that GDPR provided, on average, about ZERO new requirements compared to previous law on the books.

Yes, zero. If anything, it cleared up some questions from before.

The big difference is that previously, most member states law had consequences for non-compliance that were pretty much ignorable especially for bigger companies.

The cottage industry that sprung was majorly oriented at extracting as much value from the panic, and in fact often increased said panic on purpose. In fact, if you're in B2B market, you might have close to 0 work anyway.

The big problem was that with extremely lacking enforcement before, corporations built up, often unknowingly, huge amounts of mess and now someone started considering that the fines might be substantial.


As a German, and as a privacy conscious person, yes, I think it's worth it.

As a CTO of an AI company that does automated background checks for international compliance, .. jesus fucking christ, my job is 90% GDPR compliance at this point.

off-topic: did you know in Germany you need a court order to surveil Taliban terrorists ... in Afghanistan? Fun times.


> did you know in Germany you need a court order to surveil Taliban terrorists ... in Afghanistan? Fun times.

sounds reasonable, otherwise it'll be abused, no?


Abused how? If you need sigint on a possible bomb threat you learned about via humint in the field and you're waiting for bureaucracy to happen, people will die, and they have died already.


I could see plenty of ways in which a law like "you can surveil people if xyz without a court order" could be abused


Side note:

German intelligence service BND famously argued ~10 years ago that them intercepting satellite communication at their listening station in Bad Aibling does not in fact happen on German soil but in space. Therefore, they argued, their surveillance activities weren't bound by German laws at all.

https://de.wikipedia.org/wiki/Weltraumtheorie


> The very people who comply with and execute the GDPR consider it to be positive for their company, positive for privacy and not a pointless, bureaucratic regulation.

It's their job to do GDPR, of course they consider it a positive.


I’m a run of the mill developer and I consider it a net positive.

Less sensitive customer data to process.

Less stuff compromised in case of a security breach.

Easier to say no to product managers who wants to unnecessarily track customers.

Faster response times since fewer scripts are loaded on fist visit.

The largest flaw is the lax enforcement in some EU countries.


It's usually not their main job. In many companies, managing GDPR is a responsibility that just gets added to someone's job description.


This is about the random sysadmins and devs that have to ensure their app or systems must comply with GDPR. Not the regulators.


> It's their job to do GDPR, of course they consider it a positive.

No, actually it is annoying to do. But it is extremely positive. The two things do not conflict.

It's annoying for restaurants to maintain the high level of hygiene that the laws enforce. But I expect most people in the restaurant business see it as a positive (and I'm not going to the restaurants that don't...)


So, regulators consider that regulating is good. Science is amazing, really.


Not just regulators, I'm a developer in the EU and I find GDPR a great thing. I have yet to find a fellow developer here that disagrees.

Yes, it can sometimes be a hassle to comply with it. But it's a no-brainer that it's a good idea.

In that sense is this site a bit of a bubble. In real life it's not even a discussion.


> Not just regulators, I'm a developer in the EU and I find GDPR a great thing. I have yet to find a fellow developer here that disagrees.

I know lots of developers who are still today incredibly furious about the legal effort that complying to GDPR requires (even if they - typically - don't intend to track users).


Then they should change whoever is advising them on law, cause they got duped.


> I find GDPR a great thing. I have yet to find a fellow developer here that disagrees.

Besides compliance costs, operational complexity, impact on small businesses, inconsistent implementation across EU Member States, and extraterritorial reach and global compliance challenges...

IMO a lot of the text in the law is extremely vague and subjective, while simultaneously sounding very longwinded and authoritative.

> Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject's agreement to the processing of personal data

Notice it says “should”, not shall. Does this mean consent is technically optional? What is their definition of informed, affirmative, or agreement?

Requiring companies to remove individual records from backups also quickly proves to be a practical impossibility.

And it’s ironic to me that the law requires you to have even more centralized control over personal data from start to finish than they have now. To me this is a bit alarming from a security AND privacy perspective.

> The requirement to notify regulators of data breaches

Here’s a scary thought… what if they just stopped acknowledging breaches in the first place and claim they weren’t aware of it? What if they actually were unaware because the system was designed that way? There’s no law against being oblivious to breaches.

Other undefined terms such as “undue delay,” “likelihood of (high) risk to rights and freedoms” and “disproportionate effort” will require further clarity by the courts or regulators, or time for specific market practices to develop.

They don’t even define what a website is. Does that require HTTP be used to qualify? Or HTML? What about FTP or other protocols?

Also, it forces even non-EU platforms with zero domestic presence to risk being blocked due to not complying with rules that don’t even apply to them. Seems like net neutrality is fundamentally incompatible with GDPR in some ways. What if the US blocked EU sites for not being free enough?


> Notice it says “should”, not shall. Does this mean consent is technically optional? What is their definition of informed, affirmative, or agreement?

It looks like you are reading from the preamble, not the provisions. The purpose of the preamble is to describe the general intent behind the regulation or directive, not the technical requirements themselves. When national regulators of member states create the laws that are legally-binding within their jurisdictions, they might refer to the preamble to check that the spirit of the law is compatible with the original European text. The preamble can also inform judges in the situation that a case is taken to one of the European Union's courts. When one finally gets to the provisions, there's even more introductory text, but it's a lot more terse than the preamble.


> Notice it says “should”, not shall. Does this mean consent is technically optional?

Does it mean you never actually read the law? Does it mean you assume that the words of an HN user are a direct quote for the law? Does it mean that any statements you make about the law are false and misleading because you never read the law and rely on misinterpretation of the words of strangers to paint a picture of what the law is about?

The law does not deal in "should"s.

Scroll down to "suitable articles" to see what the law actually says, and not what you think it says: https://gdpr-info.eu/issues/consent/ Start with definitions and work your way through the referenced articles


That was my own quote taken directly from the exact law text.

https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELE...

There's many more as well:

> The processing of personal data should be designed to serve mankind

> Natural persons should have control of their own personal data.

> Legal and practical certainty for natural persons, economic operators and public authorities should be enhanced

> The protection afforded by this Regulation should apply to natural persons

And that's just from the first 3 pages.


Yes, because the first three pages are justifications and reasoning for the law to exist.

Which means that you clearly did not read the law. Truly, you can lead horse to water...

Now quote the actual regulation. You can post the entirety of Article 7, for example


To clarify, if I'm not mistaken, out of 422 shoulds in the document 420 are in Chapter I, "General Provisions which:

- spells out more-or-less in layman terms why the law exists, its scope and applications

- definitions that will be used throughout the document

There are two shoulds used in the rest of the regulation, in Article 47.1(j). IMO that has to be shall, too, but it's in a large list of other binding corporate rules, so it's not too bad.


> Besides compliance costs, operational complexity, impact on small businesses, inconsistent implementation across EU Member States, and extraterritorial reach and global compliance challenges...

In my experience the compliance cost really isn't that big for the vast majority of businesses. Where it does get trickier is if you're collecting a lot of data relative to the size of your customer base e.g. facebook, google etc. and in those cases I think some extra eyeballs on how data is handled is sensible.

Pretty much everything GDPR gets you to do is good practice, so if as a business you can't comply then imo your business is on shaky ground. For example:

- Right to be informed -> have a privacy policy which explains what data you collect, what it's used for, how long you keep it, when you delete it and so on.

- Right of access - > have a mechanism for providing users with a copy of their data.

- Right of rectification -> have a process where you can update customer data.

[1]

> There’s no law against being oblivious to breaches.

Im not sure where you got this idea from. One of the seven key principles of GDPR is "Integrity and confidentiality (security)" or more explicitly "You must ensure that you have appropriate security measures in place to protect the personal data you hold." [2] Being oblivious to breaches clearly goes against this.

> Also, it forces even non-EU platforms with zero domestic presence to risk being blocked due to not complying with rules that don’t even apply to them.

GDPR applies to the resident who's data you are processing, so if as a business you wan to deal with those customers you need to deal with those laws. Seems reasonable enough to me.

[1] https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-re...

[2] https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-re...


> Being oblivious to breaches clearly goes against this

I don't see how it does. Even "ensuring appropriate security measures" is subjective in the first place, plus as we all know, no amount of security is perfect and breaches will always happen either way.

Even if you took enough subjectively appropriate security measures, and it still happened anyway, there's nothing that guarantees you'll even know about it in the first place.


Yes, yes it is.


There still exist data brokers and personalized advertising in the EU so it hasn’t gone far enough.


EU regulations are why I have a USB-C iPhone with a replaceable battery and a GBA emulator.


Wow, you can replace iphone batteries in Europe?


You can do it worldwide on iPhone 12 and up iirc, you just bring it to the apple store and they do it in a few hours. Not user replaceable but better than the whole phone.


The study only tells us that some workers who implemented GDPR think it was worth it. As if they had enough information and context to judge whether the extra work was useful. But people do have a bias to believe that their work is useful.

The study does not tell us if GDPR is worth it. It's not.


This comment only tells me that one specific HN user thinks it's not worth it.


German here.

+ I like the right to be able download my personal data dump on platforms

- cookie banners are a total pointless waste of everyone's time

- institutions and organizations use "data protection" and "privacy" as generic gaslighting argument for things they can't do and/or lack competence in and as an excuse for keeping their 20y+ old processes

- consumer data is no way safer than before, just the means of collection have changed

- the law is so interpretable (e.g. "what is exactly PII data? IP addresses? the evercookie? what is an evercookie?") that, without blindly copying implementations from other sites, you'll probably end up in a grey area

- they started using GDPR for politically motivated unrelated prosecution. e.g. the first thing Italy did when ChatGPT went hype was using GDPR to quickly block the service. the case is only somewhat related because of scraped data for learning, however GDPR in my interpretation is about personal data.

So all in all is it worth it? I guess no. But YMMV.


> "what is exactly PII data?"

PII is not even mentioned in the GDPR. It is a notion in US law.


> the law is so interpretable (e.g. "what is exactly PII data? IP addresses? the evercookie? what is an evercookie?")

It's only interpretable if you never read it. You could start with showing where the law defines PII or where it mentions "evercookies" (or cookies in general)


The abstract points to totally unbiased science /s

GDPR is way overrated. For most businesses it was a minor annoyance, but the benefits are nowhere to be found. Our data is mashed potatoes now (GPTs) and will forever be. Politicians loved GDPR because it gives them +100 undeserved virtue points.

The real problem with BigTech is the addictiveness , and we are only beginning to wake up to it


> ...but the benefits are nowhere to be found.

I disagree pretty strongly:

- Loads of companies now have tools for getting copies of all the personal data they hold on you e.g. google takeout, which is a direct consequence of GDPR.

- Companies are punished for improper handling of personal data e.g. data breaches (though you could definitely argue that more should be done here)

- The US government's lacklustre approach to data privacy has been repeatedly laid bare (https://www.gdprsummary.com/schrems-ii/), leading to better data residency practices e.g. many services now have a "store data in the EU" option

Not to say there aren't a lot of dodgy things happening in the data collection world, but to say the benefits are "nowhere to be found" is an exaggeration.


Google Takeout was released in 2011.


Surely the benefits are for the users?


GDPR: All websites ask if cookies are ok. If not ok, can't use website. Everyone clicks ok. Amazing work, regulators.


I always choose the “essential cookies only” option. This is normally harder than accepting all cookies, but it is getting easier over time. I rarely see a site that has essential-only.


> I rarely see a site that has essential-only.

You'd need no cookie banner, then. :-)


GDPR is not the ePrivacy directive.

The later is the one that resulted in the cookie banners, the former protects the right of Europeans to have their private data remain so.


Even with ePrivacy Directive you don't need the dark pattern banners the industry puts up


Correct. The banners started as malicious compliance and then became the "industry norm" so everyone followed through.


What's even more annoying is there were already browser plug-ins, and now there are built-in browser tools, for rejecting cookies or deciding which cookies to store permanently versus which to accept temporarily. GDPR forced a one-size-fits-all solution to a "problem" that was already solved, and got people used to a hostile UI pattern on websites.


> GDPR forced a one-size-fits-all solution

Can you point me to the exact place in GDPR that talks about browsers and cookies

> to a "problem" that was already solved

If it was already solved, why did the industry come up with the dark patterns to trick people into blindly agreeing to tracking and blaming the law for that?


Not everyone clicks ok. Having a choice is better.


You always have the choice not to use the internet. A far better solution would be to manage this on the client side.


I always click no and the websites do work. What do you mean?


> If not ok, can't use website.

If you care to read it, you'll find out this is the opposite of what it says.


Everyone doesnt click ok. You can typically use websites if you dont click ok.


I wonder how many websites show a cookie banner and don't change a thing whether you click Accept or anything else.


The last study I saw was about 15% for ways detectable client side (e.g. loading ad tracking scripts before the banner is interacted with or despite negative interaction).

Probably a bit higher when you consider server side tracking and those who make up some not-legally-valid claims of legitimate interests.


My least favorite part of GDPR is the requirements around processing EU citizens data inside EU servers. China has similar laws and I think other countries will follow suit.


There is no such requirement. The GDPR allows for the transfer of personal data to non-EU countries, provided that the data controller can demonstrate that the rights of the data subjects will be respected.

https://gdpr-info.eu/chapter-5/



If data is processed outside of the EU, presumably it would then fall under different laws?


GDPR would be fine if they had a limits for small businesses. As written, startup founders don't have the resources to comply, so they often end up blocking all of Europe.


Why should small businesses be allowed to spy on their users without their consent?


A lot of times, it's not even spying on users. It's not wanting to put in the time and effort to determine if you are in compliance or not. So you block all of Europe and you get around to it if you ever have the resources or care. You might have been in compliance the whole time, but why chance it when IP blocking is easy. That's basically every local US newspaper right after the GDPR passed. Hell, I've worked for companies where I literally know we're not tracking users and we're pretty secure, but we block the EU because no one has the time to check if there was something specific we needed to do. My current company had to rarchitect their entire to deployment pipeline specifically for the EU, not because we changed literally anything, but the laywers found that there was about our cloud host provider that the GDPR disallowed because it was hosted on US soil. We have 1 EU client. I assume if they weren't so big we would have dropped their contract.


My wife and I run a small (2 person) business in the EU. The largest hurdle was finding a hosting provider (VPS) that wouldn't transfer data outside the EU so we wouldn't have to add SCCs to our privacy policy. As a business owner, I'd say the balance is still positive, it forces some self-reflection on data gathering practices.

Not sure about the "hosted on US soil" part, if you are a US company, the data gets transfered anyway when you view it.


Why do you think not being willing to put resources to comply to rules equals to intentionally "spying on their users"? How come you think the rules assert that businesses are not able to spy on their users without their consent? You should better look what's inside the law's box instead of just looking at the packaging.


The value data has varies a lot... Something like behavioural targeting data for marketing is probably inconsequential. But what about health care or financial information? Those could have actual larger impact. And they could be handled by pretty small business.. It is easier to give generic guidelines than to specify each sector separately.


If you're too stupid to handle a simple "do not track, write down that you do not track, do not sell your users to others" policy, I have bad things to say about validity of the rest of your startup.


"The very people who comply with and execute the GDPR consider it to be positive for their company, positive for privacy and not a pointless, bureaucratic regulation."

I have been involved in multiple GDPR compliance pushes and I want to emphatically state the opposite position. It is not worth it. It is not worth ANYTHING. It should be destroyed.


Do you have more detail to share in support of such a strongly worded opinion?


The removal of WHOIS information for domains that GDPR directly and indirectly resulted in has made the internet a drastically less communicative space. In the old days you could always just email someone if they had a .tld website. Now communication is usually not possible.


IMHO privacy is important. If a domain owner wants to be contacted, they know where to put their contact info.


If you're a business, you should be required to have the actual name and address of the company in Whois.

That's a requirement for the .us TLD, by the way.


Guess what, corporate contact information does not fall under GDPR.


Right, that falls under the European Directive on Electronic Commerce, which is strongly against business anonymity.


The law already forces you to put it clearly in your site, AFAIK.


In which jurisdictions? California law does, if you accept payments. It's not, as far as I know, a US federal requirement.


Sorry, I thought we were talking about the GDPR context. Don't know about other jurisdictions.


Nothing to do with GDPR and everything to do with unwanted spam from email and phone number harvesting.


I suppose that if someone wants to be communicated with, they would provide a contact link on the website in question. Not to mention that the whois databases could provide the option for a public profile. Loudly complaining however is much easier and much more popular.


Is it less convenient? For sure. Is it impossible? No.

Even NICs, e.g. DENIC, who do not divulge any contact information on any old whois query usually send a response how to get the contact information, at least for abuse and admin.


For domains, yes. For IP addresses, no, as abuse email addresses are still a requirement.


Compared to the number of people with domain names the number of people running their own ASN with their own ip netblock(s) is unfortunately insignificant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: