Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why are law documents (GDPR) so difficult to understand?
71 points by evervevdww221 on March 23, 2018 | hide | past | favorite | 80 comments
As much as I want to comply to GDPR, I think its articles difficult to understand, like many other law documents.

https://gdpr-info.eu/

As an engineer, I found it is very difficult to translate from the regulation text to code, to actual implementation.

Taking the following statement as an example:

https://gdpr-info.eu/art-5-gdpr/

>>>

(Personal data shall be) processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures (‘integrity and confidentiality’).

===

"In a manner". In what manner?

What's "appropriate security" and "appropriate technical measures"? How to interpret it? There seems to be much flexibility?

Every website has some security measures to protect data to certain degree. How do I know if that's "appropriate" or enough to meet GDPR?

Do I need symmetric encryption? Or Do I need asymmetric encryption? Which kind of crypto hash is considered "appropriate"? What if I use a database which is insecure by flaws, but I don't know or don't have the technical strength to know it? What if encryption on my backend caused performance penalty? What if I run a hosted, non-profit BBS based on certain open source BBS program that might be insecure? Should I patch the server with OS Update JKB8948, which is known to fix a security hole but opens another? is it an "appropriate measure"?

I found this regulation put too much burden on small businesses. Just to understand this GDPR text may require consulting cost. What if this law will be abused as a tactic to attack business competitions? I'm worried.

How do you understand this "security appropriateness" of the above text? How can you be sure your understanding is correct?




This might sound a little mean, and I don't mean it to be this way, but this is a really naive viewpoint.

Look at any profession -- accounting for instance -- and they have all sorts of stuff like this. As an example, there's a concept in accounting of "materiality" - basically, something that's big enough to matter. Materiality is what lets fortune 500 companies present their financial statements rounded to the nearest thousand dollars. When you're talking about tens/hundreds of millions, individual dollars just don't matter.

Whether or not something is "material" is a matter of professional judgment, to be made in the context of a large body of professional knowledge, history, prevailing industry standards, economic/cost considerations, etc, basically that thing called "experience" that we so often toss under the bus in SV.

Perhaps the biggest difference between law and code, which are in many ways quite similar, is that law is highly reliant on context. For a court to determine whether "appropriate security" and "appropriate technical measures" are followed, they would solicit testimony from experts in the field (people like us) to determine whether they felt whether someone took "appropriate security". So ultimately it's a matter of opinion, but one made with context and expertise.

It works surprisingly well.

EDIT: For really complicated stuff, implementation is often delegated to an agency, such as the FCC, to create specific guidelines like you want. But this is the job of executive action, which is easy to change, not statute (on-the-books laws), which is much harder to modify once passed.


One thing I think a lot of people don't realize is that the more specific a law is, the more it becomes like a zero-tolerance policy. Allowing for ambiguity, as you said, allows for the law to be enforced with context.

A contrived example: I could try to look up some tax information on the IRS website. An error occurs, and the server spits out a bunch of log data not meant for the public. This data happens to contain sensitive URLs. I navigate to one, and it gives me unfettered access to the server. So long as I stop here and report it, I should be in the clear.

I don't. I look around a bit to see if I can help include additional details when I contact the proper person. I haven't actually done anything bad per se, but now I'm knowingly accessing a government computer system without proper authorization. A law with proper specificity would say that I should be jailed for looking around. Common sense says that though I should have close the tab, but I was only doing my best to help. And since I never did anything detrimental, I should be in the clear.


I do understand this.

I am just wondering what happens when there is a vested interest in attacking or suppressing the company involved.

For example, if a company becomes unpopular on social media and by "public opinion" (such as Facebook right now), a court can feel pressured into a slanted decision. Given that so much is now based on opinion, what defense does the company have?

It seems that if someone had the intention to nail a company on GDPR as a PR attack, regardless of the amount of effort the company put in, they almost certainly could.

(I don't work for Facebook)


Perhaps you could describe how you would pressure a judge successfully?

I often see comments like this, abstract what ifs without any details on what.

So, try to illustrate what might happen. Also, describe what protections the judges might have against this. It’s a useful mental exercise and you might realize that it’s a fair bit harder than posting on 4chan or Twitter.


See reply to Tomte on that.


That's why we have independent judges.

"Pressuring a judge" would be an impressive feat. They are generally obnoxiously aware of their untouchable status.

And if society's stance really changes, we want the courts to take that into account. Again. feature, not bug.


Well I'm talking about exactly that - social pressure.

Just because popular opinion (aka the vocal social media / news / social media echo chamber) approves of something, it doesn't mean it is correct.

Governments and courts have definite pressure to legalize marijuana, for example. That pressure is based on popular public opinion. Therefore approving it gives that legal body or state acceptance / goodwill. This is an incentive that goes quite far.

It can also be popular to smash a company.


I don’t think this deserves to be down voted. It’s a valid perspective I think shared among those that are, perhaps, removed from legal specificities and their make-up.

I also think this point resonates fairly well in smaller courts (read as maybe more rural areas) where the legal system is closely tied with the social system of the area and there are indeed LOTS of incentives to introduce, we’ll calm them, ‘alternative judgements’.

All that said, I think law has to be appropriately ambiguous in order to remain relevant and applicable through change and societal adaptation in norms. Hence, case by case context.

This is why it looks to contain so much flex in the language. Right and wrong is implicitly an ambiguous and ever changing notion, described and defined only by the same body of individuals that mutually agree to uphold it. It’s fluid.

However, I also see the perspective that the fluidity of societal definitions and the increasing ease through technology to greatly influence a vast chunk of that populations opinion, can make these things misalign with ethical appropriateness. See the Nissan.com website case or any other number of court cases that clearly concluded under the coercive pressure of the more powerful/wealthy party.


>For really complicated stuff, implementation is often delegated to an agency, such as the FCC, to create specific guidelines like you want.

If you start at around article 43 and work your way onwards, over the next 20 or so articles the GDPR document goes on to specify that all nations should set up organizations to perform this task and that these organizations have a responsibility to create and make available such specific guidelines.


FWIW I recently attempted to translate literally the entirety of the GDPR into Plain English (albeit for a technical audience). It's at:

https://blog.varonis.com/gdpr-requirements-list-in-plain-eng...

In general I think legislatures putting out goals/guidelines instead of detailed specifications is a feature not a bug. Tech moves faster than they can possibly keep up with and to call out things down to the patchnote level just isn't feasible.

Try to think of it more like: "jury of your peers". If a dozen fellow sysadmins / devops / programmers would consider what you're doing to be reasonable then you're probably ok.

One big caveat to that with GDPR is that the legislature is very purposefully pushing for what many would consider fairly innocuous "personal data" to be treated more how many developers today would treat something like credit card numbers or banking info including pins and passwords.

If the format/style of the article feels familiar to you, it's probably because you read "AWS in Plain English" which I also wrote and which periodically blows up on HN.


I'll second Michael's page, he has definitely provided a useful starting point.

The law itself is not written for engineers as an audience. Not even for non-specialist-data-protection-lawyers as an audience.

That said, as an engineer, I found a book targeted to non-specialist lawyers to be enormously helpful: Peter Carey's _Data Protection: A Practical Guide to UK and EU Law_: https://www.amazon.com/gp/product/B00VU5XJHK/ref=oh_aui_sear....

It's not cheap, but if understanding GDPR is a professional concern, consider it a resource for explaining the history and motivation for the requirements that Michael extracts.

In the wake of the data protection issues we're having here in the US, I would love to have a GDPR-influenced regime.

And for small business, it mostly just means- be careful and respectful of people's personal data- which can be done without it being a burden.


Nice summary, I liked the "plain english" format, I enjoyed reading it.

You might want to put a disclaimer in your blogpost that this is not legal advice.


I wish I could upvote this a lot more than once. This is excellent, thank you.


You don‘t really expect a law to specify which hash algorithm you‘re supposed to use, do you?

The answer is simple: the law will stand for a long time, and legislators know their limits. Unlike many engineers, unfortunately.

Having courts interpret laws, with help from experts, is not a bug, but a feature!


I really like this explanation. I think code could be thought of as extremely formal requirements interpreted by the computer extremely rigidly (called instructions). And the process of making software is a translation from high level requirements specified in all sorts of ways (like this law text) to gradually lower levels of abstraction. We meet somewhere in the middle by using apis/sdks. And I think programming languages could benefit a lot from becoming more elastic in the sense that they should allow you to consiusly choose which level of abstraction you need at every point in a project lifetime. Not in the sense that php is dynamic and eats errors, but more in the way the optional type forces you to consider if you care about what values tou really need and what to do if you dont have them. Or kind of (not this but kind of!) the way some business process modelling tools lets you create a graphical model of what work needs to be done and lets you automate parts of it gradually. I dont know if this makes sense, but programming languages have a very long way to go and i think they can be used in far more ways than we imagine today.


I disagree with “code can be thought of as formal requirements” - perhaps if you twist the meaning of the word requirement, yes.

This analogy puts us into a place where we mix intent and implementation and make them the same. It doesn’t allow for bugs in a sense.

This is why I dislike analogous thinking!


I was thinking requirements in the sense of «tell don’t ask». And I reslly believe that if we can define intent as implementation (on a high level of course) we have solved many problems. For example if you want to sort transactions by time in a accounting system you should only have to ask for a sorted collection ( sort(transactions)) and not worry about what way the data is sorted. Then the people developing the library for sorting will be able to optimize the implementation independently. This is something we do today, but I think this concept can be taken a lot further if we think of avstracting even higher. At the same time I think this requires rigor and extremely well defined boundaries between layers, so we dont end up with a mess like java null pointers.


Exactly. The law should express the intent, not the implementation means, which can—and does—improve over time.

Imagine a law mandating SHA-1 in 1999; it would end up enforcing a security problem today.


South Korea did a shockingly similar thing to your example in 1999, which enshrined Internet Explorer's use long after the rest of the world had moved on.


They're delegating creating the detailed regulations to the courts. When some court rules that something is acceptable, or unacceptable, you'll know what the "real" rule is.

Another approach some countries have taken is to have laws cite some set of government regulations, which some part of the government is supposed to update. That tends to be even messier as the courts still get involved, but at least you get faster updates.


Mostly agree, but there's a middleground between the ambiguities OP raises and mandating a hash algorithm. Legislators sure do know their limits; they know that laws that are not narrowly scoped remove a lot of limits on the enforcers.


I did one semester of law school before deciding I didn't really want to go that route, and while I'm not going to comment on how to read this law, I can tell you my most effective technique for really breaking down the language of legal documents:

Treat every paragraph like a great big chain of boolean logic. The confusing parts of law are normally due to long paragraphs of 'and' and 'or', all mingled together. Parsing those in legal documents isn't any different than parsing them in code. But we normally don't have to think that way when reading, so it feels more confusing than it really is.

Try re-reading it specifically looking for the ands/ors, and envision how they really operate on the words of the paragraph, and legal reading will suddenly become far more clear.


Regulations like this usually draw from other sources for inspiration. So if your company is subject to other regulations (like PCI-DSS), then these really vague sentences start to seem more concrete.

1.) Make sure your software has all vendor-supplied patches.

2.) When personal data is being processed, keep it in RAM.

3.) When personal data is at rest, ensure that it's on a locked-down system and safe. Encryption at rest is called out in GDPR, but it's not required. (The definition of "locked-down" can fill a couple paragraphs, but consider it like SOX - only give access to employees that need it as part of their job title. Block off all access for everyone else - network, physical, logins).

4.) Make sure that all systems that store/receive/transmit personal data are audited and logged, and do not give out access to people unless they absolutely require it. (Anonymize data for BI, developers, business reports when able)


For the same reason the control problem is so hard in AI. It's difficult to anticipate how intelligent agents will behave in a reasonably complex environment. To address this challenge, some legal propositions are intended to be clear cut rules, but many are intended to kick the can down the road with less than clear language to be interpreted with the benefit of specific scenarios at hand and experience.


As others have said, the law isn't a technical spec. it describes what you should do, not how you should do it. The concept of "due care" http://www.businessdictionary.com/definition/due-care.html

comes in to play here. GDPR, at its core, is about legally requiring businesses to care for customer data. It gives customers increased control over how their data is used, and how it should be protected.

In answer to OP's question "how do I know it is appropriate," as a first pass, how would you feel if your most important personal data were being treated that way? As a developer, if that makes you uncomfortable, that's probably a warning sign.


Legislators don't (and shouldn't imho) get into the specifics of how an industry should abide by the laws they make. That makes the law flexible for the future. In the US, at least, if you were brought up on charges of violation, they would use the reasonable person test. That means that a reasonable person in your position (with the expected knowledge of the industry/field/underlying tech) would have done it in the way you did. The reasonable. person in hashed out by the jury based on testimony by experts on both sides of the case. Basically if you are using best practices, you should be ok. Qualification: I am a non-practicing attorney


But during the process, a business being accused will waste money and time to defend itself. This could be costly for the business.


Others are giving an optimistic interpretation. Here's my pessimistic one (at least in the GDPR's case): they want the wiggle room to subjectively apply these rules on companies they don't like. The intention may be valid, but with the boundaries vague you can bet that enforcement won't be uniform. It never is and history has shown how ambiguities in law can be bent for targeted application based on political will.

Also, most comments will say this is just how it has to be because the law cannot be very specific on highly technical matters. I believe that part is true, but it is not just how it has to be. The other option is the absence of the law and alternative measures to tackle this problem (e.g. education/awareness, encouragement of alternatives, public equivalents or assistance w/ caveats, etc, etc).


Really good points.


EU law is written so it can apply for many decades – when the precursor of the GDPR was written (1995), MD5 was considered secure.

So, you should expect the "appropriate" part to mean the current state of the art to keep something secure.

An "appropriate" hashing algorithm today would be bcrypt, scrypt, or potentially still a salted SHA512 with many rounds.

An "appropriate" protection against unauthorised access would probably be a strict permissions setup in your AWS rules, proper firewalling, and potentially at-rest encryption.

An "appropriate" encryption would be AES 256 GCM.

"Appropriate" always just refers to the current state of the art for what is considered secure.


I agree with your points. I think mostly the problem is there is no one specific place to find the list of "appropriate" methods to achieve the objective. Someone working in the infosec field could probably spit them out, but a dev may not be so up to date on such nuances.


Isn't this a risk descision based on 'could I defend this against a likely prosecution'?

In that kind of situation, you'll probably end up getting measured against something between 'industry normal practice', and 'industry ideal practice'.

If you don't expect to actually get prosecuted or audited for compliance by a client or whatever, this probably doesn't matter much.

If you do, then you should probably look at whether an infosec consultant would pay for themselves in terms of avoidng fines or winning contracts.


I found this regulation put too much burden on small businesses.

It's not. You are wrong.

What if this law will be abused as a tactic to attack business competitions?

Why would that happen?

How do you understand this "security appropriateness" of the above text? How can you be sure your understanding is correct?

You use your knowledge or regulation to read and make decisions. If you don't have the required experience, you hire a consultant or a lawyer. Just like you do when complying with any other piece of legislation.


What if this law will be abused as a tactic to attack business competitions? Why would that happen?

> For example, Business A has a competitor startup B who has less resources to hire security consultant. Business A hence hired person C to register the service provided by B with a weak password and hire D to breach C's account. C claims that he has been hacked, so he brings startup B to court. B goes bankrupt because it runs out of money to hire lawyers.

You use your knowledge or regulation to read and make decisions. If you don't have the required experience,

> How do I know I have required experience (what experience is required is not said in the regulation text)? I know md5 is insecure and you need salting on password. I'm self learned, garage based entrepreneur with $1000 in my bank to either buy food or hire a consultant, is that required experience?


B goes bankrupt because it runs out of money to hire lawyers.

Right, that's like any other malicious lawsuit – i.e. this is totally irrelevant.

I'm self learned, garage based entrepreneur with $1000 in my bank to either buy food or hire a consultant, is that required experience?

Yes. If you don't have the knowledge or resources to correctly comply with appropriate regulation, then you should not be operating in a space. "I didn't know that I needed to keep raw and cooked meat separate" would not be a valid excuse in food prep; why would "I didn't know I needed to use a secure hash" a valid excuse for an engineer?


Food safety standards are a concrete set of rules that are trivially captured in training for low-wage, low-literacy, high-turnover workers. Restaurants do not need to hire specialist lawyers to guess at how the courts might interpret them. This analogy is completely inappropriate.


Food safety standards are actually a good example of how laws don't include specifics like "keep raw and cooked meat separate".

For example, the FDA Food Safety and Modernization Act[1] doesn't include specifics about how food should be handled. It specifies some areas where the FDA is supposed to issue rules and then the FDA makes rules based on the authority that the law gives it.

GDPR has Data Protection Offices that issue more specific guidance about how to comply. For instance, the UK Data Protection Office issued this guidance[2] about how to prepare for GDPR.

[1] https://www.fda.gov/NewsEvents/PublicHealthFocus/ucm239907.h...

[2] https://ico.org.uk/media/1624219/preparing-for-the-gdpr-12-s...


So just set the bar of compliance to start any company at several hundred thousand dollars (which is pretty much where it is) and you get the lovely contemporary effect of 40 year lows in new business creation[1][2].

There are active, observable, and semi-quantifiable costs to society when lawmakers create arcane, incomprehensible laws meant to prevent entry into markets by making the barrier of compliance too high for most people to afford to compete.

[1] http://money.cnn.com/2016/09/08/news/economy/us-startups-nea...

[2] https://www.washingtonpost.com/news/on-small-business/wp/201...


The GDPR is not arcane or incomprehensible. It’s quite simple, there are a million explanations out there about how to comply, and even in the case you don’t get it right regulators will invariably give you the benefit of the doubt.

Many things might cause low levels of business creation. I am entirely unconvinced that the cost of regulation is one of them.


The hiring a consultant/lawyer thing has potential to be troublesome for a lot of solo devs. Such services are not cheap and it’s not uncommon for independent devs to be very much cash-strapped.

I wonder if this means we’ll start to see sites/services start out as US only and only become available in Europe once finances are no longer an issue.


That's up to you - what techniques do you know that allows you to process personal data which ensures it's secure? It's a law, not a specification or an implementation detail.

More general, it's difficult to understand because it's in legalese, which is aimed at people that studied it for years - it's the English programming language. And like code, there's lots of holes in it that it doesn't cover, despite lots of manyears of effort.


My favorite website is this: https://ico.org.uk/for-organisations/data-protection-reform/...

They have explained GDPR in reasonably everyday language, with checklists and examples. That site should be your first choice.

If you are a developer, you should check https://github.com/gdprhq/GdprHq.Io.ClientSdk - you can find interfaces and default implementations. For example, to implement the right to erasure (to be forgotten) in your app, you'll need to erase personal data and to inform an individual that you've done so. Even though actual erasure might be tricky, at least you know what you need to implement to be compliant. However, note that having the app GDPR compliant isn't the same as having the business compliant; primarily, GDPR is a set of rules and processes that apply to organizations.


Every Data Protection Office in the EU Member states has published guidance for business and individiuls. Here's the British ICO: https://ico.org.uk/for-organisations/guide-to-the-general-da...


Okay, let me try to translate that legalese for you (I'm not a lawyer BTW but I regularly deal with GDRP and data protection issues):

"In a manner" means that you're absolutely free to use whichever means (i.e. technologies, systems, ...) you want to do your data processing with, as long as you make sure you keep the data secure.

"Appropriate security" is indeed a very vague term, but it is vague on purpose: As you probably know firsthand, technologies change rapidly these days, and what's considered "state of the art" today might be a "legacy system" in five years. Therefore, laws often do leave the interpretation of terms like the "appropriateness" above open to interpretation by the executive branch. In case of the GDPR, this means that at the highest level it will be the European Court that will decide if a given measure/technology was appropriate or not. In practice we can't (and do not want to) fight out each definition in court of course, so in addition to that last instance the member countries try to release guidelines that should help companies to judge what measures are appropriate. Unfortunately, there's not always consensus between individual countries here so you will have to find a compromise or look at the guidelines of the country you're based in (as that's where complaints about your company will be handled in the first instance). For Germany, the BSI (Bundesamt für die Sicherheit in der Informationstechnik) would be the relevant instance to look for guidance when it comes to IT security best practices, and the standard that they define will (usually) be followed by the federral data protection agencies.

As a final remark, what helped me a lot in understanding the intent behind the law is to read the "motivations" section, which is where the lawmakers write down the actual intent they had when creating a given law. These are used by courts to interpret laws in case of ambiguity and can (in my opinion) greatly help to gain a better understanding of some of the more cryptic articles. Here's the link:

https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A...

If you have any specific questions about appropriate measures or the GDPR please feel free to reach out to me (contact info in my profile), I'm always eager to learn about your problems and will be glad to give you free advice wherever I can.


I found this regulation put too much burden on small businesses.

There's a very simple way around that problem - don't ask for your user's data.

The GDPR is about making sure you do your best to protect what they share with you. If they don't need to share anything then there is no burden on you to protect anything. In my opinion this is the ideal outcome. If you gather their data then there really should be a burden on you and your business to do the necessary work to make sure you've done at least the minimum to protect what they've shared, especially if you're profiting from that data.


Most websites ask for users' email used as the account name.

From what I have read on this topic, email address is considered a personal information.

> the necessary work to make sure you've done at least the minimum to protect what they've shared, > especially if you're profiting from that data.

The OP was willing to comply, she asked what "necessary work" means and how to define "minimum".

Also it seems to me that GDPR applies to non-profit sites.


> Most websites ask for users' email used as the account name.

> email address is considered a personal information.

As another poster mentioned - just don't use email.

Or if you must then just make sure you only do the minimum you have to with it - e.g. Don't send it to a third party, have a way to delete it when a user wants to close their account (unless you have a good reason to keep it - e.g. to match to a financial transaction).

What's the difficulty?

GDPR shouldn't be a burden for a small business unless the business is in the personal data space.


So what? Don't ask for an email address. Use an OAuth provider instead. Or let people use the server without signing up. Or assign users a random number as their log in.

There are plenty of options available if you don't want the "burden" of securing your users private data, but ignoring if isn't one of them any more. This is a good thing.


There is a good guide for developers here

https://techblog.bozho.net/gdpr-practical-guide-developers/


The GDPR isn't a set of technical specs. It purposefully sets out broad guidelines and leaves the implementation to each data-handling organization. Obviously the requirements and challenges of a hospital are much different than those of an e-commerce. Therefore, it is the organization, or more precisely its DPO, that has to define what is "appropriate" to their business.

Then, according to your interests/knowledge/SOW, you can act as a security consultant who gives proactive advice, or as a contractor that develops a solution from a set of specs.


GDPR is basically written as something to be hashed out in court. The question is not "Are you compliant?". The question is "Can you use what you've done to tell a convincing story that you're compliant enough that you shouldn't be punished after a breach? When someone at the regulator's office might be looking to make their career over the corpse of your company?".


It is amazing with the confusion that is arising with GDPR. I have to participate in a negotiation on a data processing agreement with a vendor because of GDPR. The material data they have to handle for us (the reason we buy their services) has very little personal data in it, so I thought this will not be a big deal. But it turns out that we can (not that we do, but that we can) transfer more or less anything to them in their support / incident channel, which is implemented with email. So we spend hour after hour to discuss how to regulate the case that emails might end up on various devices in their organisation. Completely unstructured data where mentions of names or addresses might occur.


The magic phrase you want for Mr. Google is “legislative drafting”. In fact there is a long-existing methodology in government for writing laws and regulations. Here, for example, is the US House of Representatives’ guideline document: http://legcounsel.house.gov/HOLC/Drafting_Legislation/Drafti...

Chesterson’s Fence.

Having said that, I am a tax lawyer and I view the output of the Federal government as astonishingly bad. Poorly-written laws cause millions or billions of dollars of friction to taxpayers.

There are many reasons for why laws are written so poorly:

- the people are stupid or malicious or lazy (appealing but this conclusion reveals more about its holder than its object)

- the task of writing the law is impossible

- the people writing the law are technically deficient in this style of writing

I’m sure you can think of others.

There is another factor. After you have been a lawyer for a long time, you see the folly of believing it is possible to write binary yes/no rules to govern human behavior. Politicians believe this. An economist would call this a belief in a static economic model.

Thus, politicians think “we will impose this tax and collect a lot of money”. Then they are shocked because people change their behavior and avoid paying the tax. I do not mean maliciously. I mean like a toll-road. Imposing a toll for driving on a road will affect the traffic patterns and fewer people will drive on the road. Less money will be collected. Economists think that way. Politicians are not as . . . astute. (That statement reveals more about me than it reveals about politicians as a group). :-)

Back to why laws are so badly written. There is a second reason and it is mentioned in another comment. Lawyers and judges are comfortable with ambiguity. In tax law, we often ask for penalties to be waived for “reasonable cause”. WTF is that? Yet it (more or less) works. At the margin, some people get away with stuff that “should” be penalized. And some people are penalized “unjustly”.

This discovery in law is the result of centuries of evolution. It works. So if you see ambiguity, understand that it is a feature, not a bug.

But for tax law, quite often I think of the rules as a broken set of algebra rules written by teenage sociology majors while drunk, for bribes. Over decades. Again, this reveals more about me than the United States Code and the Treasury Department.


Also, is hackernews complied to GDPR? I didn't seem to see a "delete account" button? As I know GDPR asks that users' data can be deleted at anytime?


IANAL, but it's probably not compliant. http://www.ycombinator.com/legal/ even explicitly says that

Please note that we have no obligation to delete any of stories, favorites or comments listed in your profile or otherwise remove their association with your profile or username.

I presume this is based on the theory that European law can't be enforced against HN since it operates from California. On the other hand, it might be possible to convince a judge to take action against YC companies instead? Not sure.


No, GDPR says entities outside EU will be affected too, as long as you process EU people's data. I read somewhere the penalty is 20 million dollars!


What matters is whether or not you target users in the EU. If people in the EU use your services in spite of you taking no action to target them, then GDPR doesn't apply.

HN probably would not be considered to be targeting EU users because it is an English-only forum based in the US that does no marketing towards EU users. If they added a German-language forum, then they would probably need to start following GDPR because that would be interpreted as targeting users in the EU.

This is based on CJEU's interpretation of previous regulations[1]. Factors that they listed were:

> Use of the language of a Member State (if the language is different than the language of the home state);

> Use of the currency of a Member State (if the currency is different than the currency of the home state);

> Use of a top-level domain name of a Member State;

> Mentions of customers based in a Member State; or

> Targeted advertising to consumers in a Member State.

[1] https://www.wileyrein.com/newsroom-newsletters-item-May_2017...


What GDPR says and what can be enforced are two different things. 20 million dollars are irrelevant if there's no way to extract them. (Hence my musing about YC companies.)


What a wonderful feature for the professional astroturfing companies. Are public comments really "personal data"? This is part of the war on general purpose computing, it's an attack on memory.


no, email addresses are. I think GDPR considers emails addresses and perhaps cookies personal information, as a person can be identified by them.


And IP's. Comments are easily used to writeprint (ID) commentators even if they don't contain "personal information".


> I think its articles difficult to understand

AINAL, I am a security/privacy consultant. I have a strictly technical/security background but didn't find the GDPR that hard to understand at all. Actually, I was pleasantly surprised that the text itself was quite easy to read, even though really understanding the consequences requires a bit of background research. A year ago, I got CIPP-E certified in a month just by self-study.

> I found it is very difficult to translate from the regulation text to code, to actual implementation

Well yeah, I'm with you on that one. I think it is not because the text is too vague, but rather because it was written in a way that allows companies to implement it in a way that fits their size and the sensitivity of data. Art. 32 is the most important one for security/technical protection. It allows you to implement security controls the way you see fit, as long as you can demonstrate that you made an appropriate decision based on the risk of the data. I think that is the strength of it, not its weakness: A small company doesn't need formal authorization processes if they can show that the user administrator knows all personnel personally and issued the correct authorization profiles for the roles. Telephone numbers from contacts don't need to be protected in the same way you need to protect medical data.

The advantage of the wording is that the controls just need to be "good enough". The disadvantage is that there is no checklist of security controls to take, and hence, you never know for sure whether good is really good enough until you had a visit from the data protection authorities. That remains a problem, but I think if you can explain your reasoning, they might disagree, but if the reasoning is solid enough they won't fine you for it because you can demonstrate you acted in good faith. Therefore it is important to document the reasoning behind your decisions.

> I found this regulation put too much burden on small businesses.

For a small company, setting up the "records of processing activities", doing a basic risk assessment and setting up processing agreements can be done in a couple of days. I don't think that's too much a burden. For many of my clients it helped them to identify weak spots in their security, which is a win-win for both the company and their customers.


When reading a regulation your first priority always needs to be, answering the question, "Who does this apply to?"

If the rule applies to people who do x, and you encounter a requirement that is clearly burdensome, then you should reevaluate whether or not your really want to do x.

ie... The new liquor regulation is 10^100 pages long, but page 1 says we dont have to read the rest if we just serve wine and beer.


I have 2 concrete questions to ask on this topic:

1. is email address considered personal information?

2. is cookie, in a form of random hash code, be considered personal information?


Yes and yes, as both are "personal identifiers" -- "anything that you could conceivably use to identify a person within a larger group".


Same reason large tax preparation companies lobby against simplification of rules. They need you to need them.


The GDPR explicitly does not set out to specify implementation details because of the reasons here: https://gdpr-info.eu/recitals/no-15/ (Technology neutrality)


Because you don't have the training and knowledge level that they were written for. Assuming you're a developer, imagine taking a programming language spec and giving it to a layperson. Would you expect them to perfectly understand it? I wouldn't think so.


I won't beat about the bush. If you don't understand the laws, or don't know how to keep your stack secure, YOU SHOULD NOT COLLECT PERSONAL DATA!

Ignorance isn't an excuse.

It is a burden. It's supposed to be a burden. It's supposed to make it harder for companies to amass toxic dumps of sensitive data, that go onto get hacked. And to stop the Facebooks of the world weaponising every anecdote they can link to you.

It's a good set of measures. It's just going to be a cock to comply with some times. I get that. I'm going to have to comply with it. But I'm happy that companies I deal with will also have to deal with it, for my sake, as a consumer.


Lawyers find code hard to read as well.

I guess the difference is that the law doesn't have to implement code. Code has to implement the law.


There are a few summaries online, search for "GDPR for developers"


Can I trust those "summaries" (open interpretations)? What if I got fined by trusting them? Who to blame?



[flagged]


Yes. How dare they request that you actually take care of your user's data, and ask yourself if you really need it, instead of just gobbling it all up as fast as you can.


[flagged]


> Lawyers wished to create a 'walled garden' for their knowledge and methods. Doctors do the same and the churches also did it. These people charge fees for access to this knowledge and do not want easy access to the general public.

And programmers do the same thing, using a secret language called "code" to make their work incomprehensible to mere mortals. /s

More seriously, I've heard it stated that there are actually a lot of similarities between legalese and software code, as they're both constrained languages to meant to make certain kinds of statements more precise.


This is utter nonsense and you have no basis for stating any of it.


I see, I gored your sacred cow. It is well know that all the trades and professions had their private languages so they could discuss things and their customers did not know what they discussed. Latin for the church and medical trades. Lawyers still have their arcane tongue. Most of the others, butchers, tinkers, bakers etc had their trade language. I am from England and had a butcher and a baker in my family, and they told me bits of their language. True, it was not as comprehensive as English. I also grew up in a Yiddish speaking family, as my father was a major on the Burma Road and we housed a family of displaced persons from Poland. Mother, Father, Bubba, and two young boys - all Jewish, and me and my Mom - I soaked up yiddish and by the time the war was over I could speak it as well as most kids of 9. I used to startle the Jewish shop keepers, after WW2, when we emigrated to Canada, when this little Brit kid could know whatever they said. They were in the habit of carrying on conversations, secure in their privacy - Hah.

It stood me in good stead because I made many Jewish friends in Toronto in high school and college, because in those days there was a major barrier to many Jewish people in business circles. No brokers etc. This all changed in the mid 60's after a few prosecutions, and now it is wide open to all.

https://en.wikipedia.org/wiki/Craft


The "walled garden" is knowledge. To understand legalese, you have to know the definitions of the words. Specifically, you have to know the definitions as used by the courts. Many words have very precise legal definitions, which may or may not correspond with the non-legal definition.

Why is it that way? When you're writing a contract or a law, having words that let you state precisely what you mean saves a lot of trouble and confusion. We can go to court about people actually breaking the law or the contract, rather than about what the definition of "is" is.

And, as that snarky example illustrated, sometimes it doesn't work. Sometimes you wind up arguing exactly about the definition of words. But the precision of words in the legal community means that that happens a lot less often than it otherwise would.


Yes, there is a dual use, keeping information a trade secret, as well as good communication across legal or medical fraternities. The trade and craft secrecy was mainly to mulct clients and restrict trade competition. https://www.google.ca/search?q=mediaeval+trade+monopolies&oq...


Most laws of this kind are made so that they can exclude small players from the market.

If something goes wrong, Facebook can easily hire an army of lawyers and "prove" that they "processed [data] in a manner that ensures appropriate security".

But a small player can't do that. This has the follow advantages for big corps and governments:

1. Big companies can use the laws to destroy small companies (they just need to push the enforcers of regulations in the right direction, maybe with a little gift or something wink wink) 2. People don't even try to create a small company, because it's too risky, so big companies don't have to face any competition at all 3. Governments and other institutions can use the laws to stop people who spread informations or products that go against their interest. In fact the main reason for GDPR is stopping the spreading of true informations about the current political situation in Europe (what they techinically call "fake news"). It's much easier to control the web if you only have Facebook, Youtube and other channel you can easily manipulate. Good luck instead controlling hundreds of thousands of small blogs, mailing lists, chat rooms, etc




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: