Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Should Developers Be Sued For Security Holes?

samzenpus posted more than 2 years ago | from the who's-to-blame dept.

Crime 550

An anonymous reader writes "A Cambridge academic is arguing for regulations that allow software users to sue developers when sloppy coding leaves holes for malware infection. European officials have considered introducing such a law but no binding regulations have been passed. Not everyone agrees that it's a good idea — Microsoft has previously argued against such a move by analogy, claiming a burglary victim wouldn't expect to be able to sue the manufacturer of the door or a window in their home."

cancel ×

550 comments

Sorry! There are no comments related to the filter you selected.

Nah (5, Insightful)

Anrego (830717) | more than 2 years ago | (#41102783)

I think excessively poor software should result in some form of negligence ... but general “can happen to anyone” type bugs.. no.

You can buy software with a (real) warrantee attached. In general this costs a fuck tonne of money because they are accepting a fair amount of liability. Even in a very horizontal market, the price increase for accepting that liability is going to be way more than anyone can afford.

You get what you pay for. Want software that is very secure and unlikely to have serious bugs.. you can get it.. but it’s gonna cost more than you are willing to pay if you don’t really _need_ that level of support.

Re:Nah (4, Insightful)

Mitreya (579078) | more than 2 years ago | (#41102925)

excessively poor software should result in some form of negligence ... but general âoecan happen to anyoneâ type bugs.. no.

And how do you define the difference?
Based on the quality of code?
Based on the amount of unit testing that was (provably) performed?

This will start a slew of software that is only warranted under specific OS/software configurations (and then installing an aggressive anti-virus or not error-checking your RAM chips regularly would void your warranty).

Re:Nah (4, Interesting)

Shikaku (1129753) | more than 2 years ago | (#41103029)

Simply requiring encryption when handling something sensitive like credit card info is a start. See: Sony and the PSN disaster.

Re:Nah (4, Insightful)

tomhath (637240) | more than 2 years ago | (#41103261)

So how do you define "sensitive"? There's no end to it; once you open the door a good lawyer can can convince a jury anything.

Re:Nah (1)

Anrego (830717) | more than 2 years ago | (#41103033)

That of course is a huge issue.

Realistically you would need a standard (or set of standards) defining what "secure software" is... and good luck with that!

I would venture that in the case of a huge vulnerability, the company would be required to show "what they did" to secure the software (what kind of testing they did, review, etc..) and a jury would decide if they were negligent (excessively negligent would be the lead dev cracking on the stand about how the boss kept shouting "ship it or you get the cane again!".)

Re:Nah (1)

DJRumpy (1345787) | more than 2 years ago | (#41103201)

You don't define the difference. We already have mechanisms under law to handle such as this.A good example would be cases where someone's computer was damaged by a virus scanner that quarantined a key system file and caused a failed boot condition. Those consumers could sue the company for losses associated with getting their computers fixed. It could also escalate into a class action lawsuit. I would think the same recourse is available to an enterprise.

Re:Nah (4, Insightful)

mcvos (645701) | more than 2 years ago | (#41102981)

Some things, like allowing SQL injection, might be considered negligence. But no programmer can possibly guarantee a complete absence of bugs, and any bug can be a security hole. It takes time and money to track them down. If you don't give them that time and money, you can't expect perfect security.

Re:Nah (3, Insightful)

Anrego (830717) | more than 2 years ago | (#41103065)

Perfect, no... but I suspect there are companies where if required to justify what they did to protect the end users (was the thought of security even a consideration at any point..) would pretty much give a blank stare.

It's not about having perfect security imo, but rather about at least making an effort proportional to the risk you are putting users in.

Re:Nah (1)

mcvos (645701) | more than 2 years ago | (#41103269)

That last line is the big one: think about what's at stake. If there's any sensitive data involved (credit cards, medical, whatever), security is a very real concern. Of course that also needs to be included in the price. You get what you pay for.

Re:Nah (2, Insightful)

Anonymous Coward | more than 2 years ago | (#41103103)

Allowing SQL injection attacks is negligence. And if you allow an SQL Injection attack, you need to find another line of work.

SQLinjection is the easiest attack vector to ward off, all you have to do is use prepared statements.

Done.

Re:Nah (1)

mcvos (645701) | more than 2 years ago | (#41103243)

Absolutely. It amazes me how many sites, important ones, even, are vulnerable to it. It's trivial to prevent, and doing so makes your code prettier and faster. There's no excuse.

Re:Nah (1)

craigminah (1885846) | more than 2 years ago | (#41103151)

How bout we sue people who post stupid articles on /.

Re:Nah (1)

Elminster Aumar (2668365) | more than 2 years ago | (#41103247)

What makes you think it's stupid? I mean, after all, your response reeks of empirical rationale and conclusive examples...

Re:Nah (1)

alen (225700) | more than 2 years ago | (#41103231)

no, the developer will just have a very constrained usage scenario for the no security hole policy. the NSA has proved that you can make Windows and Linux secure, its just going to be a pain in the a$$ to use and annoying to do the simplest things.

Re:Nah (2)

jellomizer (103300) | more than 2 years ago | (#41103293)

Blame the organization not the developer.
Week one we need a working prototype.
Week two we need to put the prototype into production.

We want it to do all these features... 6 months down the line we need to put in production with half the features.

Yes this products will be only used internally.

As developers we are often not given the full picture and the organization changes the mind. Often good developers in bad organizations write bad code.

Short answer: No (1)

Anonymous Coward | more than 2 years ago | (#41102805)

Long answer: Noooooooooooooooooooooooooooo

Re:Short answer: No (1)

Kergan (780543) | more than 2 years ago | (#41102879)

Why not? As an indie dev, it kind of freaks me out, but if it drives as little as half of th crap coders outside of the market, it might be a very good idea...

Re:Short answer: No (5, Insightful)

Anrego (830717) | more than 2 years ago | (#41102935)

It'll have very little impact on actual code quality.

All that will happen is:
- software prices will increase
- a whole insurance industry will spring up around it (think malpractice insurance)..
- people will specifically seek out stuff developed by small shops and try to break it specifically so they can sue..
- producing software will become so expensive and require so much up-front investment that indie devs will be SOL
- the big guys will keep producing shit, and just protect themselves behind lawyers (and feed the cost back to the customer)

Re:Short answer: No (1)

lightknight (213164) | more than 2 years ago | (#41103115)

Exactly. Imagine the insurance premiums -> they will be comparable or larger to what the engineers / doctors pay.

Software will become to expensive, in reaction, that the field will collapse.

Re:Short answer: No (0)

Anonymous Coward | more than 2 years ago | (#41103173)

It will start to resemble the medical business with developers becoming hyper conservative and progress slowing to a crawl.

Re:Short answer: No (1)

trev.norris (2010080) | more than 2 years ago | (#41103267)

Can we all email this a million times to the "Cambridge academic" that believes suing developers is a good? Seriously, how long did it take to figure out this was such a bad idea.

Re:Short answer: No (1)

Nethemas the Great (909900) | more than 2 years ago | (#41103007)

The "why not" is because all it would result in is more outsourcing to countries outside of jurisdiction.

Either way, bad software is more of a consequence of bad managers than bad developers. Managers allocate resources to projects--including development talent, managers demand deadlines contrary to software quality. If people should be held accountable it should be the ones running the show not the ones taking orders.

Re:Short answer: No (1, Funny)

lightknight (213164) | more than 2 years ago | (#41103133)

Yes, and I am sure the managers will be the ones to pay the price.

Re:Short answer: No (3, Insightful)

Elminster Aumar (2668365) | more than 2 years ago | (#41103163)

Just because managers hire employees doesn't mean they're in 100% control of who they hire. Managers have supervisors just like everyone else. They have deadlines, too. Besides that, I'm not so sure it's wise to assume that just because you're the one hiring that you somehow control whether you hire talented developers. Last but not least, just because you have a talented pack doesn't mean they either work together well or acclimate to the technologies clientele require servicing with. Same with new hires, too: out of 200 applicants, a manager who hires the one that screened best doesn't imply automatic success because too many moving parts occupy the context. Long story short, it's not all on the manager's shoulders just like it's not all the fault of the developers, clients, etc. All this article is about is some whiny asshole trying to point his fat, stubby finger at someone because he made a bad decision. They're angry, they're pissed, and now someone has to pay their bill. That's all this boils down to.

Re:Short answer: No (1)

eddy (18759) | more than 2 years ago | (#41103077)

I'd invoke some form of principle of proportionality. A law where all programmers were liable for their code (no exceptions, straight up same for something that is given away for free as something used in airplanes) would mean the eradication of industries, a set back of science and progress almost unfathomable. Who would have written the first web-browser under such tyrrany? No one. Who'd write a $4 game when the liabilty at the other end of the balance board weighs in at multiple millions, or conversely, who could write a $4 game using methods that guarantee no software errors? No one. Things we take for granted today couldn't exist. Freedoms we have to distribute our code on the web, no strings attached, would have to go away. And to solve what problem?

At the most basic level, do we really need to give lawyers more tools to fuck us all over with? Really? Because lawyers are the only ones who would profit from this kind of legislation. Everyone else will be losers.

Re:Short answer: No (1)

Anonymous Coward | more than 2 years ago | (#41103295)

Did you develop the programming language you use? Did you build the IDE and the compiler? Did you design the framework, if applicable? How about the target hardware, did you do that yourself?

If you can't answer "yes" to every single one of the above questions, then you should already know why laying the blame at your feet for bugs found potentially upstream from you is a really bad idea.

Would stop a lot of development (5, Insightful)

Burdell (228580) | more than 2 years ago | (#41102809)

If it was possible to prevent all security holes, this wouldn't be a bad idea. However, it is provably impossible to do so. This would just create a new inurance industry, profiting from others' mistakes. It would really only serve to cut down on development, especially from small companies and individuals that couldn't afford to make a single security mistake (or insurance against lawsuits).

Re:Would stop a lot of development (0)

Anonymous Coward | more than 2 years ago | (#41102865)

If it was possible to prevent all security holes, this wouldn't be a bad idea.

Indeed - there is too many variables involved (OS, memory corruption, etc).
To go for a car analogy, wouldn't it be something like a car warranty guaranteeing that the car will run under any gravity or in presence of an acid rain?

Re:Would stop a lot of development (0)

Anonymous Coward | more than 2 years ago | (#41103317)

Or even suing the manufacturer for any repair/work outside of the recommended maintenance.

Re:Would stop a lot of development (1, Interesting)

proprioceptionZ (784155) | more than 2 years ago | (#41102945)

Yes. You can't "prove" that anything but a trivial program works correctly. I think that was the conclusion of the famous paper by Turing in the 1930's.

Re:Would stop a lot of development (1)

ZombieEngineer (738752) | more than 2 years ago | (#41102959)

Not really - you would need professional indemnity insurance.

The insurance is based on risk of a claim (more copies sold / bigger the premium, could be priced on a fixed price per copy), the impact of damage (just make sure that the license terms exclude indirect consequental damages).

The risk side of the equation can be reduced by using appropriate development structures (code reviews, etc).

This could improve the quality of the industry long term but there will be some pain getting there...

Re:Would stop a lot of development (0)

Anonymous Coward | more than 2 years ago | (#41102983)

My first thought was, "Is this being pushed by the insurance sector?"

Re:Would stop a lot of development (4, Informative)

arth1 (260657) | more than 2 years ago | (#41103131)

If it was possible to prevent all security holes, this wouldn't be a bad idea. However, it is provably impossible to do so.

This is true. However, I still think it should be possible to sue for gross negligence. Like lack of input validation, or storing passwords in plain text, or installing everything world writable.

That's like a bike lock manufacturer whose locks open if hit with a shoe, or a car manufacturer whose cars start if you roll them dowhill and put them in gear, even without an ignition key. Both existed, but would be considered gross negligence today.

I don't expect software to be perfect, but I do expect it to not be outright stupid.

Re:Would stop a lot of development (1)

Palinchron (924876) | more than 2 years ago | (#41103185)

However, it is provably impossible to do so.

Oh? Please prove it, then.

Sure (5, Insightful)

BigSlowTarget (325940) | more than 2 years ago | (#41102819)

What we need is more and richer lawyers and frightened software developers with malpractice costs bigger than doctors. Perhaps we can eventually make sure all code is only developed by giant corporations made up primarily of legal defense teams dedicated to patent exploitation and liability control with tiny development arms tagged on the end.

Windows (4, Funny)

MrEricSir (398214) | more than 2 years ago | (#41102843)

Microsoft has previously argued against such a move by analogy, claiming a burglary victim wouldn't expect to be able to sue the manufacturer of the door or a window in their home.

Interesting choice of words there!

Re:Windows (0)

Anonymous Coward | more than 2 years ago | (#41103025)

But if there was a master key glued into the lock it would be another story, of course I am referring to developers leaving private keys in software/hardware.

Re:Windows (1)

Elminster Aumar (2668365) | more than 2 years ago | (#41103307)

...in which case, you take the bastard to court who made the freaking key--not the entire industry!

Another Analogy (0)

Anonymous Coward | more than 2 years ago | (#41102853)

Yes... But could a file cabinet manufacturer be sued if the drawers could be locked but the side of the cabinet fell off with one solid blow?

For "sloppy coding"? Definitely! (1, Insightful)

gweihir (88907) | more than 2 years ago | (#41102859)

Exception: FOSS. All commercial software vendors should be liable for any and all damage caused by sloppy coding, including system cleanup, downtimes, etc. In most European countries this would just require classifying sloppy coding as "gross negligence". I am all for it.

Re:For "sloppy coding"? Definitely! (1)

DemonGenius (2247652) | more than 2 years ago | (#41103009)

Except you fail to consider the fact that many managers expect unrealistic deadlines from developers, leaving them no choice but to take the quick and easy way out. I'd say that most developers want to create the best code they can, but contraints on timelines, requirements, and feature creep often work against this ideal situation. Do you really want to be sued because of an incompetent manager?

Re:For "sloppy coding"? Definitely! (3, Insightful)

swillden (191260) | more than 2 years ago | (#41103073)

I think the company would be liable, not the individual.

Re:For "sloppy coding"? Definitely! (1)

turbidostato (878842) | more than 2 years ago | (#41103249)

"I'd say that most developers want to create the best code they can"

No, they don't.

Because if they did, they are in a perfect position to enforce their rules: you just don't write down a single semicolon and the software doesn't run at all. It's up to the developer when exactly to write down said semicolon.

Re:For "sloppy coding"? Definitely! (4, Insightful)

Ronin Developer (67677) | more than 2 years ago | (#41103019)

Why should FOSS get a bye? What user really has the time to validate the code, line by line, to search for security weaknesses BEFORE using it? No. Users expect the software, free or commercial, to work as advertised. And, given the "superiority" that FOSS pro-ports over commercial software, maybe they should be held to an even higher standard? Didn't think you'd want to go there.

In many ways, FOSS would find itself encountering lawsuits despite the "good samaritan" approach it provides. Loss, whether it be from something you paid for free, is still a loss and, in our litigious society, fair game.

No, leave it to an academic to propose making individual developers liable for each line of code they right. This will destroy the entire IT industry (and, most institutions) in a sweeping blow. Who could afford the "malpractice" insurance given the wide-spread dissemination of most commercial and FOS software?

 

Re:For "sloppy coding"? Definitely! (2, Informative)

DeathFromSomewhere (940915) | more than 2 years ago | (#41103023)

You realize the most visible open source software projects are built by commercial software vendors? Also, how would you define "sloppy coding" in a law?

Re:For "sloppy coding"? Definitely! (0)

Anonymous Coward | more than 2 years ago | (#41103037)

Dream on, freetard. The malpractice lawyers and insurance industry smell vast sums of money so FOSS will not be exempt.

Re:For "sloppy coding"? Definitely! (1)

bws111 (1216812) | more than 2 years ago | (#41103101)

That makes no sense at all.

First, that puts FOSS at a huge disadvantage. If a customer uses 'commercial software' (whatever that is), they can sue if something goes wrong. If they use FOSS - too bad?

Second, define 'commercial software', and more importantly 'commercial developers'. What about the large amount of FOSS that is developed by 'commercial' developers (IBM, Red Hat, etc)? What about FOSS that is 'sold' (RHEL, SuSE)? Do those companies get a free pass on selling crap, or are the developers of the stuff they are selling going to be held responsible, no matter who they are?

And while we're at it, why don't we apply that silly theory to everyday life? If you a cab driver and kill a passenger, you can get sued. But if you just give someone a ride and kill them, too bad? It makes no sense. The law applies equally to everyone, not just some class of people you happen to not like and want to punish.

Re:For "sloppy coding"? Definitely! (1)

Githaron (2462596) | more than 2 years ago | (#41103291)

Keep in mind that no one is going to give you a free ride if they think there is any chance of you suing them. FOSS would completely disappear if there was a fairly decent chance that the developers were going to get sued.

Bad Analogy (4, Informative)

ZombieEngineer (738752) | more than 2 years ago | (#41102861)

You can not sue a door or window manufacturer for failure of your action (leaving the door / window open).

You should be able to successfully able to sue a door / window manufacturer for failing to provide the request product (i.e. seal the opening).

That then hits the ugly question of what is "reasonable". Did the manufacturer provide a reasonable product that provided the expected level of security?

Re:Bad Analogy (1)

kubernet3s (1954672) | more than 2 years ago | (#41103045)

The door to my apartment is in the habit of blowing open during strong winds even when "locked" unless we carefully move the bolt in when the door is positioned correctly. My landlord will not fix it: he doesn't claim a reason, just never gets around to it. If the door blows open, and someone steals our stuff, I should be able to sue the landlord by arguing that his negligence lead to the conditions under which I was burgled: the door would have remained shut if he furnished the apartment with a working door.

Re:Bad Analogy (1)

bmo (77928) | more than 2 years ago | (#41103213)

Many rental laws give you the right to self-fix and deduct from the next rent payment. Look into them wherever you live.

--
BMO

Re:Bad Analogy (2)

Derekloffin (741455) | more than 2 years ago | (#41103075)

Well, it isn't the greatest analogy, sure, but it does have a point. It is like the faulty 'If you can build a bridge that doesn't fail, why not a program', where the easy counter is 'Bridges generally don't have people trying 24/7 to destroy them through every imaginable means available to them, and when they do, they generally don't last long.' If you can't hold a door maker responsible for the failure of a door and lock to stop an determined intruder who has to be physically present, how do you really expect a software maker to ensure their software is secure when people across the whole globe can be attacking it constantly. Security just isn't the kind of thing you can ensure. Now, if the software maker gives you some kind of guarantee, then I can see it as they are making a claim they are now responsible for backing up, but for software in general I just can't see it working.

Re:Bad Analogy (1)

amicusNYCL (1538833) | more than 2 years ago | (#41103229)

You should be able to successfully able to sue a door / window manufacturer for failing to provide the request product (i.e. seal the opening).

They provided the product, it's not their fault how the product was actually used or installed. It's not the manufacturer's fault that the person responsible for the actual installation was on his first day on the job after reading several tutorials online. It's the responsibility of whoever sets up a computer to secure that computer, you can't sue the manufacturer of your CPU because it executed malicious code that deleted your files.

Re:Bad Analogy (2)

lightknight (213164) | more than 2 years ago | (#41103233)

Yeah, no. Software programs juggle so many variables that it's virtually impossible to prove the program is bug-free. And add in the computer illiterate, who will find a way to generate giant lawsuits because of "the computer didn't preserve the placement of the file icons I dragged around the folder, after I copied it to a new driver; therefore it's a design flaw" crowd, and you know as well as I do that tech will be sacrificed for the greater human stupidity.

Engineering Discipline (4, Insightful)

DemonGenius (2247652) | more than 2 years ago | (#41102867)

If software development was an official engineering discipline that required P.Eng designation, then maybe this case would have more legs. Even then I'd be in disagreement. Otherwise, hell no, HELL NOOOOOOOOOOOO!!!!!!! That is definitely one way to drive people away from a career in software development. This actually seems like a sneaky way for management to evade culpability if their product harms a customer/user.

Re:Engineering Discipline (-1)

Anonymous Coward | more than 2 years ago | (#41103169)

If software development was an official engineering discipline that required P.Eng designation

... which is exactly why this liability should happen. We need to start driving the hacks out. Thinning the ranks by 90% would be a good thing, and there would be far fewer problems and vulnerabilities. This idea could serve as a catalyst to get the industry to finally clean up its act, something that's desperately needed.

We demand professional competence in other fields. It's time for this one to grow up.

Betteridge's law of headlines (4, Insightful)

fiannaFailMan (702447) | more than 2 years ago | (#41102877)

Sue the actual developer? How would you propose to do that if they're working for an incorporated company with limited liability?

Re:Betteridge's law of headlines (1)

Githaron (2462596) | more than 2 years ago | (#41103013)

That is what I was thinking. If a developer is being paid a standard wage rather than a major percentage of sales, why should said developer have to assume the responsibility of the product rather than the company that is actually reaping the profits of the product..

Fair's fair (3, Insightful)

Anonymous Coward | more than 2 years ago | (#41102881)

Sure, let's agree with Prof. Clayton that you should be able to sue developers for malpractice if their code contains security holes.

Then perhaps you should also be able to sue professors, like Richard Clayton for malpractice, if their students are undereducated, or if their papers contain flaws.

it depends (1)

roc97007 (608802) | more than 2 years ago | (#41102895)

> Microsoft has previously argued against such a move

Well, of course (/snark)

> claiming a burglary victim wouldn't expect to be able to sue the manufacturer of the door or a window in their home.

Maybe one could expect that, if the advertisement for the door or window led one to believe a level of security that the door or window was not designed to supply. Or if a reasonable person would assume, for instance, that a door with a security-type cylindrical-key lock on it could not be opened with a common ink pen (true story).

I think it's about reasonable expectations. For instance, if there was an unknown back door in an otherwise reasonably secure OS, and the manufacturer lost the credentials, and didn't 'fess up, and as a result a bad guy nabbed my customer credit card database, yeah, as a person with reasonable expectations, I'd sue.

Hanlon's (3, Interesting)

gmuslera (3436) | more than 2 years ago | (#41102901)

You could consider suing developers that intentionally planted backdoors (even if was following NSA or other US government agency orders), but can't target the ones that by weren't aware of them, did by mistake, lack of knowledge or culture, or because things changed (i..e. having/forcing a 8 char password was "good enough" several years ago, not anymore), or even because taken assumptions no longer true by end user choice (how much portals meant for intranets with not specially strong security end being used on internet).

Also, who you sue because a bug in an open source program with a lot of contributes? or against a big corporation that put in legalese that they aren't responsible for any damage or problem that could happen for using it (that is most commercial software licenses)?

Yes, definitely (0)

Anonymous Coward | more than 2 years ago | (#41102929)

Especially since purposely coding security vulnerabilities into your product can be a lucrative enterprise:

The Vulnerabilities Market and the Future of Security [forbes.com]

This is why the new market for vulnerabilities is so dangerous; it results in vulnerabilities remaining secret and unpatched. That it’s even more lucrative than the public vulnerabilities market means that more hackers will choose this path. And unlike the previous reward of notoriety and consulting gigs, it gives software programmers within a company the incentive to deliberately create vulnerabilities in the products they’re working on — and then secretly sell them to some government agency.

If fact, criminal sanctions maybe needed in order to protect the public.

Sure. it can be done. (2)

bmo (77928) | more than 2 years ago | (#41102933)

As long as you're going to foot the bill for a $500 application that changes your computer's wallpaper.

PEs and PLSs, doctors, psychologists, etc, all carry liability insurance. They're also not cheap. In the 80s, a survey crew cost $100/hr to come out and measure your land with a half-day minimum.

Now apply these costs to software.

--
BMO

Should Wall Street be sued for wrecking the world? (0)

Anonymous Coward | more than 2 years ago | (#41102939)

Hack journalists should just shut up about what should or shouldn't happen. What should happen and what really happens rarely even have a superficial resemblance, s*!theads.

Re:Should Wall Street be sued for wrecking the wor (0)

Anonymous Coward | more than 2 years ago | (#41102997)

I'd be satisfied if they were just hung up by their scrotums in Times Square.

certification (0)

Anonymous Coward | more than 2 years ago | (#41102973)

I suspect this is another attempt to guildify the field by requiring a certificate in order to be allowed to code.
I.E. code produced by a coder with a "certificate" from an "accredited" institution will be indemnified.

All code has holes - Read "secrets and lies" (0)

Anonymous Coward | more than 2 years ago | (#41102977)

All code has such problems.
A law, would only create incomes for lawyers.
Lawyers are dreaming that they can create security via the law.
They fail to understand reality.
(Again)

Re:All code has holes - Read "secrets and lies" (0)

Anonymous Coward | more than 2 years ago | (#41103149)

All code has such problems.
A law, would only create incomes for lawyers.
Lawyers are dreaming that they can create security via the law.
They fail to understand reality.
(Again)

No, lawyers understand reality well enough. Especially the part from your quote I put in bold.

only if you think you should (1)

nimbius (983462) | more than 2 years ago | (#41102987)

sue engineers for degraded roads or 9/11. As time passes, holes emerge and the code shows age. things that were once determined safe and sound like blowfish are no longer the norm, much as roads from the sixties can no longer handle multi-ton SUV traffic. maybe its a problem with the roads, or a problem that can be addressed by changing the environmental factors contributing to its degradation. Security, much as road construction, continues to improve but to retroactively fault an engineer for not knowing that which was unthinkable at the time is wrong-headed.
its also why you cant sue an engineer for a building that collapses under the shock force of an earthquake that meets or exceeds its structurally rated limit. In russia it was actually fair practice to jail engineers for failing to prevent catastrophic accidents. Hence the recount of senior control room engineers who were incarcerated for failing to safeguard against bureaucratic failures that precipitated the chernobyl disaster.

Re:only if you think you should (0)

Anonymous Coward | more than 2 years ago | (#41103067)

People want better roads.
People don't want to pay for better roads.
People don't want to deal with construction.
People are unrealistic.

Re:only if you think you should (2)

Belial6 (794905) | more than 2 years ago | (#41103299)

Even worse is that software is held to a standard that no other engineering field gets held to. That bridge need only function well enough not to completely collapse. Software gets racked over the coals if it has the smallest crack that gets exploited by teams of highly skilled criminals working full time to break the code.

Even if developers are liable, are they negligent? (1)

CokeJunky (51666) | more than 2 years ago | (#41102989)

From the article:

“The question is ‘Are they being negligent?’. The usual test is ‘Are they applying contemporary standards to the quality of their work?’,”

It seems to me that at the moment the contemporary standards are that almost all software has security holes. The "contemporary standards" are that this is acceptable -- very few customers with very specific needs can afford to insist otherwise, and even they have to build in redundancy and monitoring systems to handle the case where something doesn't work as advertised.

If the authors argument is based on contemporary standards, it's not a very good one.

When did Slashdot... (0)

Anonymous Coward | more than 2 years ago | (#41102995)

turn into Jezebel blog?

Re:When did Slashdot... (0)

Anonymous Coward | more than 2 years ago | (#41103189)

ASSAAAAANGE!! (shakes fist)

"Microsoft has previously argued against" (0)

RocketRabbit (830691) | more than 2 years ago | (#41102999)

Well no shit Microsoft has argued against such a move. There's an entire industry that feeds off of the insecurity of the MS platform, and I am beginning to suspect that MS leaves holes in its software as part of a probable arrangement with the Federal Government, in exchange for being the primary user-facing IT platform there. What better way to spy on friends and enemies, to ruin centrifuges, and so forth, than to have the makers of the standard business / academic / government OS in your pocket?

I for one would love to see some moves toward making OS companies and application developers liable for their own fuck-ups. If an infants car seat chokes babies to death repeatedly, the product is recalled and the manufacturer is usually sued. People incur real, measurable, personal monetary, business, and emotional damages from poorly-crafted software.

If such a regulation passed, we'd see a lot more attention given to removing bugs, and a lot less attention given to creating new but probably useless features. Yes, I'm looking at both the Gnome3 and Win8 dev teams here.

Re:"Microsoft has previously argued against" (1)

FrangoAssado (561740) | more than 2 years ago | (#41103237)

If such a regulation passed, we'd see a lot more attention given to removing bugs, and a lot less attention given to creating new but probably useless features. Yes, I'm looking at both the Gnome3 and Win8 dev teams here.

If such a regulation passed, you wouldn't be able to look at the Gnome3 dev team at all, because it wouldn't exist anymore. As would most free software developers.

Put yourself in the position of Linus Torvalds when he started developing Linux, for instance. Do you think he would even start distributing it if he could be sued for it?

Yet again... (1)

Elminster Aumar (2668365) | more than 2 years ago | (#41103003)

...blame the developer.

Re:Yet again... (1)

lightknight (213164) | more than 2 years ago | (#41103251)

'Tis quite alright. We'll just stop making software for the other sectors. They can go back to doing things on an abacus, while the rest of us have 30 minute work days.

like other engineering fields (2, Insightful)

Anonymous Coward | more than 2 years ago | (#41103015)

It's really time for computer science to grow up and join the rest of the pack. If a mechanical engineer designs a bridge that collapses under normal load, that engineer can be held PERSONALLY responsible for breach of duty. It's long past time for us to stop forgiving shody practices and people making the same old mistakes over and over that cause 90% of security vulnerabilities. Until people are held accountable, there won't be any meaningful change, and we'll keep having a field dominated by hacks and talent-free people without a real understanding of what they are doing.

We NEED this responsibility, and so does the public we serve. They're growing tired of the mess that exists right now. Apple is trying to do better on this front but really it needs to go much futher, and the whole field needs to improve. We've had many decades of ad-hoc cowboy-coders. It's time to start demanding professional behavior.

Re:like other engineering fields (5, Insightful)

Todd Knarr (15451) | more than 2 years ago | (#41103141)

OTOH a professional engineer differs from a software developer in one key way: he can't legally be overridden on safety matters. If management orders him to use steel that doesn't meet spec for the bridge's designed load, he can refuse to sign off on the plans and if the company tries to fire him the company is the one who'll end up in legal hot water after he reports them. If you want to make software developers responsible in that same way, you need to give them the same authority and immunity to repercussions for using that authority.

SONY BMG Rootkit (0)

Anonymous Coward | more than 2 years ago | (#41103017)

what was the outcome of that?

Re:SONY BMG Rootkit (0)

Anonymous Coward | more than 2 years ago | (#41103143)

a ridiculously non-punitive class action lawsuit settlement that didn't require Sony to recall the CD's loaded with the rootkit. Still, I believe it set a precedent for suing a company that purposely installed a product that contained a security vulnerability.

If you're going to be adding accountability (2)

Teunis (678244) | more than 2 years ago | (#41103035)

If you're going to be adding accountability, be sure of the origin of the security weakness. If it originates in management, in outside requirements or in other ways is part of the contract - the developer shouldn't be held responsible.

What happens if there is gross negligence? (3, Interesting)

ChumpusRex2003 (726306) | more than 2 years ago | (#41103039)

Bugs and security vulns are almost unavoidable - but some are due to gross negligence. Gross negligence should always be open to litigation. To follow on from Microsoft's analogy, if a door manufacturer was grossly negligent (let's assume that the door includes the lock and hinges - when this isn't normally teh case), and sold a high security door system, but had accidentally keyed all the doors to a single grand-master key. Then if you were burgled because a burglar happened to find out about this grandmaster key, then potentially you have a claim.

I don't see why it shouldn't be too different in software development. A software vendor needs to bear some responsibilty for good programming practice.

Bad software is everywhere; some is so bad, that it does border on grossly negligent.

As an example, I recently reverse engineered an "electronic patient record" system that was installed at a local hospital. This had a number of interesting design features:
1. Password security was via encryption rather than hashing. The encryption was a home-brew modified Vigenere cipher.
2. The database connection string was stored in the clear in a conf file, stored in the user's home directory. Interesting the database connection used the "sa" user.
3. Presumably for performance reasons, certain database tables (notably "users") would be cached in plaintext to the user's home directory. This way, an SQL join could be avoided, and the joins could be done client side.
4. The software ran an auto-updater that would automatically connect to a specified web site and download and run patches as admin - without any kind of signature verification.
5. All SQL queries were dynamically generated strings - no parameters, prepared statements or stored procedure. Not every user input was properly escaped. Entry of a patient name with an apostrophe in it, would cause very peculiar behavior. In the end, regular memos had to go round to staff telling them under no circumstances to use apostrophes in patient names, and to avoid, wherever possible the use of apostrophes in the plain text entries.

This is by no means all the security problems this software had, never mind the bugs e.g. a race condition when synchronising with a second application which would result in the two components opening different patient's charts.

Amazingly, there weren't any security breaches or significant medical errors as a result of this software - but I can't really conclude that this software production was anything other than grossly negligent.

Re:What happens if there is gross negligence? (1)

DNS-and-BIND (461968) | more than 2 years ago | (#41103205)

In no event shall Microsoft be liable for any damages whatsoever, even in the event of fault (including negligence).
-- Windows XP Professional license agreement

For those who didn't RTFA (4, Informative)

dkleinsc (563838) | more than 2 years ago | (#41103057)

They aren't talking about suing the individual programmers, they're talking about suing the software companies. Specifically, they want to disallow this kind of language very common in EULAs (this is taken from an actual EULA, name omitted to protect the guilty):

_______ and/or its respective suppliers hereby disclaim all warranties and conditions with regard to this product, including all implied warranties and conditions of merchantibility, fitness for a particular purpose, title and non-infringement. In no event shall _______ and/or its respective suppliers be liable for any special, indirect or consequential damages or any damages whatsoever resulting from loss of use, data or profits, whether in an action of contract, negligence or other tortious action, arising out of or in connection with the use of this software.

The translation of this clause out of legalese is "No matter what happens, you can't sue us, we're not responsible. We don't promise that this software is even remotely like what we advertised it to be."

Microsoft will change their mind (0)

Anonymous Coward | more than 2 years ago | (#41103071)

Microsoft will change their mind once they realize they can afford the insurance, and the open source developers can't.

First define 'security hole' (1)

Mister Liberty (769145) | more than 2 years ago | (#41103079)

Then answer the question.

No (0)

Anonymous Coward | more than 2 years ago | (#41103089)

Academics should be -- for not doing a good of teaching.

No way (0)

Anonymous Coward | more than 2 years ago | (#41103097)

Can I sue my government for their corruption?

If they did it on purpose (1)

Cockatrice_hunter (1777856) | more than 2 years ago | (#41103105)

If they left a hole in the fence on purpose or knew about a hole and didn't patch it I'd say sure go ahead and sue. Would you sue a fence builder if someone dug under your fence? Not all security holes are immediately apparent and most software has holes of some kind you can't just sue everybody.

No more lawsuits (1)

craigminah (1885846) | more than 2 years ago | (#41103111)

I'm so sick of the legal situation in the USA where people sue for everything. If we allow software developers to be sued for security holes we might as well ban software development. Like the medical field, a large majority of the cost would be in insurance and mitigation. Let developers develop, so long as their code isn't negligent then security holes are to be expected. What today is "secure code" will tomorrow be "vulnerable code" due to a clever haxor so don't complicate development any more and burden those who buy software with extra costs and don't burden our legal system with more stupid cases.

NO (1)

PostPhil (739179) | more than 2 years ago | (#41103129)

No, developers should NOT be sued. I'm quite frankly tired of hearing this drivel. COMPANIES or their UPPER MANAGEMENT should be sued (depending on the type of company) because THEY are the ones truly responsible and accountable. "They get paid the big bucks for a reason." Unless the person is a very crappy developer, most devs I know actually WANT quality control and the time required to write software properly. It's almost always management that tells them no, that "time to market" with something that vaguely resembles a product is most important, no matter how angry at the result the customers will be. Until the people with the actual power to change company decisions are held accountable for their decisions, nothing changes. So why are we wasting time persecuting the people with little power and who actually agrees with us?

Depends on the intention (1)

vchoy (134429) | more than 2 years ago | (#41103139)

If the security hole introduced intentionally with malice, or nothing is done about a known security bug, then getting sued maybe on the cards...
It depends on the impact the security hole...eg privacy or information breach.

If the security hole was introduced without malice or fixed in time, or does not have major impact (ie affects only test data, or performance...), then you're unlikely to have a case for litigation.

Only an academic would think this is a good idea. (1)

Barlo_Mung_42 (411228) | more than 2 years ago | (#41103197)

That is all.

Oh, there are ways around this. (1)

Bill, Shooter of Bul (629286) | more than 2 years ago | (#41103215)

Add a EULA that forbids anyone suing me.

Short of that, no longer license software, but provide it for free, but tivo'd. The binary blob inside is what I'll license for a cost. And guess what, its just a trivial piece of software that cannot contain any bugs.

si (1)

shentino (1139071) | more than 2 years ago | (#41103223)

Gee, I wonder why a company like Microsoft that writes perfect code would ever lobby against this...

Doors vs Vault Doors (4, Interesting)

holophrastic (221104) | more than 2 years ago | (#41103227)

Just like anything else, pay for whatever guarantee you desire. If you want your software created in record time, for a low cost, then the bugs are a part of the equasion. If you want secure coding, then you'll get to pay for it in time and money. It's always been that simple. You don't sue the manufacturer of your house door, but you do sue the manufacturer of your bank vault door. The difference in cost is tremendous.

It's rare that my clients ask for proper security. But for the elements that they do indeed want to protect, they pay for me to do my very best work. And you'd better believe that they hold me responsible and often accountable for significant problems should they result.

But in the end, it's all just insurance anyway. If a client of mine wants a particular e-commerce feature to be super-secure, then they'll ask me to pay for any dollars lost due to bugs. I know that I'm not perfect, and of the thirty possible bugs, there's a small chance that I'll fall into one or two of them, and a partial chance that I won't catch it before it's exploited. So while much of the added price is for me to sit there and check things closely, the rest of the added price is for me to accumulate in the event that I need to pay it back. Over multiple clients and multiple exploits, that's the only way to do it.

The obvious alternative of checking things even closer winds up being far more money, and is only really relevant when physical safety is an issue.

What is so hard about this? (0)

Anonymous Coward | more than 2 years ago | (#41103285)

Just adopt the PCI-DSS model:

  • Define a reasonable set of standards for what is "sloppy"
  • Have tiered requirements that scale based on the types of information the application can/will expose
  • ???
  • Profit!
  • ???
  • Security auditors / scanners Profit!

Sounds like a good idea... (1)

multicoregeneral (2618207) | more than 2 years ago | (#41103321)

until you realize that there's no such thing as "safe" software. All software is potentially exploitable, and potentially dangerous. As developers, we can do our best to make sure that we get the obvious defects, but given enough time and motivation, any piece of software can be cracked. It's just the way computers work. To assume negligence by default is dangerous for everyone, and could have the opposite effect that they are intending (think gun laws). It will lead to less innovation, and more software companies shielding themselves from these kinds of suits. Not a good scenario.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?