Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming Security IT

Choice of Programming Language Doesn't Matter For Security 192

An anonymous reader writes "The Security Ninja has written a blog post which discusses web programming languages and the fact that they are all insecure. It's based on a report from WhiteHat Security and aims to dispel the myth that some languages will guarantee that an application will be more or less secure than other languages. '... secure code is the product of a secure development process and real business commitment to deliver secure applications which includes developer education. The absence of these processes and business commitments will lead to web applications being developed insecurely regardless of the language being used.'"
This discussion has been archived. No new comments can be posted.

Choice of Programming Language Doesn't Matter For Security

Comments Filter:
  • It's a great point that security awareness is paramount in any web programming...

    But I dare you to write a more secure web service in , than in Java. Sure you have to be security aware, but it's still the case that some languages make acting on that awareness easier than others.

    • by Anonymous Coward on Friday May 07, 2010 @02:50PM (#32131182)

      But I dare you to write a more secure web service in , than in Java.

      I didn't know Whitespace [wikipedia.org] supported web services.

    • Don't know where it went, but there was supposed to be a "C" in there and I swear there was one when I hit submit...

      As it is though, I guess you can fill in your least favorite language!

      • Don't know where it went, but there was supposed to be a "C" in there and I swear there was one when I hit submit...

        Between pressing 'S-c' and 'Enter', you had a race condition.

    • Re: (Score:3, Insightful)

      by phantomfive ( 622387 )
      I've been thinking about this. Is Java really more secure than C? It is a harder question than it seems at first sight.

      Usually when you see claims that Java is more secure than C, it is based on people finding more security bugs in C than in Java, but I am not sure that is a good measurement because C and Java are used in different places, and that makes a huge difference. For example, a buffer overrun in a desktop app (excel, photoshop, whatever) is not a security breach, it's just annoying. C is use
      • You fail it.

        If you're worried about security, you don't assume a best case scenario. "lalala, ladee dah, I'll just make sure my C code is perfect with no exploits and it'll be just as secure as Java."

        The reality is it only takes one simple, hard to find and debug fuck up and your application will be owned. In the same scenario using Java, the app would still be secure.

        In a perfect world, C and Java are just as secure as one another. In reality, it's not even comparable, Java wins hands down.

        • That is your argument? Really? It sounds more like talking points from the Java faction. Your argument implies that you think it is impossible to make one simple, hard to find and debug fuck-up in Java. If you really believe that, you will be a lousy coder in any language.
          • Your argument implies that you think it is impossible to make one simple, hard to find and debug fuck-up in Java.

            But this is exactly the point: for certain types of f-ups, it is impossible to make them in some languages. You cannot overflow a string in Java. You can in C. Java is unambiguously safer in this respect.

            This is not a black and white issue. It is safer to use a language where there are fewer opportunities for human error, even if there are still some such opportunities.

          • Re: (Score:3, Insightful)

            The problem is that *because* people are protected from certain very basic screw ups in Java companies automatically downgrade the quality of programmer and the level of oversight they use. The result, I believe, is that the end product is *even worse* than it would have been for the "more vulnerable" language.

            So - if you are talking academically about languages - Java is more secure by a long way. C has all the vulnerabilities that Java has plus a lot more. If you are talking about actual outcomes in

      • by dotgain ( 630123 ) on Friday May 07, 2010 @05:21PM (#32133272) Homepage Journal

        For example, a buffer overrun in a desktop app (excel, photoshop, whatever) is not a security breach, it's just annoying.

        Bad choice of examples. That's what we were saying and thinking in 1998: IT to PHB: "Don't open any EXE files mailed to you, however Excel spreadsheets, Word docs etc, are fine".

        A exploitable buffer overrun in any application where malicious inputs exist is a security hole.

    • Never underestimate incompetence. Sure, Java protects you against some kinds of buffer overflows (but then a couple of versions had such vulnerabilities in their native parts of the JRE instead), but it doesn't protect against any other kind of incompetence.

      There are probably a few SQL injection vulnerabilities and an XSS exploit being written somewhere right now. And someone out there is writing a servlet which reads and writes files off the hard drive, but isn't checking the paths, so really you can reque

      • Just because a bunch of Java programmers are morons doesn't mean the language sucks.

        • I never said that Java sucks. But it seems to me like TFA has a point. You still need to educate your devs and take security seriously. There is no magic amulet that you can just put on and be immune from security problems.

  • Obviously... (Score:2, Interesting)

    by Meshach ( 578918 )
    That seems like a no brain statement. It doesn't matter what language I use if I write insecure code the application will be insecure.

    More at 11
    • That seems like a no brain statement. It doesn't matter what language I use if I write insecure code the application will be insecure.

      The point is that it's easier to write insecure code in some languages than in others. In fact, in some of them, it practically writes itself.

      That said, TFA is pointless. Let's see what the guy is comparing:

      - ASP (languages: VBScript, JScript)
      - ASP.NET (languages: C#, VB)
      - ColdFusion
      - JSP & Struts (language: Java)
      - PHP
      - Perl

      Now, one thing that stands out: there's no C or C++ here. In fact, there's no memory-unsafe language here at all. You know, the kind where you can actually get things such as buffer

  • by aBaldrich ( 1692238 ) on Friday May 07, 2010 @02:46PM (#32131080)
    I think that in average programs written in haskell (exempli gratia) tend to be more secure because it takes a better programmer to write them than a quick and dirty VB application.
    • Re: (Score:3, Funny)

      The problem is that the set of all haskell applications is too small to be statistically significant. OK, I'm just kidding.

    • by blair1q ( 305137 )

      I think the opposite because the better programmer will be rendered an average programmer by the difficulties of the language.

      • If that were true, we could prove evolution wrong.

        That which does not kill us makes us stronger, not weaker.

        The difficulties of the language are negligeable compared to the difficulty of writing a secure, complex app, simply because the language's complexity is negligeable compared to the sum complexity of all the apps that can be written in it.

        Easy-to-use, "simple" languages, ought to be more secure than C, but in the real world, only logo is really safer than anything else, just because you can do almost

    • There's also the fact that in Haskell (or, say, O'Caml) you can structure your application's interface to the web such that you cannot (by abstraction) do unsafe things unless you really want to(*). If you have such a framework the programmer has to be actively working against you to do unsafe things. (And there's nothing that can save you from an actively destructive/malevolent programmer inside your organization.)

      (*) Define Unsafe/Safe string variants and force all strings trough Escape/Unescape for all t

    • I don't know if I agree; haskell programmers tend to be demographically more experienced (it's the only language I know of where it seems that the median programmer has or is working on a PhD), but I would also trust a relatively inexperienced programmer to write fairly good code in Haskell, especially if they used an existing web framework like HappStack. Static typing and well-defined libraries go a long way towards making it hard to do the wrong thing. This is one of the things I find compelling about

  • by by (1706743) ( 1706744 ) on Friday May 07, 2010 @02:46PM (#32131098)
    'Cause even if the source is available, the would-be attacker won't be able to understand it [wikipedia.org]!
  • by Anonymous Coward on Friday May 07, 2010 @02:47PM (#32131104)

    Anyone who says all programming languages are equally exploitable is a fool. Sure, secure coding practices and standards are the way to approach the issue- not language selection, but it is, for instance, impossible to overrun a buffer in interpreted byte code and executed native code. The fact that stack crashing doesn't exist in interpreted code alone demonstrates that languages (or their runtime environments that are inherent to a language) are not all equal in exploit-ability levels. To say they are all the same is simplifying things too much. Yes, all languages have their exploitable bad practices, but some have more than others.

    • Re: (Score:2, Insightful)

      by phantomfive ( 622387 )
      Wow, way to let your pre-existing ideas blind you to the truth. Do you realize these guys are doing the scientific thing here, and testing assumptions like the one you just made? They made a statistical analysis of 1,700 websites in different languages. Do you think you could at least read their assessment before spouting off your rage? If you have a problem with their study, you should point it out. If you disagree with them because of what you 'think', you might as well start a church and call it a r
      • by Fnkmaster ( 89084 ) on Friday May 07, 2010 @03:23PM (#32131706)

        Yeah, except this isn't a comparison by language. It's a comparison by platform technology. For example, JSP shows as one of the highest vulnerability ratios, whereas Struts (Apache's Java MVC framework) has just about the lowest vulnerability ratio (on par with ASPX).

        Clearly they are measuring *something* but it seems to have relatively little to do with languages themselves.

        If anything, it seems like web apps written in frameworks that don't actively discourage mixing code and presentation are more likely to have vulnerabilities, whereas frameworks that encourage separation more actively (and perhaps are newer frameworks) are less likely to have vulnerabilities. The worst two measured, Perl and JSP, are older technologies that date from the era before frameworks that enforced more MVC separation were common and before web app best practices really existed.

        • Good. You are criticizing the study. That is reasonable. The OP was not. The OP was spouting off his opinion without anything to back it up. If you can't see the difference, you're in trouble.
      • by hibiki_r ( 649814 ) on Friday May 07, 2010 @03:24PM (#32131720)

        The test itself already has bias, precisely because it works on a family of programs that happen to have a very limited set of inputs, and where the avenues of attack are relatively limited in some very important ways. The core vulnerabilities of websites have been done to death, so at this point, barring utter stupidity, I'd have been surprised if the security problems were noticeably different depending on the language.

      • I think you're right to say that it's better to trust empirical evidence than go on seemingly logical assumptions. However, to me it looks like the study is making a _business case_ that all the tested languages are likely to produce roughly the same number of flaws. That is to say, as a business decision the programming language viewed by itself is not a significant factor. However I don't think it can be extended out to saying the language doesn't matter. It's not accounting for quality of programmer, des

      • by gangien ( 151940 )

        I read the link, and since it requires registration of sorts to dl the pdf, I won't do that.

        But it seems like a very macro level type of study, and that seems to gloss over technical details that you need when judging a language.

        To use a car analogy, if you're measuring accident avoidance, and you note that the prius has the least amount of accidents, therefor it's the best for avoiding accidents. Ignoring the fact that what you're doing if your driving a prius, or the type of person that typically would u

      • by dgatwood ( 11270 ) on Friday May 07, 2010 @03:31PM (#32131814) Homepage Journal

        They made a statistical analysis of web languages. That's not generalizable to all programming languages as the Slashdot headline implies. All of these languages have several things in common:

        • Variable-length strings.
        • No truly fixed-size data structures or buffers.
        • No direct access to pointers.

        In short, all of these programming languages eliminate entire classes of potential exploits that other programming languages allow. Therefore, although these programming languages happen to be similar, that does not mean that programming language choice has no bearing on security. It just means that choice of programming language within a very narrow range of languages that are not a representative sample of programming languages as a whole has no bearing on security.

      • by bmajik ( 96670 ) <matt@mattevans.org> on Friday May 07, 2010 @03:49PM (#32132084) Homepage Journal

        Well, the full paper is behind nag-ware, but here are the "top 3 findings"

        Empirically, programming languages / frameworks do not have similar security postures when deployed in the field. They are shown to have moderately different vulnerabilities, with different frequency of occurrence, which are fixed in different amounts of time.

        The size of a Web application's attack surface alone does not necessarily correlate to the volume and type of issues identified. For example Microsoft's .NET (ASPX) and Struts (DO), with near-average attack surfaces, turned in the two lowest historical vulnerability averages. ASPX and DO websites have had an average of 18.7 and 19.9 serious vulnerabilities respectively.

        Perl (PL) had the highest average number of vulnerabilities found historically by a wide margin, at 44.8 per website and also the largest number currently at 11.8.ties have taken over 50 days to fix.

        Gosh. To me that says that they found significant differences between the languages and platforms.

        However, I am not ready to make any claims based on this, other than, they sampled a bunch of websites and then recorded information about vulnerabilities, and did "group by" on language/framework.

        Isn't it likely that there is some selection bias here?

        Rather than making claims about the intrinsic nature of some language or framework [like all of them are equal, or one is better than the others], don't you need to correct for the lack of control.. like same coders in the same organization trying to implement the same _type_ of application?

        If I gave the same group of developers equal traning time, equal implementation time, and equal specs... and then said "do this in ASP.NET", and then "do it in Perl". And i did this with 10 groups of developers, and i changed the order of which application came first (i.e. some groups did perl first, some did asp.net first).

        __then__ I would feel comfortable saying something about the relationship between language/framework and security vulnerabilities. What we really want to know is, given developers _like yours_, who've had equal training, expertise, and time, when trying to produce equivalent functionality.. how is _their_ production of security defects, and is there a difference between toolchains?

        Now, I didn't read the PDF because if the nag-wall infront of it. But that doesn't sound like what they did here.

        The goal out here in the real world is this: make an application that is secure-enough, cheaply-enough. "Cheaply-enough" means what caliber of people you need to hire, and how long it takes them to produce value-adding output. Secure enough means that the cost of fixing your bugs is higher than the cost of (risk of penetration * financial impact of penetration).

    • Re: (Score:3, Insightful)

      by BitZtream ( 692029 )

      Please show me the interpreted byte code langauge/runtime that has never had a buffer overflow exploit.

      I promise you that I can count one one finger higher than you can show me systems without a buffer overflow exploit.

      The fact that your saying what you're saying tells me you really have no understanding of how exploits happen at all, let alone any reason you should be talking about secure code.

    • by Delusion_ ( 56114 ) on Friday May 07, 2010 @03:29PM (#32131794) Homepage

      "All languages are exploitable" != "all languages are equally exploitable".

      You're the first person to bring the word "equally" into the conversation, and have missed the point.

    • Re: (Score:3, Informative)

      by eulernet ( 1132389 )

      The fact that stack crashing doesn't exist in interpreted code alone demonstrates that languages (or their runtime environments that are inherent to a language) are not all equal in exploit-ability levels.

      You are totally wrong, since Javascript, which is interpreted, has numerous exploits involving stack crashing.
      http://en.wikipedia.org/wiki/Heap_spraying [wikipedia.org]
      with an example here:
      http://stackoverflow.com/questions/381171/help-me-understand-this-javascript-exploit [stackoverflow.com]

      ActionScript (from Flash) is also an interpreted language, and full of security bugs !

  • "That aims to dispel the myth that some languages will guarantee that an application will be more or less secure than other languages."

    Whoever said, besides your 16-year-old cousin that just figured out how to add a flaming skull animation to his MySpace page, that there is any web application programming language that will guarantee security. Sheesh.

  • by Anonymous Coward
    You mean I am actually supposed to know what I'm doing?!
  • If you know what programming language a programmer uses you can tell if they know what they are doing or not.

  • The Python Paradox (Score:4, Informative)

    by calmofthestorm ( 1344385 ) on Friday May 07, 2010 @02:52PM (#32131224)

    If you haven't heard of it, the python paradox is an interesting read: http://www.paulgraham.com/pypar.html [paulgraham.com]

    Simply put, the kind of people who learn a language out of interest than out of wanting to get a job tend to be better programmers on average. (This was written awhile ago, when Python had little use outside the FOSS community. Now that Python is looking like it may someday replace Java, perhaps the Haskell Paradox is a better term).

    Anyway, perhaps the same issue is at play here. Perhaps the people who use PHP tend to be less aware of security or more apathetic toward it, and thus there is a two way feedback between language and programmer (the last time I used Visual Basic the compiler was as full of holes as a piece of swiss cheese and Microsoft wanted me to pay $100 each to report counterexample bugs, but that was 6.0, back in middle school)

    • by barzok ( 26681 ) on Friday May 07, 2010 @02:58PM (#32131346)

      Simply put, the kind of people who learn a language out of interest than out of wanting to get a job tend to be better programmers on average.

      People who do anything because it interests & fascinates them on a personal level do better than those who are only in it for the paycheck. Doesn't matter whether it's programming, auto repair, landscaping, or anything else.

      • People who do anything because it interests & fascinates them on a personal level do better than those who are only in it for the paycheck. Doesn't matter whether it's programming, auto repair, landscaping, or anything else.

        Except gambling/poker.

      • People who do anything because it interests & fascinates them on a personal level do better than those who are only in it for the paycheck. Doesn't matter whether it's programming, auto repair, landscaping, or anything else.

        I paid for my undergrad degree by landscaping, and I can say with some certainty that it is neither an interesting nor a fascinating subject.

    • Re: (Score:3, Insightful)

      by tsalmark ( 1265778 )
      I think php is rather an interesting case. Looking at SQL injection: the language is strong enough and easy enough to protect against attack. Yet if you look at programming documentation, examples and free applets available on the web. Many of them have no protection at all. Also the forums providing answers to novice questions are often being answered by other novices. Best practices do not yet seem agreed on and pointed to in PHP as in other languages. So the bad practices are almost self perpetuating.
      • by calmofthestorm ( 1344385 ) on Friday May 07, 2010 @03:23PM (#32131698)

        Exactly. The culture of a language is as or more important than the language itself. Indeed, the culture shapes the language (but of course, to a degree, the language shapes the culture).

        Java itself isn't a very good language, but it's the hordes of incompetent Java programmers who make it such a terrible choice for everything. This goes back to the Python paradox: companies want Python programmers to write Java for them.

        I will say this in Java's favor, however: It's a language where the smartest can't write code that confuses the dumbest, and where the dumbest can't write code that does too much damage.

        • I will say this in Java's favor, however: It's a language where the smartest can't write code that confuses the dumbest

          Not really. They dug the grave for the dumb in 1.1 already, with inner classes; and nailed the coffin shut in 5, with generics. In 7, they are going to finally bury it with lambdas [java.net].

          Me, I'm waiting eagerly to see what the next incarnation of COBOL turns out to be...

        • Re: (Score:3, Insightful)

          by tsotha ( 720379 )

          I will say this in Java's favor, however: It's a language where the smartest can't write code that confuses the dumbest, and where the dumbest can't write code that does too much damage.

          That alone makes it the best language for large business projects. Your coworkers will be a mix of good and bad, and pretending the bad programmers is a more damaging mistake than anything you can do to the code.

        • I will say this in Java's favor, however: It's a language where the smartest can't write code that confuses the dumbest, [...].

          In my AP computer science class, some of my classmates don't seem to grok the whole OO concept (i.e. if I create multiple interacting classes, it confuses them. They do however understand "[Type] foo=new [Type]();", but only for predefined types).

    • It's an interesting essay, but it's just speculation. Not any more insightful than the serious posts on Slashdot.

    • This a very good point. It is the programmers that are important, not so much the languages, but languages do bias what kinds of programmers will use them.

      See also why Linus' Torvalds prefers C to C++ [cat-v.org], it has not so much to do with the language itself but with the kind of programmers that use the language.

  • Any language that lacks an inherent insecurity can be used to write secure apps, just as any language (Brainfuck, anyone?) can be used to write any program.*

    You choose a language not because it makes it possible for you to do anything, but because it makes it easier than another language.

    *I realize that there are cases where performance-per-core is critical, and that narrows your choices considerably. Still, in that situation, some use C, others use C++, and still others use Lisp.

  • I see the big problem here is that they are specifically talking about web applications. When you're doing web applications, the major security holes are SQL Injection, XSS attacks, HTML/Javascript Injection, and other such things. Nobody (almost) uses C for programming on the web. Of all the popular web languages PHP, .Net, Java, Python, and all the others, none of them use pointers, none of them require you to manage your own memory. This cuts out a lot of security exploits found in languages like C,
  • by david.emery ( 127135 ) on Friday May 07, 2010 @03:02PM (#32131406)

    1. The languages being considered/charted are ASP, ASPX, CFM, DO, JSP, PHP and PL (I can guess at most of these acronyms).

    What's missing, obviously, are 'real' programming languages such as C, Java, FORTRAN, Ada, C++, Eiffel, etc.

    2. A lot of these languages share a common (C) heritage, and I'd assert "inherit" a lot of the security weaknesses of C. That's particularly true of weak typing for scalars, including array bounds.

    The conclusion I think can be drawn from this is that we need a substantial increase in Web Programming practices, including languages. Any other conclusion is overreach.

    • by hAckz0r ( 989977 )
      Agreed, I have yet to find a book on Web Programming in ADA, and I doubt if one ever did surface nobody would read it. Those features that make a language secure tend also to make it unpopular.
    • You forgot the functional programming languages, like Haskell, Ocaml, standard ML, Erlang, etc.
      They usually have a track record of higher security without loss in performance (that Java has), since the checks can happen at compile time.

      Of course you can still mess up things even in Haskell. E.g. [0,1,2,3,4,5]!!10 will cause a runtime exception. But the functions that can cause errors are well-known and can be avoided. Also, QuickCheck is a really great tool to test out all the possibilities.
      If I ever had a

    • ... "inherit" a lot of the security weaknesses of C. That's particularly true of weak typing for scalars, including array bounds.

      Can you explain what you mean here?

  • by ctrl-alt-canc ( 977108 ) on Friday May 07, 2010 @03:03PM (#32131426)
    "My favourite programming language is a soldering iron".
    • by micheas ( 231635 )

      Reminds me of one of my roommates in college.

      He had a poster of the saying (IIRC) "real programmers don't use operating systems they write directly on the hardware as god intended."

  • "You can write a FORTRAN program in any language."

    If anyone knows who deserves the credit for that one, BTW ...

  • Java (Score:3, Insightful)

    by caluml ( 551744 ) <slashdot@spamgoe ... minus herbivore> on Friday May 07, 2010 @03:26PM (#32131742) Homepage
    I'm confused. When I was thinking of learning a new language a few years ago, I took a good look at them all (well, the top 5 to 10), and picked one based on how many jobs there were, pay levels, non-proprietary-ness, etc.

    One of the things I liked about Java was that there aren't any buffer overflows to worry about. Well, apart from ones in the JVM, but they are few and far between.
    I don't understand when people say that all languages are as insecure as each other. Sure, people can do stupid things like not check input, etc - but when it comes to finding some sort of buffer overflow in a function/library?
    If I had to write a website that would be deployed onto a box which was not touched for 5 years, I imagine that a Java-based site would have a better chance of faring than a PHP one.
  • I think this is more a testament to the fact that crappy programmers will write crappy code in any language, instead of showing that all languages are equally as crappy for writing secure code. If the same person wrote the same program in different languages then you might have a fair comparison, otherwise this report just shows a similar ratio of bad programmers across different languages which I don't find all that surprising.
  • There isn’t just “secure” and “not secure”, you know? Some are more and others less secure.

    And I have only one thing to say:

    Haskell > Java > C :)

    (Java has built-in checks that prevent the worst errors of C. And Haskell also has them, but at compile time, so you get back the performance. Of course those language are just examples, and any similar languages could be placed in there instead.)

    • Re: (Score:3, Insightful)

      In the case of an advanced JVM like HotSpot (the official Sun/Oracle JVM), you also get the performance back.

      If the array bounds checking can be removed without compromising security, HotSpot's JIT compiler will do so when compiling the Java bytecode into native instructions.

  • Thousands of Banks can't be wrong! Right?

  • Sure, if you take extra precautions with the buffer-overflow languages, your software can be just a secure. But in the real world, that's almost never the case. Projects are always rushed and mistakes happen. Every team has one or more weak links. And what coder prefers burdensome process to development anyway? Anyone who promotes C/C++ over Java for a back-end enterprise application is not a professional IMO. They come across as stubborn basement hackers who can't keep their resumes up to date. C/C++ was n

  • http://www.c-program.com/kt/reflections-on-trusting.html [c-program.com] is worth another read. Apple can make a reasonable case that allowing other development tools in their little garden reduces their ability to ensure a secure system.

    I concur with posters who observe that some languages do have more protections than others; but in the end the application programmer needs to be careful and security aware, and there has to be complete trust in every step of the processing from what the programmer writes to what is run on

  • Languages maybe. Standard libraries, oh boy for sure.

    From the man(3) page for the set of functions including gets():

    BUGS
    Never use gets(). Because it is impossible to tell without knowing the
    data in advance how many characters gets() will read, and because gets()
    will continue to store characters past the end of the buffer, it is

    • Re: (Score:3, Informative)

      gets() is officially deprecated in ISO C99 standard, though, and will be removed entirely in ISO C1X. Most compilers today (even non-C99 ones) will output a warning if they see it, and warnings-as-errors is standard development practice for C/C++ these days.

      Also the "secure" functions in TR 24731 (strcpy_s, strcat_s etc [us-cert.gov]) will be part of the base standard library in C1X.

  • a security "ninja" who uses windows xp for everything? its a bit like a design "guru" who uses mspaint on a monochrome monitor.

    and yeah its kind of obvious its not really down to the language, its down to the programmer. a little surprised by the stats though - i'd have thought perl hackers would have more security know-how than your average java monkey.

  • There''s a hell of a lot more to programming than just Web programming and there are a lot of real programming languages that go to great lengths
    to make secure programming easier.

    The Security Ninja is a paper tiger.

  • Comment removed based on user account deletion

Scientists will study your brain to learn more about your distant cousin, Man.

Working...