Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming

How Relevant is C in 2014? 641

Nerval's Lobster writes: Many programming languages have come and gone since Dennis Ritchie devised C in 1972, and yet C has not only survived three major revisions, but continues to thrive. But aside from this incredible legacy, what keeps C atop the Tiobe Index? The number of jobs available for C programmers is not huge, and many of those also include C++ and Objective-C. On Reddit, the C community, while one of the ten most popular programming communities, is half the size of the C++ group. In a new column, David Bolton argues that C remains extremely relevant due to a number of factors including newer C compiler support, the Internet ("basically driven by C applications"), an immense amount of active software written in C that's still used, and its ease in learning. "Knowing C provides a handy insight into higher-level languages — C++, Objective-C, Perl, Python, Java, PHP, C#, D and Go all have block syntax that's derived from C." Do you agree?
This discussion has been archived. No new comments can be posted.

How Relevant is C in 2014?

Comments Filter:
  • Si. (Score:5, Funny)

    by Anonymous Coward on Tuesday December 09, 2014 @06:09AM (#48553741)
    Si.
    • Why, very relevant! I use the C. word almost daily!

      • Re: (Score:3, Funny)

        by Anonymous Coward

        I have gone out of my way to never use that letter. Notise that at first it kan be a bit diffikult but you get used to it.

        • Re:Si. (Score:4, Funny)

          by Thiez ( 1281866 ) on Tuesday December 09, 2014 @07:54AM (#48554101)
          Said the Anonymous Coward...
        • Re:Si. (Score:4, Funny)

          by jc42 ( 318812 ) on Tuesday December 09, 2014 @04:19PM (#48558195) Homepage Journal

          I have gone out of my way to never use that letter. Notise that at first it kan be a bit diffikult but you get used to it.

          In English, pretty much the only "real" use of 'c' rather than 's' or 'k' is in the digraph "ch", which represents a phoneme that has no other standard spelling. However, you kan replase it with "tsh", which produses the same phoneme bekause phonetikally "ch" really is just 't' + 'sh'. So with this tshoise of letters, you kan further approatsh the kommendable goal of replasing an utterly unnesessary English letter with a more phonetikally-korrekt ekwivalent. At the same time, we kan make kwik work of replasing that idiotik 'q' with a sensible replasement.

          (Kyue the Mark Twain kwotes on the topik. ;-)

    • Re:Si. (Score:4, Insightful)

      by cheesybagel ( 670288 ) on Tuesday December 09, 2014 @11:11AM (#48555363)

      Even if you mostly program in other languages eventually you need to interface with some system function or legacy library and you *will* need to use C.

      • Or a library written in another language, in which case C, or more exactly, the ABI (usually dictated by your C runtime of choise), is the common dialect.
  • Relevant C (Score:5, Funny)

    by smittyoneeach ( 243267 ) * on Tuesday December 09, 2014 @06:11AM (#48553749) Homepage Journal
    Relevant C
    2B || !2B
    Either learn what you're doing
    Or stick to the Wii
    Burma Shave
    • by BarbaraHudson ( 3785311 ) <barbara.jane.hudson@nospAM.icloud.com> on Tuesday December 09, 2014 @11:54AM (#48555851) Journal
      To "C" or not to "C", that is the question.
      Whether ’tis nobler in the mind to suffer
      The slings and arrows of java coders,
      Or to take arms against a sea of perl,
      And by opposing end them? To die(): To sleep()
      No more; and by die() to say we exit(),
      The heart-ache of the thousand malloc()s
      That c code is heir to, ’tis a consummation (of ram)
      Devoutly to be wish’d to die() when we forget to free(),
      To sleep(): perchance to dream(): ay, there’s the rub;
      For in that sleep() of die() what random() instructions may come
      When we have shuffled off other's poor performance,
      Must give us pause: there’s the respect That makes durability of so long C;
      Burma Shave
      For who would bear the whips and scorns of n00bs,
      The oppressor’s wrong, the proud man’s memory managed tools,
  • Embedded Systems (Score:4, Insightful)

    by Anonymous Coward on Tuesday December 09, 2014 @06:15AM (#48553757)

    Those widgets the clueless newspaper reporters and marketers call 'the internet of things', otherwise known as embedded systems, depends on Linux and C. So therefore C is 'the next big thing'.

    • And even more things, like your central heating programmer, and things of that ik won't nercessarily run Linux or even an rtos, but do their own co-operative multitasking, if required.
    • Re:Embedded Systems (Score:5, Interesting)

      by jythie ( 914043 ) on Tuesday December 09, 2014 @07:37AM (#48554035)
      One of the big reasons C will probably not be going away any time soon is there is no replacement and not much work being done on one. The higher level languages, language designers are constantly trying to redo or replace, but there is not much interest in replacing such a low level language... and the people who do use C are not interested either since they tend not to be language fetishists.
      • by Tom ( 822 ) on Tuesday December 09, 2014 @08:09AM (#48554147) Homepage Journal

        and the people who do use C are not interested either since they tend not to be language fetishists.

        This. Half of the newer high-level languages today are just the mental masturbations of someone who either thinks he can make the wheel more round or the result of a "not invented here" mindset. There's so much crap out there forking a perfectly good language because someone thinks it should be a =+ b; instead of a += b;

        It's sickening, and a good reason to stay away from all this shit, because five years down the road someone will fork the fork and you can throw all your code away because support and development just stops as all the ADD kids jump at the new toy. That'll never happen with your C code.

        • by chuckinator ( 2409512 ) on Tuesday December 09, 2014 @12:18PM (#48556053)
          I agree with PP and GP, but there's more to it than just that. Software is like an organ of your computer; your computer typically won't do much worthwhile if there's not a whole bunch of the things working together to make complete systems. Almost every one of the higher level languages are implemented in C at some point in the software stack. Some might argue that certain JVM languages like Scala and Groovy and Clojure are written in pure java, but guess what? The JVM is written in C. Almost every piece of software out in the wild is either written in C or depends on critical components written in C all the way down to the operating system. If you're running embedded, you might not have an OS, but you probably should be using C on microcontrollers and embedded systems unless there's a real good reason not to.
        • by phantomfive ( 622387 ) on Tuesday December 09, 2014 @01:02PM (#48556331) Journal

          It's sickening, and a good reason to stay away from all this shit, because five years down the road someone will fork the fork and you can throw all your code away because support and development just stops as all the ADD kids jump at the new toy. That'll never happen with your C code.

          This is one good reason I program in C.
          The other reason is portability. You can write something in C and it will run on every major platform and almost every embedded platform. Furthermore, it is portable between languages. If you write a library in C, people can call it from C#, Java, Python, TCL, Ruby, or nearly any other language. It is the lingua franca of programming.

  • Because it's extremely dangerous and a lot of people are still using it. The 'standard' standard library is so full of security holes it's not even funny, and attempts to 'improve' it over the years have mostly been unsuccessful because the bad coding patterns still exist.

    C is a great language, it's just that most humans are incapable of using it safely and securely. It's like a .45 with a downward-pointing barrel. It's all too easy to shoot yourself in the foot.

    For full disclosure, I used to be an avid C p

    • by gsslay ( 807818 ) on Tuesday December 09, 2014 @06:34AM (#48553809)

      It's like when you drunk drive and think you're just fine.

      Well the problem there is you're drunk, not that you can drive. C is a great language, and it gives its programmers a great deal of power and flexibility. But with that comes responsibility not to code like an idiot. If you're going to wield its power carelessly, of course you're a danger.

      Perhaps C's greatest weakness is that it places too much trust in the coder, where other languages don't.

      • OP was equating intentionally using C with intentionally driving drunk. As a long time C hack (still am) I concur.

      • by Tom ( 822 )

        Perhaps C's greatest weakness is that it places too much trust in the coder, where other languages don't.

        I consider this its greatest strength. If you want a training wheels language, there are probably 200 to choose from. If you want a language for adults, there aren't all that many choices.

      • by gweihir ( 88907 ) on Tuesday December 09, 2014 @09:20AM (#48554527)

        C is not a tool for the incompetent (whether temporary due to alcohol or permanently). It is an expert-only tool. There are a few of those around and they will stay around, because in the hands of somebody skilled, these tools deliver exceptional results that no more generally usable tool can match.

        • This is why romanticizing C is not a good idea. It's a pain in the ass language that should only be used when it has to. Currently there a far too many C zealots that are trying to project to the world that they are experts but would be better described as Dunning-Kruger sufferers.

    • by Viol8 ( 599362 ) on Tuesday December 09, 2014 @06:38AM (#48553825) Homepage

      If its too dangerous for humans, who do you think is going to write all the compiler/interpreter and low level OS interfaces of whatever alternative language of your choice is? At some point someone has to get their hands dirty down at the metal whether its in C or assembler. If you're not up to that then fine, but please spare us the poor workman blaming his tools excuse.

      • by codewarren ( 927270 ) on Tuesday December 09, 2014 @09:43AM (#48554647)

        Actually this is false. It is possible to write a language that is both safe* and compiles itself.

        If you're not up to that then fine, but please spare us the poor workman blaming his tools excuse

        I can cut a straight line with a circular saw without using a guide or a guard, but I can do it a hell of a lot quicker with a guide to rest against and a guard to keep me from having to constantly check my fingers and chords etc. These things weren't invented because of bad workman, but because they make good workman better. Not everyone who notices that there may be better tools out there than C for the very things that C is used for is a workman blaming his tools.

        Someone eventually needs to write the rules for translating the higher level language down to lower levels, but this isn't the same as "getting their hands dirty down to the metal" in the same way that you've implied because it can be done in tiny self-contained, small chunks following yet more rules and rigorously like a mathematical proof and therefore not be subject to the same pitfalls as languages like C. It also only has to be done once (per processor) but then the safety is ongoing.

        This layering is just modular design and separation of concern. Look at IR in the LLVM project which has allowed an explosion of languages that can enjoy most of the same compiler optimizations that the C family enjoy using this principle.

        (btw, the Rust project is very interesting in this subject)

        * Of course, the term "safe" has a limited meaning. A compiler can't read your mind but, to the extent that a language is well designed, it can prevent you from doing things that you could not have intended to do and force you to follow rules that will never allow certain common errors that result from people having limited memory.

    • by TheRaven64 ( 641858 ) on Tuesday December 09, 2014 @06:51AM (#48553873) Journal
      There are good reasons and bad reasons why C is still popular.

      The main good reasons is the small footprint. I was recently given an ARM Cortex M3 prototyping board to play with. This is a pretty high-end part by IoT standards, but has 128KB of RAM and 512KB of flash for code and data storage. It's programmed using C++, but unless you stick to a very restrictive subset of C++ that's almost C, then you'll end up generating too much code (C++ templates are not just a good way of blowing away your i-cache on high-end systems, they're also a good way of blowing away your total code storage on embedded chips).

      The other good reason is that it makes it relatively easy to have fine control. Not quite as easy as you'd want. To give one example, the JavaScriptCore interpreter and baseline JIT were rewritten from C++ into macro assembler a couple of years back because C and C++ don't give you fine-grained control over stack layout. To give another example, some game devs were recently complaining on the LLVM list that their hand-optimised memcpy implementations were being turned into memcpy library calls, because they assume that they're using a macro assembler when they write C, and not a language with a complex optimising compiler behind it. It does, however, give you flow control primitives that make it easy to reason about performance and fine-grained control over memory layout. These are particularly valuable in certain contexts, for example when implementing higher-level languages.

      The biggest bad reason for C being popular is that we've standardised on C as the way of defining library APIs in UNIX-land. There's no IDL that describes higher-level concepts, there are just C headers, and the language that makes it easiest to use C libraries wins. There has been some improvement in C-calling FFIs recently, and a big part of the popularity of Python is the ease with which you can use C/C++ libraries from it. Even simple things are hard when interoperating with C. It's hard for an FFI generator to know whether that char * parameter is a null-terminated string or a pointer to an arbitrary block of memory that's going to be read by the callee, a pointer to a single char that's going to be written back, or whether the callee returns a pointer to within the block and needs the original memory to persist. Lots of libraries take function pointers that have a void* context pointer, so can be mapped to closures in the caller's language, but they all put the context object in different places so you need a custom trampoline for each one.

      With over 8 billion lines of open source C code (source: OpenHub.net), there's a good chance that the library that you want to use is written in C.

    • That's why most application developers will not touch C with a ten foot pole. C remains extremely fast and simple programming language, but it has little built in support for "safe programming".

      To be honest, some of that is not the failure of the language but the libraries. For example, string handling is a big source of programming mistakes in C. So why isn't there a _standard_ library for safe string handling? (I know there may be several third party libraries) A library could abstract away the management

      • So why isn't there a _standard_ library for safe string handling? (I know there may be several third party libraries) A library could abstract away the management of pointers to chars, things like growing and shrinking storage of the strings, creating string objects, destroying them, etc. without programmer ever touching a raw pointer to memory containing the string data.

        Sounds like you're looking for C++ and std::string

  • by Ihlosi ( 895663 ) on Tuesday December 09, 2014 @06:21AM (#48553787)
    C is the high-level language there. If you want actual control over your target, you'll need to use assembly.
    • by andyn ( 689342 )

      Depends mostly on compiler and toolchain availability on those platforms. You still have Python-capable [telit.com] processors for embedded systems if you can't afford to learn C.

      FWIW, I've been struggling with LPC4300 series processors. The open source toolchain is just so bad that your CPU hard faults on first attempted function call (most likely due to incorrect memory maps).

    • by jareth-0205 ( 525594 ) on Tuesday December 09, 2014 @07:55AM (#48554103) Homepage

      C is the high-level language there. If you want actual control over your target, you'll need to use assembly.

      Luxury! You trust a compiler? When I were a lad we inputted the hex codes directly.
      /
      Well of course we had it tough... tape and a magnetised pin was all we needed.
      /
      You kids don't know you were born... we used to program using a cigarette end to burn holes in the punch cards.
      /
      etc...

      • Once. Gave me a real appreciation for high level languages like C.

        And, yes, it was an embedded application. No user interface, the controller talked to the device, the device ran some very precise valves. It was actually a fun month getting all that working.

  • by Anonymous Coward on Tuesday December 09, 2014 @06:30AM (#48553801)

    Is there another OS/system programming language that is universally accepted as a reference, rather simple to learn, available on virtually any OS, real fast? I mean, go is nice and multi platform and powerful, but it is not even close to C popularity.

    C++ should have been C successor, but it is too complex to be.

    • I don't think there is any other. The latest specs of C++ has made some functionality much simpler, but it's still indeed rather complex language. C is the golden standard for embedded and other small footprint stuff, C++ provides high-performance OOP for GUI apps and video games.
    • Re: (Score:3, Insightful)

      by Rei ( 128717 )

      C++ should have been C successor, but it is too complex to be.

      C++ is no more complicated to use than C. You can write C code in C++ and it'll work just fine, with only a few rare exceptions.

      What C++ does give you is many more capabilities. Now, if you don't want to take the time to learn these capabilities, that's not the language's fault. There's a few things that were implemented a bit awkwardly (mainly looking at you, streams), but the vast majority is quite simple and straightforward, and it just keeps

  • Modern, best-practice C can be compiled with a C++ compiler. (There are a few gotchas moving in either direction - http://www.cprogramming.com/tu... [cprogramming.com] - but it's not hard to avoid them.) For all its object-oriented impurity and spec-bloat, the one thing I love about C++ is that you can write relatively high-level code when that makes sense, but you always have the option to grapple with all the fine detail when that's useful.

    • It's not C until it has designated initializers.

    • Here be monsters (Score:4, Insightful)

      by luis_a_espinal ( 1810296 ) on Tuesday December 09, 2014 @07:14AM (#48553949)

      C++ is C

      I used to believe in this until I had to work on both. Although one can compile best-practice C with a C++ compiler (sans the gotchas), that glosses over the idiosyncrasies of each language. C does not have initializers as in C++.

      More importantly, it does not have references, type-safe casting operators and its template language is not turing complete as in C++. These differences will never go away, and these differences alter completely the type of design and implementation of your code and your abstractions.

      Not to mention the C++ rules of PODs versus everything else which affect how we link C code with C++ code (and viceversa.) And modern C++ heavily uses templates in manner that makes the language resemble something else entirely. Whether that is a good thing is highly subjective, but whatever.

      So from a practical point of view, it is sane to treat both languages as fundamentally different.

      When we program in a language (be it Ruby, Java, C or C++ or whatever), we ought to do so in the idiomatic way that naturally exploits the best capabilities of the language. So with that in mind, we cannot treat C and C++ as the same language (and it is not quite accurate to compare modern C++ as a superset of the former, regardless of historical evolution.)

      I do believe, however, that is very important, if not fundamental, to understand C semantics to use C++ effectively. The fundamental semantics behind the primitive types and control structures remain more or less the same. And I've always found that C++ programmers without a good background in C tend to make certain mistakes when they need to operate with pointers (since they are so used to work with references.)

      Furthermore, integration of C with C++ is not an uncommon task, and development of C++ code with that in mind is paramount. It is very hard to do that without a good understanding of C.

  • Too many C++ programmers who learnt C++ without learning C first jumped straight into the OO and/or generics including the STL. Which is fine up to a point. But they tend to get completely lost when someone asks them to do any low level coding such as writing a bespoke B+ tree from scratch or something similar. Also when presented with multi level pointers they tend to get confused and don't really seem to understand the difference between pointers and C arrays.

    • by janoc ( 699997 )

      " don't really seem to understand the difference between pointers and C arrays"

      Well, because there isn't one at the language level. The array syntax using square brackets is only a syntactic sugar for pointer arithmetic, nothing more. This is a common myth that there is a difference.

      I suppose you mean the difference in the sense that an array means a continuously allocated block of memory of a certain size, whereas a pointer can point anywhere and you need to explicitly allocate that block if you want it. H

      • by Viol8 ( 599362 )

        "Well, because there isn't one at the language level. "The array syntax using square brackets is only a syntactic sugar for pointer arithmetic, nothing more"

        Seriously??

        char a[100];
        char *b = a;

        printf("%d, %d\n",sizeof(a),sizeof(b));

      • by jeremyp ( 130771 ) on Tuesday December 09, 2014 @08:27AM (#48554229) Homepage Journal

        " don't really seem to understand the difference between pointers and C arrays"

        Well, because there isn't one at the language level. The array syntax using square brackets is only a syntactic sugar for pointer arithmetic, nothing more.

        There is a difference between an array and a pointer.

        char a[100];
        char* b;

        b = a; // Fine
        a = b; // Not fine.

        If you read the standard, the language used is that, in an expression, an array "decays" to a pointer with the rule being that you get a pointer to the array's first element. The "array is not a pointer" rule is further demonstrated by passing an array to sizeof (as viol8 points out).

    • I'm not sure I agree. I learned C++ first and then C a year or two later, and I can manage raw pointers just fine, thank you. I just think you're crazy if you do it willingly when there are much better alternatives available.

      I've seen plenty of C turned C++ programmers who essentially treat classes like a giant package for wrapping up loosely related functions into horrifying kitchen-sink classes. They don't know how to create proper interfaces, and pass all sorts of raw pointers around, managing memory

  • A lot of problems want to have a solution that is very close to the hardware. C is an excellent macro assembler, but you need to remember to treat it as such.

    It seems like very many programmers don't know, or don't want to think over the lower level implications of everything you do in C.

    C is relevant because as long as we use computers, we need to tell them what to do, and C (and fuzzy bloated C like C++) does that for us.

    Most people and most programmers don't need to touch computers on that level, and the

    • by N1AK ( 864906 )

      Any good programmer should be able to program on any level from assembler to C and C++ to Python and shell-scripts and up.

      And every good butcher should be a great farmer, every good soldier an expert weapon maker, every good driver a world class mechanic ;)...

      I learnt assembler, I think it was valuable to do so and I'd still suggest it to others, but it's nonsense to suggest that you need to know it to be a good programmer.

      • Where are you drawing the line for good?

        I can see that somebody could program without knowing anything at all about assembly language, but I find it difficult to believe that they would be any good at it. For many years CS curricula around the world contained the same sequence of courses: a "high" level language (be it C, C++ or Java depending on time and location), assembly language for a real architecture (SPARC, MIPS or x86) then a compiler course later in the degree that explicitly teachs the mapping fr

      • by Tom ( 822 )

        And every good butcher should be a great farmer, every good soldier an expert weapon maker, every good driver a world class mechanic ;)...

        Strawman argument.

        The OP doesn't argue that people whose profession is different from programming should be able to program. He argues that a good butcher should be able to kill with a mechanical tool, not just the fancy bolt gun the slaughterhouses have now. That a good driver should be able to drive stick-shift even if his car has automatic.

        And I agree. When I was in university, I tortured students with proper input handling in C until they got it, until they understood that unless they check their input

  • Libraries (Score:5, Informative)

    by heikkile ( 111814 ) on Tuesday December 09, 2014 @06:39AM (#48553831)

    If you write a good useful library in C, it can be used from almost any other language, with little effort. If you write your library in any other language, you limit its use to a handful of related languages. Also, properly written C can be very portable to a wide variety of systems.

  • by dargaud ( 518470 ) <slashdot2@@@gdargaud...net> on Tuesday December 09, 2014 @06:40AM (#48553837) Homepage
    One of the main reasons is that entire operating systems are written with it. When there are operating systems written from scratch in Erlang or Java or Go or whatever, then and only then we'll see C start to fade away. Until then it's here to stay. All the other reasons are secondary: ease of use (gcc test.c; ./a.out), widespread availability from tiny micro-controllers to behemoth supercomputers, low overhead, precise and full control of everything to the bit level, huge choice of well tested libraries, etc... That's why I regularly try and learn new languages but most of what I do is in C.

    As to why there are more threads related to C++ on the Internet, easy, it's because C++ is a lot more complicated and complex to grok. I need as much help as I can with some of its tortured constructs and seldom used idioms. C is more straightforward (even if there are plenty of tricky things in it, which the seasoned programmers will either know how to use or steer well clear of).

    • by AmiMoJo ( 196126 ) *

      I find C++ less and less relevant these days. I use C most of the time for embedded work or little desktop utilities, and C# when I want to do a more complex desktop app. C++ fits in somewhere between those two and most of the code written in it might as well be C, as it just uses C++ features to organize modules and do a bit of lazy memory allocation.

      There are areas where C++ shines, but I think these days it's becoming less and less relevant as there are better languages for embedded systems (C, assemble

  • by inflex ( 123318 ) on Tuesday December 09, 2014 @06:46AM (#48553853) Homepage Journal

    Lots of old traps in there, I stopped about 5 years ago with this book, needs a lot more work, but covers the basic "ooops" events. Thankfully at least with things like Valgrind / CLANG|gcc a lot of the older dramatic mistakes can be picked out quickly.

    "C of Peril" - the book (pdf, free) at http://www.pldaniels.com/c-of-... [pldaniels.com]

  • I always tell people who want to learn to code to go for modern C++ (preferably C++11 or newer) first, and then if necessary learn some C afterwards.

    It is crazy how much more C code is needed to get the same level of performance and security that equivalent C++ has, and C coders know it. Just look at all the extensions that C compilers, and even the C11 Standard, borrow from C++ (generics, RAII) - but in a convoluted ugly way to preserve the precious ABI for 50 years.

    And for all those who will say that C++

  • I hope than C will evolve to add a clean simple and single inheritance object model without all the over-complex crap from C++.
    A even more advance would be that C get an optional standard library comparable to boost and the like that make consensus in the developer community.
    Finally a JIT would make it competitive to the increasing markets areas with high diversity of silicon processors architectures.

    • by mwvdlee ( 775178 )

      I sure hope you were being sarcastic.
      It's hard to tell the difference sometimes.

  • The C language has a stable ABI, and it is the only mainstream language that can make such a claim.
  • by janoc ( 699997 ) on Tuesday December 09, 2014 @07:11AM (#48553935)

    C is very much still relevant - most of the deeply embedded computer firmware is written in either assembler or C, where the bit twiddling capabilities, compactness of the language and efficient generated code are of high importance. All those ATMegas, PICs, 80x51, Z80, Renesas, small ARM Cortex cores - chips that are too small in terms of available memory to use higher level languages and OSes effectively. Essentially, if you are writing "to the metal", you are most likely going to use C, assembler and (rarely) C++. Those chips costs peanuts and are pretty much everywhere, controlling everything from your toaster to brakes in your car ...

    Programming is not only about the desktop and web, you know.

    Even on more "grown up" platforms you will find C in the network code, most of system programming is done in C, C with its standardized ABI is an interface language (e.g. you can load a C-interfaced DLL into Python or Java, for example) and many many other applications. I would say that knowing at least the basics of C is as much a must for any programmer as knowing basics of English - unless all that you do are web apps in Javascript.

  • by Required Snark ( 1702878 ) on Tuesday December 09, 2014 @07:17AM (#48553957)
    C is important because it directly presents the actual machine memory model. If you want to have an understanding of how software works, you need to understand this. People who never learn how memory is really organized lack fundamental knowledge.

    It's as if engineers could skip calculus because there are automated systems that will do it for them. Even if they never work directly with calculus, the experience is critical to being a competent engineer.

    Yes, C has features/bugs that can be really ugly. But as a professional you can make a system like C and it's runtime libraries work then you are much better equipped to do other complex tasks. The experience can result in careful habits that will help your entire career.

    • by Ihlosi ( 895663 )
      C is important because it directly presents the actual machine memory model.

      Well, not really. There are some architectures that were basically designed to be used with C (68k, ARM), but there are others (8051) where a C compiler need to jump through some major hoops.

      And the C compiler still shields the programmers from things like stack frames or worrying about CPU register allocation.

  • The most important reason-feature to learn and use C is that it's the LOWEST COMMON DENOMINATOR in many other languages' attempts for interoperability. These APIs are many times written in C in order for the libraries to operate seamlessly between them. Another good reason is to improve what you can do with Lua. With C and Lua you can literally tact almost any problem, from drivers to databases. It might not be the most efficient way, but definitely you will get more bang for your time and money.

  • Relevant to what? (Score:2, Insightful)

    by aglider ( 2435074 )
    To mobile application market? Irrelevant.
    To online web services? Not so relevant.
    To online web server? Very relevant!
    To high efficiency applications? It's almost the standard.
    To operating systems? There's almost nothing other than C (in terms of market share).

    Please, elaborate more on your stupid question, you insensitive mobile clod!
    • by AmiMoJo ( 196126 ) *

      Android apps can use C for native code, and many do for performance reasons (e.g. games) or to access lower level OS functions (e.g. interfacing with USB devices).

  • A lot of HW related companies use mostly just C. Any sort of small device development or most appliances (switches, storage, etc.) have software written in C. Any driver development, etc. It is still the language anyone can pretty much agree on.

  • C is great for low level stuff since it is capable of generating machine code that has zero dependencies. K&R even explicitly mentions "non hosted mode" with no libc and implementation defined entrypoint semantics. In fact, it is the only language in mainstream use today that has this feature (aside from assembly.)

    For kernel, drivers, firmware, embedded, and RTOS its pretty much the only choice unless you want to code everything in straight assembly. Since my current job is firmware programming, I've

  • by msobkow ( 48369 ) on Tuesday December 09, 2014 @07:30AM (#48554011) Homepage Journal

    C started out with high level "constructs" that were basically the operators of the PDP-11 and VAX processors from DEC. While those constructs have mapped well to other processors, virtually every statement in C originally compiled to one instruction on those processors.

    To this day, C still gives you the power and flexibility of any low-level language worth it's salt, and ample opportunity to hang yourself and your code. Don't forget -- C++ originally targetted C as it's output, not machine code. C has similarly provided the "back end" for no small number of special-purpose compilers.

    Then there are the operating systems and device drivers that have been written with it, and all the embedded systems logic for devices all and sundry.

    C will never die any more than assembly or COBOL and FORTRAN will. There will always be those special-purpose high-performance tasks where it is worthwhile to use C instead of a higher level language. Just as there are times where it still makes sense to drop even lower and into assembly.

    You go ahead and try to bootstrap an entire C++ implementation so you can write a kernel in it. Good luck with that. Getting a C library framework running on a bootstrapped kernel is hard enough. C++ would be orders of magnitude harder.

  • If you need to be efficient, you write in C, end of story. That's why.

    C++ and all the other languages there simply have too much overhead and give you not enough control over what's actually happening.

    Assembler isn't worth the major hassle for the tiny improvement you get over C.

    C, however, is at this unique point between the bare metal and the high-level abstractions where the trade-offs are perfect. You get just enough abstraction to be a) hardware-independent and b) can use a high-level programming langu

  • by Ottibus ( 753944 ) on Tuesday December 09, 2014 @10:02AM (#48554787)

    "C++, Objective-C, Perl, Python, Java, PHP, C#, D and Go all have block syntax that's derived from C"

    And C got the block syntax from B which got it from BCPL which was a simplified version of CPL which was influnced by the first block structured language, ALGOL.

    I was taugh ALGOL at University, though I had already been "mentally mutilated beyond hope of regeneration" by BASIC before that...

  • by Lawrence_Bird ( 67278 ) on Tuesday December 09, 2014 @10:36AM (#48555053) Homepage

    How many times more this year (only 22 days left!) are we going to see these types of "articles"? Give it a rest. C, COBOL and Fortran will be around long after all of you are dead.

  • by Delwin ( 599872 ) on Tuesday December 09, 2014 @10:56AM (#48555227)
    There's also OpenCL which is far closer to C than the rest of them, and that is a language that is still up and coming.
  • by jbolden ( 176878 ) on Tuesday December 09, 2014 @11:12AM (#48555377) Homepage

    First off I'd consider C++ and Objective-C to both be variants of C. And you can make a fairly good case that Java is also a variant of C.

    That being said there is a good use case for C by itself. Lots of algorithms execution times are of forms like R*n^2 + S*n + T. For N large, A is all that matters. For N small T can often be the dominant factor. The C language, like assembly excels at getting the execution times for T down. However for most business processing execution time isn't really all that important since n isn't large nor is T particularly big. In which case programmer efficiency matters a great deal.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...