Fighting the Culture of 'Worse Is Better' 240
An anonymous reader writes: Developer Paul Chiusano thinks much of programming culture has been infected by a "worse is better" mindset, where trade-offs to preserve compatibility and interoperability cripple the functionality of vital languages and architectures. He says, "[W]e do not merely calculate in earnest to what extent tradeoffs are necessary or desirable, keeping in mind our goals and values -- there is a culture around making such compromises that actively discourages people from even considering more radical, principled approaches." Chiusano takes C++ as an example, explaining how Stroustrup's insistence that it retain full compatibility with C has led to decades of problems and hacks.
He says this isn't necessarily the wrong approach, but the culture of software development prevents us from having a reasoned discussion about it. "Developing software is a form of investment management. When a company or an individual develops a new feature, inserts a hack, hires too quickly without sufficient onboarding or training, or works on better infrastructure for software development (including new languages, tools, and the like), these are investments or the taking on of debt. ... The outcome of everyone solving their own narrow short-term problems and never really revisiting the solutions is the sea of accidental complexity we now operate in, and which we all recognize is a problem."
He says this isn't necessarily the wrong approach, but the culture of software development prevents us from having a reasoned discussion about it. "Developing software is a form of investment management. When a company or an individual develops a new feature, inserts a hack, hires too quickly without sufficient onboarding or training, or works on better infrastructure for software development (including new languages, tools, and the like), these are investments or the taking on of debt. ... The outcome of everyone solving their own narrow short-term problems and never really revisiting the solutions is the sea of accidental complexity we now operate in, and which we all recognize is a problem."
The whole point of C++ was its C compatability (Score:5, Insightful)
Back in the day. The clue is in the name. If it wasn't compatible but simply similar then it would have been called something else. Java perhaps.
Easy to say when not dealing with customers (Score:5, Insightful)
It's easy for a programmer to say "We should stop worrying so much about compatibility and interoperability" when they don't have to deal with customers, support, or actually selling the end product. When a customer calls up and says, "Hey, how come this new version of Windows doesn't work with any of my old Windows software?" you can't just tell them "Because our programmers thought it was better to get a fresh start."
Re: (Score:3)
When a customer calls up and says, "Hey, how come this new version of Windows doesn't work with any of my old Windows software?" you can't just tell them "Because our programmers thought it was better to get a fresh start."
"Hey, how come this new version of Mac OS doesn't work with any of my old Mac OS 9 software?", said Mac users in response to Classic support being dropped with the release of Mac OS X 10.5.
"Hey, how come this new version of OS X doesn't work with any of my old PowerPC software?", said Mac users in response to Rosetta being dropped with the release of OS X 10.7.
Both of those are from just the last 7 years, and I wouldn't be surprised if we could rattle off more, both for OS X and iOS. The fact is, you can te
Re: (Score:2)
Re: (Score:2)
Yeah, but obviously that would break things, look at how little warning they gave developers. They only released an API/standards that UAC played well with in 2001 with Windows XP. Surely that's not enough time to modify their code.
Seriously, I rememb
Re: (Score:3)
Near as I can tell, you've pointed out an additional difference between Microsoft and Apple, rather than addressing or contradicting anything I was discussing.
Basically, while Apple does indeed slap developers for misusing APIs, as you stated, it also frequently deprecates features and APIs that are working as intended while their customers are still using software that's dependent on those features and APIs, which is what I was pointing out. Both Classic and Rosetta were working as intended and were being
I can answer these, as I was there. (Score:2)
I can answer these, as I was there.
"Hey, how come this new version of Mac OS doesn't work with any of my old Mac OS 9 software?", said Mac users in response to Classic support being dropped with the release of Mac OS X 10.5.
Because Apple was unwilling to port the Classic 68K emulator to Intel because of the difference in processor byte order, among other things, making such a port not worthwhile in terms of performance of the Classic software. The user experience would have been crap, and so the decision was made by upper management to not support Classic going forward on Intel.
For the PPC versions of Classic, they could have been supported under Rosetta, but it would have meant an approxima
Re: (Score:2)
For the most part we get these compatibility issues from someone trying to make something the previous system wasn't designed to do well, so they did some hacks to get it to work.
When we wen't from DOS to Windows. The idea of a common drivers came into play. Before you needed to do assembly calls to support your devices.
Now the real trouble makers are the sales men who push the product as something to solve all your needs, while it was made to solve particular problems.
Re: (Score:3)
Yes, and this is precisely WHY programmers don't get to make those decisions. Someone who actually understands the customers and the business has to come in and set boundaries. It's all well and good for the OP to call for radical change, but a real-world manager has to come in an some point and say "Look guys, if you want to be revolutionaries, go somewhere else. We are here to sell software to customers who don't give a shit about your revolution."
Re: (Score:3)
Yeah, but minimum requirements are usually on the application side of things and say "If you want to run this latest version of the software you'll need a relatively new computer to do it." That's a lot different than saying "This software will no longer read any legacy files from older versions, it will no longer work the same way it used to, it will no longer be compatible with the OS's it used to work with" etc.
Yeah, I'm sure programmers around the world would just love to throw all their legacy code out
Huh (Score:5, Insightful)
So.. preserving backwards compatibility and interoperability across versions is a bad thing? If he's unhappy with the feature set of C++ (and I wouldn't blame him for that), then how about simply picking up a different language instead? That's what a new, non-compatible C++ version would be in any case.
Look at how great it has worked out for Python. It's been six years since the only mildly incompatible version 3 was released, and it has still not managed to become dominant over the legacy version 2. A more radical break would almost certainly have had an even tougher road ahead.
Re: (Score:2)
And the stupid thing is that everything python 3 changed were things the language desperately needed.
Exactly one function parsed as a statement instead?
Exception handling syntax different from every other block format?
Defaulting strings to unicode?
I mean, you're right that 3 never truly recovered, but those were all things the language desperately needed.
Re: (Score:2)
Yes. Absolutely. If you can't treat print as a regular old function, it causes problems. You can't pass it as a __call__able argument to methods, you can't wrap and replace it for special situations, you can't use any number of language features for function calls(like *args).
The language has powerful tools that are more obscure to fiddle with functions, and not nearly as many for statements. Unifying print with the rest of the language was absolutely a necessary, if painful, step.
Re: (Score:2)
All great reasons to add a proper print function, but not necessarily to break comparability with a program that uses the old print statement.
Re: (Score:2)
> A more radical break would almost certainly have had an even tougher road ahead.
This is why we are still waiting for Perl 6, if it ever gets released.
It's just too radical a break from what the Perl community expects. Even if you've never used Perl, you can tell from the drastic difference between Perl 5's dependable Camel [wikimedia.org] to Perl 6's Camelia [wikimedia.org].
Re: (Score:2)
> This is why we are still waiting for Perl 6, if it ever gets released.
I suspect in the case of Perl 6 (and perhaps also for Python) it may have been better to give the language a new name, and allow even more radical changes. Keeping the name strongly signals that it's still the same language. Breaking compatibility is exactly what makes it a different one.
Re: (Score:3)
Once you change a language enough it is no longer the same language.
I just do not see the issue. If you want C++ without C create a new language that fits that description. Apple created Objective c without c and called it Swift. So make C++ without c and call it Sure.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Python can't hold a candle to Perl in this regard. The Perl 6 design process started in 2000.
Re: (Score:2)
He's saying that compatibility is not (or rather, should not be) the sole consideration.
In that case, he's making a needless argument, because that has NEVER been the sole consideration with any software development project, and never will be.
hmm... (Score:2)
Re: (Score:3)
Seems like someone (or a consortium of someone's) should take C++, drop the C compatibility requirement, make whatever "cleanup" changes that allows, and call it C+++. Just make sure there's a module ready to go for gcc.
Yeah and basically you also clean up the syntax since if you're breaking compatibility then you may as well make it parsable. What you get is D or Rust.
Both of them seem like fine languages.
Howeve, I end up always reaching for C++, because then I can reuse my old code and keep using my libra
Re: (Score:3)
TLDR (Score:3)
"Technical Debt is a thing, people"
Re: (Score:3)
Re: (Score:2)
It's always been that way. Fewer than 1% of Americans are illiterate, but something like 97% are aliterate. When I was in school, very few kids wore glasses; no computers or cell phones and the TV was across the room. Reading (or any other close up work) at a young age makes you nearsighted.
Worse is Better doesn't mean that. (Score:2)
Don't create app functionality that the user won't use. More bells and whistles mean more dev time, test time and more to maintain, it's a simple concept.
That's why apps now have functionality metrics (Firefox seems really big on it for example).
You keep using that word.... (Score:2, Informative)
Re: (Score:2)
Worse is better is basically the KISS principle for software.
Huh? It seems the opposite of the KISS principle. Making my old code no longer work is hardly keeping it simple. Most modern, ugly web design is as confusing and complicated and bloated as they can make it.
I always followed the KISS principle, it's the easiest way to write bug-free code. Hard to write a buggy version of "hello, world".
This guy is an idiot.
I certainly don't disagree there.
Re: (Score:2)
I disagree with the premise... (Score:5, Insightful)
.
So he thinks that compatibility and interoperability are not features which he likes. OK, I'm OK with that.
However, that is his opinion, nothing more, nothing less.
There are reasons why interoperability and compatibility are desired. It is not the easiest path to provide those characteristics, on the contrary, it is easier to just say, ~screw compatibility, screw interoperability~, and you'll probably finish your task more quickly.
So then the question becomes, why do people invest extra effort in order to assure interoperability and compatibility?
...which we all recognize is a problem....
And now he presumes to speak for everyone....
Overall it sounds like he just got out of a bad meeting in which someone told him that his opinions are not worth the air used to utter them, and now he's trying to convince the world that he is right and the world is wrong.
Radical changes are not (always) good. (Score:2)
Re: (Score:2)
Take a look at evolution. Sexual reproduction has so many hurdles to jump through before a beneficial mutation could find a toehold. In asexual reproduction individuals can rapidly and radically adjust to the changing environment and pass on the beneficial mutations to the next generations.
The late writer and biochemist Isaac Asimov would have disagreed with you [bestlibraryspot.net] vehemently. Asimov held a PhD in biochemistry and did cancer research at Boston University.
The above linked short sci-fi story was originally title
Simple != worse (Score:5, Insightful)
I don't know who to credit for this (probably read it on Slashdot), but a single perspective completely changed the way I view coding:
It takes substantially more effort to debug than it does to write code in the first place. If, therefore, I write code as clever as I possibly can - I can't effectively debug it (without investing far more time than I should) if something changes or goes wrong.
Now, that doesn't mean "worse is better"... I can still produce good code; I can even still write the occasional clever function when performance demands it. But for the 99.9% of code that has almost no impact whatsoever on performance, I can just say "if X then Y else Z" rather than using cool-but-cryptic bitmasking tricks to avoid executing a conditional instruction. And hey, whaddya know, I can actually read it at a glance six month later, rather than praying I didn't forget to update my comments.
On the flip side of this, a few weeks ago I helped a friend put together a spreadsheet with a few complex formulas in it. I love me some IFSUMS, arguably the best new feature of Excel in the past decade. Note that clause, "in the past decade". This weekend, she called me because her nice helpful spreadsheet wouldn't work - On Excel 2003. It seems that while 2003 has IFSUM, MS didn't add IFSUMS until 2007. The choice of one seemingly harmless backward-compatibility-breaking function made the whole thing useless in a given context. Now, in fairness, I can hear you all screaming "just upgrade already!"... But in the real world, well, we still have people using Windows 95.
Keeping it Simple (Score:2)
I'm with you on this. As a programmer, the thing I hate the most is "Gee, Mom, look what I can do!" code -- obtuse code written to impress rather than be simple, obvious and functional. And yes there are indeed times when something mind-bendingly complex is needed to achieve the required goal, but by and large, the KISS principle applies. As to the article's main point, I have to ask what is the purpose of breaking backward compatibility: Making it faster to produce readable, easily maintainable code, being
Re: (Score:2)
I agree, but I also think there needs to be a balance between KISS and other principles (e.g. DRY). I've come across developers who use KISS as an excuse to be lazy.
I recently came across some code from a colleague where it was hundreds of lines filling in object properties from data in a spreadsheet. Each property being filled in was coded as a separate line, calling a one of five different routines based on the data type to be parsed.
I asked "why didn't you just add a configuration (or just an array)
Re: (Score:2)
Re: (Score:2)
I don't know who to credit for this
.
.
.
If, therefore, I write code as clever as I possibly can - I can't effectively debug it
Based on your quote, probably (originally) Don Knuth.
Re: (Score:2)
But for the 99.9% of code that has almost no impact whatsoever on performance, I can just say "if X then Y else Z" rather than using cool-but-cryptic bitmasking tricks to avoid executing a conditional instruction.
... and even in that other 0.01% of the time, it's likely that your compiler will optimize the pretty human-readable code into the cool-but-cryptic bitmasking trick at the assembly level anyway. There's no need for the human programmer to do that sort of obfuscatory wizardry at the source code level, when the compiler can do it for him -- and likely do it more reliably as well, since compiler writers pay more attention to what is strictly language-legal vs what-seems-to-sort-of-work-today.
Re: (Score:2)
That's almost universally untrue. The 99.9% is made up of the union of "code that executes infrequently enough" and "code that the compiler can auto-optimize."
Now, predicting what's in that 0.1% is tricky, which is why it is often better to optimize later after profiling reveals it. And may someone protect you from me
Re: (Score:2)
It sounds like what you're referring to may be a reference an oft-quoted piece from Brian Kernighan (originally via "The Elements of Programming Style"):
“Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?”
Re: (Score:2)
Heh, SUMIFS. Not IFSUMS. Duh, thanks. And no, I didn't charge her anything - I did say "friend", not "client". Just doing her a favor, took a whopping five minutes of my time.
Although my solution and insight was worth much more than yours.
You can approach any given problem in two different ways:
You can work with the conditions of the problem as given and find a solution under those conditions, or,
you can whine ab
Bad example, interesting points. (Score:2)
I think the author's original focusing on C++ as an example of "worse is better" is a sad distraction. Clearly C++ was designed with the goal of being compatible with C. There are plenty of examples of languages which attempted to solve object-oriented programming but threw away backwards compatibility as a design goal: D, Java, and C# come to mind.
That said, I think he does have an interesting point about our unwillingness to sit down and carefully consider our response to problems as they arise during dev
Re: (Score:2)
Clojure is designed to be be compatible - not backwards compatible, but intercalling compatible, with Java. The consequence is that a Clojure program can crash out of stack when it still has masses of heap. Why? Well, the JVM was designed for small embedded devices which would run small programs, which weren't expected to do a lot of recursion; and were low power with limited memory so allocating stack as a vector was seen as an efficiency win. The fact that most of the time we don't run Java on small embed
About my book.... (Score:2)
So Paul, is this a plug for your book or yet another argument for functional programming?
Not much useful information or examples as to why or where "Worse is better" is harming the world of computing. There is always a better tool to help solve a problem. But for many reasons they may or may not be appropriate. You the programmer should know when and where to use them.
One of my favorite quotes (the bold text at the end is the good part):
Fresh start / the opposite is happening. (Score:2)
I think we are seeing fresh starts. We've seen a shift away from desktop applications towards web as a common framework, that's become remarkably more useful. And now we are seeing moves towards mobility. Both web and mobility have forced a genuine separation between view systems and business logic that in the desktop world was a goal often not met. Then with the rise of devops architectures are becoming even more factory component where the parts are interchangeable and thus design mistakes can be corr
What hacks? (Score:2)
Does anyone have any idea of the hacks he's talking about?
Since C++ is intended to and has always been a superset of C, how could there be any problems and hacks caused by compatibility with C? How could it be any better by discarding part of the language itself?
It's Not Even That (Score:3)
A problem I've been seeing lately is that everyone seems to think software is carved in stone. In the past 3 or 4 years I've heard a LOT of excuses why some flaw in one system or another made a feature impossible. In these cases, fixing the flaw would be pretty trivial. Instead of doing THAT, people just build another layer of crap on top of the previous layer of crap and try to kind-of get something working. Code is not immutable. If it doesn't do something you need it to do, MAKE it do what you need it to do. Write a library, redesign a layer, simplify an interface, whatever. Don't just wring your hands and make the problem worse! Code is made to change. No design is ever perfect right from the start. If you try to make your design perfect from the start, you'll just end up paralyzed, afraid of doing anything because you might do it wrong. Start with a design that seems reasonable and adjust it as needed. Write small, decoupled libraries that can support that, and write unit tests to insure that each component behaves as expected. It's really not that hard, people!
Re: (Score:2)
Re: (Score:2)
Code is not immutable. If it doesn't do something you need it to do, MAKE it do what you need it to do. Write a library, redesign a layer, simplify an interface, whatever.
I completely agree in principle, but in practice, the more software that is using the current version of the code, the more things will break when you change the design. That has the effect of making the code less malleable, proportional to the number of its dependents.
So for a function that is used only by your own program, it's no problem at all. For an in-house library that is used in several programs across your company, it's a bit of a hassle but doable. For a new computer language that is being use
Re: (Score:2)
Python (Score:4, Insightful)
How did breaking compatibility worked out for python 3?
Re: (Score:2)
Features matter. (Score:3)
If being perfect means not having critical core features then you're confused about what is and is not perfect. Compatibility is important. In many applications it is vital. Period - end of story. Does maintaining compatibility make the project more complicated? Yep. Coding is hard.
Next issue.
BS (Score:2)
Interoperability is king.
I don't care how awesome your new application is, if I can't get my data into it and out of it, it's worthless to me. I've been down this road. If I have to start a massive project just to start using your application, or plan for a massive one to stop using it, that's a cost to me... a big one. And if my people have to go to training just to be able to use it because you didn't want to bother meeting a standard... or I have to hire people strait out of college? Again, that's a huge
Too bad (Score:2)
If you don't like the mess in C++, find a better language and use that. C++ is actually the most popular language for performance-critical code? Hm, I wonder how that happened? Because of or despite its C compatibility?
The world is full of bad technology that is popular because its version 1.0 was really popular at the time. The fact we still use all of it says something about the market for technology; apparently, backwards compatibility to a fault makes for more long-term popular systems than do-it-right-
Wrong lens (Score:3)
Worse is better is better (Score:2)
Without worse-is-better, you make sure the job is completely done before release. So, it takes years to make progress, you end up building an extremely complex system to cover every possibility. Because of that very complexity it is difficult to extend for new requirements, which become apparent after the system is specified and before it was built.
If you want to build a nuclear power plant control system or something equally as critical and unchanging, sure, go ahead and engineer everything out the wazoo
Backwards Compatibility - Backward Languages (Score:5, Insightful)
So far, I don't think I've seen a single comment here that got the point of the essay.
He's not talking about incremental "improvements" to existing languages, he's pointing out that the common attitude of "we'll make this language easy to learn by making it look like C" is a poor way to achieve any substantial progress.
This is true but everyone who's invested a substantial amount of time learning the dominant, clumsy, descended-from-microcode paradigm is reluctant to dip a toe into anything requiring them to become a true novice again.
I've long been a big fan of what are now called "functional" languages like APL and J - wait, hold on - I know that started alarm bells ringing and red lights flashing for some of you - and find it painful to have to program in the crap languages that still dominate the programming eco-system. Oh look, another loop - let me guess, I'll have to set up the same boilerplate that I've done for every other loop because this language does not have a grammar to let me apply a function across an array. You want me to continue doing math by counting on my fingers when I've got an actual notation that handles arrays at a high level, but I can't use it because it's "too weird". (end rant)
There have been any number of studies - widely ignored in the CS world - going back decades (see this http://cacm.acm.org/magazines/... [acm.org]) - pointing out how poorly dominant programming memes mesh with the way most people think about problems and processes. Meanwhile, the 1960s called - they want their programming languages and debugging "techniques" back - "printf", anyone?
Re:Backwards Compatibility - Backward Languages (Score:4, Insightful)
Meanwhile, the 1960s called - they want their programming languages and debugging "techniques" back - "printf", anyone?
What's wrong with printf?
printf has some very nice features. Firstly you get the histroy of what happened at previous iterations of your algotrithm, right there, which is something you don't get with a setpping debugger. Secondly, you can use a second language, suc as awk to process it very easily to find things out.
I find debugging numeric code without printf to be a major pain. Sometimes it's worth plotting things a bit more dynamically, but you can do that with awk+gnuplot on the stream of data coming out with printf.
I agree!! (Score:2)
All of my competitors should adopt the author's philosophy of software development immediately. His ivory tower FP idealism is worthy of emulation by all.
I will keep muddling through based on years of experience, leveraging existing code and know-how, maintaining backwards compatibility, planning long-term changes that sometimes take years to complete, deprecating unneeded features in as non-disruptive a manner as possible. And then, when the opportunity arises to do something radically different (like wi
It's a rant against (Score:2)
silliness (Score:5, Interesting)
To accuse the C++ community of not having engaged in "reasoned discussion" about backwards compatibility is silly. Chiusano may not like the tradeoffs that C++ makes (I don't), but they are the result of a glacially slow and tedious community process and discussions. Whatever C++ is, it is by choice and reflection. Furthermore, "worse is better" refers to keeping things simple by cutting corners, and you really can't accuse C++ of keeping things simple.
(Charges about too much backwards compatibility are ironic from someone who promotes Scala, a language that makes many compromises just in order to run on top of the JVM and remain backwards compatible with Java.)
Sounds like ... (Score:3)
Too much of a good thing (Score:2)
Backwards compatibility is a good thing, but too much of it can be bad. There are glaring errors in languages which have not been fixed in the name of some faux backwards compatibility argument.
I'll start with one that was fixed.... thirty years after the fact, all the while arguing that it couldn't be fixed, in the name of backward compatibility. The example? ANSI C gets the right answer for sqrt(2) while the original C didn't.
An example of things that C got wrong and have yet to be fixed? the bug inducing
The Free Market? (Score:2)
The outcome of everyone solving their own narrow short-term problems and never really revisiting the solutions is the sea of accidental complexity we now operate in, and which we all recognize is a problem.
It strikes me that this describes our economic system as well.
Re: (Score:2)
There's a difference between linking C libraries (Many non-c++ languages can do this) vs Stroustrup's insistance that a c++ compiler must be able to compile C code. this requirement is one of the reasons C++ is such a mess.
Re: (Score:2)
Pulling two examples out of my ass:
int *foo = malloc(sizeof *foo);
and
int bar(void);
C++ and C are two different, incompatible languages.
Re: (Score:3)
Try compiling this as both C and C++. It's completely valid in both languages, but the output will be different in each.
Re: (Score:3)
Re: (Score:2)
Either way, we might as well do it the other way around and consider int foo(); in C and in C++. In one of those languages, foo does take any arguments.
Re:New langauge (Score:5, Insightful)
The two languages are not incompatible
That's vague. The real question is whether C++ is a strict superset of C. Answer: it is not.
Some constructs valid in C are invalid in C++. Some valid C code is also valid C++, but behaves differently.
See Section 1 and Section 2 of this Wikipedia article. [wikipedia.org]
Re: (Score:2)
I don't see what a compiler's ability to do with other formats/languages has to do with a different format/language.
GCC can compile fortran, and that has nothing at all to do with C++.
"GNU C compiler" versus "GNU Compiler Collection" (Score:5, Informative)
Ordinarily, it doesn't. But the thing is, there are two things called GCC: the GNU C compiler (which handles C) and the GNU Compiler Collection (a set of compilers which, though they share the same backend, are still separate entities).
GCC, the C compiler, cannot handle Fortran. GCC, the set of compilers, can handle it via g77 (the old compiler) or gfortran (the new one), but the C compiler can't. This is considered the traditional way of doing things.
What makes C++ different from many languages is that its maintainers insist that C++ compilers must be able to handle C code. It's not enough to have a different compiler in the set, the way GCC does: it must be doable with the C++ compiler itself, in the same application. And so g++ can do it too, because that's what the standard requires of it.
That's what makes the difference. Ordinarily, as you say, a compiler's ability to handle multiple languages shouldn't affect any of the languages in it. But C++ was defined in a way that not only makes those effects possible: it makes them mandatory.
Re: (Score:2)
Re: (Score:2)
If it didn't meet the requirement to be fully compatible with C, it wouldn't be C++.
Like the man says, you're welcome to fork a new language that's C++ without C compatibility. Frankly I don't see the point. If you don't need C compatibility there are far better OO languages than C++.
Re: (Score:2)
Re:New langauge (Score:5, Informative)
Obligatory XKCD [xkcd.com]
Re:Since when... (Score:5, Insightful)
Since when has slashdot ever been news? Its masthead may be "news for nerds" but its news is seldom very new. It's about seeing one's fellow nerds' views on that topic.
Re: (Score:2)
I think this article might be a direct response to the unveiling of JSONx
http://www.reddit.com/r/progra... [reddit.com]
Re: (Score:2)
Re: (Score:2, Funny)
Re: (Score:3)
Yes, but you are even worse. (Score:5, Informative)
And that compatibility still is important today. For one thing, APIs can be written in C (starting with the POSIX API) and be used by programs written in either C or C++ or in language X, or language Y, or language Z.
FTFY. This is due to calling conventions, not due to a 'language compatibility'
Here's an idea: if you don't know shit about C++, don't post shit about C++?
Here's an idea: If you don't know shit about the basics of programming, don't post, like, at all. Especially avoid calling others idiots when you're at the same time making clear you're even less competent.
Yes, but you are even worse. (Score:3, Informative)
Wrong.
C++ has zero problem dealing with structs as function parameters (usually pointers to structs) which have arbitrarily complex internal structure, including embedded pointers. Since C does not provide OO encapsulation, it is often useful for calling code to manipulate pointers directly.
In Perl, you've got to wait until someone provides a binding for these complex cases... or, yeah, "why don't you do it yourself? It's open source."
Re:Yes, but you are even worse. (Score:5, Insightful)
Backward compatibility isn't just important, it's paramount. It's not 1957.
Nobody wants to upgrade their build system to a newer version of the language and find out it breaks the code all over the place. Nobody has time for that. New versions of the same language need to ONLY
(1) fix errors
(2) add new features that you invoke with new code that would be rejected by the old version's parser.
If you change how old code will behave, that's not a new version. It's a new language.
Likewise other systems, such as operating systems and even user-exposed interfaces, because ultimately programs depend on them working a certain way.
Re:Yes, but you are even worse. (Score:5, Insightful)
Re: (Score:3)
Dude, by your argument, "compatibility" counts as "place your arguments in registers A, B, C..."
The compatibility between C and C++ goes a bit deeper than that.
Worse is worse (Score:4, Insightful)
I would assert precisely the opposite. "trade-offs to preserve compatibility and interoperability" do not cripple the functionality to users-- failures to engineer compatibility and interoperability is what cripples functionality.
The number of times that there's been a new feature and I've said "oh, excellent, it's true that my old files no longer work, but this is so wonderful I don't care" has been very close to zero. The number of times there's been a new feature and I've said "those assholes, I have twenty thousand files that don't work any more, what in the world were those idiots thinking?" is decidedly not zero.
It's bigger than that... (Score:3, Insightful)
It's a combination of 'Standards are never finished being implemented' combined with features never being explicitly declared.
Both C and C++ have been hobbled for years not by their backwards compatbility, but rather than features that are intentionally ambiguous or undefined because those are 'Implementation specific details', without regards to the effect of those lacking declarations effects on both cross platform compatibility as well as auditability of the resulting code. If there is no guarantee as to
Re: (Score:3)
Look at K&R, ANSI, C90, C14, etc. Many of these standards were never fully implemented in any particular compiler. K&R was supported in early gcc releases but deprecated and then dumped in gcc-2.95->early 3.x releases. Many of the later releases break features of the earlier ones piecemeal despite the original standard never having had a 'stable' release that could properly generate code for all applications (While rare, there are still many corner cases, even in perfectly 'valid' C programs that thanks to developer error, or mistake implementation of standard features resulted in code generation bugs. Some of which weren't fixed before a standard was deprecated or altered for compatibility with C++ for instance in a manner that broke a formerly 'conforming' application.)
There is no such thing as C14. C90 isn't really a thing either (it's C89, with a few errata fixed). C89 was the first ISO standard C. K&R wasn't a standard, it was just the documentation of a specific implementation. To claim that it wasn't implemented is nonsense - it was implemented, it was never standardised.
After C89, both versions of the C standard (C99 and C11) have been backwards compatible. They are not always backwards compatible with vendor extensions. C99, for example, added an inline
Database upgrades (Score:2)
We had FoxPro 6 and Windows 98, when XP hit our desks, FoxPro no longer worked. They made me use (ugh!) MS Access.
So I have a few dozen Access apps when they "upgraded" to Office '03, and not a single one would run. Access had become a completely different program with completely different code and was completely incompatible with Access '98. I had to rewrite every God damned program!
OTOH the NOMAD mainframe databases seldom had glitches. I'd been a PC kind of guy, but NOMAD on the mainframe and Microsoft's
Re: (Score:3)
On the other hand, I would say that's what you get for using something like Access or FoxPro as a tool for developing ongoing projects. They are great for things like
Re: (Score:2)
Worse is worse (Score:4, Insightful)
I would assert precisely the opposite. "trade-offs to preserve compatibility and interoperability" do not cripple the functionality to users-- failures to engineer compatibility and interoperability is what cripples functionality. The number of times that there's been a new feature and I've said "oh, excellent, it's true that my old files no longer work, but this is so wonderful I don't care" has been very close to zero. The number of times there's been a new feature and I've said "those assholes, I have twenty thousand files that don't work any more, what in the world were those idiots thinking?" is decidedly not zero.
And this is why there are so many programming languages with massive overlap in usage. Because once you start down a path, you can never, ever, everver change. If you want something that 42 years down the road that breaks with convention, go create a new language.
Re: (Score:2)
I had an interesting talk with an accountant friend of my wife. Being on the accounting side, she was completely puzzled by IT projects. Time isn't book properly on projects, budget overruns, time overruns, no one can give proper estimates because no one knows the backend...
We had a really good discussion actually, and one thing that actually came to my mind is there is a real accounting deficit on the engineer/IT side.
Here's the example that we both really understood.
She wanted some field added to some web
Re: (Score:2)
Here's the example that we both really understood.
She wanted some field added to some web application. In her head, it's a simple field... you know like adding a new column in Excel.
"All You Have To Do Is..."
The most deadly words in Information Technology.
A lot of the missed deadlines and cost overruns are because of the AYHTDI effect. Because people won't believe that the jobs isn't as simple and straightforward as it appears in their pointy little heads. They forget - and worse yet - the DEVELOPERS forget that computers are STUPID. And that you have to allow extra time and effort to take what's a simple job for humans and make it simple enough for computers.
And to compound the issue,
Re: (Score:3)
Part of the problem is that people wrongly equate creating software with building something when it is actually more like designing something.
Yes, once a building is designed, it is possible (and expected) that a very close estimate is provided for the cost to build the thing. For some reason, people expect to be able to get an exact time and cost for designing a combination canopener/jumbo jet/submarine (oh, and can it have interchangable designer steering wheels and make a dolphin noise when it surfaces?
Re: (Score:2)