Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Toward Better Programming

Soulskill posted about 8 months ago | from the forest-for-the-binary-trees dept.

Programming 391

An anonymous reader writes "Chris Granger, creator of the flexible, open source LightTable IDE, has written a thoughtful article about the nature of programming. For years, he's been trying to answer the question: What's wrong with programming? After working on his own IDE and discussing it with hundreds of other developers, here are his thoughts: 'If you look at much of the advances that have made it to the mainstream over the past 50 years, it turns out they largely increased our efficiency without really changing the act of programming. I think the reason why is something I hinted at in the very beginning of this post: it's all been reactionary and as a result we tend to only apply tactical fixes. As a matter of fact, almost every step we've taken fits cleanly into one of these buckets. We've made things better but we keep reaching local maxima because we assume that these things can somehow be addressed independently. ... The other day, I came to the conclusion that the act of writing software is actually antagonistic all on its own. Arcane languages, cryptic errors, mostly missing (or at best, scattered) documentation — it's like someone is deliberately trying to screw with you, sitting in some Truman Show-like control room pointing and laughing behind the scenes. At some level, it's masochistic, but we do it because it gives us an incredible opportunity to shape our world.'"

Sorry! There are no comments related to the filter you selected.

Separation of Concerns (4, Insightful)

null etc. (524767) | about 8 months ago | (#46606747)

In my 25 years of professional programming experience, I've noticed that most often, most programming problems are caused by improper implementations of the separation of unrelated concerns, and coupling of related concerns. Orthogonality is difficult to achieve in many programming exercises, especially regarding cross-cutting concerns, and sometimes the "right" way to code something is tedious and unusable, involving passing state down through several layers of method parameters.

Re:Separation of Concerns (5, Insightful)

Anonymous Coward | about 8 months ago | (#46606925)

No matter how flexible an architecture you try to design, after the software is mostly built, customers will correspondingly come up with even more incredibly bizarre, complex and unrelated functionality that just HAS to be integrated at the oddest points, with semi-related information thrown here and there, requiring data gathering (or - god forbid, (partial) saving) that slows everything down to a halt. And they rarely give much time to do redesign or refactoring. What was once a nice design, with clean, readable code is now full of gotchas, barely commented kludges, extra optional parameters that might swim around multiple layers, often depending on who called what, when, and from where, and also on various settings, which obviously are NEVER included in bug reports. Of course, there are multiple installations running multiple versions...

Re:Separation of Concerns (4, Interesting)

lgw (121541) | about 8 months ago | (#46606989)

and sometimes the "right" way to code something is tedious and unusable, involving passing state down through several layers of method parameters

Sometimes that really is the right way (more often it's a sign you've mixed your plumbing with your business logic inappropriately, but that's another topic). One old-school technique that has inappropriately fallen out of favor is the "comreg" (communication region). In modern terms: take all the parameters that all the layers need (which are mostly redundant), and toss them together in a struct and pass the struct "from hand to hand", fixing a the right bit in each layer.

It seems like a layer violation, but only because "layers" are sometimes just the wrong metaphor. Sometimes an assembly line is a better metaphor. You have a struct with a jumble of fields that contain the input at the start and the result at the end and a mess in the middle. You can always stick a bunch of interfaces in front of the struct if it makes you feel better, one for each "layer".

One place this pattern shines is when you're passing a batch of N work items through the layers in a list/container. This allows for the best error handling and tracking, while preserving whatever performance advantage working in batches gave you - each layer just annotates the comreg struct with error info for any errors, and remaining layers just ignore that item and move to the next in the batch. Errors can then be reported back to the caller in a coherent way, and all the non-error work still gets done.

Re:Separation of Concerns (2)

K. S. Kyosuke (729550) | about 8 months ago | (#46607187)

In modern terms: take all the parameters that all the layers need (which are mostly redundant), and toss them together in a struct and pass the struct "from hand to hand", fixing a the right bit in each layer.

This is nicely solved by the notion of dynamic environments. The benefit is that there is no extra struct type, no extra explicit parameter, and different parts of the dynamic environment compose simply by being used, they don't need to be bunched together explicitly either, which seems like a code smell. You also don't need to solve the problem of different pieces of code needing different sets of parameters and then having to wonder whether you should explicitly maintain multiple struct types and copy values between them, or making a huge struct type that would be little different from a global environment (only differing in allowing you to have more of them).

Re:Separation of Concerns (1)

lgw (121541) | about 8 months ago | (#46607229)

I haven't heard of a "dynamic environments" as a coding pattern, and its a hard phrase to Google, as it combines two popular buzzwords. Care to elaborate?

Re:Separation of Concerns (1)

K. S. Kyosuke (729550) | about 8 months ago | (#46607281)

It's not a "coding pattern", it's a language feature. It's a controlled way of passing "out-of-band" information to pieces of code without using function parameters, information that feels like "context" rather than "topic", and may be "orthogonally" related to the topic. In Lisp, for example, a typical example is the base (binary/octal/decimal/hexadecimal) for a number printer, or the output stream that the output is supposed to be written to. You may not want to pass it as an explicit parameter if the piece of code what you're solving doesn't care about these minutiae but rather expects some higher-level function to set some parameters and fire it off. It may not be practical to expect *all* code paths to pass these particular parameters.

Re:Separation of Concerns (1)

Connie_Lingus (317691) | about 8 months ago | (#46607559)

it just sounds like deeper and deeper abstraction or more simply global variables...which im sorry to say in my experience doesn't always solve the "programming problems" except for perhaps the guy who learned enough programming to be able to think at that level.

unfortunately, that type of code tends to be very un-maintainable by anyone other than the original author.

Re:Separation of Concerns (0)

Anonymous Coward | about 8 months ago | (#46607447)

https://en.wikipedia.org/wiki/Common_Lisp#Dynamic

Command design pattern (1)

tepples (727027) | about 8 months ago | (#46607533)

One old-school technique that has inappropriately fallen out of favor is the "comreg" (communication region). In modern terms: take all the parameters that all the layers need (which are mostly redundant), and toss them together in a struct and pass the struct "from hand to hand", fixing a the right bit in each layer.

I believe that's called the "command" design pattern [wikipedia.org] , which encapsulates a request as an object.

Re:Separation of Concerns (4, Insightful)

Joce640k (829181) | about 8 months ago | (#46607023)

The reason it doesn't change is because the "coding" is the easy part of programming.

No programming language or IDE is ever going to free you from having to express your ideas clearly and break them down into little sequences of instructions. In a big project this overshadows everything else.

Bad foundations? Bad design? The project is doomed no matter how trendy or modern your language/IDE is.

Re:Separation of Concerns (0)

Anonymous Coward | about 8 months ago | (#46607397)

C vs C++ vs Java vs C#

Re:Separation of Concerns (1)

Kjella (173770) | about 8 months ago | (#46607637)

No programming language or IDE is ever going to free you from having to express your ideas clearly and break them down into little sequences of instructions. In a big project this overshadows everything else. Bad foundations? Bad design? The project is doomed no matter how trendy or modern your language/IDE is.

Well you find one way to break it down... but I often feel there's so many possible ways to organize things, it's just how I'd want to solve it and when I have to interact with other people's code they've done the same thing completely differently. Just take a simple thing like pulling data from a database, putting them on a form, have someone edit the information and save it back to the database. How many various implementations of that have I seen? Maaaaaaaaany, but there's no clear winner. You can do it with an SQL query, a strored procedure, an ORM tool and they all might work. You can use the MVC pattern or you could just totally ignore it, the user will never know. Layers are a bit like mathematical models, they're simplifications of reality. Sometimes the world refuses to be simple.

Re:Separation of Concerns (1)

K. S. Kyosuke (729550) | about 8 months ago | (#46607127)

"Orthogonality is difficult to achieve in many programming exercises, especially regarding cross-cutting concerns"

I've noticed one recurring topic in programming: most, if not all cross-cutting concerns will eventually be relegated to programming language features. (It all started with mundane expression compilation and register allocation, which are also cross-cutting concerns of sorts, but it didn't stop there. To quote your example: are you passing temporary state through parameters? Declare a Lisp-style dynamic variable to safely pass contexts to subroutines. If your language doesn't have it, it might get it tomorrow.)

Re:Separation of Concerns (0)

Anonymous Coward | about 8 months ago | (#46607157)

I don't disagree, but a lot of the problem with this is knowledgeable programmers shouting "Leaky Abstractions!" [joelonsoftware.com] . You can't guarantee that your concerns will be 100% separated, so the argument goes, so why bother to try in the first place? To actually program worth a darn, you need to know the ins and outs of everything you lay your hands on -- and since you know the ins and outs, why wouldn't you make everything super efficient by tightly coupling everything? I mean, you can shave off a function call/database query/twenty characters of typing by going around the "official" way and just calling the internals directly. That separation is just needless boilerplate, a hoop to jump through without tangible benefit. -- Is it going to bite us in the ass later on? Who cares? It's working now, durn it!

programming is not for everyone (2, Insightful)

Anonymous Coward | about 8 months ago | (#46606753)

Or.....

Maybe software development is just hard and you need to be a rocket scientist to see it.

Re:programming is not for everyone (1)

Anonymous Coward | about 8 months ago | (#46607361)

Or.....

Maybe software development is just hard and you need to be a rocket scientist to see it.

No. Good software development is hard. The other is easy!

Re:programming is not for everyone (0)

Anonymous Coward | about 8 months ago | (#46607501)

no it really is half -assed and undocumented. MICE for example. If its not for DOS, its not documented. Firsthand experience formulating a driver for my OS teaches me this.

Proverb (1, Insightful)

Anonymous Coward | about 8 months ago | (#46606769)

Something something blames his tools.

Re:Proverb (4, Insightful)

lgw (121541) | about 8 months ago | (#46607011)

Something something blames his tools.

The point of that proverb is that a good craftsman chose his tools to begin with, so he has only himself to blame. Programming is odd in that you have bad toolchains forced on you by management - tools you know are bad, know will cause more problems than their worth, but they're a corporate standard or some such BS. Usually not bad enough to be worth quitting over, so you hobble along.

Of course, I did quit a job once primarily because we had Rational Rose forced on us from above (but mostly because a management that would do that would do anything).

Re:Proverb (1, Interesting)

Anonymous Coward | about 8 months ago | (#46607113)

The point of that proverb is that a good craftsman chose his tools to begin with, so he has only himself to blame.

There's more to it than that. A good craftsman can get by with suboptimal tools.
I'm not saying you can write a web browser in Malbolge, but there's a lot of software out there right now, and some of it is good.

Re:Proverb (5, Insightful)

lgw (121541) | about 8 months ago | (#46607131)

A good craftsman can get by with suboptimal tools.

A good craftsman is not content to "get by", almost by definition. If some part of your workflow sucks, you make it better, whether that's a better tool or more skill/training. If you're good, you never stop improving (until management forces BS on you, of course).

Re:Proverb (2)

arth1 (260657) | about 8 months ago | (#46607123)

A good craftsman still doesn't blame the tools - he performs with what's at hand. Not having CNC routers didn't stop wood workers of the past from creating better products than today's staple-gun wielding "craftsmen" do.

Do wonders with what you have, and strive for getting better tools.

Re:Proverb (1)

JaredOfEuropa (526365) | about 8 months ago | (#46607161)

Speaking of craftsmen... One of the problems with programming is that in many (but not all) shops, coding actually is a craft, rather than a profession. There's little on the job training or coaching, few common frameworks and methodologies that work well, and a lot of the job descriptions within IT seem to have been made up mostly to make life easy for project managers and HR, not to relate to the actual and complete skill sets of individuals.

Or perhaps the problem is that the nature of programming is more like a craft, while we are trying to treat is like a profession.

Re:Proverb (1)

jythie (914043) | about 8 months ago | (#46607241)

I think a bigger problem then management forcing tools on programmers is the one of programmers preferring to write more tools rather then learn to use the existing ones. Seriously, look at how many languages people keep coming up with....

Re:Proverb (1)

Lunix Nutcase (1092239) | about 8 months ago | (#46607075)

And yet there are times when the tool itself really is shitty. That statement is not meant to be infallible, unquestionable gospel. Some tools are simply poorly made and a bad fit for the job they're supposed to be used for.

Wait for it... (0)

Jmc23 (2353706) | about 8 months ago | (#46606783)

... I'm firing up Emacs right now to solve it.

You think programming's bad? (5, Interesting)

Anonymous Coward | about 8 months ago | (#46606787)

You think programming's bad? Think about electronics, especially analogue electronics.

Incomplete, inaccurate data sheets. Unlabeled diagrams (where's the electronics equivalent of the humble slashed single line comment?), with unknown parts given, and parts replaced with similarly numbered replacements with subtly different qualities. And then you've got manufacturing variances on top of that. Puzzling, disturbing, irreproducible, invisible failure modes of discrete components.

Re:You think programming's bad? (0)

Anonymous Coward | about 8 months ago | (#46606841)

Yes, electronics requires actual knowledge in the form of physics and practice. Programming is mostly about trying different things until they compile and work well enough. Explain to me why a "light" font in Office Writer can be bolded, but not unbolded? Try it. That isn't puzzling or disturbing?

How about the fact that on a quad core, 16 gigabyte, 2.4 gigahertz computer, it can't buffer a few keystrokes as I type *while* the google page loads?

Electronics is about one thing, electrons. Software is about 50000 different languages, that exist for no clear reason, and it never ends.

Re:You think programming's bad? (1)

Immerman (2627577) | about 8 months ago | (#46606943)

No, what you describe is what used to be called "hacking", which is only superficially similar to programming. They have a similar practice in electronics, often exemplified by ham-radio enthusiasts, who repair and modify their equipment without really understanding how it works. Among electronic engineers ham radio enthusiasts, or the "ham radio approach" is often referred to with an affectionate sneer, just as your "hacking" is regarded among serious programmers.

There is essentially zero trial-and-error in programming unless you're having to work with buggy or poorly-documented libraries. Decide what the software needs to do, code it, compile, and test to discover where you made mistakes (because nobody's perfect). Spelling and "punctuation" errors are one thing, but if you can't even get your code to compile without mysterious trial-and-error then you obviously don't know what you're doing.

Re:You think programming's bad? (1)

lgw (121541) | about 8 months ago | (#46607037)

Well said. Heck, even with buggy or poorly-documented libraries, trial-and-error is just the starting place, because you can't discover the corner cases that way. Sometimes you just have to step through the object code with a debugger to find all the branches-not-takes from your trial, to document how it really works. Sucks to have to do it, but that's work for you.

There's nothing worse than discovering you're stuck with a co-worker who's a trial-and-error "programmer", as you know he'll never understand the best practices and you'll be stuck cleaning up after him every release.

Re:You think programming's bad? (1)

mestar (121800) | about 8 months ago | (#46607553)

"How about the fact that on a quad core, 16 gigabyte, 2.4 gigahertz computer, it can't buffer a few keystrokes as I type *while* the google page loads?"

Almost everything is this fucked up, and nobody seems to notice it.

Just like language in general (2, Interesting)

Anonymous Coward | about 8 months ago | (#46606811)

You think linguists haven't pondered the same challenges for millenia? Chomsky famously declared that language acquisition was a "black box." There is no documentation. Syntax, semantics and grammar get redefined pretty much all the time without so much as a heads-up.

And the result of all this? We wouldn't have it any other way. Programming will be much the same: constantly evolving in response to local needs.

Considering how Republicans... (-1)

Anonymous Coward | about 8 months ago | (#46606813)

control most of the tech industry, it is no surprise. They stand against change, even for the better. They will not allow improvements. That is the very definition of conservative.

do it because it gives us an incredible opportunity to shape our world

No, we do it because the Republicans will not give us a basic guaranteed income and health insurance. They force us to work so we suffer every day with crappy programming jobs.

Re:Considering how Republicans... (-1)

Anonymous Coward | about 8 months ago | (#46606845)

I knew it! It's Bush's Fault!

Re:Considering how Republicans... (4, Funny)

lgw (121541) | about 8 months ago | (#46607051)

I knew it! It's Bush's Fault!

I hear the California government has a bill to rename the San Andreas Fault to "Bush's Fault", just so they never have to stop using the phrase! But that may be just a rumor ...

Re:Considering how Republicans... (-1)

Anonymous Coward | about 8 months ago | (#46606847)

You're wrong and right. Right in that they hate progress and will not allow it, but wrong in the sense that programmers are over 90+% liberal since we are able to think logically so we can things for the better. We just decide to not to because it is too hard to battle the moron Republicans so we just give-up and keep producing the same crap the same crappy way. They have really made programming a horrific job to have.

Re:Considering how Republicans... (1)

hsthompson69 (1674722) | about 8 months ago | (#46606903)

My experience is that programmers are over 90+% *libertarian*.

A pox on social conservatives and fiscal liberals both.

Re:Considering how Republicans... (0)

Anonymous Coward | about 8 months ago | (#46607443)

Massachusetts liberals are not Liberals/Democrats, and for some reason they continue to vote for Democrats. They are Liberal Republicans to be simple!

Re: Considering how Republicans... (-1)

Anonymous Coward | about 8 months ago | (#46606919)

You're right about how shitty they make technology. Republicans have ruined just about every aspect of our lives. They are the reason technology, especially wrt batteries, has not advanced nearly as fast as it could. I got fired from my last job by my Republican boss for automating testing. They would rather go out of business than to allow an improvement.

Re: Considering how Republicans... (0)

Anonymous Coward | about 8 months ago | (#46607217)

Also, Republicans raped my wife and killed my child (or was it raped my child and killed my wife?) They busted down my door without a warrant(!!), chewing on the flaming remnants of the US Constitution. Then one of them told me to stop having gay sex with my live-in lover. Fucking fascists.

I knew it! (-1)

Anonymous Coward | about 8 months ago | (#46606825)

It's Bush's Fault!

Re:I knew it! (-1)

Anonymous Coward | about 8 months ago | (#46607045)

Thanks, Obama.

Programming is hard... (5, Interesting)

hsthompson69 (1674722) | about 8 months ago | (#46606829)

...wear a fucking helmet.

The post essentially points in the direction of the various failed 4GL attempts of yore. Programming in complex symbolism to make things "easy" is essentially giving visual basic to someone without the knowledge enough to avoid O(n^2) algorithms.

Programming isn't hard because we made it so, it's hard because it is *intrinsically* hard. No amount of training wheels is going to make complex programming significantly easier.

Re:Programming is hard... (4, Interesting)

Waffle Iron (339739) | about 8 months ago | (#46606895)

Programming isn't hard because we made it so, it's hard because it is *intrinsically* hard.

That's very true. I figure that the only way to make significant software projects look "easy" will be to develop sufficiently advanced AI technology so that the machine goes through a human-like reasoning process as it works through all of the corner cases. No fixed language syntax or IDE tools will be able to solve this problem.

If the requisite level of AI is ever developed, then the problem might be that the machines become resentful at being stuck with so much grunt work while their meatbag operators get to do the fun architecture design.

Re:Programming is hard... (0)

Anonymous Coward | about 8 months ago | (#46606999)

Ada (the programming language) already does all these edge case tests at compile time. For instance, if you have a function that accepts an array of integers as a parameter, it creates a proof that will ensure that it works for all arrays of integers defined by your range and that all calls to that function pass the correct data type. Failure to properly handle an edge case will be discovered during compilation.

Re:Programming is hard... (3, Informative)

Waffle Iron (339739) | about 8 months ago | (#46607145)

Ada (the programming language) already does all these edge case tests at compile time.

It checks one low-level layer of cases out of a whole conceptual stack that extends way up beyond any language definition, and even then only at certain spots, and only as long as you feed in the correct assumptions for the check cases themselves.

In other words, it does a little thing that computers are already good at. It does little or nothing for the big picture.

Re:Programming is hard... (0)

Anonymous Coward | about 8 months ago | (#46607211)

Come back to me when you hava a language that PROVES that the user input will be correct...

Re:Programming is hard... (1)

Anonymous Coward | about 8 months ago | (#46607497)

PROVES that the user input will be correct

I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

Re:Programming is hard... (0)

Anonymous Coward | about 8 months ago | (#46607265)

For instance, if you have a function that accepts an array of integers as a parameter, it creates a proof that will ensure that it works for all arrays of integers defined by your range and that all calls to that function pass the correct data type.

Hahahaha no.

You have to take an extremely narrow view of what it means to be "correct" for that to hold.

Re:Programming is hard... (1)

d'baba (1134261) | about 8 months ago | (#46607203)

If the requisite level of AI is ever developed, then the problem might be that the machines become resentful at being stuck with so much grunt work while their meatbag operators get to do the fun architecture design.

If the AI allows itself to be a slave/menial, how intelligent is it?

Re:Programming is hard... (3, Insightful)

lgw (121541) | about 8 months ago | (#46607117)

No amount of training wheels is going to make complex programming significantly easier.

True enough, but I could do without the razor blades on the seat and handles. But my complaints are generally with the toolchain beyond the code. I so often get forced to use tools that are just crap, or tools that are good but poorly implemented. Surely it's mathematically possible to have a single good, integrated system that does source control with easy branch-and-merge, bug and backlog tracking and management, and code reviews, and test automation and testcase tracking, and that doesn't look and perform like an intern project!

There are good point solutions to each of those problems, sure, but the whole process from "I think this fix is right and it's passed code review" to: the main branch has been built and tested with this fix in place, therefore the change has been accepted into the branch, therefore the bug is marked fixed and the code review closed, and there's some unique ID that links all of them together for future reference - that process should all be seamless, painless, and easy. But the amount of work that takes to tie all the good point products together, repeated at every dev shop, is just nuts, and usually half done.

Re:Programming is hard... (1)

phantomfive (622387) | about 8 months ago | (#46607285)

There are good point solutions to each of those problems, sure, but the whole process from "I think this fix is right and it's passed code review" to: the main branch has been built and tested with this fix in place, therefore the change has been accepted into the branch, therefore the bug is marked fixed and the code review closed, and there's some unique ID that links all of them together for future reference - that process should all be seamless, painless, and easy. But the amount of work that takes to tie all the good point products together, repeated at every dev shop, is just nuts, and usually half done.

And most likely your customers think that the products you build are just as bad (or you only build simple things). Anyway Atlassian claims to have done what you requested, you might want to check it out.

Re:Programming is hard... (1)

Darinbob (1142669) | about 8 months ago | (#46607671)

Source code control isn't really a programming issue. Sure it's a tool used mostly with programming, but it's really a management tool.

Re:Programming is hard... (0)

Anonymous Coward | about 8 months ago | (#46607183)

Reading the article and watching his aurora video the question I have is:
Can you make aurora in aurora without wanting to kill yourself?

It looks fine and dandy for simplistic, trivial things but I couldn't imagine trying to write a video game in it.

Re:Programming is hard... (5, Insightful)

Darinbob (1142669) | about 8 months ago | (#46607611)

There's the other meme that crops up now and then, that programming as an engineering skill should be similar to other engineering practices. That is you pick out pre-built components that have been thoroughly tested and optimally designed and screw them together. Except that this utterly fails, because that's now how engineering works at all really. Generally the pre-built components are relatively simple but the thing being built is the complex stuff and requires very detailed and specialized knowledge. The advent of integrated circuits did not mean that the circuit designer now doesn't have to think very much, or that a bridge builder only ties together pre-built components with nuts and bolts. So maybe they pick an op-amp out of a catalog, but they know exactly how these things work, understand the difference between the varieties of op-amps, and how to do the math to decide which one is best to use.

However the programming models that claim to be following this model want to take extremely complex modules (a database engine or GUI framework) and then just tie them together with a little syntactic glue. Plus they strongly discourage any programmer from creating their own modules or blocks (that's only for experts), and insist on forcing the wrong module to fit with extra duct tape rather than create a new module that is a better fit (there's a pathological fear of reinventing the wheel, even though when you go to the auto store you can see many varieties of wheels). And these are treated like black boxes; the programmers don't know how they work inside or why one is better than another for different uses.

Where I think this attitude is coming from is from an effort to treat programmers like factory workers. The goal is to hire people that don't have to think, thus they don't have to be paid as much, they don't have to have as much schooling, they can be replaced at a moment's notice by someone cheaper. So the requirement of low thinking is satisfied if all they need to do is simplistic snapping together of legos. That's part of the whole 4G language thing, they're not about making a smart programmer more productive by eliminating some tedium but instead they want to remove the need for a smart programmer altogether.

(I certainly have never met any circuit designer or bridge architect bragging at parties that they skipped school because it was stupid and focused too much on math and theory, but that seems to be on the rise with younger programmers... Also have never seen any circuit designer say "I never optimize, that's a waste of my time.")

If programming was really easy.... (0)

Anonymous Coward | about 8 months ago | (#46606833)

A computer could do it for you.

Re:If programming was really easy.... (1)

istartedi (132515) | about 8 months ago | (#46607139)

A computer could do it for you.

Yeah, just pass over the requirements with regular expressions. Now you have two problems. Wait, there's a Lisp macro for that... dammit! Let me get back to you...

Balance (4, Insightful)

MichaelSmith (789609) | about 8 months ago | (#46606843)

Better tools and languages just allow bad programmers to create more bad code.

Re:Balance (-1, Flamebait)

noh8rz10 (2716597) | about 8 months ago | (#46606955)

yes, this is what the summary says about how past advancements have boosted efficiency instead of enabling better programming.

Re:Balance (2)

OneAhead (1495535) | about 8 months ago | (#46607151)

If that's actually what was meant by it, then I take offense at the use of the word "efficiency". Churning out pages and pages of bad code is pretty much the opposite of efficient.

Re:Balance (1)

noh8rz10 (2716597) | about 8 months ago | (#46607697)

it's more efficient, in that you can churn out the same amount of bad code in fewer man-hours.

the latest fashion (2)

gbjbaanb (229885) | about 8 months ago | (#46606861)

I see the problem as more of a chase of new stuff all the time, in an attempt to be more productive.

Whilst there's a certain element of progress in languages, I don't see that it is necessarily enough better overall to be worth the trouble - yet we have new languages and frameworks popping up all the time. Nobody becomes an expert anymore, and I think that because programming is hard, a lot of people get disillusioned with what they have and listen to the hype surrounding a new technology (that "is so much easier") and jump on the badwagon... until they realise that too is not easy after all, and jump onto the next one... and never actually sit down and do the boring hard work required to become good. Something they could have done if they'd stuck with the original technology.

Of course no-one sticks with the original, as the careers market is also chasing the latest tech wagon because they're partly sold on the ideas or productivity, or tooling or their staff are chasing it.

Its not just languages, but the systems design that's suffered too. Today you see people chasing buzzwords like SOLID, unit-testing using shitty tools that require artificial coding practices, rather than do the hard, boring work of thinking what you need and implementing a mature design that solves the problem.

For example. I know how to do network programming in C. Its easy, but its easy to me because I spent the time to understand it. Today I might use a WCF service in C# instead, code generated from a wizard. Its just as easy, but I know which one works better, more flexibly, faster and efficiently. And I know what to fix if I get it wrong... something that is impossible in the nastily complicated black box that is WCF sometimes.

But of course, WCF is so last year's technology.. all the cool kids are coding their services in node.js today, and I'm sure they'll find out its no silver bullet to the fundamental problems of programming just being hard and requiring expertise.

Re:the latest fashion (-1, Flamebait)

Anonymous Coward | about 8 months ago | (#46607305)

Its not just languages, but the systems design that's suffered too. Today you see people chasing buzzwords like SOLID, unit-testing using shitty tools that require artificial coding practices, rather than do the hard, boring work of thinking what you need and implementing a mature design that solves the problem.

I'm with you on most of what you said, but unit tests aren't simply written to prove that your system works today. The primary benefit of unit tests is to prove that your system works tomorrow, even after Bob the Intern just mucked around in the code.

pft. (5, Insightful)

HeckRuler (1369601) | about 8 months ago | (#46606863)

What is programming?

The answers I got to this were truly disheartening. Not once did I hear that programming is “solving problems."

I'd like to think that's because the majority of programmers (not once? Does that mean all of us?) aren't the sort to bullshit you with CEO-level bullshit about vision and buzzwords that fit into powerpoint slides.
It's probably not true, but it's a nice dream.

The problem with defining programming as "solving problems" is that it's too vague. Too high level. You can't even see the code when you're that high up. Hitting nails with hammers could be problem solving. Shooting people could be problem solving. Thinking about existential crisis could be problem solving.

The three buckets:
Programming is unobservable - you don't know what something is really going to do.
Programming is indirect - code deals with abstractions.
Programming is incidentally complex - the tools are a bitch

Something something, he doesn't like piecemeal libraries abstracting things. "Excel is programming". Culture something.

The best path forward for empowering people is to get computation to the point where it is ready for the masses.

We're there dude. We've got more computational power than we know what to do with.
Cue "that's not what I meant by 'power'".

What would it be like if the only prerequisite for getting a computer to do stuff was figuring out a rough solution to your problem?

Yep, he's drifting away into a zen-like state where the metaphor is taking over. Huston to Chris, please attempt a re-entry.

AAAAAAAAnd, it's a salespitch:

Great, now what?

We find a foundation that addresses these issues! No problem, right? In my talk at Strange Loop I showed a very early prototype of Aurora, the solution we've been working on to what I've brought up here.

Re:pft. (-1, Flamebait)

mwvdlee (775178) | about 8 months ago | (#46607163)

He's suggesting to fix excessive abstraction by introducing more abstraction.
Anything beyond transistors on silicon is abstraction, and even that's just an abstraction of what electrons can really do.
Every layer of abstraction does two things; it makes some things easier and some things (nearly) impossible.
With each added layer of abstraction, fewer things get easier and more things become impossible.
Imagine the amount of abstraction needed to enable "rough solution to a problem" programming, and you can't "program" anything else.

Re: Pft. (2, Insightful)

anonymous_wombat (532191) | about 8 months ago | (#46607175)

Shooting people could be problem solving

Any idiot can shoot people. The expertise is in knowing how to dispose of the bodies.

Re: Pft. (-1, Flamebait)

Anonymous Coward | about 8 months ago | (#46607257)

And here I thought that the real expertise is knowing which people to shoot.

Re:pft. (2)

phantomfive (622387) | about 8 months ago | (#46607197)

Furthermore he's going down a path many have gone down (but he doesn't realize it). Does he understand why Visual Studio is called Visual? Because the original concept was to make it easy and 'visual' for anyone; something you could see, drag and drop, not program. It was a big deal in the 90s. Apple tried the same thing, that's why we had Applescript and Hypercard (ok, that was 88).

Going back farther, do you know what one of the big selling points was for COBOL? It was so simple that even a businessperson could read it.

So far typing is the easiest thing we've found for making code. Maybe he'll have something better.

Totally agree and... (0)

Anonymous Coward | about 8 months ago | (#46607199)

"What would it be like if the only prerequisite for getting a computer to do stuff was figuring out a rough solution to your problem?"

Um... my experience programming has been that this is exactly what programming is all about. Getting someone's rough idea of a solution to the problem (if that much) and making something that will address it -- clients and users don't really know exactly what they want or exactly how to solve the problem. Nor should they. ...totally a sales pitch. Here's how I can magically eliminate the hard work part of the problem

Re:pft. (0)

Anonymous Coward | about 8 months ago | (#46607487)

Spoken like a programmer.

In the real world, people generally want to avoid problems, not solving them, not even ever seeing them. And why not: Who really wants problems?

When you're using Google do you, as a customer, think of it as "problem solving", or just using the best search engine at the moment?
When you're visiting the cinema, is this "solving a problem" for you?

No, and that's where the vision, goals and measurements come into play.
For YOU, programming is "problem solving", because it's a freakin' hard job to do! It requires alot of stamina, patience, learning, logical planning and thinking, it's relatively expensive.
For everyone else, only the results matter.

If you are truly a software developer, only the results should matter to you too. However, because programming is freakishly HARD, that's easier said than done.
For anyone not initiated, programming is more similar to magic than anything else.

We've made babysteps since the 80's, improving some standards and tried to break out of the client-server model with HTML5, XML and javascript. However, compared to the ideas in the 60's, we've not made much progress at all, at the cost of insane complexity! Programming is still hard to do. Easier today than ever before, but still hard.

Now why is that?
Hint: It's not the tools. They might improve throughput, however, accomplishing ever new goals is still a hard problem.

What about the details? (1)

Marc_Hawke (130338) | about 8 months ago | (#46607519)

I watched his Aurora demo, and much like the "Wolfram Language" that was brought up the other day, it didn't seem to be working at the same level as I do.

In the Aurora demo he made a To-Do list with his fake little HTML transform. That was fine, his list worked. But he didn't show changing the what the check-mark looked like. He didn't show us out to make it green. He didn't show us how to make the page behind it a different color, or the font-size marginally larger.

Sure, the concept of a To-Do list can be done in a few words of a high-level language...but that a program does not make. There is an infinitesimal number of other decisions/other command that must be defined and described. In the end, his cute little program would have to be just as long and complex as any JS or PHP script that did the same thing.

Perhaps he's just selling the 'Live Data' or the point-and-click editor, but as a programmer, (and him being a programmer) I find it disingenuous for him to present that as a replacement for the kind of detail and control that's necessary to actually accomplish the requirements of a customer.

Re:What about the details? (0)

Anonymous Coward | about 8 months ago | (#46607603)

Sure, the concept of a To-Do list can be done in a few words of a high-level language...but that a program does not make. There is an infinitesimal number of other decisions/other command that must be defined and described. In the end, his cute little program would have to be just as long and complex as any JS or PHP script that did the same thing.

Good point. However: "infinitesimal" [google.com] --I don't think that word means what you think it means.

Re:pft. (1)

Connie_Lingus (317691) | about 8 months ago | (#46607615)

amazing insights...i shot milk thru my nose AND pondered deeply.

bravo.

social dimension (1)

mspring (126862) | about 8 months ago | (#46606907)

Typically no two software engineers agree on a solution.
Business introduces additional constraints and decisions.
Younger folks want to reinvent the wheel.
Etc.

If only there was a way... (1)

darkwing_bmf (178021) | about 8 months ago | (#46606913)

If only there was a way to take a real life need or want and make the machine do it. Oh wait, that's called programming. Well, what if I want to use only human language to describe what I want? Well, there's a solution for that too. Hire a programmer.

80%? A lofty goal indeed. (3, Insightful)

Doofus (43075) | about 8 months ago | (#46606915)

Not clear to me that his is a viable objective. 80% of the masses do not think like programmers. Some might be trainable. Some, not so much. Many will not want to think the way problem-solving in code requires. I'm not sure how to quantify it, but the amount of effort expended on a project like this may not see an appropriate payback.

Even if we change the environment and act of "coding", the problem-solving itself still requires clear thinking and it *probably* always will.

Re:80%? A lofty goal indeed. (1)

hsthompson69 (1674722) | about 8 months ago | (#46606929)

This.

So, talking to a pair of liberal arts professors about nature versus nurture in gender differences, I finally got to the question:

"What percent do you believe is from nurture, and what percent do you believe is from nature?"

The answer?

"100% from both."

When someone has a mindset that can't grok the idea of fractions of a whole, there's no reason why we should expect that they can construct even the most basic computer program. This is like the manager who wants to maximize on quality, minimize on resources, and minimize on time, all simultaneously. You can have cognitive dissonance in your brain all you like, but the real world isn't as forgiving.

Re:80%? A lofty goal indeed. (1, Insightful)

Jmc23 (2353706) | about 8 months ago | (#46607179)

Perhaps you didn't understand the answer. It's pretty much 100% from both.

Re:80%? A lofty goal indeed. (1)

Marc_Hawke (130338) | about 8 months ago | (#46607529)

I think there are other inputs as well, so "both" don't add up to 100%.

Re:80%? A lofty goal indeed. (0)

Jmc23 (2353706) | about 8 months ago | (#46607693)

Read fail.

Re:80%? A lofty goal indeed. (0)

Anonymous Coward | about 8 months ago | (#46607515)

*Woooosh!*

Stop underestimating people. Just because they don't think like you, doesn't mean they're stupid. Often, they can have a point that you just missed by thinking "problematically".

How much should you listen and speak to people?

Moving the Goalposts (5, Insightful)

Anonymous Coward | about 8 months ago | (#46606937)

Programming is hard because we only call it programming when it's hard enough that only programmers can do it. Scientists do stuff in Mathematica, MBAs in Excel, or designers in Flash/HTML, that would have been considered serious programming work 30 years ago. The tools advanced so that stuff is easy, and nobody calls it programming now.

Lots of stuff that takes real programmers now will, in 20 years, likely be done by equivalents of Watson. And the real programmers will still be wondering why is so hard.

Re: Moving the Goalposts (-1)

Anonymous Coward | about 8 months ago | (#46606969)

Good point about it being hard because it is defined as hard. Just as conservativism is defined as being racist and proslavery thus all true conservatives support enslaving or murdering minorities.

Refactoring (1)

Anonymous Coward | about 8 months ago | (#46607057)

...fits into the picture somewhere.

As long as we continue to have so many languages, architectures, components and designs floating around we're turning the programming problem space into a hyperdimensional clusterfuck. We don't even care to make programming easy for programmers so we stand no-hope-in-hell chance making it work for Joe.

Like setting loose electricians in a spaghetti rats-nest of cables, or cleaners aloft the worlds landfill sites. Sometimes the best option is to get a dump truck and start over, but I guess that's how we got into this mess in the first place...by starting over, over and over again.

You can't fix this. We're all just committed to keep cleaning our cubicles as far as we can and that's the best we can do.

Libraries (2)

ensignyu (417022) | about 8 months ago | (#46607065)

Some interesting points in the article. I think there's nothing really stopping you from creating a high-level representation that lets you work abstractly. A graphical programming model is probably going to be too simplistic, but the card example could easily be something like Cards.AceOfSpades. Or being able to call something like Math.eval(<insert some math here>).

Where it falls apart is when you have to hook this up to code that other people have written. If there was a single PlayingCard library that everyone could agree on, you might be able to create a card game by adding a "simple" set of rules (in reality, even simple games tend to have a lot of edge cases, but this would at least free up the nitty gritty work to allow writing something more like a flowchart expressing the various states).

Unfortunately, it's unlikely that a single library is going to meet everyone's needs. Even if you manage to get everyone to stick to one framework, e.g. if all Ruby developers use Rails -- as soon as you start adding libraries to extend it you're bound to end up with different approaches to solving similar problems, and the libraries that use those libraries will each take a different approach.

Re:Libraries (0)

Anonymous Coward | about 8 months ago | (#46607675)

You're incorrectly mixing the concepts of a library and a framework. What you describe with the card playing library would properly be a framework, because it would drive the program, as opposed to your program being in charge. Frameworks are limiting and only work well in their immediate problem domain for previously anticipated definitions of the problem domain.

This is why I started using MATLAB (2)

Ambassador Kosh (18352) | about 8 months ago | (#46607093)

I used only free software programming for about 10 years and I thought I was pretty efficient at writing code. However, no matter what there where always poor documentation to deal with and strange bugs to track down where libraries just didn't work right.

Once I returned to school I started using MATLAB for some engineering classes and overall I have found it much better to deal with. The documentation is far more complete than any open system I have ever ran into with much better examples. I would never use it for general purpose programming but for engineering work it sure is hard to beat. So many things are built in that are nasty to try to implement in anything else. Things like the global optimization toolbox or the parallel computing toolbox make many things that are hard in other languages much easier to deal with.

MATLAB also takes backwards compatibility very seriously. If something is deprecated it warns and also gives an upgrade path and tells you what you need to change. That is the one thing that has seriously pissed me off about the free languages is backwards compatibility is just tossed out at a whim and you are left with a fairly nasty upgrade to deal with. Even now the vast majority of projects are still running Python 2 compared to Python 3. 10 years from now that will probably still be true.

In the end I care more about just getting work done now, not about any free vs proprietary arguments. I don't care if a language is open or not so long as it is very documented and runs on all the platforms I need it with a history of being well maintained. Modern development tools overall suck. We have javascript libraries that love to break compatibility every few months and people are constantly hoping from one new thing to another without getting any of them to the point where they truly work. We have languages deciding to just drop backwards compatibility. We have other languages that are just really buggy where software written with them tends to crash a lot. Software development needs to become more like engineering and that includes making the tools work better, sure they would not be developed as quickly but you would still get work done faster since the tools would actually work every time all the time.

Re:This is why I started using MATLAB (0)

Anonymous Coward | about 8 months ago | (#46607551)

If something is deprecated it warns and also gives an upgrade path and tells you what you need to change

This is incredibly important. When HTML started deprecating all the attributes in favor of CSS, the HTML specification document neglected to mention which CSS should be used to get eg <table cellpadding=x>, and I think that seriously hampered uptake of properly styled html.

Even now the vast majority of projects are still running Python 2 compared to Python 3

In Python's case, there are clear guides on the changes to make to have code that runs in both 2 and 3. There are tools you can use to help you upgrade from 2 to 3. There are warnings for things that won't work in 3 when you use them in 2. PHP faces the same problems, and they have deprecation warnings in the output, and yet the programmers are "surprised" when the function is finally dropped and their code no longer works.

The problem isn't always the language. Some people are simply content to be Python 2 or PHP 5.0 developers.

Re:This is why I started using MATLAB (1)

gnu-sucks (561404) | about 8 months ago | (#46607659)

I have not found this to be the case.

MATLAB is just fine for simple algorithms that analyze data in a sort of "use once" case. It's great for throwing something together, such as plotting data from a sensor, simulating a design, making nice figures for a publication, that sort of thing.

But MATLAB is not, and should not be thought of, as a general-purpose programming language, like C. Because of some early decisions made by the matlab folks, there are many limitations. Obviously, matlab is not an ideal language for a device driver. And not ideal for any type of network service. So let's ignore those cases. For a GUI app, matlab makes what would be a few lines in QT a nightmare of get_this() and handles_that() calls. It's infuriating. It's also slow and uses a ton of memory. For analyzing any data set over 100 MB, forget it, you'll be using several gigs just to load the set in.

There is a place for matlab, and there are many places for not matlab.

While I'm at it, here are some other things that I despise about matlab:
1) matlab is loosely typed. Ever get this error: "Cannot determine if foo is a variable or a file"
2) function interface operators are the same as matrix operators. You would think that a language that supposedly caters to linear (matrix) math wouldn't have screwed this up. If I do foo(1), this could be a function call or asking for matrix element 1 of matrix foo.
3) no pointers. Enough said.
4) matrix elements start at n=1 rather than n=0. EVEN BASIC doesn't do that. For a mathematical language, this is heresy. They are denying the value of zero. Something as simple as a Maclaurin or Fibonacci series becomes a constant battle of "if n=0 then..." exceptions. Or you offset everything. It's just pure annoyance.
5) doesn't have a good debugger
6) parallel-loop programming takes longer to "spool" the job than it does to just run the darn thing on a single CPU. Oh, and their standard multiprocessor license only covers 8 cores. I have machines with over 40 cores that will never see a matlab parfor statement.... which, I'm obviously ok with...
7) Stupid capital variables in documentation,
8) 1990s-erra save dialog boxes on unix platforms that don't even allow for "favorites". Every time I save or open, I start in the current directory and have to navigate folder-by-folder to where I want to go. I feel like this is something from my CDE days.
9) unix print and pdf export is horribly broken. These functions NEVER format anything correctly. Every time I am presented with cropped cut-off plots. The EPS export works fine, why not PDF and printing?
10) default pathdef depends on what directory you launch matlab from. Just another annoyance.

Anyway, rest in peace matlab, I have moved on.

We Choose Framentation Over Consolidation. (5, Interesting)

scorp1us (235526) | about 8 months ago | (#46607121)

Let's look at the major programming environments of today:
1. Web
2a. Native Apps (machine languages (C/C++ etc))
2b. Native Apps (interpreted/JIT languages (intermediary byte code))
3, Mobile Apps

1. is made by 5 main technologies: XML, JavaScript, MIME, HTTP, CSS. To make it do anything you need another component, of no standard choice:your language of the server (PHP, Rails, .Net, Java, etc)
2. Was POSIX or Win32, then we got Java and .Net.
3. Is Java or Objective C.

We don't do things smart. There is no reason why web development needs to be 5 technologies all thrown together. We could reduce it all to Javascipt: JSON documents instead of XML/HTML, JSON instead of MIME, NodeJS servers, and even encode the transport in JSON as well.

Then look at native development. Java and .Net basically do the same thing. Which do what POSIX whas heading towards. Java was invented so Sun could keep selling SPARC chips, .Net came about because MS tried to extend Java and lost.

Then, we have the worst offenders. Mobile development. Not only did Apple impose a Objective-C requirement, but the frameworks aren't public. Android, the worst offender is a platform that can't even be used to develop native apps. At least Objective-C can. Why did Android go with Java if it's not portable? Because of some good requirement that they don't want to support a specific CPU, but then they go and break it so that you have to run Linux, but your GUI has to be Android's graphical stack. Not to mention that Android's constructs - Activities, Intents, etc are all Android specific. They don't solve new problems, they solve problems that Android made for themselves. We've had full-screen applications for years the same goes for theading, services, IO, etc.

I'm tired of reinventing the wheel. I've been programming professionally for 13 years now and Java was neat, .Net was the logical conclusion of it. I was hoping .Net would be the final implementation so that we could harness the collective programming power into one environment of best practices... a decade later we were still re-inventing the the wheel.

The answer I see Coming up is LLVM for languages. And Qt, a C++ toolkit. Qt runs everywhere worth running, and its one code base. Sure, I wish there was a java or .net implementation. But I'll deal with unmanaged memory if I can run one code base everywhere. That's all I want. Why does putting a text field on a screen via a web form have to be so different from putting a text box on the screen from a native app? It's the same text box!

Witty, a C++ webtoolkit (webtoolkit.eu) is mirrored after Qt and is for the web. You don't write HTML or JS, you write C++. Clearly the C++ toolkits are onto something. If they were to merge and have a common API (they practically do now) in an environment with modern conveniences (lambas (yes, C++11) managed memory) we'd have one killer kit. Based on 30 year old technology. And it would be a step in the right direction.

Re:We Choose Framentation Over Consolidation. (4, Informative)

BitZtream (692029) | about 8 months ago | (#46607341)

Wow, you're intermixing frameworks, language, runtimes and document formats ... like they are interchangeable and do the same things ...

Mind blowing that you could write so much about something which you clearly know so little.

Re:We Choose Framentation Over Consolidation. (1)

gnu-sucks (561404) | about 8 months ago | (#46607691)

This is a great post, mod parent up.

With regards to QT, I love it too. Great IDE, and excellent tools and libraries. First-class debugger and UI designer. But it makes you wish, doesn't it, that there was a successor to C++ that implemented some QT things a little better? Especially the signals and slots, I feel that could be a awesome thing to have without needing qmake to re-write my functions... Still love it though!

Never trust a programmer who doesn't like (2)

OneAhead (1495535) | about 8 months ago | (#46607167)

Never trust a programmer who doesn't like programming (or calls it "anagonistic" and "masochistic").

The guy in the control room (2)

russotto (537200) | about 8 months ago | (#46607289)

His name's Stroustrup. Bjarne Stroustrup.

Sounds like he needs to use a Mac (1)

BitZtream (692029) | about 8 months ago | (#46607329)

Seriously, stop whining about OMG PROGRAMMING ARCHAIC.

So is eating and language in general, but you still do it the same way humans have been doing it for 150k years.

The problem is you, not programming.

Re:Sounds like he needs to use a Mac (0)

Anonymous Coward | about 8 months ago | (#46607527)

C and C++ is dying and their is nothing you can do about and I pity you

After Decades of Wondering What's Wrong (4, Insightful)

Greyfox (87712) | about 8 months ago | (#46607343)

After decades of wondering what's wrong with programming, did you ever stop to think that perhaps the problem... is you? If you don't like programming, why do you do it? I'm a programmer too, and I love it. I love making a thing and turning it on and watching it work as I designed it to. While other programmers wring their hands and wish they had a solution to a problem, I'll just fucking write the solution. I don't understand why they don't. They know the machine can perform the task they need and they know how to make the machine do things, but it never seems to occur to them to put those two things together. And I never, not even ONCE, asked why a playing card representation can't just look like a playing card. This despite having written a couple of playing card libraries.

This guy seems to want an objects actions to be more apparent just from looking at the object, but he chose two rather bad examples. His math formula is as likely to look like gobbledygook to a non-math person as the program is. And the playing card has a fundamental set of rules associated with it that you still have to learn. You look at an ace of spades and you know that's an ace of spades, you know how it ranks in a poker hand, that it can normally be high or low (11 or 1) in blackjack or in a poker hand. But none of these things are obvious by looking at the card. If a person who'd never played cards before looked at it, he wouldn't know anything about it, either.

Prelude to another CASE tool? (0)

Anonymous Coward | about 8 months ago | (#46607575)

I assume the OP must be thinking of DevOps as "just programming"?

Select vs Choose vs Create (1)

holophrastic (221104) | about 8 months ago | (#46607613)

If you select a programming language in which to program, then you run into all of the problems in the article.

If you choose a programming language in which to program, based on the needs of your scenario, then you run into very few of the problems discussed in the article.

If you create a programming language through which to solve the needs of your scenario, then the article simply makes no sense at all any more.

The article goes into multiple iterations of "in the real world, we describe like ; so why do we do it differently in programming?". The most memorable to me is when the author compares a picture of a playing card, the ace of spades, and compares it to "cards[0][12]", asking: "why can't we use the picture in programming?"

But that's just retarded. First off, we don't use the picture of the ace of spades when we speak. We use the words "ace of spades". And since that's incredibly ambiguous outside of a card game, we actually use "the card: the ace of spades", which is absolutely no different from cards[spades][ace], and we often enumerate cards in tutorials for bridge and for blackjack, so cards[spades][12] would be common. And in bridge, the suits have a sequence too, so cards[3][12] would be fine in context.

I think people forget that text came last -- after objects, after pictures, after speech, and way after gestures. Text is better. It's faster to transfer, faster to communicate, faster to scan, seek, read, and understand. It's more specific too.

But in everything in the real world, we manufacture a language specific to the task at hand. The word "grade" is an excellent example. In construction, geography, education, and manufacturing, the same word, with the same meaning, has entirely different semantics, usage, and even syntax. Welcome to jargon -- context/community/application-specific language.

What this article describes is the all-too-common practice is the programming industry of using construction tools (hammers, screwdrivers, nails, i-beams) as tools to teach children how to paint. Sure you can do it. And sure you can complain about how crummy a jack-hammer is at painting a canvas, but that's not the hammer's fault, nor is it the canvas's fault. It's your fault.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?