Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Remember the Computer Science Past Or Be Condemned To Repeat It?

Soulskill posted about a year ago | from the never-get-involved-in-a-land-war-in-COBOL dept.

Programming 479

theodp writes "In the movie Groundhog Day, a weatherman finds himself living the same day over and over again. It's a tale to which software-designers-of-a-certain-age can relate. Like Philip Greenspun, who wrote in 1999, 'One of the most painful things in our culture is to watch other people repeat earlier mistakes. We're not fond of Bill Gates, but it still hurts to see Microsoft struggle with problems that IBM solved in the 1960s.' Or Dave Winer, who recently observed, 'We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.' And then there's Scott Locklin, who argues in a new essay that one of the problems with modern computer technology is that programmers don't learn from the great masters. 'There is such a thing as a Beethoven or Mozart of software design,' Locklin writes. 'Modern programmers seem more familiar with Lady Gaga. It's not just a matter of taste and an appreciation for genius. It's a matter of forgetting important things.' Hey, maybe it's hard to learn from computer history when people don't acknowledge the existence of someone old enough to have lived it, as panelists reportedly did at an event held by Mark Zuckerberg's FWD.us last Friday!"

Sorry! There are no comments related to the filter you selected.

FP (-1)

Anonymous Coward | about a year ago | (#44430699)

First Fap

Back to BASIC (3, Funny)

Bob_Who (926234) | about a year ago | (#44430749)

10 GOTO 20
20 GOTO 10

Re:Back to BASIC (-1)

Anonymous Coward | about a year ago | (#44430779)

Inefficient -- should be
10 GOTO 10

Re:Back to BASIC (0)

rrhal (88665) | about a year ago | (#44430795)

#include stdlib;
main() {
fork(); main();

Re:Back to BASIC (-1, Offtopic)

Samantha Wright (1324923) | about a year ago | (#44430811)

rrhal.c:1:10: error: #include expects "FILENAME" or <FILENAME>

Re:Back to BASIC (0)

Anonymous Coward | about a year ago | (#44430923)

Why is every post on slashdot lately where someone tries to be funny by writing C fail miserably at the most basic level? I just don't get it. From a 5 digit UID no less. I guess I should stop thinking that older readers should know better.

Re:Back to BASIC (1)

ultranova (717540) | about a year ago | (#44431037)

Your fork bomb fizzles due to running out of stack. Try this:

main() {
while(1) fork();


Re:Back to BASIC (0, Troll)

Anonymous Coward | about a year ago | (#44430821)

You chuckle, but hidden variants of LISP keep coming back every 20 years like a slow-motion herpes infection. Anyone familiar with LISP knows that it was used because it was easy to parse using a stack, but it was horribly prone to errors in coding. Good LISP programs are short, but nearly impossible to write.

Another example is Python - basically a repackaged CBASIC script interpreter. Runtime VB6, VBA and Python share a lot in common (Python of course has some bells and whistles, such as partial object support).

Anyway, it gives me somewhat of a sign of relief to see C, C++ and Java/C# be stable in the face of this recurring tide of fad languages, as there is some genuine progress in making programmer easier (ie. not more clever, not making it an infinite jack knife). On the negative side, C++0x and C# have both seen all sorts of non-C++ feature creep from the fad languages of today.

Lisp is just a representation of ASTs. (0)

Anonymous Coward | about a year ago | (#44430867)

Lisp doesn't "come back" every few years. For crying out loud, the entire language is nothing but an abstract syntax tree. Of course it's present in all languages! It's what they all end up getting converted down into before they finally get converted again into assembly or directly into machine code in some cases.

I don't see what's surprising or insightful about noticing that a common subset of almost all programming languages is, *gasp*, present in almost all programming languages.

Re:Lisp is just a representation of ASTs. (1)

Anonymous Coward | about a year ago | (#44430885)

LISP 1958
Zeta 1973
Scheme 1975
1985 Common LISP
CL/CLISP 1994, 1999
Clojure 2008

These aren't just "influences" from LISP - they are essentially new branches. "Almost all programming languages" is a huge stretch.

Re:Lisp is just a representation of ASTs. (2)

Samantha Wright (1324923) | about a year ago | (#44431001)

There's arguably a list of things that make a language into "a" Lisp [paulgraham.com] , and not all of the languages that meet those criteria are actually forks or directly inspired by McCarthy et al.'s LISP programming language. GP was referring to this concept, but probably has a much looser understanding of what it means to be a Lisp. Tragically, TFA is mostly about APL.

Re:Back to BASIC (4, Informative)

Samantha Wright (1324923) | about a year ago | (#44431065)

Here's some worthwhile reading on why Lisp has trouble staying put—possibly a little flamebait-y: Lisp is not an acceptable Lisp [blogspot.ca] , The Lisp Curse [winestockwebdesign.com] , and Revenge of the Nerds [paulgraham.com] . The core arguments seem to be (a) it's really easy to invent things in Lisp so no one can agree on how to do it, and (b) the lack of a coherent standard platform means there is no easy target for university courses or job descriptions.

Re:Back to BASIC (5, Funny)

cheater512 (783349) | about a year ago | (#44431231)

Lisp never 'comes back'. It merely recurses.

It's not the programmers making the decisions (4, Insightful)

cultiv8 (1660093) | about a year ago | (#44430753)

It's managers and executives who make the decisions, and to them whether it's a browser or mobile app or SaaS or whatever the latest trend is, who cares if you're reinventing the wheel as long as profits are up.

Re:It's not the programmers making the decisions (5, Insightful)

fldsofglry (2754803) | about a year ago | (#44430841)

Also in line with this: I can't imagine that way patents work actually help with the problem of inventing the wheel. You almost have to reinvent the wheel to create a working solution that won't get you sued.

Re:It's not the programmers making the decisions (1)

aXis100 (690904) | about a year ago | (#44431033)

I wish I had mod points. +1 Insightful.

Re:It's not the programmers making the decisions (2, Insightful)

DaveAtFraud (460127) | about a year ago | (#44430845)

It's managers and executives who make the decisions, and to them whether it's a browser or mobile app or SaaS or whatever the latest trend is, who cares if you're reinventing the wheel as long as profits are up.

That hasn't changed either. Just the specific subject of the idiocy has changed. Idiotic managers are timeless. Lady Ada probably had the same thing to say about Charles Babbage.


Re:It's not the programmers making the decisions (-1)

Anonymous Coward | about a year ago | (#44430909)

No, no. LadyAda is quite a competent electrical engineer and programmer. She was even on Slashdot last year.

                        http://hardware.slashdot.org/story/12/12/19/0053228/open-source-hardware-hacker-ladyada-awarded-entrepreneur-of-the-year [slashdot.org]

Re:It's not the programmers making the decisions (5, Informative)

sjames (1099) | about a year ago | (#44431327)

Actually, it was Babbage who faced such idiocy from Parliament:

On two occasions I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

And you thought the clogged tubes thing was bad.

Yes - and no... apk (-1)

Anonymous Coward | about a year ago | (#44430971)

Yes - Most mgt. in my experience hadn't even EVER DONE THE JOB hands-on, much less well (that was thru my time coding full time 1994-2009). They don't belong there. Promote from "within the ranks" SHOULD be "how it is" but what I call the "BBC" Ibillionaire boys club" fraternal organizations that run the world don't allow it - take that, as you will, while they place "their own" into those positions and burn companies blind with STUPID decisions (Mr.Ballmer). It's gotten better, but still goes on.

NOW - As far as my "NO" though?

Ok - I posted a couple things to a young guy here the other day (who made a decent point, 1 I even overlooked here -> http://tech.slashdot.org/comments.pl?sid=4020769&cid=44396911 [slashdot.org] & I posted the rest on "DLL HELL" to a "PHB" named Jeremiah Cornelius here (modded down too no less, 'gee wonder why' (not) -> http://tech.slashdot.org/comments.pl?sid=4020769&cid=44396873 [slashdot.org] ).

So... why do I overlook the 'crutches' the OS provides (even though they're good)? Well - there are SIMPLER BETTER WAYS AROUND IT you as the programmer have to use:

Statically compiled libs http://tech.slashdot.org/comments.pl?sid=4020769&cid=44428463 [slashdot.org]


DLL placements http://tech.slashdot.org/comments.pl?sid=4020769&cid=44402835 [slashdot.org]

(And guys? That is ELEMENTARY STUFF... but "PHB" shooting his mouth off about his former alleged employer didn't even KNOW them!)

Problem? Using "OS Crutches"... I first saw it when during Win9x's start & moreso with NT actually, devs were told "Don't worry about timeslicing/ceding back CPU time to apps yourself - the OS can do it, due to pre-emptive multitasking"... lol, that was a HUGE bust!

Where'd I see it? On a giant system with Oracle DB on Solaris, Windows 2000 clients, thru Citrix to remote campuses... within 20 minutes? Remote campus clients "locked up" cold.

We tried DoEvents in VB - no dice. Should have worked, but didn't (even though it oddly enough relies on the fix I used) - what did? A "sleep" API call in loops with return recordsets.

Middleware drivers aren't coded for niceness/timeslice. They're coded for BRUTE FORCE SPEED... doesn't "mix" well in single session setups thru Citrix (even with its "badboy app" crutch session tweaks you can apply server-side using a 'crutch' again, bad move).


P.S.=> Sometimes, the "old men" who have "been there/done that" know best... & why - we've been "punched in the head" by the mistakes you youngsters will run into before is why! apk

Re:Yes - and no... apk (-1)

Anonymous Coward | about a year ago | (#44431053)

You know what would fix it? A good HOSTS file!

Also ask your doctor to review your meds.

The thing about repeating the past (1)

FooAtWFU (699187) | about a year ago | (#44430763)

I saw the Lady Gaga quip and Scott's fondness for effective ancient map-reducey techniques on unusual hardware platforms. It reminded me about things like discovering America. Did the Vikings discover it years before any other Europeans? Certainly. Did the Chinese discover it as well? There's some scholarly thought that maybe they did. But you know whose discovery actually effected change in the world? Lame old Christopher Columbus.

Perhaps there's a lesson to be learned here from people who want to actually change the world with software and if we spent less time ranting about mmap-vs-scanf-in-Hive we could learn it.

Re:The thing about repeating the past (1)

iggymanz (596061) | about a year ago | (#44430831)

Eh? starting 15,000 years ago various waves of people came here from asia and huge and important civilizations have risen and fallen in the Americas since then. Some of those people are still around and their influence on art, food, medicine continues into our culture. One group of those asians was absolutely crucial to the United States winning its independence and also had influence on our Constitution. Talk about effecting change in the world; and they're still around by the way.

Re:The thing about repeating the past (-1)

Anonymous Coward | about a year ago | (#44431237)

caucASIANs perhaps? Posting as anon since even the suggestion that white people did anything good for anyone is considered bigotry.

Re:The thing about repeating the past (0, Offtopic)

Anonymous Coward | about a year ago | (#44430907)

It's worth pointing out that Columbus thought he was discovering a path to the East Indies, that he enslaved friendly natives, and that it was the English colonists, and not the French or Spanish (who both had claims in North America) who "effected change in the world" in the area Columbus "discovered"; by which you mean that small part of the world I assume, since the French, Spanish, Portuguese, Chinese and many others effected many changes in other parts of the world at exactly the same time. Did I mention that Columbus was Italian and funded by the Spanish?

So, if by domain range-mapping in the mathematics functional sense, you mean that the lesson is "changing the world by software" by the function of greedy funders then that is also true......MS, Oracle, Apple, Google all have changed the world via software :)

Sorry, didn't mean to be rude or nit-picky about mmaps, but it's just so ironic

Re:The thing about repeating the past (4, Insightful)

symbolset (646467) | about a year ago | (#44430963)

Lady Gaga is mentioned because she is both a classically trained artist and sui-generis of successful PopTart art through self-exploitation. Yes, the reference is recursive - as this sort of folk are prone to be. They can also be rude, if you bother to click through, as they give not one shit about propriety - they respect skill and art and nothing else.

When I plussed this one on the Firehose I knew most of us weren't going to "get it" and that's OK. Once in a while we need an article that's for the outliers on the curve to maintain the site's "geek cred". This is one of those. Don't let it bother you. Most people aren't going to understand it. Actually, if you can begin to grasp why it's important to understand this you're at least three sigmas from the mean.

Since you don't understand why it's important, I wouldn't click through to the article and attempt to participate in the discussion with these giants of technology. It would be bad for your self-esteem.

For the audience though, these are the folk that made this stuff and if you appreciate the gifts of the IT art here is where you can duck in and say "thanks."

Re:The thing about repeating the past (4, Funny)

Anonymous Coward | about a year ago | (#44431157)

This is perhaps the most masterful parody of the passive-aggressive, cringe-inducingly self-congratulatory hipster attitude that I've seen on this site, and possibly anywhere on the internet, in some time. Bravo.

Re:The thing about repeating the past (3, Insightful)

Shavano (2541114) | about a year ago | (#44431329)

Yeah, but too many of today's programmers think they discovered America themselves.

A (very) recent OSCON talk (5, Informative)

zmughal (1343549) | about a year ago | (#44430767)

John Graham-Cumming gave a talk at OSCON 2013 titled "Turing's Curse [youtube.com] " that speaks to this same idea. Worth a watch.

Re:A (very) recent OSCON talk (1)

Samantha Wright (1324923) | about a year ago | (#44431019)

I find it curious that he didn't mention this [chris-granger.com] or this [vimeo.com] at the end, given that they're both about a year old and both flirt with death and/or the halting problem in order to offer better debugging features.

We don't shun those who should be shunned. (5, Insightful)

Anonymous Coward | about a year ago | (#44430783)

It's pretty damn obvious why this is: as an industry, we no longer shun those who should definitely be shunned.

Just look at all of the damn fedora-wearing Ruby on Rails hipster freaks we deal with these days. Whoa, you're 19, you dropped out of college, but you can throw together some HTML and some shitty Ruby and now you consider yourself an "engineer". That's bullshit, son. That's utter bullshit. These kids don't have a clue what they're doing.

In the 1970s and 1980s, when a lot of us got started in industry, a fool like that would've been filtered out long before he could even get a face-to-face interview with anyone at any software company. While there were indeed a lot of weird fuckers in industry back then, especially here in SV, they at least had some skill to offset their oddness. The Ruby youth of today have none of that. They're abnormal, yet they're also without any ability to do software development correctly.

Yeah, these Ruby youngsters should get the hell off all of our lawns. There's not even good money in fixing up the crap they've produced. They fuck up so badly and produce so much utter shit that the companies that hired them go under rather than trying to even fix it!

The moral of the story is to deal with skilled veteran software developers, or at least deal with college graduates who at least have some knowledge and potential to do things properly. And the Ruby on Rails idiots? Let's shun them as hard as we can. They have no place in our industry.

Re:We don't shun those who should be shunned. (-1)

Anonymous Coward | about a year ago | (#44430807)

who wants to deal with a crabby old fuck like you who will shit on his own mother before divulging any masterpiece trade skills?

Re:We don't shun those who should be shunned. (0)

Anonymous Coward | about a year ago | (#44430933)

He isn't a "crabby old fuck", he's explaining his frustration with the poor quality of Ruby code that young programmers with a casual interest in coding and very little experience produce in great volumes.

For once think about somebody else instead of yourself. This guy is upset, but he's explained why. You could respond creatively instead of insulting his age or his mother. Instead you gave the very response some 19-year old idiot with a joint in his mouth and 4 months of Ruby coding under his belt would say. You've contributed absolutely nothing to the discussion.

Instead of rebelling against people that are different from you, try actually listening to them for once. The original AC had a point and all you've shown is that it flew over your head.

Re:We don't shun those who should be shunned. (0)

Anonymous Coward | about a year ago | (#44431015)

There was no point; it was merely 'rargh Ruby hipsters are what's wrong with everything,' then further paragraphs in search of new insults to apply to them. If you think that's a well-reasoned argument or anything approaching a valid point, you should wipe the shit off of your face instead of slinging any more of it.

Re:We don't shun those who should be shunned. (0)

Anonymous Coward | about a year ago | (#44430959)

We have nothing to "divulge". It's all out there, available for anyone smart enough to seek it! There are hundreds of excellent books on topics like C, C++, networking, parallel programming, SQL, relational databases, 3D graphics, simulations, UI design, and whatever else you're interested in.

But we can't force you to learn that material. If you're going to insist that Ruby and JavaScript be used everywhere, all the time, without exception, then there's not much we can do to help you. If you refuse to use SQL and don't want to learn about relational database theory, and instead dick around with your NoSQL nonsense, our hands our tied.

Don't blame veteran software developers for your ignorance, especially when you try so damn hard to remain as ignorant as you possibly can be.

Re:We don't shun those who should be shunned. (1)

AK Marc (707885) | about a year ago | (#44431249)

But it must all be re-discovered. If you know what, without why, then you will not know where the true limits are, or when you can break them. Knowing they are there is a good thing, but knowing why is much much more valuable. And experience (failures) gives you that information. Others often don't document their failures well, which leads to problems for those wanting to learn the "why" without having expensive failures.

We just had an earthquake here, and one of the most heavily damaged buildings was one of the newest (and up to code, held up as earthquake ready). The structure was fine, but the fittings and fixtures didn't hold up to a shake, causing millions in damage to an almost new building.

Re:We don't shun those who should be shunned. (-1)

Anonymous Coward | about a year ago | (#44431021)

well, aren't you cute, 19 year old Ruby on Rails 'developer'

Re:We don't shun those who should be shunned. (5, Interesting)

symbolset (646467) | about a year ago | (#44431029)

These guys were openly publishing their brilliance before hiding how your shit works was even a thing. Believe it or not once upon a time if you invented a brilliant thing in code you shared it for others to build upon so you could learn and grow and benefit. Hiding it for profit wasn't even thought of yet. It wasn't just undesirable: the thought did not even occur. That was the golden age of much progress, as each genius built upon the prior - standing upon the shoulders of giants reaching for fame. Now that we're in a hiding era we go around and around reinventing the same shit over and over, suing each other over who invented it first. It is madness. In the process we have moved backwards, losing decades of developed wisdom.

Re:We don't shun those who should be shunned. (0)

Anonymous Coward | about a year ago | (#44430859)

What Ruby on Rails, Python, Clojure, etc. represent are programmers who think that changing the tools will change the problems, or somehow make them skilled. The bottom line is that if you can't be a good C++ programmer, hiding inside Ruby won't do anything for you other than make you feel comfortable that other people are just as unfamiliar with Ruby as you are.

Re:We don't shun those who should be shunned. (1)

PPH (736903) | about a year ago | (#44430899)

I Googled "A poor worker blames his tools". All I got was links to Craftsman and Harbor Freight.

Re:We don't shun those who should be shunned. (2, Insightful)

Anonymous Coward | about a year ago | (#44431011)

I can tell that you're young simply because you used C++ in a debate where someone slightly older would have used C. Either that, or you're a Windows programmer.

C utterly dominated open source (and thus the Slashdot community) until about 5 years ago. That's when the overwhelming number of university switched to C++. Of course, before that it was Java, so you can see the trend.

Unless you're a Windows programmer, I'd stick with C, which is infinitely simpler, and provides you freedom to maintain competency in other languages, many of which have far cooler features than C++ will ever be able to provide.

Re:We don't shun those who should be shunned. (1, Informative)

Anonymous Coward | about a year ago | (#44431043)

I would actually recommend you use C++ with only a subset of the features. C is still the most popular language globally but C++ is a close 2nd - unfortunately there are some benefits to C++. For the record, Universities have been teaching C++ since 1995 at least, you're nearly 20 years out of date. For the record, Slashdot is not a good measure of what languages people are using, as it does not represent the general coding community.

Re:We don't shun those who should be shunned. (2)

DutchUncle (826473) | about a year ago | (#44430937)

I think you were trolling, but there's a point under there. In the 70s you had to have a clue to get anything done. As more infrastructure and support system has been built, in the interest of not having to reinvent the wheel every project, you *can* have people produce things - or appear to produce things - while remaining clueless. Flash and sizzle have been replacing the steak.

Re:We don't shun those who should be shunned. (-1)

Anonymous Coward | about a year ago | (#44430977)

There is one simple reason: Programming (and IT in general) is like meat packing. Coders too expensive? Chuck them, and get a guarenteed SLA and bug fix level from Tata. Can't offshore? Tata will send you a H-1B that will be able to deal with almost any programming situation.

Programming is like meat packing or migrant work. It used to be something that could be a living, but with the influx of people who will do 10,000 lines of bug-free code a day for $10/hour, and where loyalty is never an issue.

Want something meaningful? Go law. That's not going off US shores anytime soon, and there is no such thing as an unemployed lawyer outside of NYC and LA.

Re:We don't shun those who should be shunned. (1)

Anonymous Coward | about a year ago | (#44431089)

As a young C++ dev, I see a lot of senior devs blind to what these new languages can do. Want a quick web server for xyz, that's like 3 lines of python. It's always a trade off between various things. The right tool for the right job; sometimes that's ruby on rails. The reality is, you can make easy money as a rails dev - instant results. Who cares if it scales. If the idea holds it can be redone properly. That doesn't really work for important systems. If you want a website that matches fedora-wearing hipsters with other fedora-wearing hipsters, based on STDs acquired and taste in music, you need a Rails dev.

Re:We don't shun those who should be shunned. (0)

Anonymous Coward | about a year ago | (#44431153)

I've said it before and I'll say it again. Ruby/Node + NoSql is this generations Visual Basic + Access.

Management will ask for a quick prototype and some teenager in college will throw it together in a couple hours on the cheap. Then they'll see a working product that they got out the door on a shoe string budget. Never mind that they're shipping the prototype, and that it's full of terrible code and is a bloated mess, we can just throw a ton of hardware at it!

Fast forward 10 years and the industry will hit a tipping point where there's just so much crap being put out that we see the huge push back towards statically typed languages, or languages that don't tolerate bs to the extent that js, ruby, and php do. At this point developers like me will don our maintenance hats and get to work converting all that crappy code into something that will stand the test of time. Last round the victor went to c++, while all the vb6 was force upgraded to java/c#. My only question is what language will emerge as the dominant answer to the current round of stupidity.

Re: We don't shun those who should be shunned. (2, Insightful)

Anonymous Coward | about a year ago | (#44431285)

I'm offended that ruby keeps getting thrown in with this Node/NoSQL stuff. Node has a couple of real use cases, but outside of that its a waste of time. NoSQL has a couple of real use cases, but outside of them it's not something you build around.

Ruby on the other hand is a really interesting language that has the benefit of being so flexible that its made for creating DSLs. Puppet, Chef, Capistrano and Rails just off the top of my head. Do some libraries have memory leak issues? Yes. Does its thread handling suck? Yes. Does jRuby fix all of that? Yes. Does Torquebox let you deploy on hardened JBoss infrastructure? Yes. Is the rails community driving a huge chunk of the web to embrace PostgreSQL over MySQL...? Yes.

Ruby and specifically jRuby is a powerful development platform with a great community around it. It is not deserving of being thrown in with the likes of Node as a fad. The biggest issue I see in Rails code is developers that try to do everything in Rails instead of leveraging their database but the popularity of PostgreSQL and its features are even starting to change that.

indeed, too many bad code monkeys, few engineers (5, Insightful)

raymorris (2726007) | about a year ago | (#44431163)

Indeed. Half of today's programmers have roughly zero engineering education, and want to be called software engineers. They have no idea, no idea at all, what their data structures look like in memory and why they are so damn slow. Heck "data structure" is an unfamiliar term to many.

It's not entirely young vs old, either. I'm in my 30s. I work with people in their 50s who make GOOD money as programmers, but can't describe how the systems they are responsible for actually work.

How do we fix it? If you want to be good, studying the old work of the masters like Knuth is helpful, of course. Most helpful, I think, is to become familiar with languages at different levels. Get a little bit familiar with C. Not C# or C++, but C. It will make you a better programmer in any language. Also get familiar with high level. You truly appreciate object oriented code when you do GUI programming in a good Microsoft language. Then, take a peek at Perl's objects to see how the high level objects are implemented with simple low level tricks. Perl is perfect for understanding what an object really is, under the covers. Maybe play with microcontrollers for a few hours. At that point, you'll have the breadth of knowledge that you could implement high level entities like objects in low level C. You'll have UNDERSTANDING, not just rote repetition.

* none of this is intended to imply that I'm any kind of expert. Hundreds, possibly thousands of people are better programmers than I. On the other hand, tens of thousands could learn something from the approach I described.

All things being equal scales to infinity (0)

Anonymous Coward | about a year ago | (#44430787)

If you have defined features that are more of a consequence of having a particular data model rather than any use required by the customers, you have a shit product. If your UI is coded for what's easiest for the programmer rather than what allows the user to make the most money, you have a shitty UI. If you depend on lock in for your revenue rather than usefulness, you are hoping your customers are idiots and idiots are harder to support than smart people.

Rinse and repeat.

Paging Linus (3, Insightful)

TubeSteak (669689) | about a year ago | (#44430803)

http://scottlocklin.wordpress.com/2013/07/28/ruins-of-forgotten-empires-apl-languages/#comment-6301 [wordpress.com]

Computer science worked better historically in part because humorless totalitarian nincompoopery hadn't been invented yet. People were more concerned with solving actual problems than paying attention to idiots who feel a need to police productive people's language for feminist ideological correctness.

You may now go fuck yourself with a carrot scraper in whatever gender-free orifice you have available. Use a for loop while you're at it.

Re:Paging Linus (-1)

Anonymous Coward | about a year ago | (#44431091)

Why crosspost a mentally-ill person's axe-grinding rant? If we wanted one, we could ride the subway.

Hive: distributed and free (2, Interesting)

michaelmalak (91262) | about a year ago | (#44430833)

That's genius: comparing a "$100k/CPU" non-distributed database to a free distributed database. Also no mention that, yes, everyone hates Hive, and that's why there are a dozen replacements coming out this year promising 100x speedup, also all free.

And on programming languages, Locklin is condescending speaking from his high and mighty functional programming languages mountain, and makes no mention of the detour the industry had to first take into object-oriented programming to handle and organize the exploding size in software programs before combined functional/object languages could make a resurgence. He also neglects to make any mention of Python, which has been popular and mainstream since the late 90's.

large teams (2)

fermion (181285) | about a year ago | (#44430835)

One of the things that I still see is the idea that when a problem exists, throw more people at it. The mythical man month pretty much threw that to wind for software development, and I am sure there are a whole slew of books that predate it saying essenstially the same thing. Yes advancements do mean that more people can communicate more directly, but there still is a limit and I do not think it is as great as some believe. Define interfaces, define test that insure those interfaces exhibit high fidelity, and let small teams, even a single person, solve a small problem. What technological advance has done is make clock cycles very cheap, so there is less excuse to go digging around trying to change code that will make your code run a little faster. Speaking of interfaces, we know that when data and processes are not highly encapsulated, it is nearly impossible to create a bug free large project. One thing that object oriented programs has done is to create a structure where data and processes can be hidden so they can be changed as needed without damaging the overall software application. Now, many complain because the data is not really hidden, it is just a formality. But really coding is just a formality, and a professional is mostly one who knows how to respect that formality to generate the most manageable and defect free code possible. One thing that has been lost with the generation of rapid development systems quickly spouting out bad code is that code and the ability to tweak it is the basis of what we do.

what the lack of QA and to much auto testing? (1)

Joe_Dragon (2206452) | about a year ago | (#44431041)

With to much auto testing can just code to pass the test and even if some was looking at they would mark fail but it still passes what the auto system thinks is good.

Re:what the lack of QA and to much auto testing? (0)

Anonymous Coward | about a year ago | (#44431201)

If your code passes a test but fails in the field, it is your QA that sucks, not your programmers. If you let your programmers bypass QA, then it is your management that sucks.
In your case you might want to get a better auto spell checker, or learn to code english. It's spelled TOO.

Re:what the lack of QA and to much auto testing? (0)

Anonymous Coward | about a year ago | (#44431295)

> With to much auto testing...

Do you mean too much or is much a type of auto testing?

Optimal team size (2)

DaveAtFraud (460127) | about a year ago | (#44431095)

For any given software project there is an optimal team size. If the project is small enough, you can keep the team size down to what works with an agile development methodology. If the project is bigger than that, things get ugly. I started my career in a company that considered projects of 50 to 100 man-years to be small to medium sized. Big projects involved over a thousand man-years of effort and the projects were still completed in a few years calendar time. You can do the math as to what that means as far as number of developers working concurrently (I remember one project that had approximately 500 to 600 people working on it).

The methodologies and discipline exist to solve such projects. It isn't efficiient compared to a small project but small project techniques can't solve big problems in time. Usually when you attempt to explain what it entails to management you get a response of, "We don't have time for that." So, the project flounders for twice as long before finally getting put out of everyone's missery.

Oh yeah. Been there. Done that. Over and over. Seen it done right when people used the right approach and I've seen more than my share of death marches that only seemed to convince good developers to look for another line of work.


Re:large teams (1)

Anonymous Coward | about a year ago | (#44431227)

9 women can't have a baby in a month

'Web Based' Coding is not the same... (4, Funny)

neorush (1103917) | about a year ago | (#44430855)

We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac

I don't remember that code running cross platform on varying architectures. The web as an platform for distribution should not be compared to an actual OS...that doesn't even make sense.

Re:'Web Based' Coding is not the same... (4, Interesting)

DutchUncle (826473) | about a year ago | (#44430975)

I don't remember that code running cross platform on varying architectures.

Yes. No code runs cross platform on varying architectures - INCLUDING the stuff that supposedly does, like Java and Javascript and all of the web distributed stuff. All of it DEPENDS on an interpretation level that, at some point, has to connect to the native environment.

Which is what BASIC was all about. And FORTRAN. Expressing the algorithm in a slightly more abstract form that could be compiled to the native environment, and then in the case of BASIC turned into interpreted code (Oh, you thought Java invented the virtual machine?)

In a lot of ways it is closed source vs open (2)

GoodNewsJimDotCom (2244874) | about a year ago | (#44430857)

There are a lot of things that if source code was available, other people could build on it and make higher quality products. In the absence of source code, people need to start from scratch often rebuilding the wheel.

Competition for money might get people to strive to make better pieces of art. But on the flip side, this same competition will sue your pants off for any reason they can find so you don't compete with them either.

An on an unrelated note, I had an idea for a zombie video game like Ground Hog day today. When you die, it starts out as the beginning of a zombie pandemic. As you die and play through it over and over, you get secrets to where weapons and supplies are. You find tricks you can use to survive and save people. Eventually you find out who caused the zombie pandemic. You can then kill him before he goes through with it. I'm not sure an ending where you serve in prison is a good ending though. I didn't think it the whole way through, but it sounded like a good premise for a zombie game.

Re:In a lot of ways it is closed source vs open (0)

Anonymous Coward | about a year ago | (#44430935)

>There are a lot of things that if source code was available, other people could build on it and make higher quality products.

And yet the open source landscape is littered with the half-dead corpses of crap tools that don't quite work. The answer is that whether the source is open or closed is irrelevant. It is how much investment, money and time go into the project that determines the quality. There is good open source, and there is good closed source. A good programmer knows that source is source.

Re:In a lot of ways it is closed source vs open (2)

MrEricSir (398214) | about a year ago | (#44431123)

There are a lot of things that if source code was available, other people could build on it and make higher quality products. In the absence of source code, people need to start from scratch often rebuilding the wheel.

That doesn't seem true for the most part.

All open source does with regard to code reuse is that it makes it painfully obvious how much redundancy there is. The spat between the different Linux display managers is one recent example, but I'm sure you can think of many others.

As for why this is, there's many reasons: incompatible licenses, NIH syndrome, incompatible technologies/versions, copyright assignment, etc. Getting people to work together towards a common goal over the long term is a lot harder than slapping the right license on your code.

In Browser (5, Insightful)

MacDork (560499) | about a year ago | (#44430871)

We marvel that the runtime environment of the web browser can do things that we had working 25 years ago on the Mac.

Did the Mac, 25 years ago, allow people to load code from a remote server and execute it locally in a sandbox and in a platform independent manner all in a matter of a couple of seconds? No. No it did not.

We should then pay homage to the Mac 25 years ago, when it basically did what Doug Englebart demonstrated 45 years ago. [youtube.com] Nice logic you have there.

Re:In Browser (1)

DutchUncle (826473) | about a year ago | (#44431023)

25 years ago you couldn't transmit the data in a matter of seconds. You *could* execute BASIC bytecode, though. Dynamic link libraries were invented for MULTICS in the 1960s. IBM assembler macros in the 70s could do more than a C++ template function. (OTOOH IBM deliberately crippled the small computer world by choosing an overlapped 24-bit address space instead of a 32-bit linear one (on the Motorola chips) because their mainframes were still linear 24-bit.)

Re:In Browser (1)

antifoidulus (807088) | about a year ago | (#44431325)

This, exactly this. This guy seems to elevate programming to some sort of sacred art, an art where "doing x" is the only important consideration, not how much will it cost to allow anyone in the world to do x, not how much of a pain it is to make sure everyone updates to the newest version when you want to do x and y etc.

Basically he seems to be lamenting the fact that we aren't programming solely for the sake of programming. From TFA:
Modern code monkeys donâ(TM)t even recognize mastery; mastery is measured in dollars or number of users, which is a poor substitute for distinguishing between what is good and what is dumb.

It's easy to be a pundit (0)

Anonymous Coward | about a year ago | (#44430873)

Look at Barnes and Noble. They proactively placed a huge bet on the Nook, cannibalizing their bread and butter business in terms of floor space, and (by most reviewers) executed it the right way. The trouble is they still lost. Had they never done the Nook people would've said, "look at those overpaid idiots. Completely asleep with no response to Amazon in the digital age". Everyone would've nodded sagely.

Microsoft has a big problem. They're completely married to Windows, but it is 1) an inferior architecture to Unix and 2) a proprietary architecture when the world wants openness. Also, big companies like the telcos and handset makers don't trust Microsoft as a partner, so MS almost has to go it alone, and that's not their strength as a company.

In real life, things can't be wrapped up easily like they can in a Harvard Business School classroom.

What Mac did you use? (0)

Anonymous Coward | about a year ago | (#44430887)

Please explain how a 25 year old Mac can play streaming media.

Re:What Mac did you use? (2)

Guspaz (556486) | about a year ago | (#44431265)

LocalTalk file sharing, AIFF files, 8-bit audio support.

it Is Un-Fucking-Believable (0)

Anonymous Coward | about a year ago | (#44430929)

that HTTP provided no optional persistent connection establishment and thus no way for the server to transmit an update to a page being viewed by a browser. The web would be 20 years farther along than it is now if that simple mechanism had been in place.

Re:it Is Un-Fucking-Believable (2)

Russ1642 (1087959) | about a year ago | (#44430999)

If the people in the middle ages had only realized it was the rats, and the fleas they brought with them, they wouldn't have suffered from plague for so long. Hindsight is 20/20.

Re:it Is Un-Fucking-Believable (0)

Anonymous Coward | about a year ago | (#44431097)

that HTTP provided no optional persistent connection establishment and thus no way for the server to transmit an update to a page being viewed by a browser. The web would be 20 years farther along than it is now if that simple mechanism had been in place

I remember sitting at my desk about 17 years ago and doing this via some aweful multi-part chunking thing in netscape.

Re:it Is Un-Fucking-Believable (0)

Anonymous Coward | about a year ago | (#44431111)

Dude: IPv6. The people who designed that thought like chess masters, 7-8 moves ahead. The problem is that most of us care only about the next move.

This is how it works (1)

Anonymous Coward | about a year ago | (#44430943)

The computer industry repeats itself all the time. The current virtualization trend is like the emulation of the 60s but better.

The push toward heavy weight scripting languages with VMs over simple scripting. Watch the next trend come to make simple scripting languages that have fast startup for these virtualized environments in a few years.

The push for terminals that happens every few years.. in reality web apps are the evolution of the terminal because we get dumb devices to see them with.

Tablets are like computers from the mid 80s... no multi user modes, multitasking, etc. New versions of android support multiuser and multitasking. We almost hit windows 95 on the tablets.. I can't wait for NT.

Devices are getting dumber so the little people can use them. Soon we'll have complex tablets that mimic PCs from 8 years ago and new dumber devices will be on the horizon again.

The push for multiple CPU architectures and then the alternate push to standardize on one. Currently we've got the ARM vs x86 war and we had the ppc and alpha in the past or farther back the 68k or even farther back ibm vs dec.

History is doomed to repeat itself.

ANSII on a BBS (1)

flyingfsck (986395) | about a year ago | (#44431331)

Well, yeah, Facebook is just a slightly improved BBS and a browser is just a slightly improved ANSI terminal.

troll (0)

Anonymous Coward | about a year ago | (#44430961)

troll article? i mean really now.

The third link (1)

phantomfive (622387) | about a year ago | (#44430969)

The third link is mainly a praise of APL, the programming language. Talk about odd.

It would be great if he'd actually given examples of why APL is a good language. I would be interested in that. Instead he says mmap is really interesting, which actually doesn't have anything to do with programming language.

He says that old programmers have left a lot of examples of good source code. It would be great if he'd actually linked to their code.......

What past was he from? (2, Funny)

Russ1642 (1087959) | about a year ago | (#44430979)

He says system performance is the same as it was way back then. He thinks that stuff just happened immediately on those systems because they were running very efficient code. So what. Here's a simple test. Go get one of those computers and set it next to yours. Turn them both on. Mine would be at a desktop before the old one even thinks about getting down to actually running the operating system. Or start a program. On a current system it loads now. As in, right now. Back then it was a waiting game. Everything was a waiting game. He must have simply forgotten or repressed those memories.

Re:What past was he from? (3, Insightful)

siride (974284) | about a year ago | (#44431215)

Also those old programs did a lot less than many of our new programs. People often forget that when complaining about performance.

That's not to say, of course, that modern programs couldn't be written more efficiently. Because of Moore's Law and other considerations, we have moved away from spending a lot of time on performance and efficiency.

To much theory / lack of apprenticeships in CS? (1)

Joe_Dragon (2206452) | about a year ago | (#44430985)

When people don't learn from people who have made mistakes or even had some real work place experience (not years of academic experience) it easy to end makeing mistakes that in theory seem like good ideas.

Also similar to some of the certs type stuff there the book says this but in the real work place that does not work.

first (-1)

Anonymous Coward | about a year ago | (#44430987)

random luck cause a couple people to be first. They are not geniuses -- not extraordinary. Let them compete with God's intellect.

Lady Gaga (0)

Anonymous Coward | about a year ago | (#44430991)

that line about Lady Gaga made me laugh.

Net, CPU and GPU bound (5, Insightful)

AHuxley (892839) | about a year ago | (#44430993)

The best and brightest at Apple, MS, BeOS, Linux did learn from "the great masters" - thank you to all of them.
They faced the limits of the data on a floppy and cd.
They had to think of updates over dial up, isdn, adsl.
Their art had to look amazing and be responsive on first gen cpu/gpu's.
They had to work around the quality and quantity of consumer RAM.
They where stuck with early sound output.
You got a generation of GUI's that worked, file systems that looked after your data, over time better graphics and sound.
You got a generation of programming options that let you shape your 3d car on screen rather than write your own gui and then have to think about the basics of 3d art for every project.
They also got the internet working at home.

Re:Net, CPU and GPU bound (0)

Anonymous Coward | about a year ago | (#44431313)

And now we're loosing all this, because of some idiot's pushing to "save the children."


Some things are missing (4, Interesting)

Gim Tom (716904) | about a year ago | (#44431005)

As a 66 year old life long geek I actually saw many of the things I worked with decades ago reinvented numerous times under a variety of names, but there is one thing I used extensively on IBM OS/360 that I have never seen in the PC world that was a very useful item to have in my tool kit. The Generation Data Set and by extension the Generation Data Group were a mainstay of mainframe computing on that platform the entire time I worked on it. When I moved on to Unix and networks in the last few decades of my career I looked for something similar, and never found anything quite as simple and elegant (in the engineering sense of the word) as the Generation Data Set was. Oh, you can build the same functionality in any program, but this was built into the OS and used extensively. If anyone has seen a similar feature in Unix or Linux I would love to know about it.

Re:Some things are missing (1)

tsiv (622512) | about a year ago | (#44431297)

Something like GDG would have been anathema to the original file system creators of Unix - at least through the Berkeley FFS days. I think I saw something like it on some extensions in 4.1ES. It was certainly on VMS. Maybe something like ext3cow will do what you are thinking of. Of course much of the problem is that Unix at the mothership was used extensively to submit JCL to OS/360 hosts, so all that stuff was done back on the big iron, not the cheap Unix FEs. Departments had to cough up for the unix hosts but projects paid for time on big iron. I never worked on projects that cared about that - network design not legacy OSS dev.

Remember the strategy gaming past... (1)

macraig (621737) | about a year ago | (#44431051)

... or be doomed to repeat it. And they have been for 20 years, every year. Strategy game development in particular seriously needs a persistent collective consciousness.

Re:Remember the strategy gaming past... (1)

macraig (621737) | about a year ago | (#44431063)

Well, maybe I should have been more specific and said 4X strategy game development....

Think Like an Egyptian (0)

Anonymous Coward | about a year ago | (#44431055)

Imagine how many pharaohs are rolling over in their graves watching us repeat the same "mistake" of building these rounded rectangular stones, discarded like fodder scattered on fields throughout the world as we can't seem to build a proper monument to the dead.

Oh wait, you mean we aren't trying to rebuild pyramids exactly?

In other news, a Turing machine is the mathematical definition of a computer, where all concrete computers are actually a subset of said mathematical concept. I'm not really sure why anything related to "on a computer" is patentable. If it was on a Turing machine, it would be a mathematical definition of that thing, and hence not patentable. So painful to watch as we waste our time looking at subsets of Turing machines. The only new inventions are mathematical concepts, and they are not patentable.

A symptom of popular culture in the '60s (4, Insightful)

DutchUncle (826473) | about a year ago | (#44431061)

... which really means the late '60s into the '70s. Isaac Newton said that he saw far because he stood on the shoulders of giants. Bill Gates and Steve Jobs were *proud* of knowing nothing about the industry they were trying to overturn. The same free, open, do-your-own-thing attitude (partly based on the new abundance helped along by technological advancement) that permitted startups to overtake established manufacturers, also encouraged tossing out anything "established" as "outdated" whether it was useful or not.

Darkness At Noon (-1, Offtopic)

conner_bw (120497) | about a year ago | (#44431079)

Now, every technical improvement creates a new complication to the
economic apparatus, causes the appearance of new factors and combinations,
which the masses can not penetrate for a time. Every jump of technical
progress leaves the relative intellectual development of the masses a step
behind, and thus causes a fall in the political maturity thermometer. It
takes sometimes tens of years, sometimes generations, for a peculiar level
of understanding gradually to adapt itself to the changed state of
affairs, until it has recovered the same capacity for self-government, as
it had already possessed at a lower stage of civilisation. Hence the
political maturity of the masses can not be measured by an absolute
figure, but only relatively, i.e, in proportion to the stage of
civilisation at that moment.

    When the level of mass-consciousness catches up with the objective
state of affairs, there follows inevitably the conquest of democray,
either peaceably or by force. Until the next jump of technical
civilization - the discovery of the mechanical loom, for example - again
sets back the masses in a state of relative immaturity, and renders
possible or even necessary the establishment of some form of absolute

    This process might be compared to the lifting of a ship through a
lock with several chambers. When it first enters a lock chamber, the ship
is on a low level relative to the capacity of the chamber; it is slowly
lifted up until the water-level reaches its highest point. But this
grandeur is illusory, the next lock chamber is higher still, the levelling
process has to start again. The walls of the lock chambers represent the
objective state of control of natural forces, of the technical
civilisation; the water-level in the lock chamber represents the political
maturity of the masses. It would be meaningless to measure the latter as
an absolute height above sea-level; what counts is the relative height of
the level in the lock chamber.

    The discovery of the steam engine started a period of rapid
objective progress, and consequently, of equally rapid subjective
political retrogression. The industrial era is still young in history, the
discrepancy is still great between its extremely complicated economic
structure and the masses' understanding of it. Thus it is comprehensible
that the relative political maturity of the nations in the first half of
the twentieth century is less than it was in 200 B.C. or at the end of the
feusal epoch.

    The mistake in the socialist theory was to believe that the level
of mass-consciousness rose constantly and steadily. Hence its helplessness
before the latest swing of the pendulum, the ideological self-mutilation
of the peoples. We believed that the adaptation of the masses' conception
of the world to changed circumstances was a simple process, which one
could measure in years; whereas, according to all historical experience,
it would have been more suitable to measure by centuries. The people of
Europe are still far from having mentally digested the consequences of the
steam engine. The capitalist system will collapse befre the masses have
understood it.

Climate computations (0)

Anonymous Coward | about a year ago | (#44431085)

I think I'll use a recursive function with floating point arguments for my climate computations then choose the hardware that gives me the result I want.

All Mozart's Works are Open Source (4, Insightful)

Flwyd (607088) | about a year ago | (#44431139)

You can learn a lot from Mozart because you can read all the notes he published.
You can listen to many interpretations of his works by different people.
We don't have the chance to read through 25-year-old Mac symphonies^W programs.
We aren't even writing for the same instruments.

Re:All Mozart's Works are Open Source (0)

Anonymous Coward | about a year ago | (#44431173)

If we did have a chance to read through 25-year old Mac source code, yeah there would be quite a few curiosity downloads, but the number of eople who would take it up and spend more than 10 hours on it probably would be countable on one hand.

Software is currently not at the same stage as music in terms of notation.

Back to chariots and horses (1, Insightful)

Tony Isaac (1301187) | about a year ago | (#44431167)

Chariots were masterpieces of art. They were often made of precious metals and had elegant design work. They were environmentally friendly, using no fossil fuels whatsoever. They didn't cause noise pollution, or kill dozens of people when they crashed.

Aircraft makers should learn from the past. They have totally functional designs, no semblance of artistry anywhere. Accommodations are cramped, passengers treated like cattle.

We should go back to the good old days, things were so much better back then.

No. I've loaded decks of punch cards and distributed printouts. But who could afford a computer of their own? Nobody. He can have his good old days...no way *I* want to go back!

The past was nice but today is not then (5, Insightful)

Coditor (2849497) | about a year ago | (#44431185)

I'm old enough at 55 to remember the past, and yes I did love APL briefly but lamenting that the present isn't like the past is like wishing it was 1850 again so you could have slaves do all your work. Neither the web nor the modern mobile application is anything like the past, and what we use to write that code today is nothing like what I started with. Trying to relive the past is why old programmers get a reputation for being out of touch. The past is important in that I learned a lot then that still rings today but I can say that about every year since I started. Today is a new day everyday.

Library Code Archives (1)

MarkvW (1037596) | about a year ago | (#44431251)

Libraries should be archiving (and date-stamping) code. When copyright expires, that code can form public domain building blocks for a lot of cool stuff.

The kids of the future won't have to reinvent the wheel, they'll be able to improve it.

Software patents suck.

Nothing wrong with APL, as far as it goes... (0)

Anonymous Coward | about a year ago | (#44431273)

But people can get far more done today with matlab and mathematica.

Remember the past? (1)

Anonymous Coward | about a year ago | (#44431275)

Hell, I'd be happy if they just learned about it in the first place

I had a coworker once, claimed a computer science/engineering education. Did not know what parity was - not "didn't know how it worked" - that would be ok, given the slow creep of IP stacks that manage that, but DI NOT KNOW WHAT IT WAS.

captcha: restart

Re:Remember the past? (1)

flyingfsck (986395) | about a year ago | (#44431347)

Hmm, so he also didn't understand CRCs, error correction codes, encryption and compression - anything in a Galois field. Where did he buy his CS degree?
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?