Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

What Are the Genuinely Useful Ideas In Programming?

Unknown Lamer posted about 10 months ago | from the lattices-are-pretty-cool dept.

Programming 598

Hugh Pickens DOT Com writes "Computer Scientist Daniel Lemire has had an interesting discussion going on at his site about the ideas in software that are universally recognized as useful. 'Let me put it this way: if you were to meet a master of software programming, what are you absolutely sure he will recommend to a kid who wants to become a programmer?' Lemire's list currently includes structured programming; Unix and its corresponding philosophy; database transactions; the 'relational database;' the graphical user interface; software testing; the most basic data structures (the heap, the hash table, and trees) and a handful of basic algorithms such as quicksort; public-key encryption and cryptographic hashing; high-level programming and typing; and version control. 'Maybe you feel that functional and object-oriented programming are essential. Maybe you think that I should include complexity analysis, JavaScript, XML, or garbage collection. One can have endless debates but I am trying to narrow it down to an uncontroversial list.' Inspired by Lemire, Philip Reames has come up with his own list of 'Things every practicing software engineer should aim to know.'"

cancel ×

598 comments

I can think of one that Steve Jobs disagreed with (4, Funny)

Cryacin (657549) | about 10 months ago | (#45066473)

The "on" button.

FUDD'S FIRST LAW OF OPPOSITION (2)

Jeremiah Cornelius (137) | about 10 months ago | (#45066565)

"If you push something hard enough, it will fall over"

I think also,

"It goes in, it must come out"
-- Testlcle's deviant to Fudd's law.

Re:FUDD'S FIRST LAW OF OPPOSITION (1)

Jane Q. Public (1010737) | about 10 months ago | (#45066861)

"It goes in, it must come out"

... is actually Teslacle's Deviant to Fudd's Law, with an "L". But hey... could be a typo.

---
"Close B clothes mode on Deputy Dan."

Re:I can think of one that Steve Jobs disagreed wi (5, Insightful)

Z00L00K (682162) | about 10 months ago | (#45066623)

I would say that one of the most important thing in programming is to break down a problem into parts that are useful and easy to manage. It doesn't matter which language you code in. It's very much like building with Lego - you have more use for all those 4x2 bricks than any other brick. The humongous large bricks are "use once". A right sized brick can be copied and pasted into future code as well, possibly tweaked a bit to suit the new environment. In the process of breaking down a problem - define interfaces. Make a design of the important interfaces to make sure that they can remain stable over time. That can make maintenance easier.

The second most important thing is to learn what compiler warnings means and how to fix them. In this case strong typing isn't your enemy - it's your friend since it will tell you about problems even before you get them when executing the code.

Third is to learn about which known algorithms that are out in the wild so you don't have to invent them yourself. Quicksort is already implemented a thousand times, so there's no need to implement it again, just find which library you need. If you are developing a commercial app you shall start with the Apache project since that license is permissive when it comes to how the libraries may be incorporated. The LGPL is also good. But leave the license headaches to someone else to sort out if you aren't sure.

These are the useful ideas I try to follow, the rest is just a mixture of ingredients and seasoning to get something running.

Remember: You can build a great machine with simple stuff or a stupid machine with expensive stuff.

Re:I can think of one that Steve Jobs disagreed wi (3, Insightful)

Jane Q. Public (1010737) | about 10 months ago | (#45066891)

"Quicksort is already implemented a thousand times, so there's no need to implement it again, just find which library you need."

Yes, that's true, but we're talking about education here, not building websites.

If you're a coder, and you don't know how to BUILD a hash table from genuinely fundamental, low-level components, or if you can't do a quicksort from those same fundamental building blocks, guess what? I won't hire you.

It's great to be able to buy or borrow a used V8, but if you don't know how to build one, you're not going to be my mechanic.

Re:I can think of one that Steve Jobs disagreed wi (1)

acscott (1885598) | about 10 months ago | (#45066939)

Yes. Divide and conquer (or structured systems programming), and make your inheritor's life good (good karma). A good compiler is good. Typing is a computer science term--not a reference to pecking at the keyboard. Algorithms is a funky term. Ultimately it is a reference to doing things in discrete mathematics. Mathematicians really do give "it" away. "...the rest is just a mixtures of ingredients...." This is a reference to making things taste good. It's relevant, informative, honest, and a description of doing the right thing. It's a moral attitude. To sum it up for people not in the culture: Do good, engineer, break problems up into their component parts. To lazily invoke a giant, beauty is harmony of parts in a whole (Aristotle somewheres).

It's not just about the concepts (4, Informative)

Krishnoid (984597) | about 10 months ago | (#45066479)

if you were to meet a master of software programming, what are you absolutely sure he will recommend to a kid who wants to become a programmer?

Make it clear that 'mastery' of programming involves wisdom and experience beyond knowledge of techniques. My go-to example for this is Code Complete.

Great read: The Pragmatic Programmer (3, Interesting)

gentryx (759438) | about 10 months ago | (#45066617)

Exactly. Code Complete is a great book. I liked The Pragmatic Programmer -- from Journeyman to Master [amazon.com] even better. It's slightly more meta, but the tips inside are really universa.

Some are even applicable beyond software engineering, e.g. "don't repeat yourself" (i.e. don't have two versions of the same information (e.g. your source in your repository and its documentation on your website) stored in two different places because the probability that over the time both will diverge equals 1. It's better to make one the master copy and derive the other from it.) I recommend this book to all my students.

Re:It's not just about the concepts (1)

bondsbw (888959) | about 10 months ago | (#45066657)

Practice makes perfect.

It always amazes me how I can go back to a language I knew so well 5 years ago, yet I make mistakes you'd see from a first year CS student.

Re:It's not just about the concepts (4, Insightful)

Anonymous Coward | about 10 months ago | (#45066811)

Practice makes perfect.

It always amazes me how I can go back to a language I knew so well 5 years ago, yet I make mistakes you'd see from a first year CS student.

Nah, practice makes better... nobody's perfect.

Recursion (X) (0)

Anonymous Coward | about 10 months ago | (#45066675)

Recursion = Recursion (X-1) + Recursion (X-2)

Continuous Integration (1)

Anonymous Coward | about 10 months ago | (#45066481)

I'm talking about two points here:
1. The tools: have a CI server to do automatic validation.
2. The concept: try to integrate early, integrate often. Your automatic validation will catch errors and you'll be able to solve them quickly. If you wait a long time to integrate, you will just have a huge bunch of stuff to integrate, and possibly too many errors.

The second point is why I prefer not using feature branches in version control (and yes, I use DVCS).

Re:Continuous Integration (2)

Anrego (830717) | about 10 months ago | (#45066631)

I'm all for CI (hudson/jenkins being my prefered tool of choice), but no feature branches? Lunacy!

I strongly believe integration has to be balanced. Sometimes people have to be able to run on their own for a bit then merge their work back in. Every commit going straight into trunk and expected to work is going to add lots of overhead and kill momentum.

The most basic (0)

Anonymous Coward | about 10 months ago | (#45066483)

Problem solving.
The ability to think through how to put pieces together to solve a problem.
Without this ability, nothing else is useable.
McLae

Regular Expressions (5, Informative)

Kohath (38547) | about 10 months ago | (#45066495)

are the most useful thing I learned in the last 5 years.

Re:Regular Expressions (5, Funny)

jlar (584848) | about 10 months ago | (#45066529)

"Some people, when confronted with a problem, think "I know, I'll use regular expressions." Now they have two problems."

- Jamie Zawinski

Re:Regular Expressions (0, Offtopic)

Anonymous Coward | about 10 months ago | (#45066595)

You, sir, deserve to be modded down.

Re:Regular Expressions (1)

gl4ss (559668) | about 10 months ago | (#45066605)

yeah that'll get the kid to get interested in programming!

how about just simple stuff? nibbles.bas . something that makes him see that the programs are not magic but something he can understand, modify and create.

some people went on to recommend shit like db guides and other stuff. that'll come naturally if the kid gets into his head that the programs are just programs. if he gets into his head that some programs are just magic created by pixies over at oracle land then he'll always be just an user. might just as well recommend learning vbscript for excel(has it's uses, if he is an _user_ of excel).

most regexp users are just using "magic", don't go that route to introduce someone to be a programmer. show him that he, or anyone, can be a programmer and not just an user of programs.

Re:Regular Expressions (3, Insightful)

H0p313ss (811249) | about 10 months ago | (#45066887)

are the most useful thing I learned in the last 5 years.

Regular expressions are incredibly powerful and useful, if you know how to use them and how to not abuse them.

Much like welding torches.

Brian Harvey CS 61A (1)

Anonymous Coward | about 10 months ago | (#45066497)

Structure and Interpretation of Computer Programs

https://www.youtube.com/watch?v=GAHA0nJv8EA

Re:Brian Harvey CS 61A (1)

narcc (412956) | about 10 months ago | (#45066869)

Overrated

lowest common denominator (3, Insightful)

Laxori666 (748529) | about 10 months ago | (#45066499)

By definition, most programmers are not masters of software programming. So why is Daniel trying to compile a list that everybody will agree with? That would be a list of what every non-master programmer agrees a master programmer should know, which is different than a list of what a programmer should know to be a master programmer...

As for my approach, it would be to list those qualities which would make learning Javascript, XML, relational databases, etc., easy enough to do, by which I mean, those qualities which would allow a programmer to be able to self-teach himself these things, to the master level if his tasks require it. A master programmer doesn't have to know Objective-C or JavaScript, but he sure as heck better be able to learn how to effectively use them if he needs to.

A Lost Art? (0)

Anonymous Coward | about 10 months ago | (#45066523)

Clean Coding

Re:lowest common denominator (1)

Jane Q. Public (1010737) | about 10 months ago | (#45066931)

"So why is Daniel trying to compile a list that everybody will agree with? That would be a list of what every non-master programmer agrees a master programmer should know, which is different than a list of what a programmer should know to be a master programmer..."

No, it isn't. It's INTENDED to be the opposite: a list that master programmers can agree every non-master should know.

You have to admit: there's a lot of master-level programming talent who check in to Slashdot.

Indirection and caching (0)

Anonymous Coward | about 10 months ago | (#45066505)

The two duct tape equivalents for IT.

"Indirection can solve any problem in software, except for too many layers of indirection."

- Anonymous

Can't Trick Me! (4, Funny)

Erik_Kahl (260470) | about 10 months ago | (#45066507)

I'm not writing your "How to be a Programmer in 20 Minutes!" ebook for you. You'll have to spend 20 years learning like the rest of us.

Does it do what you want? (0)

Anonymous Coward | about 10 months ago | (#45066511)

No? Then keep hacking.

That's all you need to know in order to learn programming.

Re:Does it do what you want? (1)

Anonymous Coward | about 10 months ago | (#45066801)

Given the average quality of software, I fear that's indeed the mindset of the majority of programmers.

databases (0)

Anonymous Coward | about 10 months ago | (#45066513)

database transactions; the 'relational database; ... I am trying to narrow it down to an uncontroversial list

Well he's already failed. Databases are a niche topic that doesn't belong in an "uncontroversial" list of things that every software engineer needs to know.

Re: Database Transactions (2)

EdmundSS (264957) | about 10 months ago | (#45066689)

... are too specific. The concept of an Atomic Operation is important; database transactions are just a domain-specific example.

Re:databases (5, Interesting)

reluctantjoiner (2486248) | about 10 months ago | (#45066691)

Surely any programmer ought to know the underlying principles that make databases work (ie ACID etc) even if they never intend to go anywhere near multi threading. Even in single threaded programs knowing what and how ACID works can help. Have you never done a write() and wondered where the data you sent to disk went?

Perhaps the relational calculus might not be strictly necessary, however if knowing the theory behind relations helps engineers from naively treating databases as data garbage dumps, it'd be worth it.

Re:databases (5, Insightful)

dgatwood (11270) | about 10 months ago | (#45066741)

Well he's already failed. Databases are a niche topic that doesn't belong in an "uncontroversial" list of things that every software engineer needs to know.

When it was part of our CS required curriculum, I suspected I would never use it, but it turns out that the vast majority of projects I've been involved with have used databases in some way, and one of them even involved some pretty serious database query optimization. As far as I can tell, unless you pretty much code exclusively down at the kernel level, you're going to eventually be asked to work on some project involving databases. They're the glue that holds technology together. Outside of a handful of niche fields, I'd be surprised if any programmer managed to go more than five years out of school without having to work with one.

Also, once you understand databases conceptually, everything starts to look like a special case of a database. This is a good thing. C data structures? Table records. Pointers? Relations. And so on. It ends up helping you understand complex problems even if you're one of those rare people who never ends up touching an actual database.

After 30 years of programming (5, Insightful)

jgotts (2785) | about 10 months ago | (#45066519)

Forget about having to learn any specific language or environment. You should be able to pick up any language or environment on the job.

You need to learn how to plan, estimate how long that plan will take to complete, and finish it on time. Very few programmers I've worked with are any good at estimating how much time they will take to complete anything. The worst offenders take double the amount of time they say they will.

Forget about specific computer science trivia. You can look that all up, and it's all available in libraries with various licenses. When you're starting a new job, refresh yourself on how that problem is already being solved. If you need a refresher on a specific computer science concept, take some time and do so.

With this advice you won't burn out at age 25.

Re:After 30 years of programming (5, Funny)

pezpunk (205653) | about 10 months ago | (#45066577)

the WORST offenders take twice as long as their estimate? you know some pretty good programmers!

Re:After 30 years of programming (4, Insightful)

tlhIngan (30335) | about 10 months ago | (#45066625)

Forget about specific computer science trivia. You can look that all up, and it's all available in libraries with various licenses. When you're starting a new job, refresh yourself on how that problem is already being solved. If you need a refresher on a specific computer science concept, take some time and do so.

Well, it's helpful to have the basic understanding of Big-O and what common algorithms have. It's also worthwhile to know when it really doesn't matter - using bubblesort is bad, but if you know you're only sorting 10 items ever, it'll work in a pinch.

Knowing this can have an effect on what algorithms you choose and even how you architect the system - perhaps what you're dealing with may end up causing quicksort to see its worst-case behavior far more often than necessary.

And you laugh, but I've seen commercial software falter because of poor choices of algorithms - where doing a 1 second capture is perfectly fine, but a 10 second capture causes it to bog down and be 100 times slower.

Or the time where adding a line to a text box causes it to get exponentially slower because adding a line causes a memory allocation and a memory copy.

Next, understand OS fundamentals and how the computer really works. Things like how virtual memory and page files operate, how the heap works and synchronization and even lock-free algorithms and memory barriers. It's difficult to learn on your own - it takes a lot of time to sit down and really understand how it works and why it works, and even then it can take 3-4 different methods of explanation until it clicks.

Concurrent programming isn't hard especially if concurrency was taken into account when the system was designed. Adding concurrency to a non-concurrent system though is a huge, difficult and trouble-prone process. Especially once bit-rot has set in and you find 10 different ways of getting at the variable.

Re:After 30 years of programming (0)

GoodNewsJimDotCom (2244874) | about 10 months ago | (#45066663)

Sir, you're the type of person I'd like to work with(and hey I'm looking). I typically pick up languages in about 2 weeks to be able to crank out reasonable code then move towards mastery over time. I agree that having a general idea on how to code is better than any specific language. However HR doesn't see that today. They demand you know all sorts of specific technologies as if you couldn't pick them up on the job.

I have 10+ years in C/C++, but haven't touched it much in 4 years so I'm rusty for tests, but I can code in it. I applied to EA, and they had a multiple choice test on arcane syntax. It was pretty horrible. I'm reminded of the time where Einstein couldn't remember how many feet were in a mile. Einstein's reply was "I don't know, why should I fill my brain with facts I can find in two minutes in any standard reference book?” We're still in an era where the hiring practices can reject competent people, but yet still can hire inept people often. At least Google finally realized those "put you on the spot brain teasers" aren't good for interviewing.

Hey, if anyone is looking to hire a solid programmer who's been programming steadily for 24 years, send me an email Jimjobseek AT yahoo.com

I'm a very strong programmer, with a specialization in rapidly prototyping(speed development).

Re:After 30 years of programming (2)

Anrego (830717) | about 10 months ago | (#45066669)

Yup!

Estimation is a dark art, and most people suck at it.

My best approach has always been to break a problem down into chunks, then break those chunks into smaller chunks, until I'm left with something granular enough to actually base an estimate on.

I'd expand on this by adding that not only is estimation an important skill, but self management is equally important. Is that chunk you thought would take a week now in it's third week? How is that going to affect the schedule (hint: I'll just do the next chunk faster is not a good path to go down..)? Recognizing how you are performing and identifying when you are about to blow the schedule while it can still be mitigated vice being just as surprised as your boss when it's way to late is a really good thing.

Re:After 30 years of programming (3, Insightful)

bzipitidoo (647217) | about 10 months ago | (#45066785)

Estimates? How long does it take to solve a maze? Take all the correct turns, and you'll be done in a few minutes. One wrong turn, and it could take hours. How do you estimate that?

A big reason why estimates tend to be low is the tendency to overlook all the little problems that shouldn't happen but do. It's not just that libraries have bugs too. Systems sometimes go down. Networks can corrupt data. I could never get any programming or system administration work done quickly, because I'd always run into 3 or 4 other things that I had to fix or work around first. A hard drive crash is when you find out that the DVD drive which was fine for light usage overheats during an install, that the updated OS breaks part of the software, and that it was only inertia keeping the server on the misconfigured network and once it was powered down another server grabs its IP address, and so on. Once had to work around the libpng site being blocked for "inappropriate content" by the latest update of the company's web monitoring software. But those are relatively trivial problems that don't blow estimates by orders of magnitude.

Your advice is fine for hacks who need to grind out simple business logic, or glue a few well tested and thoroughly traveled library functions together, and who don't have to think much about performance or memory constraints. There's very little uncertainty in that kind of work. But when you're trying to do new things, trying out new algorithms, and you have no idea whether they will even work, let alone be fast enough, you're back in the maze. We could have got astronauts to the moon 5 years sooner if we knew beforehand which directions were blind alleys.

Code is not done until it's done. (1)

Anonymous Coward | about 10 months ago | (#45066877)

The worst you have seen is someone taking twice their estimate. You must be working on some easy problems.

The practical issue is that when solving real problems the work in writing code is 90% design, and since no one has done that before estimating with any accuracy is a pure guessing game. The harder and the more challenging the problem the harder it becomes to give an accurate estimate.

That doesn't mean you can't control your schedule. You can always drop features or build incremental developments and deliver what you have. But giving a precise estimate is nonsense.

In fact I have found that I have gotten better and not worse with age, as the problems I am tackling have gotten harder, and correctness has become more important.

Re:After 30 years of programming (0)

Anonymous Coward | about 10 months ago | (#45066901)

+1 from an AC who's coded for 26+ years.

XML? Really? (2, Insightful)

Anonymous Coward | about 10 months ago | (#45066531)

What is this, the 90s? In a world with JSON and YAML, why should we bother learning XML for anything other than legacy systems?

Don't ask me how I know this (3, Insightful)

Crash McBang (551190) | about 10 months ago | (#45066535)

People don't know what they want till they see it. Prototype early, prototype often.

Re:Don't ask me how I know this (5, Insightful)

Anrego (830717) | about 10 months ago | (#45066685)

But never make the prototype too good.

"You need 12 weeks to turn it into actual software? this works fine!"

the most basic data structures (4, Insightful)

TopSpin (753) | about 10 months ago | (#45066539)

the heap, the hash table, and trees

There is nothing basic about these. Each is the subject of on-going research and implementations range from simplistic and poor to fabulously sophisticated.

An important basic data structure? Try a stack.

Yes, a stack. What majority of supposed graduates of whatever programming related education you care to cite are basically ignorant of the importance and qualities of a stack? Some of the simplest processors implement exactly one data structure; a stack, from which all other abstractions must be derived. A stack is what you resort to when you can't piss away cycles and space on ....better.... data structures. Yet that feature prevades essentially all ISAs from the 4004 to every one of our contemporary billion transistor CPUs.

Re:the most basic data structures (1)

techno-vampire (666512) | about 10 months ago | (#45066731)

Not just a stack, a queue as well. Of course, if you look at them right, they're two sides of the same coin because they both make you deal with events or data in an ordered manner.

Back when I was actually doing programming, I spent several years working with Dan Alderson, [wikipedia.org] at JPL. His data structure of choice was always the linked list, either one way or two, depending on what was needed at the time. Yes, it's an abstraction layer, but it's a very useful one when you don't know how many items you're going to be working with from one run to another and it's not exactly hard to add to your toolkit.

Re:the most basic data structures (1)

Jane Q. Public (1010737) | about 10 months ago | (#45066807)

"There is nothing basic about these."

Correct. I wondered about that myself. Those are important but hardly basic.

A stack is a basic structure, as you say. A string is a basic structure, with several different but classic storage schemes. (Please don't start in with "That's a data type, not a structure." Nonsense. They're all structures.)

Integers are basic. As are longints, etc. They may vary from platform to platform, but they're still pretty basic. As is the byte, for example. The meaning of that has changed over the years (now meaning just 8 bits, period... it used to be variable depending on the architecture), but it's pretty basic and quite important.

Arrays are basic.

Hash tables are not "basic". In fact that used to be a rather lengthy section of basic CS classes; how to implement an efficient hash table in code. There is nothing even remotely basic about them; the data and underlying code of the class are quite complex compared to truly basic structures. Some modern languages just make them SEEM simple. But I know few coders today who could actually build a good one from really basic components.

Saying that a hash table is a basic data structure is like saying Quicksort is a native operator.

Stuff you should learn (1)

Anonymous Coward | about 10 months ago | (#45066541)

Learn an assembly language. Program hardware. Learn what goes under the hood.

Re:Stuff you should learn (3, Insightful)

Anrego (830717) | about 10 months ago | (#45066699)

I don't know if that's really good advice anymore.

I mean if it's an interest, sure. Personally I love that stuff.. but unless you plan on doing low level coding for a career, modern programming is so abstract from machine language that knowing what's going on down there is in most cases interesting trivia at best.

Re:Stuff you should learn (0)

Anonymous Coward | about 10 months ago | (#45066873)

Learn C. It's like a portable assembly language.

Er (0)

Anonymous Coward | about 10 months ago | (#45066543)

OP sounds like something a cis major would go on about to impress people. And asking implies you think we're "master programmers", and are really asking our opinion while playing on hubris.

But for the sake of pretending, I'd first breifly explain the turing machine, that it can calculate any possible computer program, then introduce them to python. After that, they can explore to fill in the gaps. No one likes to be lectured, especially not new programmers.

Reasonable list (0)

Anonymous Coward | about 10 months ago | (#45066545)

Some stuff I would probably add:

Basic understanding of caching/Virtual memory and the concept of temporal and spatial locality
Ability to write/read simple examples of RISC-style assembly (MIPS, etc.), in order to understand what exactly it is that a compiler outputs
Passing by reference vs. Passing by value.

On a par with those others, I think... (1)

Empiric (675968) | about 10 months ago | (#45066547)

Software componentization.

I'll use a generic term here, whether the specific implementation at hand is via OOP, a defined API to a DLL, a web service, or any of myriad other concrete forms.

The list as presented is indeed core to being able to develop an application up to a certain level of complexity, and in a learning framework where one can conceivably be able to, and can be expected to, understand or visualize the entirely of the project.

Beyond that level of complexity, though, the notion of being able to efficiently deal with software functionality as a "black box" for which the developer need not know all the particulars of implementation, allows for the allocation of mental resources to the particular requirements of the task at hand. Being able to use others' code in this manner, and write code in this manner, is to my mind a crucial ability to contribute to an actual professional development team.

Uncontroversial? (1)

Urkki (668283) | about 10 months ago | (#45066551)

GUI programming is a new thing and still it's rapidly transforming. Change that to event-driven application architecture, which almost all GUI apps have.

Relational databases, well... NoSQL would be enough for this list.

Model-view stuff seems to be missing from the list, and all kinds of patterns in general. Also, totally missing from TFS is client-server models, be it the backend-frontend model of web apps, or traditional TCP/IP protocols.

That being said, if you master every aspect of Qt5 including QML and using network and databases, and development for different mobile devices, while also embracing the functional aspect of JavaScript (for QML and HTML), you should be pretty well set for everything on every platform ;-)

Impossible (0)

goodmanj (234846) | about 10 months ago | (#45066553)

Any discipline that can list its core ideas in a short Slashdot post isn't worth studying.

You mean basic stuff? (1)

NotSoHeavyD3 (1400425) | about 10 months ago | (#45066557)

You know rules of thumb like "NO, YOUR FUNCTION SHOULDN"T BE 5000 FUCKING LINES LONG. IT SHOULD FIT ON THE GOD DAMN SCREEN." or "KNOCK IT OFF STUPID. YOU SHOULD HAVE WROTE ONE FUNCTION TO DO THAT. NOT CUT AND PASTE THOSE 10 LINES OF CODE 70 TO 80 TIMES." (Before anyone asks, yes I've literally seen programmers, software engineers, or whatever we're calling them with years of experience make amateurish blunders such as the above. And yes, they really did do those 2.)

Re:You mean basic stuff? (5, Insightful)

dgatwood (11270) | about 10 months ago | (#45066897)

IMO, a function should be as long as it needs to be, and no shorter or longer. If the most easily understood way to express a concept is as a 5,000 line function, then you should write a 5,000 line function. Splitting up a function based on some arbitrary length limitation can only lead to less readable code.

For example, my record is almost 5,500 lines. The entire function is basically a giant switch statement in a parser (post-tokenization). The only way you could make that function significantly shorter would be to shove each case into a function, and all that would do is make it harder to follow the program flow through the function for no good reason. At any given moment, you're still going to be staring at exactly one of those cases per token (plus a little code on either end), so having each case in a separate function just adds execution overhead without improving readability, and it makes debugging harder because you now have to pass piles of flags around everywhere instead of just using a local variable.

One of the data structures for the function in question is almost 1200 lines long by itself (including anywhere from two to fifteen lines of explanation per field, because I wanted to make sure this code is actually maintainable). By itself, the initializer for that data structure cannot meet your "fits on one screen" rule, even with most of the fields auto-initialized to empty. And there's no good way to shrink that data structure. It is a state object for a very complex state machine. The code to interpret the end state is over a thousand lines of code by itself.

In short, those sorts of rules simply don't make sense once the problem you're trying to solve becomes sufficiently complex. They're useful as guidelines for people who don't know how to write code yet—to help them avoid making obscenely complex functions when the functionality is reasonably separable into smaller, interestingly reusable components, to keep themselves from shooting themselves in the foot by repeating code where they should call a shared function, and so on. However, IMO, if you're still thinking about rules by a few years out of school, they're probably doing you more harm than good, causing you to write code with unnecessary layers of abstraction for the sake of abstraction.

quicksort better than OOP? (1)

phantomfive (622387) | about 10 months ago | (#45066569)

He lists quicksort as more useful than OOP? Quicksort is cool, but.......

I'm a C programmer, so I understand people who say OOP isn't everything, but the concepts you learn with OOP are a whole lot useful than the concepts behind quicksort. You even use those concepts when you use a language like C. It could be argued that learning algorithmic complexity is more important than OOP, but that is a different than learning quicksort.

Re:quicksort better than OOP? (0)

Anonymous Coward | about 10 months ago | (#45066817)

Quicksort is a prototypical example of recursion. Recursion is one of the key things every programmer should understand.

The Closure (5, Insightful)

gillbates (106458) | about 10 months ago | (#45066571)

The most useful concept I've ever come across is the notion of a closure in Lisp. The entire operating state of a function is contained within that function. This, and the McCarthy lisp paper (1955!) where it is explained how a lisp interpreter could be created using only a handful of assembly instructions is well worth the read. It is from the fundamental concepts first pioneered in lisp that all object oriented programming paradigms spring; if you can understand and appreciate lisp, the notions of encapsulation, data hiding, abstraction, and privacy will become second nature to you.

Furthermore, if you actually put forth the time to learn lisp, two things will become immediately apparent:

  1. A language's usefulness is more a matter of the abstractions it supports than the particular libraries available, and
  2. Great ideas are much more powerful than the language used to express them.

In Stroustroup's "The C++ programming language", there are numerous examples of concise, elegant code. These spring from the concept of deferring the details until they can be deferred no more - the top-down approach results in code which is easily understood, elegant, efficient, robust, and maintainable.

Many years ago, a poster commented that the work necessary to complete a particular project was the equivalent of writing a compiler; he was trying to emphasize just how broken and unmaintainable the code was. The irony in his statement is that most professional projects are far more complex than a compiler needs to be; because he didn't understand how they worked, he thought of them as necessarily complex. However, the operation of a compiler is actually quite simple to someone who understands how they work; the McCarthy paper shows how high level code constructs can be easily broken down into lower-level machine language instructions, and Knuth implements a MIX interpreter in a few pages in the "The Art of Computer Programming." Neither building a compiler nor an interpreter are monumental undertakings if you understand the principles of parsing and code structure. i.e., what does it mean if something is an operator, versus, say, an identifier.

Ideas are powerful; the details, temporarily useful. Learn the ideas.

Re:The Closure (1)

Guy Harris (3803) | about 10 months ago | (#45066921)

A language's usefulness is more a matter of the abstractions it supports than the particular libraries available

Not that the libraries aren't useful. All other things being equal, including the stuff you don't have to write in that language because somebody else has already done so, yes, a language with clean abstractions is better than a language with not so clean abstractions, but the libraries aren't exactly chopped liver.

Many years ago, a poster commented that the work necessary to complete a particular project was the equivalent of writing a compiler; he was trying to emphasize just how broken and unmaintainable the code was. The irony in his statement is that most professional projects are far more complex than a compiler needs to be; because he didn't understand how they worked, he thought of them as necessarily complex. However, the operation of a compiler is actually quite simple to someone who understands how they work; the McCarthy paper shows how high level code constructs can be easily broken down into lower-level machine language instructions, and Knuth implements a MIX interpreter in a few pages in the "The Art of Computer Programming." Neither building a compiler nor an interpreter are monumental undertakings if you understand the principles of parsing and code structure. i.e., what does it mean if something is an operator, versus, say, an identifier.

You are aware that a compiler is more than just a parser, right? There are these minor sub-projects of a compiler called "code generators" and "optimizers". Understanding the principles of parsing and code structure won't help you understand the principles of optimizing and generating code.

I'm best (0)

Anonymous Coward | about 10 months ago | (#45066573)

I'm the best programmer on the planet. That's why God chose me for His temple.

I took 5 asm courses, a compiler course and operating system course. Meat and potatoes.
I learned to read datasheets. I studied digital design. I am expert on boundary between hardware and software. Plus all my job experience.
I was a professional OS developer at age 20 on Ticektsmeter's VAX os in VAX asm. I was paid for x86 asm, 8051 asm projects, 68000 asm projects, PIC asm projects, atmel acr asm projects. I am a master of bare metal asm programming interacting with hardware with no operating system.

Code Comments (4, Insightful)

gimmeataco (2769727) | about 10 months ago | (#45066575)

What about code comments? I hated doing it for starters, but when you're working with something big or revisiting code after long period of period, it's invaluable.

Re:Code Comments (1, Informative)

UnknownSoldier (67820) | about 10 months ago | (#45066911)

Comments should comment WHY, as in "work-around X because commit #123 made assumption Y"
If I want to know HOW the code works you should be able to read the code. (Barring "tricky" optimizations of course.)

I would also _proper_ naming of variables, not everything abbreviated without any vowels that won't make any sense in 6 months.

Take the time to do it RIGHT the first time. 90% of the time on a job you are maintaining code, not writing new code.

Search + Search & Replace (1)

ClassicASP (1791116) | about 10 months ago | (#45066579)

Getcha a really nice GREP tool of some sort where you can search through source code easily and do replacements when necessary. I use mind all the time. Comes in quite handy when you're working on code written by other developers.

KISS principle (2, Insightful)

Anonymous Coward | about 10 months ago | (#45066581)

As simple as possible to accomplish the task correctly, and no simpler. That and find a career that hasn't been decimated in the last 5 years.

Wait that is what I would tell a kid. An expert would probably preach the methodology he has the most vested interest in.

Abstractions (1)

gehrehmee (16338) | about 10 months ago | (#45066585)

Seek to understand the various levels of abstraction available in any problem -- and to solve the problem at the appropriate level. It's a complicated lesson, and something that will take a long time to get right, but once you do, so many things fall out naturally, like clean and reusable code, the need for different languages and tools, design patterns, and on and on and on.

Re:Abstractions (1)

ggpauly (263626) | about 10 months ago | (#45066707)

Correct you are

Pattern Recognition (0)

Anonymous Coward | about 10 months ago | (#45066589)

If there's one thing I've found most useful in developing software, designing computer systems (or any system, for that matter), is to develop a keen sense of pattern recognition.

How stuff works. (0)

Anonymous Coward | about 10 months ago | (#45066591)

For low level that missing (at least from the summery: arrays, caches, and concurrency.

For higher level: abstraction, interfaces and algorithms. (Why cover quicksort if you don't include the concept of having an algorithm, with an interface that provides an abstraction?)

Some of my other favorites: Idempotentcy, stateless vs stateful, mutability, policy vs implementation, lazy vs eager, protocols (as a concept), cache-oblivious algorithms.

Also: higher order functions, dynamic dispatch, polymorphism.

Lastly, screw teaching quick sort. Tim sort show something very important in software engineering: hide your optimizations behind a good abstraction, because they make one hell of a mess.

I can give you one tip (1)

GoodNewsJimDotCom (2244874) | about 10 months ago | (#45066597)

If you're not sure if you want to use a pointer driven data structure or an array, favor the array. In languages like C++, when you use pointers, you can deference it and your code will work perfectly for the time being. So when you're doing your standard debug cycles of checking your code for errors, these can pass by, and not be seen again for days or weeks. A dereferenced pointer can end up causing your code to crash abruptly over completely arbitrary things. The only way to debug this is to read through your entire code base. I also suggest prayer. I find prayer works.

Anyway, if you favor arrays over pointer driven data structures, you simply won't blow your leg off as they say in C++. Your errors will be easier to track down. You have a lot less chance of encountering a bug that you can't easily debug which also crashes your entire program randomly. And finally, you'll become an array master(since you use them so much) so your code will come out faster.

Now some people might disagree with this, but this works for me. Sure there's some times you absolutely need to use a creatively designed linked list, but if you can do it with an array and an unnoticeable slowdown, go for!

Re:I can give you one tip (0)

Anonymous Coward | about 10 months ago | (#45066905)

I think you mean a dangling pointer when you say a dereferenced pointer. A dereferenced pointer is completely harmless if it is not dangling (or uninitialized, but only a crappy programmer would leave pointers uninitialized).

Anyway, there's another reason to use arrays when possible: They are generally faster than pointer-based data structures.

However, if you get into pointer problems for something as simple as a linked list, consider yourself a bad programmer.

Yes. (0)

Anonymous Coward | about 10 months ago | (#45066611)

Yes.

Input validation (5, Insightful)

KevMar (471257) | about 10 months ago | (#45066619)

I think he was missing input validation from his list. The idea that you can never trust user input and you must validate it. The idea that you should white list what you want instead of black list the things you don't want. Ideas that consider the security of the system and not just the working condition of it.

Encryption (1)

AHuxley (892839) | about 10 months ago | (#45066621)

There's a great future in encryption. Think about it. Will you think about it?
Beware of charming open source projects via forums and irc rooms. You will be falling into a personality cult.
Become a polymath, you will need history, science, art, music, math and much more to fully understand the needs of your buzzword touting clients.
Beware of the security clearance - years of your life will be dedicated to larger projects with real world stories, the press and politics.
Be aware of who your working for and their reputation. Their global vision, lies and domestic issues will be all over your clean CV for years.
Study your boss the way a team studied your CV. Who is really paying the bills.
Programming languages are like fads, short-lived and very amazing. Learn the educational basics, then see what interests you.

Actors and State (2)

KagatoLNX (141673) | about 10 months ago | (#45066643)

Most programming confusion I've had to combat in the workplace comes from a fundamental misunderstanding of the two most basic facts of your program:

1. Where is your program's state stored? (NOTE: 90% of the time it's "the call stack" and 90% of the time that's the wrong place to put it.)
2. Where in your code is execution happening?

Threaded program generating weirdness? It's probably because you can't answer those two questions. Distributed program a mess to debug? I bet your state is smeared all over the place. Is your code a pain to port to an evented architecture? Bet you modeled your state badly. Can't map some failure to a certain, detectable set of circumstances? I guarantee your answer starts there.

For me, the answer to understanding these problems was found in functional programming. The no-side-effects stuff causes you to make all of your state concrete and also deeply understand what the call-stack does for you (or, more often than not, *to* you). The cruel reality, though, is that applying this hard-won knowledge *doesn't* seem lie in functional programming (or, at least, not LISP, Schema, Haskell, and crew).

If you're an academic, start with Hoare's Communicating Sequential Processes (http://www.usingcsp.com/cspbook.pdf), then learn Erlang (or Go, with a heavy emphasis on GoRoutines). If you're less Ivory Tower, try to grok this blog entry (http://blog.incubaid.com/2012/03/28/the-game-of-distributed-systems-programming-which-level-are-you/), then learn Erlang (or Go, with a heavy emphasis on GoRoutines).

The first law of code: make sure it works (2)

mveloso (325617) | about 10 months ago | (#45066649)

EOM

Maintenance (1)

xbytor (215790) | about 10 months ago | (#45066655)

Understand that every line of code you write may/will be maintained by somebody else. Architect/Design/Write/Test accordingly.

Recursion (1)

EdmundSS (264957) | about 10 months ago | (#45066681)

In the list of concepts, along with OO, closures & functions, recursion is essential knowledge. If in doubt, see this post.

Re:Recursion (1)

ggpauly (263626) | about 10 months ago | (#45066805)

and higher order functions

Re:Recursion (0)

Anonymous Coward | about 10 months ago | (#45066833)

In the list of concepts, along with OO, closures & functions, recursion is essential knowledge. If in doubt, see this post.

Teach them to think logically (0)

Anonymous Coward | about 10 months ago | (#45066693)

To solve a problem you have to understand it.

An idiot with the best tools in the world will still be bested by a thinking man with only what he needs.

Arrays start at zero (1)

spitzak (4019) | about 10 months ago | (#45066695)

Arrays start at zero and go through size-1.

Left-corner design (4, Insightful)

steveha (103154) | about 10 months ago | (#45066697)

The most important book I read as a beginning software developer was Software Tools in Pascal [google.com] . That book teaches a technique it calls "left-corner design". It's kind of a rule-of-thumb for how to do agile development informally.

The basic idea: pick some part of the task that is both basic and essential, and implement that. Get it working, and test it to make sure it works. Now, pick another part of the task, and implement as above; continue iterating until you either have met all the specs or are out of time.

If you meet all the specs, great! If you are out of time, you at least have something working. The book says something like "80% of the problem solved now is usually better than 100% of the problem solved later."

For example, if you are tasked with writing a sorting program, first make it sort using some sort of sane default (such as simply sorting by code points). Next add options (for example, being able to specify a locale sort order, case-insensitive sorting, removing duplicate lines, pulling from multiple input files, etc.). A really full-featured sorting program will have lots of options, but even a trivial one that just sorts a single way is better than nothing.

Also, the book recommends releasing early and often. If you have "customers" you let them look at it as early as possible; their feedback may warn you that your design has fatal flaws, or they may suggest features that nobody had thought of when imagining how the software should work. I have experienced this, and it's really cool when you get into a feedback loop with your customer(s), and the software just gets better and better.

Way back in high school, I tried to write a program to solve a physics problem. I hadn't heard of "left-corner design" and I didn't independently invent it. I spent a lot of time working on unessential features, and when I ran out of time I didn't have a program that did really anything useful.

This is the one thing I would most wish to tell a new software developer. Left-corner design.

P.S. Software Tools in Pascal is a rewrite of an older book, Software Tools [google.com] , where the programs were written in a language called RATFOR [wikipedia.org] . Later I found a copy of Software Tools and found it interesting what things were easier to write in Pascal vs. what things were easier in RATFOR... and when I thought about it I realized that everything was just easier in C. C really is the king of the "Third-Generation" languages.

Re:Left-corner design (2)

davide marney (231845) | about 10 months ago | (#45066837)

Great advice. In picking what is both "basic and essential" I simply look at dependencies, using two perspectives: first, gating dependency -- what, if it doesn't work, would prevent other things from working -- then, structural dependency -- what is the thing that other things are built on.

First satisfy yourself that there are several approaches to meeting the gating dependencies, this will actually give you the best all-around sense of what the design of your application will likely become. Then, start from the bottom of the structural dependencies and work you way up. Happily, in most languages you'll find that library support is strongest at the bottom, so your useful level of work will proceed very quickly, and you will be satisfying all the gating dependencies early in the process.

Done like this, it gets increasingly easier to call in extra hands to help with the work, because what remains is more obvious/common.

Non-blocking I/O (0)

Anonymous Coward | about 10 months ago | (#45066723)

Node may feel too 'advanced' for the absolute beginner, but I would recommend any course mention input/output issues and how the use of callbacks can be so important.

Regular Expressions (2)

tbg58 (942837) | about 10 months ago | (#45066735)

Learn to manipulate text and you can do just about anything.

FP; project scheduling (1)

Allen Akin (31718) | about 10 months ago | (#45066745)

The behavior of floating-point arithmetic. This wasn't covered in my university curriculum, and was necessary in tasks including graphics, machine learning, and finance once I got into the industrial world.

As a manager, possibly the single most important skill for me was learning the ways to estimate the time required for complex programming tasks. Once you're tackling problems beyond the scope of a single programmer, coordination is required, and schedule estimation is essential.

Recommendations for a kid? (4, Insightful)

Rinikusu (28164) | about 10 months ago | (#45066747)

1) Learn how to have fun. Even when you're mired knee-deep in a gigantic pile of horseshit that is a 10-15 year old VBA/Access/Excel monstrosity written by a half dozen people and commented occasionally in non-english languages, if you can't find a way to enjoy the challenge, your career will be short-lived and miserable.

2) Work on things that interest you. When you invariably get the point to where you think "I wonder if there's an easier way to do this", google it. Chances are, you're right. With any luck, you can avoid #1 above if you really work at it. I don't think anyone ever woke up in the morning thinking "Fuck this fun shit, I want to be a Programmer III at some .gov contractor", but you never know. I happen to like maintaining and bug-fixing code more than I do architecting full solutions, but I'll accept that I'm an odd bug.

I mean, I'm reading some of these comments and thinking "yeah, if you pushed Knuth or SICP on me when I was 10, you'd have killed any interest I had in computers." Instead, I was POKEing away on my C64, etc. If anything, figuring out how to solve logic puzzles, breaking down problems, etc, were much more fun for me in the 5th and 6th grade than reading some of the current compsci literature that I *still* require significant motivation to go through. I'm not saying it's not important, I'm just saying that a middle schooler (a kid) may not be all that willing to put in that kind of work, and waiting until college wouldn't be so bad. You know, going back to the comment thing.. a lot of these comments sound like the anti-jock jocks. I remember these kids who's parents were forcing them to play sports and everything revolved around these kids playing sports, but the kids themselves weren't having any fun at all. Now we have nerds acting like jock parents, treating their kids in the same manner.. It can't be healthy.

The nature of reality. (2)

VortexCortex (1117377) | about 10 months ago | (#45066749)

Cybernetics. Information Theory. Done. Everything else in the Universe can be mastered & described with these, even physics and quantum physics.

I taught my 12 year old cousin how to build an adder circuit using a neural net visualization simulation. Then we built it IRL with contractors, then transistors, then we cheated and used some pre-made ICs. He "invented" the Von Neumann architecture by him self -- Well, I should say it was self evident given the core concepts of cybernetics and information theory. He was excited as ever to learn assembly languages and higher level languages, marveling at the heights everyone has gotten to from those core principals. There is no system or language he can not now understand. Even encryption systems he can grok using Information Theory / Entropy.

I really couldn't imagine introducing anyone to even higher mathematics without giving them the tools to apply and visualize it first... I mean, Turing gave us a universal calculator and we still teach times tables first?

Re:The nature of reality. (1)

VortexCortex (1117377) | about 10 months ago | (#45066765)

s/contractor/contactor/

Damn it, that's the 2nd time this week I trusted spell check erroneously on that same word. "add to dictionary"

the absolute most useful idea.... (0)

Anonymous Coward | about 10 months ago | (#45066771)

DOCUMENTATION. whether for yourself later on, other developers, or the oft-neglected end user.

Re:the absolute most useful idea.... (0, Funny)

Anonymous Coward | about 10 months ago | (#45066849)

Learn to read code, loser.

Keep narrowing - a LOT (4, Interesting)

SuperKendall (25149) | about 10 months ago | (#45066775)

Half of those things are NOT things I would "recommend to a kid who wants to become a programmer".

Version control, UNIX philosophy, software testing - it's too much! Someone who wants to be a programmer should start to learn programming first, and then they can explore the wild twists and ideas that surround the thing once they have a grasp of what programing means to them.

I would say even starting languages to recommend depend on the person. If a programmer likes some languages and not others later in life, why should that not be true from day one because of how they like to think? What if you are recommending a language that will turn them off programming forever?

It would almost be best to develop a kind of programming sandbox, that would let them use a variety of languages and concepts (like functional or OO or even, yes, procedural!) and see the path they take most naturally.

State machines, Pushdown Automata, Turing Machines (0)

Anonymous Coward | about 10 months ago | (#45066787)

These lead into the ideas of computability, complexity and space(memory)/time constraints. All other algorithms can be understood better and their suitability evaluated based on one's understanding of these.

Reflection is one of the most useful concepts. (1)

aristotle-dude (626586) | about 10 months ago | (#45066827)

Reflection, as found in .NET allows you to write extremely generic yet powerful code easily. For example, I have used it to write a method that can write any class (or list thereof) to a CSV file of any format by passing in a field-to-value map object along with the object data you want to store. The beauty of this is that you can store the format of the file externally and make changes to it without having to rewrite the code. The only time you would need code changes was if you needed additional properties added to the class you want to write out.

Fun & Logic & Games (2)

VernonNemitz (581327) | about 10 months ago | (#45066839)

My introduction to programming went something like this: I started with a small home computer, pre-IBM-PC. It came with Interpreted BASIC. Lucky me, Consumer Reports recommended that particular computer above the others for 2 years, because the manual, teaching BASIC, was very engaging. But let's back up a bit. A few years before computers started getting into homes, a Logic Game hit the market ("Master Mind") and received a Game Of The Year award. That game made it fun to practice Logic. Computer programming relies a great deal on thinking logically, so practicing Logic is very important --but having fun practicing Logic is better still. Later, I jumped from BASIC into Assembly Language Programming --and once again I was lucky. Not only was the processor of that home computer designed to to make programming it easy, the Book I got on how to do that was among the best in the field. So, what sorts of programs did I write? Games, of course! They can incorporate all the important features of any other application; User Interface, Visual (and sometimes also Audio) Outputs, Data Storage/Retrieval, Occasional Oddball Hardware Manipulation, Algorithms. And, of course, the more-developed a game-under-construction became, the more fun it was to test/play. We All Learn Best By Doing. Later, when I began doing professional work, and it wasn't always fun, I could still "hold my own" because the foundations of that career had been solidly emplaced.

Two words (0)

Anonymous Coward | about 10 months ago | (#45066845)

Bubble sort. Best idea ever.

Design/architecture (2)

Todd Knarr (15451) | about 10 months ago | (#45066895)

The single most important thing isn't about software engineering specifically, it's the ability to analyze a problem, break it down into it's component parts and work out a structure for your solution that solves the problem well. Just like the most important part of building a house isn't anything to do with actually building it, it's deciding what kind of house you're going to build, what rooms it's going to need and how they're going to be arranged. You need a very different house if you want it to support a family with 2 or 3 kids vs. just a single person or a couple without children. If you don't get that right, all the technical chops in the world later on won't help your having been hamstrung by a bad overall design. Lack of that ability has been the root cause behind something like 75% of the "software engineering" failures I've had to deal with.

Programming Books are full of Fairy Tales (0)

Anonymous Coward | about 10 months ago | (#45066919)

Reality is defined by what the machine does, not what books say the machine should do.

Sanity (4, Funny)

stanlyb (1839382) | about 10 months ago | (#45066947)

You would be surprised how many delusional, idiot developers are out there.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...