Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming IT Technology

When Good Interfaces Go Crufty 668

An anonymous reader writes "A good article over at mpt on why good interfaces go crufty." A nice read before or after a visit to the Interface Hall of Shame.
This discussion has been archived. No new comments can be posted.

When Good Interfaces Go Crufty

Comments Filter:
  • by Longinus ( 601448 ) on Friday November 08, 2002 @06:37AM (#4624043) Homepage
    Is the interface a command line? If not, it's crufty ;-)
  • A bit of history... (Score:5, Informative)

    by acehole ( 174372 ) on Friday November 08, 2002 @06:39AM (#4624047) Homepage
    GEOS on the commodore 64 had a good interface for it's time.

    yes kids, that's right! the c64 had a GUI.
  • by Anonymous Coward on Friday November 08, 2002 @06:41AM (#4624054)

    In Vernor Vinges sci-fi novel A fire upon the deep [rambles.net], he presents the idea of software archeology. Vinges future has software engineers spending large amounts of time digging through layers of decades-old code in a computer system like layers of dirt and rubbish in real-world archeology to find out how, or why, something works.

    So far, in 2002, this problem isnt so bad. We call such electronic garbage cruft [ddj.com], and promise to get rid of it [refactoring.com] someday. But its not really important right now, we tell ourselves, because computers keep getting faster [intel.com], and we havent quite got to the point where single programs [mozilla.org] are too large [blogspot.com] for highly coordinated teams to understand.

    But what if cruft makes its way into the human-computer interface? Then you have problems, because human brains arent getting noticably faster. (At least, not in the time period were concerned with here.) So the more cruft there is in an interface, the more difficult it will be to use.

    Unfortunately, over the past 20 years, Ive noticed that cruft has been appearing in computer interfaces. And few people are trying to fix it. I see two main reasons for this.

    1. Microsoft and Apple dont want to make their users go through any retraining, at all, for fear of losing market share. So rather than make their interfaces less crufty, they concentrate on making everything look pretty [theregister.co.uk].

    2. Free Software developers have the ability to start from a relatively cruft-free base, but (as a gratuitously broad generalization) they have no imagination whatsoever. So rather than making their interfaces more usable, they concentrate on copying whatever Microsoft and Apple are doing [phrasewise.com], cruft and all.

    Here are a few examples of interface cruft.

    1. In the 1970s and early 80s, transferring documents from a computers memory to permanent storage (such as a floppy disk) was slow. It took many seconds, and you had to wait for the transfer to finish before you could continue your work. So, to avoid disrupting typists, software designers made this transfer a manual task. Every few minutes, you would save your work to permanent storage by entering a particular command.

      Trouble is, since the earliest days of personal computers, people have been forgetting to do this, because its not natural. They dont have to save when using a pencil, or a pen, or a paintbrush, or a typewriter, so they forget to save when theyre using a computer. So, when something bad happens, theyve often gone too long without saving, and they lose their work [adweek.com].

      Fortunately, technology has improved since the 1970s. We have the power, in todays computers, to pick a sensible name for a document, and to save it to a persons desktop as soon as she begins typing, just like a piece of paper in real life. We also have the ability to save changes to that document every couple of minutes (or, perhaps, every paragraph) without any user intervention.

      We have the technology. So why do we still make people save each of their documents, at least once, manually? Cruft.

    2. The original Macintosh, which introduced graphical interfaces to the general public, could only run one program at a time. If you wanted to use a second program, or even return to the file manager, the first program needed to be unloaded first. To make things worse, launching programs was slow, often taking tens of seconds.

      This presented a problem. What if you had one document open in a program, and you closed that document before opening another one? If the program unloaded itself as soon as the first document was closed, the program would need to be loaded again to open the second document, and that would take too long. But if the program didnt unload itself, you couldnt launch any other program.

      So, the Macs designers made unloading a program a manual operation. If you wanted to load a second program, or go back to the file manager, you first chose a menu item called Quit to unload the first program. And if you closed all the windows in a program, it didnt unload by itself it stayed running, usually displaying nothing more than a menu bar, just in case you wanted to open another document in the same program.

      Trouble is, the Quit command has always been annoying and confusing people, because its exposing an implementation detail the lack of multitasking in the operating system. It annoys people, because occasionally they choose Quit by accident, losing their careful arrangement of windows, documents, toolboxes, and the like with an instantaneity which is totally disproportionate to how difficult it was to open and arrange them all in the first place. And it confuses people, because a program can be running without any windows being open, so while all open windows may belong to the file manager, which is now always running in the background menus and keyboard shortcuts get sent to the invisible program instead, producing unexpected behavior.

      Fortunately, technology has improved since 1984. We have the power, in todays computers, to run more than one program at once, and to load programs in less than five seconds.

      We have the technology. So why do we still punish people by including Quit or Exit menu items in programs? Cruft.

    3. As I said, the original Macintosh could only run one program at a time. If you wanted to use a second program, or even return to the file manager, the first program needed to be unloaded first.

      This presented a problem when opening or saving files. The obvious way to open a document is to launch it (or drag it) from the file manager. And the obvious way to save a document in a particular folder is to drag it to that folder in the file manager. But on the Mac, if another program was already running, you couldnt get to the file manager. What to do? What to do?

      So, the Macs designers invented something called a file selection dialog, or filepicker a lobotomized file manager, for opening and saving documents when the main file manager wasnt running. If you wanted to open a document, you chose an Open menu item, and navigated your way through the filepicker to the document you wanted. Similarly, if you wanted to save a document, you chose a Save menu item, entered a name for the document, and navigated your way through the filepicker to the folder you wanted.

      Trouble is, this interface has always been awkward to use, because its not consistent with the file manager. If youre in the file manager and you want to make a new folder, you do it one way; if youre in a filepicker and you want to make a new folder, you do it another way. In the file manager, opening two folders in separate windows is easy; in a filepicker, it cant be done.

      Fortunately, technology has improved since 1984. We have the power, in todays computers, to run more than one program at once, and to run the file manager all the time. We can open documents from the file manager without quitting all other programs first, and we can save copies of documents (if necessary) by dragging them into folders in the file manager.

      We have the technology. So why do we still make people use filepickers at all? Cruft.

    4. This last example is particularly nasty, because it shows how interface cruft can be piled up, layer upon layer.

      1. In Microsofts MS-DOS operating system, the canonical way of identifying a file was by its pathname: the concatenation of the drive name, the hierarchy of directories, and the filename, something like C:\WINDOWS\SYSTEM\CTL3DV2.DLL. If a program wanted to keep track of a file in a menu of recently-opened documents, for example it used the files pathname. For backward compatibility with MS-DOS, all Microsofts later operating systems, right up to Windows XP, do the same thing.

        Trouble is, this system causes a plethora of usability problems in Windows, because filenames are used by humans.

      2. What if a human renames a document in the file manager, and later on tries to open it from that menu of recently-opened documents? He gets an error message complaining that the file could not be found.

      3. What if he makes a shortcut to a file, moves the original file, and then tries to open the shortcut? He gets an error message, as Windows scurries to find a file which looks vaguely similar to the one the shortcut was supposed to be pointing at.

      4. What happens if he opens a file in a word processor, then renames it to a more sensible name in the file manager, and then saves it (automatically or otherwise) in the word processor? He gets another copy of the file with the old name, which he didnt want.

      5. What happens if a program installs itself in the wrong place, and our fearless human moves it to the right place? If hes lucky, the program will still work but hell get a steady trickle of error messages, the next time he launches each of the shortcuts to that program, and the next time he opens any document associated with the program.

      6. Fortunately, technology has improved since 1981. We have the power, in todays computers, to use filesystems which store a unique identifier for every file, separate from the pathname such as the file ID [apple.com] in the HFS and HFS+ filesystems, or the inode [webopedia.com] in most filesystems used with Linux and Unix. In these filesystems, shortcuts and other references to particular files can keep track of these unchanging identifiers, rather than the pathname, so none of those errors will ever happen.

        We have the technology. So why does Windows still suffer from all these problems? Cruft.

        Lest it seem like Im picking on Microsoft, Windows is not the worst offender here. GNU/Linux applications are arguably worse, because they could be avoiding all these problems (by using inodes), but their programmers so far have been too lazy. At least Windows programmers have an excuse.

      7. To see how the next bit of cruft follows from the previous one, we need to look at the mechanics of dragging and dropping. On the Macintosh, when you drag a file from one folder to another, what happens is fairly predictable.

      8. If the source and the destination are on different storage devices, the item will be copied.
      9. If the source and destination are on the same storage device, the item will be moved.
      10. If you want the item to be copied rather than moved in the latter case, you hold down the Option key.
      11. Windows has a similar scheme, for most kinds of files. But as Ive just explained, if you move a program in Windows, every shortcut to that program (and perhaps the program itself) will stop working. So as a workaround for that problem, when you drag a program from one place to another in Windows, Windows makes a shortcut to it instead of moving it and lands in the Interface Hall of Shame [iarchitect.com] as a result.

        Naturally, this inconsistency makes people rather confused about exactly what will happen when they drag an item from one place to another. So, rather than fixing the root problem which led to the workaround, Microsoft invented a workaround to the workaround. If you drag an item with the right mouse button, when you drop it youll get a menu of possible actions: move, copy, make a shortcut, or cancel. That way, by spending a couple of extra seconds choosing a menu item, you can be sure of what is going to happen. Unfortunately this earns Microsoft another citation in the Interface Hall of Shame for inventing the right-click-drag, perhaps the least intuitive operation ever conceived in interface design. Say it with me: Cruft.

      12. It gets worse. Dragging a file with the right mouse button does that fancy what-do-you-want-to-do-now-menu thing. But normally, when you click the right mouse button on something, you want a shortcut menu a menu of common actions to perform on that item. But if pressing the right mouse button might mean the user is dragging a file, it might not mean you want a shortcut menu. What to do, what to do?

        So, Windows designers made a slight tweak to the way shortcut menus work. Instead of making them open when the right mouse button goes down, they made them open when the right mouse button comes up. That way, they can tell the difference between a right-click-drag (where the mouse moves) and a right-click-I-want-a-shortcut-menu (where it doesnt).

        Trouble is, that makes the behavior of shortcut menus so much worse that they end up being pretty useless as an alternative to the main menus.

      13. They take nearly twice as long to use, since you need to release the mouse button before you can see the menu, and click and release a second time to select an item.

      14. Theyre inconsistent with every other kind of menu in Windows, which opens as soon as you push down on the mouse button.

      15. Once youve pushed the right mouse button down on something which has a menu, there is no way you can get rid of the menu without releasing, clicking the other mouse button, and releasing again. This breaks the basic GUI rule that you can cancel out of something youve pushed down on by dragging away from it, and it slows you down still further.

      16. In short, Windows native shortcut menus are so horrible to use that application developers would be best advised to implement their own shortcut menus which can be used with a single click, and avoid the native shortcut menus completely. Once more, with feeling: Cruft.

      17. Meanwhile, we still have the problem that programs on Windows cant be moved around after installation, otherwise things are likely to break. Trouble is, this makes it rather difficult for people to find the programs they want. In theory you can find programs by drilling down into the Program Files folder, but theyre arranged rather uselessly (by vendor, rather than by subject) and if you try to rearrange them for quick access, stuff will break.

        So, Windows designers invented something called the Start menu, which contained a Programs submenu for providing access to programs. Instead of containing a few frequently-used programs (like Mac OSs Apple menu did, before OS X), this Programs submenu has the weighty responsibility of providing access to all the useful programs present on the computer.

        Naturally, the only practical way of doing this is by using multiple levels of submenus thereby breaking Microsofts own guidelines about how deep submenus should be.

        And naturally, rearranging items in this menu is a little bit less obvious [microsoft.com] than moving around the programs themselves. So, in Windows 98 and later, Microsoft lets you drag and drop items in the menu itself thereby again breaking the general guideline about being able to cancel a click action by dragging away from it.

        This Programs menu is the ultimate in cruft. It is an entire system for categorizing programs, on top of a Windows filesystem hierarchy which theoretically exists for exactly the same purpose. Gnome and KDE, on top of a Unix filesystem hierarchy which is even more obtuse than that of Windows, naturally copy this cruft with with great enthusiasm [freedesktop.org].

    Following those examples, its necessary to make two disclaimers.

    Firstly, if youve used computers for more than six months, and become dulled to the pain, you may well be objecting to one or another of the examples. Hey!, youre saying. Thats not cruft, its useful! And, no doubt, for you that is true. In human-computer interfaces, as in real life, horrible things often have minor benefits to some people. These people manage to avoid, work around, or blame on user stupidity, the large inconvenience which the cruft imposes on the majority of people.

    Secondly, there are some software designers who have waged war against cruft [joesacher.com]. Word Places Yeah Write [yeahwrite.com] word processor abolished the need for saving documents. Microsofts Internet Explorer for Windows [microsoft.com], while having many interface flaws [phrasewise.com], sensibly abolished the Exit menu item. The Acorns RISC OS [riscos.org] abolished filepickers. The Mac OS uses file IDs to refer to files, avoiding all the problems I described with moving or renaming. And the ROX Desktop [sourceforge.net] eschews the idea of a Start menu, in favor of using the filesystem itself to categorize programs.

    However, for the most part, this effort has been piecemeal and on the fringe. So far, there has not been a mainstream computing platform which has seriously attacked the cruft that graphical interfaces have been dragging around since the early 1980s.

    So far.

    Discuss [phrasewise.com]

    • Code Archiologist (Score:4, Insightful)

      by Bill_EEE ( 623572 ) on Friday November 08, 2002 @01:48PM (#4626444)
      I worked on a very valuble system as a code archiologist. The legacy system was in a million dollar a piece semiconductor furnace. Doing the work involved grepping through old C code that was written by a brillient assmebly language programmer.

      If code is working and shipping you don't throw it away. What I did was decouple the various patterns that I found and made something that was more modern. I did all of this work in C. It involved a lot of grepping and creating interfaces.

      Just because code is old an kludgey doesn't mean that it is not valuble. Elegance is getting paid a million dollars for a device that only costs a fraction of that to manufacture.

      Bottom line: if you don't have the cash you can't stay in business.

      There is a difference between bad code and old idiom code. Archaic code that is shipping and works is much more valuble than pie-in-the-sky new code that no one wants. ;)
  • somewhat OT (Score:5, Insightful)

    by zephc ( 225327 ) on Friday November 08, 2002 @06:44AM (#4624066)
    But I have karma to burn...

    I just wanted to know, WHY on earth MS would use the directory name 'Program Files' when so often installers and path names, etc. can only work with the 8.3 format and end up calling it 'PROGRA~1'. Plus, the space in the file path screws up some apps... just WTF were they thinking? Why not call it 'Applications'? At least that abbreviates to 'APPLIC~1' which sounds slightly less silly
    • Re:somewhat OT (Score:4, Insightful)

      by 91degrees ( 207121 ) on Friday November 08, 2002 @06:50AM (#4624073) Journal
      Or even better.... Call it 'Programs'. 8 characters.
    • Re:somewhat OT (Score:3, Informative)

      Because that is no longer the case. Its been seven years since Windows 95 was released and it's been a while since I've used a Win32 app that didn't use long file names... If you *would* bother to read the article you'd notice that the premier cause of cruft is trying to keep everything "backwards-compatible". "Program Files" is an accurate description of what in there "Applications" is not because it is not *just* applications, not to mention that not all applications are in the Program Files directory, most of the Windows Applications (notepad, calculator, etc.) is in the Windows or Winnt directory. ~Noodle
    • Re:somewhat OT (Score:3, Informative)

      by Slashamatic ( 553801 )
      Even worse, many of the special directory names used by Windows change depending upon the language.

    • I just wanted to know, WHY on earth MS would use the directory name 'Program Files' when so often installers and path names, etc. can only work with the 8.3 format and end up calling it 'PROGRA~1'. Plus, the space in the file path screws up some apps... just WTF were they thinking? Why not call it 'Applications'?


      Actually, "Programs" would have been better. No abbreviations.

      Michael
    • I guess we're lucky in Sweden. The "Program Files" folder is named "Program", which has the short name "PROGRAM". This because the plural of "Program Files" is "program" in Sweden. Ok, actually it's "programfiler" and "program" is more like if the english equivalent was "Programs".

      Everything is fine it seems, until the first application install itself into "C:\Program Files\TheApp" anyway since it doesn't use the proper Win32 API calls to get the localized "special folder names". Arrgh.

      With around 30 apps in the Program dir, I always seem to have 3 or so apps in a "Program Files" dir. At least I know the developer of the programs since it's very apparent and I can complain if I wish. ;-)
    • On all my (own - not work) machines "C:\Program Files" does not exist. It is called "E:\WinApp". Same thing for the infamous "My Documents", it's called "Home" (Actually D:\Home\%username%). And I like it that way.
      All this needs is just some tweaking in the registry and some few tricks and you never have to live with bills-insane-directory-name-choices again...
      Same for the start menu, I just organize it as topics. It's not hard to do, and most people would do it if they wouldn't be afraid of breaking everything. Because, just deal with it: users are scared of "breaking their computer". I actually learned a lot by breaking my computer, but that was in the DOS days and with PCTools in my hands. I now know why my dad made backups so often ;-))
    • Re:somewhat OT (Score:5, Insightful)

      by Tim Browse ( 9263 ) on Friday November 08, 2002 @09:53AM (#4624607)

      This one's easy actually - a friend of mine independently came to the same conclusion as me on this one, which is that Microsoft deliberately chose "Program Files" as both a 'long filename' and a filename with a space in it precisely to speed the adoption of long filenames. They did it to bring into sharp relief any program that didn't support LFN properly. Remember, Windows 95 was the time when they introduced their "Designed for Windows" logo, which at the time was a pretty big deal, and as far as I can remember, pretty much mandated support for LFN.

      The PROGRA~1 is ugly, but it only happens on old programs - I certainly now use it as an indicator of quality in a Windows app (it reflects how much the author respects the user experience).

      Now, if you want a real gripe, I hate the way most apps just plain don't work if you install them somewhere other than Program Files. I also hate the way most apps have a slavish belief in whatever path information they stored in the registry, meaning you can't ever move an installed app. I try to make my own apps as location agnostic as possible (Mac users: feel free to gloat at this point, with considerable justification).

      Tim

  • by krazyninja ( 447747 ) on Friday November 08, 2002 @06:51AM (#4624076)
    Though the author points out crufts only in software, these are prevalent in other places too, including interfaces to portable devices. I have a similar document on portable device interfaces here [geocities.com].

    It comes out of designing without taking into account user actions and reactions. This subject is un-fashionably called "Industrial Design", but is becoming fashionable again....
    • by Anonymous Coward on Friday November 08, 2002 @07:07AM (#4624114)

      Good user interface is hard and even though the author claims that "we have the technology now", some of the ideas just reflect his personal preference and are not really the obviously better design. Many times the interface which users would find "working as expected" requires nothing short of magic (or artificial intelligence, whichever arrives first). "Industrial design" has one major advantage over user oriented design: It's learnable. Learning a system which itself adapts its behaviour to the user can be really frustrating and time consuming because it follows more complicated hidden rules. That's why "power users" turn off as many automagic functions as possible.

      The real user interface crimes are when well researched principles of perception are ignored: Making every icon round by dropping the actual icon into a marble of colored glass may be pleasing to the eye, but it's working against the way we recognize patterns. Adding bevelled lines around and between everything, even when there is no logical or functional separation, makes user interfaces distracting. And those are just the worst offenders in the graphical representation area.

      • by corporatemutantninja ( 533295 ) on Friday November 08, 2002 @08:45AM (#4624335)
        Well, you wrote my comment for me, but I'll add that the author tries to deflect this sentiment at the end by implying that anyone who disagrees with his preferences has obviously just bought into the dominant paradigm. ("But I *like* QWERTY..."). I suppose the proper reponse to him is, "If you're under 25 years old you obviously haven't had enough time to really think about this problem." And some specific criticisms: - Lots of modern software does, in fact, automatically save your documents. But by doing at least one manual save you get to pick a name and a location so you can find it again without needing Autonomy built into your computer. And while I like that the computer saves its own copy for safety, I specifically do NOT want it overriding the master copy without permission. - On Mac OS X the file picker does in fact look exactly like the file manager, with a few extra buttons around it. - I read his criticism of the "Quit" function several times and still don't understand his gripe. Yes, our computers can multi-thread so we can run multiple applications, but they have limited memory so we can't run EVERYTHING at once. And I for one would rather control which ones are running rather than wondering what my computer chose to quit. Also, Windows doesn't behave as he describes...close the last window and the application quits. Finally, there aren't any "mystery" menu commands unless you don't actually look at the menu bar when you use hot keys. I will admit that the "invisible application" phenomenon can be confusing to new Mac users, but I disagree with MPT's prescribed solutions. The current state is less "cruft" than it is the lack of a perfect alternative. Overall grade for mpt's "cruft" essay: C+.
    • And all this can also be said so: "errare humanum est".
      Like your html code which breaks the display on my browser: you must close the html tags in the same order you opened them:
      b u font "your title" /font /u /b
      Fix it so i can read it, please.
    • Anyone from the UK may have noted that Crufts is the name of the annual dog show [the-kennel-club.org.uk] of the Kennel Club in the uk.
  • by Svenne ( 117693 ) on Friday November 08, 2002 @06:56AM (#4624084) Homepage
    Well, I'll tell you why. In the article, the author compares a wordprocessor to a pen and a paper, where everything you write gets "stuck" on the paper, while in the computer world you have to manually "Save" the work before anything really gets written. This, he deducts, is an arbitrary obstacle, and not intuitive at all.

    On the contrary, my dear Watson. What if I change something in my Word document, but later on decides it was no good and wish to discard it? Nope, sorry. My old document is already rewritten with no turning back. Or is he suggesting that everyone should always make a copy of a document before editing it, just in case? Wouldn't THAT seem terrible unintuitive?

    The "Save" function is one thing that separates the wordprocessor from a real pen and paper, and it certainly has it's uses.
    • That thought occurred to me too but the drag and drop saving the author mentions occurring in Oisc OS and ROX desktop sounds very sensible and in any case there should be some combination of undo and default file names that should improve the existing scenario.

      Indeed maybe they will happen, they would provide a great reason for corporate users to upgrade and would potentially be a better way for somebody like Microsoft to protect their turf than obfuscating file formats.
    • by panaceaa ( 205396 ) on Friday November 08, 2002 @07:07AM (#4624113) Homepage Journal
      Or is he suggesting that everyone should always make a copy of a document before editing it, just in case? Wouldn't THAT seem terrible unintuitive?

      I really like the article's idea. I've lost a lot of work in my lifetime due to software crashes, power outages, or clicking things without thinking. On the other hand, it's not often that I change things temporarily and then revert back to the saved version. (Probably 20 to 1 ratio) With this paradigm, it'd be easy to get in the habit of marking a document as 'temporary' with all the benefits.

      It might make even more sense when content management platforms mature. These platforms keep track of different versions of a document, allowing you to revert back or see document evolution with ease. Then you can have it both ways, your latest changes will always be saved, and you can revert to previous versions. But of course, then you'd have the non-intuitive 'Save This as New Version' button, since you wouldn't be saving your documents manually anymore.
      • by kcbrown ( 7426 ) <slashdot@sysexperts.com> on Friday November 08, 2002 @09:02AM (#4624381)
        Version control is something any user-friendly system should handle automatically. It seems to me that programs should automatically save the diffs against the previous version and should save the full version on a very regular basis (with saves happening after a relatively short timeout or after a certain amount of changes have been made, whichever comes first). Reverting to an old version is a matter of applying the diffs in reverse to the current version. You should always be able to drop back to any previous version you want.

        There's no need for a "save as new version" button: the program will do that automatically when you exit, when you switch to a different program, or when the timeout/max diff condition occurs.

        What is needed in addition is something that should be intuitively obvious: "create a new document based on this one". This will create a "fork", and doing so will cause the program to ask you by what name you'd like to refer to the new document (as it should whenever you create a new document). Perhaps this is what you were talking about.

        We've gotten so used to working with low-level files that methodologies like this get discarded automatically by developers. But that should be done only if there's a lot of hard data that shows that users actually have a harder time dealing with it that way. That may be true of users who are used to dealing with files, but I strongly suspect people who are new to computers will have an easier time with an application that doesn't know about "files" but only about "documents". The system should keep track of the mapping between the two, and the filestore should never be seen directly except with a tool designed to manage it.

        All IMHO, of course.

        • Oh, no, no no. Explicit version control is too confusing and CRUFTY. In the REAL world, people use pen and paper, not funny complicated things with "version control." And if they make a mistake, they simply wad their work up into a little ball and place it in a round canister, and start over. It's so wonderfully intuitive that way.

          If you insist on complicating computers beyond pen and paper, at least use the Undo button, which already exists and whose use is fairly intuitive. To see what your work looked like 6 months ago, simply click Undo 60,000 times.

        • I'm not entirely sure I think this is a good idea. You're talking about combining automatic saving-- saving after every keystroke, or every n seconds, or whatever-- with automatic version control. The net result is that your document would include one version every few seconds as long as you work on it. It'd be easy to accumulate a document with tens of thousands of versions. How would that be useful? You could, in theory, go back in time to any point to recover your work, but how would you know which point in time was the right one?

          I'm not saying it wouldn't be neat; you would basically have enough data to do an instant-replay of the entire document creation process. But it doesn't sound too practical to me.
    • On the contrary, my dear Watson. What if I change something in my Word document, but later on decides it was no good and wish to discard it? Nope, sorry. My old document is already rewritten with no turning back. Or is he suggesting that everyone should always make a copy of a document before editing it, just in case? Wouldn't THAT seem terrible unintuitive?

      Like an eraser, or a bottle of tippex, that's what the "undo" button is there for. All this means is that the save process has to be a bit more sophisticated and store the last n changes.

      Q.
      • by Anonymous Coward
        The undo function reaches the last n steps which, depending on the application, sometimes cover only a few minutes. The "revert" function returns you to the last "saved" state. Some applications also have the concept of making "snapshots" to which you can revert regardless of the number of undo levels. Autosaving may be a very feasible idea for wordprocessors which hardly need the kind of computers they are running on today, but I doubt that anybody would want to wait for images upwards of 50MB or bigger media files to be saved to disk every couple of minutes.
      • by Shimbo ( 100005 ) on Friday November 08, 2002 @07:59AM (#4624227)
        Like an eraser, or a bottle of tippex, that's what the "undo" button is there for. All this means is that the save process has to be a bit more sophisticated and store the last n changes.

        A more sophisticated file system could help us there. During the day, we rsync [samba.org] the development areas every 15 minutes. It takes a trivial amount of space and CPU time. Yet for years I was stuck in the metaphor of doing nightly backups and telling folks they couldn't get back the files they changed in the morning.

        The point is that saving files or versions in case we stuff things up shouldn't be our problem. We should have 'hard' commit points (this is a published document/reviewed code). Between then 'soft' checkpointing could be managed by the OS.
    • It's a fair point; I don't want a pile of files cluttering my desktop!

      What should happen is:

      1. User creates a new document
      2. Software saves a copy of it to disk in a temp folder
      3. Software saves updates periodically to this file
      Now, if the user exits, it will ask if he wants to save; if he does, it saves to the specified location & deletes the temp file. If he doesn't, it deletes the temp file.

      If the software/computer crashes, on the next startup, it prompts the user that it has a file stored; would you like to open it? Options are open, leave or delete.

      Some of these options are already available in other software; vi will store its buffers if it's killed off, and Word (and other word processors) have autosave. It's not rocket science to implement the missing features.

    • by Katravax ( 21568 ) on Friday November 08, 2002 @07:15AM (#4624128)
      There are already programs that support undo past save. If we do something intelligent like get rid of the save command, we should also do something intelligent like undo past save. For the file-size whiners (like myself), we could have the option to lose prior undo information to reduce the file size.

      File systems that support multiple streams (like NTFS) could save undo information in a separate stream. "Not everyone has such a file system," you might say. I say, whatever -- if we're talking about moving forward here, we'll have to go past FAT and other beginner's file systems.

      We're not talking about taking away something that's required for usability today. We're talking about improvements for the next generation. Get over your "Save" command. You'll be able to undo beyond the automatic save.
      • There are already programs that support undo past save. If we do something intelligent like get rid of the save command, we should also do something intelligent like undo past save

        The correct solution is version control built in at the OS level. This would mean all file types would have to have a useful diff defined.

        That would also allow multiple people to work on the smae document with control over how their changes are merged and so on. After all, all these tools were developed for source control not because they are related to programming, but because they are related to editing; any appliction which can be sen as an editing operation could benefit.

      • by hacker ( 14635 ) <hacker@gnu-designs.com> on Friday November 08, 2002 @09:11AM (#4624411)
        File systems that support multiple streams (like NTFS) could save undo information in a separate stream. "Not everyone has such a file system," you might say. I say, whatever -- if we're talking about moving forward here, we'll have to go past FAT and other beginner's file systems.

        VMS has done this for a decade or more. Every time you edit a file, you get 'file.txt;1', 'file.txt;2' and so on, which you can pick up at any point and continue editing. It's semantically similar to cvsfs, where every file saved revisions itself. Implementing cvsfs globally could be "A Good Thing[tm]" overall.

    • Could and should be solved with versioning, just like we do it when writing code.

      "But versioning contains the concept of saving", you say. Not necessary, have a look at the "undo"-feature. Undo is a simple and crude form of versioning, without any mentioning of saving.

      How about an undo that lets you say "take this document back to the way it looked two hours ago"?

      Think about it.

    • ... and he also doesn't seem to have heard of the old concept of "auto save"??
    • What if I change something in my Word document, but later on decides it was no good and wish to discard it?

      I've never had to click "save" on my palm pilot. To undo a change you simply click "undo". 'Nuf said.

    • by photon317 ( 208409 ) on Friday November 08, 2002 @07:45AM (#4624195)

      I wrote a GTK+ text editor that saves the document on every keystroke with a frequency limiter so that you don't save more than once every 5 seconds. It used a background thread for the saving so the user interface didn't hiccup, and every save file contained a complete undo/redo history to the beginning of the document's life. It had no save buttons, only "open" and "close". I never finished it because it was plaintext-only anyways, so nobody was ever gonna use it.
    • by Ed Avis ( 5917 ) <ed@membled.com> on Friday November 08, 2002 @07:50AM (#4624204) Homepage
      It might be better to unify the 'undo' and 'save/load' functions, so that loading an older version is a kind of undo. But there needs to be a better way to navigate back to these old versions.

      IMHO all desktop apps should have built in version control, so instead of File->Save you do File->Tag this version and give it a description. All editing changes are saved to disk immediately they are made (this is only a few bytes per second, no problem on modern machines) and you're prompted to make another version tag before quitting the app.

      There's no longer any disk space argument against saving all versions of the document, all the time. At least not for wordprocessing and most 2d graphics, small spreadsheets etc.

      Collaborative working with merging in different sets of changes (a la CVS) would be tricky to implement, depending on the application: it might require storing a list of commands executed rather than the current state of the document.
    • Never heard of version control?

      I'm currently evaluating a new IDE for developers in my area, IntelliJ IDEA 3. Anyway, it has gone away from the Save-button paradigm as well. Whilst there is still a manual save button, which you can hit whenever you like, it background saves a lot. Whenever you compile it autosaves, whenever you close a file it autosaves (without prompting) and whenever the app window loses focus it autosaves all open files.

      The way it gets around the "but I didn't mean to save those changes" problem is with a local VCS. Every time it does an autosave it keeps a version, and automatically deletes those older than x days (configurable). This works alongside your "real" version control (say CVS or Bitkeeper) - essentially the local one protects you from those "oops" moments when you accidentally write over a modified file, and the external VCS does what it does now, holding actual useful revisions for a file forever.

      You could compare this system to a multi-level undo feature which spans saves, but it's better than that - as being a proper VCS you can visually diff between versions, label particular versions etc. It takes a while to get used to the "what do you mean it's already saved!" aspect, but it's really very neat, and it means you essentially will never lose any work again.

      Oh and yes, it ends up using a fair amount of disc space, but so what? It's a lot cheaper than my time. I could easily see a similar system working well with a wordprocessor, in fact Word has something similar with it's revision tracking. All you have to add in is the ultra-frequent non-obtrusive autosaves, and removal of unimportant old versions to keep size manageable.
  • I recognize this... (Score:4, Informative)

    by HiQ ( 159108 ) on Friday November 08, 2002 @07:00AM (#4624094)
    "software archeology"

    I work day in day out on a ten year old system. I do not use the term archeology, however I frequently find what I call 'fossils'. Parts of code that are still there, but are never executed. Fields of the database that should have been deleted but are still there, and are still updated, though no program ever uses them. A system has to be sufficiently large however, to experience this. But actually funny to read about this.
  • Flawed (Score:5, Informative)

    by Mr_Silver ( 213637 ) on Friday November 08, 2002 @07:00AM (#4624095)
    Whilst the author makes some good points, there are plenty of flaws in his reasoning.

    Fortunately, technology has improved since the 1970s. We have the power, in today's computers, to pick a sensible name for a document, and to save it to a person's desktop as soon as she begins typing, just like a piece of paper in real life. We also have the ability to save changes to that document every couple of minutes (or, perhaps, every paragraph) without any user intervention.

    Yes we do, but for starters a computer is a tool. You tell the computer what to do, the computer does not tell you. Sure we have autosave, but any sensible application auto-saves to a different filename so that if you decide to abandon your changes, you can just quit, not save and revert back to your original format. If you quit a document, you'd still have to agree. What happens when you do want to commit those changes to your file but you don't want to quit? You have to "save".

    Fortunately, technology has improved since 1984. We have the power, in today's computers, to run more than one program at once, and to load programs in less than five seconds.

    Here the author obviously hasn't used a PocketPC. With the PPC its very very easy not to close applications. What happens? The system slows down to a crawl as it tried to run 5 or 6 different applications. Again, this is the user being in control of the computer. I want the ability to close applications when I'm not using them. That is my decision, not the computers. It's the desktop analogy. Once i've finished with a book, I put it away because otherwise my desk gets cluttered. I don't leave it out because otherwise my desk gets full and working becomes a problem. Sure, we could get around this by having the PC unload or suspend applications that aren't used in a while - but how does it decide? Just because I've not typed something into Word for the past 30 minutes doesn't mean that I'm not using it. You'd get to the point where the cleverness of the OS/Application was causing me more hassle as it tried to be helpful and suspend stuff for me.

    Fortunately, technology has improved since 1984. We have the power, in today's computers, to run more than one program at once, and to run the file manager all the time. We can open documents from the file manager without quitting all other programs first, and we can save copies of documents (if necessary) by dragging them into folders in the file manager.

    What about if the application is taking over the whole of the desktop? I'll have to minimise and then drag. Having said that though RISCOS (I think, the one on the Archimedes) used to allow that. You hit Save and the icon appeared for you to drag somewhere. Best thing was that you could drag from one application into another to have it load in there. Neat. But very wierd.

    As for inode stuff, sounds neat. But I know so little about that type of thing, I wouldn't even know it's feasible.

    So in short, some good ideas, but some of them just aren't practical or possible and would end up being a bigger annoyance than it currently is.

    • Re:Flawed (Score:2, Insightful)

      by flippet ( 582344 )
      Having said that though RISCOS (I think, the one on the Archimedes) used to allow that. You hit Save and the icon appeared for you to drag somewhere.

      ...nice in that you could see exactly where you were saving it, but not nice in that if you clicked "save" and type in a name only to realise the window you want to save it in isn't visible.

      In the good old Acorn vs. Amiga wars of yore the Amiga's file requesters were a deadly weapon.

      Phil, just me

    • Re:Flawed (Score:3, Informative)

      by jilles ( 20976 )
      What mpt is referring to is a document centric rather than a program centric environment. Most existing operating systems are program centric. You start a program, instruct the program to create a document, tell the program to do something with a document, tell the program to quit. In a document centric environment, you create documents, you edit documents, etc. Stuff like file dialogs, opening and closing applications, etc. does not fit in with this paradigm.

      Of course you need to have some means to identify a document. As mpt points out, pathnames are an extremely lousy way of doing so. Rather you want an inode with some associated metainformation which may or may not include a name. The whole concept of a name plus a three letter extension is flawed.

      Each type of document has a number of useful metainformation items associated. Obvious ones are date of creation, last date of editing, user that created it. In the case of a bitmap, a small thumbnail might be handy. Of course users should be able to add descriptions and short names as meta info.

      Most of this meta information can be generated automatically. There is no need to bother the user with this.

      Take for example mp3 files. A major problem with these files is that they must have a filename and that they may also have meta information (which more often than not does not match either the filename or the contents of the file). You would want this the other way around. An mp3 file has meta information (like artist, title, tracknr, etc.). Based on this info, programs like filemanagers may query the metainfo to generate a small name (e.g. artist - album - track -title) that is displayed on the screen. There is no need for this generated string to be the unique identifier for the file!

      Beos actually got this right. Every file in the beos filesystem could have an arbitrary number of meta attributes associated with it. Programs like mp3 players, mail readers etc. actually used this to organize data in the beos filesystem.

      You are right that is a huge undertaking to fix this since this would require reengineering a lot of applications and operating systems. That was the whole point of mpt's article. Existing programs are a cumulation of decades of fundamental design errors. Many severe usability issues can be traced to these design errors from the seventies and eighties. Many programmers are unaware of this and have actually duplicated the errors in efforts to improve usability. Their workarounds are symptoms of rather than solutions for the problems.
      • Re:Flawed (Score:3, Insightful)

        by symbolic ( 11752 )
        You are right that is a huge undertaking to fix this since this would require reengineering a lot of applications and operating systems. That was the whole point of mpt's article. Existing programs are a cumulation of decades of fundamental design errors. Many severe usability issues can be traced to these design errors from the seventies and eighties. Many programmers are unaware of this and have actually duplicated the errors in efforts to improve usability. Their workarounds are symptoms of rather than solutions for the problems.

        I don't think some of his proposed ideas are really solutions so much as alternate means of implementations. Take his thoughts on file saving, for example. As someone who spends time on and off working on larger files, I fully appreciate the flexibility I have now to save a file on MY terms- when I want, how often, and whether or not it will be saved as a new, incremental version of the original. Saving a large file in Painter or Photoshop takes time. I can't see any way in hell that an automated save will not become a huge source of annoyance. I sometimes start work on an image, and decide that I don't want to keep the changes that I've made. That should be my decision.

        Personally, I think time would be much better spent on providing systems with transparent backup (give the electronic data a form of persistence closer to that of hard copy) so that recovering from a hard disk failure isn't such a traumatic experience - or even something that people need to worry about.
  • Skinning == crap! (Score:5, Insightful)

    by SexyKellyOsbourne ( 606860 ) on Friday November 08, 2002 @07:03AM (#4624100) Journal
    One thing I hate is the "skinning" of everything, particularly media players. It was popular for mainstream kids software, and it worked okay there; but for everything else, the standard GUI (preferrably written with something nice like WxWindows [wxwindows.org]) should be the only thing that is used. If I see something with colorful, bubbly bitmaps on the gui, I probably won't use it.

    What is intuitive to us is what is standard -- adding new buttons with new pictures, new dials, and other things in a single instance interface only confuses everyone. Even if some of the properties are inefficient, regular GUI standards are the way to go.
    • If I see something with colorful, bubbly bitmaps on the gui, I probably won't use it.

      That's what skinning is for! Just change the skin to something that doesn't have colorful, bubbly bitmaps! ;)

    • I guess you probably leave your background as the default blue of (insert OS here) too. And don't change the colours of the title bars and so forth. So the default for you is fine.

      On the other hand, many 'power users' like to personalise their desktop. My background has purple penguins in ear muffs and my colours reflect this purple rather than the default blue.

      If it was possible to change the colours easily in applications as well as the window manager, then I'd do so as well. Only those apps which allow for skinning, due to the over enthusiasm for graphics everywhere, allow changing the colours at all.

      If skinning is bad, then why allow us to 'skin' our desktop by changing the background?

      --Azaroth
      • On the other hand, many 'power users' like to personalise their desktop. My background has purple penguins in ear muffs and my colours reflect this purple rather than the default blue.

        You say that 'power users' like to personalize their desktop. I'd say, at least in the Windows world, the opposite is true. It's the inexperienced users that get the biggest kick out of themes and GUI cruft. It gives them a false sense of control over their computer. Power users know that having a snake shaped cursor only gets in the way.

        How many hardcore computer people do you know who run a copy of Webshots? Bonzi Buddy? It's all graphical masturbation. I'm glad that it makes you happy to get all Martha Stewart on your desktop. Unless it changes the functionality in some way, I simply don't care.

        Linux is a different story because you get much more control over the GUI. You have the control to change how things work, not just how they look. That's a good thing. However, even under Linux, it seems like most skins try to:

        Rip off MacOS

        or

        Rip off Windows

        The skinners are like Spinal Tap with the volume turned up to 11. If transparency is good, they make everything at least partially transparent. If goofy, bubbly icons with drop shadows is trendy, they make the goofiest and bubbliest. I'm in the camp of keep it simple and make it work. Spending three days to get the metallic pastel alpha blend on the widgets "just right" doesn't do much for me.

        If skinning is bad, then why allow us to 'skin' our desktop by changing the background?

        So I can instantly tell which box I'm on when I use a flaky KVM? So Joe Luser can have his brain damaged offspring smiling back at him? Because $COMPETITORS_OS does it and they did it to compete? Just because you can do it doesn't make it a good idea.

    • Some people seem to like skinning (not me, sorry). To satisfy those, at least skins should be implemented at the OS level, so that a user can enable/disable it system wide - event with per-application rules.

      It also saves you the time and space you waste downloading a skin that is 10x the size of the tinny application you just installed, repeat for each individual application. What a waste.
    • by hacker ( 14635 )
      What is intuitive to us is what is standard -- adding new buttons with new pictures, new dials, and other things in a single instance interface only confuses everyone. Even if some of the properties are inefficient, regular GUI standards are the way to go.

      Answer me this... why do icons in Windows have titles underneath them? Why do ANY icons have titles underneath them? Do you even care what the picture is? No, you read the titles. Why? Because Microsoft failed to standardize on them and make them as commonly known as a picture of a "Stop" sign, or a "green light" as we see while driving every day.

      What is intuitive is what works or what is used, not what is standard, because there are no standards in this space. Making "pretty pictures" under the buttons makes them understandable, in the absence of other descriptive features (such as a title).

  • (getting the fscking html tags correct this time) Lest it seem like I'm picking on Microsoft, Windows is not the worst offender here. GNU/Linux applications are arguably worse, because they could be avoiding all these problems (by using inodes), but their programmers so far have been too lazy. At least Windows programmers have an excuse.
    No, the hackers aren't lazy - they're just too busy trying to ape the MS windowes look and feel....
  • The interfaces of today represent how a computer actually work. For example, the need to save a document. This is not a bad thing if done properly, in fact, it helps (forces) the users to understand how the actual machine works (it writes periodically to the disk, and not one character at a time).
    When saying this I'm not saying that there are flaws in the interfaces of today, and I do agree on that the open source movement lack much of the initiative to make things better. However, making all computer tasks behave as things in 'the real world' will not work. I'm not even sure if all metaphors used today are good. It's better to reflect the actual tool, than try to make it look like something that it is not, this will only result in disapointed and surprised users.
    • For example, the need to save a document. This is not a bad thing if done properly, in fact, it helps (forces) the users to understand how the actual machine works (it writes periodically to the disk, and not one character at a time).

      With modern technology, why shouldn't it write one character at a time to the disk? Or at least write it into the disk cache.. I know it's less than optimal but your word processor isn't going to stress the system out by doing this on a modern PC, and it would be much more intuitive to the user (assuming that the user wasn't already familiar with having to press the "save" button.) .. and why should the modern user understand how the machine works anyhow? I don't know how my microwave oven generates microwaves (although I could find out easily enough if I needed to) but it doesn't prevent me from heating up ready-meals..

      Q.
  • More cruft! (Score:5, Insightful)

    by krazyninja ( 447747 ) on Friday November 08, 2002 @07:09AM (#4624119)
    The most annoying one I have found is when you are typing away madly at the keyboard, and this window pops up saying "xxx yyy operation failed", or "zzz download complete". It does two agonising things:
    1. The alphabet you are typing corresponds to a shortcut in the window, and the window happily closes itself, having done god-knows-what-damage-it-did.
    2. It slows down your pace, disturbs your thinking process, and by the time you close the window and move to the position you were in before, one more word gets added to your swearing vocabulary.

    • While I really like GAIM for its all-in-one approach to messenging protocols, the authors deserve a kick in the balls for having windows that constantly raise to the front, every time someone sends you a message. The result is, you are typing an e-mail or programming and, all of a sudden, what you typed ends up in the wrong window, simply because GAIM is receiving an incoming message for you. Bad, Bad, Bad GAIM..

      People coding window managers should also wake up to the fact that not everyone wants the latest application someone started to pop in their face, while they are returning to another window during the application startup. e.g. If I start OpenOffice and I know for fact that this piece of bloatware needs 5 minutes before the main window comes to screen, so I go back to typing e-mails until the application has loaded, I do not want frigging OpenOffice popping up and assuming thta what I was in the middle of typing was meant for its Untitled 1 document. As such, Xlib and other OSes' GUI libs should completely remove the ability for an application to request that one of its windows be raised and brought to the foreground since, anyhow, window mangement is the responsibility of... window managers.

      • While I really like GAIM for its all-in-one approach to messenging protocols, the authors deserve a kick in the balls for having windows that constantly raise to the front, every time someone sends you a message. The result is, you are typing an e-mail or programming and, all of a sudden, what you typed ends up in the wrong window, simply because GAIM is receiving an incoming message for you. Bad, Bad, Bad GAIM..

        Got an extra 10 seconds? Take a peek in the preferences dialog, and turn that behavior off.
    • Re:More cruft! (Score:3, Insightful)

      by G-funk ( 22712 )
      The answer to this is simple. If you're typing (determined on some magic "keys pressed in the last minute" sort of thing), the window manager should NEVER EVER AMEN NO MATTER WHAT interrupt you. What we need is a little thing in the corner somewhere (corner because it's easier to click, usability 101), that blinks orange/whatever when the computer has something to tell you. Just like ICQ in shrunken mode. In fact it could be intergrated with any message protocol, it blinks blue when people want to talk to you, orange when the computer wants to talk to you, and red when the computer really wants to talk to you.
      • Re:More cruft! (Score:3, Interesting)

        by Rich0 ( 548339 )
        AMEN TO THAT!

        Really, if MS and KDE want to keep the feature of the "SetFocus" function, they should at least put a window manager level preference setting to turn it off. If set, no application will be able to take the focus away from another.

        If I want to type in a window - I'll click on it or tab to it or whatever.

        When web browsing I have a habit of opening multiple windows at once so that web sites can load in the background while I read other sites. At home I just run Mozilla, and tabbed browsing works great - about the only interruption is when I need to enter a master password (which shouldn't interrupt me, but I can live with that). At work I use IE, and I'm constantly pestered by windows raising to the top simply because they've finished loading...

        Ditto with applications. First thing I do after I log in is typically launch a few apps of the desktop - before it gets buried. Then for the next minute while login scripts run and the disk thrashes I can't use my computer - not because of it being too slow, but because one window after another keeps grabbing focus simply because it has loaded...
      • Wish granted (Score:3, Informative)

        by devphil ( 51341 )


        I've not had this problem except with Windows -- mainly because the apps that suffer most from the "I did something that required CPU cycles, therefore I will tell you about it in a popup" disease seem to be Windows apps. So I'll tell you the Windows solution:

        Go to microsoft.com. Find wherever they've hidden TweakUI this month. Download it. (If necessary, download the whole "power tools" thingy that it's a part of.) Install it. (Install the "open cmd.exe at this directory" power tool too, while you're at it.)

        Go to the [Out-Of-]Control Panel, fire up TweakUI, and disallow applications to grab focus. There's even a "what should they do instead" selection that lets you make them blink.

        Disadvantage: some programs fire off a splash screen, then bring it down and replace it with the real program. Window focus doesn't traverse like that now, so the real program won't start off with focus, even though you the last thing you did was to double-click its startup icon. Minor annoyance only.

  • Some time ('99) ago Wired ran this article [wired.com] on Sony R&D. They were developing a kind of OS (Apeiros OS), which would deal with some of the issues discussed in the article on the MPT site. (The Wired article makes a good read on Sony R&D in it self)

    Does anybody know what happened to this project? I'm curious because Sony R&D usually comes up with brilliant solutions for common problems.

  • by Slashamatic ( 553801 ) on Friday November 08, 2002 @07:17AM (#4624133)
    I agree with the general point about cruft accumulating in old s/w but the guy gives some very bad examples.

    In a) he talks about the use of inode numbers as an internal reference used by the system. Regrettably, inodes and other equivalent internal reference numbers used by other file systems under other alternative operating systems can move around. Generally opening by inode is only recommended after first opening by name.

    Having more than one pathway to a file as mentioned in d) in windows is most definitely a feature. For engineering reasons a manufacturer may want to keep a set of files from related applications together, however to the user they may be presented somewhat differently. If anything this is an improvement of interface because of the separation between external and internal representations.

    As for the problems of moving applications around, that is also an issue with meta information held in INI files or the registery. It is quite possible to make a program easier to move (i.e., by including code to update the file locations), but this isn't often done.

    The file/folder metaphor may have probems for newbies but the only real problem is that file (particularly with Unix style file systems) may have more than one name. This is a feature not a bug.

    • Having more than one pathway to a file as mentioned in d) in windows is most definitely a feature. For engineering reasons a manufacturer may want to keep a set of files from related applications together, however to the user they may be presented somewhat differently. If anything this is an improvement of interface because of the separation between external and internal representations
      No it isn't; it's a kluge to hide the fact that applications have gotten more complex and Microsoft wasn't prepared to deal with it. On the Macintosh, until about 1995, applications, generally speaking, did not need support files. You had the application, you had a preference file, and possibly you had an extension or two, but the application usually sat by itself. However, at about that time, both in response to Windows and in response to the fact that applications simply were getting far more complex, most applications began having massive numbers of support folders. Macs just ignored the implications and kept on treckin'; Windows adopted the solution you mention.

      But Mac OS X, and OPENSTEP/NEXTSTEP before it, manage to keep the Mac metaphor while still hiding the implementation details, and it does it much better. Each application is actually a fairly complex directory structure, and all support files can be hidden within the application itself. This can include movies, help files, whatever--you name it, it's there. Now, to the user, you still have just the application, but the application can suddenly be dragged around at will without disturbing anything. For the application, you now can also guarantee a very rigid directory structure that the user can't even mess with. Next time you're on an OS X system, control-click a program and choose "Show Package Contents," or, if you prefer, cd right into the app. You'll see what I mean.

      That's the right way to solve the problem, and that's why he's slamming Windows' metaphor and lauding ROX/OS X app wrappers/packages/bundles.
  • by RAMMS+EIN ( 578166 ) on Friday November 08, 2002 @07:18AM (#4624137) Homepage Journal
    ``In the 1970s and early â(TM)80s, transferring documents from a computerâ(TM)s memory to permanent storage (such as a floppy disk) was slow. It took many seconds, and you had to wait for the transfer to finish before you could continue your work.''
    Wait...are they suggesting there weren't multi-tasking operating systems in the 1970s? What about Unix? Wasn't next to every system multi-user, multi-tasking in those days (timesharing)?
  • by flippet ( 582344 ) on Friday November 08, 2002 @07:22AM (#4624144) Homepage
    We have the technology. So why do we still punish people by including "Quit" or "Exit" menu items in programs? Cruft.

    So we should get rid of ways to close programs? I dread to think how much you'd have running if the computer is on for more than an hour or two.

    Phil, just me

    • So we should get rid of ways to close programs? I dread to think how much you'd have running if the computer is on for more than an hour or two.

      Let the system handle it. When it's idle, it can clean the processes up itself. Think of garbage collection under Java or flushing a disk cache...

    • We have the technology. So why do we still punish people by including "Quit" or "Exit" menu items in programs? Cruft.

      What he is implying is a SDI interface, where you can use the "close this window" metaphor and once all windows are closed, the application is gone. Problem is, not all types of applications work well as SDI. IDEs for one are better off as MDI applications.
  • by Katravax ( 21568 ) on Friday November 08, 2002 @07:23AM (#4624145)
    Some people are posting that they can't undo past a save, or don't like the slowdown of multiple apps, etc. By the time we can lose the cruft the author is talking about, you won't have that problem. The system will intelligently swap out applications and you'll be able to undo past save.

    If you're fighting for reasons we need the crappy methods we have today, the very methods the author was talking about, then you haven't thought it through. It's obvious there's a better way than we have now. It will take some intelligent design and programming to make it happen. That's all. We're smart enough to figure out how to make this happen, and shouldn't screw it up making excuses for why we have to keep old methods around.
  • by X_Caffeine ( 451624 ) on Friday November 08, 2002 @07:28AM (#4624149)
    Like most other articles and editorials that criticize GUI design, he points out a lot of flaws, but few answers.


    Oddly enough, most of his complaints about the handling of files are being addressed in the next Microsoft file system, that's reportedly being based on ODBC (effectively turning the entire file system into a massive database -- the BeOS guys tried and failed at doing something similar).


    Perhaps the Windows right-click-drags he vilifies should be an "advanced feature" that has to be turned on manually, and maybe it isn't magically intuitive, but damn, I'd sure like to see him come up with an alternative that allows a user to quickly and easily take files and copy, move, or alias them with a single gesture this easily.

  • by melonman ( 608440 ) on Friday November 08, 2002 @07:30AM (#4624157) Journal

    He is of course right that you don't have to save your work when using a pencil. But, on the other hand, the eraser on the other end of the pencil won't wipe out 100 pages of work in half a second by accident either. Personally, I am very happy to take responsibility for losing my data, and eternally grateful that emacs has a 'revert buffer' option!

    More generally, why does not exactly like a real desktop equal bad? It's an analogy, right? Does he want files to start curling up at the edges after a couple of years too?

  • About menus (Score:5, Insightful)

    by jeti ( 105266 ) on Friday November 08, 2002 @07:35AM (#4624169)
    Slightly OT:

    Did you notice the different feel of menus in
    common GUIs? Without tricks, it would be hard
    to select submenus. You have to keep the mouse
    pointer in a narrow 'tunnel'.

    MacOS Classic works around that problem by using
    a V shaped buffer zone. If you move your mouse
    to the right within a certain angle, the submenu
    doesn't change.
    MS used an inferior workaround. Submenus open with
    a delay, and you have to select them slowly or they
    won't open at all.

    KDE submenus work like the Windows ones. Gnome
    behaves like the old MacOS. Sadly enough, menus
    in MacOS X now work like the ones in Windows.

    The worst implementation is used by Swing.
    Submenus open with delay, but close without one.
    You have to wait for a submenu to open, and when
    your pointer leaves the tunnel, it vanishes instantly!
    • Re:About menus (Score:5, Informative)

      by Mr_Silver ( 213637 ) on Friday November 08, 2002 @08:46AM (#4624337)
      MacOS Classic works around that problem by using a V shaped buffer zone. If you move your mouse to the right within a certain angle, the submenu doesn't change. MS used an inferior workaround. Submenus open with a delay, and you have to select them slowly or they won't open at all.

      The reason for this is that during the many studies and research that Microsoft did into user interfaces it found that users did not like the fact that the sub-menu option appeared immediately. They actually found it less intimidating when it appeared after a second or two.

      It also reduces screen clutter. As you move up a set of menus, their sub-menu doesn't appear until you come to rest on the option you want, rather than all the sub-menus popping open and then closing again as you move. Having all these sub-menus flashing about tended to unsettle users.

      Having said all that, there is a registry setting that can increase or decrease this to any number of milliseconds you want. I've no idea where it is, but I do remember TweekUI allowing you to change it.

  • by Jugalator ( 259273 ) on Friday November 08, 2002 @07:42AM (#4624182) Journal
    Getting rid of Quit

    What does he mean with getting rid of Quit and Exit options? Should it (bad idea for obvious reasons) auto-close when I change the application or should they just never close (whoa - look my memory run out).

    The alternatives to not exiting apps manually seem horrible to me, so I have to be missing something...?

    Windows' shortcut menus

    "In short, Windows native shortcut menus are so horrible to use that application developers would be best advised to implement their own shortcut menus which can be used with a single click, and avoid the native shortcut menus completely"

    Ctrl+Click = Copy
    Shift+Click = Move
    Alt+Click = Shortcut

    No modifier equals the most likely operation according to Windows. I.e. Shortcut if dragging to start menu, Copy if dragging between local/network drives, Move if dragging between folders on same drive.

    The operation Windows will perform is always shown with an icon next to the mouse pointer.

    I'm not sure how I'd design a quick move/copy/link operation better myself.
  • Acorn (Score:3, Interesting)

    by mirko ( 198274 ) on Friday November 08, 2002 @08:02AM (#4624240) Journal
    Some years ago, I ordered from Acorn the "RiscOS [riscos.com] developper reference manuals".
    It is not only an exhaustive reference but also comes with a style guide.
    I guess this style guide would be invaluable for non-RiscOS developper, especially after browsing through the interface all of shame...
    So, why don't these development suites (visual studio, etc) come with such a book ?
  • by Avalonia ( 169675 ) on Friday November 08, 2002 @08:11AM (#4624258) Homepage
    The save mechanism used by RISC OS [riscos.com] in which is consisted pretty much of just an icon representing your document which could be dragged to either the Filer (i.e. file manager) or even directly into another running application has yet to be beaten, in terms of ease-of-use. The much-copied Windows Save As dialog box just doesn't cut it. Why should I have to tediously navigate to a directory in the save dialog box, when there is a representation of it already on my screen?


    The ROX Desktop [sourceforge.net] has gone someway to implementing this on X - rather than blindly re-implementing the Windows Way like so many other projects.

  • Many good points... (Score:5, Interesting)

    by jonr ( 1130 ) on Friday November 08, 2002 @08:21AM (#4624279) Homepage Journal
    I agree to many points. Modern GUIs use a lot of crutches to make them self workable. Some problems and wishses:

    My favortie is the Save/Open Dialog box, a relic from the single tasking days, why do people use crippled version of the file browser? Risc OS did it correctly, drag an icon from the app to the browser, or even other application!
    Just get rid of the File menu all together.
    Finding files is still a chore, I do miss BFS instant and always up to date live queries. Can we please have that, Apple/Microsoft/Others?

    Installation of applicatons, what is up with that? Why can't I just copy a file from the CD/Net and be done with it?

    File ID's are good idea, I could move apps/files around on BeOS and my shortcuts still worked!

    Why can't we implement version control transparently in the filesystems? Hardly due to lack of space? Each save creates a new version. Word processors could even use intelligent versioning, like making a new version for each paragraph or chapter. Does Photoshop save undo in its files?

    Right now I am using very Explorer-like client for CVS, and it doesn't matter if I am browsing my hard drive or some remote server in Elbonia, it all looks the same, wonderful! :)

    I have been using XP for a while now, and it is making me quite frustrating. Why can't I use all that NTFS has to offer? (AFAIK, NTFS is pretty close to BFS in features, not sure about Live Queries, though)

    What else... yes... Use open standards when data leaves the application!!!!. I can't <em> this enough. BeOS sort of did this with it's Translator service. Your image viewer/editor didn't even have to know how to load/save JPEG or PNG files!

    Well, enough of this rant. :)
    J.
  • by z_gringo ( 452163 ) <(moc.liamtoh) (ta) (ognirg_z)> on Friday November 08, 2002 @08:29AM (#4624294)
    The article is calling the fact that you have to do a manual save at least once per document a bad thing.

    Fortunately, technology has improved since the 1970s. We have the power, in today's computers, to pick a sensible name for a document, and to save it to a person's desktop as soon as she begins typing, just like a piece of paper in real life. We also have the ability to save changes to that document every couple of minutes (or, perhaps, every paragraph) without any user intervention.

    We have the technology. So why do we still make people save each of their documents, at least once, manually? Cruft.


    I don't wan't to name the document until I decide to save it. Does anyone else here want this feature? I create many documents every day, to re-format, print, view differently cut / paste from web for printing, for email, etc... I don't want my hard drive cluttered with this crap. That's why I don't save it. Yet, this Matthew Thomas guy thinks this would be good. I think his first example of "cruft" is a bad one.

  • by tlambert ( 566799 ) on Friday November 08, 2002 @08:41AM (#4624319)
    Personally, I don't like cruft, but the way he wants to "correct" some of the things he doesn't like, well... the cure is worse than the disease.

    My idea of hell is an editor that auto-saves code that I'm in the process of hacking up in an editor to let me think about the problem over top of code that already works.

    My idea of hell is a platform where every document I've ever opened has no way to close it and no way to exit the application that's got it up in a window, because there;s no 'Quit' or 'Exit' option.

    My idea of hell is not being able to drag something in a GUI from one folder to another, because they have an obscure "parent of my parent" relationship, which makes me have to cut and paste the document, instead of just dragging it, because I on;'y have one file manager, which is running all the time, instead of a "file picker".

    My idea of hell is symbolic links that get changed when I rename a file out from under them because the OS thinks it knows what I want better than I do, so it's impossible to replace a file with another, while keeping, and the old one, unless you copy it, rename the original, rename the copy, and then edit the original (instead of replacing it).

    -- Terry
  • by Fastball ( 91927 ) on Friday November 08, 2002 @09:14AM (#4624425) Journal
    Look, I don't disagree that UIs are slow to evolve, but isn't that good to a degree? What's the point of an interface that is completely different version to version? Fact is, on the PC desktop, we've basically reached UI valhalla. There's not much more you can do *better* with a mouse and keyboard.

    Now, if the hardware were to change such that we weren't tethered to mice and keyboards, then I can see some interesting possibilities. But things being what they are, I'm quite content with my shell and VI.

  • by dpilot ( 134227 ) on Friday November 08, 2002 @09:31AM (#4624501) Homepage Journal
    For the most part, we have only ourselves to blame for this cruft problem. Alternatives have already been brought to market and died. Maybe we were short-sighted consumers?

    Example 1: Single-level store
    Actually this one survived in the market - sort of. It was part of the old IBM Future Systems effort, and made it out the door as the System/38, with followons in the AS/400 and iSeries. Single level store says you get rid of that silly distinction between RAM and disk - everything is memory. What we quaintly call main-memory is merely another level of caching to the disk, which is the real memory. Then you make the big thing a database instead of just a filesystem, and it can readily solve pretty much all of his numbered problems in one fell swoop. Was this perhaps something like Longhorn, only about 20 years ago?

    The System/38 and descendents has met with success, largely as the closest thing to 'just plug it in and run' that has made it to market. At another level it hasn't been that successful, largely because of its unconventional and rather opaque system model.

    As an interesting aside, IBM's first entry into the workstation arena, the Romp microprocessor, also had single-level store capability. (actually expressed in inverted page tables) Then in order to make it more Unix-familiar they mapped a conventional filesystem on top of that. I don't know if Rios and PowerPC followons retained that capability or went to more conventional paging architectures.

    Double aside: Romp/Rios/PowerPC are yet another fallout of the Future Systems effort. Any big project has a backup plan, and one of the backup plans for FS was the 801 minicomputer, the original RISC.

    Example 2: The OS/2 Workplace Shell
    Just a bunch of UI glue, but what a bunch of it! It directly solved the broken link problems, and had a more consistent, if different set of mouse semantics. It also has a group feature that kind of got around his 'quit' problems.

    But I disagree with overusing the inode the way he suggests. The inode is an internal structure and isn't meant to have a UI-level life. He really wants access to Data, not to Filename or Inode. Does he really want a database-type filesystem?

    My own fantasy is a semi-conventional filesystem, but instead of a conventional directory structure use a semantic network. The role of directory navigation is taken on by relationships. It's an incomplete idea at the moment, though.
  • by west ( 39918 ) on Friday November 08, 2002 @10:10AM (#4624741)
    After accidentally hitting a geocities site, I now have to *manually* close 150 pop-under/over/beside windows, each one of which pops up another 150 windows.

    I want IE dead, and I want it now!

    Where's my Exit menu item?

    (I know, I know, it's in Mozilla. Time to switch.)
  • by jdkane ( 588293 ) on Friday November 08, 2002 @10:32AM (#4624901)
    We also have the ability to save changes to that document every couple of minutes (or, perhaps, every paragraph) without any user intervention.
    It's called "auto-save". The feature already exists in most word-processors.

    We have the technology. So why do we still make people save each of their documents, at least once, manually? Cruft.
    Well, maybe we want to allow people the choice of whether or not the the work gets saved. So to make it less crufty for the user, should we auto-save a different document every time, and fill up the user's hard drive? Then the user doesn't have cruft anymore, but does have to look through dozens of similar documents in order to revert changes.

    Interesting article. I like it, but the author doesn't appear to have a good concept of what cruft really should be.

  • by os2fan ( 254461 ) on Friday November 08, 2002 @10:47AM (#4625035) Homepage
    I had a read of the article.

    The issue is not so much that that extra features have been added, but that the intent is not correctly communicated, or is inappropriate

    For example, the WPS applied to the OS/2 desktop is a wonderous thing, one that people desire in other systems. When this file-viewing device is applied to files in general [eg DRIVES], the result is a nightmare. Drives is *not* one of os/2's better features.

    Windows copied this feature into their shell, along with a network browser. Unlike the OS/2 one, these ones *can not be hidden*, especially without corrupting the operating system. [deleting Network Neighbourhood removes UNC support].

    It's not that the "start menu" is totally bad either. It relies on an established practice of menus. So does the send-to [as a configurable context menu that allows drag-and-drop to otherwise hidden targets]. Folders = submenus. So you can have submenus in the send-to as well.

    It's not that one can't make the windows shell liveable. Create a directory called grotto, and move these folders from Windows: sendto, start menu, desktop, shellnew, recent. You can create other folders there as well.

    Create an icon with the command
    explorer.exe /e,/root=c:\grotto,start menu

    This gives you a super-program manager that you can fix your start menu, send-to, etc, as well as drag out recently edited docs for shortcuts to the desk.

    The other issue of what happens whens when one closes dialogs (as to whether it's an OK or Cancel), frustrates users to no end.

    The issue is not so much as Cruft, but the lack of consistancy. Were cars like this, they would be hazardous.

  • by thatguywhoiam ( 524290 ) on Friday November 08, 2002 @10:54AM (#4625098)
    (as in the former Wired columnist)

    As off-base as I think the author is, it's good to think this way. Even if it's not practical or better.

    Save is an advantage, not an obstacle. The article's author limits the use of Save to things like Word-processing (immediately betraying his experience with more esoteric formats). As others will surely point out, Save can save you when something you're working on goes on a tangent. Besides, Word (and others) can AutoSave.

    Launching/Quitting programs, while arguably cruft, has been accepted insofar as people do like the tool metaphor. You use a jigsaw to cut complicated shapes in wood. A screwdriver for attaching things. Photoshop for graphics, etc. Although I will admit Quit is getting a bit weird... esp. on modern systems like OS X, where there isn't that much of a reason to Quit things. I still do it out of habit.

    Filenames are... well, filenames, and they don't seem to ever change. I don't really see a disadvantage with a 256-character filename. The dot-3 suffix is a bit of an anachronism, but it's a comfy one, one that gives the user a bit of metadata as well as the computer. Windows' behaviour of only recognizing files with certain suffixes in file dialogs by default has reinforced this.

    I don't know what he means by the File Picker. I launch/pick stuff from the Finder all the time.

    What I'd really like to see is a better representation of relationships between files. Something akin to The Brain, or another derivative of Xerox's Hyperbolic Tree structures. Radial menus, with branches running in various directions to the related objects/files, have been proven to be more effective than lists of data (there's something humans like about remembering angles as opposed to just the names). People themselves need a better representation, too. iChat has taken baby steps towards this, but really, ponder for a moment; why can you not see the heads of all your friends popping up on your desktop? Why is it that we have to 'browse' for other people, either through an AIM window (another list) or some such mechanism? If I get an email from Mike I want to see a mail icon next to Mike's head. I want to send files to Mike by dropping them onto his head. I want to see *everything* that is related to Mike at a click.

    Also, to mention another pet peeve: themes. People love themes. People abuse themes. There is a need here that has never been addressed fully, IMHO - the problem is that people are dressing up the cosmetics of the interface while doing nothing to change the behaviour. It's sort of ridiculous to think that we can come up with a Swiss Army Knife interface that will be maximally productive for all conceivable computer tasks. I've actually taken to creating several different accounts on OS X, each with their own specialized settings. If I'm doing graphics work, I want everything to look like a light table; big shiny icon previews, sorted by meta data (time photo was taken, type of graphic, etc.) If I'm doing coding, I want list views or column views everywhere, and lots of reporting tools running on the desktop (bandwidth, terminal). There really should be interface schemas that can switch on-the-fly to whatever sort of task you are engaged in.

  • by bmabray ( 84486 ) on Friday November 08, 2002 @11:39AM (#4625423) Homepage Journal
    ...in the "Interface Hall of Shame" using frames? :-)
  • Save and Exit (Score:4, Insightful)

    by Bugmaster ( 227959 ) on Friday November 08, 2002 @12:04PM (#4625576) Homepage
    Much of the stuff he says is true. However, I think the author gets carried away when he declares the "Save" and "Exit" commands as "Cruft!".

    These commands are, IMO, actually examples of good interface design. The (unwritten) rule these commands are implementing is,

    A program should not try to outsmart the user
    Let's say the "Save" command was automatic. Where are the files saved ? Under what name ? How often ? What if I made a mistake, and want to restore the old file -- is it possible ? How far back can I go ? Infinitely back ? What if I don't have infinite disk space ? Etc. etc. Instead of making a program that would try to solve these questions for all people and all applications at once, I can simply tell the user, "look, when you want to permanently persist your document, hit Save". This is a lot better in the long run than a program that would overwrite your files every so often because it feels like it.

    Similarly, the "Quit" command is useful. Without it, applications would just pile up on the screen and in RAM. When I am done with writing my letter, and want to play some Warcraft 3, I want to close MS Word and open WC3. It's very natural; just as when I am done working and want to go to the beach, I take off my suit and put on swimming trunks. If I could not quit any programs, they would pile up like layers upon layers of dirty clothes -- media players, web browsers, p2p programs, text editors, word processors, compilers, virtual machines, graphics editors... and that's just what I use before breakfast ! Yes, it would be nice if we had infinite CPU, RAM, disk space and screen space, but we don't, so the "Quit" command is the next best thing.

    Note that, ironically, on single-threaded OSs, such as PalmOS, the Quit command is actually not neccessary. There, it makes a lot more sense to just save the state of the current program when you want to run something else, then restore the state when you reactivate the program. This only works because you can run one program at a time, and that's it, so there's no room for confusion.

    Contrast the ease of use of the "Save" and "Quit" commands to other commands which have been implemented as automatic agents, just as the article suggests. MS Word's "auto-correct" (more like auto-confuse), Visual Studio's auto-complete (it knows best what function you want to call) and that damn paperclip all come to mind. My computer is not psychic (yet); it cannot sense what I want to do. If it tries to predict what I want to do ahead of time, it will fail and mess up.

  • by Ilan Volow ( 539597 ) on Friday November 08, 2002 @12:17PM (#4625664) Homepage
    The author of the article, Matthew Thomas, also wrote two very good pieces
    "Why Free Software Usability Tends To Suck" [phrasewise.com]
    "Why Free Software Usability Tends To Suck Even More" [phrasewise.com].

    They are an eye-opener for any one who has wondered why linux is still not ready for the desktop despite the prescense of so many talented programmers in the Free Software Community

  • by mhackarbie ( 593426 ) on Friday November 08, 2002 @12:58PM (#4626051) Homepage Journal
    It's fine to try to improve on things, but Thomas makes a fundamental error in dismissing things like 'Save' and 'Quit' commands as outdated. Even though hardware speeds have increased, so have document sizes and many people work with huge files where continuous saving would bring their work to a screeching halt.

    I notice quite often that people who try to analyze the shortcomings of current UI's often make this kind of error because they are not aware of the diversity of needs that must be served.

    For people who like to develop new, improved and consistent UI models, I would suggest that they also spend some time in describing the particular context and subset of computer users for which this model would apply.

  • by w3woody ( 44457 ) on Friday November 08, 2002 @01:15PM (#4626192) Homepage
    I strongly disagree with the author's comment about saving documents, but not for the reason that most people think.

    The problem with eliminating save and only having one version (the version that is currently open, which is reflected in the file on the disk) is that you eliminate a primitive sort of "versioning" where the saved document on disk represents an older version of the document. The "save" command becomes a sort of "push current version out" and "revert" becomes a sort of "roll back changes to last version."

    Now I would happily eliminate the "save" menu from my programs, but only if we could replace it with a "mark version" and "rollback version" command which would allow me to maintain several versions of the same document. That is, I wouldn't mind if we created a word processor which saved the current version of the document to disk as it was typed, but only if I have the power to mark different document versions and either roll back to an older version or mark the contents as the current version in that file.

    I strongly believe this is the reason why eliminating the "save" command was not accepted by MacOS usability testing when they were working on OpenDoc. OpenDoc eliminated file saving entirely, and users hated it--because users were using the "save" command and "revert" command as a sort of "commit changes/rollback changes" command--that is, as a primitive way of version control. And OpenDoc, by eliminating the save command, took that away from users.

    Don't take away file version control; give me a more powerful version of it!

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...