|May the source be with you, but remember the KISS principle ;-)|
|Contents||Bulletin||Scripting in shell and Perl||Network troubleshooting||History||Humor|
|How to Solve It|
|Programming Pearls||The True Believer||Lions' Commentary on Unix||K&R Book||Rapid Development||Winner-Take-All Politics||Military Incompetence|
|Alice's Adventures in Wonderland||Tao of programming||AWK book||Animal Farm||The Elements of Programming Style||Humor||Etc|
|"Many say that DOS is the dark side [from Star Wars], but actually UNIX is more
like the dark side: It's less likely to find the one way to destroy your incredibly powerful
machine, and more likely to make upper management choke."
~ Lore Sjöberg, noted Internet humorist
The Unix-Haters Handbook is a UNIX classic published in 1994 by IDG books. It originated from messages to the Unix-Haters mailing list . The messages were edited and compiled into the book by Simson Garfinkel, Daniel Weise and Steven Strassmann. what is interesting is that the book has two forwords instead of one:
The book was started Daniel Weise and continued by Simson Garfinkel and Steven Strassmann. Together have created a masterpiece, by culling the best of the UNIX haters newsgroup posts and organizing this collection into a a readable book. Now the book is online for free!
The book serves as a requiem for the now forgotten operating systems such as VAX, VM/CMS, OS/360 and its derivatives. all of them from the 1980s on quickly lost ground to cheaper Unix workstations. Losing the skillset you've spent the decade or two acquiring or using Unix C complier after IBM PL/1 compilers is pretty painful and a lot of bile in the Unix-haters mailing list can be explained that the list provided perfect outlet for such feelings.
Now with Solaris losing ground we probably can create the second volume by systematizing bad feelings of Solaris users and sysadmins toward Linux ;-). Or even more biting book about experiences of those who try to move to linux from Windows. See, for example, Linux Hater's Blog
This book contains some valuable anecdotes about at the dark side of UNIX, written by pretty knowledgeable specialists. Some of the chapter subtitles include: "Power Tools for Power Fools," and for the C++ chapter, "The COBOL of the 90s.". It reminds us that while Unix is powerful and elegant, it is far from the embodiment of divine perfection. It essentially points out how new technology makes older design obsolete and more of the straitjacket then help. But truth be told many of them do not see the forest behind the tree and there posting suffer from short slightness typical for people who do not get a bigger picture.
Critique of Unix interface with character terminals remains valid. This is a horrible kludge nobody tried to replace with sound subsystem, let's say of MS DOC level, despite the fact that the typical terminal is now PC. Support of AT keyboards is also weak which we can see on programs that originated in MS DOS world such as Midnight Commander.
Shortcomings of C as a system programming language are not also well known and critique of C handing of arrays, specifically absence of built-in checks for index overflow (bound checking) is a real, unmitigated disaster.
The key set of topics revolve around various types frustrations of users with Unix. some of them are legitimate issues connected with complexity of the system, which makes learning details of its functionality a pretty challenging task even for specialists with years of experience of working computers. The worse is better design philosophy led to pretty steep learning curve for anybody coming from, say, Windows background. And power of Unix can't be unleashed without extensive training. The book is now rather dated, with most of the material describing Unix as it existed in early 1990. Some problems mentioned in the book no longer exist (for example the lack of a journaling file system), and in some topics subsequent development has shown that they are barking to the wrong tree.
But a large proportion of the complaints are about anomalies in the command line interface and this problem still exists and was almost 100% inhered by Linux. That's what makes this book classic.
The book clearly shows deficiencies of Unix command line interface and raise the question why so little progress was achieved in 20 years since the book was published. In a way it should be night read for Linus Torvalds ;-). The question why nobody tried to make the next generation command interface remains unanswered even today. If Unix sysadmin were paid ten cents for each ls, cd, vi and cp command they type, it could be good pocket money to spend each month. For some reason Unix sysadmin are very slow to adopt DOS-style visual shell environment such as provided by midnight commander and other Orthodox file managers.
Another problem is that way too many Unix utilities have root in 1970th and idiosyncratic interface. Classic example here remains complexity and idiosyncratic interface (actually mini-language) of find utility (although it solves complex task so to criticize it for complexity only is not altogether correctly ;-) and six types of regular expressions used in utilities. From the other point of view Unix proved to be remarkably stable for all those years and most material remains current.
The book has been written for IT professionals who already know Unix well. It provides some insights into origins of some of Unix utilities and behaviors, such as of the origin of some command names and the quirky mechanics of the man page command reference system.
The front page has humorous dedication: "To Ken and Dennis, without whom this book would not have been possible." in reference to Ken Thompson and Dennis Ritchie, the creators of Unix.
As we mentioned above, Dennis Ritchie, provided an "anti-foreword" for the book.
1 Unix .............................................................................................. 3
- The World’s First Computer Virus
- History of the Plague..................................................................... 4
- Sex, Drugs, and Unix .................................................................... 9
- Standardizing Unconformity......................................................... 10
- Unix Myths ................................................................................. 14
2 Welcome, New User!.......................................................................17
- Like Russian Roulette with Six Bullets Loaded
- Cryptic Command Names ...........................................................18
- Accidents Will Happen................................................................19
- Consistently Inconsistent.............................................................26
- Online Documentation ................................................................31
- Error Messages and Error Checking, NOT! ..................................31
- The Unix Attitude.........................................................................37
3 Documentation? ............................................................................43
- What Documentation?
- On-line Documentation ...............................................................44
- This Is Internal Documentation? ..................................................51
- For Programmers, Not Users.......................................................54
- Unix Without Words: A Course Proposal .....................................56
- Don’t Talk to Me, I’m Not a Typewriter!
- Sendmail: The Vietnam of Berkeley Unix ..................................62
- Subject: Returned Mail: User Unknown .....................................67
- From: <MAILER-DAEMON@berkeley.edu> ..............................74
- Apple Computer’s Mail Disaster of 1991....................................85
- I Post, Therefore I Am
- Netnews and Usenet: Anarchy Through Growth ...........................93
- Alt.massive.flamage ..................................................................100
- This Information Highway Needs Information ..............................100
- rn, trn: You Get What You Pay for .............................................101
- When in Doubt, Post .................................................................105
- Seven Stages of Snoozenet........................................................106
6 Terminal Insanity.......................................................................... 111
- Curses! Foiled Again!
- Original Sin ............................................................................... 111
- The Magic of Curses ..................................................................114
7 The X-Windows Disaster ...............................................................123
- How to Make a 50-MIPS Workstation Run Like a 4.77MHz IBM PC
- X: The First Fully Modular Software Disaster..................................124
- X Myths......................................................................................127
- X Graphics: Square Peg in a Round Hole .....................................141
- X: On the Road to Nowhere .........................................................142
8 csh, pipes, and find........................................................................147
- Power Tools for Power Fools
- The Shell Game .........................................................................148
- Shell Programming......................................................................155
- Pipes .........................................................................................161
- 9 Programming ...........................................................................173
- Hold Still, This Won’t Hurt a Bit
- The Wonderful Unix Programming Environment .............................175
- Programming in Plato’s Cave........................................................176
- “It Can’t Be a Bug, My Makefile Depends on It!”.............................186
- If You Can’t Fix It, Restart It! ........................................................198
- The COBOL of the 90s.
- The Assembly Language of Object-Oriented Programming ..............204
- Syntax Syrup of Ipecac.................................................................208
- Abstract What? ............................................................................211
- C++ Is to C as Lung Cancer Is to Lung...........................................214
- The Evolution of a Programmer.......................................................215
11 System Administration ...................................................................221
- Unix’s Hidden Cost
- Keeping Unix Running and Tuned....................................................223
- Disk Partitions and Backups............................................................227
- Configuration Files.........................................................................235
- Maintaining Mail Services ..............................................................239
- Where Did I Go Wrong? ...............................................................241
12 Security ..........................................................................................243
- Oh, I’m Sorry, Sir, Go Ahead,
- I Didn’t Realize You Were Root
- The Oxymoronic World of Unix Security ..........................................243
- Holes in the Armor .........................................................................244
- The Worms Crawl In ......................................................................257
13 The File System................................................................................261
Sure It Corrupts Your Files, But Look How Fast It Is!
What’s a File System? ....................................................................262
UFS: The Root of All Evil..................................................................265
- Nightmare File System
- Not Fully Serviceable.......................................................................284
- No File Security..............................................................................287
- Not File System Specific? (Not Quite)...............................................292
- Enlightenment Through Unix
B Creators Admit C, Unix Were Hoax.....................................................307
- FOR IMMEDIATE RELEASE
C The Rise of Worse Is Better................................................................311
- By Richard P. Gabriel
- Just When You Thought You Were Out of the Woods…
Doctor Goats (Wellington New Zealand) - See all my reviewsInteresting footnote to an 80's platform war,A Customer
August 26, 2010
I'll get this out of the way first: the book is a polemic, and I've no idea how serious the authors are. Given when it was written and the example annoyances it comes across as a requiem for the operating systems on minicomputers and mainframes, which from the 1980s on lost ground to cheaper Unix workstations.
Losing the skillset you've spent the last decade or two perfecting isn't easy, and the unix-haters mailing list appears to have provided the perfect outlet.
The contributors to the mailing list and subsequent book are all technical, and as such are in an ideal position to articulate criticisms. Many of the criticisms are of a historical nature, even at the time of writing; many seem to be aimed at a different target (e.g. Usenet or Sendmail), but try to drag Unix in by association; and some are spot on and could be updated and expanded to modern *nix.
But... having power users write the book does have a couple of downsides. Firstly, you'll need Unix familiarity to know what they're talking about. Secondly, the book goes overboard with the nit-picking: e.g. some functionality that the user of one operating system likes is absent in Unix -- never mind that it's also absent in almost every other OS.
Also, the fact that we're comparing what were then called "open systems" with the legacy systems they displaced may be lost to a modern reader. This book an historical footnote to a high-end 80s platform war, of interest to anyone who was around for it.This book would be hilarious if it weren't so painfully true,Amazon Customer (Cambridge, MA, USA) - See all my reviews
November 27, 1996
My wife hates this book: "He kept waking me up all night by either screaming in anger or shaking with laughter."
Although flawed by repeating now-obsolete tales from Unix's chequered history, so much of it still rings far too true.
It shines a glaring light of insight at the background design (or lack of it) that is *still* going into Unix.
For that reason alone it's a must-read for anyone charged with using or supporting it.
The book ranges from a petulant "backword" from Unix-Father Dennis Richie (synopsis: "Phhtttt"), through a tears-to-the-eyes explanation of how "sendmail" processes your mail "like a bat chasing insects in a cave, looping and swerving",
to an explanation of how the X-window system got that way, and why it's still blindly heading in that direction. This book is a brilliant beacon down the slippery path known as Unix.
It's so good people keep stealing mine, so the above quotes may be slightly inaccurate.
Even a cursory reading of this book will remove all surprise at announcements like Win95's ability to crash half the Unix systems on the planet with a one-line command.
Highly recommended (and, if you've got my copy, please return it).
Buy it, read it, and use it as ammo when people tell you "we need to switch to Unix."Entertaining and often even true -- now free!,
January 25, 2004
This is a breezy book poking fun at the foibles of Unix. As a sarcastic screed, it is not at all balanced or fair or reasonable, or even necessarily historically accurate. But it is valuable.
(...)It is valuable because in many ways it is a catalog of design errors that you can make when putting together a system -- any system. Designers of new systems should be able to learn from it.
It is valuable because it shows you how over time design decisions and compromises that seemed reasonable can come to seem ridiculous.
It is valuable because it really does show you that "Worse is Better". That is, Unix really did survive, and all the 'better' systems like Multics and Tenex failed (and of course they weren't necessarily better across the board). There is a lesson here for engineers who don't understand that making the 'best' product by some narrow technical definition does NOT guarantee market success.
It is valuable because it documents some of the *alternatives* to doing things the Unix way. Not enough to substitute for studying Multics and whatever, but valuable nonetheless.
It is valuable because many of the analyses of Unix apply to other systems, certainly including MS-DOS and Windows. Yes, Windows does some things better, and some things worse. But you're smart; you can figure out how to transpose the analysis.
Finally, it is valuable because it punctures the pretensions of those who hold up Unix (and Linux) as images of perfection.
Armed and Dangerous Sex, software, politics, and firearms. Life's simple pleasures…
If tempted by something that feels ‘altruistic,’ examine your motives and root out that self-deception. Then, if you still want to do it, wallow in it. —Lazarus Long
« C++ Considered Harmful. Why I Hate Proprietary Software »The Unix Hater’s Handbook, Reconsidered.
A commenter on my post pre-announcing Why C++ Is Not Our Favorite Programming Language asked “esr, from the perspective of a graybeard, which chapters did you consider good and which chapters did you consider bad?”
(Technical note: I do not in fact have a beard, and if I did it would not be gray.)
Good question, and worthy of a blog entry. I was the first techical reviewer for the manuscript of this book back when it was in preparation — IDG published it, but I think it was passed to me through MIT Press. As I noted in the same comment thread, I worked hard at trying to persuade the authors to tone down the spleen level in favor of making a stronger technical case, but didn't have much success.
They wanted to rant, and by God they were gonna rant, and no mere reviewer was gonna stop them.
I’ve thought this was a shame ever since. I am, of course, a long-time Unix fan; I’d hardly have written The Art of Unix Programming otherwise. I thought a book that soberly administered some salutary and well-directed shocks to the Unix community would be a good thing; instead, many of their good points were obscured by surrounding drifts of misdirected snark.
You can browse the Handbook itself here. What follows is my appraisal of how it reads 14 years later, written in real-time as I reread it. After the chapter-by-chapter re-review I’ll sum up and make some general remarks.
Introduction I have a lot of respect for Don Norman, but he did not write this on one of his better days. The attempts at intentional humor mostly fall flat. And “…now that's an oxymoron, a graphical user interface for Unix” looks unintentionally humorous in 2008. Otherwise there’s very little content here.
Preface Similarly unfortunate. Sets the tone for too much of the rest of the book, being mostly hyperbolic snark when it could have been useful criticism. Very dated snark, too, in today’s environment of Linuxes wrapped in rather slick GUIs. The anecdotes about terminal sessions on Sun hardware from 1987 look pretty creaky and crabby today.
The authors write: “It's tempting to write us off as envious malcontents, romantic keepers of memories of systems put to pasture by the commercial success of Unix, but it would be an error to do so: our judgments are keen, our sense of the possible pure, and our outrage authentic.” I know and rather like some of the authors, so it actually makes me a little sad to report that fourteen years later, writing them off this way is easier than ever.
Anti-Foreword: Dennis Ritchie’s rejoinder is still funny, and his opening and concluding words are still an accurate judgment on the book as a whole:
I have succumbed to the temptation you offered in your preface: I do write you off as envious malcontents and romantic keepers of memories. The systems you remember so fondly [...] are not just out to pasture, they are fertilizing it from below. [... Y]our book is a pudding stuffed with apposite observations, many well-conceived. Like excrement, it contains enough undigested nuggets of nutrition to sustain life for some. But it is not a tasty pie: it reeks too much of contempt and of envy.
Chapter 1 – Unix: The World's First Computer Virus About equal parts of history and polemic, history not new, polemic cleverly written but not very interesting once you get through chuckling at the verbal pyrotechnics. The stuff on the standards wars is really dated now.
And no mention of Linux, which had just acquired TCP/IP support the year this was written and already had a thriving community. No one can blame the authors for not foreseeing 2008, but it’s as though they were mentally stuck in the 1980s, oblivious to the reality of 1994.
Chapter 2 – Welcome, New User! There is some fair criticism in here. Yes, Unix command names are cryptic and this is a UI issue — mitigated a lot in 2008 by the presence of GUIs that no longer suck, but still an issue.
Quality of documentation still ain’t so great, and comprehensible and useful feedback when commands fail is something Linux applications could stand to be a lot better at. But these are problems almost everywhere, not just Unix-land; it seems a bit unfair to reproach Unix for special sinfulness on their account.
On other points they do less well. “Consistency and predictability in how commands behave and in how they interpret their options and arguments” sounds very nice, but the first part is impossible (different commands have to behave differently because they solve different problems) and the second is more a gripe about shell wildcard expansion than anything else. Sorry, no sale; it’s too useful. If you don’t like the way rm * works, go fire up a GUI file manager.
Then there’s a lot of flamage about Unix developers being supposedly content to write shoddy programs. Neither true nor interesting, just more hyperbolic snark obscuring the points on which they have a point.
There are some good nuggets in this chapter, but on balance digging through the excrement to find them does not seem worth it.
- Chapter 3 – Documentation? What Documentation? Yes, man(1) is still clunky and man pages are still references, not tutorials. This isn’t news in 2008, and wasn’t in 1994, either. The difference is that in 2008 the man pages style they’re excoriating is less of a blocker; we have the Web and search engines now.
Their fling at “the source code is the documentation” has got some unintentional irony since open source happened. Gripes about obsolete shells don’t add much, if anything, to the discussion.
Much that’s true in this chapter, but almost nothing that’s still useful or novel in 2008.
Mail: Don't Talk to Me, I'm Not a Typewriter This is mostly a rant against sendmail. Most of the criticism is justified. A lot of Linux distributions default to using Postfix these days, and the percentage is inceasing; end of story.
- Chapter 5 – Snoozenet: I Post, Therefore I Am USENET is not exactly dead, but these days it’s mainly a relay channel for p2p media sharing and porn. There’s some stuff in this chapter of interest to historians of hackerdom, but nothing relevant to current conditions.
- Chapter 6 – Terminal Insanity: This chapter has dated really badly. To a good first approximation there simply aren’t any actual VDTs any more; one sees a few on obsolete point-of-sale systems, but that’s about it. It’s all terminal emulators or the OS console driver, they all speak VT100/ANSI, end of story, end of problem.
There’s an attempt at an architectural point buried in the snark. Yes, it would have been really nice if Unix kernels had presented a uniform screen-painting API rather than leaving the job to a userspace library like curses(3). But — and I speak as an expert here, having implemented large parts of ncurses, the open-source emulation of it — moving all that complexity to kernel level would basically have solved nothing. The fundamental problem was that Unix (unlike the earlier systems these guys were romantically pining for) needed to talk to lots of VDTs that didn’t identify themselves to the system (so you couldn’t autoconfigure them) and the different VDT types had complicatedly different command sets. The stuff that curses did had to exist somewhere, and its capability databases; putting it in a service library in userspace at least guaranteed that bugs in this rather tricky code would not crash the system.
But this is yesterday’s issue; the VDT is dead, and the problems they’re griping about dead along with it.
- Chapter 7 – The X-Windows Disaster This chapter begins unfortunately, with complaints about X’s performance and memory usage that seem rather quaint when comparing it to the applications of 14 years later. It continues with a fling at the sparseness of X applications circa 1990 which is unintentionally funny when read from within evince on a Linux desktop also hosting the Emacs instance I’m writing in, a toolbar with about a dozen applets on it, and a Web browser.
I judge that the authors’ rejection of mechanism/policy separation as a guiding principle of X was foundationally mistaken. I argued in The Art of Unix Programming that this principle gives X an ability to adapt to new technologies and new thinking about UI that no competitor has ever matched. I still think that’s true.
But not all the feces flung in this chapter is misdirected; Motif really did suck pretty badly, it’s a good thing it’s dead. ICCCM is about as horrible as the authors describe, but that’s hard to notice these days because modern toolkits and window managers do a pretty good job of hiding the ugliness from applications.
Though it’s not explicitly credited, I’m fairly sure most of this chapter was written by Don Hopkins. Don is a wizard hacker and a good man who got caught on the wrong side of history, investing a lot of effort in Sun’s NeWS just before it got steamrollered by X, and this chapter is best read as the same bitter lament for NeWS I heard from him face to face in the late 1980s.
Don may have been right, architecturally speaking. But X did not win by accident; it clobbered NeWS essentially because it was open source while NeWS was not. In the 20 years after 1987 that meant enough people put in enough work that X got un-broken, notably when Keith Packard came back after 2001 and completely rewrote the rendering core. The nasty resources system is pretty much bypassed by modern toolkits. X-extension hell and the device portability problems the authors were so aggrieved by turned out to be a temporary penomenon while people were still working on understanding the 2D-graphics problem space.
That having been said, Olin Shivers’s rant about xauth is still pretty funny and I’m glad I haven’t had to use it in years.
- Chapter 8 – csh, pipes, and find: Power Tools for Power Fools Of the “plethora of incompatible shells” they anatomize in the first part of this chapter (including the csh in the chapter title), most are basically dead; the bash shell won. Accordingly, a lot of this chapter is just archeology, known to old farts like me but about as relevant to present-day Linux or Unix users as Ptolemaic epicycles.
The portability problem in shell programming is almost, though not quite, as historical now. Languages like Perl and Python have replaced the kind of fragile shell scripting the authors fling at — in fact, that’s why they’re called scripting languages. The authors anticipated this development:
At the risk of sounding like a hopeless dream keeper of the intergalactic space, we submit that the correct model is procedure call (either local or remote) in a language that allows first-class structures (which C gained during its adolescence) and functional composition.
I give them credit, they were right about this. It seems curious, though, that they exhibited no knowledge of Perl; it had already been supplying exactly this sort of thing in public view for some years in 1994.
The end-of-chapter rant on find(1) is still funny.
- Chapter 9 – Programming: Hold Still, This Won't Hurt a Bit It is a shame that the authors are so quick to dismiss the Unix toolkit as a primitive toybox in this chapter, because that jaundiced error gives Unix programmers an excellent excuse to ignore that parts the authors got right. To point out a first and relatively minor example, the use of tabs in make really was a botch that ought to serve as a horripilating example to tool designers.
More generally, many of their points about C and its associated assumptions and toolchains are well taken. Yes, all those fixed-length-buffer assumptions are an indictment of weak tools and bad habits formed by them. Yes, LISP would have been a better alternative. Yes, exception-catching is an important thing to have.
We didn’t get LISP. We got Python, though. I could have cited Perl and Tcl, too, but they aren’t as close to being LISP (see Peter Norvig’s detailed argument that Python is Scheme with funky syntax.) My point here is not to advocate Python, it’s to observe that the Unix community noticed that C was inadequate and addressed the problem. If the statistics on Freshmeat are to be believed, more new projects now get started in scripting languages than in C.
Gradually, in a messy and evolutionary way, the Unix community is teaching itself the lesson that the authors of this chapter wanted to give it. I agree with them that it could have happened faster and should have happened sooner.
I’d say this chapter had dated the least badly of anything in the book, if not for the next one.
- Chapter 10 – C++: The COBOL of the 90s Though out of date in minor respects (C++ got namespaces after 1994, and the authors couldn’t address templates because templates hadn’t been added yet) this chapter remains wickedly on target. The only major error in it is the assumptions that C++ is in the mainline of Unix tradition, was gleefully adopted by Unix programmers en masse, and is therefore an indictment of Unix. Language usage statistics on open-source hosting sites like SourceForge and Freshmeat convincingly demonstrate otherwise.
- Chapter 11 – System Administration: Unix's Hidden Cost This chapter is a remembrance of things past. When it was written, people still actually used magnetic tape for backups. It is probably the most dated chapter in the book.
In 2008 my septaguinarian mother uses a Linux machine and, after a handful of calls during the getting-used-to-it period, I’ve gotten used to not hearing about it for months at a time. Enough said.
- Chapter 12 – Security Many of the technical criticisms in this chapter remain valid, in the sense that Unix systems still exhibit these behaviors and have these vulnerabilities. But on another level the chapter is suspended in a curious vacuum; the authors could not point to an operating system with a better security model or a better security record. They didn’t even try to write about what they imagined such a system would be like.
The contrast with Chapters 9 and 10 is instructive. Many of the authors come from a tradition of computer languages (LISP, Scheme, and friends) that were in many and significant ways superior to Unix’s native languages as they existed in 1994 (the gap has since closed somewhat). They knew what comparative excellence looked like, and could therefore criticize from a grounding in reality.
There is no corresponding way in which the authors can suggest Unix’s security model and tools could be fundamentally improved. That’s because, despite all its flaws, nobody has ever both found and successfully deployed a better model. Laboratory exercises like capability-based OSes remain tantalizing but not solutions.
The correct rejoinder to this chapter is: “You’re right. Now what?”.
- Chapter 13 – The File System Most of the sniping about the performance and reliability of Unix filesystems that is in this chapter is long obsolete. We’ve learned about hardening and journaling; the day of the nightmare fsck session is gone. The gripes about unreliable and duplicative locking facilities have also passed their sell-by date; the standards committees did some good work in this area.
The authors’ critique of the unadorned bag-of-bytes model is not completely without point, however; as with languages, some of the authors had real-world experience with systems supporting richer semantics and knew what they were talking about.
Some Linux filesystem hackers seem to be groping towards a model in which files are units of transportability that can be streamed, but internally have filesystem-like structure with the ability to embed metadata and multiple data forks. Others have experimented with database views a la BeOS.
There is probably progress to be made here. Alas, it won’t be helped by the authors’ persistent habit of burying an ounce of insight in a pound of flamage.
- Chapter 14 – NFS: Nightmare File System Some of the specific bugs described in this chapter have been fixed, but many of the architectural criticisms of NFS made here remain valid (or at least were still valid the last time I looked closely at NFS). This chapter is still instructive.
Summing Up: What’s Still Valid Here? The original question was: “which chapters did you consider good and which chapters did you consider bad?”. Let’s categorize them.
The worst chapters in the book, at least in the sense of being the most dated and content-free for a modern reader, are probably 11 (Administration), 5 (Snoozenet), 6 (Terminal Insanity), 4 (Mail), and 1 (Unix: The World's First Computer Virus) in about that order of worst worst to least worst.
The chapter with the soundest exposition and the most lessons still to teach is certainly 10 (C++), followed closely by 14 (NFS).
A few chapters are mostly flamage or obsolete but have a good lesson or two buried in them. In rough order of descending merit:
- The authors were right to argue in chapter 8 that classic shell scripting is fragile and rebarbative, and should be replaced with languages supporting data structures and real procedural composition; this has in fact largely come to pass.
- The knocks on C in Chapter 9 (Programming) were justified.
- The objections to the pure bag-of-bytes model in Chapter 13 (The File System) should provoke a non-dismissive thought or two.
- At the bottom of this heap are the few nuggets in Chapter 2 (Welcome, New User!) about spiky command names and proliferated options. Some chapters tell us things that are true and negative about Unix, but merely rehearse problems that are (a) well known in the Unix community, and (b) haven’t been solved outside it, either. It may have made the authors feel better to vent about them, but their doing so hasn’t contributed to a solution. I’d definitely put 12 (Security) and 3 (Documentation? What Documentation) in that category.
Chapter 7 (The X-Windows Disaster) is the hardest for me to categorize. There’s still ugliness under the covers in some places they mention, but I think they’re mistaken both in asserting that the whole system is functionally horrible and in slamming the architecture and design philosophy of the system.
More than ever I see this book as a missed opportunity. The 14 years since 1994 has been enough time for useful lessons to be absorbed and integrated; if all the chapters had been up to the level of 10 or 14, we might have better Unixes than we do today. Alas that the authors were more interested in parading some inflammatory rhetoric than starting a constructive conversation.
This entry was posted on Tuesday, September 30th, 2008 at 3:57 am and is filed under Software. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.
98 Responses to “The Unix Hater’s Handbook, Reconsidered” Shenpen Says:
September 30th, 2008 at 5:01 am “Yes, it would have been really nice if Unix kernels had presented a uniform screen-painting API rather than leaving the job to a userspace library”
Let’s hope Andy Tanenbaum isn’t listening :-)
Jeff Read Says:
September 30th, 2008 at 7:04 am
No, Python is not Scheme with funky syntax. Python is decrypted Perl.
That’s probably its biggest strength and a significant weakness as well.
Jeff Read Says:Adriano:
September 30th, 2008 at 7:20 am
Yes, it would have been really nice if Unix kernels had presented a uniform screen-painting API rather than leaving the job to a userspace libraryť
Let's hope Andy Tanenbaum isn't listening :-)
Microsoft certainly listened. With the release of Windows NT 4.0 they moved the GDI into kernel space, with all the advantages (performance) and disadvantages (bugs in the graphical layer can now crash the system) that implies.
I think they tried to implement a more sensible separation of concerns in Vista. Which of course leads users to gripe…esr:
September 30th, 2008 at 7:38 am
Two small quips:
“When it was written, people still actually used magnetic tape for backups” Sorry, what do they use now? Big companies don’t exactly store to DVD.
I’m also interested in knowing how do you rate MSDN as a documentation source. After all, it (OIMHO) seems the biggest doc source outside of Unix.jrm:
September 30th, 2008 at 8:42 am
>Sorry, what do they use now? Big companies don't exactly store to DVD.
I’m aware there are a few really high-end magnetic-tape robots around, but I was referring to old style 9-track tape. That stuff is now so rare that finding a reader for it is not easy.
> I'm also interested in knowing how do you rate MSDN as a documentation source. After all, it (OIMHO) seems the biggest doc source outside of Unix.
I’ve never developed under Windows, so I have no idea. And no interest in finding out. Er, if their documentation is at the quality level of their software I don’t think I need to worry about it falsifying my review point.grendelkhan:
September 30th, 2008 at 9:06 am
> I'm also interested in knowing how do you rate MSDN as a documentation source. After all, it (OIMHO) seems the biggest doc source outside of Unix.
I’d rather read man pages. The same goes for Javadocs, too. All I really need is the function signature with helpful names.
If I want a tutorial, there’s google, and I always skip over the MSDN pages these days. They’re too chatty. Maybe if you don’t know what you’re doing they could be helpful, but I think in that case, you should buy a book on what you’re doing instead of reading the tutorial while you’re coding. People who do that scare the hell out of me.Phil:
September 30th, 2008 at 9:43 am
But this is yesterday's issue; the VDT is dead, and the problems they're griping about dead along with it.
If only. These problems will be dead only when I no longer hit backspace or delete or an arrow key on a system I’ve ssh’d into, and see ^something. Yes, it’s not impossible to track down why it’s happening, or impossible to fix, but it still happens.
It is rather amusing that page 152 contains, in reference to a decision to use termcap in curses, “Starting over, learning from the mistakes of history, would have been the right choice”, while a footnote in reference to AT&T’s decision to start from scratch reads, ” And if that wasn't bad enough, AT&T developed its own, incompatible terminal capability representation system…”.
Are you familiar with the more recent trends in UNIX (now Linux) hatedom? There’s Linux Hater’s Blog, which surrounds its criticism with John Solomon-like levels of invective. There’s also elliotth’s blog, which is much less interested in being purely cranky. For example, his post on Rhythmbox links to the relevant Bugzilla entries for the problems he points out.DGentry:
September 30th, 2008 at 9:48 am Introduction: This quote still appears to be relevant.
Literature avers that Unix succeeded because of its technical superiority. This is not true. Unix was evolutionarily superior to its competitors, but not technically superior. Unix became a commercial success because it was a virus. Its sole evolutionary advantage was its small size, simple design, and resulting portability. Later it became popular and commercially successful because it piggy-backed on three very successful hosts: the PDP-11, the VAX, and Sun workstations. (The Sun was in fact designed to be a virus vector.)
The specific details of the standards wars may be dated, but the ‘spirit’ is still with us. LSB is still fairly recent, and it is not always reliable. Also, different distros have different release times, so they include different versions of software with different problems. Standards have improved, but they are still not to the point where most software projects have a Linux binary alongside their Windows and OSX binaries.
Welcome New User!
â€śConsistency and predictability in how commands behave and in how they interpret their options and argumentsâ€ť sounds very nice, but the first part is impossible (different commands have to behave differently because they solve different problems)
They may solve different problems, but they often have similarities. Microsoft Word and Mozilla Firefox solve different problems, but they have the same keyboard shortcuts for copying and pasting text (Ctrl-c and Ctrl-v respectively). Unix still suffers from four different ways to list arguments: -r (traditional), r (BSD), –recursive (GNU), and -recursive (X11). Also, they say that it would have been a good idea to provide a standard library to handle regular expressions. This would have helped bring consistent regular expression behavior to C programs and Unix in general.
and the second is more a gripe about shell wildcard expansion than anything else. Sorry, no sale; it's too useful.
One of their may points seems to be that it would be nice if the called program also had a way to see the specific arguments it was called with. This would allow applications like rm to do a sanity check to prevent dangerous operations.
If you don't like the way rm *, go fire up a GUI file manager.
As they say in Chapter 7, Graphical interfaces can only paper over misdesigns and kludges in the underlying operating system; they can't eliminate them. ‘rm *’ would not hurt so much if Unix provided a mechanism to retrieve deleted files. Would it have been so hard to provide a special directory where the system would move files after they were ‘deleted’ and where the system would only truly remove them when it needed the memory? Both GNOME and KDE provide a ‘garbage can’, but it only works with programs specifically designed to utilize it.
System Administration I am not sure that it is as out-of-date as you say. The section on configuration files is still spot on. The section on funky device names is still relevant. I still remember a time I was installing Gentoo (yes, I know it is Gentoo, but still) and had to play the which-device-is-my-hard-drive game. It turned out to be /dev/sda, even though I think that is supposed to be a SCSI disc, and my hard-drive used SATA. *nix still does not provide a convenient way (that I know of) to make multiple hard drives look like one disk. The best way seems to be RAID, which is a hardware hack. Also, the requirement for a separate swap partition is still annoying. Both Windows and OSX seem to make due without it; why do people insist on it for Linux?
September 30th, 2008 at 10:10 am I have fond memories of this book. I worked at Sun Microsystems at the time, and reading the book made my blood boil. Though I read heavily from the O’Reilly catalog at that time I doubt I could name more than a couple other technical books I consumed. The rest merely become incorporated into the mental fabric; this book refused to be so easily assimilated. In that sense, the authors did a standout job.
I will be forever grateful to Dennis Ritchie for supplying a quote which I use to this day (heavily paraphrased to conceal my plagiarism): Like excrement, though it may contain nuggets of digestible material it is, on the whole, unsatisfying.
David Delony Says:
September 30th, 2008 at 11:01 am I have a copy of “The Unix Hater’s Handbook” in PDF, and I came to the same conclusions that you did, Eric. The rhetoric was amusing and I enjoyed commiserating with all the people’s tales of lossage. (They really need to do something about that “file deletion is forever” thing.) I remember thinking that most of the problems had long since been solved by journaling filesystems. I’ve never had a Unix crash on me except for the one kernel panic on Mac OS X (which is very nice, by the way :-)).
Whenever I get a chance, however, I should tell you about the Ubuntu upgrade that went badly. :-)
The “Handbook” is useful, along with “Life With Unix”, as a time capsule of the state of Unix before the rise of Linux and open source.
David Delony Says:esr:
September 30th, 2008 at 11:02 am Oh, I forgot what I was going to say about documentation. Of course, the man pages are terse, but thankfully most projects have better Web-based documentation. Even better are wikis, which allow the documentation to actually match reality. ;-)Phil:
September 30th, 2008 at 1:26 pm >This quote still appears to be relevant.
In a sense, yes. The issue is with the implied value judgment that if an operating system is observed to spread because of virality, then it must be inferior on other axes. That isn’t necessarily true; it may be the case that virality is correlated with design traits that make it a better bet in the long term – like, say, open source. Richard Gabriel first confronted this possibility in his 1987 “Worse Is Better” paper; the UHH guys reproduce that text in an appendix but don’t appear to have really grasped the implications. Which is not a knock on them, really; nobody, not even Gabriel himself, fully thought them through for a decade afterwards. Then I did it, and RG and I spent an entertaining 90 minutes the first time we met trying to figure out whether I had unconsciously lifted the central Worse Is Better idea for my “The Cathedral and the Bazaar”. We tentatively concluded that I hadn’t, but we’re still not sure.
I guess my real point here is that the UHH guys in 1994 had a narrow definition of “technical superiority” that doesn’t take into account the parts of the software ecosystem consisting of human beings operating under economic constraints. Some of them have matured since then. Some haven’t.
>Unix still suffers from four different ways to list arguments: -r (traditional), r (BSD), â€“recursive (GNU), and -recursive (X11).
Yes, that’s a reasonable point, and one of the “nuggets” I had in mind.
>Graphical interfaces can only paper over misdesigns and kludges in the underlying operating system; they can't eliminate them
That’s true, but who says rm is a misdesign? At some level hard deletion has to happen and there has to be a tool to do it; rm has to exist. People who rant about rm being misdesigned really mean, I think, that it is not a tool ordinary users should touch. That’s a defensible position, and it starts a more useful discussion.Phil:
September 30th, 2008 at 1:44 pm
At some level hard deletion has to happen and there has to be a tool to do it; rm has to exist.
Yes, hard deletion has to exist, but it does NOT have to be exposed to the user or the user’s programs. Above, I mentioned a relatively simple way that it could be / have been accomplished.
September 30th, 2008 at 2:07 pm Security
the authors could not point to an operating system with a better security model or a better security record.
Well, they can’t, but I can. The B5000 had automatic bounds checking, sophisticated file locking and automatic logging. VMS is known for being built like a tank. Plan 9 abolishes the superuser and provides a finer grain of access control, etc.
They didn't even try to write about what they imagined such a system would be like.
Well, they do suggest doing away with the superuser, abolishing SUID, and switching to a finer level of access control to allow specific control over certain files (/etc/passwd) and certain actions (such as spawning a shell). They also suggest per-user CPU time quotas and per-user I/O quotas. Also, they suggest a Trusted Path.
Jeff Read Says:Adriano:
September 30th, 2008 at 4:45 pm I argued in The Art of Unix Programming that this principle gives X an ability to adapt to new technologies and new thinking about UI that no competitor has ever matched.
Someone on Reddit pointed this out but it’s worth repeating: Multihead Just Worked on the Mac in 1987. Xinerama is still fucking broken in 2008.
Also, to get any sort of decent 3D performance much of X must be bypassed entirely. (The proprietary NVIDIA driver actually reimplements much of the X server; it remains the only performant 3D stack on Linux.)
Way to adapt to new technologies.Adriano:
September 30th, 2008 at 4:56 pm Hey, if new technologies are dismissed with a blanket “I've never developed under Windows, so I have no idea. And no interest in finding out. Er, if their documentation is at the quality level of their software I don't think I need to worry about it falsifying my review point.”, you start to understand why some people came up with the UHH.
Because, of course, the topic wasn’t at all related with ways Unix sucked. And of course, someone that has never programmed under a platform can immediately tell if it’s of good quality without eyes rolling.
Sure, you’ll get thousands of stories about Windows sucking. That’s because of Sturgeon’s Law.esr:
September 30th, 2008 at 5:07 pm … And of course, because it does suck, and has sucked even more in the past. But the point was that All Software Sucks. It’s, as we say here, “the dead laughing at the hanged man”.Si:
September 30th, 2008 at 6:19 pm
>Hey, if new technologies are dismissed
It’s true I will instantly dismiss anything from Microsoft, but “new” is not the relevant predicate — “proprietary” is.
Um, and don’t mistake this for a religious position. It isn’t — it’s my burn scars aching.Adriano:
September 30th, 2008 at 6:56 pm “Yes, hard deletion has to exist, but it does NOT have to be exposed to the user or the user's programs.”
I think “Your deleted files can be recovered, except sometimes when they can’t” is less pleasant than “your deleted files cannot be recovered, take care”. At least then you know where you stand.
Jim Thompson Says:
September 30th, 2008 at 8:04 pm esr, is the software technology for your car’s ECU open source? your cell phone? your vcr?
X11 bites. Hard, but as the man said, “Worse is Better”, which is the founding story of linux, and most, if not all of “open source” and “free software”.
Just why is it that you didn’t address Gabriel’s chapter in UHH, anyway? It has always struck me that “Worse is Better” pre-dated CATB by several years, but you’ve never before now (that I’ve found) given Gabriel attribution for having had largely similar ideas much earlier.
There is also a view that the OS is the bits that were left out of the programming language, and therefore, the parts implemented by the OS are actually bugs. By that measure, the “operating system” needs to evaporate.
I’m still hoping for a new “lisp machine” with lisp over a (mostly hidden) linux kernel, ala the architecture of Android, only s/java/lisp/…esr:
September 30th, 2008 at 8:35 pm You have burn scars over something you haven’t programmed in? That you don’t know?
I’ve also had contact with Windows and MS software for most of my life, and I’m not typing this from Ubuntu just because. Still, I don’t just dismiss everything before at least having tried it. It might not be a religious position, but it sure sounds like it.Lewis:
September 30th, 2008 at 9:12 pm
>You have burn scars over something you haven't programmed in? That you don't know?
No, I have burn scars from experiences with using and developing proprietary software in general. My animus is not specifically against Microsoft. Because Microsoft software is proprietary, I don’t feel I have to try it to reject having anything to do with it.
Actually, I already have more to do with it than I want. My wife has a work computer that’s a Windows laptop — I’ve had to troubleshoot things like WiFi authentication on the maggot-ridden pile of stale dog vomit Microsoft calls an OS, more than once.jt:
September 30th, 2008 at 10:21 pm
>This chapter [on NFS] is still instructive.
And as irrelevant as complaints about Sendmail and incompatible shells. See FUSE (in particular sshfs, which is made of win and goodness).Phil:
September 30th, 2008 at 11:15 pm Usenet is still relevant for comp.* and sci.*
If you kids don’t use it, that’s your stupid problem. It seems like you like the illusion of searching through /n/ (for a sufficiently big n) fora for your answers. That might look like your doing research/working.Phil:
October 1st, 2008 at 12:14 am Si Says: I think â€śYour deleted files can be recovered, except sometimes when they can'tâ€ť is less pleasant than â€śyour deleted files cannot be recovered,
The UHH addresses this concern in Chapter 1. Utilizing the kernel may have its problems, but it is much better than implementing it at a higher level. There seems to be a significant demand for a ‘garbage can’, which is evidenced by the fact that BOTH KDE and GNOME have implemented this functionality in their system, and the UHH provides further evidence with its’ description of all the various kludges users and sysadmins have come up with to ‘safely’ delete a file (‘alias rm rm -i’ & ‘alias rm mv ~/.trash’, etc.). Now, there probably should be a special mechanism to ‘securely’ delete a file (overwrite the blocks several times with random bits).
One of the major problems is that the *nix environment has made it especially difficult to ‘be careful.’ Historically, *nix programs had “a criminally lax attitude towards error checking.” Wildcard expansion is the major culprit, and since the program had no way to see the options with which it was called, it had no way of doing sanity checks.
At least then you know where you stand.
Yeah, you know you are standing on a mass of nylon in a sealed, high-pressure, 100% oxygen environment.Miles:
October 1st, 2008 at 12:43 am I have been thinking about instances where the kernel would have already reallocated the bits in a deleted file before the user had time to retrieve it, and I am mostly drawing a blank. Even if half of the files on a hard drive was deleted, all the kernel would have to do is point the inodes at the garbage directory and put all the blocks on the free list.Shenpen:
October 1st, 2008 at 4:38 am Phil:
> Now, there probably should be a special mechanism to 'securely'delete a file (overwrite the blocks several times with random bits).
There is. It’s called shred. I’m reminded of Neal Stephenson’s remark about learning Unix being an endless series of incidents where you’re on the point of inventing some useful utility and then notice that it’s already there, and has been for decades, and that explains that odd file or command you’ve vaguely wondered about.
Jeff: mrg@delirium:~$ perl -le ‘print 1 + “2″‘ 3 mrg@delirium:~$ python -c ‘print 1 + “2″‘ Traceback (most recent call last): File “”, line 1, in TypeError: unsupported operand type(s) for +: ‘int’ and ‘str’ mrg@delirium:~$ mzscheme -e ‘(+ 1 “2″)’ Welcome to MzScheme v372 [3m], Copyright (c) 2004-2007 PLT Scheme Inc. +: expects type as 2nd argument, given: “2″; other arguments were: 1 Python owes a lot to Perl (and Perl owes a lot to Python), but it’s closer to Scheme in some important respects. Of course, it’s not just a skin over Scheme either – think about Python’s broken lambdas (which Perl gets right).
October 1st, 2008 at 5:00 am
To me, the UHG is a history book, like the Jargon File, the chronicle of a very interesting period. It fascinates me that at that time when I was completely sure that computing equals taking the Commodore out of the cabinet and plugging it into the TV, a lot of other people used multi-user time-sharing systems with e-mail, Usenet discussions etc. When I first saw a PC, I found it really strange that the computer isn’t built into the keyboard, but is a separate box. It looked very unusual to me. I figure the mini-computer Unixers had it the other way around: nice terminal, but where is the computer? :-)
Jeff Read Says:Phil:
October 1st, 2008 at 5:19 am I've had to troubleshoot things like WiFi authentication on the maggot-ridden pile of stale dog vomit Microsoft calls an OS, more than once.
Ever had to troubleshoot WiFi authentication on Linux? None of the GUIs seem to do the right thing, and generally you have to hand-tweak /etc/wpa.supplicant.conf. (Using wpa_supplicant, even when your hotspot is WEP-encrypted or not encrypted at all, is generally a good idea.) Not accessible to end users. And that’s if — if — the kernel supports your WiFi card in the first place. Most WiFi chipsets were completely inaccessible without loading (gee whiz) a Windows driver into the kernel through the ndiswrapper compatibility shim.
So that steaming pile of dog vomit has a few advantages over Linux when it comes to wifi. At least it did as of a year ago when I last really messed with wifi on both.
Miles, much like the UHH, “In the beginning…” is now obsolete. Stephenson has switched to Mac OS X. Telling.esr:
October 1st, 2008 at 5:47 am
So Miles, how do I undelete a file?Adriano:
October 1st, 2008 at 7:11 am
>Ever had to troubleshoot WiFi authentication on Linux? None of the GUIs seem to do the right thing, and generally you have to hand-tweak /etc/wpa.supplicant.conf
Get a better distro. The GUI tools work for me, and I didn’t even know /etc/wpa.supplicant.conf existed until you brought it up.
October 1st, 2008 at 7:26 am
“Get a better foo. This bar works for me”. That’s exactly the positive attitude needed to solve Linux’s problems.
Jeff Read Says:Phil:
October 1st, 2008 at 8:54 am Not that it matters much, but I misspelled /etc/wpa_supplicant.conf. n.b.: on some distros it’s /etc/wpa_supplicant/wpa_supplicant.conf. Yaaaaay standards!!!
October 1st, 2008 at 9:24 am Miles, my point was that ‘securely’ deleting a file was the only case I could come think of that my ‘garbage can’ idea would not work. Of course, one could ‘securely’ delete a file once it is in the ‘garbage can’ like in OSX.
Jeff Read Says:Miles:
October 1st, 2008 at 9:26 am Eric, you may want to give MSDN, Windows, and their developer tools a second, unprejudiced look; they really are better than what Linux has to offer. If you don’t believe me, ask Jonathan Blow, a game developer who tried to port his game to Linux and found out how made of MASSIVE EPIC FAIL it is at even basic stuff.
I like Linux and these days I develop for embedded Linux for a living. But when it comes to being user- and developer-friendly for ordinary stuff, Mac OS is by far the best, Windows comes in a distant second, and Linux still isn’t even in the race.
October 1st, 2008 at 10:07 am Phil: Oh, I see. Just trying to be helpful… FWIW, garbage cans annoy the hell out of me, and I never use them even when they’re available.
I too have had far more problems using wifi under Linux than under Windows.
David Delony Says:Phil:
October 1st, 2008 at 11:08 am The biggest obstacle to widespread Linux acceptance is Linux culture. It seems that a lot of geeks think that Linux should only be accessible to kernel hackers. Too many hackers roll their eyes at Ubuntu and say “there goes the neighborhood.”esr:
October 1st, 2008 at 11:09 am Jim says: I'm still hoping for a new â€ślisp machineâ€ť with lisp over a (mostly hidden) linux kernel, ala the architecture of Android, only s/java/lisp/â€¦
Well, it is funny that you mentioned that, because yesterday I was looking at the OSKit project. For those of you who do not know, OSKit is a framework that implements many of the common functionalities required of all modern x86 operating systems: TCP/IP stack, multiple filesystem support, Linux driver support, POSIX support layer, etc. The cool thing about this project is that all the components were meant to be plugged into any operating system. This would allow OS developers to focus on innovative new features without having to reimplement all the boring stuff. It is a damn shame that the project seems to have been dormant since 2002. Anyway, to the point, while I was searching Google for any signs of life in the project, I came across this email by a grad student that combined OSKit and MzScheme and had a smooth, working lisp machine in 7-8 hours. Now THAT is impressive!! Like I said, it is a damn shame this project died.
October 1st, 2008 at 11:25 am >Eric, you may want to give MSDN, Windows, and their developer tools a second, unprejudiced look; they really are better than what Linux has to offer.
There is no possibility of “better” if the effect is to lock me into a vendor-controlled jail. None. That’s like saying “Here, try this lovely heroin. It feels soooo gooood that you won’t mind that you’ll never be able to stop.” You can list features and capabilities until doomsday and you’ll never get past the word “proprietary”, which stinks to me of lossage and of personal pain too keenly remembered.
If that’s “prejudiced”, I guess I am. And utterly immovable on this subject.
Ken Burnside Says:esr:
October 1st, 2008 at 11:49 am Eric:
On a Windows machine, you’d still have a copy of a sestina about quantum physics.
There isn’t a text editor in Windows that doesn’t autosave files; many will even keep spiral backups (where each autosave is incremented by a date stamp, and after four have been saved, the fifth is deleted to save space). Yes, I know, it’s mollycoddling by a tinker toy OS that doesn’t [include:rant001 through rant999.]
You may consider the loss of 3 hours of writing to be a trivial price to pay for using Linux. I don’t.
I’ve worked on Windows, Mac OS X, Mac OS 9 and Ubuntu. And by “Worked”, I mean “Did something that I got paid for.”
Even adjusting for biases – like the fact that I have more experience with Windows than I do the other four OSs combined – Ubuntu STILL doesn’t match the feature sets I need for most of what I need to do. The command line is cryptic, and requires mental overhead on the part of the user that is incredibly annoying.
Unix documentation, with some exceptions, is atrocious. You’re either going to become a kernel hacker, or we don’t want you.
Microsoft’s documentation is, in general, excellent. Microsoft’s developer tools are very good, and there’s a strong market based demand for them – learn to use the tools, code software, get a job. Easier to learn dev tools (through better documentation, and lots of handholding) and putting those dev tools out there for dirt cheap might – just might – explain why Windows has a 90+ percent market share.
Hell, even Windows’ Office suite includes programming language support with VBA. Is it a great programming language? Categorically not. Is it something that allows someone who isn’t a formally trained programmer or self taught hacker to get useful work done? Yes. Is there a thriving community of people who share VBA apps and source code? Damned straight. Do those people help each other improve applications? Yep.
Sounds a lot like the Open Source ‘community development’ meme, with easier to use tools to me…and if a quick scan of Monster.com and Jobs.com is any indication, there appear to be about 20x the job prospects for people competent in those tools written by Microsoft.
As a free market anarchist, even you have to concede that a company that’s kept an 80% or greater market share in the fastest growing computer market segment, in the 30 years since you entered the job market, has to be doing something right beyond glitzy marketing.
October 1st, 2008 at 12:31 pm >On a Windows machine, you'd still have a copy of a sestina about quantum physics.
Don’t be silly. Emacs makes backup files too; I’m not sure how it got lost, but failure of my editor wasn’t the problem.
>Microsoft's documentation is, in general, excellent.
That’s “Here, try this heroin” again. It won’t work this time, either. Other people can be junkies if they want; I refuse to go there.
>As a free market anarchist, even you have to concede that a company that's kept an 80% or greater market share in the fastest growing computer market segment, in the 30 years since you entered the job market, has to be doing something right beyond glitzy marketing.
Oh, they’ve done great many things competently. None of those things includes operating systems, however…what they’ve done instead is successfully lowered everyone’s expectations to the point where customers think that shit-smell is attar of roses or something. And those of us with actual standards get looked at like we’re crazy for pointing out the suck.
Jeff Read Says:
October 1st, 2008 at 1:15 pm None of those things includes operating systems, howeverâ€¦
Nonsense. The NT kernel, designed by legendary Unix-hater Dave Cutler, is a spiritual descendant of VMS and as such boasts a design that is cleaner and more orthogonal than any Unix. (Linux’s ad-hockery looks egregiously bad in comparison.) Admittedly, Windows NT has exhibited problems with stability; many of those were cause by buggy third-party drivers, a default security policy deliberately designed to maintain bug-for-bug compatibility with 16-bit Windows; and buffer overflows in userland software. Many of these problems have been fixed with Windows Vista, and I know Vista users who report high uptime with no crashes. It’s not 1998 anymore. Windows is stable.
What I think Microsoft does particularly well, however, is to act in a leader/gatekeeper role for the PC platform, negotiating the complex relationships between IHVs, ISVs, and end users. Which means that l33t cutting-edge stuff kind of tends to get left by the wayside if ordinary users don’t care about it. (The Macintosh didn’t have PMT or proper memory management for the first 15 years of its life. Did the Mac user base, known for its technological elitism, care? No.) Any platform needs such a gatekeeper. Open source has coasted along for a while without one, but the results so far have been disastrous. Linux is broken because its community is broken: without a gatekeeper to say “this is what we need to focus on, this is what we will support, this is the API you can rely on”, and so forth, what you have is a horde of anarchic cluster-communities competing with, and often clashing with, each other when their energies can and should be channeled to make significantly more progress. As a recovering fosstard (not at all dissimilar from being a recovering Catholic; I’ve experience in both), I used to say “yeah, let a thousand APIs, window managers, and subsystems bloom!” But then it dawned on me: it’s 2008 and sound still doesn’t fucking work on Linux. App A uses pulseaudio, B uses ALSA, C tries to open ALSA’s backwards-combatable OSS /dev/dsp interface and finds it blocked and so trudges onward with no sound until restarted, and D is trying to connect to a nonexistent esd daemon. FAIL. The only solution that’s been proven to work is for someone to decide on one sound API for Linux. That will never happen, though, because of the monumental cat-herding involved.
You see proprietary software as a jail. Fine. Being a software engineer, you’re extraordinarily lucky in being able to call that shot. For these people it’s rather akin to a gated community: some freedom is traded for predictability, peace of mind, and gorgeous scenery. And it’s preferable by far to the stone-knives-and-bearskins primeval village which open source represents.
For large numbers of people, the choice is between using a proprietary program on Windows, or not doing their job at all. At a talk, RMS once told a motion picture editor to get a new job rather than continue using proprietary software (all the editing tools which are any good are proprietary). Nicely illustrative of why “proprietary software = bad” is a completely untenable position.
Ken Burnside Says:esr:
October 1st, 2008 at 1:39 pm Oh, they've done great many things competently. None of those things includes operating systems, howeverâ€¦what they've done instead is successfully lowered everyone's expectations to the point where customers think that shit-smell is attar of roses or something. And those of us with actual standards get looked at like we're crazy for pointing out the suck.
Here are my standards for what makes a “Good OS”.
Last time I had a my system outright crash (Win NT through XP SP 3) was four years ago. The cause of the crash? Enough dust had accumulated in my CPU fan that the CPU was overheating. Using a blower to shove the dust out solved the stability problem.
I buy the vast majority of hardware out there, plug it in, and it works. I don’t have to sort first by whether or not there’s a driver for the distribution of Windows I use.
The only time I’ve had a problem connecting my geriatric laptop to a network where other people could connect was at your place. Worked fine at all three airports I was at, worked fine at WBC, worked fine in the hotel room at WBC, worked fine in the hotel room at GenCon, worked fine at the Indianapolis convention center.
Data Integrity and Restore
My file system backups work just fine. I run test restores every month.
My applications install without me having to open a text editor, or recompiling a patch into a kernel. I do have to log in as admin when installing them, and log back out after said installation.
I run in user-mode. I only switch to Admin mode when installing software. I live behind a router, run anti-virus sweeps three times per week, and have a firewall. I spend about as much time doing this as most of the Linux people I know spend maintaining their systems for security purposes. My biggest security hole is that I have to switch to Admin mode to install about one Windows Patch update in three.
Fonts and color spaces display right (Important for me). Apps I am dependent upon, and which I could not personally write alternatives for, and for which, no viable alternative exists in the OpenSource space, run natively rather than through some malebolgian kitbash of emulators and pipes.
What should I expect from Linux in those categories?
What categories have I missed that are important where Linux is better?
October 1st, 2008 at 1:48 pm >What categories have I missed that are important where Linux is better?
I could list them for pages, but here’s just one that will stand as a good example. If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn’t exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit. If that weren’t the case, plugging the few remaining open relays would make spam traceable to sites that could be shut down.
Jeff Read Says:
October 1st, 2008 at 2:32 pm Fonts and color spaces display right (Important for me).
Point in Linux’s favor: it has decent screen font antialiasing now. Windows antialiasing looks like ass; Linux is marginally better. As always, the Macintosh is the best by far.
(See a pattern emerging here? Mac OS X has become the “no compromises” Unix. If you need or are interested in the arcane power of Unix on your desktop, probably the best favor you can do for yourself is to get a Mac.)
What categories have I missed that are important where Linux is better?
Here’s one. Linux scales from tiny handheld devices all the way up to supercomputers. These days, Windows and Mac OS X run on cell phones, but only in special vendor-approved builds. However, the open-source nature of Linux makes it far easier to use it as a starting point for your custom embedded or large-scale computing solution.
Jeff Read Says:
October 1st, 2008 at 2:39 pm They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit.
Again, this is due to a combination of broken userland software (blame the IE and Outlook teams), and an idiotic security policy that was, to some extent, necessary to maintain full compatibility with legacy apps. It really has nothing to do with Windows as an OS (and no, IE isn’t a part of the operating system; I don’t think even Microsoft ever believed that for a second.) Microsoft is quite cognizant of its mistakes in this regard and has been after them like a chicken on a June bug.
Oh, and the “compatibility with legacy apps” is something Linux fails at too: try running an old binary on a new Linux system and boggle at all the library dependencies you have to resolve.
David Delony Says:
October 1st, 2008 at 3:51 pm > Oh, and the â€ścompatibility with legacy appsâ€ť is something Linux fails at too: try running an old binary on a new Linux system and boggle at all the library dependencies you have to resolve.
That’s why GCC was invented. ;-)
Ken Burnside Says:esr:
October 1st, 2008 at 3:59 pm
I could list them for pages, but here's just one that will stand as a good example. If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn't exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit. If that weren't the case, plugging the few remaining open relays would make spam traceable to sites that could be shut down.
My understanding of this is that it’s an exploit in Outlook (easily replaced, not part of the OS anymore than Thunderbird is part of Ubuntu), and in IE 5/6. (There are reasons I don’t run IE and Outlook.)
So, what part of the OS allows that to happen, that you place the blame there.Miles:
October 1st, 2008 at 4:20 pm >So, what part of the OS allows that to happen, that you place the blame there.
It’s not specific to IE, or Outlook, or any other application. And it’s not any one thing; there are multiple gaping holes in the architecture. I’ll name two: (1) the Windows Update channel is easy to hijack, and (2) for performance reasons, the GUI has run in the same ring as the kernel since NT 4.0, which means the most trivial application buffer overflows can give a cracker the equivalent of root privs if he knows what he’s doing.
The botnet herds are huge for a reason; the herders can find holes faster than Microsoft can fix them. That’s a sufficient indictment of Microsoft’s technical competence right there.
If you take a clean install of Windows and put it, un-firewalled, on the net, do you know how long you can expect it not to be pwned? This has been measured by experiment. I believe last time I read about this it was 17 seconds. Down from 43 the previous time the experiment had been run. But the exact figure doesn’t matter; 17 minutes, 17 hours, or 17 days would still be evidence of incompetence.
And yet people still look at me funny when I complain about Microsoft.Miles:
October 1st, 2008 at 4:25 pm > There is no possibility of â€śbetterâ€ť if the effect is to lock me into a vendor-controlled jail. None. That's like saying â€śHere, try this lovely heroin. It feels soooo gooood that you won't mind that you'll never be able to stop.â€ť You can list features and capabilities until doomsday and you'll never get past the word â€śproprietaryâ€ť, which stinks to me of lossage and of personal pain too keenly remembered.
Eric, that’s just daft. To continue with your analogy, you’re in effect saying that because the drug cartels and the pushers are selling heroin, they can’t possibly have any interesting lessons about drug synthesis or marketing or hypodermic needle design that could benefit legitimate pharmaceutical companies.Mike:
October 1st, 2008 at 4:47 pm OK, just read your next post. To clarify, I’m not suggesting that you use Windows, MSDN, Visual Studio etc, just that it’s worth keeping an eye on them from an appropriate distance in case they have any ideas worth stealing.Adriano:
October 2nd, 2008 at 2:51 am (Jeff Read) > Again, this is due to a combination of broken userland software (blame the IE and Outlook teams), and an idiotic security policy that was, to some extent, necessary to maintain full compatibility with legacy apps. It really has nothing to do with Windows as an OS
Most of the complaints presented here about Linux/Unix are also about the userland. What’s a part of the OS, anyway (…because everyone likes a never-ending debate)? Technically even the Linux audio API jungle problems can labeled as “that’s just some crappy userland tools, we have this new shiny world order (pulseaudio or such) and everyone should just get with the program”.
I have very little experience with Windows. I recently helped a colleague at work to set up a new Lenovo laptop that came with a factory-installed Vista. The first time the thing was booted, straight out from the cardboard box, it took two hours to come up, including probably a couple of reboots that the system required. I have no idea what it was doing. Possibly some configuration or such that Lenovo put in their factory installation, so I don’t know whether Microsoft had anything to do with it. When it was finally up, the amount of disk space taken was 13 GB. This is plain Vista with nothing installed, with the possible exception of some Lenovo utilities. I had no desire to find out what was going on, so I installed the tools we need at work and handed the machine over as quickly as I could (AFAIK it has been stable since). Is Vista really *that* bloated or are there DVD-quality instruction videos in there or something?Marco:
October 2nd, 2008 at 6:51 am Yes, it is really that bloated. The standard install of Vista (Starter ed. aside) is on the upside of 10 GB.SomeDude:
October 3rd, 2008 at 8:00 pm Jeff Read Says:
> Point in Linux's favor: it has decent screen font antialiasing now. > Windows antialiasing looks like ass; Linux is marginally better.
I have a corporate issued T61 with Win-XP, but I frequently run my Ubuntu 8.04 LTS Desktop Edition live CD. I think the basic Gnome interface (as-is) as packaged by Ubuntu is more than adequate, and if I were a small business owner, I would not pay any Microsoft tax for the MS UI. What I do notice is the fonts seem ugly. Not sure you are old enough to remember, but when I first used OS/2 Warp, with Adobe Type Manager, compared to MS-Windows 3.1, the subtle improvement in screen fonts that ATM provided seemed like a big deal to me. Similarly, MS-WinXP seems just a bit crisper in its fonts.
Do you have a link to a FAQ that might describe why this is, and what I could do about it (I am sure I will need to stop using the Live CD to start)?esr:
October 4th, 2008 at 5:14 am
I’ve spent quite a lot of time on your trash. You have more tendency to sex than software .Indeed,you are the one who suggests would-be-hackers in “How to become a hacker” article not to waste time on distractions like sex,social approval,etc. In addition to being an attention whore, DON’T BE AN ASSHOLE MR. RAYMOND
October 4th, 2008 at 5:50 am
A handy guide to translating hissy fits like SomeDude’s:
“attention whore” = anyone who isn’t painfully introverted.
“asshole” = anyone who can talk to a woman without scuffing his shoes and keeping his eyes on the floor.
To be fair, this isn’t just SomeDude. A lot of socially-handicapped hackers tend this way. It’s not really a choice, but something about their neurotransmitter balance, and I try to make allowances. Because in some alternate universe where I don’t have whatever minor genetic quirk gave me an entrepreneur-extrovert type personality rather than the standard geeky quasi-autistic one, I’m probably spitting enviously at an ESR-type. And reading his blog. And wishing, deep down, that I were more like him.
Peter Bessman Says:
October 6th, 2008 at 3:10 pm Jeff Read,
Long time no talk. I too would consider myself a recovering fosstard (perhaps even recovered at this point). Shoot me an email sometime if you’re so inclined. Debating you was one of the more fulfilling wastes of my digital youth that I can recall, and I’d be interested to hear what you took away from your time in the hostel that GNU built.
Jeff Read Says:Joe:
October 15th, 2008 at 11:48 pm Marco,
The “crispness” is due to a combination of factors: use of ClearType (subpixel antialiasing, though Linux does this too), and the fact that Microsoft’s TrueType engine still tends to use hinting, which means it adjusts the font shape so that crucial points fall neatly on pixel boundaries. I don’t know the magic switch to turn it off or even if there is one.
That hinting makes for sharper lines, but it takes its toll on the glyph shape, making the glyphs look less natural and even. It was necessary in the days of Windows 3.1 and aliased screen fonts, but today is far less necessary and even detrimental as it tends to mess with character spacing as well. The Mac, which employs a subpixel rendering system throughout its Quartz layer, is easily the best at font rendering; all screen fonts on the Mac look page-perfect. The Linux Xft layer comes in second, and Windows third; at least that is my preference for I find the hinted glyph shapes distracting when combined with font antialiasing.
October 17th, 2008 at 7:29 pm I was a bit of a UNIX fan when I first read that book, but I saw a lot of good points in it.
The lesson I took away from it was. All general purposes OS’s suck. If you don’t think your OS sucks you just haven’t tried to the the right/wrong things with it yet. Unix, and Windows can be improved, but they will both still suck, because of history, poorly defined goals, and general screw ups in getting all the parts together.
I have concluded that the correct way, of thinking about OS’s is not as grand building designed and built for a purpose, with majestic skylines, but as city infrastructure, with streets and sewers, that should just work. However people will only notice your system when you overflow into the basement. A infrastructure that is not over taxed is in a city that is not growing, if your OS sucks because it is being taken places that won’t considered when it was first build, well there are worse fates.
Jeff Read Says:esr:
October 27th, 2008 at 8:17 pm Joe,
No. The Mac and the Amiga did not suck. They were (and the Mac still is) beautiful machines with beautiful OS software designed specifically to make full use of the underlying hardware. Their developers had well-defined goals and succeeded enormously in “getting all the parts together”, but sadly they were marginalized for other reasons (corporate mismanagement, lack of marketing direction, Microsoft aggression, etc.) They had flaws, and they had to be expanded to keep up with ever-changing innovations in hardware and software, but they aren’t the pile of kluges that Unix and DOS/Windows were and are.
These days, you’d hardly know such things existed, as most desktops are generic x86 shitboxes running Windows or, on occasion, some Unixoid thing. But once upon a time there were companies with strong engineering traditions who were close to their user bases, listened to them, and delivered what they wanted.
Don Marti Says:
November 6th, 2008 at 7:13 pm (You never actually know if your beard will be gray or not until you grow it out. The apparent color of sink stubble can be deceptive.) The World Wide Web, originally developed on the Unix-based Nextstep, has a lot of Unix-ish ideas. A lot of fundamental stuff seems half-done or inconsistent, but later generations of software are getting more and more of it right. Would a Unix-hater’s idea of a WWW-like system have worked as well?
November 7th, 2008 at 12:19 am >They had flaws, and they had to be expanded to keep up with ever-changing innovations in hardware and software, but they aren't the pile of kluges that Unix and DOS/Windows were and are.
Romanticizing old also-rans is always a temptation. There were major flaws in both systems; ask someone who was there, sometime, about the horror that was dynamic-memory managagent under early Mac OS versions. Or the truly peculiar kludges necessitated by the fact that the Amiga’s multitasking had no pre-emption.
Jonathan Abbey Says:esr:
November 7th, 2008 at 5:29 pm The thought that Commodore had a ‘strong engineering tradition’ is highly amusing. The Amiga team were good Engineers, but anyone who claimed the Amiga bettered Unix in any substantial way never used it. The lack of memory protection or virtualization handicapped it significantly from the long view, and the CLI and user interface were not all that. I say this as a person who used an Amiga 1000 exclusively for 7 years, and programmed it in C a goodly bit.
And, Eric, the Amiga did have pre-emptive multitasking. ;-)esr:
November 7th, 2008 at 6:13 pm >And, Eric, the Amiga did have pre-emptive multitasking. ;-)
When did that happen? I never programmed in it, but I had a buddy who did…Noah Feldman, does that name ring any bells? This would have been around 1984, I think. He claimed that Amiga programs had to give up control, couldn’t be pre-empted out of their scheduler slots. If that was bad information, please correct me.
I do know directly about the Mac OS horror; I used early Macs on which it was occasionally necessary to manually assign memory to applications because the OS couldn’t do it. Even at the time I thought that was insane.
Jonathan Abbey Says:
November 7th, 2008 at 6:28 pm The Amiga Exec was always pre-emptive, from day one. It had message ports that could act as synchronization points for communications with the GUI system and kernel, but the system never depended on manual yield points like the Mac did.
One of the things that Amiga users used to do to mock Windows users was format a floppy drive while manipulating windows, showing the boing ball bouncing in the background screen, etc. Windows didn’t gain the ability to do floppy I/O without blocking other operations until Windows 95.
See http://en.wikipedia.org/wiki/Exec_(Amiga) for details, and a reference to a 1991 Byte magazine article that discusses the structure of the Amiga Exec kernel a bit.
November 9th, 2008 at 5:04 am What is your take on OpenSolaris, as the replacement for GNU/Linux in terms of being the next generation development environment?
November 9th, 2008 at 7:01 am >What is your take on OpenSolaris, as the replacement for GNU/Linux in terms of being the next generation development environment?
Next generation in what sense? Linux has an effective community, a mega-crapload of pretty good software, and a strong brand. It wouldn’t bother me if OpenSolaris displaced Linux, but I see no overwhelming technical advantage there and its community is orders of magnitude smaller. As a fallback for people tied to legacy Solaris software it makes sense, but as a Linux successor? Nah.
Jeff Read Says:
November 11th, 2008 at 4:07 am Jonathan, in those days Unix was still a minicomputer OS designed to handle multiple users logging in via terminal over serial lines. Amiga was designed to be a desktop OS; as such, it could achieve graphical displays at speeds untouchable by a similarly configured Unix machine (even if you had one for the desktop). A legacy of the days when hardware and software were designed to work together as a cohesive unit, for the benefit of the user. These days, the attitude is that hardware and software are both cheap and fungible. It’s similar to the difference between European engineering (“We build cars”) and American engineering (“We sell cars”). I don’t think even modern Linux has gotten low-latency syscalls to the point where even a stripped-down GUI like xfce feels as snappy and responsive as the Macintosh or Amiga did in their heyday; heavyweight GUIs like KDE and GNOME are not even in the same league.
And yes, Eric, the Amiga had a fully preemptive multitasking kernel in 1985, at a time when such things were unheard of on any other personal computing platform. It also was the first personal computer to support hardware-accelerated video and multi-channel sound. It could do things computers today can’t, like switch video resolutions in the middle of a vertical scan. It was in many ways a revolutionary machine, You would do well to read up on its capabilities and maybe even play around with the Amiga emulators to get a feel for what it was like.
November 12th, 2008 at 4:44 am “Next generation in what sense?”
Next generaration in the sense that it incorporates all the best from both worlds:
- System V, a much, much better and more consistent environment than GNU - ABI forward compatibility - DDI/DDK driver foward AND backward compatibility - a GNU alternative for those hopelessly tied to GNU/Linux (in /usr/sfw) - MPxIO, 1st class enterprise iSCSI support, enterprise grade clustering (gratis, open source!), grid engines (N1 and N1 geographic), IPMP (all these gratis AND open source!) - zones, to a lesser extent LDOMS, and XEN - ZFS! What’s not to love about ZFS? - DTrace! - runs on the same hardware as GNU/Linux - enteprise grade compiler suite – gratis! - runs the same free/open source software as GNU/Linux!
One can have his/her cake, and eat it, too: standardization and backward/forward compatibility of a System V UNIX, with boatloads of high quality, enterprise grade, yet open source software! So what’s not to love about OpenSolaris these days?
November 12th, 2008 at 4:50 am And, speaking of a smaller community, I believe OpenSolaris would benefit tremendously if YOU joined that community. Yes, you, ESR.
November 12th, 2008 at 5:06 am
>And, speaking of a smaller community, I believe OpenSolaris would benefit tremendously if YOU joined that community. Yes, you, ESR.
Probably. However, you averted your gaze from the elephant in the room, which is comparative size of developer bases.
There was a time when I thought the NetBSD/FreeBSD/OpenBSD crowd had a superior system to Linux. I didn’t switch, because it’s a straight-line prediction of my own theories that (other things being equal) the open-source project with more developers will improve faster than the one with fewer. Therefore, I believed that in time Linux’s metrics on real-world performance, security, and stability would improve to pass those of the BSDs. Eventually, I’d say around 2002, I judged that this had in fact occurred. Had I in fact switched because of perceived technical superiority, I would have spent a great deal of effort backing the wrong horse.
Having avoided this mistake with respect to BSD, I’m not going to make it with respect to OpenSolaris.
What’s worthwhile in your projects will be assimilated by the expanding Linux blob. What’s not will die. Sorry, but that’s the way it is. Evolution is not pretty to watch sometimes.
Jeff Read Says:
November 12th, 2008 at 10:18 am What's worthwhile in your projects will be assimilated by the expanding Linux blob.
I get it. So the Linux community are Zerg. Problem is they’ve spent so much time rushing other people’s bases they’ve neglected their own.
I stripped Linux off of one of my machines the other week and replaced it with NetBSD. It just so happened that this was my primary art machine — a TabletPC — and there was no official support for such machines under NetBSD that I could find. This was to prove a difficult move fraught with hardware-support issues but one I ultimately don’t regret making. Tweaking and compiling the X.org driver from the linuxwacom project got the last piece of the hardware puzzle — support for the machine’s built-in tablet — in place.
There isn’t the broad base of driver support that there is under Linux, but I’ve found that when NetBSD supports a piece of kit it supports it hardcore. It is able to configure the machine’s sound card for full duplex — something I haven’t been able to observe under Linux (being able only to record XOR playback adds to the sound frustration and general ALSA brokenness). Meaning that I can use this machine as an audio workstation as well. Back in 2001 when I first tried NetBSD, Linux’s USB support was only kinda, sorta there; NetBSD was able to autodetect and immediately start using any USB input device I plugged into it. For a driver to make it into the NetBSD kernel it must not only work but fit smoothly in with NetBSD’s infrastructure — a 30-year history of getting different kinds of hardware across different buses and CPU architectures to work together under a single, unified model. The elegance is astounding. Linux, by contrast, is making improvements in this regard but only really had a unified device model as of the 2.6 kernel series; the hacked-together, git-r-done model of development seems to have exacted a severe toll in terms of code quality and maintainability. It’s kind of sad to see, as I had used Linux for 13 years and become rather fond of it, but it’s approaching a threshold where until it gains the refined culture of quality that the BSDs have cultivated for years, it will collapse under its own ponderous bulk. The same goes for the GNU userland.
November 12th, 2008 at 10:18 am I don’t, from experience, subscribe to the notion of “how many masons, that much wall”: programming software is not like building a wall, and more people to work on it does not necessary mean that software will be plentiful, quality, or even correct.
I’ve seen many a time that a handful of true experts make miracles in the shortest amount of time possible, while “experts” muck around and produce crappy software.
Quantity is nice, but quality is even better.
Someone like yourself can and has done miracles. That’s why I firmly believe one ESR can do what would normally take 100 “average” programmers. I’m deeply convinced of it, because I myself experienced it.
So what if an OS is isn’t mainstream? Why would it have to be? Didn’t you yourself write in “why I hate proprietary software”, that programming is an art, and that one should enjoy it.
And since the same FOSS software runs on (Open)Solaris just like it runs on GNU/Linux, I don’t really see why community size is a problem.
Every open source beginning was hard. I don’t see why OpenSolaris would be any different; it has to start somewhere.
Jeff Read Says:
November 12th, 2008 at 2:15 pm Eric S. Raymond, Miracle Worker.
And you thought the Obama supporters were blinded by hype!
November 12th, 2008 at 4:00 pm Have you read his book, “The art of UNIX programming”?
I look up to him, as far as programming tenets go. There is no shame in learning from those who are smarter and wiser than yourself.
I also believe that ESR would be of great use as a guide to all those GNU/Linux people coming to OpenSolaris. As a “Gray beard”, he’d be great mentor to the GNU/Linux newcomers in the OpenSolaris world. We have lots of those.
After all, who knows UNIX better than a “Gray beard”? No, who *understands* better than a “Gray beard”, what UNIX is all about?
And ESR, when I read your “why I hate proprietary software”, you touched a spot in me, deep inside. When you described what you went through, I saw myself in that description, word for word.
Thank you for sharing that essay.
November 12th, 2008 at 4:04 pm “We have lots of those” == lots of GNU/Linux people coming over to use, play and work with/on OpenSolaris. Just look at opensolaris.org/os/discussions/Irgendeiner:
November 12th, 2008 at 7:10 pm >And since the same FOSS software runs on (Open)Solaris just like it runs on GNU/Linux, I don't really see why community size is a problem.
Two words: device support.
November 13th, 2008 at 2:34 am With all due respect, but that is something that used to be the case, and is definitely no longer so.
Most contemporary hardware is supported now, and support for even more hardware is in the works.
I have an intel core quad PC, and everything just works, out of the box: the network, the HD audio, the Nvidia 3D accelerator (with acceleration).
November 13th, 2008 at 2:36 am Ultimately, you will do what you want. I just think it would have been great to have you along for the ride.
Jeff Read Says:
November 19th, 2008 at 1:41 pm UX-admin, Solaris seems worth looking into. It’s rather shameful that we must accept the worst possible systems (Linux, frickin’ Windows, x86) just because of mob support.)SGilmour:
December 2nd, 2008 at 3:08 pm @ESR: We will watching closely how MUCH of your publications will not look ridiculous 15years later!Dave:
January 28th, 2009 at 10:59 am Well, I’m not a veteran like you guys, but I thought I’d share my feelings anyway. I’ve been using Vista at work and at home for about a year now. It crashed once. I used my aunt’s macbook for one summer, running OS X. The things must have crashed on me about twice a week. Everytime I ran too many apps (meaning a torrent client, a video player and a web browser, and maybe a few PDFs) the whole thing just froze, had to be rebooted. I also run GNU/Linux at home. The sound problem (mentioned in another response) is just ridiculous. Even printing isn’t easy. I’m not running Gentoo or using a windows-only printer either. I’m running Ubuntu, and my printer is a 15 y.o. HP Laser Jet 4. It can print a test page fine or standard jobs, but try to make it print even pages only and it’s no go, the printing job just disappears. You say I should have to recompile something? If your car broke down every few days and you were told all you had to do was polish the spark plug to keep the thing going, you would also think about finding something better. Isn’t that what’s happening to US car manufacturers right now?
As for being proprietary, please grow up. Why should I care to be able to modify something if it already works? If it doesn’t work, why would I buy it? Microsoft software is expensive? Let’s say I have to pay 1000$/year on Microsoft licenses. That’s less than 2% of most entry level salaries for IT jobs. If the software increases my efficiency by 10 minutes per day I’m a clear winner. But what could save 10 minutes per day? Let’s see: not having to read through endless poorly structured documentation, or filtering through forums looking for a command line, somewhere inserted between the top-notch *nix intelligentsia comments that tell you what an idiot you are for not knowing the command, instead of giving you the command itself.
GNU/Linux may have a ton of free software, but most of them do not work properly(meaning without having to change code), have a tendency to crash without warning or error messages, are not user friendly and have poor or no documentation. And I almost forgot, there are a lot of Open Source software available for Windows, and they usually have better GUIs and more functions than the *nix equivalent (ex: 7-zip vs p7zip).
Don’t get me wrong, I wouldn’t be spending so much personal time using Ubuntu, or trying to use FreeBSD, if I didn’t love the concept of freedom based software, but the *nixes still have a long way to go before the hate become unjustified.Nate:
February 22nd, 2009 at 9:07 pm @SGilmour:
My experience with OS X has been much different. Usually such crashing is caused by hardware problems (typically bad RAM). Software for OS X is very stable (for the most part; there are a few apps that crash).
Windows is a mess. I work in IT and see problems every day. When I used to use it at home, it’s was horrible.
Linux is nice, but requires a certain level of maintenance. A LaserJet 4 should work fine (though it may be necessary to modify settings in the CUPS web-admin page).Nate:
February 23rd, 2009 at 12:36 pm “There is no possibility of â€śbetterâ€ť if the effect is to lock me into a vendor-controlled jail.”
This would make sense, except GCC is many GCCisms and bugs that developers program around, effectively locking their code base to GCC. It took a while for the Linux kernel to compile under other Linux compilers (notably Intel and later Sun, but I dunno if the Sun “port” has been finished). Microsoft has better standards compliance these days than GCC, so you run the risk of vendor lock-in moreso with the Linux compiler than the Microsoft compiler. Most compilers have to have “GCC compatibility” for the code to compile with GCC without [sometimes, major] changes. Microsoft doesn’t have this switch, so I guess that’s why you refuse to use their development tools?
MSDN Documentation (both online and installed) is vastly superior to anything Linux has to offer. It is not a coincidence that KDevelop mocks the Visual Studio User Experience (failingly, however), as well as trying to mock the MSDN Documentation experience.
Proprietary is not as bad as you say. Try to keep the discussion from getting too religious. The UHH still have more than enough relavance these days. X still takes more resources to run than Windows XP. I can run XP on a P2 233 MHz computer with 160 MB RAM and a 16 MB Graphics Card, when GNOME/KDE show laggy performance on a Celeron 2.7 GHz with 1GB RAM and a 128 MB Radeon card. I don’t even want to get into Hardware support when it comes to X.
Linux developers are worrying about too much **** that doesn’t matter for desktop users. Sun is doing the same thing with OpenSolaris. They are trying to do this desktop distro, but it will not install/run on 3 of my 4 computer systems.
In the words of a UNIX hacker, “If you want an Open Source Windows, then contribute to the ReactOS project and stop crying about proprietary crap.”Nate:
February 23rd, 2009 at 12:43 pm Also, yes, I know Linux’s bread and butter is the server, but if they want to displace Microsoft Windows they will have to standardize on one user experience and really optimize X and grow it’s hardware support. OpenSolaris has to do the same thing. I cried when I found out they were doing a KDE port to be included in the base install. I shall not use it.
Linux is a cluster**** to develop for. That’s why commercial tools like Kylix has failed so collosally on that platform. It actually was a good development tool, but it’s hard to develop commercial native Linux GUI applications because of the GUI/Library clashes and assorted differences among the distros.
The only way to really be accurate in developing for Linux is to use WineLib, Develop a core applications with much functionality implemented in a proprietary scripting language (like some Editors do, SlickEdit?), use Java, or understand that your application will alienate half of the Linux userbase because some hate KDE and some Hate GNOME – and the applications will not integrate properly if they use a different base…
People need to wake up and see how horrible Linux is as a both a development platform for end-users, and a desktop/delivery platform. When they get rid of all the fragmentation, things will get better. Linux is not knocking on the door of Windows. Not Ubuntu, not Novell, not any of them. The hate for proprietary software won’t help, either. Since proprietary software has, on average, much better quality and support compared to Open Source Software (this includes “community” support, btw, IME).Nate:
February 23rd, 2009 at 1:20 pm “With all due respect, but that is something that used to be the case, and is definitely no longer so.”
Yes, it is so.
Maybe it runs on you machine, but I have 4 computers and Solaris won’t even boot the installer on 3 of them because of it’s pretty terrible hardware support.
I do not buy hardware to run Free OSes, I just renew by RHEL subscription instead.PJ:
February 23rd, 2009 at 1:35 pm “Windows is a mess. I work in IT and see problems every day. When I used to use it at home, it's was horrible.”
Learn to use Windows, I guess? The same way you learned to use Linux?
There are so many people who disagree with that statement :) The only people who really say that, are those who don’t use Windows and only want to tarnish it to persuade others to move to Linux. I have had huge arguments with Linux users who have bashed the Windows GUI/Userland/Security and have been proven wrong handily. I do not want to have this discussion again, but if you feel like you must go that route, I have a saved copy of those posts that I can paste to this blog.
Overstating and exaggerating things is not good. What you say is FUD.Prasoon:
April 2nd, 2009 at 4:51 am > If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn't exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit.
And to be fair, all the efforts (and wasted time of people’s lifes) around these problems have to be added to the TCO (of the respective OS).
Now it seems some politicians even want to have the military step in for cyberproblem mitigation, so this will become even more expensive. And thus become kind of a subsidy to MS. It’s the same “rake in profit for myself now for substandard quality and leave the problems to others” principle as bailouts for the financial gamblers, dumping waste into the environment for ecology, raping girls leaving their inner scars to others to solve, etc.David:
August 10th, 2009 at 4:17 am > If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn't exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit.
Nice argument.Viruses and worms were there before Windows and before MS-DOS.esr:
August 25th, 2009 at 5:06 pm >>> …rather than leaving the job to a userspace library like curses(3)
OK, here’s a question… what is that “3″ doing after “curses”? Or find(1) and man(1)? Is find(1) different from find(2)? Or is find always (1)? (and why?)Ben:
August 25th, 2009 at 6:54 pm >what is that â€ś3â€ł doing after â€ścursesâ€ť?
David, it’s a Unix documentation convention. The number in parentheses is the manual section you’d find the feature in. (1) is utility programs, (2) is system calls, (3) is library routines. Higher numbers are more variable, except that (6) is always games if it exists.Ben:
November 6th, 2009 at 8:00 pm Fair comments ,i do think the book was true at the time and i think your comments are fair now.
I do think OSX , Windows and Unix are 70s Dinosaurs whose design assumptions are no longer relevant and requirements have changed.Riley:
November 6th, 2009 at 8:13 pm > If Microsoft were competent at designing operating systems, botnets and the spam problem wouldn't exist. They are pretty much entirely an artifact of the fact that Windows is trivially easy to remote-exploit.
HAHA .. UNIX is NOT secure at all , MULTICS ( and VMS , OS390 , OS400) was FAR more secure and there is a nice paper about a Trojan on Multics and the authors answer was these systems can never be secure and should never be shared ( ie on the internet) we knew this in the 70s .
As long as someone can convince users to run a program ( which is a given) you will always get botnets in systems with ambient authority. Unix ( and Windows) security is fundamentally flawed as it has security at the user level ( which was ok in the 70s but not now) you need ACL security at the process level or better use a Capability security system Like EROS , Coyotos or MOOOS . Note File descriptors in Unix are capabilities unfortunately the access control is flawed.Bruce:
November 10th, 2009 at 4:44 am @Nate overstatements and exaggerations you say? http://blogs.zdnet.com/security/?p=4825&tag=nl.e550 Let’s not forget the 7 million PCs _CURRENTLY_ infected with conficker and the 2003 blunder of the MS-SQL Slammer worm.James:
January 29th, 2010 at 2:56 am I am relatively new to Unix, well Ubuntu/Linux, with previous experience in Macs and Windows. I am forced to use it at work as key software is not available in 64 bit windows, or I simply need mainframe power.
Unix/Linux is an absolute crock in my view. It takes ages to learn and I am very dependent on others. It is not obvious to me how I can do anything beyond work with a simple Cygwin window. Some colleagues who know Unix, aloofly claim its superiority, yet refuse to help me learn it.
I would dearly like to install and run an FVWM and an X-Windows client myself, especially when the commercial X-Win32 program fails on me, but I haven’t a hope of sorting things out without technical support.
For all its faults, I can work most MS Windows issues out myself. The book rang absolutely true for me. What a ridiculous and obsolete operating system! Operating systems should not be an impediment to your work, but an enabler. Even the current BASH/Linux/Ubuntu implementation of Unix utterly fails in this regard for a new user.
morgan greywolf Says:
January 29th, 2010 at 1:52 pm @Bruce:
Go get the Ubuntu Live CD — you’ll download that one if you click the big, green, friendly Download button — and give it a try. Just boot in on any reasonably current Windows desktop or laptop; you’ll get to play with it without ever installing it. Then tell me what you think of Unix. Most people in your situation are using engineering workstations thinking that that is the current state of Unix on the desktop. Hardly. My non-technical wife (she’s a psychologist) uses Linux at home on her desktop and on her Dell Vostro laptop with very little help from me.Ball:
April 16th, 2010 at 5:04 am Typo fix:
$ rcsdiff ‘index.html?p=538′ =================================================================== RCS file: index.html?p=538,v retrieving revision 1.1 diff -r1.1 index.html?p=538 145c145 < Similarly unfortunate. Sets the tone for too much of the rest of the book, being mostly hyperbolic snark when it could have been useful criticism. Very dated snark, too, in today’s environment of Linuxes wrapped in rather slick GUIs. The anecdates about terminal sessions on Sun hardware from 1987 look pretty creaky and crabby today. — > Similarly unfortunate. Sets the tone for too much of the rest of the book, being mostly hyperbolic snark when it could have been useful criticism. Very dated snark, too, in today’s environment of Linuxes wrapped in rather slick GUIs. The anecdotes about terminal sessions on Sun hardware from 1987 look pretty creaky and crabby today.
September 15th, 2010 at 6:01 pm I’ve read the UHH, and the fact that many of these problems still exist is the greatest damning evidence that UNIX sucks and will always suck. It’s not 1969 folks!
The chapter on X server is not only still relevant, it is more relevant than ever! Sure, some of the ICCCM ugliness has been hidden if you only use certain toolkits, but it’s still there and no better standard exists. The mass proliferation of toolkits has only worsened the chaos and defeats the original purpose of the GUI. When using Linux, I stick to the command line except for web browsing. In spite of having it’s own chaos, it’s still less random than what we jokingly refer to as the Linux desktop.
The only silver lining is that the UNIX toolset and assorted crap is being standardized which is the first step towards sandboxing and moving on to the next big thing. I can’t wait for the day when we can have a system were the left hand knows what the right hand is doing. Hopefully, instead of the everything-is-a-file paradigm, it will have the nothing-is-a-file paradigm.
Leave a Reply Name (required)
Mail (will not be published) (required)
Armed and Dangerous is proudly powered by WordPress Entries (RSS) and Comments (RSS).
Softpanorama hot topic of the month
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.
ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least
Copyright © 1996-2016 by Dr. Nikolai Bezroukov. www.softpanorama.org was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.
Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info|
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.
Last modified: November, 04, 2014