Softpanorama

May the source be with you, but remember the KISS principle ;-)
Home Switchboard Unix Administration Red Hat TCP/IP Networks Neoliberalism Toxic Managers
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and  bastardization of classic Unix

Open Source Software:
(slightly skeptical)
Annotated Chronicle

1999 Q4

October November December
Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

News

December 1999

[Dec 22, 1999] LWN 1999 Linux timeline

[Dec 17, 1999] Slashdot Ask Slashdot Is SCSI Sub-Par Under Linux

[Dec 12, 1999] NYT: Easing on Software Exports Has Limits -- NEtBSD as the most secure of Free Unix variants.

The appeal is not only security but cost. Marcus Ranum, chief executive of Network Flight Recorder, said the OpenBSD operating system enabled his engineers to read the source code, check for bugs and build a very secure tool for detecting attackers. If the attorney general succeeds in persuading the Europeans and Canadians to shut off the flow of open-source security software, he said, "I think it would be a tragedy."

But in case Reno has her way, the software industry is developing end runs. The administration, for example, has so far declined to regulate the international movement of source code if it is printed on paper, presumably out of concern that such regulation would violate the First Amendment. Thus, several companies are already shipping printouts of their code to Europe where it is scanned into computers.

When asked about the policy's impact on the development of Linux, FreeBSD, and other open-source projects that serve the government's own needs, Reinsch, the commerce undersecretary, said: "It's an important question which we need to study a lot more. We don't have all of the answers."

[Dec 4, 1999] Salon Technology Code critic by By Rachel Chalmers

Great paper. See also my Softpanorama Bookshelf / Classic Books

I stumbled across Lions' books in 1996. I'd majored in literature and seemed to have spent my entire life searching for a witty, literate man. When I finally found the man who would become my fiancй, he was a Unix hacker. This baffled me. I couldn't begin to imagine how the arid world of over-lit computer labs and humming server rooms could have produced someone so much droller and more insightful than my fellow humanities graduates. So I did what I always do when I want to get inside someone's head: I browsed his bookshelves.

There I found -- and devoured -- Eric Raymond's hilarious "The New Hacker's Dictionary." Written up from the legendary Jargon file, it describes a literary culture weirdly like my own, complete with movements, jokes, manifestoes and Great Works. And speaking of great works, I also found on those shelves the first editions of the Lions books. The books -- a two-volume set titled "Source Code and Commentary on Unix Level 6" -- include not only the entire source code for the Unix Version 6 kernel but also a detailed and often witty discussion of it written in the mid-1970s.

The genesis of Lions' books is, of course, tied to the Unix tradition that began 25 years ago when the magazine Communications of the Association for Computing Machinery -- technology's equivalent to Nature -- published a paper by Ken Thompson and Dennis Ritchie, the two Bell Labs employees who created Unix and the C programming language. The paper was called "The Unix Time-Sharing System" and included a description of the operating system, a justification of its design and a few notes on why it was built in the first place. The article captured the imagination of many a programmer, including Ken Robinson, a teacher at the University of New South Wales (UNSW), who wrote away for a copy of the new operating system. When it arrived, Lions, his colleague, read the source.

Source code is the blueprint for software, like a set of spells that humans can read and use to control machines. Given source code, a programmer can modify an application by fixing bugs or adding features. But most commercial software is sold only in machine-readable form. Access to source code is power.

Lions' career had followed the classic path for an Australian academic of his generation. In 1959, he took a first-class honors degree from Sydney University and promptly left the country. He earned his doctorate at Cambridge in 1963 and spent the next decade working for Burroughs Corp. in Canada and Los Angeles. By 1972 he was married and had a young family. He moved back to Australia and took a position as senior lecturer in UNSW's department of computing. He would teach there for the rest of his working life.

The Unix code base enchanted Lions -- so much so that he decided to make significant changes to two of the courses he taught. Until then, most teachers of operating systems loftily imparted general principles about programs their students had probably never seen, or encouraged students to build toy operating systems of their own. Unix offered a third approach. It ran on a comparatively affordable computer system, a Digital Equipment Corp. PDP 11 -- a machine that UNSW already owned. Unix was compact and accessible but offered a remarkable set of features. To top it off, in Lions' words, it was "intrinsically interesting." Unix could be read and understood with far less effort than IBM's bloated OS/360 and TSS/360 (which, funnily enough, are pygmies by modern standards), but it had the industrial-strength functionality to which homemade toys could never aspire. One student, Greg Rose, remembers, "John expressed this by saying, 'The only other big programs they see were written by them; at least this one is written well.'"

The first editions of Lions' books were slender computer printouts covered with red cardboard and stamped with the UNSW crest; they bore the titles "Unix Operating System Source Code Level 6" and "A Commentary on the Unix Operating System." Lions had originally prepared them for his students, who were astonished at the availability of the source code. Here was an entire operating system you could hold in your hand. "The whole documentation is not unreasonably transportable in a student's briefcase," noted Lions. Later he would joke that in subsequent editions of Unix, this had been fixed.

[Dec 1, 1999] ZDNet Enterprise Linux Opinion That ol' devil is still at it... By Evan Leibovitch, Linux

In most contexts, I recall the phrase "A rising tide lifts all boats" as an explanation of the trickle-down theory of economics popularized during the Reagan years. Now I find myself applying this term, quite justifiably, to Linux.

For Linux, quite clearly, is blazing a path through which a number of non-Linux software projects are finding new vigor. To be sure, Linux has brought Unix operating system philosophies to a far greater audience than Unix itself ever did. While on one hand SCO must certainly see Linux as competition -- especially at the low end -- SCO must be pleased with the fact that Linux has managed to make Unix(-ish) systems on Intel hardware acceptable to the business computing world. For the longest time SCO was caught in the middle between what some in the media portrayed as a RISC/Unix versus Intel/Windows battle. Now that Linux has bridged that gap, SCO is happy to come along for the ride.

But the projects that have benefitted the most from Linux's popularity, if somewhat jealously, are unquestionably the outgrowths of the Unix BSD project. BSD stands for Berkeley Software Distribution, and has a history that goes back to some of the earliest days of Unix. BSD started in 1977 as a series of modifications that made the "pure" Unix systems of the day more usable. It's not hard to make the case that the Berkeley code was the first real Unix(-ish) freeware, predating the creation of the GNU Project by many years.

Sharing the Linux spotlight

While BSD has been around for almost the entire lifespan of Unix, Linux has seemingly come from out of nowhere to steal the free software spotlight. But BSD and its many users have certainly not gone away. In fact, at the recent Linux Business Expo, BSD's most popular release, FreeBSD, had a prominent place on the show floor. It was doing a brisk business in CDs and doll-sized versions of BSD's Daemon mascot.

FreeBSD co-founder Jordan Hubbard has, for as long as I've known, taken a very reasonable and pragmatic approach to the relationship between BSD and Linux. "Our goals are more in common than they are different," he said. "Fighting between BSD and Linux is sort of like Oxfam and CARE fighting over who's going to feed the poor."

Indeed, Hubbard said, his primary goal is that BSD code be used. He doesn't care where or how, just that people find it useful -- even if it's embedded within someone else's (possibly proprietary) code. The BSD license is not only older than the GNU Public License, it's also simpler. The BSD license requires that users:

After that, everything is fair game. There is no GPL-type requirement that any changes you make to BSD-covered code be made publicly available in source form. You're free to do whatever you want with BSD-licensed code, including making proprietary changes. BSD fans accurately point out that, in this sense, BSD code is freer because it comes with fewer restrictions than the GPL.

Fears of exploitation

This kind of thing, of course, is the stuff that holy wars are made of -- and there are certainly sparks of ill will in some parts of the Linux and BSD worlds. Many BSD folk have an allergic reaction to the impositions of the GPL, while GNU supporters believe that BSD's scheme offers an invitation to proprietary exploitation that ultimately inhibits the growth and popularity of free software.

It's an unwinnable debate, and both sides have valid points. Unfortunately, the debate is often fueled by an underlying resentment of Linux in some BSD circles that contains more than a hint of jealousy at times. Fragments of the debate-turned-argument surface in the strangest places, most recently within a Slashdot discussion of the new Slackware release.

I have myself heard from BSD people who still think of Linux as a toy, and of its developers as amateurish hackers (in contrast to BSD's serious Unix-bred developers, they seem to suggest). Even moderates such as Hubbard believe that Linux is merely a well-hyped first step that gets people into understanding the value of free software. Once there, such converts will naturally migrate to BSD when they want to do serious work -- or so the belief goes.

It's for this reason that the BSD world is full of comparisons with Linux. The best known is a chart that's now being distributed as FreeBSD marketing material. Hubbard himself wrote an excellent piece on the topic for Performance Computing magazine, October 1998 issue. In the print version, the magazine publicized Hubbard's article using a tasteless cartoon of the Linux penguin impaled on the BSD Daemon's trident...

Battle for market share

...I see it differently. Linus Torvalds, more than the BSD folk or the GNU folk or anyone else, succeeded personally in inviting and encouraging a community effort as nobody had done before. It's this sense of widespread community participation, in large part reflecting Torvalds' own skills at herding cats, that has moved Linux past BSD (and past the GNU Hurd project) in any perceived battle for the attention of the mainstream computing public.

...What I do know is that FreeBSD is rock-solid in powering high-volume Web servers or mail servers -- but Linux is also proving its mettle in such situations. BSD fans are quick to point out that FreeBSD powers Yahoo, Walnut Creek (also known as cdrom.com) and similar workhorse sites. I have certainly heard -- from people whose opinions I trust -- more than my share of glowing reports about FreeBSD's abilities to handle heavy Web traffic. But if someone is already using Linux and is happy with it, I'm still uncertain why they would want to switch, or need to switch. Given the nature of free software, if BSD's networking code is really that superior, it may find its way into Linux anyway -- just like Hubbard wants.

November 1999

[Nov 23, 1999] The Consulting Times - Feature Article -- by Tom Adelstein, CIO Bynari Inc.

Global consulting firms tend to lag the rest of the population in the adoption of new products and services. Rationalizations for staying in the slow lane include the need to wait until a technology has proven itself. In the case of Linux, this could prove costlier than in previous times. Ignoring Linux and the Open Source model could have repercussions similar to Jericho for Globals firms such as IGS, EDS, Andersen, Cap Gemini et al.

In several recent analysts' meetings, including those with Gaskin's IPO Desktop and RadioWallStreet, the subject of Linux and the large consulting firms has come up several times. Some very well-known analysts see the global firms stepping in when and if the market is ripe. Those of us more familiar with Linux might take issue with that notion, but only out of emotion, not out of traditional business logic.

In order for global consulting firms to start Linux practices, they have to answer some fundamental questions. Let's take a look at some of them.

Will Linux Survive?

The first question at which consultants look regards Linux's viability. Under one interpretation, one can easily see behind the hype Linux receives and ask about revenue. For example, if an analyst looks at Red Hat's revenues versus market capitalization the imbalance becomes obvious. Similarly, Cobalt Networks suffers the same problem. A revenue model that appears viable hasn't presented itself. Commercial Linux companies do not command the revenue generation which should correlate to the attention they receive. This makes one wonder if Linux has viability.

Rumors abound regarding how well Compaq, IBM, Dell and Siemens have done with Linux. For example, Siemens has taken a leading role with SAP in porting R/3 to Linux. They gave it the highest benchmark ever achieved. Yet, colleagues of mine in Berlin tell me the policy internally at Siemens prevents anyone from using Linux. As one of my associates said, "they threatened to fire anyone who plugged anything but Microsoft into the ethernet."

Another associate at IBM told a group of us that Lotus refused to provide the source code to the Notes Client to the Linux team. The Linux team wanted it so they could use it on their laptops. Since Notes is the only mail client IBM allows, the Linux team felt compromised. One of the Linux developers found the source code to the AIX client and ported it to Linux. It worked so well that the AIX team ported back and achieved improved performance. One has to wonder why Lotus refuses to release the ported Notes client for Linux.

How Much Can Linux Grow?

Another question globals have to ask in analyzing Linux involves its ultimate market penetration. How big is the potential Linux market? Under one interpretation, one can easily say that the Linux market pales in comparison to other UNIX systems, to the IBM midsize market and to Windows NT/2000. Unlike Sun's Solaris operating system or HP-UX, no specific processor depends on Linux for its existence. As far as growth the market potential for Linux doesn't show up on the radar screens of large consulting firms.

To gain a sense of perspective, compare the revenues of Sun Micro Systems, a leading UNIX company, at $11 billion to the entire Linux industry which might have to stretch to reach $200 million. One has to question the demand for Linux in terms of offering real world solutions instead of its appeal as an alternative to Microsoft. What would compel a global consulting firm like Andersen to organize a department around Linux? Certainly a firm as clever as Andersen wouldn't adopt Linux because of the backlash effect to Microsoft's branding as a monopoly.

The Empty Bench Ultimatum

What's the demand for Linux consultants in the marketplace? How much would someone with Linux skills command? Global consulting firms employ individuals with specific skill sets that they can market at a profit. For example, one might have excellent project management skills. A large firm would pay such an individual $100,000 per year or about $50 per hour. The firm would then find someone needing a project manager and bill them out at $125 per hour.

Even small informational technology firms, like IT Partners, earn a living by paying someone less than they charge. For example, if I had 20 people billing 40 hours per week and I made $10 a hour off of each one, my revenue would be $8,000 per week or $34,640 per month. Consulting firms call this leveraging personnel. Others might call it a racket but the demand exists and companies willingly pay the going rate.

Information technology firms do face a significant risk if they can't deploy all their consultants. For example, a firm in the industry must recruit and pay high costs of acquiring consultants. If a consultant doesn't bill, then it could take three or more billable consultants to cover the cost of the idle one. Two people on the bench could hurt a branch's profitability. Ten non-billing consultants could put a mid-size firm below breaking even. This is a predicament firms like Cap Gemini America face everyday.

Consulting firms hire resource managers to make certain all billable consultants stay billable. People say that the resource manager must manage the bench. Ultimately, that means the bench should remain empty.

With an empty bench ultimatum facing every branch manager in every large consulting firm, the incentive to hire Linux consultants and then find them positions remains low. For every request made for a Linux programmer, a branch manager would rather say no than have one on the bench for any length of time. What would you rather have, ten SAP consultants bringing in $175 per hour or ten Linux consultants billing $65 an hour?

[Nov 22, 1999] COLUMNS

For people like Tony Baines, corporate evangelist for the Santa Cruz Operation Inc. (SCO), long-time purveyor of Unix systems, it was all great fun. But when the week was over, it was time to get back to business. The high profile Linux now enjoys, he postulated, is just the fall-out of a year's worth of media focus on Microsoft's woes with the Justice Department and Judge Jackson's recent finding of facts. Although Linux may be new, he says, "the concept of collaborative development and shared code is not. It has been a ubiquitous feature in the UNIX industry from the beginning. And that, of course, is a whole 20 year involvement."

Indeed, for those who watched Unix grow up (and into more flavors than Baskin-Robbins) on the campuses of universities around the country, the energy of the programmers who have written and advanced Linux programs across the Internet for the past couple of years is not new at all. The tradition of Open Source, borrowing code and advancing it, is part of that campus tradition. What is new is the fact that the operating system is free (for the cost of a download) and that, in spite of that, a number of vendors have staked high claims on the territory and are attempting to build it into a business.

Part of the charm at the Linux Business Expo was watching the recent adapters--guys in ties--standing side-by-side with pony-tailed programmers to watch demos at Corel, Red Hat and Caldera this minute and line up at the Andover.Net exhibit the next to order their Think Geek sweatshirts and Fridge Code.

It has all the schizophrenic charm of computing in the early days, when development still took place in garages and the programmers who did it were ambitious but still hungry. In these days of IPOs, venture capitalists, mergers and acquisitions, it will be interesting to watch the movement evolve. Linux as a business enterprise is raising the profile of the operating system and earning it ink in publications such as The Wall Street Journal, Forbes and BusinessWeek as well as trade journals such as VARBusiness. Corporations are embracing it. And that opens new opportunities for VARs and integrators to build and enhance systems and offer service bundles for them. Even within the industry, even companies like SCO, long-timer marketers of a commercial Unix, are embracing it in their business plans.

[Nov 21, 1999] Salon Technology Who controls free software -- the second part of the paper is extremely well written

"If this were to happen," says Kuh, "and the company holds the copyright, they can license no future versions as free software. Of course, the community would have that last free version and that couldn't be taken away, but if the primary developers all work for the company, then the community is forced to fork a new version and begin anew -- and the expertise may very well be lost to the proprietary world."

Who would stand to lose most from such a development? One possibility is other distribution vendors, who, even as they take a publicly cautious attitude towards the merger, can't be overjoyed to see Red Hat solidify its position in the marketplace.

"I guess it comes down to the question of whether you can buy people or not," says Cliff Miller, CEO of TurboLinux. This is the open source world's first test of a time-honored, competitive strategy in the software marketplace -- buying talent to consolidate a company's grip.

PCWEEK Webcast presents COMDEX'99

Network Operating Systems Linux Targets The Data Center

Remote management is another example of Linux's flawed integration. Red Hat managed to release something along these lines in the new 6.1 version, but that's the first time we've seen any kind of support for this capability under the Linux OS.

Considering that companies like Cobalt, Compaq, HP, Rave and VA Research are offering these systems mostly in rack-oriented configurations, remote management should be paramount. But Cobalt offers only a proprietary Web-based utility to this effect, while Red Hat's is also proprietary, and neither can compare with what you can do under Solaris or using third-party utilities for Windows NT/2000.

While these results show that Linux still has a way to go as a seamless corporate-serving platform, that's not to say it hasn't made strides. All the distributions we saw here, save possibly the one from SuSE, took pains to make sure the software was easily able to connect to a network and interact with file systems based on Windows or NFS. We would like to see a bit more support for native NetWare in this regard, but overall we had very little trouble getting these distributions working in our test network.

Basic tasks like drag-and-drop file-sharing are now commonplace, and even print-sharing is easier. Graphical configuration tools and much-improved native documentation across the board have gone a long way toward taking the frustration factor out of Linux. Hooking Windows-based clients up to Linux servers has also become easier, with distributions like Red Hat and TurboLinux Server making this a completely invisible process on the client side and a simple one on the server side.

Is Linux ready for a higher profile on your network? For those looking down the barrel of a costly Windows 2000 migration, now is a perfect time to ask that question.

In general, our testing showed that Linux is certainly ready as a general-purpose network server for midsized businesses, as long as they are willing to embrace the platform wholeheartedly.

Linux Today Reinventing the Wheel -- weak arguments about problems of complete redesign in OSS environment, but topic is important enough to warrant inclusion.

Software design is a peculiar business. Companies working in the field of developing software show the strangest product evolution cycle a person can come up with: the product never fundamentally changes, it only expands. Very rarely do any of the basic assumptions of what a program should be doing and how it should be treating the data it processes change past its first release.

When the basic functionality of a program gets extended beyond the scope of the original design, the added functionality either becomes a system of its own, or a major amount of mindbending and glueing is performed to make it work on top of the existing philosophy. Software evolution is a bottom-up process most of the times.

The basic problem stems from a misconception that evolved mostly with the rapid growth of the industry and consumer demands. It is the dogma preaching that reinventing the wheel is a sin and that reuse of existing code should be maximized. This idea seems to be on a par with the way knowledge and engineering evolved during our Age of Technology. But is it really?

If we take a look at the history of our understanding of the physical world, it becomes clear that science has always been about getting to the fundamentals of a reality that is already there. The growth of our understanding is fed from the bottom by shifts in the definitions of the elementary processes that make the world we register as reality theoretically possible.

The scientific evolution (a process kicked off by and sometimes intermingled with philosophy) is in contrast with the process of software development, where the requirements keep growing. By extending the requirements, we change the contextual reality of the program's "world", which can lead us to a point where the abstracted fundamental elements are no longer adequate to describe it. Often this leads to a point where developers should say: "Our basic assumption sucks, let's reimplement."

The fear of doing this leads to legacy bloat: a growth in the amount of work needed (both in programming and in executing) to implement the new "world" inside the existing philosophy.

If Microsoft, when computers commonly started getting more than 640 K of memory, had taken a good look at the assumptions MS-DOS made about RAM and had rightfully concluded: "This sucks, let's redesign", the long term usability of that OS would have grown tremendously. Instead they gave us EMM386. When they implemented the desktop paradigm, with icons and verbose descriptions of the program or action they performed, Microsoft should have taken a look at the limitations of DOS and the FAT filesystem and reimplemented them to accommodate the new requirements. Instead they turned icons into shortcuts pointing to 8.3 filenames.

Microsoft is not alone in being caught in this cycle. It's an Industry illness which even becomes apparent in hardware. (Or do you really think that ISA was such a good idea to begin with?) And in the field of software. (Do you think Netscape Communicator couldn't possibly do the things it does in less than a 13MB executable file?) Microsoft has been around for quite a while, though, and the assumptions their world started with were notably limited. The industry pushes towards commoditization, but lacks the patience to get the requirements straight before things are built and declared final.

Free software (in both the beer sense and the speech sense) has an advantage over commercial software in being better able to prevent this effect from being perpetual. Developers always have a choice to lay aside the imminently growing feature-demand from the userbase and to stand still for a moment to reflect the usability of the underlying philosophy against the direction the software is growing into. Commercial software can only refer to the competition. If the competition adds features and disregards bugs and bloat, the commercial developer is economically forced to follow the trend, for fear of losing existing market share.

Taking Linux as an example, we can see a lot of distinctive kernel features that have been reimplemented over time. New ways to look at the desired system functionality have made it necessary to rethink the underlying kernel mechanisms. Linux also has the advantage of being built on top of the UNIX philosophy, which turns out to be better adaptable to modern thinking than DOS ever was (by having less hardcoded assumptions and by being more generic in the way it treats data).

If we step away from the documented merits of using an Open Source model for software development with regard to developer co-operation, we can see the much more fundamental advantage: it creates a buffer between the software designers and the end users. This implies a freedom not only for the user, but more importantly for the developer -- the freedom to make correct technical decisions at any time, thus prioritizing technical excellence over feature-demand; the freedom to reinvent the wheel because the old one wasn't round enough and to learn a whole lot about wheels in general while doing it.

The UNIX philosophy has brought us a long way, but we should never hang on to it so tight that we would end up yet repeating the same mistakes. Just because an element of the functionality we are looking for already exists, a software developer should not automatically assume that the existing code is philosophically compatible with what's happening on the program's higher and lower levels. More to the point: the design of a program should not by definition be a function of the available existing tools it could integrate.

CRN Despite the Linux hype, Windows 2000 and Solaris will be slugging it out in corporate America

Linux Today

"At least one other large Microsoft Certified Solution Provider is watching to see how Sun reacts and responds to the increased pressure.

"As for stepping up our Solaris business, it's still too soon to tell. We'll know better next summer," said Rand Morimoto, president of Inacom Information Systems Oakland in Oakland, Calif., which does 55 percent of its business today in NT/Windows, 20 percent in Novell NDS and 15 percent in Solaris.

"As for Solaris for Intel, we're not serious about that -- it's more Linux for Intel these days. The Solaris folks are still hard-core Sun hardware."

THE REGISTER Sun touts pre-launch Solaris 8 for $20

Less that cost of RH 6.1 official resease which is actually also close to beta :-). Solaris is free for non-profits.

Sun touts pre-launch Solaris 8 for $20

Sun Microsystems is offering early-access releases of both the Sparc and Intel versions of Solaris 8, now due in February, for the knockdown price of $20 (to cover materials and documentation).

It is wooing Linux users with the inclusion of Perl and GNU development tools in Solaris 8.

With the new release, the Solaris kernel is "hardened". So if a system runs out of resources when it gets a request for memory, it rejects the request as opposed to shutting down.

According to Sun, all Solaris applications are forward compatible with Solaris 8. This means there's no need for applications upgrades.

Solaris 8 also supports Java 2SE, the Java Media Framework for streaming media, PDA synchronisation and other goodies. ®

Richard Stallman on freedom and the GNU GPL

In Stig's view, to be idealistic is to be ineffective. The Free Software movement is idealistic, but very effective: free operating systems exist because of our idealism. If you are using Linux, Linus Torvalds' kernel, you are most likely using it in conjunction with the GNU system. This combination, the GNU/Linux operating system, the subject of LinuxWorld magazine, exists because of the FSF's idealism. The system is the idealism of the GNU project made real.

I've done business in the world of free software for 14 years now, ever since I began selling tapes of GNU Emacs in 1985, and I agree with Jamie Zawinski (as quoted in Stig's article) that free software and greed are not incompatible -- at least, most of the time they can coexist. But greed alone will not protect our freedom. There are occasions where defending freedom requires a special effort, an effort that requires a motivation beyond material gain.

History shows that people who don't value freedom enough to defend it will tend to lose it. Fortunately, Eric Raymond's doctrinaire, almost Marxian vision of economic determinism is not realistic: history also shows that people who do care can defend their freedom, if they make an effort. To keep free software alive, we need many people to make small efforts, and a few people to make great efforts.

Please join the Free Software movement, and help these efforts.

The Mozilla story Milestone 10

The bottom line (for me) is that, for the first time, Mozilla is really usable. Of course, that usability is primarily as a tool to explore what's coming in the future, rather than as my full-time browser.

It's worth mentioning that even if Mozilla continues on schedule through a successful beta and to its ultimate production release, its window of opportunity may already have passed. There are new choices now appearing on the browser horizon.

KDE will have one -- possibly two -- browsers embedded in version 2.0. Konquerer is KDE's next generation file manager/browser, and it promises to have all the power of Internet Explorer or Netscape Navigator. That's in addition to the minibrowser that comes up now when you click on a link in Kmail.

Then there's Opera, a popular, lightweight, counter-bloatware kind of browser that's being ported to Linux, among other platforms. Its philosophy of being small, fast, and agile rather than fat, slow, and all-encompassing is certain to appeal to many Linux users, even if it is a commercial offering.

Mnemonic is another project in progress which promises to deliver what Mozilla should have been, but can't be now: a truly open source (it's GPLd) and full-featured browser. That alone could make it the winner in the Linux browser sweepstakes.

So once again, I happily find myself in the position of bringing you good news followed by more good news. The first is that Mozilla is maturing nicely and looks like it will be a dandy when it's finally here. The second is that there is growing competition to be your Linux browser. Ain't it grand to be a Linux user these days?

A brighter future Mozilla and open sourcing redux

...But in retrospect, it is unfair simply to blame Mozilla and the open-source process for the delays and for making schedules more unpredictable. The rewrite was unavoidable. Netscape deserves the blame for not rearchitecting the browser code earlier.

Then there is the licensing. I have now heard from many non-Netscape developers who insist that Netscape's Mozilla license doesn't discourage people who believe in open source from contributing code. Most (but not all) comments also indicate that Netscape doesn't exert any undue control over the project, though it retains some special licensing rights.

I learned about other real positives in the Mozilla project: Open source has inspired major improvements in Netscape's development methods, with much more emphasis on bug analyses, newsgroups and other feedback loops and code reviews and documentation, in addition to more modular code. Mozilla.org was slow to provide a road map of where it was heading, but momentum now seems to be gaining.

Ten or so major corporations, including Intel, appear to be committing resources to the project. Perhaps most promising is Mozilla's potential as a cross-platform technology base that could greatly simplify Web applications development. There is a renewed vision and "can-do" attitude permeating the Mozilla team.

I ended my last column criticizing open-source development as a "free lunch." I realize now that if this process is to operate effectively, it requires enormous effort from companies such as Netscape (and IBM, which has done well with the Apache Web server) and participating outside developers. It doesn't look so free to me anymore.

The jury is still out, but the future looks brighter for Mozilla and open source than I had thought, although I remain skeptical regarding how much and when Netscape Navigator will benefit.

The Mozilla team still has something to prove. It needs to finish and deliver a product.

October 1999

XML.com - Where the Web Leads Us by Tim O'Reilly

" The Linux community is far too focused on the battle with Microsoft's current operating system."

Now, that's the classic definition of a "killer application": one that makes someone go out to buy a computer. What's interesting is that the killer application is no longer a desktop productivity application or even the web as a whole, but an individual web site. And once you start thinking of web sites as applications, you soon come to realize that they represent an entirely new breed, something you might call an "information application," or perhaps even "infoware."

So what does all this have to do with Open Source software? There's one obvious answer: most of the technologies that make the Web possible are Open Source. The Internet itself-features like the TCP/IP network protocol and key infrastructure elements such as the Domain Name System (DNS) were developed through the open-source process. It's easy to argue that the open-source BIND (Berkeley Internet Name Daemon) program that runs the DNS is the single most mission-critical Internet application. Even though most web browsing is done with proprietary products (Netscape's Navigator and Microsoft's Internet Explorer), both are outgrowths of Tim Berners-Lee's original open-source web implementation and open protocol specification. According to the automated Netcraft web server survey ( http://www.netcraft.co.uk/survey), more than 60% of all visible web sites are served by the open-source Apache web server. The majority of web-based dynamic content is generated by open-source scripting languages such as Perl, Python, and Tcl.

Information applications are used to computerize tasks that just couldn't be handled in the old computing model. A few years ago, if you wanted to search a database of a million books, you talked to a librarian who knew the arcane search syntax. If you wanted to buy a book, you went to a bookstore and looked through its relatively small selection. Now, tens of thousands of people with no specialized training find and buy books online from that million-record database every day. As a result of information applications, computers have come one step closer to the way that people communicate with each other. Web-based applications use plain English to build their interface--words and pictures, not specialized little controls that acquire meaning only as you learn the software.

I also have a message to entrepreneurs who are trying to come up with new open source businesses. With commodity software, the rules are different. We need new business models. And those models are not always what you might expect. Let me illustrate once again with a story.

One person who's made buckets of money from open source software is Rick Adams, the founder of UUNet. How many of you remember when Rick was the host master at a site called seismo, the world's largest Usenet hub and the author of B News, which was the most widely used Usenet news software. Rick didn't say, "Oh, I'm going to put this software in a box and sell it." What Rick did when he saw that his bosses at the U.S. Geological Survey, or whereever it was that seismo was housed, were starting to ask, "Why are our phone bills several hundred thousand dollars a month for passing Usenet feeds to anyone who asks?" Rick realized that we needed to have some way to have Usenet pay for itself. And he really invented what we now take for granted, the commercial Internet service provider business. But when people think about free software and money, they very often play right into Bill Gates's hands because they think about the paradigm that he has perfected so well, which is "put software in a box, ship it, get a locked-in customer base, then upgrade 'em." Rick went sideways from that. He was the first person to say, "I'm going to build a serious business that is based on free software," and the business was in providing a service that was needed by the people who use that software, who talk to each other, who distributed it, who worked with each other online.

Profiling Linux Developers

Open source software, or free software, has generated much interest and debate in the wake of a number of high-impact applications and systems produced under open source models for development and distribution. Despite the high degree of interest, little hard data exists to-date on the membership of collaborative open source communities and the evolutionary process of their repositories. This paper contributes a baseline quantitative study of one of the oldest continuous repositories for the Linux open source project (the UNC MetaLab Linux Archives), including demographic information on its broad community of developers. Our methodology is a close examination of collection statistics, including custom monitoring scripts on the server, as well as an analysis of the contents of user-generated metadata embedded within the Archives. User-generated metadata files in a format known as the Linux Software Map (LSM) are required when submitting open source software for inclusion in non-mirrored portions of the MetaLab Linux Archives. The over 4500 LSMs in the Archives then provide a demographic profile of contributors of LSM-accompanied software as well as other information on this broad subset of the Linux community. To explore repository evolution directly, an instrumented Linux Archives mirror was developed, and aggregate statistics on content changes seen over a month-long period are reported. In sum, our results quantify aspects of the global Linux development effort in dimensions that have not been documented before now, as well as providing a guide for more detailed future studies.

THE REGISTER French senators propose making open source compulsory

French senators Pierre Laffitte and René Trégouët are proposing that national and local government and
administrative systems should only use open source software. Arguing in favour of their proposed law number 495, they say ease of communication and free access by citizens to information can only be achieved if the administration is not dependent on the goodwill of the publishers of the software.

"Open systems whose evolution can be guaranteed via the free availability of source code are needed," they say. The two senators have set up a discussion forum for the proposed law at the French Senate Web site, and put forward the text, and their own explanation of why the move is needed.

They see the Internet as becoming the primary way for government and citizens to communicate, and propose a period
of transition prior to a switchover to wholly electronic communications. According to Article 3 of law 495, "State
administration, local government and administrative services... can only use software free of [IP] rights and whose source code is available. A decree will fix the terms of transition from the current situation."

In addition, the senators see the switch to open source by the state as providing the engine to drive a far broader movement. Private companies dealing with the state, in bidding for contracts, for example, will tend to switch to open source to make it easier to do so electronically, while those who supply the state with computer systems will have to redouble their open source efforts.

Richard Brandt Sun's War of Trenches

To: Richard Brandt
From: Bill Joy
Subject: sorry, you got it backwards

scsl is LESS restrictive than gpl ala linux. it allows you to innovate and profit from your innovation in the normal ways. gpl does not.

in hopes of helping straighten this out, here's some material. sorry if we confused you ...

if you do something to improve linux, you have to give it back to everyone because of gpl. therefore, you don't own your own innovations and the reward for these (other than fame) goes to ???.

with java, jini and scsl you can make proprietary enhancements to the technology and resell the enhancements and NOT give it back to the community. This is subject to a compatibility constraint but doesn't constrain innovation. you can add additional packages and are NOT restricted to running them through JCP, for example. you are allowed to do vertical APIs. this does NOT restrict your ability to add value. so when you say "you can't make significant changes" you have it exactly backwards: you can make significant changes, and you own them. you can either make them freely available if you want, or sell them or whatever.

scsl is very simple: 1. you join the community, getting access to the intellectual property; 2. you agree to give back any bug fixes you make, as part of community responsibility; 3. if you create new value, improving an implementation or adding new APIs, you can keep it and sell it, give it away or whatever; 4. in any case, you have to be compatible as a responsibility to the community; and 5. you aren't allowed to try to do "embrace and extinguish," changing APIs and trying to hijack the stuff away from the community. this requires you to keep new "platform" APIs open to the community. Note that linux, etc. don't have this provision, so are more "hijackable" by you-know-who extending stuff to be proprietary.

so with scsl, you have an extra right vs. gpl, etc.: you own your innovation and can profit from it; and an extra responsibility: you must be compatible and give back bug fixes.

the goal of scsl is NOT simply to engage the hacker community in doing free software, the goal is rather to engage creative energy, including energy directed with profit, investment and the market in mind. to build on the java/jini base and receiving rewards from the normal market mechanisms is fine. this DOES NOT prevent people from doing stuff and contributing it to the java/jini community for free or under gpl or whatever license they choose.

(the reason why so many of the networking startups, for example, are using BSD rather than LINUX is that the BSD license (which i originated about 20 years ago) allows them to innovate and keep the rewards of their innovation. if they innovated on Linux, they would have to give the stuff back. so i believe, for example, that Juniper and others use BSD for that reason.)

Bill Joy: that rewarding innovation is a good thing comes from our constitution, which says that the "Congress shall have the power ... Promote the Progress of Science and the useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries." that's what intellectual property is all about. that's why if you have a great idea you can get java and/or jini and other code and improve it to make money or give it away to try to get just fame. SCSL supports both. SCSL builds on the constitutional idea of rewarding innovation through the normal mechanisms.

in that way it differs from GPL and open source, which separates the creating of value from the commercialization of it. hence remarks like [Richard] stallman's "programmers should work as waiters so they can give their code away," and the other open source ideas rooted in the notion that source code should not be owned. i think the idea of ownership is most closely linked to the notion of stewardship. things that aren't owned often suffer from the "tragedy of the commons." its important that things have stewardship through their owners.

Bill Joy: sun is also committed to the idea that APIs should be open, while implementations can be proprietary. thus you can clone the java APIs, provided you pass compatibility, i.e., you clone them all and you respect patents and copyrights. this has always been our position. (in practice for most anyone creating commercial value, i believe java and jini and other licensing fees are modest enough that the most sensible thing to do is to be a java licensee, but if you want to be a cloner, and some people do, you have that option, PROVIDED YOU ARE COMPATIBLE.)

there are elements in the open source community that would rather that scsl wasn't widely understood. for example, eric raymond has incorrectly described scsl in at least one document that he has written and has been unwilling to correct what i view as the factual errors in his account.

sendmail.net An Interview with Kirk McKusick [Part One]

osOpinion Tech Opinion commentary for the people by the people.

...in August 1999, the Gartner Group released a study claiming that the number of Windows-first developers is in the middle of a sharp decline from 65% in 1998 to 40% in the year 2000.

Supporters of Linux Worry That Commercialization Could Bring Chaos

Although Linux and the GNU components can be downloaded free, more than 22 companies today sell GNU/Linux software. The best known are Red Hat, Suse, Turbo Linux, Caldera Systems and Mandrake Soft. Last week, Silicon Graphics, VA Linux Systems and O'Reilly & Co., a publisher of computer books, announced a plan to sell Debian GNU/Linux, considered by some to be the purest version.

Originally, these companies were founded largely on service models, giving information technology managers someone to call -- or blame -- when things went wrong. But each also packages Linux differently, adding its own set of software tools for tasks like diagnosing problems and maintaining networks. With so many companies struggling to differentiate themselves, some experts say incompatibilities are inevitable.

"They say that Balkanization is not as possible with Linux as it was with Unix because Linus and the guys are controlling it," said Maureen O'Gara, publisher of Client Server News, a trade publication. But some vendors, she said, "are more inclined to chase money and less inclined to share all their toys with their friends."

"They have some proprietary stuff they've developed and are reluctant to put it in the public domain," she said.

Others disagree. "Only the trade press is really squeaking about this," said Eric S. Raymond, president of the Open Source Initiative, a programmer group, because it makes a good story hook when you have nothing real to write about and the advertising department is pressuring you to make closed-software vendors look good.

Raymond added: "The actual Linux developers know better. Fragmentation isn't going to happen, because developers outside of the Linux distributors effectively control all the key pieces." That is because Torvalds and a small group of his colleagues control the Linux standard and subject all modifications to peer review.

But Linux vendors are already pointing fingers.

"One where you might see a problem is Caldera, because they see part of their value added in proprietary tools they have licensed from third parties," said Bob Young, Red Hat's chief executive.

Benoy Tamang, Caldera's vice president for marketing, counters: "We have produced a product that combines the best of open-source and commercial packages; we are doing Linux for business. We do add to it commercial packages that allow business users to easily integrate it."

Slashdot Articles NY Times on the Fragmentation of Linux

Recreating the history... the right way (Score:2, Insightful)
by rkt ([email protected]) on Monday October 18, @08:59AM EDT (#6)
(User Info) http://www.pobox.com/~rkt/

While I still personally believe that linux is breaking up, I can also see unification in the horizon. The way Linux is maintained and the way UNIX was maintained were different in all possible ways.
Linux has been backed up with a process which is more democratic, unlike the older UNIXes which was essentially maintained by companies for economic reasons which I would call the true capatilist way of management. FSF, Linus and the rest of the gang around the world play a very vital role in regulating code which was absent in previous UNIX. However, that's just my feeling.... others might have a different view to it.

Some Fragmentation Illusory, Some Good (Score:3, Insightful)
by Christopher B. Brown ([email protected]) on Monday October 18, @09:03AM EDT (#9)
(User Info) http://www.hex.net/~cbbrowne/linux.html

Consider that most of the critical pieces of software (things like GCC, GLIBC, Perl, SAMBA, Linux Kernel come to mind) involve Dipping Into The Same Source Code Stream. Thus, while distributions may pick different versions of these components, the differences are not persistent since the next releases will pick a later version from the same stream of development.

The main place where differences between Linux distributions are persistent are with regard to two things:

  1. Installation tools

    ... In the case of "initialization" stuff, the custom tools built by Caldera versus RHAT versus SuSE versus ... may be permanently different, but this is relatively uninteresting since you only run this stuff once.

  2. System Management tools

    This is arguably a matter for more concern.

    Tools include rpm/dpkg, and the recent proliferation of distributions based on Debian is results in RPM no longer being quite as "worshipped" as it used to be.

    I regard the increase in interest in Debian-based distributions as a good thing since Debian has more automated tools for managing and validating validity of packages, which is an area where RPM had "gotten pretty stuck" for a long time.

    Aside from package management, there is then "system management," with tools like COAS and Linuxconf, where different distributions are promoting different tools. (And I'd put in a plug for the OS-independent tool cfengine that's good for lots of purposes...)

There's some fragmentation, but my old essay Linux and Decentralized Development has the thesis that the net results are positive. I haven't seen compelling evidence to the contrary yet.
Those who do not understand Unix are condemned to reinvent it, poorly. -- Henry Spencer

by LHOOQtius_ov_Borg (LHOOQtius ov Borg ([email protected])) on Monday October 18, @12:21PM EDT (#35)
(User Info)

What really will balkanize Linux is software which is made binary incompatible amongst Linux systems. In the BSD and SysV world, as with Linux, there is plenty of software that can be recompiled cross-platform, but it's software that was locked into releasing only on proprietary binary formats that fueled the competition between systems like IRIX, Solaris, HPUX, SCO, and OSF.

People use computers to perform various tasks other than running an OS. If software is not available for an OS, no matter how good the OS is is, it wanes in popularity and possibly dies. If companies only support RedHat with software, then no matter how good Linuxen like SUSE and Debian may be, they're going to eventually decline in favor of RedHat, because people need software to do work.

Also, many people can't handle recompiling software, so if they've got a Linux variant with a nice installer, and they can get commercial software in proprietary binary formats that have nice installers, then they are using a computing paradigm that is familiar to them...

.... ...

Non-Story (Score:1)
by Morchella ([email protected]) on Monday October 18, @10:14PM EDT (#65)
(User Info) http://www.mycoinfo.com/

The following quote from the article said it all:


Others disagree. "Only the trade press is really squeaking about this," said Eric S. Raymond, president of the Open Source Initiative, a programmer group, because it makes a good story hook when you have nothing real to write about and the advertising department is pressuring you to make closed-software vendors look good.

Raymond added: "The actual Linux developers know better. Fragmentation isn't going to happen, because developers outside of the Linux distributors effectively control all the key pieces." That is because Torvalds and a small group of his colleagues control the Linux standard and subject all modifications to peer review.

C'mon folks! Here's ESR telling reporters from the Times that this is a NON-STORY, and that they should better examine their motives for writing it! Kudos to Raymond for being so politic about it that the writer didn't catch it. The single quote above is the only thing in the article that actually made a modicum of sense.

--B

Still have a few years (Score:2, Interesting)
by Anonymous Coward on Monday October 18, @09:28AM EDT (#14)

Linux as a server OS will be OK, with pressure placed on commercial distributors to adopt the LSB, many of these problems will be minor. Even Caldera's integration of proprietary tools will be moot as some of the more interesting protocols mature (i.e., LDAP, XML-RPC, etc.).

We will, however, see a flurry of activity on the desktop side. There are a ton of people who do not need to run a server, but instead want a fast, stable, and cheap platform to surf the web, play games, and write letters and resumes. These people are willing to pay US$60 a pop for this (or part of it) and have paid US$400 for just the software to gain this functionality. Aside from installation, there isn't much support that is required and when it is, the established companies are already charging per incident.

Unfortunately, this Linux desktop will probably not come from the larger distributors today. It will be a company who adopts the Linux kernel and extends it with their own proprietary GUI. It won't be X compliant, or even have X available. The winner will eventually get X support through a company like Hummngbird. Early entrants will make developers pay a couple thousand for the privilege of developing for "their" platform. That will eventually stop as competing desktop vie for developers. Free tools and "open" APIs will finally arrive.

The Linux you know and love will still be strong. Serving enterprises and power-users home desktops, but your mom will be running Linux without even knowing it. From a casual inspection, you might not know it either.

Fragmentation happening rapidly! (Score:1)
by eyepeepackets ([email protected]) on Tuesday October 19, @11:01AM EDT (#70)
(User Info) http://www.keylabs.com

As a person who tests bleeding-edge hardware against the four major Linux commercial distros daily (and others as well, but mainly the big four - Caldera, SuSE, RedHat and TurboLinux) I strongly agree with the NYT article: Linux is fragging to the point of looking like a massive gibbing in a Quake fest.

No longer can the user be sure that any generic code will work on any one distribution. No longer can the user even be sure the basic functionality of the kernel will work consistently from one distribution to another.

The source of all this incompatibility? How do I loathe these commercial distros, so let me count the ways!

Lack of strong, pro-active support for the LSB by the commercial distros: Lip service spewed
simply to avoid getting flamed doesn't quite serve the purpose of getting a solid LSB.
The commercial vendors really don't want a LSB, at least their marketing folks don't: One very
strong concept in marketing is DIFFERENTIATION! You need to make your product different enough and drone on about the "superior" aspects of the variety to get the consumer to buy the
product.

Money counts more than quality. The commercial vendors have to be concerned with money first
and their products show it. Redhat is buggy crap when running X; Caldera's install won't
even let you make a boot floppy during install (hey, you know those newbies just gotta love
that); SuSE has so much proprietary patching done to their kernels that I often can't get
common drivers to work; and the list goes on and on and on....

The frickin' long-term libc vs. glibc6 mess. This has opened the door to all sorts of
opportunities for the differentiators to make trouble. Any LSB should deal with this ASAP!
Perhaps dual-library cross compilers as a standard feature? Make the effort to ensure
glibc6 is fully inclusive of libc5?

To sum it up: The commercial distros are desktop manager happy and want the entire Linux world to look and act like Microsoft product, apparently to the point of being sloppy, unreliable crap just like their favored model. The commercial distros care far more about making money than they do providing a quality product. One commercial variety of Linux will not be consistent in the way it works and the programs the user can use with it when compared with another commercial variety of Linux.

What do I use? My control testing box is Slackware-based, I don't use either one of the slow, and unreliable desktop managers (both Gnome and Kde sucketh in a big, bad, buggy kinda way) except when I'm testing X/video related stuff. I've tried using both Gnome and Kde, they are both buggy, unreliable and offer very little functionality for the loss of speed and increase in instability that comes with them. IMHO, both are still beta-stage code.

A prediction? If there isn't a strong LSB in place soon, Microsoft will continue to dominate the Desktop, will make a turn-around in server space, and Linux will have been a flash-in-the-pan. Why? Because users won't abandon one buggy, unreliable mess for another: Better a known evil than an unknown evil, to paraphrase an old saying.

Be prepared Linux-folk, the commercial vendors will try every way possible to either sink the LSB or render it a toothless (i.e., worthless) tiger because it is not in their best interest, which is making money.

Ya, ya, I know, everything in the universe sucks: It's the law.

Linux should be set free (Humor)
by Anonymous Coward on Monday October 18, @09:40AM EDT (#18)

We must let Linux permutate by itself without anybody in control of it. There should be no Monster Dictator on top that demands what should go in and what should not. We do not want another Stalin amongst our comrades. We peasants should take control, not let any power hungry man on top tell us how kernel lock synchronization should be implemented.

With this way, all variations would grow, with the lackluster ones dying off. There is no other way to evolve in our commune. We have to follow the example of the human species in the world, in which we come to dominate the mammal universe not because of our brute strength, but because of our superior intelligence. If we blow each other up, that is the way it is, since the survivors would be go on to build a better race(whatever that would be.)

So all comrades, rise to arm and pick out your mouse. We must topple anyone with power within our group as they are the perpetrators to our progress toward Utopia. Suppression of our freedom to do it anyway we want to shall not be a rule of the game anymore.

! Central control is the true way of evolving.
ZZZ

[Oct 19, 1999] ZDNN Joy details pros, cons of open source

Joy said strengths of open-source software include that it's open, that more developers can work on it, that companies don't rely on a single provider for fixes, and that the boundaries are flexible.

But he said some of the drawbacks of open source include: no guarantee of quality from a single source; limits to financial gains; and something he called a "reintegration bottleneck," which could fragment the product. He said the bottleneck happens when software becomes so successful, drawing so many contributions, that it becomes daunting -- and even impossible -- to wade through all of them and decide which changes make it into the product.

Furthermore, he said, many companies are concerned about the compatibility of open-source software. "They actually want someone they can yell at" when things aren't compatible, Joy said. "They really want someone to finger and harass."

Joy also ran through some of the aspects that make proprietary software licensing appealing to Sun, including that it protects intellectual property, and it provides for one owner and brand control.

[Oct. 17, 1999] Linux Today Joe Pranevich -- Wonderful World of Linux 2.4 (100299 Final Edition) -- among new things outlines the problems with 2.2 kernel.

[Oct 17, 1999] Will Linux Be Viable Competition for Windows Desktops -- Microsoft report see critical comments here

1. Linux will become a major success and will succeed in supplanting 32-bit Windows as the dominant desktop operating system (0.1 probability). For this scenario to be feasible, several things in the marketplace would need to occur. If we assume that Linux will gain measurable market share in the desktop audience, then the door will open for additional momentum. If this were to happen, it would be possible for the Linux distributors to leverage this with increased support from third-party hardware vendors to bundle Linux instead of or in addition to 32-bit Windows. As a result, third-party ISVs could view the Linux marketplace as profitable and support the platform with native applications. However, we think it is late in the game for this scenario to feasible. Some major force would be required to break the Windows inertia, and Linux does not offer a compelling force against 32-bit Windows. It remains a complex Unix variant that offers little advantage to mainstream users.

2. Linux will fail to supplant 32-bit Windows as the dominant desktop operating system and will fade from the market by 2004 (0.2 probability). It is very unlikely that Linux will vanish from the market during the next five years. Linux will continue to be supported and maintained by a core community that has embraced the OS with near religious fervor. We expect that future releases will focus on polishing the product but will not offer a major feature or enhancement that will be enough of an incentive to drive the installed Windows base to Linux.

3. Linux will fail to supplant 32-bit Windows as the dominant desktop operating system but will remain an alternative operating system, gaining no more than 5 percent of the installed desktop space by 2004 (0.7 probability). Despite the lack of appeal that Linux will have for mainstream users, other users will still flock to Linux as an alternative to 32-bit Windows. The Linux community has demonstrated long-term loyalty toward Linux that will continue to grow. While standards for such things as the user interface will remain poorly defined, homegrown applications at no cost, or almost no cost, will provide the minimum level of functionality to keep the OS alive. In addition, as PC vendors continue to look for ways to lower the costs of their systems, we expect that many "white box" systems will offer Linux as a lower-cost alternative to Windows and broaden acceptance in that segment of the market.

[Oct 14, 1999] Debian Linux Backers To Introduce Retail OS

Debian, a non-commercial Linux distribution since the early 1990s, will be packaged for the international operating systems market. The companies said all profits will be donated to Software in the Public Interest, which includes the Debian Project, a non-profit organization of about 500 Linux developers.

Debian has at least one unique technical feature, called "apt-get," that automatically updates every time users are online, "conceivably eliminating future OS installations," according to Debian backers.

[Oct 13, 1999] CNET.com - News - Enterprise Computing - Traditional firms showing Linux the money

Santa Cruz Operation, which has been selling the Unix operating system for years, has made its most serious Linux move to date with an investment in Linux software site LinuxMall, which will be announced tomorrow.

And Motorola soon will announce an investment in Lineo, a company making a version of Linux for TV set-top boxes, medical imaging equipment, and other non-PC devices.

The sizes of the investments weren't disclosed. Motorola didn't comment on the investment by deadline.

[Oct 12, 1999] Extremely interesting discussion of Microsoft strategy in Korea in the first part of the paper

A most promising development for the free software movement in Korea is the government's Ministry of Information and Communication announcement in late July that it will "provide government support for the development and proliferation of Linux." The Korea Herald, among others, reported that the ministry "will establish a Linux consultative body composed of software experts from the government, academic and industry sectors to standardize Korean versions of Linux and develop a variety of programs based on the operating system."

At the forefront of the Korean government's support for Linux is the Electronics and Telecommunications Research Institute (ETRI). According to Kim Hae-jin (family names are first in Korean), who is heading the ETRI Linux project, ETRI's plan is to "provide a highly scalable, highly available, single system server image cluster [technology]... adaptable from Internet [servers] to [the] mission critical enterprise."

A non-profit organization called the "Linux Council" has been established. Four committees within the Council have been designated:

  1. Standardization -- standardize Linux's Hangul terminology and documentation
  2. R & D -- promote research in and development of Linux software
  3. Supply and Support -- support Linux in end-user markets, schools and government offices
  4. Education and Training -- promote Linux education and training

[Oct. 1, 1999] ZDNN: Sun to make Solaris code available

From one point of view it looks like Solaris can serve as a new Linux roadmap -- especially in SMP and fallover areas. From another it looks that it can lessen appeal of Red Hat and damp the stock value of the company -- why bother with Red Hat if you can get source code of Solaris and have the same advantages from the point of view of tech support as Linux has without many Linux disadvantages. At this point RH stock is around $88 down from $135 max value but well up $14 initial IPO value. In any case it can be probably considered as a blow for Microsoft NT.

...Under Sun's community-source license, programmers around the world will be free to download the Solaris source code and to make any changes they desire, so long as they provide open "interfaces" to the software and report bugs back to Sun and other programmers. Developers will also be allowed to use Solaris for free in noncommercial applications, but will owe Sun licensing fees if they incorporate it into commercial products.

By contrast, Linux and similar "open source" software is available free of charge to anyone who wants it -- including for commercial use, although users are required to make public any changes to the source code.

Sun is hoping that by simply having the source code available, programmers will trust the software more since they will be able to see its inner workings. Sun also expects that other software programmers outside the company will come up with ways to perhaps even improve Solaris.

The move to open up Solaris amounts to an attempt to ensconce Sun's operating system, which it has long described as one of its crown jewels, as the Unix operating system of choice for Internet sites and corporate data centers.

Sun has been a vigorous proponent of its community-source licenses, which it has used to promote its Java and Jini software technologies, as well as its major microprocessor designs.

Avoid free-for-alls
Sun hopes community-source licensing can help it avoid the free-for-all spirit that characterizes -- some would say, energizes -- the Linux programming community. In particular, Sun officials remain concerned that Linux-style open-source agreements could fragment Solaris into a host of incompatible operating systems.



Etc

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D


Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to to buy a cup of coffee for authors of this site

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Last modified: March, 12, 2019