||Home||Switchboard||Unix Administration||Red Hat||TCP/IP Networks||Neoliberalism||Toxic Managers|
|(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix|
|News||Labyrinth of Software Freedom (eBook)||Recommended Books||Recommended Links||Recommended Papers||Catalog of software licenses||Stallman's biography||Fair Use||Plagiarism|
|GPL: general aspects and links||Solaris vs Linux Security in Large Enterprise Environment||The concept of Freedom Zero||GPL's Suppressed Envy Problem|
|GPL and government work||Python License change||KDE jihad||GPL as an incompatible license||GPL and Fair use||GPL and Linux Kernel||GPL and business opportunities||Google and copyright law|
|Professor Samuelson-- fair use champion||
|abandonware||Framing ang Hyperlinking||Shareware and giftware||Other Licenses||Random Findings||Humor||Etc|
Rui Miguel Seabra wrote:
> A copyright law violation is as severely pursued be it Free Software as proprietary software
"The church of GNU denounce copyright as an evil plot... unless this is connected with the direct attacks of the legality of GPL. ;-)"
Letter by Alexander Telehov
This page is a collection of my research notes to the eBook Labyrinth of Software Freedom. As any research collection it's mainly for the internal consumption and as such is pretty raw. Still I think that software developers should be aware about misrepresentation of copyright in GPL and in general by RMS (aka FSF ;-) and here this page might be somewhat useful.
I actually like the 'you get something for free, you have to give something back' idea, however, I like it from the point of view of academic ethics; I don't like the GNU manifesto, I don't like Stallman's ideas about commercial software (and yes, that includes Microsoft as a dominant PC commercial software vendor ;-), and I really don't like people who blindly follow software anarchists like Stallman.
Again, this is a "slightly skeptical" page. If someone decides to license software under the GPL, that's fine, but that should be done by looking at the license, reading relevant discussions, papers and historic cases and seeing what it's all about (and what the consequences are). Most Linux programmers however, blindly stamp GPL on their programs without making any effort to understand consequences and real benefits of such decision. Sometimes they are naively thinking that they prevent companies to profit off their work, forgetting that companies like IBM, RH, Suse (now Novell) are profiting from them already.
Copyright concepts as well as the length of copyright evolved with time. The authors were originally given copyright for a relatively short period — in the U.S., it was initially only 14 years from the first publication of the work. For most authors, that would be enough time to earn the bulk of the income that they would ever receive from their writings; after that, the works would be in the public domain.
But corporations build fortunes on copyright, and repeatedly pushed Congress to extend it, to the point that in the U.S. it now lasts for 70 years after the creator's death and for works of corporate authorship to 120 years after creation or 95 years after publication, whichever endpoint is earlier. The 1998 legislation responsible for the last extension was nicknamed the "Mickey Mouse Protection Act" because it allowed the Walt Disney Company to retain copyright of its famous cartoon character. See Copyright Term Extension Act
Arrival of open source licenses such as GPL was also a new development that changes the landscape of copyright protection. In initial copyright protection was explicitly designed to allow the author to get revenue from his/her creation for some period of time, here the key element is different. Open source licenses are mainly concerned with hijacking of open source product by commercial companies and reselling the results for profits.
Arrival of cloud services changes the landscape once again, especially the notion of distribution.
GPL has problems typical for any utopia. It has some systemic shortcomings: they are too complex 9and that means that interpretation in courts require help of highly paid lawyers), too vague, contain too much irrelevant information, and is too restrictive in case of object oriented programming components reuse (you might need a new license to get the equivalent of what the LGPL is for non-OO languages).
Still it does include positive value of promoting altruism that is absent in traditional copyright licenses and that is tremendously important in the neoliberal world we live. In The Soul of Man Under Socialism, Oscar Wilde wrote that "a map of the world which does not include Utopia is not worth looking at". As Jon "Hannibal" Stokes aptly noted: Ars Technica Intellectual Property and the Good Society - Page 1 - (8-2001)
Each aspect of a structure--the choice of a foundation, the materials, the location--reflects the values, interests, and goals of the person or group who built it. This is nowhere more evident than in the international intellectual property structures currently under construction by parties with an interest in maintaining the status quo of the offline world into the digital age. This global intellectual property regime is being developed quite deliberately by a very select group of transnational corporations with vast patent, copyright, and trademark holdings, holdings that are essential to their survival. Furthermore, this group justifies their vision of how this structure is turning out by using the language of rights--language that works to this group's benefit in exactly the way I outlined above. By focusing the debate on the "rights" of individual parties, this group has been able to distract us from the construction work that's going on right under our noses. Make no mistake, this group may talk "rights" to the public, but they're thinking "structures," and even a cursory examination of the documents they produce will bear this out.
The type of structures that this regime is developing are outlined in a document typical of it: a product of the World Intellectual Property Organization's (WIPO) Workshop on Implementation Issues of the WIPO Copyright Treaty and the WIPO Performances and Phonograms Treaty, entitled, "Technical Protection Measures: The Intersection of Technology, Law, and Commercial Licenses." In this document, the authors outline a three-pronged strategy for the "protection of intellectual property." Again, it is important to note that this strategy was not developed by content producers "on the ground," but is a product of transnational corporate interests whose overriding concern is the maintenance of the status quo. I'll summarize this strategy briefly, because it's important that each of its three components be understood if those of us with an interest in the outcome of these developments are going to be able to address them.
The first element in this strategy is the most familiar to the technical crowd, as it involves the development of technological protection measures aimed at "protecting" content from "unauthorized uses." The current focus of the "content protection" industry is on the use of encryption and steganography (i.e. information hiding techniques like digital watermarking) to control access to digital works. The industry doesn't expect encryption to completely prevent unauthorized copying, however. Rather, it is felt that encryption will enable content owners to raise the level of difficulty associated with unauthorized reproduction and distribution of copyrighted works. If it is suitably difficult for a consumer to compromise the digital locks placed on published content, the reasoning goes, then such "pirate" activity will be limited to the few, the competent, and the dedicated. Content owners openly admit that this is a direct attempt to artificially reproduce the constraints on copying naturally inherent in analog media, thus doing away with the advantages of digital media for everyone but the content owners themselves.
Open/free software licenses probably are farther from the "traditional" copyright law than an approach discussed in the quote above. To a certain extent they are connected to the political notions of freedom and power (like in "Power without freedom is tyranny. Freedom without power is impotent."). RMS chronic abuse of the word "freedom" and simplistic (anarchistic) understanding of this complex issue is very symptomatic in this respect. That's why www.gnu.org sometimes reminds a web site of some obscure "software cult". In no way it can be considered a software developers website despite the fact that RMS was in the past a programmer himself. That's probably why most important open source products are developed outside FSF umbrella...
It's important to understand that the material presented below covers a pretty limited spectrum of questions related to the theme of the e-book Labyrinth of Software Freedom.
My impression is that for successful software GPL partially outlived its usefulness and that for such product dual licensing with the Artistic license as the second license are preferable (Plan 9 license generally can be considered as a longer and with more legalize derivative of the Artistic license designed for corporate use). In any case dual licensing looks like a more viable alternative for "pure" GPL and it is possible to switch to it even if on early stages of software development GPL was adopted as single license. It also can be used from the very beginning, especially if there is some public funding involved. One interesting property of GPL license is that it is difficult to change it to the other license and attempt of such change often generate a backlash including possible attempt to fork the product. that's way adding the second license while aching 80% of what it achievable by changing the license is a better way to do the same. In any case GPL is influential enough so that authors, who once adopted it can't change to other license without some damage (and in worse case from being subjected to "GPL jihad" attacks). Some good software products were destroyed due to change of the license from GPL to other. That means that dual licensing is preferable to the change of the license. This idea was pioneered by Perl.
It's clear that viable alternative to GPL exist. Sun Community Software License is probably another interesting underutilized license that can be used on early stages of software development cycle. BSD license is better for mature software and here GPL is simply inadequate.
But GPL proved to be a very viable license for abandonware, which now constitutes a large share of open source software, as it greatly simplifies change of the developer/maintainer.
Abandonware. is a rapidly growing phenomenon and it has some grass root support. It probably will need a separate page really soon ;-). The term abandonware is usually applied to commercial software (like Norton Commander, DOS, MS Word for DOS, WordPerfect for Linux, etc). But we need also to understand that most open source projects listed say on sites such as Freshmeat (now Freecode.net) are abandonware. Few are actively maintained and of those even fewer are useful. Quantity does not turn into quality automatically.
GPL also has an interesting problem related to enforcing zero price (which can be considered as a special case of price fixing). This was first shown in two Wallace lawsuits that were dismissed on technical grounds. In both cases the GPL was not upheld - Wallace's complaint was dismissed without discussing its merits at the request of FSF (FSF Tries Again To Get GPL Antitrust Suit Dismissed @ ENTERPRISE OPEN SOURCE MAGAZINE)
The Defendant FREE SOFTWARE FOUNDATION INC. has entered into contracts and otherwise conspired and agreed with individual software authors and commercial distributors of commodity software products such as Red Hat Inc. and Novell Inc. to artificially fix the prices charged for computer software programs through the promotion and use of an adhesion contract that was created, used and promoted since at least the year 1991 by the FREE SOFTWARE FOUNDATION INC. This license is known as the GNU GENERAL PUBLIC LICENSE. The price fixing scheme implemented with the use of the GNU GENERAL PUBLIC LICENSE substantially lessens the ability of individual software authors to compete in a free market through the creation, sale and distribution of computer software programs. [emphasis mine]
The ruling was:
"First, while Mr. Wallace contends that the GPL is "foreclosing competition in the market for computer operating systems" (id.), his problem appears to be that GPL generates too much competition, free of charge. The court's understanding from the GPL itself is that it is a software licensing agreement through which the GNU/Linux operating system may be licensed and distributed to individual users so long as those users "cause any work that [they] distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License." (GPL 3.) The GPL purportedly functions to "guarantee [users'] freedom to share and change free software." (GPL Preamble.) As alleged, the GPL in no way forecloses other operating systems from entering the market. Instead, it merely acts as a means by which certain software may be copied, modified and redistributed without violating the software's copyright protection. As such, the GPL encourages, rather than discourages, free competition and the distribution of computer operating systems, the benefits of which directly pass to consumers. These benefits include lower prices, better access and more innovation. See Jason B. Wacha, Taking the Case: Is the GPL Enforceable, 21 Santa Clara Computer & High Tech L.J. 451, 487 (2005). And the Sherman Act "was enacted to assure customers the benefits of price competition, and . . . prior cases have emphasized the controlling interest in protecting the economic freedom of participants in the relevant market." Assoc.'d Gen. Contractors v. Cal. State Council of Carpenters, 459 U.S. 519, 528 (1983). Therefore, the court finds that the Fourth Amended Complaint does not adequately set forth an injury to competition as a whole."- John Daniel Tinder, Judge, United States District Court, Daniel Wallace v. Free Software Foundation, Inc.
It looks like Wallace discovered in interesting property of GPL that previously was never discussed: In the article of the Stanford Center for Internet and Society "Court Finds No Antitrust Injury From GNU General Public License (GPL)" the substance of Wallace claims was outlined as follows:
This case involves the GNU General Public License (GPL), which governs the use of many products sold and distributed by the Free Software Foundation (FSF), including GNU/Linux operating systems. The GPL requires, among another things, that users who distribute or publish any work derived from GPL-covered software to license that work to under the GPL to all third parties at no charge. The plaintiff, Daniel Wallace (Wallace), was not a user of FSF software; rather, he was a competitor of FSF’s, trying to sell his own operating system. Wallace brought an action pro se against FSF claiming that it was conspiring with commercial distributors IBM, RedHat, Novell, and others to fix prices for intellectual property in the market by attaching the GPL to GNU/Linux operating system software. Wallace claimed, in essence, that the GPL constituted a horizontal price-fixing scheme among competitors in violation of Section 1 of the Sherman Antitrust Act and sought to enjoin FSF from developing and distributing Linux under the GPL. On motion by FSF, Judge Tinder of the United States District Court for the Southern District of Indiana dismissed the complaint for failure to show any “antitrust injury” from FSF’s conduct, but held that Wallace had otherwise stated a claim upon which relief could be granted.
In his third amended complaint, Wallace alleged that FSF was conspiring with its competitors to fix prices for software via the GPL. The court determined that Wallace was effectively claiming the existence of a horizontal price-fixing agreement, which would be illegal per se under the Section 1 of the Sherman Antitrust Act (prohibiting contracts and conspiracies in restraint of trade) because such horizontal arrangements are perceived to have a “pernicious effect” on competition. By comparison, vertical agreements (those between enterprises at different levels within the same chain of distribution) are governed by a “rule of reason” analysis because their effects will not always be anticompetitive. The court determined that the GPL could not be reasonably characterized as a horizontal agreement because it governs agreements between licensees and licensors, who are users at different levels within the same chain of distribution. Therefore, the court reasoned, the GPL is a vertical agreement, and it cannot alone constitute a per se violation of the Sherman Act.
The court then analyzed the GPL under the rule of reason to determine whether it might be an unreasonable restraint of trade. Under the rule of reason, a vertical licensing agreement may violate the Sherman Act if it produces adverse, anti-competitive effects such as a reduction in output, increase in price, or deterioration in the quality of goods and services, among other factors. FSF argued that its practice of allowing free access to software with the GPL aids competition rather than hinders it. However, the court held that the GPL may have an anticompetitive effect by discouraging software developers from creating better programs for Linux (since they could not be adequately compensated) and reducing the number of quality programs available to consumers. Thus, Wallace’s complaint sufficiently alleged a violation of the Sherman Act.
However, the complaint ultimately failed because the court found that Wallace had suffered no antitrust injury, i.e., injury of the sort that the antitrust laws are designed to prevent. Examining Wallace’s complaint, the court found that his only alleged injury was an inability or unwillingness to enter into the software business because he could not compete with users of Linux. Because this is an injury to a (potential) competitor rather than an injury to consumers or to competition itself, the court found no antitrust injury and dismissed the complaint.
Due the size of the page software licenses catalog was moved to separate pages. There are two common misconceptions that I would like to point out.
The first very common misconception that many people confuse "GPL", and "Open Source". Open source is actually an umbrella term and it is more correct to distinguish between OSS under particular licenses: Artistic license, BSD, GPL, MPL and several other less popular licenses. GPLed software represents less that 50% of "open source" code; the majority of "open source" code is made up of BSD, Artistic license (and its derivatives) and the other open-source license types.
The second misconception is that open source is considered to be mainly Linux-related. In reality BSD operating systems and BSD-licensed software is older, in some areas more prominent and more technically advanced (for example. more secure -- as in OpenBSD) branch of open software space. While Linux might be getting all the ink, at least in ISP world BSD is doing most of the work (BTW Yahoo is running on BSD). Windows open source software also pretty popular. Such Windows programs open source programs as Virtual box, Firefox, 7-zip, MPC, Teraterm, Cygwin, Ghostscript, FAR, Filezilla and many more are of professional quality. Open Office is also very popular open source software for Windows although it exists for Linux too.
See also the Usenet group misc.int-property for relevant discussions.
Dr. Nikolai Bezroukov
May 30, 2021 | firstmonday.org
Shadow libraries' circumvention tactics
Russian shadow libraries developed survival strategies according to three objectives. The first ensured the sustainability and growth of collections, which in turn depended on the vitality of their communities. The second provided personal security to administrators, who could face diverse legal responsibilities. Finally, these libraries sought to guarantee access to their collections in the Russian Federation.
Today's shadow libraries are "communities" that bring together readers and administrators (BodÃ³, 2018). Unlike Moshkov's Library, Librusec, Flibusta and Maxima Library operate as Wiki-like libraries, where texts and annotations on authors and works are uploaded and corrected by users themselves. Their range of possible actions depend on their seniority, "abilities" (such as coding skills) and "merits" ( e.g. , technical problems resolved, number of books uploaded). But the possibility for any logged member to upload and modify content remains a basic principle for shadow libraries. This may have allowed senior administrators" who are usually very few in number" to disavow themselves in the hypothetical case of legal liability: they could say that they were not responsible for user-generated content [ 52 ]. This argument, though, is no longer valid since the introduction of the notion of "information intermediaries" by "anti-piracy" legislation.
On the other hand, this collective work ensures the possibility of exponential growth of collections. Ilya Larin, a programmer by profession who shares the ethics of free software exchange, has made the source code of his library freely available [ 53 ]. As Coleman noticed, this practice of code sharing is deeply rooted in a worldview where "code is speech", typical of hacker culture, and thus freedom of speech and code sharing are intrinsically connected [ 54 ]. The interview I conducted with Larin shows that he relies on both code and text sharing to guarantee the survival of the ecosystem:
They (Flibusta) borrowed my site engine, cloned my archive. A lot of libraries once took the Librusec site engine. Since then, they all have developed in different directions. But there is constant cross-pollination: someone has a new book, there is a mass of librarians that work simultaneously on several Web sites and drag and drop books. It is now a dense community that is much more sustainable. If Librusec disappears tomorrow, well, nothing will happen.
This "cross-pollination" preserves collections by multiplying media and means of dissemination, thanks to members of these communities who share particular know-how or have access to specific libraries. For instance, Larin argues that, when a large volume of archives needs to be downloaded (such as all of the updates for a month or a year), it is "more convenient to download them from a torrent, and there are volunteers who regularly pull these updates from the Librusec and hand them out through the torrents". Some shadow libraries have their own torrents. Major torrent aggregators also distribute these archives and their updates. Torrents allow users to download the main archives in their entirety onto their computers, accompanied by a program that transforms the archive into a database for convenient use. Once the full database is installed on a computer, the user may download monthly or annual updates. Comments on the torrent aggregators show that these distributions are the work of a small number of volunteers and that some users take over and distribute uploaded archives through their own channels. Thus, the library's collection is replicated many times and kept not only online, but also off-line. Its ability to be restored any time makes its total disappearance impossible.
The server on which the main collection is stored is also of great importance for the security of the library. The legislation of the country where it is located must be sufficiently tolerant of copyright infringements. On the forums, administrators exchange comments on "good" and "bad" locations. Larin states in the interview that there is a balance to be found between various types of censorship. Hosted in the Netherlands, a country perceived by the respondent as tolerant of pirate sites, Librusec had been attacked for "child pornography" because of an old sex education manual, and had to find a host in another country. Thus, administrators of the shadow libraries use the possibilities offered by the global nature of the Internet, moving servers according to local affinities.
Additional security is provided by the choice of anti-abuse hosts. Their services are more expensive, but they are a bulwark against possible complaints from rightsholders that an ordinary host would satisfy without going into detail.
Another task for library administrators is to ensure their personal safekeeping. Indeed, "anti-pirate" legislation makes them responsible for illegal content of the sites that they maintain. Two tactics co-exist: first, the use of a geographically remote location providing immunity; second, the strictest anonymity which does not permit to establish a link between a specific individual and his virtual double. The geography of immunity does not coincide with the formal, traditional borders of Russia. Rather, it is defined by the interest of local authorities and industry in the piracy of books, most of which are Russian-speaking.
An example of the first tactic is Larin, who has been living in Ecuador since the beginning of 2000s. This location allows him not to hide his identity, which he uses in particular on social networks (Livejournal, Facebook) and in public interviews. During my interview with him, he stresses the ineffectiveness of court decisions in his country of residence:
"" Litres brought a lawsuit against me, even in Ecuador. I think they even won, but it did not affect me in any way. The specificity of Latin America is that when one gringo sues another, it does not mean that they are somehow concerned.
"" Can they charge you with a fine?
"" They can charge me, yes. But they cannot force me to pay it. Besides, court decided that the Librusec domain should somehow be taken away from me, and that should kill the library. And it has been like that for two years now. However, it works, I renew it every year. Well, that is Ecuador. [ 55 ]
The word "gringo" underlines his status as a foreigner who is considered detached from local concerns. However, he feels distant from the Russian context too:
I have never seen them (people from Litres) alive; they are some people in a faraway Moscow on another continent who are doing their business. Well, great for them. [ 56 ]
His posture as a person who is ultimately free of geographical constraints makes him impervious to possible prosecution. We can note the difference with WikiLeaks founder Julian Assange who also saw in Ecuador (or at least in its London embassy) a safe harbour where he could escape rape charges by the Swedish police, and possible extradition to the U.S. to face trial for leaking confidential government documents. The asylum granted to Assange was the result of an "anti-imperialist" stance by Rafael Correa, President of Ecuador, in 2012. Assange was handed over to the British police by his successor, Lenin Moreno, anxious to distance himself from his predecessor. Larin, for his part, underlines the apolitical nature of his choice of residence, made in 2000, i.e. , well before the creation of Librusec, "due to its pleasant climate". If the public figure of Assange was meaningful for Ecuador's foreign policy, Larin draws its strength precisely from its total insignificance in the local context. His example is taken by administrators of other libraries as the reason why they cannot afford to reveal their identity.
One of Flibusta's senior directors, who calls himself Stiver, supposedly lives in Germany. However, in November 2014 he related on the library's forum his confrontation with local law enforcement authorities, without naming the country where the conflict took place:
As some of you know, I have been under investigation for the last two years or so. The same nationally acclaimed publishing house filed a complaint against me, accusing me of all sorts of sins: from illegal reading to running an international crime syndicate. And of the related shadow enrichment, of course.
The police forces collected available information on me, searched my place of residence and work and seized my laptop. Then they sat down and started to examine what (...) they got.
Result so far: case closed. The investigation has not found and does not expect to find any irregularities beyond the scope of the negligible details. (...)
In addition to me personally, an attempt was made to criminalize the entire library as well (with absolutely hilarious arguments), which similarly failed. The investigation specifically stated that the library is non-profit and does not generate income. (...) Sometimes a banana is just a banana. [ 57 ]
This apparently positive news was met with contrasting reactions from members of the forum. Most of them expressed their joy, hoping that this would set a precedent that would allow illegal libraries to emerge from the shadows. Others advised not to relax or lose vigilance and emphasized the "stress" of being exposed to an investigation. Some users teased Stiver, inviting him to reveal his identity since he is already known to the police. However, users finally agreed on the usefulness of remaining anonymous in order to avoid trouble. Notably, international precedents for the arrest of individuals responsible for shadow libraries, such as the Pirate Bay founders [ 58 ], were not mentioned in the discussion, even though the Pirate Bay case had been previously debated on the forum.
The investigation on "Stiver" did not set a precedent in Russia: a German court resolution did not change the Russian authorities' relations towards the library nor its perception by Russian publishing houses. However, this led to a strong imperative to remain anonymous. The site's "rules" made clear the crucial importance of anonymity, stating that "actions intended or likely to endanger the security or personal rights of users are prohibited. In particular, impersonation is prohibited" [ 59 ]. When I requested an in-person interview with Flibusta's other principal administrator, "Roger", he categorically refused any means of communication that could potentially lead back to him, referring me back to the trial "Stiver" had endured.
Finally, the third task to be accomplished in order to guarantee durability of collections is to ensure access. Roskomsvoboda tried to set a judicial precedent for preventing blockages to shadow libraries during the first lawsuit of that kind: the trial against Rutracker, one of the biggest Russian stream aggregators, in November 2015. After the court decided in favor of "eternal" blocking of this resource, Roskomsvoboda's lawyers launched a campaign for the abolition of such restrictive measures. Roskomsvoboda called this campaign "The battle for RuNet", thus placing free access to cultural content in the category of digital freedoms, together with other types of resistance to censorship. A Web site was set up to collect funds and signatures [ 60 ], and a complaint signed by 7,000 people was submitted to the court. Their argument was that "restricting access to the entire site due to a few items from the catalogue of Eksmo Publishing violates the rights of millions of site users to access content on the site and the rights of thousands of authors who voluntarily distribute their works to the Rutracker audience" [ 61 ]. This attempt to turn the affair into a political struggle did not bear fruit: this claim was dismissed twice and did not affect further proceedings.
In a context of increasingly frequent blockages from 2015 onwards, it was necessary for shadow libraries to nonetheless find a way to make resources visible to the wider public. The first way consisted of multiplying "mirrors" in locations that are considered safe. For example, the .lib domain, ruled by Emercoin, is regarded as such thanks to its decentralized structure. In its current form, the domain name system (DNS) root is under the centralized authority of Internet Corporation for Assigned Names and Numbers (ICANN). In this system, every DNS record is kept by the DNS provider and can be blocked under political or commercial pressure; however, "in a decentralized DNS each record is managed solely by its owner, and is readable by all users on the network" [ 62 ]. Networks that guarantee anonymity, like Tor and the "Invisible Internet Project" (i2p), are also suitable to escape national constraints, which led shadow libraries to have their "representatives" there [ 63 ].
The second way focused on instructing users. Since the beginning of site blockages, libraries and forums have displayed lists of the technical means to escape communication barriers created by Russian Internet service providers at Roskomnadzor's request. This dissemination of circumvention details seems to be omnipresent and circulated through many additional channels: mailings [ 64 ], posts on social networks and explanations on YouTube. Roskomsvoboda also organized masterclasses and disseminated lists of circumvention strategies [ 65 ]. The proposed solutions included changing the DNS server; browsers with turbo mode (Opera, Chrome, and Yandex) [ 66 ] or embedded VPN; special plug-ins for browsers; VPN services; Tor browser; and the use of anonymizers. These lists were often illustrated with pictures and tutorial videos [ 67 ]. Community members (actual programmers) explained to non-technical users how each method worked, presented its advantages and disadvantages (usually speed of execution versus simplicity of use). They also monitored specific tools and then shared results of their observations with other users by advising them on a particular VPN or anonymizer. This mastering of circumvention tools became a part of the expertise of every person visiting shadow libraries.
In addition to the torrents mentioned earlier, library users and administrators competed in creativity to organize the distribution of new archive updates. Here again, the proliferation of means and media was the rule. For example, Librusec.ucoz, a forum common to several shadow libraries and which acts as a safe harbour in case the internal forum is inaccessible, has a section called "Our Tortuga" [ 68 ]. One of the administrators of the forum provides links to updates of Librusek and Flibusta databases via free file-sharing sites.
In parallel to these external but friendly sites, the communities used social networks and messaging. For example, a bot on the Russian social media VKontakte allowed, for a while, to make a quick search in shadow libraries, facilitating access in case of a breakdown or blocking [ 69 ]. However, in June 2019, VKontakte and the publishing house Eksmo concluded a settlement according to which the social network has to check the legal status of all the books downloaded by users. However, this did not lead to the disappearance of the bot: in September 2019, it was transferred to a safer storage medium [ 70 ].
After the tightening of VKontakte's attitude towards illegal content, some shadow libraries began to actively use Telegram bots to distribute books, as this messenger was known to be tolerant to illegal content thanks to the libertarian worldview of its owner, Pavel Durov. However, by August 2020, these bots would have stopped working, supposedly as a result of an agreement between Telegram and the Russian authorities. In response, users immediately suggested a technical solution to bypass this new barrier:
Flibusta bot has been blocked not only on iOS, but also on android.
How to make the bot work again? It is simple" you need to do the following steps:
1. Create a group;
2. Add a bot to it by clicking on the following link (...) [ 71 ]
By adding the bot to a group, the system would therefore stop considering it as such. Thus, users leveraged the internal shortcomings of the messaging system to bypass blockages.
To summarize, the survival of libraries and the communities that sustain them was influenced by the multiplication of media and preservation sites, responsiveness of the members of this community to the obstacles that arose as well as their ability to share tips and find creative technical solutions and to share these techniques as widely as possible. The communities that animated shadow libraries leveraged the technical knowledge of their members to help make these libraries perennial and elusive. The permanence of communication channels seemed essential to this process, hence their multiplication and sometimes their redundancy.
This is likely part of a wider dynamic. According to Keucheyan and Tessier, in today's digital world, the idea of a "revolution" does not seem very promising in the light of the counter-revolutions it would entail; thus, hackers do not seek confrontation. Rather, "they take advantage of the interstices, disperse and reform elsewhere" [ 72 ]. But if these symbolic diversions could only take place in a democratic country, we have shown how they also occur in authoritarian contexts. Russian NGOs fighting for open access, such as Association of Internet Publishers, Wikimedia.ru or Roskomsvoboda, are part of international networks and endorse globalized political struggles. Shadow libraries stand aside and promote a conception of "freedom" that goes beyond politics, anchored in the mastery of technical tools and cultural practices that are not censored by an authority outside their communities.
In Russia, the struggles for an Internet free of censorship and those for open and fair access to literary texts overlap to some extent. On the one hand, digital rights defenders are indeed mobilizing for both causes, encompassing them under the same label of a "free Internet". On the other hand, the organizations fighting against these shadow libraries, whether at the service of the state (Roskomnadzor) or that of the Russian book industry (ASAPI), conceptualize "pirates" in the same way as "extremists" or "child pornographers", whom they censor: the idea is then to "purge" RuNet of these dangerous and impure elements, supposedly linked to the darknet and foreign organizations, to banish them from the Russian online public space. Outside Russia, this has led to a gradual politicization of shadow libraries administrators: having started as "merry men happily sharing what was sold", they found themselves in the same struggle as "victims of persecution and political censorship" which gave them a "cause for rebellion" [ 73 ]. This politicization of communities around Russian mass-literature shadow libraries has not yet been detected: concentrating on technical aspects of circumventing constraints and training users, they have kept away from public protest actions, conducted on their behalf by organisations defending digital freedoms. For example, the few explicitly "political" posts on anti-copyright mobilizations around the world hardly provoke any reaction, unlike posts on technical issues which have been widely discussed. However, these communities cultivate an irony and a folklore reflecting their detachment from, and sometimes even aggressive rejection of, the copyright industry and its profit values.
These communities hold a special place in the nebula of "hackers". They overcome legal and technical boundaries to disseminate copyright protected content, and technical expertise goes hand in hand with literary knowledge that is their raison d'Ãªtre . Russia presents a rich breeding ground for a symbiosis of love of books and technical mastery, since technical intelligentsia , known for their infatuation with reading, have strongly invested in the Internet. Yet, the majority of Russian hackers are confronted, above all, with foreign industries. Thus, the so-called "carders" have mainly devoted themselves to stealing American credit cards, sometimes defending their efforts with a patriotic anti-American discourse (Turovskij, 2019). Shadow libraries with audio or video content have been primarily in conflict with U.S. copyright industries (Kiryia, 2011). Russian academic shadow archives, like LibGen and SciHub (BodÃ³, 2018), challenge the Western academic publishing system that generates inequalities in access to knowledge (Karaganis, 2018). Unlike them, literary text archives focus on Russian language text (originals or translations), and therefore interfere very little with foreign actors. Facing the Russian book industry presents additional challenges. That industry is the foundation for supposedly the "most-reading nation", a clichÃ© inherited from the Soviet Union that the current government wishes to promote as a marker of a specific identity. This explains the state's continued support for this industry and the weight of this sector in the creation of new legislation against the illicit distribution of cultural products.
Communities with changing contours, promoting the idea of "absolute freedom" attached to a pirate imaginary, unintentionally disrupt the enterprise of creating a sovereign "pure" RuNet allegedly containing a national legally approved literature, and thus contribute to the development and dissemination of circumvention techniques.
About the author
Bella Ostromooukhova is Associate Professor of Russian Language and Culture at Sorbonne UniversitÃ©. Her research focuses on sociological aspects of independent book publishing in contemporary Russia.
E-mail: ostrob [at] gmail [dot] com
1. Interview with Maksim Moshkov, 4 July 2019.
2. The term "pirate", although frequent in common speech, has multiple connotations that are often instrumentalized by actors. Stigmatizing "piracy" highlights the violent or even "terrorist" aspect of their actions and therefore justifies their banishment. However, "pirates" often refer to themselves as such, because of the idea of freedom and power attached to this figure which can therefore play a liberating role (Keucheyan, 2008; Hayat and Paloque-BergÃ¨s, 2014). In order to distance myself from these additional meanings, I will call our object of study, following the footsteps of Karaganis (2018), "shadow libraries", using the word "pirate" or "piracy" only when it is used by the actors themselves.
3. Dagiral, 2008, p. 495.
4. Bacot and Canonne, 2019, p. 10.
5. Coleman, 2013, p. 15.
6. According to its administrator, it was the most visited Russian language Web site in 2012""2013.
7. Flibusta is currently one of the most active shadow archives, navigating legislative loopholes and introducing new technical means to circumvent prohibitions.
8. For instance, the science fiction writer Leonid Kaganov considers that "free distribution can be very useful for the authors, especially young and unknown ones", since it increases their professional potential. See Leonid Kaganov, "There are benefits to pirate libraries," at https://web.archive.org/web/20090730000859/http://www.slon.ru/articles/59387 , accessed 1 June 2020.
Sep 14, 2019 | economistsview.typepad.com
anne , September 13, 2019 at 06:48 PMhttp://cepr.net/blogs/beat-the-press/patents-and-copyright-protection-racket-for-intellectuals
September 12, 2019
Patents and Copyright: Protection Racket for Intellectuals
By Dean Baker
Last week I was asked on Twitter why proposals for replacing patent monopoly financing of prescription drugs with direct public financing have gained so little traction. After all, this would mean that drugs would be cheap; no one would have to struggle with paying tens or hundreds of thousands of dollars for drugs that are needed for their health or to save their life. (This is discussed in Rigged: How Globalization and the Rules of the Modern Economy Were Structured to Make the Rich Richer - it's free. * )
Public funding would also eliminate the incentive to misrepresent the safety and effectiveness of drugs in order to maximize sales at the patent monopoly price. Without patent monopolies, the drug companies would not have had the same incentive to push opioids, as well as many other drugs of questionable safety and effectiveness.
The idea of direct funding of biomedical research also should not seem strange to people. We currently spend close to $45 billion a year on research through the National Institutes of Health and other government agencies. The idea of doubling or tripling this funding to replace the roughly $70 billion of patent supported research now done by the pharmaceutical industry, should not appear outlandish, especially since the potential savings from free market drugs would be close to $400 billion annually (1.9 percent of GDP).
So why is there so little interest in reforming the prescription drug industry along these lines? I can think of two plausible answers. The first is a self-serving one for the elites who dominate policy debates. They don't like to have questions raised about the basic underpinnings of the distribution of income.
The second is perhaps a more simple proposition. Intellectuals have a hard time dealing with new ideas, and paying for innovation outside of the patent system, or creative work outside of the copyright system is a new idea that most intellectual types would rather not wrestle with.
Starting with the first one, the elites who dominate public policy debates are among the winners in the upward redistribution of the last four decades. While there are plenty of journalists struggling to keep their jobs or working as free lancers, and there are plenty of adjunct faculty who can't pay the rent, these are not the people setting the agenda in policy debates. Rather we are talking about columnists who enjoy high six or even seven figure incomes from their writings and speaking fees and top university faculty who can count on comparable pay.
These people do not want to entertain the idea that they didn't end up as big winners through a combination of skill, hard work, and perhaps a dose of good luck. Even the progressives in this group, who support redistributive tax and transfer policy, would rather see this as an expression of their generosity than a refusal to take part in theft.
The issue can be seen as a distinction between someone who wins a big pile of money in a lottery and someone who slips in a fake card to win the poker pot. If we recognize that patent and copyright monopolies are government policies, that could be completely restructured or even eliminated altogether, it destroys the idea that technology has been responsible for upward redistribution or even a major factor in upward redistribution.
If Bill Gates got very rich because of Windows and other Microsoft software, it was not because of the technology, but rather because the government gave him copyright and patent monopolies on this software. All the high-paying jobs in the science, technology, engineering, and mathematics sector are not the result of technological change creating new opportunities, but rather the large incentives the government provides with long and strong intellectual property protections.
This is a direction that many, perhaps most, elite types would rather not go. They might be open to coughing up more money in taxes to reduce inequality and provide opportunities for the poor, but they are not open to the idea that they never should have had the money in the first place.
Motivated reasoning is common in public debates, and this seems a plausible story. However, there is also the alternative option, that questioning patent and copyright monopolies is a new idea to most elite types, and they would rather not expend any mental energy on the effort.
If it had not been for my experiences during the housing bubble, and subsequent collapse and recession, I might be reluctant to accept that intellectual types would have a hard time thinking seriously about patent and copyright monopolies. That experience taught me that some ideas can be too simple for intellectuals to understand.
I first noticed that house sale prices were running far out of line with other prices and with historical experience in 2002. House sale prices had generally moved more or less in step with overall inflation. In the years from 1996 to 2002 they hugely outpaced inflation.
Furthermore, this did not seem to be driven by the fundamentals in the market. Rents were pretty much moving in line with inflation. And, vacancy rates were actually very high, not the story you expect when prices are rising rapidly.
Making the matter even more worrisome, the bubble was clearly driving the economy. Residential construction was growing rapidly as a share of the economy and consumption was soaring as people took advantage of the newly created equity in their homes to increase spending. The construction boom would clearly end when prices came back down to earth and the home equity driven consumption surge would hit a wall when the home equity disappeared.
All of this was very straightforward. The analysis could be constructed from summary data in publicly available series, it didn't require any sophisticated econometric analysis.
Nonetheless, I couldn't get any economists to take my concerns seriously. (Paul Krugman was a notable exception.) It wasn't that they had any counter-arguments. It basically boiled down to "we haven't seen anything like that before."
Incredibly, even after the fact economists could not own up to their mistake. They insisted the story was not the housing bubble, but rather the financial crisis.
This gave them an out, since credit default swaps and collaterized debt obligations can get complicated. Looking at the growth of residential construction and the fall in the savings rate in the GDP data is pretty damn simple. Rather than own up to being too lazy to look at the data that was right in front of their face, economists and their followers in policy circles whipped up a cock and bull story to conceal their incredible incompetence and/or negligence.
Anyhow, having seen first-hand the laziness and narrow-mindedness of economists in refusing to take the risks of the housing bubble seriously, I certainly can find it plausible that they simply don't want to entertain the idea that we could have alternative mechanisms to patents and copyrights to finance research and innovation. Furthermore, they don't want to have to alter their view of the government and the economy to incorporate the fact that these forms of property are determined by policy and can be altered pretty much any way we like.
I have two accounts that are certainly consistent with this view. In one case, I had written an article for a major progressive publication, arguing that we should have publicly funded research for pharmaceuticals, rather than supporting the research through patent monopolies. After it had been the through the editing process, the editor sent me a note asking whether I was arguing for "short patents" or no patents.
I wrote back clarifying that I meant no patents. I explained that short patents wouldn't even make any sense. Since the government had paid for the research, who would get the patent?
When the piece was published it said "short patents." Apparently the editor could not even conceive of an innovative new drug being sold in the free market without some form of patent monopoly.
In the other case, I asked an editor at the Atlantic, for whom I just written another piece, whether they would be interested in an article that made the point that patents and copyrights were tools of public policy, and that there are alternative mechanisms for financing innovation and creative work. I indicated that the piece would focus on prescription drugs, given the enormous financial and health consequences at stake in the sector.
The editor responded that, although they personally were sympathetic to my argument, the magazine doesn't publish opinion pieces. If it's not obvious, nothing that I proposed was opinion. I was giving facts and logic, but apparently this editor could not make the distinction.
The takeaway here is that thinking about patents and copyrights as policy tools that can be altered, as opposed to being natural features of the market, requires more reflection than most people in policy debates are prepared to do. After all, it is not as though any of them will lose their job, or even see their career advancement jeopardized, by not considering the implications of this obvious truth, just as none of them suffered any consequence from ignoring the housing bubble.
So, how do we get the idea of replacing patent monopolies with public funding of prescription drug research into the public debate? Wish I had an answer. I couldn't get the housing bubble into the public debate before its collapse sank the economy or, even after the fact.
Anyhow, I'm open to suggestions. I will keep trying.
Apr 22, 2019 | economistsview.typepad.com
anne -> anne... , April 21, 2019 at 05:26 PMhttp://cepr.net/images/stories/reports/ip-2018-10.pdf
Is Intellectual Property the Root of All Evil? Patents, Copyrights, and Inequality
By Dean Baker
This paper raises three issues on the relationship between intellectual property and inequality. The first is a simple logical point. Patents, copyrights, and other forms of intellectual property are public policy. They are not facts given to us by the world or the structure of technology somehow. While this point should be self-evident, it is rarely noted in discussions of inequality or ways to address it.
The second issue is that there is an enormous amount of money at stake with intellectual property rules. Many items that sell at high prices as a result of patent or copyright protection would be free or nearly free in the absence of these government granted monopolies. Perhaps the most notable example is prescription drugs where we will spend over $420 billion in 2018 in the United States for drugs that would almost certainly cost less than $105 billion in a free market. The difference is $315 billion annually or 1.6 percent of GDP. If we add in software, medical equipment, pesticides, fertilizer, and other areas where these protections account for a large percentage of the cost, the gap between protected prices and free market prices likely approaches $1 trillion annually, a sum that is more than 60 percent of after-tax corporate profits.
The third issue is that the effect of these protections is to redistribute income upward. This can be seen most easily in looking at the origins of the fortunes of some of the country's richest people, starting with Bill Gates. It also is apparent from looking at the leading companies in terms of market capitalization and profits, starting with Apple.
In addition, the demand for people with advanced skills in computer science, biotechnology, and other technical areas is highly dependent on the patent and copyright monopolies which ultimately pay for their work. With a different set of rules for promoting innovation and creative work, there could be far less demand for their work. While it can be debated whether or not that situation is desirable, the point is that this is a policy decision, not anything that is determined by technology or the natural development of the economy.
Instead of being a sidebar pursued by a small clique of economists and people concerned about access to medicines, rules on intellectual property should play a central role in debates on inequality. There is a huge amount at stake in setting these rules and those concerned about inequality should be paying attention.
Dec 28, 2017 | www.moonofalabama.org
Tony_0pmoc , Dec 27, 2017 4:08:19 PM | 117@b,108
To post a full article of yours or anyone without asking permission, and without giving accreditation is completely outrageous. However, what you write is extremely good, and people will want to spread your wise words. They should however ask permission, and maybe even pay you if they have the resources to do so.
I haven't asked permission of an Australian journalist, to re-link her words. I first came across her only a few weeks ago on Tom Feeley's website Information Clearing House. I often read you there too. Did you know?
Her words are the same, but there are subtle differences in the detail and the imagery. Is that Mr Fish's cartoon on ICH? If so has he been acknowledged and paid?
Has Caitlin Johnstone been paid? I think what she writes is awesome.
Maybe you guys, should contact each other, and maybe form an independent co-operative, cos from where I'm reading you - you are all on the same side, and you are all incredibly talented and brave.
Sep 16, 2017 | news.slashdot.org
osted by msmash on Thursday September 07, 2017
Eugene Kim, reporting for CNBC: Shortly before Amazon Prime Day in July, the owner of the Brushes4Less store on Amazon's marketplace received a suspension notice for his best-selling product, a toothbrush head replacement.
The email that landed in his inbox said the product was being delisted from the site because of an intellectual property violation. In order to resolve the matter and get the product reinstated, the owner would have to contact the law firm that filed the complaint. But there was one problem: the firm didn't exist . Brushes4Less was given the contact information for an entity named Wesley & McCain in Pittsburgh.
The website wesleymccain.com has profiles for five lawyers. A Google image search shows that all five actually work for the law firm Brydon, Swearengen & England in Jefferson City, Missouri. The phone number for Wesley & McCain doesn't work while the address belongs to a firm in Pittsburgh called Robb Leonard Mulvihill. The person who supposedly filed the complaint is not registered to practice law in Pennsylvania.
One section on Wesley & McCain's site stole language from the website of the Colby Law Office. The owner of Brushes4Less agreed to tell his story to CNBC but asked that we not use his name out of concern for his privacy.
As far as he can tell, and based on what CNBC could confirm, Amazon was duped into shutting down the seller's key product days before the site's busiest shopping event ever.
Feb 01, 2017 | economistsview.typepad.comanne : January 31, 2017 at 05:11 AMpgl -> anne... , January 31, 2017 at 05:23 AM
January 31, 2017
David Leonhardt Says the Future of the Free World Depends on Longer Copyrights for Mickey Mouse
That's not exactly what he said but pretty damn close. Since you get thrown out of elite circles if you question the merits of the Trans-Pacific Partnership (TPP), the members are doubling down. They are insisting that terrible things will happen now that the TPP is dead.
David Leonhardt picked up the mantle in his New York Times column * today telling readers to counteract China, the countries of the region supported the TPP. He says they were:
"willing to adopt American-style rules on intellectual property, pollution and labor unions, even though those rules created some political tensions in those countries."
Among the rules on intellectual property was the retroactive extension of copyrights, requiring that countries protect works created in the past for at least 75 years. The retroactive extension of copyrights makes virtually no sense. Copyright monopolies are supposed to provide an incentive to produce creative work. While longer copyrights can in principle provide more incentive going forward they can't provide incentive for past behavior.
Retroactive copyright extension has been a practice in the United States in large part to keep Mickey Mouse under copyright protection. The length of copyright has twice been extended retroactively ** in the United States as a result of Disney's ability to lobby Congress.
This sort of protectionism is very costly. The Obama administration, at the request of the entertainment industry, the software industry, and pharmaceutical industry, insisted on stronger and longer patent and copyright related protections in the TPP. Unfortunately, the projections of the economic impact of the TPP do not take account of the costs of these protections.
Anyhow, it is worth noting these handouts to politically powerful corporations. If the future of the free world depends on the TPP, as Leonhardt argues here, then maybe it shouldn't have included measures that will hugely raise the cost of everything from prescription drugs to software to Mickey Mouse memorabilia.
-- Dean BakerA 75 year copyright is really weird. Let's say Taylor Swift comes out with a great new hit. Is she going to live until she is 100 years old? Besides - she has already made so much money, Tom Brady is jealous.Observer -> anne... , January 31, 2017 at 06:53 AMAnd then there is Peanuts ... which is worth considerable more than peanuts
Exclusive: Peanuts, home of Snoopy and Charlie Brown, up for sale - sources
"U.S. brand management company Iconix Brand Group Inc (ICON.O) is exploring a sale of its majority stake in Peanuts Worldwide LLC, which owns the rights to cartoon strip characters Snoopy and Charlie Brown ...
Created by Charles Schulz and licensed in over 100 countries, the characters generate about $30 million in 12-month earnings before interest, taxes, depreciation and amortization, the people added. They declined to comment on the expected deal valuation."
Jan 28, 2017 | economistsview.typepad.comanne : January 28, 2017 at 05:38 AMhttp://cepr.net/blogs/beat-the-press/a-trade-war-everyone-can-winanne -> anne... , January 28, 2017 at 05:48 AM
January 27, 2017
A Trade War Everyone Can Win
Donald Trump has indicated * that he might slap high tariffs on imports from Mexico as a way to make the country pay for his border wall. While it's not clear this makes sense, since U.S. consumers would bear the bulk of the burden from this tax, it would certainly reduce imports from Mexico. It would also would violate the North American Free Trade Agreement and World Trade Organization rules, thereby opening the door to a trade war with Mexico and possibly other countries.
Many have seen this as taking us down a road to ever higher tariffs, leading to a plunge in international trade, which would have substantial economic costs for everyone. However, Mexico could take an alternative path that would provide far more effective retaliation against President Trump, while leading to fewer barriers and more growth.
The alternative is simple: Mexico could announce that it would no longer enforce U.S. patents and copyrights on its soil. This would be a yuuge deal, as Trump would say.
To take one prominent example, suppose that Mexico allowed for the free importation of generic drugs from India and elsewhere. The Hepatitis C drug Solvaldi has a list price in the United States of $84,000. A high quality generic is available in India for $200. There are also low cost generic versions available of many other drugs that carry exorbitant prices in the United States, with savings often more than 95 percent.
Suppose that people suffering from Hepatitis C, cancer, and other devastating and life-threatening diseases could get drugs in Mexico for a few hundreds rather than tens or even hundreds of thousands of dollars in the United States? That would likely lead to lots of business for Mexico's retail drug industry, although it would be pretty bad news for Pfizer and Merck.
The same would apply to other areas. Medical equipment, like high-end scanning and diagnostic devices, would be very cheap in Mexico if they could be produced without patent protections. This should be great for a medical travel industry in Mexico.
There would be a similar story on copyright protection. People could get the latest version of Windows and other software for free in Mexico with their new computers. This is bad news for Bill Gates and Microsoft, but good news for U.S. consumers interested in visiting Mexico, along with Mexico's retail sector. Mexico could also make a vast amount of recorded music and video material available without copyright protection. That's great news for consumers everywhere but very bad news for Disney, Time-Warner, and other Hollywood giants.
Of course the erosion of patent and copyright protection will undermine the system of incentives that now support innovation and creative work. This means that we would have to develop more efficient alternatives to these relics of the feudal guild system. Among other places, folks can read about alternative in my book, "Rigged: How Globalization and the Rules of the Modern Economy Were Structured to Make the Rich Richer" ** (it's free).
Anyhow, this would be a blueprint for a trade war in which everyone, except a few corporate giants, could be big winners.
-- Dean Bakerhttp://deanbaker.net/images/stories/documents/Rigged.pdf
Rigged: How Globalization and the Rules of the Modern Economy Were Structured to Make the Rich Richer
By Dean Baker
The Old Technology and Inequality Scam: The Story of Patents and Copyrights
One of the amazing lines often repeated by people in policy debates is that, as a result of technology, we are seeing income redistributed from people who work for a living to the people who own the technology. While the redistribution part of the story may be mostly true, the problem is that the technology does not determine who "owns" the technology. The people who write the laws determine who owns the technology.
Specifically, patents and copyrights give their holders monopolies on technology or creative work for their duration. If we are concerned that money is going from ordinary workers to people who hold patents and copyrights, then one policy we may want to consider is shortening and weakening these monopolies. But policy has gone sharply in the opposite direction over the last four decades, as a wide variety of measures have been put into law that make these protections longer and stronger. Thus, the redistribution from people who work to people who own the technology should not be surprising - that was the purpose of the policy.
If stronger rules on patents and copyrights produced economic dividends in the form of more innovation and more creative output, then this upward redistribution might be justified. But the evidence doesn't indicate there has been any noticeable growth dividend associated with this upward redistribution. In fact, stronger patent protection seems to be associated with slower growth.
Before directly considering the case, it is worth thinking for a minute about what the world might look like if we had alternative mechanisms to patents and copyrights, so that the items now subject to these monopolies could be sold in a free market just like paper cups and shovels.
The biggest impact would be in prescription drugs. The breakthrough drugs for cancer, hepatitis C, and other diseases, which now sell for tens or hundreds of thousands of dollars annually, would instead sell for a few hundred dollars. No one would have to struggle to get their insurer to pay for drugs or scrape together the money from friends and family. Almost every drug would be well within an affordable price range for a middle-class family, and covering the cost for poorer families could be easily managed by governments and aid agencies.
The same would be the case with various medical tests and treatments. Doctors would not have to struggle with a decision about whether to prescribe an expensive scan, which might be the best way to detect a cancerous growth or other health issue, or to rely on cheaper but less reliable technology. In the absence of patent protection even the most cutting edge scans would be reasonably priced.
Health care is not the only area that would be transformed by a free market in technology and creative work. Imagine that all the textbooks needed by college students could be downloaded at no cost over the web and printed out for the price of the paper. Suppose that a vast amount of new books, recorded music, and movies was freely available on the web.
People or companies who create and innovate deserve to be compensated, but there is little reason to believe that the current system of patent and copyright monopolies is the best way to support their work. It's not surprising that the people who benefit from the current system are reluctant to have the efficiency of patents and copyrights become a topic for public debate, but those who are serious about inequality have no choice. These forms of property claims have been important drivers of inequality in the last four decades.
The explicit assumption behind the steps over the last four decades to increase the strength and duration of patent and copyright protection is that the higher prices resulting from increased protection will be more than offset by an increased incentive for innovation and creative work. Patent and copyright protection should be understood as being like very large tariffs. These protections can often the raise the price of protected items by several multiples of the free market price, making them comparable to tariffs of several hundred or even several thousand percent. The resulting economic distortions are comparable to what they would be if we imposed tariffs of this magnitude.
The justification for granting these monopoly protections is that the increased innovation and creative work that is produced as a result of these incentives exceeds the economic costs from patent and copyright monopolies. However, there is remarkably little evidence to support this assumption. While the cost of patent and copyright protection in higher prices is apparent, even if not well-measured, there is little evidence of a substantial payoff in the form of a more rapid pace of innovation or more and better creative work....
Oct 30, 2016 | economistsview.typepad.comanne : October 30, 2016 at 07:40 AMhttp://cepr.net/publications/op-eds-columns/inequality-as-policy-selective-trade-protectionism-favors-higher-earners
October 27, 2016
Inequality As Policy: Selective Trade Protectionism Favors Higher Earners
By Dean Baker
Globalization and technology are routinely cited as drivers of inequality over the last four decades. While the relative importance of these causes is disputed, both are often viewed as natural and inevitable products of the working of the economy, rather than as the outcomes of deliberate policy. In fact, both the course of globalization and the distribution of rewards from technological innovation are very much the result of policy. Insofar as they have led to greater inequality, this has been the result of conscious policy choices.
Starting with globalization, there was nothing pre-determined about a pattern of trade liberalization that put U.S. manufacturing workers in direct competition with their much lower paid counterparts in the developing world. Instead, that competition was the result of trade pacts written to make it as easy as possible for U.S. corporations to invest in the developing world to take advantage of lower labor costs, and then ship their products back to the United States. The predicted and actual result of this pattern of trade has been to lower wages for manufacturing workers and non-college educated workers more generally, as displaced manufacturing workers crowd into other sectors of the economy.
Instead of only putting manufacturing workers into competition with lower-paid workers in other countries, our trade deals could have been crafted to subject doctors, dentists, lawyers and other highly-paid professionals to international competition. As it stands, almost nothing has been done to remove the protectionist barriers that allow highly-educated professionals in the United States to earn far more than their counterparts in other wealthy countries.
This is clearest in the case of doctors. For the most part, it is impossible for foreign-trained physicians to practice in the United States unless they have completed a residency program in the United States. The number of residency slots, in turn, is strictly limited, as is the number of slots open for foreign medical students. While this is a quite blatantly protectionist restriction, it has persisted largely unquestioned through a long process of trade liberalization that has radically reduced or eliminated most of the barriers on trade in goods. The result is that doctors in the United States earn an average of more than $250,000 a year, more than twice as much as their counterparts in other wealthy countries. This costs the country roughly $100 billion a year in higher medical bills compared to a situation in which U.S. doctors received the same pay as doctors elsewhere. Economists, including trade economists, have largely chosen to ignore the barriers that sustain high professional pay at enormous economic cost.
In addition to the items subject to trade, the overall trade balance is also very much the result of policy choices. The textbook theory has capital flowing from rich countries to poor countries, which means that rich countries run trade surpluses with poor countries. While this accurately described the pattern of trade in the 1990s up until the East Asian financial crisis (a period in which the countries of the region enjoyed very rapid growth), in the last two decades developing countries taken as a whole have been running large trade surpluses with wealthy countries.
This implies large trade deficits in rich countries, especially the United States, which in turn has meant a further loss of manufacturing jobs with the resulting negative impact on wage inequality. However, there was nothing inevitable about the policy shifts associated with the bailout from the East Asian financial crisis that led the developing world to become a net exporter of capital.
The pattern of gains from technology has been even more directly determined by policy than is the case with gains from trade. There has been a considerable strengthening and lengthening of patent and copyright and related protections over the last four decades. The laws have been changed to extend patents to new areas such as life forms, business methods, and software. Copyright duration has been extended from 55 years to 95 years. Perhaps even more important, the laws have become much more friendly to holders of these property claims to tilt legal proceedings in their favor, with courts becoming more patent-friendly and penalties for violations becoming harsher. And, the United States has placed stronger intellectual property (IP) rules at center of every trade agreement negotiated in the last quarter century.
In this context, it would hardly be surprising if the development of "technology" was causing an upward redistribution of income. The people in a position to profit from stronger IP rules are almost exclusively the highly educated and those at the top end of the income distribution. It is almost definitional that stronger IP rules will result in an upward redistribution of income.
This upward redistribution could be justified if stronger IP rules led to more rapid productivity growth, thereby benefitting the economy as a whole. However, there is very little evidence to support that claim. Michele Boldrin and David Levine have done considerable research * on this topic and generally found the opposite. My own work, using cross-country regressions with standard measures of patent strength, generally found a negative and often significant relationship between patent strength and productivity growth.
There is also a substantial amount of money at stake. In the case of prescription drugs alone, the United States is on path to spend more than $430 billion in 2016 for drugs that would likely cost one-tenth of this amount in the absence of patent and related protections....
* http://levine.sscnet.ucla.edu/general/intellectual/againstfinal.htmanne -> anne... , October 30, 2016 at 07:41 AMhttp://levine.sscnet.ucla.edu/general/intellectual/againstfinal.htmanne -> anne... ,
January 2, 2008
Against Intellectual Monopoly
By Michele Boldrin and David K. Levine
It is common to argue that intellectual property in the form of copyright and patent is necessary for the innovation and creation of ideas and inventions such as machines, drugs, computer software, books, music, literature and movies. In fact intellectual property is a government grant of a costly and dangerous private monopoly over ideas. We show through theory and example that intellectual monopoly is not necessary for innovation and as a practical matter is damaging to growth, prosperity and liberty.http://levine.sscnet.ucla.edu/general/intellectual/againstfinal.htm
January 2, 2008
Against Intellectual Monopoly
By Michele Boldrin and David K. Levine
In late 1764, while repairing a small Newcomen steam engine, the idea of allowing steam to expand and condense in separate containers sprang into the mind of James Watt. He spent the next few months in unceasing labor building a model of the new engine. In 1768, after a series of improvements and substantial borrowing, he applied for a patent on the idea, requiring him to travel to London in August. He spent the next six months working hard to obtain his patent. It was finally awarded in January of the following year. Nothing much happened by way of production until 1775. Then, with a major effort supported by his business partner, the rich industrialist Matthew Boulton, Watt secured an Act of Parliament extending his patent until the year 1800. The great statesman Edmund Burke spoke eloquently in Parliament in the name of economic freedom and against the creation of unnecessary monopoly – but to no avail. The connections of Watt's partner Boulton were too solid to be defeated by simple principle.
Once Watt's patents were secured and production started, a substantial portion of his energy was devoted to fending off rival inventors. In 1782, Watt secured an additional patent, made "necessary in consequence of ... having been so unfairly anticipated, by [Matthew] Wasborough in the crank motion." More dramatically, in the 1790s, when the superior Hornblower engine was put into production, Boulton and Watt went after him with the full force of the legal system.
During the period of Watt's patents the U.K. added about 750 horsepower of steam engines per year. In the thirty years following Watt's patents, additional horsepower was added at a rate of more than 4,000 per year. Moreover, the fuel efficiency of steam engines changed little during the period of Watt's patent; while between 1810 and 1835 it is estimated to have increased by a factor of five.
After the expiration of Watt's patents, not only was there an explosion in the production and efficiency of engines, but steam power came into its own as the driving force of the industrial revolution. Over a thirty year period steam engines were modified and improved as crucial innovations such as the steam train, the steamboat and the steam jenny came into wide usage. The key innovation was the high-pressure steam engine – development of which had been blocked by Watt's strategic use of his patent.
Many new improvements to the steam engine, such as those of William Bull, Richard Trevithick, and Arthur Woolf, became available by 1804: although developed earlier these innovations were kept idle until the Boulton and Watt patent expired. None of these innovators wished to incur the same fate as Jonathan Hornblower.
Ironically, not only did Watt use the patent system as a legal cudgel with which to smash competition, but his own efforts at developing a superior steam engine were hindered by the very same patent system he used to keep competitors at bay....
Feb 09, 2016 | www.nakedcapitalism.comCarolinianChris in Paris,
Here's an important article about Pharma as the modern day enclosure of the commons. It says patent protection for medicines was largely unknown until the middle of the 20th century and big pharma has been the driving force behind international intellectual property agreements like the TPP. Just a sampler
When governments outside the U.S. refused to block generic manufacturing, the pharmaceutical industry argued, they were indulging acts of piracy.
But there was little in the way of binding international law to back up that position. So the industry pushed directly for the U.S. government to make intellectual property protection a priority in all trade negotiations. Of course, inserting monopoly patent rights into trade agreements runs counter to those agreements' stated purpose of dismantling barriers to global competition. Yet the pharmaceutical industry, reliably at the top of the list in both lobbying expenditures and political campaign contributions in the United States, quickly found willing partners on Capitol Hill and in the White House. The U.S. soon adopted intellectual property protection as a litmus test for its trade partners.
The approach was to offer carrots to patent-resistant countries - enhanced access to U.S. markets and some reductions in the subsidies of U.S. agricultural exports - while simultaneously brandishing some imposing sticks. In 1984, aggressive pharmaceutical sector lobbying helped amend the U.S. Trade Act to give the president the authority to impose duties on or withdraw trade benefits from any nation that did not provide "adequate and effective" protection for U.S. intellectual property.
In short pharma greed and giant profits could be driving our whole trade policy. The below is relevant to much of what gets discussed around here (but not today's Topic A I'll admit)
Molecules were not covered by patent until the mid 20th century in France, for example. The TRIPS agreement was the real coup that got Pharma the international IP protection they desperately wanted. India was smart enough to get an opt out./p>
March 01 | SlashdotBennett Haselton writes "The U.S. government recently announced that academic papers on federally-funded research should become freely available online within one year of publication in a journal. But the real question is why academics don't simply publish most papers freely anyway. If the problem is that traditional journals have a monopoly on the kind of prestige that can only be conferred by having your paper appear in their hallowed pages, that monopoly can easily be broken, because there's no reason why open-access journals can't confer the same imprimatur of quality."
Read on for the rest of Bennett's thoughts on the great free-access debate.
Aug 15, 2001 |Sourceware.com
... And now for some not so nice things.
Stallman recently tried what I would call a hostile takeover of the glibc development. He tried to conspire behind my back and persuade the other main developers to take control so that in the end he is in control and can dictate whatever pleases him. This attempt failed but he kept on pressuring people everywhere and it got really ugly. In the end I agreed to the creation of a so-called "steering committee" (SC). The SC is different from the SC in projects like gcc in that it does not make decisions. On this front nothing changed. The only difference is that Stallman now has no right to complain anymore since the SC he wanted acknowledged the status quo. I hope he will now shut up forever.
The morale of this is that people will hopefully realize what a control freak and raging manic Stallman is. Don't trust him. As soon as something isn't in line with his view he'll stab you in the back. *NEVER* voluntarily put a project you work on under the GNU umbrella since this means in Stallman's opinion that he has the right to make decisions for the project.
The glibc situation is even more frightening if one realizes the story behind it. When I started porting glibc 1.09 to Linux (which eventually became glibc 2.0) Stallman threatened me and tried to force me to contribute rather to the work on the Hurd. Work on Linux would be counter-productive to the Free Software course. Then came, what would be called embrace-and-extend if performed by the Evil of the North-West, and his claim for everything which lead to Linux's success.
Which brings us to the second point. One change the SC forced to happen against my will was to use LGPL 2.1 instead of LGPL 2. The argument was that the poor lawyers cannot see that LGPL 2 is sufficient. Guess who were the driving forces behind this. The most remarkable thing is that Stallman was all for this despite the clear motivation of commercialization. The reason: he finally got the provocative changes he made to the license through. In case you forgot or haven't heard, here's an excerpt: [...] For example, permission to use the GNU C Library in non-free programs enables many more people to use the whole GNU operating system, as well as its variant, the GNU/Linux operating system. This $&%$& demands everything to be labeled in a way which credits him and he does not stop before making completely wrong statements like "its variant". I find this completely unacceptable and can assure everybody that I consider none of the code I contributed to glibc (which is quite a lot) to be as part of the GNU project and so a major part of what Stallman claims credit for is simply going away.
This part has a morale, too, and it is almost the same: don't trust this person. Read the licenses carefully and rip out parts which give Stallman any possibility to influence your future. Phrases like
[...] GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version.
just invites him to screw you when it pleases him. Rip out the "any later version" part and make your own decisions when to use a different license since otherwise he can potentially do you or your work harm.
In case you are interested why the SC could make this decision I'll give a bit more background. When this SC idea came up I wanted to fork glibc (out of Stallman's control) or resign from any work. The former was not welcome this it was feared to cause fragmentation. I didn't agree but if nobody would use a fork it's of no use. There also wasn't much interest in me resigning so we ended up with the SC arrangement where the SC does nothing except the things I am not doing myself at all: handling political issues. All technical discussions happens as before on the mailing list of the core developers and I reserve the right of the final decision.
The LGPL 2.1 issue was declared political and therefore in scope of the SC. I didn't feel this was reason enough to leave the project for good so I tolerated the changes. Especially since I didn't realize the mistake with the wording of the copyright statements which allow applying later license versions before.
I cannot see this repeating, though. Despite what Stallman believes, maintaining a GNU project is *NOT* a privilege. It's a burden, and the bigger the project the bigger the burden. I have no interest to allow somebody else to tell me what to do and not to do if this is part of my free time. There are plenty of others interesting things to do and I'll immediately walk away from glibc if I see a situation like this coming up again. I will always be able to fix my own system (and if the company I work for wants it, their systems).
Jul 12 2012 | The Atlantic After dismissing a high-profile suit between Apple and Motorola, one of our leading jurists discusses the problems plaguing America's intellectual property system.
Recently, while sitting as a trial judge, I dismissed a case in which Apple and Motorola had sued each other for alleged infringement of patents for components of smartphones. My decision undoubtedly will be appealed, and since the case is not yet over with it would be inappropriate for me to comment publicly on it.
But what I am free to discuss are the general problems posed by the structure and administration of our current patent laws, a system that warrants reconsideration by our public officials.*
U.S. patent law confers a monopoly (in the sense of a right to exclude competitors), generally for 20 years, on an invention that is patented, provided the patent is valid -- that is, that it is genuinely novel, useful, and not obvious. Patents are granted by the Patent and Trademark Office and are presumed valid. But their validity can be challenged in court, normally by way of defense by a company sued by a patentee for patent infringement.
With some exceptions, U.S. patent law does not discriminate among types of inventions or particular industries. This is, or should be, the most controversial feature of that law. The reason is that the need for patent protection in order to provide incentives for innovation varies greatly across industries.
The prime example of an industry that really does need such protection is pharmaceuticals. The reasons are threefold. First, the invention of a new drug tends to be extremely costly--in the vicinity of hundreds of millions of dollars. The reason is not so much the cost of inventing as the cost of testing the drug on animal and human subjects, which is required by law in order to determine whether the drug is safe and efficacious and therefore lawful to sell. Second, and related, the patent term begins to run when the invention is made and patented, yet the drug testing, which must be completed before the drug can be sold, often takes 10 or more years. This shortens the effective patent term, which is to say the period during which the inventor tries to recoup his investment by exploiting his patent monopoly of the sale of the drug. The delay in beginning to profit from the invention also reduces the company's recoupment in real terms, because dollars received in the future are worth less than dollars received today. And third, the cost of producing, as distinct from inventing and obtaining approval for selling, a drug tends to be very low, which means that if copying were permitted, drug companies that had not incurred the cost of invention and testing could undercut the price charged by the inventing company yet make a tidy profit, and so the inventing company would never recover its costs.
So pharmaceuticals are the poster child for the patent system. But few industries resemble pharmaceuticals in the respects that I've just described. In most, the cost of invention is low; or just being first confers a durable competitive advantage because consumers associate the inventing company's brand name with the product itself; or just being first gives the first company in the market a head start in reducing its costs as it becomes more experienced at producing and marketing the product; or the product will be superseded soon anyway, so there's no point to a patent monopoly that will last 20 years; or some or all of these factors are present. Most industries could get along fine without patent protection.
I would lay particular stress on the cost of invention. In an industry in which teams of engineers are employed on a salaried basis to conduct research on and development of product improvements, the cost of a specific improvement may be small, and when that is true it is difficult to make a case for granting a patent. The improvement will be made anyway, without patent protection, as part of the normal competitive process in markets where patents are unimportant. It is true that the easier it is to get a patent, the sooner inventions will be made. But "patent races" (races, induced by hope of obtaining a patent, to be the first with a product improvement) can result in excessive resources being devoted to inventive activity. A patent race is winner take all. The firm that makes an invention and files for a patent one day before his competitors reaps the entire profit from the invention, though the benefit to consumers of obtaining the product a day earlier may be far less than the cost of the accelerated invention process.
Moreover, a firm that can get along without patent protection may have compelling reasons to oppose such protection because of fear of how its rivals may use it against the firm. A patent blocks competition within the patent's scope and so if a firm has enough patents it may be able to monopolize its market. This prospect gives rise to two wasteful phenomena:
- defensive patenting and
- patent trolls.
Defensive patenting means getting a patent not because you need it to prevent copycats from making inroads into your market, but because you want to make sure that you're not accused of infringing when you bring your own product to market. The cost of patenting and the cost of resolving disputes that may arise when competitors have patents are a social waste.
Patent trolls are companies that acquire patents not to protect their market for a product they want to produce -- patent trolls are not producers -- but to lay traps for producers, for a patentee can sue for infringement even if it doesn't make the product that it holds a patent on.
These problems are aggravated by several additional factors. One is that the Seventh Amendment to the U.S. Constitution confers a right to a jury trial in cases in federal court if the plaintiff is asking for an award of money damages, as plaintiffs in patent infringement suits normally do. Judges have difficulty understanding modern technology and jurors have even greater difficulty, yet patent plaintiffs tend to request trial by jury because they believe that jurors tend to favor patentees, believing that they must be worthy inventors defending the fruits of their invention against copycats -- even though, unlike the rule in copyright law, a patentee need not, in order to prevail in an infringement suit, show that the defendant knew he was infringing. This problem is exacerbated by the fact that in some industries it is very difficult to do a thorough search of patent records to discover whether you may be infringing someone's patent; and even if doable, the search may be very expensive. Notice too--an independent problem with current patent law -- that difficulties of search, and the prospect of incurring litigation costs to defend an infringement suit, may actually discourage innovation.
Another troublesome factor is that the Patent and Trademark Office is seriously understaffed. As a result, many patent examinations are perfunctory, and there is a general concern that too many patents are being issued, greatly complicating the problems I've been discussing. There is now a three-year backlog in the office--a three-year delay on average between the filing of a patent application and the decision by a patent examiner on whether to grant the application.
There are a variety of measures that could be taken to alleviate the problems I've described. They include: reducing the patent term for inventors in industries that do not have the peculiar characteristics of pharmaceuticals that I described; instituting a system of compulsory licensing of patented inventions; eliminating court trials including jury trials in patent cases by expanding the authority and procedures of the Patent and Trademark Office to make it the trier of patent cases, subject to limited appellate review in the courts; forbidding patent trolling by requiring the patentee to produce the patented invention within a specified period, or lose the patent; and (what is beginning) provide special training for federal judges who volunteer to preside over patent litigation.
I am not enough of an expert in patent law to come down flatly in favor of any of the reforms that I have listed. I wish merely to emphasize that there appear to be serious problems with our patent system, but almost certainly effective solutions as well, and that both the problems and the possible solutions merit greater attention than they are receiving.
*This issue is separate from what is presented to a court in a patent case. Lawsuits are governed by existing law as interpreted by the Supreme Court and the U.S. Court of Appeals for the Federal Circuit, which has (under the Supreme Court) exclusive jurisdiction of appeals in patent cases.
Marian the Librarian writes "UCSF is among the first public institutions to adopt an open access policy, and is the largest scientific institution to have such a policy. The policy, voted unanimously by the faculty, will allow UCSF authors to put electronic versions of their published scientific articles on an open access repository making their research findings freely available to the public. Dr. Richard A. Schneider, who led the initiative, said, 'Our primary motivation is to make our research available to anyone who is interested in it, whether they are members of the general public or scientists without costly subscriptions to journals. The decision is a huge step forward in eliminating barriers to scientific research.'"
GPL violations are a dime a dozen. Some are intentional, some are not - but I don't think I've ever seen one quite as surprising as this one. Yes, Richard Stallman has sent out a note letting everybody know that the 23.2 and 23.3 releases of GNU Emacs are in violation of the GPL. Says Stallman, "We have made a very bad mistake. Anyone redistributing those versions is violating the GPL, through no fault of his own."
You read that right - GNU Emacs, possibly the most GPLish of GPL'ed programs, has a GPL violation. The specifics as reported by David Kastrup are that Emacs includes a handful of "binary blobs" related to a Collection of Emacs Development Environment Tools (CEDET). We're talking maybe eight files that were autogenerated from Bison grammar files, and the Bison grammar files weren't distributed. Therein lies the GPL violation.
It also means that every distribution or downstream distributor of GNU Emacs 23.2 and 23.3 is, technically and unwittingly, violating the GPL.
The central issue is this: how can we make books and articles - not just snippets, but entire works - available to everyone, while preserving the rights of the works' creators? To answer that, of course, we need to decide what those rights are. Just as inventors are given patents so that they can profit from their inventions for a limited time, so, too, authors were originally given copyright for a relatively short period - in the U.S., it was initially only 14 years from the first publication of the work.
For most authors, that would be enough time to earn the bulk of the income that they would ever receive from their writings; after that, the works would be in the public domain. But corporations build fortunes on copyright, and repeatedly pushed Congress to extend it, to the point that in the U.S. it now lasts for 70 years after the creator's death. (The 1998 legislation responsible for the last extension was nicknamed the "Mickey Mouse Protection Act" because it allowed the Walt Disney Company to retain copyright of its famous cartoon character.)
It is because copyright lasts so long that as many as three-quarters of all library books are "orphaned." This vast collection of knowledge, culture, and literary achievement is inaccessible to most people. Digitizing it would make it available to anyone with Internet access. As Peter Brantley, director of technology for the California Digital Library, has put it: "We have a moral imperative to reach out to our library shelves, grab the material that is orphaned, and set it on top of scanners."
Robert Darnton, director of the Harvard University Library, has proposed an alternative to Google's plans: a digital public library, funded by a coalition of foundations, working in tandem with a coalition of research libraries. Darnton's plan falls short of a universal library, because works in print and in copyright would be excluded; but he believes that Congress might grant a non-commercial public library the right to digitize orphan books.
That would be a huge step in the right direction, but we should not give up the dream of a universal digital public library. After all, books still in print are likely to be the ones that contain the most up-to-date information, and the ones that people most want to read.
Many European countries, as well as Australia, Canada, Israel, and New Zealand, have adopted legislation that creates a "public lending right" - that is, the government recognizes that enabling hundreds of people to read a single copy of a book provides a public good, but that doing so is likely to reduce sales of the book. The universal public library could be allowed to digitize even works that are in print and in copyright, in exchange for fees paid to the publisher and author based on the number of times the digital version is read.
If we can put a man on the moon and sequence the human genome, we should be able to devise something close to a universal digital public library. At that point, we will face another moral imperative, one that will be even more difficult to fulfill: expanding Internet access beyond the less than 30 percent of the world's population that now has it.Peter Singer is a professor of bioethics at Princeton University and Laureate Professor at the University of Melbourne. His most recent book is "The Life You Can Save." ? Project Syndicate, 2011.
snydeq writes "Savio Rodrigues sheds light on the limitations open source software faces in app stores, a problem that will only increase as the app store model proliferates. 'In effect, in the context of a GPLv2 license, an Apple App Store item that abides by Apple's terms of service is deemed to be restricting usage and imposing further limitation on usage rights than were envisioned by the original licensor of the open source code,' Rodrigues writes. 'Far from being an abstract example, this situation is precisely why the popular VLC media player was removed from the App Store.' Microsoft, for its part, disallows the use of GPLv2 altogether. 'With the vast amount of GPLv2 code available for use, the incompatibility between the App Store's (and Windows Marketplace's) terms of service on one hand and GPLv2 on the other is a problem in need of a fix.'"
yakatz:Can you get the source via the App Store? If not, it's a violation of the GPL.
They have to offer you the source via exactly the same means as the binaries.
http://www.gnu.org/licenses/old-licenses/gpl-2.0.html [gnu.org]3. You may copy and distribute the Program (or a work based on it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you also do one of the following:
- a} Accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or,
- b) Accompany it with a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code, to be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or,
- c) Accompany it with the information you received as to the offer to distribute corresponding source code. (This alternative is allowed only for noncommercial distribution and only if you received the program in object code or executable form with such an offer, in accord with Subsection b above.)
Microlith Re:Wrong, two other ways
The first way is jailbreaking; but lets ignore that for the moment.
Yes, lets. Because it forces you to violate an EULA you agree to when you start using the device. You shouldn't be forced to violate a contract (of any kind) to be free to do as you wish.
The second way is compiling and installing yourself. Which is something you would be able to do with the developer tools, which you would need anyway once you get access to the source.
The developer tools themselves do not allow you to load them. You must pay the yearly $99 fee to load them on your phone, and even then it is a limited "beta" signature that will eventually expire (90 days, I believe.) So even then Apple is placing restrictions on your use of the software.
Anyone who can make use of the source can also get a build onto the device, in two different but equally effective ways.
I hardly call forcing people to violate an EULA, and forcing them to pay $99 to load software they compile themselves on a device they own "effective" or even remotely reasonable.
Anthony Mouse: Re:Limited problem.
I though you could build a fully open source Android? The Cyanogen wiki says it is GPL and Apache licensed (which I assume means that some bits are GPL and others are Apache).
Right, but that's not what you get on your phone when you go to Verizon and ask them for a Droid, because Motorola has made changes (presumably to the Apache licensed bits) which they don't publish, and have everything set up to prevent you from changing things as you like.
I think what I'd really like to see is something like the Nexus line, but available on all carriers and capable of running apt-get and anything else in the GNU userland.
Red Hat has decided to release its kernel source with all patches pre-applied to make it difficult for its competitors to lure away its
Brian Stevens, the chief technical officer of the company, who offered this explanation towards the end of an eight-paragraph blog entry attempting to explain the sudden change, did not respond directly to questions sent to the company by iTWire.
As reported in these columns, the change was first publicised in an interview given by Maximilian Attems, a member of the Debian kernel team. It received further publicity on the Linux Weekly News website.
A number of other publications have since reported the issue.
Stevens spoke to the British technology website, The Register, and named the two competitors for whom Red Hat wanted to make life difficult - Oracle and Novell.
However, Stevens said nothing about the fact that Red Hat could well be violating the GNU General Public Licence under which the Linux kernel is released.
The GPL specifies that additional restrictions cannot be placed on redistribution of GPL-ed sources. Red Hat's policy now is to cancel the support options for any of its subscribers who redistributes the source of its kernel.
In his blog entry, Stevens said Red Hat's competitors had now changed their tactics, and, instead of offering their own GNU/Linux contributions as alternatives, were trying to lure away its customers by offering to support its own enterprise distribution, Red Hat Enterprise Linux.
"Frankly, our response is to compete," Stevens wrote. "Essential knowledge that our customers have relied on to support their RHEL environments will increasingly only be available under subscription.
"The itemization (sic) of kernel patches that correlate with articles in our knowledge base is no longer available to our competitors, but rather only to our customers who have recognized (sic) the value of RHEL and have thus indirectly funded Red Hat's contributions to open source that will advance their business now and in the future."
Red Hat's tactic does not prevent either Oracle or Novell taking out a single subscription each to RHEL and profiting from all the work done by Red Hat.
Back in May, GCC developer Mark Mitchell started a discussion on the topic of documentation. As the GCC folks look at documenting new infrastructure - plugin hooks, for example - they would like to be able to incorporate material from the GCC source directly into the manuals. It seems like an obvious idea; many projects use tools like Doxygen to just that end. In the GCC world, though, there is a problem: the GCC code carries the GPLv3 license, while the documents are released under the GNU Free Documentation License (GFDL). The GFDL is unpopular in many quarters, but the only thing that matters with regard to this discussion is that the GFDL and the GPL are not compatible with each other. So incorporating GPLv3-licensed code into a GFDL-licensed document and distributing the result would be a violation of the GPL.
After some further discussion, Mark was able to get a concession from Richard Stallman on this topic:If Texinfo text is included the .h files specifically to be copied into a manual, it is ok to for you copy that text into a manual and release the manual under the GFDL.
This is a severely limited permission in a number of ways. To begin with, it applies only to comments in header files; the use of more advanced tools to generate documentation from the source itself would still be a problem. But there is another issue: this permission only applies to FSF-owned code. As Mark put it:However, if I changed the code, but did not regenerate the docs, and you then picked up my changes, possibly made more of your own, and then regenerated the docs, *you* would be in breach. (Because my changes are only available to you under the GPL; you do not have the right to relicense my changes under the GFDL.)
I find that consequence undesirable. In particular, what I did is OK in that scenario, but suddenly, now, you, are a possibly unwitting violator.
Dave Korn described this situation as being "laden with unforeseen potential booby-traps" and suggested that it might be better to just give up on generating documentation from the code. The conversation faded away shortly thereafter; it may well be that this idea is truly dead.
One might poke fun at the FSF for turning a laudable goal (better documentation) into a complicated and potentially hazardous venture. But the real problem is that we as a community lack a copyleft license that works well for both code and text. About the only thing that even comes close to working is putting the documentation under the GPL as well, but the GPL is a poor fit for text. Nonetheless, it may be the best we have in cases where GPL-licensed code is to be incorporated into documentation.
Ian Lance Taylor recently described a problem which will be familiar to many developers in growing projects:The gcc project currently has a problem: when people who are not regular gcc developers send in a patch, those patches often get dropped. They get dropped because they do not get reviewed, and they get dropped because after review they do not get committed. This discourages new developers and it means that the gcc project does not move as fast as it could.
He also noted that a contributor who goes by the name NightStrike had offered to build a system which would track patches and help ensure that they are answered; think of it as a sort of virtual Andrew Morton. This system was never implemented, though, and it doesn't appear that it will be. The reason? The GCC Powers That Be were unwilling to give NightStrike access to the project's infrastructure without knowing something about the person behind the name. As described by Ian, the project's reasoning would seem to make some sense:Giving somebody a shell account on gcc.gnu.org means giving them a very high level of trust. There are quite a few people who could translate a shell account on gcc.gnu.org into a number of difficult-to-detect attacks on the entire FLOSS infrastructure, including the kernel, the source code control systems, etc. It's hard for us to get to the required level of trust in somebody whom we have never met and who won't provide any real world contact information.
NightStrike, who still refuses to provide that information, was unimpressed:What you guys need to realize is that if I did just make something up, there wouldn't be an issue. Your policies are vintage computer security circa 1963. That's what's so darn frustrating about this whole entire thing. You don't have any actual security, but yet you think I'm going to try to bring down everything GNU. That's just awesome.
Awesome or not, this episode highlights a real problem that we have in our community. We place a great deal of trust in the people whose code we use and we place an equal amount of trust in the people who work with the infrastructure around that code. The potential economic benefits of abusing that trust could be huge; it's surprising that we have seen so few cases of that happening so far. So it makes sense that a project would want to know who it is taking code from and who it is letting onto its systems. To do anything else looks negligent.
But what do we really know about these people? In many projects, all that is really required is to provide a name which looks moderately plausible. Debian goes a little further by asking prospective maintainers to submit a GPG key which has been signed by at least one established developer. But, in general, it is quite hard to establish that somebody out there on the net is who he or she claims to be. Much of what goes on now - turning away obvious pseudonyms but accepting names that look vaguely real, for example - could well be described as a sort of security theater. The fact that Ian thanked NightStrike for not making up a name says it all: the project is turning away contributors who are honest about their anonymity, but it can do little about those who lie.
Fixes for this problem will not be easy to come by. Attempts to impose identity structures on the net - as the US is currently trying to do - seem likely to create more problems than they solve, even if they can be made to work on a global scale. What we really need is processes which are robust in the presence of uncertain identity. Peer review of code is clearly one such process, as is better peer review of the development and distribution chain in general. Distributed version control systems can make repository tampering nearly impossible. And so on. But no solution is perfect, and these concerns will remain with us for some time. So we will have to continue to rely on feeling that, somehow, we know the people we are trusting our systems to.
McKusick on SCO's latest copyright claims
December 29, 2003
NewsForge asked longtime Unix and BSD guru Kirk McKusick [ http://www.mckusick.com/ ], who has intimate knowledge of the original AT&T versus BSD legal battles over Unix source code in the early 1990s, to comment on SCO's recent claims of copyright infringement in Linux. McKusick says he believes Torvalds when he says he did not copy the files in question, but notes that may be not the real issue here. McKusick also questions whether the GPL license could be applied to the code even if the requisite copyright notices had appeared.
SCO's letter of warning to certain Linux users claims that "Certain copyrighted application binary interfaces ("ABI Code") have been copied verbatim from our copyrighted UNIX code base and contributed to Linux for distribution under the General Public License ("GPL") without proper authorization and without copyright attribution."
Here is McKusick's reaction to the SCO claims as evidenced in SCO's letter [ http://lwn.net/Articles/64052/ ] as published by LWN.The argument that SCO is making is that they own the ABI, for example that EPERM (Operation not permitted) will have the value 1 (as defined in /usr/include/errno.h). AT&T made the same argument with BSD. In the end we argued that these interfaces had been distributed without copyright notices in *all* binary distributions which were available without signing non-disclosure agreements. The conclusion was that Berkeley could distribute these files with the following notice:
/* * Copyright (c) 1982, 1986, 1989, 1993
* The Regents of the University of California. All rights reserved.
* (c) UNIX System Laboratories, Inc.
* All or some portions of this file are derived from material licensed
* to the University of California by American Telephone and Telegraph
* Co. or Unix System Laboratories, Inc. and are reproduced herein with
* the permission of UNIX System Laboratories, Inc.
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* 3. All advertising materials mentioning features or use of this software
* must display the following acknowledgement:
* This product includes software developed by the University of
* California, Berkeley and its contributors.
* 4. Neither the name of the University nor the names of its contributors
* may be used to endorse or promote products derived from this software
* without specific prior written permission.
* THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
* OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
* OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
* SUCH DAMAGE.
Linus Torvolds says that he typed in these files from scratch and I believe that he did. Unfortunately that does not get him out of the ABI argument, because, by necessity, he had to use the same names and values. Thus, I am guessing that Linux can distribute these files, but only with the above notice. It is not clear to me whether it would be permissible to also add the GPL to the above set of notices.
The tangled web of translucent IP claims spun by SCO is further complicated by the fact that they may not even own the copyrights they claim have been infringed upon. Novell, the company from whom SCO bought Unix, disputes SCO's ownership of those copyrights. Both Novell and SCO have filed for those copyrights [ http://www.fortwayne.com/mld/journalgazette/7574560.htm ] during the past year.
Salt Lake Tribune
The trial involves the 2004 lawsuit SCO filed after Novell said it owned the copyrights. That was important because SCO had sued IBM the year before, and it could not hope to win if its ownership of the computer code was in doubt. SCO claims that IBM and Novell have collaborated to press Novell's claim to ownership as a strategic business decision. Both were getting heavily into Linux, a rival operating system into which SCO says IBM had improperly placed Unix code.
The opening for Novell's claim is the language of the 1995 sales contract between Novell and the Santa Cruz Operation. At one point the text says that "copyrights" were not sold.
But SCO has brought to the stand -- or played videotaped testimony from -- a number of former officials from Novell and Santa Cruz. They say the intention of the deal was to sell the copyrights -- and that it makes no sense without them.
That point was bolstered again on Friday. William Broderick, the former contracts manager for Novell and Santa Cruz who now works for SCO, said without the copyrights, "that would have destroyed our business."
But SCO has further argued that even if that language is ambiguous, it was quickly repealed by an amendment to the contract called an Asset Purchase Agreement.
Broderick, describing himself as a "contracts guy," erupted when Novell attorney Eric Acker pressed him on what the original sales agreement actually says.
"It makes no sense to use that language," Broderick said. "It does not exist in the APA. It was replaced by Amendment No. 2."
That type of questioning also brought a sharp reaction from another SCO witness, Ty Mattingly, who was on the Novell team that negotiated the Santa Cruz deal.
"You can't take a rifle shot of a thing and badger me with it," he told attorney Sterling Brennan.
That line of questioning has been one of Novell's main areas of attack against SCO's evidence during the week.
Novell's questioning in cross examinations of witnesses has focused on the intent of the contract based on the original language, over why the board of directors appeared to hold back the copyrights, and whether the lawyers who drafted the language had been trusted to reflect the deal's intent.
Novell attorneys also have pointed out that at least three of the former Novell officials have a financial stake in SCO, suggesting that tainted their testimony.
Novell is scheduled to begin presenting its case next week.
anonymous-insider: 3/12/2010 8:51:00 PM
The intent on the transfer of copyrights to SCO is easily verifiable from 1996 through 2002 when the PDP Unix Preservation Society (PUPS) negotiated with SCO licenses for 'ancient' Unix source code.
The most important negotiations for the ancient Unix source code licenses are documented in this Web page:
There's even an interesting e-mail from Mike Tilson, a VP at SCO, to PUPS and dated 12 Aug 1996. This is a copy of the original email:
Mike Tilson wrote in part: "The UNIX intellectual property has great value (SCO gave up nearly 20% of its equity plus future cash payments to obtain it.)"
There's copy of the whole PUPS mail archive at:
You may find key documents as relevant as Tilson's message in the PUPS mail archive.
An important petition for the licenses was signed in 1997-1998 by industry insiders and it's available at
The petition process ended in early 2002 when SCO released 'ancient' versions of Unix under a BSD type license.
During the whole process to obtain ancient Unix source code licenses from 1996 through 2002 all negotiations were made through SCO and not Novell. During all this time nobody challenged the fact SCO owned the Unix copyrights.
Main interest on ancient Unix source code licenses decreased in 2000 when Sun Microsystems released Solaris 8 source code to the public. The code was made easily available on the Internet in late 2000. The release of Solaris 8 source code is documented at:
Please note I'm an independent researcher. I'll be glad to give you my real identity (not a la PJ) and talk to you over the phone if required. Best regards.
anonymous-insider: 3/13/2010 10:03:00 AM +1
anonymous-insider: 3/13/2010 10:03:00 AM
A key Novell employee compared SCO to OJ in 2003. Bill Claybrook wrote the following while working for Aberdeen Group:
"Some day soon, let us hope the SCO-IBM lawsuit will end. The terror tactics of SCO will be over, and Linux and Microsoft Windows can fight it out in the marketplace. While there are many possible endings, don't be surprised if the unexpected happens. Remember OJ won his case. For that reason, we have to be prepared for anything."
Did IBM pay Aberdeen Group for the 2003 attack on SCO?
Bill Claybrook wrote plenty of anti-SCO propaganda and was then hired by Novell in late 2004 as Linux Product Marketing Manager. Here are some of the things Claybrook wrote:
It's unbelievable Novell hired Claybrook after all the things he wrote against SCO.
Pamela Jones then wrote in early 2005:
"I stopped in at LinuxWorld just for a quick look around today... First, I met Bill Claybrook, which was certainly an honor. I learned he works for Novell now. I hope he will write something for Groklaw some time..."
Did Claybrook write articles for Groklaw?
For unknown reasons Bill Claybrook quit Novell in mid 2009.
Most anti-SCO criticism is done by anonymous individuals whose real and obscure motives remain undisclosed.
Other anti-SCO criticism is done by individuals who are associated to dubious entities like the Free Software Foundation (FSF).
Individuals like Peter Salus have provided erroneous accounts on how previous Unix copyright holders conducted business in the 80s:
See item 7 on Salus' declaration.
Open source software leaders have spoken about the dubious tactics of the Free Software Foundation too:
The video content on that link starting on minute 1:25 is concise about the tactics of the Free Software Foundation and its 'wholesale attack on intellectual property rights'.
This talk by Eben Moglen* from the Free Software Foundation provides evidence on the 'wholesale attack on intellectual property rights':
Here is more stuff about the tactics of the FSF:
*Moglen has coordinated for the Free Software Foundation a public relations attack on SCO since 2003:
What I Couldn't Say…
I feel for Google – Steve Jobs threatened to sue me, too.
In 2003, after I unveiled a prototype Linux desktop called Project Looking Glass*, Steve called my office to let me know the graphical effects were "stepping all over Apple's IP." (IP = Intellectual Property = patents, trademarks and copyrights.) If we moved forward to commercialize it, "I'll just sue you."
My response was simple. "Steve, I was just watching your last presentation, and Keynote looks identical to Concurrence – do you own that IP?" Concurrence was a presentation product built by Lighthouse Design, a company I'd help to found and which Sun acquired in 1996. Lighthouse built applications for NeXTSTEP, the Unix based operating system whose core would become the foundation for all Mac products after Apple acquired NeXT in 1996. Steve had used Concurrence for years, and as Apple built their own presentation tool, it was obvious where they'd found inspiration. "And last I checked, MacOS is now built on Unix. I think Sun has a few OS patents, too." Steve was silent.
And that was the last I heard on the topic. Although we ended up abandoning Looking Glass, Steve's threat didn't figure into our decision (the last thing enterprises wanted was a new desktop – in hindsight, exactly the wrong audience to poll (we should've been asking developers, not CIO's)).
Bluster and Threat (Often Credible)
As in life, bluster and threat are commonplace in business – especially the technology business. So that interaction was good preparation for a later meeting with Bill Gates and Steve Ballmer. They'd flown in over a weekend to meet with Scott McNealy, Sun's then CEO – who asked me and Greg Papadopoulos (Sun's CTO) to accompany him. As we sat down in our Menlo Park conference room, Bill skipped the small talk, and went straight to the point, "Microsoft owns the office productivity market, and our patents read all over OpenOffice." OpenOffice is a free office productivity suite found on tens of millions of desktops worldwide. It's a tremendous brand ambassador for its owner – it also limits the appeal of Microsoft Office to businesses and those forced to pirate it. Bill was delivering a slightly more sophisticated variant of the threat Steve had made, but he had a different solution in mind. "We're happy to get you under license." That was code for "We'll go away if you pay us a royalty for every download" – the digital version of a protection racket.
Royalty bearing free software? Jumbo shrimp. (Oxymoron.)
But fearing this was on the agenda, we were prepared for the meeting. Microsoft is no stranger to imitating successful products, then leveraging their distribution power to eliminate a competitive threat – from tablet computing to search engines, their inspiration is often obvious (I'm trying to like Bing, I really am). So when they created their web application platform, .NET, it was obvious their designers had been staring at Java – which was exactly my retort. "We've looked at .NET, and you're trampling all over a huge number of Java patents. So what will you pay us for every copy of Windows?" Bill explained the software business was all about building variable revenue streams from a fixed engineering cost base, so royalties didn't fit with their model… which is to say, it was a short meeting.
I understand the value of patents – offensively and, more importantly, for defensive purposes. Sun had a treasure trove of some of the internet's most valuable patents – ranging from search to microelectronics – so no one in the technology industry could come after us without fearing an expensive counter assault. And there's no defense like an obvious offense.
But for a technology company, going on offense with software patents seems like an act of desperation, relying on the courts instead of the marketplace. See Nokia's suit against Apple for a parallel example of frivolous litigation – it hasn't slowed iPhone momentum (I'd argue it accelerated it). So I wonder who will be first to claim Apple's iPad is stepping on their IP… perhaps those that own the carcass of the tablet computing pioneer Go Corp.? Except that would be AT&T. Hm.
Having watched this movie play out many times, suing a competitor typically makes them more relevant, not less. Developers I know aren't getting less interested in Google's Android platform, they're getting more interested – Apple's actions are enhancing that interest.
Sun was sued numerous times – most big companies are sued almost constantly by entities or actors whose sole focus is suing others. Groups with no business focus other than litigating patent suits are affectionately known as trolls – pure litigation entities. (For good humor, read this, an application to patent the act of trolling. If granted, it would give the patent holder a reciprocal claim against a patent troll.)
The most egregious of such suits was filed against Sun by Kodak (yes, the film photography people).
Egregious, because Kodak had acquired a patent from a defunct computer maker (Wang) for the exclusive purpose of suing Sun over an esoteric technology, Java Remote Method Invocation ("Java RMI" – not exactly the first thing that comes to mind when you hear "Kodak"). Given how immature Kodak's technology business was (they were just starting out in the digital world), we had little we could respond with – I suppose we could've hunted for a Wang-like opportunity to hit at their core, but Kodak was a customer, which certainly complicated things, and the time and expense involved would've been prohibitive.
Their case was eventually heard before a jury in Rochester, New York, famous for being home to… the Eastman Kodak company. Lo and behold, the local jury decided Sun should pay Kodak more than a hundred million dollars. So here's something I could never say as Sun's CEO.
Day 2 of the SCO v. Novell Trial - Fire Sale PricingAuthored by: Faluzeer on Wednesday, March 10 2010 @ 09:28 AM ESTPJ Wrote : "$59.5 million in stock? For what had cost approximately $360 million only four years before? And yet they want us to believe that everything Novell got from USL, it passed on to Santa Cruz? That's some fire sale."
Whilst I agree with you PJ, I thought I would play devil's advocate and try to anticipate some of the responses that tSCOg would use.
I believe that tSCOg would respond by stating that :
The sale price of UNIX was consistent with the sale price of the combined Wordperfect Group & Quattro Pro that Novell made to Corel. Both sets of assets were sold by Novell under Frankenberg for a fraction of the price that Novell paid for them less than 4 years previously.
Whilst I personally believe that the core UNIX assets had more value than the Wordperfect Group / Quattro Pro at the time the sale was made, I am sure that tSCOg would argue that the fire sale of those assets indicates that Novell had bought them at vastly inflated prices and in an attempt to stay competitive against Microsoft they wanted to get any form of return on those assets that were deemed as not core to Novell's business.
Day 2 of the SCO v. Novell Trial - Fire Sale Pricing
Authored by: ralevin on Wednesday, March 10 2010 @ 12:45 PM EST
Let's not forget one very simple possibility: Novell vastly overpaid for the various assets they later sold. It does happen, particularly if the boss has big dreams of building an empire.
Aug 4, 2009 | Monty says
The first example of dual-licensing was probably Ghostscript, which Peter Deutsch licensed first under the GPL and later under the Aladdin Free Public License, but also under a proprietary license.
Inspired by his idea, David Axmark and I released MySQL under similar dual-licensing terms. Dual licensing has since become one of the most common and popular ways to create profit centers around Open Source/Free Software, in addition to support and services around the product.
To be able to bootstrap MySQL Ab, we originally had a license that allowed free usage, but a "pay-for" license if you used MYSQL for commercial usage or on the Windows platform. In 2000 we changed the free license to GPL, mostly to avoid having to explain our own license to everyone.
The basic idea for our dual-licensing was this: if you bought a license then we waived the GPL restriction that you have to redistribute your code as GPL. You could change, modify, extend, distribute, and redistribute the copy in any way you wanted (but of course not change the license of the MySQL code). The license was for any version and usage of MySQL, for now and forever.
This is still reflected in the MySQL FAQ on this topic.
This is what I personally think is the appropriate way to dual-license open source software and how we intend to do it in my new company, Monty Program Ab, for the software we produce.
The MySQL OEM LicenseI was recently made aware that the above is no longer the case with the standard MySQL OEM agreement. Sun is now, by default, putting the following limitations on their licensees:
(Sun has, of course, all rights to put any restrictions on their code, but as this is not how dual licenses used to work with MySQL or how it works with other Open Source projects (See for example, the license information for Ghostscript and .) You should however be aware of these issues if you intend to ever acquire a commercial license for MySQL)
- You cannot modify MySQL in any way (for example to fix bugs, optimise MySQL for your applications, include publicly available enhancements (such as the BSD licensed "Google patch" or compile it with another storage engine) to improve your MySQL as part of your product.
- You cannot use any forks of MySQL (such as Drizzle, ExtSQL or MariaDB).
- You are tied in to the current major release of MySQL enterprise (i.e. you have to pay for upgrades). This may be normal in a closed source environment, but not normal when it comes to Open Source.
- There are serious limitations for what kind of applications you can build with the MySQL code, for instance, the default agreement prohibits installations in hosting facilities or to use your version as a SQL server.
- The end user can't transfer/sell the license to someone else (to be used under the same conditions).
Recommendations to licensees and those considering the purchase of a MySQL licenseWith above limitations in place, you should consider if it's worth it to you to buy licenses for MySQL under the current terms. Also, if you are an old licensee of MySQL, you should be careful to review any new conditions when your license is up for renewal. Note that this warning is not something specific to Sun but applicable when working with any software vendor.
If you are running an old, modified, community, or forked version of MySQL at your company, you need to be aware that the default OEM agreement is not applicable to you. This also the case if you modify MySQL code to implement a new storage engine, MySQL extensions or if you are a hardware vendor that wants to to tune MySQL for your setup.
If you need to buy a commercial license, because you cannot use the GPL, you need to seriously consider if you can accept the default restrictions. If not, then you should contact Sun and renegotiate the terms. I know there are examples where MySQL licensees have been allowed to change MySQL code and also have the right to publish those changes (Infobright openly advertises that they've done so). You should ask to get those same rights.
If you plan to do dual licensing yourself, you also need to make sure that the license allows you to use an Open Source version of MySQL with your Open Source product.
When agreeing to a license, ensure that you get enough freedom to do what is required for your business and you are not completely dependent on one vendor for your success!
Recommendations for companies doing Dual-LicensingI believe one should be very permissive when doing dual licenses with Open Source as otherwise you lose many of the business advantages you get from being Open Source. The Open Source community is a very effective ecosystem and if you allow it to participate with your business you have a better chance to succeed.
The only restriction you need when re-licensing is that the licensee should not be able to change the license of your code and they can only use and/or distribute the pre-negotiated number of copies of it.
By being fair to others, you will get a reputation as a trustworthy business partner and you will get more business in the long run.
- Allowing changes to the licensed code allows the licensee to combine community code and their own code in creating a better product. It also gives your customer more trust in your product as they don't feel locked into only one vendor for things like bug fixes and enhancements.
- Make it easy to use your product or part of your code with other products.
- Allowing re-distribution of the product creates a market for people doing addons, enhancements and totally new products based on yours.
- Don't be afraid of forks; They enlarge your ecosystem and anyone that wants to buy a license for these forks also has to buy one from you.
- Don't limit the license to a specific version; If you allow changes this is meaningless anyway as one can easily go around it. In the long run it's not a winning proposition to sell the same software over and over again to the same customer. Instead work on the software and with the customer to increase the usage of the software.
- Don't limit in any way how the product/code can be used; it just forces people to choose or develop other products that will compete with you and will limit the business you can create.
- Make the end-user license transferable. This is already allowed in many countries, it is what normal people expect from most things they buy and will create opportunities for new business by others. If you got paid for any copy of your software that exists, do you really care who uses it second hand ?
Recommendations to Community contributorsI assume for this blog that it's clear why it's beneficial for you to donate code to an Open Source project. (If not, then this could be a topic for another blog post).
However, when donating your code to a an Open Source project that is using dual-licensing, you need to also consider how the project is going to use your code when re-licensing it under a non-Open Source license. This is very important if you ever want to license the project yourself under a commercial license (not Open Source).
- What are the restrictions on how you can use the re-licensed work? (Ideally it should be usable for any purpose and in any manner).
- What changes can you make to the code when you re-license it? (Ideally there should be no restrictions, except that you can't change the license).
- Can an licensing agreement be used to restrict the licensee's possibility to publish their own code as Open Source, or to include Open Source code in their product?
- Is the re-licensing agreement tied to a specific version of the project.
- Is the contributor agreement for the project clear in terms of how you may donate code to it? Can the project, for example, take any code you ever send to any related email list or do you need to explicitly sign every contribution separately. (Our contributor agreement wasn't clear in this aspect, so I recently added: "Each submission must explicitly be marked that it's donated under the MCA". You can of course also mark the code to be under BSD.)
If you agree with the above and you have signed contributor agreements that do not include such a note, you should consider contacting those projects and asking for a new one with such a clause or get some other public guarantee that the project re-licenses code in an appropriate manner.
Note that releasing your code as BSD for a project that has or may have GPL code doesn't protect your code from being dual-licensed in an unfavorable way. The only way to ensure full freedom for others is to only donate your code under a contributor agreement with a clause as suggested below or to a project that has agreeable guidelines for how they license their code!
To assure our users, contributors, and customers of how we at Monty Program Ab intend to re-license the code we produce or the code people donate to us, I have added the following note to our contributor agreement:
"Monty Program Ab agrees that when it dual licenses code, it will not restrict the way the third party licensee uses the licensed copy of the code nor restrict how they use their own code."
If you have any comments/ideas around this, feel free to join the the maria-discuss Launchpad team and its associated mailing list and discuss this topic.Posted by Monty at 23:59
Labels: licensing Dual-licensing, OEM
Kaj Arnö said...
- From Sun's perspective, nothing has changed in our Dual Licensing implementation for many years. It has been substantially the same since you Monty were actively part of fine tuning the first principles that you and David established.
The restrictions of the commercial MySQL license are industry standard and talk about what others can do with MySQL code. Contributors under both SCA and CLA grant rights to us, but continue to own their own code and may thus do whatever they please with it -- including releasing and using it under the rules of the GPL. Those are all basic tenets of dual licensing (commercial rules for commercial licensees, GPL rules for GPL licensees) which I believe are widely understood and accepted.
- August 5, 2009 4:47 PM
- water outbreaks said...
- What exactly is your new company going to dual license? Unless I'm mistaken, you can't dual license the Mysql fork, because you are using the GPL license from Sun/Mysql.
Are there additional utilities and such that are going to be independent enough of the Mysql codebase to allow a dual license?
- August 5, 2009 8:18 PM
- Monty said...
- About what Monty Program Ab could Dual-license:
Over time we at Monty Program Ab will produce a lot of code, tools and extensions for MariaDB and other products that could be dual licensed. I want to keep the options open to dual license these.
It's true that we can't dual license MariaDB as such (as part of it is owned by Sun). This doesn't however stop anyone from buying a license from Sun for the MySQL part and then buy a license from us for the MariaDB part.
The same is true for MySQL/MariaDB storage engine vendors and those companies that provides Dual-licensed extensions to MySQL/MariaDB.
- August 6, 2009 11:43 PM
- Monty said...
- In response to Kaj's comment that 'Nothing has changed':
The current MySQL OEM license was updated September 2008, after MySQL was bought by Sun. I don't know how the previous copy of the OEM license looked, but I do know that when I was part of deciding the OEM licensing scheme in MySQL Ab, it was very liberal (as described in my blog). I don't know when things changed (at least I was never consulted or even informed about it), but I know that the current one does not match my principles of how to do dual-licensing of Open Source software.
As you, Kaj, should know, one of my basic principles in doing business is that one should never write or propose an agreement that you would never want to receive or sign yourself. It should be more than clear to you that the current OEM agreement is, for me, not such a document.
As my blog already describes, the current commercial MySQL license does not follow the "industry standard" of dual-licensing (where did you get this idea?); MySQL Ab were much more open in the beginning (and for a longer time than the current limitations have existed) and other dual-licensed project are much more open. It's also clearly not what people expect from an Open Source project as it seriously limits other peoples possibilities to work, use, and do business around the product. I have gotten lots of comments about this, so that part is easily verifiable.
I strongly disagrees with the notion that you can have commercial rules for commercial license, GPL rules for a GPL license. People donate code in to a product because they are using it and intend to recommend and use it in the future. If the commercial license is not agreeable to them, there is no reason for them to donate time and effort to help the project. Why help someone that doesn't understand your needs and is working against you?
Because of that, I strongly recommend anyone doing dual-licensing take their users and contributors into account when defining how they dual-license their software. It's to everyone's benefit to have a liberal dual-licensing policy to create a working developer ecosystem around products that benefit a large number of people.
- August 7, 2009 1:29 AM
- Mark Callaghan said...
- Does the OEM agreement allow the use of plugins? Does that include storage engine plugins?
How do you get bug fixes under this license? Is that covered when you also buy a support contract?
I understand the need to limit the agreement to a particular release as there should be different prices for a customer who only wants a license for 5.0.84 versus someone who wants a perpetual license for all future releases of MySQL. Without such a limit, it is not as easy to define the difference between fixing a bug, using the Percona patch, and upgrading to new versions of official MySQL.
However, I also think that opportunities are being squandered by Sun. If their agreements preclude external developers from having a chance at making money, then I suspect that many of those external developers will stop contributing via the CLA/SCA.
- August 7, 2009 6:04 PM
- Sheeri K. Cabral said...
- Monty, I think you're spreading FUD. You admit "I don't know when things changed (at least I was never consulted or even informed about it)" but you're spreading *FEAR* that Sun changed the OEM license.
You don't actually know that Sun changed it.
I think it would be best if you retracted the part of the post that slams Sun for changing the license, since you don't know that Sun actually changed it.
- August 7, 2009 9:10 PM
- In response to Sheeri:
Please read my post/comment again; There is no FUD involved, only facts. I did not say that it was Sun who has made the OEM license as restricted as it's now; I just said that, contrary to what Kaj implied, that I was not part in doing such a change.
When it comes to Sun, the only thing that is clear is the OEM license has been changed after Sun bought MySQL (just check the date in the license) and Sun has thus approved of the current content of the license.
The main point is however not if the change was done before or after Sun bought MySQL. The important thing is that the current OEM license for MySQL is something that in my mind is unacceptable when doing Dual-licensing on Open Source software. I think that people contributing code to an Open Source project should be aware of how they code are used and why it's important to know this.
I know that Sun is not the only company that is doing this wrong. However, now when the Oracle / Sun deal gets a lot of attention, it's important to know what Sun is doing so that we can help DOJ to and the EU commission to understand better where MySQL stands now and what they should do to ensure that MySQL stays free in the future. For this to happen, we need to be able to discuss things openly, instead of staying silent. This makes it difficult, since any criticism against Sun can always be called FUD, but anyone in the MySQL community must be allowed to have an opinion and to talk about it in public - that is how Open Source works after all. You can spread more FUD by not saying anything than by being transparent!
A final word about Monty Program. Our business model targets those who use the GPL version of MySQL (or MariaDB). We don't have a self interest in this topic. Sure if things were different, maybe we could do some other business too, but things are what they are. The only reason to bring this up for public discussion was that we were made aware of these problems by people affected by them. After publishing this we have also got feedback from other leaders in the community, that told us they did not know about this and they did indeed assume dual licensing worked differently and were thankful for the blog post.
- August 10, 2009 5:12 PM
- Monty said...
- In response to Mark:
If you use the standard OEM agreement, you can't use any non default storage engines, you can't fix bugs yourself or ask anyone to fix the bugs for you. You can get a separate support contract from Sun and hope that they fix the bugs you report in the version that you bought the EOM contract for.
I don't think there is a reason to limit an OEM agreement of Open Source software to a particlar versions as if you are all; It's gets too hard to define what changes you can apply or not apply without having to change the version number yourself. I think it's a better business model when you combine Dual-licensing with a separate support offering. This will allow you to get the money from the customer over time when he uses the product for new things instead of trying to get the customer to pay over and over for almost the same code (just because he wants to have the latest bug fixes that is just in the latest release)...
- August 10, 2009 5:17 PM
- ccx said...
- Via archive.org, one can see that the default OEM agreement included restrictions on modifying the source since at least June 2005.
There are four fundamental questions/topics in open source:
- Open-source licenses and the availability of source code;
- The impact of free (as in cost) software;
- The value of brand. As Red Hat knows, Red Hat is indomitable because of its brand, not its source tree;
- Who's asking? The answer you give to an 8-year-old is different from the one you'd give to a CIO. This last topic provides the answer to the open-source revenue question.
... ... ...
I don't expect many college students, developers, or start-ups to spend a lot of money on intellectual property. I expect someone whose job is on the line if a system fails to spend considerably more than nothing. The key is figuring out the difference between one's market and one's community. They are not the same.
On the GPL, Apache and Open-CoreMatthew Aslett, August 28, 2009 @ 5:48 am ET
Jay has already provided a good overview of the debate related to the apparent decline in the usage of the GPLv2. I don't intend to cover the same ground, but I did want to quickly respond to a statement made by Matt Asay in his assessment of the reasons for and implications of reduced GPLv2 usage.
"as Open Core becomes the default business model for 'pure-play' open-source companies, we will see more software licensed under the Apache license"
I don't doubt that we will see more software licensed under the Apache license, and also more vendors making use of permissively-licensed code, but I don't see a correlation with the Open-Core model.
In our report, "Open Source is Not a Business Model", report we found that 23.7% of the 114 vendors we covered were using Open-Core as a vendor licensing strategy. Looking at the stats, over 70% of Open-Core strategy users also used a variant of the GPL or LGPL.
The main reason for the correlation of the L/GPL and Open-Core is, as Matt notes, that "the GPL makes sense in a world where vendors hope to exercise control over their communities". Carlo Daffara agrees: "the GPL is not a barrier in adopting this new style of open core model, and certainly creates a barrier for potential freeriding by competitors".
Carlo cites as an example the use of the GPL by the usually Apache-focused SpringSource for its SpringSource dm Server as a means of restricting the commercial opportunities for potential rivals, something that we covered here.
As Matt explains, however, "if the desire is to foster unfettered growth, Apache licensing offers a better path". Savio Rodrigues offers an example of a usually L/GPL-focused company - Red Hat/JBoss - choosing the Apache License for its new HornetQ messaging software because "the project team felt that the Apache license would ensure that the project's code could be more easily included into products from the ecosystem."
1-1 then. But this isn't about point scoring. What the examples demonstrate is that vendors choose licenses for individual projects/products based on pragmatic business reasons rather than dogmatic commitment to licensing philosophy, and that - as we previously suggested - there is actually some benefit in the proliferation of different licenses.
Of course it is also important to remember that many vendors don't have the luxury or choosing a license for the project they attempt to commercialize. Mike Olson notes that adoption has been a factor related to the Apache licensed Hadoop project - but what came first commercialization or adoption?
I believe we are seeing increased adoption of permissively-licensed open source software by both new open source specialists, such as Mike's Cloudera, and also proprietary vendors such as Oracle, SAP and - as recently discussed - Day Software.
In these cases, the commercial vendor doesn't choose the Apache license for software to encourage widespread adoption, it is encouraged to choose Apache-licensed software because of widespread adoption (not to mention the low cost and high quality advantages of being part of a true developer *community*).
That has more to do with the patron model, as discussed by Day Software's chief marketing officer, Kevin Cochrane, than it does Open-Core.
Additionally, as Carlo notes, it is a product of the shift towards what he calls "consortia-managed projects". Or as I previously stated: "if Open-Core was a significant revenue strategy of open source 3.0 (vendor-dominated open source projects such as MySQL, JasperSoft), then Embedded [as I was referring to the patron model at the time] is one of the commercial open source strategies of open source 4.0 (vendor-dominated open source communities such as Eclipse, Symbian)."
So while we expect Open-Core to remain a significant business model for 'pure-play' open-source companies, and we expect to see more software licensed under the Apache license, we don't see the two as being directly related.
Anyway, this was supposed to be a quick post. That's enough for now.
April 29, 2009 | CNET News
I have spent years advocating the GNU General Public License as the optimal open-source license for commercial open source.
Roughly nine years after I first became a fan of the GPL, I think I've been wrong.
My admiration for the GPL mostly stemmed from its ability to mimic, but then invert, proprietary licensing. The GPL is like opening a cannister of radioactive waste: while your competitors can touch it, you're dead certain that they won't.
Given that openness is increasingly a winning business model--if not the winning business model, as Red Hat executive Michael Tiemann argues--one has to wonder if pretending to be open through the GPL accomplishes as much as fully opening up through Apache-style licensing would.
Open-source luminary Eric Raymond is pretty clear on this point:
I think we live in a...universe...in which the GPL is unnecessary rather than futile. Mind you, I am not claiming the GPL is entirely useless. It's a signaling behavior, like wearing a crucifix or yarmulke or pentagram; it helps build trust groups. But it has costs, too.
It creates a lot of needless fear from potential allies and users who suspect they won't be able to control their exposure, if they let it in...Is the GPL's utility as a form of in-group signaling worth the degree to which fear and uncertainty about it slows down open-source adoption? Increasingly, I think the answer is no.
The GPL may be a community-building signaling device, but it is also a confession of fear and weakness. To believe that it matters, you have to believe that you live in a...universe where closed-source development is such an attractive proposition that you have to punish people for trying to move to it.
In other words, if openness works (in the Jamesian, pragmatic way), why not give it free rein, rather than hedging our open-source bets to the point of obviating their efficacy?
Equally important, we may not be getting the "protection" we seek from the GPL, anyway, as the GPL becomes the new BSD in the cloud, as Linus Torvalds recently commented to me in an e-mail:
AGPL/GPLv3 anti-ASP/TiVo language doesn't "protect" anything. There is no upside to pushing freeloaders away.
Sun Microsystems CEO Jonathan Schwartz rightly identifies adoption, not protection of freedom, as a key open-source benefit: open source provides an efficient way to distribute software to the maximum audience at the minimum price. With this in mind, unfettered Apache-style licensing would be the ideal license to maximize adoption, despite likely being the worst way to directly monetize software.
So long, however, as one's business either monetizes software indirectly (i.e., Google with its advertising model) or adds to the open-source components with commercial extensions (i.e., IBM with proprietary software, services, and hardware add-ons), then a company should be able to reap a bounteous harvest from its open-source seeds.
In sum, the GPL may well be an excellent capitalist tool, but Apache licensing could well be even better.
Disclosure: My company uses the GPL, not an Apache license.
July 04, 2006
A look at the impact of the GPL on free software development.
If that's the case, perhaps Richard Stallman should rewrite the GPL so that it doesn't (rather effectively) hinder small, grass-roots, free software development and distribution projects.
Perhaps he ought to use his influence to stop the Free Software Foundation from using the clauses already in the GPL as an excuse to bully small free software development and distribution projects with its legal heft.
MEPIS wasn't the only distribution targeted: also take note of the threat of legal action looming over other projects such as Kororaa Xgl LiveCD.
John Andrews of Damn Small Linux reportedly agrees with the estimation of MEPIS' founder Warren Woodford that a great many small Linux distribution projects are at significant risk, and new projects are considerably less likely to spring up in the evolving legal climate of the "free" software community.
"how much more trouble is it to set up a source code tree and allow people to access it?"
Easy to say if you're not paying for the bandwidth, or with the Mepis example above, the time and materials to burn, package, and ship the DVDs. There's a reason the hippie communes all fell apart. Peace, love and harmony are fine concepts until it comes time to pay the bill, or actually get something done. Simple fact is, even under the GPL, the source is freely available, but the work/resources required to convey it from source to destination are under no such restriction - and shouldn't be.
That said, if Zenwalk isn't making their source code available, they should at least implement something along the lines of the Mepis solution.
May 23, 2008
12:47 PM GMT Yeah, if I was burning the DVDs to order (or even several at a time), you could probably figure about 15 minutes of time to get it done per set, and that translates to $38.75 at my current billable rate (oh, if only I was the one getting $155/hr, alas, it is my employer).
May 23, 2008
12:48 PM GMT I wouldn't trust a distro that doesn't have at least a local copy of the original sources. When you download something, put a copy in a folder on your hard drive. When someone contributes, make them give you the original source.
I have no use for anyone who thinks he/she can ignore a software license because complying is inconvenient. In addition, any responsible developer knows you should keep a copy of the sources. A few GB on a hard drive. No big deal. Slap it on a DVD and mail it to someone that wants it. If there were anything difficult about complying with this rule, I would be at least 1% sympathetic.
> They have a script for users that supposedly pulls sources from upstream
Then what are they whining about? They can just run the script themselves. Problem solved.
What are they going to do if there's a bug and the upstream source is no longer available?
May 23, 2008
12:50 PM GMT You're ranting at the wrong people, dumper. No sources are freely available unless someone makes them available, and the GPL specifies exactly how to make it available. "Go find it yourself somewhere upstream" is not one of the allowed methods, and people who expect Linux distributors to honor the GPL are hardly dopey hippies with crazy ideas.
There is a simple rule for using GPL code: if you don't want to honor the terms of the GPL, don't use GPL code. Mepis and Zenwalk are the ones who are balking at paying their GPL bills, which is merely making their source trees available. I daresay that is considerably cheaper than creating and maintaining an entire operating system with productivity applications from scratch. They're getting the benefit of gigabytes of other people's code, and yet they're still whiny and ungrateful. Warren is displaying bad faith with DVD-only distribution at an inflated cost, even if he is technically, by a hair, in GPL compliance, and his licensing FAQ is full of errors. Users who want the sources for a single or a just a few applications are stuck with buying the DVD set, and there is no way of knowing how up-to-date the DVD sets are, or how soon they will receive them.
I don't understand why "little guys" like Mepis and Zenwalk should get a pass on GPL violations- they're in the wrong just as much as any BigEvilCorp that does the same thing.
May 23, 2008
1:51 PM GMT Easy tuxchick - take a deep breath. I'm not ranting at anyone, and we fundamentally agree. I believe any user of any code should comply with said code's license. I further believe that under the GPL, any applicable code should be made available in a reasonable manner.
We part company in that I don't share your bent towards entitlement. Mepis and Zenwalk and anyone else using GPL code should be held to the same standard as anyone else. You seem to have re-defined that standard to mean "they should provide free, immediate access to everything for everyone." I view that as unhealthy and irrational. Sadly, the effort to re-define "free" software as the technical equivalent of a welfare state is becoming a common sentiment.
- Yes, they should honor the license of the code they redistribute.
- Yes, they should provide current code when it's requested.
- Yes, they should provide it to anyone who requests it.
- Yes, it should be done at a reasonable cost - what they reasonably feel it costs them to maintain and distribute such resources.
- No, they're not under any obligation to provide instant access to such code.
- No, they're not obligated to let you decide for them what you feel a reasonable fee is for such provision.
- No, they're not obligated to play indentured servant for anyone, and sort through gigabytes of code for someone who only wants this . . . and this . . . . and this, but not this . . . or this.
I never claimed "people" who expect anyone else to honor the terms of a given software license are "dopey hippies with crazy ideas". Although it's cute how you'd associate your perspective with that of "people" as a whole - we think very highly of ourselves, don't we? :)
What I am saying is that the irrational entitlement motif you seem to be championing, if codified, would tend to cripple the freedom you claim so loudly to respect. Once again, passion is worth very little if it can't be balanced with a practical implementation that serves the users of the software.
May 23, 2008
2:11 PM GMT
Quoted: to give any third party, for a charge no more than your cost of physically performing source distribution
Neither GPLv2 or v3 says anything near that.
GPLv2 Section 1
Quoted: You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.
GPLv3 Section 4
Quoted: You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee.
It is a misconception that I have run across that you cant make a tidy profit off of source code. I can take and put source code on a CD/DVD and charge a thousand bucks for it if I chose. Granted that the first CD/DVD to go out can then be shared wholly or in part for free.
Free as in Freeloader does not apply to source code either, a GPL project can make the source code prohibitively expensive. Sorry if I seem a bit short TC, I'm writing this before my first cup of coffee.
May 23, 2008
2:20 PM GMT
3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:
a) Accompany it with the complete corresponding machine-readable
source code, which must be distributed under the terms of Sections
1 and 2 above on a medium customarily used for software interchange; or,
b) Accompany it with a written offer, valid for at least three
years, to give any third party, for a charge no more than your
cost of physically performing source distribution, a complete
machine-readable copy of the corresponding source code, to be
distributed under the terms of Sections 1 and 2 above on a medium
customarily used for software interchange;
May 23, 2008
2:58 PM GMT Thank you TC, you missed something in your reading of Section 3 though. Section 3.b kicks in (cost of the act) only when you choose alternate compliance from sections 1 and 2 as listed in Section 3.a.
Section 3.b as alternate compliance limits cost, Section 3.a has no such limit.
I'm not saying anything about Mepis or Zenwalk compliance in this, just that the presumption that source code has a mandatory minimus charge is incorrect.
*edit* a good place for an example of charging in extremis would be the FSF itself. [HYPERLINK@agia.fsf.org] */edit*
May 23, 2008
3:49 PM GMT
> Easy to say if you're not paying for the bandwidth,
Meer.net provides DSL service with a fixed IP address in the Morgantown area for $45 a month and don't restrict you to non-commercial uses. Yeah, the speed would be lousy, but the GPL doesn't address that, and if Warren is so absolutely certain nobody needs the source, there shouldn't be that much demand for it, should there?
May 23, 2008
5:07 PM GMT
"Yeah, the speed would be lousy, but the GPL doesn't address that . . ."
Then you're still left with the same complaints (from the same complainers) I tried to address in my second post. Now it'll be "with this crappy connection, I still can't download and they're violating the "spirit" of the GPL". It's the same slippery entitlement based slope, with a twist of rationalization to make it more palatable. So let's add another point:
- No, they're not obligated to let anyone else decide (outside of the explicit declarations of the license) what is a reasonable method of distribution for them.
May 23, 2008
5:31 PM GMT
heh, dumper, that's some world-class straining at gnats there. Indentured servant? Sorting through code for someone else? Please, give me one example of where this will happen by providing an ordinary source tree online. I know when I need to fetch source tarballs, SRPMs, or get something from SVN I don't need anyone else to "sort" them for me, and I rather doubt anyone else does either. Where do you get this stuff? The "medium customarily used for software interchange" is via Internet download, and in the FOSS world it's been that way for years. The only "Irrational entitlement" here is from Zenwalk and Mepis, who should really be using BSD code since they're so GPL-averse.
May 23, 2008
6:07 PM GMT
"Users who want the sources for a single or a just a few applications are stuck with buying the DVD set,"
I got that from you , tuxchick. You're right, it wouldn't happen by providing an ordinary source tree online, but you've also completely skirted the point: You have no right to determine for them what method of distribution they choose. Distribution via media is perfectly valid, whether you like it or not. It's you straining at gnats, to somehow try to validate your entitlement mindset.
>"The "medium customarily used for software interchange" is via Internet download"
So declares tuxchick - ah, the arrogance of elitism. And in typical "I am entitled" fashion, you'd dictate to these developers (translate: "free" code users) just what their "freedom" means, and exactly how it can be exercised. Your position sounds more like an MS EULA than a "free" software license.
>"Zenwalk and Mepis, who should really be using BSD code since they're so GPL-averse."
Mepis is apparently complying, just not in a manner you care for. Zenwalk should, we don't even have anything to argue about here. Sadly, it's not the GPL you're arguing for.
I must admit, arguments like yours make me curious to see what the GPLv4 would come out looking like following your ideology. As I said, this "communities" more passionate members look more like a welfare state all the time.
May 23, 2008
11:02 PM GMT
Now it'll be "with this crappy connection, I still can't download and they're violating the "spirit" of the GPL".
And your point is?
Your were saying that the bandwith costs were a concern. I pointed out that they weren't. The only reason Warren doesn't provide a source code tree is because he doesn't want to. Why he doesn't want to isn't something I'm worried about, since he is complying with the GPL (unlike Zenwalk), but there's no reason to sugar coat the matter.
Added: Yes, I'm somewhat sensitive on the matter, since Morgantown is only about 30 miles away as the crow files.
BTW, in the judgment of folks here, would providing a Bittorrent feed meet the requirements of the GPL?
May 24, 2008
1:49 AM GMT
I'm not actually trying to sugar coat anything, or even trying to stick up for Mepis or Zenwalk. I happen to agree with your first post in this thread (and indirectly tuxchick's argument) - a good SVN repository on even a moderate connection is reasonable - if they're worried about cost, they could charge a reasonable fee for access to the repository, and change access permissions monthly or something to help pay for it.
My point is there's always a reason to gripe if you don't like the way someone else does something. As of yet, nobody's offered an intelligent reason to believe the "community" is entitled to any special consideration beyond strict compliance with the license. That's the part I have a problem sugar coating. :)
The Bittorrent question is interesting. By tuxchick's definition, it is a medium "customarily used for software interchange . . . and in the FOSS world it's been that way for years". Case in point - any number of ISOs for any number of distros. I'm not sure it's any more convenient or accessible (or customary) than ordering a DVD, but that depends on how much attention they pay to their feed.
May 24, 2008
2:02 AM GMT TC -
If we're going to talk about straining and bending meaning --
Where do you come up with being unable to charge a fee that does not exceed your cost of physically transferring the source, and, presuming that to be true, what do you think that includes?
As I read the GPLV2, you may charge a fee for the physical transfer. The license doesn't say that you can't make a profit in the transfer. It merely specifies what the fee actually covers. You cannot charge for the source. Period. Not even a penny. You can charge for the transfer.
May 24, 2008
11:22 AM GMT As of yet, nobody's offered an intelligent reason to believe the "community" is entitled to any special consideration beyond strict compliance with the license.
Well, that's because we're not. And, like his method or not, Warren is now complying with the license.
The only reason I have any problem with what he's doing is that I think it reflects badly on my home area; and we have enough problems in that regard. That, of course, is entirely a non-GPL issue (and not something I expect anyone else to necessarily agree with).
Zenwalk, on the other hand, is a GPL issue. I've looked at the distro a few times and even considered recommending it to people. This issue pretty much shoots that idea.
May 24, 2008
11:49 AM GMT
dino, you're right, and azerthoth addressed that too. In the context of Mepis, I am extra-critical because of Woodford's cruddy attitude and misinformation. Yes, technically he's in compliance, but his whining and foot-dragging is bush league. I bet if he had to buy DVDs every time he needed to update his own source tree, or just a couple of applications, he'd complain plenty.
dumper, I believe in the spirit of a license as well as the strict letter. You may recall Theo complaining how all these big commercial software companies use and profit from BSD code, and don't give anything back.
Well, the BSD license permits that. But even so, I believe that for both BSD and GPL code, that which is freely given should be received with gratitude, and something given back. Instead of grudgingly following the rules to the least extent possible- I think that is a slap in the face to all the contributors whose hard work and talent make all this possible.
[Apr 30, 2009] Linux.com A GPL requirement could have a chilling effect on derivative distros
A GPL requirement could have a chilling effect on derivative distrosBy Bruce Byfield on June 27, 2006 (8:00:00 AM)
CommentsWarren Woodford, the founder of the MEPIS distribution, would prefer to be concentrating on polishing his latest release. Instead, he is distracted by an official notice from the Free Software Foundation that, because MEPIS has not previously supplied source code for the packages already available from the distribution it is based on -- once Debian, and now Ubuntu -- it is in violation of the GNU General Public License (GPL). Woodford intends to comply, but he worries about how this requirement might affect all distributions derived from other distributions -- especially those run by one or two people in their spare time.
The requirement to supply source code is covered by section 3 of the second version of the GPL. Under these sections, the distributor of GPL code is obligated to provide source code "on a medium customarily used for software interchange" for up to three years. In practice, this medium is usually a CD or DVD, or a server from which it can be downloaded. Under section 6 of the GPL, each distributor of the code comes under the obligations specified in section 3. This obligation is specified even more strongly in section 10 of the draft for the third version of the GPL, which specifically states that "downstream users" (those who, like Woodford, adopt the work of another project -- the "upstream distributor" -- for their own use) fall under these obligations.
"We think it's pretty clear," says David Turner, GPL compliance engineer at the FSF. "One problem with allowing people to skip out on source code distribution is that there's nothing that requires the upstream distributor to continue to offer source code. If they stop doing so, the source could become totally unavailable. Or, more commonly, the upstream distributor will upgrade the version of the source code available, leaving downstream distributors totally out of sync. In order to fix bugs, users need to get source code exactly corresponding to the binaries they have available."
Woodford does supply the source code for MEPIS' reconfigured kernel in a Debian source-package. His mistake seems to have been the assumption that, so long as the source code was available somewhere, he did not have to provide it himself if he hadn't modified it. While he has not contacted any other distributions, he suspects that he is far from the only one to make this assumption. "We, like 10,000 other people, probably, believed we were covered by the safe harbor of having an upstream distribution available online," Woodford says. "I think, of the 500 distributions tracked by DistroWatch, probably 450 of them are in trouble right now per this position."
A safe harbor is a legal term, referring to the elimination of the need to comply because a violation was made in good faith.
Compliance in the community
Woodford is exaggerating, but not enough to change the basic truth of what he says. Klaus Knopper, who develops the popular Knoppix live CD, says that he maintains a source repository and will make source code available on request. Talking on behalf of CentOS, Johnny Hughes says, "CentOS has been providing source for all packages, changed and unchanged, in their distribution. CentOS has the same understanding of the GPL as expressed by the FSF on this issue." Similarly, Texstar, the main maintainer for PCLinuxOS, says, "I am aware of the GPL requirements and make all of my source code available via DVD and it can be downloaded from a free server."
However, a majority of distributions and their distributors are apparently unaware of the requirements. "Before I was contacted by the FSF, I didn't know that we needed to actually offer the source code of binaries we didn't modify," says John Andrews, the source code maintainer of Damn Small Linux. "Yet we do comply now, and the FSF occasionally pops in with an email to make sure we do." Similarly, LinuxCD.org, a distributor, makes only Fedora source code available -- and only provides that because it was specifically requested to do so.
Unsurprisingly, no non-compliant distribution was willing to go on record for this article. However, a search through the Web pages of two dozen randomly selected smaller distributions in DistroWatch's top hundred shows only a few download repositories that contain source code, and no offers to provide it on request. The fact that only a few replied to a request for comments may also be significant, suggesting that the maintainers, having become aware of their non-compliance, do not wish to advertise their status -- although it might simply be that, being small operations, they prefer to focus on their work rather than answer questions. Still, even if Woodford's exact percentage is wrong, his suggestion that the majority of distributions are unaware of the GPL requirements does seem accurate.
Implications and solution-seeking
Woodford is now working to come into compliance. "Either I go along or go to court with them about it, and it's a lot easier to go along," he says. "I'm not making any money here. I can't afford a lawyer. I have an income, but I'm just barely staying afloat. We're going to reply to their request, and it seems like the request is consistent with the GPL license."
Woodford also understands that, while the FSF is firm about compliance, it is showing restraint in its effort to get MEPIS to comply. "If we were a big corporate entity, then they would ask us to pay them money," he says.
Yet, despite his willingness to comply, Woodford remains concerned about the implications. According to Turner, because MEPIS distributes both online and on CD and DVD, it would need to provide the source code in both media under the third version of the GPL, although section 3b of the second version would require distribution in only one medium. Woodford is also concerned about the practical considerations of automating the regular extraction of only the packages that MEPIS uses from the Ubuntu repositories.
Even more importantly, Woodford says, "I think that what they're doing is probably going to be bad for creativity in the open source community. There's plenty of people out there who like to be the GPL police. And with this extra little thing in their bag of tricks, somebody is going to go out there looking at everybody who puts out a new release of anything."
"What is really needed for the benefit of the community is if there could be a way to have an exception for the little guy," Woodford says. "But how can you do that when the whole thing is designed around the idea that every entity and every person that uses the GPL is held to the exact same rules and standards? How do you start making exceptions to that?"
Asked about the possibility of adding such an exception to the third version of the GPL, Turner replied, "If someone submitted a comment to that effect, we would of course consider that comment. But I don't think it likely that it will be changed.... I just asked Richard Stallman about this. He noted that the requirement isn't particularly onerous -- source code isn't much larger than binaries."
Woodford, though, disagrees. "If I had been told this when I was getting ready to create MEPIS in the first place, I never would have done it. I didn't have a server, I didn't have a repository, and it would have been a daunting task." His concern is that others will be similarly discouraged.
Andrews from Damn Small Linux also disagrees with Turner and Stallman, saying, "I understand why the FSF makes sure small-time players comply with their requirements. However, I also know from experience that it's quite a burden for the hobbyist or small-time developer who wants to share something cool with the world but doesn't have the finances or organizational structure of the big corporations."
"Of course, non-profit distributors can always arrange with their upstream distributors to help them with the source code distribution," Turner suggests. "If such an arrangement is in place, the problems mentioned above won't happen, and the non-profit distributor will be able to save time and bandwidth."
Major upstream distributors, however, are unlikely to enter such arrangements, if Fedora is any indication. Max Spevack, chair of the Fedora Board, says, "There are several reasons why the Fedora Project would be hesitant to officially sanction downstream distributions to point to upstream code repositories. The first has to do with the issue of forking. If the downstream developer has improvements, those improvements should be fed into the upstream code whenever possible. If downstream doesn't want to push those changes upstream, then it makes sense that the downstream distribution should bear the burden of redistributing the source for the forked code.
"Second, there is an issue of legal liability," Spevack continues. "The upstream party would be assuming legal liability for the downstream modifier, and that is not something that the Fedora Project is interested in doing.
"The third issue is that of cost -- which, while a valid concern, in my opinion is a lesser issue than the other two."
A possible solution for some distributions would be rPath's rBuilder Online, a tool whose use is free for non-commercial purposes and which allows users to build their own distribution using a repository of the Conary packaging system. Since one of the points of a Conary repository is that it contains both source and binary packages, using its version control system to keep track of them, as Erik Troan, one of rPath's founder notes, using "rBuilder automatically solves the problem by providing permanent access to binaries and the sources." Distributions based on rBuilder would still need to maintain their own repositories, but would not need to set up separate source repositories. This is the solution that Foresight Linux has chosen. However, rBuilder Online is not available to commercial distributions, and Conary is still a new and relatively unknown packaging system.
Many derivative distributions, then, seem to be on their own in a difficult situation where good intentions and creativity count for nothing beside the letter of the law.
For Woodford, the situation means struggling for compliance while preparing his next release, and the strain of the additional concerns is taking its toll. "I'm just trying to get back to the point where I can sleep at night," Woodford says. "Last night, I went to bed at 1:30 and just lay in bed thinking of all the technicalities that have been discussed about the GPL and how I'm going to access the source and make it available."
Bruce Byfield is a course designer and instructor, and a computer journalist who writes regularly for NewsForge, Linux.com and IT Manager's Journal.Bruce Byfield is a computer journalist who writes regularly for Linux.com.
Re:Play by the rules, plain and simplePosted by: Anonymous Coward on June 29, 2006 04:51 AM
According to a copy of the GPL, the paragraph says:
"b) Accompany it with a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code, to be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or,"
Emphasis added by me. Whether US$100 reflect my costs to physically perform the source distribution is a matter of calculation and the current circumstances.
Please note that some people may never pay, I may need to buy a CD burner to burn the disc, it will take some work to prepare the disc depending on how the original sources are stored, there's packaging, fees, maybe taxes, you need something to track the money, telephone calls, etc.
I don't know whether US$100 are a good guess. It might be less, it might be more. That probably also depends on where you live.
[Apr 30, 2009] GPL Compliance FAQ MEPIS
I am not an attorney. This is not legal advice.
Q1. What is this GPL license all about?
A1. The GPL license and the Free Software Foundation make sense to me if I assume that the purpose of the GPL license is to force the redistribution of all source code and to prevent commerce that does not include the unencumbered redistribution of all source code. The FSF recommends that you assign your copyrights to them, so they can insure your software "freedom." If the FSF succeeds, all source code will be GPL licensed and controlled by the Free Software Foundation; and all Laws regarding software patents and copyrights will be rendered ineffective.
Q2. Why would anyone want the GPLed source code in MEPIS?
A2. Except for a few packages, the sources are available at the Ubuntu and Debian repositories. The MEPIS kernel source is available from the MEPIS repository. So there is no obvious reason for anyone to want to get the MEPIS related GPLed source code from MEPIS, except to verify that MEPIS is complying with the GPL license restrictions.
Q3. Who is required to distribute GPL licensed source code?
A3. According to the GPLv2 license, any party who causes a party to receive GPL licensed binary code is required to make available to that party a copy of the source code, if the other party requests it. It doesn't matter whether you have ever previously had or wanted a copy of the source code, you are required to have a copy so you can redistribute it.
Q4. Does this mean that if I give a copy of MEPIS to a friend, I also have to give them a copy of the GPLed source code?
A4. According to the Free Software Foundation, if they want the source code, it means exactly that. Whether you give MEPIS to a friend or install it on a computer and sell it, or even if you give it away on the street corner, you are still obligated by the restrictions of the GPL license.
Q5. I want to distribute MEPIS to others. How can I do that and meet the legal restrictions of the GPL license?
A5. MEPIS offers the source code in compliance with the GPL license restrictions. If you have an agency relationship with MEPIS, then you are not distributing MEPIS independently, and therefore you are not independently obligated by the GPL license.
Q6. How can I have an agency relationship with MEPIS so I can give away copies of MEPIS Linux?
A6. Only for the purpose of satisfying the restriction of the GPL license regarding GPLed source code, MEPIS hereby grants an automatic limited agency relationship to individuals and groups giving MEPIS CDs to others free of charge or for a fee that is charged only to raise funds for a legal not-for-profit activity. This includes individuals giving copies to friends, Linux User Groups, the KDE Project, the Debian Project, Ubuntu, and other not-for-profit entities. This relationship is not granted to for-profit entities and this relationship is not granted in jurisdictions where MEPIS is prohibited by applicable Law from distributing MEPIS, specifically the "T6" and the "restricted strong encryption" countries.
Q7. How can you charge for the source code? Isn't it suppose to be free?
A7. The "Free" in Free Software Foundation is not about price. It's about who controls the source code. The FSF has created a non-standard definition of the phrase "free software." See A1.
Q8. I'm not sure if my situation is covered by the other answers, what should I do?
A8. Contact MEPIS via email@example.com and explain your situation.
[Apr 30, 2009] Interview with Warren Woodford - Founder of MepisDecember 12th, 2008 | howsoftwareisbuilt.com
Interview with Warren Woodford - Founder of MepisInterviewers: Scott Swigart and Sean Campbell
Interviewee: Warren Woodford
In this interview we talk with Warren. In specific, we talk about:
- The origins of SimplyMEPIS
- Ubuntu's role in the larger community
- Differences among distros from a developer perspective
- Corporate use of free versus for-fee Linux
- The Linux desktop and the future of client-side Linux
- Future directions of note: IPv6 and DNSSEC
Sean Campbell: Warren, could you introduce yourself and tell us about some of the things you have worked on during your career?
Warren Woodford: Sure. I've been pushing electrons for a very long time. I grew up with what is now the computer industry, and I was already working at almost the VP level when the first microcomputers came along.
My background includes telecommunications, entertainment, field service, mini computers, micro computers, mainframe computers, PCs before they were called PCs, real time processing systems, software for business, software for home, software for government, and tools that people have heard of if they've been around a long time–always on the bleeding edge.
That's the way I worked until the Internet bubble burst, at which time I kind of withdrew and decided to take it easy while the economy was down, not realizing it was going to be so volatile for so long. It was in 2000 or 2001 that I first started looking at Linux.
Aside from the philosophy and technical foundations of Linux, there was a lot there that I really didn't like, frankly. Because of my background, I had been a champion of GUI interfaces since the early '80s, and that aspect in particular was very inadequate at the time.
The bottom line is that, when I first found Linux, it was too rough around the edges for me. That represented the possibility of opportunity, not that I was really looking for work. This will piss off a few people, but there was a certain amateur quality about it.
Around 2001 was the first time I used a version of Linux that felt pretty good, which was SUSE, but it also had some significant bugs. It was pretty mature, but it was stiff–just too rigid in the way it did certain things. Mandrake seemed like it was on the right track, but there were bugs in the installation process and things like that.
Still, I felt that there was promise, so I started using Mandrake around 2001, and as I got familiar with everything, I decided that it was marginally good enough. Then, in 2002, Mandrake stumbled badly with their release in the September/October timeframe. They made some big mistakes, in part because of pure hubris.
It was around that point that I started thinking about building a version of Linux, instead of depending on other people. I jumped into it, deciding to pursue it as a way to learn technology that I didn't know.
It has always turned out that when I learn a new technology, opportunities arise, whether my original reason for getting involved worked out or not. That's how I got into Linux and decided to develop MEPIS. It was first for myself, and then I decided to see what would happen if I gave it free reign.
It got picked up by Distrowatch and went to #10 in one month, and that told me something. I started spending almost all of my time on it, but then in 2004 I had an injury that laid me up for a long, long time. During that time, MEPIS made it to #1 at Distrowatch, but I couldn't really do much to maintain it.
Then it slid. Mark Shuttleworth saw opportunity and forked Ubuntu off of Debian. To be clear, I think that has ultimately been good for Debian, and Ubuntu contributes a lot back to the Debian community, but it is clearly a fork.
Sean: What do you think about Ubuntu's strategy? They contribute upstream–more as of late, in fact–but at the same they time fuzz the distinction for users that Linux is really a collection of sub projects that are traveling in the same direction at roughly the same velocity.
I actually think what he's trying to do isn't all that bad, but I may have the luxury of some detached pragmatism. I see it as a logical, commercially driven decision, but I'm curious what you think about it, because you've obviously got a lot more history in this than I do.
Warren: I'm not bothered by anything that Ubuntu does. I think that Ubuntu pushes the boundaries regarding purity, and I don't think that's a bad thing, although that gets me in trouble with some people.
Some people call me a whiner about the GPL, while from my point of view they are the whiners. The GPL deserves to be scrutinized closely and to be debated, as does any legal document that restricts people's rights. Calling a person a whiner because they care enough to challenge, question, or state positions about something is itself whining.
I think it's good that Ubuntu challenges the boundaries regarding what is and is not proper open source. I think that what Ubuntu contributes back, both upstream and cross-stream to Debian, is good. And I think that the way that they kicked Debian in the collective butt has been good for Debian.
The fact that Mark is out there trying to make commercial deals actually may or may not make a big difference in the long run, but it's not necessarily a bad thing. For example, I know that a couple of years ago he negotiated with IBM to get Ubuntu approved as a platform for running DB2. That would make it very easy to get that DB2 approval extended to Debian if anybody wanted to, and I think that's OK.
From the point of view of what's right and wrong, I don't think there's anything at all wrong with their fuzzying things a bit, as long as they don't do it as much as Xandros did. Xandros at one point was blatantly changing copyright, renaming things and such, to make it appear that they had invented KDE or something. I can't speak for what was in their mind, but something was going on there.
I don't see Ubuntu doing that. I see them creating projects of their own to build utilities that represent their philosophy about how such things should be done, like the Adept project for a package manager. However if there's a good product out there already, then there's no good reason for them to be reinventing the wheel.
I think KPackage has been kind of so-so. On the other hand, I think Synaptic is awfully darn good. But Synaptic is GTK based, and while that's not necessarily a bad thing, I wouldn't program in that world. For personal reasons, it would be too inefficient and take too long to do. I wouldn't program in Python for the same reason.
I don't see anything wrong with creating or sponsoring Adept, just as I don't see anything wrong with the idea of starting Ubuntu in the first place. You can complain or you can say that competition's a good thing.
Mark, for one, is probably not going to pursue something unless there really is a shortcoming to be addressed.
Sean: What about Ubuntu's OEM strategy? They have done really solid work in getting Toshiba to push Ubuntu, for example, and of course they also have Dell with five systems. They seem to be doing a really bang up job in getting OEMs to ship, promote, and push the Linux based desktop, and that doesn't even count the netbooks push that's happening.
Do you think they may have figured out a strategy that could give them a certain position on desktops that others haven't quite figured out yet?
Warren: An important consideration is that Mark is more or less a billionaire, and he made that money in the computer industry. He can talk to companies like Dell, IBM, Toshiba, and others with a level of credibility that no one else in the Linux industry that I'm aware of can match.
He can get in the door to propose things, and he can get things agreed to that nobody else can. That gives him an advantage when you consider one distro over another, but that's just how things are.
That's good for Linux as a whole, and it means that companies like Dell and Toshiba are starting to think about compatibility more than they were before. And that's a good thing. There are people who have told me, "Hey, this is really great. I bought a Dell machine with Ubuntu and then put SimplyMEPIS on it and it all worked."
Sean: You've got Intel producing video drivers and wireless drivers and you've got network managers, so this is where the Ubuntu fuzzing works both for and against you. On the one hand, it makes the user feel like the network manager is Ubuntu's network manager, even though it isn't, really. On the other hand, they can just go stick a different distro on it.
Warren: Yes. In that regard, they're having a positive impact on Linux compatibility with mainstream hardware, by getting the mainstream hardware companies to think a little bit about compatibility.
Scott Swigart: How much difference is there, from an application developer's perspective, between different Linux distros? And how much work is that to take into account, and how much does something like the Linux Standard Base help with that?
To put it another way, for an application developer, how much effort is it to support and test and ensure that you're compatible with lots of different Linux distributions? And how much of that work is done by the app developer versus the distros themselves?
Warren: That is a very big question. About three years ago, some of the biggest companies in the computer industry brought together some Debian affiliated companies like mine regarding this very question. They wanted to explore having Debian-based Linux as a competitor for Red Hat and Novell, and it gave rise to the ill-fated Debian Common Core Consortium.
Those conversations arose from this very issue. You couldn't have commercial applications or commercial support for a particular Linux distro without a known, stable base. That plays out at different levels, because it depends on what kind of application it is. If it's something for server, then probably, you only care about a core set of packages.
You can have a core set of packages, and you can have standards around that. If you do, then at the very lowest foundational level, companies that are considering something commercial related to Linux have a common base that they can rely on. But what's the real core if you're running a practical application?
And if we're not using straight X, then what toolkit are you using, and what version? Consider the case of Acrobat Reader for Linux. How are you going to release Acrobat Reader in a way that runs cross distro, when each distro and each release of each distro may have different versions of key libraries?
Adobe does it by basically bundling those libraries, so they only have to rely on the very minimum number of compatible packages, or libraries, on a particular release.
From the point of view of an application developer, the problem is that every distro and every release of every distro has variations in what versions of core packages are installed. And because each distro has a different philosophy about long term maintainability and about stability of the distro, it's a moving target forever.
You know, it's a miracle that Firefox works so wonderfully. Those guys are incredible, and so are the Open Office people. Figuring out how to write code that is compatible with so many different versions of libraries to run with or be compiled against is a huge job. This is where Linux has a really big disadvantage when it comes to building complicated applications that you want to distribute broadly.
Scott: Educate me on the Linux Standard Base. What you basically said is that it's a moving target forever. How much does the Linux Standard Base do to alleviate that? It feels kind of like POSIX back in the day.
Warren: In my opinion, LSB doesn't do a lot for that. LSB provides some value, like POSIX provides some value, but it doesn't resolve all the issues.
New releases of MEPIS are still on top of Debian, which acts like a stable core or foundation on which MEPIS is built. If you take the latest version of OpenOffice and recompile it for Debian Lenny, OpenOffice 3.0 will compile in that environment. But OpenOffice 3.0 will not compile on top of Etch, because so many things have changed. One of the things that happens is that a lot of libraries change over time–names of libraries, APIs, and entire philosophies change.
Underneath it all, Linux represents a very active developer community in various areas with people trying out new ideas or making improvements. Without them being coordinated with or accountable to projects like OpenOffice, or Firefox or whomever, it's a real challenge to build a stable distro or to build a complex application and have it be compatible.
Scott: There seems to be a lot of pressure as companies become more comfortable with Linux, to move from the paid distributions to the free ones, because they find that they're not picking up the phone. They are not making a lot of support calls and things like that.
On the other hand, if we are at conference and we walk up to the Red Hat booth and ask them about that, their response is typically that the Linux market overall is growing, and it's only going to be a small percentage of the total number of Linux users that go from free to fee. They are just happy to see the overall market grow.
Warren: In general, corporate America wants something that is supported officially. You can argue about why that is, and it may be a bit cynical to say so, but my impression is that people in companies can't afford to take the chance of guaranteeing something themselves.
You are in a company and you are working toward retirement. You have a good job with benefits, and you do not want to say, "Well, let's use this free version of Linux. I guarantee you it will be fine, and we will take care of any problems that come up."
People don't want to do that in the corporate world. They want say, "Well, Gartner says this thing over here is great and will work fine for our purposes. They have an annual support contract, and they're an established company, so we can go with this." The company can feel comfortable, and everybody up the food chain can be held blameless if something goes wrong.
I think that is the number one driving factor for free versions of Linux not being used in corporate America, except covertly or in very special circumstances where management is used to taking risks. Some industries are more risk-averse than others, of course.
The commercial versions of Linux also offer extras that the free versions don't. If you look at Red Hat versus Fedora or Novell versus SUSE, they are offering extra things, like guaranteed update schedules, tiered support, and other kinds of extras. For example, their versions also contain extra utilities that facilitate and manage enterprise deployment of Linux.
Now, a small company can sure start out with Red Hat and switch to Fedora, and in a ten-person company with a good technical person in house, that could work out fine. For the most part, though, that won't happen in larger companies except where risk taking is the norm.
Scott: Do you see that changing over the next five years or so? Do you see some of those larger, somewhat risk-averse companies realizing that there is money that they could be saving by not writing a big check every year, if they haven't been having the problems they were worried about?
Warren: They absolutely will consider writing a smaller check to a different company, but I don't see them going with something that is completely free and open source unless it is not critical to their operations.
I know of a recent scenario where a company had no approved product for doing a translation; if I remember correctly, it was a transform between PDF and TIFF. They couldn't find a commercial product that exactly met their requirements and was approved by their enterprise architecture organization.
Since they couldn't get enterprise architecture to suggest a product, they went with something open source. But that's a minor usage of an open source product. I guarantee that same company won't use Linux as a platform OS. They hang their hat on IBM big iron and AIX.
In the corporate world, it's largely not a matter of writing a check or not–it's more writing a big check versus a smaller one, where the person championing it to upper management isn't going to get hung if something doesn't work.
People will be bothered if there's a big problem with some software, but if the company is big enough and proper due diligence was done, then nobody is going to be fired if it doesn't work out.
Scott: What about the year of the Linux desktop?
Warren: It's never going to happen. Sorry.
Scott: Why not?
Warren: There's a chicken and egg problem with getting it on the desktops, where no matter how much Mark Shuttleworth does, Michael Dell is not going to tell Bill Gates where to go. No one is going to forget that Microsoft's the big game in town, no matter how much Microsoft stumbles.
Mark Shuttleworth can spend his entire billion dollars on trying to make Ubuntu good enough to shoot down Microsoft on the desktop and that won't change. It goes back to what I was saying earlier about the fragmentation in the Linux market.
You don't have one set of products against which you can build commercial software, or do commercial deployment, or even long term enterprise deployment. It's doable with Ubuntu, but it's not a no brainer, although Novell and Red Hat are trying to address that part.
Right now, I don't know of a single major corporation that would go with Linux on the desktop for one reason–no Visio. Until OpenOffice does a Visio clone, you can forget it.
Sean: That reminds me of a story. Wikipedia went to Ubuntu recently for all their servers, but they still have one Windows machine to run QuickBooks, which runs their financials.
There's the problem, right? And it leads to the question, if you were king for a day, what is a reasonable goal for Linux on the desktop?
Is it netbooks, where because it can be thin and light, it's kind of a doorway to the Internet? And then if you want to leverage all those other locally installed apps, you have to go over to the Windows thick client? That scenario is a bit like the way Apple is carving out a niche with a certain set of consumers.
Warren: Like I said, Linux desktops cannot win in big business as long as there is no Visio clone. They can't win in small business because of the very thing you mentioned–QuickBooks. Small companies all seem to use it.
On the other hand, some things that are happening right now, like Nokia buying Trolltech and Google inventing Android, can shed light on where the opportunity lies.
In other words, appliances are a place where Linux, with a GUI, can land and really thrive in my opinion–not the desktop we'd normally think of.
Scott: We talked to someone from Mandriva, and his impression was that by focusing on the Linux desktop, people are missing the boat. He suggests that Linux is going to bypass the desktop altogether, because people are looking for the day when won't need their laptop on the road, because they have a phone and similar devices that are sufficiently capable.
So Linux is not well suited for the desktop, as we know it, but it might be very well suited for what comes after the desktop. Even if the desktop always remains as a mainline use case, there are going to be other scenarios with handheld devices.
Linux shouldn't really be aiming at where people have been, it should be aiming at where they're going, and it might be better suited for that.
Warren: I think that's exactly where it is going, and it is happening right now, specifically with things like Android and Nokia. Notice that in the handheld arena, you don't probably want QuickBooks or Visio.
Because these application markets are just getting established, if you make sure that the environments support the major toolkits for developing applications for those environments, you're not going to fall short.
To touch back on Linux on the conventional desktop, though, I use MEPIS for everything I do. I just spent a year on site in a corporate environment, and I used MEPIS everyday.
Almost everybody there was using Windows, but my Citrix connections were better, and so was my wireless connection within the corporate environment. Through Citrix, I was able to use Microsoft apps. An individual can absolutely use Linux for their desktop and have very little that they're falling short on.
I also run all kinds of things in Virtual Box. Actually, for MEPIS, I do a recompile of the open source Virtual Box, because since Sun took over, they've been a little behind on recompiling the open source edition. If you're using a 2.6.26 or a 2.6.27 kernel, you can't run a guest on Virtual Box unless you have Virtual Box Version 2.0.2 or better.
The bottom line is that I can run Windows in Virtual Box, and I can run all kinds of test scenarios in Virtual Box. I can run from whichever 32 or 64 bit version of MEPIS I want. My main development machine is a MacPro, so I can even run OS X when I need to do that. It gives me one heck of an environment as a developer.
Scott: I want to be sensitive to the time, so is there anything you want to add in closing?
Warren: Earlier you asked about what I thought was not given as much attention as perhaps it might be. To follow up on that, there are reasons for DNSSEC and IPv6, for example, to be implemented and used.
The circles that I run in, I haven't heard much talk about DNSSEC or IPv6, except very recently. I think that with IPv4 running out of IP addresses, IPv6 has to come along really soon. But the problem is infrastructure not OSes; I think most Linuxes can do IPv6 out of the box. MEPIS certainly does.
DNSSEC, though, has been with us for 16 years now as a concept, and it hasn't been implemented. It should greatly improve security on the Internet regarding spoofing and things like that, and there's work being done to actually implement it now. Still, though, I don't hear much being said about it, and I don't know to what degree people are getting ready for it or considering implementing it sooner rather than later.
The .gov domain is going to start implementing DNSSEC January 1st. There are trials that have been done, I believe, for .com and .org, but if you look to see what's been done regarding integrating DNSSEC in user applications, there's practically nothing.
It's all at the experimental stage. To go to a website and be able to identify immediately that it does not have a valid DNS record would be a great thing. That's something that I would hope that the Firefox project is going to put in the next release, but I don't know that they are.
Sean: Those are great points. Thanks for taking the time to talk today.
Warren: Thank you.
[Apr 30, 2009] Is Xming Another Example of Misunderstanding Libre Licenses?
Another rabid "libre" defender. Pretty annoying and narrow minded...
Wed, 2007-11-14 04:13 - dcp
Xming appears to be a useful program for accessing and running your GNU/Linux applications remotely from a Windows computer. It is licensed under the GPLv2. But just how free is it, really?
Xming looks like a great tool for connecting Windows boxes to remote GNU/Linux computers to run applications. And, according to the license included with the software, it is released under the terms of the LGPLv2. Unfortunately, the developer's website includes a strange notice that poses a potentially confusing challenge for those who wish to redistribute the software. In addition to the GNU GPL license, the author requires that anyone desiring to redistribute the software must ask permission to do so.
According to the 3rd paragraph (displayed according to "fair use", as allowed by U.S. copyright law):
Xming is mostly a derivative work with many component licenses e.g. the zlib license, LGPLv2 for Pthreads, MIT/X11 for all PuTTY tools and the Pixman library, or Creative Commons by-nc-sa. Redistributing any part (or whole) of the Xming website, documentation, images, executables or installers, by the internet, other projects/products or via media such as CD's, without asking permission, attributing 'Colin Harrison' and providing links to http://StraightRunning.com/XmingNotes/ and SourceForge Project Xming will be regarded as a breach of copyright.
Note that the statement, as it currently reads, states quite clearly that the program is derived from several others, including software licensed under the GNU LGPLv2. The statement then refers to redistributing any part (or whole). Taken as is, the statement appears to cover the upstream code licensed under the terms of the LGPLv2 - in violation of that license's Section 10. Whether this is what Colin Harrison, the Xming developer intends is another question.
When Blue GNU contacted Harrison seeking clarification, his initial response was that if I don't like the license, I don't have to use the software. When I clarified that I was not attempting to engage in a debate, he replied that, "I don't have the time to explain my licensing at the moment. When the dust settles on the GPLv3 debate I will clarify the situation more in Xming." Meanwhile, the statement leads to possible confusion now, and downstream users need to be clear as to what is covered and what is not.
Harrison did take time to complain "that most of the software I have written in the past for free is now embedded in devices and products that I don't have source access to, or even change feedback or attribution." He did not specify whether he thought any of these uses were in violation of his license, or whether he has made any effort to enforce his license in such cases. He went on to further complain about "freeloaders" who do not give "attribution" without being more specific.
Harrison's e-mail ended with a question about whether FOSS might be devaluing the expertise of professional developers. Blue GNU has replied further saying that clarification of the statement on the website is needed, since Xming may be violating the LGPLv2 under which Pthreads is distributed by adding further restrictions on the software (section 10 of the LGPLv2) and failing to make the source code available. No source code for Pthreads is available from the Xming site, nor included in the distribution. Blue GNU also contacted the Win32 Pthreads developers and the FSF to find out if they agree there might be a violation.
Harrison's stipulation that redistributed copies of Xming must include links back to his own project appear hypocritical in view of the fact that he does not link back to the projects mentioned in the same paragraph. And, while it might withstand scrutiny in court, the stipulation amounts to a sort of advertising clause - something that has always been considered problematic. Perhaps this could have been better resolved through a trademark?
Neither Colin Harrison, the Pthreads developers, nor the FSF have had time to respond to the latest e-mails, so it remains to be seen how the issue will play out. However, the situation does raise some serious questions about how well developers understand Free Software licenses.
- How well do developers understand the various licenses they choose to use?
- Do they even bother to read them, or do they simply choose them because they're canned legalese?
- How well do people really understand that Free Software, according to the Free Software Definition and other documents, must be free to distribute and use commercially (for any purpose)?
- Do developers understand that a company, an individual - anyone - can modify Free Software in-house, and never give back so much as a comment, as long as they do not redistribute the modified version?
- Can a developer release something under a given license, and then still require explicit permission to redistribute it, or does meeting the license requirements satisfy copyright law?
- How does one define "attribution"? If all the copyright notices are intact, is that not attribution? Or must a project necessarily advertise for its upstream project(s) in order to be considered to be "attributing"?
Developers and others might modify and/or redistribute Free Software should always undertake to read and understand any licenses for code they choose to modify and/or redistribute, so they can ensure compliance. They also need to be clear when imposing additional stipulations that may involve code others have developed, as to exactly what is, and what is not, covered. While the issue can very likely be resolved fairly quickly and with little fanfare, developers could save themselves a little time and a lot of stress by doing some research and communicating clearly from the beginning.
Note: As of 14:25-14:30 GMT-5, the Xming site is returning a time-out error.
Note: Warren Woodford ran into problems when notified by the FSF that he must provide the source code for his Mepis distribution.
Walk awayAll my software on SourceForge is PublicDomain. Do with it what you want, or shove it. I have explained the minor error I made on my site and elsewhere.
Walk away Parris, stop trolling now.
Don Parris... stop bearing false witnessHaving failed to troll me on my own Xming Forum Parris appears to have forced me to troll his. Xming is no longer licensed GPL, it is a derivative work with many licenses none of which is GPL, except for the legal use of one LGPL library. As I said to him if you don't like the license don't use it. That statement appears to have started World War three. I suppose you could not expect anything better from an 'Absolute Truth' merchant like him, so we all have to suffer defamation at his hands.
Both the fsf and the maintainer of Pthreads-Win32 have not given me the thumbs down as described by Parris here. And yes I have wasted lots of time on this, and yes I have corrected some minor licensing anomalies in Xming.
Again "Parris don't use anything you don't want to" and I'm not a blue-GNU supporter (or can be converted and baptised as one for your gratification)
BTW I have no objection to developng GNU/FSF software, but if it could ever be used by people like Parris, perhaps I should :)
LGPL version 2
SCO-Linux controversies - Wikipedia, the free encyclopedia
[Apr 22, 2009] Once more on SCO Paul Murphy ZDNet.com
From this blog for January 8 2008:
What's worst about [the hate mail I get when ever I comment on SCO] , however, is that the "groklaw effect" has become a significant component in the overall Linux "gestalt" - and just about everything most of that mob has been led to believe about the case is wrong.
The fundamentals of the case are simple: SCO (and I use the name generically) asked IBM to pay continuing royalties under its AT&T Unix licenses; IBM said words to the effect of "Nope, we have a fully paid perpetual license"; a discussion ensued during which SCO became convinced that IBM had breached the contracts by allowing people with intimate knowledge of the AT&T source to contribute to Linux; and so SCO issued the 90 day license suspension warning against AIX required under that contract.
At the time I expected that IBM's senior management would review the issue, recognize a problem, and settle expeditiously with SCO; but that didn't happen. Instead IBM circled the wagons and waited for SCO to do what it had to under the contract: escalate the conflict by formally suspending IBM's Unix licenses with respect to AIX and then ask a court to enforce that order against IBM.
That should have been a simple process: all SCO had to do was show the court that at least some IBM Linux contributors had significant prior or concurrent exposure to AIX source and it would have been game over for IBM. That didn't happen either: instead it appears that someone somewhere in the process saw the combination of IBM's intransigence and deep pockets with a rather obvious contractual dispute as a potential gold mine - and out of that we get the next act in which a major east coast firm, headed by the same lawyer who had been unable to prove that Microsoft benefited from an illegally obtained and enforced monopoly, gets a cost plus style contingency agreement to prosecute the case against IBM and we start to see inflammatory, and largely incorrect, claims issued in SCO's name.
[Apr 22, 2009] Technology Review Why IBM Needs SunRe: Does Cringley even follow the SCO lawsuit
anonymous-insider on 04/06/2009 at 7:48 PM
Groklaw's anti-SCO efforts began in 2003 at http://radio.weblogs.com/0120124/ as a PR campaign by Pamela Jones for her company Medabiliti Software Inc.
Medabiliti Software Inc had just written for Exemplar International -known today as Examinetics- a Web application named XM Network. XM Network run on Linux and was written on open source software.
Medabiliti Software Inc needed a PR campaign to convince its clients, including Exemplar International, SCO's suit against IBM had no merits. A timeline of Medabiliti Software is found at http://tinyurl.com/djp8lj .
Originally, Groklaw was Medabiliti Software's vehicle for that campaign.
While Medabiliti Software ceased operations in 2004, Groklaw's private agenda continues to be a PR campaign against SCO.
Re: Does Cringley even follow the SCO lawsuit
anonymous-insider on 04/07/2009 at 1:33 PM
Pamela Jones began to write at http://radio.weblogs.com/0120124/ .
Later in 2003, she stopped writing at http://radio.weblogs.com/0120124/ and moved to http://www.groklaw.net/ . See the note 'We Have Moved Permanently' at http://radio.weblogs.com/0120124/ .
Some of Pamela Jones writings were factually wrong due to her anti-SCO agenda. The article 'And They Call Linus Careless' found at http://www.groklaw.net/articlebasic.php?story=66 is an example of such a writing.
Linus Torvalds was indeed careless on Linux kernel source code management. See the paper 'Kernel comparison: Improvements in kernel development from 2.4 to 2.6' at http://www.ibm.com/developerworks/linux/library/l-dev26/ .
Groklaw is mainly read by fanatics that blindly follow Pamela Jones' agenda. Still, you're welcome to draw your own conclusions with content found at http://tinyurl.com/c4ypfg and http://tinyurl.com/cy8mcz
Re: Anonymous Insider
anonymous-insider on 04/10/2009 at 12:47 PM
In regard to Kimbal, I'll use the same words written by Paul Murphy:
"So here's a bet - and if you want to accept the bet just put a note to that effect in the comments here. If this court [ http://www.tmcnet.com/usubmit/2009/03/07/4038711.htm ] or one directed by it does not overturn Judge Kimbal's ruling and I'm still writing this blog [ http://anonymous-insider.blogspot.com/ ] the week after it happens, I will dedicate a Saturday entry entirely to lines that look like this: Dear XX - I'm sorry, I was wrong about the appeal court's response on SCO's ownership of the copyrights."
Groklaw's intent to influence Kimbal's decision is clear.
Then, the speculation on Solaris 8 source code and methods on Linux arises on how easy it was for Linux kernel developers to obtain third party source code.
As Paul Murphy said, "the corpse is still twitching". The corpse might go to court again...
Finally, you'll find a note by Pamela Jones at http://www.groklaw.net/newsitems.php about Cringely's article:
"[PJ: For what it's worth, my guess is that SCO has absolutely nothing to do with it. For one thing, while Cringely says "some mingling" has taken place, the allegations SCO has still on the table are essentially none. Here's what SCO and IBM each filed, telling the judge what each believes survived the decision in SCO v. Novell that Novell owns the copyrights and that Novell has the right to tell SCO not to sue IBM. Unless SCO is able to overturn that decision on appeal and then win before a jury after a trial, which I consider unlikely, personally, there is nothing dogging IBM about any SCO allegations about copyrights. Even if that could happen, and SCO were named the owner of the copyrights, the code that SCO finally presented before the court's ruling was more or less nothing at all, with multiple defenses at hand even if they were not laughed out of court, to the extent they were known publicly. So whatever the proposed deal was about, I don't believe IBM is worried about SCO v. IBM. And the fact that IBM apparently just walked away from the deal indicates whether I am right in my supposings or not.]"
That's the note that sent me here.
Your original pro-Groklaw comment motivated me to argue with you.
IBM's future is indeed tied to Sun's acquisition
anonymous-insider on 04/06/2009 at 4:28 PM
Recently, Rob Enderle wrote:
"There is some speculation that Solaris is the source of many of the core components in the current generations of Linux, and that IBM's acquisition could prevent another SCO event in the future, should someone less friendly acquire Sun instead."
Some of the rationale behind the speculation has been exposed at http://tinyurl.com/c9nzuu (see comments in the bottom of the main article). If the speculation proves correct, IBM will have to pay Sun's asking price.
Re: IBM's future is indeed tied to Sun's acquisition
anonymous-insider on 04/07/2009 at 1:43 PM
I welcome you to obtain a copy of Solaris 8 source code as it was released by Sun in late 2000. See http://tinyurl.com/cbedk3
Solaris 8 source code was available for download on the Web until sometime in 2002. There might be some copies at MIT.
Then compare against Linux kernel versions 2.4.17 and later, all 2.5 tests and early 2.6 versions.
Do your comparisons in the following kernel areas: task scheduling, virtual memory management (VM), communication device drivers, TCP/IP, storage device drivers, web server, kernel locking, kernel preemptibility (SMP only), buffer cache management, IPC (semaphores, shared memory, message queues, and pipes).Re: IBM's future is indeed tied to Sun's acquisition
anonymous-insider on 04/07/2009 at 5:35 PM
"... there is no PROOF that PJ ever worked for Medabiliti Software..."
That was exactly Medabiliti Software's strategy.
For better 'credibility' on its PR campaign against SCO, Medabiliti Software needed to distance itself from Groklaw.
There are a few bytes of information that relates Pamela Jones of Groklaw to Pamela Jones of Medabiliti. See http://tinyurl.com/c4ypfg
Another reason for Medabiliti to distance itself from Groklaw was XM Network, written on open source software and never shared back to the community. See http://tinyurl.com/dgssaa
In regard to Solaris 8 source code in Linux, there'll be plenty of time in the near future to prove the speculation.
Re: IBM's future is indeed tied to Sun's acquisition anonymous-insider on 04/07/2009 at 6:00 PM
"And They Call Linus Careless"
"SCO's Amended Complaint attacks Linus for allegedly being careless, allowing code in without checking for IP problems first."
On November 2, 2001, Alan Cox announced "Marcelo Tosatti will be the head maintainer over the 2.4 stable kernel tree", see http://www.advogato.org/article/370.html and http://tinyurl.com/cbtlez .
Did Marcelo Tosatti had the age and the experience to discern if code added to the Linux kernel was free of copyright infringement? For an answer, see http://web.archive.org/web/*/http://marcelothewonderpenguin.com
A few weeks after Tosatti was named the new Linux 2.4 kernel maintainer, an e-mail message was transmitted over the Internet. The subject of the message was "2.4.16 & OOM killer screw up".
A well-known Linux kernel developer wrote "The VM code lacks comments, and nobody except yourself understands what it is supposed to be doing."
Another Linux kernel developer wrote "Andrea, it seems -aa is not the holy grail VM-wise. If you want to merge your good stuff with marcelo, please do it in the 'one patch with explanation per problem' style marcelo asked."
The 'author' of the VM code responded "it's not true that I'm the only one who understands it. For instance Linus understand it completly, I am 100% sure."
All the "2.4.16 & OOM killer screw up" messages have been gathered in one single URL, see http://tinyurl.com/dxse7b . Nevertheless, copies of the entire electronic exchange are available in the Web in different locations.
Allowing a young and inexperienced Brazilian programmer to maintain Linux kernel version 2.4.16 and later 2.4 releases was indeed a careless decision by Linus Torvalds. Or was the decision intentional?
Robert X Cringely's quote, not mine
anonymous-insider on 04/20/2009 at 7:40 PM
It was Robert X Cringely who wrote
"While IBM has the upper hand in the SCO suit, which has been ongoing since 2003, it has become clear that some code commingling has taken place, which could hurt future copyright and intellectual-property claims over software developed for Linux and AIX."
Now that Oracle has formally made a deal with Sun, it's definitely IBM's loss. Very similar to Cringely's narration on how Gary Kildall sent IBM away.
In this case, it was IBM's turn to send Sun away.
There is no "distribution" here, so the reciprocal clause of the GPL is never triggered. In such a world, service providers can use GPL-licensed code in proprietary back-end server farms with impunity. This seems contrary to the spirit of the authors of much of the GPL-licensed code used in this way, although it strictly complies with the license. It means that, as Bradley warned, GPL code can be used in the cloud computing market in exactly the same way as BSD code can be used in the traditional software market.
Dec 20, 2008 | Yaakov Nemoy
Summary, in case you want to skip this.
Open Source development is pretty close to Anarchism. Still, we rely on the courts and government to protect Open Source. What if we were to lose that support, what would the Open Source ecosystem look like then?
Before i begin, let me redefine Anarchism away from the bad taste in your mouth, purely chaotic society where anyone will kill his parents if it means a few bucks. It's really an insult to the decency of mankind to presume anyone would act in such a way. When i refer to anarchism, i refer to a self regulating, self ruling society where the individual decides which rules are important.
I was watching an interview with Eric S. Raymond where the interviewer asked him the million dollar question: "Is Open Source Communism?". His response was extreme disgust, and his argument against this was about the very nature of Communism. Communism forces the individual to share and participate in a single monoculture society, where if you chose not to be a member, you were thrown in the Gulag, shot in the back of the head, and even buried in an unmarked grave. The question was raised around the 'viral' aspect around the GPL, in how it forces the redistributor to retain that license on all modified code. But let's face it, very few people actually want to force people to use the GPL and nothing but the GPL.
Let's take this completely the other direction. Economically, Capitalism is considered the economic polar opposite* of Communism. The idea behind Red Hat is that Open Source makes perfect business sense, because it's been proven to encourage faster economic development than the traditional methods that preceded Open Source development. Capitalism is certainly akin to Anarchism, in that they both encourage a certain free growth, unimpeded by any other limitations. For example, in our society, most capitalistic economies are limited by government regulation, but are otherwise completely subject to the consumer demand.** Capitalism, especially as Open Source moves into it more, relies on a set of organically grown collective agreements between the different corporations. Still, it relies on a level of government regulation and intervention to support and maintain these agreements. For example, corporations rely more than ever on the court systems to enforce trademarks worldwide, because without an overarching court, any individual can use a trademark freely with little retribution. Open Source moves corporations into a space though, where they no longer compete with each other directly, but actually support each other. This is fairly close to an anarchistic economy.
written by Armin Ronacher, on Thursday, February 12, 2009 7:13.
When I started using Linux I was totally sold to the concept of Open Source. I still am, but my view changed. The first code I released under an Open Source license was GPL licensed and I continued to do so for some time. The last code under the GPL license I actively developed was Zine until a few days before the release when I relicensed it under the modified BSD license.
The reason why I changed the license is a rather complex one and I want to share my experiences with the GPL and other Open Source licenses here for a moment. I suppose many people acted like me and chose the GPL because everyone else did but didn't know about all the implications it has.
Left versus Right
The GPL and BSD (and friends) licenses couldn't be more different. It starts with the length of the paper. The BSD license is two or three clauses of text plus a copyright information and no-warranty clause. The GPLv3 on the other hand has 600 lines of text. BSD restricts the rights, GPL permits. Restricting rights sounds bad, but it just means that you can do everything with it, except the stuff listed in the license. The GPL starts by explaining what you can do with it. The GPL is following the Copyleft principle, something the BSD license is not doing.
This has some very complex implications many GPL / BSD users don't know about but should.
What BSD means
Let's start with the BSD license, the license of my choice. The three clause version is very similar to the MIT license and the two clause version is basically the MIT license. What does it say?
Copyright (c) <year>, <copyright holder>
All rights reserved.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
- Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
- Neither the name of the <organization> nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
Pretty simple. It allows the user to do everything with the application but removing the Copyright. The third clause means that derived works may not use the author's names for advertising. This clause is not in the 2 clause BSD and MIT licenses.
Now this of course means that someone can take your software, change the branding and sell it. The world is bad and you can be sure that this will happen if your application is worth it. We'll cover that part of the license a little bit later.
Let's see how the GPL works there.
What GPL means
The GPL license is too long to be quoted here, but I'll try to sum up the most important aspects of it:
- Copies may be distributed free of charge or for money, but the source code has to be shipped or provided free of charge (or at cost price) on demand. The receiver of the source code has the same rights meaning he can share copies free of charge or resell.
- The licensed material may be analysed or modified.
- Modified material may be distributed under the same licensing terms but don't have to be distributed.
There is a lot more in the license, like how long source code has to be available and how to deal with that, but the essence is that. Like for the BSD license someone can take the application, rebrand it and sell it, however the license demands that the modified source is available under the same license.
However modifications only have to be made available to the public if distribution happens. So it's perfectly fine to take GPL code, modify it heavily and use it in a not distributed application / library. This is how companies like Google can run their own patched versions of Linux for example.
But what this also means is that non GPL code can't use GPL code which is also the main problem of it.
BSD is GPL compatible, but GPL does not permit the use of GPL licensed code in non-GPL code. This is especially annoying if important libraries users expect are GPL. For example the very popular readline library is GPL licensed. Users of OS X will know that, because interactive shells of Python and other non GPL applications sucks there. People tried to rewrite readline to get rid of the GPL problem but the alternatives are not as well maintained as the original one.
I guess this is also what Steve Ballmer referred to as "cancer". Unfortunately he's not entirely wrong there. For example I tried to develop an interactive administration shell for Zine but without readline (which I cannot use as Zine is BSD licensed) the user experience is just meh. I would have to relicense the entire application to
GPLjust so that I can have an interactive shell with readline support.
Now this depends on how you define freedom. The people behind the GPL have a very communistic point of view in terms of freedom: free software should be available to everybody under the same terms. Unfortunately like communism it does not work out that well because it turns out humans are not really compatible to that way to look at things. On the other hand there are the permissive licenses like BSD that just give away all rights except the copyright and do not enforce freedom. You can take BSD code and re-license it under the GPL if you want to. That kind of freedom however is a one-way ticket. Once you made a GPL release of your code there will always be a GPL version of it. If not for future releases, at least for that one release as you can't revoke the license.
Ultimately the goal of software development for many is to make money. Many people decide to utilize the GPL license for that by dual-licensing the code under the GPL and a proprietary license where the latter is only available to costumers. As a single developer it's arguable harder to sell code that is licensed under the BSD license. There the business model is probably more selling non-open-source extensions to paying costumers. If you open source all your code under the BSD you have to be really good so that you can make money out of it.
Many developers don't really care about that, have their fun developing it and BSD license it for others to start where they stopped. A good example of successful BSD / MIT code are Django and Ruby on Rails. Both projects developed by strong communities with supporting companies behind it. The company behind Rails creates very successful closed source applications based on Rails; Many of the developers working on Django are paid by individual companies that work with it.
Before you license your code under an Open Source license: Think about the license! Both types of licenses have their advantages and disadvantages and it would be stupid to use the GPL without thinking just because "everybody does". Many just do because they haven't read the license either.
August 15, 2008 | legaltimes.typepad.com
Federal Circuit: Copyright Infringement Applies to 'Artistic' Licenses
Free software is available everywhere on the Web, and downloading it is a cinch. But breaking the terms of so-called "open-source" or artistic licenses can amount to copyright infringement, the U.S. Court of Appeals for the Federal Circuit in the District of Columbia ruled this week.
The appeals court in a California case reversed a federal district court ruling that said a person who breaks the terms of an artistic license can be held liable for breach of contract, not copyright infringement. The remedy for infringement - including injunctions, statutory damages and attorney's fees - can be more substantial than a breach of contract award.
Bob Jacobsen, a physics professor at the University of Berkeley, manages a software group called Java Model Railroad Interface, which controls a programming application called DecoderPro. Model railroad enthusiasts use DecoderPro to program chips in model trains. Jacobsen accused Oregon resident Matthew Katzer and Kamind Associates of copying materials from the publicly available software and incorporating it without following the terms of the public license. Jacobsen filed a copyright infringement complaint and sought an injunction against Katzer and Kamind Associates of Hillsboro, Oregon.
The U.S. District Court for the Northern District of California, in San Francisco, ruled Jacobsen's artistic license was "intentionally broad" and had unlimited scope. The Federal Circuit remanded to the district court, where Jacobsen's complaint was filed in 2006. A status conference will be held in the coming weeks.
"This decision confirms what everyone in the community knew, that the terms must be followed or it is copyright infringement," Jacobsen's attorney, Victoria Hall of Bethesda, told Legal Times. A message left with Katzer's attorney, R. Scott Jerger of Field Jerger, in Portland, Oregon, was not immediately returned. Katzer could ask for a rehearing en banc or petition the Supreme Court for certiorari.
Much of the open-source litigation, Hall said, has been disposed through arrangements between the parties and through settlements. Bloggers advocating open-source software heralded the Federal Circuit's ruling. Appeals of copyright law are rare in the Federal Circuit compared to patent law litigation.
"Copyright holders who engage in open source licensing have the right to control the modification and distribution of copyrighted material," the Federal Circuit said. Chief Circuit Judge Paul Michel of the Federal Circuit, Circuit Judge Sharon Prost and U.S. District Judge Faith Hochberg of the Northern District of New Jersey issued the order.
The judges said: "Open source licensing has become a widely used method of creative collaboration that serves to advance the arts and sciences in a manner and at a pace that few could have imagined just a few decades ago."
Posted by Mike Scarcella on August 15, 2008 at 05:05 PM in D.C. Courts and Government | Permalink
From: Linus Torvalds <torvalds-AT-linux-foundation.org> To: Alexandre Oliva <aoliva-AT-redhat.com> Subject: Re: Dual-Licensing Linux Kernel with GPL V2 and GPL V3 Date: Tue, 12 Jun 2007 08:45:46 -0700 (PDT) Cc: Ingo Molnar <mingo-AT-elte.hu>, Tarkan Erimer <tarkan-AT-netone.net.tr>, debian developer <debiandev-AT-gmail.com>, "david-AT-lang.hm" <david-AT-lang.hm>, Linux Kernel Mailing List <linux-kernel-AT-vger.kernel.org>, Andrew Morton <akpm-AT-linux-foundation.org>, Greg KH <greg-AT-kroah.com> Archive-link: Article, ThreadOn Tue, 12 Jun 2007, Alexandre Oliva wrote: > > Per this reasoning, Sun wouldn't be waiting for GPLv3, and it would > have already released the OpenSolaris kernel under GPLv2, would it > not? ;-) Umm. You are making the fundamental mistake of thinking that Sun is in this to actually further some open-source agenda. Here's a cynical prediction (but backed up by past behaviour of Sun): - first off: they may be talking a lot more than they are or ever will be doing. How many announcements about Sun and Linux have you seen over the years? And how much of that has actually happened? - They may like open source, but Linux _has_ hurt them in the marketplace. A lot. They almost used to own the chip design market, and it took quite a long time before the big EDA vendors ported to Linux (and x86-64 in particular). But when they did, their chip design market just basically disappeared: sparc performance is so horribly bad (especially on a workstation kind of setup), that to do chip design on them is just idiotic. Which is not to say that there aren't holdouts, but let's face it, for a lot of things, Solaris is simply the wrong choice these days. Ergo: they sure as hell don't want to help Linux. Which is fine. Competition is good. - So they want to use Linux resources (_especially_ drivers), but they do *not* want to give anything back (especially ZFS, which seems to be one of their very very few bright spots). - Ergo: they'll not be releasing ZFS and the other things that people are drooling about in a way that lets Linux use them on an equal footing. I can pretty much guarantee that. They don't like competition on that level. They'd *much* rather take our drivers and _not_ give anythign back, or give back the stuff that doesn't matter (like core Solaris: who are you kidding - Linux code is _better_). End result: - they'll talk about it. They not only drool after our drivers, they drool after all the _people_ who write drivers. They'd love to get kernel developers from Linux, they see that we have a huge amount of really talented people. So they want to talk things up, and the more "open source" they can position themselves, the better. - They may release the uninteresting parts under some fine license. See the OpenSolaris stuff - instead of being blinded by the code they _did_ release under an open source license, ask youryourself why the open source parts are not ready to bootstrap a competitive system, or why they are released under licenses that Sun can make sure they control. So the _last_ thing they want to do is to release the interesting stuff under GPLv2 (quite frankly, I think the only really interesting thing they have is ZFS, and even there, I suspect we'd be better off talking to NetApp, and seeing if they are interested in releasing WAFL for Linux). Yes, they finally released Java under GPLv2, and they should be commended for that. But you should also ask yourself why, and why it took so long. Maybe it had something to do with the fact that other Java implementations started being more and more relevant? Am I cynical? Yes. Do I expect people to act in their own interests? Hell yes! That's how things are _supposed_ to happen. I'm not at all berating Sun, what I'm trying to do here is to wake people up who seem to be living in some dream-world where Sun wants to help people. So to Sun, a GPLv3-only release would actually let them look good, and still keep Linux from taking their interesting parts, and would allow them to take at least parts of Linux without giving anything back (ahh, the joys of license fragmentation). Of course, they know that. And yes, maybe ZFS is worthwhile enough that I'm willing to go to the effort of trying to relicense the kernel. But quite frankly, I can almost guarantee that Sun won't release ZFS under the GPLv3 even if they release other parts. Because if they did, they'd lose the patent protection. And yes, I'm cynical, and yes, I hope I'm wrong. And if I'm wrong, I'll very happily retract anything cynical I said about Sun. They _have_ done great things, and maybe I'm just too pessimistic about all the history I've seen of Sun with open source. The _good_ news is that Jonathan Schwartz actually does seem to have made a difference, and I hope to God he is really as serious about open-sourcing things as he says he is. And don't get me wrong: I think a truly open-source GPLv3 Solaris would be a really really _good_ thing, even if it does end up being a one-way street as far as code is concerned! Linus
(Log in to post comments)
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 18:14 UTC (Tue) by JoeBuck (subscriber, #2330) [Link]
Yes, I am an EDA developer, who once developed primarily on Solaris/Sparc and who now develops primarily on Linux. Sun dropped the ball years ago; they had a Solaris/x86 in the early 90s that never got any attention, because Sun management wanted to put "all the wood behind SPARC".
Provided that Sun eventually does go the GPLv3 route, or if other GPLv3 code appears interesting, Linux could start a transition to a dual license: GPLv2 or GPLv3. The advantage of those terms is that they would achieve the advantages of GPLv3 (better internationalization, compatibility with other free software licenses such as Apache's) but still avoid the DRM restriction Linus objects to (anything GPLv2 permits would still be allowed). Furthermore the kernel already has a lot of "GPLv2 or any later version" code. A transition would take a while to do, and a complete transition might require replacement of code from authors who won't play or can't be located. But the Mozilla project managed to do it; if Linus asked, I would expect the vast majority of developers to go along.
But it's up to him.
I'm sure Sun is only making these moves to attract developers. But I'm happy to see more choices.
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 18:27 UTC (Tue) by cpeterso (subscriber, #305) [Link]Furthermore the kernel already has a lot of "GPLv2 or any later version" code.I thought the kernel code was licensed under GPLv2 only? http://lxr.linux.no/source/COPYINGAlso note that the only valid version of the GPL as far as the kernel is concerned is _this_ particular version of the license (ie v2, not v2.2 or v3.x or whatever), unless explicitly otherwise stated.
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 19:54 UTC (Tue) by yokem_55 (subscriber, #10498) [Link]The kernel as a complete work is GPL V2 only. Many of the individual files in the source though have the V2 or later language in them. So, if a relicensing project were started to bring the complete work of the kernel to GPL V3, those files would already be taken care of.
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 18:27 UTC (Tue) by arcticwolf (guest, #8341) [Link]How would a dual-licensed Linux help, though? If you go that route, you still can't port things like ZFS, since that'd be - if it's GPL'ed at all - GPLv3 only. So the only way to port it would be to essentially split Linux into a GPLv2 version and a GPLv3 version; and given that there'd likely be code (old or new) that would be GPLv2 only, the latter would not even be a superset of the former.
So as long as Linu(x|s) doesn't go GPLv3, period, there would really be no way to make code flow from Solaris to Linux, and in fact, a dual-licensed version of Linux with an "official", integrated GPLv3 branch would actually make it easier for Sun to pull in code from Linux.
But then, maybe Linus is just a strategic genius, too - maybe all his vocal opposition to the GPLv3 is just a clever ploy to lull Sun into a false sense of security, and once they've released Solaris and/or ZFS under the GPLv3, he'll just switch over as well and reach the rewards. ;)
1. Yeah, I know, he can't just do that, but for the sake of the joke, let's pretend he can.
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 18:52 UTC (Tue) by ajross (subscriber, #4563) [Link]The v2 and v3 GPL variants are explicitly compatible with each other in both directions. So there's no reason the kernel couldn't simply ship different parts of the tree under different licenses. No fork would be required except in the case where someone wanted to put together a "GPL2 only" distribution for some reason.
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 19:23 UTC (Tue) by dlang (subscriber, #313) [Link]GPLv2 and GPLv3 are not compatible in either direction
the only thing that lets GPLv2 code change to GPLv3 is if people gave the FSF a blank check and said 'GPLv2 or later'
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 21:04 UTC (Tue) by man_ls (subscriber, #15091) [Link]True:When we say that GPLv2 and GPLv3 are incompatible, it means there is no legal way to combine code under GPLv2 with code under GPLv3 in a single program. This is because both GPLv2 and GPLv3 are copyleft licenses: each of them says, "If you include code under this license in a larger program, the larger program must be under this license too." There is no way to make them compatible.
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 8:20 UTC (Wed) by forthy (guest, #1525) [Link]
The FSF took great efforts that GPL versions can be made compatible. The paragraph that deals with it is section 9 of the GPL. Read it, especially the last part - many files in the Linux kernel are not explicitly restricted to a specific GPL versions, which means "any version". And section 6 makes sure that everybody receives a license from the original licensor, not from a compilation editor like Linus Torvalds.
The compilation editor (Linus Torvalds) can set terms under which he redistributes the work, i.e. conditions he has to follow. But since everybody receives the license from the original licensors, this "restriction" is null and void, you still can make a compilation yourself which does not restrict the license version, and then, most parts of Linux are compatible with GPLv3 (because you can either choose any GPL or explicitely v2 or later).
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 9:37 UTC (Wed) by man_ls (subscriber, #15091) [Link]That kind of compatibility is not much help, unless all of the kernel is licensed as "v2 or later". As long as there is a single file licensed under "v2 only", it becomes impossible to link with a single "v3 only" file.
Meanwhile, relicensing all files under a "v2 or later" license might seem to be a necessary first step to a GPLv3 kernel. But given Linus' reluctance to blanket license, I would rather expect a "dual v2-v3" license, if the migration is to be done at all.
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 13:09 UTC (Wed) by job (subscriber, #670) [Link]Maaaybe there was a good reason why the FSF recommended the use of "v2 or later" licensing. Then you basically leave the choice to the user. I never understood what Linus didn't like about that, except some unspecified fear of the FSF, which would be not only ridicolous but also unfortunate.
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 13:36 UTC (Wed) by man_ls (subscriber, #15091) [Link]It is not so unreasonable: Linus said:How can you _ever_ sign anything sight unseen? That's just stupid, and that's totally regardless of any worries about the FSF.Said that way, it looks like the correct thing to do. However, given that (as you say) "v2 or later" licensing gives the choice to the user, I'm not particularly worried about misuse.
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 21:56 UTC (Wed) by notamisfit (subscriber, #40886) [Link]It creates the possibility that code created in a downstream work may not be usable upstream. Linus has put his cards on the table in the past; he wants code back.
Linus on GPLv3 and ZFS
Posted Jun 14, 2007 1:59 UTC (Thu) by error27 (subscriber, #8346) [Link]Instead of "ridiculous and unfortunate" I would say "justified by current events."
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 19:06 UTC (Tue) by proski (subscriber, #104) [Link]We could also theorize that Linus is hinting at the possibility of switching Linux to GPLv3 to dissuade Sun from releasing ZFS under GPLv3.
But why would not Linus want ZFS in the kernel? The history of Linux shows that reimplemented code is more successful that ported code. XFS and JFS are rarely used, whereas ext2 and ext3 are wildly popular.
Patent issues: GPLv3 and ZFS
Posted Jun 12, 2007 19:30 UTC (Tue) by dwheeler (subscriber, #1216) [Link]Patent issues. If Sun releases ZFS under GPLv3, ZFS is patented, and its patents on ZFS are valid, then anyone else using GPLv3 can use ZFS. They can even "bring in" the GPLv3 code and completely rewrite it, so the IMPLEMENTATION may be different but they'd still be okay legally (I think). Using GPLv2 wouldn't give them access to patents released only under GPLv3.
Patent issues: GPLv3 and ZFS
Posted Jun 12, 2007 20:01 UTC (Tue) by atai (subscriber, #10977) [Link]One would expect Sun already considered this aspect already assuming Sun will release OpenSolaris under the GPL v3.
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 1:48 UTC (Wed) by wolfrider (guest, #3105) [Link]> JFS [is] rarely used
--Depends on who you ask. I use JFS now almost exclusively for Vmware and "bkps" (read: large) filesystems, where before I would use ReiserfsV3 with notail.
--After seeing how fast (and reliable) JFS is, I switched almost all my Reiser filesystems over to it - and have been much happier. Reiser is great for root and squid (tail-packing) but not ideal when you're trying to run a VM from a USB2 IDE drive. JFS makes it usable.
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 20:43 UTC (Tue) by JoeBuck (subscriber, #2330) [Link]If all of the kernel code were GPLv2 || GPLv3, it could be combined with a GPLv3 ZFS. The collection as a whole would be GPLv3 only if ZFS were added, but ZFS could be a module, and everything would be legal, while embedded software developers who want to do DRM could still use the rest of Linux (except ZFS).
Linus on GPLv3 and ZFS
Posted Jun 12, 2007 22:28 UTC (Tue) by cyperpunks (subscriber, #39406) [Link]Linus is right, Sun don't want Linux source code, they want Linux' kernel hackers (and then later Linux' users).
Of course CDDL is hopeless is this regard as hackers must transfer copyright to Sun, who want to do that?
Sun have to fix the bootstrap problem too: it's now not possible to build a complete free Solaris "distribution". You must use some non free Sun tools at some point.
Who wants to contribute to project you can't build yourself?
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 0:00 UTC (Wed) by JoeBuck (subscriber, #2330) [Link]To be fair, Sun's launching a project, called "Indiana", to correct that deficiency and produce something that would resemble a GNU/Linux distribution. It will take them some time to do it, but I'm looking forward to it.
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 12:36 UTC (Wed) by paulj (subscriber, #341) [Link]
Of course CDDL is hopeless is this regard as hackers must transfer copyright to Sun, who want to do that?
This is false.
You need to sign a contributors agreement with Sun, granting Sun joint ownership, if you wish to have Sun incorporate any contribution to various open-source projects which Sun founded and maintain (such as OpenSolaris, amongst other projects).
However the CDDL does not require any copyright transfership, and you're quite free to take and hack away on CDDLed code, like OpenSolaris, without giving copyright ownership to Sun or anyone else.
for the clueless
Posted Jun 12, 2007 20:01 UTC (Tue) by ccyoung (subscriber, #16340) [Link]such as myself, what is ZFS and what makes it so hot?
Google is your friend
Posted Jun 12, 2007 20:21 UTC (Tue) by rfunk (subscriber, #4054) [Link]http://www.opensolaris.org/os/community/zfs/
for the clueless
Posted Jun 12, 2007 21:14 UTC (Tue) by huerlisi (guest, #44534) [Link]Because it's a nice product of computer engineering. Here's a quote from a nice-to-read geeky background article :64 bits would have been plenty ... but then you can't talk out of your ass about boiling oceans then, can you?Simon
WAFL != ZFS
Posted Jun 12, 2007 21:56 UTC (Tue) by qu1j0t3 (subscriber, #25786) [Link]I have to assume Linus knows that. Sigh. If not, like another poster here, he should Google... I'm tired of posting ZFS linkfests ;-)
He's treading close to the FUD-line with this one. There's also a hidden assumption here that Jonathan Schwartz is being disingenuous with his massively revamped corporate strategy.
Sun's a hardware company. They're happy for you to run Linux on your Sun gear if you prefer - it's a supported option - heck, they even support Windows.
WAFL != ZFS
Posted Jun 12, 2007 23:03 UTC (Tue) by allesfresser (subscriber, #216) [Link]I don't think it's anywhere near FUD, personally. It sounded simply like classic Linus--he's being very transparently honest. He hopes and wishes that Schwartz and company are being as open and forthright as they claim to be, but knowing human nature and the temptations that beset us, he is keeping his powder dry and his head down, so to speak.
forthright != Linus
Posted Jun 12, 2007 23:43 UTC (Tue) by qu1j0t3 (subscriber, #25786) [Link]
It's not very "forthright" to inject snide Linusisms such as "the only really interesting thing they have is ZFS, and even there, I suspect we'd be better off talking to NetApp"! XFS is already integrated, and that has about as much in common with ZFS as WAFL does.
It comes across as sour grapes about the license, and even some N-I-H ("core Solaris: who are you kidding - Linux code is _better_"). Btw, there is as much spurious rancor of the opposite polarity from the Sun camp, as recent zfs-discuss flamefests can attest.
Why can't we all just get along? - Admit that some people like BSD license, some people like GPL, Sun likes CDDL for now, and ZFS plain rocks... :)
Linux devs ignore it at their peril; Linus, being an engineer of Sun's calibre, could do a much more helpful job of deconstructing the issue.
forthright != Linus
Posted Jun 13, 2007 0:03 UTC (Wed) by JoeBuck (subscriber, #2330) [Link]The issue with Sun is not that they prefer a particular license, but that they are choosing to license patents only to code that uses their particular license, while IBM, Red Hat, Novell, and others are licensing a number of patents (or in Red Hat's case, all their patents) to developers who use a much larger set of open source licenses.
Posted Jun 13, 2007 0:21 UTC (Wed) by qu1j0t3 (subscriber, #25786) [Link]This still does not fully explain to me why, to date, kernel devs aren't looking dispassionately as the affordances of ZFS and how they might have them without stepping on anyone's patent*. Max V. Yudin recently asked on zfs-discuss,
... is it legal to write ZFS clone from scratch while maintaining binary compatibility with original?
Jeff mentioned in his blog that Sun filled 56 patents on ZFS related technologies. Can anybody from the company provide me with more information about this?
If porting ZFS to Linux kernel is not an option and I were to implement different file system with ZFS ideas in mind how can I be safe and not break any Sun patents?
There has been no meaningful resolution of his questions. At least it may prove that, thanks to software patents, interesting development is now impossible. So much for stimulating innovation...
* - I suppose NetApp has patents too, but perhaps Linus wishes to imply that they would be more tractable to deal with than Sun (maybe he actually knows somebody @ NetApp). Let's dream for a moment, and imagine that Linus and Jonathan, over a piña colada one Sunday, work out a magical way to free ZFS for kernel inclusion. That would be a P/R coup for Sun an order of magnitude greater than even the Apple buzz. Since Solaris 10 famously runs on all varieties of hardware (IBM, HP, Dell, even Macs), I don't seriously think Jonathan believes this would damage hardware sales. Then again, I only have the ponytail, not an MBA, and my bonuses are a few zeroes short of his. ;-)
Posted Jun 13, 2007 7:57 UTC (Wed) by TRS-80 (subscriber, #1804) [Link]One could always start from Sun's GPLv2 ZFS code in GRUB. And Jonathon Schwartz has just posted saying Linux ZFS would have full patent indemnity.
Posted Jun 13, 2007 9:38 UTC (Wed) by michaeljt (subscriber, #39183) [Link]That said "GPL v2 or later" if I read correctly. I didn't take the time to read the code, but presumably that is only code for reading and would not affect potential patents on writing parts.
On another note, if Sun make Solaris GPL3 and accept external contributions, it might get tricky to keep parts (i.e. ZFS) under another licence.
Posted Jun 13, 2007 19:51 UTC (Wed) by dlang (subscriber, #313) [Link]note that the zfs code released for grub is not enough to actually be able to write to the filesystem, just enough for grub to be able to find the files that it needs.
Netapp and patents
Posted Jun 22, 2007 7:10 UTC (Fri) by anton (guest, #25547) [Link]I suppose NetApp has patents too, but perhaps Linus wishes to imply that they would be more tractable to deal with than SunYes, Netapp has patents, and they caused Daniel Phillips to stop working on the tux2 filesystem; I have not followed the story enough to know if Netapp did anything other than file the patents to achieve this result.
Netapp's WAFL is not very interesting for Linux anyway, because it requires special NVRAM hardware to buffer writes during some of the more time-consuming operations (e.g., snapshot creation). I don't think that this hardware dependence can be eliminated without major changes to the WAFL code.
Concerning not breaking Sun patents, you can look for older sources where similar ideas have been described, e.g., various papers on log-structured file systems, e.g., our Freenix 2000 paper, or (maybe too young) my file system ideas.
ZFS and WAFL
Posted Jun 15, 2007 0:02 UTC (Fri) by joern (subscriber, #22392) [Link]> XFS is already integrated, and that has about as much in common with ZFS as WAFL does.
That is plain wrong. XFS is in the huge class of traditional filesystems with a static mapping between file offsets and device offsets. ZFS is in the somewhat smaller (ignoring reseach projects) class of COW filesystems, just like WAFL. Anyone unable to see the similarities is well advised to read more and write less. ;)
ZFS and WAFL
Posted Jun 17, 2007 15:30 UTC (Sun) by qu1j0t3 (subscriber, #25786) [Link]As you see I haven't studied XFS in depth, but I was under the impression it was COW like ZFS. AFAIK, WAFL also lacks the more interesting features of ZFS (foremost being end-to-end checksumming).
ZFS and WAFL
Posted Jun 17, 2007 17:29 UTC (Sun) by joern (subscriber, #22392) [Link]Checksums are easy to add once you have a COW format. Either you add them to the block pointers, as ZFS did, or you add them to the objects themselves, as JFFS2 and LogFS did.
Either way you have an incompatible format change. But the amount of code affected if rather small. Took about 1-2% of the effort to design a new filesystem in the LogFS case.
Posted Jun 17, 2007 18:06 UTC (Sun) by qu1j0t3 (subscriber, #25786) [Link]
Maybe so, but there's quite a lot of catch-up to do. Once you have COW, transactions, and checksums, then you want self-healing; then snapshots; pools; quotas; compression; and so on, until you eventually have something like ZFS. :)
Linus' grandstanding aside, it's possible there is quiet work going on to improve the situation, as David Magda commented on zfs-discuss:Somewhat off topic, but it seems that someone released a COW file system for Linux (currently in 'alpha'): * Extent based file storage (2^64 max file size) * Space efficient packing of small files * Space efficient indexed directories * Dynamic inode allocation * Writable snapshots * Subvolumes (separate internal filesystem roots) - Object level mirroring and striping * Checksums on data and metadata (multiple algorithms available) - Strong integration with device mapper for multiple device support - Online filesystem check * Very fast offline filesystem check - Efficient incremental backup and FS mirroring http://lkml.org/lkml/2007/6/12/242 http://oss.oracle.com/~mason/btrfs/ Via Storage Mojo
Posted Jun 17, 2007 19:05 UTC (Sun) by joern (subscriber, #22392) [Link]> Maybe so, but there's quite a lot of catch-up to do. Once you have COW, transactions, and checksums, then you want self-healing; then snapshots; pools; quotas; compression; and so on, until you eventually have something like ZFS. :)
Sure, ZFS has an impressive set of features. If nothing else, it has showed how things can be done. And I have little doubt that btrfs, which you quoted, will end up having most of those features relatively soon. And even if Chris dies tomorrow, I'll keep working on LogFS.
The only question is when, not if. :)
good to hear
Posted Jun 17, 2007 19:15 UTC (Sun) by qu1j0t3 (subscriber, #25786) [Link]
The only question is when, not if. :)
From comments I've heard, and the buzz around Linus' and Jon's posts, there seems to be considerable community interest around ZFS (I don't want to use anything else, or wait, so I switched to Solaris 10 some time ago).
Look forward to more news on this front. Did you follow-up on LKML? (I have not checked :)
good to hear
Posted Jun 17, 2007 21:43 UTC (Sun) by joern (subscriber, #22392) [Link]> From comments I've heard, and the buzz around Linus' and Jon's posts, there seems to be considerable community interest around ZFS (I don't want to use anything else, or wait, so I switched to Solaris 10 some time ago).
> Look forward to more news on this front. Did you follow-up on LKML? (I have not checked :)
My personal interest is in flash, not hard disks. Therefore ZFS is impressive technology, but solving someone else's problem. It is not the last word in filesystems either, as the fsck will run for hours or days if it ever becomes necessary. So there remain valid reasons to work on different filesystems.
Impressive technology none the less.
COW for Flash?
Posted Jun 18, 2007 17:26 UTC (Mon) by qu1j0t3 (subscriber, #25786) [Link]
It has been said that COW is ideal for Flash. Can you explain why ZFS isn't relevant here?
There is no fsck; ZFS is "always consistent on disk" (through COW+atomic transactions). It seems to me this is a necessary invariant to achieve its other features (such as snapshots). Debate flares up (occasionally) as to whether a scavenger will be necessary. If so, it won't much resemble 'fsck' - and certainly won't be run in normal operation or after reset/powerfail/etc (ZFS behaviour under impromptu reset is extremely well tested).
I suspect, but correct me if I'm wrong, once you "know" you've lost data in ZFS (through exhausting redundancy or ditto blocks), it's actually gone by definition, and unrecoverable by re-creating links. No doubt Bonwick et all have explained it better somewhere...
COW for Flash?
Posted Jun 19, 2007 8:17 UTC (Tue) by joern (subscriber, #22392) [Link]> It has been said that COW is ideal for Flash. Can you explain why ZFS isn't relevant here?
Raw flash behaves sufficiently different to hard disks that some ZFS design assumptions become untrue. Flash has large erase blocks. Within erase blocks, data must be written from front to back. Writing the block again requires erasing all of it. So the filesystem block size either has to be equal to the erase block size, or you need garbage collection. And with garbage collection comes a nasty deadlock problem most people don't even realize exists. :)
Next comes wear out and protection against it. Afaics, ZFS has several hot zones that receive significantly more writes than others.
I guess those two are the big reasons.
linux core better?
Posted Jun 12, 2007 23:54 UTC (Tue) by genius (guest, #19981) [Link]i dont think i can agree with it. that was a blog last time about the linux kernel having problem scaling beyond 8-way compared to bsd. not sure whether they have solved it. on the other hand, linux has definitely revived interest in unix.
linux core better?
Posted Jun 13, 2007 0:11 UTC (Wed) by JoeBuck (subscriber, #2330) [Link]You are seriously out of date; folks at SGI have Linux running on 1024 processors.
The Solaris and BSD folks cannot claim to be more scalable than Linux at this point; it appears that the reverse is true.
what epoch are you posting from?
Posted Jun 13, 2007 2:37 UTC (Wed) by xoddam (subscriber, #2322) [Link]> linux kernel having problem scaling beyond 8-way compared to bsd.
You're several years behind. A long time_t in LKML-land.
what epoch are you posting from?
Posted Jun 13, 2007 3:03 UTC (Wed) by Nick (subscriber, #15060) [Link]Poster is probably talking about this blog entry
So he is right, and Linux did have a problem on this workload. Basically it was a combination of a glibc inefficiency and the fact that nobody seems to have reported such a workload before. The fix was basically
a small change to the way malloc/free works, and a little patch to the kernel to optimise the new path used by glibc.
That post found the fixes to have eliminated the big dropoff. Note it still doesn't scale past 8-way, but this is likely to be a MySQL issue -- BSD doesn't do any better.
what epoch are you posting from?
Posted Jun 14, 2007 7:36 UTC (Thu) by drag (subscriber, #31333) [Link]Those things seem to have triggered some sort of bug.
Rest assured people do have real-world Linux boxes that have over 512 cpus in a single system image. SGI has boxes, at least, have Linux boxes with 4096 cpus in a single system image.
As far as clustering goes.. there are Linux systems with tens of thousands of cpus going.
Linux kernel itself does scale past 8 cpus. Of course nothing is perfect.
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 10:03 UTC (Wed) by venkatesh045 (guest, #45744) [Link]You could read Jonathan Schwartz's reply to this post. I think he clarifies a lot of issues as to what Sun is looking at. This post is actually worth a good read.
Linus on GPLv3 and ZFS
Posted Jun 13, 2007 12:25 UTC (Wed) by marduk (subscriber, #3831) [Link]Reads like typical salesmanship. Sales talk always "sounds" good...
I actually disagreed with his implication that communities don't compete (only corporations). There exists competition in the OSS community.
Posted Jun 13, 2007 12:47 UTC (Wed) by qu1j0t3 (subscriber, #25786) [Link]There difference may be that it's augmentative competition rather than destructive - In business, it's optimal to completely eliminate rivals. In open source, you don't have to do that; you can just do better. It's a more pure meritocracy. I hope. :)
That said, there's still some dirty pool played from time to time, but since it's played in the open, it hardly festers.
Linus Torvalds, leader of the Linux kernel project and a major figure in the open-source programming movement, said Wednesday he's "pretty pleased" with changes in a third draft of the General Public License (GPL) released Wednesday.
The Linux kernel and many higher-level software packages are governed by the current GPL 2, and Torvalds has expressed strong displeasure with earlier version 3 drafts. After a preliminary analysis of GPL 3, however, some of those concerns are gone or moderated, he said.
"I'm actually pretty pleased. Not because I think it's perfect, but simply because I think it's certainly a lot better than I really expected from the previous drafts," he said in an interview. "Whether it's actually a better license than the GPLv2, I'm still a bit skeptical, but at least it's now 'I'm skeptical' rather than 'Hell no!'"
In particular, one provision against digital rights management has been narrowed, and another that Torvalds feared could lead to multiple incompatible versions of the GPL has been removed or defanged.
"I'm much happier with many parts of it. I think much of it reads better, and some of the worst horrors have been removed entirely," Torvalds said.
The Free Software Foundation (FSF) has been accused of working to prevent co-operation between the free and proprietary software sectors, thanks to new terms in the latest draft version of the GNU GPL.
Unsurprisingly, the speediest criticism came from Microsoft, whose deal with Novell prompted the inclusion of the controversial clauses in the first place.
Horacio Gutierrez, Microsoft's vice president of intellectual property and licensing, told eWeek: "We note that the draft of the GPLv3 does not tear down the bridge Microsoft and Novell have built for their customers. It is unfortunate, however, that the FSF is attempting to use the GPLv3 to prevent future collaboration among industry leaders to benefit customers."
Microsoft holds that Linux infringes several of its patents and late last year signed a deal with Novell, under which Novell's customers were indemnified against legal action by Microsoft. Novell was roundly criticised at the time: the Open Source sector felt that the deal was a tacit admission that Linux does infringe Redmond's IP, something Novell has strenuously denied.
Many also felt the deal ran counter to the spirit of the GPL, even if it was technically compliant. Jeremy Allison, now ex-head of Novell's Samba team, resigned in protest. He said in a memo: "We can pledge patents all we wish, we can talk to the press and 'community leaders', we can do all the right things w.r.t. all our other interactions, but we will still be known as GPL violators and that's the end of it."
Novell maintains that the agreement did comply with the terms of the GPL, specifically the requirement that all recipients of the code should be treated equally, since there was no agreement between Novell and Microsoft, just between Microsoft and Novell's customers.
The new draft specifically prohibits deals like the one done by Microsoft and Novell from now on.
Morgan Reed, executive director of The Association for Competitive Technology said the new terms mean the GPL "no longer just defines freedom; it is designed to punish companies and business models that Richard Stallman just doesn't like".
The FSF's Richard Stallman believes the foundation had to do something. He argues that there are four "defining freedoms" to free software: the freedom to run the program as you see fit, study and adapt it for your own purposes, redistribute copies to help your neighbour, and release your improvements to the public.
"The recent patent agreement between Microsoft and Novell aims to undermine these freedoms. In this draft, we have worked hard to prevent such deals from making a mockery of free software," he said.
The second draft of GPLv.3 is just that, a draft. There is a 60 day period during which suggestions can be submitted. You can comment on the draft here.
Sun Sticks With Solaris CDDL (For Now)
By Sean Michael Kerner
Whether or not Sun will migrate to the upcoming GPL version 3 license for OpenSolaris and Java is a question resulting in much speculation.
Currently OpenSolaris is licensed under Sun's Common Development and Distribution License (CDDL) license and Java is set to be licensed under GPL v2. GPL v3 , which is currently still under development adds new terms for digital rights management (DRM) and patents that could have wide ranging effects on licensees.
Sun Microsystems' Chief Open Source Officer, Simon Phipps, explained that Sun is picking the best license on a case-by-case basis for its software and will continue to use the license that is most appropriate for the community involved.
That said, some things aren't going to change.
"I've got no intention of removing CDDL from OpenSolaris as it has been an ideal license for OpenSolaris," Phipps told internetnews.com. "The CDDL is doing a fine job with that community. The role of the license is to empower the innovator and the CDDL is demonstrably doing a good job of empowering OpenSolaris."
Phipps noted that under CDDL, OpenSolaris has grown its user base and contributions. At least five distributions are now available that are based on OpenSolaris, which is facilitated by the CDDL.
Just because the CDDL is working doesn't necessarily mean that Phipps won't consider adding another license to OpenSolaris. He commented that if the community wants another license than he would consider it. In fact, Phipps noted that he is just starting to see a debate in the OpenSolaris community on whether to add GPL v3.
Currently Sun uses the GPL v2 license in some of its software applications, though Sun isn't automatically going to migrate to v3 when it comes out.
Under the terms of the GPL v2, licensees "have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation."
However, if "the program does not specify a version number of this license, you may choose any version ever published by the Free Software Foundation." Some applications, notably the Linux kernel and MySQL have included the language "GPLv2 only," as opposed to "GPLv2 or later," implying that an automatic changeover will not occur.
"I look at the 'or any latter version clause' and think it's a really strange thing for any responsible enterprise to use in its licensing," Phipps said. "That's carte blanche to a successive body to act in a way that is against your interests."
The fact that Sun is not using the "or any later version clause" does not imply any sort of criticism or lack of confidence in the GPL v3 process. Rather, it's a matter of responsibility, according to Phipps.
Phipps argued that with Java, for example, there are five million developers that rely on Java for their livelihood. "It would be absolutely irresponsible of me to license Java in a way that would endanger the livelihood of the developers working on it," Phipps said.
Sun has been very active in the GPL v3 process since the beginning. Phipps noted that he has every confidence that GPL v3 will be a license that will be usable in some areas of Sun's software business.
In the case of both OpenSolaris and Java, the respective communities will debate on whether or not GPL v3 is right for them, though, in the final analysis, the decision to actually use GPL v3 is up to Sun.
"Ultimately in each of those cases, Sun is the copyright holder and it is Sun that has to take the action," Phipps said. "So ultimately the decision is mine."
"I'm not going to pick a license that is still not published," Phipps said. "Licenses give freedom to developers and I need to know that the license chosen gives the developers that I'm serving and protecting the freedoms that they desire."
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2020 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to to buy a cup of coffee for authors of this site|
Last updated: January 06, 2020