|Home||Switchboard||Unix Administration||Red Hat||TCP/IP Networks||Neoliberalism||Toxic Managers|
May the source be with you, but remember the KISS principle ;-)
Bigger doesn't imply better. Bigger often is a sign of obesity, of lost control, of overcomplexity, of cancerous cells
|News||Recommended Links||OS History||Unix History||CPU History||Language History||Solaris history||DOS History|
|Donald Knuth||Richard Stallman||Larry Wall||Scripting Giants||CTSS||Multics OS||Quotes||Etc|
Invention of computers was the highest moment of the development of the USA high-tech industry, the area which defined the progress of high-tech as a whole. This is an area were the USA really has been the greatest nation of the world. Real "shining city on the hill". The USA gave the world great programmers, hardware designers, network architects and managers. Those unique conditions, when the whole country was a large Silicon Valley of the world, were destroyed by neoliberal transformation of society and, especially, neoliberal transformation of higher education (when university were transformed into for profit corporation for the university elite) started in 80th and getting to full speed after 2001.
When ENIAC was declassified in 1946 ( it made the front page of the New York Times) the computer revolution was put into fast motion. As early as 1952 during the presidential elections night, Univac computer correctly predicted the winner. While chances were 50% ;-), this was an impressive introduction of computers into mainstream society. IBM, DEC, CDC and later Intel, HP, Apple and Dell emerged as the leading producers of hardware. With the advent of microprocessors all major CPU, Include Intel x86 and Motorola 68000, PowerPC, etc were US designed. The USA programmers created all major world operating systems such as System/360, Multics, VM/CMS, VAX/VMS, Unix, CP/M, DOS, Windows, System 7, OS 7, major linux distributions, such as Red Hat and Debian, Android. in 1967 they wrote the first hypervisor ( CP-67; later renamed to CP/CMS was available to IBM customers from 1968 to 1972, in source code form without support). In 1072 the shipped first commercial hypervisor VM/370. Later they create a series of impressive offering in this area too such as VirtualBox and VMware.
History is written by the winners and computer history of XX century was definitely written in the USA. If we assume that the professional success is a mixture of natural abilities, hard labor and luck (including being born at the right place at right time), as in Malcolm Gladwell suggested in his unscientific, but now popular The 10,000 Hour Rule it is clear that the US scientists have has all those three components. But they were not alone -- conditions is GB, Germany and France were not bad iether. While we should take Gladwell findings and his 10000 rule with a grain of slat, it point to one interesting observation. Most those that I mention below were born between 1920 and 1955 -- a window of opportunity in computer science which since then is virtually closed. It is similar to 1830-1840 window for titans of Gilded Age such as Rockefeller (1939), Carnegue(1835), Gould(1836), J.P Morgam(1837) that Gladwell mentioned. "No one—not rock stars, not professional athletes, not software billionaires, and not even geniuses—ever makes it alone", writes Gladwell.
At the same time it is important to see this history not as "people, places and events" but also via artifacts, be it machines, programs or interviews of pioneers. This part of history is badly persevered in the USA. Moreover there is a trend of Dumbing down history of computer science. As Donald Knuth remarked ( Kailath Lecture and Colloquia):
For many years the history of computer science was presented in a way that was useful to computer scientists. But nowadays almost all technical content is excised; historians are concentrating rather on issues like how computer scientists have been able to get funding for their projects, and/or how much their work has influenced Wall Street. We no longer are told what ideas were actually discovered, nor how they were discovered, nor why they are great ideas. We only get a scorecard.
Similar trends are occurring with respect to other sciences. Historians generally no prefer "external history" to "internal history", so that they can write stories that appeal to readers with almost no expertise.
Historians of mathematics have thankfully been resisting such temptations. In this talk the speaker will explain why he is so grateful for the continued excellence of papers on mathematical history, and he will make a plea for historians of computer science to get back on track.
History is always written by the winners, and that means right now it is written by neoliberals. Dumping down history of computer science is just application of neoliberalism to particular narrow field. The to way an essence of neoliberal history is "to dumb down everything". Dumbing down is a deliberate lowering of the intellectual level of education, literature, cinema, news, and culture. Deliberate dumbing down is the goal.
They use power of vanity to rob us of vision which history can provide. Knuth lecture "Let's Not Dumb Down the History of Computer Science" can be viewed at Kailath Lecture and Colloquia. He did important point that historical errors are as important as achievement, and probably more educational. In this "drama of ideas" (and he mentioned high educational value of errors/blunders of Linux Torvalds in design of Linux kernel) errors and achievement s all have their place and historical value. History gives people stories that are much more educational then anything else. that's that way people learn best.
Giants of the field either were US citizens or people who worked in the USA for a long time. Among them:
Those people mentioned above are all associated with the USA. And I named just a few about work of which I personally know... The USA computer science research was often conducted in close collaboration with British computer scientists which also made some significant contributions (some of the most impressive IBM compilers were actually designed and implemented in Britain) but the leadership role of the USA was indisputable. CACM was always more important publications then Computer Journal.
Large part of this unique technology culture was destroyed via outsourcing frenzy which started around 1998, but the period from approximately 1950 till approximately 2000 was really the triumph of the US computer engineering. Simultaneously this was a triumph of New Deal policies. When they were dismantled (starting from Reagan or even Carter), and neoliberalism became the ruling ideology, computer science quickly was overtaken by commercial interests and became very similar to economics in the level of corruption of academics and academic institutions.
But that did not happened overnight and the inertia lasted till late 90th.
Firms also did not escape this transformation into money making machines with IBM as a primary example of the disastrous results of such transformations which started under "American Express"-style leadership of Lou Gerstner. The first of financial shenanigans, who became CEO of a major technical company. And who will later destroy several other major US computer companies. In the interests of shareholders and personal bonuses ;-). See IBM marry Linux to Outsourcing.
Here is the timeline modified from History of Computer Science
In 1949 The U.S. Army and the University of Illinois jointly fund the construction of two computers, ORDVAC and ILLIAC (ILLInois Automated Computer). The Digital Computer Laboratory is organized. Ralph Meagher, a physicist and chief engineer for ORDVAC, is head. 1951 ORDVAC (Ordnance Variable Automated Computer), one of the fastest computers in existence, is completed. 1952 ORDVAC moves to the Army Ballistic Research Laboratory in Aberdeen, Maryland. It is used remotely from the University of Illinois via a teletype circuit up to eight hours each night until the ILLIAC computer is completed
Grace Murray Hopper (1906-1992) invented the notion of a compiler, at Remington Rand, in 1951. Earlier, in 1947, Hopper found the first computer "bug" -- a real one -- a moth that had gotten into the Harvard Mark II. (Actually, the use of ``bug'' to mean defect goes back to at least 1889.). The first compiler was written by Grace Hopper, in 1952, for the A-0 System language. The term compiler was coined by Hopper. History of compiler construction - Wikipedia, the free encyclopedia
In a famous paper that appeared in the journal Mind in 1950, Alan Turing introduced the Turing Test, one of the first efforts in the field of artificial intelligence. He proposed a definition of "thinking" or "consciousness" using a game: a tester would have to decide, on the basis of written conversation, whether the entity in the next room responding to the tester's queries was a human or a computer. If this distinction could not be made, then it could be fairly said that the computer was "thinking".
In 1952, Alan Turing was arrested for "gross indecency" after a burglary led to the discovery of his affair with Arnold Murray. Overt homosexuality was taboo in 1950's England, and Turing was forced to take estrogen "treatments" which rendered him impotent and caused him to grow breasts. On June 7, 1954, despondent over his situation, Turing committed suicide by eating an apple laced with cyanide.
In the same 1952 ILLIAC, the first computer built and owned entirely by an educational institution, becomes operational. It was ten feet long, two feet wide, and eight and one-half feet high, contained 2,800 vacuum tubes, and weighed five tons.
In the same 1952 IBM developed first magnetic disk. In September 1952, IBM opened a facility in San Jose, Calif.—a critical moment in the story of Silicon Valley. The company set to work developing a new kind of magnetic memory for its planned Model 305 Ramac (Random Access Method of Accounting and Control), the world's first "supercomputer."
In 1952 Univac correctly predicted the results of presidential elections in the USA. Remington Rand seized the opportunity to introduce themselves to America as the maker of UNIVAC – the computer system whose name would become synonymous with computer in the 1950s. Remington Rand was already widely known as the company that made the Remington typewriters. The company bought out the struggling Eckert-Mauchly Computer Corporation in 1950. Pres Eckert and John Mauchly had led the ENIAC project and made one of the first commercially available computer, UNIVAC. See Computer History Museum @CHM Have you got a prediction for us, UNIVAC
The IBM 650 Magnetic Drum Data Processing Machine was announced 2 July 1953 (as the "Magnetic Drum Calculator", or MDC), but not delivered until December 1954 (same time as the NORC). Principal designer: Frank Hamilton, who had also designed ASCC and SSEC. Two IBM 650s were installed at IBM Watson Scientific Computing Laboratory at Columbia University, 612 West 116th Street, beginning in August 1955.
Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also invented an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer. (Dijkstra is famous for his caustic, opinionated memos. For example, see his opinions of some programming languages).
In 1956 IBM 305 RAMAC was announced. It was the first commercial computer that used a moving head hard disk drive (magnetic disk storage) for secondary storage. The 305 was one of the last vacuum tube computers that IBM built. The IBM 350 disk system stored 5 million 8-bit (7 data bits plus 1 parity bit) characters. It had fifty 24-inch-diameter (610 mm) disks.
The same year Case University Computing Center got IBM 650 and the same year Donald Knuth entered this college and he managed to start working at the Case University Computing Center. That later led to creation of his three volume series the Art of Computer Programming -- the bible of programming as it was called.
On October 4, 1957, the first artificial Earth satellite Sputnik was launched by USSR it into an elliptical low Earth orbit. In a way it as a happenstance due to iron will and talent of Sergey Korolev, a charismatic head of the USSR rocket program (who actually served some years in GULAG). But it opened a new era. The ILLIAC I (Illinois Automatic Computer), a pioneering computer built in 1952 by the University of Illinois, was the first computer built and owned entirely by a US educational institution, was the first to calculated Sputnik orbit. The launch of Sputnik led to creation NASA and indirectly of the US Advanced Research Projects Agency (DARPA) in February 1958 to regain a technological lead. It also led to dramatic increase in U.S. government spending on scientific research and education via President Eisenhower's bill called the National Defense Education Act. This bill encouraged students to go to college and study math and science. The students' tuition fees would be paid for. This led to a new emphasis on science and technology in American schools. In other words Sputnik created building blocks which probably led to the general establishment of the way computer science developed in the USA for the next decade of two. DARPA latter funded the creation of the TCP/IP protocol and Internet as we know it. It also contributed to development of large integral circuits. The rivalry in space, even though it had military reasons served as tremendous push forward for computers and computer science.
John Backus and others developed the first complete complier -- FORTRAN compiler in April 1957. FORTRAN stands for FORmula TRANslating system. Heading the team is John Backus, who goes on to contribute to the development of ALGOL and the well-known syntax-specification system known as BNF. The first FORTRAN compiler took 18 person-years to create.
LISP, a list-processing language for artificial intelligence programming, was invented by John McCarthy about 1958. The same year Alan Perlis, John Backus, Peter Naur and others developed Algol.
In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959.
In 1959 LISP 1.5 appears. The same year COBOL is created by the Conference on Data Systems and Languages (CODASYL).
See also Knuth Biographic Notes
Operating systems saw major advances. Fred Brooks at IBM designed System/360, a line of different computers with the same architecture and instruction set, from small machine to top-of-the-line. DEC designed PDP series. The first PDP-1 was delivered to Bolt, Beranek and Newman in November 1960, and formally accepted the next April. The PDP-1 sold in basic form for $120,000, or about $900,000 in 2011 US dollars. By the time production ended in 1969, 53 PDP-1s had been delivered.At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed.
In 1960 ALGOL 60, the first block-structured language, appears. This is the root of the family tree that will ultimately produce the Pl/l, algol 68, Pascal, Modula, C, Java, C# and other languages. ALGOL become popular language in Europe in the mid- to late-1960s. Attempts to simplify Algol lead to creation of BASIC (developed c. 1964 by John Kemeny (1926-1992) and Thomas Kurtz (b. 1928)). It became very popular with PC revolution.
The 1960's also saw the rise of automata theory and the theory of formal languages. Big names here include Noam Chomsky and Michael Rabin. Chomsky introduced the notion of context free languages and later became well-known for his theory that language is "hard-wired" in human brains, and for his criticism of American foreign policy.
Sometime in the early 1960s , Kenneth Iverson begins work on the language that will become APL--A Programming Language. It uses a specialized character set that, for proper use, requires APL-compatible I/O devices. APL is documented in Iverson's book, A Programming Language published in 1962
In 1962 ILLIAC II, a transistorized computer 100 times faster than the original ILLIAC, becomes operational. ACM Computing Reviews says of the machine, "ILLIAC II, at its conception in the mid-1950s, represents the spearhead and breakthrough into a new generation of machines." in 1963 Professor Donald B. Gillies discovered three Mersenne prime numbers while testing ILLIAC II, including the largest then known prime number, 2**11213 -1, which is over 3,000 digits.
The famous IBM System/360 (S/360) was first announced by IBM on April 7, 1964. S/360 became the most popular computer systems for more then a decade. It introduced 8-bit byte address space, byte addressing and many other things. The same year (1964) PL/1 was released. It became the most widely used programming language in Eastern Europe and the USSR. It later served as a prototype of several other languages including PL/M and C.
In 1964 the IBM 2311 Direct Access Storage Facility was introduced (History of IBM magnetic disk drives - Wikipedia,) for the System/360 series. It was also available on the IBM 1130 and (using the 2841 Control Unit) the IBM 1800. The 2311 mechanism was largely identical to the 1311, but recording improvements allowed higher data density. The 2311 stored 7.25 megabytes on a single removable IBM 1316 disk pack (the same type used on the IBM 1311) consisting of six platters that rotated as a single unit. Each recording surface had 200 tracks plus three optional tracks which could be used as alternatives in case faulty tracks were discovered. Average seek time was 85 ms. Data transfer rate was 156 kB/s.
Along with the development unified System 360 series of computers, IBM wanted a single programming language for all users. It hoped that Fortran could be extended to include the features needed by commercial programmers. In October 1963 a committee was formed composed originally of 3 IBMers from New York and 3 members of SHARE, the IBM scientific users group, to propose these extensions to Fortran. Given the constraints of Fortran, they were unable to do this and embarked on the design of a “new programming language” based loosely on Algol labeled “NPL". This acronym conflicted with that of the UK’s National Physical Laboratory and was replaced briefly by MPPL (MultiPurpose Programming Language) and, in 1965, with PL/I (with a Roman numeral “I” ). The first definition appeared in April 1964. IBM took NPL as a starting point and completed the design to a level that the first compiler could be written: the NPL definition was incomplete in scope and in detail. Control of the PL/I language was vested initially in the New York Programming Center and later at the IBM UK Laboratory at Hursley. The SHARE and GUIDE user groups were involved in extending the language and had a role in IBM’s process for controlling the language through their PL/I Projects. The language was first specified in detail in the manual “PL/I Language Specifications. C28-6571” written in New York from 1965 and superseded by “PL/I Language Specifications. GY33-6003” written in Hursley from 1967. IBM continued to develop PL/I in the late sixties and early seventies, publishing it in the GY33-6003 manual. These manuals were used by the Multics group and other early implementers. The first production PL/I compiler was the PL/I F compiler for the OS/360 Operating System, built by John Nash's team at Hursley in the UK: the runtime library team was managed by I.M.(Nobby) Clarke. Release 1 shipped in 1966. That was a significant step forward in comparison with earlier compilers. The PL/I D compiler, using 16 kilobytes of memory, was developed by IBM Germany for the DOS/360 low end operating system. It implemented a subset of the PL/I language requiring all strings and arrays to have fixed extents, thus simplifying the run-time environment. Reflecting the underlying operating system it lacked dynamic storage allocation and the controlled storage class. It was shipped within a year of PL/I F.
Hoare also invented Quicksort while on business trip to Moscow.
Douglas C. Englebart invents the computer mouse c. 1968, at SRI.
The first volume of The Art of Computer Programming was published in 1968 and instantly became classic Donald Knuth (b. 1938) later published two additional volumes of his world famous three-volume treatise
In 1968 ALGOL 68 , a monster language compared to ALGOL 60, appears. Some members of the specifications committee -- including C.A.R. Hoare and Niklaus Wirth -- protest its approval. ALGOL 68 proves difficult to implement. The same year Niklaus Wirth begins his work on a simple teaching language which later becomes Pascal.
Ted Hoff (b. 1937) and Federico Faggin at Intel designed the first microprocessor (computer on a chip) in 1969-1971.
In late 60th the PDP-11 one of the first 16-bit minicomputers was designed in a crash program by Harold McFarland, Gordon Bell, Roger Cady, and others as a response to NOVA 16-bit minicomputers. The project was able to leap forward in design with the arrival of Harold McFarland, who had been researching 16-bit designs at Carnegie Mellon University. One of his simpler designs became the PDP-11. It was launched in 1970 and became huge success. The first officially named version of Unix ran on the PDP-11/20 in 1970. It is commonly stated that the C programming language took advantage of several low-level PDP-11–dependent programming features, albeit not originally by design. A major advance in the PDP-11 design was Digital's Unibus, which supported all peripherals through memory mapping. This allowed a new device to be added easily, generally only requiring plugging a hardware interface board into the backplane, and then installing software that read and wrote to the mapped memory to control it. The relative ease of interfacing spawned a huge market of third party add-ons for the PDP-11, which made the machine even more useful. The combination of architectural innovations proved superior to competitors and the "11" architecture was soon the industry leader, propelling DEC back to a strong market position.
A second generation of programming languages, such as Basic, Algol 68 and Pascal (Designed by Niklaus Wirth in 1968-1969) appeared at the end of decade
Unix, a very influential operating system, was developed at Bell Laboratories by Ken Thompson (b. 1943) and Dennis Ritchie (b. 1941) after ATT withdraw from Multics project. Brian Kernighan and Ritchie together developed C, which became the most influential system programming language and also was used as general purpose language on personal computers. The first release was made in 1972. The definitive reference manual for it will not appear until 1974.
In early 1970th the PL/I Optimizer and Checkout compilers produced in Hursley supported a common level of PL/I language and aimed to replace the PL/I F compiler. The compilers had to produce identical results - the Checkout Compiler was used to debug programs that would then be submitted to the Optimizer. Given that the compilers had entirely different designs and were handling the full PL/I language this goal was challenging: it was achieved. The PL/I optimizing compiler took over from the PL/I F compiler and was IBM’s workhorse compiler from the 1970s to the 1990s. Like PL/I F, it was a multiple pass compiler with a 44kByte design point, but it was an entirely new design. Unlike the F compiler it had to perform compile time evaluation of constant expressions using the run-time library - reducing the maximum memory for a compiler phase to 28 kilobytes. A second-time around design, it succeeded in eliminating the annoyances of PL/I F such as cascading diagnostics. It was written in S/360 Macro Assembler by a team, led by Tony Burbridge, most of whom had worked on PL/I F. Macros were defined to automate common compiler services and to shield the compiler writers from the task of managing real-mode storage - allowing the compiler to be moved easily to other memory models. Program optimization techniques developed for the contemporary IBM Fortran H compiler were deployed: the Optimizer equaled Fortran execution speeds in the hands of good programmers. Announced with the IBM S/370 in 1970, it shipped first for the DOS/360 operating system in Aug 1971, and shortly afterward for OS/360, and the first virtual memory IBM operating systems OS/VS1, MVS and VM/CMS (the developers were unaware that while they were shoehorning the code into 28kB sections, IBM Poughkeepsie was finally ready to ship virtual memory support in OS/360). It supported the batch programming environments and, under TSO and CMS, it could be run interactively.
Simultaneously PL/C a dialect of PL/1 for education was developed at Cornell University in the early 1970s. It was designed with the specific goal of being used for teaching programming. The main authors were Richard W. Conway and Thomas R. Wilcox. They submitted the famous article "Design and implementation of a diagnostic compiler for PL/I" published in the Communications of ACM in March 1973. PL/C eliminated some of the more complex features of PL/I, and added extensive debugging and error recovery facilities. The PL/C compiler had the unusual capability of never failing to compile any program, through the use of extensive automatic correction of many syntax errors and by converting any remaining syntax errors to output statements.
In 1972 Gary Kildall implemented a subset of PL/1, called "PL/M" for microprocessors. PL/M was used to write the CP/M operating system - and much application software running on CP/M and MP/M. Digital Research also sold a PL/I compiler for the PC written in PL/M. PL/M was used to write much other software at Intel for the 8080, 8085, and Z-80 processors during the 1970s.
In 1973-74 Gary Kildall developed CP/M during , an operating system for an Intel Intellec-8 development system, equipped with a Shugart Associates 8-inch floppy disk drive interfaced via a custom floppy disk controller. It was written in PL/M. Various aspects of CP/M were influenced by the TOPS-10 operating system of the DECsystem-10 mainframe computer, which Kildall had used as a development environment.
The LSI-11 (PDP-11/03), introduced in February, 1975 was the first PDP-11 model produced using large-scale integration a precursor to personal PC.
The first RISC architecture was begun by John Cocke in 1975, at the Thomas J. Watson Laboratories of IBM. Similar projects started at Berkeley and Stanford around this time.
In March 1976 one of the first supercomputer CRAY-1 was shipped, designed by Seymour Cray (b. 1925) It could perform 160 million operations in a second. The Cray XMP came out in 1982. Later Cray Research was taken over by Silicon Graphics.
There were also major advances in algorithms and computational complexity. In 1971, Steve Cook published his seminal paper on NP-completeness, and shortly thereafter, Richard Karp showed that many natural combinatorial problems were NP-complete. Whit Diffie and Martin Hellman published a paper that introduced the theory of public-key cryptography, and a public-key cryptosystem known as RSA was invented by Ronald Rivest, Adi Shamir, and Leonard Adleman.
Microsoft was formed on April 4, 1975 to develop and sell BASIC interpreters for the Altair 8800. Bill Gates and Paul Allen write a version of BASIC that they sell to MITS (Micro Instrumentation and Telemetry Systems) on a per-copy royalty basis. MITS is producing the Altair, one of the earlier 8080-based microcomputers that came with a interpreter for a programming language.
The Apple I went on sale in July 1976 and was market-priced at $666.66 ($2,572 in 2011 dollars, adjusted for inflation.)
The Apple II was introduced on April 16, 1977 at the first West Coast Computer Faire. It differed from its major rivals, the TRS-80 and Commodore PET, because it came with color graphics and an open architecture. While early models used ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5 1/4 inch floppy disk drive and interface, the Disk II.
In 1976, DEC decided to extend the PDP-11 architecture to 32-bits while adding a complete virtual memory system to the simple paging and memory protection of the PDP-11. The result was the VAX architecture. The first computer to use a VAX CPU was the VAX-11/780, which DEC referred to as a superminicomputer. Although it was not the first 32-bit minicomputer, the VAX-11/780's combination of features, price, and marketing almost immediately propelled it to a leadership position in the market after it was released in 1978. VAX systems were so successful that it propelled Unix to the status of major OS. in 1983, DEC canceled its Jupiter project, which had been intended to build a successor to the PDP-10 mainframe, and instead focused on promoting the VAX as the single computer architecture for the company.
In 1978 AWK -- a text-processing language named after the designers, Aho, Weinberger, and Kernighan -- appears. The same year the ANSI standard for FORTRAN 77 appears.
In 1977 Bill Joy, then a graduate student at Berkeley, started compiling the first Berkeley Software Distribution (1BSD), which was released on March 9, 1978
In 1979, three graduate students in North Carolina developed a distributed news server which eventually became Usenet.
The Second Berkeley Software Distribution (2BSD), was released in May 1979. It included updated versions of the 1BSD software as well as two new programs by Joy that persist on Unix systems to this day: the vi text editor (a visual version of ex) and the C shell.
The same 1979 VisiCalc the first spreadsheet program available for personal computers was conceived by Dan Bricklin, refined by Bob Frankston, developed by their company Software Arts, and distributed by Personal Software in 1979 (later named VisiCorp) for the Apple II computer
At the end of 1979 the kernel of BSD Unix was largely rewritten by Berkeley students to include a virtual memory implementation, and a complete operating system including the new kernel, ports of the 2BSD utilities to the VAX, was released as 3BSD at the end of 1979.
Microsoft purchased a license for Version 7 Unix from AT&T in 1979, and announced on August 25, 1980 that it would make it available for the 16-bit microcomputer market.
The success of 3BSD was a major factor in the Defense Advanced Research Projects Agency's (DARPA) decision to fund Berkeley's Computer Systems Research Group (CSRG), which would develop a standard Unix platform for future DARPA research in the VLSI Project and included TCP stack. CSRG released 4BSD, containing numerous improvements to the 3BSD system, in October 1980. 4BSD released in November 1980 offered a number of enhancements over 3BSD, notably job control in the previously released csh, delivermail (the antecedent of sendmail), "reliable" signals, and the Curses programming library.
This decade also saw the rise of the personal computer, thanks to Steve Wozniak and Steve Jobs, founders of Apple Computer.
In 1981 IBM PC was launched which made personal computer mainstream. The first computer viruses are developed also in 1981. The term was coined by Leonard Adleman, now at the University of Southern California. The same year, 1981, the first truly successful portable computer (predecessor of modern laptops) was marketed, the Osborne I.
In 1982 one of the first scripting languages REXX was released by IBM as a product. It was four years after AWK was released. Over the years IBM included REXX in almost all of its operating systems (VM/CMS, VM/GCS, MVS TSO/E, AS/400, VSE/ESA, AIX, CICS/ESA, PC DOS, and OS/2), and has made versions available for Novell NetWare, Windows, Java, and Linux.
In 1982 PostScript appears, which revolutionized printing on dot matrix and laser printers.
1983 was the year of major events in language area:
In 1984 Stallman published a rewritten version of Gosling's Emacs (GNU Emacs, where G stand for Goslings) as "free" software (Goslings sold the rights for his code to a commercial company), and launches the Free Software Foundation (FSF) to support the GNU project. One of the first program he decided to write is a C compiler that became widely knows as gcc. The same year Steven Levy "Hackers" book is published with a chapter devoted to RMS that presented him in an extremely favorable light.
In October 1983 Apple introduced the Macintosh computer which was the first GUI-based mass produced personal computer. It was three years after IBM PC was launched and six years after Apple II launch. It went of sale on Jan 24, 1984 two days after US$1.5 million Ridley Scott television commercial, "1984" was aired during Super Bowl XVIII on January 22, 1984. It is now considered a a "masterpiece.". In it an unnamed heroine to represent the coming of the Macintosh (indicated by a Picasso-style picture of Apple's Macintosh computer on her white tank top) as a means of saving humanity from the "conformity" of IBM's attempts to dominate the computer industry.
In 1985 Intel 80386 introduced 32-bit logical addressing. It became instrumental in Unix Renaissance which started the same year the launch of of Xenix 2.0 by Microsoft. It was based on UNIX System V. An update numbered 2.1.1 added support for the Intel 80286 processor. The Sperry PC/IT, an IBM PC AT clone, was advertised as capable of supporting eight simultaneous dumb terminal users under this version. Subsequent releases improved System V compatibility. The era of PC Unix started and Microsoft became dominant vendor of Unix: in the late 1980s, Xenix was, according to The Design and Implementation of the 4.3BSD UNIX Operating System, "probably the most widespread version of the UNIX operating system, according to the number of machines on which it runs". In 1987, SCO ported Xenix to the 386 processor. Microsoft used Xenix on Sun workstations and VAX minicomputers extensively within their company as late as 1992
Microsoft Excel was first released for Macintosh, not IBM PC, in 1985. The same year the combination of the Mac, Apple's LaserWriter printer, and Mac-specific software like Boston Software's MacPublisher and Aldus PageMaker enabled users to design, preview, and print page layouts complete with text and graphics—an activity to become known as desktop publishing.
The first version of GCC was able to compile itself in late 1985. The same year GNU Manifesto published
In 1986-1989 a series of computer viruses for PC DOS made headlines. One of the first mass viruses was boot virus called Brain created in 1986 by the Farooq Alvi Brothers in Lahore, Pakistan, reportedly to deter piracy of the software they had written.
In 1987, the US National Science Foundation started NSFnet, precursor to part of today's Internet.
The same year, 1987, Perl was released by Larry Wall. In 1988 Perl 2 was released.
Steve Jobs was ousted from Apple and formed his new company NeXT Computer with a dozen of former Apple employees. NeXT was the first affordable workstation with over megaflop computer power. It was in 1988, and the smaller NeXTstation in 1990. It was NeXTstation that was used to develop World Wide Web in CERN. It was also instrumental in creating of complex modern GUI interfaces and launching object oriented programming into mainstream...
In 1998 Human genome sequncing project started. A Brief History of the Human Genome Project
In 1988, Congress funded both the NIH and the DOE to embark on further exploration of this concept, and the two government agencies formalized an agreement by signing a Memorandum of Understanding to "coordinate research and technical activities related to the human genome."
James Watson was appointed to lead the NIH component, which was dubbed the Office of Human Genome Research. The following year, the Office of Human Genome Research evolved into the National Center for Human Genome Research (NCHGR).
In 1990, the initial planning stage was completed with the publication of a joint research plan, "Understanding Our Genetic Inheritance: The Human Genome Project, The First Five Years, FY 1991-1995." This initial research plan set out specific goals for the first five years of what was then projected to be a 15-year research effort.
In 1992, Watson resigned, and Michael Gottesman was appointed acting director of the center. The following year, Francis S. Collins was named director.
The advent and employment of improved research techniques, including the use of restriction fragment-length polymorphisms, the polymerase chain reaction, bacterial and yeast artificial chromosomes and pulsed-field gel electrophoresis, enabled rapid early progress. Therefore, the 1990 plan was updated with a new five-year plan announced in 1993 in the journal Science (262: 43-46; 1993).
1989 FSF introduces a General Public License (GPL). GPL is also known as 'copyleft'. Stallman redefines the word "free" in software to mean "GPL compatible". In 1990 As the president of the League for Programming Freedom (organization that fight software patterns), Stallman is given a $240,000 fellowship by the John D. and Catherine T. MacArthur Foundation.
4.3BSD-Reno came in early 1990. It was an interim release during the early development of 4.4BSD, and its use was considered a "gamble", hence the naming after the gambling center of Reno, Nevada. This release was explicitly moving towards POSIX compliance. Among the new features was an NFS implementation from the University of Guelph. In August 2006, Information Week magazine rated 4.3BSD as the "Greatest Software Ever Written".They commented: "BSD 4.3 represents the single biggest theoretical undergirder of the Internet."
On December 25 1990 the first successful communication between a Hypertext Transfer Protocol (HTTP) client and server via the Internet was accomplished in CERN. It was running on NeXT:
" Mike Sendall buys a NeXT cube for evaluation, and gives it to Tim [Berners-Lee]. Tim's prototype implementation on NeXTStep is made in the space of a few months, thanks to the qualities of the NeXTStep software development system. This prototype offers WYSIWYG browsing/authoring! Current Web browsers used in "surfing the Internet" are mere passive windows, depriving the user of the possibility to contribute. During some sessions in the CERN cafeteria, Tim and I try to find a catching name for the system. I was determined that the name should not yet again be taken from Greek mythology. Tim proposes "World-Wide Web". I like this very much, except that it is difficult to pronounce in French..." by Robert Cailliau, 2 November 1995.
In 1991 Linux was launched. The USSR was dissolved that led to influx of Russian programmers (as well as programmers from Eastern European countries) in the USA.
The first website was online on 6 August 1991:
"Info.cern.ch was the address of the world's first-ever web site and web server, running on a NeXT computer at CERN. The first web page address was http://info.cern.ch/hypertext/WWW/TheProject.html, which centred on information regarding the WWW project. Visitors could learn more about hypertext, technical details for creating their own webpage, and even an explanation on how to search the Web for information. There are no screenshots of this original page and, in any case, changes were made daily to the information available on the page as the WWW project developed. You may find a later copy (1992) on the World Wide Web Consortium website." -CERN
BSDi, the company formed to commercialized Unix BSD system found itself in legal trouble with AT&T's Unix System Laboratories (USL) subsidiary, then the owners of the System V copyright and the Unix trademark. The USL v. BSDi lawsuit was filed in 1992 and led to an injunction on the distribution of Net/2 until the validity of USL's copyright claims on the source could be determined. That launched Linux into mainstream.
FreeBSD development began in 1993 with a quickly growing, unofficial patchkit maintained by users of the 386BSD operating system. This patchkit forked from 386BSD and grew into an operating system taken from U.C. Berkeley's 4.3BSD-Lite (Net/2) tape with many 386BSD components and code from the Free Software Foundation.
On April 1993 CERN released the web technology into the public domain.
1994 First official Linux version 1.0 kernel released. Linux already has about 500,000 users. Unix renaissance started.
The same 1994 Microsoft incorporates Visual Basic for Applications into Excel, creating a way to knock out the competition of the Microsoft Office.
In February 1995, ISO accepts the 1995 revision of the Ada language. Called Ada 95, it includes OOP features and support for real-time systems.
In 1995 TCP connectivity in the USA became mainstream. Internet boom (aka dot-com boom) hit the USA. . Red Hat was formed by merger with ACC with Robert Yong of ACC (former founder of Linux Journal) a CEO.
In 1996 first computer monitoring system such as Tivoli and OpenView became established players.
In 1996 first ANSI C++ standard was released.
In 1997 Java was released. Also weak and primitive programming language if we consider its design (originally intended for imbedded systems), it proved to be durable and successful successor for Cobol. Sum Microsystems proved to be a capable marketing machine but that lead to deterioration of Solaris position and partial neglect of other projects such as Solaris on X86 and TCL. Microsoft launched a successful derivate of Java, called C# in December 2002.
In 1998 outsourcing that in 10 years destroy the USA programming industry became a fashion, fueleed by finacial industry attempts to exploit Internet boom for quick profits.
In 1999 a crazy connected with so called Millennium bug hit the USA. Proved lasting intellectual deterioration of some key US political figures including chairmen Greenspan --a cult like figure at the time.
In March 1999. Al Gore revealed that "During my service in the United States Congress, I took the initiative in creating the internet.". Which was partically true
This decade ended with 2000 dot-com boom bust. See Nikolai Bezroukov. Portraits of Open Source Pioneers. Ch 4: Grand Replicator aka Benevolent Dictator (A Slightly Skeptical View on Linus Torvalds)
Oct 15, 2018 | www.theverge.com
Microsoft co-founder Paul Allen died today from complications with non-Hodgkin's lymphoma. He was 65. Allen said earlier this month that he was being treated for the disease.
Allen was a childhood friend of Bill Gates, and together, the two started Microsoft in 1975. He left the company in 1983 while being treated for Hodgkin's lymphoma and remained a board member with the company through 2000. He was first treated for non-Hodgkin's lymphoma in 2009, before seeing it go into remission.
In a statement given to ABC News , Gates said he was "heartbroken by the passing of one of my oldest and dearest friends." He went on to commend his fellow co-founder for his life after Microsoft:
From our early days together at Lakeside School, through our partnership in the creation of Microsoft, to some of our joint philanthropic projects over the years, Paul was a true partner and dear friend. Personal computing would not have existed without him.
But Paul wasn't content with starting one company. He channelled his intellect and compassion into a second act focused on improving people's lives and strengthening communities in Seattle and around the world. He was fond of saying, "If it has the potential to do good, then we should do it." That's the king of person he was.
Paul loved life and those around him, and we all cherished him in return. He deserved much more time, but his contributions to the world of technology and philanthropy will live on for generations to come. I will miss him tremendously.
Microsoft CEO Satya Nadella said Allen's contributions to both Microsoft and the industry were "indispensable." His full statement is quoted below:
Paul Allen's contributions to our company, our industry, and to our community are indispensable. As co-founder of Microsoft, in his own quiet and persistent way, he created magical products, experiences and institutions, and in doing so, he changed the world. I have learned so much from him -- his inquisitiveness, curiosity, and push for high standards is something that will continue to inspire me and all of us as Microsoft. Our hearts are with Paul's family and loved ones. Rest in peace.
In a memoir published in 2011, Allen says that he was responsible for naming Microsoft and creating the two-button mouse. The book also portrayed Allen as going under-credited for his work at Microsoft, and Gates as having taken more ownership of the company than he deserved. It created some drama when it arrived, but the two men ultimately appeared to remain friends, posing for a photo together two years later.
After leaving Microsoft, Allen became an investor through his company Vulcan, buying into a diverse set of companies and markets. Vulcan's current portfolio ranges from the Museum of Pop Culture in Seattle, to a group focused on using machine learning for climate preservation, to Stratolaunch, which is creating a spaceplane . Allen's investments and donations made him a major name in Seattle, where much of his work was focused. He recently funded a $46 million building in South Seattle that will house homeless and low-income families.
Both Apple CEO Tim Cook and Google CEO Sundar Pichai called Allen a tech "pioneer" while highlighting his philanthropic work in statements on Twitter. Amazon CEO Jeff Bezos said Allen's work "inspired so many."
Allen has long been the owner of the Portland Trail Blazers and Seattle Seahawks as well. NFL Commissioner Roger Goodell said Allen "worked tirelessly" to "identify new ways to make the game safer and protect our players from unnecessary risk." NBA Commissioner Adam Silver said Allen "helped lay the foundation for the league's growth internationally and our embrace of new technologies."
He also launched a number of philanthropic efforts, which were later combined under the name Paul G. Allen Philanthropies. His "philanthropic contributions exceed $2 billion," according to Allen's own website, and he had committed to giving away the majority of his fortune.
Allen's sister, Jody Allen, wrote a statement on his family's behalf:
My brother was a remarkable individual on every level. While most knew Paul Allen as a technologist and philanthropist, for us he was a much loved brother and uncle, and an exceptional friend.
Paul's family and friends were blessed to experience his wit, warmth, his generosity and deep concern. For all the demands on his schedule, there was always time for family and friends. At this time of loss and grief for us – and so many others – we are profoundly grateful for the care and concern he demonstrated every day.
Some of Allen's philanthropy has taken a scientific bent: Allen founded the Allen Institute for Brain Science in 2003, pouring $500 million into the non-profit that aims to give scientists the tools and data they need to probe how brain works. One recent project, the Allen Brain Observatory , provides an open-access "catalogue of activity in the mouse's brain," Saskia de Vries, senior scientist on the project, said in a video . That kind of data is key to piecing together how the brain processes information.
In an interview with Matthew Herper at Forbes , Allen called the brain "hideously complex" -- much more so than a computer. "As an ex-programmer I'm still just curious about how the brain functions, how that flow of information really happens," he said . After founding the brain science institute, Allen also founded the Allen Institute for Artificial Intelligence and the Allen Institute for Cell Science in 2014, as well as the Paul G. Allen Frontiers Group in 2016 , which funds cutting-edge research.
Even back in 2012, when Allen spoke with Herper at Forbes , he talked about plans for his financial legacy after his death -- and he said that a large part of it would be "allocated to this kind of work for the future."
In a statement emailed to The Verge, The Allen Institute's President and CEO Allan Jones said:
Paul's vision and insight have been an inspiration to me and to many others both here at the Institute that bears his name, and in the myriad of other areas that made up the fantastic universe of his interests. He will be sorely missed. We honor his legacy today, and every day into the long future of the Allen Institute, by carrying out our mission of tackling the hard problems in bioscience and making a significant difference in our respective fields.
According to Quincy Jones, Allen was also an excellent guitar player .
Oct 15, 2018 | science.slashdot.org
bennet42 ( 1313459 ) , Monday October 15, 2018 @08:24PM ( #57483472 )Re:RIP Paul! ( Score: 5 , Informative)
Man what a shock! I was lucky enough to be working at a Seattle startup that Paul bought back in the 90s ( doing VoIP SOHO phone systems ). He liked to swing by office on a regular basis as we were just a few blocks from Dicks hamburgers on Mercer St (his favorite). He was really an engineer's engineer. We'd give him a status report on how things were going and within a few minutes he was up at the white board spitballing technical solutions to ASIC or network problems. I especially remember him coming by the day he bought the Seahawks. Paul was a big physical presence ( 6'2" 250lbs in those days ), but he kept going on about how after meeting the Seahawks players, he never felt so physically small in his life. Ignore the internet trolls. Paul was a good guy. He was a humble, modest, down-to-earth guy. There was always a pick-up basketball game on his court on Thursday nights. Jam session over at his place were legendary ( I never got to play with him, but every musician that I know that played with him was impressed with his guitar playing ). He left a huge legacy in the pacific northwest. We'll miss you Paul!Futurepower(R) ( 558542 ) writes: < MJennings.USA@NOT_any_of_THISgmail.com > on Monday October 15, 2018 @06:56PM ( #57482948 ) HomepageBill Gates was so angry, Allen left the company. ( Score: 5 , Interesting)
The book Paul Allen wrote avoids a full report, but gives the impression that Bill Gates was so angry, Paul Allen left the company because interacting with Bill Gates was bad for his health.
Quotes from the book, Idea Man [amazon.com] by Paul Allen.
THREE DECADES AFTER teaching Bill and me at Lakeside, Fred Wright was asked what he'd thought about our success with Microsoft. His reply: "It was neat that they got along well enough that the company didn't explode in the first year or two."
When Bill pushed on licensing terms or bad-mouthed the flaky Signetics cards, Ed thought he was insubordinate. You could hear them yelling throughout the plant, and it was quite a spectacle-the burly ex-military officer standing toe to toe with the owlish prodigy about half his weight, neither giving an inch.
Bill was sarcastic, combative, defensive, and contemptuous.
"For Bill, the ground had already begun shifting. At product review meetings, his scathing critiques became a perverse badge of honor. One game was to count how many times Bill confronted a given manager; whoever got tagged for the most "stupidest things " won the contest. "I give my feedback," he grumbled to me, "and it doesn't go anywhere."RubberDogBone ( 851604 ) , Monday October 15, 2018 @10:16PM ( #57483928 )toadlife ( 301863 ) , Monday October 15, 2018 @06:29PM ( #57482740 ) JournalRIP Dr. Netvorkian ( Score: 2 )
Rest well, Mr. Allen.
He used to have the nickname "Doctor NetVorkian" because many of the things he invested in promptly tanked in one way or another after his investment. He had a lot of bad luck with his investments.
For those who don't understand the joke, a certain Dr. Kervorkian became notorious for helping ill patients commit suicide.Heyyyyy! ( Score: 5 , Funny)hey! ( 33014 ) writes:
Allen had nothing to do with systemd!Re: ( Score: 2 )CohibaVancouver ( 864662 ) , Monday October 15, 2018 @06:44PM ( #57482862 )
What a ray of sunshine you are.Re:Now burning in hell ( Score: 5 , Informative)BitterOak ( 537666 ) , Monday October 15, 2018 @06:56PM ( #57482940 )
He is now burning in hell for Microsoft and Windows
Windows, Anonymous Coward? Allen left Microsoft in 1982. Windows 1.0 launched in 1985.
("The" Windows - Windows 3.1 - Didn't launch until 1992, a decade after Allen had left.)Re:Now burning in hell ( Score: 5 , Insightful)El Cubano ( 631386 ) writes:Microsoft created Windows and Allen co-founded Microsoft - he cannot wipe that blood off his hands!
But you can wipe Windows off your hard drive, so I don't get your point. Paul Allen was a great guy in many, many ways.Re: ( Score: 2 )110010001000 ( 697113 ) writes:But you can wipe Windows off your hard drive, so I don't get your point. Paul Allen was a great guy in many, many ways.
Agreed. Even if you could "blame" him for all or part of Windows, he did start the Museum of Pop Culture [wikipedia.org]. If you are ever in Seattle, it is a must see. I mean, they have what is probably the best Star Trek museum display anywhere (which is saying a lot since the Smithsonian has a very nice one as well), including most of the original series set pieces and I believe one of the only actual Enterprise models used for filming. In my mind, that gives him a great deal of geek cred. Plus, as I underRe: ( Score: 2 )110010001000 ( 697113 ) , Monday October 15, 2018 @07:20PM ( #57483078 ) Homepage Journal
Well if he donated guitars and liked Star Trek then he must have been a good guy.Re:And Then? ( Score: 1 )
You forgot he was a big Patent Troll. He won't be missed or remembered.110010001000 ( 697113 ) , Monday October 15, 2018 @10:28PM ( #57483964 ) Homepage JournalRe:And Then? ( Score: 2 )
I knew someone would say that. You are right. I won't. But he won't either. He was a patent troll. Oh but: RIP and thoughts and prayers, right? He was a great guy and will be missed.
Sep 07, 2018 | science.slashdot.org
[Editor's note: all links in the story will lead you to Twitter] : In the 1970s the cost -- and size -- of calculators tumbled. Business tools became toys; as a result prestige tech companies had to rapidly diversify into other products -- or die! This is the story of the 1970s great calculator race... Compact electronic calculators had been around since the mid-1960s, although 'compact' was a relative term. They were serious, expensive tools for business . So it was quite a breakthrough in 1967 when Texas Instruments presented the Cal-Tech: a prototype battery powered 'pocket' calculator using four integrated circuits . It sparked a wave of interest. Canon was one of the first to launch a pocket calculator in 1970. The Pocketronic used Texas Instruments integrated circuits, with calculations printed on a roll of thermal paper. Sharp was also an early producer of pocket calculators. Unlike Canon they used integrated circuits from Rockwell and showed the calculation on a vacuum fluorescent display . The carrying handle was a nice touch!
The next year brought another big leap: the Hewlet-Packard HP35 . Not only did it use a microprocessor it was also the first scientific pocket calculator. Suddenly the slide rule was no longer king; the 35 buttons of the HP35 had taken its crown. The most stylish pocket calculator was undoubtedly the Olivetti Divisumma 18 , designed by Mario Bellini. Its smooth look and soft shape has become something of a tech icon and an inspiration for many designers. It even featured in Space:1999! By 1974 Hewlett Packard had created another first: the HP-65 programmable pocket calculator . Programmes were stored on magnetic cards slotted into the unit. It was even used during the Apollo-Soyuz space mission to make manual course corrections. The biggest problem for pocket calculators was the power drain: LED displays ate up batteries. As LCD displays gained popularity in the late 1970s the size of battery needed began to reduce . The 1972 Sinclair Executive had been the first pocket calculator to use small circular watch batteries , allowing the case to be very thin. Once LCD displays took off watch batteries increasingly became the norm for calculators. Solar power was the next innovation for the calculator: Teal introduced the Photon in 1977, no batteries required or supplied!
But the biggest shake-up of the emerging calculator market came in 1975, when Texas Instruments -- who made the chips for most calculator companies -- decided to produce and sell their own models. As a vertically integrated company Texas Instruments could make and sell calculators at a much lower price than its competitors . Commodore almost went out of business trying to compete: it was paying more for its TI chips than TI was selling an entire calculator for. With prices falling the pocket calculator quickly moved from business tool to gizmo : every pupil, every student, every office worker wanted one, especially when they discovered the digital fun they could have! Calculator games suddenly became a 'thing' , often combining a calculator with a deck of cards to create new games to play. Another popular pastime was finding numbers that spelt rude words if the calculator was turned upside down; the Samsung Secal even gave you a clue to one!
The calculator was quickly evolving into a lifestyle accessory . Hewlett Packard launched the first calculator watch in 1977... Casio launched the first credit card sized calculator in 1978 , and by 1980 the pocket calculator and pocket computer were starting to merge. Peak calculator probably came in 1981, with Kraftwerk's Pocket Calculator released as a cassingle in a calculator-shaped box . Although the heyday of the pocket calculator may be over they are still quite collectable. Older models in good condition with the original packaging can command high prices online. So let's hear it for the pocket calculator: the future in the palm of your hand!
Anonymous Coward , Monday September 03, 2018 @10:39PM ( #57248568 )mmogilvi ( 685746 ) writes:HP were real engineers ( Score: 3 , Informative)
I have a HP-15C purchased in 1985 and it is still running on the original batteries - 32 years!
That is phenomenal low power design for the technology and knowledge at the time.Re: ( Score: 3 , Interesting)cyn1c77 ( 928549 ) , Monday September 03, 2018 @11:58PM ( #57248754 )
I replaced the batteries in my 15c for the first time a couple of years ago. And just to be clear, it has three small non-rechargable button batteries, like you would find in a watch.Re:HP were real engineers ( Score: 2 )I have a HP-15C purchased in 1985 and it is still running on the original batteries - 32 years!
That is phenomenal low power design for the technology and knowledge at the time.
That's phenomenal even by today's design standards!JBMcB ( 73720 ) , Monday September 03, 2018 @11:21PM ( #57248666 )Olivetti Divisumma 18 ( Score: 3 )
My dad's friend was a gadget hound, and had one of these in the 80's. Not a great machine. The keys were weird and mushy. It had no electronic display. It only had a thermal printer that printed shiny dark gray numbers on shiny light gray paper. In other words, visibility was poor. It looked amazing, though, and you could spill a coke on it and the keys would still work.
Much more impressive but more utilitarian - he had a completely electro-mechanical rotary auto-dial telephone. It took small, hard plastic punch cards you'd put the number on. You'd push the card into a slot on the telephone, and it would feed the card in and out, generating pulses until it got to the number you punched out. Then it would pull the card back in and do it again for the next number until the whole number was dialed. No digital anything, just relays and motors.fermion ( 181285 ) , Tuesday September 04, 2018 @01:53AM ( #57248998 ) Homepage Journalchicken or the egg ( Score: 5 , Insightful)
In some ways, the electronic calculator market was created by TI and it's need to sell the new IC. There were not many applications, and one marketable application was the electronic calculator. In some ways it was like live Apple leveraging the microwave for the iPod.
Like the iPod, the TI calculators were not great, but they were very easy to use. The HP calculators were and are beatiful. But ease of use won out.
Another thing that won out was until about a decade ago all TI calculators were very limited. This made them ideal machines for tests. HP calculators could do unit analsys, and since 1990 they had algebra systems, and could even do calculus. This made them the ideal machine for technical students and professionals, but no high school would waste time teaching it because all they care about is filling out bubbles on an answer sheet.
The interesting contemporary issue that I see is that schools are still teaching calculators when really smart phones can do everything and more, especially with apps like Wolfram Alpha. Unless you are a legacy HP user, asking kids to buy a calculator just to boosts TI profits seems very wasteful to me. This is going to change as more tests move to online format, and online resources such as Desmos take over the physical clacultor, but in the meantime the taxpayer is on the hook for millions of dollars a year per large school district just for legacy technology.Dhericean ( 158757 ) , Tuesday September 04, 2018 @05:01AM ( #57249428 )Japan and Calculators ( Score: 3 )
An interesting NHK World documentary about Japanese calculator culture and the history of calculators in Japan. I generally watch these at speed = 1.5.
Begin Japanology (13 June 2013) - Calculators [youtube.com]weilawei ( 897823 ) , Tuesday September 04, 2018 @05:17AM ( #57249468 ) HomepageNo TI-89 Fans Yet? ( Score: 2 )
I love my TI-89. I still use it daily. There's a lot to be said for multiple decades of practice on a calculator. Even the emulator of it on my phone, for when I don't have it handy, isn't the same.
It doesn't need to be particularly fast or do huge calculations--that's what programming something else is for. But nothing beats a good calculator for immediate results. ›mknewman ( 557587 ) , Tuesday September 04, 2018 @09:31AM ( #57250228 )Ken Hall ( 40554 ) , Tuesday September 04, 2018 @09:50AM ( #57250316 )TI started in 1972 not 1975 ( Score: 2 )
I had a Datamath in 1973, and a SR-57 programmable (100 steps, 10 memories) in 1975. Those were the days.TI Programmer ( Score: 2 )
First calculator that did octal and hex math (also binary). Got one when they came out, cost $50 in 1977. Still have it, still works, although the nicad battery died long ago. In a remarkable show of foresight, TI made the battery pack with a standard 9V battery connector, and provided a special battery door that let you replace the rechargeable battery with a normal 9V. I replaced it with a solar powered Casio that did a bunch more stuff, but the TI still works.
Sep 01, 2018 | www.nakedcapitalism.com
Posted on September 1, 2018 by Lambert Strether
Lambert: This is an important post, especially as it pertains to our implicit and unacknowledged industrial policy, militarization.
By Marshall Auerback, a market analyst and Research Associate at the Levy Institute. Cross-posted from Alternet .
President Trump and his Mexican counterpart, Enrique Peña Nieto, recently announced resolution of major sticking points that have held up the overall renegotiation of the NAFTA Treaty (or whatever new name Trump confers on the expected trilateral agreement). At first glance, there are some marginal improvements on the existing treaty, especially in terms of higher local content sourcing, and the theoretic redirection of more "high wage" jobs back to the U.S.
These benefits are more apparent than real. The new and improved NAFTA deal won't mean much, even if Canada ultimately signs on. The deal represents reshuffling a few deck chairs on the Titanic , which constitutes American manufacturing in the 21st century: a sector that has been decimated by policies of globalization and offshoring.
Additionally, what has remained onshore is now affected adversely to an increasing degree by the Pentagon. The experience of companies that have become largely reliant on military-based demand is that they gradually lose the ability to compete in global markets.
As early as the 1980s, this insight was presciently confirmed by the late scholar Seymour Melman . Melman was one of the first to state the perhaps not-so-obvious fact that the huge amount of Department of Defense (DoD) Research and Development (R&D) pumped into the economy has actually stifled American civilian industry innovation and competitiveness, most notably in the very manufacturing sector that Trump is seeking to revitalize with these "reformed" trade deals.
The three biggest reasons are:
1. The huge diversion of national R&D investment into grossly overpriced and mostly unjustifiable DoD R&D programs has tremendously misallocated a large proportion America's finest engineering talent toward unproductive pursuits (e.g., the tactical fighter fiascos , such as the F-35 Joint Strike Fighter that, among myriad other deficiencies, cannot fly within 25 miles of a thunderstorm; producing legacy systems that reflect outdated Cold War defense programs to deal with a massive national power, as opposed to combatting 21st-century terrorist counterinsurgencies). Indicative of this waste, former congressional aide Mike Lofgren quotes a University of Massachusetts study , illustrating that comparable civilian expenditures "would produce anywhere from 35 percent to 138 percent more jobs than spending the same amount on DoD [projects]." The NAFTA reforms won't change any of that.
2. By extension, the wasteful, cost-is-irrelevant habits of mind inculcated into otherwise competent engineers by lavish DoD cost-plus contracting have ruined these engineers for innovation in competitive, cost-is-crucial civilian industries.
3. The ludicrously bureaucratized management systems (systems analysis, systems engineering, five-year planning and on and on through a forest of acronyms) that DoD has so heavily propagandized and forced on contractors has, in symbiosis with the Harvard Business School/Wall Street mega-corporate managerial mindset, thoroughly wrecked efficient management of most sectors of American industry.
Let's drill down to the details of the pact, notably automobiles, which have comprised a big part of NAFTA. Under the new deal, 25 percent of auto content can be produced elsewhere than North America, a reduction from 37.5 percent that could be produced outside before, because of the multinational nature of every major automobile manufacturer . Twenty-five percent is still a very large percentage of the high-end auto content, much of which is already manufactured in Europe -- especially expensive parts like engines and transmissions, especially for non-U.S. manufacturers, that won't be much affected by this deal.
Additionally, much of the non–North American auto content that can be or is being manufactured in Europe is the high end of the value-added chain . Certainly the workers producing the engines and transmissions have higher-than-$16-per-hour wage rates, which are trumpeted in the new agreement as a proof that more "good jobs for working people" are being re-established by virtue of this deal. Since when is $16 per hour a Trumpian boon for U.S. auto workers? Objectively, $16 is only 27 percent above the 2018 Federal Poverty Threshold for a family with two kids ; even worse, $16 is only 54 percent of today's actual average hourly pay ($29.60) in U.S. automobile manufacturing, according to 2018 BLS numbers .
But beyond cars, here's the real problem: Although the ostensible goal of all of Trump's trade negotiations is to revitalize American manufacturing, the truth is that U.S. manufacturing basically suffered a catastrophic setback when China entered the World Trade Organization (WTO) back in 2001. Along with liberalized capital flows, the extensive resort to "offshoring" of manufacturing to China has sapped the American manufacturing capabilities, as well as engendering a skills shortage. This includes (to quote a recent Harvard Business Review study co-authored by Professors Gary Pisano and Willy Shih ):
"[the] tool and die makers, maintenance technicians, operators capable of working with highly sophisticated computer-controlled equipment, skilled welders, and even production engineers [all of whom] are in short supply.
"The reasons for such shortages are easy to understand. As manufacturing plants closed or scaled back, many people in those occupations moved on to other things or retired. Seeing fewer job prospects down the road, young people opted for other careers. And many community and vocational schools, starved of students, scaled back their technical programs."
The one ready source of demand for U.S.-manufactured goods is the military. High-tech enthusiasts like to claim that the U.S. Defense Department had a crucial role in creating Silicon Valley . The truth is far more nuanced. Silicon Valley grew like a weed with essentially no Department of Defense investment; in fact, until quite recently, most successful Silicon Valley enterprises avoided DoD contracts like the plague.
A bit of history: the transistor was invented by entirely private-funded research at Bell Labs in 1947. Next, the first integrated circuit patent was filed by Werner Jacobi, a Siemens commercial engineer in 1949 for application to hearing aids . The next advance, the idea of a silicon substrate, was patented in 1952 as a cheaper way of using transistors by Geoffrey Dummer , a reliability engineer working at a British government radar lab; for all of the talk of the defense establishment's role in developing high tech, ironically the British military showed no interest and Dummer couldn't secure the funding support to produce a working prototype. In 1958, Jack Kilby, a newbie at Texas Instruments (which had developed the first transistor radio as a commercial product in 1954) came up with the idea of multiple transistors on a germanium substrate. Almost simultaneously, in 1960, Robert Noyce at Fairchild Electronics patented a cheaper solution, a new approach to the silicon substrate, and implemented working prototypes. Both men envisioned mainly civilian applications (Kilby soon designed the first integrated circuit pocket calculator, a commercial success).
It is true that the first customers for both the Noyce and Kilby chips were the U.S. Air Force's B-70 and Minuteman I projects, which gave rise to the idea that the Pentagon played the key role in developing U.S. high-tech manufacturing, although it is worth noting that the electronics on both projects proved to be failures. Both companies soon developed major commercial applications for their integrated circuit innovations, Texas Instruments with considerably more success than Fairchild.
The Defense Advanced Research Projects Agency (aka "DARPA") is generally credited with inventing the internet. That's overblown. In fact, civilian computer science labs in the U.S., UK and France developed the idea of wide area networks in the 1950s. In the early '60s, DARPA started funding ARPANET design concepts to connect DoD laboratories and research facilities, initially without the idea of packet switching. Donald Davies at the UK's National Physics Lab first demonstrated practical packet switching in 1967 and had a full inter-lab network going by 1969. The first two nodes of the ARPANET were demonstrated in 1969 using a primitive centralized architecture and the Davies packet switching approach. In 1973, DARPA's Cerf and Kahn borrowed the idea of decentralized nodes from the French CYCLADES networking system, but this wasn't fully implemented as the TCP/IP protocol on the ARPANET until 1983. In 1985, the civilian National Science Foundation started funding the NSFNET, based on the ARPANET's TCP/IP protocol, for a much larger network of universities, supercomputer labs and research facilities. NSFNET started operations with a much larger backbone than ARPANET in 1986 (ARPANET itself was decommissioned in 1990) and started accepting limited commercial service providers in 1988, and, with further expansion and much-needed protocol upgrades, NSFNET morphed into the internet in 1995, at which time the NSFNET backbone was decommissioned.
Today, of course, the DoD is showering the largest Silicon Valley companies with multi-billions and begging them to help U.S. military out of the hopeless mess it has made of its metastasizing computer, communications and software systems. Needless to say, if this DoD money becomes a significant portion of the income stream of Google, Microsoft, Apple, etc., it is safe to predict their decay and destruction at the hands of new innovators unencumbered by DoD funding, much as occurred with aviation companies such as Lockheed. NAFTA's reforms won't change that reality.
As a result of the militarization of what's left of U.S. manufacturing, along with the enlargement of the trans-Pacific supply chains with China (brought about through decades of offshoring), a mere tweak of the "new NAFTA" is unlikely to achieve Trump's objective of revitalizing America's industrial commons. With China's entry into the WTO, it is possible that the U.S. manufacturing has hit a "point of no return," which mitigates the utility of regional trade deals, as a means of reorienting the multinational production networks in a way that produces high-quality, high-paying jobs for American workers.
Offshoring and the increasing militarization of the American economy, then, have rendered reforms of the kind introduced by NAFTA almost moot. When it becomes more profitable to move factories overseas, or when it becomes more profitable, more quickly, to focus on finance instead of manufacturing, then your average red-blooded capitalist is going to happily engage in the deconstruction of the manufacturing sector (most, if not all, outsourced factories are profitable, just not as profitable as they might be in China or, for that matter, Mexico). The spoils in a market go to the fastest and strongest, and profits from finance, measured in nanoseconds, are gained more quickly than profits from factories, whose time frame is measured in years. Add to that the desire to weaken unions and the championing of an overvalued dollar by the financial industry, and you have a perfect storm of national decline and growing economic inequality.
Seen in this broader context, the new NAFTA-that's-not-called-NAFTA is a nothing-burger. The much-trumpeted reforms give crumbs to U.S. labor, in contrast to the clear-cut bonanza granted to corporate America under the GOP's new tax "reform" legislation. Hardly a great source of celebration as we approach the Labor Day weekend. In fact, by the time the big bucks corporate lawyer-lobbyists for other mega corporations have slipped in their little wording refinements exempting hundreds of billions of special interest dollars, the new NAFTA-that's-not-called-NAFTA will most likely include an equal screwing for textile, steel, energy sector, chemical, and other U.S. industry workers, and anything else that is left of the civilian economy.
This article was produced by the Independent Media Institute .
Mark Pontin , September 1, 2018 at 3:08 amAnthony K Wikrent , September 1, 2018 at 8:09 am
From Auerback's article: "Silicon Valley grew like a weed with essentially no Department of Defense investment."
In 1960, 100 percent of all microprocessor chips produced by Silicon Valley -- therefore, IIRC, at that time 100 percent of all microprocessor chips produced in the world -- were bought by the U.S. Department of Defense. This was primarily driven by their use in the guidance systems of the gen-1 ICBMs then being developed under USAF General Bernard Schriever (which also supplied the rockets used for NASA's Redstone and Mercury programs), and by their use in NORAD and descendant early-warning radar networks.
As late as 1967, 75 percent of all microprocessor chips from Silicon Valley were still bought by the Pentagon.
By then, Japanese companies like Matsushita, Sharp, etc. had licensed the technology so Silicon Valley was no longer producing all the world's chip output. But still .
That aside, I think Auerback makes some very good points.Scott , September 1, 2018 at 8:40 am
While I think Auerback's Melman's point that the best engineering talent was diverted by military spending, it does not mean that military spending had an entirely negative effect on the ability to compete in other markets. What no one addresses is: Why did USA computer and electronics industries become so successful, while those of Britain did not?
The key, I think, is the conscious policy to deliberately seed new technologies developed under military auspices into the civilian economy. The best recent example was the Moore School Lectures in July – August 1946, which can be identified as the precise point in time when the USA government, following Hamiltonian principles of nation building, acted to create an entire new industry, and an entirely new phase shift in the technological bases of the economy.
This followed historical examples such as the spread of modern metal working machine tools out of the national armories after the War of 1812; the role of the Army in surveying, planning, and constructing railroads in the first half of the 19th century; the role of the Navy in establishing scientific principles of steam engine design during and after the Civil War; and the role of the Navy in creating the profession of mechanical engineering after the Civil War.
The left, unfortunately, in its inexplicable hatred of Alexander Hamilton (such as the left's false argument that Hamilton wanted to create a political economy in which the rich stayed rich and in control -- easily disproven by comparing the descendants of USA wealthy elites of the early 1800s with the descendants of British wealthy elites of the early 1800s) cannot understand Hamiltonian principles of nation building because the left does not want to understand the difference between wealth and the creation of wealth. The left sees Hamilton's focus on creating new wealth (and, be it noted, Hamilton was explicit in arguing that wealth is created by improving the productive powers of labor) and misunderstands it as coddling of wealth.Marshall Auerback , September 1, 2018 at 9:04 am
I agree, please see this presentation titled The Secret History of Silicon Valley, which details the relationship in fair greater detail. It is not from a professional historian, but the depth of research and citations show a story that runs counter to the common myths. https://steveblank.com/secret-history/Carolinian , September 1, 2018 at 9:19 am
But there had already been substantial private investment EARLIER than 1960. The DoD investment came later and initially was unsuccessful. I do later acknowledge that the Pentagon played A role, but not THE role, and that by and large the military's role has been unhelpful to American manufacturing (as opposed to the common myths to the contrary). Multiple events, multiple players, and multiple points of origin need to be mentioned in any sensible understanding of the emergence of Silicon Valley, starting with Bell Laboratories in 1947.4corners , September 1, 2018 at 3:23 am
Thank you for this bit of pushback against anti-SV paranoia. And the above comment is only referring to one aspect of Silicon Valley whereas the industry as we now think about it is as much about personal computing and software as about hardware.
That doesn't make them good guys necessarily but the notion that the USG is running the whole show is surely wrong.anon48 , September 1, 2018 at 12:24 pm
With China's entry into the WTO, it is possible that the U.S. manufacturing has hit a "point of no return," which mitigates the utility of regional trade deals.
So, what then does Mr Auerback propose? Give up on trade deals all together? Leave in place the unbalanced tariff structures that Trump claims to be addressing?
Re: autos, 25% imported content (by weight or what?!) seems like a significant move from 37.5%. But the author just dismisses that with a vague suggestion that the higher value-added products might constitute the 25%, and that it probably doesn't matter anyway.
The other half of the double feature – about inefficiency from DOD investments -- is interesting, but what is the main criticism about DoD? Is it that contracting with the private sector is inherently inefficient or should it be more about how those programs have been managed? For example, the F-35 seems ill-conceived and poorly managed but does that suggest DoD-private sector projects are inherently counterproductive?Mark Pontin , September 1, 2018 at 3:30 am
"So, what then does Mr Auerback propose? Give up on trade deals all together?"
That was my takeaway too Mr Auerback should we wait until you drop the other shoe?The Rev Kev , September 1, 2018 at 3:42 am
I guess I should cite a source. An interesting one is Donald Mackenzie's Inventing Accuracy: A Historical Sociology of Nuclear Missile Guidance MIT Press, 1992-93.
MacKenzie -- a professor of sociology at the University of Edinburgh, specializing in science and technology studies -- is also a good deep dive on financial engineering, the algorithms and the actual hardware of HFT, etc. A book of his from 2008, An Engine, Not a Camera: How Financial Models Shape Markets is absolutely worth reading.DF , September 1, 2018 at 10:35 am
Even working in the private sector can let engineers get into bad habits. Electronic engineers were used to just selecting whatever material they needed to finish an engineering design in at least one major corporation. That is, until the EU but a strict ban on the import of electronic equipment that contained certain metals and chemicals due to the fact that they were such bad pollutants. But the EU is far too big a market to ignore so it looked like adjustments had to be made.
A talk was given by a consultant of how the new rules would work in practice. Straight away the engineers asked if a deferral was possible. No. Was it possible to get a waiver for the use of those materials. No. The consultant stated that these were the rules and then went on to point out that he was now on his way to China to give the same identical talk to Chinese engineers. He also said that these rules would probably become de facto world standards. Wish that I had saved that article.The Rev Kev , September 1, 2018 at 10:39 am
Was this RoHS?John Wright , September 1, 2018 at 11:54 am
Don't know, but I only saw the original article talking about this about a year or two ago.David , September 1, 2018 at 1:10 pm
This appears to be the original ROHS (Removal Of Hazardous Substances) EU directive.
This involved a lot of different chemicals, but of interest to me was the replacement of tin-lead solder in electronic equipment.
In the early 2000's my job involved migrating a number of shipping tin-lead (sn-pb) printed circuit assemblies to pb free designs to meet ROHS standards.
A lot of changes were necessary as the new pb-free (tin-silver-copper) solder melted at around 223c while the previous tin-lead solder melted at 183c.
Components had to all take the higher temperature and this caused a slew of problems. Usually the bare board had to be changed to a different material that could take the higher temp.
The EU ended up being somewhat flexible in the ROHS implementation, as I know that they allowed a number of exemptions for some high pb content (>= 85% pb) solder where there was no good substitute. (example, high temperature solders for devices that run hot such as transducers and loudspeakers and internal power IC connections)
As I remember, medical equipment and US military equipment were not required to adhere to ROHS.
It was not a simple task to meet the new ROHS standards, so I can understand that some engineers wanted a deferral.
But it provided employment for me.AbateMagicThinking But Not Money , September 1, 2018 at 5:22 am
When a system is designed to last 20+ years with high reliability and availability requirements, techniques and methods that are used to design and build this years I-phone aren't always applicable.
The lead in the solder helps prevent the growth of "tin whiskers", which can cause electronic systems to fail over time. One usually gets a RoHS waiver for the part(s) for longer life systems.
But what happens in five or ten years when the part(s) need to be replaced? Will one be able to find that leaded-part? Try and buy a new 5 year old I-phone.
Will that part be a genuinely new part or one that was salvaged, and possibly damaged, from another piece of equipment? How would one know if the part is damaged? It looks okay. It functions in the lab, but will it function, as expected, in the real world? For how long?
What's the risk of putting a similar but different part in the system? Could people be harmed if the part fails or doesn't function as expected?
This is systems engineering. Some wave it off as a "ludicrously bureaucratized management systems", but it is important. They may prefer the commercial approach of letting the public find faults in designs, but that's how people get run over by autonomous vehicles.
Unfortunately, systems engineering doesn't directly contribute to the quarterly bottom line, so it is usually underfunded or neglected. It only becomes important when, after all the designers have moved on, the system can't be used because it can't be repaired or something dangerous happens because the system was maintained with parts that were "good enough", but weren't.
What this has to do with NAFTA, I don't know.Disturbed Voter , September 1, 2018 at 5:41 am
Perhaps covert regime change may be the only way (instead of tariffs):
The medicine required will probably involve hard-faced interviews with all the top CEO's (involving rubber gloves at the private airports as they come from their jaunts abroad). Who better than special ops for carrying our such persuasion. It will have to be done on the QT under stricter than usual national security secrecy so that the 'persuaded' won't lose face.
Look out for an appropriate and revolutionary "refocusing initiative" announced by all the top people in the private sector as an indication that the threat of the the rubber glove has worked.
Pip-Pip!jo6pac , September 1, 2018 at 8:30 am
Utopians will make any society into a no-place. You can't run a society solely on consumerism, it has to also run on MIC expenditures, because of excess productive capacity, as was shown in WW II and the Cold War. The other trend is we don't want to pay anyone with a college degree, more than the minimum wage. Per management, engineers make too much salary.Felix_47 , September 1, 2018 at 8:48 am
The CIA money is everywhere in the silly valley.
The article is a good read.ambrit , September 1, 2018 at 10:24 am
I work in this area with the government. This is exactly what I see. It is a deep seated cancer on our nation. What can be done to change this. I always thought cut the military budget but then the only jobs American can get (if vets with security clearances) would vanish? Does anyone have any solutions?Jeff , September 1, 2018 at 10:48 am
Declare the Infrastructure Renovation Program to be "In the Interests of National Security," cut the Military Industrial Complex on for a piece of the action, and start hiring. After all, General Dynamics got some serious contracts to run the call centres for Obamacare. The template is there.Newton Finn , September 1, 2018 at 10:43 am
That's disgusting. Giving call center contracts to defense firms And likely for a much bigger mark-up than might have been given to call center management companies.Wukchumni , September 1, 2018 at 10:55 am
As Edward Bellamy envisioned in the late 19th Century, one possible "solution" would be to transition the military model into serving benevolent civilian uses. Imagine if our now-monstrous MIC was tasked not with protecting elite wealth and destroying other nations, but rather with repairing environmental damage and restoring environmental health, along with addressing other essential needs of people and planet. Change the purpose pursued, and a great evil can become a great good.blennylips , September 1, 2018 at 1:38 pm
We largely did that in the early 1970's, as angst against the Vietnam War was at a fever pitch and the ecological movement was ascendant.
It took awhile for the effects to be felt, but smog in L.A. went from horrendous to manageable within a decade of auto emissions being regulated. Lakes & waterways across the land were cleaned up as well, to the benefit of everybody.
Bellamy was quite the visionary, so much of what he described in Looking Backward, actually happened.John Zelnicker , September 1, 2018 at 9:37 am
Second Hanford radioactive tunnel collapse expected. And it could be more severe
By Annette Cary
August 28, 2018 07:14 PM
https://www.tri-cityherald.com/news/local/hanford/article217470425.htmlMichaeLeroy , September 1, 2018 at 9:48 am
Lambert – Did you intend to post only about a third of Auerbach's article? That's all I see.DF , September 1, 2018 at 10:33 am
One of my gigs was working as a scientist/engineer for a fledgling Minnesota based medical device company. The company's R&D team originally consisted of experienced medical device development professionals and we made progress on the product. Management secured a round of funding from Silicon Valley investors. One of the strings attached to that investment was that Silicon Valley expertise be brought in to the R&D efforts. There was a huge culture clash between the Minnesota and the California teams, as the Silicon Valley engineers tried to apply their military systems perspective to development of a device meant for use in humans. From our perspective, the solutions proposed by the weapons guys were dangerous and impractical. However management considered the Silicon Valley expertise golden and consistently ruled in their favor. This was the early 2000s, so none of the Minnesota R&D team had difficulty moving on to other projects, which is what we did. The result of the Silicon Valley turn was a several-fold increase in cash burn, loss of medical device engineering knowledge and ultimately no product.John Wright , September 1, 2018 at 12:09 pm
Another view is that defense procurement is one of the few things that's keeping a lot of US manufacturing and technical expertise alive at all. I went to one of the only electronic parts stores (i.e., a place that sells stuff like IC's capacitors, resistors, etc.) in Albuquerque, NM a few months ago. The person at the cash register told me that nearly all of their business comes from Sandia National Laboratories, which is the big nuclear weapons laboratory in ABQ.John , September 1, 2018 at 10:40 am
I suspect there is a lot more USA technical expertise out there than indicated by the small quantity of retail electronics parts stores surviving. On-line companies such as Digi-key and Mouser have very large and diverse component selections and quick delivery to retail customers.
Go to https://www.digikey.com/products/en
The blank search shows 8,380,317 items.Mel , September 1, 2018 at 11:31 am
The Aug 21, 2015 NC has a piece "How Complex Systems Fail" that seems appropriate to this. Death is an unavoidable part of the great cycle. Get used to it.
How Complex Systems Fail PDF here .
Jun 18, 2017 | hardware.slashdot.org(arstechnica.com) 71 Charles Thacker, one of the lead hardware designers on the Xerox Alto, the first modern personal computer, died of a brief illness on Monday. He was 74.
The Alto, which was released in 1973 but was never a commercial success, was an incredibly influential machine. Ahead of its time, it boasted resizeable windows as part of its graphical user interface, along with a mouse, Ethernet, and numerous other technologies that didn't become standard until years later. (Last year, Y Combinator acquired one and began restoring it.)
"Chuck" Thacker was born in Pasadena, California, in 1943. He first attended the California Institute of Technology in his hometown but later transferred to the University of California, Berkeley in 1967. While in northern California, Thacker began to abandon his academic pursuit of physics and dove deeper into computer hardware design, where he joined Project Genie , an influential computer research group. By the end of the decade, several members, including Thacker, became the core of Xerox's Palo Alto Research Center (PARC) computer research group, where they developed the Alto.
In a 2010 interview, Thacker recalled :
We knew [the Alto] was revolutionary. We built it with the very first semiconductor dynamic RAM, the Intel 1103, which was the first memory you could buy that was less than a 10th of a cent a bit. As a result, we realized we could build a display that was qualitatively better than what we had at the time. We had character generator terminals, and some of them were quite nice. But they were limited in various ways, whereas the Alto had the property that anything you could represent on paper, you could put on the screen. We knew that was going to be a big deal.
At the age of 24, Apple co-founder Steve Jobs famously visited Xerox's Palo Alto Research Center (PARC) and saw an Alto firsthand in 1979. That episode is often credited with being a huge inspiration for the eventual release of the Macintosh five years later. (Like Jobs, Thacker also founded a company in his 20s-in 1969, the short-lived Berkeley Computer Company was born.)
Michael Hiltzik, a journalist who wrote an entire book on the history of Xerox PARC called Dealers of Lightning , said that Thacker was a "key designer."
Hiltzik told Ars:
He was the quintessential hardware guy at a time when designing the hardware was a key aspect of the task. He was a master at designing logic boards when that was the guts of the computer. This was before the silicon revolution. He did all that, and he had built a couple of computers even before the Alto.
Later in his career, Thacker joined Microsoft in 1997 to help establish the company's lab in Cambridge, England. Two years after that, he designed the hardware for Microsoft's Tablet PC, which was first conceived of by his PARC colleague Alan Kay during the early 1970s.
In 2009, Thacker received the Association for Computing Machinery's prestigious A.M. Turing award. According to Thomas Haigh , a computer historian and professor at the University of Wisconsin, Milwaukee, it is "very rare" for a "system builder in industry," as opposed to a theorist or an academic, to be given this honor.
Haigh wrote the following in an e-mail to Ars:
Alto is the direct ancestor of today's personal computers. It provided the model: GUI, windows, high-resolution screen, Ethernet, mouse, etc. that the computer industry spent the next 15 years catching up to. Of course others like Alan Kay and Butler Lampson spent years evolving the software side of the platform, but without Thacker's creation of what was, by the standards of the early 1970s, an amazingly powerful personal hardware platform, none of that other work would have been possible.
In the same 2010 interview, Thacker had some simple advice for young computer scientists: "Try to be broad. Learn more math, learn more physics."
Posted by EditorDavid on Saturday June 17, 2017 @01:46PM
An anonymous reader quotes Ars Technica : Charles Thacker, one of the lead hardware designers on the Xerox Alto, the first modern personal computer, died of a brief illness on Monday . He was 74. The Alto, which was released in 1973 but was never a commercial success, was an incredibly influential machine...
Thomas Haigh, a computer historian and professor at the University of Wisconsin, Milwaukee, wrote in an email to Ars , "Alto is the direct ancestor of today's personal computers. It provided the model: GUI, windows, high-resolution screen, Ethernet, mouse, etc. that the computer industry spent the next 15 years catching up to. Of course others like Alan Kay and Butler Lampson spent years evolving the software side of the platform, but without Thacker's creation of what was, by the standards of the early 1970s, an amazingly powerful personal hardware platform, none of that other work would have been possible."
In 1999 Thacker also designed the hardware for Microsoft's Tablet PC , "which was first conceived of by his PARC colleague Alan Kay during the early 1970s," according to the article. "I've found over my career that it's been very difficult to predict the future," Thacker said in a guest lecture in 2013 .
"People who tried to do it generally wind up being wrong."
woboyle ( 1044168 ) , Saturday June 17, 2017 @03:13PM ( #54639641 )Ethernet and Robert Metcalf ( Score: 5 , Interesting)
The co-inventor of ethernet at PARC, Robert Metcalf, has been a friend of mine for 35 years. My sympathies to Thacker's family for their loss. I never knew him although I may have met him in the early 1980's in the Silicon Valley. As a commercial computer sales rep in the Valley back then I sold Robert the first 100 IBM PC's for his startup, 3-com. When I was an engineer in Boston in the late 1980's and early 1990's we would meet for dinner before IEEE meetings.
Mar 27, 2018 | developers.slashdot.org
(tnmoc.org) the National Museum of Computing published a blog post in which it tried to find the person who has been programming the longest . At the time, it declared Bill Williams, a 70-year old to be one of the world's most durable programmers, who claimed to have started coding for a living in 1969 and was still doing so at the time of publication. The post has been updated several times over the years, and over the weekend, the TNMC updated it once again. The newest contender is Terry Froggatt of Hampshire, who writes: I can beat claim of your 71-year-old by a couple of years, (although I can't compete with the likes of David Hartley). I wrote my first program for the Elliott 903 in September 1966. Now at the age of 73 I am still writing programs for the Elliott 903! I've just written a 903 program to calculate the Fibonacci numbers. And I've written quite a lot of programs in the years in between, some for the 903 but also a good many in Ada.
Oct 14, 2017 | opensource.com
Larry Wall released Perl 1.0 to the comp.sources.misc Usenet newsgroup on December 18, 1987. In the nearly 30 years since then, both the language and the community of enthusiasts that sprung up around it have grown and thrived -- and they continue to do so, despite suggestions to the contrary!
Wall's fundamental assertion -- there is more than one way to do it -- continues to resonate with developers. Perl allows programmers to embody the three chief virtues of a programmer: laziness, impatience, and hubris. Perl was originally designed for utility, not beauty. Perl is a programming language for fixing things, for quick hacks, and for making complicated things possible partly through the power of community. This was a conscious decision on Larry Wall's part: In an interview in 1999, he posed the question, "When's the last time you used duct tape on a duct?"A history lesson
- Perl 1.0 - Perl 4.036 Perl allows programmers to embody the three chief virtues of a programmer: laziness, impatience, and hubris. Larry Wall developed the first Perl interpreter and language while working for System Development Corporation, later a part of Unisys. Early releases focused on the tools needed for the system engineering problems that he was trying to solve. Perl 2's release in 1988 made improvements on the regular expression engine. Perl 3, in 1989, added support for binary data streams. In March of 1991, Perl 4 was released, along with the first edition of Programming Perl , by Larry Wall and Randal L. Schwartz. Prior to Perl 4, the documentation for Perl had been maintained in a single document, but the O'Reilly-published "Camel Book," as it is called, continues to be the canonical reference for the Perl language. As Perl has changed over the years, Programming Perl has been updated, and it is now in its fourth edition.
- Early Perl 5 Perl 5.000, released on October 17, 1994, was a nearly complete rewrite of the interpreter. New features included objects, references, lexical variables, and the use of external, reusable modules. This new modularity provides a tool for growing the language without modifying the underlying interpreter. Perl 5.004 introduced CGI.pm, which contributed to its use as an early scripting language for the internet. Many Perl-driven internet applications and sites still in use today emerged about this time, including IMDB, Craigslist, Bugzilla, and cPanel.
- Modern Perl 5 Perl version 5.10 of Perl was released on the 20th anniversary of Perl 1.0: December 18, 2007. Version 5.10 marks the start of the "Modern Perl" movement. Modern Perl is a style of development that takes advantage of the newest language features, places a high importance on readable code, encourages testing, and relies heavily on the use of the CPAN ecosystem of contributed code. Development of Perl 5 continues along more modern lines, with attention in recent years to Unicode compatibility, JSON support, and other useful features for object-oriented coders.
... ... ...
The Perl community
... ... ...
- Perl Mongers In 1997, a group of Perl enthusiasts from the New York City area met at the first O'Reilly Perl Conference (which later became OSCON), and formed the New York Perl Mongers, or NY.pm . The ".pm" suffix for Perl Mongers groups is a play on the fact that shared-code Perl files are suffixed .pm, for "Perl module." The Perl Mongers organization has, for the past 20 years, provided a framework for the foundation and nurturing of local user groups all over the world and currently boasts of 250 Perl monger groups. Individual groups, or groups working as a team, sponsor and host conferences, hackathons, and workshops from time to time, as well as local meetings for technical and social discussions.
- PerlMonks Have a question? Want to read the wisdom of some of the gurus of Perl? Check out PerlMonks . You'll find numerous tutorials, a venue to ask questions and get answers from the community, along with lighthearted bits about Perl and the Perl community. The software that drives PerlMonks is getting a little long in the tooth, but the community continues to thrive, with new posts daily and a humorous take on the religious fervor that developers express about their favorite languages. As you participate, you gain points and levels . The Meditations contains discussions about Perl, hacker culture, or other related things; some include suggestions and ideas for new features.
... ... ...
As Perl turns 30, the community that emerged around Larry Wall's solution to sticky system administration problems continues to grow and thrive. New developers enter the community all the time, and substantial new work is being done to modernize the language and keep it useful for solving a new generation of problems. Interested? Find your local Perl Mongers group, or join us online, or attend a Perl Conference near you!
Ruth Holloway - Ruth Holloway has been a system administrator and software developer for a long, long time, getting her professional start on a VAX 11/780, way back when. She spent a lot of her career (so far) serving the technology needs of libraries, and has been a contributor since 2008 to the Koha open source library automation suite.Ruth is currently a Perl Developer at cPanel in Houston, and also serves as chief of staff for an obnoxious cat. In her copious free time, she occasionally reviews old romance... "
Sep 21, 2017 | www.msn.com
Microsoft co-founder Bill Gates has pledged to give more than half of his enormous wealth to philanthropy and helped make the personal computer a reality ! but he cannot live down the control-alt-delete keyboard function.
On Wednesday, the billionaire admitted the three-strokes PC users must use to log on to their computer or interrupt a program could be just one button.
David Rubenstein, co-founder and co-CEO of private equity firm The Carlyle Group, raised the issue Wednesday during a panel discussion at the Bloomberg Global Business Forum in New York City.
"You are the person who came up with the idea of doing it that way," Rubenstein said in this Bloomberg video of the discussion . "Why did you do that?"
The question drew laughs from the audience and from Gates' fellow panelists. Gates took a long pause before answering.
"Clearly, the people involved, they should have put another key on in order to make that work," he said. "I'm not sure you can go back and change small things in your life without putting the other things at risk. Sure, if I can make one small edit, I'd make that a single key operation." It wasn't the first time Gates addressed control-alt-delete. Rubenstein asked him about it four years ago during an interview at Harvard University.
"We could have had a single button," Gates said then . "The guy who did the IBM keyboard design didn't want to give us our single button... It was a mistake."
May 28, 2017 | economistsview.typepad.com
libezkova , May 27, 2017 at 10:53 PM"When combined with our brains, human fingers are amazingly fine manipulation devices."
Not only fingers. The whole human arm is an amazing device. Pure magic, if you ask me.
To emulate those capabilities on computers will probably require another 100 years or more. Selective functions can be imitated even now (manipulator that deals with blocks in a pyramid was created in 70th or early 80th I think, but capabilities of human "eye controlled arm" is still far, far beyond even wildest dreams of AI.
Similarly human intellect is completely different from AI. At the current level the difference is probably 1000 times larger then the difference between a child with Down syndrome and a normal person.
Human brain is actually a machine that creates languages for specific domain (or acquire them via learning) and then is able to operate in terms of those languages. Human child forced to grow up with animals, including wild animals, learns and is able to use "animal language." At least to a certain extent. Some of such children managed to survive in this environment.
Such cruel natural experiments have shown that the level of flexibility of human brain is something really incredible. And IMHO can not be achieved by computers (although never say never).
Here we are talking about tasks that are 1 million times more complex task that playing GO or chess, or driving a car on the street.
My impression is that most of recent AI successes (especially IBM win in Jeopardy ( http://www.techrepublic.com/article/ibm-watson-the-inside-story-of-how-the-jeopardy-winning-supercomputer-was-born-and-what-it-wants-to-do-next/ ), which probably was partially staged, is by-and-large due to the growth of storage and the number of cores of computers, not so much sophistication of algorithms used.
The limits of AI are clearly visible when we see the quality of translation from one language to another. For more or less complex technical text it remains medium to low. As in "requires human editing".
If you are bilingual, try Google translate on this post. You might be impressed by their recent progress in this field. It did improved considerably and now does not cause instant laugh.
Same thing with the speech recognition. The progress is tremendous, especially the last three-five years. But it is still far from perfect. Now, with a some training, programs like Dragon are quite usable as dictation device on, say PC with 4 core 3GHz CPU with 16 GB of memory (especially if you are native English speaker), but if you deal with special text or have strong accent, they still leaves much to be desired (although your level of knowledge of the program, experience and persistence can improve the results considerably.
One interesting observation that I have is that automation is not always improve functioning of the organization. It can be quite opposite :-). Only the costs are cut, and even that is not always true.
Of course the last 25 years (or so) were years of tremendous progress in computers and networking that changed the human civilization. And it is unclear whether we reached the limit of current capabilities or not in certain areas (in CPU speeds and die shrinking we probably did; I do not expect anything significant below 7 nanometers: https://en.wikipedia.org/wiki/7_nanometer ).
Sep 16, 2017 | yro.slashdot.org
Posted by msmash on Thursday September 07, 2017
A federal judge in Massachusetts has dismissed a libel lawsuit filed earlier this year against tech news website Techdirt. From a report: The claim was brought by Shiva Ayyadurai, who has controversially claimed that he invented e-mail in the late 1970s. Techdirt (and its founder and CEO, Mike Masnick) has been a longtime critic of Ayyadurai and institutions that have bought into his claims.
"How The Guy Who Didn't Invent Email Got Memorialized In The Press & The Smithsonian As The Inventor Of Email," reads one Techdirt headline from 2012. One of Techdirt's commenters dubbed Ayyadurai a "liar" and a "charlatan," which partially fueled Ayyadurai's January 2017 libel lawsuit.
In the Wednesday ruling, US District Judge F. Dennis Saylor found that because it is impossible to define precisely and specifically what e-mail is, Ayyadurai's " claim is incapable of being proved true or false. "
Jul 25, 2017 | www.msn.comVideo by Wochit Tech
Adobe Systems Inc's Flash, a once-ubiquitous technology used to power most of the media content found online, will be retired at the end of 2020, the software company announced on Tuesday.
Adobe, along with partners Apple Inc., Microsoft Corp., Alphabet Inc.'s Google, Facebook Inc. and Mozilla Corp., said support for Flash will ramp down across the internet in phases over the next three years.
After 2020, Adobe will stop releasing updates for Flash and web browsers will no longer support it. The companies are encouraging developers to migrate their software onto modern programming standards.
"Few technologies have had such a profound and positive impact in the internet era," said Govind Balakrishnan, vice president of product development for Adobe Creative Cloud.
Created more than 20 years ago, Flash was once the preferred software used by developers to create games, video players and applications capable of running on multiple web browsers. When Adobe acquired Flash in its 2005 purchase of Macromedia, the technology was on more than 98 percent of personal computers connected to the web, Macromedia said at the time.
But Flash's popularity began to wane after Apple's decision not to support it on the iPhone.
In a public letter in 2010, late Apple CEO Steve Jobs criticized Flash's reliability, security and performance. Since then, other technologies, like HTML5, have emerged as alternatives to Flash.
In the past year, several web browsers have begun to require users to enable Flash before running it.
On Google's Chrome, the most popular web browser, Flash's usage has already fallen drastically. In 2014, Flash was used each day by 80 percent of desktop users. That number is now at 17 percent "and continues to decline," Google said in a blog on Tuesday.
"This trend reveals that sites are migrating to open web technologies, which are faster and more power-efficient than Flash," Google said. "They're also more secure."
Flash, however, remains in use among some online gamers. Adobe said it will work with Facebook as well as Unity Technologies and Epic Games to help developers migrate their games.
Adobe said it does not expect Flash's sunset to have an impact on its bottom line. "In fact, we think the opportunity for Adobe is greater in a post-Flash world," Balakrishnan said.
(Reporting by Salvador Rodriguez; additional reporting by Stephen Nellis; editing by Jonathan Weber and Lisa Shumaker)
Jul 09, 2017 | tech.slashdot.org(multicians.org)
"The seminal operating system Multics has been reborn," writes Slashdot reader doon386 :
The last native Multics system was shut down in 2000 . After more than a dozen years in hibernation a simulator for the Honeywell DPS-8/M CPU was finally realized and, consequently, Multics found new life... Along with the simulator an accompanying new release of Multics -- MR12.6 -- has been created and made available. MR12.6 contains many bug and Y2K fixes and allows Multics to run in a post-Y2K, internet-enabled world. Besides supporting dates in the 21st century, it offers mail and send_message functionality, and can even simulate tape and disk I/O. (And yes, someone has already installed Multics on a Raspberry Pi.)
Version 1.0 of the simulator was released Saturday, and Multicians.org is offering a complete QuickStart installation package with software, compilers, install scripts, and several initial projects (including SysDaemon, SysAdmin, and Daemon).
Plus there's also useful Wiki documents about how to get started, noting that Multics emulation runs on Linux, macOS, Windows, and Raspian systems. The original submission points out that "This revival of Multics allows hobbyists, researchers and students the chance to experience first hand the system that inspired UNIX."
www.sorehands.com ( 142825 ) , Sunday July 09, 2017 @01:47AM ( #54772267 ) HomepageI used it at MIT in the early 80s. ( Score: 4 , Informative)Gravis Zero ( 934156 ) , Sunday July 09, 2017 @02:10AM ( #54772329 )
I was a project administrator on Multics for my students at MIT. It was a little too powerful for students, but I was able to lock it down. Once I had access to the source code for the basic subsystem (in PL/1) I was able to make it much easier to use. But it was still command line based.
A command line, emails, and troff. Who needed anything else?It's not the end! ( Score: 4 , Interesting)Tom ( 822 ) , Sunday July 09, 2017 @04:16AM ( #54772487 ) Homepage Journal
Considering that processor was likely made with the three micrometer lithographic process, it's quite possible to make the processor in a homemade lab using maskless lithography. Hell, you could even make it NMOS if you wanted. So yeah, emulation isn't the end, it's just another waypoint in bringing old technology back to life.Multics ( Score: 5 , Interesting)nuckfuts ( 690967 ) , Sunday July 09, 2017 @02:00PM ( #54774035 )The original submission points out that "This revival of Multics allows hobbyists, researchers and students the chance to experience first hand the system that inspired UNIX."
More importantly: To take some of the things that Multics did better and port them to Unix-like systems. Much of the secure system design, for example, was dumped from early Unix systems and was then later glued back on in pieces.Influence on Unix ( Score: 4 , Informative)Shirley Marquez ( 1753714 ) , Monday July 10, 2017 @12:44PM ( #54779281 ) Homepage
From here [wikipedia.org]...
The design and features of Multics greatly influenced the Unix operating system, which was originally written by two Multics programmers, Ken Thompson and Dennis Ritchie. Superficial influence of Multics on Unix is evident in many areas, including the naming of some commands. But the internal design philosophy was quite different, focusing on keeping the system small and simple, and so correcting some deficiencies of Multics because of its high resource demands on the limited computer hardware of the time.
The name Unix (originally Unics) is itself a pun on Multics. The U in Unix is rumored to stand for uniplexed as opposed to the multiplexed of Multics, further underscoring the designers' rejections of Multics' complexity in favor of a more straightforward and workable approach for smaller computers. (Garfinkel and Abelson cite an alternative origin: Peter Neumann at Bell Labs, watching a demonstration of the prototype, suggested the name/pun UNICS (pronounced "Eunuchs"), as a "castrated Multics", although Dennis Ritchie is claimed to have denied this.)
Ken Thompson, in a transcribed 2007 interview with Peter Seibel refers to Multics as "...overdesigned and overbuilt and over everything. It was close to unusable. They (i.e., Massachusetts Institute of Technology) still claim it's a monstrous success, but it just clearly wasn't." He admits, however, that "the things that I liked enough (about Multics) to actually take were the hierarchical file system and the shell! a separate process that you can replace with some other process."A hugely influential failure ( Score: 2 )
The biggest problem with Multics was GE/Honeywell/Bull, the succession of companies that made the computers that it ran on. None of them were much good at either building or marketing mainframe computers.
So yes, Multics was a commercial failure; the number of Multics systems that were sold was small. But in terms of moving the computing and OS state of the art forward, it was a huge success. Many important concepts were invented or popularized by Multics, including memory mapped file I/O, multi-level file system hierarchies, and hardware protection rings. Security was a major focus in the design of Multics, which led to it being adopted by the military and other security-conscious customers.
May 29, 2017 | economistsview.typepad.comlibezkova - , May 28, 2017 at 06:13 PM"computers merely use syntactic rules to manipulate symbol strings, but have no understanding of meaning or semantics. "
It is actually more complex than that. Here I again would like to remind about a very simple program called Eliza, created around 1966 that was one of the first to fake Turing test ( https://en.wikipedia.org/wiki/ELIZA ). It really deceived humans that it is a Rogerian psychotherapist.
If was first to show that "understanding" in a very deep sense is based the ability to manipulate symbols in some abstract language and inability of humans to tell that this is not another human being (the essence of Turing test) is intrinsically connected with our of notion "understanding". In this sense, the program that is able to pass the Turing test in a particular domain "understands" this domain.
For example, when you ask your smartphone voice assistant "What is the weather in San Francisco today?" the answer demonstrates clear understanding of what you are asking for. Your smartphone clearly "understands" you.
On the other hand when you submit a text for translation to Google and get the result, the result clearly shows that in this domain the computer is clearly unable to pass the Turing test. Although if we limit the domain to a very narrow area I think the situation improves.
Similarly the ability of a computer to compete with humans in chess in a very deep sense means that such a specialized computer "understands chess" (or any other game) despite the fact that mechanisms using which humans come to the next move and computer comes to the next move are different.
So an instrumental aspect of understanding that matter most.
What is currently lacking is what we mean by the word "creativity" -- the ability to create a new "artificial language" for a domain (humans invented chess).
At the same time the ability of complex manipulation of symbols on the level that allow a program to pass Turing test remains a very good proxy for understanding.
So the question is only about the complexity and power of those manipulations with the symbols of the language. In other words at some point quantity turns into quality.
Apr 26, 2017 | sourceforge.net
On this day in 1960 IBM announced their plans for their Stretch supercomputer, the IBM 7030. In an upbeat release the company stated that the Stretch would be the world's fastest and most powerful computer of the time, outperforming the IBM 704 by 60 to 100 times. The actual performance of the IBM 7030 ended up disappointing, at only about 30 times that of the IBM 704. This caused considerable embarrassment for IBM and a significant price cut of the Stretch from $13.5 million to $7.78 million.
Though the IBM 7030 was not considered a success, it spawned technologies which were incorporated in many, more successful machines.
Apr 19, 2017 | sourceforge.net
On this day in 1957, the first FORTRAN program was run.
Back in 1953, IBM computer scientist John W. Backus proposed to his superiors that a more practical language be developed for programming their IBM 704 mainframe computer. He along with a team of programmers later invented FORTRAN, a "high-level" programming language that greatly simplified program writing. And by April 19, 1957 the first FORTRAN program was run (apart from internal IBM testing) at Westinghouse, producing a missing comma diagnostic, followed by a successful attempt.
Apr 15, 2017 | economistsview.typepad.comanne -> anne... , April 14, 2017 at 11:47 AMhttps://plato.stanford.edu/entries/chinese-room/libezkova -> anne... , April 14, 2017 at 07:22 PM
April 9, 2014
The Chinese Room Argument
The argument and thought-experiment now generally known as the Chinese Room Argument was first published in a paper * in 1980 by American philosopher John Searle (1932- ). It has become one of the best-known arguments in recent philosophy. Searle imagines himself alone in a room following a computer program for responding to Chinese characters slipped under the door. Searle understands nothing of Chinese, and yet, by following the program for manipulating symbols and numerals just as a computer does, he produces appropriate strings of Chinese characters that fool those outside into thinking there is a Chinese speaker in the room. The narrow conclusion of the argument is that programming a digital computer may make it appear to understand language but does not produce real understanding. Hence the "Turing Test" ** is inadequate. Searle argues that the thought experiment underscores the fact that computers merely use syntactic rules to manipulate symbol strings, but have no understanding of meaning or semantics. The broader conclusion of the argument is that the theory that human minds are computer-like computational or information processing systems is refuted. Instead minds must result from biological processes; computers can at best simulate these biological processes. Thus the argument has large implications for semantics, philosophy of language and mind, theories of consciousness, computer science and cognitive science generally. As a result, there have been many critical replies to the argument.
The Turing Test is a test, developed by Alan Turing in 1950, of a machine's ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human.Anne,
This guy is 50 years late
The first researcher who has shown that Turing test can be faked was Joseph Weizenbaum from MIT.
In 1964-1966 he wrote his famous ELIZA program
( https://en.wikipedia.org/wiki/ELIZA ) which simulated a Rogerian psychotherapist and used rules, dictated in the script, to respond with non-directional questions to user inputs.
As such, ELIZA was one of the first chatterbots, but was also regarded as one of the first programs capable of passing the Turing Test.
In general humans are unique in the ability to construct new languages and rules of manipulating objects in those languages.
Computers so far can only manipulate with objects in a given language using preprogrammed rules. although with neural networks recognition of visual images this is more complex then that.
Amazing achievements of computers in chess and natural language recognition, including the spoken one, are mostly due to huge storage and multiprocessing. A regular desktop for $8K-$10K can now have 64 cores (2 x 32) and 128 GB or even 1TB of memory. This was a supercomputer just 15 years ago.
In chess computers are also able to work in parallel to explore deterministically positions from a given one for the larger amount of "half-moves" them human do. That's why computers became competitive with humans in chess and Blue Gene managed to beat Kasparov in 1996 ( https://en.wikipedia.org/wiki/Deep_Blue_versus_Garry_Kasparov ) That's has nothing to do with the ability to think.
Winning in Jeopardy is more involved and impressive fit, but I believe IBM cheated.
IBM after Louis Gerstner Jr, destroyed it became a completely financially driven company capable to perform all kind of dirty tricks.
But to outsider such programs are definitely looking "intelligent".
Clark third law states "Any sufficiently advanced technology is indistinguishable from magic."
That's what we are currently observing with computers.
Note: BTW the quality of translation from one language to another remains so dismal that any bilingual human can do better then the best computer program. Although recent experiments with recognizing spoken language translating it to another and verbalizing it in the second language are nothing short of amazing.
Apr 04, 2017 | www.reddit.comThirty years ago this month, IBM declared war on reality – and lost. For the 30 years prior to that April day in 1987, nobody had dared to challenge IBM's dominance of the computer industry. It would take almost a decade for the defeat to be complete, but when it came, it was emphatic.
In April of '87, IBM announced both a new PC architecture, PS/2, and a new operating system to run on the boxes, OS/2. "Real" business computers would also run IBM networking software and IBM SQL database software, all bundled into an "Extended Edition" of the operating system. There was only one real way of computing, IBM declared, and it had a /2 on the end. This was signalled by the job titles, the PC business was called "Entry Systems": bikes with training wheels.
While IBM itself has subsequently survived and prospered, it's a very different company today. It subsequently divested its PCs, printers and microprocessor divisions (along with much else) and one wonders how much different it would be today if it hadn't devoted a decade and thousands of staff to trying to bring the PC industry back under its control. Ironically, Lenovo is the biggest PC company again – but by following the rules set by Microsoft and Intel.
OS/2 is an oft-told story, not least here at the Reg , where we gave an account of the OS wars on the 25th anniversary of OS/2. Dominic Connor also shared lots of evocative detail about the IBM culture at the time here (Part One) and here (Part Two) . So there's no need to do so again.
But every time history is written, it's from a different vantage point – a different context. It no longer seems a joke to suggest, as IBM CEO Thomas J Watson probably never did , that the world needs only five computers. It's a quote that adorned many
.sigfiles in the early days of email. Hah Hah. How stupid could a tech CEO be?
So if IBM had fought a different war, how might the world look?The tl;dr
If you're not familiar with the significance of OS/2 and PS/2 and don't want to read thousands of words, here's a capsule summary. The setter of standards for the past three decades, IBM had responded to the microprocessor revolution by allowing its PC to be easily "cloneable" and run a toy third-party operating system. It didn't matter too much to senior IBM management at first: PCs weren't really computers, so few people would buy them. Business computing would be done on real multiuser systems. That was the norm at the time. But the runaway sales of the PC clones worried IBM and in 1984, just three years after the launch of the IBM PC, Big Blue plotted how to regain control. The result was a nostalgic backward-looking vision that under-estimated that computing standards would increasingly be set by the open market, not by IBM.
The PS/2 series of PCs had world-beating futuristic industrial design, based around clip-together plastic parts that made the beige tins of the time look instantly dated. And PS/2 would have been fine if it hadn't been for the notorious proprietary bus: Micro Channel Architecture .
This plug and play bus was much better than the ISA standards of its day, and good enough for IBM to be using in workstations and even mainframes. However, it was not compatible with the (fairly crude) expansion boards of the day required to do networking or graphics; MCA cards were twice the price of comparable cards; and hybrid PCs were difficult to build. But the killer was that IBM demanded a licence fee from OEMs and sent its lawyers after MCA clones. The result was years of uncertainty. Seven years later, Apple was still making fun of how hard it was to get expansion cards to work in PCs. The real bedevilment of IBM's PCs was the tech titan's cost structure, which made it more expensive than the competition, at least without a heavy corporate discount.
But people don't get emotional about PS/2s in the way they got emotional about OS/2, which, even as a former user, is pretty strange, given how much grief it gave you. The tl;dr of the OS/2 story is that IBM announced a highly advanced operating system for PCs, but it was five years (1992) before it shipped a version that was demonstrably better for the average user. (In fact, it was almost a year before anything shipped at all.)
Since operating systems aren't an end in themselves, but merely a means to an end, a means of running something that alleviates your grunt work (like dBase or Lotus 1-2-3 at the time), the advantages of OS/2 were pretty elusive.
And, even in 1992, "better" meant managing old apps and files better – and the action in that "space" was taking place on Windows.
Because the industry was so unused to believing it could actually set standards, for a long time it didn't. This nervousness in the wake of PS/2 and OS/2 had caused a kind of winter. People just didn't bother upgrading – well, why would you? You had to wait and see what the standards would be.
So Microsoft put next to no effort into updating DOS (or even Windows) for the next two years. Application vendors continued to update their applications, but these remained character mode and there the lack of a standard for addressing extended memory also added to the inertia. Remember that OS/2 had been written for the 80286 chip introduced in 1984, and while the 80386 chip added new modes for protecting memory and virtualizing sessions, without the software to take advantage of it, weak demand ensured 386 PCs remained very expensive.
I described IBM's vision of computing as nostalgic and backward-looking, with PCs limited to office paperwork while real computing took place on (IBM) servers. But what if IBM had dared skip a generation and tried something really daring?Alternative histories
At the time, there just that weren't many options.
A week before IBM's PS/2 and OS/2 roadmap was unveiled, Digital Research Inc shipped a multitasking DOS for 286 computers – Concurrent DOS 286 – to OEMs. This emulated the original IBM PC chip in software, and offered a huge leap in multitasking over Microsoft DOS. DRI also had a graphical shell, GEM. Why not dump Microsoft for DRI, which ran the industry standard OS when IBM set about making its first PC? It was really about control. IBM had excellent engineers and many PhDs; it could make it itself. IBM had little appetite to give Gary Kildall, who understood the microcomputer business much better than IBM, another sniff.
As it turned out, DRI struggled, along with everyone else, to make Concurrent DOS work reliably on the 80286 chip, particularly when it came to networking. The PC was a wild west, and reliable compatibility really needed the hardware virtualisation of the 80386 chip.
The obvious alternative was rapidly maturing: Unix. The trade press had busily hyping "open systems" for years. In fact, it's little remembered now the PS/2 came with a version of IBM's Unix: the Advanced Interactive Executive or AIX for the PS/2. "The multiuser, multitasking virtual memory operating system will be a subset of the Unix-like AIX operating system for the RT PC", as InfoWorld reported.
(Note that a Unix would be "Unix-like" forever after.)
But IBM didn't like Unix, and didn't get serious about selling it until Sun, with the rest of the industry in hot pursuit, was eating into its business. IBM didn't like distributed architectures that were too distributed, and for IBM, Sun's emphasis on "the network is the computer" put the emphasis in completely the wrong place. The "Open Systems" hype was that the CIO user would mix and match their IT – and that was the last thing IBM wanted.
And there were more immediate, practical difficulties with leaping a generation of technology and yoking the IBM PC to Unix long term. It wasn't ease of use. Later, usability was the stick used to beat Unix vendors with, but at the time DOS and Unix were equally hard to use. Backward-compatibility was the main issue: they wouldn't run the PC applications of the day. It seemed far more plausible that IBM could persuade the big PC application vendors of the day – like Lotus and Ashton Tate – to migrate to OS/2 than bring their Unix ports to IBM's version of Unix. Far less risky for IBM too.
With the benefit of hindsight, the third option would have been far more attractive: why didn't IBM just buy Apple? It wasn't IBM-compatible, but most of IBM's kit wasn't IBM-compatible, either. (Hence one of those grand IBM unifying strategies of the time: "System Application Architecture".)
As it turned out, IBM and Apple would work closely together, but only out of desperation, once it was clear that Windows was a runaway success, and both had lost the developers to Microsoft. Much of the first half of the 1990s saw several thousand IBM and Apple developers working on ambitious joint projects that never bore fruit: Taligent, a WorkPlace OS, and much else.
Apple had a working Intel port of MacOS... in 1994. IBM and Apple had even agreed in principle to merge in 1995, only for Apple CEO Michael Spindler to get cold feet at the signing. He wanted more money from the buyer (the Apple Way).
Still, it's fascinating to speculate what a popular, consumer-friendly OS might have done if IBM had been prepared to license Apple's Macintosh OS aggressively, so it supplanted DOS as the industry standard. Many vendors had a foot in both camps at the time, so it really would have been a case of investing in existing initiatives, rather than starting from scratch. Without big IBM's support, Microsoft would have been relegated to a tools company with a nifty spreadsheet, possibly eventually divesting both. I wonder who would have bought Excel?
Alas, nothing was further from the thoughts of IBM's management at the time, obsessed with taming the industry and bringing it back under central control. What they'd always done had always worked. Why change?
Thomas J Watson may never have said the world will have five computers - it is almost certainly a myth. But it's striking as the world moves to three (or four) computing platforms, IBM isn't one of them. ®
1 day Anonymous South African Coward1 day Anonymous Coward
Re: Interesting times
On the other hand, my experience with OS/2 on a 33MHz 386SX with 4Mb RAM was excellent. First 2.1 then Warp, then Merlin... haven't had any crashes or funny things. Just wished the dang PC would go faster.
DooM played better than under DOS and some games just loved the extra memory I was able to give to them with the DOS settings under OS/2...
Good days, good memories.1 day AndrueC
Re: Interesting times
Agreed, I had it on a 386sx 25 (2mb ram) and then a 486dx2 66 (8mb ram), ran videos, games and windows apps faster than in windows and dos. Was a good OS, shame it didn't do better than it did. Have been using it up until a couple of years ago, some older machines where I work used it. But that wasn't a pleasant task, keeping it integrated with the current environment and making work around to keep it 'compliant'.1 day ChrisC
Re: Interesting times
I remember Sunday afternoons playing Geoff Crammond's Formula One Grandprix in a VDM while downloading messages from CompuServe using the multi-threaded Golden Compass. My first experience of real multi-tasking on a PC.
I also used it to develop DOS applications as the crash protection meant that I only had to reopen the VDM, not reboot the machine.1 day LDS
Re: Interesting times
I have fond memories of Warp too - back then I was doing some research work on robotic equations of motion, which had eventually evolved into a hideously complex Matlab script to do all the hard work for me. I'd just define the system geometry parameters at the start, click Go, twiddle my thumbs for an hour or so, and then get a complete set of optimised motion equations out the other end.
Unfortunately this was all being done in the Win3.1 version of Matlab, and as bad as the co-operative multitasking was in 3.1 generally, it was a shining beacon of excellence compared to how it behaved once Matlab started up - I'm pretty sure the Matlab devteam must have misread the Windows documentation and thought it featured "un-cooperative multitasking", because once you let Matlab loose on a script it was game over as far as being able to do anything else on that PC was concerned.
As a hardcore Amiga user at the time, I knew that multitasking didn't have to be this godawful, and I was convinced that the PC I had in front of me, which at the time had roughly twice the raw processing power of the fastest Amiga in my collection, really ought to be able to multitask at least as well as the slowest Amiga in my collection...
I can't recall how I stumbled upon OS/2 as the solution, all I do remember is that having learned of its existence and its claimed abilities to do stuff that Windows could only dream of doing, I dashed into town and bought my own copy of Warp, and once I got over the hurdle of getting it installed as a multi-boot setup with my existing fine-tuned DOS/Win3.1 setup (having expended god knows how many hours tweaking it to run all my games nicely - yes, even those that expected to have almost all of the base memory available, but still also needed to have CDROM *and* mouse drivers shoe-horned in there somewhere too - I didn't want to mess that up) I fired it up, installed Matlab, and tentatively clicked Go... Umm, is it running? This can't be right, the OS is still perfectly responsive, I can launch other Win3.1 applications without any signs of hesitation, and yet my Matlab script really does claim to be churning its way through its calculations about as quickly as it did hogging Win3.1 all to itself.
From that day on, Warp became my go-to OS for anything work-related until the day I finally ditched Win3.1 and made the switch to 95.
So yes, count me in as another one of those people who, despite the problems OS/2 had (I'll readily admit that it could be a bit flakey or just a bit obtuse when trying to get it to do what you wanted it to do) will still quite happily wax lyrical about just how bloody amazing it was in comparison to a DOS/Win16 based setup for anyone wanting to unlock the true potential of the hardware in front of them. Even today I still don't think the Windows dev team *really* understand how multitasking ought to behave, and I do wonder just how much productivity is lost globally due to those annoying random slowdowns and temporary hangs which remain part and parcel of everyday life as a Windows user, despite the underlying hardware being orders of magnitude more powerful than anything we could dream of having sat on our desks back in the 90's.16 hrs Planty
Re: Interesting times
OS/2 had a single applications message queue, or something alike, and a non responding application could block the whole queue. 3.0 run DOS applications very well, most Windows application worked well, but some had issues - i.e. Delphi 1.0. But despite IBM asserting it could also support Win32 applications, it never did, and from 1995 onwards the world was quickly migrating to 32 bit applications - and native OS/2 applications were too few and sparse.
But OS/2 was just a part of the PS/2 equation - PS/2 and MCA boards were really too expensive compared to clones - and once Compaq and others started delivering good enough machines, it was too late to close the stable. Nor DRI/GEM nor Apple could have turned the tide - it was a matter of customers money.
Back then PCs and their software were still expensive (much more expensive than today), and for many it was a quite demanding investment - asking even more like IBM did, was just cutting out many, many customers who could only afford clones - all of them running MS-DOS and then Windows.15 hrs Richard Plinston
Re: Interesting times
Interesting times again. Microsoft are now in IBM shoes, facing irevellance as nobody really cares about their products anymore. Microsoft have been running around throwing out all sorts of random things, hoping something will stick. Nothing stuck, everything sucked.6 hrs AndrueC
Re: Interesting times
> OS/2 had a single applications message queue,
It was Windows 1 through 3.11 that "had a single applications message queue".
"""Preemptive multitasking has always been supported by Windows NT (all versions), OS/2 (native applications) , Unix and Unix-like systems (such as Linux, BSD and macOS), VMS, OS/360, and many other operating systems designed for use in the academic and medium-to-large business markets."""
> But despite IBM asserting it could also support Win32 applications, it never did,
OS/2 Windows 3.x did support Win32s applications by loading in the win32s module, just like Windows 3.1 could do. However, Microsoft added a completely spurious access to virtual memory beyond the 2Gbyte limit of OS/2 (Windows supported 4Gbyte accesses) just to stop OS/2 using that beyond a particular version. Microsoft then required the new version in their software.
Exactly what you would expect from Microsoft, and still should do.
> But OS/2 was just a part of the PS/2 equation - PS/2 and MCA boards were really too expensive compared to clones
OS/2 ran fine on PC clones and ISA boards. The link between PS/2 and OS/2 was entirely spurious.5 hrs cmaurand
Re: Interesting times
I think you might be confusing two things there. Having pre-emptive multi-tasking doesn't preclude having a single message queue. The two things are only tangentially related.
Multi-tasking is the ability to run multiple processes (or multiple threads) by switching between them. Pre-emptive multitasking means that the OS can force a task switch, cooperative multitasking means that each process has to yield control back to the OS. OS/2 was an indeed one of the earliest (possibly the earliest) PC OS that was preemptive. Windows was only cooperative until 9x and NT.
But nothing about being multi-tasking requires that the OS even support message queues. Early versions of Unix offered pre-emptive multitasking but in the absence of X-Windows probably didn't have any message queues. In fact arguably an OS probably never would. Message queues are usually a higher-level construct typically implemented in the GUI framework.
And, sadly, it is indeed true that the early versions of Work Place Shell (the OS/2 default GUI) had a single message queue. IBM's recommendation was that developers implement their own queue sinks. The idea being that every application would have a dedicated thread that did nothing but accept messages and store them in a queue. The main application thread(s) would then empty this 'personal' queue at their leisure. I'm not sure why they wanted this design - maybe because then it was the application's responsibility to manage message storage? Sadly (and not surprisingly) most developers couldn't be arsed. As a result the WPS could often lock up. Now the OS itself wasn't locked - other processes would keep running just fine. If you were lucky enough to have a full screen VDM open you wouldn't even notice until you tried to go back to the WPS. When it happened to us my colleague and I used to ask the other one to Telnet in to our boxes and kill the main WPS thread to get things going again.
One of the big features that OS/2 Warp finally brought was multiple message queues. Sadly by then it was too late. It's a shame because I did like the WPS. It's object oriented nature was great. Windows has never offered that. In the WPS an icon is an object that knows where it should appear. In Windows it's just an icon that Explorer choose to render in a particular location. Right click it and Explorer tries to work out what to put on the menu. Do the same in WPS and the icon will create its own menu.
OOP GUIs are powerful things. Re: Interesting times
Lets try not to forget that Microsoft only got the PC-DOS gig after pitching Microsoft port of AT&T Unix (Xenix) for IBM's Personal Computer. It is also worth mentioning that Windows was initially developed on Xenix and ported to DOS after initial testing.
Had it not been for IBM, the world would be.... pretty much as it is now
Re: Interesting times4 hrs AndrueC
Actually, it did support 32 bit applications. Microsoft kept changed something in windows in the way it handled 32 bit applications, IBM adjusted, then Microsoft came out with Win32s, IBM adjusted, Microsoft changed it again (something about the way windows does things in memory) and IBM gave up on trying to keep up with Microsoft's changes.2 hrs Mine's a Large One
Re: Interesting times
Actually, it did support 32 bit applications.
That Windows integration piece in OS/2 was cool, I thought. Pretty much seamless and you could choose how seamless it was - full screen or pretend to be an application on the desktop. The bit where it could link into an existing installation was just brilliant. Licensing payments to Microsoft for Windows? Not today, thanks :D
But then the entire VDM subsystem was cool. A true 'Virtual DOS machine' rather than the Windows poor cousin. You could even boot up different versions of DOS. I seem to recall one of their bug fixes was to the audio subsystem to support a Golf simulator that triggered tens of thousands of interrupts per second. From what I vaguely recall they said they removed the need on the card side and faked the results inside the VDM. The VDM team must've gone to extraordinary lengths in order create what we'd probably now call a Virtual Machine.2 hrs Faux Science Slayer
Re: Interesting times
"Now the OS itself wasn't locked - other processes would keep running just fine. If you were lucky enough to have a full screen VDM open you wouldn't even notice until you tried to go back to the WPS. When it happened to us my colleague and I used to ask the other one to Telnet in to our boxes and kill the main WPS thread to get things going again."
We had a couple of apps which used to do this occasionally on our servers and the first we'd know about it would be when we tried to do anything at the server, something we didn't often do - everything had been merrily working away in the background but with a single app having frozen the WPS.18 mins CrazyOldCatMan
Rockefeller front IBM replaced by Rockefeller front MicroSoft
Edison did not invent electric light, Alex Bell stole his telephone patent, Marconi STOLE Tesla radio patents and Glenn Curtiss stole a dozen Wright Brothers patents. All were set up fronts for Rockefeller monopolies.
"JFK to 9/11, Everything is a Rich Man's Trick" on YouTube....end feudalism in 20171 day schifreen
Re: Interesting times
my experience with OS/2 on a 33MHz 386SX with 4Mb RAM was excellent
It was (even more so than DOS/Windows) sensitive to the hardware - if you had anything slightly out then OS/2 would do lots of mysterious things..
I used it, right up to the end when IBM dropped it. Along with linux (my first linux machine was a 386sx25 with a 330MB ESDI drive - was originally OS/2 with an IDE 80MB drive but OS/2 simply wouldn't load when the ESDI drive interface card was inserted. So I used it for linux instead).
After all, OS/2 did TCP/IP long before Windows had a stable IP stack supported by the manufacturer (yes, yes, there was Trumpet WinSock and Chameleon TCP/IP but they were not from Microsoft and MS refused to support them).1 day Little Mouse
Too much credit
You seem to conclude that the problem lay with IBM's poor strategy. You give the company way too much credit. They didn't really have a strategy at all and, if they did, no one knew what it was. Or where they would be working once the latest departmental reorg was complete.
The problem with OS/2 is that it was a solution to a problem that hardly anyone actually had. IBM knew this. Ask senior IBM people at the time (as I did) exactly what problem OS/2 was designed to solve, and why the world needed to ditch the free OS that came with their PC in order to install something that, frankly, never installed properly anyway, and they singularly failed to come up with a sensible answer. Or any answer at all.1 day I am the liquor
Re: Too much credit
I'd almost forgotten how pervasive the image of IBM was back in those days. I joined the PC party in the days of 286, and IIRC, PC's only really fell into two camps - either IBM or IBM clones.
Amazing that they could get from that position of dominance to this.1 day Warm Braw
Re: Too much credit
I don't think it's true that OS/2 was a solution to a problem that never existed. Perhaps it was too early. But by the mid 90s when customers did understand the problem, and Microsoft was answering it with Windows NT, surely that should have meant that OS/2 was mature and ready to take advantage.
OS/2 Warp was a better design in many ways than NT 3.51, but IBM's head start didn't help them win the race, and now we all run Windows NT on our PCs.1 day Vic
Re: A solution to a problem that hardly anyone actually had
It could have been a solution to a problem that a lot of people had. If you've ever had the misfortune to write software for IBM mainframes and been stuck with TSO or SPF (or even VM/CMS which is only better my comparison with the alternatives) you'd have given your eye teeth for a less developer-hostile environment. If they'd put decent mainframe developer tools onto a personal workstation they'd have had a core market to build on.
But this wasn't the IBM way: accounts were structured along product lines and no mainframe systems salesman was going to let a PC salesman or a System/34 salesman onto his territory if the thought it might cannablise his commission. No product manager was going to risk the ongoing investment in mainframe software by allowing personal computers to connect other than as the dumbest of terminals. It's ironic that IBM used the word "System" so much: it never really understood systems, just product families..1 day LDS
Re: Too much creditYou give the company way too much credit. They didn't really have a strategy at all
I was working at an IBM Systems Centre when the PS/2 launched. We had to go to customers to tell them how wonderful this MCA thing was - without having any knowledge of whether or not it was any better than ISA.
And that was a shame, really - MCA *was* better. But no-one found out until it was way too late. And the Model/30 didn't have MCA anyway...18 hrs a_yank_lurker
Re: Too much credit
Many people started to have problems with the single applications model DOS had - it's no surprise one of the first big hits of Borland was Sidekick - which made TSRs (Terminate and Stay Resident) applications common.
Still, everything had to work in the 1MB of memory available in real mode. Extended/Expanded memory couldn't be used to run code from. DOS Extenders later could run bigger applications, but still a single one.
Windows was well accepted not only because it was a GUI, but because it allowed to run more than one application, switch among them easily, and move data across applications. The limited multithreading was not an issue on desktops with only a single core CPU.
If you were using the PC just to play games, it really didn't matter, but for business users it was a great productivity boost.15 hrs Richard Plinston
Re: Too much credit
@schifreen - Itsy Bitsy Morons always had a schizophrenic attitude towards PCs and to a lesser extent minis at the time. They worshipped big iron and could not understand why people would want a "toy" or "crippled iron". What they failed to grasp is many computing activities are not very resource intensive on any computer (I wrote a thesis on an Apple IIe) even early PCs. These are activities that could be easily automated and put on a smaller computer. Others grasped the vacuum left and moved in with both feet.
The other issue for them was selling PCs is very different than selling a mainframe. One buys a PC much like one buys any other appliance from a retailer. There is no formal bid process with tenders to opened and reviewed. At retail, the sales staff is less interested in the brand you bought but is very interested in selling you something.14 hrs addinall
Re: Too much credit
> and why the world needed to ditch the free OS that came with their PC
MS-DOS and Windows were never free*. When you bought a computer with Windows installed, and sometimes when Windows was _not_ installed, money went from the OEM to Microsoft. That cost was part of the price.
* actually there was a 'free' version: 'Windows with Bing' that no one wanted.13 hrs Richard Plinston
Re: Too much credit
MS-DOS was never free. Unless you stole it.10 hrs Richard Plinston
Re: Too much credit
> MS-DOS was never free. Unless you stole it.
When SCP sold 86-DOS to Microsoft for development into PC-DOS and MS-DOS (MS had previously licensed it from SCP) the agreement was that SCP would have as many copies of MS-DOS as they wanted for free as long as they were shipped with a computer (SCP built the Zebra range of S-100 based computers).
After the fire in the SCP factory, which stopped them building computers, they started selling V20 chips (faster clone of the 8088* with 8085 emulation built in) and V30 chips (ditto 8086) with a free copy of MS-DOS. MS bought out the agreement for a reputed $1million.
* swap this for the 8088 to get a 20% faster machine that could also run CP/M software (with suitable loader).10 hrs Richard Plinston
Re: Too much credit
> MS-DOS was never free. Unless you stole it.
After per-box pricing* was declared illegal, MS came up with another scheme where MS-DOS and Windows were bundled together at the price of Windows alone. Effectively this was MS-DOS for free to stop DR-DOS being installed. At the time it was MS-DOS 4.01 versus DR-DOS 5 which was infinitely superior and it took MS 20 moths to nearly catch up with MS-DOS 5, at which point DR released DR-DOS 6 with task switching. MS took another year to almost catch up with MS-DOS 6.
* OEMs were contracted to pay Microsoft for MS-DOS on every box sold regardless of whether it had MS or DR-DOS (or other) installed. This was to strangle DR-DOS sales. The alternative to accepting this contract was to never sell any MS products ever again.10 hrs Pompous Git
Re: Too much credit
> Itsy Bitsy Morons always had a schizophrenic attitude towards PCs and to a lesser extent minis at the time. They worshipped big iron and could not understand why people would want a "toy" or "crippled iron".
You write as if IBM were one thing. It was divided up into several divisions, each with their own sales and marketing and each competing against the others. The mainframe division (360/370) wanted a small computer to counter the Apple IIs with Visicalc, Z80 softcard and CP/M software invading their sites. The IBM-PC was designed to be 20% better than the Apple II (160Kb floppies instead of 120Kb etc) and also act as a terminal (which is why the IBM PC has DTE serial ports while other micros had DCE) while running the same software. There were also 3740 (terminal) PCs and 360 emulating PCs (with Motorola 68x00 co-processor boards) for developers to use to write mainframe software.
The mainframe division did look down on the Series One, System 3, System 36 and System 38 (AS400) and other divisions, but did not see the IBM PC as any threat at all. They did want to exclude other brands though.7 hrs Richard Plinston
Re: Too much creditMS-DOS was never free. Unless you stole it.Only two of the many computers I've owned came with MS-DOS, most were without an OS. Since MS refused to sell DOS at retail, most people did just that; they stole a copy of DOS. A much smaller number of us purchased DR-DOS and reaped the benefits of an arguably better DOS than DOS. Especially if you also ran the 4DOS command processor.6 hrs Pompous Git
Re: Too much credit
> Since MS refused to sell DOS at retail
MS had a contractual moratorium on selling MS-DOS at retail for 10 years with IBM. This expired and MS released MS-DOS 5 for retail sales.
> A much smaller number of us purchased DR-DOS and reaped the benefits of an arguably better DOS than DOS.
It was significantly better that the contemporary MS-DOS 4.01 and had a 20 month lead on MS-DOS 5.
Allegedly it reached a 20% market share until MS brought in illegal per-box pricing and bundled MS-DOS+Windows at Windows price.5 hrs Anonymous Coward
Re: Too much credit"MS had a contractual moratorium on selling MS-DOS at retail for 10 years with IBM. This expired and MS released MS-DOS 5 for retail sales."B-b-b-b-ut that can't be true. Shirley MS are responsible for everything bad in computing... ;-)4 hrs FuzzyWuzzys
Re: Too much credit
"At the time it was MS-DOS 4.01 versus DR-DOS 5 which was infinitely superior and it took MS 20 moths to nearly catch up with MS-DOS 5, at which point DR released DR-DOS 6 with task switching. MS took another year to almost catch up with MS-DOS 6"
But MS had a trick up its sleeve to sideline DR-DOS ... with Win 3.0 they had a long (several months) "beta" period where people cold download the "beta" for free to try out so there was a lot of prelaunch publicity. Part of that publicity was that Win 3.0 didn't work on DR-DOS as there was an error during start-up. In legal terms DR weren't allowed to access the beta so they couldn't counter all the "if you want win 3.0 you'll need MS-DOS and not DR-DOS" stories in the press. In reality the error was, I believe, due to MS deliberately using a result from a DOS call which wasn't precisely specified where Win3 worked with the value MSDOS returned but not with the one DRDOS returned ... trivial for DR to fix and seem to recall they did this as soon as Win3 launched but the damage was done.
At the time I was using DR-DOS so I assumed that to get Win3 I'd need to factor in the cost of MSDOS as well and at which point the price of OS/2 (with a special launch offer) was similar so I went for OS/213 mins CrazyOldCatMan
Re: Too much credit
DR-DOS was superb, it had so much stuff bundled in and it worked like a dream and as proven MS took months to catch up to what DR-DOS did. DR also has one of the more tragic stories in PC history, well worth checking out.
Re: Too much credit
OS/2 Warp was a better design in many ways than NT 3.51, but IBM's head start didn't help them win the race, and now we all run Windows NT on our PCs.
Windows winning has often been called "the triumph of marketing over excellence".
1 day Martin 47Since operating systems aren't an end in themselves, but merely a means to an end, a means of running something that alleviates your grunt work (like dBase or Lotus 1-2-3 at the time), the advantages of OS/2 were pretty elusive.1 day stephanh
Think someone needs to mention that to Microsoft10 hrs Pompous Git
The whole situation seems very similar to Microsoft today desperately getting a bolthole in the mobile market, against Android. Microsoft's "Windows Experience" Android strategy me a lot of OS/2 (replace a working, familiar OS by something nobody needs).
"Hegel remarks somewhere that all great world-historic facts and personages appear, so to speak, twice. He forgot to add: the first time as tragedy, the second time as farce."
-- Karl Marx, "The Eighteenth Brumaire of Louis Napoleon"1 day Admiral Grace Hopper"Since operating systems aren't an end in themselves, but merely a means to an end... the advantages of OS/2 were pretty elusive."Not really. It was far from unknown in publishing to have to leave the machine rendering a print job because it couldn't do anything else at the same time. Being able to continue working while printing would have been a blessing, but had to await WinNT.The Man In The High Data Centre1 day Tinslave_the_Barelegged
I enjoy a good counterfactual. I still wonder occasionally how the world would have looked if IBM and Apple had got Pink to the point where it was marketable.22 hrs jake
Re: The Man In The High Data Centre
Bloke at our small village local being cagey about what he did at IBM. Eventually I twigged and said "Oh, you're working on Pink?" He seemed amazed that anyone would know about it, and eventually chilled, but it struck me that little episode was a good analogy for the problems with the ill fated IBM/Apple dalliance.1 day Colin Bull 1
Re: The Man In The High Data Centre
Somewhere, I have a T-shirt with the IBM logo of the time superimposed over the "full color" Apple logo of the time on the front. On the back, it reads "Your brain, on drugs.". The first time we wore them at work, we were told we'd be fired if we wore them again ...1 day MrT
The PS2 brought with it one of the longest used PC standards - VGA. Only in the last year or two has the VGA standard connector been superceded by HDMI.1 day Peter Gathercole
Re: succesful standard
And the PS/2 port for mice/keyboards.
The Model M keyboard set a high standard that many modern items still struggle to beat...1 day Wade Burchette
Re: successful standard
Though to be truthful, the key action in the Model M keyboards had appeared in the earlier Model F keyboards, and the very first Model M Enhance PC keyboard appeared with an IBM PC 5 pin DIN connector on the 5170 PC/AT.
We had some 6MHz original model PC/ATs where I worked in 1984, and even then I liked the feel of the keyboard. Unfortunately, the Computer Unit decided to let the departmental secretaries compare keyboards before the volume orders went in, and they said they liked the short-travel 'soft-touch' Cherry keyboards over all the others (including Model Ms).
As this was an educational establishment, the keyboards got absolutely hammered, and these soft-touch keyboards ended up with a lifetime measured in months, whereas the small number of Model Ms never went wrong unless someone spilled something sticky into them.
I wish I had known at the time that they were robust enough to be able to withstand total immersion in clean water, as long as they were dried properly.Re: succesful standard1 day Danny 14
The PS/2 standard is for keyboards/mice and it is still an important connector. It just works.
And VGA has not been supplanted by HDMI, but by DVI which was supplanted by DisplayPort. HDMI is designed for TV's and has a licensing cost.22 hrs Dave 32
Re: succesful standard
I bought a second hand denford milling machine. The license comes on a 3.5in floppy disk - I needed to buy a USB drive as I didn't have any working ones left (not even sure I had a motherboard with a connection any more either). Re: succesful standard
Ah, those proprietary floppy disk drives which couldn't tell the difference between a 720Kb and 1.4Mb disk. I hate to think about the hours spent recovering data from misformatted floppies.
And whilst the industrial design of the PS/2 was attractive and internals were easily accessible, quality of manufacture (the ones made in Scotland, at least) was lousy. I still carry the scars from fitting lids back on Model 55s.14 hrs bombastic bob
Re: succesful standard
Ah, you forgot about the 2.88MB floppy disks.
I happen to have an IBM PS/2-model 9595 sitting here under the desk with one of those drives in it. ;-)
Ah, yes, the model 9595; those had that little 8 character LED "Information Panel" display on the front of the case. It only took a tiny bit of programming to write an OS/2 device driver to turn that into a time-of-day clock. For years, that was the only thing that I ran on that 9595 (I called it my "600 Watt clock".).
Hmm, why was there suddenly a price spike for old IBM PS/2-model 9595 machines? ;-)
P.S. I'll get my coat; it's the one with the copy of Warp in it.10 hrs Pompous Git
Re: OS/2 and PS/2 Memories
Back in the 90's, a few months before the release of Windows 3.0, I took an OS/2 presentation manager programming class at the local (night school) city college. Got an 'A'. Only 6 students survived to complete the class, and the room was packed on day 1 when I thankfully had my add slip signed by the prof... [and we had "the PS/2 machines" in the lab to ourselves, since they were the only ones that could run OS/2].
And I really _LIKED_ OS/2 PM. I was able to format a diskette while doing OTHER THINGS, kinda cool because DOS could _NEVER_ do that! Version 1.2 was nice looking, too, 3D SKEUOMORPHIC just like Windows 3.0 would soon become!
But when i tried to BUY it, I ran into NOTHING but brick walls. It was like "get a PS/2, or wait forever for OEMs to get it 'ported' to THEIR machines". Bleah.
THAT is what killed it. NOT making it available for CLONES. When 'Warp' finally released, it was too little, too late.
But the best part of OS/2 was it's API naming, which follows the MORE SENSIBLE object-verb naming, rather than verb-object. So in Windows, it's "CreateWindow". In OS/2, it's "WindowCreate". And wouldn't you know it, when you read the DOCUMENTATION all of the things that work with WINDOWS are in the SAME PART OF THE MANUAL!
Damn, that was nice! OK I have hard-copy manuals for OS/2 1.2 still laying about somewhere... and corresponding hard-copy Windows 3.0 manuals that I had to thumb back-forth with all the time. Old school, yeah. HARDCOPY manuals. And actual message loops (not toolkits nor ".Not" garbage).1 day Lord Elpuss
Re: OS/2 and PS/2 Memories"THAT is what killed it. NOT making it available for CLONES. When 'Warp' finally released, it was too little, too late."A friend who was a developer at the time says the main thing that killed OS/2 was the cost of the SDK: over a $AU1,000. BillG was giving away the SDK for Windows at computer developer events.1 day wolfetone
"But it's striking as the world moves to three (or four) computing platforms, IBM isn't one of them ."
IBM is very much one of the top players. In Cloud, I would say AWS, Azure and IBM are the top 3. Business Intelligence? Oracle, IBM, Microsoft. AI/Cognitive? IBM Watson, Google DeepMind are the big two, Microsoft Cognitive coming a far third (along with highly innovative smaller vendors with a great solution but lacking scale - these will be swallowed by the big 2 (or 3) in short order).
Don't discount Big Blue too early.1 day Peter Gathercole
The IBM PS/2 Model 70 was my first PC, bought for me on my 10th birthday in 1997. Knew nothing about the computers, so it got thrown out a year later for an upgraded Acer 486. It had a huge impact on me, probably the one thing that got my love of computers and computing in general going.
In 2012, after many a weekend looking at eBay, I found a Model 70 for sale in London for £50. I got in the car, drove down and picked it up. I've other more interesting pieces of technology in my collection, but the Model 70 is the jewel as far as I'm concerned.1 day Danny 14
When the IBM AIX Systems Support Centre in the UK was set up in in 1989/1990, the standard system that was on the desks of the support specialists was a PS/2 Model 80 running AIX 1.2. (I don't recall if they were ever upgraded to 1.2.1, and 1.3 was never marketed in the UK).
386DX at 25MHz with 4MB of memory as standard, upgraded to 8MB of memory and an MCA 8514 1024x768 XGA graphics adapter and Token Ring card. IIRC, the cost of each unit excluding the monitor ran to over £4500.
Mine was called Foghorn (the specialists were asked to name them, using cartoon character names).
These systems were pretty robust, and most were still working when they were replaced with IBM Xstation 130s (named after Native American tribes), and later RS/6000 43Ps (named after job professions - I named mine Magician, but I was in charge of them by then so could bend the rules).
I nursed a small fleet of these PS/2 re-installed with OS/2 Warp (and memory canalized from the others to give them 16MB) for the Call-AIX handlers while they were in Havant. I guess they were scrapped after that. One user who had a particular need for processing power had an IBM Blue Lightning 486 processor (made by AMD) bought and fitted.1 day kmac499
I remember the day the boss signed a huge contract to strip out the IBMs and put Gateways in instead. 200 machines, and in 1990 it was brand new 486 33s (not the x2 ones), it was a fortune. Out with IBM and in with DOS, Windows for workgroups and good old novell netware logon scripting. Good days and it all pretty much worked. we even had Pegasus mail back in the day and a JANET connection.Back in them days the old "No-one got fired for buying IBM" was still rampant, and the site I was on, or more accurately the tech support crew, backed PS/2 OS/2 for that very reason.22 hrs Bandikoto
The wheels started to come off quite quickly with the cost and availability of MCA cards.
The other thing that really killed it isn't mentioned in the article. That was Token Ring networking, anyone else remember Madge cards?. I'm no network guy but IIRC it was all very proprietary, and an expensive pig to modify once installed (MAU's ??). Unlike the XEROX Ethernet system with simpler cabling and connectors,
Of course some PS/2 architecture does survive to this day. The keyboard and mouse connectors; all that work and all that's left is a color coded plug .....1 day Peter2
wrotefixed the Madge Tolkien Ring network card driver when I worked at Telebit, long, long ago, in a Valley far, far away. I'm pretty sure that was part of the work to get IPX working properly in Fred, as well. Token Ring was rated at 60% faster than thin Ethernet, and actually much faster at Load, but a real pain in the behind to work with.I remember the pictured computer in the article, and this was typed on the (pictured) model M keyboard that came with that computer. They don't build equipment like that any more. Which is probably just as well, given that the keyboard weighs practically as much as a modern15 hrs Anonymous Coward laptopnotebook.1 day BarryProsser
I still have a model M on my home PC. Only real issue with it is that it doesn't have a Windows key. I also wonder how long motherboards will come with a PS2 keyboard port.
Nothing I have ever used since comes close to it.Perspective of the PC developers?22 hrs Bandikoto
As a retired IBM UK employee, I have followed The Register's articles for years. I like this article more than most of the ones written about IBM. I would love to hear the perspective of the IBMers who were directly involved in the PC developments 30 years ago. I doubt many of them read or participate here. Where best to engage them? Yahoo, Linkedin, Facebook, others?1 day John Smith 19
Re: Perspective of the PC developers?
Do you participate in your local IBM Club? I know that at least one still exists in San Jose, California. There's an IBM Club in Boca, which seems like a good place to start, given that was the home of the IBM Personal Computer personal computer. http://www.ibmsfqccaa.org/1 day LDS
Although technically OS/2 is still around
In hindsight (always 20/20) IBM's mistake was to seek to start making money off of MCA ASAP. Had they had been much more reasonable they would have had a little bite of every board (through licensing) made and MCA would have dominated.
BTW let's not forget what a PoS Windows 2.0 was or how the 286 was so retarded that the MS tech who worked out how to switch it from real to virtual mode and back was hailed a f**king genius.
Re: Although technically OS/2 is still around
The 286 designers never thought that someone would have wanted to return to the much more limited real mode once into the much more advanced protected (not virtual) mode. Entering protected mode was easy - just set the proper bit. Getting out was "impossible" - but through a reset.
After all, back then, backward compatibility wasn't still a perceived issue. Probably Intel believed everyone would have rewritten the software to work in the new, better, protected world. It turned out it was wrong (just to make the same mistake years later with Itanium, though....).
The problem was not the 286 itself (and IMHO we'll see the segmented memory model back again one day because it implements a far better security model...), it was most DOS advanced application were written to bypass DOS itself and access the BIOS and HW (RAM and I/O ports) directly, something that was hard to emulate on a 286, and made porting to other OS more difficult.
Probably CP/M applications were better behaved, and a "protected" CP/M would have been easier to develop.
17 hrs Paul CrawfordRe: 28614 hrs Richard Plinston
The article has one significant mistake - the 286 did support protected operation, even the option for "no execute" on memory segments. But as it was designed to be either 'real mode' 8086 compatible OR protected mode you had the fsck-up of having to use a keyboard controller interrupt to bring it out of halt state back to 'real' mode.
The major advances for the 386 were:
1) 32-bit registers
2) The "flat" memory model and virtual memory support (not the 16-bit segments of 286, OK still segments but way big enough for a long time)
3) The option to easily change protection modes.6 hrs LDS
Re: Although technically OS/2 is still around
> BTW let's not forget what a PoS Windows 2.0 was or how the 286 was so retarded that the MS tech who worked out how to switch it from real to virtual mode and back was hailed a f**king genius.
It was an IBM tech that worked out how to switch the 80286 back to 8086 mode using the keyboard IF chip, there was a standard instruction to switch it to protected mode. The mechanism was incorporated into OS/2 1.0, MS stole that for Windows/286.1 day David Lawrence
The segmented mode has more than the "no execute" bit. It has an "execute only" mode - the CPU can execute the code in the segment, but the application can't nor read nor modify if (goodbye, ROP!). Still, managing segments adds complexity, security checks slow down execution, and applications are less free (which, from a security point of view, is a good thing).
The "flat memory model" was not an inherent new feature - it means just to use large segments, large enough the application never need to load another one (which usually meant the whole application address space). Also, usually, code and data segments overlap to make everything "easier" (and far less secure).
286 16-bit segments were too small to allow for that. 386 32-bit segments allowed that, while the new pagination feature allowed for "virtual memory" with a page (4k) granularity. 286 virtual memory worked at the segment level - with 64k segment it could work, but with flat 4GB segments it couldn't work, obviously.
But what made the 386 able to run DOS applications was the Virtual 86 mode. In that mode, the hardware itself trapped direct accesses to memory and I/O ports, and allowed the OS to handle them, without requiring complex, fragile hacks.
This mode is no longer available in 64 bit mode, and that's why Windows 64 bit can't run DOS applications any longer (Windows supports console applications which are native Windows applications, not DOS ones).Ah yes I remember it well1 day Danny 14
I was working for IBM at the time, at their UK Headquarters, so I got my free copy of OS/2 Warp.
There was a big demo with loads of PCs all running Windows-based games and software. I hit problems installing it on my PC because I wasn't prepared to re-partition my hard drive and lose all my existing data. Rubbish.
I watched OS/2 go down the drain, followed by several other doomed products. Then the bottom dropped out of the mainframe market and things took another slide. The 'partnership' with Lotus was a bit of a disaster too.
IBM? They just do software and services right? And a lot (but not all) of the software was obtained through acquisitions. I remember when they actually made stuff - like a PC keyboard that cost £120.
Shame really.1 day adam payne
Re: Ah yes I remember it well
lotus notes still makes me blink asymmetrically when I see pictures of it. <shudder>I loved the model M keyboard such a fine piece of design and engineering.1 day ShelLuser1 hr Uncle Ron
IBM was its own worst enemy
It's been a while but back in the days I was a serious OS/2 advocate. Look, if you even get other people to end up trying out OS/2 because they became sick and tired of Windows 3.11 often bodging up and not being able to network properly then yeah...
But IBM more than often didn't even seem to care all that much. Looking back I think it was a bit the same as the stories we get to hear about Microsoft now: how divisions in the company do different things, don't always work together and in some rare cases even compete. Even at the expense of customers if they have to!
But IBM... I enrolled in the OS/2 support program (I seriously don't remember how I pulled this off anymore, I think I asked (and got) permission from my work to look into all this and also use their name) which ended up with IBM sending me several beta versions of OS/2 products. Including several OS/2 server environments. It was awesome. OS/2 server (a green covered double CD, that much I remember) was basically OS/2 with additional user management and network configuration settings.
Yet the funniest thing: IBM couldn't care less about your test results. At one time I got an invitation to go to IBM in the Netherlands for an OS/2 server demonstration which would also showcase some of their latest product (I recall being showed a very lightweight laptop). At arrival you had to search for the entrance and where it all was, because any announcements or directions were no where to be found on site.
I bought OS/2 3.0 Warp and the 4.0 Merlin and it always worked like a charm. I seriously liked OS/2 much better than anything else. So when I had the opportunity to buy a PC through my work it was obvious what I would need to get, right? An IBM Aptiva. That would be an ultimate, the thing to get for OS/2. Because obviously an IBM OS will definitely run on IBM hardware, right?
Context: this was at the prime of my OS/2 endeavors. I could optimize and write a config.sys file from mind if I had to, I knew what drivers to use, which to skip, what each command did. Memory optimization? Easy. Bootstrapping a *single* floppy disk to get an OS/2 commandline? Hard, yet not impossible (try it, you'd normally get multiple disks to boot with).
It took me one whole weekend, dozens of phonecalls to the IBM support line, and the conclusion was simple: IBM did not care about OS/2 for their own hardware. And with that I mean not at all. It did not work, no matter what I tried. Even they told me that this wasn't going to work. Compaq out of all brands did care. Compaq, the brand which tried extremely hard to appeal to the general customer by making their hardware "easy" to use and also "easy" to customize (comparable to Dell a bit) didn't only target Microsoft and Windows. Noooo.... When I eventually ditched my IBM I got myself a Compaq and I also purchased an extra set of drivers and installation media (3 boxes of 3.5 floppy disks, approx. 37 in total) and guess what? Next to a full Windows 3.11 installation plus a different program manager and dozens of drivers it also included several disks with OS/2 drivers. I removed Windows and installed OS/2 that very same evening.
Compaq... which often advertised that they made Windows easier. And also delivered OS/2 drivers for their harware...
IBM, which made OS/2 also made hardware, never even bothered to provide OS/2 drivers for their own PC's. Not even if you asked them.
Does that look like a company which cared?
IBM was its own enemy sometimes.1 day Anonymous South African Coward
Re: IBM was its own worst enemy
IBM was and still is it's own enemy. So many comments above reflect this so well. Eg: "IBM's mistake was that it tried to make money right away from MCA." So true. IMHO, it is the MBA's permeating the entire company that are the enemy. They know nothing about the business IBM is actually in, only about cost recovery, expense containment, and fecking business models. For the last 30 years, the real heroes in IBM have been the ones who cut the most, or spend the least, or pound suppliers the worst.
This virus is especially dangerous when a non-MBA contracts it. When they see who gets the most recognition, they can't wait to de-fund sales commissions or training programs or development staffs. They think they are "doing good." It is not only true in IBM. Companies all over the West are infected with the idea that reducing costs (and innovation) towards zero, and increasing revenue towards infinity, is all we should be working on. So, fewer Cheerios in the box, fewer ounces of pepper in the same size can, cut the sales-force, reduce the cost (and quality) of support, and on and on and on.
If there is one word that summarizes this disease, and a word I cannot stand to hear in -any- context, it is the word, "Monetize." It encapsulates all the evils of what I feel is the "Too Smart by Half" mentality. I cannot say how many times I have heard the phrase, "how much money are we leaving on the table?" or, "how many more will we sell if we..." and the room goes silent and a good idea is dropped.
I am sorry I am rambling, I am sad. Never be another System/360 or Boeing 747. Incremental from here on out. Elon Musk doesn't seem to be infected...Single Input Queue1 day Kingstonian
Gotta love the silly old SIQ... one app chokes the SIQ and you can do absolutely nothing, except hard reboot :)
Fun times indeed. I had a couple of SIQ incidents as well. All but forgotten, but recalled to memory now :) Sometimes a CTRL-ALT-DEL would work, sometimes not.
And who remember the CTRL-ALT-NUMLOCK-NUMLOCK sequence?15 hrs addinall
Memories - Nostalgia isn't what it used to be.
It all worked well and was of its time. PS/2 and OS/2 made sense in an IBM mainframe using corporate environment (which I worked in) with some specialized workgroup LANs too. OS/2 EE had mainframe connectivity built-in and multitasking that worked. Token Ring was much better for deterministic performance too where near real time applications were concerned and more resilient than ethernet at the time - ethernet would die at high usage (CSMA/CD on a bus system) whereas token ring would still work if loaded to 100% and just degrade performance gracefully. Ethernet only gained the upper hand in many large corporate environments when 10 base T took off. Token ring would connect to the mainfame too so no more IRMA boards for PCs
There was OS/2 software available to have a central build server where each workstation could be defined on the server and then set up via the network by booting from floppy disk - useful in the corporate world. DB/2 was available for OS/2 so a complete family of useful tools was available. And IBM published its standards
IBM was used to the big corporate world and moving down to individuals via its PCs whereas Microsoft at that time was more individual standalone PCs and moving up to corporate connectivity. The heritage still shows to some extent. Novell was still the LAN server of choice for us for some time though.
The PS/2 was easy to take apart - our supplier showed us a PS/2 50 when it first came out. He had to leave the room briefly and we had the lid of the machine and had taken it apart (no tools needed) before he returned. He was very worried but it was very easy just to slide the parts back together and they just clipped into place - not something you could do with other PCs then. I came across an old price list recently - the IBM model M keyboard for PS/2 was around £200 (without a cable which came with the base unit- short for the model 50 and 70 desktops and long for the 60 and 80 towers! Memory was very expensive too and OS/2 needed more than DOS. In fact EVERYTHING was expensive.
OS/2 service packs (patches) came on floppy disks in the post. You had to copy them and then return them!
Starting in computing just after the original IBM PC was announced this all brings back fond memories and a huge reminder of the industry changes.5 hrs Anonymous Coward
Fake News. Re: Memories - Nostalgia isn't what it used to be.
Your memory is shot. OS/2 was developed my Microsoft AND IBM, first released jointly in December 1987. Bill Gates was at Wembley flogging the OS during 1988.
Way before then Microsoft had UNIX System 7 running and released as XENIX for the 8086/80286 architecture.
The development of OS/2 began when IBM and Microsoft signed the "Joint Development Agreement" in August 1985. It was code-named "CP/DOS" and it took two years for the first product to be delivered.
OS/2 1.0 was announced in April 1987 and released in December. The original release is textmode-only, and a GUI was introduced with OS/2 1.1 about a year later. OS/2 features an API for controlling the video display (VIO) and handling keyboard and mouse events so that programmers writing for protected-mode need not call the BIOS or access hardware directly. In addition, development tools include a subset of the video and keyboard APIs as linkable libraries so that family mode programs are able to run under MS-DOS. A task-switcher named Program Selector is available through the Ctrl-Esc hotkey combination, allowing the user to select among multitasked text-mode sessions (or screen groups; each can run multiple programs).
Communications and database-oriented extensions were delivered in 1988, as part of OS/2 1.0 Extended Edition: SNA, X.25/APPC/LU 6.2, LAN Manager, Query Manager, SQL.
The promised graphical user interface (GUI), Presentation Manager, was introduced with OS/2 1.1 in October, 1988. It had a similar user interface to Windows 2.1, which was released in May of that year. (The interface was replaced in versions 1.2 and 1.3 by a tweaked GUI closer in appearance to Windows 3.1).
OS/2 occupied the "Intellegent Workstation" part of SAA (Systems Application Architecture) and made use of the APPC PU2.1 LU6.2 SNA network stack.7 mins Anonymous Coward
"The development of OS/2 began..." etc. etc.
That mostly looks copy & paste from Wikipedia. A link would have been enoughRe: Fake News. Memories - Nostalgia isn't what it used to be.1 day Primus Secundus Tertius
OS/2 was developed my Microsoft AND IBM
And that's why an OS/2 subsystem lingered in Windows for some time.
(They finally killed it, right? And the POSIX subsystem?)No Stop button in window22 hrs Frish
The big defect in OS/2 that I met was the lack of a stop button in any window. Yes, you could close the window but that did not stop the task, just left it sitting in the background.
It was Windows 95 that brought us a proper stop button.
We had an OS/2 system returned to us as not working. The user had been closing the window, and then later starting up another instance. The system was clogged with dormant tasks. Once I removed them, everything worked again; what we had to do then was to update our User Guide.IBM what could have been
At one point, OS/2 could run multiple PC-DOS Windows "Better than Microsoft can" since memory was partitioned, and a crash in one window wouldn't affect the whole machine. Microsoft wrote a stub to detect whether OS/2 was installed, and killed the attempt to load PC-DOS...
Where IBM Boca missed the boat was thinking that this is a "personal" computer, and therefore, not a "commercial" one. IBM should have OWNED that market, entirely, and the competing product within the corporation that lost to Boca's product recognized that but got shoved aside by the sexiness of the PC, and all the departures from Big Blue tradition, etc.
Also, the way IBM execs got paid meant they were shortsighted about 'solutions' that included IBM Products that they didn't get paid for.
As Product Marketing Manager for IBM CICS 0S/2, (announced at Comdex '89, a Sr. VP from MIcrosoft shared with me on the show floor "That's the most important announcement in this entire show" as I handed him a press release THAT WAS NOT included in the show's press kit, since the PC division was in charge, and they kept other Division's products from being included...) I tried to get the then President of the PC division to just whisper CICS OS/2 to the management of a very large Insurance firm. He would have left with a 40,000 PC order, but, instead, chose to say nothing...IDIOTIC but true.
18 hrs Primus Secundus Tertius9 hrs Pompous Git
Re: IBM what could have been
"...Microsoft wrote a stub to detect whether OS/2 was installed..."
Windows 95 was distributed to end users as an "update CD". It would not run unless it detected an installed Windows 3.x or was presnted with the first W3 floppy disk. It would also accept the first OS/2 Warp 3 floppy disk.
Re: IBM what could have been"Windows 95 was distributed to end users as an "update CD". It would not run unless it detected an installed Windows 3.x or was presented with the first W3 floppy disk. It would also accept the first OS/2 Warp 3 floppy disk."That's because Warp included a full Win 3.x license.
22 hrs Jim 59GEM22 hrs Justin Clift
I remember seeing a demonstration of GEM at the PCW show in 1986 or 1987, at - Olympia I think it was. Very impressive it was too. Didn't it also come bundled on some Amstrad PCs ? Re: GEM
> Didn't it also come bundled on some Amstrad PCs ?
Yes. They came with both DRI's DOS+ and GEM and MS-DOS.
It was in all Atari 512 (and derivatives) running on TOS, which was written by DRI. It also came with BBC Master 512 which ran DRI's DOS+ and GEM on an 80186 (or was it 80188?) processor.OS/2 resurrection22 hrs Jim-234
Just to point out, OS/2 is being resurrected as "ArcaOS":
They're been working on it for a while, and (though I've not used it), apparently it's targeted for release in under two weeks:
Modern Qt 5.x has been ported to it, and many Qt based apps are apparently working properly.
Saying this as DB Browser for SQLite is one of them. One of our Community members has been keeping us informed. ;)
OS/2 Desktop virtualization before it was cool18 hrs Handlebar
I used to run OS/2 very well on a 386 based PC
I could have several windows open each with different virtual OSes, and you could move around the virtual OS image files etc.
So for example if you had some odd networking hardware (like LanTastic) that only wanted a specific DOS version, well spin up a new little dedicated Virtual OS image for it.
While people think all this virtualization stuff is so new.. it was around 25 years ago and worked rather well considering the hardware you had available at the time.
It's a shame Microsoft was able to blackmail IBM into discontinuing OS/2, with a different vision OS/2 might have become a serious contender for workplace use.
Re: OS/2 Desktop virtualization before it was cool9 hrs Pompous Git
Er, IBM were doing virtual machines in the 1960s ;-)22 hrs VanguardG
Re: OS/2 Desktop virtualization before it was cool"It's a shame Microsoft was able to blackmail IBM into discontinuing OS/2"Say what? Are you crazy?
The quote from Mr Watson was supposedly in 1943 - when computers were larger than many houses and weighed as much as several buses. To say nothing of being extremely pricey (since they were essentially built on-site by hand) and expensive to operate, since they needed to be staffed by a crew trained in just that one machine, plus the power needs were enormous, and the heat produced prodigious. And, at that time, you really had fairly limited tasks that needed doing that required that kind of number-crunching. Did he say it? Maybe not, but given the time frame, it really doesn't seem as boneheaded as it would have sounded 15 years later.
Microchannel *would* have been sweet, were it not for the expense. A basic sound card of the time was $70 for ISA - the same card in MCA was $150.
As for plugging in a keyboard after boot...I still find it amusing that someone actually wrote an error code of "Keyboard not detected. Preset F1 to continue." If there's no keyboard, there's no F1 to press.
21 hrs Primus Secundus Tertius
There are other variations of that quote about five computers: e.g. the UK would only need that kind of number.
For the publicly known computations of that time - gunnery trajectories etc., that number is perhaps right. But I believe over a dozen instances of Colossus were built for code cracking. So even then the estimate of five was way out.
However the real expansion of computing came with data processing, where record keeping outweighed the relatively small amount of computation. IBM should have known that, given their existing business in accounting machines fed with punched cards.
18 hrs Updraft102
Well, yeah... there's no F1 to press, so if you want the PC to boot and you see that message, you will have to plug a keyboard in and press F1. That and make sure the keyboard lock is in the off/unlocked position, which is probably more what the keyboard check/stop on fail configuration was about than anything else.
The keyboard lock was an actual physical lock on the front of the PC that used a round key (like on vending machines) which, if set to the locked position, would prevent the PC from detecting the keyboard or responding to keypresses.
Enabling the "Press F1 to resume on keyboard error" BIOS setting made the keylock into a rudimentary system protection device. It wasn't all that effective against anyone who could get into the computer case, as bypassing the lock was as easy as pulling the cord for the keylock off the motherboard, but PC cases also typically had provisions for a small padlock to keep the cover on the case back then too. It wasn't great protection, but it probably provided some relief from pranksters, who probably would not be determined enough to cause physical damage to someone's PC for a joke. 14 hrs Richard Plinston > So even then the estimate of five was way out.
Actually he was speculating on the sales of the particular model that they were currently building. He was counting the number of government agencies that would be able to afford them and find them useful.
It was only much later that anyone tried to use computers for commercial purposes that would find them a place in businesses: LEO - Lyons Electronic Office was developed for payroll, stock distribution and manufacturing (of cakes).
In the 1950s the British government were deciding where _the_ computer would go. They chose a town that was a major railway junction because then the train loads of punch cards could easily be shipped to it. 20 mins VanguardG I remember - and people were perpetually misplacing the key to their padlocks. The round keylocks require a lot of expertise to crack open, but the padlocks...well...they were nearly always bought cheap. Basic (legal) hand tools could pop them open in seconds, without any damage to the case and, at most, a scratch or two on the padlock. Some people who were kinda serious had monitor stands that had a built-in, locking keyboard drawer. But those were usually employed by people who had a "favorite" keyboard they were afraid would get stolen by a jealous co-worker rather than because of any actual security concerns. 21 hrs Tim Brown 1 The UK had the best tech for personal computers at the time
For PCs during that period, in pure tech terms , Acorn's ARM machines running RISC-OS were way ahead of offerings from anyone else and prior to that the BBC micro (built by Acorn).
It's just such a shame that Acorn lacked any international marketing savvy then. 14 hrs Richard Plinston Re: The UK had the best tech for personal computers at the time
> It's just such a shame that Acorn lacked any international marketing savvy then.
And yet Acorn became a worldwide powerhouse chip design expert that currently sells licences for billions of chips every year. Even before phones started using them ARM were selling tens of millions of licences for chips to power embedded equipment (modems, routers, PABX, ...).
ARM = Acorn RISC Machines 9 hrs jake Re: The UK had the best tech for personal computers at the time
Not really. I had a 16 bit Heath H11A personal computer in 1978. Acorn didn't ship a 16 bit computer until 1985 ... The old girl still runs. Loudly.
https://en.wikipedia.org/wiki/Heathkit_H11 21 hrs Anonymous Coward "Thomas J Watson may never have said the world will have five computers"
It is funny that people take this statement to be a sign of IBM not understanding the potential of the computing market, if it was ever even said. It actually makes a lot of sense though. TJW, correctly, didn't think it would make sense for every business out there to go build their own little data center and a "best we can do" computing infrastructure. Better to let a giant time share (now called cloud provider) handle all of that complexity and just rent the resources as needed. It is kind of like saying there is only room for a handful of electrical utilities in the world. Even if everyone, at MSFT's urging, went out and bought a gas powered generator for their house... it still makes sense that there is only room for a handful of utilities. 21 hrs JohnCr A tiny bit more
A few comments.
1) IBM was STRONGLY urged to skip the 286 and create a true 32 bit, 386 version of OS/2. Even Microsoft was strongly pushing IBM in this direction. IBM's resistance to do so was a major contributor to the split between IBM and Microsoft.
2) The MicroChannel was better than the ISA bus. The problem was it was not good enough for the future. The PC evolution was moving towards faster graphics and network (100 mbps) connectivity. The MicroChannel, even 3 generations into the future did not have the bandwidth to meet these needs. The industry evolved to the PCI interface. As an interesting historical coincidence, PCI uses the same type of connectors as did the MicroChannel. And the PCI interface found its way into other IBM systems.
3) The IBM PC was inferior to Apple's products. The PC became more successful because a new industry and IBM's customers worked together to make it successful. (Steve Job - The Lost Interview) In 1987 IBM turned a deaf ear to the industry and its customers. When IBM stopped listening its fortunes turned. This culture took over the whole company and was a major factor in the company almost going out of business in the 1990's.Expand Comment 14 hrs Richard Plinston Re: A tiny bit more
> Even Microsoft was strongly pushing IBM in this direction.
Microsoft had developed its own 286 versions of MS-DOS: 4.0 and 4.1 (not to be confused with the much later 4.01). These was also known as European DOS because the were used by Siemans, ICL (where I worked) and Wang. These versions supported a limited multitasking of background tasks and one foreground program. I have a manual here on how to write 'family' applications that would run on 8086 MS-DOS or in protected mode on 80286 MS-DOS 4.x.
It was dumped when they switched to writing OS/2 with IBM. 9 hrs niksgarage Re: A tiny bit more
MCA was an architecture that worked fine in workstations, just not PCs. I met Fred Strietelmeyer - the architect of MCA (fairly sure that's his name) in Austin, TX in the 80s who told me that MCA was a fine architecture, but not all implementations were. RS/6000 had multi bus-mastering working with MCA, mostly because the address on the channel was a logical address, which went through the memory manager, so AIX could easily control and protect the physical memory space. PS/2 used physical addresses, which meant that either bus mastering was turned off, or the bus mastering cards needed to have a copy of the memory manager on board as well. If you were running AIX, MCA was not a problem or even a question deserving of five minutes of thought.
The PC industry hated MCA, the connector, the architecture and its licencing. They came out with EISA - a backward-compatible connector to extend the AT bus. I always found it a huge irony that PCI used the same physical connector as MCA years later. 9 hrs niksgarage Re: A tiny bit more
AND you are exactly right about the 286/386 wars. I was in Boca when the AIX guys from Austin came to see the CPDOS developers (as OS/2 was known in Dev), and showed them true multi-tasking on 386. They were baffled why IBM was writing another OS when we already had a virtualising, 32-bit ready pre-emptive multi-tasker that ran on multiple hardware platforms. It's only issue was it couldn't run on 286. And for that reason alone, IBM spent enough money on OS/2 development that they could have paid for the Hubble telescope AND had it repaired.
I also saw the numerous prototype machines in Boca (one called 'Nova' in dev as I recall) which had a 16MHz 386, lots of memory on board and an AT expansion bus. (Also had the 1.44MB diskette drive) Nice machine, could have sold really well. Only the Model 30 was allowed to be shipped, in case the AT bus continued to outshine MCA. Marshalltown What grief?
It always puzzled me what grief OS/2 was supposed to create. I used it as a substitute for Windows, running windows s/w into the early years of the century. I gathered that IBM might have still been attempting to extract its pound of flesh from developers, but as far as I was concerned, it worked fine. I built my own machines and it ran on them without problems. I also liked Rexx as a scripting language. It was immensely more useful than does and much less of a pain (to me) than MS BASIC and all its little dialects and subspecialties.
The only real grief I encountered was developers who "simply couldn't" do a version for OS/2 - and, of course, MS doing their best to see that their software was less compatible than need be.
History seems to forget how thoroughly MS would break things with each "improvement." Many of the useful "improvements" in Windows were first present in OS/2 and some really nice features vanished when IBM decided the effort wasn't worth the candle. The browser with its tree-structured browsing history was remarkable. No browser since has and anything to match. Even now, relicts of the OS/2 interface are still present in KDE and GNU. Microsoft has finally moved "on" with the horrible looking and acting interface of Windows 10. Ah... Concurrent DOS...
IBM did actually use DR Concurrent DOS 286 - but in their 4680 Point-of-sale (described often as a real P.O.S. by those of us who used it) OS. Re: Ah... Concurrent DOS...
> IBM did actually use DR Concurrent DOS 286 - but in their 4680 Point-of-sale (described often as a real P.O.S. by those of us who used it) OS.
Yes, that was a DRI product but it was not Concurrent-DOS it was FlexOS. This shared code with MP/M-86, as did Concurrent-DOS (neither of which had an 80286 product) but was 80286 based. The main difference was that FlexOS would not run MS/PC-DOS programs and Concurrent-CP/M-86 / Concurrent-DOS would run several of them at once (as well as CP/M-86 programs).
DRI had pre-emptive multi-user, multi-tasking systems since 1978 with MP/M which ran on 8085 and Z80 micros with bank switched memory (I have a couple of RAIR Blackbox/ICL PC1s here and an ICL PC2 8085AH2 with 512Kbyte). MPM2 and MP/M-86 (for the 8086) were released around 1980. Concurrent-CP/M-86 with multiple virtual screens ran on an IBM-PC (and other machines - I have a stack of 8086 ICL PC2) and could used EEMS memory cards such as AST RamPage to get several Mbytes of memory and do context switching with just a handfull of register moves.
Concurrent-CP/M-86 was demonstrated the same month as MS-DOS 2 was released. It had pre-emptive multi-tasking (and multiuser with serial terminals). The virtual screens were just a keystroke away so one could run SuperCalc, Wordstar, and other programs at the same time and just flick between them - even on the serial terminals.
Later, this was developed for 386 into DR-Multiuser-DOS from which DR-DOS 5 and 6 were derived.
There was a FlexOS-386 which had an enhanced GEM-X but it was dropped to concentrate on the Concurrent range.Expand Comment PS/2 and OS/2
The naming convention always struck me as odd. In my mind, the /2 meant divide by 2, ie half an OS and half a PS :-) Re: PS/2 and OS/2
Ah - "OS divided by 2" - I'd never thought of it that way before.
Interestingly, for the short time that MS were promoting it, they spelt it OS-2, which could I suppose be read as "OS minus 2".
I'm not sure what if anything we can deduce from that! Re: PS/2 and OS/2
Nah. IBM (and many others) used the "/" for many, many products. Like System/360. It was a naming convention that seemed to lend weight, not division, to a product family. Big Blue: IBM's Use and Abuse of Power
To understand the lead up to PCs people should read Richard DeLamarter's Big Blue: IBM's Use and Abuse of Power.
It goes back to the 1890s and shows how IBM became dominant through not such ethical practices. The antitrust suit against IBM was dropped by Ronald Reagan, which prompted DeLamarter to write the book. Make sure your driver strategy is in place if you launch a new O/S
Good article Andrew. I was there in Miami when IBM launched the PS/2. They closed off the streets six blocks around the exhibition centre and as far as I remember the Beach Boys played. They should probably have saved their money. One thing you missed is that not only was OS/2 late but the driver support in the operating system was very poor. This meant that as well as blocking all the plug in cards through the new bus architecture, they also bricked all of the add on peripherals for the PS/2 that had worked with the IBM PC. Add to that the fact that the OS/2 driver team started competing with driver developers for driver business and also blocked them from developing for the architecture (until the OS/2 DDK made a brief appearance and then disappeared) and the factors that contributed to IBM's demise was complete. I recall that when the driver team saw the source code of one of our drivers some years later they threatened to sue us. That was until I pointed out that the code was based on the OS/2 DDK and they went quiet but couldn't quite believe that we had managed to obtain a copy in the few weeks that it had popped its head above the surface. Microsoft worked out early on that driver support is a key element to the success of an Operating System. Something that they seem to have lost sight of a bit with Windows Vista onwards although I suppose the switch to 64bit has made backwards compatibility more difficult. Keep the nostalgia coming Andrew, it's not like it used to be! Tony Harris6 hrs BinkyTheMagicPaperclip
Such a missed opportunity
I didn't use OS/2 1.x at the time (only later), but beyond 1.0 it was fine for server based apps and a good, solid platform. Not so much for desktop apps - insufficient driver support, high memory requirements, and limited app support put paid to that.
OS/2 2.x and beyond was a much improved proposition, but suffered in competition with the large number of Windows apps. The userbase were not, in general, prepared to pay more for a smaller amount of higher quality features - the reality of running a minority platform.
OS/2 might have got further if IBM concentrated on Intel, but instead they wasted vast amounts of effort on OS/2 PPC. Much though I loved OS/2, the succession plan was flawed. Windows NT is simply better architected, they spent the time maintaining compatibility with 16 bit apps, and had much improved security, and multi user support. OS/2 was effectively dead before it really caused a problem, but it would have caused issues later on.
System <n>/Mac OS were also flawed, and the early versions of OS X sucked, but Apple are much better at retaining compatibility whilst updating the OS (at least for a few years, until they drop old kit like a brick).
I've still got an OS/2 system, and a lot of apps, and will be assembling an OS/2 1.3 system (because I'm a masochist and like trying OS). Haven't bothered with eComstation, but might give Arca 5.0 a go if it's any good, and not ludicrously priced. There aren't too many OS/2 apps I really want to run these days, though.
One final note : it's *synchronous* input queue, not single. If messages are not taken off the input queue it hangs the interface, but does not stop apps running. There was a workaround implemented in Warp 3 fixpack 16, but until then a badly behaved app was a real pain. However, Win32 successfully moved away from the synchronous input queues in Win16, to asynchronous in Win32, without breaking too many apps. IBM should have put in the engineering effort to do the same.
There are also some substantial differences between OS/2's architecture, and Windows (or indeed anything else). For instance the co-ordinate origin in Windows is at the top left of the screen, but in OS/2 it's the bottom left (OS/2 uses the mathematically correct option here)
The "fun" takeaway from this is that while "industry analysts" have been consistently wrong for at least the last 30 years, we're still listening to what they say...
Concurrent DOS, great stuff!
One of the first multiuser systems I really learned as a spotty teenager back in the late '80s. Working with my Dad on getting shared DataEase databases working at his workplace. We had C/DOS on the "master" PC system and a couple Wyse terminals hanging off the serial ports, 3 systems independently using a shared, albeit cutdown, PC based RDBMS system. I loved it so much as a kid that I ended up with a career working in database systems.
Better than OS/2 / Win9x / NT
Around the time OS/2 was making its way onto the scene and most people use DESQview to multitask on their 286/386 PC, Quantum Software Systems (now QNX Software Systems) had a real-time multiuser, multitasking , networked and distributed OS available for 8086/8088 & 80286 processors.
On PC/XT hardware in ran without any memory protection, but the same binaries would run on the real mode and protected mode kernel.
Something about being able to do
$  unzip /tmp/blah.zip
Would run the unzip program on node 2, read as a source archive the /tmp/blah.zip file on node 3 and extract the files into the current working directory.
We accessed a local BBS that ran a 4-node QNX network (6 incoming phone lines + X.25 (Datapac) )
Even supported diskless client booting and the sharing of any device over the network. Though at around $1k for a license, it wasn't "mainstream".
It's too bad the few times Quantum tried to take it mainstream, the plans failed. Both the Unisys ICON and a new Amiga had chosen QNX as the base for their OS.OS/2 from the Big App developers' POV
WordPerfect's then-CEO wrote this memoire, eventually put it on web for free. It's actually a good read, especially if you've never experienced the insane self-meme-absorption that is american corporate insiders. Which is why it went from biggest in world to *pop* so suddenly.
OS/2 first mentioned near end of Ch.8 and then passim. It shows quite a different view of OS/2.
We had a productive OS/2 machine at one of our sites up until very recently. I think the only reason it was got rid of was the site being axed.
It was running a protein separation machine and had to dual into Win98 if you wanted to copy your data to the NAS. It was impressive that it lasted so long baring in mind it lived in a cold room at <5c.
1 hr ImpureScienceStill Sort Of Miss It
I really liked OS/2, and for a while I thought it would take the place that Windows now owns. But IBM had no idea how to sell to single end users, and get developers on board. Despite having a superior product their financial policies guaranteed only customers ended up being banks and insurance companies.
I'm a musician, and I remember going on for over a year with the guy they had put in charge of MIDI on OS/2. It never happened, because which bank, or what insurance company, would be interested? !--file:///f:/Public_html/History/index.shtml-->
Apr 05, 2017 | sourceforge.net
April 5, 1911 – Born on this day is Cuthbert Hurd, a mathematician hired by IBM in 1949. At the time, Hurd was only the second IBM employee hired with a Ph.D. While he may not be widely-known, his contribution to IBM and the development of computers is invaluable. During the early 1950s IBM profited greatly from traditional punch card accounting. It was Hurd who quietly encouraged the upper management of IBM to enter the field of computing, as a cross-country sales trip revealed pent-up demand for scientific computers. It was a difficult move for the company, but was a rewarding one. Hurd was able to sell 10 out of the 18 computers marketed as the IBM 701, the first commercial scientific machines that could be rented at $18,000 a month.
Hurd soon became director of the IBM Electronic Data Processing Machines Division, and later became president of the Computer Usage Company, the first independent computer software company.
Mar 29, 2017 | sourceforge.net
March 29, 1989 – Pixar wins an Academy Award for Tin Toy , the first fully computer-animated work to ever win best animated short film. The film was directed by John Lasseter and financed by then Pixar owner Steve Jobs. It was an official test of the PhotoRealistic RenderMan software. The short film later caught the attention of Disney, which partnered with Pixar to create the highly successful Toy Story , an animated movie turned film franchise with elements inspired by the original short film.
Mar 22, 2017 | sourceforge.net
This is the first of a new blog series, "Today in Tech" and will feature the significant technological events of specific dates throughout history. Today in Tech:
1971 – Intel announces that the world's first commercial microprocessor – the Intel 4004 – is officially ready for shipping. The 4-bit central processing unit was designed by engineers Ted Hoff and Stan Mazor under the leadership of Federico Faggin, and with the assistance of Masatoshi Shima, an engineer from the Japanese firm Busicom. The microprocessor was designed in April of 1970 and completed in January of 1971, the same year when it was first made commercially available.
Feb 21, 2017 | economistsview.typepad.comRC AKA Darryl, Ron : February 20, 2017 at 04:48 AM , 2017 at 04:48 AMRE: Designing and managing large technologiescm -> RC AKA Darryl, Ron... , February 20, 2017 at 12:32 PM
[This is one of those days where the sociology is better than the economics or even the political history.]
What is involved in designing, implementing, coordinating, and managing the deployment of a large new technology system in a real social, political, and organizational environment? Here I am thinking of projects like the development of the SAGE early warning system, the Affordable Care Act, or the introduction of nuclear power into the civilian power industry.
Tom Hughes described several such projects in Rescuing Prometheus: Four Monumental Projects That Changed the Modern World. Here is how he describes his focus in that book:
Telling the story of this ongoing creation since 1945 carries us into a human-built world far more complex than that populated earlier by heroic inventors such as Thomas Edison and by firms such as the Ford Motor Company. Post-World War II cultural history of technology and science introduces us to system builders and the military-industrial-university complex. Our focus will be on massive research and development projects rather than on the invention and development of individual machines, devices, and processes. In short, we shall be dealing with collective creative endeavors that have produced the communications, information, transportation, and defense systems that structure our world and shape the way we live our lives. (3)
The emphasis here is on size, complexity, and multi-dimensionality. The projects that Hughes describes include the SAGE air defense system, the Atlas ICBM, Boston's Central Artery/Tunnel project, and the development of ARPANET...
[Of course read the full text at the link, but here is the conclusion:]
...This topic is of interest for practical reasons -- as a society we need to be confident in the effectiveness and responsiveness of the planning and development that goes into large projects like these. But it is also of interest for a deeper reason: the challenge of attributing rational planning and action to a very large and distributed organization at all. When an individual scientist or engineer leads a laboratory focused on a particular set of research problems, it is possible for that individual (with assistance from the program and lab managers hired for the effort) to keep the important scientific and logistical details in mind. It is an individual effort. But the projects described here are sufficiently complex that there is no individual leader who has the whole plan in mind. Instead, the "organizational intentionality" is embodied in the working committees, communications processes, and assessment mechanisms that have been established.
It is interesting to consider how students, both undergraduate and graduate, can come to have a better appreciation of the organizational challenges raised by large projects like these. Almost by definition, study of these problem areas in a traditional university curriculum proceeds from the point of view of a specialized discipline -- accounting, electrical engineering, environmental policy. But the view provided from a discipline is insufficient to give the student a rich understanding of the complexity of the real-world problems associated with projects like these. It is tempting to think that advanced courses for engineering and management students could be devised making extensive use of detailed case studies as well as simulation tools that would allow students to gain a more adequate understanding of what is needed to organize and implement a large new system. And interestingly enough, this is a place where the skills of humanists and social scientists are perhaps even more essential than the expertise of technology and management specialists. Historians and sociologists have a great deal to add to a student's understanding of these complex, messy processes.
[A big YEP to that.]
Another rediscovery that work is a social process. But certainly well expressed.
It (or the part you quoted) also doesn't say, but hints at the obvious "problem" - social complexity and especially the difficulty of managing large scale collaboration. Easier to do when there is a strong national or comparable large-group identity narrative, almost impossible with neoliberal YOYO. You can always compel token effort but not the "intangible" true participation.
People are taught to ask "what's in it for me", but the answer better be "the same as what's in it for everybody else" - and literally *everybody*. Any doubts there and you can forget it. The question will usually not be asked explicitly or in this clarity, but most people will still figure it out - if not today then tomorrow.
Jan 11, 2017 | theregister.co.uk
Re: The point stands
the point is flat on it's back just like the sophistic reply.
Lets take apples first machines they copied the mouse from Olivetti , they took the OS look from a rank XEROX engineers work, the private sector take risks and plagiarize when they can, but the missing person here is the amateur, take the BBS private individuals designed, built and ran it was the pre cursor to the net and a lot of .com company's like AOL and CompuServe where born there.
And the poor clarity in the BBC article is mind numbing, the modern tech industry has the Fairchild camera company as it's grand daddy which is about as far from federal or state intervention and innovation as you can get .
Deconstructionism only works when you understand the brief and use the correct and varied sources not just one crackpot seeking attention.
Re: Engineering change at the BBC?
"The BBC doesn't "do" engineering "
CEEFAX, PAL Colour TV, 625 line transmissions, The BBC 'B', Satellite Broadcasting, Digital Services, the iPlayer, micro:bit, Smart TV services.
There's also the work that the BBC did in improving loudspeakers including the BBC LS range. That work is one reason that British loudspeakers are still considered among the world's best designs.
By all means kick the BBC, but keep it factual.
Re: I thought I invented it.
That was the first market demographics - iPod users happy to buy one who could also make calls. But that's also were Nokia failed spectacularly - it was by nature phone-centric. Its models where phones that could also make something else. True smartphones are instead little computers that can also make phone calls.
In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard).
the iPod probably taught Apple you need a proper "finger based" UI for this kind of devices - especially for the consumer market - and multitouch solved a lot of problems.
Re: I thought I invented it.
Shortly there-after I duct-taped 4 of them together and invented the tablet.
My version of it all is that the glory goes to iTunes for consumer friendly interface (ignore that concept Linux guys) and easy music purchases, the rest was natural progression and Chinese slave labor.
Smart phones and handheld computers were definitely driven by military dollars world wide but so was the internet. All that fact shows is that a smart balance of Capitalism & Socialism can go a long way.
Re: I thought I invented it.
>That was the first market demographics - iPod users happy to buy one who could also make calls. But that's also were Nokia failed spectacularly - it was by nature phone-centric. Its models where phones that could also make something else. True smartphones are instead little computers that can also make phone calls. In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard). the iPod probably taught Apple you need a proper "finger based" UI for this kind of devices - especially for the consumer market - and multitouch solved a lot of problems.
I don't know exactly why Nokia failed, but it wasn't because their smart phones were "phone centric". The N900, N810 and N800 are to this day far more "little computers" than any other smartphone so far. Indeed, as they ran a Debian Linux derivative with a themed Enlightenment based desktop, which is pretty much off the shelf Linux software. While they didn't have multitouch, you could use your finger on the apps no problem. It had a stylus for when you wanted extra precision though.
I could apt-get (with some sources tweaking) what I wanted outside of their apps. You could also compile and run proper Linux desktop apps on it, including openoffice (back in the day). It ran like a dog and didn't fit the "mobile-UI" they created, but it worked.
It also had a proper X server, so I could forward any phone app to my big PC if I didn't feel like messing about on a small touchscreen. To this day I miss this ability. To just connect via SSH to my phone over wifi, run an smartphone app, and have it appear on my desktop like any other app would.
It had xterm, it had Perl built in, it had Python (a lot of it was written in Python), you even could install a C toolchain on it and develop C code on it. People ported standard desktop UIs on it, and with a VNC/RDP server you could use it as a portable computer just fine (just connect to it using a thin client, or a borrowed PC).
I had written little scripts to batch send New years SMS to contacts, and even piped the output of "fortune" to a select few numbers just for kicks (the days with free SMS, and no chat apps). To this day I have no such power on my modern phones.
Damn, now that I think back, it really was a powerful piece of kit. I actually still miss the features *sniff*
And now that I think about it, In fact I suspect they failed because their phones were too much "little computers" at a time when people wanted a phone. Few people (outside of geeks) wanted to fiddle with X-forwarding, install SSH, script/program/modify, or otherwise customise their stuff.
Arguably the one weakest app on the N900 was the phone application itself, which was not open source, so could not be improved by the community, so much so people used to say it wasn't really a phone, rather it was a computer with a phone attached, which is exactly what I wanted.
Invention of iPhone
It wasn't even really an invention.
The BBC frequently "invents" tech history. They probably think MS and IBM created personal computing, when in fact they held it back for 10 years and destroyed innovating companies then.
The only significant part was the touch interface by Fingerworks.
I was reading a BBC news web article and it was wrong too. It missed out emphasising that the real reason for success in 2007 was the deals with operators, cheap high cap data packages, often bundled with iPhone from the Mobile Operator.
This is nonsense:
"Those were the days, by the way, when phones were for making calls but all that was about to change."
Actually if you had a corporate account, you had a phone already with email, Apps, ability to read MS Office docs, web browser and even real Fax send/receive maybe 5 or 6 years before the iPhone. Apart from an easier touch interface, the pre-existing phones had more features like copy/paste, voice control and recording calls.
The revolution was ordinary consumers being able to have a smart phone AND afford the data. The actual HW was commodity stuff. I had the dev system for the SC6400 Samsung ARM cpu used it.
Why did other phones use resistive + stylus instead of capacitive finger touch?
- 1) Apple Newton and Palm: Handwriting & annotation. Needs high resolution.
- 2) Dominance of MS CE interface (only usable with with a high resolution stylus.)
The capacitive touch existed in the late 1980s, but "holy grail" was handwriting recognition, not gesture control, though Xerox and IIS both had worked on it and guestures were defined before the 1990s. So the UK guy didn't invent anything.
Mines the one with a N9110 and later N9210 in the pocket. The first commercial smart phone was 1998 and crippled by high per MByte or per second (or both!) charging. Also in 2002, max speed was often 28K, but then in 2005 my landline was still 19.2K till I got Broadband, though I had 128K in 1990s in the city (ISDN) before I moved.
Re: Invention of iPhone
The ground breaking elements of the iPhone were all to do with usability:
The fixed price data tariff was - to me - the biggest innovation. It may have been the hardest to do, as it involved entrenched network operators in a near monopoly. The hardware engineers only had to deal with the laws of physics.
The apple store made it easy to purchase and install apps and media. Suddenly you didn't have to be a geek or an innovator to make your phone do something useful or fun that the manufacturer didn't want to give to everyone.
The improved touch interface, the styling, and apple's cache all helped, and, I assume, fed into the efforts to persuade the network operators to give the average end user access to data without fear.
Re: Invention of iPhone
"Those were the days, by the way, when phones were for making calls but all that was about to change."
I remember having a motorola A920 way back in 2003/2004 maybe, and on that I made video calls, went online, had a touch interface, ran 'apps', watched videos.... in fact I could do everything the iPhone could do and more... BUT it was clunky and the screen was not large... the iPhone was a nice step forward in many ways but also a step back in functionality
Re: Invention of iPhone
"The fixed price data tariff was - to me - the biggest innovation".
In my experience, the iphone killed the "all you can eat" fixed price data tariffs
I purchased a HTC Athena (T-Mobile Ameo) on a T-Mobile-Web and Walk contract in Feb 2007. I had unlimited 3.5G access (including tethering) and fixed call minutes/texts.
When it was time to upgrade, I was told that iphone 3G users were using too much data and that T-Mobile were no longer offering unlimited internet access.
For fun, I put "first smartphone" into Google. It wasn't Apple's. I think a BBC editor may have temporarily said that it was.
As for Apple inventing the first multitouch smartphone, though -
http://www.bbc.co.uk/news/technology-38552241 claims, with some credibility, that Apple's engineers wanted to put a keyboard on their phone. The Blackberry phone had a keyboard. But Steve Jobs wanted a phone that you could work with your finger (without a keyboard).
If you're only using one finger, you're not actually using multi touch?
Apple invented everything... They may have invented the iPhone but they DID NOT invent the "smartphone category" as that article suggests.
Microsoft had Smartphone 2002 and Pocket PC 2000 which were eventually merged into Windows Mobile and, interface aside, were vastly superior to the iPhone's iOS.
Devices were manufactured in a similar fashion to how android devices are now - MS provided the OS and firms like HTC, HP, Acer, Asus, Eten, Motorola made the hardware.
People rarely know how long HTC has been going as they used to OEM stuff for the networks - like the original Orange SPV (HTC Canary), a candybar style device running Microsoft Smartphone 2002. Or the original O2 XDA (HTC Wallaby), one the first Pocket PC "phone edition" devices and, IIRC, the first touchscreen smartphone to be made by HTC.
Re: Apple invented everything...
Yup, I had Windows based smartphones made by Qtek and HTC, and my first smartphone was an Orange SPV M2000 (a Qtek 9090 ) three years before the first iPhone, and I had a O2 XDA after that, which in 2006, had GPS, MMS, and an SD card slot, which held music for my train commute.
Now I'm a fan of the Note series, I had one capacitive screen smartphone without a stylus (HTC HD2), and missed it too much.
Re: Apple invented everything...
Lotaresco, I used to review a lot of the devices back in the day, as well as using them daily and modifying them (my phone history for ref: http://mowned.com/nedge2k ). Not once did they ever fail to make a phone call. Maybe the journalist was biased and made it up (Symbian was massively under threat at the time and all sorts of bullshit stories were flying about), maybe he had dodgy hardware, who knows.
Either way, it doesn't mean that the OS as a whole wasn't superior to what Nokia and Apple produced - because in every other way, it was.
Re: Apple invented everything...
"The weak spot for Microsoft was that it decided to run telephony in the application layer. This meant that any problem with the OS would result in telephony being lost....
Symbian provided a telephone which could function as a computer. The telephony was a low-level service and even if the OS crashed completely you could still make and receive calls. Apple adopted the same architecture, interface and telephony are low level services which are difficult to kill."
Sorry, but if iOS (or symbian) crashes you cannot make calls. In what capacity were you evaluating phones in 2002? I cannot recall ever seeing a Windows Mobile blue screen. It would hang from time to time, but it never blue screened.
Seeing how much free advertising the BBC has given Apple over the years I doubt they will care.
And lets be honest here, the guy is kinda correct. We didn't just go from a dumb phone to a smart phone, there was a gradual move towards it as processing power was able to be increased and electronic packages made smaller. Had we gone from the old brick phones straight to an iPhone then I would agree that they owned something like TNT.
Did Apple design the iPhone - Yes, of course.
Did Apple invent the Smart Phone - Nope.
IBM had a touch screen "smart" phone in 1992 that had a square screen with rounded corners.
What Apple did was put it into a great package with a great store behind it and they made sure it worked - and worked well. I personally am not fond of Apple due to the huge price premium they demand and overly locked down ecosystems, but I will admit it was a wonderful product Design.
Re: "opinion pieces don't need to be balanced"
"I am no fan of Apple, but to state that something was invented by the State because everyone involved went to state-funded school is a kindergarten-level of thinking that has no place in reasoned argument."
It's actually "Intellectual Yet Idiot" level thinking. Google it. Your right that arguments of this sort of calibre have no place in reasoned argument, but the presence of this sort of quality thinking being shoved down peoples throats by media is why a hell of a lot of people are "fed up with experts".
I actually got one of these for my wife. It was awful. It almost felt like a beta product (and these are just a few of things I still remember):
- It had no kind of face sensor so it was common for the user to disconnect mid-call via their chin or cheek;
- It's autocorrect functions were terrible - tiny little words above the word in question and even tinier x to close the option;
- Inability to forward messages;
- No email support;
- No apps.
I think it's reasonably fair to say that it was the app store that really allowed the iPhone to become so successful, combined with the then Apple aura and mystique that Jobs was bringing to their products.
As to who invented this bit or that bit - I suggest you could pull most products released in the last 10-20 years and have the same kind of arguments.
But poor show on the beeb for their lack of fact checking on this one.
Re: Hmmm....iPhone 1.0
"...The original iPhone definitely has a proximity sensor. It is possible that your wife's phone was faulty or there was a software issue...."
Have an upvote - hers definitely never worked (and at the time I didn't even know it was supposed to be there), so yeah, probably faulty. I'd just assumed it didn't have one.
There is of course...
.. the fact that the iPhone wouldn't exist without its screen and all LCD displays owe their existence to (UK) government sponsored research. So whereas I agree that Mazzucato is guilty of rabidly promoting an incorrect hypothesis to the status of fact, there is this tiny kernel of truth.
The government was looking for a display technology for aircraft that was rugged, light, low powered and more reliable than CRTs. They also wanted to avoid the punitive royalties taken by RCA on CRTs. It was the work done in the 1960s by the Royal Radar Establishment at Malvern and George William Gray and his team at the University of Hull that led to modern LCDs. QinetiQ, which inherited RSRE's intellectual property rights, is still taking royalties on each display sold.
anonymous boring coward
Re: There is of course...
I had a calculator in the late 1970s with an LCD display. It had no resemblance to my phone's display.
Not even my first LCD screened laptop had much resemblance with a phone's display. That laptop had a colour display, in theory. If looked at at the right angle, in the correct light.
Innovation is ongoing, and not defined by some initial stumbling attempts.
Apple invented the iPhone...
... in the same way that Ford invented the Model T, Sony invented the Walkman or Nintendo invented the Wii. They took existing technologies, iterated and integrated them, and presented them in the right way in the right place at the right time.
And that's been true of pretty much every invention since someone discovered how to knap flint.
As to how much of a part the state had to play: a lot of things - especially in the IT and medical field - have been spun out of military research, though by the same token, much of this is done by private companies funded by government sources.
Equally, a lot of technology has been acquired through trade, acquisition or outright theft. In WW2, the United Kingdom gave the USA a lot of technology via the Tizard mission (and later, jet-engine technology was also licenced), and both Russia and the USA "acquired" a lot of rocket technology by picking over the bones of Germany's industrial infrastructure. Then, Russia spent the next 40 years stealing whatever nuclear/military technology it could from the USA - though I'm sure some things would have trickled the other way as well!
Anyway, if you trace any modern technology back far enough, there will have been state intervention. That shouldn't subtract in any way from the work done by companies and individuals who have produced something where the sum is greater than the parts...
Re: Apple invented the iPhone...
... in the same way that Ford invented the Model T, Sony invented the Walkman or Nintendo invented the Wii. They took existing technologies, iterated and integrated them, and presented them in the right way in the right place at the right time.
And that's been true of pretty much every invention since someone discovered how to knap flint.
Not so sure, Singer did a little more with respect to the sewing machine - his was the forst that actually worked. Likewise Marconi was the first with a working wireless. Yes both made extensive use of existing technology, but both clearly made that final inventive step; something that isn't so clear in the case of the examples you cite.
Equally, a lot of technology has been acquired through trade, acquisition or outright theft.
Don't disagree, although your analysis omitted Japanese and Chinese acquisition of 'western' technology and know-how...
Anyway, if you trace any modern technology back far enough, there will have been state intervention.
Interesting point, particularly when you consider the case of John Harrison, the inventor of the marine chronometer. Whilst the government did offer a financial reward it was very reluctant to actually pay anything out...
Apple invented the iPhone, but not the smartphone.
The smartphone had been showed before inseveral incarnations, including the "all touch screen" several years before Apple decided to dabble in smartphones. So no invention here.
As for the experience, again, nothing new. Al thought of before, in good part even implemented.
The key here is that Steve Jobs had the guts to force the thought of a useful smartphone, gadget for the user first and phone second into the minds of the Telcos, and he was the one to get unlimited/big data bundles.
He identified correctly, as many had before but before the power to do anything about it, that the customers are the final users, not the telcos.
The rest of the smartphones were culled before birth by the Telecomm industry, as they demanded certain "features" that nobody wanted but lined their pockets nicely with minumum investment.
So I thank Steve Jobs for that and for being able to buy digital music.
Dec 26, 2016 | news.slashdot.org(freedos.org) 59
Posted by EditorDavid on Sunday December 25, 2016 @02:56PM from the long-term-projects dept.
Very long-time Slashdot reader Jim Hall -- part of GNOME's board of directors -- has a Christmas gift. Since 1994 he's been overseeing an open source project that maintains a replacement for the MS-DOS operating system, and has just announced the release of the "updated, more modern" FreeDOS 1.2 !
[Y]ou'll find a few nice surprises. FreeDOS 1.2 now makes it easier to connect to a network. And you can find more tools and games, and a few graphical desktop options including OpenGEM. But the first thing you'll probably notice is the all-new new installer that makes it much easier to install FreeDOS. And after you install FreeDOS, try the FDIMPLES program to install new programs or to remove any you don't want. Official announcement also available at the FreeDOS Project blog .
FreeDOS also lets you play classic DOS games like Doom , Wolfenstein 3D , Duke Nukem , and Jill of the Jungle -- and today marks a very special occasion, since it's been almost five years since the release of FreeDos 1.1. "If you've followed FreeDOS, you know that we don't have a very fast release cycle," Jim writes on his blog . "We just don't need to; DOS isn't exactly a moving target anymore..."
Nov 23, 2016 | developers.slashdot.org(fedscoop.com) 116
Posted by BeauHD on Wednesday November 23, 2016 @02:00AM from the blast-from-the-past dept.
An anonymous reader quotes a report from FedScoop:
President Barack Obama awarded Presidential Medals of Freedom to two storied women in tech -- one posthumously to Grace Hopper, known as the "first lady of software," and one to programmer Margaret Hamilton. Hopper worked on the Harvard Mark I computer, and invented the first compiler.
"At age 37 and a full 15 pounds below military guidelines, the gutsy and colorful Grace joined the Navy and was sent to work on one of the first computers, Harvard's Mark 1," Obama said at the ceremony Tuesday. "She saw beyond the boundaries of the possible and invented the first compiler, which allowed programs to be written in regular language and then translated for computers to understand." Hopper followed her mother into mathematics, and earned a doctoral degree from Yale, Obama said.
She retired from the Navy as a rear admiral. "From cell phones to Cyber Command, we can thank Grace Hopper for opening programming up to millions more people, helping to usher in the Information Age and profoundly shaping our digital world," Obama said. Hamilton led the team that created the onboard flight software for NASA's Apollo command modules and lunar modules, according to a White House release . "
At this time software engineering wasn't even a field yet," Obama noted at the ceremony. "There were no textbooks to follow, so as Margaret says, 'there was no choice but to be pioneers.'" He added: "Luckily for us, Margaret never stopped pioneering. And she symbolizes that generation of unsung women who helped send humankind into space."
Sep 05, 2016 | economistsview.typepad.com
pgl : Monday, September 05, 2016 at 11:07 AMAl Gore could not have invented the Internet since Steve Jobs is taking the bow for that. Actually Jobs started NeXT which Apple bought in 1997 for a mere $427 million. NeXT had sold a couple of computer models that did not do so well but the platform software allowed Apple to sell Web based computers. BTW - the internet really began in the 1980's as something called Bitnet. Really clunky stuff back then but new versions and applications followed. But yes - the Federal government in the 1990's was very supportive of the ICT revolution.ilsm -> pgl... , Monday, September 05, 2016 at 11:59 AMDARPA did most of it to keep researchers talking.RC AKA Darryl, Ron -> pgl... , Monday, September 05, 2016 at 12:35 PMhttps://en.wikipedia.org/wiki/ARPANETsanjait -> RC AKA Darryl, Ron... , Monday, September 05, 2016 at 01:09 PM
The Advanced Research Projects Agency Network (ARPANET) was an early packet switching network and the first network to implement the protocol suite TCP/IP. Both technologies became the technical foundation of the Internet. ARPANET was initially funded by the Advanced Research Projects Agency (ARPA) of the United States Department of Defense.
The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory. The TCP/IP communications protocols were developed for ARPANET by computer scientists Robert Kahn and Vint Cerf, and incorporated concepts by Louis Pouzin for the French CYCLADES project.
As the project progressed, protocols for internetworking were developed by which multiple separate networks could be joined into a network of networks. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet protocol suite (TCP/IP) was introduced as the standard networking protocol on the ARPANET. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations. ARPANET was decommissioned in 1990...
By mid-1968, Taylor had prepared a complete plan for a computer network, and, after ARPA's approval, a Request for Quotation (RFQ) was issued for 140 potential bidders. Most computer science companies regarded the ARPA–Taylor proposal as outlandish, and only twelve submitted bids to build a network; of the twelve, ARPA regarded only four as top-rank contractors. At year's end, ARPA considered only two contractors, and awarded the contract to build the network to BBN Technologies on 7 April 1969. The initial, seven-person BBN team were much aided by the technical specificity of their response to the ARPA RFQ, and thus quickly produced the first working system. This team was led by Frank Heart. The BBN-proposed network closely followed Taylor's ARPA plan: a network composed of small computers called Interface Message Processors (or IMPs), similar to the later concept of routers, that functioned as gateways interconnecting local resources. At each site, the IMPs performed store-and-forward packet switching functions, and were interconnected with leased lines via telecommunication data sets (modems), with initial data rates of 56kbit/s. The host computers were connected to the IMPs via custom serial communication interfaces. The system, including the hardware and the packet switching software, was designed and installed in nine months...Though the thing we currently regard as "the Internet", including such innovations as the world wide web and the web browser, were developed as part of "the Gore bill" from 1991.pgl -> sanjait... , Monday, September 05, 2016 at 02:37 PM
In case anyone is trying to argue Gore didn't massively contribute to the development of the Internet, as he claimed.So the American government help paved the way for this ICT revolution. Steve Jobs figures out how Apple could make incredible amounts of income off of this. He also shelters most of that income in a tax haven so Apple does not pay its share of taxes. And Tim Cook lectures the Senate in May of 2013 why they should accept this. No wonder Senator Levin was so upset with Cook.ilsm -> pgl... , Monday, September 05, 2016 at 04:29 PMIn 1980 DoD was a huge percent of the IC business, a lot of the R&D was done at Bell Labs, some of that for telecom not DoD. By 1995 or so DoD was shuttering its IC development as it was all being done for Wii. Which is a minor cause for why software is so hard for DoD; the chips are not under control and change too fast.ilsm -> RC AKA Darryl, Ron... , Monday, September 05, 2016 at 04:25 PMAbout 20 years ago I conversed with a fellow who was in ARPANET at the beginning. We were getting into firewalls at the time with concerns for security (Hillary was recently elected to the senate) and he was shaking his head saying: "It was all developed for collaboration.... security gets in the way".
yeah thanks Carly ...
HP made bullet-proof products that would last forever..... I still buy HP workstation notebooks, especially now when I can get them for $100 on ebay ....
I sold HP products in the 1990s .... we had HP laserjet IIs that companies would run day & night .... virtually no maintenance ... when PCL5 came around then we had LJ IIIs .... and still companies would call for LJ I's, .... 100 pounds of invincible Printing ! .... this kind of product has no place in the World of Planned-Obsolesence .... I'm currently running an 8510w, 8530w, 2530p, Dell 6420 quad i7, hp printers hp scanners, hp pavilion desktops, .... all for less than what a Laserjet II would have cost in 1994, Total.
Not My Real Name
I still have my HP 15C scientific calculator I bought in 1983 to get me through college for my engineering degree. There is nothing better than a hand held calculator that uses Reverse Polish Notation!
HP used to make fantastic products. I remember getting their RPN calculators back in th 80's; built like tanks.
Then they decided to "add value" by removing more and more material from their consumer/"prosumer" products until they became unspeakably flimsy. They stopped holding things together with proper fastenings and starting hot melting/gluing it together, so if it died you had to cut it open to have any chance of fixing it.
I still have one of their Laserjet 4100 printers. I expect it to outlast anything they currently produce, and it must be going on 16+ years old now.
Fuck you, HP. You started selling shit and now you're eating through your seed corn. I just wish the "leaders" who did this to you had to pay some kind of penalty greater than getting $25M in a severance package.
HP12C. 31 years old and still humming.WTF happened?
+100. The path of HP is everything that is wrong about modern business models. I still have a 5MP laserjet (one of the first), still works great. Also have a number of 42S calculators.....my day-to-day workhorse and several spares. I don't think the present HP could even dream of making these products today.
How well will I profit, as a salesman, if I sell you something that works?
How valuable are you, as a customer in my database, if you never come back?
Confucious say "Buy another one, and if you can't afford it, f'n finance it!"
It's the growing trend. Look at appliances. Nothing works anymore.
Son of Loki
GE to cut Houston jobs as work moves overseas
" Yes we can! "
hey big brother.... if you are curious, there is a damn good android emulator of the HP42S available (Free42). really it is so good that it made me relax about accumulating more spares. still not quite the same as a real calculator. (the 42S, by the way, is the modernization/simplification of the classic HP41, the real hardcord very-programmable, reconfigurable, hackable unit with all the plug-in-modules that came out in the early 80s.)
Imagine working at HP and having to listen to Carly Fiorina bulldoze you...she is like a blow-torch...here are 4 minutes of Carly and Ralph Nader (if you can take it):
My husband has been a software architect for 30 years at the same company. Never before has he seen the sheer unadulterated panic in the executives. All indices are down and they are planning for the worst. Quality is being sacrificed for " just get some relatively functional piece of shit out the door we can sell". He is fighting because he has always produced a stellar product and refuses to have shit tied to his name ( 90% of competitor benchmarks fail against his projects). They can't afford to lay him off, but the first time in my life I see my husband want to quit...
I've been an engineer for 31 years - our managements's unspoken motto at the place I'm at (large company) is: "release it now, we'll put in the quality later". I try to put in as much as possible before the product is shoved out the door without killing myself doing it.
Do they even make test equipment anymore?
HP test and measurement was spun off many years ago as Agilent. The electronics part of Agilent was spun off as keysight late last year.
HP basically makes computer equipment (PCs, servers, Printers) and software. Part of the problem is that computer hardware has been commodized. Since PCs are cheap and frequent replacements are need, People just by the cheapest models, expecting to toss it in a couple of years and by a newer model (aka the Flat screen TV model). So there is no justification to use quality components. Same is become true with the Server market. Businesses have switched to virtualization and/or cloud systems. So instead of taking a boat load of time to rebuild a crashed server, the VM is just moved to another host.
HP has also adopted the Computer Associates business model (aka Borg). HP buys up new tech companies and sits on the tech and never improves it. It decays and gets replaced with a system from a competitor. It also has a habit of buying outdated tech companies that never generate the revenues HP thinks it will.
When Carly was CEO of HP, she instituted a draconian "pay for performance" plan. She ended up leaving with over $146 Million because she was smart enough not to specify "what type" of performance.
An era of leadership in computer technology has died, and there is no grave marker, not even a funeral ceremony or eulogy ... Hewlett-Packard, COMPAQ, Digital Equipment Corp, UNIVAC, Sperry-Rand, Data General, Tektronix, ZILOG, Advanced Micro Devices, Sun Microsystems, etc, etc, etc. So much change in so short a time, leaves your mind dizzy.
Dec 26, 2014 | Slashdot
Anonymous Coward on Friday December 26, 2014 @01:58PM (#48676489)
The physics does NOT define Computer Science (Score:5, Insightful)
The physics does NOT define Computer Science. Computer Science has nothing that depends on transistors, or tubes, or levers and gears.
Computers can be designed and built, and computing performed, at many different levels of physical abstraction.
You can do computer science all on paper for fucks sake.
Ever heard of this guy called Alan Turing?
Knuth is right, the ignorance, even among technical people, is astounding
Dracos (107777) on Friday December 26, 2014 @12:59PM (#48676173)
Re:False Summary - Haigh Agrees with Knuth's Thesi (Score:5, Insightful)
there are indeed no good technical histories of computer science, and little prospect of any.
I see the posthumous reactions to Steve Jobs and Dennis Ritchie as indicators that Knuth is absolutely right.
- Jobs, who was essentially just a marketing asshole, gets every manner of fanfare commemorating his "world-changing" achievements.
- Ritchie on the other hand is almost completely ignored in the media, even though he is one of the giants upon whose shoulders Jobs undeservingly stood.
I bet anyone here would agree that co-authoring UNIX is a far more important event than being the iPod/iPhone taskmaster.
ripvlan (2609033) on Friday December 26, 2014 @11:53AM (#48675785)
But wait,there's more (Score:3, Insightful)
I returned to college several years ago after a 20 year hiatus (the first 6 years were my creative period). My first time around I studied what might be called pure Computer Science. A lot has happened in the industry after 20 years and I very much enjoyed conversations in class - esp with the perspective of the younger generation. I found it fascinating how many kids of today hoped to enter the gaming industry (my generation - Zork was popular when I was a kid and Myst was a breakout success on a new level). Kids today see blockbuster gaming as an almost make it rich experience - plus a "real world" job that sounds like fun.
But more interesting was the concepts of Computer Engineering vs Computer Science. What is Science vs Engineering? Are software "engineers" really scientists? Do they need to learn all this sciencey stuff in order to enter the business school? I attended a large semi-well-known University. Back in the '80s the CS department was "owned" by the school of business. Programming computers was thought to be the money maker - only business really used them with a strong overlap into engineering because computers were really big calculators. However it was a real CS curriculum with only 1 class for business majors. Fast forward a dozen years and CS is now part of the Engineering school (with Business on its own). The "kids" wondered why they needed to study Knuth et al when they were just going to be programming games. What about art? Story telling? They planned on using visual creative studio tools to create their works. Why all this science stuff? (this in a haptics class). Should a poet learn algorithms in order to operate MS-Word?
Since computers are ubiquitous they are used everywhere. I tell students to get a degree in what interests them - and learn how to use/program computers because...well..who doesn't use a computer? I used to program my TI calculator in highschool to pump out answers to physics & algebra questions (basic formulas).
Are those who program Excel Macros computer scientists? No. Computer Engineers? no. Business people solving real problems? Yes/maybe. The land is now wider. Many people don't care about the details of landing a man on the moon - but they like it when the velcro strap on their shoes holds properly. They receive entertainment via the Discovery Channel and get the dumbed down edition of all things "science."
When creating entertainment - it needs to be relatable to your target audience. The down and dirty details and technicalities interest only a few of us. My wife's eyes glaze over when I talk about some cool thing I'm working on. Retell it as saving the world and improving quality - she gets it (only to politely say I should go play with the kids -- but at least she was listening to that version of events).
I think that the dumbing down of history is ... well.. history. There was this thing called World War 2. The details I learned in grade school - lots of details. Each battle, names of important individuals. Today - lots of history has happened in the meantime. WW2 is now a bit dumbed down - still an important subject - but students still only have 8 grades in school with more material to cover.
My brain melts when I watch the Discovery Channel. I'm probably not the target audience. The details of historical science probably interest me. The history of Computing needs to be told like "The Social Network."
Virtucon (127420) on Friday December 26, 2014 @12:19PM (#48675913)xororand (860319) writes: on Friday December 26, 2014 @02:07PM ( #48676543)
it's everywhere (Score:3)
we've raised at least two generations of self obsessed, no attention-span kids who want instant gratification. Retards like Justin Bieber who today tweets that he bought a new plane. As the later generations grow into the workforce and into fields like journalism, history and computer science it's no small wonder they want to reduce everything down to one liners or soundbites. Pick your field because these kids started with censored cartoons and wound up with Sponge Bob. Shit, even the news is now brokered into short paragraphs that just say "this shit happened now onto the next.."
Screw that! Yeah I'm getting older so get the fuck off my lawn!
The Machine That Changed The World
There's a gem of a documentary about the history of computing before the web.
The Machine That Changed the World is the longest, most comprehensive documentary about the history of computing ever produced.
It's a whirlwind tour of computing before the Web, with brilliant archival footage and interviews with key players - several of whom passed away since the filming.
Episode 1 featured Interviews with, including but not limited to:
Paul Ceruzzi (computer historian), Doron Swade (London Science Museum), Konrad Zuse (inventor of the first functional computer and high-level programming language, died in 1995), Kay Mauchly Antonelli (human computer in WWII and ENIAC programmer, died in 2006), Herman Goldstine (ENIAC developer, died in 2004), J. Presper Eckert (co-inventor of ENIAC, died in 1995), Maurice Wilkes (inventor of EDSAC), Donald Michie (Codebreaker at Bletchley Park)
luis_a_espinal (1810296) on Friday December 26, 2014 @03:49PM (#48676989) Homepage
There is a CS dumbing down going on (Score:2)
Donald Knuth Worried About the "Dumbing Down" of Computer Science History
Whether CS education is appropriate to all people who do computed-assisted technical work is very irrelevant to me since practical forces in real life simply solve that issue.
The problem I care about is a problem I seen in CS for real. I've met quite a few CS grads who don't know who Knuth, Lamport, Liskov, Hoare Tarjan, o Dijkstra are.
If you (the generic CS grad) do not know who they are, how the hell do you know about basic CS things like routing algorithms, pre and post conditions, data structures, you know, the very basic shit that is supposed to be the bread and butter of CS????
It is ok not to know these things and these people if you are a Computer Engineer, MIS or Network/Telecomm engineer (to a degree dependent on what your job expects from you.)
But if you are Computer Scientist, my God, this is like hiring an Electrical Engineer who doesn't know who Maxwell was. It does not inspire a lot of confidence, does it?
Aikiplayer (804569) on Friday December 26, 2014 @05:59PM (#48677657)
Re:Don't do what they did to math (Score:1)
Knuth did a nice job of articulating why he wants to look at the history of things at the beginning of the video. Those reasons might not resonate with you but he does have definite reasons for wanting technical histories (not social histories which pander to "the stupid") to be written.
Dec 26, 2014 | Communications of the ACM, January 2015
In his lecture Knuth worried that a "dismal trend" in historical work meant that "all we get nowadays is dumbed down" through the elimination of technical detail. According to Knuth "historians of math have always faced the fact that they won't be able to please everybody." He feels that other historians of science have succumbed to "the delusion that ... an ordinary person can understand physics ..."
I am going to tell you why Knuth's tears were misguided, or at least misdirected, but first let me stress that historians of computing deeply appreciate his conviction that our mission is of profound importance. Indeed, one distinguished historian of computing recently asked me what he could do to get flamed by Knuth. Knuth has been engaged for decades with history. This is not one of his passionate interests outside computer science, such as his project reading verses 3:16 of different books of the Bible. Knuth's core work on computer programming reflects a historical sensibility, as he tracks down the origin and development of algorithms and reconstructs the development of thought in specific areas. For years advertisements for IEEE Annals of the History of Computing, where Campbell-Kelly's paper was published, relied on a quote from Knuth that it was the only publication he read from cover to cover. With the freedom to choose a vital topic for a distinguished lecture Knuth chose to focus on history rather than one of his better-known scientific enthusiasms such as literate programming or his progress with The Art of Computer Programming.
... Distinguished computer scientists are prone to blur their own discipline, and in particular few dozen elite programs, with the much broader field of computing. The tools and ideas produced by computer scientists underpin all areas of IT and make possible the work carried out by network technicians, business analysts, help desk workers, and Excel programmers. That does not make those workers computer scientists. The U.S. alone is estimated to have more than 10 million "information technology workers," which is about a hundred times more than the ACM's membership. Vint Cerf has warned in Communications that even the population of "professional programmers" dwarfs the association's membership.7 ACM's share of the IT workforce has been in decline for a half-century, despite efforts begun back in the 1960s and 1970s by leaders such as Walter Carlson and Herb Grosch to broaden its appeal.
... ... ...
So why is the history of computer science not being written in the volume it deserves, or the manner favored by Knuth? I am, at heart, a social historian of science and technology and so my analysis of the situation is grounded in disciplinary and institutional factors. Books of this kind would demand years of expert research and sell a few hundred copies. They would thus be authored by those not expected to support themselves with royalties, primarily academics.
... ... ...
The history of science is a kind of history, which is in turn part of the humanities. Some historians of science are specialists within broad history departments, and others work in specialized programs devoted to science studies or to the history of science, technology, or medicine. In both settings, historians judge the work of prospective colleagues by the standards of history, not those of computer science. There are no faculty jobs earmarked for scholars with doctoral training in the history of computing, still less in the history of computer science. The persistently brutal state of the humanities job market means that search committees can shortlist candidates precisely fitting whatever obscure combination of geographical area, time period, and methodological approaches are desired. So a bright young scholar aspiring to a career teaching and researching the history of computer science would need to appear to a humanities search committee as an exceptionally well qualified historian of the variety being sought (perhaps a specialist in gender studies or the history of capitalism) who happens to work on topics related to computing.
... ... ...
Thus the kind of historical work Knuth would like to read would have to be written by computer scientists themselves. Some disciplines support careers spent teaching history to their students and writing history for their practitioners. Knuth himself holds up the history of mathematics as an example of what the history of computing should be. It is possible to earn a Ph.D. within some mathematics departments by writing a historical thesis (euphemistically referred to as an "expository" approach). Such departments have also been known to hire, tenure, and promote scholars whose research is primarily historical. Likewise medical schools, law schools, and a few business schools have hired and trained historians. A friend involved in a history of medicine program recently told me that its Ph.D. students are helped to shape their work and market themselves differently depending on whether they are seeking jobs in medical schools or in history programs. In other words, some medical schools and mathematics departments have created a demand for scholars working on the history of their disciplines and in response a supply of such scholars has arisen.
As Knuth himself noted toward the end of his talk, computer science does not offer such possibilities. As far as I am aware no computer science department in the U.S. has ever hired as a faculty member someone who wrote a Ph.D. on a historical topic within computer science, still less someone with a Ph.D. in history. I am also not aware of anyone in the U.S. having been tenured or promoted within a computer science department on the basis of work on the history of computer science. Campbell-Kelly, now retired, did both things (earning his Ph.D. in computer science under Randell's direction) but he worked in England where reputable computer science departments have been more open to "fuzzy" topics than their American counterparts. Neither are the review processes and presentation formats at prestigious computer conferences well suited for the presentation of historical work. Nobody can reasonably expect to build a career within computer science by researching its history.
In its early days the history of computing was studied primarily by those who had already made their careers and could afford to indulge pursuing historical interests from tenured positions or to dabble after retirement. Despite some worthy initiatives, such as the efforts of the ACM History Committee to encourage historical projects, the impulse to write technical history has not spread widely among younger generations of distinguished and secure computer scientists.
... ... ...
Contrary both to Knuth's despair and to Campbell-Kelly's story of a march of progress away from technical history, some scholars with formal training in history and philosophy have been turning to topics with more direct connections to computer science over the past few years. Liesbeth De Mol and Maarten Bullynck have been working to engage the history and philosophy of mathematics with issues raised by early computing practice and to bring computer scientists into more contact with historical work.3 Working with like-minded colleagues, they helped to establish a new Commission for the History and Philosophy of Computing within the International Union of the History and Philosophy of Science. Edgar Daylight has been interviewing famous computer scientists, Knuth included, and weaving their remarks into fragments of a broader history of computer science.8 Matti Tedre has been working on the historical shaping of computer science and its development as a discipline.22 The history of Algol was a major focus of the recent European Science Foundation project Software for Europe. Algol, as its developers themselves have observed, was important not only for pioneering new capabilities such as recursive functions and block structures, but as a project bringing together a number of brilliant research-minded systems programmers from different countries at a time when computer science had yet to coalesce as a discipline.c Pierre Mounier-Kuhn has looked deeply into the institutional history of computer science in France and its relationship to the development of the computer industry.16
Stephanie Dick, who recently earned her Ph.D. from Harvard, has been exploring the history of artificial intelligence with close attention to technical aspects such as the development and significance of the linked list data structure.d Rebecca Slayton, another Harvard Ph.D., has written about the engagement of prominent computer scientists with the debate on the feasibility of the "Star Wars" missile defense system; her thesis has been published as an MIT Press book.20 At Princeton, Ksenia Tatarchenko recently completed a dissertation on the USSR's flagship Akademgorodok Computer Center and its relationship to Western computer science.21 British researcher Mark Priestley has written a deep and careful exploration of the history of computer architecture and its relationship to ideas about computation and logic.18 I have worked with Priestly to explore the history of ENIAC, looking in great detail at the functioning and development of what we believe to be the first modern computer program ever executed.9 Our research engaged with some of the earliest historical work on computing, including Knuth's own examination of John von Neumann's first sketch of a modern computer program10 and Campbell-Kelly's technical papers on early programming techniques.5
Nov 12, 2014 | Wikipedia,
The year 2014 will see release of numerous games, including new installments for some well-received franchises, such as Alone in the Dark, Assassin's Creed, Bayonetta, Borderlands, Call of Duty, Castlevania, Civilization, Dark Souls, Donkey Kong, Dragon Age, The Elder Scrolls, Elite, EverQuest, Far Cry, Final Fantasy, Forza Horizon, Infamous, Kinect Sports, Kirby, LittleBigPlanet, Mario Golf, Mario Kart, Metal Gear, MX vs. ATV, Ninja Gaiden, Persona, Pokémon, Professor Layton, Shantae, Sniper Elite, Sonic the Hedgehog, Strider Hiryu, Super Smash Bros., Tales, The Amazing Spider-Man, The Legend of Zelda, The Settlers, The Sims, Thief, Trials, Tropico, Wolfenstein and World of Warcraft.
May 17, 2013 | Scientific American
The biggest concern seems to be distraction. Google Glass looks like a pair of glasses, minus the lenses; it's just a band across your forehead, with a tiny screen mounted at the upper-right side. By tapping the earpiece and using spoken commands, you direct it to do smartphone-ish tasks, such as fielding a calendar alert and finding a nearby sushi restaurant.
Just what we need, right? People reading texts and watching movies while they drive and attaining new heights of rudeness by scanning their e-mail during face-to-face conversation.
Those are misguided concerns. When I finally got to try Google Glass, I realized that they don't put anything in front of your eyes. You still make eye contact when you talk. You still see the road ahead. The screen is so tiny, it doesn't block your normal vision.
Hilarious parody videos show people undergoing all kinds of injury while peering at the world through a screen cluttered with alerts and ads. But that's not quite how it works. You glance up now and then, exactly as you would check your phone. But because you don't have to look down and dig around in your pocket, you could argue that there's less distraction. By being so hands-free, it should be incredibly handy.
Stormport May 17, 2013, 12:42 PM
Although the fallibility of the human monkey is much trumpeted (e.g. "To Err is Human", NAS study of out of control corporate iatrogenic death in America), there is one area of human activity where we score an almost 100% reliability: the 'justifiability' of the sport shooting of the mentally ill, Big Pharma crazed, suicidal, or just simply angry folks amongst us by our local and national 'law enforcement' (LE). Well, not all are simply shooting for sport, many are the result of overwhelming panic (e.g. 17 or 57 bullet holes in the human target) of individuals who shouldn't be allowed to own a sharp pencil much less carry a gun with a license to kill. I have not bothered to look for the statistics presuming them to be either not available or obfuscated in some way but rely on my local newspaper for my almost daily story of a local police shooting.
With that said, one can only say YES! to Google Glass and its most obvious use, replacing the patrol car dash cam. Every uniformed 'law enforcement' officer and 'security guard' must be so equipped and required to have the camera on and recording at any time on 'duty' and not doing duty in the can or on some other 'personal' time. Consider it simply as having one's supervisor as a 'partner'. Same rules would apply. No sweat.
25 Aug 2014 | The Register
Obituary Former IBM CEO John Akers has died in Boston aged 79.
Big Blue announced Akers' passing here, due to a stroke according to Bloomberg.After a stint as a US Navy pilot, the IBM obit states, Akers joined the company in 1960. His 33-year stint with IBM culminated in his appointment as its sixth CEO in 1985, following three years as president.
The top job became something of a poisoned chalice for Akers: the IBM PC project was green-lit during his tenure, and the industry spawned by this computer would cannibalize Big Blue's mainframe revenue, which was already under attack from minicomputers.
His career was founded on the success of the iconic System/360 and System/370 iron, but eventually fell victim to one of the first big disruptions the industry experienced.
He was eventually replaced by Lou Gerstner (as Bloomberg notes, the first CEO to be appointed from outside IBM).
To Gerstner fell the task of reversing the losses IBM was racking up – US$7.8 billion over two years – by embarking on a top-down restructure to shave US$7 billion in costs.
According to retired IBM executive Nicholas Donofrio, Akers took a strong interest in nursing the behind-schedule RS6000 Unix workstation project through to fruition in the late 1980s:
"he asked what additional resources I needed and agreed to meet with me monthly to ensure we made the new schedule".
Apr 21, 2014 forbes.com
It really is a great idea.
A pair of glasses that can project information or perform actions on a virtual screen in front of you about pretty much anything and all you have to do is ask. Driving directions. LinkedIn LNKD -0.59% connections. Order history. A photo. A video. A phone call. An email. The options seem limitless. And they are. Google GOOG -0.37% Glass really is a great idea. The technology can and probably will change the world. So how did Google screw it up?
Yes, screw it up. Since first announcing the product in 2012, Google Glass has been subject to ridicule and even violence. It's become a symbol of the anti-tech, anti-Silicon Valley crowd. Surveys like this one demonstrate the American public's general dislike and distrust of Google Glass. The product has not yet spawned an industry. It has not generated revenues for Google. It's become a frequent joke on late night TV and a target for bloggers and comedians around the country. The world "glasshole" has now risen to the same prominence as "selfie" and "twerk." Yes, it's getting attention. But only as a creepy gimmick which, I'm sure, is not the kind of attention that Google intended when they initially introduced it. As cool as it is, let's admit that Google Glass will go down in the annals of bad product launches. And it will do so because of these reasons.
Apr 10, 2014
For a limited time starting Tuesday, Google will make the wearable device available to more than just the select group of users such as apps developers in its Glass Explorer program.
In a blogpost, Google did not say how many pairs it would sell, just that the quantity would be limited.
"Every day we get requests from those of you who haven't found a way into the program yet, and we want your feedback too," the company said on a Thursday blogpost.
"That's why next Tuesday, April 15th, we'll be trying our latest and biggest Explorer Program expansion experiment to date. We'll be allowing anyone in the U.S. to become an Explorer by purchasing Glass."
Many tech pundits expect wearable devices to go mainstream this year, extending smartphone and tablet capabilities to gadgets worn on the body, from watches to headsets. Google has run campaigns in the past to drum up public involvement, including inviting people to tweet under the hashtag #ifihadglass for a chance to buy a pair of the glasses.
Google Glass has raised privacy concerns, prompting some legislators to propose bans on the gadget.
Apr 08, 2014 | The Register
IBM's System 360 mainframe, celebrating its 50th anniversary on Monday, was more than a just another computer. The S/360 changed IBM just as it changed computing and the technology industry.
The digital computers that were to become known as mainframes were already being sold by companies during the 1950s and 1960s - so the S/360 wasn't a first.
Where the S/360 was different was that it introduced a brand-new way of thinking about how computers could and should be built and used.
The S/360 made computing affordable and practical - relatively speaking. We're not talking the personal computer revolution of the 1980s, but it was a step.
The secret was a modern system: a new architecture and design that allowed the manufacturer - IBM - to churn out S/360s at relatively low cost. This had the more important effect of turning mainframes into a scalable and profitable business for IBM, thereby creating a mass market.
The S/360 democratized computing, taking it out of the hands of government and universities and putting its power in the hands of many ordinary businesses.
The birth of IBM's mainframe was made all the more remarkable given making the machine required not just a new way of thinking but a new way of manufacturing. The S/360 produced a corporate and a mental restructuring of IBM, turning it into the computing giant we have today.
The S/360 also introduced new technologies, such as IBM's Solid Logic Technology (SLT) in 1964 that meant a faster and a much smaller machine than what was coming from the competition of the time.
Big Blue introduced new concepts and de facto standards with us now: virtualisation - the toast of cloud computing on the PC and distributed x86 server that succeeded the mainframe - and the 8-bit byte over the 6-bit byte.
The S/360 helped IBM see off a rising tide of competitors such that by the 1970s, rivals were dismissively known as "the BUNCH" or the dwarves. Success was a mixed blessing for IBM, which got in trouble with US regulators for being "too" successful and spent a decade fighting a government anti-trust law suit over the mainframe business.
The legacy of the S/360 is with us today, outside of IBM and the technology sector.
S/360 I knew you well
The S/390 name is a hint to its lineage, S/360 -> S/370 -> S/390, I'm not sure what happened to the S/380. Having made a huge jump with S/360 they tried to do the same thing in the 1970s with the Future Systems project, this turned out to be a huge flop, lots of money spent on creating new ideas that would leapfrog the competition, but ultimately failed. Some of the ideas emerged on the System/38 and onto the original AS/400s, like having a query-able database for the file system rather than what we are used to now.
The link to NASA with the S/360 is explicit with JES2 (Job Execution Subsystem 2) the element of the OS that controls batch jobs and the like. Messages from JES2 start with the prefix HASP, which stands for Houston Automatic Spooling Program.
As a side note, CICS is developed at Hursley Park in Hampshire. It wasn't started there though. CICS system messages start with DFH which allegedly stands for Denver Foot Hills. A hint to its physical origins, IBM swapped the development sites for CICS and PL/1 long ago.
I've not touched an IBM mainframe for nearly twenty years, and it worries me that I have this information still in my head. I need to lie down!
Re: S/360 I knew you well
I have great memories of being a Computer Operator on a 360/40. They were amazing capable and interesting machines (and peripherals).
Re: S/360 I knew you well
ESA is the bit that you are missing - the whole extended address thing, data spaces,hyperspaces and cross-memory extensions.
Fantastic machines though - I learned everything I know about computing from Principals of Operations and the source code for VM/SP - they used to ship you all that, and send you the listings for everything else on microfiche. I almost feel sorry for the younger generations that they will never see a proper machine room with the ECL water-cooled monsters and attendant farms of DASD and tape drives. After the 9750's came along they sort of look like very groovy American fridge-freezers.
Mind you, I can get better mippage on my Thinkpad with Hercules than the 3090 I worked with back in the 80's, but I couldn't run a UK-wide distribution system, with thousands of concurrent users, on it.
Nice article, BTW, and an upvote for the post mentioning The Mythical Man Month; utterly and reliably true.
Happy birthday IBM Mainframe, and thanks for keeping me in gainful employment and beer for 30 years!
Re: S/360 I knew you well
I stated programming (IBM 360 67) and have programmed several IBM mainframe computers. One of the reason for the ability to handle large amounts of data is that these machines communicate to terminals in EBCDIC characters, which is similar to ASCII. It took very few of these characters to program the 3270 display terminals, while modern X86 computers use a graphical display and need a lot data transmitted to paint a screen. I worked for a company that had an IBM-370-168 with VM running both os and VMS.
We had over 1500 terminals connected to this mainframe over 4 states. IBM had visioned that VM/CMS. CICS was only supposed to be a temporary solution to handling display terminals, but it became the main stay in many shops.
Our shop had over 50 3330 300 meg disk drives online with at least 15 tape units. These machines are in use today, in part, because the cost of converting to X86 is prohibitive.
On these old 370 CICS, the screens were separate from the program. JCL (job control language) was used to initiate jobs, but unlike modern batch files, it would attach resources such as a hard drive or tape to the program. This is totally foreign to any modern OS.
Linux or Unix can come close but MS products are totally different.
Re: S/360 I knew you well
S/380 was the "future systems program" that was cut down to the S/38 mini.
HASP was the original "grid scheduler" in Houston running on a dedicated mainframe scheduling work to the other 23 mainframes under the bridge.. I nearly wet myself with laughter reading Data-Synapse documentation and their "invention" of a job-control-language. 40 years ago HASP was doing Map/Reduce to process data faster than a tape-drive could handle.
If we don't learn the lessons of history, we are destined to IEFBR14!
Come and look at this!
As a senior IT bod said to me one time, when I was doing some work for a mobile phone outfit.
"it's an IBM engineer getting his hands dirty".
And so it was: a hardware guy, with his sleeves rolled up and blood grime on his hands, replacing a failed board in an IBM mainframe.
The reason it was so noteworthy, even in the early 90's was because it was such a rare occurrence. It was probably one of the major selling points of IBM computers (the other one, with just as much traction, is the ability to do a fork-lift upgrade in a weekend and know it will work.) that they didn't blow a gasket if you looked at them wrong.
The reliability and compatibility across ranges is why people choose this kit. It may be arcane, old-fashioned, expensive and untrendy - but it keeps on running.
The other major legacy of OS/360 was, of course, The Mythical Man Month who's readership is still the most reliable way of telling the professional IT managers from the wannabees who only have buzzwords as a knowledge base.
Re: Come and look at this!
They were bloody good guys from IBM!
I started off working on mainframes around 1989, as graveyard shift "tape monkey" loading tapes for batch jobs. My first solo job was as a Unix admin on a set of RS/6000 boxes, I once blew out the firmware and a test box wouldn't boot.
I called out an IBM engineer after I completely "futzed" the box, he came out and spent about 2 hours with me teaching me how to select and load the correct firmware. He then spent another 30 mins checking my production system with me and even left me his phone number so I call him directly if I needed help when I did the production box.
I did the prod box with no issues because of the confidence I got and the time he spent with me. Cheers!
Re: 16 bit byte?
The typo must be fixed, the article says 6-bit now. The following is for those who have no idea what we are talking about.
Generally machines prior to the S/360 were 6-bit if character or 36-bit if word oriented. The S/360 was the first IBM architecture (thank you Dr's Brooks, Blaauw and Amdahl) to provide both data types with appropriate instructions and to include a "full" character set (256 characters instead of 64) and to provide a concise decimal format (2 digits in one character position instead of 1) 8-bits was chosen as the "character" length.
It did mean a lot of Fortran code had to be reworked to deal with 32-bit single precision or 32 bit integers instead of the previous 36-bit.
If you think the old ways are gone, have a look at the data formats for the Unisys 2200.
Came with the S/370, not the S/360, which didn't even have virtual memory.
The 360/168 had it, but it was a rare beast.
Nope. CP/67 was the forerunner of IBM's VM. Ran on S/360
S/360 Model 67 running CP67 (CMS which became VM) or the Michigan Terminal System. The Model 67 was a Model 65 with a DAT box to support paging/segmentation but CP67 only ever supported paging (I think, it's been a few years).
The 360/168 had a proper MMU and thus supported virtual memory. I interviewed at Bradford university, where they had a 360/168 that they were doing all sorts of things that IBM hadn't contemplated with (like using conventional glass teletypes hooked to minicomputers so they could emulate the page based - and more expensive - IBM terminals).
I didn't get to use an IBM mainframe in anger until the 3090/600 was available (where DEC told the company that they'd need a 96 VAX cluster and IBM said that one 3090/600J would do the same task). At the time we were using VM/TSO and SQL/DS, and were hitting 16MB memory size limits.
Re: Virtualisation @Steve Todd
I'm not sure that the 360/168 was a real model. The Wikipedia article does not think so either.
As far as I recall, the only /168 model was the 370/168, one of which was at Newcastle University in the UK, serving other Universities in the north-east of the UK, including Durham (where I was) and Edinburgh.
They also still had a 360/65, and one of the exercises we had to do was write some JCL in OS/360. The 370 ran MTS rather than an IBM OS.
You're right. The 360/67 was the first VM - I had the privilege of trying it out a few times. It was a bit slow though. The first version of CP/67 only supported 2 terminals I recall... The VM capability was impressive. You could treat files as though they were in real memory - no explicit I/O necessary.
This was a big factor in the profitability of mainframes. There was no such thing as an 'industry-standard' interface - either physical or logical. If you needed to replace a memory module or disk drive, you had no option* but to buy a new one from IBM and pay one of their engineers to install it (and your system would probably be 'down' for as long as this operation took). So nearly everyone took out a maintenance contract, which could easily run to an annual 10-20% of the list price. Purchase prices could be heavily discounted (depending on how desperate your salesperson was) - maintenance charges almost never were.
* There actually were a few IBM 'plug-compatible' manufacturers - Amdahl and Fujitsu. But even then you couldn't mix and match components - you could only buy a complete system from Amdahl, and then pay their maintenance charges. And since IBM had total control over the interface specs and could change them at will in new models, PCMs were generally playing catch-up.
So true re the service costs, but "Field Engineering" as a profit centre and a big one at that. Not true regarding having to buy "complete" systems for compatibility. In the 70's I had a room full of CDC disks on a Model 40 bought because they were cheaper and had a faster linear motor positioner (the thing that moved the heads), while the real 2311's used hydraulic positioners. Bad day when there was a puddle of oil under the 2311.
"This was a big factor in the profitability of mainframes. There was no such thing as an 'industry-standard' interface - either physical or logical. If you needed to replace a memory module or disk drive, you had no option* but to buy a new one from IBM and pay one of their engineers to install it (and your system would probably be 'down' for as long as this operation took). So nearly everyone took out a maintenance contract, which could easily run to an annual 10-20% of the list price. Purchase prices could be heavily discounted (depending on how desperate your salesperson was) - maintenance charges almost never were."
Back in the day one of the Scheduler software suppliers made a shed load of money (the SW was $250k a pop) by making new jobs start a lot faster and letting shops put back their memory upgrades by a year or two.
Mainframe memory was expensive.
Now owned by CA (along with many things mainframe) and so probably gone to s**t.
Done with some frequency. In the DoD agency where I worked we had mostly Memorex disks as I remember it, along with various non-IBM as well as IBM tape drives, and later got an STK tape library. Occasionally there were reports of problems where the different manufacturers' CEs would try to shift blame before getting down to the fix.
I particularly remember rooting around in a Syncsort core dump that ran to a couple of cubic feet from a problem eventually tracked down to firmware in a Memorex controller. This highlighted the enormous I/O capacity of these systems, something that seems to have been overlooked in the article. The dump showed mainly long sequences of chained channel programs that allowed the mainframe to transfer huge amounts of data by executing a single instruction to the channel processors, and perform other possibly useful work while awaiting completion of the asynchronous I/O.
@ChrisMiller - The IBM I/O channel was so well-specified that it was pretty much a standard. Look at what the Systems Concepts guys did - a Dec10 I/O and memory bus to IBM channel converter. Had one of those in the Imperial HENP group so we could use IBM 6250bpi drives as DEC were late to market with them. And the DEC 1600 bpi drives were horribly unreliable.
The IBM drives were awesome. It was always amusing explaining to IBM techs why they couldn't run online diags. On the rare occasions when they needed fixing.
It all comes flooding back.
A long CCW chain, some of which are the equivalent of NOP in channel talk (where did I put that green card?) with a TIC (Transfer in Channel, think branch) at the bottom of the chain back to the top. The idea was to take an interrupt (PCI) on some CCW in the chain and get back to convert the NOPs to real CCWs to continue the chain without ending it. Certainly the way the page pool was handled in CP67.
And I too remember the dumps coming on trollies. There was software to analyse a dump tape but that name is now long gone (as was the origin of most of the problems in the dumps). Those were the days I could not just add and subtract in hex but multiply as well.
The Mythical Man-Month
Fred Brooks' seminal work on the management of large software projects, was written after he managed the design of OS/360. If you can get around the mentions of secretaries, typed meeting notes and keypunches, it's required reading for anyone who manages a software project. Come to think of it...*any* engineering project. I've recommended it to several people and been thanked for it.
// Real Computers have switches and lights...
The Mythical Man-Month
The key concepts of this book are as relevant today as they were back in the 60s and 70s - it is still oft quoted ("there are no silver bullets" being one I've heard recently). Unfortunately fewer and fewer people have heard of this book these days and even fewer have read it, even in project management circles.
Was IBM ever cheaper?
I've been in IT since the 1970s.
My understanding from the guys who were old timers when I started was the big thing with the 360 was the standardized Op Codes that would remain the same from model to model, with enhancements, but never would an Op Code be withdrawn.
The beauty of IBM s/360 and s/370 was you had model independence. The promise was made, and the promise was kept, that after re-writing your programs in BAL (360's Basic Assembler Language) you'd never have to re-code your assembler programs ever again.
Also the re-locating loader and method of link editing meant you didn't have to re-assemble programs to run them on a different computer. Either they would simply run as it, or they would run after being re-linked. (When I started, linking might take 5 minutes, where re-assembling might take 4 hours, for one program. I seem to recall talk of assemblies taking all day in the 1960s.)
I wasn't there in the 1950s and 60s, but I don't recall any one ever boasting at how 360s or 370s were cheaper than competitors.
IBM products were always the most expensive, easily the most expensive, at least in Canada.
But maybe in the UK it was like that. After all the UK had its own native computer manufacturers that IBM had to squeeze out despite patriotism still being a thing in business at the time.
Cut my programming teeth on S/390 TSO architecture
We were developing CAD/CAM programs in this environment starting in the early eighties, because it's what was available then, based on use of this system for stock control in a large electronics manufacturing environment. We fairly soon moved this Fortran code onto smaller machines, DEC/VAX minicomputers and early Apollo workstations. We even had an early IBM-PC in the development lab, but this was more a curiosity than something we could do much real work on initially. The Unix based Apollo and early Sun workstations were much closer to later PCs once these acquired similar amounts of memory, X-Windows like GUIs and more respectable graphics and storage capabilities, and multi-user operating systems.
Ahh S/360 I knew thee well
Cut my programming teeth on OS/390 assembler (TPF) at Galileo - one of Amadeus' competitors.
I interviewed for Amadeus's initial project for moving off of S/390 in 1999 and it had been planned for at least a year or 2 before that - now that was a long term project!
Re: Ahh S/360 I knew thee well
There are people who worked on Galileo still alive? And ACP/TPF still lives, as zTPF? I remember a headhunter chasing me in the early 80's for a job in OZ, Quantas looking for ACP/TPF coders, $80k US, very temping.
You can do everything in 2k segments of BAL.
No mention of microcode?
Unless I missed it, there was no reference to microcode which was specific to each individual model of the S/360 and S/370 ranges, at least, and provided the 'common interface' for IBM Assembler op-codes. It is the rough equivalent of PC firmware. It was documented in thick A3 black folders held in two-layer trolleys (most of which held circuit diagrams, and other engineering amusements), and was interesting to read (if not understand). There you could see that the IBM Assembler op-codes each translated into tens or hundreds of microcode machine instructions. Even 0700, NO-OP, got expanded into surprisingly many machine instructions.
John Smith 19
Re: No mention of microcode?
"I first met microcode by writing a routine to do addition for my company's s/370. Oddly, they wouldn't let me try it out on the production system :-)"
I did not know the microcode store was writeable.
Microcode was a core (no pun intended) feature of the S/360/370/390/4030/z architecture.
It allowed IBM to trade actual hardware (EG a full spec hardware multiplier) for partial (part word or single word) or completely software based (microcode loop) depending on the machines spec (and the customers pocket) without needing a re compile as at the assembler level it would be the same instruction.
I'd guess hacking the microcode would call for exceptional bravery on a production machine.
Arnaut the less
Re: No mention of microcode? - floppy disk
Someone will doubtless correct me, but as I understood it the floppy was invented as a way of loading the microcode into the mainframe CPU.
The rule of thumb in use (from Brooks's Mythical Man Month, as I remember) is around 5 debugged lines of code per programmer per day, pretty much irrespective of the language. And although the end code might have been a million lines, some of it probably needed to be written several times: another memorable Brooks item about large programming projects is "plan to throw one away, because you will."
Programming systems product
The main reason for what appears, at first sight, low productivity is spelled out in "The Mythical Man-Month". Brooks freely concedes that anyone who has just learned to program would expect to be many times more productive than his huge crew of seasoned professionals. Then he explains, with the aid of a diagram divided into four quadrants.
Top left, we have the simple program. When a program gets big and complex enough, it becomes a programming system, which takes a team to write it rather than a single individual. And that introduces many extra time-consuming aspects and much overhead.
Going the other way, writing a simple program is far easier than creating a product with software at its core. Something that will be sold as a commercial product must be tested seven ways from Sunday, made as maintainable and extensible as possible, be supplemented with manuals, training courses, and technical support services, etc.
Finally, put the two together and you get the programming systems product, which can be 100 times more expensive and time-consuming to create than an equivalent simple program.
"Why won't you DIE?"
I suppose that witty, but utterly inappropriate, heading was added by an editor; Gavin knows better.
If anyone is in doubt, the answer would be the same as for other elderly technology such as houses, roads, clothing, cars, aeroplanes, radio, TV, etc. Namely, it works - and after 50 years of widespread practical use, it has been refined so that it now works *bloody well*. In extreme contrast to many more recent examples of computing innovation, I may add.
Whoever added that ill-advised attempt at humour should be forced to write out 1,000 times:
"The definition of a legacy system: ONE THAT WORKS".
Re: Pay Per Line Of Code
I worked for IBM UK in the 60s and wrote a lot of code for many different customers. There was never a charge. It was all part of the built in customer support. I even rewrote part of the OS for one system (not s/360 - IBM 1710 I think) for Rolls Royce aero engines to allow all the user code for monitoring engine test cells to fit in memory.
Sole Source For Hardware?
Even before the advent of Plug Compatible Machines brought competition for the Central Processing Units, the S/360 peripheral hardware market was open to third parties. IBM published the technical specifications for the bus and tag channel interfaces allowing, indeed, encouraging vendors to produce plug and play devices for the architecture, even in competition with IBM's own. My first S/360 in 1972 had Marshall not IBM disks and a Calcomp drum plotter for which IBM offered no counterpart. This was true of the IBM Personal Computer as well. This type of openness dramatically expands the marketability of a new platform architecture.
Eventually we stripped scrapped 360s for components.
"IBM built its own circuits for S/360, Solid Logic Technology (SLT) - a set of transistors and diodes mounted on a circuit twenty-eight-thousandths of a square inch and protected by a film of glass just sixty-millionths of an inch thick. The SLT was 10 times more dense the technology of its day."
When these machines were eventually scrapped we used the components from them for electronic projects. Their unusual construction was a pain, much of the 'componentry' couldn't be used because of the construction. (That was further compounded by IBM actually partially smashing modules before they were released as scrap.)
"p3 [Photo caption] The S/360 Model 91 at NASA's Goddard Space Flight Center, with 2,097,152 bytes of main memory, was announced in 1968"
Around that time our 360 only had 44kB memory, it was later expanded to 77kB in about 1969. Why those odd values were chosen is still somewhat a mystery to me.
Re: Eventually we stripped scrapped 360s for components.
@RobHib-The odd memory was probably the size of the memory available for the user, not the hardware size (which came in powers of 2 multiples). The size the OS took was a function of what devices were attached and a few other sysgen parameters. Whatever was left after the OS was user space. There was usually a 2k boundary since memory protect keys worked on 2k chunks, but not always, some customers ran naked to squeeze out those extra bytes.
Glen Turner 666
Primacy of software
Could have had a little more about the primacy of software: IBM had a huge range of compliers, and having an assembling language common across a wide range was a huge winner (as obvious as that seems today in an age of a handful of processor instruction sets). Furthermore, IBM had a strong focus on binary compatibility, and the lack of that with some competitor's ranges made shipping software for those machines much more expensive than for IBM.
IBM also sustained that commitment to development. Which meant that until the minicomputer age they were really the only possibility if you wanted newer features (such as CICS for screen-based transaction processing or VSAM or DB2 for databases, or VMs for a cheaper test versus production environment). Other manufacturers would develop against their forthcoming models, not their shipped models, and so IBM would be the company "shipping now" with the feature you desired.
IBM were also very focused on business. They knew how to market (eg, the myth of 'idle' versus 'ready' light on tape drives, whitepapers to explain technology to managers). They knew how to charge (eg, essentially a lease, which matched company's revenue). They knew how to do politics (eg, lobbying the Australian PM after they lost a government sale). They knew how to do support (with their customer engineers basically being a little bit of IBM embedded at the customer). Their strategic planning is still world class.
I would be cautious about lauding the $0.5B taken to develop the OS/360 software as progress. As a counterpoint consider Burroughs, who delivered better capability with less lines of code, since they wrote in Algol rather than assembler. Both companies got one thing right: huge libraries of code which made life much easier for applications programmers.
DEC's VMS learnt that lesson well. It wasn't until MS-DOS that we were suddenly dropped back into an inferior programming environment (but you'll cope with a lot for sheer responsiveness, and it didn't take too long until you could buy in what you needed).
What killed the mainframe was its sheer optimisation for batch and transaction processing and the massive cost if you used it any other way. Consider that TCP/IP used about 3% of the system's resources, or $30k pa of mainframe time. That would pay for a new Unix machine every year to host your website on.
Google matched content
Largest web collection of digital facsimiles of original documents by
Turing and other pioneers of computing.
Plus articles about Turing and his work, including Artificial Intelligence.
NEW Recently declassified previously top-secret documents about codebreaking.
This WWW page is the initiation of a collection of materials related to the history of computing as collected and written by J. A. N. Lee, until 1995 Editor-in-Chief of the IEEE Annals of the History of Computing, past chair of the IEEE Computer Society History of Computing Committee and current chair of the IFIP Working Group 9.7 (History of Computing).
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least
Copyright © 1996-2018 by Dr. Nikolai Bezroukov. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) in the author free time and without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info|
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.
Last modified: October 16, 2018