Softpanorama

May the source be with you, but remember the KISS principle ;-)
Contents Bulletin Scripting in shell and Perl Network troubleshooting History Humor

History of "The Unofficial Unix Administration Horror Story Summary"

Anatoly Ivasyuk created " The Unofficial Unix Administration Horror Story Summary form posting to the Usenet forum after Arne Asplem asked for contributions:

aras@multix.no (Arne Asplem) wrote:

> I'm the program chair for a one day conference on Unix system
> administration in Oslo in 3 weeks, including topics like network
> management, system admininistration tools, integration, print/file-servers,
> securitym, etc.

> I'm looking for actual horror stories of what have gone wrong because
> of bad system administration, as an early morning wakeup.

> I'll summarise to the net if there is any interest.

This list exists several versions: 

What is interesting that Anatoly Ivasyuk  at this time was just a student at Rochester Institute of Technology and later his career was revolving almost exclusively around Windows platform:

company-about

Anatoly Ivasyuk

With over 10 years of professional WindowsTM software development experience, Mr. Ivasyuk designs and develops the award winning software products that comprise the desktop side of the DTLink business.

Prior to founding DTLink, Mr. Ivasyuk established himself as an internationally recognized developer with the release of his Personal Stock Monitor product, which has gone on to become one of the most highly regarded stock market portfolio management applications available on the Net. Consistently rated at the top of its class, Personal Stock Monitor has been featured in every edition of Investing Online for Dummies, has been covered in Barrons' Online and has won countless awards from virtually every major software site.

Prior to that, Mr. Ivasyuk was Chief Windows Architect at WebThreads, LLC, a Vienna Virginia based internet technology startup where he was responsible for the Windows version of company's 1-to-1 marketing communications solution.

Mr. Ivasyuk was also responsible for development of Windows-based Network Management Software (NMS) for Telogy, Inc., a developer of satellite modem software and technology.


Top Visited
Switchboard
Latest
Past week
Past month

NEWS CONTENTS

Old News ;-)

[Apr 06, 2017] It's 30 years ago: IBM's final battle with reality by Andrew Orlowski

Notable quotes:
"... OS/2 ran fine on PC clones and ISA boards. The link between PS/2 and OS/2 was entirely spurious. ..."
"... After all, OS/2 did TCP/IP long before Windows had a stable IP stack supported by the manufacturer (yes, yes, there was Trumpet WinSock and Chameleon TCP/IP but they were not from Microsoft and MS refused to support them). ..."
"... If they'd put decent mainframe developer tools onto a personal workstation they'd have had a core market to build on. ..."
"... No product manager was going to risk the ongoing investment in mainframe software by allowing personal computers to connect other than as the dumbest of terminals. It's ironic that IBM used the word "System" so much: it never really understood systems, just product families.. ..."
"... Way before then Microsoft had UNIX System 7 running and released as XENIX for the 8086/80286 architecture. ..."
Apr 04, 2017 | www.reddit.com
Thirty years ago this month, IBM declared war on reality – and lost. For the 30 years prior to that April day in 1987, nobody had dared to challenge IBM's dominance of the computer industry. It would take almost a decade for the defeat to be complete, but when it came, it was emphatic.

In April of '87, IBM announced both a new PC architecture, PS/2, and a new operating system to run on the boxes, OS/2. "Real" business computers would also run IBM networking software and IBM SQL database software, all bundled into an "Extended Edition" of the operating system. There was only one real way of computing, IBM declared, and it had a /2 on the end. This was signalled by the job titles, the PC business was called "Entry Systems": bikes with training wheels.

While IBM itself has subsequently survived and prospered, it's a very different company today. It subsequently divested its PCs, printers and microprocessor divisions (along with much else) and one wonders how much different it would be today if it hadn't devoted a decade and thousands of staff to trying to bring the PC industry back under its control. Ironically, Lenovo is the biggest PC company again – but by following the rules set by Microsoft and Intel.

Analysts, eh?

OS/2 is an oft-told story, not least here at the Reg , where we gave an account of the OS wars on the 25th anniversary of OS/2. Dominic Connor also shared lots of evocative detail about the IBM culture at the time here (Part One) and here (Part Two) . So there's no need to do so again.

But every time history is written, it's from a different vantage point – a different context. It no longer seems a joke to suggest, as IBM CEO Thomas J Watson probably never did , that the world needs only five computers. It's a quote that adorned many .sig files in the early days of email. Hah Hah. How stupid could a tech CEO be?

Well today, just a handful of cloud platforms are threatening to dominate both consumer and business computing, as big data analytics and AI (we're told) will only really work at scale. And only a few companies (Amazon, Alphabet, Microsoft) will have that data. So how many computers will the world really have? Not counting the Javascript interpreter in your pocket or your living room, that clicks on invisible advertisements in the always-connected world, that has been relegated to a runtime.

So if IBM had fought a different war, how might the world look?

The tl;dr

If you're not familiar with the significance of OS/2 and PS/2 and don't want to read thousands of words, here's a capsule summary. The setter of standards for the past three decades, IBM had responded to the microprocessor revolution by allowing its PC to be easily "cloneable" and run a toy third-party operating system. It didn't matter too much to senior IBM management at first: PCs weren't really computers, so few people would buy them. Business computing would be done on real multiuser systems. That was the norm at the time. But the runaway sales of the PC clones worried IBM and in 1984, just three years after the launch of the IBM PC, Big Blue plotted how to regain control. The result was a nostalgic backward-looking vision that under-estimated that computing standards would increasingly be set by the open market, not by IBM.

The PS/2 series of PCs had world-beating futuristic industrial design, based around clip-together plastic parts that made the beige tins of the time look instantly dated. And PS/2 would have been fine if it hadn't been for the notorious proprietary bus: Micro Channel Architecture .

This plug and play bus was much better than the ISA standards of its day, and good enough for IBM to be using in workstations and even mainframes. However, it was not compatible with the (fairly crude) expansion boards of the day required to do networking or graphics; MCA cards were twice the price of comparable cards; and hybrid PCs were difficult to build. But the killer was that IBM demanded a licence fee from OEMs and sent its lawyers after MCA clones. The result was years of uncertainty. Seven years later, Apple was still making fun of how hard it was to get expansion cards to work in PCs. The real bedevilment of IBM's PCs was the tech titan's cost structure, which made it more expensive than the competition, at least without a heavy corporate discount.

But people don't get emotional about PS/2s in the way they got emotional about OS/2, which, even as a former user, is pretty strange, given how much grief it gave you. The tl;dr of the OS/2 story is that IBM announced a highly advanced operating system for PCs, but it was five years (1992) before it shipped a version that was demonstrably better for the average user. (In fact, it was almost a year before anything shipped at all.)

Since operating systems aren't an end in themselves, but merely a means to an end, a means of running something that alleviates your grunt work (like dBase or Lotus 1-2-3 at the time), the advantages of OS/2 were pretty elusive.

And, even in 1992, "better" meant managing old apps and files better – and the action in that "space" was taking place on Windows.

Because the industry was so unused to believing it could actually set standards, for a long time it didn't. This nervousness in the wake of PS/2 and OS/2 had caused a kind of winter. People just didn't bother upgrading – well, why would you? You had to wait and see what the standards would be.

So Microsoft put next to no effort into updating DOS (or even Windows) for the next two years. Application vendors continued to update their applications, but these remained character mode and there the lack of a standard for addressing extended memory also added to the inertia. Remember that OS/2 had been written for the 80286 chip introduced in 1984, and while the 80386 chip added new modes for protecting memory and virtualizing sessions, without the software to take advantage of it, weak demand ensured 386 PCs remained very expensive.

I described IBM's vision of computing as nostalgic and backward-looking, with PCs limited to office paperwork while real computing took place on (IBM) servers. But what if IBM had dared skip a generation and tried something really daring?

Alternative histories

At the time, there just that weren't many options.

A week before IBM's PS/2 and OS/2 roadmap was unveiled, Digital Research Inc shipped a multitasking DOS for 286 computers – Concurrent DOS 286 – to OEMs. This emulated the original IBM PC chip in software, and offered a huge leap in multitasking over Microsoft DOS. DRI also had a graphical shell, GEM. Why not dump Microsoft for DRI, which ran the industry standard OS when IBM set about making its first PC? It was really about control. IBM had excellent engineers and many PhDs; it could make it itself. IBM had little appetite to give Gary Kildall, who understood the microcomputer business much better than IBM, another sniff.

As it turned out, DRI struggled, along with everyone else, to make Concurrent DOS work reliably on the 80286 chip, particularly when it came to networking. The PC was a wild west, and reliable compatibility really needed the hardware virtualisation of the 80386 chip.

The obvious alternative was rapidly maturing: Unix. The trade press had busily hyping "open systems" for years. In fact, it's little remembered now the PS/2 came with a version of IBM's Unix: the Advanced Interactive Executive or AIX for the PS/2. "The multiuser, multitasking virtual memory operating system will be a subset of the Unix-like AIX operating system for the RT PC", as InfoWorld reported.

(Note that a Unix would be "Unix-like" forever after.)

But IBM didn't like Unix, and didn't get serious about selling it until Sun, with the rest of the industry in hot pursuit, was eating into its business. IBM didn't like distributed architectures that were too distributed, and for IBM, Sun's emphasis on "the network is the computer" put the emphasis in completely the wrong place. The "Open Systems" hype was that the CIO user would mix and match their IT – and that was the last thing IBM wanted.

And there were more immediate, practical difficulties with leaping a generation of technology and yoking the IBM PC to Unix long term. It wasn't ease of use. Later, usability was the stick used to beat Unix vendors with, but at the time DOS and Unix were equally hard to use. Backward-compatibility was the main issue: they wouldn't run the PC applications of the day. It seemed far more plausible that IBM could persuade the big PC application vendors of the day – like Lotus and Ashton Tate – to migrate to OS/2 than bring their Unix ports to IBM's version of Unix. Far less risky for IBM too.

With the benefit of hindsight, the third option would have been far more attractive: why didn't IBM just buy Apple? It wasn't IBM-compatible, but most of IBM's kit wasn't IBM-compatible, either. (Hence one of those grand IBM unifying strategies of the time: "System Application Architecture".)

As it turned out, IBM and Apple would work closely together, but only out of desperation, once it was clear that Windows was a runaway success, and both had lost the developers to Microsoft. Much of the first half of the 1990s saw several thousand IBM and Apple developers working on ambitious joint projects that never bore fruit: Taligent, a WorkPlace OS, and much else.

Apple had a working Intel port of MacOS... in 1994. IBM and Apple had even agreed in principle to merge in 1995, only for Apple CEO Michael Spindler to get cold feet at the signing. He wanted more money from the buyer (the Apple Way).

Still, it's fascinating to speculate what a popular, consumer-friendly OS might have done if IBM had been prepared to license Apple's Macintosh OS aggressively, so it supplanted DOS as the industry standard. Many vendors had a foot in both camps at the time, so it really would have been a case of investing in existing initiatives, rather than starting from scratch. Without big IBM's support, Microsoft would have been relegated to a tools company with a nifty spreadsheet, possibly eventually divesting both. I wonder who would have bought Excel?

Alas, nothing was further from the thoughts of IBM's management at the time, obsessed with taming the industry and bringing it back under central control. What they'd always done had always worked. Why change?

Thomas J Watson may never have said the world will have five computers - it is almost certainly a myth. But it's striking as the world moves to three (or four) computing platforms, IBM isn't one of them. ®

1 day Anonymous South African Coward

Re: Interesting times

On the other hand, my experience with OS/2 on a 33MHz 386SX with 4Mb RAM was excellent. First 2.1 then Warp, then Merlin... haven't had any crashes or funny things. Just wished the dang PC would go faster.

DooM played better than under DOS and some games just loved the extra memory I was able to give to them with the DOS settings under OS/2...

Good days, good memories.

1 day Anonymous Coward

Re: Interesting times

Agreed, I had it on a 386sx 25 (2mb ram) and then a 486dx2 66 (8mb ram), ran videos, games and windows apps faster than in windows and dos. Was a good OS, shame it didn't do better than it did. Have been using it up until a couple of years ago, some older machines where I work used it. But that wasn't a pleasant task, keeping it integrated with the current environment and making work around to keep it 'compliant'.

1 day AndrueC

Re: Interesting times

I remember Sunday afternoons playing Geoff Crammond's Formula One Grandprix in a VDM while downloading messages from CompuServe using the multi-threaded Golden Compass. My first experience of real multi-tasking on a PC.

I also used it to develop DOS applications as the crash protection meant that I only had to reopen the VDM, not reboot the machine.

1 day ChrisC

Re: Interesting times

I have fond memories of Warp too - back then I was doing some research work on robotic equations of motion, which had eventually evolved into a hideously complex Matlab script to do all the hard work for me. I'd just define the system geometry parameters at the start, click Go, twiddle my thumbs for an hour or so, and then get a complete set of optimised motion equations out the other end.

Unfortunately this was all being done in the Win3.1 version of Matlab, and as bad as the co-operative multitasking was in 3.1 generally, it was a shining beacon of excellence compared to how it behaved once Matlab started up - I'm pretty sure the Matlab devteam must have misread the Windows documentation and thought it featured "un-cooperative multitasking", because once you let Matlab loose on a script it was game over as far as being able to do anything else on that PC was concerned.

As a hardcore Amiga user at the time, I knew that multitasking didn't have to be this godawful, and I was convinced that the PC I had in front of me, which at the time had roughly twice the raw processing power of the fastest Amiga in my collection, really ought to be able to multitask at least as well as the slowest Amiga in my collection...

I can't recall how I stumbled upon OS/2 as the solution, all I do remember is that having learned of its existence and its claimed abilities to do stuff that Windows could only dream of doing, I dashed into town and bought my own copy of Warp, and once I got over the hurdle of getting it installed as a multi-boot setup with my existing fine-tuned DOS/Win3.1 setup (having expended god knows how many hours tweaking it to run all my games nicely - yes, even those that expected to have almost all of the base memory available, but still also needed to have CDROM *and* mouse drivers shoe-horned in there somewhere too - I didn't want to mess that up) I fired it up, installed Matlab, and tentatively clicked Go... Umm, is it running? This can't be right, the OS is still perfectly responsive, I can launch other Win3.1 applications without any signs of hesitation, and yet my Matlab script really does claim to be churning its way through its calculations about as quickly as it did hogging Win3.1 all to itself.

From that day on, Warp became my go-to OS for anything work-related until the day I finally ditched Win3.1 and made the switch to 95.

So yes, count me in as another one of those people who, despite the problems OS/2 had (I'll readily admit that it could be a bit flakey or just a bit obtuse when trying to get it to do what you wanted it to do) will still quite happily wax lyrical about just how bloody amazing it was in comparison to a DOS/Win16 based setup for anyone wanting to unlock the true potential of the hardware in front of them. Even today I still don't think the Windows dev team *really* understand how multitasking ought to behave, and I do wonder just how much productivity is lost globally due to those annoying random slowdowns and temporary hangs which remain part and parcel of everyday life as a Windows user, despite the underlying hardware being orders of magnitude more powerful than anything we could dream of having sat on our desks back in the 90's.

1 day LDS

Re: Interesting times

OS/2 had a single applications message queue, or something alike, and a non responding application could block the whole queue. 3.0 run DOS applications very well, most Windows application worked well, but some had issues - i.e. Delphi 1.0. But despite IBM asserting it could also support Win32 applications, it never did, and from 1995 onwards the world was quickly migrating to 32 bit applications - and native OS/2 applications were too few and sparse.

But OS/2 was just a part of the PS/2 equation - PS/2 and MCA boards were really too expensive compared to clones - and once Compaq and others started delivering good enough machines, it was too late to close the stable. Nor DRI/GEM nor Apple could have turned the tide - it was a matter of customers money.

Back then PCs and their software were still expensive (much more expensive than today), and for many it was a quite demanding investment - asking even more like IBM did, was just cutting out many, many customers who could only afford clones - all of them running MS-DOS and then Windows.

16 hrs Planty

Re: Interesting times

Interesting times again. Microsoft are now in IBM shoes, facing irevellance as nobody really cares about their products anymore. Microsoft have been running around throwing out all sorts of random things, hoping something will stick. Nothing stuck, everything sucked.

15 hrs Richard Plinston

Re: Interesting times

> OS/2 had a single applications message queue,

It was Windows 1 through 3.11 that "had a single applications message queue".

"""Preemptive multitasking has always been supported by Windows NT (all versions), OS/2 (native applications) , Unix and Unix-like systems (such as Linux, BSD and macOS), VMS, OS/360, and many other operating systems designed for use in the academic and medium-to-large business markets."""

> But despite IBM asserting it could also support Win32 applications, it never did,

OS/2 Windows 3.x did support Win32s applications by loading in the win32s module, just like Windows 3.1 could do. However, Microsoft added a completely spurious access to virtual memory beyond the 2Gbyte limit of OS/2 (Windows supported 4Gbyte accesses) just to stop OS/2 using that beyond a particular version. Microsoft then required the new version in their software.

Exactly what you would expect from Microsoft, and still should do.

> But OS/2 was just a part of the PS/2 equation - PS/2 and MCA boards were really too expensive compared to clones

OS/2 ran fine on PC clones and ISA boards. The link between PS/2 and OS/2 was entirely spurious.

6 hrs AndrueC

Re: Interesting times

I think you might be confusing two things there. Having pre-emptive multi-tasking doesn't preclude having a single message queue. The two things are only tangentially related.

Multi-tasking is the ability to run multiple processes (or multiple threads) by switching between them. Pre-emptive multitasking means that the OS can force a task switch, cooperative multitasking means that each process has to yield control back to the OS. OS/2 was an indeed one of the earliest (possibly the earliest) PC OS that was preemptive. Windows was only cooperative until 9x and NT.

But nothing about being multi-tasking requires that the OS even support message queues. Early versions of Unix offered pre-emptive multitasking but in the absence of X-Windows probably didn't have any message queues. In fact arguably an OS probably never would. Message queues are usually a higher-level construct typically implemented in the GUI framework.

And, sadly, it is indeed true that the early versions of Work Place Shell (the OS/2 default GUI) had a single message queue. IBM's recommendation was that developers implement their own queue sinks. The idea being that every application would have a dedicated thread that did nothing but accept messages and store them in a queue. The main application thread(s) would then empty this 'personal' queue at their leisure. I'm not sure why they wanted this design - maybe because then it was the application's responsibility to manage message storage? Sadly (and not surprisingly) most developers couldn't be arsed. As a result the WPS could often lock up. Now the OS itself wasn't locked - other processes would keep running just fine. If you were lucky enough to have a full screen VDM open you wouldn't even notice until you tried to go back to the WPS. When it happened to us my colleague and I used to ask the other one to Telnet in to our boxes and kill the main WPS thread to get things going again.

One of the big features that OS/2 Warp finally brought was multiple message queues. Sadly by then it was too late. It's a shame because I did like the WPS. It's object oriented nature was great. Windows has never offered that. In the WPS an icon is an object that knows where it should appear. In Windows it's just an icon that Explorer choose to render in a particular location. Right click it and Explorer tries to work out what to put on the menu. Do the same in WPS and the icon will create its own menu.

OOP GUIs are powerful things. Re: Interesting times

Lets try not to forget that Microsoft only got the PC-DOS gig after pitching Microsoft port of AT&T Unix (Xenix) for IBM's Personal Computer. It is also worth mentioning that Windows was initially developed on Xenix and ported to DOS after initial testing.

Had it not been for IBM, the world would be.... pretty much as it is now

5 hrs cmaurand

Re: Interesting times

Actually, it did support 32 bit applications. Microsoft kept changed something in windows in the way it handled 32 bit applications, IBM adjusted, then Microsoft came out with Win32s, IBM adjusted, Microsoft changed it again (something about the way windows does things in memory) and IBM gave up on trying to keep up with Microsoft's changes.

4 hrs AndrueC

Re: Interesting times

Actually, it did support 32 bit applications.

That Windows integration piece in OS/2 was cool, I thought. Pretty much seamless and you could choose how seamless it was - full screen or pretend to be an application on the desktop. The bit where it could link into an existing installation was just brilliant. Licensing payments to Microsoft for Windows? Not today, thanks :D

But then the entire VDM subsystem was cool. A true 'Virtual DOS machine' rather than the Windows poor cousin. You could even boot up different versions of DOS. I seem to recall one of their bug fixes was to the audio subsystem to support a Golf simulator that triggered tens of thousands of interrupts per second. From what I vaguely recall they said they removed the need on the card side and faked the results inside the VDM. The VDM team must've gone to extraordinary lengths in order create what we'd probably now call a Virtual Machine.

2 hrs Mine's a Large One

Re: Interesting times

"Now the OS itself wasn't locked - other processes would keep running just fine. If you were lucky enough to have a full screen VDM open you wouldn't even notice until you tried to go back to the WPS. When it happened to us my colleague and I used to ask the other one to Telnet in to our boxes and kill the main WPS thread to get things going again."

We had a couple of apps which used to do this occasionally on our servers and the first we'd know about it would be when we tried to do anything at the server, something we didn't often do - everything had been merrily working away in the background but with a single app having frozen the WPS.

2 hrs Faux Science Slayer

Rockefeller front IBM replaced by Rockefeller front MicroSoft

Edison did not invent electric light, Alex Bell stole his telephone patent, Marconi STOLE Tesla radio patents and Glenn Curtiss stole a dozen Wright Brothers patents. All were set up fronts for Rockefeller monopolies.

"JFK to 9/11, Everything is a Rich Man's Trick" on YouTube....end feudalism in 2017

18 mins CrazyOldCatMan

Re: Interesting times

my experience with OS/2 on a 33MHz 386SX with 4Mb RAM was excellent

It was (even more so than DOS/Windows) sensitive to the hardware - if you had anything slightly out then OS/2 would do lots of mysterious things..

I used it, right up to the end when IBM dropped it. Along with linux (my first linux machine was a 386sx25 with a 330MB ESDI drive - was originally OS/2 with an IDE 80MB drive but OS/2 simply wouldn't load when the ESDI drive interface card was inserted. So I used it for linux instead).

After all, OS/2 did TCP/IP long before Windows had a stable IP stack supported by the manufacturer (yes, yes, there was Trumpet WinSock and Chameleon TCP/IP but they were not from Microsoft and MS refused to support them).

1 day schifreen

Too much credit

You seem to conclude that the problem lay with IBM's poor strategy. You give the company way too much credit. They didn't really have a strategy at all and, if they did, no one knew what it was. Or where they would be working once the latest departmental reorg was complete.

The problem with OS/2 is that it was a solution to a problem that hardly anyone actually had. IBM knew this. Ask senior IBM people at the time (as I did) exactly what problem OS/2 was designed to solve, and why the world needed to ditch the free OS that came with their PC in order to install something that, frankly, never installed properly anyway, and they singularly failed to come up with a sensible answer. Or any answer at all.

1 day Little Mouse

Re: Too much credit

I'd almost forgotten how pervasive the image of IBM was back in those days. I joined the PC party in the days of 286, and IIRC, PC's only really fell into two camps - either IBM or IBM clones.

Amazing that they could get from that position of dominance to this.

1 day I am the liquor

Re: Too much credit

I don't think it's true that OS/2 was a solution to a problem that never existed. Perhaps it was too early. But by the mid 90s when customers did understand the problem, and Microsoft was answering it with Windows NT, surely that should have meant that OS/2 was mature and ready to take advantage.

OS/2 Warp was a better design in many ways than NT 3.51, but IBM's head start didn't help them win the race, and now we all run Windows NT on our PCs.

1 day Warm Braw

Re: A solution to a problem that hardly anyone actually had

It could have been a solution to a problem that a lot of people had. If you've ever had the misfortune to write software for IBM mainframes and been stuck with TSO or SPF (or even VM/CMS which is only better my comparison with the alternatives) you'd have given your eye teeth for a less developer-hostile environment. If they'd put decent mainframe developer tools onto a personal workstation they'd have had a core market to build on.

But this wasn't the IBM way: accounts were structured along product lines and no mainframe systems salesman was going to let a PC salesman or a System/34 salesman onto his territory if the thought it might cannablise his commission. No product manager was going to risk the ongoing investment in mainframe software by allowing personal computers to connect other than as the dumbest of terminals. It's ironic that IBM used the word "System" so much: it never really understood systems, just product families..

1 day Vic

Re: Too much credit

You give the company way too much credit. They didn't really have a strategy at all

Indeed.

I was working at an IBM Systems Centre when the PS/2 launched. We had to go to customers to tell them how wonderful this MCA thing was - without having any knowledge of whether or not it was any better than ISA.

And that was a shame, really - MCA *was* better. But no-one found out until it was way too late. And the Model/30 didn't have MCA anyway...

1 day LDS

Re: Too much credit

Many people started to have problems with the single applications model DOS had - it's no surprise one of the first big hits of Borland was Sidekick - which made TSRs (Terminate and Stay Resident) applications common.

Still, everything had to work in the 1MB of memory available in real mode. Extended/Expanded memory couldn't be used to run code from. DOS Extenders later could run bigger applications, but still a single one.

Windows was well accepted not only because it was a GUI, but because it allowed to run more than one application, switch among them easily, and move data across applications. The limited multithreading was not an issue on desktops with only a single core CPU.

If you were using the PC just to play games, it really didn't matter, but for business users it was a great productivity boost.

18 hrs a_yank_lurker

Re: Too much credit

@schifreen - Itsy Bitsy Morons always had a schizophrenic attitude towards PCs and to a lesser extent minis at the time. They worshipped big iron and could not understand why people would want a "toy" or "crippled iron". What they failed to grasp is many computing activities are not very resource intensive on any computer (I wrote a thesis on an Apple IIe) even early PCs. These are activities that could be easily automated and put on a smaller computer. Others grasped the vacuum left and moved in with both feet.

The other issue for them was selling PCs is very different than selling a mainframe. One buys a PC much like one buys any other appliance from a retailer. There is no formal bid process with tenders to opened and reviewed. At retail, the sales staff is less interested in the brand you bought but is very interested in selling you something.

15 hrs Richard Plinston

Re: Too much credit

> and why the world needed to ditch the free OS that came with their PC

MS-DOS and Windows were never free*. When you bought a computer with Windows installed, and sometimes when Windows was _not_ installed, money went from the OEM to Microsoft. That cost was part of the price.

* actually there was a 'free' version: 'Windows with Bing' that no one wanted.

14 hrs addinall

Re: Too much credit

MS-DOS was never free. Unless you stole it.

13 hrs Richard Plinston

Re: Too much credit

> MS-DOS was never free. Unless you stole it.

When SCP sold 86-DOS to Microsoft for development into PC-DOS and MS-DOS (MS had previously licensed it from SCP) the agreement was that SCP would have as many copies of MS-DOS as they wanted for free as long as they were shipped with a computer (SCP built the Zebra range of S-100 based computers).

After the fire in the SCP factory, which stopped them building computers, they started selling V20 chips (faster clone of the 8088* with 8085 emulation built in) and V30 chips (ditto 8086) with a free copy of MS-DOS. MS bought out the agreement for a reputed $1million.

* swap this for the 8088 to get a 20% faster machine that could also run CP/M software (with suitable loader).

10 hrs Richard Plinston

Re: Too much credit

> MS-DOS was never free. Unless you stole it.

After per-box pricing* was declared illegal, MS came up with another scheme where MS-DOS and Windows were bundled together at the price of Windows alone. Effectively this was MS-DOS for free to stop DR-DOS being installed. At the time it was MS-DOS 4.01 versus DR-DOS 5 which was infinitely superior and it took MS 20 moths to nearly catch up with MS-DOS 5, at which point DR released DR-DOS 6 with task switching. MS took another year to almost catch up with MS-DOS 6.

* OEMs were contracted to pay Microsoft for MS-DOS on every box sold regardless of whether it had MS or DR-DOS (or other) installed. This was to strangle DR-DOS sales. The alternative to accepting this contract was to never sell any MS products ever again.

10 hrs Richard Plinston

Re: Too much credit

> Itsy Bitsy Morons always had a schizophrenic attitude towards PCs and to a lesser extent minis at the time. They worshipped big iron and could not understand why people would want a "toy" or "crippled iron".

You write as if IBM were one thing. It was divided up into several divisions, each with their own sales and marketing and each competing against the others. The mainframe division (360/370) wanted a small computer to counter the Apple IIs with Visicalc, Z80 softcard and CP/M software invading their sites. The IBM-PC was designed to be 20% better than the Apple II (160Kb floppies instead of 120Kb etc) and also act as a terminal (which is why the IBM PC has DTE serial ports while other micros had DCE) while running the same software. There were also 3740 (terminal) PCs and 360 emulating PCs (with Motorola 68x00 co-processor boards) for developers to use to write mainframe software.

The mainframe division did look down on the Series One, System 3, System 36 and System 38 (AS400) and other divisions, but did not see the IBM PC as any threat at all. They did want to exclude other brands though.

10 hrs Pompous Git

Re: Too much credit

MS-DOS was never free. Unless you stole it.
Only two of the many computers I've owned came with MS-DOS, most were without an OS. Since MS refused to sell DOS at retail, most people did just that; they stole a copy of DOS. A much smaller number of us purchased DR-DOS and reaped the benefits of an arguably better DOS than DOS. Especially if you also ran the 4DOS command processor.
7 hrs Richard Plinston

Re: Too much credit

> Since MS refused to sell DOS at retail

MS had a contractual moratorium on selling MS-DOS at retail for 10 years with IBM. This expired and MS released MS-DOS 5 for retail sales.

> A much smaller number of us purchased DR-DOS and reaped the benefits of an arguably better DOS than DOS.

It was significantly better that the contemporary MS-DOS 4.01 and had a 20 month lead on MS-DOS 5.

Allegedly it reached a 20% market share until MS brought in illegal per-box pricing and bundled MS-DOS+Windows at Windows price.

6 hrs Pompous Git

Re: Too much credit

"MS had a contractual moratorium on selling MS-DOS at retail for 10 years with IBM. This expired and MS released MS-DOS 5 for retail sales."
B-b-b-b-ut that can't be true. Shirley MS are responsible for everything bad in computing... ;-)
5 hrs Anonymous Coward

Re: Too much credit

"At the time it was MS-DOS 4.01 versus DR-DOS 5 which was infinitely superior and it took MS 20 moths to nearly catch up with MS-DOS 5, at which point DR released DR-DOS 6 with task switching. MS took another year to almost catch up with MS-DOS 6"

But MS had a trick up its sleeve to sideline DR-DOS ... with Win 3.0 they had a long (several months) "beta" period where people cold download the "beta" for free to try out so there was a lot of prelaunch publicity. Part of that publicity was that Win 3.0 didn't work on DR-DOS as there was an error during start-up. In legal terms DR weren't allowed to access the beta so they couldn't counter all the "if you want win 3.0 you'll need MS-DOS and not DR-DOS" stories in the press. In reality the error was, I believe, due to MS deliberately using a result from a DOS call which wasn't precisely specified where Win3 worked with the value MSDOS returned but not with the one DRDOS returned ... trivial for DR to fix and seem to recall they did this as soon as Win3 launched but the damage was done.

At the time I was using DR-DOS so I assumed that to get Win3 I'd need to factor in the cost of MSDOS as well and at which point the price of OS/2 (with a special launch offer) was similar so I went for OS/2

4 hrs FuzzyWuzzys

Re: Too much credit

DR-DOS was superb, it had so much stuff bundled in and it worked like a dream and as proven MS took months to catch up to what DR-DOS did. DR also has one of the more tragic stories in PC history, well worth checking out.

13 mins CrazyOldCatMan

Re: Too much credit

OS/2 Warp was a better design in many ways than NT 3.51, but IBM's head start didn't help them win the race, and now we all run Windows NT on our PCs.

Windows winning has often been called "the triumph of marketing over excellence".

1 day Martin 47
Since operating systems aren't an end in themselves, but merely a means to an end, a means of running something that alleviates your grunt work (like dBase or Lotus 1-2-3 at the time), the advantages of OS/2 were pretty elusive.

Think someone needs to mention that to Microsoft

1 day stephanh

The whole situation seems very similar to Microsoft today desperately getting a bolthole in the mobile market, against Android. Microsoft's "Windows Experience" Android strategy me a lot of OS/2 (replace a working, familiar OS by something nobody needs).

"Hegel remarks somewhere that all great world-historic facts and personages appear, so to speak, twice. He forgot to add: the first time as tragedy, the second time as farce."

-- Karl Marx, "The Eighteenth Brumaire of Louis Napoleon"

10 hrs Pompous Git
"Since operating systems aren't an end in themselves, but merely a means to an end... the advantages of OS/2 were pretty elusive."
Not really. It was far from unknown in publishing to have to leave the machine rendering a print job because it couldn't do anything else at the same time. Being able to continue working while printing would have been a blessing, but had to await WinNT.
1 day Admiral Grace Hopper
The Man In The High Data Centre

I enjoy a good counterfactual. I still wonder occasionally how the world would have looked if IBM and Apple had got Pink to the point where it was marketable.

1 day Tinslave_the_Barelegged

Re: The Man In The High Data Centre

Bloke at our small village local being cagey about what he did at IBM. Eventually I twigged and said "Oh, you're working on Pink?" He seemed amazed that anyone would know about it, and eventually chilled, but it struck me that little episode was a good analogy for the problems with the ill fated IBM/Apple dalliance.

22 hrs jake

Re: The Man In The High Data Centre

Somewhere, I have a T-shirt with the IBM logo of the time superimposed over the "full color" Apple logo of the time on the front. On the back, it reads "Your brain, on drugs.". The first time we wore them at work, we were told we'd be fired if we wore them again ...

1 day Colin Bull 1

succesful standard

The PS2 brought with it one of the longest used PC standards - VGA. Only in the last year or two has the VGA standard connector been superceded by HDMI.

1 day MrT

Re: succesful standard

And the PS/2 port for mice/keyboards.

The Model M keyboard set a high standard that many modern items still struggle to beat...

1 day Peter Gathercole

Re: successful standard

Though to be truthful, the key action in the Model M keyboards had appeared in the earlier Model F keyboards, and the very first Model M Enhance PC keyboard appeared with an IBM PC 5 pin DIN connector on the 5170 PC/AT.

We had some 6MHz original model PC/ATs where I worked in 1984, and even then I liked the feel of the keyboard. Unfortunately, the Computer Unit decided to let the departmental secretaries compare keyboards before the volume orders went in, and they said they liked the short-travel 'soft-touch' Cherry keyboards over all the others (including Model Ms).

As this was an educational establishment, the keyboards got absolutely hammered, and these soft-touch keyboards ended up with a lifetime measured in months, whereas the small number of Model Ms never went wrong unless someone spilled something sticky into them.

I wish I had known at the time that they were robust enough to be able to withstand total immersion in clean water, as long as they were dried properly.

1 day Wade Burchette
Re: succesful standard

The PS/2 standard is for keyboards/mice and it is still an important connector. It just works.

And VGA has not been supplanted by HDMI, but by DVI which was supplanted by DisplayPort. HDMI is designed for TV's and has a licensing cost.

1 day Danny 14

Re: succesful standard

I bought a second hand denford milling machine. The license comes on a 3.5in floppy disk - I needed to buy a USB drive as I didn't have any working ones left (not even sure I had a motherboard with a connection any more either). Re: succesful standard

Ah, those proprietary floppy disk drives which couldn't tell the difference between a 720Kb and 1.4Mb disk. I hate to think about the hours spent recovering data from misformatted floppies.

And whilst the industrial design of the PS/2 was attractive and internals were easily accessible, quality of manufacture (the ones made in Scotland, at least) was lousy. I still carry the scars from fitting lids back on Model 55s.

22 hrs Dave 32

Re: succesful standard

Ah, you forgot about the 2.88MB floppy disks.

I happen to have an IBM PS/2-model 9595 sitting here under the desk with one of those drives in it. ;-)

Ah, yes, the model 9595; those had that little 8 character LED "Information Panel" display on the front of the case. It only took a tiny bit of programming to write an OS/2 device driver to turn that into a time-of-day clock. For years, that was the only thing that I ran on that 9595 (I called it my "600 Watt clock".).

Hmm, why was there suddenly a price spike for old IBM PS/2-model 9595 machines? ;-)

Dave

P.S. I'll get my coat; it's the one with the copy of Warp in it.

14 hrs bombastic bob

Re: OS/2 and PS/2 Memories

Back in the 90's, a few months before the release of Windows 3.0, I took an OS/2 presentation manager programming class at the local (night school) city college. Got an 'A'. Only 6 students survived to complete the class, and the room was packed on day 1 when I thankfully had my add slip signed by the prof... [and we had "the PS/2 machines" in the lab to ourselves, since they were the only ones that could run OS/2].

And I really _LIKED_ OS/2 PM. I was able to format a diskette while doing OTHER THINGS, kinda cool because DOS could _NEVER_ do that! Version 1.2 was nice looking, too, 3D SKEUOMORPHIC just like Windows 3.0 would soon become!

But when i tried to BUY it, I ran into NOTHING but brick walls. It was like "get a PS/2, or wait forever for OEMs to get it 'ported' to THEIR machines". Bleah.

THAT is what killed it. NOT making it available for CLONES. When 'Warp' finally released, it was too little, too late.

But the best part of OS/2 was it's API naming, which follows the MORE SENSIBLE object-verb naming, rather than verb-object. So in Windows, it's "CreateWindow". In OS/2, it's "WindowCreate". And wouldn't you know it, when you read the DOCUMENTATION all of the things that work with WINDOWS are in the SAME PART OF THE MANUAL!

Damn, that was nice! OK I have hard-copy manuals for OS/2 1.2 still laying about somewhere... and corresponding hard-copy Windows 3.0 manuals that I had to thumb back-forth with all the time. Old school, yeah. HARDCOPY manuals. And actual message loops (not toolkits nor ".Not" garbage).

10 hrs Pompous Git

Re: OS/2 and PS/2 Memories

"THAT is what killed it. NOT making it available for CLONES. When 'Warp' finally released, it was too little, too late."
A friend who was a developer at the time says the main thing that killed OS/2 was the cost of the SDK: over a $AU1,000. BillG was giving away the SDK for Windows at computer developer events.
1 day Lord Elpuss

"But it's striking as the world moves to three (or four) computing platforms, IBM isn't one of them ."

IBM is very much one of the top players. In Cloud, I would say AWS, Azure and IBM are the top 3. Business Intelligence? Oracle, IBM, Microsoft. AI/Cognitive? IBM Watson, Google DeepMind are the big two, Microsoft Cognitive coming a far third (along with highly innovative smaller vendors with a great solution but lacking scale - these will be swallowed by the big 2 (or 3) in short order).

Don't discount Big Blue too early.

1 day wolfetone

The IBM PS/2 Model 70 was my first PC, bought for me on my 10th birthday in 1997. Knew nothing about the computers, so it got thrown out a year later for an upgraded Acer 486. It had a huge impact on me, probably the one thing that got my love of computers and computing in general going.

In 2012, after many a weekend looking at eBay, I found a Model 70 for sale in London for £50. I got in the car, drove down and picked it up. I've other more interesting pieces of technology in my collection, but the Model 70 is the jewel as far as I'm concerned.

1 day Peter Gathercole

When the IBM AIX Systems Support Centre in the UK was set up in in 1989/1990, the standard system that was on the desks of the support specialists was a PS/2 Model 80 running AIX 1.2. (I don't recall if they were ever upgraded to 1.2.1, and 1.3 was never marketed in the UK).

386DX at 25MHz with 4MB of memory as standard, upgraded to 8MB of memory and an MCA 8514 1024x768 XGA graphics adapter and Token Ring card. IIRC, the cost of each unit excluding the monitor ran to over £4500.

Mine was called Foghorn (the specialists were asked to name them, using cartoon character names).

These systems were pretty robust, and most were still working when they were replaced with IBM Xstation 130s (named after Native American tribes), and later RS/6000 43Ps (named after job professions - I named mine Magician, but I was in charge of them by then so could bend the rules).

I nursed a small fleet of these PS/2 re-installed with OS/2 Warp (and memory canalized from the others to give them 16MB) for the Call-AIX handlers while they were in Havant. I guess they were scrapped after that. One user who had a particular need for processing power had an IBM Blue Lightning 486 processor (made by AMD) bought and fitted.

1 day Danny 14

I remember the day the boss signed a huge contract to strip out the IBMs and put Gateways in instead. 200 machines, and in 1990 it was brand new 486 33s (not the x2 ones), it was a fortune. Out with IBM and in with DOS, Windows for workgroups and good old novell netware logon scripting. Good days and it all pretty much worked. we even had Pegasus mail back in the day and a JANET connection.

1 day kmac499
Back in them days the old "No-one got fired for buying IBM" was still rampant, and the site I was on, or more accurately the tech support crew, backed PS/2 OS/2 for that very reason.

The wheels started to come off quite quickly with the cost and availability of MCA cards.

The other thing that really killed it isn't mentioned in the article. That was Token Ring networking, anyone else remember Madge cards?. I'm no network guy but IIRC it was all very proprietary, and an expensive pig to modify once installed (MAU's ??). Unlike the XEROX Ethernet system with simpler cabling and connectors,

Of course some PS/2 architecture does survive to this day. The keyboard and mouse connectors; all that work and all that's left is a color coded plug .....

22 hrs Bandikoto

I wrote fixed the Madge Tolkien Ring network card driver when I worked at Telebit, long, long ago, in a Valley far, far away. I'm pretty sure that was part of the work to get IPX working properly in Fred, as well. Token Ring was rated at 60% faster than thin Ethernet, and actually much faster at Load, but a real pain in the behind to work with.

1 day Peter2
I remember the pictured computer in the article, and this was typed on the (pictured) model M keyboard that came with that computer. They don't build equipment like that any more. Which is probably just as well, given that the keyboard weighs practically as much as a modern laptop notebook.
15 hrs Anonymous Coward

Keyboard

I still have a model M on my home PC. Only real issue with it is that it doesn't have a Windows key. I also wonder how long motherboards will come with a PS2 keyboard port.

Nothing I have ever used since comes close to it.

1 day BarryProsser
Perspective of the PC developers?

As a retired IBM UK employee, I have followed The Register's articles for years. I like this article more than most of the ones written about IBM. I would love to hear the perspective of the IBMers who were directly involved in the PC developments 30 years ago. I doubt many of them read or participate here. Where best to engage them? Yahoo, Linkedin, Facebook, others?

22 hrs Bandikoto

Re: Perspective of the PC developers?

Do you participate in your local IBM Club? I know that at least one still exists in San Jose, California. There's an IBM Club in Boca, which seems like a good place to start, given that was the home of the IBM Personal Computer personal computer. http://www.ibmsfqccaa.org/

1 day John Smith 19

Although technically OS/2 is still around

As EComStation

In hindsight (always 20/20) IBM's mistake was to seek to start making money off of MCA ASAP. Had they had been much more reasonable they would have had a little bite of every board (through licensing) made and MCA would have dominated.

BTW let's not forget what a PoS Windows 2.0 was or how the 286 was so retarded that the MS tech who worked out how to switch it from real to virtual mode and back was hailed a f**king genius.

1 day LDS

Re: Although technically OS/2 is still around

The 286 designers never thought that someone would have wanted to return to the much more limited real mode once into the much more advanced protected (not virtual) mode. Entering protected mode was easy - just set the proper bit. Getting out was "impossible" - but through a reset.

After all, back then, backward compatibility wasn't still a perceived issue. Probably Intel believed everyone would have rewritten the software to work in the new, better, protected world. It turned out it was wrong (just to make the same mistake years later with Itanium, though....).

The problem was not the 286 itself (and IMHO we'll see the segmented memory model back again one day because it implements a far better security model...), it was most DOS advanced application were written to bypass DOS itself and access the BIOS and HW (RAM and I/O ports) directly, something that was hard to emulate on a 286, and made porting to other OS more difficult.

Probably CP/M applications were better behaved, and a "protected" CP/M would have been easier to develop.

17 hrs Paul Crawford

Re: 286

The article has one significant mistake - the 286 did support protected operation, even the option for "no execute" on memory segments. But as it was designed to be either 'real mode' 8086 compatible OR protected mode you had the fsck-up of having to use a keyboard controller interrupt to bring it out of halt state back to 'real' mode.

The major advances for the 386 were:

1) 32-bit registers

2) The "flat" memory model and virtual memory support (not the 16-bit segments of 286, OK still segments but way big enough for a long time)

3) The option to easily change protection modes.

14 hrs Richard Plinston

Re: Although technically OS/2 is still around

> BTW let's not forget what a PoS Windows 2.0 was or how the 286 was so retarded that the MS tech who worked out how to switch it from real to virtual mode and back was hailed a f**king genius.

It was an IBM tech that worked out how to switch the 80286 back to 8086 mode using the keyboard IF chip, there was a standard instruction to switch it to protected mode. The mechanism was incorporated into OS/2 1.0, MS stole that for Windows/286.

6 hrs LDS

Re: 286

The segmented mode has more than the "no execute" bit. It has an "execute only" mode - the CPU can execute the code in the segment, but the application can't nor read nor modify if (goodbye, ROP!). Still, managing segments adds complexity, security checks slow down execution, and applications are less free (which, from a security point of view, is a good thing).

The "flat memory model" was not an inherent new feature - it means just to use large segments, large enough the application never need to load another one (which usually meant the whole application address space). Also, usually, code and data segments overlap to make everything "easier" (and far less secure).

286 16-bit segments were too small to allow for that. 386 32-bit segments allowed that, while the new pagination feature allowed for "virtual memory" with a page (4k) granularity. 286 virtual memory worked at the segment level - with 64k segment it could work, but with flat 4GB segments it couldn't work, obviously.

But what made the 386 able to run DOS applications was the Virtual 86 mode. In that mode, the hardware itself trapped direct accesses to memory and I/O ports, and allowed the OS to handle them, without requiring complex, fragile hacks.

This mode is no longer available in 64 bit mode, and that's why Windows 64 bit can't run DOS applications any longer (Windows supports console applications which are native Windows applications, not DOS ones).

1 day David Lawrence
Ah yes I remember it well

I was working for IBM at the time, at their UK Headquarters, so I got my free copy of OS/2 Warp.

There was a big demo with loads of PCs all running Windows-based games and software. I hit problems installing it on my PC because I wasn't prepared to re-partition my hard drive and lose all my existing data. Rubbish.

I watched OS/2 go down the drain, followed by several other doomed products. Then the bottom dropped out of the mainframe market and things took another slide. The 'partnership' with Lotus was a bit of a disaster too.

IBM? They just do software and services right? And a lot (but not all) of the software was obtained through acquisitions. I remember when they actually made stuff - like a PC keyboard that cost £120.

Shame really.

1 day Danny 14

Re: Ah yes I remember it well

lotus notes still makes me blink asymmetrically when I see pictures of it. <shudder>

1 day adam payne
I loved the model M keyboard such a fine piece of design and engineering.
1 day ShelLuser

IBM was its own worst enemy

It's been a while but back in the days I was a serious OS/2 advocate. Look, if you even get other people to end up trying out OS/2 because they became sick and tired of Windows 3.11 often bodging up and not being able to network properly then yeah...

But IBM more than often didn't even seem to care all that much. Looking back I think it was a bit the same as the stories we get to hear about Microsoft now: how divisions in the company do different things, don't always work together and in some rare cases even compete. Even at the expense of customers if they have to!

But IBM... I enrolled in the OS/2 support program (I seriously don't remember how I pulled this off anymore, I think I asked (and got) permission from my work to look into all this and also use their name) which ended up with IBM sending me several beta versions of OS/2 products. Including several OS/2 server environments. It was awesome. OS/2 server (a green covered double CD, that much I remember) was basically OS/2 with additional user management and network configuration settings.

Yet the funniest thing: IBM couldn't care less about your test results. At one time I got an invitation to go to IBM in the Netherlands for an OS/2 server demonstration which would also showcase some of their latest product (I recall being showed a very lightweight laptop). At arrival you had to search for the entrance and where it all was, because any announcements or directions were no where to be found on site.

I bought OS/2 3.0 Warp and the 4.0 Merlin and it always worked like a charm. I seriously liked OS/2 much better than anything else. So when I had the opportunity to buy a PC through my work it was obvious what I would need to get, right? An IBM Aptiva. That would be an ultimate, the thing to get for OS/2. Because obviously an IBM OS will definitely run on IBM hardware, right?

Context: this was at the prime of my OS/2 endeavors. I could optimize and write a config.sys file from mind if I had to, I knew what drivers to use, which to skip, what each command did. Memory optimization? Easy. Bootstrapping a *single* floppy disk to get an OS/2 commandline? Hard, yet not impossible (try it, you'd normally get multiple disks to boot with).

It took me one whole weekend, dozens of phonecalls to the IBM support line, and the conclusion was simple: IBM did not care about OS/2 for their own hardware. And with that I mean not at all. It did not work, no matter what I tried. Even they told me that this wasn't going to work. Compaq out of all brands did care. Compaq, the brand which tried extremely hard to appeal to the general customer by making their hardware "easy" to use and also "easy" to customize (comparable to Dell a bit) didn't only target Microsoft and Windows. Noooo.... When I eventually ditched my IBM I got myself a Compaq and I also purchased an extra set of drivers and installation media (3 boxes of 3.5 floppy disks, approx. 37 in total) and guess what? Next to a full Windows 3.11 installation plus a different program manager and dozens of drivers it also included several disks with OS/2 drivers. I removed Windows and installed OS/2 that very same evening.

Compaq... which often advertised that they made Windows easier. And also delivered OS/2 drivers for their harware...

IBM, which made OS/2 also made hardware, never even bothered to provide OS/2 drivers for their own PC's. Not even if you asked them.

Does that look like a company which cared?

IBM was its own enemy sometimes.

1 hr Uncle Ron

Re: IBM was its own worst enemy

IBM was and still is it's own enemy. So many comments above reflect this so well. Eg: "IBM's mistake was that it tried to make money right away from MCA." So true. IMHO, it is the MBA's permeating the entire company that are the enemy. They know nothing about the business IBM is actually in, only about cost recovery, expense containment, and fecking business models. For the last 30 years, the real heroes in IBM have been the ones who cut the most, or spend the least, or pound suppliers the worst.

This virus is especially dangerous when a non-MBA contracts it. When they see who gets the most recognition, they can't wait to de-fund sales commissions or training programs or development staffs. They think they are "doing good." It is not only true in IBM. Companies all over the West are infected with the idea that reducing costs (and innovation) towards zero, and increasing revenue towards infinity, is all we should be working on. So, fewer Cheerios in the box, fewer ounces of pepper in the same size can, cut the sales-force, reduce the cost (and quality) of support, and on and on and on.

If there is one word that summarizes this disease, and a word I cannot stand to hear in -any- context, it is the word, "Monetize." It encapsulates all the evils of what I feel is the "Too Smart by Half" mentality. I cannot say how many times I have heard the phrase, "how much money are we leaving on the table?" or, "how many more will we sell if we..." and the room goes silent and a good idea is dropped.

I am sorry I am rambling, I am sad. Never be another System/360 or Boeing 747. Incremental from here on out. Elon Musk doesn't seem to be infected...

1 day Anonymous South African Coward
Single Input Queue

Gotta love the silly old SIQ... one app chokes the SIQ and you can do absolutely nothing, except hard reboot :)

Fun times indeed. I had a couple of SIQ incidents as well. All but forgotten, but recalled to memory now :) Sometimes a CTRL-ALT-DEL would work, sometimes not.

And who remember the CTRL-ALT-NUMLOCK-NUMLOCK sequence?

1 day Kingstonian

Memories - Nostalgia isn't what it used to be.

It all worked well and was of its time. PS/2 and OS/2 made sense in an IBM mainframe using corporate environment (which I worked in) with some specialized workgroup LANs too. OS/2 EE had mainframe connectivity built-in and multitasking that worked. Token Ring was much better for deterministic performance too where near real time applications were concerned and more resilient than ethernet at the time - ethernet would die at high usage (CSMA/CD on a bus system) whereas token ring would still work if loaded to 100% and just degrade performance gracefully. Ethernet only gained the upper hand in many large corporate environments when 10 base T took off. Token ring would connect to the mainfame too so no more IRMA boards for PCs

There was OS/2 software available to have a central build server where each workstation could be defined on the server and then set up via the network by booting from floppy disk - useful in the corporate world. DB/2 was available for OS/2 so a complete family of useful tools was available. And IBM published its standards

IBM was used to the big corporate world and moving down to individuals via its PCs whereas Microsoft at that time was more individual standalone PCs and moving up to corporate connectivity. The heritage still shows to some extent. Novell was still the LAN server of choice for us for some time though.

The PS/2 was easy to take apart - our supplier showed us a PS/2 50 when it first came out. He had to leave the room briefly and we had the lid of the machine and had taken it apart (no tools needed) before he returned. He was very worried but it was very easy just to slide the parts back together and they just clipped into place - not something you could do with other PCs then. I came across an old price list recently - the IBM model M keyboard for PS/2 was around £200 (without a cable which came with the base unit- short for the model 50 and 70 desktops and long for the 60 and 80 towers! Memory was very expensive too and OS/2 needed more than DOS. In fact EVERYTHING was expensive.

OS/2 service packs (patches) came on floppy disks in the post. You had to copy them and then return them!

Starting in computing just after the original IBM PC was announced this all brings back fond memories and a huge reminder of the industry changes.

15 hrs addinall

Fake News. Re: Memories - Nostalgia isn't what it used to be.

Your memory is shot. OS/2 was developed my Microsoft AND IBM, first released jointly in December 1987. Bill Gates was at Wembley flogging the OS during 1988.

Way before then Microsoft had UNIX System 7 running and released as XENIX for the 8086/80286 architecture.

The development of OS/2 began when IBM and Microsoft signed the "Joint Development Agreement" in August 1985. It was code-named "CP/DOS" and it took two years for the first product to be delivered.

OS/2 1.0 was announced in April 1987 and released in December. The original release is textmode-only, and a GUI was introduced with OS/2 1.1 about a year later. OS/2 features an API for controlling the video display (VIO) and handling keyboard and mouse events so that programmers writing for protected-mode need not call the BIOS or access hardware directly. In addition, development tools include a subset of the video and keyboard APIs as linkable libraries so that family mode programs are able to run under MS-DOS. A task-switcher named Program Selector is available through the Ctrl-Esc hotkey combination, allowing the user to select among multitasked text-mode sessions (or screen groups; each can run multiple programs).

Communications and database-oriented extensions were delivered in 1988, as part of OS/2 1.0 Extended Edition: SNA, X.25/APPC/LU 6.2, LAN Manager, Query Manager, SQL.

The promised graphical user interface (GUI), Presentation Manager, was introduced with OS/2 1.1 in October, 1988.[9] It had a similar user interface to Windows 2.1, which was released in May of that year. (The interface was replaced in versions 1.2 and 1.3 by a tweaked GUI closer in appearance to Windows 3.1).

OS/2 occupied the "Intellegent Workstation" part of SAA (Systems Application Architecture) and made use of the APPC PU2.1 LU6.2 SNA network stack.

5 hrs Anonymous Coward

"The development of OS/2 began..." etc. etc.

That mostly looks copy & paste from Wikipedia. A link would have been enough

7 mins Anonymous Coward
Re: Fake News. Memories - Nostalgia isn't what it used to be.

OS/2 was developed my Microsoft AND IBM

And that's why an OS/2 subsystem lingered in Windows for some time.

(They finally killed it, right? And the POSIX subsystem?)

1 day Primus Secundus Tertius
No Stop button in window

The big defect in OS/2 that I met was the lack of a stop button in any window. Yes, you could close the window but that did not stop the task, just left it sitting in the background.

It was Windows 95 that brought us a proper stop button.

We had an OS/2 system returned to us as not working. The user had been closing the window, and then later starting up another instance. The system was clogged with dormant tasks. Once I removed them, everything worked again; what we had to do then was to update our User Guide.

22 hrs Frish
IBM what could have been

At one point, OS/2 could run multiple PC-DOS Windows "Better than Microsoft can" since memory was partitioned, and a crash in one window wouldn't affect the whole machine. Microsoft wrote a stub to detect whether OS/2 was installed, and killed the attempt to load PC-DOS...

Where IBM Boca missed the boat was thinking that this is a "personal" computer, and therefore, not a "commercial" one. IBM should have OWNED that market, entirely, and the competing product within the corporation that lost to Boca's product recognized that but got shoved aside by the sexiness of the PC, and all the departures from Big Blue tradition, etc.

Also, the way IBM execs got paid meant they were shortsighted about 'solutions' that included IBM Products that they didn't get paid for.

As Product Marketing Manager for IBM CICS 0S/2, (announced at Comdex '89, a Sr. VP from MIcrosoft shared with me on the show floor "That's the most important announcement in this entire show" as I handed him a press release THAT WAS NOT included in the show's press kit, since the PC division was in charge, and they kept other Division's products from being included...) I tried to get the then President of the PC division to just whisper CICS OS/2 to the management of a very large Insurance firm. He would have left with a 40,000 PC order, but, instead, chose to say nothing...IDIOTIC but true.

18 hrs Primus Secundus Tertius

Re: IBM what could have been

"...Microsoft wrote a stub to detect whether OS/2 was installed..."

Windows 95 was distributed to end users as an "update CD". It would not run unless it detected an installed Windows 3.x or was presnted with the first W3 floppy disk. It would also accept the first OS/2 Warp 3 floppy disk.

9 hrs Pompous Git

Re: IBM what could have been

"Windows 95 was distributed to end users as an "update CD". It would not run unless it detected an installed Windows 3.x or was presented with the first W3 floppy disk. It would also accept the first OS/2 Warp 3 floppy disk."
That's because Warp included a full Win 3.x license.

22 hrs Jim 59

GEM

I remember seeing a demonstration of GEM at the PCW show in 1986 or 1987, at - Olympia I think it was. Very impressive it was too. Didn't it also come bundled on some Amstrad PCs ? Re: GEM

> Didn't it also come bundled on some Amstrad PCs ?

Yes. They came with both DRI's DOS+ and GEM and MS-DOS.

It was in all Atari 512 (and derivatives) running on TOS, which was written by DRI. It also came with BBC Master 512 which ran DRI's DOS+ and GEM on an 80186 (or was it 80188?) processor.

22 hrs Justin Clift
OS/2 resurrection

Just to point out, OS/2 is being resurrected as "ArcaOS":

https://www.arcanoae.com/blue-lion/

They're been working on it for a while, and (though I've not used it), apparently it's targeted for release in under two weeks:

https://www.arcanoae.com/arcaos-5-0-launch-on-hold-for-a-few-more-days/

Modern Qt 5.x has been ported to it, and many Qt based apps are apparently working properly.

Saying this as DB Browser for SQLite is one of them. One of our Community members has been keeping us informed. ;)

22 hrs Jim-234

OS/2 Desktop virtualization before it was cool

I used to run OS/2 very well on a 386 based PC

I could have several windows open each with different virtual OSes, and you could move around the virtual OS image files etc.

So for example if you had some odd networking hardware (like LanTastic) that only wanted a specific DOS version, well spin up a new little dedicated Virtual OS image for it.

While people think all this virtualization stuff is so new.. it was around 25 years ago and worked rather well considering the hardware you had available at the time.

It's a shame Microsoft was able to blackmail IBM into discontinuing OS/2, with a different vision OS/2 might have become a serious contender for workplace use.

18 hrs Handlebar

Re: OS/2 Desktop virtualization before it was cool

Er, IBM were doing virtual machines in the 1960s ;-)

9 hrs Pompous Git

Re: OS/2 Desktop virtualization before it was cool

"It's a shame Microsoft was able to blackmail IBM into discontinuing OS/2"
Say what? Are you crazy?
22 hrs VanguardG

The quote from Mr Watson was supposedly in 1943 - when computers were larger than many houses and weighed as much as several buses. To say nothing of being extremely pricey (since they were essentially built on-site by hand) and expensive to operate, since they needed to be staffed by a crew trained in just that one machine, plus the power needs were enormous, and the heat produced prodigious. And, at that time, you really had fairly limited tasks that needed doing that required that kind of number-crunching. Did he say it? Maybe not, but given the time frame, it really doesn't seem as boneheaded as it would have sounded 15 years later.

Microchannel *would* have been sweet, were it not for the expense. A basic sound card of the time was $70 for ISA - the same card in MCA was $150.

As for plugging in a keyboard after boot...I still find it amusing that someone actually wrote an error code of "Keyboard not detected. Preset F1 to continue." If there's no keyboard, there's no F1 to press.

21 hrs Primus Secundus Tertius

There are other variations of that quote about five computers: e.g. the UK would only need that kind of number.

For the publicly known computations of that time - gunnery trajectories etc., that number is perhaps right. But I believe over a dozen instances of Colossus were built for code cracking. So even then the estimate of five was way out.

However the real expansion of computing came with data processing, where record keeping outweighed the relatively small amount of computation. IBM should have known that, given their existing business in accounting machines fed with punched cards.

18 hrs Updraft102

Well, yeah... there's no F1 to press, so if you want the PC to boot and you see that message, you will have to plug a keyboard in and press F1. That and make sure the keyboard lock is in the off/unlocked position, which is probably more what the keyboard check/stop on fail configuration was about than anything else.

The keyboard lock was an actual physical lock on the front of the PC that used a round key (like on vending machines) which, if set to the locked position, would prevent the PC from detecting the keyboard or responding to keypresses.

Enabling the "Press F1 to resume on keyboard error" BIOS setting made the keylock into a rudimentary system protection device. It wasn't all that effective against anyone who could get into the computer case, as bypassing the lock was as easy as pulling the cord for the keylock off the motherboard, but PC cases also typically had provisions for a small padlock to keep the cover on the case back then too. It wasn't great protection, but it probably provided some relief from pranksters, who probably would not be determined enough to cause physical damage to someone's PC for a joke. 14 hrs Richard Plinston > So even then the estimate of five was way out.

Actually he was speculating on the sales of the particular model that they were currently building. He was counting the number of government agencies that would be able to afford them and find them useful.

It was only much later that anyone tried to use computers for commercial purposes that would find them a place in businesses: LEO - Lyons Electronic Office was developed for payroll, stock distribution and manufacturing (of cakes).

In the 1950s the British government were deciding where _the_ computer would go. They chose a town that was a major railway junction because then the train loads of punch cards could easily be shipped to it. 20 mins VanguardG I remember - and people were perpetually misplacing the key to their padlocks. The round keylocks require a lot of expertise to crack open, but the padlocks...well...they were nearly always bought cheap. Basic (legal) hand tools could pop them open in seconds, without any damage to the case and, at most, a scratch or two on the padlock. Some people who were kinda serious had monitor stands that had a built-in, locking keyboard drawer. But those were usually employed by people who had a "favorite" keyboard they were afraid would get stolen by a jealous co-worker rather than because of any actual security concerns. 21 hrs Tim Brown 1 The UK had the best tech for personal computers at the time

For PCs during that period, in pure tech terms , Acorn's ARM machines running RISC-OS were way ahead of offerings from anyone else and prior to that the BBC micro (built by Acorn).

It's just such a shame that Acorn lacked any international marketing savvy then. 14 hrs Richard Plinston Re: The UK had the best tech for personal computers at the time

> It's just such a shame that Acorn lacked any international marketing savvy then.

And yet Acorn became a worldwide powerhouse chip design expert that currently sells licences for billions of chips every year. Even before phones started using them ARM were selling tens of millions of licences for chips to power embedded equipment (modems, routers, PABX, ...).

ARM = Acorn RISC Machines 9 hrs jake Re: The UK had the best tech for personal computers at the time

Not really. I had a 16 bit Heath H11A personal computer in 1978. Acorn didn't ship a 16 bit computer until 1985 ... The old girl still runs. Loudly.

http://www.decodesystems.com/heathkit-h11-ad-1.gif

https://en.wikipedia.org/wiki/Heathkit_H11 21 hrs Anonymous Coward "Thomas J Watson may never have said the world will have five computers"

It is funny that people take this statement to be a sign of IBM not understanding the potential of the computing market, if it was ever even said. It actually makes a lot of sense though. TJW, correctly, didn't think it would make sense for every business out there to go build their own little data center and a "best we can do" computing infrastructure. Better to let a giant time share (now called cloud provider) handle all of that complexity and just rent the resources as needed. It is kind of like saying there is only room for a handful of electrical utilities in the world. Even if everyone, at MSFT's urging, went out and bought a gas powered generator for their house... it still makes sense that there is only room for a handful of utilities. 21 hrs JohnCr A tiny bit more

A few comments.

1) IBM was STRONGLY urged to skip the 286 and create a true 32 bit, 386 version of OS/2. Even Microsoft was strongly pushing IBM in this direction. IBM's resistance to do so was a major contributor to the split between IBM and Microsoft.

2) The MicroChannel was better than the ISA bus. The problem was it was not good enough for the future. The PC evolution was moving towards faster graphics and network (100 mbps) connectivity. The MicroChannel, even 3 generations into the future did not have the bandwidth to meet these needs. The industry evolved to the PCI interface. As an interesting historical coincidence, PCI uses the same type of connectors as did the MicroChannel. And the PCI interface found its way into other IBM systems.

3) The IBM PC was inferior to Apple's products. The PC became more successful because a new industry and IBM's customers worked together to make it successful. (Steve Job - The Lost Interview) In 1987 IBM turned a deaf ear to the industry and its customers. When IBM stopped listening its fortunes turned. This culture took over the whole company and was a major factor in the company almost going out of business in the 1990's.

Expand Comment 14 hrs Richard Plinston Re: A tiny bit more

> Even Microsoft was strongly pushing IBM in this direction.

Microsoft had developed its own 286 versions of MS-DOS: 4.0 and 4.1 (not to be confused with the much later 4.01). These was also known as European DOS because the were used by Siemans, ICL (where I worked) and Wang. These versions supported a limited multitasking of background tasks and one foreground program. I have a manual here on how to write 'family' applications that would run on 8086 MS-DOS or in protected mode on 80286 MS-DOS 4.x.

It was dumped when they switched to writing OS/2 with IBM. 9 hrs niksgarage Re: A tiny bit more

MCA was an architecture that worked fine in workstations, just not PCs. I met Fred Strietelmeyer - the architect of MCA (fairly sure that's his name) in Austin, TX in the 80s who told me that MCA was a fine architecture, but not all implementations were. RS/6000 had multi bus-mastering working with MCA, mostly because the address on the channel was a logical address, which went through the memory manager, so AIX could easily control and protect the physical memory space. PS/2 used physical addresses, which meant that either bus mastering was turned off, or the bus mastering cards needed to have a copy of the memory manager on board as well. If you were running AIX, MCA was not a problem or even a question deserving of five minutes of thought.

The PC industry hated MCA, the connector, the architecture and its licencing. They came out with EISA - a backward-compatible connector to extend the AT bus. I always found it a huge irony that PCI used the same physical connector as MCA years later. 9 hrs niksgarage Re: A tiny bit more

AND you are exactly right about the 286/386 wars. I was in Boca when the AIX guys from Austin came to see the CPDOS developers (as OS/2 was known in Dev), and showed them true multi-tasking on 386. They were baffled why IBM was writing another OS when we already had a virtualising, 32-bit ready pre-emptive multi-tasker that ran on multiple hardware platforms. It's only issue was it couldn't run on 286. And for that reason alone, IBM spent enough money on OS/2 development that they could have paid for the Hubble telescope AND had it repaired.

I also saw the numerous prototype machines in Boca (one called 'Nova' in dev as I recall) which had a 16MHz 386, lots of memory on board and an AT expansion bus. (Also had the 1.44MB diskette drive) Nice machine, could have sold really well. Only the Model 30 was allowed to be shipped, in case the AT bus continued to outshine MCA. Marshalltown What grief?

It always puzzled me what grief OS/2 was supposed to create. I used it as a substitute for Windows, running windows s/w into the early years of the century. I gathered that IBM might have still been attempting to extract its pound of flesh from developers, but as far as I was concerned, it worked fine. I built my own machines and it ran on them without problems. I also liked Rexx as a scripting language. It was immensely more useful than does and much less of a pain (to me) than MS BASIC and all its little dialects and subspecialties.

The only real grief I encountered was developers who "simply couldn't" do a version for OS/2 - and, of course, MS doing their best to see that their software was less compatible than need be.

History seems to forget how thoroughly MS would break things with each "improvement." Many of the useful "improvements" in Windows were first present in OS/2 and some really nice features vanished when IBM decided the effort wasn't worth the candle. The browser with its tree-structured browsing history was remarkable. No browser since has and anything to match. Even now, relicts of the OS/2 interface are still present in KDE and GNU. Microsoft has finally moved "on" with the horrible looking and acting interface of Windows 10. Ah... Concurrent DOS...

IBM did actually use DR Concurrent DOS 286 - but in their 4680 Point-of-sale (described often as a real P.O.S. by those of us who used it) OS. Re: Ah... Concurrent DOS...

> IBM did actually use DR Concurrent DOS 286 - but in their 4680 Point-of-sale (described often as a real P.O.S. by those of us who used it) OS.

Yes, that was a DRI product but it was not Concurrent-DOS it was FlexOS. This shared code with MP/M-86, as did Concurrent-DOS (neither of which had an 80286 product) but was 80286 based. The main difference was that FlexOS would not run MS/PC-DOS programs and Concurrent-CP/M-86 / Concurrent-DOS would run several of them at once (as well as CP/M-86 programs).

DRI had pre-emptive multi-user, multi-tasking systems since 1978 with MP/M which ran on 8085 and Z80 micros with bank switched memory (I have a couple of RAIR Blackbox/ICL PC1s here and an ICL PC2 8085AH2 with 512Kbyte). MPM2 and MP/M-86 (for the 8086) were released around 1980. Concurrent-CP/M-86 with multiple virtual screens ran on an IBM-PC (and other machines - I have a stack of 8086 ICL PC2) and could used EEMS memory cards such as AST RamPage to get several Mbytes of memory and do context switching with just a handfull of register moves.

Concurrent-CP/M-86 was demonstrated the same month as MS-DOS 2 was released. It had pre-emptive multi-tasking (and multiuser with serial terminals). The virtual screens were just a keystroke away so one could run SuperCalc, Wordstar, and other programs at the same time and just flick between them - even on the serial terminals.

Later, this was developed for 386 into DR-Multiuser-DOS from which DR-DOS 5 and 6 were derived.

There was a FlexOS-386 which had an enhanced GEM-X but it was dropped to concentrate on the Concurrent range.

Expand Comment PS/2 and OS/2

The naming convention always struck me as odd. In my mind, the /2 meant divide by 2, ie half an OS and half a PS :-) Re: PS/2 and OS/2

Ah - "OS divided by 2" - I'd never thought of it that way before.

Interestingly, for the short time that MS were promoting it, they spelt it OS-2, which could I suppose be read as "OS minus 2".

I'm not sure what if anything we can deduce from that! Re: PS/2 and OS/2

Nah. IBM (and many others) used the "/" for many, many products. Like System/360. It was a naming convention that seemed to lend weight, not division, to a product family. Big Blue: IBM's Use and Abuse of Power

To understand the lead up to PCs people should read Richard DeLamarter's Big Blue: IBM's Use and Abuse of Power.

It goes back to the 1890s and shows how IBM became dominant through not such ethical practices. The antitrust suit against IBM was dropped by Ronald Reagan, which prompted DeLamarter to write the book. Make sure your driver strategy is in place if you launch a new O/S

Good article Andrew. I was there in Miami when IBM launched the PS/2. They closed off the streets six blocks around the exhibition centre and as far as I remember the Beach Boys played. They should probably have saved their money. One thing you missed is that not only was OS/2 late but the driver support in the operating system was very poor. This meant that as well as blocking all the plug in cards through the new bus architecture, they also bricked all of the add on peripherals for the PS/2 that had worked with the IBM PC. Add to that the fact that the OS/2 driver team started competing with driver developers for driver business and also blocked them from developing for the architecture (until the OS/2 DDK made a brief appearance and then disappeared) and the factors that contributed to IBM's demise was complete. I recall that when the driver team saw the source code of one of our drivers some years later they threatened to sue us. That was until I pointed out that the code was based on the OS/2 DDK and they went quiet but couldn't quite believe that we had managed to obtain a copy in the few weeks that it had popped its head above the surface. Microsoft worked out early on that driver support is a key element to the success of an Operating System. Something that they seem to have lost sight of a bit with Windows Vista onwards although I suppose the switch to 64bit has made backwards compatibility more difficult. Keep the nostalgia coming Andrew, it's not like it used to be! Tony Harris

6 hrs BinkyTheMagicPaperclip

Such a missed opportunity

I didn't use OS/2 1.x at the time (only later), but beyond 1.0 it was fine for server based apps and a good, solid platform. Not so much for desktop apps - insufficient driver support, high memory requirements, and limited app support put paid to that.

OS/2 2.x and beyond was a much improved proposition, but suffered in competition with the large number of Windows apps. The userbase were not, in general, prepared to pay more for a smaller amount of higher quality features - the reality of running a minority platform.

OS/2 might have got further if IBM concentrated on Intel, but instead they wasted vast amounts of effort on OS/2 PPC. Much though I loved OS/2, the succession plan was flawed. Windows NT is simply better architected, they spent the time maintaining compatibility with 16 bit apps, and had much improved security, and multi user support. OS/2 was effectively dead before it really caused a problem, but it would have caused issues later on.

System <n>/Mac OS were also flawed, and the early versions of OS X sucked, but Apple are much better at retaining compatibility whilst updating the OS (at least for a few years, until they drop old kit like a brick).

I've still got an OS/2 system, and a lot of apps, and will be assembling an OS/2 1.3 system (because I'm a masochist and like trying OS). Haven't bothered with eComstation, but might give Arca 5.0 a go if it's any good, and not ludicrously priced. There aren't too many OS/2 apps I really want to run these days, though.

One final note : it's *synchronous* input queue, not single. If messages are not taken off the input queue it hangs the interface, but does not stop apps running. There was a workaround implemented in Warp 3 fixpack 16, but until then a badly behaved app was a real pain. However, Win32 successfully moved away from the synchronous input queues in Win16, to asynchronous in Win32, without breaking too many apps. IBM should have put in the engineering effort to do the same.

There are also some substantial differences between OS/2's architecture, and Windows (or indeed anything else). For instance the co-ordinate origin in Windows is at the top left of the screen, but in OS/2 it's the bottom left (OS/2 uses the mathematically correct option here)

The "fun" takeaway from this is that while "industry analysts" have been consistently wrong for at least the last 30 years, we're still listening to what they say...

Concurrent DOS, great stuff!

One of the first multiuser systems I really learned as a spotty teenager back in the late '80s. Working with my Dad on getting shared DataEase databases working at his workplace. We had C/DOS on the "master" PC system and a couple Wyse terminals hanging off the serial ports, 3 systems independently using a shared, albeit cutdown, PC based RDBMS system. I loved it so much as a kid that I ended up with a career working in database systems.

Better than OS/2 / Win9x / NT

Around the time OS/2 was making its way onto the scene and most people use DESQview to multitask on their 286/386 PC, Quantum Software Systems (now QNX Software Systems) had a real-time multiuser, multitasking , networked and distributed OS available for 8086/8088 & 80286 processors.

On PC/XT hardware in ran without any memory protection, but the same binaries would run on the real mode and protected mode kernel.

Something about being able to do

$ [2] unzip [3]/tmp/blah.zip

Would run the unzip program on node 2, read as a source archive the /tmp/blah.zip file on node 3 and extract the files into the current working directory.

We accessed a local BBS that ran a 4-node QNX network (6 incoming phone lines + X.25 (Datapac) )

Even supported diskless client booting and the sharing of any device over the network. Though at around $1k for a license, it wasn't "mainstream".

It's too bad the few times Quantum tried to take it mainstream, the plans failed. Both the Unisys ICON and a new Amiga had chosen QNX as the base for their OS.

OS/2 from the Big App developers' POV

http://www.wordplace.com/ap/index.shtml

WordPerfect's then-CEO wrote this memoire, eventually put it on web for free. It's actually a good read, especially if you've never experienced the insane self-meme-absorption that is american corporate insiders. Which is why it went from biggest in world to *pop* so suddenly.

OS/2 first mentioned near end of Ch.8 and then passim. It shows quite a different view of OS/2.

Boffin Productive OS/2?

We had a productive OS/2 machine at one of our sites up until very recently. I think the only reason it was got rid of was the site being axed.

It was running a protein separation machine and had to dual into Win98 if you wanted to copy your data to the NAS. It was impressive that it lasted so long baring in mind it lived in a cold room at <5c.

1 hr ImpureScience

Still Sort Of Miss It

I really liked OS/2, and for a while I thought it would take the place that Windows now owns. But IBM had no idea how to sell to single end users, and get developers on board. Despite having a superior product their financial policies guaranteed only customers ended up being banks and insurance companies.

I'm a musician, and I remember going on for over a year with the guy they had put in charge of MIDI on OS/2. It never happened, because which bank, or what insurance company, would be interested? !--file:///f:/Public_html/History/index.shtml-->

[Feb 21, 2017] Designing and managing large technologies

Feb 21, 2017 | economistsview.typepad.com
RC AKA Darryl, Ron : February 20, 2017 at 04:48 AM , 2017 at 04:48 AM
RE: Designing and managing large technologies

http://understandingsociety.blogspot.com/2017/02/designing-and-managing-large.html

[This is one of those days where the sociology is better than the economics or even the political history.]

What is involved in designing, implementing, coordinating, and managing the deployment of a large new technology system in a real social, political, and organizational environment? Here I am thinking of projects like the development of the SAGE early warning system, the Affordable Care Act, or the introduction of nuclear power into the civilian power industry.

Tom Hughes described several such projects in Rescuing Prometheus: Four Monumental Projects That Changed the Modern World. Here is how he describes his focus in that book:

Telling the story of this ongoing creation since 1945 carries us into a human-built world far more complex than that populated earlier by heroic inventors such as Thomas Edison and by firms such as the Ford Motor Company. Post-World War II cultural history of technology and science introduces us to system builders and the military-industrial-university complex. Our focus will be on massive research and development projects rather than on the invention and development of individual machines, devices, and processes. In short, we shall be dealing with collective creative endeavors that have produced the communications, information, transportation, and defense systems that structure our world and shape the way we live our lives. (3)

The emphasis here is on size, complexity, and multi-dimensionality. The projects that Hughes describes include the SAGE air defense system, the Atlas ICBM, Boston's Central Artery/Tunnel project, and the development of ARPANET...


[Of course read the full text at the link, but here is the conclusion:]


...This topic is of interest for practical reasons -- as a society we need to be confident in the effectiveness and responsiveness of the planning and development that goes into large projects like these. But it is also of interest for a deeper reason: the challenge of attributing rational planning and action to a very large and distributed organization at all. When an individual scientist or engineer leads a laboratory focused on a particular set of research problems, it is possible for that individual (with assistance from the program and lab managers hired for the effort) to keep the important scientific and logistical details in mind. It is an individual effort. But the projects described here are sufficiently complex that there is no individual leader who has the whole plan in mind. Instead, the "organizational intentionality" is embodied in the working committees, communications processes, and assessment mechanisms that have been established.

It is interesting to consider how students, both undergraduate and graduate, can come to have a better appreciation of the organizational challenges raised by large projects like these. Almost by definition, study of these problem areas in a traditional university curriculum proceeds from the point of view of a specialized discipline -- accounting, electrical engineering, environmental policy. But the view provided from a discipline is insufficient to give the student a rich understanding of the complexity of the real-world problems associated with projects like these. It is tempting to think that advanced courses for engineering and management students could be devised making extensive use of detailed case studies as well as simulation tools that would allow students to gain a more adequate understanding of what is needed to organize and implement a large new system. And interestingly enough, this is a place where the skills of humanists and social scientists are perhaps even more essential than the expertise of technology and management specialists. Historians and sociologists have a great deal to add to a student's understanding of these complex, messy processes.

[A big YEP to that.]


cm -> RC AKA Darryl, Ron... , February 20, 2017 at 12:32 PM
Another rediscovery that work is a social process. But certainly well expressed.

It (or the part you quoted) also doesn't say, but hints at the obvious "problem" - social complexity and especially the difficulty of managing large scale collaboration. Easier to do when there is a strong national or comparable large-group identity narrative, almost impossible with neoliberal YOYO. You can always compel token effort but not the "intangible" true participation.

People are taught to ask "what's in it for me", but the answer better be "the same as what's in it for everybody else" - and literally *everybody*. Any doubts there and you can forget it. The question will usually not be asked explicitly or in this clarity, but most people will still figure it out - if not today then tomorrow.

[Jan 11, 2017] Fake History Alert Sorry BBC, but Apple really did invent the iPhone

Notable quotes:
"... In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard). ..."
"... The N900, N810 and N800 are to this day far more "little computers" than any other smartphone so far. Indeed, as they ran a Debian Linux derivative with a themed Enlightenment based desktop, which is pretty much off the shelf Linux software. While they didn't have multitouch, you could use your finger on the apps no problem. It had a stylus for when you wanted extra precision though. ..."
"... I was reading a BBC news web article and it was wrong too. It missed out emphasising that the real reason for success in 2007 was the deals with operators, cheap high cap data packages, often bundled with iPhone from the Mobile Operator. ..."
"... Actually if you had a corporate account, you had a phone already with email, Apps, ability to read MS Office docs, web browser and even real Fax send/receive maybe 5 or 6 years before the iPhone. Apart from an easier touch interface, the pre-existing phones had more features like copy/paste, voice control and recording calls. ..."
"... I remember having a motorola A920 way back in 2003/2004 maybe, and on that I made video calls, went online, had a touch interface, ran 'apps', watched videos.... in fact I could do everything the iPhone could do and more... BUT it was clunky and the screen was not large... the iPhone was a nice step forward in many ways but also a step back in functionality ..."
"... Apple invented everything... They may have invented the iPhone but they DID NOT invent the "smartphone category" as that article suggests. ..."
"... Microsoft had Smartphone 2002 and Pocket PC 2000 which were eventually merged into Windows Mobile and, interface aside, were vastly superior to the iPhone's iOS. ..."
"... Devices were manufactured in a similar fashion to how android devices are now - MS provided the OS and firms like HTC, HP, Acer, Asus, Eten, Motorola made the hardware. ..."
"... The government was looking for a display technology for aircraft that was rugged, light, low powered and more reliable than CRTs. They also wanted to avoid the punitive royalties taken by RCA on CRTs. It was the work done in the 1960s by the Royal Radar Establishment at Malvern and George William Gray and his team at the University of Hull that led to modern LCDs. QinetiQ, which inherited RSRE's intellectual property rights, is still taking royalties on each display sold. ..."
"... The key here is that Steve Jobs had the guts to force the thought of a useful smartphone, gadget for the user first and phone second into the minds of the Telcos, and he was the one to get unlimited/big data bundles. ..."
"... He identified correctly, as many had before but before the power to do anything about it, that the customers are the final users, not the telcos. ..."
Jan 11, 2017 | theregister.co.uk

deconstructionist

Re: The point stands

the point is flat on it's back just like the sophistic reply.

Lets take apples first machines they copied the mouse from Olivetti , they took the OS look from a rank XEROX engineers work, the private sector take risks and plagiarize when they can, but the missing person here is the amateur, take the BBS private individuals designed, built and ran it was the pre cursor to the net and a lot of .com company's like AOL and CompuServe where born there.

And the poor clarity in the BBC article is mind numbing, the modern tech industry has the Fairchild camera company as it's grand daddy which is about as far from federal or state intervention and innovation as you can get .

Deconstructionism only works when you understand the brief and use the correct and varied sources not just one crackpot seeking attention.

Lotaresco

Re: Engineering change at the BBC?

"The BBC doesn't "do" engineering "

CEEFAX, PAL Colour TV, 625 line transmissions, The BBC 'B', Satellite Broadcasting, Digital Services, the iPlayer, micro:bit, Smart TV services.

There's also the work that the BBC did in improving loudspeakers including the BBC LS range. That work is one reason that British loudspeakers are still considered among the world's best designs.

By all means kick the BBC, but keep it factual.

LDS

Re: I thought I invented it.

That was the first market demographics - iPod users happy to buy one who could also make calls. But that's also were Nokia failed spectacularly - it was by nature phone-centric. Its models where phones that could also make something else. True smartphones are instead little computers that can also make phone calls.

In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard).

the iPod probably taught Apple you need a proper "finger based" UI for this kind of devices - especially for the consumer market - and multitouch solved a lot of problems.

Emmeran

Re: I thought I invented it.

Shortly there-after I duct-taped 4 of them together and invented the tablet.

My version of it all is that the glory goes to iTunes for consumer friendly interface (ignore that concept Linux guys) and easy music purchases, the rest was natural progression and Chinese slave labor.

Smart phones and handheld computers were definitely driven by military dollars world wide but so was the internet. All that fact shows is that a smart balance of Capitalism & Socialism can go a long way.

Ogi

Re: I thought I invented it.

>That was the first market demographics - iPod users happy to buy one who could also make calls. But that's also were Nokia failed spectacularly - it was by nature phone-centric. Its models where phones that could also make something else. True smartphones are instead little computers that can also make phone calls. In many ways Treo/Palm and Windows CE anticipated it, but especially the latter tried to bring a "desktop" UI on tiny devices (and designed UIs around a stylus and a physical keyboard). the iPod probably taught Apple you need a proper "finger based" UI for this kind of devices - especially for the consumer market - and multitouch solved a lot of problems.

I don't know exactly why Nokia failed, but it wasn't because their smart phones were "phone centric". The N900, N810 and N800 are to this day far more "little computers" than any other smartphone so far. Indeed, as they ran a Debian Linux derivative with a themed Enlightenment based desktop, which is pretty much off the shelf Linux software. While they didn't have multitouch, you could use your finger on the apps no problem. It had a stylus for when you wanted extra precision though.

I could apt-get (with some sources tweaking) what I wanted outside of their apps. You could also compile and run proper Linux desktop apps on it, including openoffice (back in the day). It ran like a dog and didn't fit the "mobile-UI" they created, but it worked.

It also had a proper X server, so I could forward any phone app to my big PC if I didn't feel like messing about on a small touchscreen. To this day I miss this ability. To just connect via SSH to my phone over wifi, run an smartphone app, and have it appear on my desktop like any other app would.

It had xterm, it had Perl built in, it had Python (a lot of it was written in Python), you even could install a C toolchain on it and develop C code on it. People ported standard desktop UIs on it, and with a VNC/RDP server you could use it as a portable computer just fine (just connect to it using a thin client, or a borrowed PC).

I had written little scripts to batch send New years SMS to contacts, and even piped the output of "fortune" to a select few numbers just for kicks (the days with free SMS, and no chat apps). To this day I have no such power on my modern phones.

Damn, now that I think back, it really was a powerful piece of kit. I actually still miss the features *sniff*

And now that I think about it, In fact I suspect they failed because their phones were too much "little computers" at a time when people wanted a phone. Few people (outside of geeks) wanted to fiddle with X-forwarding, install SSH, script/program/modify, or otherwise customise their stuff.

Arguably the one weakest app on the N900 was the phone application itself, which was not open source, so could not be improved by the community, so much so people used to say it wasn't really a phone, rather it was a computer with a phone attached, which is exactly what I wanted.

Mage

Invention of iPhone

It wasn't even really an invention.

The BBC frequently "invents" tech history. They probably think MS and IBM created personal computing, when in fact they held it back for 10 years and destroyed innovating companies then.

The only significant part was the touch interface by Fingerworks.

I was reading a BBC news web article and it was wrong too. It missed out emphasising that the real reason for success in 2007 was the deals with operators, cheap high cap data packages, often bundled with iPhone from the Mobile Operator.

This is nonsense:

http://www.bbc.com/news/technology-38550016

"Those were the days, by the way, when phones were for making calls but all that was about to change."

Actually if you had a corporate account, you had a phone already with email, Apps, ability to read MS Office docs, web browser and even real Fax send/receive maybe 5 or 6 years before the iPhone. Apart from an easier touch interface, the pre-existing phones had more features like copy/paste, voice control and recording calls.

The revolution was ordinary consumers being able to have a smart phone AND afford the data. The actual HW was commodity stuff. I had the dev system for the SC6400 Samsung ARM cpu used it.

Why did other phones use resistive + stylus instead of capacitive finger touch?

The capacitive touch existed in the late 1980s, but "holy grail" was handwriting recognition, not gesture control, though Xerox and IIS both had worked on it and guestures were defined before the 1990s. So the UK guy didn't invent anything.

Also irrelevant.

http://www.bbc.com/news/technology-38552241

Mines the one with a N9110 and later N9210 in the pocket. The first commercial smart phone was 1998 and crippled by high per MByte or per second (or both!) charging. Also in 2002, max speed was often 28K, but then in 2005 my landline was still 19.2K till I got Broadband, though I had 128K in 1990s in the city (ISDN) before I moved.

xeroks

Re: Invention of iPhone

The ground breaking elements of the iPhone were all to do with usability:

The fixed price data tariff was - to me - the biggest innovation. It may have been the hardest to do, as it involved entrenched network operators in a near monopoly. The hardware engineers only had to deal with the laws of physics.

The apple store made it easy to purchase and install apps and media. Suddenly you didn't have to be a geek or an innovator to make your phone do something useful or fun that the manufacturer didn't want to give to everyone.

The improved touch interface, the styling, and apple's cache all helped, and, I assume, fed into the efforts to persuade the network operators to give the average end user access to data without fear.

MrXavia

Re: Invention of iPhone

"Those were the days, by the way, when phones were for making calls but all that was about to change."

I remember having a motorola A920 way back in 2003/2004 maybe, and on that I made video calls, went online, had a touch interface, ran 'apps', watched videos.... in fact I could do everything the iPhone could do and more... BUT it was clunky and the screen was not large... the iPhone was a nice step forward in many ways but also a step back in functionality

imaginarynumber

Re: Invention of iPhone

"The fixed price data tariff was - to me - the biggest innovation".

In my experience, the iphone killed the "all you can eat" fixed price data tariffs

I purchased a HTC Athena (T-Mobile Ameo) on a T-Mobile-Web and Walk contract in Feb 2007. I had unlimited 3.5G access (including tethering) and fixed call minutes/texts.

When it was time to upgrade, I was told that iphone 3G users were using too much data and that T-Mobile were no longer offering unlimited internet access.

Robert Carnegie

"First smartphone"

For fun, I put "first smartphone" into Google. It wasn't Apple's. I think a BBC editor may have temporarily said that it was.

As for Apple inventing the first multitouch smartphone, though -

http://www.bbc.co.uk/news/technology-38552241 claims, with some credibility, that Apple's engineers wanted to put a keyboard on their phone. The Blackberry phone had a keyboard. But Steve Jobs wanted a phone that you could work with your finger (without a keyboard).

One finger.

If you're only using one finger, you're not actually using multi touch?

nedge2k

Apple invented everything... They may have invented the iPhone but they DID NOT invent the "smartphone category" as that article suggests.

Microsoft had Smartphone 2002 and Pocket PC 2000 which were eventually merged into Windows Mobile and, interface aside, were vastly superior to the iPhone's iOS.

Devices were manufactured in a similar fashion to how android devices are now - MS provided the OS and firms like HTC, HP, Acer, Asus, Eten, Motorola made the hardware.

People rarely know how long HTC has been going as they used to OEM stuff for the networks - like the original Orange SPV (HTC Canary), a candybar style device running Microsoft Smartphone 2002. Or the original O2 XDA (HTC Wallaby), one the first Pocket PC "phone edition" devices and, IIRC, the first touchscreen smartphone to be made by HTC.

GruntyMcPugh

Re: Apple invented everything...

Yup, I had Windows based smartphones made by Qtek and HTC, and my first smartphone was an Orange SPV M2000 (a Qtek 9090 ) three years before the first iPhone, and I had a O2 XDA after that, which in 2006, had GPS, MMS, and an SD card slot, which held music for my train commute.

Now I'm a fan of the Note series, I had one capacitive screen smartphone without a stylus (HTC HD2), and missed it too much.

nedge2k

Re: Apple invented everything...

Lotaresco, I used to review a lot of the devices back in the day, as well as using them daily and modifying them (my phone history for ref: http://mowned.com/nedge2k ). Not once did they ever fail to make a phone call. Maybe the journalist was biased and made it up (Symbian was massively under threat at the time and all sorts of bullshit stories were flying about), maybe he had dodgy hardware, who knows.

Either way, it doesn't mean that the OS as a whole wasn't superior to what Nokia and Apple produced - because in every other way, it was.

imaginarynumber

Re: Apple invented everything...

@Lotaresco

"The weak spot for Microsoft was that it decided to run telephony in the application layer. This meant that any problem with the OS would result in telephony being lost....

Symbian provided a telephone which could function as a computer. The telephony was a low-level service and even if the OS crashed completely you could still make and receive calls. Apple adopted the same architecture, interface and telephony are low level services which are difficult to kill."

Sorry, but if iOS (or symbian) crashes you cannot make calls. In what capacity were you evaluating phones in 2002? I cannot recall ever seeing a Windows Mobile blue screen. It would hang from time to time, but it never blue screened.

MR J

Seeing how much free advertising the BBC has given Apple over the years I doubt they will care.

And lets be honest here, the guy is kinda correct. We didn't just go from a dumb phone to a smart phone, there was a gradual move towards it as processing power was able to be increased and electronic packages made smaller. Had we gone from the old brick phones straight to an iPhone then I would agree that they owned something like TNT.

Did Apple design the iPhone - Yes, of course.

Did Apple invent the Smart Phone - Nope.

IBM had a touch screen "smart" phone in 1992 that had a square screen with rounded corners.

What Apple did was put it into a great package with a great store behind it and they made sure it worked - and worked well. I personally am not fond of Apple due to the huge price premium they demand and overly locked down ecosystems, but I will admit it was a wonderful product Design.

Peter2

Re: "opinion pieces don't need to be balanced"

"I am no fan of Apple, but to state that something was invented by the State because everyone involved went to state-funded school is a kindergarten-level of thinking that has no place in reasoned argument."

It's actually "Intellectual Yet Idiot" level thinking. Google it. Your right that arguments of this sort of calibre have no place in reasoned argument, but the presence of this sort of quality thinking being shoved down peoples throats by media is why a hell of a lot of people are "fed up with experts".

TonyJ

Hmmm....iPhone 1.0

I actually got one of these for my wife. It was awful. It almost felt like a beta product (and these are just a few of things I still remember):

I think it's reasonably fair to say that it was the app store that really allowed the iPhone to become so successful, combined with the then Apple aura and mystique that Jobs was bringing to their products.

As to who invented this bit or that bit - I suggest you could pull most products released in the last 10-20 years and have the same kind of arguments.

But poor show on the beeb for their lack of fact checking on this one.

TonyJ

Re: Hmmm....iPhone 1.0

"...The original iPhone definitely has a proximity sensor. It is possible that your wife's phone was faulty or there was a software issue...."

Have an upvote - hers definitely never worked (and at the time I didn't even know it was supposed to be there), so yeah, probably faulty. I'd just assumed it didn't have one.

Lotaresco

There is of course...

.. the fact that the iPhone wouldn't exist without its screen and all LCD displays owe their existence to (UK) government sponsored research. So whereas I agree that Mazzucato is guilty of rabidly promoting an incorrect hypothesis to the status of fact, there is this tiny kernel of truth.

The government was looking for a display technology for aircraft that was rugged, light, low powered and more reliable than CRTs. They also wanted to avoid the punitive royalties taken by RCA on CRTs. It was the work done in the 1960s by the Royal Radar Establishment at Malvern and George William Gray and his team at the University of Hull that led to modern LCDs. QinetiQ, which inherited RSRE's intellectual property rights, is still taking royalties on each display sold.

anonymous boring coward

Re: There is of course...

I had a calculator in the late 1970s with an LCD display. It had no resemblance to my phone's display.

Not even my first LCD screened laptop had much resemblance with a phone's display. That laptop had a colour display, in theory. If looked at at the right angle, in the correct light.

Innovation is ongoing, and not defined by some initial stumbling attempts.

juice

Apple invented the iPhone...

... in the same way that Ford invented the Model T, Sony invented the Walkman or Nintendo invented the Wii. They took existing technologies, iterated and integrated them, and presented them in the right way in the right place at the right time.

And that's been true of pretty much every invention since someone discovered how to knap flint.

As to how much of a part the state had to play: a lot of things - especially in the IT and medical field - have been spun out of military research, though by the same token, much of this is done by private companies funded by government sources.

Equally, a lot of technology has been acquired through trade, acquisition or outright theft. In WW2, the United Kingdom gave the USA a lot of technology via the Tizard mission (and later, jet-engine technology was also licenced), and both Russia and the USA "acquired" a lot of rocket technology by picking over the bones of Germany's industrial infrastructure. Then, Russia spent the next 40 years stealing whatever nuclear/military technology it could from the USA - though I'm sure some things would have trickled the other way as well!

Anyway, if you trace any modern technology back far enough, there will have been state intervention. That shouldn't subtract in any way from the work done by companies and individuals who have produced something where the sum is greater than the parts...

Roland6

Re: Apple invented the iPhone...

... in the same way that Ford invented the Model T, Sony invented the Walkman or Nintendo invented the Wii. They took existing technologies, iterated and integrated them, and presented them in the right way in the right place at the right time.

And that's been true of pretty much every invention since someone discovered how to knap flint.

Not so sure, Singer did a little more with respect to the sewing machine - his was the forst that actually worked. Likewise Marconi was the first with a working wireless. Yes both made extensive use of existing technology, but both clearly made that final inventive step; something that isn't so clear in the case of the examples you cite.

Equally, a lot of technology has been acquired through trade, acquisition or outright theft.

Don't disagree, although your analysis omitted Japanese and Chinese acquisition of 'western' technology and know-how...

Anyway, if you trace any modern technology back far enough, there will have been state intervention.

Interesting point, particularly when you consider the case of John Harrison, the inventor of the marine chronometer. Whilst the government did offer a financial reward it was very reluctant to actually pay anything out...

Aitor 1

Apple invented the iPhone, but not the smartphone.

The smartphone had been showed before inseveral incarnations, including the "all touch screen" several years before Apple decided to dabble in smartphones. So no invention here.

As for the experience, again, nothing new. Al thought of before, in good part even implemented.

The key here is that Steve Jobs had the guts to force the thought of a useful smartphone, gadget for the user first and phone second into the minds of the Telcos, and he was the one to get unlimited/big data bundles.

He identified correctly, as many had before but before the power to do anything about it, that the customers are the final users, not the telcos.

The rest of the smartphones were culled before birth by the Telecomm industry, as they demanded certain "features" that nobody wanted but lined their pockets nicely with minumum investment.

So I thank Steve Jobs for that and for being able to buy digital music.

[Dec 26, 2016] FreeDOS 1.2 Is Finally Released

Notable quotes:
"... Jill of the Jungle ..."
Dec 26, 2016 | news.slashdot.org
(freedos.org) 59

Posted by EditorDavid on Sunday December 25, 2016 @02:56PM from the long-term-projects dept.

Very long-time Slashdot reader Jim Hall -- part of GNOME's board of directors -- has a Christmas gift. Since 1994 he's been overseeing an open source project that maintains a replacement for the MS-DOS operating system, and has just announced the release of the "updated, more modern" FreeDOS 1.2 !

[Y]ou'll find a few nice surprises. FreeDOS 1.2 now makes it easier to connect to a network. And you can find more tools and games, and a few graphical desktop options including OpenGEM. But the first thing you'll probably notice is the all-new new installer that makes it much easier to install FreeDOS. And after you install FreeDOS, try the FDIMPLES program to install new programs or to remove any you don't want. Official announcement also available at the FreeDOS Project blog .

FreeDOS also lets you play classic DOS games like Doom , Wolfenstein 3D , Duke Nukem , and Jill of the Jungle -- and today marks a very special occasion, since it's been almost five years since the release of FreeDos 1.1. "If you've followed FreeDOS, you know that we don't have a very fast release cycle," Jim writes on his blog . "We just don't need to; DOS isn't exactly a moving target anymore..."

[Nov 24, 2016] American Computer Scientists Grace Hopper, Margaret Hamilton Receive Presidential Medals of Freedom

Nov 23, 2016 | developers.slashdot.org
(fedscoop.com) 116

Posted by BeauHD on Wednesday November 23, 2016 @02:00AM from the blast-from-the-past dept.

An anonymous reader quotes a report from FedScoop:

President Barack Obama awarded Presidential Medals of Freedom to two storied women in tech -- one posthumously to Grace Hopper, known as the "first lady of software," and one to programmer Margaret Hamilton. Hopper worked on the Harvard Mark I computer, and invented the first compiler.

"At age 37 and a full 15 pounds below military guidelines, the gutsy and colorful Grace joined the Navy and was sent to work on one of the first computers, Harvard's Mark 1," Obama said at the ceremony Tuesday. "She saw beyond the boundaries of the possible and invented the first compiler, which allowed programs to be written in regular language and then translated for computers to understand." Hopper followed her mother into mathematics, and earned a doctoral degree from Yale, Obama said.

She retired from the Navy as a rear admiral. "From cell phones to Cyber Command, we can thank Grace Hopper for opening programming up to millions more people, helping to usher in the Information Age and profoundly shaping our digital world," Obama said. Hamilton led the team that created the onboard flight software for NASA's Apollo command modules and lunar modules, according to a White House release . "

At this time software engineering wasn't even a field yet," Obama noted at the ceremony. "There were no textbooks to follow, so as Margaret says, 'there was no choice but to be pioneers.'" He added: "Luckily for us, Margaret never stopped pioneering. And she symbolizes that generation of unsung women who helped send humankind into space."

[Sep 06, 2016] The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory

Notable quotes:
"... The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory.[6] The TCP/IP communications protocols were developed for ARPANET by computer scientists Robert Kahn and Vint Cerf, and incorporated concepts by Louis Pouzin for the French CYCLADES project. ..."
"... In 1980 DoD was a huge percent of the IC business, a lot of the R&D was done at Bell Labs, some of that for telecom not DoD. By 1995 or so DoD was shuttering its IC development as it was all being done for Wii. Which is a minor cause for why software is so hard for DoD; the chips are not under control and change too fast. ..."
"... About 20 years ago I conversed with a fellow who was in ARPANET at the beginning. We were getting into firewalls at the time with concerns for security (Hillary was recently elected to the senate) and he was shaking his head saying: "It was all developed for collaboration.... security gets in the way". ..."
Sep 05, 2016 | economistsview.typepad.com

pgl : Monday, September 05, 2016 at 11:07 AM

Al Gore could not have invented the Internet since Steve Jobs is taking the bow for that. Actually Jobs started NeXT which Apple bought in 1997 for a mere $427 million. NeXT had sold a couple of computer models that did not do so well but the platform software allowed Apple to sell Web based computers. BTW - the internet really began in the 1980's as something called Bitnet. Really clunky stuff back then but new versions and applications followed. But yes - the Federal government in the 1990's was very supportive of the ICT revolution.
ilsm -> pgl... , Monday, September 05, 2016 at 11:59 AM
DARPA did most of it to keep researchers talking.
RC AKA Darryl, Ron -> pgl... , Monday, September 05, 2016 at 12:35 PM
https://en.wikipedia.org/wiki/ARPANET

The Advanced Research Projects Agency Network (ARPANET) was an early packet switching network and the first network to implement the protocol suite TCP/IP. Both technologies became the technical foundation of the Internet. ARPANET was initially funded by the Advanced Research Projects Agency (ARPA) of the United States Department of Defense.[1][2][3][4][5]

The packet switching methodology employed in the ARPANET was based on concepts and designs by Americans Leonard Kleinrock and Paul Baran, British scientist Donald Davies, and Lawrence Roberts of the Lincoln Laboratory.[6] The TCP/IP communications protocols were developed for ARPANET by computer scientists Robert Kahn and Vint Cerf, and incorporated concepts by Louis Pouzin for the French CYCLADES project.

As the project progressed, protocols for internetworking were developed by which multiple separate networks could be joined into a network of networks. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet protocol suite (TCP/IP) was introduced as the standard networking protocol on the ARPANET. In the early 1980s the NSF funded the establishment for national supercomputing centers at several universities, and provided interconnectivity in 1986 with the NSFNET project, which also created network access to the supercomputer sites in the United States from research and education organizations. ARPANET was decommissioned in 1990...

Creation

By mid-1968, Taylor had prepared a complete plan for a computer network, and, after ARPA's approval, a Request for Quotation (RFQ) was issued for 140 potential bidders. Most computer science companies regarded the ARPA–Taylor proposal as outlandish, and only twelve submitted bids to build a network; of the twelve, ARPA regarded only four as top-rank contractors. At year's end, ARPA considered only two contractors, and awarded the contract to build the network to BBN Technologies on 7 April 1969. The initial, seven-person BBN team were much aided by the technical specificity of their response to the ARPA RFQ, and thus quickly produced the first working system. This team was led by Frank Heart. The BBN-proposed network closely followed Taylor's ARPA plan: a network composed of small computers called Interface Message Processors (or IMPs), similar to the later concept of routers, that functioned as gateways interconnecting local resources. At each site, the IMPs performed store-and-forward packet switching functions, and were interconnected with leased lines via telecommunication data sets (modems), with initial data rates of 56kbit/s. The host computers were connected to the IMPs via custom serial communication interfaces. The system, including the hardware and the packet switching software, was designed and installed in nine months...

sanjait -> RC AKA Darryl, Ron... , Monday, September 05, 2016 at 01:09 PM
Though the thing we currently regard as "the Internet", including such innovations as the world wide web and the web browser, were developed as part of "the Gore bill" from 1991.

http://www.theregister.co.uk/2000/10/02/net_builders_kahn_cerf_recognise/

https://en.wikipedia.org/wiki/High_Performance_Computing_Act_of_1991

In case anyone is trying to argue Gore didn't massively contribute to the development of the Internet, as he claimed.

pgl -> sanjait... , Monday, September 05, 2016 at 02:37 PM
So the American government help paved the way for this ICT revolution. Steve Jobs figures out how Apple could make incredible amounts of income off of this. He also shelters most of that income in a tax haven so Apple does not pay its share of taxes. And Tim Cook lectures the Senate in May of 2013 why they should accept this. No wonder Senator Levin was so upset with Cook.
ilsm -> pgl... , Monday, September 05, 2016 at 04:29 PM
In 1980 DoD was a huge percent of the IC business, a lot of the R&D was done at Bell Labs, some of that for telecom not DoD. By 1995 or so DoD was shuttering its IC development as it was all being done for Wii. Which is a minor cause for why software is so hard for DoD; the chips are not under control and change too fast.
ilsm -> RC AKA Darryl, Ron... , Monday, September 05, 2016 at 04:25 PM
About 20 years ago I conversed with a fellow who was in ARPANET at the beginning. We were getting into firewalls at the time with concerns for security (Hillary was recently elected to the senate) and he was shaking his head saying: "It was all developed for collaboration.... security gets in the way".

aras@multix.no (Arne Asplem) wrote:

> I'm the program chair for a one day conference on Unix system
> administration in Oslo in 3 weeks, including topics like network
> management, system admininistration tools, integration, print/file-servers,
> securitym, etc.

> I'm looking for actual horror stories of what have gone wrong because
> of bad system administration, as an early morning wakeup.

> I'll summarise to the net if there is any interest.

> -- Arne



Etc

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit exclusivly for research and educational purposes.   If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner. 

ABUSE: IPs or network segments from which we detect a stream of probes might be blocked for no less then 90 days. Multiple types of probes increase this period.  

Society

Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers :   Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotesSomerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose BierceBernard Shaw : Mark Twain Quotes

Bulletin:

Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 :  Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method  : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

The Last but not Least


Copyright © 1996-2016 by Dr. Nikolai Bezroukov. www.softpanorama.org was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License.

The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.

Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.

FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.

This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting development of this site and speed up access. In case softpanorama.org is down you can use the at softpanorama.info

Disclaimer:

The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

Last modified: October 01, 2017