So my dad was knee deep in networking back before there was networking. He built the first network at DOE. By the time he retired he had 7,000 computers in his workgroup. Even now he's got his legacy machines in their own special black market corner of the DOE web. He's like the old guy with the modem in TRON. And he loves telling his version of OS/2, which I've never been able to find on the web. According to him, IBM and Microsoft had a legal agreement that Microsoft would help develop OS/2 for IBM. IBM, for their part, was compelled to use whatever Microsoft used. When "the divorce" happened, Microsoft looked over the terms of their agreement and discovered that they were not required to support what they wrote for IBM. The way my dad tells it, IBM got a big bucket of assembly called "OS/2". They notably did not get any source code. Which put IBM in the position to decompile an operating system before they could even keep adapting it to their machines, let alone sell it. Microsoft basically threw an 18-month delay at them and they had to eat it. Microsoft, meanwhile, was busy developing NT in the meantime. And while Win95 launched onto the scene with a big splash, it's the legacy code from NT that kept Windows running through Vista. In other words, Microsoft was looking at the long game for big institutional clients (knowing that they'd never have to worry about IBM in the home) and monkeywrenched IBM right when they had the opportunity to destroy the server-based makeup of big business. I spent the summer at an internship where the only computer ran OS/2 Warp. It was the summer Win95 came out. OS/2 was kludgey; even getting that Windows VM open was problematic at best. It was slicker than WIn3.11, for damn sure, but also hella expensive, slow, and nothing ran on it. Also, clicking on things could permanently alter stuff in ways you didn't understand, rendering the machine inoperable. Me? I'm still bummed BeOS never caught on. That shit was slick.
That said, Hiaku is there if you want it. It's still pretty cool, and still not much good for anything but being pretty cool.Me? I'm still bummed BeOS never caught on. That shit was slick.
BeOS was beautiful, but it was too late to be able to just toss an OS out there and have it catch on. There were no applications, and not much reason to write them. If they'd had some good graphics applications they might have been able to win over the TV people who still missed Amiga, or maybe the movie people fleeing SGI, but they didn't. The free unixes were better nerd toys, and BeOS wasn't suited for the server-side applications where the money was. There were no games for it, and making the kind of games that would draw people had already gotten too expensive for someone to do as a hobby. Plenty of people bought copies, and I've never heard anyone who didn't remember it being really cool, but no one found a use for it. Yesterday it worked
Today it is not working
The web is like that.
OS/2 let you move a window without bringing it to the front. That's the only thing I miss. Surely someone has posted In the Beginning was the Command Line by now. Has any computer essay from the last century held up so well?
Focus-follows-mouse and click-to-raise could give you something like that in X window managers that let you pick their focus policy.OS/2 let you move a window without bringing it to the front. That's the only thing I miss.
Surely someone has posted In the Beginning was the Command Line by now. Has any computer essay from the last century held up so well?
The Rise of "Worse is Better"
Yes, that's pretty good, thanks. He makes two points:I believe that worse-is-better, even in its strawman form, has better survival characteristics than the-right-thing, and that the New Jersey approach when used for software is a better approach than the MIT approach.
He supports the first point well, but I don't see the values by which he makes the second claim. Is market dominance the measuring rod of quality?