We were still supporting ie6 until 2014 or so because some of our government users refused to use anything else. Including some pages that used Google Maps. Google dropped support for ie6 long before we did, so I got to reverse engineer the parts of Maps that broke and monkeypatch them to work in ie6 again. There were still plenty of holdouts when we decided it wasn't worth the trouble of bending over backwards for them anymore.
I helped a friend a while back. They were running software on Windows 95. This was in, I think 2013? Win95, yea, sigh.... yea. The new software was bundled with hardware and the vendor wanted 700K for it. We bought a desktop with serial ports, found a PCI Serial extender with Win95 support that also worked in VMWare, installed VMWare then made a virtual Win95 box, connected all the serial stuff, tested, and got it all working. We cheered when it all worked, let me tell you. Now they have a VMware file that they can move from hardware to hardware and just load the VM to get everything running again. Unfortunately with the new versions of VMWare they no longer support Win95, AS THEY FUCKING SHOULDN'T. IE6? I'd charge double. I'd even consider charging triple.
I can't recall the particulars, but my father has a radiation instrument of some sort that was custom built in 1981 to the tune of $350k. It communicates via proprietary software running on a National Semiconductor Starplex. Its data has to get across the Department of Energy's encrypted network. There's something about a random cipher that opens a random extension on a sandboxed VoIP PBX like some kludged-up Enigma machine or something. The thing is literally this guy.
We are both chumps compared the the guys who are still using a C64 in production. Somewhere out there, a Vic20 is chugging along.
Let's be honest, though - a C64 at an auto shop is useful for showing you spark hysteresis and that's it. American Motor Corp was making cars well past the point of Commodore making 64s. Back when I was an acoustician, I had gear that ran on CPM. But that was ten years ago now.
I still run into stuff running CP/M as a friend offering "bail me out, pal" advice to people I know. It is light weight, low latency, solid as software can be, and loves to have analog data fed into it via serial ports. Novell supposedly has some of the similar benefits, but I never got into the Novell stuff. From my own experience, USB to Serial conversions work for people who do not need microsecond or better timing. I use a few of these adapters to feed GPS time signals into cameras for a telescope and spectroscopy project I gave up on a few years back. I did not have an old PC that could run WinXP reliably, so I had to use Windows 7 and the converters. Since you are an audio guy, you've probably run into this as well. This same issue is one of several reasons why the 'real' stuff uses Audrino and not Raspberry Pi for their inputs. Microsats for instance. It is my understanding that the Pi inputs run over USB while the Audrino run native into the bus.Back when I was an acoustician, I had gear that ran on CPM.
I have never in my life encountered serial protocols that require microsecond timing. Pretty much every serial thing I've ever had to deal with runs RS232, and I have seen RS232 run a thousand feet on f'n lamp cord. The 'real' stuff uses Arduino because it has a much longer lineage and a lot more devkit support. The 'fake' stuff runs RaspPi because it was deliberately designed for frivolous uses by people afraid of programming. Not that there's anything wrong with that. They both spook the shit out of me.
Raspberry Pi is really good at all the things you'd use a mac mini/old beige box gathering dusting in your closet for at $30 and small enough to fit in a project box. It's not very good at embedded systems, but it's not really meant for embedded systems. My standing desk is controlled by an Arduino, because after a couple of months of sending bug reports to the manufacturer I learned they didn't have any actual engineers on staff, just some dude with a degree in industrial design, and it would be much less trouble for me to DIY than to try to funnel electronics 101 to the dude with an industrial design degree via the customer support people who act insulted when I notice half the bugs I'm reporting are caused by them confusing pin 4 and pin 17.
These things are freaking awesome We are swapping out all out thin clients for these as the mini's are cheaper than the new thin clients, have a Windows license and we can change our minds and run local software if needed. For the price, they are great for businesses that need cheap, yet good, desktops. Still cannot beat a Pi 3 for the price if you are willing to put in some sweat equity and do sone Linux learning.
Glad to be of service. The thing is literally a laptop with no battery and no screen. And everything inside is Intel so Linux will run great on them; I hope to put an Ubuntu load on one of these and tinker with it when the deployment slows down.
I so want to get into it but all the shit I need accomplished is like way the fuck harder than anybody who sucks at compiling should try for. I decided Arduino wasn't for me when I was trying to figure out how to write code that would turn a button press into RS232 and even that was well beyond my ability. You have my mad respect. Hey, you're smart. Know what I want to do? I want to use a RaspPi or equivalent to pull a webcam image and display it full screen. Or pull a webstream and display it full screen. That's it. Like, I want a live-capture photo frame. Why is this so hard?
That's… I don't know. Baffling? Yeah, let's go with that. To think that I was struggling with one of my professor's explanation why the university grid still offers (in parallel to Debian stable stack) some 'ancient' versions of compilers like Python 2.4 or GCC 3.2. Or why it took until 1996 to start using Fortran 90 instead of 77 and C99 was used instead of the ~ANSI only since 2007. Why are so many institution opposed to upgrading? I get that the cost is nothing to sneeze at, but this just feels silly. What's next, Pentagon still using hub-based network infrastructure because this newfangled MAU are expensive? ;PThey were running software on Windows 95. This was in, I think 2013?
IE6?
I think it's often a case of some weird inertia how jacked government acquisition processes are pragmatism (if it ain't broke, don't fix it). A few years back, I worked for the state deciding unemployment claims (i.e. I'd do a hearing and decide whether someone got benefits). Once we did a hearing, we had two options. One, we could write our decision via a telnet client, where the program we used could not do line breaks or word wrapping in the editor (thankfully they would print okay). But we had to put in paragraph breaks via a special character so that when the decision was printed, it would look right. The other option was to dictate your decision into a little java applet so that a bunch of typists we had on staff could then deal with the ancient editor. I should add that all of our state's IT infrastructure is provided by Northrop Grumman, who generally suck (and is likely going to be sued by the state in the near future). It's a mess. I also worked for the state Medicaid system in a similar capacity, and our infrastructure there was even dumber, but that was due to incompetent management in my department rather than any external factors. Oh yeah! I'd forgotten. I interned with the attorney general's office when I was in law school, so this would've been early 2008, and the computer in the closet they assigned me to was still running Windows 2000.Why are so many institution opposed to upgrading? I get that the cost is nothing to sneeze at, but this just feels silly.
Not really, at least as far as I bothered to go through the scripts. The JavaScript on (and off) the site are for stuff like making indentation in code samples, highlighting (colouring) syntax and similar cosmetic stuff. While a bit ironical since the page is about not needing it, it's much more expedient when you have a lot of stuff to mark (tedious when done by hand, imagine adding stuff like <span id="keyword"></span> in literally every line of the examples). CSS features are cool to use on some bits and pieces, but JavaScript can automate a lot of the tedious work. On the other hand, JavaScript often is just purely tedious to work with so you can't ever be happy anyway. ;) I think that it's used not because of some hypocrisy but because not all CSS3 features have been added to all modern browsers. Also bare in mind that no-script block all scripts, often times if you don't host your website from literally your very own home server, the hosting service will add some of their scripts regardless of your wishes for whatever reason. Usually benign in purpose (tracking if people click ads, reporting back what kind of browser you use, how long have you spent on the page) but not something to dismiss in general.