6 comments

  • masswerk10 hours ago
    Great article!

    Another (mid-late) data point may be the original NeXT brochure[1], which claims the NeXT to be "a mainframe on two chips". It provides a definition along the lines of a throughput oriented architecture and peripheral channels (in analogy to the NeXT's two DSP chip) and, while doing so, also marries the concepts of physical size and architecture (there's also some romanticism around uncompromised designs and "ruthless efficiency" involved):

    > The NeXT Computer acknowledges that throughput is absolutely key to performance. For that reason, we chose not to use the architecture of any existing desktop computer. The desired performance could be found only in a computer of a different class: the mainframe.

    > Having long shed any self-consciousness over such mundane matters as size and expense, mainframes easily dwarf desktop computers in the measure of throughput.

    > This is accomplished by a different kind of architecture. Rather than require the attention of the main processor for every task, the mainframe has a legion of separate Input/Output processors, each with a direct channel to memory It's a scheme that works with ruthless efficiency.

    [1] https://lastin.dti.supsi.ch/VET/sys/NeXT/N1000/NeXTcube_Broc...

    • kens9 hours ago
      There's also the Intel iAPX 432 "micro-mainframe" processor (1981) on two chips. (This was supposed to be Intel's flagship processor, but it was a disaster and the 8086 took over instead. The NYT called it "one of the great disaster stories of modern computing".) I think that microprocessor manufacturers had mainframe envy in the 1980s.
  • tommiegannert14 hours ago
    > Based on my research, the earliest computer to use the term "main frame" was the IBM 701 computer (1952)

    > This shows that by 1962, "main frame" had semantically shifted to a new word, "mainframe."

    > IBM started using "mainframe" as a marketing term in the mid-1980s.

    I must conclude it takes the competition 10 years to catch up to IBM, and IBM about 20 years to realize they have competition. Setting a countdown timer for IBM to launch an LLM in 2040.

    Thanks for researching and writing this up. It's a brilliant read!

    • masswerk10 hours ago
      I can kind of see why this should have been. The 1401, which was really intended as a replacement for IBM's punchcard appliances, was widely known as IBM's small mainframe. On the other hand, there are the 701, things like the 7030 (Stretch), and then the ranges of the S/360 and S/370. – Considering this rather inconceivable wide class of machines, stepping in and deciding, what's a mainframe and what's not, is a losing game, from a marketing perspective. So better keep silent and reap the fruits…
  • kens16 hours ago
    Author here. Anyone have interesting mainframe stories?
    • ggm12 hours ago
      A rumour from my mainframe days was that Digital Equipment hired lacemakers from france to show people how they did it. This was wiring up the core memory planes for the Dec-10 (I have one, a folded 3 part card) which just barely squeezes into the mainframe class.

      The guy who told me this was the Australian engineer sent over to help make the machine to bring back for UQ. He parked in the quiet side of the Maynard factory, not realising why the other drivers avoided it. Then his car got caught in a snowdrift.

      A prior engineer told me about the UUO wire wrap feature on the instruction set backplane: you were allowed to write your own higher level ALU "macros" in the instruction space by wiring patches in this backplane. Dec 10 had a 5 element complex instruction model. Goodness knows what people did in there but it had a BCD arithmetic model for the six bit data (36 bit word so 6 bytes of six bits in BCD mode)

      A guy from Latrobe uni told me for their Burroughs, you edited the kernel inside a permanently resident Emacs like editor which did recompile on exit and threw you back in on a bad compile. So it was "safe to run" when it decided your edits were legal.

      We tore down our IBM 3030 before junking it to use the room for a secondhand Cray 1. We kept so many of the water cooled chip pads (6" square aluminium bonded grids of chips, for the water cooler pad. About 64 chips per pad) the recycler reduced his bid price because of all the gold we hoarded back.

      The Cray needed two regenerator units to convert Australian 220v to 110v for some things, and 400hz frequency for other bits (this high voltage ac frequency was some trick they used doing power distribution across the main CPU backplane) and we blew one up spectacularly closing a breaker badly. I've never seen a field engineer leap back so fast. Turned out reusing the IBM raised floor for a Cray didn't save us money: we'd assumed the floor bed for liquid cooled computers was the same; not so - Cray used a different bend radius for flourinert. The flourinert recycling tank was clear plastic, we named the Cray "yabby" and hung a plastic lobster in it. This tank literally had a float valve like a toilet cistern.

      When the Cray was scrapped one engineer kept the round tower "loving seat" module as a wardrobe for a while. The only CPU cabinet I've ever seen which came from the factory with custom cushions.

      • dhosek8 hours ago
        I heard a story of Seymour Cray doing a demo of one of the machines and it turned out there was a bug in some hardware procedure. While the customers were at lunch, Seymour opened up the machine, redid the wire wrap and had the bug fixed when they returned. (Note that many details are likely inaccurate as this is a 35-year-old memory of a second-hand story.)
    • AnimalMuppet8 minutes ago
      My mom was a programmer back in the 1950s. First thing in the morning, she would run a memory test. If it failed, she would slide out the failing tray of memory (100 words by 36 bits, and tubes), hand it off to a tech to fix, slide in a fresh tray, and proceed.

      She had one CPU she worked on where you could change its instruction set by moving some patch cords.

    • raldi5 hours ago
    • wslh10 hours ago
      A tax agency unified all its data from different agencies via X.25 and satellite connections. However, the process was expensive and slower than expected because the files were uncompressed and stored as basic plain-text ASCII/EBCDIC files.

      One obvious solution to this problem was to buy an Ethernet network device for the mainframe (which used Token Ring), but that was yet another very expensive IBM product. With that device, we could have simply compressed and uncompressed the files on any standard PC before transferring them to/from the mainframe.

      Another obvious solution was to use C to compile a basic compression and decompression tool. However, C wasn’t available—buying it would have been expensive as well!

      So, we developed the compression utility twice (for performance comparisons), using COBOL and REXX. These turned out to be two amusing projects, as we had to handle bits in COBOL, a language never intended for this purpose.

      • Spooky238 hours ago
        The capture of all thought by IBM at these palaces was nuts.

        Circa 2002 I’m a Unix admin at a government agency. Unix is a nascent platform previously only used for terminal services. Mostly AIX and HPUX, with some Digital stuff as well. I created a ruckus when I installed OpenSSH on a server (Telnet was standard). The IBM CE/spy ratted me out to the division director, who summoned me for an ass chewing.

        He turned out to be a good guy and listened to and ultimately agreed with my concerns. (He was surprised, as mainframe Telnet has encryption) Except one. “Son, we don’t use freeware around here. We’ll buy an SSH solution for your team. Sit tight.”

        I figured they’d buy the SSH Communications software. Turned out we got IBMSSH, for the low price of $950/cpu for a shared source license.

        I go about getting the bits and install the software… and the CLI is very familiar. I grab the source tarball and it turns out this product I never heard of was developed by substituting the word “Open” with “IBM”. To the point that the man page had a sentence that read “IBM a connection”.

      • guenthert53 minutes ago
        > Another obvious solution was to use C to compile a basic compression and decompression tool. However, C wasn’t available—buying it would have been expensive as well!

        I would have thought in the (IBM) mainframe world, PL/I (or PL/S) would have been the obvious choice.

      • somat10 hours ago
        > C wasn’t available—buying it would have been expensive as well!

        On the subject of expensive mainframe software, I got to do the spit take once of "you are paying how much for a lousy ftp client? per month!" I think it was around $500 per month.

        Man open source software really has us spoiled.

        • dugmartin6 minutes ago
          I think I paid $299 (one time fee) for a license for PKZIP in 1993 for a product I was building. Open source is pretty amazing.
        • dhosek8 hours ago
          The youngs have no idea. You used to have to pay for development tools. In high school, I would hand-assemble my 6502 assembler programs by writing them out longhand on graph paper, fill in the hex in the left columns than type in the hex in the Apple monitor to get the program running. Heaven forbid there was anything wrong with the code, the hand-assembling or the keyboarding, but I couldn’t afford one of the paid assemblers (in retrospect, I should have written one in AppleSoft, but hindsight is 20/20, and I don’t know that I was smart enough then to write an assembler anyway). Spending $500–1000 for a compiler in the 90s was typical. Now the kids whine about paying the relatively cheap fee for things like JetBrain IDEs.
          • philiplu7 hours ago
            Back about roughly 1978 to 1982, I wrote and sold a 6800/6809 disassembler ("Dynamite Disassembler") for I think $200 a copy. Sold enough copies to pay for computer equipment during my college days. I think we reduced the price to $100 or maybe $150 when the TRS-80 Color Computer came out with a 6809 powering it.

            $200 back in 1980 is about $800 today. Amazing to think anyone would spend that much for a fairly simple tool.

  • dboreham10 hours ago
    Ken, surely frames significantly pre-date the computer? They were used in telephone exchanges. The Colossus reproduction at Bletchley is constructed from frames.
    • kens8 hours ago
      I discuss this a bit in footnote 1. Thousands of things were built on frames before computers, of course. However, IBM both built computers from semi-portable cabinets built around metal frames and also referred to the computer's units as "frames", including the "main frame".

      Telephone systems used main distribution frames, but these are unrelated. First, they look nothing like a computer mainframe, being a large rack with cable connections. Second, there isn't a path from the telephony systems to the computer mainframes; if the first mainframes were developed at Bell Labs, for instance, it would be plausible.

      As for Colossus, it was built on 90-inch racks, called the J rack, K rack, S rack, C rack, and so forth; see https://www.codesandciphers.org.uk/lorenz/colossus.htm

    • Animats8 hours ago
      Telephone switches introduced the idea of frames. They were the first systems to have enough components to require modular organization. Western Electric tended to use 24 inch spacing.

      19 inch racks seem to come from railroad interlocking and may have been introduced by Union Switch and Signal, founded by Westinghouse in 1881, and still around, as a unit of Ansaldo.

    • ggm9 hours ago
      Likewise bus bars, and hence data bus
  • m4639 hours ago
    something... related?

    https://youtu.be/QHnJ9NmK3Pc

    (the mainframe song, uncertain of its background)

  • exodust5 hours ago
    Loosely related to unexpected word origins, I didn't know until recently that when "gaslighting" someone, you're using an expression tied to literal gaslights from a 1944 movie. I'm likely late to the party on learning this origin.
    • kens4 hours ago
      Yes; I watched the movie "Gaslight" recently, and the meaning that everyone uses has drifted pretty far from the movie's meaning. But it's probably hopeless to resist a meaning when it becomes trendy.