• 516052 (unregistered)

    This story brought a smile to my face like no other. God I miss working with assembly.

  • (nodebb)

    Some say, that compiler is still being used, but became much cheaper due to inflation.

  • (nodebb)

    There were other, similar issues with handling text. For example, any code which read data from a file or input was capped at reading 73 or 80 characters at a time- the screen width of the terminals when Ike had been designing the code.

    More likely that it was the width of the punch cards...

  • Tim Ward (unregistered)

    Wasn't the column 72 / column 80 thing part of the language definition? So there was no conceivable need for a FORTRAN compiler to read beyond column 80, and checking the sequence number in 73-80 was always a bit optional.

  • I dunno LOL ¯\(°_o)/¯ (unregistered)

    So it's a Mel story, except Mel still worked there, except it had been so long he couldn't understand what he did either!

  • Need ... more ... Mel! (unregistered) in reply to I dunno LOL ¯\(°_o)/¯

    100% a Mel story. Loved it. Thank you!

  • Ollie Jones (unregistered)

    Running 6600s took real heroics. Ike and George's story is a great example.

    I knew Dana, a sysadmin guy who took care of an academic 6600. He had a problem where the thing would sometimes crash. George and Ike knew what Dana knew: downtime was really expensive.

    The CDC tech support guy came in with an oscilloscope, a wire-wrap tool, a wire-unwrap tool and a spool of wire-wrap wire. After fiddling with the 'scope for a while, he turned off the power, removed some wire from the backplane with the wire-unwrap tool, then replaced it with a wire that was 10cm longer.

    Adding a 0.3 nanosecond delay to that one circuit fixed the problem.

  • my name is missing (unregistered)

    I remember CDC assembly instructions being basically random letters of the alphabet. For my first ever project as a programmer in the early 80's I could use any programming language I wanted, as long as it was Fortran. I do not miss those days at all!

  • LCrawford (unregistered)

    I wonder what created the project to rewrite the compiler? It seemed to work pretty well, except for the linker. And wouldn't there be commercial products that finally rivaled Ike's compiler?

  • The Running Nose (unregistered)

    And then we find ourselves in the year 2021, customizing a SharePoint Online environment for a customer that needs to store hundreds of thousands of documents in a single library and hitting a list view threshold of 20.000 that's undocumented and unchangeable. Even the Microsoft support engineer was able to confirm it only by scripting their own tests...

    Of course we only found out in the production environment because our QS cases weren't properly sized. Of course we're met with heavy criticism because the end-users understandably won't accept such a limit. Of course we need to redesign our information architecture because of this.

    Talk about constraints we don't think about often. 😅

  • Hasseman (unregistered)

    On university we did Sintran-III Basic. Actually a Basic with matrix operations. Later NORD-500 Simula-67. Have briefly touched the CDC 6600.

    My first job was FORTRAN-66 on HP3000. That Fortran has IF but no ELSE part. You use GOTO to simulate that.

    My very first computer was an ALPHA-LSI with Basic. Must load the interpreter from paper tape. Debug device was special punch to make new holes in the tape.

  • DCL (unregistered)

    I remember running CPU intensive COBOL compilations on a 370/155 under MVT a long time ago. We were charged for the CPU time used. To reduce what we were being billed by the service bureau I would occasionally run the compiles with a TIME=1440 parameter which basically caused the OS to say zero CPU time was used i.e. free compiles.

  • (nodebb) in reply to Hasseman

    On university we did Sintran-III Basic.

    Oh God, Sintran. That brings back memories. See, my dad worked for for Norsk Data for a few years, and talked often about Sintran. I even had a job there one summer while I was a student.

  • tbo (unregistered)

    Wait, so Ike was a genius, not a "genius"? That's a new one.

  • It's about waste o'clock... (unregistered)

    Why add in "feels like the real thing" delays if computer time is $0.10/second? I assume they were put in such that other code would be written correctly to deal with the delays and not just to make the program run slower in general??

  • (nodebb)

    When I started, an hour on the mainframe (not compute, utilization of an online terminal) cost almost as much as a weeks pay (more than net, less than gross - admittedly I was underpaid). The result was that countless hours of "debugging on paper" were spent. Frustrating at the time; but I learned much about software design and the practices of writing said software.

  • jay (unregistered)

    "BCDBIT EQU 6" Wait, they didn't just hardcode 6 wherever it was needed? Why, this sounds almost like someone was trying to make the code maintainable or some silly idea like that.

  • jay (unregistered) in reply to 516052

    "I miss working with assembly." When I started in this business in 1980, everything was much more "bare metal". You read individual characters from the keyboard, poked data into memory-mapped video addresses, etc. Now everything I do is on a so much higher level. And I wonder some times: For people new to the business, who never worked at the low level, do they understand what's actually going on? Do they have any idea what their code is getting compiled into and why it matters?

  • jay (unregistered) in reply to Steve_The_Cynic

    "More likely that it was the width of the punch cards..." Yes. Early monitors had a width of 80 columns because that was the number of characters you could fit on a punch card. And punch cards had 80 columns because that was the number of columns they could fit on a piece of cardboard that size. And computer punch cards had to be that size to fit the machines built to handle pre-computer days punch cards -- there were machines to sort punch cards and find cards in a deck with certain characters in certain columns. And those machines handled cards of that size because that was the size of a Continental dollar bill.

  • Officer Johnny Holzkopf (unregistered) in reply to DCL

    Sound like a fascinating installation-specific "abnormity": TIME=1440 (equivalent of 24 hours, but interpreted as "without time limit"), or TIME=NOLIMIT, would not cancel the job or step, no matter how much time it already ran. If that caused a specific MVT setup to stop accounting (!), not just for time, but probably for resources in general - that would be a nice way of saying: "Be honest and you'll pay; be greedy and you get it for free." Lesson to be learned: Honesty has been and always will be punished.

  • Conradus (unregistered) in reply to It's about waste o'clock...

    "Why add in "feels like the real thing" delays if computer time is $0.10/second?"

    Because the delays don't count. In the old time-sharing days, you were only charged compute time when the CPU was processing your instructions. A delay was "Give the CPU to someone else and don't call me again until the timer's expired." and didn't count.

  • It's about waste o'clock... (unregistered) in reply to Conradus

    Fascinating that they had such purview. Reminds me of today's cloud computing, which is... basically the same thing. So I guess that means no need for anyone here to reminisce, when they can live it today!

  • (nodebb)

    Compuserve also wrote their own FORTRAN compiler in (PDP-10) assembler, and their own runtime to go with it, but at least they were separate; and they used DEC's linker. All of those were also originally upper-case only (characters 6 bits wide); I converted the assembler, the linker, and the runtime to case-sensitive 9-bit characters, which gave them a couple of extra years on that platform before they had to abandon it. The FORTRAN compiler was written by my near-namesake Steve Wilhite, better known today as the inventor of Graphics Interchange Format (GIF, like the peanut butter, not like a Christmas present with a mouth full of marbles).

  • markm (unregistered) in reply to LCrawford

    "I wonder what created the project to rewrite the compiler?" FORTRAN was updated repeatedly with new features added, and they wanted to keep up. Fortran 77 was quite a lot improved from FORTRAN 66; among other things, string variables were added in 1977, which was OK when you were trying to compute a rocket trajectory but makes quite a difference for business applications. (Imagine printing out and mailing customer account statements without string variables to hold the customer's name and address. Or writing a FORTRAN compiler in FORTRAN 66.) I suspect both computer manufacturers and programming shops would have often developed non-standard workarounds for this, which leads into your other question:

    "And wouldn't there be commercial products that finally rivaled Ike's compiler?" Sure, but you'd probably have to buy a newer computer to use a newer compiler. Compilers were quite specific to the computer architecture, which originally changed with each new model. Before the System 360 (introduced late in the 1960's - with the OS being several years later than the hardware!), IBM was often selling three incompatible architectures at a time, each of which would be obsolete in a few years; updating software for a computer that was no longer made was never a high priority. Control Data Corporation ("CDC"), maker of the computer mentioned in the original post, was a small company that specialized in giant computers for screaming fast scientific calculations, and updated their designs as fast as possible. (I think one of the reasons their compiler was slow was that the computer was optimized for floating point calculations, not for the string and logic operations needed by compilers.) If you came to CDC in 1977 and asked for a FORTRAN 77 compiler for your outdated computer, they wouldn't laugh at you, but they would have to say no and try to sell you the 1978 model, ask you to pay in advance to get a position high on the waiting list, and promise there would be a compiler for it no later than 1979.

  • markm (unregistered)

    Continuing: But when you got the new computer with the new compiler, your old program would either not compile at all, or would compile but run with many new bugs. In those days, language standards were treated more as suggestions than as rules, and quite a few details of the language would depend on how the compiler writers used the specifics of the computer architecture. Integer and floating point sizes depended on the computer, and things like how many characters could be in a variable name varied between compilers supposedly implementing the same standard. (IIRC, the first compiler I ever used - for FORTRAN II on an NCR 100 - allowed 5 characters.) And then there were non-standard additions to the language or the libraries, to take advantage of a special feature of the computer, or to bridge gaps such as the lack of a string variable to hold the customer name. Update and you'd be rewriting all that code...

  • markm (unregistered)

    HOW specific could the language be to the architecture? Consider the arithmetic IF, which gave a three-way branch (3 GOTO's) depending on whether the calculated result was negative, zero, or positive. E.g., "IF N-I 10,20,30" where "10,20,30" is three labels to branch to. If you wanted to continue with the next line when N was less than I, you put the label "10" on the next line.

    AFAIK, this happened only because FORTRAN I was created for an early IBM machine with 3 target addresses in each instruction word. (They must have been short addresses, but memory sizes were small because it cost so much.) For instance, TEMP = N - I would be translated into a single subtraction instruction, that fetched N and J, subtracted, then stored TEMP. Then there was a three-way branch instruction, going to one of three target addresses depending on the result of the previous instruction. So the arithmetic IF translated very efficiently into just two instructions - for that computer - and now and again a clever programmer could do more with it than a "logical if" that either dropped through to the next instruction or branched to one address. But for every computer made in 50 or 60 years, branch instructions only have one address and the arithmetic if took about 4 instructions, while the "logical if" used in every newer computer language could be as short as two instructions.

    For FORTRAN IV, they added the logical if, although without an "else" clause or a block structure to put multiple lines in the "then" clause. Most of the time, the "then" was a GOTO; the logical IF was followed by code for the false case, ending in a GOTO to where the two branches rejoined, then a label targeted by the IF ... GOTO, and the code for the true case. But the arithmetic IF was kept for compatibility, and as far as I know it's still there in the newest best version of Fortran, even though from day 1 most arithmetic IF's have been written to simulate a logical if. (You use two labels, one duplicated, and place one label right after the IF.)

  • (nodebb)

    The CDC tech support guy came in with an oscilloscope, a wire-wrap tool, a wire-unwrap tool and a spool of wire-wrap wire. After fiddling with the 'scope for a while, he turned off the power, removed some wire from the backplane with the wire-unwrap tool, then replaced it with a wire that was 10cm longer.

    The CDC tech support guy's first name wasn't Seymour was it?

  • Gumpy Gus (unregistered) in reply to zomgwtf

    Well, actually, it was. One of the 6600's on the production line would not do a floating point divide correctly. This was a big deal as the computer was touted as the fastest and best computer for floating-point calculations, and the floating-divide hardware consisted of a whole 4x8 foot panel of modules. The regular computer checkout guys didn't get anywhere for a few days, so they called in Mr. Cray. He had designed most of the computer, so he knew that circuitry by heart. After just a few minutes of probing with an oscilloscope, he asked for a 10.5 foot length of coaxial cable. The standard cables were mostly 10 feet long, in order to keep the propagation delays consistent. He replaced one important signal cable with the 10.5 foot one, and then the floating-point divide unit gave correct results. The RG-174A cable used had a propagation speed of around 0.66 the speed of light, so he had added about 1.5 nanoseconds.

  • Gumpy Gus (unregistered) in reply to LCrawford

    There were THREE official CDC FORTRAN compilers.

    The first one was written by Seymour Cray, in OCTAL assembly language. It worked surprisingly well but was completely unintelligible and unsupportable.

    The second one was the "cheap" compiler. It worked "OK" but had no friendly features like a debugger or stack backtraces and no optimization.

    There was finally a "fancy" compiler, with extreme optimization, but it took up all of memory, had many passes, and was very, very slow to compile and would only link to one bloated I/O and runtime package. Also they never quite seemed to get the compiler and runtime and fancy overlay linker all working compatibly at the same time, so running it was always a gamble. Quite often it would run, run, run, and at the end you'd have a bill for like $14 and a program that dumped. Things were exciting but a bit grim too back then.

    That opened up a niche for a small and fast compiler.

  • (nodebb) in reply to markm

    But for every computer made in 50 or 60 years, branch instructions only have one address and the arithmetic if took about 4 instructions, while the "logical if" used in every newer computer language could be as short as two instructions.

    Some architectures can do it in less, at least in some cases. ARM has conditionals as part of other instructions, and modern compilers are quite good at taking advantage of this (producing really very dense code indeed). Beating modern compilers is hard, but possible if you're able to take advantage of higher-level knowledge (such as knowing that a function is always called from the highest-priority interrupt handler and can be much more aggressive in register usage).

  • Barf4Eva (unregistered)

    Damn it, Remy, I can't get this song out of my head now, and now all I hear is FORTRAN in place of "I RAN" every time... arghhhhh. what evil have you unleashed upon me...?

  • Gnasher729 (unregistered) in reply to Gumpy Gus

    Vaguely remembering that S. Jasik, known to every early Mac developer for his debugger, wrote the optimising Fortran compiler that I used as a student on our CDC Cyber 175. 40 megahertz! That compiler was good. One interesting feature was a 10x60 but instruction cache. Any loops had to fit into that cache to run full speed.

    Later I worked at a company making graphics cards. Found a book describing the hardware and showed it to the guys designing the hardware. Their estimate was one square Millimeter plus whatever you wanted to spend on RAM.

  • lgh (unregistered)

    Hello! I will be a bit off-topic, but it's still FORTRAN. I never programmed in FORTRAN but the article got my attention, and made me curious. Do you know about any FORTRAN77 code available on the internet, if possible a real software, so not example programs from a book, which I could download and look at how a real program looked like in the 1970-80-s? Also I read somewhere something like "Who can program FORTRAN, can program FORTRAN in any language" I would be curious to see what this sentence meant... maybe it's referring to some unique techniques in this language. Thank you!

Leave a comment on “And FORTRAN, FORTRAN So Far Away”

Log In or post as a guest

Replying to comment #:

« Return to Article