The first modern computer language, Fortran, is first shared with the coding community.
Fortran, the first modern computer language, is shared with the coding community for the first time. Three years in the making, it would be refined in work that continues to this day.
While this ground-breaking “high level” language has been long eclipsed, it defined an approach to programming that still informs the art of computer science.
Back at the dawn of the computer age thinking machines were oversized, petulant infants that understood only their own, private, nearly incomprehensible languages. There really wasn’t a pressing need to have languages that worked on every possible machine, there not being too many kinds yet. So programs written using “assembly” or “low level” languages were good enough — even though they were difficult to learn, took lots of time to write and compile, and had no lasting value.
Unlike the software and web apps of today, which can run on different operating systems and platforms with, at worst, slight modifications, early languages ran only on the same series of computer. A program written for a WingBat Series 51 couldn’t operate on a BatWing Series 15, because it issued instructions based the unique architecture of the box on which and for which it was written. Trying to port it would be like giving driving directions meant for a driver in Paris to someone walking around in Nairobi.
Enter John W. Backus, whose permanent place in computing history began on a stroll in midtown Manhattan in 1950. The 25-year-old grad student, intrigued by a room-sized computer on display on the ground floor of IBM’s New York City offices, wandered inside to get a closer look.
A tour guide learned he was studying math at Columbia University uptown and sent him upstairs for what would be a brief oral exam of “brain teasers.” Backus was immediately hired — as a programmer. “That was the way it was done in those days,” he would later tell The New York Times with a shrug.