Go Back to software indexchapter 2

1. Introduction to Software History

by Cornelis Robat, editor

This chapter will be revised constantly.

Related Articles

OS
OOP
Program Development

Introduction to software history part 1

Related Resources

 

 

First Steps
How It All Started
Enter the Information Age
   programming the new tools
   machine language
   subroutines
   fortran
   programming language
   enter C
   artificial intelligence
   enter OOP
   new methods required
What is a program
Creation of a program
Computer languages
Interpreters and Compilers
   interpreters
   compilers
Standardization
Software Generations
    first Generation
    second Generation
    third Generation
    fourth Generation
    fifth Generation
Programming languages
    languages_index

 

First Steps

This part will be different from the History of the computer, no chronological travel trough software-land, but a collection of articles and assays on software.

Software has a long history and as far as the facts are known to us we will give them to you. When missing stories, data, or other information are shared to us they will be put on this site. If you have any comments of suggestions regarding this page or any other page please do not hesitate to contact us.

A simple question: "What is software?" A very simple answer is: Hardware you can touch, software you can't. But that is too simple indeed.

But when talking about software you talk about programming and programming languages. But about producing and selling the products made by programming (languages) as well.

There are over 300 different ("common") computer languages in existence, apart from the various dialects stemming from one of them. Most of them can be classified in definable groups, but others don’t belong to anything. Some because they are rather new or the use of them was or is never wide spread and only used by a small specialized professionals or groups of scientists requiring these dialects. This is often the case with a specific language that was designed for just one purpose, e.g. telecommunication or supercomputing.
Some languages are even dead languages, some others are revived and expanded upon again, and there are ones that constantly rejuvenate. In the latter case a programmer is sometimes wondering whether he or she is not just upgrading to a newer version but instead learning a complete new language.

 

How It All Started

It shouldn't be a big surprise that the creation or software also went in large but distinguishable steps. Compared with hardware there were fewer developments that went parallel or overlapping. In rare cases developments were reinvented sometimes because the development or invention was not published, even prohibited to be made public (war, secrecy acts etc.) or became known at the same time and after (legal)discussions the "other" party won the honors.

The earliest practical form of programming was probably done by Jaquard (1804, France). He designed a loom that performed predefined tasks through feeding punched cards into a reading contraption. This new technology allowed carpets and tissues to be manufactured with lower skills and even with fewer people. The little kid sitting under the loom changing rods and other things vanished. One single person could now handle a loom. That this met resistance from the weavers leaves no question. The same thing happened in England during the industrial revolution there. Even a movement came up called: Luddites (anti technology or just concerned citizens fighting for their bread?)


This picture shows the manufacturing
of punched cards for looms

The technology of punched cards will later be adapted by (IBM's) Recording and Tabulating Company to process data.

The situation was still a one on one game: a problem needed to be solved thus a machine was built. (Pascal, Babbage, Scheultz & Son) And when some sort of instruction was needed a sequence was designed or written and transferred to either cards or mechanical aids such as wires, gears, shafts actuators etc.. To call that programming? Well, according to our definition yes it was.

First there was Ada Lovelace, writing a rudimentary program (1843) for the Analytical Machine, designed by Charles Babbage in 1827, but the machine never came into operation.

Then there was George Boole (1815-1864), a British mathematician, who proved the relation between mathematics and logic with his algebra of logic (BOOLEAN algebra or binary logic) in 1847.

This meant a breakthrough for mathematics. Boole was the first to prove that logic is part of mathematics and not of philosophy.

A big step in thinking too.

But it will take one hundred years before this algebra of logic is put to work for computing.

It took Claude Shannon (1916-2001) who wrote a thesis (A Mathematical Theory of Communication in the Bell System Technical Journal -1948) on how binary logic could be used in computing to complete the software concept of modern computing.

Also the hardware needed to make jumps ahead like the folowing:

  • It took Zuse to create the first binary programmable computer, relay based.
  • The Bomba, originally built by Polish engineers to crack the Enigma code, pushed the envelop again
  • The colossus built by people from Bletchley Park (near London, UK) for the same purpose
  • Atanasov and Berry designed the ABC computer, a binary computer, as Zuse's was, but now 100% electronic.
  • And not to forget the ENIAC by Eckert and Mauchly and a team made up of many others

Now things were in place to start off with the information age.

"What was first: software or hardware"

Frankly this is a matter of philosophy, or simpler: how you look at it.

 

Enter the Information Age

In the beginning of the so called "Information Age" computers were programmed by "programming" direct instructions into it. This was done by setting switches or making connections to different logical units by wires (circuitry).

(1)
Two women wiring the right side of the ENIAC with a new program
(US Army photo, from archives of the ARL Technical library, courtesy of Mike Muuss)

Programming like this was nothing else but rewiring these huge machines in order to use all the options, possibilities and calculations. Reprogramming always meant rewiring.

In that way calculations needed days of preparations, handling thousands of wires, resetting switches, plugs etc. (in the most extreme case that is). And the programmed calculation itself just took a few minutes. If the "programming" of wiring did not have a wrong connection, the word bug was not in used yet, for programming errors.
The coding panels very much looked like that a few telephone switchboards hustled together, and in fact many parts actually came from switch boards.

With the invention of vacuum tubes, along with many other inventions, much of the rewiring belonged to the past. The tubes replaced the slow machines based on relays.

 

When the transistor was invented this again replaced a technology: vacuum tubes.

When Shannon reinvented or better rediscovered the binary calculus in 1948 and indicated how that could be used for computing a revolution started. The race was on!

 

Programming the new tools (or toys?)

Programming the first binary computer was still not an easy task and much prone to mistakes. First programming was done by typing in 1's or 0's that were stored on different information carriers. Like paper tapes, punched hole cards, hydrogen delay lines (sound or electric pulse) and later magnetic drums and much later magnetic and optical discs.

By storing these 0's and 1's on a carrier (first used by Karl Suze's X1 in 1938) it was possible to have the computer read the data on any later time. But mis typing a single zero or one meant a disaster because all coding (instructions) should absolutely be on the right place and in the right order in memory. This technology was called absolute addressing.

An example:

1010010101110101011

If this is a sequence for switches it means switch one on, switch two off etc. etc.  

In simple language:

Panel 1 function: enter house
Switch 0 1 open the door
Switch 1 1 put the lights on
Switch 2 0 close the door (please)

In fact the protocols for programming the machines in this way looked very much like that.

In the early 50's programmers started to let the machines do a part of the job. This was called automatic coding and made live a lot easier for the early programmers.

 

Soon the next step was to have the program to select the proper memory address in stead of using absolute addressing.

The next development was to combine groups of instruction into so called words and abbreviations were thought up called: opcodes (Hopper 1948)

 

Machine Language

Opcode works like a shorthand and represents as said a group of machine instructions. The opcode is translated by another program into zero's and one's, something a machine could translate into instructions.

But the relation is still one to one: one code to one single instruction. However very basically this is already a programming language. It was called: assembly language.

An example:

Label Opcode Register
CALC: STO R1, HELP0
  STO R2, HELP2
  LD R3, HELP1
  ADD R3, HELP2
  LD R4, HELP1
  SUB R4, HELP2
  RSR SP, 0
HELP1: DS 2
HELP2: DS 2

                           

This piece of assembly code calculates the difference between two numbers.

 

Subroutines

Soon after developing machine languages and the first crude programming languages began to appear the danger of inextricable and thus unreadable coding became apparent. Later this messy programming was called: "spaghetti code".
One important step in unraveling or preventing spaghetti code was the development of subroutines. And it needed Maurice Wilkes, when realizing that "a good part of the remainder of his life was going to be spent in finding errors in ... programs", to develop the concept of subroutines in programs to create reusable modules. Together with Stanley Gill and David Wheeler he produced the first textbook on "The Preparation of Programs for an Electronic Digital Computer".(6)
The formalized concept of software development (not named so for another decade) had its beginning in 1951.

Below is an example of how subroutines would work.

Start of program

the main "menu"

 

 

 

first subroutine

 

back to the main menu

second subroutine

with a parameter (contents of what to print)

 


back to procedure: main

Begin program;

Main;

Printf ("Hello World");
DoSomethingElse()
Printf ("Hello World");

(end of program)

Function DoSomethingElse;

Add two numbers;

Return OK

Function Printf(what_to_print)

Open channel to printer interface;
Initialize printer;
Send "what_to_print" to printer;
Send page feed to printer;
Close printer interface;

Return OK

 

This program would print "Hello World" twice on two different pages.

By re-using the Printf subroutine a possible error in this routine would show up only once. An enormous advantage when looking for errors. Off course the Open, Initialize, Send, and Close "commands" in this Printf function are also subroutines.

 

Fortran

The next big step in programming began when an IBM team under John W. Backus created FORTRAN - FORmula TRANslator 1952. It could only be used on their own machine, the: IBM 704. But later versions for other machines, and platforms were sold soon after. Until long past 1960 different CPU's required an other kind instruction set to add a number, thus for each different machine a different compiler was needed. Typically the manual came years later in 1957!
Rewiring of machines to reprogram them now definitely belonged to the past!

Programming language

FORTRAN soon became called a programming language. So why calling a set of some predefined instructions a programming language?

Because some characteristics of a language are met:

All the above criteria were easily met for this - strictly defined- set of computer instructions.

An example:

Let's presume communication with a computer can be accomplished. Then how would you tell it to add two numbers in simple terms?

human computer
Add 2 and 2  
Show me the answer print 2+2

Depending on what (dialect of) computer language you use it could look different:

human computer
Add 2 and 2 answer := 2+2;
Show me the answer printf ("%d\n", answer);

And by the time when Cobol, Common Business Oriented Language, was published in 1960 by the Codasyl committee, (Hopper was a member) the term Computer Language was a fact.

In the meantime hardware developments raced literally ahead.

Already computers were connected to teletype machines to expedite the entry of programs. In the late 1960's the first video interfaces were connected. The network was invented. And floppy disks, harddrives etc. made live for programmers a lot easier.

As mentioned you could not simply run FORTRAN on any machine. It had to be rewritten for each particular machine if the type of processor was different. In in that early days ALL types were different. This did not promote the development of programming at all!

 

Enter "C"

C came into being in the years 1969-1973 and was developed by Dennis Richey and David Kerningham both working at the Bell laboratories.(3)

And the magic word was portability.

Parallel at the development of computer languages, Operating Systems (OS) were developed. These were needed because to create programs and having to write all machine specific instructions over and over again was a "waste of time" job.
So by introducing the OS 'principle' almost all input and output tasks were taken over by the OS.

Such as:

As the common computer languages had trouble to be translated from one machine to another the OS's had to take the same hurdle every time a new machine was developed.

The need ad pressure for a common portable language was enormous.

There were some unsuccessful projects aimed to solve this problem, like Multics, a joint venture of MIT, General Electric, and Bell Labs. And other developments at DOD's in different countries. But they all came either too late or became too complex to succeed.

But the demise of Multics inspired Dennis Ritchie and Brian Kernigham to develop C in 1972. This was and still is a very strict language that stayed close enough to the machine's internal logic and structure. If you were allowed to say that. This new language was reasonable well to read and understand by humans. And because of this combination the language is fast, compact and became very popular amongst system programmers and commercial software manufacturers.

With that language they also developed UNIX, a generic operating system.

The power of C was that the language had a small language base (vocabulary) but leaned heavily on what they called libraries. Libraries contain machine specific instructions to perform tasks, like the OS does. These libraries were the only parts that had to be redesigned for different machines, or processor families. But, and that was C's strength, the programming interface/language remained the same. Portability was born. Sourcecode could be reused, only to be recompiled when it had to run on other machines.

A classic example of C, printing "Hello World" on your screen:

 

/* helloworld.c */

main()
{
printf('Hello World\n");
}

In another chapter this routine will be shown to make the various differences in language visual.

Soon other language manufacturers sprang on the bandwagon and the software industry leaped ahead. The industry took of like a rocket!

 

Next chapter continues with Artificial Intelligence

See also the languages index.

For further reading: Impressions and thoughts on creating software, an essay

 

Go Back t osoftware index Last Updated on April 8, 2003 For suggestions  please mail the editors 

 

Footnotes & References