Coding software

(from the evolutionary point of view)

 

 

Programmable computers, like living species, appeared in different places and with different constructive strategies, but attending to similar needs.

At the beginning these needs were:

1. Managing large amount of data (like a country's census)

2. Carry out cumbersome calculations (for instance, all the calculations for the planes fighting at WW2 were done with the only help of mechanical calculators)

There seems to be consensus that the first machine worthy of being called 'computer' was the Z3. It was technologically primitive -electromechanical- and at the same time usefuly programmable (using punched tape). Whereas the first electronic computer, Colossus, was technologically more advanced -vacuum valves- but of more difficult programming (changing connections).

At first several different architectures were profiled. The neural networks, however promising, were too unknown to do 'something useful' with them in an environment that requires 'predictable' results for any investment. Analog computers (like Typhoon) worked perfectly but they had a fundamental drawback: an analog computer does what it has been designed for and will never do a step further. Nor were 'state machines' easy to carry out effectively. It was the proposal of Von Neumann that won the day: we are still using his conception of computer.

On the other hand, development is intertwined with previous discoveries (example: punched cards were already used by the Jacquard loom in the early nineteenth century), and later ones (when the first transistor appeared, it took very little time to join the nascent computer industry).

I will focus on the programming aspects.

The first step was to replace machine code (exactly the instructions the machine understands) with mnemonic codes (each one paired to one op code). It is easier to remember and write ADD: 23: 48 when willing to add 23 and 48, than a series of bits or holes that mean nothing to a human. The second, immediate step was to extend the same strategy to a series of instructions. If one often finds himself writing:

MOV AX,09h
MOV DX,ptr
INT 21h

prefers to explain to the machine that when he says 'PRINT (ptr)' means exactly the same as before, but in a single line: a 'macro'.

This illustrates one of the forces that move evolution. We can call it laziness or desire for efficiency. If there is something we do very frequently, we try to simplify it. Instead, we leave for later possible developments for which we do not experience a pressing need. This force has continued and continues to operate.

It shapes syntax into simpler forms. In obvious ways (like going from X=X+4 to X+=4). And in not so obvious ways (as in lambda expressions, where implied or deductible information is ommited).

The step that followed macros was to store together many of them -usually related among themselves- in one single file, in a precompiled form. Librarys. This way you just needed to link these files along the newly wrote ones to reuse your work of many hours.

Combining chunks of code (like macros) into more complex forms requires som form of grammar, a language. And here we can see a major trend in evolution.

Because a certain way of thinking leads to a certain way of defining a new language. But, in turn, using this new language brings out a new way of thinking.

First languages (after assemblers) were bound to math (like FORTRAN, FORmula TRANslator), procedural. Just sets of instructions involving many low level opcodes which can be invoked by a simple command.

Notoriously, one of these languages, BASIC (created solely for learning purposes) has experienced succesive evolutions and survided decades where many others run into extinction.

By the mid sixties appeared another issue. An issue which has only grown until today. Complexity.

An application that can be written in a hundred lines (of high level language), comfortably fits in ONE programmer's mind. For a software suite that requieres writting millions of lines of code, there is no human mind able to know both, the whole and the details. Such software suite requires the cooperation of a team whose members can not know the details of their companions' work. And a bug in one part of the software can actually crash the entire system.

In order to avoid collapse of the whole, due to a fail in just one part, nature began to implement it's first solution, in plants -at least- 200 million years ago. When building fortresses, the same strategy was applied since a few thousand years ago. And engineers did the same to ships as soon as ships acquired a size big enough to handle it. We call it compartmentalization. (Have the Titanic had a complete compartmentalization, it would not have sunk).

In software this is called Object Oriented Programming (OOP). All you need to know about your fellow programmer's work is the portion of code his objects expose and you use: the 'interface' (well, not so simple but let's put it like this for the argument's sake). This has some additional advantages. Further work on the objects will require no changes on their users code provided it keeps untouched the interface. Moreover, this way of programming will prevent users of an object (the work of someone else) to intentionally modify it, or inadvertently spoil it. (this was not achieved in first versions of objects, it has been an evolution)

Not surprisingly we are repeating life's path. Complex organisms, integrating billions of cells, are composed by organs each of which performs some specialized task. Getting some inputs and releasing some outputs in a accordance to it's nature and function. Whithin organs, there are also different parts. (and within software 'objects' there are also other 'objects')

Now that the word interface has shown up, let's have a glance over it. In computing science, interfaces have had an evolution of their own.

 

The idea that computing machines can hold some kind of living beeings, dates from the beggining of computer science. John von Neuman himself wrote "Theory of self-reproducing automata" before 1950. To build a new life (even if it's a virtual one) is indeed a seductive idea. So, some people flirted with it. First only as a mind exercise. Then as a coding exercise. Even as a competition or show off of expertise. By the mid 80's viruses and worms had got existence and spread to places beyond their programmer aims. In the same way that some people find fun annoying others, early malware programmers made a kind of heavy jokes.

It did not take a long time to see these 'jokes' could also be weapons. Computer viruses, worms and Trojan horses are currently used to steal information (spying), blocking adversary's servers and, more rarely, to interfere their systems.

Maybe it is time to have a look on nets and internet.

 

 

 

 

Initially I intended to make a historical tour along the evolution of software with the goal of highlighting parallelism with biological evolution. Among many and diverse interests I am forced to choose and the option is to drop out this exploration.

For theese interested on such matters, I bring a link to an article by Andrew Smith in The Guardian which points out some of the chilling aspects of this evolution. Franken-algorithms: the deadly consequences of unpredictable code

 

 

 

 

 

 

 

 

 

 

 

 

(to be continued)