Wednesday, 12 October 2011

Enter "C"


C came into being in the years 1969-1973 and was developed by Dennis Richey and David Kerningham both working at the Bell laboratories.(3)
And the magic word was portability.
Parallel at the development of computer languages, Operating Systems (OS) were developed. These were needed because to create programs and having to write all machine specific instructions over and over again was a "waste of time" job.
So by introducing the OS 'principle' almost all input and ou
tput tasks were taken over by the OS.

Subroutines

Soon after developing machine languages and the first crude programming languages began to appear the danger of inextricable and thus unreadable coding became apparent. Later this messy programming was called: "spaghetti code". 
One important step in unraveling or preventing spaghetti code was the development of subroutines. And it needed Maurice Wilkes, when realizing that "a good part of the remainder of his life was going to be spent in finding errors in ... programs", to develop the concept of subroutines in programs to create reusable modules. Together with Stanley Gill and David Wheeler he produced the first textbook on "The Preparation of Programs for an Electronic Digital Computer".(6)
The formalized concept of software development (not named so for another decade) had its beginning in 1951.

Machine Language


Opcode works like a shorthand and represents as said a group of machine instructions. The opcode is translated by another program into zero's and one's, something a machine could translate into instructions.
But the relation is still one to one: one code to one single instruction. However very basically this is already a programming language. It was called: assembly language.

Programming the new tools (or toys?)


Programming the first binary computer was still not an easy task and much prone to mistakes. First programming was done by typing in 1's or 0's that were stored on different information carriers. Like paper tapes, punched hole cards, hydrogen delay lines (sound or electric pulse) and later magnetic drums and much later magnetic and optical discs.
By storing these 0's and 1's on a carrier (first used by Karl Suze's X1 in 1938) it was possible to have the computer read the data on any later time. But mis typing a single zero or one meant a disaster because all coding (instructions) should absolutely be on the right place and in the right order in memory. This technology was called absolute addressing.

Information Age

In the beginning of the so called "Information Age" computers were programmed by "programming" direct instructions into it. This was done by setting switches or making connections to different logical units by wires (circuitry).

Programming language

Programming languages have been under development for years and will remain so for many years to come. They got their start with a list of steps to wire a computer to perform a task. These steps eventually found their way into software and began to acquire newer and better features. The first major languages were characterized by the simple fact that they were intended for one purpose and one purpose only, while the languages of today are differentiated by the way they are programmed in, as they can be used for almost any purpose. And perhaps the languages of tomorrow will be more natural with the invention of quantum and biological computers.

Microsoft

Microsoft has extended BASIC in its Visual Basic (VB) product. The heart of VB is the form, or blank window on which you drag and drop components such as menus, pictures, and slider bars. These items are known as "widgets." Widgets have properties (such as its color) and events (such as clicks and double-clicks) and are central to building any user interface today in any language. VB is most often used today to create quick and simple interfaces to other Microsoft products such as Excel and Access without needing a lot of code, though it is possible to create full applications with it.