Computer programming and its rich history
If it is the history of programming that needs to be told, then it is safe to start an account with charles babbage’s difference engine in 1822. Even from the time computers were so simple, they still had to have instructions so they could perform tasks that were inserted into them. This set of instructions is what is known today as computer programming.
During the era of the difference engine, the gears had to be replaced manually, which would result in the calculations being made. All this changed when electricity signals replaced the physical movement with the 1942 U.S. government machine called ENIAC. The concept of accepting programming was also followed by this machine.
In 1945, to speed up programming, two essential concepts were developed that directly influenced the programming languages of John Von Neumann, who was then part of the Institute for Advanced Studies. The first concept was known as the shared program method. This concept dictated that the hardware should not be complex and not be connected by hand for each program. Complicated instructions were used to control this type of hardware that made reprogramming faster.
The second concept, “conditional control transfer”, has developed blocks of code that can even be used in different requests or subroutines. The next part of the concept was logical branching. This created the concept of having blocks of code that can be used and reused.
In 1949, the Short Code language was released. She became the mother of the computer language of electronic devices. With this language, the programmer had to use 0 and 1 instead of the old explanations. 1951 marked the appearance of the compiler called A-0 by Grace Hopper. This program has translated all 0’s and 1’s to the computer. This gave way to a much faster schedule.
The FORTRAN (FORmula TRANslating System) was introduced in 1957, which was also the first key language. It was designed for IBM for scientific calculations. This language included the GOTO, DO and IF statements. Fortran’s strength, however, was not enterprise computing. It was a good program for manipulating numbers, but not for business calculations.
COBOL was then developed in 1959. It was designed as the language of a businessman. The COBOL program was similar to a trial where there are 4-5 sections comprising a large whole. This facilitated the study.
The LISP language (developed for the study of artificial intelligence) also known as Cambridge Polish was developed in 1958 by John McCarthy. This programming language is very abstract and specific, which is why it is still used. LISP can save lists and change them yourself.
In the same year, the Algol language was produced. She became the mother of the Pascal, C and C++ language, and also Java. Algol also had the first correct grammar called backus-to form or BNF. Algol 68, which was the next version, was a more difficult version to use. This difficulty created Pascal.
Niklaus Wirth introduced the Pascal language in 1968. It was then a necessary means of teaching. It was a combination of the following languages: ALGOL, FORTRAN and COBOL. It was also Pascal who improved the pointer data form. Death was caused by the lack of variable groups. Modula-2 appeared at the time, but C was already popular among many users.
Dennis Ritchie’s C (1972, used by Unix) was similar to Pascal, but its predecessors were the B and BCPL. It is also used on Windows, Linux, and MacOS. OOP (Object Oriented Programming) was developed between the 1970s and 1980s. This developed in the C++ language in 1983. This language can handle many tasks at the same time. These are also the language courses chosen in COMPUTER SCIENCE AP. In 1987, the Perl (Practical Extraction and Reporting Language) was developed.
Java soon followed in 1994. It still has many goals to achieve especially with its slow programs. But there are high expectations that there is much in the future for this language. Microsoft has also developed VB or Visual Basic, which uses widgets and are now widely used.
There are many more developments in the future for computer programming. It may have started in a rough method, but looking at the languages in use today, there have been so many developments that we can only imagine what “impossible” could be possible very soon.