History of the computer: what it was like and its characteristics

We explain and summarize the history of computers, what the first ones that were invented were like, and what their main characteristics are.

The computer is the most advanced and efficient calculating machine ever invented.

What was the history of the computer like?

The history of the computer is the account of events, innovations and technological developments in the field of computing and automation, which gave rise to the machines we know as computers. It also records their improvement and updating until reaching the miniaturized and fast versions of the 21st century.

The computers They are the most advanced and efficient computing machines invented by humans. They are equipped with sufficient operational power, autonomy and speed to replace humans in many tasks, or to allow virtual and digital work dynamics that have never before been possible in history.

The invention of this type of devices in the 20th century forever revolutionized the way we understand industrial processeswork, society and countless other areas of our lives. It affects everything from the way we relate to one another to the type of global information exchange operations we are capable of carrying out.

Computer background

Joseph Marie Jacquard invented a punched card system in 1802.

The history of the computer has a long history, which They go back to the first slide rules and the first machines designed to make the task of arithmetic easier for humans. The abacus, for example, was an important advance in the field, created around 4,000 BC. c.

Also There were much later inventions, such as Blaise Pascal’s machine.known as Pascal’s Machine or Pascaline, created in 1642. It consisted of a series of gears that allowed arithmetic operations to be performed. This machine was improved by Gottfried Leibinitz in 1671 and the history of calculators began.

Human attempts at automation have continued ever since: Joseph Marie Jacquard invented a punched card system in 1802. to try to automate their looms, and in 1822 the Englishman Charles Babbage used these cards to create a differential calculus machine.

Only twelve years later (1834), he managed to innovate his machine and obtain an analytical machine capable of the four arithmetic operations and of storing numbers in a memory (up to 1,000 50-digit numbers). For this reason, Babbage is considered the father of computingsince this machine represents a leap towards the world of computing as we know it.

Invention of the computer

The invention of the computer cannot be attributed to a single personBabbage is considered the father of the branch of knowledge that would later become computing, but it would not be until much later that the first computer as such would be made.

Another important founder in this process was Alan Turing, creator of a machine capable of calculating anything, and which he called the “universal machine” or “Turing machine.” The ideas that were used to build it were the same ones that later gave birth to the first computer.

Another important case was that of ENIAC (Electronic Numeral Integrator and Calculatoror Electronic Numerical Integrator and Calculator), created by two professors at the University of Pennsylvania in 1943, considered the grandfather of computers per se. It consisted of 18,000 vacuum tubes that filled an entire room.

Invention of transistors

Transistors were essential for the manufacture of the first microchips.

The history of computers would not have taken the course it did without the invention of transistors in 1947, fruit of the efforts of Bell Laboratories in the United StatesThese devices are electrical switches made of solid materials and without the need for a vacuum.

This discovery It was essential for the manufacture of the first microchips, and allowed the transition from electrical to electronic devices. The first integrated circuits (that is, chips) appeared in 1958, the result of the efforts of Jack Kilby and Robert Noyce. The former received the Nobel Prize in Physics in 2000 for the discovery.

The first computer

The Z3 was the first German electronic computer.

The first computers They emerged as logical calculation machines, due to the needs of the allies during World War II. To decode the transmissions of the warring sides, calculations had to be made quickly and constantly.

That’s why, Harvard University designed the first electromechanical computer in 1944., with the help of IBM, baptized Mark I. It was about 15 meters long and 2.5 meters high, wrapped in a glass and stainless steel box. It had 760,000 parts, 800 kilometers of cables and 420 control switches. He served for 16 years.

At the same time, In Germany, the Z1 and Z2 had been developed, test models of similar computers built by Konrad Zuse, who completed his fully operational Z3 model, based on the binary system. It was smaller and cheaper to build than its American competitor.

The first commercial computer

In February 1951 the Ferranti Mark 1 appeareda modern version of the American computer of the same name that was commercially available. It was extremely important in the history of the computer, as it had an index of registers, which allowed for easier reading of a set of words in memory.

For this reason, up to thirty-four different patents arose for its development. In later years served as the basis for the construction of IBM computersvery successful industrially and commercially.

The first programming language

FORTRAN appeared in 1953.acronym for The IBM Mathematical Formula Translation (“IBM Translation of Mathematical Formulas”), developed as the first formal programming language, that is, the first program designed to manufacture computer programs, by IBM programmers, led by John Backus.

It was initially developed for the IBM 704 computer.and for a wide range of scientific and engineering applications, which is why it has had a wide series of versions over half a century of implementation. It is still one of the two most popular programming languages, especially for the world’s supercomputers.

The first modern computer

Engelbart invented the mouse and the graphical user interface.

The first modern computer appeared in the fall of 1968, as a prototype presented by Douglas Engelbart. It had for the first time a mouse or pointer, and a graphical user interface (GUI), forever changing the way users and computer systems would interact in the future.

The presentation of the Engelbart prototype It lasted 90 minutes and included an on-screen connection to his research center., thus constituting the first videoconference in history. The Apple and then Windows models were later versions of this first prototype.

Secondary storage devices

The 3 ½ inch floppy disks were rigid, colored, and much smaller.

The first device for exchanging information between one computer and another were Floppy disks, created in 1971 by IBMThese were black squares of flexible plastic, in the middle of which there was a magnetizable material that allowed recording and retrieving information. There were several types of diskettes:

  • 8 inches. The first to appear, bulky and with a capacity between 79 and 512 kbytes.
  • 5 ¼ inches. Similar to the 8-inch ones but smaller, they stored between 89 and 360 kbytes.
  • 3 ½ inches. Introduced in the 1980s, they were rigid, colorful and much smaller, with a capacity of between 720 and 1440 kbytes.

Also There were high and low density versions, and numerous variants of cassettes. At the end of the 80s, the appearance and massification of the compact disc (CD) completely replaced the format, increasing the speed and capacity of data recovery.

Finally, at the turn of the century, all of these device formats became obsolete and were replaced by the pendrive or removable flash memorywith varied (but much higher) capacity, high speed and extreme portability.

The first computer networks

The world’s first computer network It was ARPANET, created in 1968 by the United States Department of Defense. It served as a rapid platform for the exchange of information between educational and state institutions, probably for military purposes.

This network was developed, updated and eventually became the backbone of the internetopen to the general public, at least until 1990.

21st century computers

The rise of robotics promises to leave many workers unemployed.

The computers Today they are part of everyday lifeto the point that for many it is already inconceivable to live in a world without them. They are found in our offices, in our cell phones, in various household appliances, in charge of automated installations, and performing countless operations automatically and independently.

This It has many positive aspects, but it also involves many fears.. For example, the emergence of robotics, a natural next step in computing, promises to leave many human workers unemployed, surpassed by the capacity for automation that grows greater and faster every day.

References