Electronic computer. History Tue

Appendix 4

Test on the topic:

"History of the development of computer technology"

Choose the correct answer

1. An electronic computer is:

a) a complex of hardware and software information processing tools;

b) complex technical means for automatic information processing;

c) a model that establishes the composition, order and principles of interaction of its components.

2. A personal computer is:

a) A computer for an individual buyer;

b) a computer that provides dialogue with the user;

c) desktop or personal computer that meets the requirements of general accessibility and universality

3. Inventor of a mechanical device that allows you to add numbers:

a) P. Norton;

b) B. Pascal;

c) G. Leibniz;

d) D. Napier.

4. The scientist who put the idea together mechanical machine with the idea of ​​program control:

a) C. Babbage (mid-19th century);

b) J. Atanosov (30s of the XX century);

c) K. Beri (XX century);

d) B. Pascal (mid-17th century)

5. The world's first programmer is:

a) G. Leibniz;
b) C. Babbage;

c) J. von Neumann;

d) A. Lovelace.

6. Country where the first computer was created that implements the principles of program management:

b) England;

c) in Germany

7. Founder of domestic computer technology:

8. The city in which the first domestic computer was created:

b) Moscow;

in Saint-Petersburg;

Yekaterinburg city.

9. Means of communication between the user and the second generation computer:

a) punched cards;

b) magnetic tokens;

c) magnetic tapes;

d) magnetic tokens.

10. First tool for counting

a) sticks;

b) pebbles;

c) human hand;

d) shells.

11. Number system in Russian abacus:

a) binary;

b) fivefold;

c) octal;

d) decimal.

12. Scope of application of first generation computers:

a) design;

b) engineering and scientific calculations;

c) banking;

d) architecture and construction.

13. Computer generation, during which programming languages ​​began to appear high level:

a) first;

b) second;

c) third;

d) fourth.

14. Generation of computers, the elemental base of which were transistors:

a) first;

b) second;

c) third;

d) fourth.

15. Programming language in first generation machines:

a) machine code;

b) Assembler;

c) BASIC

d) Fortran

Select all correct answers:

16. Elements of third generation computers:

a) integrated circuits;

b) microprocessors

c) CRT-based display

d) magnetic disks

e) mouse manipulator

17. Elements of Babbage's Analytical Engine

a) input block;

b) microprocessor;

c) output block;

d) office;

e) mill;

f) result printing block;

g) arithmetic device;

h) memory;

18. Elements of a fourth generation computer:

a) integrated circuits;

b) microprocessors;

c) color display;

d) transistors;

e) joystick manipulator;

e) plotters.

19. The very first counting devices

a) https://pandia.ru/text/78/312/images/image003_40.jpg" width="206" height="69">.jpg" width="151" height="58">.jpg" width="146" height="71">0 " style="margin-left:-22.95pt;border-collapse:collapse;border:none">

a, d, d, f, i

Performance evaluation scale

Points

Grade

Satisfactorily

As soon as a person discovered the concept of “quantity”, he immediately began to select tools that would optimize and facilitate counting. Today, super-powerful computers, based on the principles of mathematical calculations, process, store and transmit information - most important resource and the engine of human progress. It is not difficult to get an idea of ​​how the development of computer technology took place by briefly considering the main stages of this process.

The main stages of the development of computer technology

The most popular classification proposes to highlight the main stages of the development of computer technology on a chronological basis:

  • Manual stage. It began at the dawn of the human era and continued until the middle of the 17th century. During this period, the basics of counting emerged. Later, with the formation of positional number systems, devices appeared (abacus, abacus, and later a slide rule) that made calculations by digits possible.
  • Mechanical stage. Began in the middle of the 17th century and lasted almost until late XIX centuries. The level of development of science during this period made it possible to create mechanical devices that perform basic arithmetic operations and automatically remembering the highest digits.
  • The electromechanical stage is the shortest of all that unite the history of the development of computer technology. It only lasted about 60 years. This is the period between the invention of the first tabulator in 1887 until 1946, when the very first computer (ENIAC) appeared. New machines, the operation of which was based on an electric drive and an electric relay, made it possible to perform calculations with much greater speed and accuracy, but the counting process still had to be controlled by a person.
  • The electronic stage began in the second half of the last century and continues today. This is the story of six generations of electronic computers - from the very first giant units, which were based on vacuum tubes, to the ultra-powerful modern supercomputers with a huge number of parallel working processors, capable of simultaneously executing many commands.

The stages of development of computer technology are divided according to a chronological principle rather arbitrarily. At a time when some types of computers were in use, the prerequisites for the emergence of the following were actively being created.

The very first counting devices

The earliest counting tool known to the history of the development of computer technology is the ten fingers on human hands. Counting results were initially recorded using fingers, notches on wood and stone, special sticks, and knots.

With the advent of writing, various ways recording numbers, positional number systems were invented (decimal - in India, sexagesimal - in Babylon).

Around the 4th century BC, the ancient Greeks began to count using an abacus. Initially, it was a clay flat tablet with stripes applied to it with a sharp object. Counting was carried out by placing small stones or other small objects on these stripes in a certain order.

In China, in the 4th century AD, a seven-pointed abacus appeared - suanpan (suanpan). To a rectangular wooden frame wires or ropes were pulled - from nine or more. Another wire (rope), stretched perpendicular to the others, divided the suanpan into two unequal parts. In the larger compartment, called “earth,” there were five bones strung on wires, in the smaller compartment, called “sky,” there were two of them. Each of the wires corresponded to a decimal place.

Traditional soroban abacus has become popular in Japan since the 16th century, having arrived there from China. At the same time, abacus appeared in Russia.

In the 17th century, based on logarithms discovered by the Scottish mathematician John Napier, the Englishman Edmond Gunther invented the slide rule. This device was constantly improved and has survived to this day. It allows you to multiply and divide numbers, raise to powers, determine logarithms and trigonometric functions.

The slide rule became a device that completed the development of computer technology at the manual (pre-mechanical) stage.

The first mechanical calculating devices

In 1623, the German scientist Wilhelm Schickard created the first mechanical "calculator", which he called a counting clock. The mechanism of this device resembled an ordinary clock, consisting of gears and sprockets. However, this invention became known only in the middle of the last century.

A quantum leap in the field of computing technology was the invention of the Pascalina adding machine in 1642. Its creator, French mathematician Blaise Pascal, began work on this device when he was not even 20 years old. "Pascalina" was a mechanical device in the form of a box with a large number of interconnected gears. The numbers that needed to be added were entered into the machine by turning special wheels.

In 1673, the Saxon mathematician and philosopher Gottfried von Leibniz invented a machine that performed four basic mathematical operations and knew how to take square roots. The principle of its operation was based on the binary number system, specially invented by the scientist.

In 1818, the Frenchman Charles (Karl) Xavier Thomas de Colmar, taking Leibniz's ideas as a basis, invented an adding machine that could multiply and divide. And two years later, the Englishman Charles Babbage began constructing a machine that would be capable of performing calculations with an accuracy of 20 decimal places. This project remained unfinished, but in 1830 its author developed another - an analytical engine for performing accurate scientific and technical calculations. The machine was supposed to be controlled by software, and perforated cards with different locations of holes were to be used to input and output information. Babbage's project foresaw the development of electronic computing technology and the problems that could be solved with its help.

It is noteworthy that the fame of the world's first programmer belongs to a woman - Lady Ada Lovelace (nee Byron). It was she who created the first programs for Babbage's computer. One of the computer languages ​​was subsequently named after her.

Development of the first computer analogues

In 1887, the history of the development of computer technology reached new stage. The American engineer Herman Hollerith (Hollerith) managed to design the first electromechanical computer - the tabulator. Its mechanism had a relay, as well as counters and a special sorting box. The device read and sorted statistical records made on punched cards. Subsequently, the company founded by Hollerith became the backbone of the world-famous computer giant IBM.

In 1930, the American Vannovar Bush created a differential analyzer. It was powered by electricity, and vacuum tubes were used to store data. This machine was capable of quickly finding solutions to complex mathematical problems.

Six years later, the English scientist Alan Turing developed the concept of a machine that became theoretical basis for current computers. She had all the main properties modern means computer technology: could step-by-step perform operations that were programmed in the internal memory.

A year after this, George Stibitz, a scientist from the USA, invented the country's first electric mechanical device, capable of performing binary addition. His actions were based on Boolean algebra - mathematical logic created in mid-19th century by George Boole: using the logical operators AND, OR and NOT. Later, the binary adder will become an integral part of the digital computer.

In 1938, Claude Shannon, an employee of the University of Massachusetts, outlined the principles of the logical design of a computer using electrical circuits for solving Boolean algebra problems.

The beginning of the computer era

The governments of the countries involved in World War II were aware of the strategic role of computing in the conduct of military operations. This was the impetus for the development and parallel emergence of the first generation of computers in these countries.

A pioneer in the field of computer engineering was Konrad Zuse, a German engineer. In 1941, he created the first computer controlled by a program. The machine, called the Z3, was built on telephone relays, and programs for it were encoded on perforated tape. This device was able to work in the binary system, as well as operate with floating point numbers.

The next model of Zuse's machine, the Z4, is officially recognized as the first truly working programmable computer. He also went down in history as the creator of the first high-level programming language, called Plankalküll.

In 1942, American researchers John Atanasoff (Atanasoff) and Clifford Berry created a computing device that ran on vacuum tubes. The machine also used binary code and could perform a number of logical operations.

In 1943, in an English government laboratory, in an atmosphere of secrecy, the first computer, called “Colossus,” was built. Instead of electromechanical relays, it used 2 thousand electronic tubes for storing and processing information. It was intended to crack and decrypt the code of secret messages transmitted by the German Enigma encryption machine, which was widely used by the Wehrmacht. The existence of this device was kept in the strictest confidence for a long time. After the end of the war, the order for its destruction was signed personally by Winston Churchill.

Architecture development

In 1945, the Hungarian-German American mathematician John (Janos Lajos) von Neumann created the prototype for the architecture of modern computers. He proposed writing a program in the form of code directly into the machine’s memory, implying joint storage of programs and data in the computer’s memory.

Von Neumann's architecture formed the basis for the first universal electronic computer, ENIAC, being created at that time in the United States. This giant weighed about 30 tons and was located at 170 square meters area. 18 thousand lamps were used in the operation of the machine. This computer could perform 300 multiplication operations or 5 thousand additions in one second.

Europe's first universal programmable computer was created in 1950 in the Soviet Union (Ukraine). A group of Kyiv scientists, led by Sergei Alekseevich Lebedev, designed a small electronic calculating machine (MESM). Its speed was 50 operations per second, it contained about 6 thousand vacuum tubes.

In 1952, domestic computer technology was replenished with BESM, a large electronic calculating machine, also developed under the leadership of Lebedev. This computer, which performed up to 10 thousand operations per second, was at that time the fastest in Europe. Information was entered into the machine's memory using punched paper tape, and data was output via photo printing.

During the same period, a series of large computers was produced in the USSR under the general name “Strela” (the author of the development was Yuri Yakovlevich Bazilevsky). Since 1954, serial production of the universal computer "Ural" began in Penza under the leadership of Bashir Rameev. The latest models were hardware and software compatible with each other, there was a wide selection of peripheral devices, allowing you to assemble machines of various configurations.

Transistors. Release of the first serial computers

However, the lamps failed very quickly, making it very difficult to work with the machine. The transistor, invented in 1947, managed to solve this problem. Using the electrical properties of semiconductors, it performed the same tasks as vacuum tubes, but occupied much less space and did not consume as much energy. Along with the advent of ferrite cores for organizing computer memory, the use of transistors made it possible to significantly reduce the size of machines, making them even more reliable and faster.

In 1954, the American company Texas Instruments began mass-producing transistors, and two years later the first second-generation computer built on transistors, the TX-O, appeared in Massachusetts.

In the middle of the last century Substantial part government organizations and large companies used computers for scientific, financial, engineering calculations, and work with large amounts of data. Gradually, computers acquired features familiar to us today. During this period, plotters, printers, and storage media on magnetic disks and tape appeared.

The active use of computer technology has led to an expansion of the areas of its application and required the creation of new software technologies. High-level programming languages ​​have appeared that make it possible to transfer programs from one machine to another and simplify the process of writing code (Fortran, Cobol and others). Special translator programs have appeared that convert code from these languages ​​into commands that can be directly perceived by the machine.

The emergence of integrated circuits

In 1958-1960, thanks to engineers from the United States Robert Noyce and Jack Kilby, the world learned about the existence of integrated circuits. Miniature transistors and other components, sometimes up to hundreds or thousands, were mounted on a silicon or germanium crystal base. The chips, just over a centimeter in size, were much faster than transistors and consumed much less power. The history of the development of computer technology connects their appearance with the emergence of the third generation of computers.

In 1964, IBM released the first computer of the SYSTEM 360 family, which was based on integrated circuits. From this time on, the mass production of computers can be counted. In total, more than 20 thousand copies of this computer were produced.

In 1972, the USSR developed the ES (unified series) computer. These were standardized complexes for the operation of computer centers that had common system commands Was taken as a basis American system IBM 360.

IN next year DEC released the PDP-8 minicomputer, which became the first commercial project in this area. The relatively low cost of minicomputers has made it possible for small organizations to use them.

During the same period, there was constant improvement software. Operating systems have been developed to support maximum amount external devices, new programs appeared. In 1964, they developed BASIC, a language designed specifically for training novice programmers. Five years after this, Pascal appeared, which turned out to be very convenient for solving many applied problems.

Personal computers

After 1970, production of the fourth generation of computers began. The development of computer technology at this time is characterized by the introduction of large integrated circuits into computer production. Such machines could now perform thousands of millions of computational operations in one second, and their RAM capacity increased to 500 million bits. A significant reduction in the cost of microcomputers has led to the fact that the opportunity to buy them gradually became available to the average person.

Apple was one of the first manufacturers of personal computers. Those who created it Steve Jobs and Steve Wozniak designed the first PC model in 1976, calling it the Apple I. It cost only $500. A year later, the next model of this company was presented - Apple II.

The computer of this time for the first time became similar to a household appliance: in addition to its compact size, it had an elegant design and a user-friendly interface. The proliferation of personal computers at the end of the 1970s led to the fact that the demand for mainframe computers fell markedly. This fact seriously worried their manufacturer, IBM, and in 1979 it released its first PC to the market.

Two years later, the company's first microcomputer with an open architecture appeared, based on the 16-bit 8088 microprocessor manufactured by Intel. The computer was equipped with a monochrome display, two drives for five-inch floppy disks, RAM volume 64 kilobytes. On behalf of the creator company, Microsoft specially developed an operating system for this machine. Numerous IBM PC clones appeared on the market, fueling growth industrial production personal computers.

In 1984 by Apple a new computer was developed and released - the Macintosh. His operating system was extremely user-friendly: it presented commands in the form graphic images and allowed them to be entered using a manipulator - a mouse. This made the computer even more accessible, since now no special skills were required from the user.

Some sources date computers of the fifth generation of computing technology to 1992-2013. Briefly, their main concept is formulated as follows: these are computers created on the basis of highly complex microprocessors, having a parallel-vector structure, which makes it possible to simultaneously execute dozens of sequential commands embedded in the program. Machines with several hundred processors working in parallel make it possible to process data even more accurately and quickly, as well as create efficient networks.

The development of modern computer technology already allows us to talk about sixth generation computers. These are electronic and optoelectronic computers running on tens of thousands of microprocessors, characterized by massive parallelism and modeling the architecture of neural biological systems, which allows them to successfully recognize complex images.

Having consistently examined all stages of the development of computer technology, it should be noted interesting fact: inventions that have proven themselves well in each of them have survived to this day and continue to be used successfully.

Computer Science Classes

Exist various options computer classifications.

So, according to their purpose, computers are divided:

  • to universal ones - those that are capable of solving a wide variety of mathematical, economic, engineering, technical, scientific and other problems;
  • problem-oriented - problem solving more narrow direction, associated, as a rule, with the management of certain processes (data recording, accumulation and processing of small amounts of information, performing calculations in accordance with simple algorithms). They have more limited software and hardware resources than the first group of computers;
  • specialized computers usually solve strictly defined tasks. They have a highly specialized structure and, with a relatively low complexity of the device and control, are quite reliable and productive in their field. These are, for example, controllers or adapters that control a number of devices, as well as programmable microprocessors.

Based on size and productive capacity, modern electronic computing equipment is divided into:

  • to ultra-large (supercomputers);
  • large computers;
  • small computers;
  • ultra-small (microcomputers).

Thus, we saw that devices, first invented by man to take into account resources and values, and then to quickly and accurately carry out complex calculations and computational operations, were constantly developing and improving.

History of the development of computer technology

The development of computing technology can be broken down into the following periods:

Ø Manual(VI century BC - XVII century AD)

Ø Mechanical(XVII century - mid-XX century)

Ø Electronic(mid XX century - present time)

Although Prometheus in Aeschylus’s tragedy states: “Think what I did to mortals: I invented the number for them and taught them how to connect letters,” the concept of number arose long before the advent of writing. People have been learning to count for many centuries, passing on and enriching their experience from generation to generation.

Counting, or more broadly, calculations, can be carried out in various forms: exists oral, written and instrumental counting . Instrumental accounting tools at different times had different capabilities and were called differently.

Manual stage (VI century BC - XVII century AD)

The emergence of counting in ancient times - “This was the beginning of beginnings...”

The estimated age of the last generation of humanity is 3-4 million years. It was so many years ago that a man stood up and picked up an instrument he had made himself. However, the ability to count (that is, the ability to break down the concepts of “more” and “less” into a specific number of units) developed in humans much later, namely 40-50 thousand years ago (Late Paleolithic). This stage corresponds to the appearance modern man(Cro-Magnon). Thus, one of the main (if not the main) characteristic that distinguishes the Cro-Magnon man from the more ancient stage of man is the presence of counting abilities.

It is not difficult to guess that the first Man's counting device was his fingers.

The fingers turned out greatcomputer. With their help it was possible to count up to 5, and if you take two hands, then up to 10. And in countries where people walked barefoot, on their fingers it was easy to count to 20. Then this was practically enough for most people's needs.

The fingers turned out to be so closely connected with account that on Ancient Greek the concept of "count" was expressed by the word"fivefold" And in Russian the word “five” resembles “pastcarpus” - part hands (the word “metacarpus” is rarely mentioned now, but its derivative is "wrist" - often used even now). The hand, metacarpus, is a synonym and in fact the basis of the numeral “FIVE” among many peoples. For example, the Malay "LIMA" means both "hand" and "five".

However, there are known peoples whose units of counting are It was not the fingers, but their joints.

Learning to count on fingers toten, people took the next step forward and began to count in tens. And if some Papuan tribes could only count to six, others could count up to several tens. Just for this it was necessary invite many counters at once.

In many languages, the words “two” and “ten” are consonant. Perhaps this is explained by the fact that once the word "ten" meant "two hands." And now there are tribes that say"two hands" instead of "ten" and "arms and legs" instead of "twenty". And in England The first ten numbers are called by a common name - “fingers”. This means that the British once counted on their fingers.

Finger counting has been preserved in some places to this day, for example, the historian of mathematics L. Karpinsky in his book “The History of Arithmetic” reports that at the world's largest grain exchange in Chicago, offers and requests, as well as prices, are announced by brokers on their fingers without a single word.

Then counting with moving stones appeared, counting with the help of rosaries... This was a significant breakthrough in human counting abilities - the beginning of the abstraction of numbers.

Who invented the computing machine

Complex modern radio systems and even many Appliances are unthinkable without computer technology, so it will be interesting for Radio readers to learn about the origins of the computer.

The origins of this process were the English mathematician Charles Babbage (1791-1871). His “analytical engine” anticipated the advent of computers by more than a hundred years. A man of varied interests, he also studied geology, archeology, and astronomy. Babbage's works on economics, political science and theology are well known. But in the annals of history he will forever remain as the inventor of the world's first digital machine. general purpose. The scientist came up with the idea of ​​​​its creation in 1833, and he devoted the rest of his life to this matter.

Babbage's machine, unlike modern computers, worked not in binary, but in decimal number system, but was based in general on the same principles. For example, it contained logical elements.

Theoretically, Babbage's machine could perform any mathematical operations by storing sequences of commands in memory (in modern terms, a program) and using punched cards as a large-capacity storage device for storing mathematical tables, data entry and programs. Babbage borrowed the idea of ​​punched cards from the textile industry: they were used in the Jacquard loom.

IN technical work Babbage was helped with the machine by the mathematically gifted daughter of the poet Byron, Ada Byron, married to Lovelace, the world's first programmer. The programming language "ADA" is named in her honor. “The Analytical Engine,” wrote Lady Lovelace, “embroiders algebraic structures in the same way as Jacquard’s machine embroiders flowers and leaves.”

The central processor (in modern terminology) of the analytical engine contained fifty thousand wheels and a thousand axes.

Unfortunately, the implementation of Babbage's ideas on mechanical devices could not lead to success. Only with the advent of electronic devices did it become possible to realize the scientist’s plans.

Who built the first computer? For a long time The first computer was considered ENIAC (an abbreviation of the English name - "electronic numerical integrator and calculator"), built on more than 18,000 vacuum tubes during the Second World War at the University of Pennsylvania (USA) under the leadership of John W. Mauchly (1907-1980). However, the priority for creating the first computer was finally awarded (literally!) in 1973 to the American scientist of Bulgarian origin, John V. Atanasov, born in 1903 in Hamilton (New York).

In the late 1930s, Atanasov, a professor at Iowa State College, after attempting to create analog devices for complex calculations, began working on a “computer proper,” or, as they would say today, a digital computer based on the binary number system. The machine was built on electromechanical and electronic components. Atanasov invented, in particular, regenerative memory using capacitors. With the help of graduate student Clifford E. Berry, he built prototype machines for solving differential equations. The machine was called ABC (Atanasov-Berry-Computer).

In 1941, Professor Mauchly, invited from the University of Pennsylvania, studied the Atanasov-Berry machine and its documentation - 35 pages outlining the principle of operation. This documentation was required to obtain funds for research work and was intended to serve as the basis for a patent application. But due to wartime, the application was never submitted. In 1942, Atanasov was already working in one of the laboratories of the US Navy.

ENIAC was declassified in 1946, and shortly thereafter Mauchly and his assistant J. Presper Eckert (b. 1919) filed a number of patent applications related to ENIAC.

Atanasov began to defend his priority only when the organization for which he worked entered into litigation with the owners of the Mochly-Eckert patents. In 1973, a panel of the Minneapolis District Court ruled that Mauchly "derived" the ideas that formed the basis of his and Eckert's patents from his long-ago visit to Atanasoff. The court recognized the “first electronic computer” not as ENIAC, but as ABC.

The court's ruling cannot be considered a strict criterion in matters of priority, but in this case it was developed with the wide involvement of qualified specialists. Mauchly’s fault was “only” that he did not refer to ABC, a specialized computer on the basis of which ENIAC was created.

"The Father of the Computer" J.V. Atanasov was awarded the medal of the Institute of Electrical and Electronics Engineers of the USA in 1983, and in 1985 - the Order of the People's Republic of Bulgaria, 1st degree.

What about Mokli? The reader should not get the impression of him as a "patent pirate." The contribution of this scientist to the development of computer technology is undeniable. The ABC computer remained an experimental device, while ENIAC served faithfully until 1955. Is this why Atanasov was only involved in the trial with difficulty?

Disputes about priority outstanding discoveries and inventions run through the entire history of science and technology. Let us recall that Isaac Newton (1643-1727) and Gottfried Wilhelm Leibniz (1646-1716) laid claim to the invention of mathematical analysis. Not only Benjamin Franklin (1706-1790), but also Prokop Divish (1698-1765) is considered the inventor of the lightning rod. For decades, controversy has not subsided about the role of Alexander Stepanovich Popov (1859-1905/06) and Guglielmo Marconi (1874-1937). Paradoxically, this question occupied more subsequent generations (especially in our country) than Popov and Marconi themselves.

Benjamin Franklin really did not like disputes about priority. He said that it is better to spend time creating new experiments than arguing about existing ones.

See other articles section.

The German astronomer Johannes Kepler often encountered extraordinary problems in his research, the solution of which required a lot of work and time. Fortunately, he had a colleague who had an idea of ​​how to help the grief: Wilhelm Schickard, professor of mathematics in Tübingen, invented the first attested gear-based computer. But, alas, Kepler was unable to use the new product - the model burned down in a fire. Only at the end of the 1950s. managed to create a copy of Schickard’s machine based on the surviving drawings and prove its functionality.

Filial assistance

To help his father, a tax collector, with his tedious calculations, Blaise Pascal developed the Pascalina, a calculating machine capable of adding and subtracting eight-digit numbers by automatically performing decimal transfers. Until the middle of the 17th century. 50 such machines were constructed, one of which became the property of the Swedish Queen Christina.

Helping humanity

The founder and first president of the Prussian Academy of Sciences in Berlin, Gottfried Wilhelm von Leibniz, not only invented differential and integral calculus, but also introduced to the scientific world in 1673 an adding machine, whose mechanical device with cylindrical rollers and a carriage was much more advanced than that of Schickard and Pascal. In this machine, Leibniz first used the binary number he invented, on which the work of future computers was based.

Start of mass production

Based on Leibniz's adding machine, Charles Xavier Thomas de Colmar designed a computer in 1818 that was also capable of extracting square roots, exponentiation, and calculating values. trigonometric functions. The Colmar adding machine was distinguished by its reliability and accuracy to the twentieth decimal place. In 1821, the inventor began mass production. In 1833, British mathematician Charles Babbage invented the first computer-controlled adding machine. Thus, he became the spiritual father of digital computing machines. However, more than 100 years passed before Konrad Zuse created the first modern computer.

  • 1853: Georg Scheitz created the first calculating machine with a printing device in Stockholm.
  • 1873: In Würzburg, the mechanical engineer Salling designed a calculating machine with a keyboard.
  • 1890: Herman Hollerith received a patent for a computer using punched cards.
  • 1967: Englishman Norman Kitz created the first desktop electronic calculator - Anita MK VIII.