FIVE GENERATIONS OF COMPUTERS
Since the development of the Harvard Mark I, the digital computing machines have progressed at a rapid pace. Computers are often divided into five generations according to a series of advances in hardware, mainly in logic circuitry. Each generation comprises a group of machines that share a common technology.
The First Generation (the 1940s – much of the 1950s)
ENIAC, along with other electronic computers built in the 1940s, marks the beginning of the so-called first-generation computers. These computers cost millions of dollars and filled entire rooms. They used thousands of vacuum tubes for calculation, control, and sometimes for memory as well. Vacuum tubes were bulky, unreliable, energy consuming devices generating large amounts of heat. The vacuum tubes of one machine consumed enough electricity to power a small town. As long as computers were tied down to vacuum tube technology, they could only be huge, heavy and expensive. Though their operations were very fast in comparison with manual calculations, they were slow by today's standards.
The Second Generation (the late 1950s – the early 1960s)
The invention of the transistor in 1947 resulted in a revolution in computer development. Germanium (later silicon) transistors were smaller, more reliable and efficient than the vacuum tubes that had been used in electronics up to that time. These semi-conductor devices generated and controlled the electric signals that operated the computer. By the late 1950s and early 1960s, vacuum tubes were no longer used in computers.
Transistors led to the creation of smaller, more powerful and faster computers known as minicomputers. They were operated by specialized technicians, who were often dressed in white lab coats and usually referred to as "computer priesthood1". The machines were expensive and difficult to use. Few people came in direct contact with them, not even their programmers. The typical interaction was as follows: a programmer coded instructions and data on preformatted paper, a keypunch operator transferred the data onto punch cards, a computer operator fed the cards into a card reader, and, finally, the computer executed the instructions or stored the cards' information for later processing.
The so-called second-generation computers, which used large numbers of transistors, were able to reduce computational time from milliseconds to microseconds or millionths of seconds. At that time, there were two types of computers. There were room-sized mainframes, costing hundreds of thousands of dollars that were built one at a time by companies such as International Business Machines Corporation and Control Data Corporation. There also were smaller (refrigerator-sized), cheaper (about 100,000 dollars), mass-produced minicomputers built by such companies as Digital Equipment Corp. and Hewlett-Packard Company for scientific research laboratories, large businesses and higher educational institutions.
Most people, however, had no direct contact with either type of computer, and the machines were popularly viewed as giant brains that threatened to eliminate jobs as a result of automation. The idea that anyone would have his or her own desktop computer was generally considered as far-fetched2.
The Third Generation (much of the 1960s – the 1970s)
The step forward in computer miniaturization came in 1958, when Jack Kilby, an American engineer, designed the first integrated circuit (IC). His prototype consisted of a germanium wafer that included hundreds of tiny transistors, diodes, resistors, and capacitors – the main components of electronic circuitry. The microchip itself was small enough to fit on the end of your finger (See Figure 1).
The invention of the IC marks the beginning of the third generation of computers. With integrated circuits, computers could be made smaller, less expensive and more reliable. They could perform many data processing operations in nanoseconds, which are billionths of seconds.
The Integrated Circuit
The next jump in the development of computer technology came with the introduction of large-scale ICs. Using less-expensive silicon chips, engineers managed to place more and more electronic components on each chip. Whereas the older ICs contained hundred of transistors, the new ones contained thousands or tens of thousands (modern microprocessors can contain more than 40 million transistors).
It was the large-scale ICs that made possible to produce the microprocessor and the microcomputer. The price of computers then fell; more and more small businesses and individuals could afford to buy them. Microcomputers – systems no larger than portable television sets yet with large computing power – began to be called the personal computers (PCs). All these recent developments have resulted in a microprocessor revolution, which began in the middle 1970s and for which there is no end in sight.
The Fourth Generation (1980s and beyond)
By the beginning of the 1980s, integrated circuitry had advanced to very large-scale integration (VLSI). This technology greatly increased the circuit density of microprocessor, memory and support circuitry – i.e.those that serve to interface microprocessors with input-output devices. By the 1990s, some VLSI circuits had contained more than 3 million transistors on a silicon chip less than 2 square cm in area.
The digital computers using VLSI technologies are frequently referred to as fourth-generation systems. These computers are hundred times smaller than those of the first generation and a single chip is far more powerful than the whole ENIAC. They are characterized by low cost, ease of use and large capabilities.
This fourth generation is the first in which a lot of computers are widely used in business, science, industry, medicine, education, or for home use. In addition to the common applications in digital watches, pocket calculators and personal computers, there are microprocessors in practically every machine at home or business – from microwave ovens and cellular telephones to spacecrafts and Global Positioning System3 (GPS) devices.
The Fifth Generation
The computer revolution is very dynamic. We are on the threshold4of the fifth generation of computers. The term was devised by the Japanese to describe the powerful, intelligent computers they wanted to build by the mid-1990s. Since then it has become an umbrella term5, encompassing many research fields in the computer industry. Today researchers in the USA, Western Europe, Japan work on the problems of artificial intelligence, the application of natural languages for inputting data, ultra-large-scale integration (ULSI) technologies, etc.
Notes: 1priesthood – высшая каста, сленг спецы, асы, гуру;
2far-fetched – "притянутый за уши", нереальный;
3Global Positioning System – глобальная система навигации и определения положения;
4threshold – порог, преддверие, канун;
5umbrella term – всеохватывающий термин (номинация).
Ex. 22. Search the text for the English equivalents to the following phrases:
1. ряд усовершенствований в аппаратном обеспечении компьютера;
2. так называемый;
3. знаменует начало;
4. вырабатывающий большое количество тепла;
5. потребляла столько электричества, что хватило бы на энергоснабжение небольшого города;
6. по сравнению с;
7. вырабатывать электрические сигналы и управлять ими;
8. электронные лампы больше не использовались в компьютерах;
9. оператор, отвечающий за нанесение данных на перфокарты или перфоленту;
10. вводить перфокарты в устройство для считывания;
11. сохранять информацию для дальнейшей обработки;
12. миникомпьютеры массового (серийного) производства;
13. шаг вперёд;
14. поместиться на кончике пальца;
15. одна миллиардная доля секунды;
16. следующий скачок в усовершенствовании вычислительной техники;
17. именно большие интегральные схемы позволили создать;
18. революция, которой не видно конца;
19. невысокая стоимость, простота в использовании, большие возможности;
20. термин был придуман;
21. использование естественных языков для ввода данных.
©2015- 2021 megalektsii.ru Все материалы представленные на сайте исключительно с целью ознакомления читателями и не преследуют коммерческих целей или нарушение авторских прав.