Главная | Обратная связь | Поможем написать вашу работу!
МегаЛекции

New developments in electronic memories




 

I. The versatile capabilities that have made the computer the great success of our age are due to exploitation of the high speed of electronic computation by means of stored programs. This pro­cess requires that intermediate results be stored rapidly and fur­nished1 on demand for long computations, for which high speed is worthwhile2 in the first place.

Storage devices or memories3 must have capacities4 sufficient not only tor intermediate results but also for the input and output data and the programs.

Once prepared a program can be reused any number of times, which involves remembering.

Computers can "remember" and "recall"5 and virtually un­limited is the capacity of computers to remember (that is, to store information). Associated with the capacity of remembering is the capacity of recalling.

In the context of electronics "memory" (or, in British usage, "store") usually refers to a device for storing digital information. Storage ("write") and retrieval ("read") operations are completely under electronic control. The storage of auditory or visual infor­mation in analogue form is usually referred to as recording.

There is some overlap6 between analogue and digital recording. Described here is digital memory.

The most widely used digital memories are read/write memo­ries, the term signifying7 that they perform read and write oper­ations at an identical or similar rate.

Of primary importance to characteristics for memories are storage capacity, cost per bit and reliability. Other important char­acteristics are speed of operation (defined in terms of access time), cycle8 time and data-transfer rate. Access time is simply the time it takes to read or write at any storage location.

The demand for fast access and large capacity has grown con­stantly. Never before has man possessed a tool comparable to a computer. Today there are memories accessible in tens of nanoseconds and memories with more than a billion bits. However although the existence of computer was a reality, only in 1970s have we got a microprocessor. It is the microprocessor that helps to solve many problems.

Ideal would be a single device in which vast amounts of in­formation could be stored in non-volatile form suitable for archival record-keeping and yet be accessible at electronic speeds when called for.9 So far10 there is no way to realize this ideal. Fortunately, the benefits of large capacity and rapid access can be obtained by use of a hierarchy of different types of storage devices of decreasing capacity and increasing speed.

A prime distinction11 between memories is the manner12 in hich information is stored (written) and accessed (read) andom-access memories involve column13 and row14 matrices which allow information to be stored in any cell15 and accessed in proximately the same time. By contrast,16 "serial access" means that information is stored in column order, and access time depends on the storage location selected.

The main hierarchy today comprises, on the one hand, large-capacity magnetic recording devices, which are accessed mechani­cally and serial (reels17 of tapes, disks, and drums), and on the other hand, fast electronic memories (the core18 memory and various types of transistor memories).

Random-access memories can complete read and write oper­ations in specified minimum period known as the cycle time. Serial-access and block-access memories have a variable and rela­tively large access time after which the data-transfer rate is con­stant. The data-transfer rate is the rate at which information is transferred to or from sequential storage positions.

The smallest block of information accessible in a memory system can be a single bit (represented by 0 or 1), a larger group of bits such as a byte19 or character20 (usually eight or nine bits), or a word (12 to 64 bits depending on the particular system). Most memories are location- addressable,21 which means that a desired bit, byte or word has a specified address or physical location to which it is assigned.22

Of prime interest to a reader will be the knowledge of the de­velopment of memories.

One of the first electronic memories was a circulating delay line, a signal transmission device in which the output, properly amplified and shaped, was fed back23 into the input. Although it was economical, it had the inherent drawback24 of serial access: the greater the capacity, the longer the average access time. What was really needed was selective access to any stored data in a time that was both as short as possible and independent of the data ad­dress or any previous access. This is known as random access, so named to emphasize25 the total freedom of accessing and there­fore of branching26 (following one or another part of a program). The first random-access memories (RAM's) were electrostatic storage tubes.

In the early 1950's the core memory replaced these early de­vices, providing a solution to the need for random access that truly fired the emerging computer industry.

The core memory has become the main internal computer memory and was used universally until challenged27 recently by semiconductor memories. Typical are memories with 1 million words of 30 to 60 bits each, randomly accessible in 1 microsecond. The core memory has also been extended to very large capacities, of the order of 100 million words.

In the 1950's and 1960's electronic memories were arrays of cores, or rings, of ferrite material a millimeter or less in diameter, strung28 by thousands on a grid of wires. Ferrite-core memories have now been largely succeeded29 in new designs by semicon­ductor memories that provide faster data access, smaller physical size and lower power consumption, and all at significantly lower cost.

In the early 1970's semiconductor memory cells that served the same purpose as cores were developed, and integrated mem­ory circuits began to be installed as the main computer memory.

In the 1980's new memory technologies involving magnetic bubbles, superconducting tunnel-junction devices and devices ac­cessed by laser beams or electron beams come into play.

Semiconductor memories are extremely versatile and highly compatible30 with other electronic devices in both small and large systems and have much potential for further improvement in performance and cost. They are expected to dominate the elec­tronic-memory market31 for at least another decade.

The most widely used form of electronic memory is the ran­dom-access read/write memory (RAM) fabricated in the form of a single large-scale-integrated memory chip capable of storing as many as 65.000 bits in an area less than half a centimeter on a side. A number of individual circuits, each storing one binary bit, are organized in a rectangular32 array. Access to the location of a single bit is provided by a binary-coded address presented as an input to address decoders that select one row and one column for a read or write operation. Only the storage element at the intersection33 of the selected row and column is the target34 for the reading or writing of one bit of information. A read/write control signal determines which of the two operations is to be performed. The memory array can be designed with a single in­put-output line for the transfer of data or with several parallel lines for the simultaneous35 input or output of four, eight or more bits.

II. Different categories of semiconductor memories and spe­cific data storage applications where they find primary use provide system engineers with a wide range of options.36 In general, metal-oxide semiconductor (MOS), erasable-programmable read-­only memories (EPROMs) and dynamic random-access memo­ries (RAMs) are extensively used in micro- and minicomputer ap­plications. The slow electrically-alterable read-only memories (EAROMs) are most suitable to peripherals, at present. In addi­tion, dense dynamic MOS RAMs are used in large volume37 in small and large mainframe computers, and so on and so forth. Many laboratories are looking for new options.

However, we are still far from the ideal shoe-box device with 1012 bits accessible in nanoseconds, and still farther from the ca­pacities of 1015 bits needed for many already well-defined appli­cations. Although much can still be expected from VLSI and mag­netic techniques, these great goals (цель) may require radically new approaches.

Very high speed and very low power memories rather than large capacity may well be the benefits of some of these ap­proaches.38

Thus computers today use a hierarchy of large-capacity, rela­tively slow mechanically accessed memories in conjunction39 with fast electronically accessed memories of relatively small capacity. It would be highly desirable to fill the gap by some device of sufficient capacity and speed.

Candidates for gap-filling memories include metal-oxide semiconductor (MOS) random-access memories (RAMs) made by large-scale integration (LSI); magnetic bubble devices based on cylindrical domains of magnetization; electron beam-addressed memories; and optical memories based on lasers, holography, and electrooptical effects, charge-coupled devices (CCD).

One of the latest designs of a CCD serial-access memory has storage for 65.536 bits on a chip measuring about 3.5 by five mil­limeters.

The vast40 number of different types of semiconductor mem­ories available to the system engineer is increasing steadily.

Radically new technologies, still at an early laboratory stage, are aimed41 at a more ideal solution than today's hierarchy.

Many laboratories are looking into basic principles. Semiconductor memories based on the Josephson effect may be able to operate in picoseconds on small power. The boundaries within the walls of magnetic domains,42 exploited in the bubble lattice devices, are also used in a so-called cross-tie memory that may provide non-volatile storage memories on LSI chips.

One can foresee the development of cryoelectronic memories with extremely high component densities operating at speeds 10 to 100 times faster than today's fastest electronic memories.

Researchers now are looking forward to light parti­cles-photons-which will permit the performance to be made a thousand times faster. This would mean that in the future we can expect the emergence of photon computers and that computations will be done by means of light.

Any radical improvement in memory technology will ul­timately greatly affect our way of life, as previous innovations have shown.

 

Поделиться:





Воспользуйтесь поиском по сайту:



©2015 - 2024 megalektsii.ru Все авторские права принадлежат авторам лекционных материалов. Обратная связь с нами...