history of computer and its generation pdf

History Of Computer And Its Generation Pdf

File Name: history of computer and its generation .zip
Size: 1924Kb
Published: 24.05.2021

The first counting device was used by the primitive people.

Introduction: A computer is an electronic device that manipulates information or data. It has the ability to store, retrieve, and process data.

The history of computing hardware covers the developments from early simple devices to aid calculation to modern day computers. Before the 20th century, most calculations were done by humans. Early mechanical tools to help humans with digital calculations, like the abacus , were referred to as calculating machines or calculators and other proprietary names. The machine operator was called the computer.

Digital computer

It would also include discussion of mechanical, analog and digital computing architectures. As late as the s, mechanical devices, such as the Marchant calculator, still found widespread application in science and engineering.

During the early days of electronic computing devices, there was much discussion about the relative merits of analog vs. In fact, as late as the s, analog computers were routinely used to solve systems of finite difference equations arising in oil reservoir modeling.

In the end, digital computing devices proved to have the power, economics and scalability necessary to deal with large scale computations. Digital computers now dominate the computing world in all areas ranging from the hand calculator to the supercomputer and are pervasive throughout society.

Therefore, this brief sketch of the development of scientific computing is limited to the area of digital, electronic computers. The evolution of digital computing is often divided into generations. Each generation is characterized by dramatic improvements over the previous generation in the technology used to build computers, the internal organization of computer systems, and programming languages.

Although not usually associated with computer generations, there has been a steady improvement in algorithms, including algorithms used in computational science.

The following history has been organized using these widely recognized generations as mileposts. The Mechanical Era The idea of using machines to solve mathematical problems can be traced at least as far as the early 17th century. Mathematicians who designed and implemented calculators that were capable of addition, subtraction, multiplication, and division included Wilhelm Schickhard, Blaise Pascal, and Gottfried Leibnitz.

The first multi-purpose, i. A more ambitious machine was the Analytical Engine. It was designed in , but unfortunately it also was only partially completed by Babbage. Babbage was truly a man ahead of his time: many historians think the major reason he was unable to complete these projects was the fact that the technology of the day was not reliable enough. In spite of never building a complete working machine, Babbage and his colleagues, most notably Ada, Countess of Lovelace, recognized several important programming techniques, including conditional branches, iterative loops and index variables.

A machine inspired by Babbage's design was arguably the first to be used in computational science. George Scheutz read of the difference engine in , and along with his son Edvard Scheutz began work on a smaller version.

By they had constructed a machine that could process digit numbers and calculate fourth-order differences. Their machine won a gold medal at the Exhibition of Paris in , and later they sold it to the Dudley Observatory in Albany, New York, which used it to calculate the orbit of Mars. One of the first commercial uses of mechanical computers was by the US Census Bureau, which used punch-card equipment designed by Herman Hollerith to tabulate data for the census. In Hollerith's company merged with a competitor to found the corporation which in became International Business Machines.

First Generation Electronic Computers Three machines have been promoted at various times as the first electronic computers. These machines used electronic switches, in the form of vacuum tubes, instead of electromechanical relays.

In principle the electronic switches would be more reliable, since they would have no moving parts that would wear out, but the technology was still new at that time and the tubes were comparable to relays in reliability. The earliest attempt to build an electronic computer was by J. Atanasoff, a professor of physics and mathematics at Iowa State, in Atanasoff set out to build a machine that would help his graduate students solve systems of partial differential equations.

By he and graduate student Clifford Berry had succeeded in building a machine that could solve 29 simultaneous equations with 29 unknowns.

However, the machine was not programmable, and was more of an electronic calculator. A second early electronic machine was Colossus, designed by Alan Turing for the British military in Turing's main contribution to the field of computer science was the idea of the Turing machine, a mathematical formalism widely used in the study of computable functions.

The existence of Colossus was kept secret until long after the war ended, and the credit due to Turing and his colleagues for designing one of the first working electronic computers was slow in coming. Presper Eckert and John V. Mauchly at the University of Pennsylvania. The machine wasn't completed until , but then it was used extensively for calculations during the design of the hydrogen bomb.

By the time it was decommissioned in it had been used for research on the design of wind tunnels, random number generators, and weather prediction. There is some controversy over who deserves the credit for this idea, but none over how important the idea was to the future of general purpose computers. ENIAC was controlled by a set of external switches and dials; to change the program required physically altering the settings on these controls.

These controls also limited the speed of the internal electronic operations. Through the use of a memory that was large enough to hold both instructions and data, and using the program stored in memory to control the order of arithmetic operations, EDVAC was able to run orders of magnitude faster than ENIAC.

By storing instructions in the same medium as data, designers could concentrate on improving the internal structure of the machine without worrying about matching it to the speed of an external control. Regardless of who deserves the credit for the stored program idea, the EDVAC project is significant as an example of the power of interdisciplinary projects that characterize modern computational science. By recognizing that functions, in the form of a sequence of instructions for a computer, can be encoded as numbers, the EDVAC group knew the instructions could be stored in the computer's memory along with numerical data.

The notion of using numbers to represent functions was a key step used by Goedel in his incompleteness theorem in , work which von Neumann, as a logician, was quite familiar with.

Von Neumann's background in logic, combined with Eckert and Mauchly's electrical engineering skills, formed a very powerful interdisciplinary team. Software technology during this period was very primitive. The first programs were written out in machine code, i. By the s programmers were using a symbolic notation, known as assembly language, then hand-translating the symbolic notation into machine code. Later programs known as assemblers performed the translation task.

As primitive as they were, these first electronic machines were quite useful in applied science and engineering. Atanasoff estimated that it would take eight hours to solve a set of equations with eight unknowns using a Marchant calculator, and hours to solve 29 equations for 29 unknowns.

The Atanasoff-Berry computer was able to complete the task in under an hour. The first problem run on the ENIAC, a numerical simulation used in the design of the hydrogen bomb, required 20 seconds, as opposed to forty hours using mechanical calculators. Second Generation The second generation saw several important developments at all levels of computer system design, from the technology used to build the basic circuits to the programming languages used to write scientific applications.

Electronic switches in this era were based on discrete diode and transistor technology with a switching time of approximately 0. Important innovations in computer architecture included index registers for controlling loops and floating point units for calculations based on real numbers.

Prior to this accessing successive elements in an array was quite tedious and often involved writing self-modifying code programs which modified themselves as they ran; at the time viewed as a powerful application of the principle that programs and data were fundamentally the same, this practice is now frowned upon as extremely hard to debug and is impossible in most high level languages.

Floating point operations were performed by libraries of software routines in early computers, but were done in hardware in second generation machines. Important commercial machines of this era include the IBM and its successors, the and The second generation also saw the first two supercomputers designed specifically for numeric processing in scientific applications.

Two machines of the s deserve this title. Third Generation The third generation brought huge gains in computational power. Innovations in this era include the use of integrated circuits, or ICs semiconductor devices with several transistors built into one physical component , semiconductor memories starting to be used instead of magnetic cores, microprogramming as a technique for efficiently designing complex processors, the coming of age of pipelining and other forms of parallel processing described in detail in Chapter CA , and the introduction of operating systems and time-sharing.

Multilayered printed circuits were developed and core memory was replaced by faster, solid state memories. In , Seymour Cray developed the CDC , which was the first architecture to use functional parallelism. By using 10 separate functional units that could operate simultaneously and 32 independent memory banks, the CDC was able to attain a computation rate of 1 million floating point operations per second 1 Mflops.

The CDC , with its pipelined functional units, is considered to be the first vector processor and was capable of executing at 10 Mflops. It employed instruction look ahead, separate floating point and integer functional units and pipelined instruction stream. Fourth Generation The next generation of computer systems saw the use of large scale integration LSI - devices per chip and very large scale integration VLSI - , devices per chip in the construction of computing elements.

Gate delays dropped to about 1ns per gate. Semiconductor memories replaced core memories as the main memory in most systems; until this time the use of semiconductor memory in most systems was limited to registers and cache. Computers with large main memory, such as the CRAY 2, began to emerge. A variety of parallel architectures began to appear; however, during this period the parallel computing efforts were of a mostly experimental nature and most computational science was carried out on vector processors.

Microcomputers and workstations were introduced and saw wide use as alternatives to time-shared mainframe computers. Developments in software include very high level languages such as FP functional programming and Prolog programming in logic.

These languages are not yet in wide use, but are very promising as notations for programs that will run on massively parallel computers systems with over 1, processors. Compilers for established languages started to use sophisticated optimization techniques to improve code, and compilers for vector processors were able to vectorize simple loops turn loops into single instructions that would initiate an operation over an entire vector.

Two important events marked the early part of the third generation: the development of the C programming language and the UNIX operating system, both at Bell Labs. This C-based UNIX was soon ported to many different computers, relieving users from having to learn a new operating system each time they change computer hardware. An important event in the development of computational science was the publication of the Lax report.

The Lax Report stated that aggressive and focused foreign initiatives in high performance computing, especially in Japan, were in sharp contrast to the absence of coordinated national attention in the United States. The report noted that university researchers had inadequate access to high performance computers. One of the first and most visible of the responses to the Lax report was the establishment of the NSF supercomputing centers.

Phase I on this NSF program was designed to encourage the use of high performance computing at American universities by making cycles and training on three and later six existing supercomputers immediately available.

In addition they have provided many valuable training programs and have developed several software packages that are available free of charge. Fifth Generation The development of the next generation of computer systems is characterized mainly by the acceptance of parallel processing. Until this time parallelism was limited to pipelining and vector processing, or at most to a few processors sharing jobs.

The fifth generation saw the introduction of machines with hundreds of processors that could all be working on different parts of a single program. The scale of integration in semiconductors continued at an incredible pace - by it was possible to build chips with a million components - and semiconductor memories became standard on all computers. Other new developments were the widespread use of computer networks and the increasing use of single-user workstations.

Looking for other ways to read this?

A transistor computer , now often called a second generation computer , [1] is a computer which uses discrete transistors instead of vacuum tubes. The first generation of electronic computers used vacuum tubes, which generated large amounts of heat, were bulky and unreliable. A second generation of computers, through the late s and s featured circuit boards filled with individual transistors and magnetic core memory. These machines remained the mainstream design into the late s, when integrated circuits started appearing and led to the third generation computer. The University of Manchester 's experimental Transistor Computer was first operational in November and it is widely believed to be the first transistor computer to come into operation anywhere in the world.

History of Computer Generation PDF

Not a MyNAP member yet? Register for a free account to start saving and receiving special member only perks. A A History of Computer Performance. Computer performance has historically been defined by how fast a computer system can execute a single-threaded program to perform useful work. Why care about computer performance?

Digital computer , any of a class of devices capable of solving problems by processing information in discrete form. It operates on data, including magnitudes, letters, and symbols, that are expressed in binary code —i. By counting, comparing, and manipulating these digits or their combinations according to a set of instructions held in its memory , a digital computer can perform such tasks as to control industrial processes and regulate the operations of machines; analyze and organize vast amounts of business data; and simulate the behaviour of dynamic systems e.

Computer Evolution. Computer History and Development. Nothing epitomizes modern life better than the computer.

4 comments

Tiffany N.

PDF | On Oct 21, , Ishaq Zakari and others published History of computer and its generations. | Find, read and cite all the research you.

REPLY

Quetitutin

First Generation Computers. • The first electronic computer was designed at Iowa State between • The Atanasoff-Berry Computer used the binary.

REPLY

Vanya C.

History. 4. Computer Generations These human computers were typically engaged in the calculation of record and document numbers, quantities, or even.

REPLY

Hercilia R.

High school algebra book pdf united nations convention on the rights of the child 1991 pdf

REPLY

Leave a comment

it’s easy to post a comment

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>