Friday, December 16, 2022

HISTORY OF COMPUTERS

 


Precomputers and Early Computers (before approximately 1946) 

The abacus is considered by many to be the earliest recorded calculating device; it was used primarily as an aid for basic arithmetic calculations. Other early computing devices include the slide rule, the mechanical calculator, and Dr. Herman Hollerith’s Punch Card Tabulating Machine and Sorter. This latter device was the first electromechanical machine that could read punch cards—special cards with holes punched in them to represent data. Hollerith’s machine was used to process the 1890 U.S. Census data and it was able to complete the task in two and one-half years, instead of the decade it usually took to process the data manually. 

First-Generation Computers (approximately 1946–1957) 

The first computers were enormous, often taking up entire rooms. They were powered by thousands of vacuum tubes—glass tubes that look similar to large light bulbs—which needed replacing constantly, required a great deal of electricity, and generated a lot of heat. Usually paper punch cards and paper tape were used for input, and output was printed on paper. Two of the most significant examples of first-generation computers were ENIAC and UNIVAC. ENIAC, instead of the 40 hours required for a person to compute the optimal settings for a single weapon under a single set of conditions using manual calculations, ENIAC could complete the same calculations in less than two minutes. UNIVAC, was used to analyze votes in the 1952 U.S. presidential election.  However, UNIVAC became the first computer to be mass produced for general commercial use.

Second-Generation Computers (approximately 1958–1963) 


The second generation of computers began when the transistor—a small device made of semiconductor material that acts like a switch to open or close electronic circuits—started to replace the vacuum tube. Transistors allowed second-generation computers to be smaller, less expensive, more powerful, more energy-efficient, and more reliable than first-generation computers. Typically, programs and data were input on punch cards and magnetic tape, output was on punch cards and paper printouts, and magnetic tape was used for storage. Hard drives and programming languages (such  as FORTRAN and COBOL) were developed and implemented during this generation.

Third-Generation Computers (approximately 1964–1970)

The replacement of the transistor with integrated circuits (ICs) marked the beginning of the third generation of computers. Integrated circuits incorporate many transistors and electronic circuits on a single tiny silicon chip, allowing third-generation computers to be even smaller and more reliable than computers in the earlier computer generations. Instead of punch cards and paper printouts, keyboards and monitors were introduced for input and output; hard drives were typically used for storage. An example of a widely used third generation computer 

Fourth-Generation Computers (approximately 1971–present)

A technological breakthrough in the early 1970s made it possible to place an increasing number of transistors on a single chip. This led to the invention of the microprocessor in 1971, which ushered in the fourth generation of computers. In essence, a microprocessor contains the core processing capabilities of an entire computer on one single chip. The original IBM PC and Apple Macintosh computers, and most of today’s traditional computers, fall into this category. Fourth-generation computers typically use a keyboard and mouse for input, a monitor and printer for output, and hard drives, flash memory media, and optical discs for storage. This generation also witnessed the development of computer networks, wireless technologies, and the Internet.

Fifth-Generation Computers (now and the future)

Fifth-generation computers are most commonly defined as those that are based on artificial intelligence, allowing them to think, reason, and learn. Some aspects of fifth-generation computers—such as voice and touch input and speech recognition—are in use today. In the future, fifth-generation computers are expected to be constructed differently than they are today, such as in the form of optical computers that process data using light instead of electrons, tiny computers that utilize nanotechnology, or as entire general-purpose computers built into desks, home appliances, and other everyday devices.

No comments:

Post a Comment

ARTIFICIAL INTELLIGENCE

  What Is Artificial Intelligence (AI)? Artificial intelligence (AI) refers to the simulation of human intelligence in machines that are pro...