Login / Signup18002669990

History of Computer Generations

History of Computer Generations

Charles Babbage is said to be the "father of computers", since it was, he who had first conceived the idea of the basic design of a computer.

With his “need” to develop a device that can carry out calculations with absolute precision and eliminate miscalculations in course-plotting and engineering, he offered the world the design for his first computer called the Difference Engine.

The gradual developments of computers over the years

Computers have faced “technical evolution” over the entire course of their years. With the advancement in scientific knowledge and improved technical progress, computers are growing “smarter” day-by-day.

Science has been able to spearhead the improvement that has allowed computers to advance.

Now, future computers can make their "own decisions" with better artificial intelligence.    

 

First Generation ENIAC (1940-1956)

The ENIAC or Electronic Numerator Integrator and Calculator computers were fragile.

These first-generation machines used vacuum tubes, generated much heat, and consumed vast amounts of electricity.

The massive heat generation would attract insects (or “bugs”) into it that would cause an error in the calculations.

Thus, came the word “bugs” in computers to indicate a malfunctioning machine.

ENIAC devices could decrypt messages and calculate ballistic targeting.

 

Second Generation Machines (1957-1963)

The second-generation computers were composed of transistors whose major manufacturing came from the USA (Santa Clara Valley).

The use of silicon in transistors made the name of the location "Silicon Valley".

They were comparatively accurate, although these second-generation computers carried out calculations at a rate of 5000 additions/second.

However, increasing the performance capacity of these machines required the addition of more transistors.

Thus, these machines kept demanding more and more extra machine parts.

 

Third Generation Machines (1964-1971)

The use of multiple transistors into a single chip brought forth Integrated Circuits, the backbone of third-generation computers.

According to Moore's Law, every two years, the expenditure of the IC would reduce by half, as twice the number of transistors can be fitted into ICs.

The closure of circuits in an IC reduced the flow-time of electricity, leading to its swift calculation capabilities.

But air conditions were mandatory for its proper functioning.

 

Fourth Generation Devices (1972-Present)

The invention of stamp-size chips, called microprocessors, has paved the way for fourth-generation computers.

The tiny microprocessor carries all the necessary components to run the machine.

One of the oldest firms still involved in the development of integrated electronics is Intel.

These machines were known as microcomputers due to their reduced size in comparison to the earlier ones.

One of the most renowned fourth-generation personal computers, under the public spotlight during that time, was the Apple II from Apple Computers.

 

Fifth Generation Computers (Future Computers)

The fifth-generation computers might carry decision-making capabilities.

Through the use of advanced artificial intelligence and improved neural networks, they can "learn" and "adapt".

The smallest fifth-generation computer, with the size of a rice grain, can quantify temperature in cell clusters.

These future computers carry limitless real-life applications and might be difficult to control due to their capabilities at interpreting information and making decisions.
 

SHARE |