History of Computer

Pre-twentieth century

Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was probably a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers.[3][4] The use of counting rods is one example.

The Chinese Suanpan (算盘) (the number represented on this abacus is 6,302,715,408)

The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in Babylonia as early as 2400 BC. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money.

The ancient Greek-designedAntikythera mechanism, dating between 150 and 100 BC, is the world’s oldest analog computer.

The Antikythera mechanism is believed to be the earliest mechanical analog “computer”, according to Derek J. de Solla Price.[5] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kytheraand Crete, and has been dated to circa 100 BC. Devices of a level of complexity comparable to that of the Antikythera mechanism would not reappear until a thousand years later.

Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century.[6] The astrolabe was invented in theHellenistic world in either the 1st or 2nd centuries BC and is often attributed toHipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer[7][8] and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235.[9] Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe,[10] an early fixed-wired knowledge processing machine[11] with a gear train and gear-wheels,[12] circa1000 AD.

The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation.

The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage.

A slide rule

The slide rule was invented around 1620–1630, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well astranscendental functions such as logarithms and exponentials, circular andhyperbolic trigonometry and other functions. Aviation is one of the few fields where slide rules are still in widespread use, particularly for solving time–distance problems in light aircraft. To save space and for ease of reading, these are typically circular devices rather than the classic linear slide rule shape. A popular example is theE6B.

In the 1770s Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automata) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically “programmed” to read instructions. Along with two other complex machines, the doll is at the Musée d’Art et d’Histoire of Neuchâtel, Switzerland, and still operates.[13]

The tide-predicting machine invented by Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location.

The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876 Lord Kelvin had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators.[14] In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers.

First computing device

Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the “father of the computer“,[15]he conceptualized and invented the first mechanical computer in the early 19th century. After working on his revolutionary difference engine, designed to aid in navigational calculations, in 1833 he realized that a much more general design, anAnalytical Engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The Engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integratedmemory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.[16][17]

The machine was about a century ahead of its time. All the parts for his machine had to be made by hand — this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage’s failure to complete the analytical engine can be chiefly attributed to difficulties not only of politics and financing, but also to his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine’s computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906.

Analog computers

Sir William Thomson‘s third tide-predicting machine design, 1879–81

During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.[18] The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the brother of the more famous Lord Kelvin.[14]

The art of mechanical analog computing reached its zenith with the differential analyzer, built by H. L. Hazen and Vannevar Bush at MIT starting in 1927. This built on the mechanical integrators of James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence became obvious. By the 1950s the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education (control systems) and aircraft (slide rule).

Digital computers

Electromechanical

By 1938 the United States Navy had developed an electromechanical analog computer small enough to use aboard asubmarine. This was the Torpedo Data Computer, which used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II similar devices were developed in other countries as well.

Replica of Zuse‘s Z3, the first fully automatic, digital (electromechanical) computer.

Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939, was one of the earliest examples of an electromechanical relay computer.[19]

In 1941, Zuse followed his earlier machine up with the Z3, the world’s first workingelectromechanical programmable, fully automatic digital computer.[20][21] The Z3 was built with 2000 relays, implementing a 22 bit word length that operated at a clock frequency of about 5–10 Hz.[22] Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such asfloating point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage‘s earlier design), using a binary system meant that Zuse’s machines were easier to build and potentially more reliable, given the technologies available at that time.[23] The Z3 was Turing complete.[24][25]

Vacuum tubes and digital electronic circuits

Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation 5 years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes.[18] In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942,[26] the first “automatic electronic digital computer”.[27] This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory.[28]

Colossus was the first electronicdigital programmable computing device, and was used to break German ciphers during World War II.

During World War II, the British at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanicalbombes. To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus.[28] He spent eleven months from early February 1943 designing and building the first Colossus.[29] After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944[30] and attacked its first message on 5 February.[28]

Colossus was the world’s first electronic digital programmable computer.[18] It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1500 thermionic valves (tubes), but Mark II with 2400 valves, was both 5 times faster and simpler to operate than Mark 1, greatly speeding the decoding process.[31][32]

ENIAC was the first Turing-complete device, and performed ballistics trajectory calculations for the United States Army.

The US-built ENIAC[33] (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the US. Although the ENIAC was similar to the Colossus it was much faster and more flexible. Like the Colossus, a “program” on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches.

It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC’s development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors

Definition of Computer

A computer is a device that can be instructed to carry out an arbitrary set of arithmetic or logical operations automatically. Their ability of computers to follow a sequence of operations, called a program, make computers very flexible and useful. Such computers are used as control systems for a very wide variety of industrial and consumer devices. This includes simple special purpose devices like microwave ovens and remote controls, factory devices such as industrial robots and computer assisted design, but also in general purpose devices like personal computers and mobile devices such as smartphones. The Internet is run on computers and it connects millions of other computers.

Since ancient times, simple manual devices like the abacus aided people in doing calculations. Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II. The speed, power, and versatility of computers increased continuously and dramatically since then, to the point that artificial intelligence may become possible in the future.

Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU), and some form of memory. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joystick, etc.), output devices (monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000 s-era touchscreen). Peripheral devices allow information to be retrieved from an external source and they enable the result of operations to be saved and retrieved.

 

Parts of Computer

The basic parts of a computer system are:

  1. Monitor
  2. CPU (Central Processing Unit)
  3. Keyboard
  4. Mouse
  5. Speakers
  6. Printer

Let us find out about some more devices that can be connected to a computer.

Input Devices:

Devices that help us put data into the computer are called input devices. They help in giving instructions to the computer. Let us learn about a few input devices.

keyboard information for KidesKeyboard:

The keyboard is used for entering data into the computer system. It can type words, numbers and symbols.  More information of computer keyboard  Click here.

 

 

Mouse: mose information for kide

The mouse is a pointing device. You can give input to the computer with the help of the mouse. More information of computer mouse

Click here.

 

Computer joystick info for kidesJoystick:

 

A joystick makes computer games a lot more fun. When it is moved, it passes information to the computer.

 

 

 Microphone:http://localhost/wordpress/basic-parts-of-a-computer-with-devices/
 

A microphone is the mike that can be attached to a computer. It allows you to input sounds like speech and songs into the computer. You can record your voice with the help of a microphone.

 

Web Camera information for Kides
Web Camera: 

 

A web camera is used to take live photos videos. You can save them in the computer.

 

 

Scanner: http://localhost/wordpress/basic-parts-of-a-computer-with-devices/

A Scanner Copies pictures and pages, and turns them into images that can be saved on a computer.

 

 

Processing Device:

All the inputs are stored, sorted, arranged and changed by a computer. The device that helps a computer do so is called the processing device. The processing device in a computer is known as Central Processing Unit (CPU).

http://localhost/wordpress/basic-parts-of-a-computer-with-devices/

Output Devices:

The parts of a computer that help us to show the results of processing are called out devices. Let us learn about a few output devices.

 Computer monitor  information for KidesMonitor:

 

A monitor looks like a TV screen. It shows whatever you type on the keyboard or draw with the mouse.

 

 

Printer:Printer information for kide

A printer prints the results of your work from the computer screen on a sheet of paper. This is called a printout.

 

 

compute Speakers  information For KidesSpeakers:

 

The speakers are the output devices that produce different types of sounds processed by the computer. You can listen to songs or speeches stored in the computer with the help of speakers.

 

 

Headphone:Headphone  information for Kides

You can listen to music or any sound from a computer with the help of headphones without disturbing others.

 

Storage Devices:

 

The parts of a computer which are used for storing data are called. Storing data are called storage devices. They help in storing any work done on a computer permanently. Let us learn about a few storage devices.

Computer hard disk information fo KidesHard Disk:

 

Inside the CPU there is a hard disk. It is made up of one or more metallic disks. It stores a large amount of information.

 

 

Floppy Disk:Computer Floppy Disks information for Kides

A floppy disk stores small amounts small amounts of information. It works when it is inserted into the floppy drive. The floppy drive is fixed in the CPU.

 

Compact Disc (CD) information for KidesCompact Disc (CD):

 

A CD stores many times more information than a floppy disk. It works when it is inserted into the CD drive. The CD drive is fixed in the CPU. Note: Not handling the CD properly may result in loss of data stored.

Modern Computer

Concept of modern computer

The principle of the modern computer was proposed by Alan Turing, in his seminal 1936 paper,[35] On Computable Numbers. Turing proposed a simple device that he called “Universal Computing machine” that is later known as a Universal Turing machine. He proved that such machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing’s design is stored program, where all instruction for computing is stored in the memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper.[36] Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.

Stored programs

Three tall racks containing electronic circuit boards

A section of the Manchester Small-Scale Experimental Machine, the first stored-program computer.

Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine.[28] With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid by Alan Turing in his 1936 paper. In 1945 Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report ‘Proposed Electronic Calculator’ was the first specification for such a device. John von Neumann at theUniversity of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945.[18]

Ferranti Mark 1, c. 1951.

The Manchester Small-Scale Experimental Machine, nicknamed Baby, was the world’s first stored-program computer. It was built at the Victoria University of Manchester by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948.[37] It was designed as a testbed for the Williams tube the firstrandom-access digital storage device.[38] Although the computer was considered “small and primitive” by the standards of its time, it was the first working machine to contain all of the elements essential to a modern electronic computer.[39] As soon as the SSEM had demonstrated the feasibility of its design, a project was initiated at the university to develop it into a more usable computer, the Manchester Mark 1.

The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world’s first commercially available general-purpose computer.[40] Built byFerranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam.[41] In October 1947, the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. The LEO I computer became operational in April 1951[42] and ran the world’s first regular routine office computer job.

Transistors

The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the “second generation” of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space.

At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistorsinstead of valves.[43] Their first transistorised computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955,[44] built by the electronics division of the Atomic Energy Research Establishment at Harwell.[44][45]

Integrated circuits

The next great advance in computing power came with the advent of the integrated circuit. The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952.[46]

The first practical ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor.[47] Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[48] In his patent application of 6 February 1959, Kilby described his new device as “a body of semiconductor material … wherein all the components of the electronic circuit are completely integrated”.[49][50] Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.[51] His chip solved many practical problems that Kilby’s had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby’s chip was made of germanium.

This new development heralded an explosion in the commercial and personal use of computers and led to the invention of the microprocessor. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term “microprocessor”, it is largely undisputed that the first single-chip microprocessor was the Intel 4004,[52] designed and realized by Ted Hoff, Federico Faggin, and Stanley Mazor at Intel.[53]

 

Advantages and Disadvantages of Computer Education to Students

Technology has struggled to find its way into the classroom in all sorts of ways, from projectors and televisions to computer labs and student laptops. Along with improving the way students are taught, it is also vitally important that students learn to use computers to improve their own work and prepare for careers in a world where computers have become as common as the pencil and paper.

Modernizing Education

Education has benefited from the inclusion of technology and computers by making it easier for students to keep up while helping teachers by improving the way lessons can be planned and taught. Students who use computers learn to use word processors for work, and subsequently they learn computer jargon and strengthen grammatical skills. Students can also look up lessons on websites or through email rather than lugging heavy textbooks with them every day.

Improving Student Performance

Students who use computers have been shown to attend school more steadily and perform better than students who do not use computers. Along with getting higher grades on exams, students also stated they felt more involved with their lessons and work if they used a computer. Using computers gets students to become more focused on their work at home, in collaborative projects with other students and on their own.

Learning Job Skills

Computers play a vital role in the modern business world, and many of even the most basic jobs involve technology and computers. Teaching students how to use computers helps them prepare for any number of possible careers, and classes based on computer education can get even more specific. Many classes teach students to use office suite programs, create presentations and data sheets, and learn any number of programming languages such as C++ or Java.

Efficiency

Computers make the learning process a lot more simple and efficient, giving students access to tools and methods of communication unavailable offline. For example, students can check their grades or lesson plans online, and also communicate directly with their teachers via email or educational platforms such as Blackboard. Students can also send work to their teachers from home or anywhere else, letting them finish work outside the constraints of school hours and teaching them about procrastination and personal responsibility.

Research

Technology has made research far easier than in the past. Decades ago, students learned history by going to the library and thumbing through history books and encyclopedias. Today, many of those same books are available in digital format and can be accessed online. As the Internet has grown, so too has the available research options. Students can research topics in minutes rather than the hours it used to take.

The Disadvantages of Computers in Education

The limitless access to information provided to college students by computers can present challenges and disadvantages directly related to computer usage in institutions of higher education. Financial difficulties may make it difficult for some students to access important coursework, while other students may use computers to plagiarize or cheat.

Technical Problems

For online learning courses or classes requiring network access, technical issues can cause major problems. A lost or stolen computer might prohibit a student from logging onto a discussion forum. And, according to a 2012 article in “The U.S. News & World Report,” students who do not set up proper security settings on personal computers used for college education might be victims of identify theft.

Spelling and Handwriting Skills

When students replace paper and pen with a computer for education, handwriting skills may suffer. Adult learners benefit from increased brain activity when writing new information by hand, particularly in subjects such as math and chemistry. Most computer word processing programs include a spelling and grammar check, and students might rely too heavily on the computer to correct spelling and grammatical errors.

Cheating

Using computers to cheat is a widespread problem in universities and colleges, as reported in the article,”Cheating in College is Widespread — But Why?,” published on the National Public Radio’s website in July 2010. Students might search online for answers to test questions or have answers sent to their computers by other students. Students are able to access huge amounts of information via computers and may present that information as their own. Plagiarizing may be difficult for universities to prove or identify because of the broad scope of the Internet and difficulty of finding all possible sources of information.

Financial Problems

Financial problems may prohibit some students from owning a computer, placing them at a disadvantage. Low-income college students are less likely to have easy access to a computer and may not have learned basic computer skills that other students learned at a young age. This puts them at a disadvantage when college classes require heavy computer usage, according to a 2008 article published in the “American Academic,” a publication on the American Federation of Teachers’ website.