world of computers

Upload: varsha-sukhramani

Post on 09-Apr-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/8/2019 World of Computers

    1/21

    Presented By:-

    Varsha. Sukhramani

    Shabeen. Samnani

  • 8/8/2019 World of Computers

    2/21

    Introduction

    History of Computing

    How Where Computers Introduces

    Abacus Computers

    Charles Babbage

    Harvard Mark I

    ENIAC Computers

    Super Computers

    Development of Computers

    How Computers Work

  • 8/8/2019 World of Computers

    3/21

    Acomputer is a machine which manipulates data according to a list of

    instructions.

    The first devices that resemble modern computers date to mid-20th

    century(around 1940-1941). Early electronic computers were the size of a large

    room, consuming as much power as several hundred modern personal

    computers. Today, simple computers may be made small enough to fit into a

    wrist watch and be powered from a watch battery. However, the most

    common form of computer in use today is by far the embedded computer.

    Embedded computers are small, simple devices that are often used to control

    other devices.

    Any computer with a certain minimum capability is, in principal, capable

    of performing the same tasks that any other computer can perform.

  • 8/8/2019 World of Computers

    4/21

    It is difficult to identify any one device as the earliest computer, partly because the

    term "computer" has been subject to varying interpretations over time.Originally, the

    term "computer" referred to a person who performed numerical calculations (a human

    computer), often with the aid of a mechanical calculating device.. In 1801, Joseph

    Marie Jacquard made an improvement to the textile loom that used a series of punchedpaper cards as a template to allow his loom to weave intricate patterns automatically.

    In 1837, Charles Babbage was the first to conceptualize and design a fully

    programmable mechanical computer that he called "TheAnalytical Engine".During the

    first half of the 20th century, many scientific computing needs were met by increasingly

    sophisticated analog computers, which used a direct mechanical or electrical model of

    the problem as a basis for computation. However, these were not programmable and

    generally lacked the versatility and accuracy of modern digital computers.Asuccession

    of steadily more powerful and flexible computing devices were constructed in the 1930s

    and 1940s, gradually adding the key features that are seen in modern computers

  • 8/8/2019 World of Computers

    5/21

    The first to be demonstrated working was the Manchester Small-Scale Experimental Machine

    (SSEM) ,while the technologies used in computers have changed dramatically since the first

    electronic general-purpose computers of the 1940s.Vacuum tube-based computers were in use

    throughout the 1950s, but were largely replaced in the 1960s, which were smaller, faster,

    cheaper, used less power and were more reliable. By the 1970s, the adoption of integrated

    circuit technology and the subsequent creation of microprocessors such as theIntel 4004 whichwere, speed, cost and reliability. The early and mid-1980s saw machines with a modest number

    of vector processors working in parallel become the standard. Typical numbers of processors

    were in the range of four to sixteen. By the 1980s, computers had become sufficiently small and

    cheap to replace simple mechanical controls in domestic appliances such as washing machines.

    Around the same time, computers became widely accessible for personal use by individuals in theform of home computers and the now ubiquitous personal computer. In the later 1980s and

    1990s, attention turned from vector processors to massive parallel processing systems with

    thousands of "ordinary" CPUs, some being off the shelf units and others being custom designs.

    In conjunction with the widespread growth of theInternet since the 1990s, personal computers

    are becoming as common as the television and the telephone and almost all modern electronicdevices contain a computer of some kind.

  • 8/8/2019 World of Computers

    6/21

    The modern Abacus

    The old Abacus

    TheAbacus was an early aid for mathematical computations. Its only value is

    that it aids the memory of the human performing the calculations. Askilled

    abacus operator can work on addition and subtraction problems at the speedof a person equipped with a hand calculator (multiplication and division are

    slower). The oldest survivingAbacus was used in 300bc, by the Babylonians.

    TheAbacus is still an use today, principally in far east. AmodernAbacus

    consists of rings that slide over rods but the older one dates from the time

    when pebbles were used for counting. The modernAbacus is just therepresentation of the human fingers: the lower rings on each rod represent the

    5 fingers and the 2 upper rings represent the 2 hands.

  • 8/8/2019 World of Computers

    7/21

    Small section of the type of mechanism employed in Babbage's Difference Engine

    By 1822 the English mathematician Charles Babbage was proposing a steam

    driven calculating machine the size of a room, which he called the Difference

    Engine. This machine would be able to compute tables of numbers, such aslogarithm tables. He obtained government funding for this project due to the

    importance of numeric tables in ocean navigation.It was Babbage who made an important intellectual leap regarding the punched

    cards. Babbage realized that punched paper could be employed as a storage

    mechanism, holding computed numbers for future reference. Babbage called thetwo main parts of hisAnalytic Engine the "Store" and the "Mill", as both terms are

    used in the weaving industry. The Store was where numbers were held and the

    Mill was where they were "woven" into new results. In a modern computer these

    same parts are called the memory unit and the central processing unit (CPU).

  • 8/8/2019 World of Computers

    8/21

    The Harvard Mark I: an electro-mechanical computer

    One early success was the Harvard MarkIcomputer which was built as a partnership

    between Harvard andIBM in 1944. This was the first programmable digital computer

    made in the U.S. But it was not a purely electronic computer. Instead the MarkIwas

    constructed out of switches, relays, rotating shafts, and clutches.

    One of the primary programmers for the MarkIwas a woman, Grace Hopper. Hopperfound the first computer "bug. The word "bug" had been used to describe a defect since

    at least 1889 but Hopper is credited with coining the word "debugging" to describe the

    work to eliminate program faults.

    The MarkIoperated on numbers that were 23 digits wide. It could add or subtract

    two of these numbers in three-tenths of a second, multiply them in four seconds, and

    divide them in ten seconds. Forty-five years later computers could perform an addition

    in a billionth of a second!. This kind of speed is obviously impossible for a machine

    which must move a rotating shaft and that is why electronic computers killed off their

    mechanical predecessors.

  • 8/8/2019 World of Computers

    9/21

    "Electronic Numerical Integrator and Calculator"

    The title of forefather of today's all-electronic digital computers is usually

    awarded to ENIAC, which stood for Electronic NumericalIntegrator and

    Calculator. ENIAC was built at the University of Pennsylvania between 1943and 1945 by two professors, John Mauchly and the 24 year old J. Presper

    Eckert, . Like the MarkI, ENIAC employed paper card readers obtained from

    IBM.

    To perform this computation on ENIAC you had to rearrange a large number ofpatch cords and then locate three particular knobs on that vast wall of knobs

    and set them to 3, 1, and 4.

    One of the most obvious problems was that the design would require 18,000

    vacuum tubes to all work simultaneously

  • 8/8/2019 World of Computers

    10/21

    The Cray-2 was the world's fastest computer from 1985 to 1989.

    Asupercomputer is a computer that is considered, or was considered at the time of

    its introduction, to be at the frontline in terms of processing capacity, particularly

    speed of calculation.

    Supercomputers introduced in the 1960s were designed primarily by Seymour Cray

    at Control Data Corporation (CDC), and led the market into the 1970s until Crayleft to form his own company, Cray Research. He then took over the supercomputer

    market with his new designs, holding the top spot in supercomputing for five years

    (19851990).

    The term supercomputer itself is rather fluid, and today's supercomputer tends to

    become tomorrow's normal computer. In the 1970s most supercomputers were

    dedicated to running a vector processor, Today, parallel designs are based on "off the

    shelf" server-class microprocessors, such as the PowerPC, Itanium, or x86-64, and

    most modern supercomputers are now highly-tuned computer clusters using

    commodity processors combined with custom interconnects.

  • 8/8/2019 World of Computers

    11/21

  • 8/8/2019 World of Computers

    12/21

    As time progressed, people found they were using adding machines and slide rules toperform more and more extremely tedious calculations. Aiken, developed the MarkI

    in 1944 to ease this calculating burden. During World WarII, researchers made

    more advances to ease the burden of performing calculations. In 1946, they

    developed the ENIA

    C, Electronic NumericalIntegrator and Calculator. Thecomputer had 18,000 vacuum tubes which were used to perform calculations at a

    rate of 5,000 additions per second.. In the next few years, a number of other "first

    generation" computers were built. All of these early computers used vacuum tubes to

    perform their calculations. In 1945, John von Neumann wrote a paper describing

    how a binary program could be electronically stored in a computer .In 1947, theEDVAC, Electronic Discrete VariableAutomatic Computer, was built by Eckert

    and Mauchley. In 1951, Eckert and Mauchley built the UNIVAC for use at the

    Census Bureau. The UNIVAC used magnetic tape to store input/output rather

    than the punch tape. In 1953, IBM continued to develop and expand its computer

    line and within the next decade. Digital Equipped Computer

  • 8/8/2019 World of Computers

    13/21

    Vanguard Motion AnalyzrSmall

    In 1947, Bell Laboratories invented the transistor. This creation sparked the

    production of a wave of "second generation" computers. By using transistors

    in place of vacuum tubes, manufacturers could produce more reliable

    computers. Using transistors was also less expensive than building acomputer with vacuum tubes. The combination of smaller size, better

    reliability, and lower cost made these second generation computers very

    popular with buyers. For scientists and engineers, large powerful computers

    were built which were good at performing calculations. For banks andinsurance companies, computers which were smaller , faster and those which

    were good at sorting and printing were built. Computer companies found

    that it was expensive to produce two different lines of computers, so they set

    to work to develop a computer which could perform both calculations and

    data processing equally well.

  • 8/8/2019 World of Computers

    14/21

    Vectral General Computer

    In 1958, the first integrated circuit was made. This invention has led to the

    widespread use of computers today. Scientists found a way to reduce the size

    of transistors so they could place hundreds of them on a small silicon chip,

    about a quarter of an inch on each side. This enabled computermanufacturers to build smaller computers. In 1956, FORTRAN, the first

    programming language, was developed. The introduction of programming

    languages enabled this third generation of computers to contain something

    called an operating system.. Another aspect of new computing to the thirdgeneration machines was the presence of multiprogramming. It enabled the

    computer to run a number of jobs simultaneously. The companies whomanufactured the third generation computers tried to create computers which

    could successfully perform both calculations and sorts.

  • 8/8/2019 World of Computers

    15/21

    Then, in 1971 Intel created the first microprocessor. The microprocessor was a

    large-scale integrated circuit which contained thousands of transistors. In 1976,

    Steve Jobs and Steve Wozniak built the firstApple computer in a garage in

    California. Then, in 1981,IBM introduced its first personal computer. Thepersonal computer was such a revolutionary concept and was expected to have

    such an impact on society that in 1982. Within a matter of years, computers

    spread from the work place into the home. Personal computers have changed a

    great deal since the early eighties. The hardware has definitely changed, the

    computers are faster now, have more memory, The increased processing speed andmemory in computers has led to an increase in the quality of computer graphics.

    The introduction of the integrated circuit and its development into the very-large

    scale integrated circuit started a technological revolution which caused computers

    to invade almost every aspect of our society.

  • 8/8/2019 World of Computers

    16/21

    Control Unit

    Memory

    Input/output (I/O)

    Arithmetic/Logic Unit (ALU)

  • 8/8/2019 World of Computers

    17/21

    The control unit directs the various components of a computer. It reads and interprets

    instructions in the program. Control systems in advanced computers may change theorder of some instructions so as to improve performance. Akey component common to

    all CPUs is the program counter. The control system's function is as follows. Some of

    these steps may be performed concurrently or in a different order depending on the type

    of CPU:-

    Read the code for the next instruction from the cell indicated by the program counter.Decode the numerical code for the instruction into a set of commands or signals for

    each of the other systems.

    Increment the program counter so it points to the next instruction.

    Read whatever data the instruction requires from cells in memory. The location of

    this required data is typically stored within the instruction code.Provide the necessary data to anALU or register.

    If the instruction requires anALU or specialized hardware to complete, instruct the

    hardware to perform the requested operation.

    Write the result from theALU back to a memory location or to a register or perhaps

    an output device.

  • 8/8/2019 World of Computers

    18/21

    Magnetic Core Memory

    Magnetic core memory was popular main memory for computers through the 1960suntil it was completely replaced by semiconductor memory.

    Acomputer's memory can be viewed as a list of cells into which numbers can be placed

    or read. The information stored in memory may represent practically anything. Letters,

    numbers, even computer instructions. In almost all modern computers, each memory cell

    is set up to store binary numbers in groups of eight bits (called a byte). Each byte is ableto represent 256 different numbers. Acomputer can store any kind of information in

    memory as long as it can be somehow represented in numerical form. Computer main

    memory comes in two principal varieties: random access memory or RAM and read-only

    memory or ROM. ROM is pre-loaded with data and software that never changes. The

    contents of RAM is erased when the power to the computer is turned off while ROM

    retains its data indefinitely Software that is stored in ROM is often called firmware

    because it is notionally more like hardware than software. Flash memory blurs the

    distinction between ROM and RAM by retaining data when turned off but being

    rewritable like RA

    M.In more sophisticated computers there may be one or more R

    AMcache memories which are slower than registers but faster than main memory.

  • 8/8/2019 World of Computers

    19/21

    Common I/O Devices Used With Computers

    Hard disks are CommonI/O devices used with computers.I/O is the means by

    which a computer receives information from the outside world and sends results

    back. Devices that provide input or output to the computer are called

    peripherals. On a typical personal computer, peripherals include input deviceslike the keyboard and mouse, and output devices such as the display and printer.

    Hard disk drives, floppy disk drives and optical disc drives serve as both input

    and output devices. Computer networking is another form ofI/O.Often, I/O

    devices are complex computers in their own right with their own CPU andmemory. Agraphics processing unit might contain fifty or more tiny computers

    that perform the calculations necessary to display 3D graphics. Modern desktop

    computers contain many smaller computers that assist the main CPU in

    performingI/O.

  • 8/8/2019 World of Computers

    20/21

    TheALU is capable of performing two classes of operations: arithmetic and

    logic.The set of arithmetic operations that a particularALU supports may be

    limited to adding and subtracting or might include multiplying or dividing.A

    nycomputer can be programmed to perform any arithmetic operationalthough it

    will take more time to do so if itsALU does not directly support the operation..

    Logic operations involve Boolean logic: AND, OR, XOR and NOT. These can be

    useful both for creating complicated conditional statements and processing

    boolean logic.Superscalar computers contain multipleALUs so that they canprocess several instructions at the same time. Graphics processors and computers

    with SIMD and MIMD features often provideALUs that can perform

    arithmetic on vectors and matrices.

  • 8/8/2019 World of Computers

    21/21