lecture 1 introduction to computerarchtecture

Upload: derrick-james

Post on 05-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/31/2019 Lecture 1 Introduction to ComputerArchtecture

    1/7

    Lecture 1: Introduction to Computer Architecture

    The aim of this course unit is to present the nature and characteristics of modern-day computer

    systems. This task is challenging for several reasons:

    1. There is a tremendous variety of products that can rightly claim the name Computer, from

    single-chip microprocessors to supercomputers. Variety is exhibited in terms of cost, size,

    performance and application.2. The rapid pace of change that has always characterised computer technology continue with

    no letup e.g integrated circuit (IC) used to construct computer components, increased use

    of parallel organisation concepts in combining those components and many ore aspects of

    computer technology.

    In spite of the variety and pace of change, certain fundamental concepts apply throughout. The

    application of these concepts depends on the current state of technology and price/performance

    objectives of the designer.All of the basic performance characteristics of computer systems, including processor speed,

    memory speed, memory capacity, and interconnection data rates, are increasing rapidly.

    This makes it difficult to design a balanced system that maximises the performance and

    utilisation of all elements. Thus, computer design is a game of changing the structure or

    function in one area to compensate for a performance mismatch in another area.

    A computer system consists of an interrelated set of components. The system is best

    characterised in terms of structure the way in which components are interrelated, and

    function the operating of the individual components.

    Why Study Computer organisation and Architecture?

    The IEEE/ACM (Institute of Electrical and Electronics Engineers/Association for Computing

    Machinery) report (2001) lists computer architecture as one on the core subjects that should be

    in the curriculum of all students in Computer Science and Computer Engineering.

    The report says the following:

    The computer lies at the heart of computing. Without it most disciplines today would be

    a branch of theoretical mathematics. To be a professional in the field of computing

    today, one should not regard the computer as just a black box that executes programs by

    magic. All students of computing should acquire some understanding and appreciationof a computer systems functional interactions. There are practical implications as well.

    Students need to understand computer architecture in order to structure a program so that

    it runs more efficiently on a real machine. In selecting a system to use, they should beable to understand the tradeoff among various components, such as CPU clock speed

    versus memory size.

    The following examples are reasons for studying computer architecture:

    1. Suppose a graduate enters the industry and is asked to select the most effective

    computer for use throughout a large organisation. An understanding of theimplications of spending more for various alternatives, such as a larger cache or a

    higher processor clock rate, is essential to making the decision.

    2. Many processors are not used in PCs or servers but in embedded systems. A designer

    may program a processor in C that is embedded in some real-time or larger system,

    such as intelligent automobile electrics controller. Debugging the system may require

    the use of a logic analyser that displays the relationship between interrupt requests

    from engine sensors and machine-code.

    3. Concepts used in computer architecture find application in other courses. In particular,the way in way in which the computer provides architectural support for programming

    languages and operating system facilities reinforces concepts from those areas.

    Computer Architecture and Organisation

    Definition of terms

    a) Computer Architecture

    1

  • 7/31/2019 Lecture 1 Introduction to ComputerArchtecture

    2/7

    Baer: The design of the integrated system which provides a useful tool to the programmer

    Hayes: The study of the structure, behavior and design of computers

    Abd-Alla: The design of the system specification at a general or subsystem level

    Foster: The art of designing a machine that will be a pleasure to work with

    Hennessy and Patterson: The interface between the hardware and the lowest level software

    Common themes: Design / structure

    Art

    System

    Tool for programmer and application

    Interface

    Thus, computer architecture refers to those attributes of the system that are visible to a

    programmer those attributes that have a direct impact on the execution of a program

    Instruction sets Data representations the number of bits used to represent various data types ()e.g. numbers,

    characters, etc)

    Addressing memory techniques.

    I/O mechanisms

    NB

    Computer architects have always been striving to increase the performance

    of their architectures.Computer architecture is concerned with the structure and behavior of the computer as seen by

    the user. It includes the information formats, the instruction set, and techniques for addressing

    memory. The architectural design of a computer system is concerned with the specifications of

    the various functional modules, such as processors and memories, and structuring them together

    into a computer system.

    Computer design is concerned with the hardware design of the computer. Once the computerspecifications are formulated, it is the task of the designer to develop hardware for the system.

    Computer design is concerned with the determination of what hardware should be used and

    how the parts should be connected. This aspect of computer hardware is sometimes referred to

    as computer implementation.

    b) Computer OrganizationComputer Organization refers to the operational units and their interconnections that realize thearchitecture specifications.

    Synonymous with architecture in many uses and textbooks. We will use it to mean the

    underlying implementation of the architecture, transparent to the programmer.

    An architecture can have a number of organizational implementations:

    Control signals

    Technologies

    Device implementations

    Computer organization is concerned with the way the hardware components operate and the

    way they are connected together to form the computer system. The various components are

    assumed to be in place and the task is to investigate the organizational structure to verify thatthe computer parts operate as intended.

    Structure & Function

    A hierarchical system is a set of interrelated subsystems, each of the latter, in turn,

    hierarchical in structure until we reach some lowest level of elementary subsystem.

    Structure is the way in which components relate to each other

    Function is the operation of individual components as part of the structure

    c) Computer

    2

  • 7/31/2019 Lecture 1 Introduction to ComputerArchtecture

    3/7

    A computer is an electronic machine that accepts information, stores it until the information is

    needed, processes the information according to the instructions provided by the user, and finally

    returns the results to the user. The computer can store and manipulate large quantities of data at

    very high speed, but a computer cannot think. A computer makes decisions based on simple

    comparisons such as one number being larger than another. Although the computer can help

    solve a tremendous variety of problems, it is simply a machine. It cannot solve problems on its

    own!!!

    Historically, a computer was a job title, not a piece of equipment!

    Requirements of a computer are:

    Process data

    Store data

    Move data between the computer and the outside world

    Control the operation of the above

    Figure showing functional view of a computer

    History of ComputersTECHNOLOGICAL DEVELOPMENTComputer technology has shown an unprecedented rate of improvement. This includes the development of

    processors and memories. Indeed, it is the advances in technology that have fueled the computer industry.The integration of numbers of transistors (a transistor is a controlled on/off switch) into a single chip has

    increased from a few hundred to millions. This impressive increase has been made possible by the advancesin the fabrication technology of transistors.The scale of integration has grown from small-scale (SSI) to medium-scale (MSI) to large-scale (LSI) tovery large-scale integration (VLSI), and currently to wafer-scale integration (WSI). Table 1 shows the

    typical numbers of devices per chip in each of these technologies.

    TABLE 1 Numbers of Devices per Chip

    It should be mentioned that the continuous decrease in the minimum devices feature size has led to acontinuous increase in the number of devices per chip, which in turn has led to a number of developments.Among these is the increase in the number of devices in RAM memories, which in turn helps designers totrade off memory size for speed. The improvement in the feature size provides golden opportunities for

    introducing improved design styles.

    Mechanical Era (1600s-1940s)

    Wilhelm Schickhard (1623) Astronomer and mathematician

    Automatically add, subtract, multiply, and divide

    Blaise Pascal (1642)

    Mathematician

    Mass produced first working machine (50 copies)

    3

  • 7/31/2019 Lecture 1 Introduction to ComputerArchtecture

    4/7

    Could only add and subtract

    Maintenance and labor problems

    Gottfried Liebniz (1673)

    Mathematician and inventor

    Improved on Pascals machine

    Add, subtract, multiply, and divide

    Charles Babbage (1822)

    Mathematician Father of modern computer

    Wanted more accuracy in calculations

    Difference engine

    Government / science agreement

    Automatic computation of math tables

    Analytic engine

    Perform any math operation

    Punch cardsModern structure: I/O, storage, ALU

    Add in 1 second, multiply in 1 minute

    Both engines plagued by mechanical problems

    George Boole (1847)

    Mathematical analysis of logic

    Investigation of laws of thought

    Herman Hollerith (1889)

    Modern day punched card machine Formed Tabulating Machine Company (became IBM)

    1880 census took 5 years to tabulate

    Tabulation estimates

    1890: 7.5 years

    1900: 10+ years

    Holleriths tabulating machine reduced the 7.5 year estimate to 2 months

    Konrad Zuse (1938) Built first working mechanical computer, the Z1

    Binary machine

    German government decided not to pursue development -- W.W.II already started

    Howard Aiken (1943)

    Designed the Harvard Mark I

    Implementation of Babbages machine

    Built by IBM

    Mechanical era summary

    Mechanical computers were designed to reduce the time required for calculations and

    increase accuracy of the results

    Two drawbacks

    Speed of operation limited by the inertia of moving parts (gears and pulleys)

    Cumbersome, unreliable, and expensive

    The Electronic Era

    Computer Generations

    The history of computer development is known as computer generations.

    John von Neumann and his Neumann architecture: using a processing unit and a single

    separate storage structure to hold both instructions and data.

    Each generation of computer is characterized by a major technological development

    that fundamentally changed the ways the computers operate.

    4

  • 7/31/2019 Lecture 1 Introduction to ComputerArchtecture

    5/7

    This resulted in smaller, cheaper, powerful, efficient and reliable machines.

    How many generations of computers we have now?

    First Generation

    1940s~1956

    Vacuum Tubes

    Used vacuum tubes for circuitry and magnetic drums for memory

    Often enormous, taking up entire rooms

    Reliability is a problem

    Expensive, used lots of electricity, generated lots of heat Examples: ENIAC-Electronic Numerical Integrator and Calculator,

    UNIVAC-UNIVersal Automatic Computer

    ENIAC: the first high-speed, purely electronic, Turing-complete, digital computer capable of

    being reprogrammed to solve a full range of computing problems .

    It contained 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000

    capacitors and 5 million hand-soldered joints. 27 Tons, 8.5ft X 3ft X 80ft.

    Second Generation

    1956-1963 Transistors

    The transistor was far superior to the vacuum tube, allowing computers to become

    smaller, faster, cheaper, more energy-efficient and more reliable than their first-

    generation predecessors

    Still the transistors generated lots of heat

    High level programming languages were used like COBOL and FORTRAN

    Examples: IBM 1620, CDC 3600

    Sample Photo: The IBM 1620 Data Processing System

    5

  • 7/31/2019 Lecture 1 Introduction to ComputerArchtecture

    6/7

    Nick name: CADET (Can't Add Doesn't Even Try)

    Third Generation

    1964-1971

    Integrated Circuits (IC). Up to 100 devices per chip.

    Transistors were miniaturized and placed on silicon chips, called

    semiconductors, which drastically increased the speed and efficiency of computers.

    Instead of punched cards and printouts, users interacted with third generationcomputers through keyboards and monitors and interfaced with an operating system

    CPL->BCPL->B

    Examples: IBM-360, ICL-1900, VAX-750

    Sample Photo: VAX - 750

    Fourth Generation

    1971-present

    Microprocessors

    Very Large Scale (VLSI), Ultra Large Scale (ULSC), millions of components

    could be fit into a single silicon chip

    Due to the development of microprocessor it is possible to place computer's

    central processing unit (CPU) on single chip

    1981: IBM PC, 1984: Apple Macintosh

    GUIs, Mouse.

    Fifth Generation

    Present and Beyond

    Based on artificial intelligence, the fifth generation is still in development,

    though there are some applications, such as voice recognition, that are being used

    today

    The use of parallel processing and superconductors is helping to make

    artificial intelligence a reality

    Quantum computation and molecular and nano technology will radically

    change the face of computers in years to come

    Any other idea?

    6

  • 7/31/2019 Lecture 1 Introduction to ComputerArchtecture

    7/7

    Do you possess any of these devices?????

    7