day 1 - ceap standards-based assesment

of 24/24
Aurora F. Fernandez, Ed. D. National Education Testing and Research Center Department of Education

Post on 08-Dec-2016

219 views

Category:

Documents

1 download

Embed Size (px)

TRANSCRIPT

  • Aurora F. Fernandez, Ed. D.

    National Education Testing and Research Center

    Department of Education

  • Defining a Standard (Webster, 1994)

    basis of comparison in measuring or judging capacity, quality, etc.

    a measure of adequacy; if meeting the requirements of, a standard or role model

    2

  • Assessment

    It is the gathering of information and evidence about the performance of individuals in tests and other measures.

    It facilitates broad communication and dialogue focused on outcomes or qualitative and quantitative data.

    On the national scale, results of assessment are means to assess fulfillment of goals and serve as bases for formulating and implementing policies that lead to upgrading performance standards of education in general.

    3

  • Broadly, the objectives of assessment in education are to:

    1. assess readiness of learners for subsequent grade/year levels in the education ladder

    2. ensure that quality learning is being effected by the system

    3. assess the appropriateness, adequacy and timeliness of inputs and processes at each stage/phase of the system

    4. continuously monitor progress or positive change and improvement in a program

    5. identify strengths and weaknesses of a program, with focus on its components inputs, processes and transactions

    6. identify gaps and/or duplications in processes, activities and efforts toward attaining the program goals

    6. reduce duplication of efforts and investments in material and human resource inputs and processes in the implementation of the program

    7. provide basis for decisions and policy toward sustenance and/or improvement to adapt to emerging needs of the program

    8. provide basis for feedback to all the stakeholders policy makers, educators, teachers, et. al.

    4

  • 5

    Proposed Assessment Framework for K to 12 National Level

    FLO*

    Post Grade 12 University Admission Exam, COC-TESDA

    End of Stage 4 Assessment: National Basic Education Competency Assessment (NBECA);

    Senior High School Diploma SY 2017 - 2018

    End of Stage 3 Assessment: National Achievement Test (NAT);

    Junior High School Certificate SY 2015 - 2016

    National Career Assessment Examination (NCAE)

    End of Stage 2 Assessment: National Achievement Test (NAT);

    Elementary School Certificate (will commence on SY 2017 2018)

    End of Stage 1 Assessment:

    National Achievement Test (NAT) (will commence on SY 2014 2015)

    Placement / Diagnostic Test Assessment Alternative

    Delivery System (ALS)

    Formal

    Delivery System

    Philippine Educational Placement Test (PEPT)/ Philippine Validating Test (PVT)

    Grade 12

    Senior Level

    Junior Level

    Elementary Level

    Grade 10

    Grade 9

    Grade 6

    Grade 3

    Grade 1

    Kindergarten

    *FLO is the acronym for Flexible Learning Options

  • 6

    Post Grade 12 University Admission Exam, COC-TESDA

    End of Stage 1 Assessment: Early Grade Reading Assessment (EGRA) in Filipino &English

    and Early Grade Math Assessment (EGMA); (will commence on SY 2014 2015)

    Early Grade Reading Assessment (EGRA) in the Mother Tongue

    School Readiness Yearend Assessment (SReYA) in the Mother Tongue

    Placement / Diagnostic Test Assessment Alternative

    Delivery System (ALS)

    Formal

    Delivery System

    Philippine Educational Placement Test (PEPT)/ Philippine Validating Test (PVT)

    Grade 12

    Senior Level

    Junior Level

    Elementary Level

    Grade 10

    Grade 9

    Grade 6

    Grade 3

    Grade 1

    Kindergarten

    Proposed Assessment Framework for K to 12 School-Based Level

  • 7

    Planning the Test

    Developing the Table of Test Specifications

    Item Writing

    Test Assembly and Review of Test Items

    Pilot Testing or Try Out of the Test

    (at least 2 forms of the final test)

    Item Analysis

    Items

    Useful Reject

    No

    Validity/Reliability

    Organize final form of the Test

    Yes

    Norming

    Preparation of the Test Manual

  • An example of a Table of Specifications for Grade III Science

    8

    Behavior Content

    Knowledge of Specific

    Fact

    Understanding of Basic

    Concepts and Principles

    Observing and

    Describing Objects

    Comparing, Classifying

    Objects based on Observable

    Characteristics

    Making Inferences

    from Observation

    Total Number of Items

    Item Placement

    Percent of

    Items

    People 2 3 3 3 2 13 1 13 32.5%

    Animals 2 3 3 3 3 14 14 27 35%

    Plants 2 3 3 3 2 13 28 40 32.5%

    Number of Items

    6 9 9 9 7 40

    Percent of Items

    15% 22.5% 22.5% 22.5% 17.5% 100%

  • 9

    TAXONOMY OF THINKING SKILLS

    DEFINITION TEST # &

    TEST ITEM #

    % RANK OF USE

    1. Knowledge Remember or recall facts, concepts, method, procedures

    2. Comprehension

    Understand facts and principles Interpret charts and graphs and verbal materials Translate verbal materials to mathematical formula Estimate future consequences implied in the data

    3. Application

    Use of materials learned in actual context Apply concepts/principles to new situation Solve mathematical problems Demonstrate correct usage of a method or process

    4. Analysis

    Understand different elements and how they fit together Divide whole into different component elements Distinguish facts from inference Identify organizational structure of work (art, music, writing)

    5. Synthesis Combine separate knowledge into one whole Putting bits of information together Relate and integrate information

    6. Evaluation Judge the worth of an event, object or idea of a given principle using definite criteria Express and defend a point of view

  • ENGLISH

    1. Use verbs in the simple present tense

    2. Use reflexive pronouns

    3. Predict outcome based on the selection

    4. Sequence events in the selection

    5. Determine cause and effect relationship in a given selection

    10

    Grammar/ Language Skills

    Reading Comprehension Skills

  • SCIENCE

    1. Differentiate physical from chemical changes/processes by giving examples

    2. Identify major parts of the circulatory system/and their functions

    3. Illustrate the interdependence of plants and animals for gases through the oxygen-carbon dioxide cycle

    4. Explain the effects of change in materials on health and the environment

    5. Describe characteristics of stars and how group of stars are useful to people

    11

  • 1. Congruency of the item with the competency/skill tested is of prime importance.

    2. The applicability of the variety of multiple choice test type in item writing should always be considered.

    3. Textbooks are used as refreshers for concepts, principles, etc., but a good item is free from textbook jargon. Any item, even nicely stated concept/examples and illustrations but obviously taken from the book is categorized as a knowledge item.

    4. Cite the source of the text when lifting a paragraph or even capturing an essential part from a story.

    5. In choosing your selection material, pick topics which can impart great virtues or higher values in both levels: Elementary and Secondary levels. Values education is integrated into the curriculum.

    6. Adapt your vocabulary or the phrasing of stem to the level of the group you will test.

    7. Avoid changing the names and places in an old item because this does not create a new item.

    8. Maximize the use of graphs, illustrations, tables used in the stem by developing two or three items out of it.

    9. Avoid unnecessary words/sentences in the stem. A simple and direct question is preferable.

    10. Formulate the distracters skillfully where homogeneity of options is seen.

    11. Always avoid irrelevant clues to the correct answer such as:

    a. Repeating clue words found in both the stem and in the options

    b. The longest option that explains the answer

    12. When an item calls for the BEST answer, see to it that the wrong options are partially correct. 12

  • 1. Raw Scores

    The pupils raw scores are transformed into mean raw scores. For instance, for standard setting purposes, measures of central tendencies are used in computing the NAT performance:

    Mean Raw Score (MRS). This refers to the average number of items correctly answered by pupils in each school structure using the formula:

    X = X

    N

    Where: X = this refers to the Mean (Average)

    X = it is the summation of individual scores

    N = the total number of examinees

    13

  • 2. Mean Percentage Score (MPS)

    The pupils raw scores are transformed into Mean Percentage Scores (MPS). This indicates the ratio between the number of correctly answered items in a test and the total number of items.

    a. Subject Area MPS

    Subject Mean Score Obtained

    MPS = x 100

    Subject Total Number of Items

    b. Aggregate or Total Score

    Mean Aggregate or Total Test Score Obtained

    MPS = x 100

    NAT Total of Items (G6) 200

    14

  • A 50 MPS in one subject area, for instance, would mean that an examinee correctly answered 20 of the 40 test items (NAT G6). Further, a 60 MPS in the total test would mean that an examinee correctly answered 6 out of 10 questions in the test.

    A mean percentage score of 75 in a 40-item test would mean that 30 items were answered correctly.

    The NAT scores are both Normative-Referenced type where the performance is gauged against the average performance of a group (e.g. national, regional and division levels); and as Criterion-Referenced type when there is a prescribed competency level which should be attained.

    15

  • 1. Normal Curve Distribution

    This is a statistical distribution wherein the top 16 percent of the continuum is considered high; the middle 68% as average and the extreme end 16 percent of the same continuum is taken into account as low performance.

    16

  • 2. The Mastery Level

    Percentage Descriptive Equivalent

    96% - 100% Mastered

    86% - 95% Closely Approximating Mastery

    66% - 85% Moving Towards Mastery

    35% - 65% Average

    16% - 34% Low

    5% - 15% Very Low

    0% - 4% Absolutely No Mastery

    17

    NAT Standards

  • 3. Quartile Distribution

    This is a distribution of scores into four equal percentage points. The 1st quarter or uppermost/superior (25%); 2nd quarter, upper average (25%); 3rd quarter, lower average (25%) and 4th quarter, poor (25%).

    Quartile Distribution Descriptive Equivalent

    76 100 51 75 26 50 0 25

    Q1 Q2 Q3 Q4

    Superior Upper Average Lower Average Poor

    18

  • 4. Proposed Standards of Achievement

    Standards of Achievement

    Descriptive Equivalent

    90% - 100% 75% - 89%

    35% - 74% 0% - 34%

    Superior Meeting the Standard Below Standard Poor

    (DECS Order No. 46, s. 1983)

    75 passing score/cut-off score set by the Department of Education

    Based on the Mastery Levels of the NAT

    19

  • 1. Percentile Rank (PR) Standard Scores

    NCAE Standards

    PR Descriptive Rating

    PR 99+ Excellent (E)

    PR 98 99 Very High (VH)

    PR 86 97 Above Average (AA)

    PR 51 85 Average (A)

    PR 15 50 Low Average (LA)

    PR 3 14 Below Average (BA)

    PR 1 2 Poor (P)

    PR 0 - .99 Very Poor (VP)

    Descriptive Rating

    800 700

    High

    600 500 400

    Average

    300 200

    Low

    20

  • OIISSS* Level

    Levels of Preference for the Occupational Interest

    76% - 100% High Preference (HP)

    51% - 75% Moderate Preference (MP)

    26% - 50% Low Preference (LP)

    0% - 25% Very Low Preference (VLP)

    * Occupational Interest Inventory for Secondary Schools Students

    21

  • The Normal Curve as the

    Point of Reference for the Standards in Assessment

    0-4 5-15 16-34 35-65 66-85 86-95 96-100

    0 25

    Poor 26 50

    Below Average

    51 - 75

    Above Average

    76 100

    Excellent/ Superior

    b. Quartile Distribution

    c. Standards of Achievement

    0 34%

    Poor

    35 74% 75 89% 90 100%

    Below Average

    Meeting Standard

    Superior

    0-2 3-14 15-50 51-85 86-97 98-99 99+

    NCAE Percentile Rank (PR)

    NCAE

    a. Mastery Level

    NAT

    22

  • DepED Order No. 71, s. 2010

    DECS Order No. 46, s. 1983 Revised System of Rating and Reporting of Pupil Progress

    NETRC The NAT Performance of Grade 6 Pupils Over the Years (2006-2012)

    Bloom, Benjamin (Editor), et. al. Taxonomy of Educational Objectives, David Mckay Company, Inc. New York, 1956

    23

  • Thank You!

    24 TDPU-pjdc-5/22/13