fi e l d n o t e s - sabes · aaaaaaaaaaaaaaaaaaaa aaaa fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa...

28
Fi e l d n o t e s At times it seems that everything there is to say about testing and assessment in adult literacy has been said. By now, practitioners and administrators alike can cite the shortcomings of standardized tests using multiple-choice formats and are familiar with the inadequacy of grade levels as indicators of what adult learners know and are able to do. Yet, multiple-choice, pencil and paper tests continue to be used not only as placement instruments but as measures of learner gains and evidence of program success. Given current reporting requirements, their use is likely to increase, at least in the near future. From the perspective of programs, there seem few viable alternatives that would meet the information needs of funders interested in reliable data that indicate how a program is doing overall. Portfolio approaches, for example—considered the last great hope a few years back—have not quite matured to the level where they might be used as a means to report and aggregate learner gains by group (although they are invaluable as evidence of individual learner progress), largely because the field has not invested in the development of benchmarks and rubrics. Local approaches have remained just that, local approaches, primarily for two reasons: (1) there has not been enough field testing to establish the reliability of these measures and (2) there have not been sufficient efforts to implement alternative assessments across programs. It is easy to see how even programs that have been enthusiastic about developing an assessment system that captures worthwhile outcomes are becoming distressed about the prospects of an alternative system being able to rival the standardized tests currently in fashion. Assessment and Accountability: A Modest Proposal by Heide Spruck Wrigley Continued on page 4 Assessment and Accountability: cover A Modest Proposal Heide Spruck Wrigley Letter to the Editor 3 Bob Bickerton The Uniform Portfolio System 8 as a Standardized Assessment Tool Jane J. Meyer Involving Learners in Assessment 10 Justine Sadoff Assessment and Learning Disability 11 Wallace M. Perkins Why Do We Have to Do Assessment? 12 Shelley Bourgeois Crosswalk—A Lesson in Comparison 13 Jeri Bayer Active, Purposeful, and Contextual: 14 Assessment in the EFF Classroom Joan Benz The Pareto Principle 15 Donna Curry The TABE: Thoughts from an 16 Inquiring Mind Cathy Coleman Assessment, Accountability, the 20 National Reporting System: Who Is Driving the Bus? An Interview by Cathy Coleman Ever Widening Circles: Involving 24 Teachers in the Development of an Assessment System Cynthia Gaede TABLE OF CONTENTS formerly Bright Ideas

Upload: others

Post on 25-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

1

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

At times it seems that everything there is to say about testing andassessment in adult literacy has been said. By now, practitioners andadministrators alike can cite the shortcomings of standardized testsusing multiple-choice formats and are familiar with the inadequacyof grade levels as indicators of what adult learners know and are ableto do. Yet, multiple-choice, pencil and paper tests continue to beused not only as placement instruments but as measures of learnergains and evidence of program success. Given current reportingrequirements, their use is likely to increase, at least in the near future.From the perspective of programs, there seem few viable alternativesthat would meet the information needs of funders interested in reliabledata that indicate how a program is doing overall. Portfolio approaches,for example—considered the last great hope a few years back—havenot quite matured to the level where they might be used as a meansto report and aggregate learner gains by group (although they areinvaluable as evidence of individual learner progress), largely becausethe field has not invested in the development of benchmarks andrubrics. Local approaches have remained just that, local approaches,primarily for two reasons: (1) there has not been enough field testingto establish the reliability of these measures and (2) there have notbeen sufficient efforts to implement alternative assessments acrossprograms. It is easy to see how even programs that have beenenthusiastic about developing an assessment system that capturesworthwhile outcomes are becoming distressed about the prospectsof an alternative system being able to rival the standardized testscurrently in fashion.

Assessment and Accountability:A Modest Proposalby Heide Spruck Wrigley

Continued on page 4

Assessment and Accountability: coverA Modest ProposalHeide Spruck Wrigley

Letter to the Editor 3Bob Bickerton

The Uniform Portfolio System 8as a Standardized Assessment ToolJane J. Meyer

Involving Learners in Assessment 10Justine Sadoff

Assessment and Learning Disability 11Wallace M. Perkins

Why Do We Have to Do Assessment? 12Shelley Bourgeois

Crosswalk—A Lesson in Comparison 13Jeri Bayer

Active, Purposeful, and Contextual: 14Assessment in the EFF ClassroomJoan Benz

The Pareto Principle 15Donna Curry

The TABE: Thoughts from an 16Inquiring MindCathy Coleman

Assessment, Accountability, the 20National Reporting System:Who Is Driving the Bus?An Interview by Cathy Coleman

Ever Widening Circles: Involving 24Teachers in the Development of anAssessment SystemCynthia Gaede

TABLE OF CONTENTS

formerly Bright Ideas

Page 2: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

2

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

Field Notes is a quarterly newsletterthat provides a place to shareinnovative practices, new resources,and information within the field of adultbasic education. It is published bySABES, the System for Adult BasicEducation Support, and funded by Adultand Community Learning Services(ACLS), Massachusetts Departmentof Education.

The Central Resource Center of SABESis located at World Education, 44Farnsworth Street, Boston, MA 02210.

Opinions expressed in Field Notes arethose of the authors and not necessarilythe opinions of SABES or its funders.

Permission is granted to reproduceportions of this newsletter; however, werequest appropriate credit to the authorand Field Notes.

Subscriptions are free to MassachusettsABE practitioners. All others maysubscribe for an annual fee of $8.00.To subscribe, contact Lenore Balliro,44 Farnsworth Street, Boston, MA02210. Make checks payable toWorld Education.

Submissions are welcome. If you havean idea for an article or wish to submita letter to the editor, call Lenore Balliroat (617) 482-9485.

We do reserve the right to declinepublication.

Guest Editor: Cathy ColemanLayout: Marie HorchlerSubscriptions: Justine Sadoff

Advisory Board for 2000/2001:Bruce Dahlquist, Lee Haller, MarilynMonteiro, Bonniebelle O’Neal, SusanPeltier, Nancy Tarriot

F O R E W O R D

ow do you measure a sunset? Can you? Should you?

These are the kinds of questions that leave me puzzled

in this age of increased attention to measurable goals

and deliverables. If I had the correct equipment, I could measure

the length of the light waves. If I were an anthropologist, I could

measure the way sunsets have been portrayed in artifacts over the

years. If I were a psychologist, I could measure the effect of a

sunset on a person’s rate of relaxation. But…would this really

give me what I want? Would this truly measure the sunset? Would

these measurements give me a true picture of the sunset?

No, I don’t think they would. I think, in the case of a sunset,

the whole is greater than the sum of its parts, and I would venture

to say that in some cases, we find ourselves in adult education

in the challenging position of trying to measure the difficult to

measure and answering the questions: How do we know what our

students know? How do we know if we are doing a good job?

These are difficult questions, and this issue of Field Notes

makes no claims of having the answer, but instead hopes to open

up conversation and provide some stories of the various ways that

people are trying to answer their questions. As guest editor, I have

had the privilege to work with the authors here who represent

valuable voices from the field, both here in Massachusetts and

around the country.

In this time of exploration in assessment, I’d like to propose

that we keep in mind my favorite quote about assessment, which

comes from former Massachusetts ABE practitioner, Donna

Curry, “Assessment is not something we do to our students or for

our students, but rather with our students.” Perhaps if we keep

this guiding principle in mind we can create something that

benefits us all.

H

Page 3: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

3

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

I am writing to thank Field Noteseditor Lenore Balliro and all theauthors for the Fall 2000 issue(Vol. 10, No. 2) about reading. I trulyenjoyed every article and learneda great deal as I worked my waythrough this very substantial issue.I was particularly struck by the visionand the care each author took toconnect theory with the realities ofour classes and our students. Thisdistinguishes Field Notes frommany other ABE publications,enabling it to serve as a usefulstimulus and tool across our field.

I am writing with anotherpurpose as well. While the carefulreader can find many places wheremore structured approaches toteaching reading might apply, thealmost exclusive focus of this issueis to highlight approaches thatemphasize “making meaning”during the acquisition and practiceof reading. I have no quarrel withsuch approaches—they are, in fact,why we care so much about readingin the first place. However, giventhe “holy wars” of the last decadebetween “whole language”versus “decoding/phonics” basedapproaches to reading, I havebecome particularly well attunedto how these schools of thought do,and especially how they do not, worktogether. They do not work togetherwell in this issue, unfortunately,since discussion and examples ofstructured approaches to readingare largely absent.

I believe we must base readinginstruction on theoretical, research-

Letter to the Editor

based, and practical considerations.If the whole idea of reading is toconvey and construct meaning,then we should never engage ourstudents, particularly adult students,in approaches that minimize thisdimension. The research is alsoclear that decoding skills are essen-tial to fluency and are essential toaccessing the meaning we strive toacquire and build. As an adulteducator who made ends meet foryears doing carpentry work on theside (we all know about underpaid,part-time employment in adulteducation), I have a great deal oftrouble understanding why some ofmy colleagues find contradictionsrather than synergy in these twoapproaches. Carpenters show up ata job with all the tools they mightneed in their toolbox. I can’timagine a carpenter leaving halfher/his tools at home because some-one might label them “unacceptable.”

We have many of the brightest,most creative and effective adulteducators in the nation here inMassachusetts. I am finding,however, that these qualities canalso present a challenge when itcomes to letting go of polarizingtheories of teaching and learning—what I refer to as the “holy war”between whole language and phonicsis a great case in point. By the year2000, almost everyone knows thatwith at least some audiences, youwon’t get very far promotingone approach to the completeexclusion of the other. So often,today’s dialogue includes an

acknowledgment of the “value”of both approaches while reallypromoting one to the exclusion ofthe other. The last issue of FieldNotes promotes whole language withonly token references to decoding/phonics. On the other hand, someof the reading courses developedby Massachusetts adult educatorsover the past few years promotedecoding with only token referencesto making meaning/whole language.Every time one or the other sideis faced with this dynamic, it onlyserves to reinforce the beliefthat they need to defend their“pole” of this unnecessary andunproductive dichotomy.

The Department of Educationhighly values the contributionsof practitioners from these twoschools of thought and will continueto encourage them to work togetherto build more comprehensive,effective, and complete approachesto teaching reading. Every adulteducator teaching reading inMassachusetts needs to be equallyadept at all proven approaches.In order to respond effectively tothe abilities and the needs ofeach student, we will need to shiftgears between and blend togetherthese different approaches toteaching reading. Our studentsdeserve nothing less.

Sincerely,Bob BickertonMA Director of Adult Education

Fi e l d n o t e s○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Page 4: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

4

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

Continued on page 5

All Is Not LostAll Is Not LostAll Is Not LostAll Is Not LostAll Is Not LostYet, the picture is not as dim

and grim as it might first appear.Indeed, it may be premature to givein to cynicism (“it’s all a sham andno one really cares”), paranoia(“next year, all funding will be tiedto the results of standardized tests”),and paralysis (“in the end, no onewill care about alternative assess-ment, so let’s just sit and wait tosee what comes down thepike”). Since a Pollyannaattitude does not appear tobe justified either, givenrecent legislation, perhapsit is time to take an existen-tialist perspective wherewe commit ourselves toforge ahead although (andeven because) life in adultliteracy does not alwaysmake sense, but what elseare we going to do to stay sane?Let’s ask then if there is anythingpositive happening in assessment,and how we can help shape newdirections on the national or statelevel, while continuing to strive forsane assessments within and acrosslocal programs.

The Federal OutcomeThe Federal OutcomeThe Federal OutcomeThe Federal OutcomeThe Federal OutcomeReporting SystemReporting SystemReporting SystemReporting SystemReporting System

You may have heard that theUS Department of Education hasmandated a uniform outcome-basedreporting system that requires thatall states send data for all programsfunded under Adult Basic Education(ABE) to the Department of Educa-tion in Washington. Assessmentsfor capturing outcomes must be“valid and reliable.” States (and the

programs they fund) will be askedto report “learner gains” in reading,writing, speaking, and listening(and possibly additional skillsrelated to workforce development)and show that learners are advanc-ing across levels.

To understand the thinkingbehind the initiative, it is importantto keep in mind that the primaryfocus is neither curriculum reform,nor program improvement, butrather an accountability measureto bring adult literacy in line withthe requirements of GPRA—theGovernment Performance and

Results Act. GPRA requires that allfederal agencies show that they, aswell as the agencies and programsthey fund, are achieving results orelse risk loss of funding. Funderswill want to know how a program isdoing overall (i.e., whether it ispositively affecting literacy skills),and they expect to see numbers inaggregate form. While in many ways,documenting the kinds of outcomesrequired by the new reportingsystem is “doable,” two dangersloom as programs try to show gainsand as results are increasingly tiedto funding. There is a risk thatprograms will be (1) tempted tomanipulate assessment results intheir favor and (2) succumb to apractice known as “creaming.”

The Dangers of CreamingThe Dangers of CreamingThe Dangers of CreamingThe Dangers of CreamingThe Dangers of CreamingIt is an unfortunate fact of adult

literacy that programs that help those“hardest to serve” (e.g., learnerswho are both new to English and newto literacy) have the greatestdifficulties showing gains, not onlybecause their learners need a greatdeal of time until progress isevident, but because the kind ofprogress they are making is noteasily captured by standardizedmultiple-choice, paper and penciltests. In addition, programs whoserve these students don’t havethe resources to set up testing

alternatives appropri-ate for a low literacypopulation. There isa danger, then, thatprograms not fullycommitted to servinglearners who needboth special supportand extended time willdecide to focus theirefforts instead onthose students who

most easily advance, since theincremental progress of “slower”students only makes the program“look bad.” Thinking along thoselines, ESOL programs, for example,might decide to focus the curriculumon immigrants with higher levels ofeducation, rather than serving ESOLliteracy students. This process offocusing on participants who areeasy to serve is known as “creaming”and has long been decried as anunintended outcome of programsthat have signed performance-based contracts, where fundsare linked to learner outcomesand program impacts, such asjob placement.

Assessment andAssessment andAssessment andAssessment andAssessment andAccountabilityAccountabilityAccountabilityAccountabilityAccountabilityContinued from page 1

How we can help shape newdirections on the national

or state level ?

Page 5: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

5

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

So far, not many public debateshave taken place around this issuein adult literacy on the state level,but concerns are sure to arise asprograms realize the difficultiesthey face in reporting progressacross levels in the time periodsenvisioned by the reporting system.

So Why Not Ask for anSo Why Not Ask for anSo Why Not Ask for anSo Why Not Ask for anSo Why Not Ask for anExemption?Exemption?Exemption?Exemption?Exemption?

Two solutions to the problemof creaming seem possible: (1) setaside monies so programs candevelop an alternative assessmentfor lower level students, or (2) askthat learners who have difficultynegotiating paper and pencil testsbe exempted from testing. In myview, exemptions, as attractive asthey may seem, are not the bestsolution in the long run, since wemay end up marginalizing both thisgroup and programs that servethem. I believe that, rather thanasking for exemptions for studentswho cannot cope with the standard-ized tests approved by a state, weare better off advocating for thedevelopment of an alternativeassessment framework for thisgroup. Once such an assessmentis developed for one group, it iseasier to acquire the resourcesto extend it to other levels andother populations.

Alternative TAlternative TAlternative TAlternative TAlternative Testing festing festing festing festing forororororLow Literacy StudentsLow Literacy StudentsLow Literacy StudentsLow Literacy StudentsLow Literacy Students

What might an assessment thatmeasures the incremental changesoccurring at the initial levels oflanguage and literacy development

look like? It is entirely possibleto design a framework allowinglearners to demonstrate what theycan say and understand in Englishdespite limited proficiency (in fact,the oral interview component of theBEST test does just that). It is alsopossible to design a “can-do”literacy assessment based on thekinds of texts and tasks that thosenew to literacy deal with every day.

If a program wants to createan assessment that works doubleduty (as a basis for program im-provement and for accountability),a further step is necessary: thedevelopment of scales, rubrics, andbenchmarks indicating the expecta-tions for any given level, and to whatdegree learners are close to acquir-ing the kinds of knowledge, skills,and strategies that are a core part ofour curriculum.

As funding for adult literacy isincreasing, the old refrain of “thereis no money to do this,” no longerholds true. There are alternativesto multiple-choice tests, and wemust advocate for their develop-ment and use if we are seriousabout documenting progress forall learners, including those whostill struggle with basic literacy.

Building an AssessmentBuilding an AssessmentBuilding an AssessmentBuilding an AssessmentBuilding an AssessmentFramework That YieldsFramework That YieldsFramework That YieldsFramework That YieldsFramework That YieldsWWWWWorthwhile Resultsorthwhile Resultsorthwhile Resultsorthwhile Resultsorthwhile Results

Developing an assessment thatcaptures gains at the lower levels isonly the starting point in a largereffort to build a system that works.Other efforts are needed, at boththe local and the state levels, sothat we don’t end up with an ac-countability system driven in largepart by what current standardizedtests are able to measure. If wewant the quality of adult literacy to

increase, we need an approach thatmeasures to what extent learners areacquiring the knowledge, skills, andstrategies that matter in the longrun. How can this be done? At thelocal level, a three-pronged approachmight be necessary: (1) finding a wayto live with the currently availablestandardized tests, selecting the“LOT”—the least objectionable test—and keeping in mind the principleof “first, do no harm” to students;(2) convincing the state that the dataa program has provided over theyears are at least as valid and reliableas standardized tests such as theTABE, and therefore the processshould continue; and (3) work withothers to develop an assessmentsystem reflecting the realities ofadult learners’ lives and focusing onwhat participating programs havedeemed to be the core sets ofknowledge, skills, and strategiesimportant enough to teach and test.

Components of anComponents of anComponents of anComponents of anComponents of anAlternative AssessmentAlternative AssessmentAlternative AssessmentAlternative AssessmentAlternative AssessmentSystemSystemSystemSystemSystem

What might be the componentsof such a system? To start with, anyprogram concerned about servingdifferent groups of learners equallywell, needs to collect demographicinformation capturing the kind oflearner characteristics and exper-iences that may have a bearing onschool success. After all, only byhaving rich descriptive informationcan we know what learners want andneed to do with English and literacy,how much schooling they have had,and what the print and communica-tion challenges are that they face intheir everyday lives. Having descrip-tive information of this kind is

Continued on page 6

Assessment andAssessment andAssessment andAssessment andAssessment andAccountabilityAccountabilityAccountabilityAccountabilityAccountabilityContinued from page 4

Page 6: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

6

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

invaluable, since it allows us to seewhich learners are succeeding inour programs, and which are lan-guishing (or leaving) because theirneeds are not met. This informationcan be collected in the form ofprofiles that travel with the studentand to which teachers and learnerscontribute on an ongoing basis. Inaddition to background variablessuch as age, employmentstatus, years of schooling,country of origin andlanguages spoken, theseprofiles can: (1) capturecurrent literacy practices;(2) chart shifts in learnergoals; and (3) record changesin life circumstances.

In these profiles, progresscan be captured as it occurs.Profiles have the addedadvantage of encouraging teachers tocreate opportunities for learners todiscuss what is happening in theirlives, so they can spend some timeobserving. Profiles of this sort (alsoknown as “running records”) canbe connected with portfolios thatdemonstrate student progressthrough writing samples, readinginventories, and various typesof performance tasks. If a stan-dardized test is used, results canbe included in the profile aswell, helping to flesh out thegeneral picture of achievementsand struggles.

From Learner Success toFrom Learner Success toFrom Learner Success toFrom Learner Success toFrom Learner Success toAccountabilityAccountabilityAccountabilityAccountabilityAccountability

This must be said: While anapproach that combines rich profilesand individual portfolios will produce

important information on individualstudents and provide insights intothe relative success of certain learnergroups, it does not, in and of itself,yield the kind of data needed foraccountability. After all, we cannotship boxes of profile folders tofunders to have them realize whata great job we are doing. To makeprofiles work for funders, a furtherstep is needed, one that yields datain aggregate form so that policy-makers can get a picture of the shapeand size of the forest, not just a close-up of the trees.

To measure progress and reportto funders, profiles need to includethe following: a broad set of languageand literacy tasks that are accompa-nied by rubrics, scales, and bench-marks for transition. Rubrics areused to indicate what expectationsare for any given area (face-to-facecommunication, dealing with print,accessing resources, etc.) and whatevidence of success might look like.The scales that accompany therubrics allow us to document wherelearners fall on a continuum of pro-ficiency, documenting what they cando with relative ease, where theysucceed with some help, and wherethey are struggling. Since rubricsand scales can be designed fordifferent skill domains (SCANSskills, communication strategies,navigating systems, etc.) and for

various contexts (school, family,community), they can easily bematched to the goals of learners,and adapted to the focus of aparticular program.

Once rubrics and scales are inplace, meeting accountability re-quirements calling for aggregatedata becomes relatively easy. Sincethe descriptors on a scale can easilybe numbered (from 1 for “struggles”to 6 for “no problem”) assessmentresults can be easily compiled,summarized, analyzed, and reportedout. If matched with demographic

profiles, they allow aprogram to see whichgroups of learners arebeing served well bythe program and whereprogram changes are inorder because successis lacking.

The beauty is thatthis kind of approachfulfills the samefunction as standard-

ized tests: learners are assessed ona variety of skills under standardconditions with common instru-ments on similar tasks. But unlikethe standardized tests currentlyavailable, profile assessments donot rely on multiple-choice, paperand pencil items. Rather, they givelearners the opportunity to demon-strate what they can do with languageand literacy through more open-ended assignments. Furthermore,profile approaches to assessmentcan be adapted for certain learnergroups and modified to matchthe focus of a particular program(e.g., workplace, family literacy,citizenship). Most importantly,they provide rich information thatmakes sense to teachers and

Continued on page 7

Assessment andAssessment andAssessment andAssessment andAssessment andAccountabilityAccountabilityAccountabilityAccountabilityAccountabilityContinued from page 5

We can also work toward asystem that measures effectiveness

where it counts...

Page 7: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

7

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

learners, information that is usefulto programs, not just funders.

Why then, are we not seeingmore of these kinds of assess-ments? While extremely worthwhileand high in validity, these types ofassessment carry a significantburden: they require consensusbuilding on what is worth teachingand learning, and a common under-standing of what the evidence ofsuccess might look like for any givenskill domain. To be successful,profiles and portfolios have to beintegrated into the curriculum, andongoing assessment must either bepart of the day-to-day teaching wedo, or time must be set aside atintake to establish a baseline, andtoward the end of a teaching cycle,to document progress. If that meansthe end of open-entry/open-exit aswe know it, and if it forces us intoshorter instructional cycles thathave a clear teaching/learning focus,so be it.

To give such a framework achance, a significant amount ofteacher orientation, training, andbuy-in will be needed. Clearly,there are not many adult literacyprograms that have the commit-ment, energy, and resources toembark on that endeavor, althoughsome, like the Arlington Educationand Employment Program inVirginia, are well on their way. But,given sufficient advocacy from localprograms, along with a modicumof political will on the part of statedirectors and other funders, workinggroups and consortia could be set upto develop an assessment frameworkthat, if not based on profiles, at least

includes them. In fact, the NationalInstitute for Literacy is moving inthat direction, developing anassessment framework combiningthe use of alternative assessmentswith standardized tests whereappropriate, in order to capturethe gains that learners make whoare part of the Equipped for theFuture initiative.

What then is the bottom line,given the current climate ofaccountability for accountability’ssake? We have several options: wecan decide that cynicism is the onlysane response to the currentrequirements, live with standardizedtests as best as we can, try to lay low,figuring “this too shall pass,” orcommit ourselves to fighting for asaner system for our own sake andthat of our students. On the locallevel, we must be prepared to workwith others to decide on the focus ofour programs, and be willing to mapout a core set of knowledge, skills,and strategies that matter. At thefederal level, we must push for anaccountability system that is drivennot by what the current standard-ized tests are able to assess (whichis rather limited), but by outcomesthat reflect what sound adult literacyprograms should be all about.Furthermore, if we are asked to showaccountability related to outcomesand impacts, we must be given theresources to document success inmeaningful ways. Finally, while wemay need to play the accountabilitygame for the time being, we can alsowork toward a system that measureseffectiveness where it counts: adultlearners acquiring the kinds ofknowledge, skills, and strategiesthat are important to them now andthat matter in the long run. If wegive up too soon, we will onlymarginalize adult literacy further.

“Not everything that counts

can be counted, and not

everything that can be

counted counts.”

Albert Einstein, Physicist

“There is no one giant

step that does it. It’s a lot

of little steps.”

Peter A. Cohen, Banker

�“Without a clear vision

of what is important,

measurement can become

a sterilized exercise to

come up with numbers to

satisfy external agencies.

What is counted usually

becomes what counts.”

Juliet Merrifield,Researcher/Adult Educator

Heide Spruck Wrigley is a SeniorResearch Associate at AguirreInternational in San Mateo, CA.She can be reached by email [email protected]. This articlewas excerpted from Adventures inAssessment, Volume II (Winter1998), SABES/World Education,Boston, MA, Copyright 1999. Acomplete version is available on theWeb at <http://SABES.org/aia111.htm>.

Assessment andAssessment andAssessment andAssessment andAssessment andAccountabilityAccountabilityAccountabilityAccountabilityAccountabilityContinued from page 6

Page 8: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

8

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

he National Reporting System (NRS) has had the healthy effect of surfacing manyquestions about assessment andaccountability in Adult Basic Educa-tion (ABE). The required use ofstandardized assessments for bothplacement and measures of progresshas caused concern at both the stateand program level. Typical standard-ized tests simply do not capture allthe positive outcomes gained fromadult literacy instruction.

Reality shows us that even ifthese tests could measure every-thing we want to document, manyadult students are not available forpost-testing. These issues promptedadult educators in Ohio to exploretypes of standardized assessmentbeyond the standardized tests. Thisled to the development of Ohio’sUniform Portfolio System.

Ohio’s Uniform PortfolioSystem (UPS) strives to integratethe positive aspects of portfolioassessment into a standardizedassessment system through the useof uniform competency checklists.Working together, consultants fromthe Ohio ABE office, researchers,and practitioners created reading,writing, math, and ESOL checklistsfor each of the six NRS levels basedon the NRS educational functioninglevel descriptors. Students areplaced into an NRS level for report-ing purposes based on scores froma standardized test, but progress ismeasured by both a standardized

The Uniform Portfolio System as aStandardized Assessment Toolby Jane J. Meyer

post-test and by the percentageof items mastered on thecompetency checklists.

Although the actual portfoliomay look different in various pro-grams around the state or even indifferent classrooms within aprogram, it is uniform in that itmust contain student goals, an indi-vidual learning plan, and one ormore of the competency checklists(depending on the student’s goals).As students master competenciesfrom the lists, they check them offand include documentation showingmastery (such as a test or worksample) in the portfolio. Teachersand students have freedom todecide how they will demonstratemastery, but the items on thechecklists are the same across thestate, which is what makes thesystem standardized.

Concerns were raised earlyon that the checklists not becomea mandated curriculum. Becausesuccess is measured by how manycompetencies on the list are checkedoff, care was taken to make sure thecompetencies are general enoughto be taught in the context of avariety of individual student needsand interests. For example, one ofthe items on the level 4 reading logsays “Draw conclusions based ondetails in the text.” It is easy to seehow you can vary the text to meet thestudent’s needs and interests andyet still meet the checkoff. Onestudent might be reading a parent

information sheet from her child’spreschool and drawing conclusionsabout what she needs to do at hometo help her child while anotherstudent might be reading an articleon job hunting and drawing conclu-sions on what steps he needs to doto get the job he wants.

All teachers in the state arerequired to review the uniformportfolio with each student a mini-mum of every three months. At thisreview each student’s NRS level isalso reassessed in one of two ways.Students who have completedenough hours to make standardizedtesting appropriate and meaningfultake a post-test. Levels are deter-mined for students who haven’tcompleted enough hours for mean-ingful post-testing based on thepercentage of items masteredon the checklists. The state hasdesignated that when 75% of theitems on the checklist are masteredthe student has progressed to thenext level.

This system reduces the numberof students who can not be countedbecause they do not have a standard-ized post assessment (a require-ment of the NRS), without theproblem of giving standardizedtests too often in order to be surestudents don’t leave before post-testing. Because the competencieson the checklists and the percentmastered for completion of a level

Continued on page 9

T

Page 9: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

9

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

are standardized across the state,the level determined at the lastportfolio review can be used as astandardized post assessment if thestudent leaves without a post-test.

The UPS supports researchon best practices in assessmentbecause it allows and encouragesthe use of multiple measures usinga variety of assessment tools. Useof more than one assessment tomeasure performance verifies thereliability of the assessment. Anxietyfrom former negative school experi-ences can effect the performanceof many adult literacy students intraditional testing situations, butthese same students may be ableto demonstrate competence whenmeasured with other forms ofassessment. A standardized testusually shows a snapshot of whatthe student knows at one particularmoment in time. A truer pictureemerges by assessing skills over aperiod of time with several differentmeasures. For example, although astandardized test alone is allowableto measure a student’s math level,a better demonstration of what thestudent knows and can do with mathcould be documented by a portfolioincluding an end of the chapter teston fractions, percents, and decimals;a personal budget illustrated witha pie graph; a week’s menu withnutritional analysis; and a compari-son of the cost of using creditcards, renting to owning, or payingcash for appliances.

Portfolios can document attain-ment of a broader range of skillsand use of skills in context than canbe measured on a multiple-choicetest of decontextualized skills. They

link assessment closely with instruc-tion and engage adult learners asactive partners in the assessmentprocess. This allows teachers toplan instruction and assessmentaround issues of interest and im-portance to students. For example,a student interested in a politicalelection could demonstrate a writingcompetency by writing an essaycomparing two candidates, while ayoung mother could compare andcontrast two preschool programsshe is considering for her child.

Implementing the UPS in timeto meet the NRS deadline hasproduced several challenges. Theportfolio checklists, which are thecore of Ohio’s system, need to bevalidated. Current use in the fieldwill show whether the checklistsmeasure what they are supposed tomeasure, that is, at what NRS levelthe student is functioning. Teachersare comparing the performance ofstudents at NRS levels as identifiedby CASAS, TABE, and AMES withthe items on the Ohio UPS checklistsand with the diagnostic profiles of thestandardized tests. Plans are alreadyunderway to adjust the checklists.

Reliability, or consistency ofmeasure, is another issue that mustbe addressed. The UPS will onlyreally become standardized whenteachers around the state understandand agree on what each item onthe checklist means and what itsstandard of mastery will be. Forexample, an item from the level 2reading checklist states that thestudent will “interpret abbreviationscommonly used in documents.” As afield, Ohio teachers need to decidewhat abbreviations those are so thatthe criteria for mastery is uniformthroughout the state. This does notmean that we intend to create Ohiostandardized tests to measure each

of the competencies, but that weneed to agree on what it is exactlywe are measuring and what masterywill look like, although students maydemonstrate it in different ways.

A third challenge is staff trainingto ensure that all teachers are usingthe UPS in a standardized fashionfor external accountability purposesand to help teachers develop effec-tive techniques in using portfolioassessment to meet the needs ofstudents and staff. Staff develop-ment began last spring withvideotapes produced by the stateABE office, which were requiredviewing for all ABE teachers in Ohio.The state is following up with aseries of portfolio workshops forstaff development. Local programsare encouraged to study the portfolioprocess and the UPS checklists andsubmit ideas for improvement.

Although Ohio’s UniformPortfolio System is still in the devel-opmental stage the idea shows greatpromise. Continued work to ensurevalidity and reliability, coupled withongoing staff training on use of thesystem and alternative assessmenttechniques, will ensure Ohiodevelops a system built on soundassessment strategies that meets theneed for external accountability forthe NRS, and, at the same time,provides useful information forstudents and teachers to drive thecontinuous cycle of assessing,planning, and teaching/learning.

Jane J. Meyer is Coordinator of theCanton City Schools Adult BasicLiteracy Education (ABLE) Programin Canton, Ohio. She can be reachedby email at [email protected].

The Uniform Portfolio…The Uniform Portfolio…The Uniform Portfolio…The Uniform Portfolio…The Uniform Portfolio…Continued from page 8

Page 10: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

10

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

his was the question Nicole Graves posed to both students and teachers at hercommunity-based, nonprofit agencyproviding English to Speakers ofOther Languages (ESOL), computer,and other classes to a rural sectionof western Massachusetts. In orderto answer the question, Gravescreated a unique approach to assess-ment that collectively utilized moretraditional tools, such as teacher-driven observation checklists andprogress reports, as well as innovativetools, such as learners’ logs andlearner self-evaluations. By incorpo-rating student feedback and writingsinto the assessment process, Gravesobtained more accurate andsatisfying results.

The initiative for this experimentcame from a number of avenues.Already in place was a teacher-driven project to improve uponexisting curriculum, and assessmentwas seen as part of this process.Funders for Graves’ agency had alsomade some changes in theirrequirements, and these changeswere connected to the state ESOLCurriculum Framework. ThusGraves had an impetus to both en-hance her agency’s curriculum, aswell as to meet the new require-ments that the state and her funderswere putting into place. Her firstaction was to turn this project intoher practitioner research for theNational Center for the Studyof Adult Learning and Literacy

(NCSALL), and the groundwork shelaid out included re-examiningold progress reports, and then re-arranging these reports to match thestate ESOL Framework. She alsocross-referenced the reports withMainstream English LanguageTraining (MELTS) levels, Compre-hensive Adult Student AssessmentSystem (CASAS) assessment instru-ments, The Secretary’s Commis-sion on Achieving Necessary Skills(SCANS) 2000 reference levels,and Equipped for the Future. Theend result of this groundwork gaveGraves a clearer picture of somecommon themes to all of them,which she could then apply to herown program. And the first step inthis application process was creatingthe above question.

Although beginning learnerswere not able to evaluate theirwriting logs or probe deeply intoassessing their own learningprocess, they were nonetheless ableto evaluate their class on a weeklybasis, and their teacher at the endof the session. Intermediatelearners, however, had an oppor-tunity to delve into their own writingand examine how their learning hadprogressed over the length of theclass. Graves then took the students’progress reports and the class,teacher, and self-evaluations andrecorded her findings, along withthe responses to a survey she hadgiven out during the final week ofclass. The culmination of these

assessment tools pointed toward thelearners’ own sense of personal andsocial improvement. For example,many of the students reported beingable to engage in activities they hadnot been able to do before takingclasses and doing personal assess-ment, such as going to the bank bythemselves, going to the doctor’salone, shopping by themselves,answering the telephone, andexpressing themselves in general.Such increased independencelikewise brought about feelings ofheightened confidence and self-worth, and this, in turn, increasedthe learners’ sense of connectionwith the American culture at large.For instance, one student wrotethat she could now “hearAmerican voices.”

Such responses, of course, arepositive on many levels. For onething, these findings correlate withthe standards of progress found inthe Curriculum Frameworks andthe Department of Education (DOE)strands. Furthermore, there isagreement between these progressreports and the TennesseeLongitudinal Study and the NCSALLstudy, which tested for improve-ments in socio-economic well-being (jobs, income, survival),social well-being (family andcommunity), personal well-being,and physical well-being. Likewise,the broad categories of the ESOL

“What do I learn about student learning when I use a jointprocess of assessment which utilizes different tools?”

Involving Learners in Assessment

A Summary of Nicole Graves’ Practitioner Research by Justine Sadoff

Continued on page 25

T

Page 11: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

11

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

Continued on page 26

eople, young or old, who return for their GED after being out of school for a timeare going to have difficulty becauseof the skills they have lost over theyears or perhaps never developed.However, those who have learningdisabilities are going to struggleeven more. Many of them leftschool because they were notsuccessful due to their disabilitiesor because of teachers who couldnot meet their needs. Now, in theGED program, it is imperative thatinstructors use the right strategiesto keep these individuals motivatedand acquiring the skills necessaryfor productivity in the workplaceand in life.

In the Quinsigamond program,once the student has self-identified,she/he is referred for an evaluation.Quinsigamond employs a retiredschool psychologist who workedfor over 20 years in the elementaryschool system.

The evaluation of the studentsinvolves testing in two areas. First,the WAIS-III, an intelligence test,is given. The results indicate what aperson’s intelligence is relative toacademics. This test is essentiallydivided into two areas: Verbal andPerformance (visual-motor). Thesetwo categories include four sub-divisions: Verbal Comprehension,Perceptual Organization, Freedomfrom Distraction, and PerceptualSpeed Indexes. Second, from theidentified weaknesses on theWAIS-III, further testing usually

Assessment and Learning Disability:The LD Student in Quinsigamond’s GED Programby Wallace M. Perkins

occurs with the Woodcock-JohnsonPsycho-Educational Battery. Thistest detects specific learningdisabilities such as auditory process-ing, attention, language, memory,visual perception, and visual-motor.The Woodcock-Johnson is anexcellent test to determine a person’scognitive strengths and weaknessesas well as to check levels in reading,writing, and arithmetic. If there arestill questions or concerns in theevaluator’s mind, parts of other testssuch as Wechsler Memory Scale-Third Edition or Detroit Tests ofLearning Aptitude are given tosupport or discredit the findings onthe WAIS-III and Woodcock-Johnson.

When an individual is havingdifficulty in reading, writing, and/ormathematics, it is most likely thatshe/he has a weakness in at leastone of the cognitive skills that isneeded for success in the corre-sponding area. The Woodcock-Johnson can be helpful here. Itoffers four subtests that involvedifferent skills to determine wherea student’s deficiency in readinglies. Memory for Sentences showsif one has difficulty with short-termauditory memory. Vocabulary isimportant for reading and OralVocabulary is given to show how wellone knows definitions by asking forsynonyms and antonyms to words.Sound Blending is another subtestthat checks a person’s phoneticability by requiring him/her tocombine sounds to make words.Finally, visual perception, including

speed, is tested in an activity thatrequires finding two numbers thatare the same in an array of sixrepeatedly. Similarly, for each ofthe other cognitive areas thereare subtests that can focus onspecific abilities that may becausing problems.

However, testing is just one ofthe early steps in the program tohelp the learning disabled studentin the classroom. The results of theevaluation are discussed at a meetingincludes the learning supportspecialist, the teacher, the student,and the school psychologist. Atthat time, recommendations arepresented for the teacher and thestudent. If the teacher can beencouraged to include one or twoof these recommendations in herteaching style, not only the learningdisabled student but the entire classwill benefit. Examples of commonsuggestions include the present-ation of new or important inform-ation at the beginning of each classor just after the break; the teachingof mnemonic devices for assistingstudents to memorize importantfacts; the teaching of specific pro-cedures in solving problems; orthe encouraging of students toexplain the concepts that are beingtaught in class and to employ themin solving a specific problem.

In addition to teachers adjustingtheir programs to meet the needs ofthe LD student, at Quinsigamond

P

Page 12: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

12

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

hy do we have to make ourstudents take standardized

assessment tests? We knowwhat they need to learn, and weknow how to teach. It’s just extrawork, and the students don’t want totake the tests anyway. Many teachersand program directors see this asjust one more thing we have to do.

There will never be one stan-dardized test that can measure allthat our students have learned, buthaving more than one way tomeasure progress is in the bestinterest of students, teachers,and programs. In-house assess-ments, portfolio assessments,and curriculum reviews areamong the many other ways ofmeasuring progress.

Developing alternativeassessments is not an easy task,and the teachers here at theJackson Mann CommunityCenter have not wanted totake this on either, but we havehad the benefit of being involvedin a national project called the“What Works Literacy Partnership”sponsored by Literacy Partners Inc.in New York and funded by theLila Wallace Readers Digest Fund.Through this project, there are 12programs from around the countrythat get together on a regular basis tolearn about assessment and programevaluation and how we can use it tobenefit our programs.

We at Jackson Mann are learningthat we can look at assessment asjust another thing we have to do oras something we can learn from. Itdoes take time; it does cost money;it does take training; and we have

learned that we have a lot to learn.The biggest thing we’ve learned isthat we can look at data collection assomething that goes into a blackhole that we give to the state becausewe have to or we can use it to make ourprogram better.

It all starts with asking questions.What do we want to know about ourprograms? One teacher gave anexcellent example of this. Shenoticed that in the program there

seemed to be a certain group ofstudents who were not makingprogress. These were ESOL studentswho had reached an oral proficiencylevel of English but were not makingprogress in reading or in writing.We wondered if there were otherstudents in the program who fellinto this category. We realizedthrough looking at data we hadobtained from the BEST test, thatthe teacher’s instincts were correct,that these were not the only studentsin that situation. The data backed upher intuition. Since this was a newfunding year, we were able to createa class for these students.

Through this process, we havediscovered that we have mixed

feelings about standardized tests.The scores from standardized tests,while helpful, do not always reflectwhat skills a student really has. Thisis one of the reasons we resist usingthem. Few adults function in theworld and come back with a GLE(grade level equivalent) of 1.3 on theTABE. There are many reasons whyour students may not do well on astandardized test. They have nevereven seen one before. They are

often nervous. Sometimes theproctor is not skilled in giving thetest. However, even given all this,we know that our students need tolearn how to take these kinds oftests because they may need totake them to get into furthertraining or higher education.

We have recognized that assess-ment of many different kinds,including tests, can be helpful tous as a program. There are manyprograms in Massachusetts and

around the country that have de-veloped wonderful assessmentmeasures, and have found thatusing them can only make theirprogram better. It’s only a changein perception.

Shelley Bourgeois is a Teacher andthe Director of the Jackson MannCommunity Center and participatedin the What Works Literacy Partnership.

Why Do We Have to Do Assessment?by Shelley Bourgeois

W

Page 13: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

13

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

ntil recently a crosswalk was something I instructed my children to be sure to use when going from one side of a busy street to another.Traffic laws assure me that pedestrians poised at a curbbefore the painted bar on the pavement are guaranteedsafe crossing.

Now, it seems crosswalk has gone the way of eyeballand impact. It has become a verb, and a transitive oneat that. Recently, ABE and adult ESOL practitionersneeded to crosswalk the results of their students’assessments. Simply put, they needed to compare theresults of their own assessments with the proficiencylevels described by the National Reporting System(NRS). The comparison is the crosswalk and by usingit one can demonstrate compliance with federalregulations while at the same time evaluate learners’strengths and weaknesses in meaningful ways.

For both ABE and ESOL the NRS describes six“educational functioning levels” in three categories.For ABE those categories are Basic Reading and Writing,Numeracy, and Functional and Workplace Skills. ForESOL, Speaking and Listening replace Numeracy.A chart describes each level in terms of a learner’sgeneral abilities and weaknesses. The levels rangefrom Beginning ABE Literacy to High Adult SecondaryEducation and from Beginning ESL Literacy to HighAdvanced ESL. For each level the chart also listscorrelating scores for several standardized tests.

Crosswalk—A Lesson in Comparisonby Jeri Bayer

U But assessment and the NRS don’t have a copyrighton crosswalks. In fact, the term has become useful withregard to a number of teaching elements. Recently, forexample, Central SABES offered a series of workshopsthat crosswalked the Massachusetts Curriculum Frame-works to the national standards initiative, Equipped for theFuture. Participants engaged in identifying and compar-ing the common ground between the two, as well astheir uniqueness.

The next step in the assessment crosswalk adventurewill be undertaken by the working group that the DOEwill convene in January 2001. For 18 months the prac-titioners and other stakeholders in the group willthoroughly explore a range of standardized proceduresthat appropriately crosswalk to the NRS proficiencylevels. The crosswalks of the individual programs inrecent months was a preliminary stroll. Now ABE inMassachusetts stands poised on the curb of a busierstreet, determined to traverse safely to the other sidewith an assessment that enables funders to justify theirinvestment, programs to continuously improve theirservices, and learners to observe their progress in ameaningful way. Controlling the traffic is the crosswalktalk. The time has come to walk the walk and talk thetalk, the crosswalk talk.

Jeri Bayer is the Curriculum Coordinator for NortheastSABES. She can be reached by email at [email protected].

Substantial progress has been made toward the ABE teacher’s certificate. Following submission of the SecondInterim Report to the Commissioner, feedback was collected from the field, resulting in further changes tothe statewide Advisory Committee recommendations. These recommendations are now being reviewed byDeputy Commissioner, Sandra Stotsky, who appears to be largely supportive. Regulations and Guidelines arebeing drafted for review by the Commissioner and the Board of Education. With luck, the regulations andGuidelines will be out for public review early in 2001. Meanwhile, DOE is conducting a Needs Survey todetermine the number of teachers who might pursue certification and the kinds of support they will need. Inaddition, two new “pilot courses” (45 PDPs and stipend for completion and extensive course evaluation) arein the works for Spring 2001 presentation: ESOL I and Curriculum & Methods. All referenced documents,including blank Needs Surveys, are online at www.sabes.org and www.doe.mass.edu/acls. Questions?Call Carey Reid (SABES), (617) 492-9485, or Mary Jayne Fay (DOE), (781) 338-3854.

ABE Teacher’s Certification Update

Page 14: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

14

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

Continued on page 26

am an instructor of adult basic skills and GED at the

Bethel Family Learning Centerin Eugene, Oregon. Over the lasttwo years I have been involved withteaching and assessing using theEquipped for the Future (EFF)framework. EFF, an initiative of theNational Institute for Literacy, wasdeveloped to answer the complexquestion: What do adults need toknow and be able to do in order tocarry out their roles and responsi-bilities as workers, parents andfamily members, and citizens andcommunity members? (NIFL,Equipped for the Future ContentStandards, 2000). EFF standardshave been identified through a care-ful research process that began withadult learners and has includedadministrators, practitioners,tutors, and policy makers as wellas experts from adult education,literacy, workforce development,and other stakeholder systems.The 16 EFF standards represent thecore skills needed for effective adultperformance in the three majoradult roles in today’s rapidly chang-ing world and are a new definitionof literacy for the 21st century. TheEFF standards framework includes:

• Four purposes of learning definedby adult students: Access toInformation, Voice, IndependentAction, and Building a Bridge tothe Future.

• Role maps that define what effectiveadults need to know and do to

carry out their responsibilities.The three role maps are worker,parent/family, and citizen/community member.

• Common activities that cross allthree roles.

• 16 standards that support effect-ive performance in the threeroles to achieve the four purposes.

EFF is very exciting to use inclass because it is based on inputfrom adult learners and thereforeis very meaningful to my students.Learning in an EFF classroom isactive, purposeful, and contextual.Students are very much in controlof their own learning.

Classroom activities are developedaround performance tasks. Perfor-mance tasks are real-life activitiesthat allow students to demonstrateperformance of one or more of theEFF standards. An example of aperformance task would be a groupof activities that students would doto be able to convey ideas in writing(an EFF standard) in a letter to theirchild’s teacher. Learning activitieswould address components ofperformance or skills needed tobe able to use the standard for ameaningful purpose. Here are thecomponents of the standard ConveyIdeas in Writing (NIFL 2000):

• Determine the purpose forcommunicating.

• Organize and present informationto serve the purpose, context,and audience.

• Pay attention to conventions ofEnglish language usage, includinggrammar, spelling, and sentencestructure, to minimize barriers toreader’s comprehension.

• Seek feedback and revise toenhance the effectiveness ofthe communication.

A well-structured performancetask will address all thesecomponents.

There is still a point to consider.How do I know that my students arelearning? A paper and pencil testwill not capture what these studentsknow and are able to do. The EFFframework addresses assessmentas movement along a continuum oflearning. As people learn, theyincrease their knowledge, , , , , fluency,,,,,independence, , , , , and range in usinga skill. EFF refers to these as fourdimensions of performance. Eachdimension helps describe not onlywhat people know, but also howwell they can use what they know(NIFL, 2000).

This is where, for me, instruc-tion and assessment combine.Using the four dimensions of per-formance allows me to think aboutwhat skills are needed to performthe task and to look at where studentskill levels are related to these dim-ensions before and after the task.

Active, Purposeful, and Contextual:Assessment in the EFF Classroomby Joan Benz

I

Page 15: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

15

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

The Pareto Principle: Plotting Learner’s Growth

lfredo Pareto was an Italian sociologist who suggested that “80% of all wealth inthis country is owned by 20% of thepeople.” (In our country today, thepercent is closer to 90%/10%.)This supposition was further devel-oped by business and industryleaders who found that most of thequality problems were confined toa small number of machines orworkers. In other words, “80% ofproblems come from 20% of theequipment or workforce.”

The Pareto Principle is usedby business and industry to workto continually improve quality—whether it be a product or a service.Quality improvement involvestackling one issue at a time. Afterall, there is rarely just one causerelated to a problem. By addressingthe one causing the most difficulty(the 20% which are causing 80%of the problem), improvementscan be made and monitored forcontinuous progress. Bar charts,called Pareto charts, are used todecide what steps need to be takenfor quality improvement.

A Pareto chart is simply a barchart that sorts defects, errors,and issues, in decreasing order. Indoing so, it is clear which problemis causing the greatest difficulty.Pareto charts can be used to seewhether strategies used to correctproblems have been effective.

What does the Pareto Principlehave to do with education? And,more specifically, what does it haveto do with assessment? Documentinga learner’s errors using Pareto chartsis an interesting way for learners to

by Donna Curry

A see evidence of growth, especiallywhen they are working on discreteskills. Pareto charts can also be usedto document overall improvementof a class.

Let’s imagine you are an Englishteacher. Sometimes it’s difficultto articulate to learners just howtheir writing has improved. Paretocharts can help you and your learnernote progress.

Let’s say you want to know whattypes of conventions of Englisherrors learners are making in orderto figure out what areas to address.Begin by having each learnerprovide a writing sample. Analyzethe mistakes that the learner makes.You can create a key so that you cancode each error made. This makesit easier for the learner to see thetypes of errors he most typicallymakes. (See figure 1.)

figure 1

Have each learner create a Paretochart showing the errors. The barsshould be in descending order. Thebar representing the most frequenttype of errors is often the one thatthe learner should focus on first.

After the learner has had achance to work on a particular area(the one in which she had the mosterrors), do another analysis of awriting sample. Again, use the samecoding. The learner again creates aPareto chart showing the frequencyof errors. The largest bar shouldnow be different from the earlierPareto chart. By looking at this newchart, the learner can see what areato focus on next. This new chartserves as a “pre-assessment” forthe next area of focus. (See figure 2.)(Notice that the assessment involveslooking at how learners are applyingtheir new learning in the context ofa writing activity rather than simplydocumenting the completion ofpages in a text of workbook.)

If learners keep their charts intheir portfolios they can clearly seehow they have shown improvementover time. They are aware of whatdiscrete skills they have learnedand are able to apply in theirwriting. They also are getting anopportunity to see how to use mathto communicate. And, they arelearning a valuable tool used inbusiness and industry.

Donna Curry is the PublicationsCoordinator for the EFF National Centerand has done staff developmenttraining for adult educators for 10 years.She can be reached by email [email protected].

figure 2

Writing Sample: 12/5/99

0

2

46

8

10

Capitaliz

ation

Comm

as

Run-ons

Inco

mplete

s

Pronouns

VerbsN

umbe

rof

erro

rs

Writing Sample: 11/5/99

0123456789

10

Verbs

Capitaliz

ation

Comm

as

Inco

mplete

s

Run-ons

Pronouns

Num

ber

ofer

ros

Page 16: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

16

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

typical day in my life starts like this: I sit in a stack on the book shelf. Suddenly, Iam whisked away and handed tosomeone who looks as startled byme as I am by them. I am a TABE(the Test of Adult Basic Education)and I have become, according to oneadult education professional, the“industry standard.” Lots of peopleknow my name. Some love me andsome hate me. I guess you can’tplease all the people all the time.

The TABE is a battery ofmultiple-choice tests. Accordingto the publisher, the purpose of thebattery is “not to test specific lifeskills, but to test basic skills in thecontext of life skills tasks” (CTB/McGraw Hill, 1987). There is avocabulary section and a readingcomprehension section, whichtogether give a composite readingscore. A locator test is availablewhich consists of 25 multiple-choice items and 25 multiple-choice computation items rangingfrom whole numbers skills todecimals. The locator requires 37minutes to administer (for bothvocabulary and arithmetic sections).There are also two math sectionsand two writing sections.

Programs vary a great deal onwhich sections (or how manysections) of the TABE are given. Thereading section is almost always oneof the sections included.

You can’t please all the peopleall the time. The same could be saidfor any type of assessment. Thequestion for me as an adult educatorand staff development person isdoes this test meet my needs, the

The TABE: Thoughts From an Inquiring Mindby Cathy Coleman

needs of my program, and the needsof my learners?

To begin to address thesequestions, I look at my own experi-ence as an adult educator. I havebeen able to gain a fairly accurate,general idea of a learner’s readingcomprehension level by using theTABE. Someone might come to myclass on any given day with a TABEscore of 5.5, for example. This givesme an idea of where to start.

Still, I have learned over theyears to take that score with a ratherlarge grain of salt. When I talk to mylearners about their scores on theTABE (which they are almost alwaysanxious to find out), I tell them thatthis score only gives us a ballparkfigure and that we will both knowbetter after a few weeks of workingtogether, at which “level,” for wantof a better word, they are.

We also talk about the value ofknowing a level. We discuss how itcan give us a general idea about howfar they might be from being able totackle the GED (which is very oftentheir first, and sometimes only,stated goal).

My career in adult educationhad almost always involved thewearing of at least two different

hats. One of the hats that I haveworn is that of the PractitionerInquiry Coordinator. In that role,I worked with teachers to see theworlds of our classrooms throughthe eyes of an anthropologist. Weobserved carefully. We tried toanswer the question: What is goingon in this classroom? We tried toidentify and question the under-lying assumptions in our teaching.Sometimes we made changes basedon that.

A counselor I spoke with toldme that the TABE is the “industrystandard.” I asked how long the pro-gram had been using the TABE. Hetold me that they had been using theTABE “since 1973 when I got here.”

As a teacher who has taught in anumber of different programs andsettings, and as a staff developerwho has contact with many differentteachers, I then considered thequestion: How is the TABE used?What is it used for?

Many programs seem to use theTABE as an initial assessment toolto determine placement in one ofthree (typically ABE, pre-GED, andGED) programs. Some programsadminister the TABE on a regularbasis to determine movement tohigher levels.

Some funders do this, too. Somemandate intervals at which the TABEmust be taken. In one such case, Ihad several students who needed totake the TABE after every 150 hoursof class time (about every two tothree months). Some of thesestudents had taken the same form

Continued on page 17

A

Page 17: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

17

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

of the test a few times even beforethey got to my class.

Some students told me thattheir goal for class was to reach aneighth grade level on the TABE sothat they could enter a particular jobtraining program. In some cases, agreat deal was riding on this parti-cular test score. For these students,passing the test (meaning scoring atthe eighth grade level) became theirpriority (understandably so). So… Ilearned rather quickly in my careerthat the TABE could sometimesmake or break a student’s potentialcareer path.

A study published in 1995 bythe National Center on Adult Literacy(NCAL) entitled, When Less is More:A Comparative Analysis for PlacingStudents in Adult Literacy Classes,concluded that “a test as brief as theTABE locator could predict place-ments as well as the complete groupof reading tests.” The followingsums up their recommendations:

“Attempts to achieve extremelyhigh accuracy in placement shouldbe tempered by a consideration ofthe small number of placementlevels usually available.… Overall, itmay be concluded that less testingmay be more valuable to bothstudents and adult literacy programs.Less time on testing means lesscost for testing. Perhaps more im-portantly, learners often havedistaste for and fear of standardizedtests. By cutting back on testing andmoving toward a self-assessmentmodel, programs may stimulategreater motivation and satisfactionamong the clients they serve.”

Based on my experience, Iwould recommend we considerthe following questions:

• When and why did we all decidethat the TABE was the “ industrystandard”?

• Does the TABE help us find outthe information we are seekingto know?

• What do we seek to know fromusing the TABE?

• To what extent is the TABEsuccessful in placing students inthe correct classes?

• Is there flexibility in ourprograms when the TABE resultsare not successful in placingstudents in the correct classes?

• Are we using the TABE in a waythat is consistent with theintended purposes of the test?

• Does the TABE help learnersidentify needs and/or levels?

It is possible that the TABE isindeed the very best test to use todetermine this kind of information.If we take an inquiry approach tothis issue, however, and examinethe underlying assumptions, we maydiscover important information thatcan help us all better assess the needsof our learners and our programs.

If so much is going to ride onthe results of a standardized test,perhaps we should take a momentto step back and think about the pur-pose of a standardized test, what itcan and cannot tell us, and if indeedthis is the most appropriate test touse. Inquiring minds want to know.

Cathy Coleman is a Staff DevelopmentSpecialist at SABES/World Education.She can be reached by email [email protected].

The TABEThe TABEThe TABEThe TABEThe TABEContinued from page 16

“If the only tool you have is

a hammer, you tend to see

every problem as a nail.”

Abraham Maslow

“The mind is not a vessel

to be filled but a fire to

be kindled.”

Plutarch

“The best way to

get something done

is to begin.”

Anonymous

“…have patience with

everything that remains

unsolved in your heart.

Try to love the questions

themselves…”

Rainer Maria Rilke

Page 18: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

18

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

Adult Basic LearningAdult Basic LearningAdult Basic LearningAdult Basic LearningAdult Basic LearningExamination (ABLE)Examination (ABLE)Examination (ABLE)Examination (ABLE)Examination (ABLE)Authors: Bjorn Gardner,Eric F. KarlsenPublisher: Harcourt EducationMeasurements, 1986To order: (800) 211-8378

The ABLE is a battery of tests thatmeasures the general grade levelof adults who have not completed12 years of schooling. There is alsoa version of the ABLE that tests foradults who have had at least 8 yearsof school, but have not graduated.The ABLE tests learners forvocabulary, reading comprehension,spelling, number operations, andproblem solving.

Adult Measure ofAdult Measure ofAdult Measure ofAdult Measure ofAdult Measure ofEssential Skills (AMES)Essential Skills (AMES)Essential Skills (AMES)Essential Skills (AMES)Essential Skills (AMES)Authors: Riverside Publishing HousePublisher: Steck-Vaughn, 1998To order: (800) 531-5015

The AMES is a battery of assess-ments designed to measure basicworkplace and educational skills.Its focus is on adults who may ormay not have graduated from highschool. The multiple-choicequestions, which are administeredon five levels, are meant toreflect real-life experiencesencountered at school, work, andwithin the community.

Diagnostic AssessmentDiagnostic AssessmentDiagnostic AssessmentDiagnostic AssessmentDiagnostic Assessmentooooof Reading with Tf Reading with Tf Reading with Tf Reading with Tf Reading with TrialrialrialrialrialTTTTTeaching Strategieseaching Strategieseaching Strategieseaching Strategieseaching Strategies(DARTTS)(DARTTS)(DARTTS)(DARTTS)(DARTTS)Authors: Florence G. Roswell,Jeanne S. ChallPublisher: Riverside Publishing, 1992To order: (800) 323-9540

The DARTTS diagnostic assess-ment is a kit packaged in a file box.There are two components: theDiagnostic Assessment of Reading,or DAR, which provides diagnosticinformation on a learner’s ability tocomprehend reading and language,including word recognition, oralreading, word analysis, silent readingcomprehension, and spelling. Thesecond component is the TrialTeaching Strategies, which identifiesstudents’ needs through the use ofmicroteaching sessions.

Basic English Skills TBasic English Skills TBasic English Skills TBasic English Skills TBasic English Skills Testestestestest(BEST)(BEST)(BEST)(BEST)(BEST)Author: Dorry KenyonPublisher: Center for AppliedLinguistics, 1986To order: (202) 362-3740

The BEST test was initially devel-oped by the Federal Office forRefugee Resettlement and itsMainstream English LanguageTraining Project during the early

eighties refugee influx. It is designedas a life skills, task-based assess-ment that has two sections. The firstsection is an oral interview, whichassesses listening comprehension,pronunciation, communication, andfluency. The second part is a literacysection, which assesses reading and writing.

TTTTTest oest oest oest oest of Adult Basicf Adult Basicf Adult Basicf Adult Basicf Adult BasicEducation (TABE)Education (TABE)Education (TABE)Education (TABE)Education (TABE)Authors: John P. Sabatini and othersPublisher: CTB/McGraw-Hill, 1994To order: (800) 538-9547

The TABE test is designed to beused in conjunction with ABE/GEDclasses, in order to determine astudent’s initial functioning level, orgrade equivalent, upon entry intothe program. There is also a TABEWork-Related Problem Solvingcomponent, called Forms 7 and 8,and these can be administeredeither in conjunction with the TABE,or separately. Forms 7 and 8 aremeant to provide employers,professional trainers, and educatorswith an assessment of how learnershandle various problem-solvingtasks. There is also a Spanish TABE ,which is designed for assessingbasic reading, math, and languageskills in adults whose primarylanguage is Spanish.

Beyond Good & Evil: Facts on Standardized TestsFreidrich Nietzsche once commented, “whatever does not destroy me makes me stronger.” In thespirit of such courageous optimism, we offer a short introduction to some of the more widely usedassessment tests on the market. Take a deep breath, relax, and remember the sage words of Nietzsche…

Continued on page 19

Compiled by Justine Sadoff

Page 19: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

19

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

Assessment Definitions

Authentic Assessment“Assessment is authentic when we directlyexamine student performance on worthyintellectual tasks.… Authentic assessmentspresent the student with the full array oftasks that mirror the priorities and challengesfound in the best instructional activities.”

Bias“A situation that occurs in testing whenitems systematically measure differently fordifferent ethnic, gender, or age groups.Test developers reduce bias by analyzingitem data separately for each group, thenidentifying and discarding items thatappear to be biased.”

Construct Validity“The test measures the ‘right’ psychologicalconditions. Intelligence, self-esteem andcreativity are examples of such psycho-logical traits. Evidence in support ofconstruct validity can take many forms.One approach is to demonstrate that theitems within a measure are inter-relatedand therefore measure a single construct.Inter-correlation and factor analysis areoften used to demonstrate relationshipsamong the items. Another approach isto demonstrate that the test behaves asone would expect a measure of theconstruct to behave. For example, onemight expect a measure of creativity toshow a greater correlation with a measureof artistic ability than with a measure ofscholastic achievement.”

Content Validity“Content validity refers to the extent towhich the test questions represent the skillsin the specified subject area. Contentvalidity is often evaluated by examiningthe plan and procedures used in testconstruction. Did the test developmentprocedure follow a rational approach thatensures appropriate content? How similaris this content to the content you areinterested in testing?”

Norms“The average or typical scores on a test formembers of a specified group. They areusually presented in tabular form for aseries of different homogenous groups.”

Performance Assessment“Performance assessment, also known asalternative or authentic assessment, is aform of testing that requires students toperform a task rather than select an answerfrom a ready-made list.”

“Performance assessment [is an] activity thatrequires students to construct a response,create a product, or perform a demonstration.Usually there are multiple ways that an exam-inee can approach a performance assess-ment and more than one correct answer.”

Predictive Validity“In terms of achievement tests, predictivevalidity refers to the extent to which a test canbe used to draw inferences regardingachievement. Empirical evidence in supportof predictive validity must include a comp-arison of performance on the validated testagainst performance on outside criteria.”

Reliability“The consistency of test scores obtained bythe same individuals on different occasionsor with different sets of equivalent items;accuracy of scores.”

“Fundamental to the evaluation of anyinstrument is the degree to which test scoresare free from measurement error and areconsistent from one occasion to another.Sources of measurement error, which includefatigue, nervousness, content sampling,answering mistakes, misinterpreting instruc-tions and guessing, contribute to an individual’sscore and lower a test’s reliability.”

Rubrics“These are specific sets of criteria that clearlydefine for both student and teacher whata range of acceptable and unacceptableperformance looks like. Criteria definedescriptors of ability at each level of perfor-mance and assign values to each level.Levels referred to are proficiency levels whichdescribe a continuum from excellent tounacceptable product.”

Beyond Good & EvilBeyond Good & EvilBeyond Good & EvilBeyond Good & EvilBeyond Good & EvilContinued from page 18

Authentic assessment, construct validity, equal-interval scale… reading a few assessmentdefinitions can force a person to ask that great existential question Dionne Warwick posedback in the sixties, “What’s it all about, Alfie?” Given that many of the definitions listed belowoften come attached with varying perspectives on what they should mean exactly, we presentthem more in the spirit of “cooperative dialogue,” rather than as definitive statements. At thevery least, we hope it will benefit teachers to get acquainted with some of the concepts andterms currently popping up around the issue of assessment.

These definitions are taken from the ERICS Digest, <http://www.ed. gov/ databases/ERIC_Digests/ed385607.html>,CTB McGraw-Hill, <http://www.ctb.com/about_assessment/glossary.shmtl>, and Downing, Chuck. (1995). Ruminating onRubrics. (Online). Access Excellence. <http://www.accessexcellence.com/21st/SER/JA/rubrics.html>.

Comprehensive AdultComprehensive AdultComprehensive AdultComprehensive AdultComprehensive AdultStudent AssessmentStudent AssessmentStudent AssessmentStudent AssessmentStudent AssessmentSystem (CASAS)System (CASAS)System (CASAS)System (CASAS)System (CASAS)Author: CASASPublisher: CASASTo order: (800) 255-1036

CASAS is a broad ranging, functionalassessment system that measuresliteracy skills and their applicationis real-life situations, which theycall “Competencies.” Some of theCompetencies the CASAS measuresare basic communication, occupa-tional knowledge, communityresources, health, and independentliving skills. In total, there are over4,000 items in the CASAS test bank,and this allows for the creation ofcustomized tests to given objectivesand difficulty levels.

FFFFFair Tair Tair Tair Tair Test: The Nationalest: The Nationalest: The Nationalest: The Nationalest: The NationalCenter fCenter fCenter fCenter fCenter for For For For For Fair & Openair & Openair & Openair & Openair & OpenTTTTTestingestingestingestingesting

Fair Test is an advocacy organizationdevoted to ensuring the fair,accurate, and unbiased admin-istering of standardized tests. Theyproduce a quarterly newsletter, aswell as a catalog of test information,for both K-12 and universityeducators. Their Web site, <http://fairtest.org>, is an excellentstarting place for information oncurrent educational policies andtesting research.

Justine Sadoff is Project Coordinatorat the SABES Central Resource Centerat World Education.

Compiled by Justine Sadoff

Page 20: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

20

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

ssessment, the SMARTT System, and the National Reporting System (NRS): Howdo these interact? Where do theyintersect? Where is Massachusettsheading in terms of assessment?All of these issues are ones thatare on the minds of ABE practi-tioners in Massachusetts. It was thefocus of my conversation with BobBickerton, State Director of AdultEducation and Donna Cornellier,Project Manager for the SMARTT(the Management InformationSystem for ABE programs) Team.Our discussion fell into threebroad categories:1. Current performance account-

ability (including assessment)policy in Massachusetts and theimpact of the NRS on it

2. The impact of the policy on teachers3. The impact of the policy on learners

Current PerformanceCurrent PerformanceCurrent PerformanceCurrent PerformanceCurrent PerformanceAccountability PolicyAccountability PolicyAccountability PolicyAccountability PolicyAccountability PolicyCC: Can you describe Massachusetts’current performance accountabilitypolicy?

BB: We believe that the primarypurpose of performance account-ability, including assessment, isprogram improvement. We believethat we are ultimately accountableto students and to the goals that theyhave set for themselves. Assess-ment is a way to measure progresstowards goals, and in particular thegoal of educational gain.

Massachusetts has not mandatedany one assessment tool. What wehave done is require that programsgive an initial, a mid-year, and afinal assessment for each enrolledstudent using any tool that the ABEprogram has chosen. In order tocomply with new requirementsunder the federal “WorkforceInvestment Act” (Title II of whichis “Adult Education and FamilyLiteracy”), we have also asked thateach program submit a “Crosswalkform” which documents how theircurrent assessment tool aligns withgrade level equivalents (GLEs)and student performance levels(SPLs) as defined in the NationalReporting System.

CC: Can you say something about thecrosswalk form? What is a crosswalk?What were programs asked to doand why?

BB: Currently we are allowingprograms to explore their assess-ments and think about how tostrengthen them. We’re not tellingthem to change the tools they areusing as long as they can provide uswith a crosswalk that says we reallydo assess people in a thoughtfulsystematic manner and if someonescores this way on our assessment,they are presumed to be at a specificgrade level equivalent or at a specificstudent performance level. Thecrosswalk is a form we asked programsto submit to document this.

DC: The assessment crosswalks wenow have on file for programsdocument our state’s currentstandardized assessment procedure.In January, we will be convening thetask force to look at and decide ona process that will replace thecrosswalks, starting with programyear 2003.

CC: Some states are choosing tomandate one standardized test touse across the state. Why hasn’tMassachusetts chosen to do that?

BB: Because any time we considerthe pros and cons of any of thecommercially available tests, thecons outweigh the pros. With anystandardized test, we need to look atthe questions of how does this fitwith the learning/content standardsincluded in the Massachusetts ABECurriculum Frameworks and howdoes it fit with what we’re teaching?This “alignment” between curriculumand assessment is the key ingredientin determining whether the assess-ment is “valid”— a requirement ofthe NRS and just plain good educa-tional practice. That is why we areconvening a task force to look atthese issues.

Assessment, Accountablity, the NationalReporting System: Who Is Driving This Bus?An Interview with Bob Bickerton and Donna Cornellier

by Cathy Coleman

A

Continued on page 21

Page 21: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

21

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

CC: Can you tell us more aboutthe Performance AccountabilityWorking Group?

BB: The Department will be issuinga broad invitation to practitioners tojoin with us in developing a consistent,standardized system of assessmentfor our state. We would like peoplewho have done work in the area ofperformance accountability, includ-ing assessment to join us in thiseffort. The Performance Account-ability Working Group will begin inJanuary with the goal of having astandardized assessment procedurein place by July 2002.

CC: What determines whether anassessment is “valid” and “reliable?”

BB: Valid means the assessmentmeasures what it is supposed tomeasure. Reliability measures theconsistency and stability of assess-ment scores. When we talk of a validand reliable measure of whethersomeone achieved a countable goal(e.g., “got a job”), that is a bitdifferent. Whether that measure ofthe goal is valid and reliable is usuallyabout clear definitions and verifi-cation. Then the issue becomes howgood are we at reporting whatstudents tell us they’ve achieved.

CC: How is NRS and its requirementsimpacting Massachusetts AssessmentPolicy?

BB: We believe that NRS is a subsetof our vision for the Massachusettsperformance accountability system,that is, being held accountable tostudents and to the goals that they

have for themselves. NRS is not themain motivator behind what we askprograms to document throughSMARTT. NRS requires a standard-ized system of accountability andhas sped up the timeline, but assess-ment and accountability are issuesthat we have been studying andworking on for a long time.

A key question for me is who dowe want to control and own account-ability in adult ed? It is true thatthere is some information that wemust collect because of the NRS, butin this state, we collect more thanNRS requires and what is drivingthat is what’s most important. It’snot NRS. In fact, people often askme “Why are we collecting all thisstuff?” If we collect only what NRSrequires, then what we are doing issaying that is what’s most importantin adult education—data on gettingpeople diplomas, getting peopleinto jobs, post secondary education,and/or training. This would de factosay that’s what’s most importantbecause that is what we are measur-ing; that is what we’re putting thespotlight on.

For years, we have been sayingwe don’t want our students to bereduced to one dimension. We wantto support all the reasons that peoplecome to us in adult ed. If we don’task the questions and record what’shappening in those domains, thenwe run the risk of those things dis-appearing. Things like, “becoming acitizen,” “helping my kids with theirhomework,” etc. Do we want to allowa publication from a funding sourceto say this is what’s at the center ofaccountability or do we let whatcomes from our students determinewhat’s at the center of accountability?Who is driving this bus? It is ourbelief that it is the students—not thefeds, not our office.

CC: There seems to be some confusionaround why the state is requiring threeassessments per year. NRS actuallyonly requires two assessments peryear. Can you clarify this?

DC: Since we know that about 60%of ABE students across the stateleave before the end of a fiscal year(about 30% because they accom-plished their goal(s) and another30% who drop-out before accom-plishing any goal), we know that formany students we would not be ableto get a final assessment (the secondassessment that is required by NRSand which we feel is important todocument the success of our services).If we wait until the waning weeks ofthe fiscal year to conduct a secondassessment, we will lose any mea-sure of their educational progress.So what we’ve done is require a mid-year assessment with the resultthat assessments need to be admini-stered about once every four monthsfor each student.

CC: How does the NRS relate toour Curriculum Frameworks inMassachusetts?

BB: The NRS says we must measureeducational gain. The CurriculumFrameworks answer the question“Gain in what?” The CurriculumFrameworks are our attempt todefine the universe of content whichis an issue the NRS is largely silenton. When you don’t know the answerto what is the content and skills weare measuring gain in, it oftenbecomes, by default, the contentof what is contained on the GEDtest. What we have done in our stateis say that there is life beyond the

Continued on page 21

Who Is Driving This Bus?Who Is Driving This Bus?Who Is Driving This Bus?Who Is Driving This Bus?Who Is Driving This Bus?Continued from page 20

Page 22: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

22

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

GED, that the table of contents ofa GED textbook does not defineour curriculum.

Impact on TImpact on TImpact on TImpact on TImpact on TeacherseacherseacherseacherseachersCC: How do you think theNational Reporting Systemwill impact teachers in theadult education system?

DC: I think the key is thateveryone understand thatthis is not about penalizingteachers or programs. It’sabout improving the system.

BB: I think this presentsteachers with a choice.Will they be acted uponby the Massachusetts ABEperformance accountabilitysystem or will they take some controlof the process? There are at leasttwo places where they can do that.The most accessible one is whenteachers begin to use the informa-tion that they put into the account-ability (SMARTT) system to informpractice and to see how things areand aren’t working with students.We are not suggesting that theaccountability system is the wholeuniverse of feedback. There are alot of ways that teachers get feed-back, but we need to get to a placewhere teachers include this as oneof the tools they use. The secondway they can become involved is tobecome involved on the performanceaccountability working group wetalked about before or in their ownpeer groups (e.g., during staffmeetings) so that they have someownership over the direction thisgoes in.

CC: Teachers of literacy level studentsand students with learning disabilitiesare concerned that NRS will have anegative impact on them and on theirstudents. They are asking how theycan show the subtle gains with literacylevel students? Can you speak tothat concern?

BB: The standards that we set mustbe appropriate to the level and thecircumstances of the students weserve. We will be using real data ofhow the system is working to helpset benchmarks for differentpopulation subgroups—wheneverthere is a significant difference inthe participation and performancecharacteristics of that group. Thiscould turn out to be the case forstudents with certain disabilities, orfor homeless adults, single parents,etc. This is why it is crucial to havedata we can rely on. If people give usdata that may be overly ambitious(e.g., “inflated”) about how manygrade level equivalents students gainin a certain amount of time, and thatdata isn’t accurate, we could endup setting unrealistic benchmarks—and then everyone suffers! Butclearly we will take into accountpopulation subgroups whensetting benchmarks.

DC: It is also important for peopleto understand that the performancesystem does NOT stipulate theclasses people are placed in. Insome places, people are setting upsix classes to match the six levels ofthe NRS. The classes you set upshould be based on the needs ofyour students and your programs;

they should not be set upsimply to mirror thereporting system. We don’texpect students to movefrom one class to the nextnecessarily in onereporting cycle. What wedo look at is theirassessments and theirprogress on those.

Impact onImpact onImpact onImpact onImpact onStudentsStudentsStudentsStudentsStudentsCC: How do you think theNRS will impact adultlearners in Massachusetts?

BB: I think if we do a good job atputting students in the drivers’ seatof accountability, this could havea very positive impact. When wetalk to students about these issues,as Anne Serino of OperationBootstrap in Lynn has done in herprogram, we find out what studentsreally want to know. They say thingslike “I’d like to know more abouthow long this is going to take, andI’d like to know how I’m reallydoing.… I like hearing that I’mdoing well, but give me a measurethat says where am I compared towhere I need to get.” There aresome people who believe thatstudents will get a real benefit out ofthis. It depends on whether wekeep them in the driver’s seat.

Continued on page 23

If we do a good job at puttingstudents in the driver’s seat of

accountability, this couldhave a very positive impact.

Who Is Driving This Bus?Who Is Driving This Bus?Who Is Driving This Bus?Who Is Driving This Bus?Who Is Driving This Bus?Continued from page 21

Page 23: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

23

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

CC: Will students end up penalizedif they are not making progress fastenough? Some program feel pressuredto produce measurable gain, and, as aresult, students who struggle academi-cally or have social issues that interferewith attendance get dropped from theprogram. These students representsome of our neediest clients. How willwe deal with this issue given the re-quirements of the accountability system?

BB: As long as students have thecapacity to learn how to read andwrite and are working seriouslytoward meeting their goals, weshould not set artificial time limitsto their learning. We ALL learn atdifferent rates! One issue that theperformance standards taskforceaddressed was the issue of reten-tion, and we all agreed that thedefinition of retention should notbe continuous attendance butshould include what people referto as “stopping out.” What we had aharder time with was defining whatstopping out was. Did it meanstopping out of the program for acouple of weeks, a few months, ayear? It is clear that we need toacknowledge the issue of studentswho need to stop out of a programand don’t define themselves asdropping out.

DC: One of the reports I do is ontotal attendance. If they came,dropped out, and came, we take theirtotal hours. The SMARTT system isset up to answer questions like that.

CC: Older learners sometimes comeinto adult education without employ-ment goals or family literacy goals but

simply to learn to read and write betterfor their own satisfaction. Will there beroom for these students under the newaccountability system?

DC: Our system in Massachusetts isbroader than the NRS, and we thinkthat what the NRS refers to as thesecondary descriptors are just asimportant as the goals of employ-ment and family literacy. Any goalcan be a person’s primary goal,whether its on our list of goals or not.

BB: People (our students) setthe education-related goal that isimportant to them and we need tobe ready to respond. The goals thatthey set are the ones we’re account-able to. The list from NRS is a subsetof a larger list that the students setthemselves (the list of goals in theSMARTT database reflect sevenyears of collecting student articulatedgoals for their learning). After theyset a goal for themselves, theyshould be asked if they think theycan achieve that within a year. Helpingset shorter-term goals is an impor-tant part of the process so that theperson can get a sense of progressand mobility. What are the thingsthat we can do within a year that willlet you know that you are makingprogress? That is the question weneed to ask.

Final QuestionsFinal QuestionsFinal QuestionsFinal QuestionsFinal QuestionsCC: What do you think is the biggestmisconception about the NationalReporting System?

BB: That it is driving accountabilityin Massachusetts. It isn’t. Students are.

CC: What do you think is thebiggest misconception about theSMARTT system?

BB: That it asks a lot of pointlessquestions. Actually we thought hardabout what questions it asks. Forexample, we ask if a person is asingle parent not because we wantto intrude on people’s lives, butbecause we believe that singleparents might be a group whose per-formance profile might significantlydiffer from other groups—whichwould result in setting differentperformance benchmark(s) for thatgroup. It’s important to ask it so thatwe don’t “cream” the crop andsimply take students who we thinkwill make us look good. Creaming inour field would be disastrous andwould set up what they call “perverseincentives,” where we end upcreating incentives to do the verythings we know will hurt ourstudents—like enrolling only thosewe know will quickly succeed. Therereally are reasons behind what weask for in the SMARTT system.

CC: Do you have anything else to addthat we didn’t cover today?

BB: Yes, that we think its great thatField Notes is doing this issue.

DC: That the most important thingis to get the information out so thatwe can clear up misconceptions.Robert Foreman and I are the NRStrainers for the state. I would beglad to answer questions frompeople on this. People are welcometo email me with questions [email protected].

Who Is Driving This Bus?Who Is Driving This Bus?Who Is Driving This Bus?Who Is Driving This Bus?Who Is Driving This Bus?Continued from page 22

Cathy Coleman is a Staff DevelopmentSpecialist at SABES/World Education.She can be reached by email [email protected]

Page 24: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

24

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

ashington state has embarked on a three-year

project to create an ongoingassessment system that encouragesthe use of performance-basedassessment tasks and is alignedwith the Equipped for the Future(EFF) initiative.

Well into the second year ofour effort to create an ongoingassessment system, the challengesinvolved in creating and using asystem at the same time are cominginto focus. Washington state BasicSkills Programs Administrator,Brian Kanes, defines the issue in thisway: “We’re developing while we’redoing and we’re trying to use themedium of assessment to promote atotally different basic skills ‘culture.’”Among those issues are:• How will teachers be convinced to

convert their experience andknowledge to a system that betterserves students, but relies heavilyon the concept of teachers aspartners in learning?

• How will the new system continuethe alignment we want to have withEFF and at the same time retainits own integrity?

• How will it accommodate thealignment we need with theNational Reporting System(NRS) in light of the WorkforceInvestment Act (WIA) andperformance funding?

• How can the emerging systemwithstand the pressures of beingcreated very quickly and verypublicly with every potential flawor inconsistency in each draftcoming under scrutiny.

• How should feedback be incorp-orated or addressed and by whom?

The emerging assessmentsystem will rely on the use of per-formance tasks scored by holisticrubrics to demonstrate progress,making it possible for instructorsto more closely align assessmentand instruction. Washington state’sadoption of EFF as the primaryframework for curriculum, instruc-tion, and assessment intends forinstruction to be based “on theapplication of skills in real-lifecontexts which are meaningful tothe learners.” One hoped-for resultwould be the flexibility to allow forregional, programmatic, and indi-vidual differences by using the statecompetencies and assessment toguide instruction, and more closelyalign instruction and assessmentwith learner goals. The use ofholistic rubrics to assess level pro-gression will not replace ongoingclassroom assessment used to gaugeeffectiveness of instruction andlearner mastery of individualcompetencies. Teachers will beencouraged to continue using theanalytic rubrics to help guideplanning and instruction.

Our clarity about the systemand the process for developing itis greater now than when we beganover a year ago. Like most states,Washington found itself scramblingto put an assessment system in placeto comply with accountability aspectsof WIA. In January 1999, a groupof experienced basic skills

practitioners met with stateadministrators to design a processby which to build a “toolbox” ofassessment strategies that identi-fies level progression and meetsWIA’s requirements.

While the system now underconstruction in Washington was notthe result of a “grassroots” move-ment, both the product and theprocess were inspired by that firstgroup of practitioners, and refinedby subsequent work groups. Theinitial attempt in April 1999 to createrubrics for ABE and ESL writingwas complicated, then stalled, whenmembers of the work group con-cluded that the competenciesneeded revision before they couldbegin creating rubrics. The processof creating rubrics had revealedthat the competencies weren’tleveled equally and would notprovide a sound basis from whichto create rubrics. This realizationled to a more intentional and time-consuming process last year ofrevision and creation resulting inthe foundation of our performance-based assessment system. Thecompetencies were made moreconsistent in magnitude and alignedwith the six levels defined by theUS Department of Education. Thenthe corresponding holistic rubricswere created to assess level progressor completion.

During that first year of the“official” three-year process, 81basic skills practitioners from 44

Ever Widening Circles: Involving Teachers inthe Development of an Assessment Systemby Cynthia Gaede

W

Continued on page 25

Page 25: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

25

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

programs across the state metto revise the state’s basic skillscompetencies and create contentarea rubrics that will be used toreflect level completion. Theirmission was successful, but thornyissues are inevitable in a projectof this scope and importance.Washington’s adoption of EFF as aprimary framework provokes theneed for clarification about whatEFF looks like in Washington.

Teacher work groups this yearwill define quality criteria by whichperformance tasks will be guided;they will identify the componentsof performance tasks and createsamples to be used as models; theywill act as ambassadors for thesystem by training their programcolleagues; and they will try usingthe rubrics with scoring tasks. ByJune 30, 2001, Washington will havequality criteria by which to assessperformance tasks, a set of sampletasks that meet the quality criteria,as well as rubrics and competenciesthat have been piloted in programsstatewide. Training in effective use

of performance tasks to informinstruction, and use of the rubricsto measure progress will continueas the system develops.

Very little of consequencehappens in Washington withoutconsensus-building, and that iscertainly true of the ongoing assess-ment project. Clearly, the systembenefits from broad participationin the implementation of its visionthrough “buy-in” from the field.Washington’s adult basic skillsteachers are pragmatic, visionary,hard-headed, and smart. Theirwillingness to share their experienceand knowledge is crucial to thesuccess of the project. Only practi-tioners are able to “contextualize” thesystem by sharing what they knowabout their students’ lives, hopes,fears. The state can only build abase broad enough to support theemerging system by asking teacher toparticipate in ever-widening circles.

In return, the obvious value forteachers lies in a very real sense ofownership and professional rec-ognition. There are rich opportunitiesfor both direct and indirect pro-fessional development opportunities.Considering that virtually all

Curriculum Framework inMassachusetts appear to correlatewith the findings of the reportsinasmuch as what the students arelooking for in a program, the ESOLFramework provides. For example,one of the Framework’s categories is“Navigating Systems,” which Gravesbelieves is of paramount importanceto learners, who want to be able tocommunicate orally.

Only time will tell if morestudent-centered assessment isgoing to find a place within the

Ever WEver WEver WEver WEver Widening Circlesidening Circlesidening Circlesidening Circlesidening CirclesContinued from page 24

participants in the developmentof this system are being asked tovolunteer their expertise, the statepromises that teachers will useand develop tremendous creativethinking, problem-solving andnegotiation skills. In addition,participants must use all theirleadership skills to share thesystem vision and concepts withtheir colleagues. They become“insiders;” they are being asked tolead, facilitate, and negotiate as theyseek collaboration in the researchnecessary to complete the project. It’sa lot to ask.

When it seems like the state isasking “too much,” it helps toremember that the vision drivingthe system puts learners where theybelong—squarely at the center ofboth instruction and assessment.

Cynthia Gaede is a Training Coor-dinator for the Adult Basic LiteracyEducators Network of Washingtoncharged with coordinatingWashington’s adult basic skills ongoingassessment project. She can be reachedby email at [email protected]. Thebasic skills competencies and rubricsare available on the Web at <www.sbctc.ctc.edu/Board/Educ/ABE/assess.htm>.

boundaries of traditional standard-ized assessment. It should be notedthat some of the teachers who parti-cipated in Graves’ assessmentexperiment found her results to be“practical but not always effective atmeasuring students’ progress.”Others, however, felt there was noway teachers could know whatexactly learners could do outside ofthe classroom unless they (thelearners) reported it themselves.Thus, one of the conclusions thatcould be drawn is that it would dowell to continue documentinglearners’ various modes of learning,and how they themselves view theirown learning. Understanding the

need for such continued research,Graves’ agency recently applied fora DOE grant that would allow themto pursue this research, using thegroundwork already done as astepping stone to ultimatelyrefine the progress reports. Suchparticipatory approaches toassessment just might, in the finalanalysis, prove to be an essentialelement in the learning process.

Justine Sadoff is Project Coordinatorat Boston SABES. Nicole Graves is anESOL Program Coordinator at theCenter for New Americans in Amherst.Justine can be reached by email [email protected]. Nicole can bereached by email at [email protected].

Involving Learners…Involving Learners…Involving Learners…Involving Learners…Involving Learners…Continued from page 10

Page 26: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

26

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

the student meets one-on-one witha tutor once a week. During thesesessions, the student tries to under-stand rules and procedures andapplies them in both real andacademic contexts. In addition thestudent improves his/her compre-hension skills by completinghomework assignments in reading.(Homework usually entails at leastthree to four hours betweensessions.) Special education teachersin the surrounding communitieswould be especially well-qualifiedtutors for the learning disabled.

At Quinsigamond, severalstudents with major learning dis-abilities have earned their GEDsfollowing the previous procedure.One student in particular had beenworking for eight years withoutsuccess. After having the evaluationand following the resulting recom-mendations, which included muchwork on her part to compensate forher weaknesses, she passed.

It has been a challenge for all—the teachers have had to changetheir style to a degree; the studentshave had to spend many hoursdoing homework between sessions;and the tutors have had to spendmuch time preparing required work.

Assessment and LDAssessment and LDAssessment and LDAssessment and LDAssessment and LDContinued from page 11

The end results for those whohave persisted have been worth-while. The students have their GEDin spite of the odds against them.

Wallace M. Perkins is a licensededucational psychologist and hasworked for 20 years as a schoolpsychologist in the Shrewsbury Schoolsystem. Other interested communitiesmay contact the special educationdepartment in their local or surround-ing school districts to learn if presentschool psychologists would be interestedin an additional part-time position orto secure the names of retired schoolpsychologists. One can also write tothe Massachusetts School PsychologistsAssociation to obtain a list of itsretired members.

Knowledge Base• What vocabulary do learners have

related to the skill?• What content knowledge do the

learners have related to the skill?• What strategies do learners have

for organizing and applyingcontent knowledge?

Fluency• How much effort is required?

Independence• How much help is needed from

others?

Range• In how many different contexts

can learners perform?• How many different tasks can the

learner do using the skill?

One way I learned to betterunderstand these dimensions ofperformance was to think aboutsomething I was learning; in mycase it was learning to swing dance.

When I first started out I reallydidn’t have much of a knowledgebase—————just what was in the catalog.As I learned, I picked up vocabulary(basic step, loop pass, etc.) and gotbetter at organizing what I waslearning by being able to put thesesteps together. Fluency was a bigproblem when starting out. I had tocount and concentrate on each step.As I got better, some of the stepsbecame automatic. At first I had ahard time learning from watchingand needed the instructor todemonstrate the steps by being mypartner. My independence grew as Igained confidence and didn’t needas much “hands on” support. Myrange of performance is still

limited. I haven’t danced anywhereother than the classroom. My goal isto be able to dance (and enjoy it) atmy son’s wedding next summer. Ihave a ways to go, but I can see thatI am learning and improving.

This is a lot of informationfor a short article, and I have to saythat I certainly don’t have all thecomponents working togethersmoothly in my classroom yet.However, using EFF as a frame-work for assesment and instructionhas allowed me to become a moreintentional and informed instructorand learner.

Joan Benz is an Instructor of adultbasic skills and GED at the BethelFamily Learning Center in Eugene,Oregon. She can be reached by emailat [email protected]. EFF is aninitiative of the National InstituteFor Literacy. For more informationabout EFF visit their Web site at<www.nifl.gov/EFF>.

Active, Purposeful…Active, Purposeful…Active, Purposeful…Active, Purposeful…Active, Purposeful…Continued from page 14

Page 27: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

27

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

winter 2001

Sites on StandardizedSites on StandardizedSites on StandardizedSites on StandardizedSites on StandardizedTTTTTests and Tests and Tests and Tests and Tests and Testingestingestingestingesting

ABLE Test<http://ericae.net/tc3/TC018651.htm>This is another ERICS site, givingdescriptive information on the AdultsBasic Learning Examination, or ABLE.

Buros Institute of MentalMeasurements<www.unl.edu/buros/>Another gigantic database. Tests, testlocators, test resources, test publishers,and everything you could possibly wantto know about tests.

Center for Applied Linguistics (CAL)<www.cal.org>Thorough resource for varied informationon language, language testing, andcurrent education policy. Click on AdultESL Literacy and click on the link forRelated Products and Publications. Thiswill bring up a list of many helpfulresources for ESOL teachers and learners.

Comprehensive Adult StudentAssessment System (CASAS)Competency List<www.casas.org>Just about everything you could want toknow about CASAS including skill leveldescriptors, competency lists, andfrequently asked questions about thetest as well as how it is used.

ERICS Digest<www.ericae.net/testcol.htm>Massive database, probably the bestresource for information on tests andeverything associated with testingand education in general.

Fair Test: The National Center forFair & Open Testinghttp://fairtest.orgFair Test is an advocacy organizationdevoted to ensuring the fair, accurate,and unbiased administering of standard-ized tests. Their Web site is an excellentresource for information on currenteducational policies and testing research.

Riverside Publishing<www.riverpub.com/products/group/dartts.htm>Riverside Publishing produces a widevariety of educational materials, includ-ing tests. They are part of the largereducational publishing house, HoughtonMifflin. This site is a good introductionto the Diagnostic Assessment of Readingwith Trial Teaching Strategies and itscousin, the Diagnostic Assessmentof Reading.

Sites on AlternativeSites on AlternativeSites on AlternativeSites on AlternativeSites on AlternativeAssessment andAssessment andAssessment andAssessment andAssessment andPerformance-BasedPerformance-BasedPerformance-BasedPerformance-BasedPerformance-BasedAssessmentAssessmentAssessmentAssessmentAssessment

Access Excellence<www.accessexcellence.com>Another example of rubrics, this onefocusing on the use of rubrics forteaching science. From the AccessExcellence home page, click on“Classrooms of the 21st Century.”

A Humourous Look atPerformance-Based Assessment<www.middleweb.com/gradexam.html>We all need a little chuckle once in awhile, and this site may prove helpful. Itprovides a tongue-in-cheek look at someauthentic assessment tasks such as ineconomics: “Develop a realistic plan forrefinancing the national debt. Run forCongress.” There is also valuable non-tongue-in-cheek information on this site.Don’t be scared off by the middle schoolaspect. Many of the links under Assess-ment and Evaluation will prove useful tous in adult education as well.

Education World<www.educationworld.com/a_curr/curr248.shtml>This site offers information on rubrics, theeducational background to rubrics, andhow you can use them.

Click Here: Web Sites on AssessmentCompiled by Justine Sadoff

Guideline forPortfolio Assessment<www.w-angle.galil.k12.il/call/portfolio/default.html>Another tome of a piece, but just abouteverything you could think of concerningportfolios is laid out here in clear,thorough detail. Again, this site shouldnot be missed by anyone interested inlearning about portfolios.

Interactive Classroom<www.interactiveclassroom.com>This site has several articles that addressissues of authentic assessment includingone that describes the process of“negotiable contracting,” which activelybrings learners into the process ofassessment. Click on articles from themain page.

National Clearinghouse forBilingual Education<www.ncbe.gwu.edu/ncbepubs/pigs/pig9.htm>Excellent, lengthy, and detailed site onperformance and portfolio assessmentfor language minority students. Thisis heavy reading, but highly insightfulfor anyone interested in portfolio-based assessment.

Portfolio Assessment<www.eduplace.com/rdg/res/literacy/assess6.html>This site is operated by Houghton Mifflin,the educational publishing house. It’s agood introduction to portfolio assessment,giving short but precise explanations forsome of the key issues and themes inportfolio assessment.

Using Rubrics in the WashingtonState Assessment System<www.sbctc.ctc.edu/Board/Educ/ABE/assess.htm>Washington State has developed rubricsas part of their assessment system. Thissite lists several downloadable resourcesincluding the rubrics in both Adobe PDFformat and in Word.

Page 28: Fi e l d n o t e s - SABES · aaaaaaaaaaaaaaaaaaaa aaaa Fi e l d n o t e s aaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 1 aaaaaaaaaaaaaaaaaaaaaaaaaaaaa a a a

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

Fi e l d n o t e s

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

28

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

field notes

Nonprofit OrgU.S. POSTAGE

PAIDBoston, MAPermit 50157

44 Farnsworth StreetBoston, MA 02210

Return Service Requested

Mark Your Calendar

Massachusetts Association of Teachers ofEnglish to Speakers of Other Languages(MATSOL), MiniconferenceThe Stakes of Assessment: Research, Policyand PracticeNewton, MA (Pine Manor College)Contact: MATSOL, (617) 576-9865

JanuarJanuarJanuarJanuarJanuary 12–13y 12–13y 12–13y 12–13y 12–13

Center for Study of Adult Literacy, GeorgiaState University & Centre for Literacy (Quebec),3rd International Conference onWomen and LiteracyAtlanta, GAContact: Sandy Vaughn, (404) 651-1400Email: [email protected]

JanuarJanuarJanuarJanuarJanuary 22–24y 22–24y 22–24y 22–24y 22–24

FebruarFebruarFebruarFebruarFebruary 13y 13y 13y 13y 13

FebruarFebruarFebruarFebruarFebruary 28–March 3y 28–March 3y 28–March 3y 28–March 3y 28–March 3Commission on Adult Basic Education(COABE), 2001 National ConferenceMeet Me in MemphisMemphis, TNContact: Peggy Davis, (901) 855-1101Web: <http://206.82.75.28/conf.html>

Mass Networks Education PartnershipCLASP (Curriculum Library Alignment andSharing Project), Statewide Conference SeriesWorcester, MAContact: (888) 638-1997Web: <www.massnetworks.org>

March 14March 14March 14March 14March 14Massachusetts County House of Corrections(CHOC), Annual ConferenceWorcester, MAContact: Joanne Harrington, (508) 854-4476Email: [email protected]

March 18–20March 18–20March 18–20March 18–20March 18–20National Center for Family Literacy (NCFL),10th Annual National ConferencePartners in LearningDallas, TXContact: NCFL, (502) 584-1133 x149Web: <www.famlit.org/conference/conf2001.html>

March 30–April 4March 30–April 4March 30–April 4March 30–April 4March 30–April 4Teachers of English to Speakers of OtherLanguages (TESOL), 35th Annual ConventionTESOL 2001: Gateway to the FutureSt. Louis, MOContact: TESOL, (703) 836-0774Web: <www.tesol.org/conv/2001.html>

American Educational Research Association(AERA), 82nd Annual MeetingWhat We Know and How We Know ItSeattle, WAContact: AERA, (202) 223-9485Web: <www.aera.net/meeting/am2001>

April 10–14April 10–14April 10–14April 10–14April 10–14

Information about upcomingconferences relating to adultliteracy can be sent to: LenoreBalliro, Field Notes Editor, WorldEducation, 44 Farnsworth Street,Boston, MA 02210.