outcomes assessment and accreditation in us engineering formation

12
This article was downloaded by: [York University Libraries] On: 11 November 2014, At: 12:51 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK European Journal of Engineering Education Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/ceee20 Outcomes Assessment and Accreditation in US Engineering Formation LANCE SCHACHTERLE Published online: 27 Apr 2007. To cite this article: LANCE SCHACHTERLE (1999) Outcomes Assessment and Accreditation in US Engineering Formation, European Journal of Engineering Education, 24:2, 121-131, DOI: 10.1080/03043799908923547 To link to this article: http://dx.doi.org/10.1080/03043799908923547 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

Upload: lance

Post on 16-Mar-2017

215 views

Category:

Documents


1 download

TRANSCRIPT

This article was downloaded by: [York University Libraries]On: 11 November 2014, At: 12:51Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

European Journal of Engineering EducationPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/ceee20

Outcomes Assessment and Accreditation in USEngineering FormationLANCE SCHACHTERLEPublished online: 27 Apr 2007.

To cite this article: LANCE SCHACHTERLE (1999) Outcomes Assessment and Accreditation in US Engineering Formation,European Journal of Engineering Education, 24:2, 121-131, DOI: 10.1080/03043799908923547

To link to this article: http://dx.doi.org/10.1080/03043799908923547

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

European Journal of Engineering Education, Vol. 24, No.2, 1999 121

Outcomes Assessment and Accreditation in USEngineering Formation

LANCESCHACHTERLE

SUMMARY American higher education is characterized by enormous program and qualitydiversity among the 3842 institutions (1998 data) with two- or four-year programs (of which319 institutions have one or more engineering programs). Sharp distinctions emerge alongseveral axes: funding (private vs public), size (hundreds CO tens of thousands of students) andmission (research vs teaching). Recently the accreditation organizations for universities broadlyand for engineering specifically have addressed this diversity by moving co outcomes-basedassessmems. Rather than judging student performance in terms of classes passed, institutionsmust (1) define their distinctive mission, (2) design a curriculum to help students achieve thesegoals, (3) assess student learning outcomes according to both institutional and professionalcriteria, and (4) create a culture of continuous improuement co beuer align steps (1) and (2).Adoption of similar procedures may help European engineering institutions co measureprograms across boundaries and co foster 'trans-national recognition'.

1. A Brief Overview of Engineering Higher Education in the US

US education is characterized by a rich mix of private and public institutions of varyingsizes and with diverse missions. Historically, US educational programmes and stan­dards have been organized at local and state levels. Significant political pressure resistsmandating of national standards at any educational level, in part because financialsupport for education often derives from local real estate taxes. Thus, communitiescompete for superior educational resources to reinforce competitive economic advan­tages, which redound to the tax base continuing to support local education. Theadvantages of such a system are encouraging diversity, innovation and local publicinvolvement; the disadvantages, inequities in quality and difficulties in comparing fromarea to area how successful graduates are in using their academic education as thefoundation for professional practice. In short, the US system of regional control ofeducation resembles in major ways the situation in European countries: differentsystems, different funding mechanisms, different expectations and different standards.

While the 50 state governments often impose their own unique core curriculumrequirements upon all graduates of state universities, public engineering programmesare often left with a fair degree of room with which to experiment in differentiating their'products' from those of competitors. Experimentation is even keener among privateengineering institutions, which range from small technological universities (such asWorcester Polytechnic Institute (WPI) and Harvey Mudd) to medium-sized universi­ties with several separate colleges (such as Rensselaer and Lehigh), to world-renownedlarge research technological universities (such as MIT and Stanford). All these institu-

0304-3797/99/020121-11 © 1999 European Society for Engineering Education

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

122 L. Schachterle

tions recognize that a convincing case can be made for closely linking economic andtechnological growth and thus compete for resources among both the private andpublic funders.

Of the roughly nine millions US citizens pursuing degrees at 4-year institutions,more than 60% attend publicly (usually state) funded programmes. The Chronicle ofHigher Education Almanac (28 August 1998) reports (p. 18) 3842 institutions with 2or 4-year programmes. The Accreditation Board for Engineering Education (ABET)Annual Report for 1997 (p. 51) indicates that 1551 different engineering programmesare accredited at 319 different institutions. In the absence of national standards, howare standards created and maintained, permitting comparisons among these numerousand diverse programmes and institutions?

2. Voluntary Peer Accreditation Procedures in the US

The broad answer to the question just raised about creating and maintaining standardsis "through participation in private voluntary accreditation organizations". In theabsence of any government controls over the details of the curriculum, in the USvoluntary peer-review organizations have emerged to create and compare qualitystandards on engineering education. These organizations come in two forms: regionalaccreditation agencies which link to the federal government; and professional organiza­tions which accredit professional programmes. In the US, virtually every engineeringinstitution is accredited both by one of the six regional organizations (which reviewentire institutions) and a multiplicity of professional accrediting bodies which accreditspecific disciplinary programmes under their jurisdiction.

TIle best known of the engineering accrediting bodies is the ABET, which, since1932, has provided standards of student performance which arc accepted at virtually alltechnological universities. Since the 1930s, ABET has stressed a body of knowledge inwhich students must demonstrate mastery by passing the appropriate courses. Recently,many engineering education observers have protested the lack of room in which to

experiment as a result of ever-increasingly tight constraints concerning which coursesstudents must pass. The response to newly emerging professional and scientific areashas often been to add more required courses to an already packed 4-year curriculum.

More important, educators throughout the world arc coming to recognize thatpassing courses is far from a guarantee of professional mastery. Self-reflection, as wellas professional studies, have convinced many educators that courses arc most effectiveat conveying information to short-term memory, but that students often cannot relateor usc materials when needed in upper-level courses. Many students find it even moredifficult to synthesize materials learned in separate courses when required to apply such'knowledge' in real-world projects. At best, courses prepare students effectively to

relearn material when they really need it.In response to concerns about measuring student performance entirely in tenus of

courses passed rather than knowledge applied effectively, in the last decade ABET hasexperimented with a new standard, Engineering Criteria 2000 (Ee 2000), which placesan emphasis on demonstrated student outcomes in actual performance rather than inobtaining pass grades. EC 2000 enumerates a set of outcome-based measures which fiton a single page, a far cry from the 30 pages of small-spaced, double-column require­ments that previously mandated passing courses. The new criteria cover many of thesame headings as the earlier ones, such as adequate resources, institutional organizationand general expectations for background which characterize a beginning engineeringprofessional. In addition, as with the earlier criteria, EC 2000 provides the opportunity

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

Outcomes Assessment and Accreditation 123

for each relevant professional society (such as ASME or IEEE) to spell out expectationsspecific to the different disciplines (the 'programme criteria').

At the heart of EC 2000 is criterion 3, 'Program outcomes and assessment', whichenumerates the performance outcomes which students must demonstrate in order tosecure a degree under the new requirements. These outcomes are as follows:

(a) an ability to apply knowledge of mathematics, science and engineering;(b) an ability to design and conduct experiments, as well as to analyze and interpret

data;(c) an ability to design a system, component, or process to meet desired needs;(d) an ability to function on multi-disciplinary teams;(e) an ability to identify, formulate and solve engineering problems;(I) an understanding of professional and ethical responsibility;(g) an ability to communicate effectively;(h) the broad education necessary to understand the impact of engineering solu-

tions in a global and societal context;(i) a recognition of the need for, and an ability to engage, in life long learning;(j) a knowledge of contemporary issues;(k) an ability to use the techniques, skills and modem engineering tools necessary

for engineering practice.

Operationally, the process for seeking accreditation under EC 2000 is quite differentfrom previous ABET procedures, though close to recent developments among theregional accrediting societies. At the heart of the procedures is the need to assessstudent performance-what they can do, not the courses they have passed. To shift thefocus of evaluation to 'student outcomes assessment', institutions must take thefollowing four steps:

(I) Each institution, and appropriation subdivision (such as a college of engineer­ing), as well as all departments or programmes coming up for EC 2000accreditation, must define a specific mission for the programme. Of greatimportant is that the mission must be framed in terms which permit allconstituencies interested in engineering education (students, faculty, alumni,employers and accreditation organizations) to assess the success of students andgraduates in achieving the outcome.

(2) Institutions and programmes must specify their general curriculum require­ments, in terms that indicate the curriculum is designed to support both themission statements (step I) and the goals of EC 2000.

(3) Each institution and programme must specify how students achieve the out­comes (a)-(k), or how other outcomes determined by the programme can betranslated into these EC 2000 outcomes. Again, the emphasis is on specifyingoutcomes that can clearly be measured in terms of how students can demon­strate their capacity to achieve each of the subparts of criterion 3.

(4) Finally, and most importantly, each institution and programme undergoing EC2000 accreditation must demonstrate in detail how faculty and others assess theoutcomes of student performance based on the measurements and how theyrevised the curriculum and mission as needed in order to institute continuousprogramme improvement.

These four steps essentially ask engineering educators at every level of the system to beclear about what they are trying to do in terms of student learning, about how theymeasure student success, and about how-when inevitably they find lapses-they

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

124 L. Schachterle

review and improve the curriculum to bring student performance into better alignmentwith their curriculum, core values and mission.

Probably the greatest challenge for engineering educators is step 3, characterizingeach of the 11 student outcomes by articulating goals for student achievement that canbe measured. For example, the first outcome, demonstrating "an ability to applyknowledge of mathematics, science and engineering" must specify at what levels and inwhat contexts. US engineering educators have been working hard in the last severalyears to devise such appropriate metrics, to gather data about student outcomes and,when necessary, to revise curricula in attempts to enhance desirable outcomes.

3. WPI as Experimental Site

To examine more closely this four-step process, especially step 3, let us take WPI as anexample. In 1996, ABET encouraged institutions that believed their present curriculummet the new standards to apply for new or continuing accreditation through theexperimental clause within ABET guidelines. Because of WPI's history from the late1960s on as an experimental engineering institution, WPI was selected in 1996 as oneof the first two institutions to undergo an accreditation visit under the new EC 2000.

In the late 1960s, the senior WPI faculty set out to reform WPI's existing,conventional engineering curriculum. In the place of lists of required courses dis­tributed in various areas, the faculty created new degree requirements based onmeasures of competency primarily determined by capstone project experiences. Thethree major projects involve demonstrating the ability to synthesize and use previouscourse work in the areas of: humanities and the arts; interdisciplinary studies involvingthe interaction of science, technology and society; and the disciplinary field the studentselects as the major.

Each of these demonstrations of student activity requires an appropriate degree ofcreative work (presumably based on previous classroom learning) and each culminatesin a written project as well as some form of oral presentation. Thus, in 1996 WPI had25 years of experience with outcomes-based engineering curriculum; WPI also hadavailable decades-worth of completed student project reports in all three of the areasthe faculty regarded as important in its curricular renovation. These projects coveredvirtually all of the 11 student outcomes in EC 2000.

More important, 10 years into WPI's experimental programme, beginning in theearly 1980s, WPI instituted faculty peer review of the student outcomes as embodiedin the completed written project reports. These reviews were organized within everyrelevant department and programme, and brought together faculty (and sometimesexternal members of the engineering education community such as graduates andemployers) to review the projects in terms of what outcomes the projects demonstrated.Of special interest to WPI was ensuring that the outcomes were in accord with the spiritof the new WPI curriculum. After several years of experience with the peer reviewprocess, faculties became bolder in using the results of the peer reviews withindepartmental and programme undergraduate review committees, to institute changesaimed at improving the fit between mission and student learning. Of special concernwas to make sure that the conjunction between upper-level course expectations andproject results was as close as possible.

4. Specific Goals and Metrics

ABET student outcomes (a)-(k) are general; faculty in specific programmes mustdefine more specific goals and measurements to carry out assessments. For example, if

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

Outcomes Assessment and Accreditation 125

a department felt that statistics was an important part of the curriculum and expectedstudents to take such courses in the Mathematics Department, then it was importantto observe how many projects in the disciplinary field used statistics at a levelappropriate to beginning professionals. Thus, WPI faculty began to develop specificmeasurements to assess how well students succeeded in demonstrating through theirvarious projects that they met the ABET-prescribed student outcomes (a)-(k) listedearlier. One set of such measurable outcome is given hereafter, developed by the WPIDepartment of Chemical Engineering, for outcomes (b) and (e). Such outcomes can be'scored' by faculry on a simple scale, such as 1-10.

ABET Outcome (b)

General. Students must demonstrate an ability to design and conduct experiments, aswell as to analyze and interpret data.

Specific. When given a hypothesis for testing, or a problem suitable for laboratoryinvestigation, students can:

• design a lab protocol including experimental objectives that will test the hypoth­esis or thoroughly study the problem;

• successfully and safely conduct the experiments in the appropriate laboratory;• analyze the data, including error effects, sufficient to draw supported conclusions

about the problem relative to the objectives;• successfully defend their analysis and interpretation in written or oral form.

ABET Outcome (e)

General. Students must demonstrate an ability to identify, formulate and solve engin­eering problems.

Specific. When given an open-ended problem in their discipline, students can:

• perform background research of sufficient depth to identify important parts of theproblem that must be solved;

• develop a methodology to solve the problem, including methods to devaluatelcompare alternative solutions;

• successfully complete the problem solution and defend that solution in written ororal form.

A second example is from the WPI Department of Electrical and Computer Engineer­ing.

The following paragraphs are intended as a guide to aid in consistently interpretingthe 'Student major project demonstrates' portion of the project outcomes and assess­ment form. (Outcomes (a)-(k) as appropriate to disciplinary projects.)

(a) Appropriate Use of Math/Science/Engineering Knowledge

The project demonstrates that the student is aware of the phenomenology of theproblem being solved at a depth greater than that of a layperson and is able to use thisdeeper understanding to determine an appropriate solution space for the problem. Thisunderstanding should be illustrated by discussing the problem and the proposedsolution in appropriate mathematical or engineering terminology.

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

126 L. Schachterle

(b) Design of Experiments/Data Analysis

Tests and measurements are made in a structured, controlled manner with a specificpurpose. Data obtained from measurements are analyzed in valid and meaningful ways.Correlations between expected and observed results and anomalies are explainedappropriately.

(c) Design to Specifications

The design goals of the project are clearly stated. Discussions of design address specificdesign goals, and conclusions, experiments and analysis are provided to identify whichspecifications were satisfied. Explanations are provided to justify any specificationswhich were unattainable.

(d) Functioning Multi-disciplinary Team

The project team consists of people from more than one engineering or sciencediscipline. For example, a multi-disciplinary team might consist of electrical engineer­ing and computer science majors, not simply ECE and CS components.

(e) ldentify/Formulate/Solue an Engineering Problem

The process of decomposing a problem into its components is clearly presented. Thepresentation includes a discussion of the problem, a presentation of sensible alternatesolutions and a justification of the particular method selected to solve the problem. Thesolution demonstrates consistency with the system specifications.

(f) Ethics

Ethies are covered in other projects.

(g) Ability to Communicate Effectively

The project report describes the project clearly and completely. The report is con­structed in such a manner that a person not associated with the project can understandthe problem, the rationale behind the solution and the final results.

(h) Sensitivity to Global/Societal Context

The project report describes the technical aspects of the project and places the projectinto its appropriate context. For example, the report on a new robot arm controllerdesign should place the new design in context with previous approaches by makingcomparisons on a basis of factors such as cost, utility, speed, etc.

(i) Need for Lifetime Learning

The report illustrates that the student is aware that some of the limitations or failuresof the project are a consequence of the current state of technology, or of the student'slack of understanding or experience. In either case, it is acknowledged that part of anengineer's job is continually to update and expand hislher knowledge and experience.

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

Outcomes Assessment and Accreditation 127

(j) Sensitivity to Contemporary Issues

The report acknowledges that contemporary technical and non-technical issues mayinfluence engineering design and shows how such issues were taken into account by thedesign. For example, a project involving automation might mention the associated needfor retraining workers.

(k) Use of Modem Engineering Tools and Techniques

The project was done using tools and techniques reflective of the current state-of-the­art. Where appropriate computer simulations are performed to support analysis, thereport is word processed, figures and schematics are drawn using CAD tools, etc.

In addition, the Electrical and Computer Engineering faculty also expect theirsenior level projects to address other issues of concern to ABET. The followingparagraphs are intended as a guide to aid in consistently interpreting the 'Project showsacceptable evidence of' portion of the project outcomes and assessment form.

Economic Considerations

The project report illustrates an awareness of economic concerns. Economic awarenessis demonstrated by presenting data such as estimates of material cost, estimates of laborcosts, optimizing cost versus performance, or computing life cycle cost.

Safety Considerations

The report discusses any safety issues that arise relative to the use of the designeddevice. Proper use of the actual prototype for measurements or demonstrations withoutcausing bodily injury to its operators are discussed. If the device is mass produced anyissues that affect the welfare of technicians or end-users are presented. Proper handlingof any materials or chemical processes that may have adverse effects on the environ­ment are documented. The need for Underwriters Laboratories (UL), (US) FederalCommunications Commission (FCC), or other approvals on the final hardware isdiscussed.

Reliability Considerations

It is acknowledged in the report that a device must not only work, but it must work witha consistency that is reflective of the community expectations for the device. This goeshand in hand with testing (quality assurance, quality control) and analysis (reliabiliry,availability, component tolerance). If the device is targeted at mass production, a set oftests to help eliminate defective units or unreliable units is presented. Mechanisms thatcould cause irreversible damage to the device are described.

Aesthetic Aspects

The report describes how aesthetic aspects such as power dissipation, weight, volume,form factor and user interface are taken into consideration as a part of the designprocess.

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

128 L. Schachterle

Analysis

The report supports design decisions with an appropriate combination of mathematicalanalysis and simulation.

Synthesis

The report shows specifically how the design was created. This normally involvesdecomposing a high-level system specification into progressively lower level designspecifications until a component level is reached.

Integration of Previous Course Work

The project demonstrates that the student is able to integrate concepts from earliercourses into the solution of an engineering problem.

Experimental Work

Analytical results, computer simulations and compliance with specifications are verifiedby experimental measurements or certain aspects of the project's phenomenology aredetermined or verified through the use of well-formulated experiments.

s. Additional Considerations

Finally, the Electrical and Computer Engineering Faculty also recognizes that not allreports will fully articulate in writing a consistently high degree of learning in all thesenumerous outcomes. Some learning may be implicit; such learning may not beverbalized in the report. Thus, the department requires the faculty advisor(s) to submita brief written statement along with the final grade report in which the faculty cancomment on outcomes which they believe the student achieved but which are notevident to someone assessing the project experience only from the written report. Forexample, the experience of the project may have called upon the student(s) to useexperiments successfully, but the report may not require detailed written sectionsdescribing how students designed experiments. Faculty advisors can thus provide astatement for their peer reviewers and others concerning how students addressed thisoutcome in their work.

A final example of an on-going articulation of preparing for outcomes-based studentassessments may be found on the web site of the WPI Civil and EnvironmentalDepartment (CEE) at http://www.wpi.edu/AcademicslDepartments/CEElABET/salazarMQP report.

In addition to these faculty-generated student assessment tools, like many otheruniversities WPI is looking seriously as student-generated assessments of their achievedoutcomes. WPI is investigating for potential campus-wide adoption of a portfoliosystem of student self-assessment. Many faculties believe such portfolios must be usefulboth to students (as part of their lifelong learning) and to faculty and external visitors.Such portfolios, at their best, provide important educational opportunities for studentsby encouraging them to articulate and become more self-conscious of their learning ata level higher than they often conceive. For example, ABET Criteria 2000 appropriatelyexpects students to communicate effectively, in both written and oral forms. However,WPI faculty believes strongly that such expectations are not at all well met by requiredcourses in communications which are divorced of engineering content. Instead, in­creased emphasis by faculty advising technical projects on oral and written communica-

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

Outcomes Assessment and Accreditation 129

tions, with a clear statement to the students as to why such learning activities areimportant educationally, will effectively address this need.

Also, by asking students regularly (but briefly) to reflect in writing on their successin meeting ABET outcomes (a) through (k), students come to recognize how manyactivities not receiving academic grades contribute richly to learning and to meeting theexpectations of professionals as embodied in EC 2000. For example, relevant summerjobs and part-time work, internships, participation in programmes in leadership andentrepreneurship, as well as co- and extra-curricular athletics and clubs, all provideopportunities for growth in a number of areas important to EC 2000.

6. Some Conclusions and Suggestions

This paper has emphasized the student outcomes assessment process primarily as itapplies to professional engineering accreditation. In the US, the regional accreditationorganizations (which every decade reaccredit institutions desiring access to Federalfunding) have also adopted outcomes assessments. The principal difference is that theseregional agencies examine the entire institution, not (as ABET does) specific pro­grammes; and they also look with care at areas beyond the academic, such as institu­tional finance, publicity and student services. Since few US institutions of highereducation wish to forego any Federal funding (such as student loans), most of the 3842institutions most recently reported seek membership in one of the six regional groups.WPJ's regional accreditation organization, the New England Association of Schools andColleges (NEASC), for example, moved to an outcomes-based assessment process in1992. (A bibliography of readings recommended by NEASC on accreditation isattached.)

Of what value might student outcomes-based assessment be for European engineer­ing institutions? Such a process might help provide the proverbial 'level playing field'among institutions, while still preserving appropriate local/national expectations andconventions (such as numbers of years of study). The US student-outcomes accredita­tion process tries to balance individual institutional differences and initiatives againstreasonable professional standards and expectations for all graduates. Faculties whichpursue valid and defensible student assessment within the framework of widely recog­nized, desired outcomes may use different means (such as curricular structures) toachieve common ends. If European institutions and professional organizations couldagree to define the outcomes-what they want graduates to do professionally directlyout of university-and also how to assess those outcomes, issues currently slowingdown greater European cooperation (as articulated in Professor Augusti's paper,'European engineering formation: the problem of trans-national recognition'), theseissues might be addressed. Rather than stressing deeply embedded differences such aslength of study and curriculum, examining greater European cooperation in terms ofarticulating what recently graduated European engineers are expected to do mayprovide a way of achieving 'trans-national recognition'.

Bibliography

Some key resources on student outcomes assessment: American Association of HigherEducation (AAHE) Assessment Publications Bundle-a collection of assessment forumresources, including 'Principles of good practice for assessing student learning', 'Usingassessment to strengthen general education', 'Time will tell: portfolio-assisted assess­ment of general education', 'Assessment programs and projects: a directory', 'Catching

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

130 L. Schachterle

theory up with practice: conceptual frameworks for assessment', 'Behind outcomes:contexts and questions for assessment'.

ASHCROIT, K. & PALACIO, D. (1996) Researching imo Assessmem and Evaluation inColleges and Universities (London, Kogan Page).

ASSESSMENT UPDATE (Bimonthly journal) Progress, Trends and Practices in HigherEducation (San Francisco, jossey-Bass).

BANTA, T.W. (Ed.) (1988) Implementing Outcomes Assessment: Promise and Perils (SanFrancisco, CA, [osscy-Bass).

BANTA, T.W. & ASSOCIATES (1993) Making a Difference: Outcomes of a Decade ofAssessment in Higher Education (San Francisco, CA, jossey-Bass).

BANTA, T.W., LUND, J.P., BLACK, K.E. & OBLANDER, F.W. (1996) Assessment inPractice: Putting Principles to Work on College Campuses (San Francisco, CA, [ossey­Bass).

DRISCOLL, A. & GELMON, S.B. (1998) Assessing the Impact of Service Learning: AWorkbook of Strategies and Methods (Portland State University, Center for Aca­demic University).

EWELL, P.T. (Ed.) (1985) Assessing Educational Outcomes, New Directionsfor InstitutionalResearch, No. 47 (San Francisco, CA, jossey-Bass).

FARMER, D.W. (1988) Enhancing Student Learning: Emphasizing Essential Competenciesin Academic Programs (Wilkes-Barre, PA, King's College).

GAPP, J., RATCLlPP, J.L. & ASSOCIATES (1966) Handbook of the Undergraduate Curricu­lum: A Comprehensive Guide to Purposes, Structures, Practices, and Change (SanFrancisco, CA, [ossey-Bass).

GARDINER, L.F., ANDERSON, C. & CAMBRIDGE, B.L. (Eds) (1997) Learning ThroughAssessmem: A Resource Guide for Higher Education (Washington, DC AmericanAssociation for Higher Education) (revised 1998).

HALPERN, D.F. (Ed.) (1987) Student Outcomes Assessment: What Institutions Stand toGain (San Francisco, CA, [ossey-Bass).

MAKI, P.L. (1997) Summary of the Survey 011 Enhancing Institutional EffectivenessThrough Student Outcomes Assessmem at the Undergraduate Level (Bedford, MA,New England Association of Schools and Colleges, Commission on Institutions ofHigher Education) (also available on CIHE's web site: http://www.neasc.org/neasclcihe.htm) .

NICHOLS,J.W. (1995) A Practitioner's Handbook for Institutional Effectiveness and StudentOutcomes Assessment Implementation (New York, Agathon Press).

NICHOLS, J.O. (1995) Assessment Case Studies: Common Issues in Implementation withVarious Campus Approaches to Resolution (New York, Agathon Press).

ROGERS, G.M. & SANDO, J.K. (1996) Stepping Ahead: An Assessment Plan DevelopmentGuide (Terre Haute, IN, Rose-Hulman Institute of Technology),

Web sites

http://www.ga.unc.edu/UNCGNAssessment/Internet Resources for Higher Education Outcomes AssessmentMaintained by the University of North CarolinaHas links to many other sites about assessment

http://www.ericae.netlERIC Clearing house on Assessment and EvaluationA project of the US Department of Education

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14

Outcomes Assessment and Accreditation 131

Has general information about assessment, access to tests,information about alternativeassessment, links to other sites

http://www.ohiou.edu/-inres/assessments/index.htmlAssessment-related World Wide Web SitesMaintained by the Ohio Office of Institutional ResearchHas links to campus-based assessment sites

http://www.ABET.org

Dow

nloa

ded

by [

Yor

k U

nive

rsity

Lib

rari

es]

at 1

2:51

11

Nov

embe

r 20

14