document resome - eric · document resome ed 197 117 ce 027 688 author pearsol, james a.; and...
TRANSCRIPT
DOCUMENT RESOME
ED 197 117 CE 027 688
AUTHOR Pearsol, James A.; And OthersTITLE Assessment of the Career Planning Support System.
Final Report.INSTITUTION Ohio State Univ., Columbus. National Center for
Research in Vocational Education.SPONS AGENCY National Inst. of Education (DREW), Washington,
D.C.PUS DATE Nov 79GRANT OB-NTE-G-78-0211NOTE 190p.-; Some pages may not reproduce well due to light
and broken print. For related documents see ED 143B66-877.
EDPS PRICE MF01/PCOB Plus Postage.DESCRIPTORS Attitude Measures: *Career Development; Career
Education: Career Guidance: *Educational Planning:*Program Development; *Program Effectiveness: ProgramEvaluation: Secon.dary Education: *Teacher:Attitudes
IDENTIFIERS *Career Planning Support System: *Support Systems
ABSTRACTThe-National Center for Research in Vocational
Education conducted an assessment to report information about theimpact of the Career Planning Support System (CPSS) on high schoolstaffs' planning and career development activities f,..)r student careerdevelopment. (CPSS is a package of materials designed to enable ahigh school staff to improve effectiveness of its career developmentprogram through systematic program planning.) The study used apre-posttest, experimental and control group research design. Tenhigh schools used CPSS; eight did not. Two levels of measurement wereused: (1) two data collection forms and a rating instrument- tocollect and rate information about each school's career developmentprogram and (2) test of CPSS knowledge and attitude scale related tosystematic program planning to assess CPSS steering committee membersin experimental schools. Findings indicated greater change inexperimental than in control schools toward lnstallaticn of asystematic plan for career guidance, higher quality careerdevelopment activities in experimental schools, and increasedknowledge of CPSS and favorable attitudes of committee members inexperimental s:Acsols. (Appendixes, amounting to over 60% of thereport, include assessment materials and forms and other materialsrelated to the study.) (YLB)
***********************************************************************Reproductions supplied by EDP.S are the best that can be made
- from the original document.***********************************************************************
FINAL REPORT
Assessment of the CareerPlanning Support System
James A. PearsolAnn R. Nufiez
H. Lawrence HotchkissVernon R. Padgett
The National Center for Research in Vocational EducationThe Ohio State University
1960 Kenny RoadColumbus, Ohio 43210
November 1979
2
U.S. DEPARTMENT OF HEALTH,EDUCATION & WELFARENATIONAL INSTITUTE OF
. EDUCATION
THIS DOCUMENT HAS BEEN REPRO.DUCED EXACTLY AS RECEIVED FROMTHE PERSON OR ORGANIZATION ORIGIN-ATING IT POINTS OF VIEW OR OPINIONSSTATED DO NOT NECESSARILY REPRE-SENT OFFICIAL NATIONAL INSTITUTE OFEDUCATION POSITION OR POLICY
'..;
. ,
' t z
..
" -ist4pi i -. ::.,14?.,.-1.'-),--.5.1 e t .,-,?...
.,..3,.,.,,,,,;,.. ,'
'THE NATIONAL CENTER MISSION STATEMENT..:...gfig,=,,.;,
.,.-.,
:::-:::?,,
',:t:''.' ,;...:-,in;
The National Center for Research in Vocational Education's missionis to increase the ability of diverse agencies, institutions, and organi- -6.=,zations to solve educational problems relating to individual careerplanning, preparation, and progression. The National Center fulfills
....,,,yo,its mission by: ..- . : 4..' ''.! .4' z4,04.,-T,
Generating knowledge through research
Developing educational programs and products
Evaluating individual program needs and outcomes
Installing educational programs and products
Operating information systems and services
Conducting leadership development and training programsis_.
.
.
: oe.se
C. !
A Final Report on a Project Conducted UnderGrant No. OB-NIE-G-78-0211
The project presented or reported herein was performed pursuartto a grant from the National Institue of Education, Departmentof Health, Education and Welfare. However,.the opinions expressedherein do not necessarily reflect the position or policy of theNational Institute of Education, and no official endorsement bythe National Institue of Education should be inferred.
U.S. DEPARTMENT OFHEALTH, EDUCATION AND WELFARE
National Instituteof Education
FOREWORD
The challenge to assist youth in gaining the Eldlls and knowledgeto plan for and acquire meaningful careers is a major concernof the educational community, particularly high schools. Inresponse to this challenge, The National Center developed andtested, under sponsorship of the National Institute of Education(NIE), the Career Planning Support System (CPSS). CPSS is a setof staff instructional materials. that show high school staffshow to deliver improved, cost-effective career guidance servicesthat meet student needs and fall within the resources of theindividual school.
This report documents a controlled evaluation of the CPSS conductedbetween June 1, 1978 and November 30, 1979. By using ten experi-mental (used CPSS) and eight control (did not use CPSS) highschools, National Center staff conducted a national assessmentof the effectiveness of the CPSS. The results cf the nationalassessment show that the career development programs in the tenhigh schools that used CPSS were significantly better than theeight schools which did not use the CPSS. We are pleased toreport that, in a controlled evaluation, CPSS works.
We are deeply grateful to the staff in the high schools andschool districts who participated in the assessment study. Theircooperation and consistent enthusiasm were central to the success-ful completion of the project. The names of the staff, schools,and school districts involved in the national assessment arelisted in Appendix L of this report.
Additional thanks go to the project review panel who were requiredto rate anonymously the career development programs of the highschools participating in the assessment. The panel consisted offifteen experts in the fields of career education practice andresearch. Their names are listed in Appendix G of the report.
Special thanks go to Robert I. Wise, NIE Project Officer, whoprovided support and guidance during the CPSS evaluation, toDr. Paul Raffeld, instrument consultant, for his advice in revisingthe rating instrument, and to Dr. James Altschuld, consultant.
iii
Finally, we are grateful to the project staff who planned andconducted the assessment of CPSS, Mr. James A. Pearsol,Dr. Ann R. Nunez, Mr. Vernon Padgett, Dr. Donald C. Findlay,Mr. Drew Denton, Ms. Susan Klaiber, and Dr. H. Lawrence Hotchkiss,Project Director. Special thanks go to Ms. Nancy Robinson,Ms. Debbie Frederick, and Ms. Debbie Cantan for their help inpreparing for and typing the final report.
Robert E. TaylorExecutive DirectorThe National Center for Research
in Vocational Education
iv
FOREWORD
ABSTRACT
TABLE OF CONTENTS
Page
iii
vii
INTRODUCTION 1
Description of CPSS 2
Overview of the Study 6
METHODOLOGY 13
Instrumentation 13
Panel of Raters 21
Experimental and Control School Selection 25
Experimental and Control School Implementation 30
RESULTS 37
Rating Instrument 37
Knowledge and Attitude Tests 49
CONCLUSIONS 58
BIBLIOGRAPHY 61
APPENDICES 62
A - Dissemination Plan 62
B Career Development Program Status Report 66
C Verification Checklist 89
D - Career Development Program Rating Instrumentand Directions 99
E - Career Planning Support System Knowledge Test 124F - Perceptions of Program Planning for Career
Development 132
G Panel of Expert Judges 140H - Original Target Cities and Cover Letter 142
I School Information Checklist 146J - Work Statement 153K - Joint Dissemination Review Panel Report 163L Acknowledgements 174
LISTS OF FIGURES AND TABLESPage
Figures
1. CPSS Assessment Study Research Design 7
2. Forms and Instruments 14
3. ANOVA's, ANCOVA's, reliabilities and averageconfidence ratings for hypothesis one 43
4. ANOVA's, ANCOVA's, reliabilities and averageconfidence ratings for hypothesis two 46
5. Knowledge Test Scores 52
6. Attitude Scale Scores 54
Tables
1. Data Sources for the Assessment of CPSS 11
2. Experimental and Control Schools 28
3. Characteristics of Experimental and ControlSchools in Each District 29
4. Key Characteristics of Test Sites 31
5. Mean Scores on CPSS Knowledge Test 51
6. Control Schools' Knowledge Scores 53
7. Mean Scores on CPSS Attitude Scale 56
8. Attitude Scores for 'Control Schools 57
vi8
ABSTRACT
The National Center for Research in Vocational Education
conducted the "Assessment of the Career Planning Support System"
co report information about the impact of the Career Planning
Support System (CPSS) on high school staffs' planning and career
development activities for'student career development.
CPSS is a package of materials designed to enable a high
school staff to improve the effectiveness of its career develop-
ment program through systematic program planning.
The study used a pre-post, experimental and control group
research design. Ten high schools were assigned to the experi-
mental condition, using CPSS, and eight high schools were assigned
to the control condition and did not use CPSS. The high schools
were located in eight different school districts in Arizona,
Colorado, Florida, Illinois, Kentucky, Maryland, and Tennessee.
Each school provided a part-time coordinator who was respon-
sible for the preparation and completion of data collection forms
and served as the on-site contact person with National Center
staff. In the experimental schools, this contact person also
served as the CPSS coordinator and chaired a CPSS steering com-
mittee in the school.
Two levels of measurement were used in the study. The most
important, or primary, level involved the use of two data collection
vii
9
forms and a rating instrument. The data collection forms were
completed at each school and were used to collect information
about each school's career development program. The rating instru-
ment was used by an external panel of experts to rate the informa-
tion collected from each school on the data collection forms,
both pre and posttest. The eifteen panel members represented pro-
fessional fields related to career development research and prac-
tice. The instrumentation developed for the primary level of
measurement was used to assess two major hypotheses: I- experi-
mental school staffs, using CPSS, would demonstrate greater change
toward the installation of a systematic plan for career guidance
than would control schools, and II-experimental school staffs
would develop a plan that contains higher quality career develop=
ment activities to improve student career development skills than
would control schools.
A secondary level of investigation involved assessments of
CPSS steering committee members in the experimental schools. A
test of .CPSS knowledge and an attitude scale related to systematic
program planning were administered to these school staff. The
instruments were used to assess CPSS steering committee members'
knowledge of CPSS and attitudes toward planning for career devel-
opment programs both pre and posttest. This information was used
to support Hypothesis I above.
The findings reported for the primary level of measurement,
Hypothesis I, were highly significant. The experimental schools
displayed greater change toward the installation of a systematic
viii
10
plan for career guidance than did control schools. The findings
for the primary level of measurement, Hypothesis II, were mixed;
however, a positive trend was observe'd in favor of the experimental
schools regarding the quality of career development activities.
The, secondary level of measurement produced significant findings
related to increased knowledge of CPSS and favorable attitudes
toward systematic planning for career development programs among
CPSS steering committee members in the experimental schools.
A major product of the study was the preparation of a report
for the Joint OE-NIE Dissemination Review Panel. The. Panel reviews
the products of federally sponsored research for disSemination funds.
The results of the Panel's evaluation of the report were not avail-
able at the time the final report was published.
ix
INTRODUCTION
The National Center for Research in Vocational Education
conducted the Career Planning Support System (CPSS) Assessment
Study to report information about the impact of CPSS on high
school staffs' planning and programming for student career
development. The eighteen month study (June 1, 1978 -
November 30, 1979) assessed the effects of CPSS in selected high
schools during one academic year (October, 1978 June, 1979).
Project staff used the June - September, 1978 months to conduct
site selection and instrument development activities, and the
July November, 1979 months to complete ratings of school data,
data analysis, and final reports.
This controlled assessment involved the use of ten experi-
mental (use CPSS) high schools and eight control (did not use
CPSS) high schools in eight different school districts throughout
the United States. The results of the assessment are presented
in this final report and were submitted in a condensed report to
the Joint Dissemination Review Panel.(JDRP) of the U.S. Office
of Education and the National Institute of Education for review
and approval.
1 2
Description of CPSS
The Career Planning Support System (CPSS) is a package of
materials designed to enable a high school staff to improve the
effectiveness of its career development program through system-
atic program planning. "Career development" in the CPSS perspec-
tive is defined as the process by which an individual student
acquires the basic, non-technical skills to cope in the
world of work. CPSS is a support system that helps organize a
school's staff and resources to meet the career development needs
of students; therefore, CPSS is not a package of materials that
an individual student might use to explore careers. CPSS is a
set of program management procedures that are designed to promote
five basic student career development skill areas: 1) Personal-
Social Awareness Skills, 2) Career Exploration Skills, 3) Job
Acquisition Skills, 4) Education and Training Exploration Skills,
and 5) Education and Training Acquisition Skills. Over time,
student skills development in these five areas is the expected
outcome of the procedures suggested by CPSS.
CPSS is implemented by school staff and employs a package
of handbooks, reproducible forms and filmstrips that guide the
planning, implementation and evaluation of a high school's career
development program. The following list describes a complete
set of CPSS materials:
The Coordinator's Training Guide is a self-instructionaltraining guide for the part-time CPSS coordinator.
The Coordinator's Handbook contains instructions thatdescribe step-by-step procedures for managing and imple-menting CPSS in the high school.
2
Camera-Ready Forms are reproducible copies of each formneeded for the questionnaires, instructions, CPSS ProgramInformation File, etc.
Handbooks
The Advisory Committee Handbook defines the responsi-bilities and duties of Advisory Committee members (fivecopies).
Assessing Resources guides a resource leader in direct-ing a task force to collect information on and accountfor the use of resources in the school and community.
Assessing Needs: Surveying provides instruction forpreparing, administering, and collecting survey ques-tionnaires for students, graduates, parents, and faculty/staff (five copies).
Assessing Needs: Tabulation contains instruction onmanually tabulating data collected by questionnaires(five copies).
Analyzing Methods directs the methods specialist toreport the availability of guidance methods and instructshim/her on how to integrate this knowledge into theconstruction and review of career development units.
The Manual for Writing Behavioral Objectives is aself-instructional guide for the behavioral objectivesspecialist.
Writing Behavioral Objectives informs the behavioralobjectives specialist about the function of behavioralobjectives in the construction of career developmentunits.
Producing CDUs (Career Development Units) provides direc-tion for developing career guidance/development activities.
Filmstrip/Audio Tape Presentations include:
AV -l: "An Orientation to CPSS"--designed to orientinterested persons or special groups to CPSS.
AV-2: "Shaping Program Goals"--an overview of how theneeds and resources assessments lead to goalsfor your school.
AV-3: "Behavioral Objectives"--used with the behavioralobjectives manual.
3
4
AV-4: "Producing CDUs"--an overview of the careerdevelopment unit process.
To accomplish the planning, implementation and evaluation
procedures, CPSS recommends that a school coordinator lead the
CPSS effort with the assistance of a working steering committee
comprised of students, teachers, counselors, and administrators.
The CPSS coordinator gives direction to the CPSS effort and
chairs the steering committee; however, much of the planning,
implementation and evaluation is performed by the steering com-
mittee with the help of other school faculty and members of the
community.
The key procedural steps prescribed by CPSS are:
1. Organization of school staff: The CPSS coordinator,using CPSS manuals as guidelines, organizes a steeringcommittee of faculty, students, counselors and adminis-trators who meet regularly to complete the followingsteps. For each of the steps below, CPSS provides com-plete instructions.
2. Assessment of resources: Steering committee memberscompile a record of resources available in the schooland community that might be used in the career develop-ment program. Resources include equipment, space, per-sonnel, funds, and materials. Resource assessment alsoincludes collecting demographic data describing the schooland community, information about current career guidanceand career development instructional activities in thestate, district, school, and feeder schools, and a recordof resources expended. The CPSS package includes a listof important categories of resources and tabulationforms for recording the resources in each major category.
3. Assessment of student needs: Steering committee memberswith the help of other faculty and students surveystudents, faculty, recent high school graduates andparents in order to determine the career developmentneeds of the student body. Camera-ready master copiesof the necessary survey instruments are provided inthe CPSS materials.
4
i5-
4. Writing goals and behavioral objectives: Steeringcommittee members use the results of the needs assess-ment to specify career development goals derived fromneeds and to list the goals in order of priority.Priorities depend, in part, on available resources.After the goals have been listed in order of priority,the highest priority goals are translated into behavioralobjectives for students.
5. Creating career development units (CDUs): Interestedfaculty, with guidance from steering committee members,design CDUs. A CDU is a sequence of activities thatis designed to achieve related sets of behavioral objec-tives for students. CPSS manuals prescribe the componentsof successful CDUs. The CDUs are geared toward thefive basic career development skill areas listed earlierin this section. An example of a CDU might be a careerdevelopment course or a sequence of related field trips.
6. Annual program review: At the end of each year thesteering committee members'review the CPSS careerdevelopment program, including CDU development andimplementation, the status of career development resources,progress toward program goals in satisfying student needs,and plan for the following year's efforts.
7. Program reassessment: CPSS recommends that the needsassessment be administered every three years. Afterthree years, the data from recent graduates, who wereintroduced to CPSS-derived activities in the previousthree years, are compared with data from current studentsto determine if the program plans initiated through CPSSare working and if student needs in high priority areashave been diminished (over the threw year period).
The seven points listed above are the primary procedural
components of the CPSS. The seven points are systematically
interdependent in that steps one through four must occur first
and lead to steps five, six, and seven. Approximately three
academic years are required to install CPSS fully. The first
year is devoted mostly to initial, systematic career development
program planning, basically steps one through four above. After
the first year's foundation is laid, the CPSS steering committee
5
0
focuses on the production and implementation of CDUs (step five),
the heart of the school's career development program. This
Assessment Study focused primarily on steps one thrOugh four;
however, some data related to step five were collected.
CPSS, as a support system, is designed to provide the or-
ganizational framework and procedural steps required to install
an accountable, school-wide career development program. It does
not prescribe what specific career development activities a school
should use, but rather provides a means for a school to focus
its career development program on the unique needs of its own
students and within the bounds of its own resources.
Overview of the Study
Design
The CPSS Assessment Study was conducted to assess the ef-
fectiveness of CPSS. The study used a pre-post, experimental
and control group research design to determine the impact of
CPSS on selected high schools during one academic year. Eighteen
comparable high schools participated in the study. Ten high
schools were assigned to the experimental condition, implementing
CPSS, and eight high schools were assigned to the control condition.
The following chart shows the months when the pre and posttest
data were collected and when the CPSS treatment occurred.
6 7
Figure 1. CPSS Assessment Study Research Design.
1978
Sept Oct 'Nov
1979
Dec Jan Feb Mar Apr vlay June
ExperimentalSchools (10)
ControlSchools (6)
(2)
PRE
PRE-9."
CPSS POST
PRE*
POST- -
POST,
* Two control schools were unable to complete thepretest until 12/15/78.
Objectives
Two objectives guided the study. The objectives addressed
the capacity of CPSS to produce 1) a systematic plan for career
guidance and 2) career development activities that have a high
probability of improving student career development skills. The
objectives for the study are listed below:
1. School counselors, teachers, administrators, and studentsinvolved in CPSS for one academic year will developa systematic plan for career guidance as judged by anexternal panel of reviewers.
a. CPSS steering committees in the experimental schoolswill demonstrate a working knowledge of the operationof the Career Planning Support System.
b. Attitudes of CPSS steering committee members inthe experimental schools toward systematic planningfor career development programs will either bemore favorable by the end of the school year or willnot be any less favorable than before the initiationof CPSS.
7
2. School counselors, teachers, administrators, and students,involved in CPSS for one academic year will develop aplan that contains activities having a high probabilityof improving student career development skills as judgedby an external panel of reviewers.
Objective One is based on the assumption that "a systematic
plan" is -derived from coherently-related planning components.
For the purposes of this study "a systematic plan for career guid-
ance" consists of the following planning elements:
An organizational structure facilitating a career develop-ment program, to include clearly designated leadership,permanent active committees and work groups, and adminis-trative cooperation
An assessment of the career development needs of localstudents and use of the results of the needs assessmentin the career development program
Creation of explicit career development goals reflectingassessed student career development needs
Creation of behavioral objectives designed to implement-the goals
Creation of student activities to achieve the objectivesand goals
Descriptive information related to the elements above was
collected from both experimental and control schools, pre and
posttest, and rated by expert raters to assess Objective One.
It was expected that experimental schools would produce greater
change toward a systematic plan for career guidance than would
control schools.
Objectives la and lb assume that demonstrated knowledge of
CPSS procedures and favorable attitudes toward systematic planning
will insure that experimental schools know how to develop a system-
atic career development program plan and display favorable attitudes
8
already or will develop favorable attitudes about systematic
planning efforts. Demonstrated knowledge of CPSS and supportive
attitudes lend credence to the expectation that the systematic
planning steps developed in Objective One can be implemented.
Objective Two is based upon the assumption that systematic
career development program planning leads to higher quality
activities to meet student needs for career development skills
than activities found in schools that do not use systematic
planning procedures. Although the one-year assessment period did
not permit full implementation of CPSS and full development of
CDUs, some data were collected allowing comparisons between experi-
mental and control school career development activities.
Measurement
Two types of instrumentation were developed to meet Objectives
One and Two. One type of instrumentation was the two data sources
which were used to report the status of career development program
planning efforts and career development activities in experimental
and control schools both pre and posttest. The first data source
was a Career Development Program Status Report (CDPSR) and the
other was a Verification Checklist (VCLI.
The second type of instrumentation was the Career Development
Program Rating Instrument (RI) used by an expert panel of raters.
The raters completed the rating instrument using. the CDPSR and
VCL, as data sources. The purpose of the ratings (provided on
therating instrument) was to assess the presence or absence of
9
key career development program planning components and the quality
of career development program activities.
The subset of objectives of Objective One, la and lb, were
assessed through a separate set of instrumentation. A knowledge
test of CPSS procedures, The Career Planning Support System
Knowledge Test, was developed by project staff to assess steering
committee members' knowledge of CPSS. An attitude scale, Percep-
tions of Program Planning for Career Development, developed by
project staff was designed to assess steering committee members'
attitudes toward systematic planning. The data sources, rating
instruments, knowledge test and attitude scale are described
fully in the METHODOLOGY section.
Table 1 displays the data sources and timing for the CPSS
Assessment Study relative to the objectives of the study.
Details about the progress of the Assessment Study and
subsequent results follow. Site selection, instrument develop-
ment, site monitoring, pre-post data collection, and the conduct
of the expert panel of raters' meetings are all described in the
METHODOLOGY section. The outcomes of the study are reported in
the RESULTS and CONCLUSIONS sections of this report.
Dissemination
Dissemination of the findings of this Assessment Study were
achieved through a three step dissemination plan (see Appendix A).
Step one involved a minor revision of the CPSS package of materials
to correct typographical errors and to include information about
the Assessment Study. Step two required project staff to prepare
10
TABLE 1. Data Sources for the Assessment of CPSS.
OBJECTIVES
TIMING AND DATA SOURCE DURING ACADEMIC YEAR
At Beginning of i At Interim Points
the Year of the Year
At the End o
the Year
1. Systematic career
development program
plans (experimental
and control schools)
Composite Status o Verification by
Reports generated project staff via
by the school(CDPSEI telephone, mail,
School records or site visit
Completion of on-
site Verification
Checklist by
National Center
project staff(VCL)
Plans/Reports
generated by the
schools(CDPSR)
School records
Completion of on-
site Verification
Checklist by
National Center
project staff(VCL)
a. Knowledge of
CPSS (experimental
schools only)
Career Planning
Support System
Knowledge Test
(steering committee
only)
Career Planning
Support System
Knowledge Test
(steering committee
only)
b. Attitudes toward
systematic planning
for career develop-
ment programs (ex-
perimental schools
only)
Perceptions of
Program Planning
for Career Develop-
ment Scale (steer-
ing committee only)
Perceptions of
Program Planning
for Career Develop-
ment Scale (steer-
ing committee only)
2. Career develop-
ment program plans
containing activi-
ties having a high
probability of im-
proving students'
career development
skills (experimental
and control schools)
Composite Status
Reports generated
by the schools (CDPSR
e School records
Completion of on-
site Verification
Checklist by
National Center
project staff(VCL)
Verification by
project staff via
telephone, mail,
or site visit
Plans/Reports gen-
erated by the
schools (CDPSR)
School records
Completion of on-
site Verification
Checklist by
National Center
project staff (VCL)
and submit two journal articles about the study and to coordinate
an informational seminar about CPSS at a national conference.
Step three involved mailing a complimentary set of CPSS materials
to each state and territory department of guidance.
12
METHODOLOGY
Instrumentation
Two data collection forms and three assessment instruments
were developed to assess attainment of the project objectives.
An overall listing of the forms and instruments, who completed
them, the tune and sequence of their completion, and each objec-
tive they address is presented in Figure 2.
Data Collection Forms
Two data collection forms were used to gather descriptive
information from the participating schools both pre and posttest.
These data sources constituted the inputs for executing the Career
Development Program Rating Instrument.
The Career Development Program Status Report (CDPSR). The
CDPSR organized information about the school and community and
about the ongoing and projected career development program planning
efforts at participating schools. The construction of the CDPSR
was based upon the kinds of descriptive information needed to
determine systematic planning (as exemplified by CPSS). The
CDPSR (see Appendix B) was completed by school personnel both pre
and posttest and consisted of four sections:
I. Site Description community, school district and school
13
Figure 2. Forms and Instruments
Pretest
Completion
Data Collection Forms Persons Completing Date
Posttest
Completion
Date
Intended
Objectives
Career Development Program Status High School Staff (all 12/78 6/79 1 & 2
Report (CDPSR) schools)
Verification Checklist (VCL) National Center Project
Staff (all schools) 12/78 6/79
Assessment Instruments
Career Development Program Rating Expert Panel of Raters
Instrument 8/79 8/79 1 & 2
Career Planning Support System CPSS Steering Committee
Knowledge Test Members (experimental1
schools only) 12/78 6/79 la
Perceptions of Program Planning CPSS Steering Committee
for Career Development Members (experimental
schools only)
12/781
6/79 lb
1In addition to the experimental school CPSS steering committee members, thirteen school
staff in three control schools completed these instruments at posttest only (see pp.50 & 54)
II. Career Development Program Description staff organiza-tion, state and district goals, and an overview of acti-vities and resources
III. Present 1 Planning Efforts - personnel, assessment, andevaluation activities
IV. Projected1 Planning Efforts - personnel, assessment, and
evaluation activities.
By completing the CDPSR, participating schools documented
the extent to which their existing career development program
planning reflected the basic components of systematic career
development program planning (staffing and personnel support,
needs assessment, goal selection, development of behavioral ob-
jectives, and career development activities).
A pilot tryout of the form was completed with the assistance
of two external consultants, both experienced career development
practitioners. They were told that the form would be used by'high
school staff members to report information about the status of
their career development programs and the programs' suitability to
their communities. The consultants conducted their reviews of the
CDPSR from the viewpoint of a director of guidance who might be
asked to complete it for his or her high school. The consultants
were instructed to assess the clarity of the structure, terminology,
and phrasing of the report and the ease of retrieval of the requested
11n the pretest version, sections.III and IV referred to the 1977-78 and 1978-79 school years, respectively. In the posttestversion, these changed to 1978-79 and 1979-80, respectively.
information as well as to comment on other problems encountered
in completing the report.
Subsequently, a revised version of the CDPSR was submitted
for final review to two additional external specialists in instru-
ment development and career guidance. The consultants' report
stated that the form provided information related to the goals of
a systematic career development program and that it appeared to
have content validity.
The Verification Checklist (VCL). This form provided a
means by which a National Center project staff person could corro-
borate, clarify, and expand the information recorded on a school's
Career Development Program Status Report. The design of the VCL
was based on the essential components of systematic career develop-
ment program planning as exemplified by CPSS (see Appendix C).
During the pre and posttest site visits, a National Center
project staff member completed the VCL based on the information
presented in the school's CDPSR. Subsequently, during a face-to-
face meeting with school personnel who completed the CDPSR, the
National Center staff member asked for any clarification, probed
for any missing information, and attempted to attain the most
complete and accurate information possible with respect to the
school's career development program planning efforts. Finally
both the staff member and school coordinator signed the VCL.
In this manner the VCL provided evidence supportive of, and sup-
plementary to, the self-reporting mechanism of the CDPSR.
16
Assessment Instruments
Career Development Program Rating Instrument (RI). The
Career Development Program Rating Instrument was used by a rating
panel with expertise in career development to rate the adequacy,
quality, and potential impact of a high school's career develop-
ment program planning as reported through the descriptive data
sources, the CDPSR and the VCL.
The panelists were instructed to read all information pro-
vided in a school's CDPSR and VCL before using the RI to rate the
school. The RI consisted of 33 items and was divided into seven
sections. Each section had directions that applied to the items
in that section and listed specific references in the data sources
for each section or for individual items in a section. The data
sources (CDPSR and VCL) were often accompanied by appendices that
were attached to the CDPSR as part of the reference.
The RI sections represented basic component areas of a
systematic career development program planning effort. Concep-
tually, sections I through V asked questions concerning specific
facts describing the school's career development program. Section
VI of the RI asks for the rater's estimate of the impact on
student career development skills of future plans for the school's
career development program. Section VII provided the raters an
opportunity to express their judgments about the overall quality
of career development program planning at a school relative to
Objectives One and Two. Sections I through V, thus, were designed
17
to familiarize each rater with the facts, in preparation for the
broad judgments requested in Sections VI and VII.
Both discrete variables (dichotomous and categorical) and
continuous variables comprised the RI items (see Appendix D).
Raters usually indicated their judgments on a four or five point
Likert scale or by a Yes-No response unless specific instructions
directed them otherwise. Each item was followed by a confidence
rating which enabled the rater to indicate his/her confidence that
the answer marked was correct. Therefore in addition to the
reliability coefficients reported in the RESULTS section of this
report, another estimate of reliability was obtained when panelists
were asked to estimate their confidence in each rating they made.
Raters were asked to place a check along a scale from zero to
100 indicating their judgments regarding the likelihood that
their answers were accurate. The format of the confidence rating
is reproduced below.
CONFIDENCE RATING: Please estimate the probability that youranswer is correct by placing a check at the appropriate loca-tion on the scale.
1 1 1 I 1 fI I I I (
0 10 20 30 40 50 60 70 80 90 100%
no complete
confidence confidence
A prototype of the present RI was used-by the rating panel
in January, 1979 to rate the pretest data sources. However, due
to unacceptably low reliability measures among raters, the original-
18
rating instruments were revised with the help of an external test
construction consultant and the pre and posttest data were rated
in August, 1979, using the revised RI.
Principle concerns in the RI revision were: (1) to include
only items that were clearly answerable given the CDPSR and VCL
information, and (2) to eliminate items that did not allow control
schools a fair opportunity to receive a. high rating. _Following
examination and review by project staff, the RI was pilot tested
by six National Center staff persons with expertise in instrument
design, evaluation, and career development. The recommendations
of these persons were incorporated in the final version of the RI.
Career Planning Support System Knowledge Test. A test of
CPSS knowledge was constructed to assess Objective la (see p.7 ).
A prototype multiple-choice test was developed based upon the
contents of the Career Planning Support System Coordinator's
Handbook, the principal reference for describing the concepts and
procedures of CPSS. Since the test was intended to measure know-
ledge of CPSS planning procedures, an effort was made by project
staff to exclude items not related to the CPSS planning model.
Any items of this nature detected later were deleted or not
scored.
The prototype test was reviewed first by a two-person team
of outside technical consultants followed by a single outside
technical consultant. In response to the consultant recommenda-
tions, action was taken to increase the coverage of CPSS planning
steps, to eliminate items dealing with general planning procedures,
19
2
and to improve vaguely written items. On the advice of the
consultants, the multiple-choice format was converted to a short
answer (open-ended) one.
A small-scale tryout of the test was conducted using two
persons familiar with CPSS and two persons unfamiliar with it.
After further revisict, the test was reviewed for clarity of
expression and ease of understanding by two school district
persons present at the National Center for training as CPSS
school coordinators. Their suggestions were incorporated in
the final revision of the test.
The completed knowledge test consisted of 21 items2with
an average of three to four score points possible for each item
(see Appendix E). A total of 74 points was possible.
The 74 possible points of the test were distributed across
eight areas, with the number of items and points for each area
determined, for the most part, by the approximate amount of
time the steering committee was schedulE1 (by the Coordinator's
Handbook) to meet regarding the topic. The points by areas were:
career development units (17), needs assessment (12), resources
(9), annual review (9), goals (8), objectives (7), overall planning
coordination (7), student involvement (5). The results for the
knowledge test are reported in the RESULTS section, p. 49.
2Originally, there were 27 items in the test. Six items (9, 11,13, 14, 21B, 18) were not used because of defects foundafter the test was printed.
2033
Perceptions of Program Planning for Career Development.
A prototype attitude instrument was constructed with the help of
an external consultant to assess the attitudes of CPSS steering
committee members toward systematic planning for career develop-
ment programs. This instrument was developed to meet Objective
lb (see page 7). Recommendations from a team of two external
technical consultants, a single external technical consultant,
and a National Center instrument design specialist were followed
in revising and improving the instrument.
The instrument consisted of 32 statements about program
planning for career development (see Appendix F). The statements
reflected the planning concepts and procedures of CPSS as presented
in the Career Planning Support System Coordinator's Handbook.
Each subject had the opportunity to select one of five responses
(disagree strongly = -2, disagree = -1, neutral = 0, agree = +1,
agree strongly = +2) for each statement. Scoring was accomplished
by coding the five responses (+2 through -2) and summing across
the 32 items. The possible range of scores was from -64 to +64.
The results on the attitude instrument are reported in the RESULTS
section, p. 53.
Panel of Raters
Selection of Panelists
A major part of the Assessment Study involved the selection
and recruitment of an external review panel of experts whose task
would be to rate the adequacy, quality, and potential impact of
21
career development'program planning and activities at experimental
and control schools. A number of criteria for selection of review
panel members were listed: proponent of comprehensive career guid-
ance; national visibility; reputation within the profession; a
mix of practitioners, administrators, and methodologists; availa-
bility; and experience in evaluation. Other considerations in-
cluded minority representation, male to female ratio, and geographic
distribution. National Center staff with familiarity in the area
of career development were asked to suggest potential panel members
meeting these criteria. A list of over 30 persons was compiled
from which fifteen persons would be selected.
In the initial telephone contacts, prospective panel members
were asked if they would be willing to serve on a 15-member rating
panel consisting of persons like themselves with background and
experience in the areas of career development. They were informed
that the panel would be performing assessment activities on data
received from high schools throughout the nation.
The main reason for use of a panel of expert judges is related
to the nature of the subject matter. Few people would doubt that
efficient organization and planning comprise important aspects
of high school career development programs. Yet the important
features of efficient organization and planning remain uncodified
in sufficient detail to permit completely objective measurement.
In such instances, human judgments are essential. Hence, a
panel of individuals with the experience, training, and reputation
was assembled to provide the most accurate judgments available.
22
Rating of Schools
The rating panel convened at the National Center for Research
in Vocational Education on January 24, 25, and 26, 1979 (see
Appendix G for list of panel members). The panel rated pretest
data collected fromthe eighteen high schools. The most signifi-
cant outcome of the first meeting was a subsequent revision of the
Career Development Program Rating Instrument.
A second meeting of the rating panel occurred August 20,
21, and 22, 1979. Following brief introductions by project staff,
the panelists were supplied sample copies of the revised Career
Development Program Rating Instrument (RI) with accompanying
definitions and directions. The RI and directions were reviewed
with the panel so that all panelists were sure of their assignments.
The task given the panelists was to use a Career Development
Program Rating Instrument to rate a school's career development
program planning as reported through two data sources, the Career
Development Program Status Report (CDPSR) and the Verification
Checklist (VCL). All panelists completed a rating instrument
for each data set they reviewed.
The panelists' (N=15) rating assignments were planned so
that each school (N =18, pretest; N=18, posttest) was rated by
three or four different and randomly-assigned raters. Assigning
more than one rater to each school permitted numerical assessment
of reliability of the ratings and yielded more accurate results
than could be obtained from a single rating per school. Each
panelist rated both the pre and posttest data from a school.
23
Each panelist rated eight different data sets (one pretest, one
posttest) from four different schools.
Two hour time periods were alloted for the rating of a single
data set. At the end of each rating period, each data set was
collected and then redistributed to the next appropriate panelist.
In order to eliminate rater bias, the panelists were not
informed that a CPSS model provided the framework for this study.
Neither were they advised of the pre-post, experimental-control
design of the study. They were told that the data sources were
reports taken at two different time periods from a set of high
schools and that some of the schools in the two time periods may
have been the same. However, all identifying information, e.g.,
state, city, school name, school address, dates, had been deleted
prior to the ratings.
In a final debriefing session, after all rating activities
had been completed, the panelists were told that they had parti-
cipated in an assessment study of the Career Planning Support
System. They were given CPSS materials, a study abstract, and
informed of all aspects of the study. The panelists indicated
that they had neither surmised the nature of the study nor
recognized that they had rated pre and posttest data from the
same schools.
A last task required of the panelists was the submission of
a two-page paper of recommendations or comments to assess the
rating procedure, rating instrument, implications for analysis,
and the adequacy of the Status Report and Verification Checklist
24
as sources of information about a school career development
program. These recommendations were used in preparing the JDRP
report.
Experimental and Control School Selection
The intent of the school selection process was to identify
a sample of urban high schools, ten experimental and ten control,
to participate in the CPSS Assessment Study. Project staff ex-
pected to select two volunteer high schools from each of ten
urban school districts. Once identified, each high school was
to be randomly assigned to either an experimental or control
condition. The purpose in selecting one experimental high school
and one control high school from the same district was to control
for district and geographic variables and to minimize national
travel.
The expected school selection process was difficult to
carry out. CPSS requires a staff commitment of one part-time
coordinator and a working steering committee. Some school dis-
tricts could not commit any staff to the project. Many school
districts were experiencing budget and subsequent staff cutbacks,
union/school board problems, and desegregation difficulties. Other
school district officials cited the passage of California's
Proposition 13 as a detriment to participation in a national
project of the scope of the CPSS Assessment Study during 1978-79.
These factors caused project staff to make compromises in obtain-
int the desired number of schools within optimal research conditions.
25
School Selection
Project staff began the school selection process on July 21,
1978, by inviting superintendents of school districts in thirty-
five of the fifty largest cities in the United States to recommend
high schools in their districts to participate in the national
assessment of CPSS (see Appendix H). School district superinten-
dents were encouraged to nominate interested high schools. Each
interested school would complete a School Information Checklist
(SICL) which reported information about the school relative to
the number of faculty/staff, size of the student population,
distribution of ACT/SAT scores, drop-out rates, estimated average
family income of student population, racial/ethnic mix of student
population and other information (see Appendix I). Project staff
ultimately used the SICL to comparatively match schools within
districts. The fifteen city school districts which did not re-
ceive the initial mailing were known to be experiencing some of
the difficulties mentioned in the previous section. By selecting
high schools from districts in the largest cities in the United
States, it is expected that the results of the study may generalize
to the widest population of American high schools.
The initial mailing produced few candidate schools by mid-
September. To enlarge the sampling frame and to broaden geographic
distribution, additional school districts were invited to parti-
cipate in the study. These districts were located in the standard
metropolitan areas (SMAs) surrounding some of the large cities.
26
By mid-October eighteen high schools representing eight
school districts agreed to participate in the study. The final
two schools, each representing one of two adjacent school districts,
were located in a rural setting and thus expanded the domain of
representation. Table 2 shows the number of experimental and
control schools, the school districts, cities, and states that
participated in the CPSS Assessment Study.
Experimental and Control Assignment
Of the eighteen high schools, four schools were randomly
assigned to experimental and control conditions. Most local
school district administrators insisted that the determination
of experimental an ,1,. control schools be decided at the local district
level, rather than through random assignment by the National Center;
therefore, fourteen schools were not randomly assigned to condi-
tions (see Table 2). Some of the local considerations for assign-
ment to conditions were: 1) staffing patterns in the experimental
schools allowed for easy assignmen Df a CPSS coordinator; 2) the
experimental school was ready to start with CPSS, the control
school wanted to wait; 3) a school dropped out at the last minute
and the remaining school was assigned experimental, until a control
school could be found. Nevertheless, schools within districts
appeared closely matched on the characteristics reported on the
School Information Checklist (see Table 3). Table 3 lists the
descriptive information obtained from the SICL about each of the
schools involved in the study.
27
TABLE 2
Experimental and Control Schools
# of Experimental # of Control Assignment
District City and State High Schools High Schools to Conditions
Chicago Public Schools' Chicago, IL 2 2 Random
Baltimore City Public Schools' Baltimore, MD 2 1 Nonrandom
Dade County Public Schools' Miami, FL 2 1 Nonrandom
Jefferson County Public Schools Louisville, KY 1 1 Nonrandom
Memphis City Schools Memphis, TN 1 1 Nonrandom
Jefferson County Public Schools Lakewood, CO 1 1 Nonrandom
Globe Public Schools2
Globe, AZ 1 Nonrandom
Miami Public Schools2
Miami, AZ 1 Nonrandom
'To insure a sufficient number of experimental and control schools, project staff accepted all of
the schools recommended by interested school districts,
2Globe and Miami are adjacent districts with one high school in each district. For the purposes
of the assessment study, Globe and Miami were matched, Globe is the experimental high school
and Miami is the control high school,
TABLE 3
Characteristics of Experimental aid Control
Schools in Each District
School
Code
Treat4
ment
X
C
X
X
C
X
C
X
C
X
C
X
C
X
C
X
X
C
Student Teachers/Coun-
Pop. selors/Administrators
2
Racial/Ethnic
(Percentages)
ACT
Composite or Est. Average Grade
SAT Family Income Levels
Student
Drop-Out
Rate Ste
01
02
03
04.
05
06
09
07
08
10
11
12
13
14
15
16
17
18
1029
617
1278
2006
3000
2049
1853
3054
2735
1600
2258
1950
1406
893
1399
2300
3000
2274
Tchrs/Counselors/Admin W / B / SP / Asian / Nat. An
48 2 2 54 - 21 - 25
35 1 2 58 - 40 1.4
72 5 3 100
101 7 4 10 90 -
120 7 5 30 70 -
110 6 5 - 100 -
80 4 3 100
159 8 5 100 -
139 7 4 100 -
92 5 5 83 - 16 1
101 6 5 86 - 13 1
88 4 4 70 30 -
80 4 3 77 23 -
63 3 4 72 28 -
55 3 3 85 15
109 7 4 30 40 30 -
138 6 5 62 24 12 2
116 5 4 70 17 11 2
17.5
19,4
350/300
330/330
300/337
12
10
12
14
19.3
19.3
16,5
18.2
16.7
18.4
N/A
440 (mean)
507 (mean)
(in thousands)
10-15 9-12
10-15 9-12
15-20 10-12
5-10 9-12
5-10 10-12
5-10 9-12
5-10 9-12
10-15 9-12
10-15 9-12
10-15 10-12
15-20 9-12
5-10 9-12
10-15 9-12
15-20 9-12
15-20 9-12
10-15 10-12
10-15 9-12
15-20 10-12
121
8.9%
5%
8%
13%
2,9%
16,2%
7%
8%
5%
10,9%
7%
5%
2,5%
5%
14%
11%
13%
3
MD
IL
CO
KY
TN
FL
1Districts are shown by horizontal broken lines separting groups of schools
2
Ethnic abbreviations: W = White, B = Black, SP = Spanish Surname, Nat. Am = Native American
3Schools 01 and 02 are the two rural, Arizona high schools located in separate, adjacent School districts
4X = experimental, C = control
13
Whenever random assignment to treatment groups cannot be
achieved, observed differences between treatment groups in theory
can be due to nontreatment variables. The standard methodology
for handling objections of this sort is to introduce some type
of statistical control for a small group of variables that are
likely candidates to account for observeet differences between
treatment groups. In the present case, the sample size is small
enough to render such procedures of dubious value. One may
observe, however, bivariate relationships between selected "control"
variables and the treatment variable.
In the present study the treatment variable is defined by
the two categories--used CPSS and did not use CPSS. Averages on
the following variables were compared statistically for users and
nonusers of CPSS: student population, faculty/student ratio,
academic test scores, drop-out rate, peicentage of the student
population who were minority group members, and a rough estimate
of family income of the student body. As shown in Table 4, in
none of these six tests were statistically significant differences
observed. Hence, it is concluded that the differences between users
and nonusers of CPSS reported in the RESULTS section of this report
are not likely due to any of these six characteristics of schools.
Experimental and Control School Implementation
Contracts
Once schools were selected and assigned to experimental
and control conditions, project staff initiated a contractural
30
TABLE 4
Key Characteristics of Test Sites
Average of Control ExperimentalCharacteristic Schools Schools t-value
Student population 1916 1943 .074
Ratio of faculty andstaff to studentpopulation 19.49 17.64 1.540
ACT/SAT scores* 15.67 16.84 .748
Estimated family income $12000 $13125 .607
Drop-out rate 7.4% 10.0% 1.375
Percent white 38% 51% ,...814
NOTE: Table entries are averages over the control or ex-perimental schools, as labeled. Experimental schoolrefers to a school that used CPSS during the study,and control school refers to a school that did notuse CPSS.
*Five schools made SAT scores available, and the remaining13 submitted ACT averages. The five SAT scores wereconverted to the metric of ACT by dividing them bythe ratio of the average over schools SAT to theaverage ACT.
31
relationship with each of the school districts involved through
purchase of services agreements. These were used to help the
participating schools defray some of the costs related to the study.
$5,000.00 was set aside for each experimental school to reimburse
the school district for staff time, and $500.00 was set aside for
each control school for reimbursement of staff time related to data
collection.
Each school was required to provide a part-time coordinator.
In the experimental schools, the coordinatOr was responsible for
the preparation and completion of data collection forms and served
as coordinator of CPSSin the school. In the control schools,
the coordinator was responsible for the completion of the necessary
data collection forms and served as the on-site contact for the
study. A copy of the work statement which accompanied the purchase
of services agreement is appended (see Appendix J).
Pretest Data Collection
With purchase of services agreements initiated for each
experimental and control school, National Center project staff
began the on-site pretest data collection. As mentioned on p. 13,
each school completed the Career Development Program Status
Report (CDPSR) as a pretest report of the school's career develop-
ment program. Project staff, using the Verification Checklist
(VCL), reviewed the CDPSR on-site to'insure that the CDPSR was
a complete self-report of the career development program in the
school. The purpose of the VCL was to highlight the strongest
32
aspects of the school's program. The VCL was signed by both the
school representative and the project staff member.
These pretest visits were conducted between mid-October and
mid-November in sixteen of the eighteen schools. Two control
schools were not able to complete the pretest data reporting
until mid-December (see Figure 1, p. 7).
At pretest, nine of the eighteen schools received on-site
visits and completed data collection activities prior to assign-
ment to experimental or control conditions. Of these nine, four
were randomly assigned and five were not. The other nine schools
completed pretest data collection activities after assignment to
experimental or control conditions.
The pretest data collected on the CDPSR and VCL provided
the information that the rating panel used to rate the eighteen
schools prior to the CPSS treatment.
In the experimental schools, additional pretesting was per-
formed. Each CPSS coordinator gave the steering committee members
a timed pretest (45 minutes) relative to the members' knowledge
about CPSS procedures, The Career Planning Support System Know-
ledge Test, and their attitudes toward systematic planning for
career development programs, Perceptions of Program Planning for
Career Development. These tests were not given to control school
staff (see pp. 19,21..for descriptions of the instruments).
Training
The ten CPSS coordinators from the ten experimental schools
were brought to the National Center in November, after the pretest
33
visits, to receive training in CPSS procedures. The three-day
training involved an introduction to the rationale and basis for
the CPSS model, an in-depth review of each step in the CPSS
process and "hands-on" practice with the CPSS materials. Once
the CPSS coordinators returned to their schools, after training,
they began to install CPSS.
Site Monitoring
The experimental and control schools were monitored through-
out the year (November, 1978 - June, 1979) up to and including
posttesting. In the experimental schools, CPSS coordinators
completed project logs twice a month, describing the progress of
CPSS in, the school. In addition, National Center project staff
telephoned each school at least once per month and conducted
an interim on-site visit during February. Project staff completed
an on-site monitoring log for each school visited. The on-site
monitoring log described the nature and scope of the visit.
For the control schools, site monitoring involved telephone
calls on the average of one per month and an on-site visit in
February. An on-site monitoring log was completed for each
control school. The purpose of the on-site visit to the control
school was to review at the school the on-going activities related
to the career development program in the school.
Technical assistance to experimental or control schools was
minimal. Although the experimental schools received three days
of training in CPSS, project staff did very little to assist
experimental schools in the installation of CPSS.
34
Posttest Data Collection
The final phase of experimental and control school activity
was the posttest data collection. Each school completed a post-
test version of the CDPSR. This form was exactly the same as
the pretest version except that dates were changed to reflect the
different reporting period. The information required on the form
was the same as the pretest. National Center staff conducted
posttest site visits to collect the CDPSR and reviewed it on-site
to insure that it was completed fully. The VCL was used to
insure that all possible information was reported in the CDPSR.
As during the pretest, the VCL was signed by both project staff
and the school coordinator.
In the experimental schools, additional posttesting occurred.
National Center staff administered to steering committee members
the posttest versions of the knowledge test of CPSS procedures
and the attitude test. Also, post-project interviews were con-
ducted with steering committee members to gather information about
the use of CPSS in the school. These interviews gave steering
committee members a chance to describe the strengths and weaknesses
of CPSS and to report their experiences with CPSS. In addition,
two career development units prepared in experimental schools as
a direct result of CPSS were comparatively tested in each of the
experimental schools. This information was collected, in part,
to address Objective Two (see p. 7 ). Each experimental school dev-
oped two career development units based on th r students' needs.
Each unit involved instruction for students in some career skill,
i.e., career exploration, interviewing. The teacher-developed
test of achievement for each Lareer development unit was given to
the class that had been taught the unit and also to a comparable
class which had not been taught the unit. The result of these
comparisons is reported in the RESULTS section, p. 48.
In the control schools, posttesting involved completion of
the CDPSR and cooperative completion of the VCL. Each of the
control schools was given a set of CPSS materials and an orien-
tation to CPSS once the posttest data collection was completed.
36
RESULTS
Rating Instrument
Major Hypotheses
There were two major hypotheses tested by use of the Career
Development Program Rating Instrument. The first major hypothesis
maintains that school staffs using CPSS will produce greater
change toward a systematic plan for career guidance than will
school staffs not using CPSS. This corresponds directly with
Objective One, p. 7. The second major hypothesis states that
school staffs using CPSS will develop higher quality activities
to improve student career development skills than will school
staffs not using CPSS. This hypothesis related to Objective
Two, p. 7. The numbers used in the analyses for these hypotheses
are the averages over raters of scores for a given item on the
rating instrument.
The key items on the rating instrument related to the first
hypothesis were those taken from section VII of the rating
instrument, items 28 through 33. These were the rating items
that required the raters to make judgments about key components
of a systematic plan for career guidance: staff organization,
needs assessment, goal setting, preparation of behavioral objec-
tives,. and design of effective career development activities.
37
The main rating items for analysis of the first hypothesis are
reproduced below.
28. Estimate the extent to which the school staff was or-ganized to plan systematically a comprehensive careerdevelopment program by evidence of clearly designatedleadership; administrative cooperation; and permanent,active groups and committees.
( )
(0)
( ) ( )
(1) (2)( )
(3)
( )(4)
Limited Extent Great Extent
29. Estimate the extent to which a student career develop-ment needs assessment was conducted, tabulated, properlyinterpreted, and the data utilized for planning thecareer development program.
( ) ( ) ( ) ( ) ( )
(0) (1) (2) (3) (4)
Limited Extent Great Extent
30. Estimate the extent to which a comprehensive set ofordered career development goals reflecting assessedstudent career development needs were developed andused in planning, implementation and evaluation ofthe program.
( ) ( ) ( ) ( ) ( )
(0) (1) (2) (3) (4)
Limited Extent Great Extent
31. Estimate the extent to which a set of behavioral 'ob-jectives was developed reflecting specific goals andcontaining a clear statement of the intended audience,behavior, situation and standard of mastery.
( ) ( )
(0) (1)
( ) ( )
(2) (3)
( )
(4)
Limited Extent Great Extent
32. Estimate the extent to which career development activi-ties were developed that reflect student needs, goals,and associated objectives, and that indicate methods,target student group, and outcome measures by referringto the two attached career development activities(yellow).
( ) ( ) ( ) ( ) ( )
(0) (1) (2) (3) (4)
Limited Extent Great Extent
38
33. Based on the available information (including all yellowand pink career development activities), rate theoverall quality of the school's career developmentprogram.
( ) ( ) ( ) ( ) ( )
(0) (1) (2) (3) (4)
Very Low Quality Very High Quality
The rating items used to support the second hypothesis were
items 19, 27 and 32. These were the items that required the
expert raters to make judgments about the quality of career
development activities. The items for analysis of the second
hypothesis are reproduced below.
19.1 What is your best estimate of the chance that each of
the career development activities will improve studentcareer development skills, irrespective of whether thecareer development skill is listed as an objectiveof the activity.
Activity A: ( ) ) ( ) )
Activity B: ( ) ( ) ( ) ( )
Good Some Little Very LittleChance Chance Chance Chance
(3) (2) (1) (0)
27. If the plans reported on SR pp. 16-20 are carried out,what, in your experience, is the chance that studentcareer development skills will be improved?
( ) ( ) ) ( )
Good Some Little Very LittleChance Chance Chance Chance
(3) (2) (1) (0)
1ltem 19 has answer choices A and B. If a school reported twocareer activities both A and B were used. However, if aschool reported one career activity, A was used. If nonewere reported no activity was rated; therefore, more "cases"are reported for 19A (N=33) than for 19B (N=27).
39
32. Estimate the extent to which career development activi-'ties were developed that reflect student needs, goals,and associated objectives, and that indicate methods,target student group and outcome measure by referringto the two attached career development activities(yellow).
( ) ( ) ( ) ( ) ( )
(0) (1) (2) (3) (4)
Limited Extent Great Extent
Data Analysis Methods
Two statistical methods were used to assess the two hypotheses..
The most straightforward method is a two-factor analysis of
variance (ANOVA) in which one facto' time (pretest and post-
test) and the other factor is experimental condition (used CPSS
and did not use CPSS). The research hypothesis for the ANOVA
is that a statistical interaction will be observed of the follow-
ing form: average change on rating scores toward a systematic
planning process or better career activities from pretest to
posttest for experimental school staffs will exceed that for
control school staffs. Thus, the F test for interaction in the
two-factor ANOVA will be the one on which to focus. Since the
form of the interaction is predicted in advance, it is equally
important to observe the means for each of the four cells
in the,design. The pattern of these means should conform to the
expectation that experimental schools exhibit greater gain than
control schools.
The second statistical method for assesssing the two hypotheSe.i,
is analysis 'of covariance (ANCOVA). The dependent variables for
the ANCOVA are posttest rating scores describing the planning
40
process or career activities of each school at the end of the
experiment. The ANCOVAs contain one categorical factor--experi-
mental condition defined by the use or nonuse of CPSS--and one
covariate defined as the pretest rating score corresponding
to the posttest dependent variable. Conceptually, the ANCOVAs
describe differences in posttest rating scores between school
staffs using CPSS and school staffs not using CPSS, under statis-
tical control for the pretest rating score.
Although it does not appear to be widely recognized, the
ANCOVA model can be viewed as a model of change, just as is the
two-factor ANOVA described above. Conceptually, the ANCOVA can
be viewed as expressing the following hypothesis: Change over
the period of the experiment is greater for school staffs using
CPSS than for school staffs not using CPSS, when statistical con-
trols for the effect of the starting point (pretest rating scores)
are applied.
In addition to the analysis of variance and covariance, re-
liability estimates of the rating scores are reported. The unit
of analysis for all statistical results is defined by schools.
A score describing each school on each variable is calculated
by forming the average over the three or four raters who rated
each school. The agreement among raters for a given school
indicates the reliability of the scores, and, conversely,
disagreement among raters indicates unreliability. The discre-
pancies among raters of a given school can be compared to differ-
ences in average ratings across schools. This basic idea forms
41
the conceptual basis for calculating reliability coefficients
based on an analysis of variance model (see Winer, 1971: 283 ff).
The idea is to compare a mean square within schools to the mean-
square between schools. Since the object of the design is to
minimize pretest differences among. schools, these calculations
are based on posttest rating scores only. The calculations omit
consideration of "anchor points" (Winer, 1971: 289 ff), thus
yielding somewhat conservative estimates of reliability (see
Figures 3 and 4). The formula used approximates an unbiased es-
timate of reliability, assuming no anchor point differences among
raters (unlike correlational methods such as split half or coef-
ficient alpha).
Findings
The major results obtained for Hypothesis I are summarized
in Figure 3. Each of the top five panels of the figure summarize
the results for one element used to define a systematic plan for
career guidance. The sixth panel summarizes judgments of the
overall quality. The panels of the figure are numbered and
labeled to correspond to the key rating items reproduced on pp. 38 & 39.
The lefthand graphs plot subgroup means associated with the
two-factor ANOVA. The line for the schools using CPSS is labeled
E, for experimental group, and the line for schools not using
CPSS is labeled C, for control group. The horizontal axis
represents time, t = 1 for pretest, and t = 2 for posttest.
The vertical axis represents the five-point scale (0 4) associ-
ated with each item. The slope of the lines thus give the average
42
57
ANOVA ANCOVA
Organization 4 -4 4 r3 P < .001 E 3 P < .001
2 21 c 1__T-------I0 1 2 t 0
. Goals
E
y
4 -C 4-r V."
3 P < .001 E 3 P < .001
2 2
1- C 1
0 1 2 t 0 E
4
3
2
1
0
P < .001
c2 t
4 Lt
3 P < .001
2
0 E
I. Objectives 4 -3 P < .001
2 -1
0 1
4 tdE 3 P < .001
2
1
2 t 0 C E
Activities.
,3. OverallQuality 4 4 r
3 P < .001 E 3 P< .001
2 2
...iV1 c 1....'"
0 1 2 I 0 C L
reliability = .852
averageconfidencerating = 87.0%
reliability = .922
averageconfidencerating = 91.9%
reliability = .932
averageconfidencerating = 91.5%
reliability = .900
averageconfidencerating = 90.7%
reliability = .850
averageconfidencerating = 88.0%
reliability = .829
averageconfidencerating = .87.7%
FIGURE 3 ANOVAs, ANCOVAs, reliabilities, and averageconfidence ratings for Hypothesis One.
43 cJg
change from pretest to posttest for schools using CPSS (E) and
for schools not using CPSS (C). Lack of parallelity of these two
lines reflects the statistical interaction postulated for the
ANOVA.
In every case the slope is substantially steeper for experi-
mental schools than for control schools. As indicated on the
graph by the notation p < .00l,the chance that the observed dif-
ferences in slope are due to sampling error is less than one in
1000. (These probabilities are reported for the interaction term
in the ANOVA).
The righthand column of graphs displays plots of mean dif-
ferences in posttest scores between experimental schools (E)
and control schools (C), as adjusted statistically by the analysis
of covariance for pretest scores on the dependent variable.
Alternatively, as noted above, these graphs may be interpreted
as differences in change from pretest to posttest, adjusted for
differences in starting point. The vertical axis of these graphs
matches those for the ANOVAs. The horizontal axis does not re-
flect a continuous slope. Rather, the lefthand point (labeled C)
corresponds to the control group,.and the righthand point corre-
sponds to the experimental group (labeled E). This positioning
of E and C is arbitrary, but was selected so that a positive
slope indicates support for the first hypothesis: that experi-
mental schools show larger gains, when adjusted for starting point.
than do control schools. All six graphs do show a substantial
positive slope,thereby lending support to the hypothesis. As
44
with the ANOVAs, all statistical tests are highly significant,
with probabilities approaching zero. (Reported probabilities
are for the main effect of the experimental variable, after ad-
justment for the covariate).
The reliability coefficient associated with each dependent
measure is high, ranging from .829 to .932, and averaging .881.
Also, the average confidence ratings of panelists is reasonably
high, thus reinforcing the reliability calculations. In spite
of the need for approximate judgments, therefore, it is concluded
that available evidence is consistent with the view that the
measurements are accurate to within tolerable limits.
The results secured for Hypothesis II are summarized in
Figure 4. Each of the three panels display ratings for the three
Key items related to quality career development activities. The
panels of the figure are numbered and labeled to correspond to
the rating items reproduced on pp. 39 and 40.
Similar to Figure 3, the lefthand graphs plot subgroup means
associated with the two-factor ANOVA. The line for the schools
using CPSS is labeled E, and the line for schools not using CPSS
is labeled C. The horizontal axis represents time, t = 1 for
pretest, and t = 2 for posttest. The vertical axis represents
the four point scale (0 3) for items 19A, 19B and 27 and a
five point scale (0 4) for item 32. The slope of the lines
thus gives the average change in the quality of career develop-
ment activities from pretest to posttest for schools using CPSS
(E) and for schools not using CPSS (C). Lack of parallelity for
45
ANOVA
19. 3 p < .15
Activity A 2
1
t1 2
Activity B 3 p < .312 E
1C
0 I t1 2
3
2
1
0
ANCOVA
p< .008*
3 p < .0952
1
0
27. 3 I p_< .10 3 p < .003*Skills E2
1 I
2
I0 t 0
1 2
32.Activities
p < .0014 - 4
.103
2 -
E 3
2
1- 1
0 t 0
1 2
p < .001*
FIGURE 4. ANOVAs, ANCOVAs, reljabitities, and averageconfidence 'ratings for Hypothesis Two.
46
Reliability = .751
AverageConfidenceRating = .846
Reliability = .645
AverageConfidenceRating = .823
Reliability = .754
AverageConfidenceRating = .867
Reliability = .850
AverageConfidenceRating = .880'
these two lines reflects the statistical interaction.. postulated
for the ANOVA.
In items 19A, 19B, and 27, the differences between experimental
and control schools were not statistically significant. However,
in each instance, a positive slope is observed, in favor of the
exnerimental schools, thus lending some support to the research
hypothesis. The graph of item 32 indicates a high level of
significance. The probability that the observed differences
were due to chance is less than one in 1000. The positive slope
on this item indicates support for Hypothesis II: that experi-
mental schools may produce higher quality activities than do
control schools.
The righthand column of graphs are also similar to those
in Figure 3. This column displays plots of mean differences in
posttest scores between experimental (E) schools and control (C)
schools, as adjusted statistically by the analysis of covariance
(ANCOVA) for pretest scores on the dependent variable. The ver-
tical axis of these graphs matches those for the ANOVA. The hori-
zontal axis does not reflect a continuous scale; rather, the left-
hand point (C) corresponds to the control group and the righthand
point (E) corresponds to the experimental group. Item 19B
displays nonsignificant findings. However, as pointed out in
the footnote on p. 39, item 19B is based on a smaller number of
cases (N=27), than item 19A (N=33). It may be that, due to
this difference in N, larger gains would have to develop to show
47
significant results. Items 19A, 27 and 32 all show significance
at the .01 level. These findings support Hypothesis II.
Reliability coefficients for items 19A, 19B, 27 and 32
are also reported on Figure 4. These reliability measurements
are within tolerable limits, .75 and above, except for item 19B.
Again, this may be due to smaller N for the item. Average confi-
dence ratings appear to be reasonably high and reinforce the
reliability calculations.
Additional information, reported below, was collected from
experimental schools that reinforces the positive trend reported
in the analyses for Hypothesis II. As reported on p. 36, Posttest
Data Collection, experimental schools conducted comparison tests
of two career development units (CDUs) developed as a result of
the use of CPSS. In every case, the achievement of the students
who had been taught the CPSS-derived unit was much greater than
for those students who had not been taught the unit. Each ex-
perimental school conducted these comparisons in a different way,
with different content, and on different populations; therefore,
no effort has been made to report "findings." It is simply noted
that differences were observed in tests of comparison between
students who had been taught certain career development units
based on CPSS and those who had not been taught the units.
48
Knowledge and Attitude Tests
Objectives la and lb
The knowledge and attitude instruments were designed to
assess Objectives la and lb and, ultimately, to produce evidence
,--1,1prortive of Hypothesis I. It is assumed that an increase
ire CPSS steering committee members' knowledge during one year
(as reported by scores on the Career Planning Support System
Knowledge Test) and no change or a positive change in attitudes
by these members toward systematic planning for career development
(as reported by scores obtained on the Perceptions of Program
Planning for Career Development scale) provide additional evidence
to support the expectation that experimental schools display
greater change toward the demonstration of systematic planning
for career guidance programs than do control schools.
Analysis Of Knowledge Test Scores
The "Career Planning Support System Knowledge Test" was
taken by 49 subjects at the pretest and 47 at the posttest. All
were members of the steering committees (except for the CPSS
school coordinators) at the ten experimentalEchool sites. Scores
were also obtained from three control schools at posttest for
comparison purposes. The knowledge test has acceptable reliability:
correlation of split halves gave an estimated reliability coeffi-
cient of .75, corrected for length. The small number of test
items (21) and the large number of unanswered items (the items
were open-ended) probably contributed to the reliability level.
49
Three project staff members scored the test. All three
scored the five items that were viewed as most difficult (i.e.,
required more judgment) to score. Each of the remaining items
was assigned to individual raters, i.e., one rater-scored given
items on all response sheets. For the five items scored by all
three raters, the median score was used. Intercorrelations
between pairs of the raters on the five difficult items ranged
from .78 to .84.
Descriptive statistics are presented in Table 5 and Figure 5.
As shown in Figure 5, the difference between pre and posttest
scores is significant not only in the statistical sense,
t (matched pairs) = 8.23, df = 9, p < .001, but also in an "edu-
cational" sense by showing that more than one standard deviation
difference exists between means of pre and posttest measures
(i.e., gains averaged over six points on the test from pre to
posttest).
These findings support Objective la, which states that
steering committee members' knowledge of CPSS will increase.
Members of the school staff at three control schools also
took the knowledge test at posttest time only. The mean know-
ledge score for control schools was 8.7; the standard deviation,
5.28 (see Table 6). These control school scores are quite
similar to the pretest experimental school scores.
50
Table 5
Mean Scores on CPSS Knowledge Test ofSteering Committee Members in Experimental Schools
Mean Score (overSchool people within school)Code Pretest Posttest
01
03
04
06
07.
10
12
14
16
17
11.15
11.92
16.46
10.57
9.37
13.56
8.28
8.93
10-11
5.76
15.09
20.91
19.72
19.67
16.71
21.31
13.65
21.03
21.08
14.46
Pretest .Posttest
mean = 10.26 (over schools) Mean = 18.13 (over schools)
range of means = 5.76 to range of means = 13.65 to16.46 21.31
range of raw scores = .5 range of raw scores = 2.5to 28.7 to 34.5.
n = 49 n = 47
SD = 5.53 SD = 6.1.6
51
9
8
uJ
7
a
re
ti u.6
5
4
3
2
1
0
I
.1............. ......._..
\
1\1
1
1\
1
r
PRETEST
i
1
A
/ \\
- - --.----POSTTEST
PRETEST
i
1
1
/1
\1
I i POSTTEST
//
1
\
/1
\1
I//
i1
/i
\\
CI L0 N 0) r 111 N lf)CI, N N N r71
N (ID cp 0 N 0 pi 0 cNi N ctr r r N N N N N M CI CI
SCORES
FIGURE 5: Lowledgc Test Scores
Cr)
Table 6
Control Schools Knowledge Scores
School Code Mean
02 14.8
05 6.3
08 7.4
Grand mean = 8.7
Range of raw scores = 2 to 20
n = 13
SD = 5.28
Analysis of the Attitude Instrument Scores
All experimental school steering committee members (except
the CPSS coordinators) completed the attitude scale both pre
and posttest. Each of the 32 items on the "Perceptions of Program
Planning for Career Development" allowed five possible degrees
of agreement. Responses to these items were coded -2 through
+2 and were summed to give a total score. The possible range
of scores was from -64 to +64. Forty-nine response sheets were
scored and double-checked for accuracy for the pretest; 47 for
the posttest. This attitude scale also has acceptable reliability:
the coefficient of reliability between odd items and even items,
corrected for length, was .90.
As shown in Figure 6, the attitude scale produced a broad
range of scores, distributed in a fairly normal manner on both
53
1
i///
VI
GM 41.0.
PRETEST
POSTTEST
PRETEST
/1
/
\\ /\.
/
I
I
%
1
1
i
I
/1
1
j // 1
I POSTTEST
F
1
1i1
I
1
1
//
1
1
//. =Mb ildll MIPD 0.1. 1.0 .M.,
7
6
0>-
w5
ccU.
4
3
2
1
0
CD
CD
rlr-
0
(0co
SCORES
0 tn 0Ln Ln CD
(DI cc,I
u)V) tD LA
FIGURE 6. Attitude Scale Scores
the pre and posttest. Table 7. shows the mean scores and other
descriptive statistics for each school on both pretest and
posttest.
A t-test for matched pairs showed a significant difference
between pre and posttest attitude scores (t = 2.821, df = 9,
p : .05), with the posttest significantly higher. The neducationi
siyhificance" of this difference is less marked than the knowla:lj:
test; the difference between pre and posttest means is only
.4 of one standard deviation (i.e., the gain .averaged about
four points on the attitude test from pretest to posttest).
These findings support Objective lb, which predicted an
increase or no change in positive attitude scares after the
CPSS program.
Three control schools also took the attitude scale at
posttest time. The mean score for these was 37.2; the standard
deviation 9.5 (see Table 8). As with the knowledge tests,
these scores are very similar to the pretest experimental school
scores (Table 8).
55
Table 7
Mean Scores on CPSS Attitude Scale ofSteering Committee Members in Experimental Schools
SchoolCode
Mean Score
Pretest Posttest
01 43.5 47.8
03 42.0 49.0
04 51.3 44.8
06 33.2 43.3
07 29.4 35.8
10 39.7 46.7
12 28.6 35.7'
14 42.2 41.8
16 38.0 39.3
17 38.1 46.0
Pretest Posttest
mean = 38.6 mean = 42.8
range of means = 28.6 to range cf means = 35.7 to51.3 49.0
range of raw scores =to 61
15 range of raw scores =59
11
n = 49 n = 47
standard deviation of standard deviation of rawraw scores = 10.9
56
scores = 11.1
to
Table 8
Attitude Scores for Control Schools
School Code Mean
02 36.3
05 32.8
08 42.0
Grand mean = 37.15
Range of raw scores = 26 to 60
n = 13
SD = 9.47
. 57
CONCLUSIONS
The Career Planning Support System Assessment Study was
designed to test the effectiveness of CPSS and was guided by two
objectives stated as hypotheses. The first hypothesis stated
that school staffs using CPSS for one year would develop a sys-
tematic plan for career guidance. A subset of related objectives
addressed experimental school staffs' kriowledge of CPSS and their
attitudes toward systematic planning. It was assumed that in-
creased knowledge and favorable attitudes would reinforce school
staffs' use of a systematic plan for career guidance.
The second hypothesis stated that school staffs involved in
CPSS for one year would develop a plan that contained activities
having a high probability of improving student career development
kills.
The findings reported for Objectives One, la, and lb support
the associated research hypotheses with p < .001, p < .001, and
p ' .05 levels of significance, respectively. These results suggest
that during one year school staffs with school characteristics
similar to the schools and staffs involved in the Assessment
Study can successfully use the CPSS materials to generate a sys-
tematic plan for career guidance as defined by the successful
completion of key planning steps:
58
1. organization of staff
2. needs assessment
3. goal setting
4. preparation of student behavioral objectives
5. design of career development units
AdditiOnal evaluations related to other components of the CPSS,
i.e., resources assessment, annual program review and program le
assessment, were not attempted either because of difficulties ex-
perienced in trying to design valid measurement schemes or because
of the time limits prescribed by the study. CPSS requires three
school years to install completely; the Assessment Study focused
only on the first year prodUcts of the use of CPSS.
The results specified for Objective Two support a trend to-
ward higher quality career development activities in experimental
schools. Of the eight statistical tests of change (4 ANOVA's and
4 ANCOVA's) reported for the associated research hypothesis, four
were significant at the p < .01 level and four were not signifi-
cant at the p < .01 level. However, even in the instances where
the findings were not significant, the direction of differenc6s
between experimental and control schools favored the research
hypothesis (see Figure 4).
Although the support for Objective Two is not as strong as
that reported for ObjectiVes One, la and lb, the evidence re-
ported for all Objectives supports the research hypotheses.
The findings of this study were condensed into a Joint
Dissemination Review Panel submission (see Appendix K), a major
59
..)
product of this assessment effort. The results of the Panel's
evaluation of the submission were not available at the time of
the final report publication.
60
BIBLIOGRAPHY
Campbell, R.E. "The Application of Systems Methodology to CareerDevelopment Programs." In Zalatino, S.D. and Sleeman, P.J.,eds., A Systems Approach to Learning Environments.' Roselle,New Jersey: Meded Projects, Inc., 1975.
Campbell, R.E., Dworken, E.P., Jackson, D.P., Hoeltzel, K.E.,Parsons,G.E., and Lacey, D.W. The Systems Approach: AnEmerging Behavioral Model for Career Guidance. Columbus,Ohio: The Center for Vocational and Technical Education,1971.
Herr, E.L. Guidance and Counseling in the Schools: The Past,Present, and Future. Falls Church, Virginia: The AmericanPersonnel and Guidance Association, 1979.
Hosford, R.E., and Ryan, A.T. "Systems Design in the Developmentof Counseling and Guidance Programs." Personnel and Guid-ance Journal, 49: 221-230, 1970.
''Guidelines for a Quality Career Guidance Program." AVA GuidanceDivision Newsletter, 1979.
Winer, B.J. Statistical Principles in Experimental Design. NewYork, New York: McGraw-Hill, Inc., 1971.
APPENDIX A
Dissemination Plan
62
8.
CAREER PLANNING SUPPORT SYSTEM (CPSS)ASSESSMENT STUDY 1978-79
Dissemination Plan
A3 specified in the Technical Proposal, p. 36, dissemination of CPSS and
the results of the assessment study is an important outcome of the CPSS project.
The following proposed dissemination plan has been prepared to meet the inten-
tions for dissemination of CPSS presented in the Technical Proposal.
DISSEMINATION PLAN
Goals
1. The CPSS package of materials will be reviewed and revised to correct
minor printing errors and to include new information regarding the assess-
ment study.
2. Information regarding the assessment of CPSS will be presented in journals
and at conferences; and information regarding the validity of the CPSS
materials will be provided to guidance professionals (federal, state,
local guidance staff, counselor educators, vocational educators and advi-
sory groups).
3. Each state department of guidance will receive a complimentary copy of the
CPSS package.
Objectives
1. A. Each printed manual will be reviewed for typographical errors. New
information about sites and a new forward will be prepared. The CPSS
Decision Guide and Brochure will be updated. A new binder for the
manuals and a container for the audio-visual materials will be designed.
2. A. Federal, state and local guidance professionals (5,000) will be invited
to participate in a pre-American Vocational Association (AVA) conven-
tion workshop on CPSS on November 30, 1979 in Anaheim, California.
Project staff and CPSS coordinators from three of the CPSS experimental
schools will present the results of the CPSS assessment study, review
the CPSS procedural system and give advice to participants on securing
funding for implementation of CPSS, including the National Diffusion
Network.
B. Journal articles will be written before the end of the project. Two
types of articles have been drafted, one geared toward the practical
aspects of the use of CPSS, i.e., what student needs were reported
in the study, what factors impeded or enhanced the progress of CPSS
in the school; and two, an article written for a research journal which
describes the methodology for the evaluation of CPSS, e.g., instruments
used, reliability, implications of rating procedures.
C. In addition to the pre-AVA convention workshop described in Objective 2A,
an executive summa'ry of the CPSS Assessment Study will be prepared
and distributed to 5,000 guidance professionals and others throughout
the U.S. by November 30, 1979.
3. A. A complete package of the updated CPSS materials will be mailed to
state guidance directors in all the states and territories by
November 30, 1979.
Methods
The key audience will be those professionals associated with career
guidance at the local, state, regional and federal levels. Others who will
2
be contacted are vocational education professionals, counselor educators,
and school administrators. Other research and evaluation professionals may
receive notification of project results through the normal dissemination
procedures in operation at the National Center, i.e., the National Center
Clearinghouse, ERIC, and the Research Exchange.
Expected Outcomes
The expected outcomes of the dissemination plan are to report the practical
usefulness of CPSS as a result of the national assessment. Analysis of the
data is expected to provide evidence of the positive impact of CPSS in high
schools. The purposes of the dissemination plan are to: 1) increase educators'
awareness of CPSS and its benefits, 2) report JDRP approval of CPSS (provided
approval is secured), and 3) describe the methods used to assess the extent
of planning for career development in a school setting and the possible effects
of such planning on students.
APPENDIX B
Career Development Program Status Report
66
CAREER DEVELOPMENT PROGRAM STATUS REPORT
National Center for Research inVocational Education
1960 Kenny RoadColumbus, Ohio 43210
Introduction
The career development program status report is a data collection instrument. Information willbe collected from four distinct categories of data relative to a school's career development programplanning:
I Site DescriptionCommunity, school district and school
II Career Development Program Descriptionstaff organization, state and district goals,and an overview of activities and resources
III Present Planning Effortspersonnel, assessment, and evaluation activities
IV Projected Planning Effortspersonnel, assessment, and evaluation activities
Directions
PLEASE TYPE ALL INFORMATION IN THE REPORT
0
Please complete the report as fully as possible. Som9 questions request copies of documents,such as, goal lists, staff lists, brochures about your school's career development program, audiovisualmaterials lists, program plans or lesson plans. Whenever possible include copies of the documents inthe appropriate "attachments" section. For example, item no. 4 on page 3 refers to your school'scourse offerings and calendar. A copy of each should be included in the section marked "attachment1" because item no. 4 on page 3 is a part of Category I, SITE DESCRIPTION. The same procedureshiiuld be followed for any documents requested in the other categories: II, PROGRAM DESCRIP-TION; III, PRESENT PLANNING; IV, PROJECTED PLANNING. If there is no information &DIU-able for a question or section, please write N/A (not applicable) in the space.
A-National Center staff member will pick up and review the completed report during theSpring visit to your school. You and the National Center Staff member will review the report to-gether to insure completeness.
If you have any questions about the report, please call James Pearsol, Program Associate,(614) 486-3655, ext. 406. Information included in this report will remain strictly confidential.
Name of School
Address
CAREER DEVELOPMENT PROGRAM STATUS REPORT
Please type all information in the report
Name of Principal
Person Completing this Report
Address (if different from above)
Phone Number
Position
Date of Report Completion
National Center for Research inVocational Education
1960 Kenny RoadColumbus, Ohio 43210
CAREER DEVELOPMENT PROGRAM STATUS REPORT
I. SITE DESCRIPTION
A. Community
1. To the best of your knowledge, identify the primary sources of income for residentsof your attendance area by ranking the four most prevalent sources of income fromthe following list (1=most prevalent, 2=next highest, etc.):
Professional, Technical, Managerial
Clerical and Sales
Service
Farming, Fishery, Forestry and Related Occupations
Processing
Machine Trades
Benchwork
Structural Work
Miscellaneous
2. Please list sources of information which your school has for general use which include descrip-tions of community resources available for career development activities. This should includesuch items as local postsecondary institutions, organizations available for job placement assist-ance, social or cultural service organizations, Federal agencies, business and industry, state andlocal agencies.
B. School District
1. The total enrollment of your school district is students. This figurerepresents students in grades
There are junior high or middle schools which feed into your high school.
3. Please describe or attach information regarding career education goals for the fpederschools.
2
C. School
1. Enrollment
Grade 9 (if appropriate for your school)
Grade 10
Grade 11
Grade 12
Total
2. Facu lty/Staff
Administrators
Secretarial/Clerical
Counselors
Teachers
Fulltime Specialists (please list)
Number of StudentsMale Female
Number
3. Part Time Support Personnel
List the support personnel assigned to your school (such as psychologists, social workers, speechtherapists, etc.), and the amount of time each spends in your school.
Type No. of Days in School Duties
. Please attach a list of your school's course offerings and a 1978-79 school calendar.
3
II. CAREER DEVELOPMENT PROGRAM DESCRIPTION
A. Staff Organization
Using the most current information you can find, show (1) the school district's organiza-tional chart and (2) your school's organizational chart, including lines of authority over-seeing guidance/career education activities. If it is not dearlio shown on the organizationalchart, provide a list of primary decision-makers and their positions and a list of committeesinvolved with career guidance, if appropriate.
B. State and/or District Career Education Goals
Attach, or write below, the career education goals for your state or district. Are these pre-scribed or suggested? If there are none, please write N/A.
C. Overview of Career Development Activities
Please include as many copies of the next page as are necessary to fully, yet briefly, provide anoverview of present career development activities in your school. One form should be com-pleted for each major activity (e.g., event, instructional unit, group of activities or course).These should include both counselor and teacher-initiated activities. The following shouldhelp to clarify the type of information that is desired.
1. Name of Activity: Use only if the activity has an official and/or popular designation that mostpeople will recognize. If not, use Identifying Title.
Identifying Title:
2. Goal(s): List the stated goal(s) for the activity, if available, and how it was determined that theactivity was appropriate for career development in your school.
3. Objective(s): List the stated objectives for the activity, if available.
4. Activity Leader: List the individual(s) within and, if appropriate, outside the school who hasprime responsibility for the activity.
5. Target Student Group:
6. Location: Identify where the activity is being conducted.
7. Schedule: Provide the approximate date(s) or time period(s) during which the activity is con-ducted and number of times presented.
8. Activity: Briefly describe the activity. If possible, include methods, materials, and the amountof time needed by students to complete the method.
9. Resources: Include the following, if appropriate and available.
People:Materials:Equipment:Space:Funds:
10. Evaluation: Brief' y describe any evidence that indicates that the activity is effective and appro-priate for continu6tion (both student evaluation and instructor evaluation).
6
NOTE: Make as many copies ofthis form as needed.
CAREER DEVELOPMENT ACTIVITIESPlease Type
1. Name of Activity.
2. Goal(s):
3. 0 bjective(s) :
4. Activity Leader(s):
5. Target Student Group:
6. Location:
7. Schedule:
8. Activity:
9. Resources:
People:
Materials:
Equipment:
Space:
Funds:
10. Evaluation:
D. Overview of School Resources
1. Career Developm..int Print Id Audio Visual Materials (include catalog or listing if avail-able). Identify type of material and approximate number.
2. Equipment: Identify types and number of audio visual equipment available for use in yourcareer development program.
3. Space: Identify any specific space that is available for career development activities.'
4. Funds: List the funds available for career development activities; include sources, amounts andany restrictions on their use.
III. PLANNING EFFORTS FOR YOUR SCHOOL'S CAREER DEVELOPMENT PROGRAMINITIATED DURING THE 1978-79 SCHOOL YEAR
NOTE: Indicate N/A for any items which do not apply.
A. Personnel Involvement in Planning
Nature and Scope of Involvement(include amount of time involved
Number arid committee structure, if appro-Involved priate)
Administrators
Counselors
Teachers
Students
Parents
Community Representatives
Others (please specify)
10
.1,11=1,11MMIMIIMI1,,,
B. Assessment and Evaluation Activities
1. Briefly describe or attach copies of any needs evaluation activities or surveys for careerdevelopment which were conducted in the 1978-79 school year. Be sure to include whowas surveyed, methods used, and what the results were.
11
2. Briefly describe or attach copies of any community and school resource surveys which wereconducted in the 1978-79 school year. List types of resources surveyed (i.e., people, equip-ment, materials, space, funds, external services).
3. Briefly describe any procedures which were used to review or evaluate the 1978-79 careerdevelopment program at your school.
4. Do you have specific career development program goals established for your high school?
Yes No. If yes, answer the following questions.
a. How were the career development program goals for this school year (1978-79)determined?
c. Were priorities established for the goals?How were goal priorities established?
d. Were objectives for faculty/staff and students written for the goals? Yes NoWhat was the procedure for writing objectives?
5. Indicate any change in career development program activities for the 1979-80 school yearwhich may have resulted from a review or evaluation of the 1978-79 career developmentprogram.
15
IV. PROJECTED PLANNING EFFORTS FOR YOUR SCHOOL'S CAREER DEVELOPMENTPROGRAM TO BE INITIATED DURING THE 1979-80 SCHOOL YEAR
NOTE: Indicate N/A for any items which do not apply.
A. Personnel Involvement in Planning
NumberInvolved
Administrators
Counselors
Teachers
Students
Parents
Community Representatives
Others (please specify)
Nature and Scope of Involvement(include amount of time involvedand committee structure if appropriate)
16
B. Assessment and Evaluation Activities
1. Briefly describe any needs evaluation activities for your school's career development pro-gram which are planned for the school year (1979-80), including method and extent ofevaluation.
2. Briefly describe any community and school career development resource surveys plannedfor the school year (1979-80) and describe method and extent.
17
3. What process will be used for determining goals for the school's 1979-80 career developmentprogram?
4. Briefly describe any new career development program activities which you hope to developduring the coming year (1979-80) for implementation. Now was it determined that these wouldbe appropriate career development activities?
18
i 3
5. What plans, if any, do you have for reviewing or evaluating your career development programat the end of the school year (1979 -80)? How will the review be conducted?
V. CLARIFICATION AND COMMENTS
Please use this space to make any clarifying remarks, or to provide additional information aboutyour school's career development program. Please indicate question number if remarks pertainto a specific section of this report. Attach extra page(s) as needed.
20
AFPENDIX C
Verification Checklist
89
t i on a 1 iir ler Research :nVoc r I ()nal 1-:duca t ion
1960 Kenny RoadCcllumbus , Ohio .'-.3210
of School
Addres:1
IVERIFICATION CHECKLIST
ilaffaNINSI=NaSimar=1Name ol Principal
Zip Code
National Center St_atf Member(person compl..!tiw; report) Signature Date
Typed Name
I have reviewed the information in the checklist and verify the accuracy01 the intormation.
it I, pre
Signature Date
Typed Name
Position
VERIFICATION CHECKLIST
Directions
The National Center staff member completing the form should base his/her
responses on information gained through documentation available at the site.
Site records and Logs should be examined. Copies of documents such as ros-
ters of committee members, schedules, and agendas should be obtained to sup-
plement checklist responses. In addition, some interviewing of key personnel
may be considered.
It may be necessary for the staff member to exercise judgment about the
accomplishment of planning activities at some of the sites. For example, in
making judgments about the needs evaluation, it may be that a general needs
assessment study was conducted for the school or the school district. Whether
or not ibis study meets 1he intent of a needs evaluation for career develop-
ment depends upon such considerations as the recency of the study, the de-
gree of its focus upon career development, and the extent to which the study
conformed to accepted needs evaluation procedures.
The checklist provides space for comments to clarify checklist responses.
The staff member should use this space to record information which supports
his/her responses.
COMMITTEES
1. Was a Career Development Program PlanningCommittee selected?
. Were selection criteria established for selecting Career Development Program Planning Committeemembers?
3. Was orientation to career development program planning provided for committee members?
4. Did the committee meet on a regular basisthroughout the planning effort?
5. Did the committee coordinate the planningactivities?
6. Was an advisory committee organized?
7. Were selection criteria established forselecting members of an advisory committee?
8. Was orientation to career development program planning provided for advisory committee members?
9. Did the advisory committee meet on a regularbasis?
10. Were recommendations suggested by advisorycommittee incorporated into the programplanning whenever feasible and practical?
Comments:
YES NO
IDENTIFICATION OF RESOURCES
1. Has a committee been selected to identify re-sources?
2. Was orientation to career development plan-ning provided to committee members?
3. Have current career development activitiesbeen identified?
4. Have school resources necessary For careerdevelopment been identified?
5. Have community resources necessary forcareer development been identified?
6. Has a school and community descriptionbeen prepared?
7. Have available resources been matched withspecific career development activities?
Comments:
1
YES NO
LDENTIFICATION OF NEEDS
1. Has a committee been selected to identifyneeds?
2. Was orientation to career development pro-gram planning provided for committee mem-bers?
3. Has a comprehensive needs evaluation forcareer development been conducted?
a. Were students surveyed?
b. Were faculty/staff surveyed?
c. Were graduates surveyed?
d. Were parents surveyed?
4. Have needs data been tabulated and analyzed?
Comments:
YES NO
COAL SELECTION
1. Was a comprehensive set of career developmentprogram goals devised as a result of identifiedneeds?
2. Were priorities assigned to goals based uponthe identified needs?
3. Were goals selected for implementation?
4. Was a career development program goal reviewconducted?
Comments:
4
YES NO
CAREER DEVELOPMENT PROGRAM ACTIVITIES
1. Have potential career development activities beenidentified?
2. Have instructors been selected?
3. Have instructors been trained/prepared?
4. Have program activities been written?
5. Have program activities been reviewed andapproved?
6. Were program activities implemented?
7. Were program activities evaluated?
Comments:
YES NO
CAREER DEVELOPMENT PROGRAM REVIEW
1. Has a general, annual review of the careerdevelopment program been conducted?
2. Has the extent of implementation of eachcareer development program activity beendetermined?
3. Have goals not yet implemented been reviewedfor potential implementation?
4. Have decisions been made regarding programrevision and/or expansion for the ensuingyears?
Comments:
6
YES NO
APPENDIX D
Career Development Program Rating Instrument
and Directions
CAREER DEVELOPMENT PROGRAMRATING INSTRUMENT
Definitions and Directions
Your task is to use a rating instrument to rate a school's careerdevelopment progra, pin through two information references, "TheCareer Development Program Status Report" and "The Verification Checklist"(defined below). Before you begin the rating tasks, READ THE FOLLOWINGDEFINITIONS AND DIRECTIO::S.
Definitions
1. Reference any information source describing a school's career developmentprogram which is identified and provided to the rater to answer an item onthe rating instrument.
2. S.R. Career Development Program Status Report a reference andinformation source, usually green in color. This document is a reportgenerally prepared by school personnel describing the school careerdevelopment program.
3. V.C.L. - Verification Checklist - a reference and information source, bluein color. This document is a form prepared jointly by school personneland project staff describing the school career development program.
4. Career Development Activities two specific career activities used assources of information for raters (yellow cover sheet).
5. Additional Career Activities career activites used to answer thelast item on the rating instrument (pink cover sheet).
Directions
1. READ ALL THE INFORMATION PROVIDED FOR THE SCHOOL YOU ARE TO RATE BEFORE-YOU USE THE RATING INSTRUMENT TO RATETHESCHOOL. This will familiarizeyou with the intormation available in the references.
2. The rating instrument consists of 33 items and is divided into 7sections. Each section has directions which apply to the items inthat section. When answering an item, place an "X" or'/' in the brackets[answer spaces] that in your judgment provide the best answer(s) to therating items.
3. (a) References are listed in the rating instrument either at thebeginning of each section in the directions or following each item stem.The references listed are the only sources of information to be used inrating an item. if a reference reads S.R. p. 11-B, 1, it refers to thefollowing:
S.R. Career Development Program Status Report greenp. 11 = par,t chvunB, 1 = section Is, number 1
If a reference reads VCL p. 3-1, it means:
VCL = Verification Checklist - bluep. 3 = page three1 = number i
(b) READ EACH REFEREtiCE THOROUGHLY BEFORE ANSWERING THE RATING ITEM.Next to each reference will be a number or phrase (either "1", "2" or "nopriority"). The reference numbered "1" should be given primarysignificance. For exanp1., [Reference: 1. SR p. 13-4, a; 2. VCL p.4-1]. Status Report page 13 number 4a should be considered mostaccurate if there is conflicting information. If there is noconflicting information, use your best judgment based on theinformation referenced.
In the instance when both references are of equal value, "no priority" -
is listed. This means that the rater should use all the informationavailable to make the rating, with no special significance attached toeither reference.
(c) On the reference sources listed above, the Status Report (SR) andthe Verification Checkltst (VCL), the rater will often findattachments" listed. The rater should always consider the attachment as
part of the reference when answering an item. For example, SR p. 5 mightinclude as part of a response, "See Attachment I." The rater shouldthen turn to Attachment I, read it, and consider it part of SR p. 5 whenanswering the item.
(d) In Sections IV and VII of the rating instrument, other referencesare listed either in place of or in addition to the Status Report (SR) andthe Verification Checklist (VCL). In Section IV, the rater will use onlythe two attached career development activities (yellow cover sheet). InSection VII, the rater will use all the references provided. (Thedirections for each section or item will alert the rater to specialreference considerations.)
4. Confidence Rating. After each item you will find a confidence rating.This rating is to be used to report the rater's confidence that theanswer marked is correct. It reads as follows:
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
ii t f i 1 f 1 i 1
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidQnce
1 1 7
5. Goals and Objectives. When ratinz; Soctions II and III, whenever areference uses the word "goal", consider it a goal. Whenever areference uses the word "objective", consider it an objective, eventhough you may suspect that the person completing the form did not knowthe difference.
6. The information listed in the references is the best informationavailable to answer each rating item. Please use the information tomake your best estimate even when you are uncertain, then use theconfidence rating to express your uncertainty.
School
Rater
CAREER DEVEL0PMENT PROGRAMRATING INSTRUMENT
(aural -or)
I. Directions: When answering items 1-4, consider the extent to which studentneeds for career development skills were identified through a needs assessment.
Ia. Was an assessment of student career development needs conducted?[References: 1. SR p. 11; 2. VCL p. 3-3].
[ ] No (0) IF NO, CHECK THE CONFIDENCE RATING AND SKIPTO SECTION II. p. 3
[ ] Yes (1)
CONFIDENCE RATING: Please estimate the probability that your answer isccrrect by placing a check at the appropriate location on the scale.
I I I1 i V I I I i
i
0 10 20 30 40 50 60 70 80 90 100%completeconfidence
no
confidence
lb. If yes, determine whether the needs assessment was:[References: 1. SR p. VCL
[ ] conducted only in the specific school (1)
[ ] conducted in a larger unit, such as a school district (0)
CONFIDENCE RATING:
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
2. Was tIi nee(k ame!;,,nent de,;(ned to amess htiidh neeC:, for career de-velonnent skill!; or to deteredne sone other intornation, such as, a neeLkassos,.;ment lit couto;elon ro1cb n career devciopHont"!
IReforoncos: I. SR p. 11; 2. VCL p. 3-3].
[ ] Student centered (1)
[ ] Non-student centered (0)
CONFIDENCE RATING: Please estimate the probability that yourcorrect by placing a checl: at rho appropriate location on tho scale.
1I i 1
0 10 20 30 40 50 60 70 80 90 100%
no cumplete
confidence confidencea..........
3. Were the results of the needs assessment tabulated?[References: 1.
[ ] Yes (I)
[ ] No (0) e4P- IF NO, CHECK THE CONFIDENCE RATING AND SKIPTO SECTION II. p. 3. gimpor.
CONFIDENCE RATING:
0 10 20 30 40 50 60 70 80 90 1007,
no c ompl e t e
confidence confidence
4. Determine the range of respondents to the needs assessment by checkingbelow each of the groups that completed a questionnaire for the needsassessment.[References:_l_._VC1,_22 2. SR p. 11].
[ ] a. Students
[ ] b. Faculty
[ ] c. Recent graduates
] d. Parents
[ ] e. Other
[ ] f. None
CONFIDENCE RATING: Please estimate the probability that'your answer iscorrect by placing a check at the appropriate location on the scale.
----- r E-
o 10 20 30 40 50 60 70 80 90 100%
II Directions: When answering items 5-7, consider the extent to which. oals__were established for improving student career development skills.
5. Are explicit goals for student career development reported?[References: (no priority) SR p_5 and SR p. 13J.
[ ] Yes 1,1)
[ ] No (0) m.15IF N0_2- CHECK THE CONFIDENCE RATING ANDSKIP TO SECTION III. P. 4. .41.-
CONFIDENCE RATING:
i-- --I--I i----- I-- I- -I--- I 1 I
0 10 20 30 40 50 60 70 80 90 100%
no completeconfidence confidence
-3-
6. Were these career development goals formulated from the results of aneeds assessment?[References: 1. SR pp. 13 and 14c; and 2. VCL D. 4-1].
[ ] Yes (1)
[ ] No (0) cmgc-IF NO, CHECK THE CONFIDENCE RATING AND SKIP TOSECTION III, p. 4.
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
t
0 10 2U 30 40 50 60 70 80 90 100%no complete
confidence confidence
7. Were the reported goals for student career development organized in orderof priority based upon identified needs?
[References: (no priority) SR p. 14-c and VCL p. 4-2].
[ ] Yes (1)
[ ] No (0) muJIP- IF NO, CHECK THE CONFIDENCE RATING AND SKIP TOSECTION III, p. 4.
CONFIDENCE RATING:
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
III. Directions: When answering items 8-10, consider the extent to which ob-jectives were written for student career development skills.
8. Are explicit objectives for student career development reported?[References: (no priority) SR p. 5 and SR p. 14-d].
[ ] Yes (1)
[ ] No (0) mW.IF NO, CHECK THE CONFIDENCE RATING AND SKIP TOSECTION IV, p. 6.
CONFIDENCE RATING:
0 1U 2U 3U 4U 5U 60 70 80 90 IOUZno complote
confidence confidenco
1 cl-4-
9. Use the following references for items 9a-d.[References: (no priuriv) SR p. 5._and SR 2. 14-d].
9a. Do most of the objectives state who are the key actors (e.g., stu-dents, faculty)?
[ ] Yes (1)
[ ] No (0)
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
9b. Do most of the objectives state what behavior is to be demonstrated?
[ ] Yes (1)
[ ] No (0)
CONFIDENCE RATING:
0
-1 -1- 1--- 1- -I- -1- 1 I------I-1
10 20 30 40 50 60 70 80 90 100%complete
confidenceno
confidence
9c. Do most of the objectives state what are the conditionsthe behavior is to be demonstrated?
[ ] Yes (1)
[ ] No (0)
under which
CONFIDENCE RATING:
F --F- -I-
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
-5--
9d. Do most of the objectives state what degree-of- success is required toachieve the objective?
[ ] Yes (1)
[ ] No (0)
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
t- t- t- -40 10 20 30 40 50 60 70 80 90 100%
no completeconfidence confidence
10. Were the reported objectives written for, and designed to achievespecific student career development goals? REVIEW ITEMS 5-9.[References: (no priority). SR p. 13, SR p. 5, and SR p. 14-d].
[ ] Always (3)
[ ] Most of the time (2)
[ ] Occasionally (1)
[ ] No (0)
CONFIDENCE RATING:
4- -4- 4-- - 4- "0- ----h - - ---4
0 10 20 30 40 50 60 70 80 90 100%
no completeconfidence confidence
IV. Directions: Rate the attached career development activities (yellow only)on the criteria listed in items 11-20. If you have two career develop-ment activities to rate, rate the first activity using column "A" and ratethe second activity using column "B". If only one activity is attached,use column "A". If you have no activities to rate, check the box below.
[ ] No career development activities attached +4PSKIP TO SECTION V,p._ Po*
11. Is there a goal listed for the career development activity?
A
[[ ] Yes (1)
[ [ ] No (0) am-34PP- IF NO, CHECK THE CONFIDENCE RATINGAND SKIP TO ITEM 13, 22 8. ...-ittpm.
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by pla,2ing a check at the appropriate location on the scale.
A
O 10 20 30 40 50 60 70 80 90 100%
t -t- -r--
O 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
12. Is the goal listed for the career development activity reported on SRpp. 5, 13 or 14?
A
[ [ ] Yes (1)
[ [ ] No (0)
CONFIDENCE RATING:
AF--
B
4-
O 10 20 30 40 50 60 70 80 90 100%
-t
O 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
-7--
13. Is an ohjoctivo (or objectives) listed for the career development'
activity?
A
[ [] Yes (1)
[ ] [] No (0) IF N0_,_ CHECK THE CONFIDENCE RATING
AND SKIP TO ITEM 15, p2 10
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
Af- 4- -4--
0 10 20 30 40 50 60 70 80 90 100%
0 10 20 30 40 50 60 70 80 90 100;;
no complete
confidence confidence
14. Answer the following questions (14a-d) for activity A in the column marked A
and for activity B in the column marked B.
14a. Do most of the objectives listed in the activity state who are thekey actors (e.g., stude-Ls, faculty)?
A-----------
[ ] [ ] Yes (1)
[ ] [ ] No (0)
CONFIDENCE RATING:
A4 4- -4 A t-----4
0 10 20 30 40 50 60 70 80 90 100%
B-t
0 10no
confidence
20 30 40 50 60 70 80 90 100%completeconfidonco
14b. Do most of the objectives listed in the activity state what behavioris to be demonstrated?
A
[ ] [ ] Yes (1)
[ ] [ ] No (0)
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
A4- ---4--- 4- -F V
O 10 20 30 40 50 60 70 80 90 100%
Bt--- 4-
O 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
14c. Do most of the objectives listed in the activity state what arethe conditions under which the behavior is to be demonstrated?
A B
[ I [ ] Yes (1)
[ [ ] No (0)
CONFIDENCE RATING:
AV t V i- V- 4-- V -V 4-- V 4
O 10 20 30 40 50 60 70 80 90 100%
B
O 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
14d. Do most of the objectives listed in the activity state what der,reeof sucis is required tO achieve '.:11e objectives?
A
[[ ] Yes (1)
[ 1[ ] No (0)
CONFIDENCE ftAT1NC: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
At-
O 10 20 30 40 50 60 70 80 90 100%
_
O 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
15. Was a target student group reported for the activity? (e.g., the 284members of the junior class, or the 77 sophomores in Biology 100)
A B
[ ] [ ] Yes (1)
[ ] [ ] No (0)
CONFIDENCE RATING:
A
B
O 10 20 30 40 50 60 70 80 90 100%
f--O 10 20 30 40 50 60 70 80 90 100%
no completeconfidence confidence
-10-
16. Were explicit methods for instruction reported for the career developmentactivity?
A
[ [ ] Yes (1)
[ ] [ ] No (0)
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
At- t- f- f- -f---
0 10 20 30 40 50 60 70 80 90 100%
B r- -4-- -40 10
noconfidence
20 30 40 50 60 70 80 90 100%complete
confidence
17. Is there reported evidence that the effects of the career developmentactivity are determined by means of a student outcomes measure?
A-----------
[ ] [ ] Yes (1)
[ ] [ ] No (0)
CONFIDENCE RATING:
A
B
r- -
0 10 20 30 40 50 60 70 80 90 100%
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
18. Based on your experience, what is the chance that the methods describedfor each of the :u will achieve the specific objectives listedfor the nct ivi tv? 11: YOP NARKED "NO" TO EITHER ITEM 13 OR ITEM 16,MARK "NOT A1'1'L1CAPLL" A. SKIP 11LN 19
[ ] [Activity A: )
Activity B: [
GoodChance
[ ]
[
SomeChance
[ ]
[ ]
LittleChance
[ ] [ ]
Very Little NotChance Applicable
CONFIDENCE RATING:correct by placing
A
Please estimate the probability that your answer isa check at the appropriate location on the scale.
0 10 20 30 40 50 60 70 80 90 100%
Bt-
0 10
noconfidence
20 30 40 50 60 70 80 90 100%completeconfidence
19 What is your best estimate of the chance that each of the careerdevelopment activities will improve student career development skills,irrespective of whether the career development skill is listed as anobjective of the activity.
Activity A: [ ]
Activity B: [ ]
GoodChance
[
[
SomeChance
[ ] [ ]
[ ] [ ]
Little Very LittleChance Chance
CONFIDENCE RATING:
A 1- f- I- -4--0 10 20 30 40 50 60 70 80 90 100%
B1--
0 10
noconfidence
20 30 40 50 60 70 80 90 100%complete
confidence
-12-
20. Based on your experiences with the career development needs of highschool students, does each activity meet a student career develop-ment need?
A
[ [ ] Yes (1)
[ ] [ ] No (0)
CONFIDENCE RATING:correct by placing a
A0--
Pleasecheck
estimate the probability that your answer isat the appropriate location on the scale.
tr-
0 10 20 30 40 50 60 70 80 90 100%
B
0
noconfidence
10 20 30 40 50 60 70 80 90 100%completeconfidence
21. Is each activity constructed to permit easy use by faculty and staff?
A
[ ] [ ] Yes (1)
[ ] [ ] No (0)
CONFIDENCE RATING:
A
B
0 10 20 30 40 50 60 70 80 90 100%
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
22. Is each activity constructed to encourage student acceptance?
A
[ ][ ] Yes (1)
[ ][ ] No (0)
COFIDENCE kATIN6: Please estimate the probability that your answer iscorrect by placing a cht_..ck at the appropriate location on the scale.
t t- t---
0 10 20 30 40 50 60 70 80 90 100%
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
V. Directions: Items 23-26 refer to staff organization for the career develop-ment program. Rate the items as presented.
23. Was a committee selected to plan a program for student career development?[References: 1. VCL p. 1-1: ?._SR
[ ] Yes (1)
[ ] No (0) am> IF NO, CHECK THE CONFIDENCE RATINGAND SKIP TO ITEM 26, p.15 imit>
CONFIDENCE RATING:
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
24. Did the committee meet regularly during the planning period?[References: 1. VCL p_._ 1-4; 2. SR 10].
[ ] Yes (1)
[ ] No (0)
CONFIDENCE RATING:
0 10 20 30 40 50 60 70 80 90 100%no complete
conlidence confidenco
-14-
25. Did the cummitt.i coOCdin a Lc p 1 ann ng activities?
_____ . 2_. SR p. 10].
[ ] Yes (1)
[ ] No (0)
CONFIDENCE RAT1W,: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
- 4-
0 10 20 30 40 50 60 70 80 90 100;0no complete
confidence confidence
26. When rating- items 26a 7 g. refer back to ITEMS 21 -23; VU-R....-14._aumbers_1 ana-51 SR p. 10 inQ_PriQriLv).2 2--
26a. Were school administrators involved in planning the career develop-ment program?
[ Yes (1)
[ ] No (0)
CONFIDENCE RA11NG:
-r r-0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
If "Yes", rate the extent of involvement:
[ ] [ ] [ ] [ ] [ ]
High Moderate Low Negligible Cannot(3) (2) (1) (0) Determine
(9)
-15-
266. Were school conn:;,lor!: Involved in pl.Innin); the career development proran?
[ I Yes (1)
[ ] No (U)
CONFIDENCE RAING: Please esticiate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
1i I I [ I I I
i- 3-
0 10 20 30 /C0 50 60 70 80 90 1007:,
no completeconfidence confidence
If "Yes", rate the extent of involvement:
[
High(3)
[ ]
Moderate(2)
[]Low(1)
[ ]
Negligible(0)
[ ]
CannotDetermine
(9)
26c. Were school teachers involved in planning the career development program?
[ ] Yes (1)
[ ] No (0)
CONFIDENCE RATING:
0 10 20 30 40 50 60 70 80 90 100%
no completeconfidence confidence
If "Yes" rate the extent of involvement:
[ ] [ ] [ ] [ ] [ ]
High Moderate Low Negligible Cannot(3) (2) (1) (0) Determine
(9)
-16-
26d. Were school students involved in planning the career development program?
[ ] Yes (1)
[ ] No (0)
q'IDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
1
1
1 , f --]0 10 20 30 40 50 60 70 80 90 100%
no completeconfidence confidence
If "Yes", rate the extent of involvement:
[ ] [ ] [ ] [ ] [ ]
High Moderate Low Negligible Cannot(3) (2) (1) (0) Determine
(9)
26e. Were community ,:le,abe.:s involved in planning the career development program?
[ ] Yes (1)
[ ] No (0)
CONFIDENCE RATING:
4 1 t 1
0 10 20 30 40 50 70 80 90 100%no complete
confidence confidence
If "Yes", rate the extent of involvement:
[ ] [ ] [ ] [ ][ ]
High Moderate Low Negligible Cannot(3) (2) (1) (0) Determine
(9)
-17-
261. Were part nLs involvcd in planning the career development program?
[ ] Yes (1)
[ ] No (0)
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
1
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
If "Yes", rate the extent of involvement:
[ ] [ ] [ [ ] [ ]
High Moderate Low Negligible Cannot
(3) (2) (1) (0) Determine
(9)
26g. Were others involved in planning the career development program?
[ ] Yes (1)
[ ] No (0)
CONFIDENCE RATING:
F i i iI
i 1 1t i
i
0 10 20 30 40 50 60 70 80 90 100%
no complete
confidence confidence
If "Yes", rate the extent of involvement:
[ 1 [ [ [ [
High Moderate Low Negligible Cannot(3) (2) (1) (0) Determine
(9)
18
VI. Directions: Item 27 is a probability estimate of the impact of future plansfor the career development program in the school. Rate the item as presented.
27. If the plans reported on SR pp. 16-20 are carried out, what, in your ex-perience, is the chance that student career development skills will beimproved?
[ ] [ ] [ 1 [ ]
Good Some Little Very LittleChance Chance Chance Chance
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a check at the appropriate location on the scale.
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
VII. Directions: After completing the previous items, you probably will haveformed judgments about the overall career development program of theschool. Use SR pp. 5 and 10-20, VCL pp. 1 3 and 4, and all previousitems to rate the following items except item 32. Item 32 requires youto use only the yellow career development activities. Place a check inthe response space that best represents your judgment on each item.
'28. Estimate the extent to which th school staff was organized to plan sys-tematically a comprehensive career development program by evidence ofclearly desi1;11.1ted lea,1,,rship; administrative cooperation; and permanent,active groups and committees.
[ ] [ ] [ ]
(0) (1) (2)
[
(3)
[
(4)
Limited Extent Great Extent
CONFIDENCE RATING:
0 10 20 30 40 50 60 70 80 90 100%
no completeconfidence confidence
29. Estimate the c::tent to which a student career development needs as-sessmencondueted, tabulated, properly interpreted, and the datautilized for planning the career development proram.
[ ] [ I
(0) (1)
[
(2)]
(3)
[ I
(4)
Limit& Extent Great Extent
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by placing a cheok at the appropriate location on the scale.
0 10 20 30 40 50 60 70 80 90 100%
no complete
confidence confidence
30. Estimate the extent to which a comprehensive set of ordered career de-velopment goals reflecting assessed student career development needs weredeveloped and used in planning, implementation and evaluation of theprogram.
[ 1 [ [ I [
(0) (1) (2) (3) (4)
Limited Extent Great Extent
CONFIDENCE RATING:
0 10 20 30 40 50 60 70 80 90 100%
no completeconfidence confidence
31. Estimate the extent to which a set of behavioral objectives was developedreflecting specific goals and containing a clear statement of the intendedaudience, behavior, situation and standard of mastery.
[ I [ I [ ] [ I [ I
(0) (1) (2) (3) (4)
Limited Extent Great Extent
CONFIDENCE RATING:
0 10 20 3U 4u 5u 6u 70 80 90 100/,
no completecontidenee confidelwe
-20--
32. Estimate the e%tent to which career development activities were devl-lopedthat reflect student needs, goals, and associated objectives, and thatindicate methods, tar,:et student group and outcome measures by referring tothe two atmc:ied c.treer developrIent activities (yellow).
[ 1
(0)Limited Extent
[
(4)
Great Extent
CONFIDENCE RATING: Please estimate the probability that your answer iscorrect by pl:wing a check at the appropriate location on the scale.
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
33. Based on the available information (including all yellow and pink career de-velopment activities), rate the overall quality of the school's career develop-ment program.
1
(U)Very Low Quality
[ 1
(1)[ ]
(4)
Very High Quality
CONFIDENCE RATING:
1I I I f I I I I I
1
0 10 20 30 40 50 60 70 80 90 100%no complete
confidence confidence
APPENDIX E
Career Planning Support System Knowledge Test
124140
Test 2
Career Planning Support SystemKnowledge Test
forSteering Committee Members
Instructions: Now that you have taken an attitude survey, we would like
to assess your technical knowledge of the activities involved in planning
for career development.
This test can only measure the approximate extent of your know-
ledge due to differences in your educational experiences and interests.
Do not write your name on the test. All tests will be sealed in
an envelope immediately after the test and mailed to Ohio State University.
Your individual anonymity is guaranteed.
Steering Committee Knowledge Test
1. Planning a career development program should take into account theavailable resources which include people, programs, materials, equip-ment, and funds. State why it is important to assess each resourcewith one sentence.
a. People
b. Programs
c. Materials
d. Equipment
e. Funds
2. Performance objectives are a key ingredient in career developmentinstructional units. What is a performance objective?
3. Define "resource accounting" in the sense the term is used in careerdevelopment program planning.
4. When should continuously used career development units be evaluated?
5. Mention five ways in which students can be involved in planning acareer development program.
a.
b.
c.
d.
e.
6. One method of achieving a career development program goal is to developan instructional unit that aids students in attaining the goal. Listthe components of a comprehensive career development instructional unit.
2
7. A school's career development program should be reviewed annually.State five criteria that should be used in this evaluation.
8. Which groups of people should be involved in evaluating the successof a career development activity?
9. What function does the statement of the performance objectives servein the construction of a career development activity?
10. Conducting a needs assessment will yield information that is usefulin developing a career development program. Describe the needs assess-ment procedures which you think would be appropriate for your school.Mention who should be assessed, when the assessment should occur, whatshould be assessed and how results might be used.
11. List five criteria on which goals for career development programs mightbe arranged in priority order.
a.
b.
c.
d.
e.
3
12. After goals have been established for the career development programthey should be arranged in a priority order. State four reasons whyemphasizing goals according to priority is more desirable than emphasizingall goals equally.
a.
b.
c.
d.
13. A school which assesses community resources should coordinate theirassessment with that being done by other schools. Describe threebenefits this might provide.
a.
b.
C.
14. Although instructional units for career development are written primarilyfor students, teachers find them to be very useful. Cite two ways inwhich teachers might find the units to be valuable.
a.
b.
15. Which school staff members should implement a career development unit?
16. List three criteria that are used for selecting career development goals
a.
h.
c.
14
4
17. Surveying grduates can yield information useful in planning a school'scareer development program. Which four kinds of information do youthink are most important to collect from the graduates (and those whodidn't graduate) and how could each be useful?
a.
b.
C.
d.
18. Explein the advantat.es or disadvantages of assessing the needs of studentsby studyiag a representative sample of student from all grades compared tostudying a specific group of students such as sophomores, potentialdropouts, student.: , taking biology, etc.
19. How often in a si.::-year period should a needs assessment survey ofstudents be conducted? Give three arguments in support of your answer.
1.
2.
20. List and describe fcur uses for the results of an annual review of aschool's career development program.
1.
2.
4.
21. A. In your opinion, whD should participate in writing the behavioralobjectives for a career development unit?
B. State it least zwo reasons for each person listed above.
5
22. Give three examples of how resource aHsessment results might be used inplanning a career development program and putting it into effect.
23. Whar four criteria would you recommend to assess the effectiveness ofa career development unit?
a.
b.
c.
d.
24. State five goals that are typically set for a school's career develop-ment program.
a.
b.
c.
d.
C.
25. State four bases for identifying goals for a school's career develop-ment program.
a.
b.
c.
d.
26. Newly prepared career development units might contain imperfectionswhich should be corrected before the unit is made available for routineuse. In your opinion, name the categories of people who should partici-pate in the preliminary evaluation of such a unit.
6
27. Please respond briefly to A er. B:
A. For each of the program phases randomly listed below, describein each column who should be involved, the order in which eachshould occur, and the Intended purpose.
Program Phases
faculty/staff implementation
needs assessment
evaluation of the effectivenessof each career development unit
program review
setting goals and performanceobjectives
resource accounting
writing career developmentunits
Who When Pur ose
B. For each of the program phases randomly Listed below, describein each column a potential problem and practical remedy/adviceyou would give.
Pro ram Phases Problem iLeatoiyorrAdvice
faculty/staff implementation
needs assessment
evaluation of the effectivenessof each career development unit
program review
setting goals and performanceobjectives
resource accounting
writing career developmentunits
APPENDIX F
Perceptions of Program Planning for Career Development
132
MO Ohio Stele University .1880 Kenny Road Columbia Ohio 43210Till 16141 486 3666 cohla CTVOCIDOSIJ/Cnkunibtos Ohin
Dear Steering Committee Member:
We are pleased to have your participation in the CPSS assessment
project. Your involvement with CPSS demonstrates your commitment to the
development of career planning in your sch()ol. In addition, your contri-
bution is a vital part of a nationwide research effort.
The following is a two-part test. The first section assesses your
perceptions of planning for career development and the second assesses
your technical knowledge of the activities involved in planning for
career development.
You will have 45 minutes to complete these tests. Because these
are timed tests, please work quickly. If you do not know an answer,
simply go on to the next question. You may look hack over any unanswered
items after you have completed all the questions.
Individual scores will not he used or reported. Your test will he
sealed in an envelope when you are finished to insure that your answers
remain anonymous. We sincerely thank you for your cooperation.
Sincerely,
_Dita/d c.Donald C. Findlay
Project Director
CPSS Assessment Study
Ii)
Test I
PERCEPTIONS OF PRO(;RAM PLANNING FORCAREER DEVELOPMENT
A. Directions: Based on your perceptions of planning for career programs,
will you please indicate your Level of agreement with each question.
There are no "right" or "wrong" answers on this first evaluation.
Please use this scale to express your attitudes toward statements I
through 17.
W >...,CD ,I
WCD rI ,
S-4 OD $.4 CI 00OD C 00 14 W a mco o m 13 W 0 Win 14 U) $4 1-4 ),-d
-f-1 ai ,-1 W CIO 4-) bi:A C1) A z ,c1C M <
DS A SA
For instance, If you strongly agree with a statement, you would circle
SA (strongly agree).
PERCEPTIONS OF PROGRAM PLANNINGFOR CAREER DEVELOPMENT
1. Plans for career development programs
are best made by teachers, counselors, and
career specialists, working separately
from each other.
2. Information which could be supplied by
graduates has little use for career program
planners because of its datedness.
3. Knowing parent attitudes about career develop-
ment is desirable when planning local
programs.
4. Schools need to focus more effort on
planning for career development before
putting the program into effect.
W 7-,'V
>,w .--4 W .--I 4)-4 tg) t4 03 ti)tio p 00 H W Q) 0tt 0 cti .0,-) W Q) 0(a H in 0 W 14
...4 AJ --1'A' I En"rz3 En c3
DS D N A AS
DS D N A AS
DS D N A AS
DS D N A AS
5. The career development needs of students DS D N A AS
should be met entirely by counselors.
6. Industry, labor, and the professions should
work hand-in-hand with schools in planning
career development programs.
DS D N A AS
411 >,W .-414 00
4).4)14
.--.1
(!000 W $.0 0 0 .4.)(I) 14 o
.1-4 4.1 .-1 CU
a) M a) z7. Business, industry, and labor naturally
induct students into careers; schools
need not be concerned about it.
8. Career development is a highly specialized
and critical part of the total school curriculum
and should be given appropriate priority,
staffing, and reserves.
9. In-school career development activities
should be conducted independently of
the community. Otherwise, career
aspirations of students will be unduly
influenced.
10. Since the career development of youth is
a universal problem, no special local
information about local student status is
necessary when planning individual school
programs.
11. Schools spend so much time planning for career
development that it hinders implementing
the program.
DS D N A AS
DS D N A AS
DS D N A AS
DS D N A AS
DS D N A AS
12. "Subject matter teachers" should
cooperatively participate in planning,
developing, and implementing career
development activities.
13. Developing units of instruction for career
development requires too many special skills
to be done by the average teacher.
14. The goals for a career development program
should be based largely on the identified
needs of students.
15. It is sufficient to review a school's
career development program once every
five years.
16. Career development program goals should
be arranged in priority to improve chances
of achieving them.
17. Limiting career development activities to
community opportunities is a good idea
because otherwise students would have
unrealistic aspirations.
W
W
al 0m
4-)rZ) En
WWWb0alm
CO
4-bW
=z
>1
W 0OP 4J
(r)
DS D N A AS
DS D N A AS
DS D N A AS
DS D N A AS
DS D N A AS
DS D N A AS
B. Directions: Please use the following key to complete items 18 to 32.
>NI....I
>-%.-1 .--1
..g .. P .--I ......1
CD 0 0 01 .r4> = = z .--1 > ....1
VU U N L VL
If' comprehensive, systematic procedures for planning the career development
program were instituted at your school, how likely is it that such planning
procedures would:
18. result in better relationships among
faculty/staff and students?
>,.IQ/
..hd>, ,4I./ .i
'..,viQi
..54
..-/....4
r-4MI1.1
./.../
=z
>,eIW
..Nd
...a
>` >`
131>, .m1-I .,4
,...1>
VU U N L VL
19. result in more efficient use of resources VU U N L VL
for career development?
20. improve present career development planning VU U N L VL
procedures?
21. have little potential for directing student,: VU U N L VL
toward worthwhile career goals?
22. m104 in better student planning for career VU U
development?
N L VL
23. better meet student needs? VU U N L VL
24. result in more relevant career
development program objectives?
>,161
>CU
,-4.SC4)
a ..1
CI=
N--4
41)
...
1.111
_114)4
=CUz
.-4
vb4-,4,-3
>,14CU
.--44)
,S4rl..3
VU U N L VL
25. increase faculty/staff support for a career VU U N L VL
development program?
26. require more work than curl be handled by VU U N L VL
the existing faculty/staff?
27. identify goals for a career development VU U N L VL
program?
28. make me perceive career development VU U N L VL
more positively?
29. identify student needs which were not VU U N L VL
apparent before?
30. result in an annual career development VU U N L VL
program review?
31. result in narrower participation in the
implementation of the career development
program?
32. result in biased or stereotyped perceptions
of career development?
VU U N L VL
VU U N L VL
APPENDIX G
Panel of Expert Judges
140
1
CPSS ASSESSMENT STUDY
Dr. Henry BorowUniversity of Minnesota106 Nicholson HallMinneapolis, MN 55455
Dr. John Crites10897 Deborah Dr.Potomac, MD 20854
Dr. William ErpenbachWisconsin Dept. of PublicInstruction
126 Langdon St.Madison, WI 53702
Dr. Larry Gulick1578 Chippewa Ct.Grove City, OH 43123
Dr. David Jesser1872 South Pierson Ct.Lakewood, CO 80226
Ms. Elaine JonesFt. Steilacoom CommunityCollege9401 Farwest Dr.Tacoma, WA 98498
Dr. G. Brian JonesAmerican Institutes forResearch
1791 Anastradero Rd.P.O. Box 1113Palo Alto, CA 94302
Dr. Robert LathropFlorida State UniversityTallahassee, FL 32306
Ms. Nancy Losekamp1950 N. MallwayColumbus, OH 43221
REVIEW PANEL
Dr. Juliet MillerProject DirectorRehab. Research InstituteSchool of EducationUniversity of MichiganAnn Arbor, MI 48109
Dr. Clayton OmvigDept. of Vocational EducationRoom 15, Dickey HallUniversity of KentuckyLexington, KY 40506
Dr. Samuel H. Osipow575 EnfieldColumbus, OH 43209
Dr. Alice Scales4H01 Forbes QuadrangleUniversity of PittsburghPittsburgh, PA 15260
Dr. Bruce ShertzerEducation BuildingPurdue UniversityW. Lafayette, IN 47907
Ms. Jessie Teddlie1952 Yucca TrailHurst, TX
APPENDIX H
Original Target Cities
and Cover Letter
142
ORIGINAL TARGET CITIES
ARIZONA NEBRASKAPhoenix OmahaTucSon
NEW JERSEYCALIFORNIA Newark
San DiegoSan Jose NEW YORK
BrooklynCOLORADO Buffalo
DenverNORTH CAROLINA
DISTRICT OF COLUMBIA CharlotteWashington
OHIOFLORIDA Cincinnati
Miami Toledo
GEORGIA OKLAHOMAAtlanta Oklahoma City
TulsaILLINOISChicago
INDIANAIndianapolis
KENTUCKYLouisville
OREGONPortland
PENNSYLVANIAPhiladelphiaPittsburgh
TENNESSEELOUISIANA Memphis
New Orleans
MARYLANDBaltimore
MICHIGANDetroit
TEXASDallasHouston
UTAHSalt Lake City
MINNESOTA WASHINGTONMinneapolis Seattle
MISSOURIKansas CitySt. Louis
WISCONSINMilwaukee
THE NATIONAL CENTER
FOR RESEARCH IN VOCATIONAL EDUCATIONThe Ohio Sidle Umversoy1960 Kenny /load Co!Anbus. Ohio 432WTel 6141 486 3665 Cable CTVCCEDOSU/Columbus. Ohio
July 21, 1978
Dr. M. D. ThomasSuperintendent of SchoolsSalt Lake City School District440 East First SouthSalt Lake City, Utah 84111
Dear Dr. Thomas:
We at the National Center for Research in Vocational Education are interested in identifyinga set of high schools to participate in a controlled assessment study during the 1978-79 school year.The product to be tested is the Career Planning Support System (CPSS), a systems approach toupgrading career guidance services. CPSS is a comprehensive career guidance development toolthat emphasizes program planning, implementation and evaluation. A further description of CPSSwill be found in the CPSS Decision Guide, copies of which are enclosed. We urge that it be readvery carefully by interested high school principals. The assessment study design we are using callsfor ten schools to use CPSS and ten to carry on with services as planned for the 1978-79 schoolyear. Essentially, we are substantiating that schools using CPSS have improved planning for careerguidance programs significantly more than those who do not use CPSS. The objectives for theassessment of CPSS are noted in Attachment 1.
It is important that the schools using CPSS and those not using it be fairly comparable. Twocommon characteristics are (1) that all schools participating in the study must be motivated andcommitted to improving comprehensive career guidance services to students, and (2) that theschools must be comparable on criteria such as stability of school staff, size of the guidance staff,number of students who enter college, drop-out rates, and socioeconomic status. Once a set ofschools with these attributes is identified, 20 schools will be assigned randomly to user and non-user groups, ten schools in each. To protect the process of Ine study, user and non-user schoolsshould not be in communication with each other regarding the study. For now, we need manyschools volunteering who are motivated and committed to improving comprehensive career guid-ance services for students, and who meet the criteria for comparability. This is volunteering withthe understanding that, if selected, only one school from your district could be assigned randomlyto a user group and one to a non-user group. Enclosed are copies of a School Information Checklistwhich should be distributed to and completed by those high schools (three or more) in your districtwhich may wish to participate in the assessment study. The completion and return of the SchoolInformation Checklist by a school's executive will be considered an application to participate in thestudy. Each applicant will be notified in writing of the outcome of the final selection.
To defray costs of participation in this study, monies will be set aside for each of the 20schools involved. Those actually using CPSS will have a budget planned for approximately $5,000,while those in the control group will have a budget planned for approximately $500. A writtenagreement will be executed between each school and the National Center.
Page Two
The schools selected to use the product will be expected to commit a person to lead theimplementation of CPSS and act as a liaison person with the National Center. This person willaverage at least a half-time schedule with CPSS, plus a training session at the National Center.Other school personnel will need to participate in the study but not as much time will be involved.User schools also will supply pre- and post-test data through data collection instruments providedby the National Center. A graphic description of a user school's role in this study is in Attachment2, Design Structure for the Assessment of CPSS.
Similarly, those schools not using CPSS must also have a liaison person who, essentially, willsupply information to National Center staff. This person will likely spend the equivalent of 10-15days during the school year on data gathering and reporting. No time other than survey response(pre- and post-test data) will be needed from other staff.
We invite you to participate in the selection of schools once you have ascertained full commit-ment to such a study. To answer any questions you may have about the study, a member of theproject staff will telephone within the next ten days. Otherwise, direct any comments or questionsto James Pearsol, Program Associate, (614) 486-3655, x354. The deadline for applications isAugust 21, 1978.
We shall appreciate your thoughtful consideration and look forward to further communicationwith you regarding the proposed assessment study.
Sincerely,
Robert E. TaylorExecutive Director
R ET/yyEnclosures
APPENDIX I
School Information Checklist
146
1 2
THE NATIONAL CENTER
FOR RESEARCH IN VOCATIONAL EDUCATION
Dear Principal:
The Ohio State University 1960 Kenny Road Columbus. Ohio 43210Tel: (614) 4863665 Cable CTVOCEDOSU/Columbus. Ohio
July 20, 1978
We at the National Center for Research in VocationalEducation are interested in identifying a set of high schoolsto participate in a controlled assessment study during the1978-79 school year. The product to be tested is the CareerPlanning Support System (CPSS), a systems approach to upgrad-ing career guidance services. CPSS is a comprehensive careerguidance development tool that emphasizes program planning,implementation and evaluation. A further description of CPSSwill be found in the CPSS Decision Guide which accompaniesthis letter.
To identify those schools which might participate in thestudy, we have asked superintendents in several large citiesthroughout the United States to recommend those high schoolsin their districts which might be motivated to improve careerprograms and which are comparable among several variablessuch as size and socioeconomic status. Once interestedschools have been identified, ten schools will be randomlyassigned to a CPSS user group and ten will, be randomlyassigned to a non-user group (a control group) for the study.
The attached School Information Checklist is a meanswhereby we may select schools to participate in the study.If you would like to participate, please return the Checklistin the preaddressed envelope provided.
We shall appreciate your completion of the attachedChecklist and look forward to your participation in the siteselection process. To answer any questions you may have, amember of the project staff will telephone your office withinthe next week.
jel
Enclosures
Sincerely,
Dr. Robert E T forExecutive Dire or
CPSS Assessment Study
School Information Checklist
Directions
We are interested in gathering information about your
school for the Career Planning Support System (CPSS) assessment
study. The attached checklist is a simple method to collect
certain information about your school. Most, if not all, of
the information requested may be obtained from school records,
for example, average daily attendance. When completing the
form, please approximate on those items where you lack suffi-
cient information.
Please complete items one through twenty-one (1-21) as
presented. If any of these items (1-21) require clarification
or are not applicable, please refer to item 22. Please include
any comments in item 22. Thank you for your assistance.
THE NATIONAL CENTER
FOR RESEARCH IN VOCATIONAL EDUCATIONThe Oh,o State Umyersoty 1960 Kenny Road Columbus. Ohio 43210Tel 16141 4416 3666 Cable CTVOCEDOSU/Columbus. Oh.o
CPSS Assessment Study
School Information Checklist
1. High School:
2. Principal:
Zip Code
Area Code Phone Number
3. Number of years in position:
4. The average daily attendance during the 1977-78 schoolyear:
5. Has the average daily attendance increased or decreasedby more than ten percent during the past two years?
ves No. If yes, please specify reasons:
6. The number of faculty /staff in the high school duringthe 1977-78 school year:
teachers counselors administrators
7. What are the average years of total experience forfaculty/staff?
8. The racial/ethnic composition of the high school's stu-dents during the 1977-78 school year:
a. percentage of Black students
b. percentage of Native American students
c. percentage of students with Asian ancestry
d. percentage of students with Spanish surnames
e. percentage of other non-White students
f. percentage of White students
9. Percentage of students in various program (curriculum)areas:
College Prep Special
General Other
Vocational
(Specify "other")
10. What is the dropout rate for your high school?
11. Percentage of students who graduated in June 1977 andattended college:
12. Turnover rate among faculty at the school:
% per year
13. Is the school considered a leader in the adoption ofeducational innovations? (Typical indications that aschool is a leader are that it has received state orfederal funds for more than one project during the lastthree years or that the school has participated infield tests of product development.)
Yes No. If yes, briefly describe the educa-tional innovations that have been adopted during thepast three years.
14. How many feeder schools does your school have?
15. What is the mean test score(s) on the SAT or ACT forjuniors and seniors in your school? If available, usethe 1977-78 score reports. If not, use the 1976-77score reports.
76-77 77-78 SAT V M (Please referto SAT summary report for your school and check (V)appropriate year of the report.)
76-77 77-78 ACT Composite (Please referto ACTffI4h School Profile Report for your school andcheck (V) appropriate year of the report.)
16. What is the total population of the attendance area foryour school?
17. What is the estimated average income within your atten-dance area? (check one)
$ 5,000 10,000
$10,000 15,000
$15,000 - 20,000
$20,000 25,000
$25,000 above
18. What grade levels does your school have? 9-12
or 10-12
19. Does your school district or state have a comprehensivecareer guidance program model?
Yes No
20. If yes, is your school required to use the career guid-ance program model?
Yes No
21. Please list three unique aspects of your school:
22. Items you may wish to clarify or comments:
APPENDIX J
Work Statement
153
WORK STATEMENT FROM A PURCHASE OF SERVICE AGREEMENTWITH A COOPERATING SCHOOL DISTRICT
The contents of this work statement include the scope of services
provided by The National Center for Research in Vocational Education
and the Jefferson County Public Schools for the Assessment of the
Career Planning Support System. The effective dates for this proposed effort
are September 15, 1978 to June 30, 1979. Amendments to this agreement may be
effected through mutual consent in writing by all the signatories affected by
the changes.
1.0.0. The National Center for Research in Vocational Education will pro-
vide the materials, resources, and services listed in 1.1.0. through
1.6.0.
1.1.0. Two complete sets of the CPSS materials (Coordinator's Handbooks,
eight guides, four audiovisual presentations and one package of
camera-ready masters). One set will be used by the experimentalschool for the purpose of the assessment study. The other setwill be delivered to the control school once the assessment study
is completed.
1.2.0. Three days of pre-service preparation at the National Center for
Research in Vocational Education for the experimental school coor-dinator to familiarize him/her with the CPSS product and to enablehim/her to accomplish the evaluation tasks associated with theassessment study.
1.3.0. Technical assistance to the experimental school coordinator as need-
ed, as requested, or as determined by The National Center staff withthe identification and resolution of problems associated with the
assessment study and the use of the CPSS-product. This assistancefrom the National Center staff will be provided to the site viatelephone, mail or travel to the site depending on which method
is deemed most appropriate by the National Center's staff.
1.4.0. National Center staff to conduct on-site monitoring and progressassessments and knowledge and attitude posttests.
1.5.0. Sufficient copies of status report forms and knowledge and attitude
tests.
1.6.0. Costs reimbursable effort is not to exceed $5,500. Five thouanddollars will be set aside for the experimental school and $500 willbe set aside for the control school (refer to budget, attachment A).The payment schedule will in two installments. The first paymentdace will be January 15, 19/9 in an amount for which invoices havebeen received. The second payment date will be June 30, 1979 for theremaining invoices.
2.0.0. The Jefferson County Public Schools will provide the materials,resources, and services listed in 2.1.0. through 2.5.2., and willconform to the provisions incorporated in HEW Standard Form 315(incorporated by reference only).
2.1.0. Monthly, one-page reports due on the first of each month (beginningwith December 1, 1978) to include invoices and descriptions of ex-penses pertaining to the CPSS Assessment Study.
2.2.0. One central administration staff member (not a staff member at eitherthe experimental or control school) to administer the knowledge andattitude pretests to the CPSS Steering Committee at the experimentalschool.
2.3.0. .Experimental School
2.3.1. One permanent staff member released one-half time to coordinate CPSSin the school and to provide evaluative data for the assessment study.It is understood that this will be a regular assignment and not extraduty.
2,3.2. Resources and services to include:
a. Printing (offset, etc.). The total number of pages should notexceed 8.5 times the total number of students in the school.
b. Duplication services (electrostatic, etc.). The total numberof pages should not exceed two hundred (200).
c. Office supplies (excluding paper for printing and duplication).Supplies should include two reams of letterhead, 1,400 business-size envelopes imprinted with the name of the school, and postage,if the questionnaires are mailed.
d. Equipment. A filmstrip projector and a cassette tape recorder.
e. Telephone Service. Long distance telephone calls to the NationalCenter as needed.
2.3.3. Time of r.ofessional staff, in addition to the school coordinator,as follows:
a. Five to seven faculty/staff, who will serve on the CPSS SteeringCommittee an average of one and one-half hours per week duringthe academic year, and who will complete 1 1/2 hours of pre andposttests.
b. No more than 16-20 professional staff members who will serve onthe Needs and Resources Assessment Task Forces at an average oftwo hours per week for an average of 8-10 weeks each.
c. Other professional staff as needed to implement CPSS in theschool.
2.3.4. Primary outputs, to include demographic data about school and com-munity, documentation of actual and potential career developmentprogram resources, statements of program goals, statements of be-havioral objectives, proposals for career development activities,and reports of evaluations of career development activities. Onecopy of each output and any changes to that output will be pro-vided to the National Center staff on specified deliverable dates.
2-3.5. Secondary outputs to include a report of career development pro-gram planning, pre and posttest data, and project logs.
2.3.6. The school may publish and make available inkormation concerningits participation in the assessment study once the assessmentstudy is completed. A copy of each article submitted by theschool shall be sent to the National Center after its publicationor presentation.
2.4.0. Control School
2.4.1. One permanent staff membe- released for rhe equiv-lent of 10 to 15days during the school year to gather data and complete reports.
2.4.2. Postage and long distance telephone calls to the National Center asneeded.
2.4.3. The outputs shall be ,a report of career development program plan-ning, both pre and posttest.
2.5.0. Deliverables
2.5.1. Experimental and Control. Schools
A. Due Dates: 9/25/78-11/15/78 Career Development Program StatusReport
1. One Status Report to be completed by school personnel anddelivered to Center project staff during Fall on-sitevisit at the school (a two-day visit to be arranged be-tween 9/25/78 and 11/15/78 -- two weeks notice will beprovided).
2. Due Lates: 5/20/79 - 6/30/79. One Status report to becompleted by school personnel and delivered to Center pro-ject staff during Spring on-site visit at the school (atwo-day visit to be arranged between 5/20/79 and 6/30/79 --two weeks notice will be provided).
2.5.2 Experimental Schools
A. CPSS Coordinator Training - Three days training provided by theNational Center staff for the Cl'SS Coordinator, to familiarizehim/her with the CPSS product and project deliverables (datesCo be arranged prior to utilization of CPSS in the school).
B. KnowLeu,4e and Attitude Assessments
1. Due Date: 12/15/78 - to be administered by non-school per-
sonnel to the Steering Committee faculty/staff at the first
committee meeting.
2. Due Dates: 5/20/79 - 6/30/79 - to be administered to theSteering Committee faculty/staff by Center project staffduring Spring on-site visit at the school (a two-day
visit to be arranged between 5/20/79 and 6/30/79 two
weeks notice will be provided).
C Results of Needs Assessment - Due Date: 3/1/79 -- Tables 40,
41, 42, 43, 44, 45, 46, 47, 48 (see page 25 - Coordinator's
Handbook, "Results").
D. Results of Resource Assessment - Due Date: 3/1/79 - Tables34, 35, 36, 37, 38 and 39 (see page 6, Coordinator's Handbook).
E. Goals Data - Due Date: 3/31/79 - Tables 50, 51, 52 and 53
(see page 35, Coordinator's Handbook, "Results").
F. Career Development Unit (CDU) Data - Due Dates: 5/20/79 -
6/30/79 - Copies of at least two CDUs developed and evaluated
as a result of CPSS involvement. Form No. 21 and Tables 54
and 55 (see pages 52 and 57, Coordinator's Handbook). Copies
to be delivered to Center project staff.during Spring on-site
visit at the school (a two-day visit to be arranged between5/20/79 and 6/30/79 - two weeks notice will be provided).
G. Project Log - Due Dates: Due every two weeks beginning Novem-ber 15, 1978 and ending June 15, 1979. The logs are brief
progress reports and are mailed directly to National Center
staff.
Attachment A
ASSESSMENT OF THE CAREER PLANNING SUPPORT SYSTEM
Proposed Budget
September 15, L978 - June 30, 1979
The following budget depicts the budget areas, in accordance with al-
lowable costs, within which the Jefferson County Public cc-hoolg will re-
quest reimbursement.
I. Personnel - (This area refers to costs incurred for the time of
permanent professional staff participating in the CPSS Assess-
ment Study.) Items listed in this section may include costs forstipends, extra duty, substitute teachers, fringe benefits andpartial salaries.
Item Estimated Budget
a. Stipends and extra pay__
b.
c.
d.
Description:.
$5273.00
Stipends for steering committee members, needsassessment and co ordination.
Supplies - (This area refers to costs incurred through the purchase
of supplies and telephone service for the CPSS Assessment Study,
see 2.3.2. and 2.4.2. of agreement).
a.
Item Estimated Budget
SuDolies $47.00
b. Telenhono ---1.110..______.--
c. Stamps 813n no
d.
$227.00
Description:
`TOTE: Items III, IV and V require specific requests for permission andprior approval before funds will be allocated.
III. Equipment - (This area refers to costs incurred through the purchaseof equipment (direct and indirect costs) for participation in theCPSS Assessment Study).
a.,
b.
c.
d.
Item
Description:
Estimated 'Budget
=170.
IV. Travel (This area refers to travel costs, if any, incurred byschool staff participating in the CPSS Assessment Study).
a.
b.
c_
d.
Itcm Estimated Budget
Description:
11===ill
IIMMk !11/1
V. Other Costs (This area refers to other costs incurred by theschool or school staff as a direct result of participation inthe CPSS Assessment Study, i.e., expenses for a retreat for theCPSS Steering Committee).
a.
b.
c.
d.
Item Estimated Budget
Description:
011Im
ASSESSMENT OF THE CAREER PLANNING
SUPPORT SYSTEM
Jefferson County Public Schools(School District)
Van Hoose Education Center(Address)
3332 Newburg Road
Louisville, Kentucky 40218
By: ,er.1/611W7/0""/ Date: 12-1-78
(Signat /r4)
E.C. Grayson(Typed Name)
Superintendent of Schools(Title)
APPENDIX K
Joint Dissemination Review Panel Report
Copy Submitted to NIE
163
1 7q
PROGRAM AREA: Career DeVelopment Program Planning
I. TITLE OF PRODUCT: The Career Planning Support System (CPSS)
II. DEVELOPER: The National Center for Research in Vocational Education
III. FUNDING: National Institute of Education. Testing $340,373
IV. DEVELOPMENT AND TESTING PERIOD: Development: 1971-1976; Testing: 1978-1979
V. BRIEF DESCRIPTION OF CPSS:
Arnow', the priorities identified by the career education movement of the seventieswere 1) a need to blend student career development into the mainstream of educationalpractice, and 2) a need to meet increased accountability demands in the delivery ofinstructional and counseling services in public schools. To meet these two needsrecent research activities have emphasized the importance of systems methodology inproperly planning, implementing, and evaluating career development programs(Campbell, 1975; Campbell et al., 1971; Hosford and Ryan, 1970).
Mitchell and Gysbers (1979) reported that an emerging direction for career de-velopment and guidance in schools is the guidance system comprised of a series ofinterrelated planning, design, implementation, and evaluation components. Herr(1979) recommended that guidance at the local school level be based on student needsand planned as a total program with goals, objectives, activities, and studentoutcomes. A National Vocational Guidance Association Posirin Paper on Criteriafor Career Guidance Programs (1979) stated, "in order to achieve lasting effective-ness, it Is important that (career development) program planners follow a compre-hensive student needs-based and evaluation-oriented approach to program development."
In response to the need for systematic program planning for student careerdevelopment programs, the National Center for Research in Vocational Educationdeveloped and tested CPSS from 1971 to 1973. A two-year (1974-76) field test ofCPSS resulted in important revisions of the materials. Thirty-eight individualhigh schools, ranging from rural schools of less than 100 to large urban andsuburban schools of more than 2,000 students, participated in the field testing.This submission is based on data from an assessment study conducted in 1978 and 1979.The 1978-79 assessment of CPSS involved eighteen high schools in seven states.The purpose of the assessment study was to test the effectiveness of theCPSS materials as a high school career development program support system.
CPSS consists of handbooks, reproducible forms and filmstrips that describe acomprehensive organizational framework and procedural steps a school staff can useto create an accountable, school-wide high school career development program. Thefollowing list describes the complete set of CPSS materials:
The Coordinator's Training Guide is a self-instructional training guidefor the part-time CPSS coordinator.
es The Coordinator's Handbook contains instructions that describe step-by-stepprocedures for managing and implementing CPSS in the high school.
Camera-Ready Forms are reproducible copies of each form needed for thequestionnaires, instructions, CPSS Program Information File, etc.
e Handboeks
The Advisory Committee Handbook defines the responsibilities and dutiesof Advisory Committee members (Five copies).
Assessing Resources guides a resource leader in directing a task force to
collect information on and account for the use of resources in the school
and community.
Assessing Needs: Surveying provides instruction for preparing, adminis-tering, and collecting survey questionnaires for students, graduates,
parents, and faculty/staff (five copies).
Assessing Needs: Tabulation contains instruction on manually tabulatingdata collected by questionnaires (five copies).
Analyzing Methods directs a methods specialist about the availability andapplication of guidance methods and how to integrate this knowledge duringthe construction and review of career development units.
The Manual for Writing Behavioral Objectives is a self-instructional
resource for a behavioral objectives specialist.
Writing Behavioral Objectives informs the behavioral objectives specialist
about the function of behavioral objectives in the construction of career
development units.
Producing Career Development Units (CDUs) provides direction for develop-
ing career guidance/development activities.
Filmstrip/Audio Tape Presentations include:
AV-l: "An Orientation to CPSS"--orients interested persons to CPSS.
AV-2: "Shaping Program Goals"--gives an overview of how the needs andresources assessments lead to goals for a school.
AV-I: "Behavioral Objectives"--accompanies the behavioral objectivesmanual.
AV-4: "Producing CDUs"--gives an overview of the career development unitprocess.
Claims of effectiveness. CPSS is intended as a set of tools to assist withinstitutional changes in planning for career development programs in high schools.
It is assumed that the school staff using CPSS is motivated to plan for the school's
career development program. The main claim of this submission is stated below.
Use of the CPSS materials for one academic year enables a high
school staff to implement a systematic planning process for studentcareer development programs.
For the purposes of this submission "systematic planning process" includes the
following elements:
Establishment of an organizational structure facilitating a career develop-
ment program, to include clearly designated leadership and permanent activecommittees and work groups.
o Assessment of the career development needs of local students and use of theresults of the needs assessment in the career development program.
Creation of explicit career development goals reflecting assessed studentcareer development needs, listed in order of importance.
Creation of student behavioral objectives designed to implement the goals.
Creation of student activities to achieve the objectives and goals.
Career development in the Cl:T.S perspective is defined as the process by whichan individual student acquires the basic, non-technical skills needed for functioning
2
in the world of work. A career development program is a sequence of activitiesdesigned to help foster student career development.
Intended users of CPSS. High school personnel and students cooperate in useof CPSS.
Costs to schools. Table 1 shows cost estimates for using CPSS during the firstyear and subsequent years. The figures could be converted to costs per learner bydividing by the number of student users, but this ratio does not seem like a usefulstatistic since CPSS is designed to affect directly the institution, and the mainclaim of this submission refers to institutional change, not learner change.Because costs may vary among schools, a range is entered in the table.
TABLE 1. COST ESTIMATES PER SCHOOL
PersonnelStaff TrainingSpecial. facilitiesEquipment .
ConsumablesOther costsTOTAL COSTS
VI. EVIDENCE OF EFFECTIVENESS
First Year(Nonrecurring Costs)
2900--72500
0
0123260
3283-7633
Subsequent Years(Recurring Costs)
2175--29000
0
0
6160
2296--3021
Design of the field test. Data supporting the claim for effectiveness weregathered, using a pre-post, experimental-control group design, on 18 high schools.The high schools were located in Arizona, Maryland, Illinois, Kentucky, Tennessee,Florida, and Colorado. Table 2 displays descriptive statistics for the test sites.Ten of the 18 participating schools used CPSS for one academic year, and the re-maining eight did not. In this document CPSS users frequently are referenced asexperimental schools and nonusers are termed control schools. Measurements on allvariables related to the main claim iaere taken before and after the school year inwhich experimental schools used CPSS.
TABLE 2. CHARACTERISTICS OF TEST SITES
Control ExperimentalAverage of Characteristic Schools Schools t-value
Size of student population 1916 1943 .074Ratio of faculty & staff to student pop. 19.49 17.64 1.540ACT/SAT scores* 15.67 16.84 .748Estimated family income $12000 $13125 .607Drop-out rate 7.4 10.0% 1.375'Percent write 38% 51% .814
NOTE: Tabl. entries are averages over the control or experimental schools, aslabeLed. Experimental school refers to a school that used CPSS during thestudy, and control school refers to a school that did not use CPSS.
*Five schools made SAT scores available, and the remaining 13 submitted ACTaverages. The five SAT scores were converted to the metric of ACT bydividing them by the ratio of the average over schools SAT to the averageACT.
3
Each school provided a part-time coordinator who was responsible for the pre-
paration and completion of data collection forms and who served as the on-site
contact person with the National Center staff. In the experimental schools thiscontact person also served as the CPSS coordinator. The experimental school coor-dinators received a three day training in CPSS procedures in November 1978. The
training was conducted at the National Center by project staff. Training normallyis not necessary for use of CPSS; it was provided in this instance to help accelerate
the normal process of creating a career planning system, in order to complete the
study within the specified time period.
Both experimental and control schools were monitored by monthly telephone calls
and une site visit in February, 1979. This was in addition to pretest and posttestsite visits to all schools in November or December of 1978 and May or June of 1979.
The telephone calls and site visits included very little technical assistance.Experimental school coordinators were requested to complete project logs twice a
month, describing the progress of CPSS in the school.
Participating schools volunteered in response to a national publicity campaign.
The original intent was to assign participating schools at random to experimentaland control conditions, but due to insistence of local school administrators random
assignment occured in only four instances. In the remaining cases, local school
officials made the determination. Experimental schools were paid 5000 dollars todefray expenses, mostly to pay for personnel time. Control schools were paid 500dollars and given a set of CPSS materials at the end of the study.
Self selection of schools into the study at first appears to threaten the ex-ternal validity of the results, but, on reflection, probably poses no such threat.
Users of CPSS certainly will all be self-selected; therefore, the sample is drawn
from the universe of probable users. Inability to control assignment of schools toexperimental and control conditions poses some threat to the internal validity of
the design. The pre-post nature of the design, equivalence of the experimentaland control schools on key variables (see Table 2), and the magnitude of the gainsfor experimental schools suggest that the results are not likely due solely tononrandom assignment, however. The main threat to the internal validity of thestudy is the interaction between those selected into the experimental group and"maturation" (i.e., changes that would occur without the treatment, but only inexperimental schools) (Campbell and Stanley, 1966). While interaction between"maturation" and selection cannot be entirely ruled out as a contributing factorin experimental school gains, the gains reported below are too large reasonably tobe attributed solely to the interaction of maturation and the treatment variable.These gains are all over one standard deviation.
Measurement. Two data collection Forms, the Career Development Program StatusReport and the Verification Checklist, and one rating instrument, the Career De-velopment Program Rating Instrument, were developed and used for the stud . TheStatus Report and Verification Checklist were used to collect in n from thefield sites. The information was then reviewed and rated by a fifteen member reviewpanel with acknowledged expertise in career development related areas. The reviewpanel members individually answered questions on the Rating Instrument by referringto information collected on the Status Report and Verification Checklist for eachschool. All analyses reported in this submission were based on data drawn fromthe Rating instrument.
The Career Development Program Status Report and the Verification Checklistwere developed by project staff. A review of the forms by external consultants
4
indicated that the forms provide information related to the elements of a system-atic career development program and have content validity.
The Status Report was completed by school personnel in all schools who docu-mented the extent to which their existing career development program planningreflected the basic components of systematic career development program planning.These data were collected before experimental school coordinators were trained.The completed Status Report was reviewed on-site by project staff and missingdata were obtained. Examples of the type of information collected through theStatus Report include data about career-education goals, assessment and evalua-tion related to career education, and student career-development activities.Career development activities include, but are not limited to, curriculum units,visits to local businesses, and career days.
The Verification Checklist provided a means by which project staff could cor-roborate, clarify, and expand the information recorded on a school's Career Develop-ment Program Status Report. During the pretest and posttest site visits, a NationalCenter staff member completed the checklist with school personnel assistance, andboth persons signed the completed form indicating agreement on the accuracy ofthe information. Examples of information gathered on the Verification Checklistinclude data about career-education needs, career-education goals, and committeeorganization related to the career education of students.
The Rating instrument was developed by project staff with the assistance ofan external instrument design specialist. Two factors basic to the design of therating instrument were: (1) inclusion of items that were clearly answerable giventhe descriptive information that was being rated, and (2) the exclusion of itemsthat did not allow control schools a fair 'opportunity to receive a high rating.
The Rating Instrument is divided conceptually into two major parts. Part Oneasks questions concerning specific facts describing the school's career developmentprogram planning. Detailed questions are asked about the conduct of needs assess-ment, goal formation, objective writing, student activities, and organizationalstructure. Part Two contains six summary questions asking raters to form broadjudgments concerning each of the five elements of a systemat17. planning processFor career development listed on page 2. of this submission. The sixthquestion in Part Two requests a judgment regarding the overall quality of thecareer development program. Part One of the Rating Instrument, thus, is designedto familiarize thoroughly each expert rater with the facts, in preparation for thejudgments requested in Part Two.
A group of fifteen eminent persons in fields related to career developmentresearch and practice was assembled at the National Center to assist with inter-pretation of the information collected from the field sites. Panelists completedtwo twenty-7one page rating instruments for each participating high school. Thefirst completion provided a description of all schools at the beginning of theschool year, 1978-79, and the second completion described the career developmentprogram in all schools at the end of the school year. During the year the experi-mental schools used CPSS materials and the control schools did not. It should benoted that all identifying information, e.g., state, city, school, name, address,and dates had been removed from the data sources prior to the ratings.
At least three panelists were assigned at random to rate each experimental andcontrol school. Assigning more than one rater to each school permits numericalassessment of reliability of the ratings and yields more accurate results thanoold be obtained from a single rating por school. Pretest and posttest ratingsfor each school were done by the same group of panelists. Panelists were given
no information about the nature of the design prior to the rating session. In
particular, experimental and control schools and the pre-post feature of the
design were not identified to panelists. In a final debriefing session, afterall rating activities had been completed, the panelists were told that they had
participated in an assessment study of the Career Planning Support System. They
were given copies of CPSS materials, a study abstract, and informed of all aspects
of the study. The panelists indicated that they had neither surmised the nature
of the study nor recognized that they had rated pre and posttest data from the
same schools.
The main reason tor use of a panel. of judges is related to the nature of the
subject matter. Few people would doubt that efficient organization and planningcomprise important aspects of high school career development programs. Yet theImportant features of efficient organization and planning remain uncodified insufficient detail to permit completely objective measurement. In such instances,
human judgments are essential. Hence, a panel of individuals was assembled withthe experience, training,and reputation to provide the most accurate judgmentsavailable.
Because of their importance to the presentation, the six questions addressedby the panelists are reproduced verbatim below.
1. Estimate the extent to which the school staff was organized to plan system-atically a comprehensive career development program by evidence of clearlydesignated leadership; administrative cooperation; and permanent, activegroups and committees.
Estimate the extent to which a student career development needs assessmentwas conducted, tabulated, properly interpreted, and the data utilized for
planning the career development program.
3. Estimate the extent to which a comprehensive set of ordered career develop-ment goals reflecting assessed student career development needs weredeveloped and used in planning, implementation and evaluation of the program.
4. Estimate the extent to which a set of behavioral objectives was developedreflecting specific goals and containing a clear statement of the intendedaudiLmce, behavior, situation and standard of mastery.
5. Estimate the extent to which career de.7elopment activities were developedthat reflect student needs, goals, and associated objectives, and thatindicate methods, target student group and outcome measures by referringto the tWG attached career development activities.
6. Based on the available information (including all career development acti-vities), rare the overall quality of the school's career developmentprogram.
Tu answer these questions, raters referred to all information on the StatusReport and Verification Checklist from each school. Thus, raters had at theirdisposal data regarding schools' student career development needs and goals,career development activities designed for use with students, and organizationof career-development program planning. Ratings for the first five of theseitems were recorded on a five point scale ranging from "limited extent" (scored0) to "great extent" (scored 4). Ratings on the overall quality were also re-corded on a five point scale ranging from zero to four, but the two extremepoints were labeled "very low quaiity" and "very high quality."
The unit cal analysis tw- all statistical results is the school. A score des-cribing each school on each variable was calculated by forming the average over thethree or four raters who rated each school Agreement among raters for a givenschool, thus, indicates the reliability of the scores, and, conversely, disagreementamong raters indicates unreliability. The discrepancies among raters of a givenschool can he compared to differences in average ratings across schools. Thisbasic idea Forms the conceptual basis For calculating reliability coefficientsbased on an analysis of variance model (see Winer, 1971: 283 ff). The idea is tocompare a mean-square within schools to the mean-square between schools. Since theolder( tcl the design is to minimize pretest- differences among schools, these calcu-!allow; are based on posttest scores only. This procedure is quite analogous tocalculation of reliability coefficients from student scores on a test following acurriculum unit , because a "floor" effect artificially del laces reliability calcu-lations derived from pretest scores. The point is, that there is very littlevariance between schools on the pretest; all schools score low. The calculationsomit consideration of "anchor points" (Winer, 1971: 289 ff), thus yielding somewhatconservative estimates of reliability. The formula used approximates an unbiasedestimate of reliability, assuming no anchor point differences among raters (unlikecorrelational methods such as split half or coefficient alpha, which are biaseddownward).
Reliability of these items is uniformly quite high. The numerical values rangefrom .829 to .932, and average .881 (see Figure 1).
Panel4-
2-
1-
0
1:
1)
Organization
.001
reliability =
averageconfidencerating = 87.0
.852
Panel4
2-
[-
0
4:
-
p <
Objectives
.001reliability =
average.confidence
.900
_.
i rating = 90.7E
Panel4
3
1
0
2:
p <
4c
Needs
.001
reliability =
averagegconfidencerating = 91.9
.922
Panel4
2
l
0
i
5:
p <
Activities
.001
reliability =
averageconfidencerating = 87.9
.850
Pane4-
2
1
0
3:
p ,
4
Goals
.001
reliability =
averageconfidencerating = 91.5
.932
Panel4
2
1
0
6:
p <
Overall Quality
.001reliability =
averageconfidence
87.7
.829
CfE
i
Crating =
k
FIGURE I. ANCOVAs, RELFABLUTY, AND CONFIDENCE RATINGS FOR SIX SUMMARY MEASURES.
In ,Iddition to re11A1)1 1ity coofticients based on agreement Among different
raters ut the same schools, panelists were asked to estimate their confidence in
each rating they made. The confidence rating was the same for each question.Raters were asked to place a check along a scale from zero to 100 indicating their
judgments regarding the likelihood that their answers were accurate. The format
of the confidence rating is reproduced below.
CONFIDENCE RATING; Please estimate the probability that your answer is correct
by placing a check at the appropriate location on the scale.
1 t I__ _I__ i 1
0 10 20 30 40 50 60 70 80 90 100%
no confidence complete confidence
Tho Averagv confidence ratings of panelists is quite high, ranging from 87.0
to 91.9 purceni, thus reinforcing the reliability calculations. In spite of the
need for approximate judgments, therefore, it is concluded that available evidenceIs consistent with the view that the measurements are accurate to within tolerable
limits.
Data analysis methods. The major hypothesis in this study is that schoolstaffs using the Career Planning Support System will produce greater change toward
implementing a systematic planning process for career development programming thanwill school staffs not using CPSS. The statistical method for assessing this hypothesiE
is analysis of covariance (ANCOVA). The dependent variable:: for the ANCOVA are
posttest scores describing the planning process of each school at the end of the
experiment. There are two independent variables including one categorical factor--
experimental condition defined by the use or nonuse of CPSS--and one covariate
defined as the pretest score corresponding to the posttest dependent variable.
Conceptually, the ANCOVAs describe differences in posttest scores between schools
using CPSS and schools not using CPSS, under statistical control for the pretest
score.
Although it does not appear to be widely recognized, the ANCOVA model can be
viewed as a model of change. Conceptually, the ANCOVA can be viewed as expressing
the following hypothesis: Change over the period of the experiment is greater forschools using CPSS than fur schools not using CPSS, when statistical controls for
the effect of the starting point (pretest scores) are applied.
Results. The major results of the study are summarized in Figure 1. Each of
the first five panels of the figure summarize the results for one element used to
define a systematic career ::..:inning process, and the sixth panel summarizes judg-
ments of the overall quality. The panels of the figure are numbered and labeledto correspond to the quel_tions reproduced on page 6 of this submission.
The graphs display plots of mean differences in posttest scores between experi-mental schools (E) and control (C), as adjusted statistically by the analysis ofcovariance for pretest scores on the dependent variable. Alternatively, as notedabove, these graphs may be interpreted as differences in change from pretest toposttest, adjusted for differences in starLing point. The vertical axis of :hesegraphs represent scores on the six items. The horizontal axis does not ref:.ect a
continuous scale. Rather, the leffhand point (labeled C) corresponds to the controlgroup, and the rIghthandpoint corresponds to the experimental group (labeled E).
8
hIs posIt 4) I ,(1 C is arbitrary, but was Selected so that a positive slope'Indicates support lot main hypothesis: that experimental schools show largergains when adjusted for starting point than do control schools. All six graphs doshow a substantial positive slope, thereby lending support to the hypotheses. All
statistical tests are highly significant, with probabilities less than .001.(Reported probabilities are for the main effect of the experimental variable, afteradjustment for the covariate.)
Whenever random assignment to treatment groups cannot be realized, observeddifferences between treatment groups, in theory can be due to nontreatment variables.The standard methodology for handling objections of this sort is to introduce sometype of statistical control for a small group of variables that are likely candi-dates to account for observed differences between treatment groups. In the presentcase, the sample size is small enough to render such procedures of dubious value.
One may observe, however, bivariate relationships between selected "control"variables and the treatment variable. In the present study the treatment variableis defined by the two categories--used CPSS and did not use CPSS. Averages on the
following variables were compared statistically for users and nonusers of CPSS:student population size, ratio of faculty and staff to students, academic testscores, drop-out rate, percentage of the student body who were ninority groupmembers, and a rough estimate of family income of the student bildy. As shown inTable 2, in none of these five tests were statistically significant differencesobserved. Hence, it is concluded that the differences between tsers and nonusersof CPSS on the six criterion variables are riot due to any of these five charac-teristics of schools.
Educational importance. There are two factors related to the educationalimportance of the results. First, are the gains of sufficient magnitude to be non-trivial? Secondly, does CPSS address an important educational need?
Widespread application of standard score units .renders them useful because ofimplicit: standards based on long experience regarding the magnitude of change re-quired to indicate substantive importance. Table 3, therefore, presents the resultsshown in Figure 1 given in standard score units. The calculations were carriedout with the mean and standard deviation of each variable calculated over pretestand posttest and over experimental and control groups. One might prefer using thepretest means and standard deviations because these values more accurately reflect
general population of schools, the vast majority of which have not used CPSS.Reliance on the overall mean and standard deviation show the results in a conserva-tive light, however, since the pretest standard deviation is, fel every variable,considerably smaller than the overall standard deviation. Dividing by the smallerstandard deviation wcald magnify differences between experimental and controlschools. In summary, Table gives each point on the correspc.nding gr-ph inFigure 1 in standard score units, using the overall mean and standard deviation ofeach variable to calculate standard scores.
l".e results in Table 3 amplify the graphic presentation. The tabulati.ons showthat io every instance, after adjusting for pretest scores by analysis of covariance,the posttest experimental schools are over one standard deviationabove the grand mean; whereas, posttest control schools are one-third to three-fifths standard deviations below the mean. Effects of this magnitude due to a"treatment" are seldom observed in social research. It is concluded, therefore,that the magnitude of the standard scores indicates educationally important gainsfor the experimental schools.
9
TABLE 3. STANDARD SCORE MEANS FOR ANCOVAs
Panel 1: Organizational structureDid Not Use CPSS = -.38Used CPSS - 1.40
L.-RI 2: NeedsDid Not Use CPSS = -.46Used CPSS 1.36
Panel 3: CoilsDid Not Use CPSS = -.62Used CPSS - 1.42
Panel 4: ObjectivesDid Not Use CPSS = -.42Used CPSS = 1.35
Panel 5: ActivitiesDid Not Use CPSS = -.48Used CPSS = 1.19
Panel 6: Overall qualityDid Not Use CPSS = -.31Used CPSS = 1.34
The second aspect of educational significance is she need addressed by CPSS.Arc noted in the opening paragraphs of this submission, the CPSS materials weredeveloped In response to a need for improved career development planning in schools.this need has been expressed repeatedly in a variety of professional forums repre-senting several professional specialties. Prior to development of CPSS, a consensusdeveloped which reported that systematic planning was an essential ingredient inimproving career development programs. The CPSS materials are designed to instructschool staffs in the use of a systematic planning process and development ofassociated products for building career development programs in high schools.The data in this submission demonstrate that the materials do enable staffs tocreate a systematic planning process.
REFERENCES
Campbell, R.E. "The Application of Systems Methodology to Career DevelopmentPrograms." In Zalatlno, S.D. and Sleeman, P.J., eds., A Systems Approachto Learning Environments. Rose.1.2, New Jersey: Meded lrojects, Inc., 1975.
Campbell, R.E., Dworken, E.P., Jackson, D.P., Hoeltzel, K.E., Parsons, G.E., andLacey, D.W. The Systems Approach: An Emerging Behavioral Model for CareerGuidance. Columbus, Ohio: The Center for Vocational and Technical Education,1971.
Campbell, D.T., and Stanley, J.C. Experimental and Quasi-Experimental Designs forResearch. Chicago: Rand McNally & Co., 1966.
"Guidelines for a Quality Career Guidance Program." AVA Guidance Division Newsletter,1979.
Herr, E.L. Guidance and Counseling in the Schools: The Past, Present, and Future.Falls Church, Virginia: The American Personnel and Guidance Association,
Uosfard, R.E., and Ryan, A.T. "Systems Design in the Development of Counseling andGuidance Programs." Personnel and Culdance Journal, 49: 221-230, 1970.
Mitchell, A., and Gysbers, N., issue paper referenced in E.L. Herr, ed., Guidanceand Counseling_ in the Schools: The Past, Present, and Future. Falls Church,Virginia: The American Personnel and Guidance Association, 1979.
Winer, B.J. Statistical Principles in Experimental Design. New York, New York:McGraw-Hill, Inc., 1971.
10 1 cs
APPENDIX L
Acknowledgements
174
ACKNOWLFDGEMENTS
The National Center wishes to thank the following people for their participation in the Career Planning Support System AssessmentStudy, 1977 79.
District Superintendent and/ur Coordinator
AitIZON Alanny l.uty. Superintendent
Globe Public Schools. Globe
LilwrerICE. LCHI,M). SuperintendentMiami Public Schools. :Onion
et)1.()It ADOArthur I. 1.uilon, SuperintendentDavid Minguru. ('.,rear CoordinatorJefferson County l'ithhe Schools, Lakewood
FLORIDA.1. I.. "In., superintendentErnest Up thegrove, Career ConsultantDade County Puhltc Schools, Miami
ILLINOISJoseph Hannon. SuperintvntientJoyce Clark. Director. Department of
Guidance Programs & ServicesChicago Piddle Schools, Chicago
It EN"I'll EK YE. C. illayson. Superintendentliartno.a Prell. Director of Career EducationJelfersoo (Nullity Piddle Schools, Louisville
MARYLANDJohn Crew, SuperintendentHilbert Stanley. Administrative AssistantLattimore City Public Schools. Baltitnore
TENNESSEEWillie Ilerenton, SuperintendentWallace Wilson. Guidance & hipil Adjustment DivisionDanny liollingsvic.;:a. Guidance & Pupil Adjustment
DivisionMemphis City Schools. Memphis
experimental school
School Principals and Coordinators
AP.ZONA.folin Vest. Principal: Mike Cruikshank. Coordinator
Glohe High School. Globe
Richard Panagos. Principal: James 'Loll, CoordinatorMiami I(igh School. Miami
COLORADOJack Jost. Principal; Rein Leetmite. Coordinator
Alameda Sr. High School. Lakewood
John Musciano. Principal; Fred Dycr, CoordinatorPomona High School, Arvada
FLORIDALonnie Coleman. Principal; Ruth Sedlik. Coordinator*American Sr. High School, Hialeah
Miriam Stoodt, Principal; Alice Bryant, CoordinatorSuutli 1)ade Sr. high School. Homestead
Nicholas Borota, Principal; Paul Duncan. Asst. PrincipalNorth Miami Sr. High School. North Miami
ILLINOISFloyd Wyrick, Principal: Lois Gueno. Coordinator*Calumet high School. Chicago
Jacqueline Simmons. Principal: Landon Cox. CoordinatorPaul Robeson High School. Chieag .
Joseph Smith. Principal; Mildred Losee. CoordinatorJ. M. Ilarlan High School. Chicago
Robert Saddler. Principal; Marjorie Spurlock, CoordinatorJohn Marshall High School. Chicago
KENTUCKYEugenia Lewis. Principal; Paul Clubb, CoordinatorIroquois High School. Louisville
Joseph McPherson, Principe,: John Pt.cli. CoordinatorCentral High School, Louisville
MARYLANDEdinonia Yates. Principal; Gwendolyn Brooks. Coordinator*Forest Park Sr. High School. Baltimore
Kathleen Luchs, Principal; Myrna Goldberger. Coordinator*Northwestern High School, Baltimore
John Mohamed. Principal; Jane Gates, CoordinatorLake Clifton High School. Baltimore
TENNESSEECorhe t Washington. Principal; Warren Morehart. Coordinatorfdieffield High School. Memphis
Bennett Hunter, Principal; Louis Wakefield. Coordinatol.Raleigh-Egypt High School. Memphis
la")"EID OF DOCUMENT"