a college-wide assessment infrastructure was · web viewin this model, the program evaluation...

386
Monographs in Engineering Education Excellence University of South Carolina College of Engineering and Information Technology Gateway Engineering Education Coalition Edward Ernst, University of South Carolina, Monographs Editor A Continuous Quality Improvement System: An On-going Assessment Process within the College of Engineering and Information Technology at U.S.C. Susan D. Creighton Edward W. Ernst Joseph H. Gibbons Charles W. Brice Francis A. Gadala-Maria Jed S. Lyons Anthony Steve McAnally University of South Carolina 1

Upload: lythu

Post on 28-Mar-2018

220 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Monographs in Engineering Education ExcellenceUniversity of South Carolina College of Engineering and Information TechnologyGateway Engineering Education Coalition

Edward Ernst, University of South Carolina, Monographs Editor

A Continuous Quality Improvement System: An On-going Assessment Process within the College of Engineering and Information Technology at U.S.C.

Susan D. CreightonEdward W. ErnstJoseph H. GibbonsCharles W. BriceFrancis A. Gadala-MariaJed S. LyonsAnthony Steve McAnally

University of South Carolina

Number 4, December 2000

1

Page 2: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Monographs in Engineering Education ExcellenceEdward Ernst, University of South Carolina, Monographs Editor

A Continuous Quality Improvement System: An On-going Assessment Process within the College of Engineering and Information Technology at U.S.C.

By:Susan D. CreightonEdward W. ErnstJoseph H. GibbonsCharles W. BriceFrancis A. Gadala-MariaJed S. LyonsAnthony Steve McAnally

Published by the College of Engineering and Information Technology, University of South Carolina, Columbia, SC 29208. Address editorial correspondence to Edward Ernst, 3A12 Swearingen Engineering Center, University of South Carolina, Columbia, SC 29208; (803) 777-9017; [email protected].

2

Page 3: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Contents

Page

Preface 4

Background 6

College Assessment Infrastructure 8

College-Wide System 10

Assessment Plan 13

Assessment Methods 15

Quality Review Process 23

Program Assessment Structures and Processes 25Mechanical Engineering Program 29

Chemical Engineering Program 35Civil Engineering Program 42Electrical Engineering Program 50Computer Engineering Program 57

Appendices

A Assessment Plan 64B Senior Survey 69C Senior Survey reports (sample) 75D Course Survey 118E Course Survey reports (sample) 121F Alumnae/Alumni Survey 128G Alumnae/Alumni Survey reports (sample) 135H Faculty/Staff Surveys 173I Faculty/Staff Survey reports (sample) 181J Entering Student Survey 196K Entering Student Survey reports (sample) 201L Performance Assessment Instrument 223M Mid-Course Evaluation 230N Education Outreach Survey 234O Professional Communication Center Assessment Report 236P Longitudinal Student Tracking Report (sample) 242Q Bates House Project Report 251R Template for Documenting Assessment Progress 260

3

Page 4: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

PrefaceMonographs in Engineering Education Excellence is a series of publications dealing with

innovations in engineering education introduced at the University of South Carolina, with the support of the Gateway Engineering Education Coalition. The series seeks to make the information and ideas in the reports more accessible to engineering educators. It is hoped that other institutions will find the reports useful and adaptable to their own educational mission.

The Monographs in Engineering Education Excellence series includes a variety of genrestheses, dissertations, and technical reports, but all have the common objective of rethinking, reshaping, and revitalizing engineering education. This monograph, A Continuous Quality Improvement System: An Ongoing Assessment Process within the College of Engineering and Information Technology at U.S.C., discusses the college-wide assessment and CQI system developed to ensure the educational programs of the college are achieving the expectations held for them. The monograph presents examples and details regarding the tools, policies, processes, and procedures that have been developed and implemented in the college. These assessment/CQI efforts have evolved with support from the Gateway Engineering Education Coalition.

A broad agreement on the need for systemic educational reform exists within the engineering education community so those programs can provide the activities necessary to develop graduates who meet the new standards for the 21st century. The reform movement encourages more diversity in classroom practices that move instruction from a traditional lecture to structured activities reflecting what engineers do in the workplace. These initiatives promote changes in classroom practices to reflect the knowledge, skills, and abilities required by engineers to conceptualize, articulate, and implement a solution for engineering problems. The reform movement also advocates that engineering curricula incorporate a variety of teaching methods to involve students in active learning, design projects, technology use, and multidisciplinary teams. Outcomes-based assessments, in the form of design projects, portfolios, and model construction, enable faculty to link student competencies with the expectations of the workplace.

Believing in the need for change and recognizing that engineering is part of the growing national trend toward increased accountability, many accrediting organizations as well as national and state funding agencies, such as the National Science Foundation, have taken leadership roles in defining new parameters for engineering education. The paradigm shift is clearly evident in the new criteria adopted by the Accreditation Board for Engineering and Technology (ABET) which promote the use of outcomes assessment as the measuring tool for institutional and program evaluation. The stated goals of the ABET accreditation include: (1) providing graduates of accredited programs who are adequately prepared to enter the engineering profession; (2) stimulating the improvement of engineering education; and (3) encouraging new and innovative approaches to engineering education.

To achieve these objectives, the ABET Engineering Criteria 2000 stipulate that individual programs must have and have published educational objectives consistent with the mission of their institution. Programs must evaluate the success of students in meeting program objectives using appropriate assessment methodologies. The ABET criteria also require engineering programs to include a continuous quality improvement process. In this model, the program evaluation process documents progress towards achievement of objectives established by the engineering program and uses this information to improve the program.

Moreover, the criteria require that programs demonstrate student outcomes of such complex skills as the ability to design and conduct experiments, as well as to analyze and interpret

4

Page 5: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

data, the ability to design a system, component, or process to meet desired needs and an ability to communicate effectively. Types of evidence advocated by ABET to document these student outcomes include portfolios, design projects, nationally normed subject content examinations, and alumnae/alumni and employer surveys.

Criterion 2 of the ABET Engineering Criteria 2000 mandates a system that continually evaluates the programs to determine if program objectives are met and if they meet the needs of the program’s constituencies. The college developed and implemented a college-wide infrastructure with supporting policies, procedures, personnel and assessment tools to ensure the permanency and effective operation of the system.

The college-wide assessment system is linked with the continuous quality improvement processes initiated within each USC engineering program - Chemical, Civil, Computer, Electrical and Mechanical engineering. Together, the college-wide assessment processes and the program assessment processes comprise the USC COEIT Continuous Quality Improvement System.

The college-wide infrastructure provides the coordination and collaboration efforts needed to facilitate: (1) continuous cycles of program improvement; (2) the attainment of college goals and objectives; and (3) the achievement of state-level and accreditation agency performance indicators. The structure supports the personnel and resources necessary to maintain the flow of data, information and evaluation results through the system. It also serves as the focus for the triangulation and synthesis of data from different constituencies and various reports.

5

Page 6: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Background

Numerous reports over the past ten years have outlined the attributes that engineering graduates need to possess in the 21st century workplace [1]. The engineering education culture is shifting from one emphasizing individual specialization, compartmentalization of knowledge and a research-based faculty reward structure to one that values integration and specialization, teamwork, educational research and innovation. Institutions of higher education now focus on student outcomes or performance-based models of instruction that strive to measure what students have learned and what they can do [2]. Outcomes assessment examines the results of the education process by asking to what extent students have accomplished the objectives of their discipline.

There is broad agreement of the need for systemic educational reform within the engineering community so those programs can provide the activities necessary to develop graduates who meet the new standards for the next century. The reform movement encourages more diversity in classroom practices that move instruction from a traditional lecture to structured activities reflecting what engineers do in the workplace. These initiatives promote changes in classroom practices to reflect the knowledge, skills, and abilities required by engineers to conceptualize, articulate, and implement a solution for engineering problems. The reform movement also advocates that engineering curricula incorporate a variety of teaching methods to involve students in active learning, design projects, technology use, and multidisciplinary teams. Outcomes-based assessments, in the form of design projects, portfolios, and model construction, enable faculty to directly link student competencies with the expectations of the workplace.

Believing in the need for change and recognizing that engineering is part of the growing national trend toward increased accountability, many accrediting organizations as well as national and state funding agencies, such as the National Science Foundation, have taken leadership roles in defining new parameters for engineering education. The paradigm shift is clearly evident in the new criteria adopted by the Accreditation Board for Engineering and Technology (ABET) which promote the use of outcomes assessment as the measuring tool for institutional and program evaluation. The stated goals of the ABET accreditation include: (1) providing graduates of accredited programs who are adequately prepared to enter the engineering profession; (2) stimulating the improvement of engineering education; and (3) encouraging new and innovative approaches to engineering education [3].

To achieve these objectives, the ABET Engineering Criteria 2000 stipulates that individual programs must have published educational objectives consistent with the mission of their institution. Programs must evaluate the success of students in meeting program objectives using appropriate assessment methodologies. The ABET criteria also require engineering programs to include a continuous quality improvement process. In this model, the program evaluation process provides documentation of progress toward achievement of objectives established by the engineering program and uses this information to improve the program.

In addition, the criteria require that programs demonstrate student outcomes of such complex skills as the ability to design and conduct experiments, as well as to analyze and interpret data, the ability to design a system, component, or process to meet desired needs and an ability to communicate effectively. Types of evidence advocated by ABET to document these student

6

Page 7: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

outcomes can include portfolios, design projects, nationally normed subject content examinations, focus groups, and surveys of alumnae/alumni, students and/or employers.

7

Page 8: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College Assessment InfrastructureAs engineering classroom practices change, the evaluation of student development and

program effectiveness must align with the new ABET emphases. Criterion 2 of the Criteria 2000 specifies that programs must have published educational objectives that are consistent with the mission of the institution. It also mandates a system that continually evaluates to determine if program objectives are met and if they meet the needs of the program’s constituencies. To this end, the University of South Carolina College of Engineering and Information Technology (COEIT) developed and implemented a college-wide infrastructure with supporting policies procedures, personnel and assessment tools to ensure the permanency and effective operation of the system.

The college-wide assessment system is linked with the continuous quality improvement processes initiated within each USC engineering program - Chemical, Civil, Computer, Electrical and Mechanical engineering. Together, the College-wide assessment processes and the program assessment processes comprise the USC COEIT Continuous Quality Improvement System. Both parts of this system are integrated within the College Strategic Plan. As seen in Figure 1, this plan connects the College to its institution through the statement of University of South Carolina’s vision, mission and goals.

Figure 1. Overview of COEIT Continuous Quality Improvement System

The purpose of the continuous quality assessment system is to continually assess the needs of the program’s various constituencies to ensure that the programs are achieving expectations as described by the objectives and to evaluate how effectively each program and the College have moved toward achieving stated mission and goals. Assessment processes show faculty, staff, administrators and others where improvements seem to be appropriate and guide the implementation of change within each program and college-wide service areas. Changes are monitored and re-evaluated to determine what improvement has been realized. Thus, the system is an ongoing evaluation of the effectiveness of the College and its programs.

8

USC Vision, Mission, Goals

COEIT Vision, Mission, Goals

Program Assessment Systems

College-wide Assessment System

COEIT Strategic Plan

Page 9: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The following sections will discuss both the College-wide system and the program systems. Examples and details will be given regarding the tools, policies, processes, and procedures that have been developed and implemented at USC COEIT to ensure the institutionalization of the CQI System.

Note. This monograph is a snapshot of the status at the end of the spring Semester, 2000. The CQI processes are relatively new and continue to change.

9

Page 10: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College-wide System

The College-wide infrastructure provides the coordination and collaboration efforts needed to facilitate: (1) continuous cycles of program improvement; (2) the attainment of college goals and objectives; and (3) the achievement of state-level and accreditation agency performance indicators. The structure supports the personnel and resources necessary to maintain the flow of data, information and evaluation results through the system. It also serves as the focus for the triangulation and synthesis of data from different constituencies and various reports.

The comprehensive character of the college-wide assessment structure is evident in the following diagram.

Figure 2. College-Wide Assessment Infrastructure

The diagram shows the integration of state and institutional parameters within the system. It also highlights the linking of college assessment processes to its departmental programs. A more comprehensive view of the departmental assessment processes within this continuous loop system is discussed in a later section.

The personnel and processes of the college-wide assessment infrastructure, however, are the focus of this diagram. The College-wide infrastructure consists of several formal, key components: College Executive Committee, Center for Engineering Education Excellence, the Center for Engineering Education Excellence Team, Assessment Director, Departmental Assessment/Education Committees and its various constituencies.

10

Mission & GoalsUniversityCollegeDepartment

ConstituenciesStudentsAlumniEmployersIndustry BoardFaculty

Departmental Committee

Proposed Program Objectives & Outcomes

DepartmentFaculty

Approved Program Objectives & Outcomes

Results Assessment

StudentsAlumniEmployersFacultyStaffOthers

Assessment Director

College Executive Committee

Center for Engineering Education Excellence Team

Plan

Department Chairs

Department Committee

Faculty Curricula

ABET Criteria

Page 11: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

A brief overview will outline the responsibilities of each component and provide insight into how these personnel and committees interact to produce a continuous quality improvement process.

Executive Committee

The Executive Committee is composed of the Dean, Associate Deans, Departmental Chairpersons and the Center for Engineering Education Excellence Director. This committee meets at two-week intervals and provides oversight and decision-making duties for the College.

Center for Engineering Education Excellence

The Center for Engineering Education Excellence is an interdisciplinary organization of individuals who collaborate in the effort to promote self-study, innovation and reform within the College. The staff and support personnel involved in the Center include: the Director for the Center; a Program Coordinator, the Assessment Director, the Director of the Professional Communications Center and the Ethics Coordinator.

The mission of the Center includes all the major parts of engineering education: undergraduate, graduate, and research; and promotes meaningful integration of engineering education. The educational goal of the Center is to graduate students that understand the technology content of engineering as well as the social, political, ethical, environmental and economic context.

The objectives for the Center have both an internal and an external thrust. These objectives include:

Development of students as engineering professionals with the motivation, capability and knowledge base for career-long learning Emphasize effective teaching/learning strategies for all types of students Promote effective and (time) efficient student/faculty interaction Enhance the continuous quality improvement process (CQI) within the College Serve the engineering education community by encouraging innovation and reform Increase the visibility of USC to the engineering education community Provide a channel for learning about innovation in engineering education at other schools

Center for Engineering Education Excellence Management Team

The Center for Engineering Education Excellence Team provides the opportunity for collaboration among the programs, discussion of issues, planning activities, and making recommendations for college-wide initiatives. The committee consists of a Chairperson (Director of the Center), the Assessment Director, the Associate Dean for Academic Affairs, the Director for the Professional Communications Center, the Ethics Coordinator, and one faculty representative from the Chemical, Civil, Computer, Electrical and Mechanical programs. The biweekly committee meetings serve as one focal point for the distribution and discussion of report findings and information. Committee members then share this information with the appropriate committees within their individual departments.

The members of the Center for Engineering Education Excellence Team have been the primary personnel involved with the initial organization and maintenance of the assessment

11

Page 12: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

structure. Meeting on a weekly basis, the team addressed a range of issues relating to the implementation of a continuous improvement program. Substantive tasks accomplished by the Committee include:

- restatement of the College’s mission- articulation of an assessment process within each program- development of educational objectives for each program- development of objectives for each course within each program- determination of some assessment methods and metrics to measure the objectives and

outcomes- development of a faculty workload policy- discussion regarding survey results (Senior Exit Survey, Course Survey, etc.)- review and feedback of each college-wide survey or assessment technique- wrote self-study reports for the ABET accreditation review- participated in the ABET accreditation review

Program Assessment Committees

The Program Assessment Committees include three to five faculty members within each program and serve as the focus for problem solving, innovation and program change. Each program has articulated an assessment structure and process to collect and/or review data and information that is related to their student outcomes and course objectives. In general, each department designated responsibility for addressing assessment data and/or topics to one or more committees within their attachments. A more extensive discussion of the continuous quality improvement processes for the degree programs follows in later sections.

Director of Assessment

The Director of Assessment position was created to develop and implement the overall college-wide assessment infrastructure, processes, and procedures for maintaining a continuous quality improvement program, and, to provide technical support to the faculty implementing assessment processes in each degree program. Having a full-time person to direct and support assessment activities was an important step because it increased the flow of information among faculty and staff across disciplines resulting in an increased ownership of student learning outcomes and a heightened sense of responsibility towards its graduates. The sharing of ideas, information and evaluation results enhanced communication between the administration and the faculty and staff members.

12

Page 13: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Assessment Plan

The Director of Assessment developed a three-year plan to guide the implementation and evaluation of the continuous quality improvement process and to establish timeframes, action strategies and a budget for the system. The assessment program plan set objectives, outcomes, criteria and a timeframe that established the framework for a continuous quality review/improvement system. The goals of this program are fourfold:

1) to present conclusions regarding the overall outcomes of the student’s academic and extracurricular engineering performance for use in decision making by faculty, program chairs and administration;

2) to present results about programs, activities, etc. in order to improve the programs;3) to enhance understanding and appreciation of formative and summative evaluation;

and 4) to contribute to the general body of knowledge with regard to evaluation of

undergraduate engineering programs.

An example from this plan is given in the following section. Objective 1 provides for the overall assessment system for the College. See Appendix A for the complete Assessment Plan.

Assessment Plan Program Objectives and Strategies

Objective 1:

Develop and implement an assessment program that provides processes and procedures for the continuous evaluation of student performance and satisfaction, faculty performance and satisfaction and stakeholder input into the educational system.

Action Strategies & Timeframes:

1. Monitor the processes and procedures developed and implemented to evaluate assessment data provided to each program and the executive committee. (4/00; 4/01;4/02;4/03;4/04)

2. On an annual basis, each department will review and make recommendations for improvement based on assessment data collected to address each program outcome as part of the continuous quality review program. (Center for Engineering Education Excellence Team) (6/00;6/01;6/02;6/03;6/04)

3. The Director of Assessment will prepare the annual Quality Review Program Report indicating the extent to which the action plans were implemented and achieved by each department, the feasibility of the time frames and recommendation for improving the process. (10/00; 10/01;10/02;10/03;10/04)

13

Page 14: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Outcomes:

A. Outlining each major step in the assessment process that will occur within the program, each program will submit written procedures to be reviewed by the Dean.

B. Each program will submit written procedures. C. On an annual basis, each department will provide a written summary report of findings (outcomes),

results, actions taken, consequences, and recommendations verifying the assessment process has completed the annual cycle and specifying problems and solutions.

D. The Director of Assessment will summarize results and recommendations of the Center for Engineering Education Excellence Team; then prepare a synopsis of the annual review indicating assessment measures analyzed, outcomes, recommendations, changes implemented, and the evaluation results of the changes.

E. The Executive Committee will discuss and prioritize action strategies recommended as a result of the annual program review.

Resources:

The Director of Assessment position An educational research graduate assistant A work-study student assistant

The assessment plan provides a comprehensive outline of all of the tasks related to the Director of Assessment position. In addition, this plan also details the College instruments to be implemented and the methodology to be used to ensure that ongoing assessment and evaluation is undertaken by the degree programs. Use of the Strategic Plan for the College of Engineering and Information Technology is one way in which the degree program assessment processes are continually monitored, revised and evaluated. Departmental and college objectives and outcomes are modified annually to address new priorities or pursuits. The annual Quality Review Program Report is incorporated within the Strategic Plan.

14

Page 15: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Assessment Methods

The Director of Assessment has also identified and developed college-wide assessment tools for use in the continuous quality improvement system. A number of instruments, processes and procedures were developed and implemented to collect data that can be used to evaluate the effectiveness of the USC College of Engineering and Information Technology and its programs as well as student learning and growth. In addition, the Director of Assessment provided a Student Longitudinal Tracking System, coordinated the implementation of Employer Focus Groups, interviewed students and faculty members, assisted instructors with the evaluation of teaching/learning objectives for specific courses, and developed evaluation measures for examining the impact of the Professional Communications Center. A few of the important college-wide assessment instruments developed and utilized thus far in the assessment process are discussed in the following sections.

Senior SurveyStudents graduating from the College of Engineering and Information Technology

complete a survey requesting information about their undergraduate college experience and their judgment regarding specific engineering skills and abilities. The four-page survey obtains information in the following areas:

(1) overall ratings of students’ engineering education(2) life-long learning indicators(3) assessment of specific college services(4) opportunity for students to make recommendations (5) evaluation of ABET skills and competencies(6) useful experiences(7) extracurricular activities(8) plans for graduate education (9) employment information(10) demographic information including transfer status

A copy of the survey is found in Appendix B. A Graduate Placement Sheet also accompanies the distribution of the survey; this assessment form requests an address for future mailings and employment and/or graduate school information. Students are given separate envelopes to return the Placement Sheet so that their anonymity will be maintained if they choose.

Administrative Procedures

Several methodologies have been utilized since the 1998 Spring Semester to administer the survey and the data sheet to graduating seniors. During the first three semesters, the College initiated a procedure that featured the use of graduating seniors from each program to distribute and collect the surveys from students in their program. The use of paid student assistants encouraged participation and resulted in a return rate of approximately 80 percent. As a result of this more personalized approach, seniors began to learn of the importance of this type of information to the College. During recent semesters, the College has experimented using other

15

Page 16: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

distribution and retrieval methodologies. The one that appears to be the most successful in producing the highest return rate and quality responses is having the instrument administered during a particular course in each program. The Chemical, Civil, Computer, Electrical and Mechanical programs have a senior-level course comprised of graduating seniors. Given at the end of the semester, this procedure captures an even greater percentage of the graduating seniors and assures a more uniform administration of the assessment instruments.

Reporting

The Director of Assessment prepares a tabular listing of responses giving frequencies and percentages for the total results and the breakdowns for each degree program. An additional summary report giving an analysis of the overall results and a synopsis of program differences, if any accompanies the listing of results. An example of each report is given in Appendix C.

Course SurveyThe Course Survey assessment instrument is administered to students enrolled in all

undergraduate and graduate courses taught within the College each semester. Administration of the form is required for all courses, including APOGEE (long distance education/continuing education) and other graduate courses, enrolling five or more students.

The first seven items on the survey are those mandated by the state legislature. The wording and the options of these seven items are reproduced as requested by the state law. On a regular basis, the College scores on these items are reported to the Office of Institutional Planning and Assessment; data are then forwarded to the Commission on Higher Education. Other items on the survey were developed and approved by the Center for Engineering Education Excellence Team.

The Course Survey was administered for the first time at the end of the 1997 Fall Semester and has been revised several times to accommodate changes within the College and to improve the quality of the survey items. The revised survey is a two-sided Scantron sheet having four sections. Students provide course and instructor data in the first section. The second includes 23 items structured in a Likert-type format. Alternatives for most of the items follow a 5-point scale ranging from “strongly disagree” to “strongly agree” with the midpoint as “neutral” response. Two items use “very poor” to “excellent” response patterns and one item includes a 4-point scale with a “very dissatisfied” to “very satisfied” response pattern. The third area provides space for instructors to add up to 12 additional questions. The last section contains three short answer questions providing students with the opportunity to make their own observations and comments regarding the strengths and weaknesses of the course. A copy of this survey is given in Appendix D.

Administrative Procedures

Each faculty member receives packets that include course surveys and student and faculty instructions for survey completion. Memos to the students and faculty outline coding instructions for adding the instructor identification, course and section number to the scanning process as well as survey dissemination, collection and retrieval information. Surveys are received in the Student

16

Page 17: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Services Office where they are sorted, coded, counted, aligned and sent to Computer Services for scanning. Student data are analyzed and reported using a database and program written with the SAS (Statistical Analysis System) statistical software.

Reporting

Each semester, a tabular report listing the frequencies, percentages, means and standard deviations for each item alternative is generated for each faculty member. In addition to listing the faculty member’s total for each section, the report lists the departmental and college totals. The Director of Assessment also prepares a brief summary of the overall college results. Both reports are distributed to all College instructors. A more comprehensive report is prepared for the Executive Committee and members of the Center for Engineering Education Excellence Team. This report contains the frequencies, percentages, means and standard deviations for each item alternative for each program and the college totals. A copy of each type of report is located in Appendix E.

Alumnae/Alumni SurveyDuring the 1998 fall semester, the College of Engineering developed an Alumnae/Alumni

Survey to obtain information from graduates who have been attending school or working for the past three years. The survey asks alumnae/alumni to evaluate several aspects of their undergraduate program and their present career position. The five-page instrument obtains information regarding the following topics:

Employment informationSatisfaction with career, salary, etc.Continuing educationRating of undergraduate experienceRating competency level for particular skillsRating importance of particular skillsPositive aspects of engineering program Influential professors to professional developmentRecommendations for improvement of educational experienceProfessional developmentDemographic information

A copy of the Alumnae/Alumni Survey (for graduates after three years) is included in Appendix F.

Administrative Procedures

The Assessment Director used the USC database of records to obtain student addresses for each mailing of the survey. The Alumnae/Alumni Survey is administered once a year to students who graduated three years prior to that date; this schedule was chosen because it allows graduates an average time period to complete a graduate degree or to become established in the workplace. The first mailing of this survey, to students who graduated in 1995 was completed during March 1999; approximately 22 percent of the surveys were returned for an insufficient or incorrect

17

Page 18: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

address. A second mailing, using alternative addresses if appropriate was completed during the first week of May 1999. The second administration of the Alumnae/Alumni Survey took place in November 1999 with a follow-up mailed in March 2000; surveys were mailed to 1996 graduates. Inaccurate addresses continued to be a problem in reaching COEIT alumnae/alumni. The third administration of the Alumnae/Alumni Survey for 1997 graduates was completed during July 2000. Alumnae/alumni survey data has been input and analyzed using SAS software.

Reporting

The Director of Assessment prepares a tabular listing of responses giving frequencies and percentages for the total results and the breakdowns for each program. An additional summary report giving an analysis of the overall results and a synopsis of program differences, if any, accompanies the listing of results. An example of each report is given in Appendix G. Copies of each report are mailed to each Executive Committee member and each Center for Engineering Education Excellence Team member. Additional personnel receiving reports include a representative from Development, Career Services and Student Services departments.

Faculty and/or Staff Survey

An initial Faculty and Staff Survey was administered during May of the 1999 Spring Semester addressing the following areas: (1) College goals and planning; (2) College-industry interaction; (3) College administration/leadership and communication; (4) College-wide services; (5) funding priorities (6) awareness of programs at aspirant institutions.

In April 2000, an alternative Faculty Survey was administered within the college to capture data similar to information requested from seniors and alumnae/alumni. This revised faculty survey elicited responses to questions concerning:

(1) the amount of experience students received on 21 skills(2) the level of competency achieved by USC engineering students on 21 skills(3) the extent to which reform learning/teaching strategies are incorporated within the

classroom(4) the level of student input for course improvement(5) the improvement of the engineering education experience(6) the use of different assessment tools within a course(7) the professional development activities for faculty

A copy of each survey is located in Appendix H.

Administrative Procedures

Faculty surveys were mailed to each full-time faculty member within the College of Engineering and Information Technology. A cover letter, containing instructions for the return of

18

Page 19: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

the survey and an explanation of the importance of the requested information, and a labeled return envelope was provided within the survey packet. At the end of two weeks, an email was sent to all professors reminding them to complete and return the survey as soon as possible; an electronic copy of the survey was attached to the email.

Reporting

A tabular report listing the frequencies and percentages for each item alternative is generated for each program as well as college totals. The Director of Assessment also prepares a brief summary of the overall college results. Both reports are distributed to the Executive Committee and the Center for Engineering Education Excellence team members. A copy of each type of report is located in Appendix I.

Entering Student Questionnaire

An Entering Student Questionnaire was developed to provide specific information for administration personnel involved with student marketing and recruitment. The primary emphasis of this survey was determining why students chose to come to USC, to what other colleges they applied and the reasons that were important in their decision to attend the College of Engineering. Students are also asked to provide information about their academic background in math, chemistry, physics and writing. The survey also captures information about computer ownership, usage and training. A copy of this survey is found in Appendix J.

Administrative Procedures

Entering Student Questionnaires are administered once per year in the fall semester. These surveys are distributed to each faculty member teaching one of the freshmen engineering courses. Surveys are administered and collected by these instructors. Emails are sent instructors alerting them in advance that surveys are planned and as reminders when they should be returned to the Assessment Office. Data is entered and analyzed using SAS software.

Reporting

A tabular report is prepared that lists frequencies and percentages where appropriate and provides student responses to open-ended questions. The Director of Assessment also writes a summary report that analyzes and summarizes significant trends, themes and findings from the student response data. Reports are distributed to the Executive Committee, Student Services, the Development Officer and the Center for Engineering Education Excellence team members.

19

Page 20: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Performance Assessment Instrument for an Oral Presentation

A number of other assessment instruments have been developed for use by faculty members within the classroom to evaluate specific instructional objectives. A performance assessment handout listing course task expectations and an evaluation rubric for use with a senior level course using oral presentations is the first of several instruments to be developed and implemented during the 1998 Spring Semester. A copy of the handout is given in Appendix L.

Midterm Evaluation

A copy of a midterm evaluation survey is found in Appendix M. This form was developed for use in the Electrical and Computer Engineering sections to provide immediate feedback to the instructors regarding student perceptions of their progress and the overall effectiveness of the faculty member in achieving course objectives.

Educational Outreach

A survey designed to elicit information about ways to evaluate effectiveness and to improve the presentations was developed for use with the “E2 – Everyday Engineering” program. This is a school outreach effort that targets elementary, middle school and high school students. The Coordinator of this program creates and presents science-based learning activities in South Carolina area classrooms. A copy of this survey is found in Appendix N.

Professional Communications Center – Data Base and Evaluation of Impact

The Director of Assessment, in collaboration with the Director of the Professional Communications Center, planned several qualitative and quantitative methodologies to assess the impact of the writing center upon the students and faculty within the College of Engineering. A computer database and computer programs have been developed and implemented to obtain a more accurate reflection of the student/faculty consultations during each semester. Reports are generated each semester and at the end of the year; the tabular report for 1999 is found in Appendix O. This data collection effort examines the number of contacts occurring within the Center and individual classrooms involving PCC personnel. The data input also indicates the types of writing issues for which students and faculty seek assistance and the amount of time personnel spend with clients.

Student Longitudinal Tracking System

In collaboration with the University’s Institutional Planning and Assessment Office, the College of Engineering assisted with the design and implementation of a Longitudinal Student Tracking System that incorporates all of the necessary elements to study student trends from

20

Page 21: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

admission through graduation and beyond. The goal of this system is the availability of a college-wide mechanism that will provide data for faculty and administrators to enable them to continuously monitor and improve the quality of their programs.

To initiate the creation of the Longitudinal Student Tracking System, the College of Engineering developed a set of research questions and companion tables to specify the variables requested in the database and to show how the relationships among these variables might be displayed. A total of 36 research questions were enumerated, and, some of these are listed below.

1. How many students were enrolled in each cohort (1990-91, 1991-92, 1992-93, 1993-94, 1994-95) for each of the following subgroups: total engineering students, first-time freshmen, and transfer students showing ethnicity and gender for each subgroup?

2. How many students in each cohort graduated as of June 1998 showing distributions for each of the following subgroups: total students, first-time freshmen, and transfer students with breakdowns by ethnicity and gender for each subgroup?

3. How many students in each cohort graduated in Engineering as of June 1998 showing distributions for each of the following subgroups: total students, first-time freshmen, and transfer students with breakdowns by ethnicity and gender for each subgroup?

4. What are the average cumulative GPA’s of graduates within each cohort who received an Engineering degree showing the distributions for the following subgroups: total engineering students enrolled, first-time freshmen, transfer students with breakdowns by ethnicity and gender?

Using the College of Engineering research request as a guide, Planning and Assessment personnel downloaded student data from various USC mainframe systems to compile the Longitudinal Student Tracking component. This new database includes student data from the 1990-91 cohort to the 1997-98 cohort and incorporates 677 variables of interest. Variables can be grouped into the following six categories: admissions data (SAT scores, rank, entry status, etc.); demographic information (gender, ethnicity, etc.); academic performance indicators (grades in courses, GPA, etc.); graduation statistics; retention; and withdrawal rates. A copy of a report developed using some initial longitudinal student tracking data is located in Appendix P.

Evaluation of The Bates House Living-Learning Community

During the 1999 fall semester, freshmen students in The College of Engineering and Information Technology were offered the opportunity to participate in a unique Living-Learning Community program developed in collaboration with the USC Housing Department. The Engineering Community in Bates House is an on-campus residential community designed to enrich the educational environment for first-year engineering students. Development of this concept was based on research documenting the benefits of students living in learning environments that foster student-faculty interaction and student peer relationships strengthened by involvement with each other both in and out of the classroom.

21

Page 22: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

More specifically, goals of the Engineering Community in Bates House are:

1) To increase the retention rate of these freshmen by creating a learning environment that maximizes their potential for success

2) To incorporate active learning strategies and increased academic support to increase academic performance indicators such as the student’s grade point average (GPA);

3) To develop professional attitudes and to emphasize experiential learning by encouraging student involvement in the community and the professional engineering organizations;

4) To develop and implement the use of new technologies, such as laptop computers, that can be applied in the classroom to enhance education program delivery;

5) To provide early design and teamwork experience to enhance student motivation and learning and to develop leadership, communication and problem solving skills.

The increases in retention and academic performance are primarily long-term research questions. The Bates House project students will be tracked during their subsequent years at USC collecting course grades and GPA data each semester. Retention figures for this group of students will be tabulated with overall results available at the end of the first, second and fourth years of the project.

A group of engineering students with similar academic backgrounds will be randomly selected for use as a control group to provide a criterion for judgment of program success. Retention rates, course grades and GPA data will be collected for this group of students each semester from 1999-2000 through the 2002-2003 academic years. Control and experimental groups will be compared to determine if the additional academic support and activities given the Bates House students yields improved performance and retention within the College. A summary of the initial results of the project is located in Appendix Q.

22

Page 23: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Quality Review Process

Feedback from the departments to the ABET/Gateway Committee concerning improvements undertaken within programs as a result of evaluation information is a key feature of the continuous quality assessment loop within the College. Although college-wide efforts provide specific pieces of student and faculty feedback, departments are also responsible for determining the additional assessment activities needed to evaluate their individual objectives. Within each Department, the survey results and reports are analyzed and discussed within the formal assessment structure and procedures adopted for program improvement. Each departmental committee reports the changes, modifications, and/or strategies they expect to follow to accentuate positive findings and provide corrective measures for the areas in need of attention.

The review process is a key component in linking the College-wide System with the Program Assessment Systems. It is also the means for initiating modifications and the framework for reporting on those changes and the subsequent results. The following figure highlights the committees and procedures utilized within the COEIT CQI System.

Figure 3. CQI Review Process

The Director of Assessment prepares tabular and summary reports for each assessment tool utilized within the College-wide System. As indicated above, all reports are generated and distributed to all the College Executive committee and the Center for Engineering Excellence Team. Findings are discussed at meetings of both of these committees.

23

Executive Committee

Center for Engineering Education Excellence

Assessment Director Assessment Plan Assessment Tools

FeedbackData Analysis

Data Collection

Reports

Programs

ConstituenciesSeniors, Alumnae

College Executive Committee

Center Team

Program Improvement Plans/Report;

Annual Assessment

Analysis/Report;COEIT Strategic

Plan

Recommendations

College Priorities

Feedback/Goodwill

Program Priorities

Page 24: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The Program Chairpersons provide each faculty member with an electronic or hard copy of the summary and the tabular report of the results. Within each program, the results are analyzed and discussed within the formal assessment structure adopted for program improvement. Each program committee makes recommendations and initiates changes within the curriculum. As part of the strategic planning function each year, the programs include a report explaining the changes, modifications, and/or strategies they followed to accentuate positive findings and provide corrective measures for the areas in need of attention. Members of the Center for Engineering Education Excellence Team report and discuss these conclusions at committee meetings throughout the year. The same procedures are followed for the findings from each survey administration.

24

Page 25: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Program Assessment Structures and Processes

The assessment and continuous quality improvement processes implemented within the COEIT are both college-wide and departmentally focused. The program assessment systems are driven by the college-wide infrastructure; it provides the foundation necessary to generate and disseminate findings and reports for the College. More important, this infrastructure generates the coordination and collaboration efforts needed to facilitate program improvement.

The departments have responsibility for the educational programs; thus, implementation of the assessment processes is focused on the departments and the education programs. The departmental systems are comprised of on-going, institutionalized processes with the elements repeated at regular intervals to assure fresh assessment data and appropriate improvement plans.

In preparation for the development of the individual program assessment plans, the Center for Engineering Excellence Team and the Executive Committee participated in the review and modification of the statements specifying the College vision, mission, goals and objectives. The document adopted by the College on November 27, 1998 is included in the following paragraphs.

Vision Statement

The College of Engineering and Information Technology will be a national model for innovation and responsiveness in addressing the engineering education, economic development and lifelong learning needs of the state.

Mission Statement

The mission of the College of Engineering and Information Technology is to serve the engineering and technology needs of South Carolina through our programs of education, research, and outreach.

Goals and Objectives of the College

1. Meet the educational needs of South Carolina industry, our students and the engineering profession.

2. Support the economic development of our state and create new opportunities.

3. Be recognized as a learning community of students, faculty, and staff that develops student motivation and capability for learning that enhances their careers and lives.

4. Provide an environment that encourages individual intellectual curiosity and freedom and motivates students to meet high academic and ethical standards.

5. Be recognized for research and scholarship and assist the university in its aspiration to become an AAU institution.

6. Develop a supportive climate that attracts and supports a diverse group of faculty, staff, and students.

25

Page 26: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

7. Be recognized as a college committed to becoming better and more productive and to continuous improvement in its education, research, and outreach mission.

Training and Preparation

During the development process, faculty members from each program also attended a workshop conducted by Jack McGourty on October 9, 1998 to assist faculty members in the development of objectives and outcomes as well as planning strategies and actions needed to implement their program objectives. Templates of each step in the assessment process were distributed to attendees who worked in groups to practice writing objectives and outcomes. In addition to the workbooks provided by the Gateway Coalition, members of the Center for Engineering Excellence Team also received the booklet entitled “Stepping Ahead: An Assessment Plan Development Guide” written by Gloria M. Rogers and Jean K. Sando with funding from the National Science Foundation and the Foundation Coalition. Examples of the template used by the faculty members are located in Appendix R. The Assessment Director provided each department with guidelines to assist in the implementation of their individual systems; programs utilized the following outline in their preparations. This document is included in the following section.

Recommended Procedures for Articulating and Documenting Assessment ProcessesAugust 1998

Overview

In preparation for our upcoming ABET accreditation visit in November 1999, we need to develop and implement a system of ongoing evaluation. This system must demonstrate that the outcomes important to the mission of the institution and the objectives of each program are being measured, analyzed, and reported. Most important, the system must document strategies for improvement, based on the assessment results, and provide methods for the evaluation of the implemented modifications. These are concurrent and continual processes occurring each semester. The ultimate goal of this system is to examine and enhance the College of Engineering’s effectiveness.

Four areas, as identified by the Commission on Higher Education, are key elements to this evaluation process:

(1) the improvement of teaching and learning (2) the personal development of the students(3) institutional improvement(4) accountability

Departmental Assessment System

Each department must design and implement a structure and a process for managing the ongoing assessment of their programs. The design should incorporate a flexible system and a process that provides for a continuous flow of data collection, analysis and reporting. This feedback loop for quality improvement should involve broad and appropriate constituent groups in its process and should document all activities concerning its procedures. It is also important to document how results of the assessment instruments are used within each program.

26

Page 27: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The assessment processes implemented within each area should address all levels and types of infrastructure that impact the program’s capability of meeting its outcomes. The following list of items and questions is intended to act as a starting point for each department to articulate a process for managing the ongoing assessment of their program objectives. Documentation of the policy/procedures, as well as the ongoing implementation of the assessment process is critical (committee minutes, written recommendations, written summaries of outcomes, etc.).

1. Create a management structure within the department to conduct program assessment. (Ex. Faculty committee, all faculty, etc.) Document the work of this structure. (Meeting minutes, agendas, progress reports, handouts, etc.)

2. Define a process by which the management structure (composed of all of the faculty or a committee of faculty) collects data, discusses data, makes recommendations and reports findings.

Who will receive information? Why? Articulate how/when findings will be discussed. What happens with this information at the program/department level?

3. Identify the assessment techniques, tools and strategies that will be used to collect the information for the evaluation.

Relate these to the program objectives. Provide time frames. What? How? When? Why?

4. Articulate and document the assessment outcomes.

What are the findings? How well do they measure the objectives?To what extent were objectives achieved?

5. Provide conclusions and recommendations.

Determine how these recommendations will be implemented and subsequently evaluated. What are the recommendations?

How will they be implemented? What are the follow-up procedures? When?

What was the outcome of the change that was implemented?

6. Articulate the process for feeding back information to the faculty members, constituencies, the program and the college. The feedback loop should also provide information regarding how actions will be taken and the steps that will be used to re-evaluate program progress each year.

College Level Activities

A number of college-wide instruments and processes will be developed/revised, implemented and maintained to collect data that will be used to evaluate the effectiveness of our overall effectiveness, each degree program and student learning and growth. Information from these studies will be provided to each program, but programs must also determine what additional assessment activities are needed to evaluate their individual objectives.

Utilizing the document given above as well as the other documents and training provided, departments developed Assessment Plans that would fit within the structures of their program discipline or created new structures to incorporate evaluation processes. Next, departments developed processes to collect or receive data, analyze and interpret findings and determine

27

Page 28: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

recommendations for no change or adjustments within the system. Departments were also expected to determine feedback channels and to develop performance criteria for determining if program objectives were met. The programs provide an annual evaluation of the continuous quality improvement process within their area that indicates the extent to which objectives were achieved, how this result was determined and follow-up plans for monitoring each adjustment. The assessment plans developed by each program within the College are presented in the following sections.

28

Page 29: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Mechanical Engineering Program

Mission and Goals

The mechanical engineering curriculum provides a strong foundation in the basic and applied sciences and in the liberal arts, with increasing emphasis on mechanical engineering topics in the junior and senior years. A two-semester capstone senior design experience gives the student opportunities to integrate and apply the knowledge and skills learned throughout the mechanical engineering curriculum.

To support the university and college goals and include an emphasis on excellence among our regional peers, the Department of Mechanical Engineering has adopted the following mission statement:

The mission of the Department of Mechanical Engineering at the University of South Carolina is to provide students with a sound mechanical engineering education, advance the understanding and application of scientific principles, enhance economic development, and improve the quality of life of our citizens through teaching, research and outreach programs.

Consistent with this mission and to prepare students for successful careers in engineering, the Department of Mechanical Engineering maintains an academic program with the following program educational objectives:

(1) To educate students to apply mathematics, science and engineering principles to solve mechanical engineering problems;

(2) To develop the student's professional skills that enable a successful career; and (3) To provide the student with the broad education necessary to practice engineering

in a global and societal context.

Objectives and Outcomes

The department has made a deliberate connection between these three program objectives and 15 specific program learning outcomes so that our success at achieving the program objectives can be determined in part by assessing the degree to which the outcomes are reached by our graduates. Our objectives and outcomes are listed in Table 1 below.

Table 1Objectives and Outcomes for the Mechanical Engineering Program

Objective 1: To educate the student to apply mathematics, science and engineering principles to solve mechanical engineering problems.

Supporting Outcomes:(1.1) The graduates shall have the ability to analyze, design and realize mechanical and thermal systems.(1.2) The graduates shall have the ability to use contemporary computation techniques and tools.

29

Page 30: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

(1.3) The graduate shall have competence in design of experiments, experimental practices and data interpretation.

(1.4) The graduates shall have the ability to apply mathematics through linear algebra, multivariate calculus and differential equations.

(1.5) The graduates shall have the ability to apply statistical methods to analyze and interpret data.(1.6) The graduates shall have an understanding of the chemistry and physics that are fundamental to mechanical

engineering.

Objective 2: To develop the student's professional skills that enable a successful career.

Supporting Outcomes:(2.1) The graduates shall have the ability to perform engineering economic analyses.(2.2) The graduates shall have the ability to plan, schedule and execute engineering projects.(2.3) The graduates shall have effective oral and written communication skills.(2.4) The graduates shall have an understanding of professional and ethical responsibility. (2.5) The graduates shall have the ability to function on multi-disciplinary teams.(2.6) The graduates shall have an understanding of and the ability to engage in life-long learning.

Objective 3: To provide the student with the broad education necessary to practice engineering in a global and societal context.

Supporting Outcomes:(3.1) The graduates shall have an appreciation for the role of engineering in modern society.(3.2) The graduates shall have an appreciation for literature, fine arts and humanities. (3.3) The graduates shall have in one foreign language the ability to comprehend the topic and main ideas on

familiar subjects.

Program Management Structure

Each educational objective has associated with it a set of specific, measurable learning outcomes. This creates a two-part procedure for the continuous improvement of the program. The first part is to determine the program outcomes that are necessary and sufficient to achieve the program objectives. The second part of the continuous improvement process is to determine achievement of the various program outcomes. A management structure for the assessment process, as documented below, and a two-part process of outcomes assessment facilitate improvement of the curriculum.

The management of the undergraduate program assessment process is carried out by the department chair, three of the department's standing committees and three program area assessment teams as shown in the organizational chart diagrammed in Figure 4.

30

Page 31: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Figure 4. Management structure for Mechanical Engineering undergraduate program improvement

All faculty participate in one of three program area assessment teams: Thermal-Fluids and Engineering Analysis; Mechanics and Materials; and Design and Realization. Each team includes faculty members who teach courses in those general areas. These teams support the assessment process though course and curriculum evaluation and improvement.

The Undergraduate Committee is responsible for undergraduate courses and curriculum assessment and improvement, for coordinating the activities of the three program area teams (see Figure 4), and for handling undergraduate student petitions. This committee's membership is appointed by the Department Chair and consists of at least one member from each program area team.

The Computing Committee responsibilities include ensuring that the computational resource infrastructure is adequate to support the delivery of the undergraduate program. The Facilities Committee involvement consists of the management of teaching laboratory space. Additional departmental committees are involved on an as-needed basis. The management structure enables the program and course improvement processes shown schematically in Figure 5.

31

Page 32: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

(a) (b)

Figure 5. The continuous quality improvement process: (a) program outcomes assessment and program improvement, and (b) course objectives assessment and course improvement.

The first loop (a) shows the program assessment process. Program strategies are the curriculum and courses that lead to the student outcomes. The program indicators are the assessment measures and levels of performance desired. The second loop (b) summarizes the process of establishing and assessing course objectives, strategies and indicators and making course improvements. These loops are discussed below.

Program Improvement Process

The Department Chair reviews all results from the assessment instruments and determines the areas on each survey or report that are within the responsibilities of the departmental committees. The Chair asks the appropriate departmental committee to analyze the information and determine the appropriate response. The committee prepares a written response that is presented at a Department Faculty meeting. If changes are needed, the committee will prepare and make a

32

Page 33: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

motion at a called Department Faculty meeting. Recommendations requiring coordination at the College level will be reported to the Executive Committee by the Chair or his designee.

For example, the college-wide survey information from the student course evaluations and senior exit survey are distributed and addressed in the following manner. The Department Chair receives the departmental results. For results dealing with instructor performance, the Chair is responsible for counseling faculty to make improvements. All results dealing with course content go to the Undergraduate Committee. When appropriate, the Undergraduate Committee involves the appropriate Program Area Team of faculty to determine the proper course of action.

The committee prepares a written report for the Chair by the end of the following semester. Any recommendations needing faculty approval are to be made at a faculty meeting. The recommendations consist of the problem statement, the planned activity, personnel requested for the activity, resources needed and a time schedule for accomplishing the activity. The Chair is responsible for providing the appropriate resources to facilitate implementation of the recommendations. The Undergraduate Committee reexamines recommendations on an annual basis and provides the Chair with a report outlining the impact (what was implemented, how successful it was, what could not be implemented and why) of the activity.

Course Improvement Process

The second loop in Figure 5 shows the process by which individual courses are periodically reviewed. The Program Area Teams are primarily responsible for reviewing and improving individual courses in the curriculum. The primary assessment instruments for this process are the Course Portfolios and results from the Course Surveys that are provided by the course instructor. Changes to course objectives and strategies require approval of the Undergraduate Committee. If the Program Area Team finds that a curriculum or course description change is required, then faculty and university-level approval will be obtained in accordance with the process outlined in the first loop of figure 5.

Outcomes Assessment

The Mechanical Engineering Department began formalizing its assessment plan in the 1997-1998 academic year. The assessment measures included in the plan and the status of their implementation are shown in Table 2 below. A summary of how the assessment measures used in 1999 related to the program outcomes is presented in Table 3.

33

Page 34: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Table 2Methods used to ensure achievement of the program outcomes and to obtain results to improve the effectiveness of the program

ASSESSMENT MEASURESImplementation Year

Prior to 1997

1997-1998

1998-1999

1999-2000

1. Graded Coursework 2. USC Foreign Language Test 3. Course Surveys 4. Graduating Senior Exit Surveys 5. Alumni Survey 6. Senior Design Advisor Survey 7. Senior Design Student Survey 8. Senior Laboratory Student Survey 9. Course Portfolio Assessment 10. Employer Survey 11. Longitudinal Tracking Research

Table 3Relationship between assessment measures used in 1999 and Mechanical Engineering program outcomes

Program Assessment Tools

Program Outcomes Gra

ded

Cou

rsew

ork

USC

For

eign

Lan

guag

e Te

st

Cou

rse

Surv

ey

Seni

or E

xit S

urve

y

Alu

mni

Sur

vey

Seni

or D

esig

n A

dvis

or

Surv

ey

Seni

or D

esig

n St

uden

t Su

rvey

Seni

or L

abor

ator

y St

uden

t Sur

vey

1.1. analyze, design and realize 1.2. computation techniques 1.3. design and interpret experiments 1.4. apply linear algebra to calculus 1.5. apply statistical methods to data 1.6. understand chemistry and physics 2.1. engineering economic analyses 2.2. plan and execute projects 2.3. oral and written communications 2.4. professional responsibility 2.5. multi-disciplinary teams 2.6. life-long learning 3.1. engineering in modern society 3.2. literature, arts, humanities 3.3. foreign language

34

Page 35: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Chemical Engineering

Mission and Goals

The mission statement of the Department of Chemical Engineering was developed with the aid of the Industrial Advisory Board in 1989, and was first published in the 1993-1994 Annual Report. The mission statement reads as follows:

“We will develop high quality chemical engineers by continuously improving our undergraduate and graduate programs. We will conduct world class research and innovative teaching, providing an environment for professional development, and be an effective resource for industry, government, and academia.”

Program educational objectives are the broad characteristics or features that describe the attributes that our Bachelor of Science graduates obtain through our program. Faculty members in the Department of Chemical Engineering established, reviewed and improved program objectives and published them in the University of South Carolina Undergraduate Studies Bulletin. The two educational objectives for the Chemical Engineering program are given below:

1. Provide the student with a thorough grounding in mathematics, chemistry, and in chemical engineering subjects.

2. Prepare the student for a professional career or graduate studies in chemical engineering and other fields.

Objective 1 refers primarily to that technical content of the curriculum that broadly defines Chemical Engineering education, as established by current chemical engineering practice in industry, and as articulated by the American Institute of Chemical Engineers (AIChE):

“The program must demonstrate that graduates have: thorough grounding in chemistry and a working knowledge of advanced chemistry such as organic, inorganic, physical, analytical, materials chemistry, or biochemistry, selected as appropriate to the goals of the program; working knowledge, including safety and environmental aspects, of material and energy balances applied to chemical processes; thermodynamics of physical and chemical equilibria; heat, mass, and momentum transfer; chemical reaction engineering; continuous and stage-wise separation operations; process dynamics and control; process design; and appropriate modern experimental and computing techniques.”

Objective 2 refers primarily to those additional skills, experiences, perspectives, and training that transcend and unify the undergraduate curriculum so that the student is prepared for a professional career after graduation. Specifically, our Bachelor of Science graduates should be able to succeed in a career in chemical engineering (including, for example, employment in a manufacturing plant, engineering design firm, or consulting firm). Furthermore, some of our students may wish to pursue graduate studies in chemical engineering, or to pursue other professional careers (such as medicine, business, or law).

35

Page 36: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Objectives and Outcomes

Table 4List of program educational objectives and outcomes for Chemical Engineering

EDUCATIONAL OBJECTIVES EDUCATIONAL OUTCOMES

1. Provide the student a thorough grounding in mathematics, chemistry, and chemical engineering subjects.

1. Ability to apply knowledge of mathematics, chemistry, and engineering in chemical engineering practice. 2. Understanding of chemical engineering science fundamentals. 3. Ability to design a chemical engineering system, unit, or chemical process to meet desired needs. 4. Ability to design and conduct laboratory experiments, as well as to analyze and interpret data using factorial design methods. 5. Ability to use chemical process simulators and other techniques, skills, and modern engineering tools necessary for chemical engineering practice.

2. Prepare the student for a professional career in the chemical process industries or graduate studies in chemical engineering and other fields.

6. Ability to present technical material through oral presentations with visual aids. 7. Ability to present technical material including analysis and conclusions through technical reports.8. Ability to work in multi-functional teams. 9. Ability to find information and to learn independently. 10. Understanding of professional and ethical responsibility. 11. Awareness of economic, political, and social issues. 12. Ability to comprehend the topics and ideas of familiar subjects in a foreign language.

Management Structure

The constituents and other stakeholders in the Department of Chemical Engineering monitor the results of our program, and assist in the improvement our undergraduate program through planning, assessment, and recommendations. Improving our program requires establishing processes along with defining roles for the participants in those processes. The stakeholder organization for the Chemical Engineering program has evolved over the course of several years, principally since 1987. These individuals and committees, shown in Table 5, comprise the infrastructure necessary for the continuous quality improvement process within the Department of Chemical Engineering.

36

Page 37: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Table 5Stakeholder Organization within the Department of Chemical Engineering

Department of Chemical Engineering Chair Faculty

Undergraduate Curriculum Committee (est. 1995) Faculty Search Committee Laboratory Committee (est. 1994) T&P Committee Co-op Coordinator (est. 1997)

ECHE Industrial Advisory Board (IAB, est. 1987) Undergraduate chemical engineering students

Lower Division Upper Division AIChE Student Chapter

The Department Chair performs several important functions for program assessment and improvement. These activities are itemized below:

Schedules all faculty meetings and prepares agenda items including student petitions and suggestions for improving the undergraduate curriculum.

Reviews all course evaluations and discusses these with individual faculty members as part of their written annual review

Appoints members to the various departmental committees Keeps abreast of curriculum developments in the Chemical Engineering Profession

and at peer institutions Meets each year with the department chairs from other Chemical Engineering

Departments in the Southeast Prepares an annual report that is submitted to the Dean of the College of

Engineering

The Undergraduate Curriculum Committee is charged with the continuous improvement of the undergraduate curriculum. For example, faculty members on this committee consider such issues as rearranging the curriculum, modifying prerequisites, adding a new elective course, etc.

Program Improvement Processes

The specific processes for monitoring and improving our program objectives and the involvement of the various participants in those processes are outlined in Table 6. The table provides a summary of the specific processes we use, identifies the primary leader responsible for scheduling and assuring that the process takes place, the participants (including committees and offices as described above), and the primary documentation of the various processes.

37

Page 38: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Table 6Monitoring and Improving Program Educational Objectives: Processes, Leaders, Participants, and Documentation

Process Leaders Participants DocumentationIAB meetings (est. 1987) Dean and Dept.

ChairChair, faculty, students,industrial members ofthe advisory board

IAB Minutes;Annual Report;State of Dept.notes

Strategic planning day (est. 1987)

Dept. Chair Faculty, facilitator fromIndustry

Planning dayminutes

Course review sessions (est. 1994)

Dept. Chair Faculty Files for courses;UG StudentHandbook

Southeast dept. heads meeting

SE Dept. Heads Dept. Chair

Dept. Chair Report

Senior surveys (est. 1998) Assessment office/COE

Dept. Chair, faculty, students

Summary report

AIChE initial placement survey

AIChE Faculty, students Summary table forAIChE

Alumnae/Alumni survey (est. 1998)

Assessment office/COE

Dept. Chair, faculty, alumni

Summary report

Student advising (est. 1989)

Dept. Chair Faculty, students Student’s files

AIChE, ASEE, or other professional society meeting/pubs

Faculty Faculty, AIChE, ASEE,other professionalsociety

Various minutes ofthe ECHEmeetings

Faculty Meetings Chair Faculty Faculty meetingminutes

Strategic Planning Day - Established in 1987, faculty members discuss long-range plans and strategies, and discuss major issues facing the department. The meeting concludes with a set of action items and recommendations for further study and possible action. One of the Industry Advisory Board members, who also attend this retreat, provides a professional staff person to facilitate the day’s activities. Minutes of the meeting provide a permanent record that is kept on file in the department office.

Course Review Sessions - At least once each year, but typically twice each year, the Chemical Engineering faculty hold a “Course Review” meeting. These meetings were instituted in 1994. The purpose of these sessions is for the faculty to review the course syllabi, exchange ideas on teaching innovations, and to review the performance of students in our courses. At these meetings, the faculty members review the performance of students in the chemical engineering courses and gain an overall perspective on students’ understanding of chemical engineering science fundamentals. Because chemical engineering courses require knowledge of mathematics, science, and general engineering, these reviews provide the faculty an opportunity to assess whether the students are adequately prepared to perform well in chemical engineering courses. The Course Reviews also provide a forum for discussing prerequisites, performance of students in team settings, availability and suitability of computers and software, recent initiatives from industry and the chemical engineering profession, etc. The notes and materials from the course review sessions are kept on file in the department office.

38

Page 39: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Senior Surveys - Within ECHE, the faculty members receive a copy of the summary and the tabular report of results, and the faculty members analyze and discuss these results at faculty meetings, Advisory Board meetings, or Planning Days as necessary. The Department takes internal action and institutes corrective measures when improvements are needed. The Department also notes positive findings and endeavors to maintain positive processes and approaches.

AIChE Initial Placement Survey - The national headquarters of AIChE distributes a survey asking academic departments to report the initial placement of their Bachelor of Science graduates. Categories of placement include industrial (materials, biotechnology, chemicals, fuels, etc.), government (federal, state, local), graduate school, returned overseas, unknown employment, other employment, and unemployed. The data are collected from graduating students directly or through faculty and staff. This report provides the faculty with an ongoing record of the success of students in placement, and trends in placement and is kept on file in the department office.

Alumnae/Alumni Survey – This survey was initially administered in the Fall 1998 semester to 1995 College of Engineering graduates. The Alumnae/Alumni survey is given to graduates who have been working or attending graduate/professional school for three years. A report analyzing and synthesizing the survey findings are distributed to each program by the COEIT Office of Assessment.

Student Advising -All Chemical Engineering undergraduate students are assigned a Department of Chemical Engineering faculty academic advisor. Students are required to meet with their advisor a minimum of one time per semester, during the two Advising Weeks scheduled by the COE. At this time advisors review the students files, which contain an updated, unofficial transcript, list of courses currently being taken, and forms for tracking the student’s progress toward completing Lower Division, Upper Division, and humanities/social studies requirements. The faculty advisor checks grades for acceptable progress including meeting course prerequisites. These meetings also provide an opportunity for the advisor and the student to discuss the profession in general, including possible co-op and summer intern opportunities. This meeting is also an opportunity for them to discuss the student’s career objectives and the objectives and educational outcomes of the ECHE program. The results of this process, including transcripts, are documented in the student’s file.

Additional Assessment Methodologies

For ease of reference, the following table summarizes the various assessment methods utilized to obtain quantitative and qualitative data for each outcome.

39

Page 40: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Table 7Assessment Method and ProcessMethod of Assessing & Insuring Achievement of Objective

Data Associated with Assessment Method Location of Data

COSM monitors math, science prerequisites and grades

Student grades in all math & science courses; transcripts

Student files/UG Student Services

Advising: Monitor prerequisites, progress

Student transcripts, advising files, upper and lower division course requirements, degree checks

Student files/UG Student Services

Individual course assignments and evaluation of homework, exams, projects, experiments, oral presentations, written reports, or design projects

Student or team work, instructor grades & comments

ECHE Department/ Faculty member records

Faculty Course Review Days Course-specific notes including syllabi and recommendations for improvement

ECHE Office

End of semester student course evaluations

Tabulations and statistical summaries of student responses; student written comments

ECHE Office

Senior Survey Student responses to directed questions ECHE OfficeStudent confidential evaluations of team members’ performance

Student confidential evaluation forms ECHE Department/ Faculty member records

Improving Program Effectiveness

Quality improvement processes for the Department of Chemical Engineering have been utilized extensively and effectively for identifying and implementing changes that are designed to improve the overall effectiveness of the program in achieving its educational objectives. The following table lists many of the needs that have been identified by our constituents, the actions that have been taken, and the changes in our program that have resulted. The table also shows which processes have been involved in identifying and implementing changes. All of the items are documented in the minutes and summary reports mentioned above.

40

Page 41: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Table 8Documented Changes and Improvements Resulting From Quality Improvement Processes

Improvement Needed Process(es) Involved Action(s) Result(s)Growth in number of faculty to maintain quality of program with increased research

IAB, Strategic Planning Day

Five-year plan written Faculty increased from 7 to 14 from 1987 to 1997; 1 new faculty hire for Fall 99

Increased practical experience and desirability of our graduates to industry

IAB, Strategic Planning Day, faculty/student advising

Appointed co-op coordinator, educated faculty on co-op program, increased emphasis on co-op during advising; developed curriculum flow sheets for co-op students

Number of co-op participants increased from 8 (AY96-97) to 26 (AY 97-98) and18 (AY 98-99)

Faculty to better understand the curriculum and the input/output skills of students; communicate expectations to students

Course Review Day

Input/output skills discussed & articulated; UG Handbook written and updated annually; established UG Curriculum Committee

Faculty better understand the curriculum and expectations of students; better coordination across the curriculum

More flexibility in chemistry sequence

Professional society; IAB; student advising

Changed required 2-semeseter Physical Chemistry sequence to 2 semesters of advanced Chemical Electives; developed list of acceptable electives

2 semesters of elective chemistry available; students are taking electives

More engineering electives desired

Student advising; professional society

Changed Dynamics (ENGR 210) from required to elective course

Students have one additional engineering elective course; students are taking additional electives

Prerequisite sequence causing heavy burdens in junior/senior year; interference w/ capstone design & safety

Course review days; student advising; faculty meetings

Revised ENGR prerequisites; rearranged courses; moved ECHE 550 to junior year

More flexibility and fewer hours in the senior year

Lab courses need to provide reinforcement of fundamentals and more structured approach to writing and oral presentations

Course review days; professional societies; IAB; student advising

Established UO Committee; re-wrote UO Lab Manual and restructured course

Clearly defined educational, writing, and speaking objectives; more structured course; several new experiments added; UO Lab manager hired

Students interested in environmental issues

Student advising; faculty meetings; professional society

Created new course (ENGR 540) as allowed engineering elective

Offered course two times since 1995

41

Page 42: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Civil Engineering Program

Mission and Goals

The Department of Civil and Environmental Engineering developed its five-year strategic plan in November 1995. In order to support the University and College missions, the CEE Department adopted the following mission statement.

The mission of the Department of Civil and Environmental Engineering program is to:

Provide quality and essential education to undergraduate and graduate students through formal classes and supporting life-long learning through continuing education short courses and workshops.

Encourage and support research that will contribute to the competence and professional development of the faculty and broaden the body of engineering knowledge and methods.

Provide service to the college and university, local, state and federal governments, and private industry, and supporting professional organizations and society.

Objectives and Outcomes

Table 9Civil Engineering Program Objectives and Outcomes

OBJECTIVES OUTCOMES(Small letters = ABET criteria)

(Capital letters = ASCE Program criteria)

1. Provide an education in which the students will be able to integrate fundamental mathematics and science concepts to understand and solve civil engineering problems.

1.1 The graduates will have the ability to apply mathematics through vector calculus and differential equations to solve engineering problems. (a, e, k – A)

1.2 The graduates will have the ability to apply probabilistic and statistical methods to analyze and interpret data. (a, e, k – A)

1.3 The graduates will have the ability to apply an understanding of calculus-based physics and general chemistry to solve engineering problems. (a, e, k – A)

42

Page 43: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

2. Provide an education in which the students acquire and apply broad-based knowledge of fundamental principles in a minimum of four discipline areas of civil engineering to the solution of complex practical engineering problems.

2.1 The graduates will have the ability to identify, formulate and solve engineering problems within the environmental, geotechnical, structural and water resources discipline areas. (a, e, k – A, B)

2.2 The graduates will have the ability to analyze and design civil engineering systems. (c, e, j, k – B, D)

2.3 The graduates will have the ability to design and conduct experiments, and to analyze and interpret data within the various civil engineering disciplines. (b, j, k – C, B, D)

3. Provide a broad education that prepares the students for the future challenges of the Civil Engineering profession.

3.1 The graduates will have an appreciation for the role of engineering in history and modern society. (h, j)

3.2 The graduates will have an appreciation for literature, fine arts, and humanities. (g, h, j)

3.3 The graduates will have the ability to comprehend the topics and ideas of familiar subjects in a foreign language. (h)

4. Provide an education that develops business and other professional skills necessary to practice engineering.

4.1 The graduates will have the ability to plan, schedule and execute engineering projects. (k, d, g – D, E)

4.2 The graduates will have the ability to perform engineering economic analyses. (k – D, E)

4.3 The graduates will have the ability to function on multi-disciplinary teams. (d – D, E)

4.4 The graduates will develop oral and written communication skills. (g – E)

4.5 The graduates will have an understanding of professional and ethical responsibility. (f – E)

4.6 The graduates will have the ability to engage in life-long learning. (I – E)

4.7 The graduates will have the ability to use modern tools and techniques to solve engineering problems. (k)

Management Structure

As shown in the following organizational chart (Figure 6), the management structure of the Department is facilitated by the Department Chair, an Undergraduate Program Director, and four sub-disciplinary Program Coordinators.

43

Page 44: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Figure 6. Organization Chart

The Undergraduate Program Director is responsible for student advising, student enrollment, and awards and scholarships. The Undergraduate Program Committee is responsible for the undergraduate program, courses, and curriculum assessment and improvement. Except for the College-wide committees that have been described in previous sections, the following table outlines additional department and college offices and committees that support the Department’s operations and help facilitate the continuous quality improvement of the program.

44

Environmental ProgramGeotechnical Program Structures ProgramWater Resources Program

Undergraduate Program Committee

Department Chair

Undergraduate Program Director

Page 45: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Table 10Supporting Assessment Infrastructure for the Civil Engineering Program

CommitteeType of Committee/Organization

Principal Responsibility

Advisory Committee Standing Program coordinators advise the Department Chair on all issues pertaining to the operation of the department.

Undergraduate Program Director and Undergraduate Program Committee

Standing The director is responsible for student advising, student enrollment, and awards and scholarships. The committee is the caretaker of all assessment processes for the undergraduate curriculum.

Strategic Planning Committee

Ad Hoc Charged to review and revise the Department’s 1995 Strategic Five-year Plan.

Graduate Program Director and Graduate Studies Committee

Standing The Director is responsible for graduate student advisement, evaluation of applications, clearing students for a degree, coordination of examinations, and nomination of graduate students for awards and fellowships. The committee is directly involved in all aspects of the graduate program.

Student Advisory Committee

Standing Advises Department Chair about special concerns of students and provides input concerning the curriculum.

Coordinator of Community Activities

Standing The coordinator is responsible for facilitating service projects for undergraduate students.

Industrial Advisory Board Standing Advise and help the department improve the program. The committee meets twice per year and provides expertise to improve the department in the areas of research, curriculum. Placement, and fund raising.

Undergraduate Curriculum Committee

Ad Hoc Charged to review, revise and update the curriculum to prepare CEE undergraduates for engineering practice in the 21st century.

Partnership Board Twice peryear

Comprised of individuals who have achieved leadership roles in industry and engineering from around the state and nation. Advise and help the college improve the programs. The board provides expertise to improve the college in the areas of research, curriculum, placement, and fund raising.

Engineering Career Services

Two full-time College positions

This office is a branch of the University Career Services office. Engineering Career Services is the liaison between our programs and prospective employers from industry and government. Engineering Career Services is also responsible for locating companies that wish to employ co-op students and summer interns.

Assessment System Overview

The Civil Engineering Program Assessment System was adopted in the 1998-99 academic year. The assessment process captures the Two-Loop Model (shown below) proposed by ABET (EC2000) by applying continuous quality improvement to the development and assessment of program objectives and outcomes.

45

Page 46: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The left-hand loop describes the establishment, assessment and continuous quality improvement of the program objectives. A two-part evaluation procedure for the program objectives is based on the relationship of the measurable program outcomes to the program objectives. The first part is to determine the program outcomes that are necessary and sufficient to achieve the program objectives. This is an integral part of the process of establishing and periodically evaluating the program objectives described earlier and includes input from the program's constituencies.

The second part of the evaluation procedure for the program objectives is to determine achievement of the various program outcomes. The CEE Department has adopted assessment measures that include:

Instructor grades Faculty evaluation of course portfolios Course survey Senior exit survey Alumni survey Employer focus groups/ employer survey Summary of the FE Exam Results (Report 5) Transcript analysis and student advising Industrial Advisory Board (IAB) input

46

Page 47: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The following assessment schedule has been adopted to evaluate data from these assessment measures.

Table 11Schedule for assessment methodologies within the Civil Engineering Program

Assessment Instrument Frequency Source Action

Senior Survey Semester College Summarize, review by UGPC*, save

Alumni Survey Annual College Summarize, review by UGPC, save

FE Exam Summary Semiannual State/College Summarize, review by UGPC, save

Course Survey(Student Evaluation of Instructor and Course)

Semester College/students Summarize, review by chairperson

Course Portfolio(Course evaluation and improvement )

Semester Department/instructors

Review by UGPC, save

Instructor Grades(Evaluation of Students)

Semester Department/instructors

Save

* UGPC- Undergraduate Program Committee

A schematic of the annual assessment processes within the Civil Engineering Program is presented below.

LEVEL INPUT RESPONSE FEEDBACK

Constituent Formal ConstituentsAssessment Input

Department Chair Initial Qualitative Review

Undergraduate Qualitative Review, Action as Required Evaluate in ContextProgram Committee Evaluation of of Department

Assessment Data,

Dissemination of Data

47

Page 48: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Sub-disciplinary Evaluation of Action as Required Evaluate in ContextPrograms Assessment Data of Programs

Individual Faculty Evaluation of Action as Required Modify Course Assessment Data

Portfolio Informal Anecdotal Student Input

Figure 7 CEE Assessment Process

Results Used to Improve Program Effectiveness

The process to annually review the program’s objectives and outcomes includes soliciting input from the IAB, students, alumnae/alumni and employers. Mechanisms for obtaining and analyzing this input are being phased-in. The CEE Assessment System comprises the following steps:

1. Develop objectives with constituents2. Publish objectives3. Acquire data4. Interpret data5. Improve program/curriculum6. Improve measurement tools7. Modify program objectives and/or outcomes8. Report to constituents9. Review input from constituents

Formal assessment input is generated at a number of constituent sources and introduced to the department through the Chair. Initially, the Chair reviews this information qualitatively to ensure a level of continuity throughout the department and to maintain confidentiality when necessary. The Chair then disseminates the filtered assessment data to the Undergraduate Program Committee for qualitative review and dissemination to the appropriate sub-disciplinary Program Coordinators. Each Program Coordinator reviews and disseminates the data to the appropriate faculty members within the program for individual evaluation. The individual faculty may also receive anecdotal assessment information from students.

Evaluation of the assessment data as it passes through each level of the department mayresult in a record of recommended strategies and actions to be implemented at one of three levels. The Department Chair and/or the Undergraduate Program Committee may recommend an action

48

Page 49: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

based upon how the assessment data relate to the departmental program objectives. At another level, the Program Coordinators, program members, and individual faculty may recommend an action depending on how the assessment data relates to the sub-disciplinary program goals and objectives. The last level involves actions by the individual faculty based on how the assessment data and/or anecdotal student information relate to the course; an action at this level will result in modification of the course portfolio.

Annually, the Undergraduate Program Committee meets to review the assessment results and recommendations and provide an executive summary of the impact of the assessment judgments to the Department Chair. The Undergraduate Program Committee may also recommend changes to the educational objectives of the department, which will go to the Department Chair for consideration by the Department Faculty. Recommendations, such as course or curriculum changes, requiring coordination or approval at the college or university level, will be reported through the Department Chair to the appropriate committee according to University policies and procedures.

The achievement of the program objectives is assessed at various levels:

Each student’s achievement of the course objectives is evaluated by graded performance on examinations, homework, projects, presentations, etc. Course portfolios are maintained by the faculty. The portfolios contain information such as the course syllabi, course objectives, other administrative material, and examples of student work.

Using feedback from the students, instructors evaluate each course to verify that it meets the requirements within the curriculum and program. These evaluations are also available in the course portfolio.

Through the Undergraduate Program Committee, faculty members evaluate the curriculum to determine the extent to which program objectives have been achieved.

The evaluation by the various constituencies provides a measure of program effectiveness in meeting their individual needs.

49

Page 50: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Electrical Engineering

Mission and Goals

The Department of Electrical Engineering has, as its mission, to:

Provide undergraduate and professional education through programs that prepare students for the workplace, stress the development of the total person, and begin a process of lifelong learning.

Provide graduate education and training in the skills of advanced research.

Contribute to the base of technical knowledge by conducting research and scholarship and by disseminating the results of those programs.

Support the engineering professions by service in the appropriate professional organizations.

Serve the needs of the state and region by appropriate outreach programs and by support for industrial development.

The first bullet of the mission statement is the focus of the present discussion because it targets undergraduate education. The undergraduate component specifies three key areas: preparation for the workplace, development of the total person, and lifelong learning. Three broadly stated goals were derived from our mission statement. For each goal, faculty members developed program objectives and student learning outcomes.

Objectives and Outcomes

The program objectives are intended to drive the development of specific desired outcomes. The program goals, objectives, outcomes, and strategies and actions are listed in the following section. The letter(s) in parenthesis cross-indexes to the EC2000 Criterion 3 paragraph a-k.

GOAL 1: Broad Undergraduate Education

Objective 1.: The student will develop an awareness of the world around us as necessary to practice engineering in a global economy

Desired Outcome 1.1: The student will develop a career plan that recognizes current trends in engineering.

Strategies and Actions: The student will place a written discussion of how current events might affect her or his career plan in the career-planning portfolio (g,h,i,j)

Objective 2: The student will study arts, humanities, foreign language, science and mathematics

Desired Outcome 2.1: The student will successfully complete the required science and mathematics curriculum.

Strategies and Actions: The student will successfully complete the courses (a, h)

50

Page 51: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Desired Outcome 2.2: The student will successfully complete the humanities curriculum.

Strategies and Actions: The student will successfully complete the courses (h)

GOAL 2: Engineering Skills Showing Breadth and Depth

Objective 3: The student will actively participate in a broad educational experience in the fundamentals of engineering with emphasis on electrical engineering

Desired Outcome 3.1: The student will maintain a portfolio that documents his/her academic career. The student will maintain documentation in the portfolio that demonstrates a clear plan for successful negotiation of the curriculum.

Strategies and Actions: The portfolio will be examined each semester as part of the advisement process. A checklist will be maintained by the department to ensure that all requirements are met. The student is responsible for maintaining the portfolio. (g, h, i, j, k)

Desired Outcome 3.2: The student will demonstrate an ability to apply knowledge of mathematics, science and engineering.

Strategies and Actions: Many of the courses in engineering support this; however, each of the EE core courses (211, 212, 221, 222, 331, 351 and 371) are sufficient to demonstrate this ability. The student must successfully complete all these courses to graduate. (a)

Desired Outcome 3.3: The student will demonstrate an ability to design and conduct experiments, including analyzing and interpreting data.

Strategies and Actions: The EE laboratory courses (201, 301, 401, 402) require the student to design and conduct a wide variety of experiments, to analyze the results and to draw conclusions. The student who successfully completes 402 has demonstrated this ability. (b)

Desired Outcome 3.4: The student will demonstrate an ability design a system, device or process to meet desired needs.

Strategies and Actions: The junior and senior EE laboratory courses (301, 302, 401, 402) require the student to perform elements of design. The capstone laboratory sequence 401 and 402 require that two designs be put to test in hardware. (c)

Desired Outcome 3.5: The student will demonstrate an ability to identify, formulate and solve engineering problems.

Strategies and Actions: The capstone laboratory sequence 401-402 requires the student to solve a problem, beginning at the level of identifying the nature of the problem, formulating a solution to it, constructing appropriate hardware and software, testing the system, and reporting results. (e, k)

Objective 4: The student will study, in depth, one or more areas of electrical engineering.

Desired Outcome 4.1: The student will plan which elective courses he or she will take.

Strategies and Actions: In the advisement process, the second-semester junior students will create a plan for elective courses and place it in their career-planning

51

Page 52: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

portfolios. The senior students will maintain a current plan document in their portfolios. (g, h, i)

Desired Outcome 4.2: The student will successfully complete the elective courses specified by the curriculum.

Strategies and Actions: The student will successfully complete the elective courses. (a, e, k)

GOAL 3: Professional Skills

Objective 5: The student will demonstrate abilities to communicate effectively and to work as a productive member of teams.

Desired Outcome 5.1: The student will demonstrate an ability to function on multi-disciplinary teams.

Strategies and Actions: The student will work on multidisciplinary teams in the following courses: EECE 221, EECE 401-402. In the latter two, the student will perform a multidisciplinary design on a team with Computer Engineering students. (d)

Desired Outcome 5.2: The student will demonstrate an ability to communicate effectively.

Strategies and Actions: Throughout the laboratory sequence, the student will be graded on communication (both written and oral skills). Students who successfully complete the laboratory sequence will demonstrate this ability. In particular, the ECE Writing Center introduces writing styles in EECE 201. Writing Center consultants, the laboratory Teaching Assistants, and the instructor meet at the end of each term to review reports and quiz results, and to make recommendations for changes the following semester. (g)

Desired Outcome 5.3: The student will demonstrate an understanding of professional and ethical responsibility.

Strategies and Actions: A reflective writing exercise on engineering professional ethics will be included in the senior laboratory sequence (f, g). An elective course Ethics in Science and Engineering is offered by the Philosophy department, which also may be used to demonstrate this ability.

Objective 6: The student will demonstrate the ability to engage in career-long professional development.

Desired Outcome 6.1: The student will demonstrate the ability to build on previous experience and to begin work in a new field.

Strategies and Actions: The laboratory sequence requires the student to solve problems requiring knowledge beyond that covered in the curriculum course work.

52

Page 53: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Program Management Structure

Overview of the Assessment Process

The educational objectives are the primary forcing functions for desired outcomes and for strategies and actions. The desired outcomes flow directly from the objectives, the strategies and actions flow from the desired outcomes, the curriculum and other learning experiences flow from the strategies and actions, and the actual outcomes are produced by the students as they progress through the curriculum and other experiences. Finally, an assessment process is used to measure the actual outcomes, which are compared to the desired outcomes. Corrective actions are taken whenever there are serious differences between the desired and actual outcomes, thus closing the continual quality improvement loop.

The figure below shows a high-level view of the continual quality improvement loops. The loop on the left shows setting educational objectives using input from constituencies or stakeholders, while the loop on the right shows setting desired outcomes based on the objectives, designing curriculum, and measuring the actual outcomes. Note that the block in the middle has several functions, but its main function is to tie the two loops together so that the system works coherently.

Determineeducationalobjectives

Determine outcomesrequired to achieve

objectives

Determine strategies and

actions to achievedesired outcomes

Evaluate /Assess

Input fromconstituencies

Formal instructionand other

student experienceMeasure actual

outcomes

Figure 8. Diagram showing assessment loops.

A graphical view of the process for setting desired outcomes and measuring actual outcomes is given in the figure below, which draws a formal analogy to a closed-loop control

Figure 8. Diagram showing assessment loops.

53

Page 54: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

system. Of course, this analogy should not be stretched too far, since the students and faculty are people and not mechanisms, but it is helpful in presenting the concepts of desired and actual outcomes. The desired outcomes are compared, using some metrics, to the actual outcomes, and corrective actions are taken to make the actual outcomes track the desired outcomes.

Strategiesand Actions

+

-

Measuresand Metrics

DesiredOutcomes

Measuresand Metrics

DesiredMetrics

ExternalAssessmentInformation Measured

OutcomesActual

OutcomesGoals &

Objectives

Mission

Figure 9 Analogy between continual quality improvement loop and closed-loop control system.

Equally important is a conceptual framework for the continual quality improvement. We have been using the Capability Maturity Model [Paulk et al., "Capability Maturity Model, Version 1.1," IEEE Software, Vol. 10, No. 4, July 1993, pp. 18-27], described by Carnegie-Mellon Software Institute for assessment of software developers' processes, as such a conceptual framework for the evaluation of our academic processes. Figure B-3 shows the maturity levels of a hypothetical process. There is empirical evidence, at least in the case of software developers, that processes at a maturity level of 3 or above tend to stay at a high level, while those that are lower tend to fall back to a lower level. Thus, our goal is to strive to improve the maturity level of all the key academic processes to reach a level of 3 or higher.

Figure 10. The Capability Maturity Model

54

Page 55: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Assessment System Procedures

The department chair has the primary responsibility to collect and disseminate assessment data to the departmental faculty members. The department analyzes findings from the Course Survey, the Senior Survey, the Alumnae/Alumni Survey, Employer Focus Groups, Faculty Surveys, the Entering Student Questionnaire and information from Student Longitudinal Tracking Studies. Each semester, the students are asked to provide input by completing Course/Instructor Evaluations, which provide immediate input to faculty concerning course-level objectives. The senior students are surveyed just before graduation, which provides useful summary assessment data. Recent alumni are surveyed to provide a longer-range view of the program.

Each year, the Electrical Engineering Program Committee, consisting of all faculty members associated with the Electrical Engineering curriculum, reviews the Electrical Engineering curriculum. This committee meets every semester, and more often if the Program Chair calls a meeting, to discuss the curriculum ensuring that faculty members allocate sufficient time to each subject area. This review determines the extent to which supporting outcomes have been achieved during that academic year.

The challenge is to make ideas of continual improvement work in an environment where one major constituency, the faculty itself, defends the idea of academic freedom with great vigor. It is our intention, then, to maintain fairly strict control over a subset of the required courses in our undergraduate curriculum, especially the laboratory sequences. In fact, this control already exists, and we merely exploit it for our purposes. Also included will be the required introductory sophomore and junior courses. This control will guarantee that the needed material will be covered, that all students will have the needed variety of experiences in and out of the classroom, and that faculty will still have great freedom in the advanced courses and electives. Notwithstanding this freedom, all faculty and all courses will be expected to use the quality improvement ideas; there will simply be less reliance on the elective courses to meet program objectives and more reliance on the controlled subset.

The Department Chair calls a meeting of the faculty to (initially) establish, review and revise the educational objectives. This meeting takes place at least once each academic year near the end of the Spring Semester, but each semester if more rapid changes are indicated (at the discretion of the chair). At this meeting the faculty and the chair may submit proposed changes to the educational objectives. The chair may inform the faculty of administrative constraints (e.g. budgetary constraints) and present results of student surveys, alumni surveys, senior exit surveys, and discussions with the industrial advisory board and information from recruiters, industrial contacts, and the state government. The proposed changes are discussed and approved by vote of the faculty.

In addition to the instruments mentioned previously, the Electrical Engineering Department also utilizes various other assessment methodologies. Some of these are discussed in the following paragraphs.

55

Page 56: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Career Planning Portfolio - The student maintains the portfolio and it is reviewed, during the advisement period, by the department staff member assigned to advisement. We are just beginning the institution of this portfolio system, and there will undoubtedly be changes, but those changes will flow from our outcomes assessment process in a natural way over time.

The EE Writing Center - The EE Writing Center is also an important mechanism for creating and maintaining assessment tools. Since its inception in the fall of 1995, the center has been actively participating in EECE 201 students' writing and communication skills. These assessment tools include: essay prompts that ask students to write about a learning experience; primary trait scoring sheets; and questionnaires for students designed to gather information about the writing instruction in the course.

Capstone Design Project - This is the rite of passage for the entire program. Each team (typically four students) is given the current IEEE specifications for the autonomous robot. They are responsible for: (1) managing a team, (2) designing and realizing a vehicle that meets specifications, (3) managing a budget, and (4) a formal report on the project. The teams will provide any parts and components required for the project. Teamwork, communications, and project management are stressed throughout the term. Project teams are required to have regularly scheduled meetings among themselves. Special meetings may be held with the laboratory instructors.

56

Page 57: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Computer Engineering Programs

The Department of Computer Science and Computer Engineering has, as its mission, to:

Provide undergraduate and professional education through programs that prepare students for the workplace, stress the development of the total person, and begin a process of lifelong learning.

Provide graduate education and training in the skills of advanced research.

Contribute to the base of technical knowledge by conducting research and scholarship and by disseminating the results of those programs.

Support the engineering professions by service in the appropriate professional organizations.

Serve the needs of the state and region by appropriate outreach programs and by support for industrial development.

The first bullet of the mission statement targets undergraduate education and is the focusof the present discussion. The undergraduate component specifies three key areas: preparation for the workplace, development of the total person, and lifelong learning. Three broadly stated goals were derived from our mission statement. They are listed below.

GOAL 1: Broad Undergraduate Education

GOAL 2: Engineering Skills Showing Breadth and Depth

GOAL 3: Professional Skills

For each goal, faculty members developed program objectives. The Computer Engineering Program outcomes are listed below. Each educational objective is listed, followed by the desired outcomes associated with that objective, followed by the strategies and actions used to obtain that outcome. The letter in parenthesis cross-indexes to the EC2000 Criterion 3 paragraph a-k.

Objective 1.: The student will develop an awareness of the world around us as necessary to practice engineering in a global economy

Desired Outcome 1.1: The student will develop a career plan that recognizes current trends in engineering.

Strategies and Actions: The student will place a written discussion of how current events might affect her or his career plan in the career-planning portfolio (g, h, i, j)

Objective 2: The student will study arts, humanities, foreign language, science and mathematics

Desired Outcome 2.1: The student will successfully complete the required science and mathematics curriculum.

57

Page 58: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Strategies and Actions: The student will successfully complete the courses (a, h)

Desired Outcome 2.2: The student will successfully complete the humanities curriculum.

Strategies and Actions: The student will successfully complete the courses (h)

Objective 3: The student will actively participate in a broad educational experience in the fundamentals of engineering with emphasis on electrical and computer engineering

Desired Outcome 3.1: The student will maintain a portfolio that documents his/her academic career. The student will maintain documentation in the portfolio that demonstrates a clear plan for successful negotiation of the curriculum.

Strategies and Actions: The portfolio will be examined each semester as part of the advisement process. A checklist will be maintained by the department to ensure that all requirements are met. The student is responsible for maintaining the portfolio. (g, h, i, j, k)

Desired Outcome 3.2: The student will demonstrate an ability to apply knowledge of mathematics, science and engineering.

Strategies and Actions: Many of the courses in engineering support this; however, each of the ECE core courses (211, 212, 221, 222, 331, 351 and 371) are sufficient to demonstrate this ability. The student must successfully complete all these courses to graduate. (a)

Desired Outcome 3.3: The student will demonstrate an ability to design and conduct experiments, including analyzing and interpreting data.

Strategies and Actions: The ECE laboratory courses (201, 301, 403, 404) require the student to design and conduct a wide variety of experiments, to analyze the results and to draw conclusions. The student who successfully completes 404 has demonstrated this ability. (b)

Desired Outcome 3.4: The student will demonstrate an ability design a system, device or process to meet desired needs.

Strategies and Actions: The junior and senior ECE laboratory courses (301, 403, 404) require the student to perform elements of design. The capstone laboratory sequence 403 and 404 require that two designs be put to test in hardware. (c)

Desired Outcome 3.5: The student will demonstrate an ability to identify, formulate and solve engineering problems.

Strategies and Actions: The capstone laboratory sequence 403-404 requires the student to solve a problem, beginning at the level of identifying the nature of the problem, formulating a solution to it, constructing appropriate hardware and software, testing the system, and reporting results. (e, k)

Objective 4: The student will study, in depth, one or more areas of computer engineering.

Desired Outcome 4.1: The student will plan which elective courses he or she will take.

Strategies and Actions: In the advisement process, the second-semester junior students will create a plan for elective courses and place it in their career-planning

58

Page 59: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

portfolios. The senior students will maintain a current plan document in their portfolios. (g, h, i)

Desired Outcome 4.2: The student will successfully complete the elective courses specified by the curriculum.

Strategies and Actions: The student will successfully complete the elective courses. (a, e, k)

Objective 5: The student will demonstrate abilities to communicate effectively and to work as a productive member of teams.

Desired Outcome 5.1: The student will demonstrate an ability to function on multi-disciplinary teams.

Strategies and Actions: The student will work on multidisciplinary teams in the following courses: EECE 221, EECE 401-402. In the latter two, the student will perform a multidisciplinary design on a team with Computer Engineering students. (d)

Desired Outcome 5.2: The student will demonstrate an ability to communicate effectively.

Strategies and Actions: Throughout the laboratory sequence, the student will be graded on communication (both written and oral skills). The students who successfully completes the laboratory sequence will have demonstrated this ability. In particular, the EE Writing Center introduces writing styles in EECE 201. Writing Center consultants, the laboratory Teaching Assistants, and the instructor meet at the end of each term to review reports and quiz results, and to make recommendations for changes the following semester. (g)

Desired Outcome 5.3: The student will demonstrate an understanding of professional and ethical responsibility.

Strategies and Actions: A reflective writing exercise on engineering professional ethics will be included in the senior laboratory sequence (f, g). An elective course Ethics in Science and Engineering is offered by the Philosophy department, which also may be used to demonstrate this ability.

Objective 6: The student will demonstrate the ability to engage in career-long professional development.

Desired Outcome 6.1: The student will demonstrate the ability to build on previous experience and to begin work in a new field.

Strategies and Actions: The laboratory sequence requires the student to solve problems requiring knowledge beyond that covered in the curriculum course work. (i)

59

Page 60: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Assessment System

The process for establishment, review and revision of the educational objectives is shown graphically as the left-hand loop in the following diagram.

Figure 11 Diagram Showing Assessment Loops

The Department Chair calls a meeting of the faculty to (initially) establish, review and revise the educational objectives. This meeting takes place at least once each academic year near the end of the Spring Semester, but each semester if more rapid changes are indicated (at the discretion of the chair). The University Board of Trustees has given primary responsibility for the curriculum to the faculty; thus the faculty associated with the program must approve changes to the educational objectives. At this meeting the faculty and the chair may submit proposed changes to the educational objectives. The chair may inform the faculty of administrative constraints (e.g. budgetary constraints) and present results of student surveys, alumni surveys, senior exit surveys, and discussions with the industrial advisory board and information from recruiters, industrial contacts, and the state government. The proposed changes are discussed and approved by vote of the faculty.

Each semester, the students are asked to provide input by completing Course/Instructor Evaluations, which provide immediate input to faculty concerning course-level objectives. The senior students are surveyed just before graduation, which provides useful summary assessment data. Recent alumni are surveyed to provide a longer-range view of the program. The department chair has the primary responsibility to collect this information and present it to the faculty, as described above.

60

Determineeducationalobjectives

Determine outcomesrequired to achieve

objectives

Determine strategies and

actions to achievedesired outcomes

Evaluate /Assess

Input fromconstituencies

Formal instructionand other

student experienceMeasure actual

outcomes

Two continual quality improvement loops:left side shows process for setting educationalobjectives; right side shows process for settingdesired outcomes and measuring actual outcomes

Page 61: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Program Management Structure

Overview of the Assessment Process

The educational objectives are the primary forcing functions for desired outcomes and for strategies and actions. The desired outcomes flow directly from the objectives, the strategies and actions flow from the desired outcomes, the curriculum and other learning experiences flow from the strategies and actions, and the actual outcomes are produced by the students as they progress through the curriculum and other experiences. Finally, an assessment process is used to measure the actual outcomes, which are compared to the desired outcomes. Corrective actions are taken whenever there are serious differences between the desired and actual outcomes, thus closing the continual quality improvement loop.

A graphical view of the process for setting desired outcomes and measuring actual outcomes is given in the figure below, which draws a formal analogy to a closed-loop control system. Of course, this analogy should not be stretched too far, since the students and faculty members are people and not mechanisms, but it is helpful in presenting the concepts of desired and actual outcomes. The desired outcomes are compared, using some metrics, to the actual outcomes, and corrective actions are taken to make the actual outcomes track the desired outcomes.

Strategiesand Actions

+

-

Measuresand Metrics

DesiredOutcomes

Measuresand Metrics

DesiredMetrics

ExternalAssessmentInformation Measured

OutcomesActual

OutcomesGoals &

Objectives

Mission

Figure 12 Analogy between continual quality improvement loop and closed-loop control system.

Equally important is a conceptual framework for the continual quality improvement. We have been using the Capability Maturity Model [Paulk et al., "Capability Maturity Model, Version 1.1," IEEE Software, Vol. 10, No. 4, July 1993, pp. 18-27], described by Carnegie-Mellon Software Institute for assessment of software developers' processes, as such a conceptual framework for the evaluation of our academic processes. Figure B-3 shows the maturity levels of a hypothetical process. There is empirical evidence, at least in the case of software developers, that processes at a maturity level of 3 or above tend to stay at a high level, while those that are lower tend to fall back to a lower level. Thus, our goal is to strive to improve the maturity level of all the key academic processes to reach a level of 3 or higher.

61

Page 62: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Figure1-3. The Capability Maturity Model

Assessment System Procedures

The department chair has the primary responsibility to collect and disseminate assessment data to the departmental faculty members. The department analyzes findings from the Course Survey, the Senior Survey, the Alumnae/Alumni Survey, Employer Focus Groups, Faculty Surveys, the Entering Student Questionnaire and information from Student Longitudinal Tracking Studies. Each semester, the students are asked to provide input by completing Course/Instructor Evaluations, which provide immediate input to faculty concerning course-level objectives. The senior students are surveyed just before graduation, which provides useful summary assessment data. Recent alumni are surveyed to provide a longer-range view of the program.

Each year, the Computer Science and Computer Engineering Program Committee, consisting of all faculty members associated with the Computer Science/Computer Engineering curriculum, reviews the Computer Science and Computer Engineering curriculum. This committee meets every semester, and more often if the Program Chair calls a meeting, to discuss the curriculum ensuring that faculty members allocate sufficient time to each subject area. This review determines the extent to which supporting outcomes have been achieved during that academic year.

The challenge is to make ideas of continual improvement work in an environment where one major constituency, the faculty itself, defends the idea of academic freedom with great vigor. It is our intention, then, to maintain fairly strict control over a subset of the required courses in our undergraduate curriculum, especially the laboratory sequences. In fact, this control already exists, and we merely exploit it for our purposes. Also included will be the required introductory sophomore and junior courses. This control will guarantee that the needed material will be covered, that all students will have the needed variety of experiences in and out of the classroom, and that faculty will still have great freedom in the advanced courses and electives. Notwithstanding this freedom, all faculty and all courses will be expected to use the quality

62

Page 63: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

improvement ideas; there will simply be less reliance on the elective courses to meet program objectives and more reliance on the controlled subset.

The Department Chair calls a meeting of the faculty to (initially) establish, review and revise the educational objectives. This meeting takes place at least once each academic year near the end of the Spring Semester, but each semester if more rapid changes are indicated (at the discretion of the chair). At this meeting the faculty and the chair may submit proposed changes to the educational objectives. The chair may inform the faculty of administrative constraints (e.g. budgetary constraints) and present results of student surveys, alumni surveys, senior exit surveys, and discussions with the industrial advisory board and information from recruiters, industrial contacts, and the state government. The proposed changes are discussed and approved by vote of the faculty.

Assessment Methodologies

In addition to the instruments mentioned previously, the Computer Science and Computer Engineering Department also utilizes various other assessment methodologies. Some of these are discussed in the following paragraphs.

Career Planning Portfolio - The student maintains the portfolio and it is reviewed, during the advisement period, by the department staff member assigned to advisement. We are just beginning the institution of this portfolio system, and there will undoubtedly be changes, but those changes will flow from our outcomes assessment process in a natural way over time.

The EE Writing Center - The EE Writing Center is also an important mechanism for creating and maintaining assessment tools. Since its inception in the fall of 1995, the center has been actively participating in EECE 201 students' writing and communication skills. These assessment tools include: essay prompts that ask students to write about a learning experience; primary trait scoring sheets; and questionnaires for students designed to gather information about the writing instruction in the course.

Capstone Design Project - This is the rite of passage for the entire program. Each team (typically four students) is given the current IEEE specifications for the autonomous robot. They are responsible for: (1) managing a team, (2) designing and realizing a vehicle that meets specifications, (3) managing a budget, and (4) a formal report on the project. The teams will provide any parts and components required for the project. Teamwork, communications, and project management are stressed throughout the term. Project teams are required to have regularly scheduled meetings among themselves. Special meetings may be held with the laboratory instructors.

63

Page 64: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix A

Assessment Plan

64

Page 65: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information Technology Assessment Plan

Assessment Program Objectives and Strategies

Objective 1:

Develop and implement an assessment program that provides processes and procedures for the continuous evaluation of student performance and satisfaction, faculty performance and satisfaction and stakeholder input into the educational system.

Action Strategies & Timeframes:

1. Monitor the processes and procedures developed and implemented to evaluate assessment data provided to each department and the executive committee. (4/00; 4/01;4/02;4/03;4/04)

2. On an annual basis, each department will review and make recommendations for improvement based on assessment data collected to address each program outcome as part of the continuous quality review program. (ABET/Gateway Committee) (6/00;6/01;6/02;6/03;6/04)

3. The Director of Assessment will prepare the annual quality review program report indicating the extent to which the action plans were implemented and achieved by each department, the feasibility of the time frames and recommendation for improving the process. (10/00; 10/01;10/02;10/03;10/04)

Outcomes:

A. Written procedures will be submitted by each department and the executive committee outlining each major step in the assessment process that occurs within the department.

B. On an annual basis, each department will provide a written summary report of findings (outcomes), results, actions taken, consequences, and recommendations verifying the assessment process has completed the annual cycle and specifying problems and solutions.

C. The Director of Assessment will summarize results and recommendations of the ABET/Gateway Committee; then prepare a synopsis of the annual review indicating assessment measures analyzed, outcomes, recommendations, changes implemented, and the evaluation results of the changes.

D. The Executive Committee will discuss and prioritize action strategies recommended as a result of the annual program review.

Resources:

The Director of Assessment position – college-fundedAn educational research graduate assistant A work-study student assistant

Objective 2:

Develop and implement a set of evaluation instruments that assess performance and satisfaction levels for all key stakeholders for the continuous quality improvement program. (students, parents, alumni, faculty, staff, administration, industry, employers, partnership board and other legislative bodies.)

65

Page 66: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Action Strategies & Timeframes:

1. The Director of Assessment will administer the following instruments by June 2000: Alumnae/Alumni Survey (3 year), Employer Survey and/or Focus Groups, Partnership Board Evaluation (if needed), Faculty Survey, Staff Survey, Withdrawal Survey, and the Senior Survey. (6/00;6/01;6/02;6/03;6/04)

2. Refine and improve evaluation processes and procedures developed to administer and retrieve evaluation instruments. (10/00; 10/01;10/02;10/03;10/04)

3. Analyze, summarize and provide written and oral reports of the results from the evaluation (assessment) instruments. (1/00-12/00; 1/01-12/01;1/02-12/02; 1/03-12/03; 1/04-12/04)

Outcomes:

A. The Director of Assessment will distribute survey results to all of the appropriate constituencies. (See distribution list.)

B. The Director of Assessment will publish appropriate articles in newspapers and journals of survey outcomes.

C. The Director of Assessment will make presentations at designated conferences and seminars outlining lessons learned from the continuous quality improvement program.

Resources:

Envelopes, letterhead paper, paper, labels $700.00Copies $600.00Postage $400.00Code sheet development $800.00Code sheets $700.00

Objective 3:

Create, implement, and generate reports from the development and utilization of a ten-year student longitudinal tracking system based upon USC admissions, registration, and graduation data tapes.

Action Strategies & Timeframes:

1. Determine types of reports to be generated and the time frames for each; specify structure of each report. (2/00)

2. Pull data, verify accuracy, and compile designated reports. (9/00, 11/00)

Outcomes:

A. Distribute tables/reports of the Student Longitudinal Study Results to Executive Committee and Departments.

B. Write summary report describing findings within each report.

Resources:

Computer time at Computer Services $200.00Paper and copies $100.00

Objective 4:

Design and implement, analyze and report results from Bates House Living and Learning Community project.

66

Page 67: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Actions Strategies:

1. Design and implement assessment methodology for evaluating Engineering 101; report results.2. Design and implement assessment methodology for analyzing the control and experimental group

for the Bates House project; report results.

Outcomes:

A. Write a report analyzing and summarizing results of the Bates House initiative.

Resources:

Paper, copies, video tape, binders, etc. $300.00

Objective 5:

Provide technical support and assistance (assessment methodology, practices, techniques, etc.) to faculty within the College of Engineering.

Actions Strategies:

1. Analyze and report end of course survey data, pre-post attitude data and skills/competencies matrix data for EMCH 467.

2. Analyze and report data for the EECE 201 Survey.3. Analyze and report data for the ECHE 460 or 461 Survey.4. Analyze and report data for other courses as needed.

Outcomes:

A. Provide a report analyzing and summarizing results of each assessment initiative.

Resources:

Paper, copies, video tape, binders, etc. $100.00

Objective 6:

Design and implement assessment methodologies to measure the impact of the Professional Communications Center.

Action Strategies:

1. Determine and prioritize four research projects to assess written and oral communications.2. Obtain necessary faculty support and assistance with the research projects.3. Collect and analyze data, summarize findings, create a written report (12/00)

Outcomes:

A. Produce an annual report that reviews progress in the implementation of speech and writing skills with the College curriculum.

B. Provide appropriate course survey statistics to the Institutional Planning and Assessment Office

67

Page 68: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Resources:

Writing Center Consultants to grade pre and post surveys. $300.00Student Assistants to administer and collect surveys.

Objective 7:

Devise, implement and evaluate a system for the continuous evaluation of course instruction.

Action Strategies:

1. Modify and administer the College of Engineering and Information Technology Course Survey. 2. Modify and refine the policies and procedures for the implementation of the course evaluation.3. Generate and distribute faculty reports. (2/00) (7/00) 4. Analyze and report results of the college-wide course survey administration each semester. (2/00 and 7/00)

Outcomes:

A. Produce reports for faculty and administration regarding the results of each administration of the college-wide course survey.

B. Provide appropriate course survey statistics to the Institutional Planning and Assessment Office.

Resources:

Students to code and organize sheets before sending to Computer Services.

Purchase code sheets $165.44/M Total Price $827.20Scanning of code sheets $200.00

68

Page 69: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix B

Senior Survey

69

Page 70: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information Technology

Senior Survey:An Assessment of Student’sExperiences and Opinions

Return surveys to:

College of Engineering Student ServicesUniversity of South Carolina Swearingen BuildingColumbia, SC 29208

70

Page 71: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Senior SurveyMay 1999

1. Would you recommend a University of South Carolina engineering education to a friend or relative?

Yes No Maybe

2. How would you rate your overall satisfaction with your preparation to become an engineer? Please mark the box that best describes your opinion.

Not A Little Very Satisfied Satisfied Undecided Satisfied Satisfied

3. How would you rate your preparation to obtain a job after graduation? Please mark the box that best describes your opinion.

Not Somewhat Very Satisfactory Satisfactory Undecided Satisfactory Satisfactory

4. How would you rate your preparation to become a contributing member of society? Please mark the box that best describes your opinion.

Not Somewhat Very Satisfactory Satisfactory Undecided Satisfactory Satisfactory

5. What kinds of publications ( besides textbooks) do you usually read? (for example, Newsweek, The State, Journal of Engineering Education, etc.)

How often do you read these materials? What kinds of news or information-type programs do you watch or listen to on a regular basis?

__________________________________________________________________________________________

6. Please indicate your degree of satisfaction with each of the following services or features of the College of Engineering and Information Technology. If any item listed is not relevant to your situation, circle the number six (6) for “Does Not Apply.”

Features: VeryDissatisfied Dissatisfied Neutral Satisfied

VerySatisfied

Does NotApply

Information on career/job opportunities in your area 1 2 3 4 5 6Value of general advisement services received 1 2 3 4 5 6Advisor’s knowledge of your program requirements 1 2 3 4 5 6Value of assistance provided by Student Services staff 1 2 3 4 5 6Comfort and appropriateness of classrooms 1 2 3 4 5 6Overall conditions of laboratories 1 2 3 4 5 6Availability and condition of computers 1 2 3 4 5 6Availability and condition of lab equipment 1 2 3 4 5 6Teaching Assistants treat students respectfully 1 2 3 4 5 6Teaching Assistants display a clear understanding of the subject matter

1 2 3 4 5 6

71

Page 72: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

7. Below are listed some skills and competencies that engineering graduates should have. Please provide us with your opinion about the amount of experience you received in your coursework regarding these skills. Also indicate your satisfaction with the level of competency you have achieved as a result of your USC education. For each item please circle the number in the column appropriate to your answer.

Competencies Amount of Experience Level of Competency

Too

LittleAdequate Too

MuchCompletelyDissatisfied Dissatisfied

Satisfied

Completely Satisfied

An ability to apply:

Engineering terms, principles and theories 1 2 3 1 2 3 4Advanced mathematics (calculus & above) 1 2 3 1 2 3 4Chemistry and/or physics 1 2 3 1 2 3 4

Liberal Arts (English, history, economics, business, etc.)

1 2 3 1 2 3 4

An ability to:

Identify, formulate, and solve engineering problems

1 2 3 1 2 3 4

Design a system, component, or process to meet desired needs and quality

1 2 3 1 2 3 4

Use the computer as a tool for analysis & design

1 2 3 1 2 3 4

Function on multi-disciplinary or cross-functional teams

1 2 3 1 2 3 4

Function in culturally and ethnically diverse environments

1 2 3 1 2 3 4

Communicate orally, informally, and in prepared talks

1 2 3 1 2 3 4

Communicate in writing - technical reports, memos,

proposals, etc.

1 2 3 1 2 3 4

Use computer software for professional communications

1 2 3 1 2 3 4

Design and conduct experiments 1 2 3 1 2 3 4Analyze and interpret data 1 2 3 1 2 3 4

An understanding of:

Professional and ethical responsibilities 1 2 3 1 2 3 4Environmental aspects of engineering practice 1 2 3 1 2 3 4The practice of engineering on a global scale 1 2 3 1 2 3 4The impact of engineering solutions in a global

and societal context

1 2 3 1 2 3 4

The need for engaging in life-long learning 1 2 3 1 2 3 4Basic knowledge of industry practices and standards

1 2 3 1 2 3 4

Contemporary issues 1 2 3 1 2 3 4

72

Page 73: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

8. What courses, experiences, teachers, professional organizations, or learning activities did you find most useful in helping to prepare you for becoming an engineering professional?

9. What recommendations would you make to improve the educational experience for future engineering students at USC?

Extracurricular Activities or Service

10. Did you have an internship with an engineering company? _____ yes _____ no

If yes, where? ___________________________________________________

11. Did you participate in a Co-op program? _____ yes _____ no

If yes, where? ___________________________________________________

12. Did you work while going to school? _____ yes _____ no

If yes, how often? _____ part-time (20 hours or less per week) _____ part-time (20-30 hours per week)_____ part-time or full-time (more than 30 hours per week)

73

Page 74: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Graduate Education

13. Are you planning to attend graduate school? _____ yes _____ no _____ maybe

If yes, in what field?______________________________________

If yes, in which University do you plan to enroll? ___________________________________________

Employment Information

14. Have you accepted a position at this time? _____ yes _____ no

If yes, what is the name of the company or organization? _____________________________________

If yes, what is your job title? ______________________________________________

15. Did you participate in career planning in the Career Services Office? ______ yes ______ no

Demographic Information:

16. When did you enroll at USC? ____________________

17. Did you transfer from another college or university? _____ yes _____ no

If yes, what was the transfer institution? _________________________________

18. What is your major? Please circle. Chemical Civil/Environmental Computer Electrical Mechanical

19. What is your cumulative GPA (grade point average) __________________

20. What is your gender? Please circle. Male Female

21. What is your ethnicity? Please circle. Caucasian African-American Hispanic

Asian/Pacific Islander Native American Other

Thank you for completing this survey!

74

Page 75: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix C

Senior Survey Reports

(sample)

75

Page 76: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

University of South CarolinaCollege of Engineering and Information Technology Senior Survey

May 1999

1. Would you recommend a University of South Carolina engineering education to a friend or relative?

College Yes 49 (65.3%) No 9 (12.0%) Maybe 17 (22.7%)

Chemical 8 ( 88.9%) 0 ( 0.0%) 1 (11.1%)Civil 10 ( 77.8%) 0 ( 0.0%) 2 (16.7%)Computer 4 ( 36.4%) 2 (18.2%) 5 (45.5%)Electrical 5 ( 41.7%) 6 (50.0%) 1 ( 8.3%)Mechanical 21 ( 70.0%) 1 ( 3.3%) 8 (26.7%)

2. How would you rate your overall satisfaction with your preparation to become an engineer? Please mark the box that best describes your opinion.

Not A Little Very Satisfied Satisfied Undecided Satisfied Satisfied

College 2 ( 2.7%) 7 ( 9.5%) 5 ( 6.8%) 54 (73.0%) 6 ( 8.1%)

Chemical 0 ( 0.0%) 0 ( 0.0%) 0 ( 0.0%) 7 (77.8%) 2 (22.2%)Civil 0 ( 0.0%) 0 ( 0.0%) 0 ( 0.0%) 10 (83.3%) 2 (16.7%)Computer 0 ( 0.0%) 2 (18.2%) 1 ( 9.1%) 7 (63.6%) 1 ( 9.1%)Electrical 2 (16.7%) 1 ( 8.3%) 2 (16.7%) 7 (58.3%) 0 ( 0.0%)Mechanical 0 ( 0.0%) 4 (13.8%) 2 ( 6.9%) 23 (79.3%) 0 ( 0.0%)

3. How would you rate your preparation to obtain a job after graduation? Please mark the box that best describes your opinion.

Not Somewhat Very Satisfactory Satisfactory Undecided Satisfactory Satisfactory

College 3 ( 4.1%) 6 ( 8.2%) 8 (11.0%) 40 (54.8%) 16 (21.9%)

Chemical 0 ( 0.0%) 0 ( 0.0%) 1 (11.1%) 4 (44.4%) 4 (44.4%)Civil 0 ( 0.0%) 1 ( 9.1%) 0 ( 0.0%) 7 (63.6%) 3 (27.3%)Computer 0 ( 0.0%) 1 ( 9.1%) 3 (27.3%) 5 (45.5%) 2 (18.2%)Electrical 2 (16.7%) 2 (16.7%) 1 ( 8.3%) 5 (41.7%) 2 (16.7%)Mechanical 1 ( 3.4%) 2 ( 6.9%) 3 (10.3%) 18 (62.1%) 5 (17.2%)

4. How would you rate your preparation to become a contributing member of society? Please mark the box that best describes your opinion.

Not Somewhat Very Satisfactory Satisfactory Undecided Satisfactory Satisfactory

College 1 ( 1.4%) 3 ( 4.1%) 3 ( 4.1%) 45 (60.8%) 22 (29.7%)

Chemical 0 ( 0.0%) 0 ( 0.0%) 0 ( 0.0%) 7 (58.3%) 5 (41.7%)Civil 0 ( 0.0%) 0 ( 0.0%) 1 ( 11.1%) 7 (77.8%) 1 (11.1%)Computer 0 ( 0.0%) 0 ( 0.0%) 0 ( 0.0%) 5 (45.5%) 6 (54.5%)Electrical 1 ( 8.3%) 0 ( 0.0%) 1 ( 8.3%) 9 (75.0%) 1 ( 8.3%)Mechanical 0 ( 0.0%) 3 (10.3%) 2 ( 6.9%) 20 (69.0%) 4 (13.8%)

76

Page 77: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

5. What kinds of publications do you read? (for example, Newsweek, The State , Journal of Engineering Education, etc.) How often do you read these materials?

Chemical:

Engineering Textbooks (everyday) The State, Sports Illustrated, Chemical Engineering Progress (twice a week) The State, Maxim

The State, Sports Illustrated, Reader’s Digest (daily)The State (daily)Wall Street Journal (weekly)Chemical Engineering Progress, Cosmopolitan, The State (weekly)The State, Aiken Standard, Chemical Engineering Progress, AICHE Journal (daily or monthly depending on distribution)Everything, all the time, and I’m not kidding. (Not so much hard-core engineering stuff, but lots of current events.) (weekly)

Civil:

Wall Street Journal, The State (once or twice a week)Time, Journal of American Water Works (once a month)The State (everyday)Wall Street Journal (daily)Newsweek, The State, Civil Engineering (about once a month)Scientific American, Newsweek, National Geographic, Time (monthly)The State (everyday), ASCE Engineering Journal (whenever I receive it)ASCE News, The State, Popular Mechanics (once a month)The State, Civil Engineering, P. O. B. (3 or 4 times a week)Engineering News Record (weekly)The State, Newsweek, Reader’s Digest (whenever published)The State, The Gamecock (2-3 times a week)

Computer:

Potentials The State, PC Gaming (weekly)Network Magazine, Byte (weekly)Augusta Chronicle, U. S. News, Newsweek (daily, weekly, per issue frequency)Potential, Network Magazine, NT System, Enterprise Management, IT Professional, Kiplinger Report (every time a new subsription arrives)Spectrum (once a month)IEEE Spectrum, The State (3-7 days per week0The State, Newsweek, IEEE Magazines, Gamecock (3 or more times a week)The State, IEEE Potentials, Online Publications, NonePaper (weekly), books (when I need them)

Electrical:Midnight Engineering, Circuit Cellar, Popular Electronics (as they are published)IEEE Potentials, The State (once a week)The State, Newsweek, Time (every week)BlankTime, Gamecock, Newsweek, Technology Related Systems (all the time)Popular Science, IEEE Spectrum, PC week (weekly or monthly)Augusta Newspaper (everyday)NoneThe State, National Geographic, Popular Science (weekly)BlankThe State (weekly), Muscle Media, Muscle and Fitness (monthly)NoneMechanical:

77

Page 78: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Time, Mechanical Engineering (every week)The State, Wall Street Journal (daily)The State (at least once a week)Mechanical Engineering, The State, Time (once a month)ASME, Gamecock (once a week)The State, ASME Journal, USA Today (everyday)The State, Mechanical Engineering Magazine (a few times a week)ASME, Journal Pressure Vessels (monthly)Newsweek, ASME Journals, Sports Illustrated (every week)ASME Journal (monthly)ASME (monthly), Fox News (daily)Washington Post, Wall Street Journal, Various Magazines, Mechanical Engineering (every week or issue)The State, USA Today, Sports Illustrated (almost daily)Computer Magazine, The State (frequently)NoneThe State, Time, Car and Driver (once per week)Machine Design (biweekly) The State (everyday)BlankThe State, Wall Street Journal (once a day)The State, Wall Street Journal, Popular Science, Soldier of Fortune, Playboy, Penthouse, The Gamecock, Air and Space, Omni (often)Mechanical Engineering (daily)The State, Reader’s Digest, Newsweek (everyday-monthly)The State, The Gamecock, Easyrider (daily)The State, The Gamecock, ASME Mechanical Engineering (daily)Engineering text book (non stop for the last four years)Newspaper (daily), Sports Illustrated, Men’s Journal (monthly)The State (2-4 times a week)Technical Journals (weekly)USA Today (daily), Newsweek (weekly)

What kinds of news programs do you watch or listen to on a regular basis?

Chemical:

News Radio, CNN, Nightly NewsNoneCNN, Dateline, MSNBCNBC News, Paul HarveyWIS News, NBC Nightly News, The Today ShowEvening NewsNBC, CNNLocal News, National News, CNN, Weather ChannelNational news

Civil:

20/20, Dateline The Today Show

Local and National News, CNNLocal NewsNonePBS Radio, NPRTLC, Discovery, A & E, All Sports ProgramsThe Learning ChannelHeadline NewsSci Fi Channel, Discovery ChannelRadio, News on various stations

78

Page 79: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Local News Channel

Computer:

CNNMSNBCCNNHistory Channel, The Learning ChannelNewsweek, 60 MinutesDaily NewsLocal News, CNN, Headline News, Dateline NBCThe News, RadioMSNBC, CSPANBlankDiscovery Channel

Electrical:

Discovery Channel, TLCABC News, CNN, HardballCNN, Local NewsLocal and National NewsAll kinds of news from any sourceRush Limbaugh, Fox News Channel, Channel 10 Local NewsLocal News, MNBCNone CNN, Local and Network NewsBlankCNNNone

Mechanical:

20/20, 60 Minutes, Dateline, CBS Evening NewsWIS News, NBC News, Dateline, CNN, Discovery Channel, History ChannelNews, The Learning Channel, The Discovery ChannelDateline, Local NewsNightlineLocal and National NewsCNN, NBC, CBS, ABC, All Local NewsCNBC48 Hours, Good Morning America, National Geographic Explorer, Dateline60 MinutesRush Limbaugh, Oliver NorthCNNHeadline NewsBlankNoneEvening NewsBlankCNN, NBC NewsNightline, 20/20, AM RadioNightly News, CNNESPN News, CNN, Sports RadioNightly NewsNBC News, WIS TV, CNN, Radio ClipsSports Talk Radio, ESPN, CNNEvening NewsBlankLocal News

79

Page 80: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

CNN, Nightline, MTV 15/15Evening NewsDiscovery Channel, Science Programs

6. Please indicate your degree of satisfaction with each of the following “environmental” features of the College of Engineering. If any item listed is not relevant to your situation, circle the number six (6) for “Does Not Apply.”

Features: VeryDissatisfied Dissatisfied Neutral Satisfied

VerySatisfied

Does NotApply

Information on career/job opportunities in your area College Chemical Civil Computer Electrical Mechanical

1 ( 1%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 1 ( 8%) 0 ( 0%)

13 ( 17%) 1 (11%) 2 (17%) 3 (27%) 3 (25%) 4 (13%)

11 (15%) 2 (22%) 1 ( 8%) 1 ( 9%) 0 ( 0%) 7 (23%)

37 (49%) 6 (67%) 6 (50%) 4 (36%) 7 (58%)13 (43%)

10 ( 13%) 0 ( 0%) 2 (17%) 2 (18%) 1 ( 8%) 1 ( 3%)

3 ( 4%) 0 ( 0%) 1 ( 8%) 1 ( 9%) 0 ( 0%) 0 ( 0%)

Value of general advisement services received College Chemical Civil Computer Electrical Mechanical

5 ( 7%) 0 ( 0%) 1 ( 8%) 0 ( 0%) 3 (25%) 1 ( 3%)

14 (19%) 0 ( 0%) 1 ( 8%) 5 (46%) 3 (25%) 5 (17%)

14 (19%) 0 ( 0%) 1 ( 8%) 5 (46%) 3 (25%) 5 (17%)

33 (44%) 7 (78%) 6 (50%) 1 ( 9%) 3 (25%) 15 (50%)

8 (11%) 2 (22%) 2 (17%) 0 ( 0%) 0 ( 0%) 4 (13%)

1 (1%)0 (0%)1 (8%)0 (0%)0 (0%)0 (0%)

Advisor’s knowledge of your program requirements College Chemical Civil Computer Electrical Mechanical

1 ( 1%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 1 ( 3%)

6 ( 8%) 0 ( 0%) 0 ( 0%) 1 ( 9%) 3 (25%) 2 ( 7%)

14 (19%) 0 ( 0%) 2 (17%) 4 (36%) 3 (25%) 5 (17%)

36 (48%) 5 (56%) 8 (67%) 5 (46 %) 4 (33%)13 (43%)

18 (24%) 4 (44%) 5 (17%) 1 ( 9%) 2 ( 17%) 9 (30%)

0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%)

Value of assistance provided by Student Services staff College Chemical Civil Computer Electrical Mechanical

3 ( 4%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 1 ( 8%) 2 ( 7%)

0 ( 0%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 0 ( 0%)

16 (21%) 0 ( 0%) 2 (17%) 7 (64%) 1 ( 8%) 6 (20%)

39 ( 52%) 4 ( 44%) 8 ( 67%) 3 ( 27%) 9 ( 75%)15 ( 50%)

17 (23%) 5 (56%) 2 (17%) 1 ( 9%) 1 ( 8%) 7 (23%)

0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%)

Comfort and appropriateness of classrooms College Chemical Civil Computer Electrical Mechanical

2 ( 3%) 0 ( 0%) 0 ( 0%) 1 ( 9%) 1 ( 8%) 0 (0%)

4 ( 5%) 1 (11%) 1 ( 8%) 0 ( 0%) 0 ( 0%) 2 ( 7%)

7 ( 9%) 0 ( 0%) 0 ( 0%) 4 ( 36%) 0 ( 0%) 3 (10%)

47 ( 63%) 6 ( 67%) 9 ( 75%) 5 ( 46%) 10 ( 83%)17 ( 57%)

15 (20%) 2 (22%) 2 (17%) 1 ( 9%) 1 ( 8%) 8 (27%)

0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%)

Overall conditions of laboratories

80

Page 81: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College Chemical Civil Computer Electrical Mechanical

6 ( 8%) 0 ( 0%) 0 ( 0%) 1 ( 9%) 3 (25%) 2 ( 7%)

17 (23%) 1 (11%) 0 ( 0%) 7 (64%) 4 (33%) 5 (17%)

12 (16%) 1 (11%) 1 ( 8%) 2 (18%) 2 (17%) 6 (20%)

33 (44%) 1 (11%) 8 (67%) 1 ( 9%) 2 (17%)15 (50%)

7 ( 9%) 6 (67%) 3 (25%) 0 ( 0%) 1 ( 8%) 2 ( 7%)

0 (0%) 1 (11%) 0 (0%) 0 (0%) 0 (0%) 0 (0%)

Availability and condition of computers College Chemical Civil Computer Electrical Mechanical

6 ( 8%) 0 ( 0%) 1 ( 8%) 1 ( 9%) 1 ( 8%) 3 (10%)

20 (27%) 2 (22%) 2 (17%) 3 (27%) 1 ( 8%)12 (40%)

13 (17%) 2 (22%) 2 (17%) 0 ( 0%) 3 (25%) 6 (20%)

27 (36%) 5 (56%) 6 (50%) 4 (36%) 5 (42%) 6 (20%)

9 (12%) 0 ( 0%) 1 ( 8%) 3 (27%) 2 (17%) 3 (10%)

0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%)

Availability and condition of lab equipment College Chemical Civil Computer Electrical Mechanical

5 ( 7%)0 ( 0%)0 ( 0%)1 ( 9%)2 (17%)2 ( 7%)

23 (31%) 0 ( 0%) 1 ( 8%) 7 (64%) 5 (42%)10 (33%)

10 (13%) 2 (22%) 1 ( 8%) 1 ( 9%) 2 (17%) 4 (13%)

29 (39%) 7 (78%)7 (58%)

1 ( 9%) 2 (17%)11 (37%)

8 (11%) 0 ( 0%) 3 ( 25%) 1 ( 9%) 1 ( 8%) 3 (10%)

0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%)

Teaching Assistants treat students respectfully College Chemical Civil Computer Electrical Mechanical

3 (4%) 0 (0%) 0 (0%) 0 (0%) 1 (8%) 2 (7%)

3 ( 4%) 0 ( 0%) 1 ( 8%) 0 ( 0%) 1 ( 8%) 1 ( 3%)

13 (18%) 3 (38%) 0 ( 0%) 1 ( 9%) 2 (17%) 7 ( 23%)

43 (58%) 4 (63%) 8 (67%) 8 (73%) 6 (50%)15 (50%)

11 (15%) 0 ( 0%) 3 (25%) 2 (18%) 1 ( 8%) 5 (17%)

1 (0%) 0 (0%) 0 (0%) 0 (0%) 1 (8%) 0 (0%)

Teaching Assistants display a clear understanding of the subject matter College Chemical Civil Computer Electrical Mechanical

5 ( 7%) 0 ( 0%) 0 ( 0%) 1 ( 9%) 2 (17%) 2 ( 7%)

2 ( 3%) 1 (11%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 1 ( 3%)

12 (16%) 2 (22%) 1 ( 8%) 0 ( 0%) 5 (42%) 7 (23%)

45 (60%) 6 (67%)10 (83%) 7 (64%) 2 (17%)16 (53%)

10 (13%) 0 ( 0%) 1 ( 8%) 3 (27%) 1 ( 8%) 4 (13%)

1 (1%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%)

81

Page 82: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

7. Below are listed some skills and competencies that engineering graduates should have. Please provide us with your opinion about the amount of experience you received in your coursework regarding these skills. Also indicate your satisfaction with the level of competency you have achieved as a result of your USC education. For each item please circle the number in the column appropriate to your answer.

Competencies Amount of Experience Level of Competency

Too Little Adequate Too Much Completely

Dissatisfied Dissatisfied

SatisfiedCompletely

SatisfiedEngineering terms, principles and theories College 4 (5%) 66 ( 89%) 4 ( 5%) 1 ( 1%) 3 ( 4%) 61 ( 81%) 10 ( 13%)

Chemical 0 (0%) 9 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 8 ( 89%) 1 ( 11%) Civil 0 (0%) 12 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 10 ( 83%) 2 ( 17%) Computer 1 (9%) 8 ( 73%) 2 ( 18%) 0 ( 0%) 1 ( 9%) 3 ( 64%) 4 ( 27%) Electrical 1 (8%) 11 ( 92%) 0 ( 0%) 1 ( 8%) 0 ( 0%) 10 ( 83%) 1 ( 8%) Mechanical

2 (7%) 25 ( 86%) 2 ( 7%) 0 ( 0%) 2 ( 7%) 26 ( 87%) 2 ( 7%)

Advanced mathematics (calculus & above) College 5 ( 7%) 63 (85%) 6 ( 8%) 1 ( 1%) 3 ( 4%) 58 ( 77%) 13 ( 17%)

Chemical 0 ( 0%) 9 (100%) 1 (13%) 0 ( 0%) 0 ( 0%) 8 ( 89%) 1 ( 11%) Civil 0 ( 0%) 10 ( 83%) 2 (17%) 0 ( 0%) 0 ( 0%) 10 ( 83%) 2 ( 17%) Computer 0 ( 0%) 10 ( 91%) 1 ( 9%) 0 ( 0%) 0 ( 0%) 7 ( 64%) 4 ( 36%) Electrical 1 ( 8%) 11 ( 92%) 0 ( 0%) 1 ( 8%) 1 ( 8%) 8 ( 67%) 2 ( 17%) Mechanical

4 (14%) 22 ( 76%) 3 (10%) 0 ( 0%) 2 ( 7%) 25 ( 83%) 3 ( 10%)

Chemistry and/or physics College 6 ( 8%) 63 ( 84%) 6 ( 8%) 1 ( 1%) 6 ( 8%) 63 ( 84%) 5 ( 7%)

Chemical 1 (11%) 8 ( 89%) 1 (13%) 0 ( 0%) 0 ( 0%) 8 ( 89%) 1 ( 11%) Civil 1 ( 8%) 10 ( 83%) 1 ( 8%) 0 ( 0%) 1 ( 8%) 11 ( 92%) 0 ( 0%) Computer 0 ( 0%) 9 ( 82%) 2 (18%) 0 ( 0%) 0 ( 0%) 8 ( 73%) 4 ( 27%) Electrical 1 ( 8%) 10 ( 83%) 1 ( 8%) 1 ( 8%) 3 (25%) 8 ( 67%) 0 ( 0%) Mechanical

3 (10%) 25 ( 83%) 2 ( 7%) 0 ( 0%) 2 ( 7%) 27 ( 90%) 1 ( 3%)

Liberal Arts College 7 ( 9%) 50 ( 67%) 18 (24%) 1 ( 1%) 5 ( 7%) 64 ( 85%) 5 ( 7%)

Chemical 1 (11%) 7 ( 78%) 1 (11%) 0 ( 0%) 1 (11%) 6 ( 67%) 2 (22%) Civil 1 ( 8%) 7 ( 58%) 4 (33%) 0 ( 0%) 3 (25%) 9 ( 75%) 0 ( 0%) Computer 0 ( 0%) 9 ( 82%) 2 (18%) 0 ( 0%) 0 ( 0%) 9 ( 82%) 2 (18%) Electrical 2 (17%) 5 ( 42%) 5 (42%) 0 ( 0%) 0 ( 0%) 11 ( 92%) 0 ( 0%) Mechanical

2 ( 7%) 22 ( 73%) 6 (20%) 0 ( 0%) 1 ( 3%) 28 ( 93%) 1 ( 3%)

An ability to:

Identify, formulate, and solve engineering problems College 7 ( 9%) 67 ( 89%) 1 ( 1%) 1 ( 1%) 5 ( 7%) 57 (76%) 12 (16%)

Chemical 0 ( 0%) 9 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 7 (78%) 2 (22%) Civil 0 ( 0%) 12 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 12 (100%) 0 ( 0%) Computer 2 (18%) 8 ( 73%) 3 ( 9%) 0 ( 0%) 1 ( 9%) 6 (55%) 4 (36%) Electrical 1 ( 8%) 11 ( 92%) 0 ( 0%) 1 ( 8%) 1 ( 8%) 9 (75%) 1 ( 8%) Mechanical 4 (13%) 26 ( 87%) 0 ( 0%) 0 ( 0%) 3 (10%) 23 (77%) 4 (13%)

82

Page 83: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Design a system, component, or process to meet desired needs and quality College 17 (23%) 57 ( 76%) 1 ( 1%) 3 ( 4%) 9 (12%) 55 (73%) 8 (11%)

Chemical 1 (11%) 8 ( 89%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 8 (89%) 1 (11%) Civil 0 ( 0%) 12 (100%) 0 ( 0%) 0 ( 0%) 1 ( 8%) 11 (92%) 0 ( 0%) Computer 3 (27%) 7 ( 64%) 1 ( 9%) 0 ( 0%) 2 (18%) 5 (46%) 4 (36%) Electrical 4 (33%) 8 ( 67%) 0 ( 0%) 2 (17%) 1 ( 8%) 9 (75%) 0 ( 0%) Mechanical 9 (30%) 21 ( 70%) 0 ( 0%) 1 ( 3%) 5 (17%) 22 (73%) 2 ( 7%)Use the computer as a tool for analysis and design College 14 (19%) 55 ( 74%) 5 ( 5%) 1 ( 1%) 8 (11%) 51 (68%) 15 (20%)

Chemical 2 (22%) 6 ( 67%) 1 (11%) 0 ( 0%) 0 ( 0%) 7 (78%) 2 (22%) Civil 3 (25%) 9 ( 75%) 0 ( 0%) 0 ( 0%) 2 (17%) 9 (75%) 1 ( 8%) Computer 3 (27%) 6 ( 55%) 2 (18%) 0 ( 0%) 1 ( 9%) 7 (64%) 3 (27%) Electrical 2 (17%) 9 ( 75%) 1 ( 8%) 1 ( 8%) 1 ( 8%) 7 (58%) 3 (25%) Mechanical 4 (14%) 24 ( 83%) 1 ( 3%) 0 ( 0%) 4 (13%) 21 (70%) 5 (17%)Function on multi-disciplinary or cross-functional teams College 22 (29%) 48 ( 64%) 5 ( 7%) 1 ( 1%) 12 (16%) 52 (69%) 10 (13%)

Chemical 4 (44%) 5 ( 56%) 0 ( 0%) 0 ( 0%) 1 (11%) 7 ( 78%) 1 (11%) Civil 2 (17%) 10 ( 83%) 0 ( 0%) 0 ( 0%) 2 (17%) 10 ( 83%) 0 ( 0%) Computer 2 (18%) 7 ( 64%) 2 (18%) 0 ( 0%) 0 ( 0%) 9 ( 82%) 2 (18%) Electrical 4 (33%) 8 ( 67%) 0 ( 0%) 1 ( 8%) 1 ( 8%) 7 ( 58%) 3 (25%) Mechanical 9 (30%) 18 ( 60%) 3 (10%) 0 ( 0%) 7 (23%) 19 ( 63%) 4 (13%)Function in culturally an ethnically diverse environments College

15 (20%)

58 ( 77%) 2 ( 3%) 2 ( 3%) 9 (12%) 53 (71%) 11 (15%)

Chemical 2 (22%) 7 ( 78%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 8 ( 89%) 1 (11%) Civil 1 ( 8%) 11 ( 92%) 0 ( 0%) 0 ( 0%) 2 ( 1%) 10 ( 83%) 4 ( 8%) Computer 2 (18%) 8 ( 73%) 1 ( 9%) 0 ( 0%) 2 (18%) 6 ( 55%) 3 (27%) Electrical 4 (33%) 8 ( 67%) 0 ( 0%) 1 ( 8%) 1 ( 8%) 9 ( 75%) 1 ( 8%) Mechanical 6 (20%) 23 ( 77%) 1 ( 3%) 1 ( 3%) 5 (17%) 19 ( 63%) 5 (17%)Communicate orally, informally, & in prepared talks College

11 (15%) 60 ( 80%)

4 ( 5%)

1 ( 1%) 6 ( 8%) 52 (69%) 16 (21%)

Chemical 0 ( 0%) 9 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 6 (67%) 3 (33%) Civil 2 (17%) 10 ( 83%) 0 ( 0%) 0 ( 0%) 2 (17%) 9 (75%) 1 ( 8%) Computer 1 ( 9%) 8 ( 73%) 2 (18%) 0 ( 0%) 1 ( 9%) 7 (64%) 3 (27%) Electrical 5 (42%) 7 ( 58%) 0 ( 0%) 1 ( 8%) 2 (17%) 6 (50%) 3 (25%) Mechanical 3 (10%) 25 ( 83%) 2 ( 7%) 0 ( 0%) 1 ( 3%) 23 (77%) 6 (20%)

83

Page 84: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Communicate in writing – technical reports, memos, proposals, etc. College

8 (11%) 54 ( 72%) 13 (17%) 1 ( 1%) 6 ( 8%) 50 (67%) 18 (24%)

Chemical 1 (11%) 2 ( 89%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 6 (67%) 3 (33%) Civil 0 ( 0%) 8 ( 67%) 4 (33%) 0 ( 0%) 0 ( 0%) 11 (92%) 1 ( 8%) Computer 2 (18%) 5 ( 46%) 4 (36%) 0 ( 0%) 1 ( 9%) 6 (55%) 4 (36%) Electrical 2 (17%) 9 ( 75%) 1 ( 8%) 1 ( 8%) 1 ( 8%) 6 (50%) 4 (33%) Mechanical 2 ( 7%) 24 ( 80%) 4 (13%) 0 ( 0%) 4 (13%) 20 (67%) 6 (20%)Use computer software for professional communications College 9 (12%) 60 ( 80%) 6 ( 8%) 1 ( 1%) 5 ( 7%) 52 (71%) 15 (21%)

Chemical 1 (11%) 7 ( 78%) 1 ( 11%) 0 ( 0%) 0 ( 0%) 6 (75%) 2 (25%) Civil 2 (17%) 9 ( 75%) 1 ( 8%) 0 ( 0%) 2 (17%) 8 (67%) 2 (17%) Computer 3 (27%) 6 ( 55%) 2 ( 18%) 0 ( 0%) 1 ( 9%) 7 (64%) 3 (27%) Electrical 1 ( 8%) 11 ( 92%) 0 ( 0%) 1 ( 8%) 1 ( 8%) 7 (58%) 3 (25%) Mechanical 2 ( 7%) 26 ( 87%) 2 ( 7%) 0 ( 0%) 1 ( 3%) 23 (79%) 5 (17%)Design and conduct experiments College

22 (29%) 48 ( 64%) 5 ( 7%) 2 ( 3%) 14 (19%) 52 (69%) 7 ( 9%)

Chemical 3 (33%) 6 ( 67%) 0 ( 0 %) 0 ( 0%) 2 (22%) 7 (78%) 0 ( 0%) Civil 2 (17%) 9 ( 75%) 1 ( 8%) 0 ( 0%) 1 ( 8%) 10 (83%) 1 ( 8%) Computer 4 (36%) 5 ( 46%) 2 (18%) 0 ( 0%) 4 (36%) 4 (36%) 3 (27%) Electrical 2 (17%) 10 ( 83%) 0 ( 0%) 1 ( 8%) 1 ( 8%) 8 (67%) 2 (17%) Mechanical 11 (37%) 17 ( 57%) 2 ( 7%) 1 ( 3%) 6 (20%) 22 (73%) 1 ( 3%)Analyze and interpret data College 8 (11%) 63 ( 84%) 4 ( 5%) 1 ( 1%) 7 (19%) 55 (73%) 12 ( 16%)

Chemical 0 ( 0%) 8 ( 89%) 1 ( 11%) 0 ( 0%) 0 ( 0%) 5 ( 56%) 4 (44%) Civil 0 ( 0%) 12 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 11 ( 92%) 1 ( 8%) Computer 1 ( 9%) 8 ( 73%) 2 ( 18%) 0 ( 0%) 2 (18%) 7 ( 64%) 2 (18%) Electrical 1 ( 8%) 11 ( 92%) 0 ( 0%) 1 ( 8%) 0 ( 0%) 11 ( 92%) 0 ( 0%) Mechanical 6 (20%) 23 ( 77%) 1 ( 3%) 0 ( 0%) 5 (17%) 20 ( 67%) 5 (17%)An understanding of:Professional and ethical responsibilities College 21 (28%) 52 ( 69%) 2 ( 3%) 3 ( 4%) 9 ( 12%) 50 (67%) 13 (17%)

Chemical 2 (22%) 7 ( 78%) 0 ( 0%) 0 ( 0%) 1 (11%) 7 (78%) 1 (11%) Civil 1 ( 8%) 11 ( 92%) 0 ( 0%) 0 ( 0%) 1 ( 8%) 9 (75%) 2 (17%) Computer 2 (18%) 8 ( 73%) 1 ( 9%) 0 ( 0%) 1 ( 9%) 8 (73%) 2 (18%) Electrical 6 (50%) 6 ( 50%) 0 ( 0%) 2 (17%) 1 ( 8%) 7 (58%) 2 (17%) Mechanical 10 (33%) 19 ( 63%) 1 ( 3%) 1 ( 3%) 5 (17%) 18 (60%) 6 (20%)Environmental aspects of engineering practice College 32 (43%) 41 ( 55%) 2 ( 3%) 3 ( 4%) 22 (29%) 43 ( 57%) 7 ( 9%)

Chemical 4 (44%) 5 ( 56%) 0 ( 0%) 0 ( 0%) 3 (33%) 6 ( 67%) 0 ( 0%) Civil 4 (33%) 8 ( 67%) 0 ( 0%) 0 ( 0%) 4 (33%) 6 ( 50%) 2 (17%) Computer 3 (27%) 7 ( 64%) 1 ( 9%) 0 ( 0%) 3 (27%) 7 ( 64%) 1 ( 9%) Electrical 6 (50%) 6 ( 50%) 0 ( 0%) 2 (17%) 1 ( 8%) 9 ( 75%) 0 ( 0%) Mechanical 15 (50%) 14 ( 47%) 0 ( 0%) 1 ( 3%) 11 (37%) 14 ( 47%) 4 (13%)

84

Page 85: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The practice of engineering on a global scale College 39 (52%) 34 ( 45%)

2 ( 3%)

3 ( 5%) 27 (36%) 39 (52%) 5 ( 7%)

Chemical 7 (78%) 2 ( 22%) 0 ( 0%) 0 ( 0%) 6 (67%) 3 ( 3%) 0 ( 0%) Civil 3 (25%) 9 ( 75%) 0 ( 0%) 1 ( 8%) 1 ( 8%) 10 ( 3%) 0 ( 0%) Computer 6 (55%) 4 ( 36%) 1 ( 9%) 0 ( 0%) 4 (36%) 3 ( 5%) 1 ( 9%) Electrical 9 (75%) 3 ( 25%) 0 ( 0%) 3 (25%) 5 (42%) 3 ( 5%) 1 ( 8%) Mechanical 14 (47%) 15 ( 50%) 1 ( 3%) 0 ( 0%) 11 (37%) 16 ( 3%) 3 (10%)The impact of engineering solutions in a global and societal context College 37 (49%) 36 (48%) 2 ( 3%) 4 ( 5%) 25 (33%) 39 ( 52%) 7 ( 9%)

Chemical 5 (56%) 4 (44%) 0 ( 0%) 1 (11%) 3 (33%) 5 (56%) 0 ( 0%) Civil 4 (33%) 8 (67%) 0 ( 0%) 0 ( 0%) 3 (25%) 7 (58%) 2 (17%) Computer 4 (36%) 6 (55%) 1 ( 9%) 0 ( 0%) 3 (27%) 7 (64%) 1 ( 9%) Electrical 9 (75%) 3 (25%) 0 ( 0%) 2 (17%) 5 (42%) 4 (33%) 1 ( 8%) Mechanical 15 (50%) 14 (47%) 1 ( 3%) 1 ( 3%) 11 (37%) 15 (50%) 3 (10%)The need for engaging in life-long learning

College 13 (17%) 59 (79%) 3 ( 4%) 2 ( 3%) 11 (15%) 51 (68%) 11 (15%)

Chemical 0 (10%) 9 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 8 (89%) 1 (11%) Civil 1 ( 8%) 11 ( 92%) 0 ( 0%) 0 ( 0%) 1 ( 8%) 8 (67%) 3 (25%) Computer 1 ( 9%) 9 (82%) 1 ( 9%) 0 ( 0%) 1 ( 9%) 9 (82%) 1 ( 9%) Electrical 3 (25%) 9 (75%) 0 ( 0%) 1 ( 8%) 2 (17%) 8 (67%) 1 ( 8%) Mechanical 8 (27%) 20 (67%) 2 ( 7%) 1 ( 3%) 7 (23%) 18 (60%) 4 (13%)Basic knowledge of industry practices and standards College 32 (43%) 42 (56%) 1 ( 1%) 5 ( 7%) 24 (32%) 41 (55%) 5 ( 7%)

Chemical 2 (22%) 7 (78%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 8 (89%) 1 (11%) Civil 3 (25%) 9 (75%) 0 ( 0%) 1 ( 8%) 2 (17%) 9 (75%) 0 ( 0%) Computer 4 (36%) 6 (55%) 1 ( 9%) 1 ( 9%) 3 (27%) 6 (55%) 1 ( 9%) Electrical 7 (58%) 5 (42%) 0 ( 0%) 2 (17%) 6 (50%) 4 (33%) 0 ( 0%) Mechanical 16 (53%) 14 (47%) 0 ( 0%) 1 ( 3%) 13 (43%) 14 (47%) 2 ( 7%)Contemporary issues (welfare reform, irradiation,etc.) College 50 (67%) 24 (32%) 1 ( 1%) 10 (14%) 27 (37%) 33 (45%) 4 ( 5%)

Chemical 7 (78%) 2 ( 22%) 0 ( 0%) 1 (11%) 5 (56%) 3 (33%) 0 ( 0%) Civil 5 (46%) 5 ( 46%) 1 ( 9%) 2 (17%) 3 (25%) 7 (58%) 0 ( 0%) Computer 2 (40%) 3 ( 60%) 0 ( 0%) 1 ( 9%) 2 (18%) 7 (64%) 1 ( 9%) Electrical 9 (75%) 3 ( 25%) 0 ( 0%) 3 (25%) 5 (42%) 4 (33%) 0 ( 0%) Mechanical 22 (73%) 8 ( 27%) 0 ( 0%) 3 (10%) 12 (41%) 11 (38%) 3 (10%)

85

Page 86: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

8. What courses, experiences, teachers, professional organizations, or learning activities did you find most useful in helping to prepare you for becoming an engineering professional?

Chemical

Co-oping helped me get hands on experience in industry. This was extremely helpful when deciding a career path. Unit Operations was a great class because it helped with my presentation and writing skills. Dr. Gadala-Maria was an excellent professor who helped me build a strong engineering foundation. Dr. Van Brunt taught me to take responsibility for my actions as an engineer. (ECHE 467)

Dr. Van Brunt, AIChE

Professors Amiridis, Van Brunt, Ritter, and all of the other Chemical Engineering professors and their courses have been very helpful in learning about the field of Engineering. The AIChE has also been invaluable in providing information about plants and career opportunities in engineering.

Courses: kinetics, safety, mass transferTeachers: Van Brunt, AmiridisOrganizations: Tau Beta Pi, AICHE

My Co-op experience prepared me more than anything. Group projects helped as well.

Design classes, freshman English, and computer oriented classes will most likely be the most important.

Dr. Vincent Van Brunt, Dr. Karlene Hoo, All Chemical Engineering faculty; Safety ECHE 467; AIChE

Courses: Unit operations lab, Separations course, Process Safety and Health CourseOrganizations: AIChETeachers: Dr. Vincent Van Brunt, Dr. Michael Matthews, Dr. Michael AmiridisExperiences: Participant in AICHE regional and national conventions, internships

I am particularly glad I got to know Dr. Amiridis and Dr. Matthews. They are wonderful, cool-headed, helpful professors (& teachers) and assets to the department. Dr. Van Brunt should be commended for his out-of-class efforts to teach and advise students (and other zany antics). I think this was overall a very helpful, concerned and approachable department. I like the fact that interesting electives were offered. I’m glad engineers have to take so many liberal arts classes. It is necessary. I think that there was a lot of good teamwork, oral presentation, and student interaction. Theses were nice! I feel I received a good education and developed a stronger personality by studying engineering. I don’t always plan to be an engineer, but it has certainly prepared me for life.

Civil

470, fluids, statics, 535

As far as courses go, ECIV 470 (Senior Design taught by Dr. Meadows) was by far the best class I took. I learned how to work together in a team to produce a meaningful project. Also, I learned a lot of “practical” applications to things I’ve learned over the past 4 years. Working as an undergraduate research assistant was also extremely helpful in preparing me to become an engineering professional. I’ve gained more knowledge through that experience than I ever thought I would.

Design classes (pavement design, reinforced concrete, foundations), ASCE

ASCE, Hydraulics

86

Page 87: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

I found our ASCE meetings really helped us prepare for real world problems. The civil professors are very helpful in working with students, teaching us, and in some, helping us find jobs. They are people we can rely on for advice and recommendations. The career service is also an excellent program to be involved in to get a chance for interviews. They not only help us look for jobs or co-op, but also assist us with mock interviews or info on how to succeed in interviews.

Engr. 101 and 102, Eciv 301

I really enjoyed and value my experience here at USC. I really feel that it is important to join your professional organization. I am a member of ASCE and feel that it is very important for others to join. I have a few professors that I would like to commend for their knowledge that they relayed to me: Dr. Petrou, Dr. Meadows, Dr. Baus, Dr. Sutton, and Dr. Kahn.

Courses: Statistics, Soil Mechanics, Hydraulics, Fluids, Foundation Design, Pavement Design, Auto CAD (not enough time here), Solid Mechanics, Rein. Concrete Design, ECIV 300. I learned the most useful information in these courses.Teachers: Peters, Baus (most useful), Petrou, Pierce, Meadows, Kahn. These teachers were the best and I learned from them all, but I felt that Dr. Baus and Dr. Meadows were the most helpful.ASCE—I loved it. It was a great experience.

I found that all of the core requirements prepared me for becoming an engineering professional.ECIV 420—Senior Design (very important)—MeadowsECIV 325—Steel Design—BradburnECIV 530—Foundations—BausECIV 327—Reinforced Concrete—PetrouECIV 320—Structured Analysis—Bradburn

ASCE—Great hands on experience with everything from design to leadership.Courses: Senior design, ECIV 490B with Dr. Gassman (Environmental Geotechnics), ECIV 551 (water and wastewater treatment) with Dr. McAnally (because of the projects) Other: my part-time job made a huge difference in showing me exactly what kinds of jobs I could have and what kinds of work I can do.

Senior design was most useful in that it taught the importance and necessity of teamwork in real life situations. Dr. Meadows was a professor who really motivated his students to learn in his classes through his excitement about the material and through his humor.

Electrical

My favorite courses were always the labs. Doing hands-on work seemed to give my studies a feeling of relevance.

In my five years as a student, the courses that I found most useful in helping me to prepare to become an engineering professional were my calculus courses, both circuits courses, E-mag, communications, microwave engineering, power systems, and the labs. The teachers who have given me the desire to continue, heightened my interests, pushed me to perform better, and just knew how to teach were Dr. Charlie Cook, professor of mathematics at the USC Sumter Campus, Dr. Jerry Hudgins, professor of engineering, and Dr. Ted Simpson, professor of engineering.

Nothing written

Working for the college’s computer support department. Cooperative education was extremely useful. Overall curriculum for electrical engineering.

Not fair. Nothing useful at all. Not interesting!! Waste of time.

401 and 402 labs were most helpful

87

Page 88: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Dr. Simpson is one of the better instructors here. 401 and 402 labs were the most useful.

NSBE; electronics; circuits; C ++ programming, all labs

Laboratory work, 500 level courses, Dr. Brice, Dr. Hudgins. Dr. Cokkinides is extremely smart but is a little too fast.

Definitely lab courses-the most practical and informative.

Prof. Sudarshan was very insightful in relating real world experiences to the classroom.

All of the labs and my advanced electronics class EECE 571

Computer

Labs, programming classes

Physics 212, Dr. Bob Nerbun, USC Sumter. He helped me conclude my decision to enter engineering school.

Professor Simpson and Professor Sudarshan had personal conversations and in-class references. Lab classes

Courses: labs 402 and 404, EECE 371Instructors: Dr. Hudgins, Dr. DougalTeacher’s Assistants: D. W. (Scooter) Harris

EECE 534, EECE 503A, EECE 511, EECE 512

Interning helped me.

I found the two software engineering courses 351 and 352 taught by Mark Campbell and Chris King respectively, the most beneficial courses I took at college. These courses were challenging and presented practical material that I will use in my career.

Nothing written

IEEE, ECE 351-351, CSCI 575, Mark Campbell, Chris King, Mike Sechrest, ECE 502, ECE 503

National Society of Black Engineers

VTB! The Visual Test Bed was instrumental in helping me understand software engineering principles and practices.

Mechanical

Dr. Keating was one of the few if not the only teacher who really cared about the problems and decisions that you face as an engineer. In his class we found a voice and an opportunity to really speak of how to improve the level of learning and trust that is not present on our university. Many of the other professors seemed so wrapped up in “teaching” us class that our voice as students would never be heard.

Senior design, senior lab

ASME, TBP, Dr. Lyons—Manufacturing Processes, Dr. Young; Dr. Khan, Dr. Sutton, Dr. Dickerson, Dr. Peters. Great teachers who really care about student learning.

88

Page 89: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

American Society of Mechanical Engineers, Research Assistant job with Dr. Stephen McNeill, Senior Lab with Dr. Ed Young, Society of Experiment Mechanics Conferences, open door policy that all professors have.

ASME was incredible under the leadership of Helen Sailer and Heather Stone.

Dr. Young, Dr. Reynolds, Dr. Khan, Dr. Peters, Dr. Gadala-Maria, ASME

Dr. Lyons—Manufacturing (should be required), Dr. Young—Thermo, Dr. Kahn—Fluids, Truly cares for his students, Dr. McNeil—Senior Design, Dr. Keating—Truly cares for his students, Dr. Sutton, Dr. Peters, ASME, Dr. Gibbons—the best person in the College of Engineering

ASME allowed me to meet others in Mechanical Engineering who helped me in homework problems and other learning issues. As a whole the ME professors act as a family and welcome students into their offices.

I found that our basic required courses prepared me a lot, but we need some more EMCH electives (no variety). I also feel that more team project oriented classes will provide that needed experience.

I found my co-op to be intensely useful and highly recommend that it be pointed out more to students. As far as courses, I found my senior lab and design courses to be the best. Through them I was able to relate the real world with the theory learned. Also through them I was able to learn more in the area of communications.

Organizations: ICAR, ASMECourses: EMCH 394, 354, 507, 467, 527Experiences: Designing experiments and training personnel on racing teamTeachers: E. Young, D. Keating, A. Bayomi, J. Morehouse

Dr. Keating’s classes—the only classes concerned with ethics and managerial issues.

I believe our Senior Design Course as well as Senior Lab were the most beneficial. Both of these helped with written and oral skills. ASME also helped (maturity, leadership). Dr. Khan should be an inspiration to us all. He is absolutely the greatest teacher that I have come in contact with.

Junior Design, manufacturing processes, ASME, Senior Design Project

I found the materials department to be very good at USC (Reynolds, Sutton, etc.).

Senior Lab presentation skills. Senior Design—working with industry contacts.

N/A

Dr. Sutton—solids, Dr. Keating - , Dr. Young—thermodynamics, Dr. Reynolds—materials

Dave Oberly, math instructor at Spring Valley High, was my calculus instructor in evening class. He was the best instructor I’ve had at USC and I learned more in his class than any other. Professor Clary in statics was a close second to Dave Oberly.

Upper division courses. Courses that emphasized oral presentations. Senior design.

The courses that used the computer as a tool to solve problems: EMCH 301, 507, 508, etc. As far as teachers go Dr. Kahn was the best. If I had not had him for fluids early on I might have quit engineering. Also Dr. Reynolds, Dr. Young, and Dr. Rocheleau were excellent.

Nothing written

Dr. Keating, Dr. Sutton—solids, Dr. Young, Dr. Reynolds

89

Page 90: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

EMCH 520A, EMCH 527, EMCH 467, EMCH 427 and 428

Courses: early core courses, senior designTeachers: Dr. Young, Dr. Schwartz, Prof. May at USC AikenOrganizations: ASME, Tau Beta Pi

Junior design and senior design

EMCH 427, 428, 527

Dr. Ed Young, Dr. Wally Peters, Sylvia Therrell

Lab experiments.

EMCH 301 with Dr. Young was very helpful for work in other courses. EMCH 427/428 was very helpful to get an understanding about engineering design in industry. Dr. Sutton and Reynolds: outstanding in communications with students in class and getting students interested in the topic.

9. What recommendations would you make to improve the educational experience for future engineering students at USC?

Chemical

Our school should promote the co-op program more. Professors should talk about the importance of co-oping more often in class. I would even recommend making co-oping mandatory.

Don’t close a computer lab for maintenance when two others have classes in them at the same time.

Stop doing computer maintenance at the busiest times of the day. There are plenty of computers, but when one of the labs is closed for maintenance, and the others are either full or involved in class, it presents students (mostly seniors) from doing work right after class. Instead of saying that the engineering building is open 24 hours a day, make it open 24 hours a day. Sometimes the most convenient time of day is not between 6 a.m. and midnight, but late night.

Better access to computers.

They need to co-op or at least intern!!!

Process redesign (change in industrial process). Quality management.

Improvements and expansions in courses offered, material covered.

Try to get more professors involved in the activities of discipline specific organizations. Also, try to have an organization for all engineering students. Most engineering students only know students in their specific engineering discipline. By having a non-discipline specific organization or even more social events, students would get to know more of their peers. This would be especially good for freshmen. A facility for copying and making transparencies in the College of Engineering would be a nice addition. It would save the students time in not having to go to Kinko’s, as well as possibly providing the College of Engineering added revenue. I know it is not educational, but a food establishment inside the College of Engineering would be greatly appreciated. As freshmen are required to have a meal plan and higher level engineering students live at the College of Engineering, a food establishment would be well used. Have a more dependable server. I cannot count the number of times that the system and email has been down. Also, add more computers. When classes are using the computer labs, there are not enough computers for everyone else to use. Try and have someone from computer services to have a help desk from 8 a.m. to 10 p.m. Now they are only open for a few hours in the early afternoon.

90

Page 91: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Considering that nearly all students pursuing undergraduate degrees in USC engineering will go to work in industry, it is important that the emphasis be on more practical, hands-on learning. The professors in the Chem - E dept are lovely people, but often their teaching comes across as overly erudite and caters to those pursuing advanced degrees. Some of the professors don’t even seem to have that much industrial/practical experience to bring to the classroom. I think that greater contact should be established with industrial contacts. A team-taught Process Control Design class would be great. I think professors might learn to be more generous about telling you when you have done a good job. It makes a difference. I think a point should be made about recognizing student leaders in Engineering. They do make a difference! I believe this was once a common-practice, but has declined in recent years. A new curriculum or course plan should be drawn up to incorporate co-oping. This should be offered to students as an option from the very beginning. Additionally, foreign language and business minor schedules (4-year) should be established and offered. The COE should work with a foreign university to set up an engineering exchange program. U of Leeds has one with Penn State. With a little work, it can happen. We don’t know who or what we are when we come here. It is up to you to offer us options. In the end, it comes down to how complacent you are with the current state of things and how much you are willing to change and better yourself in the process.

Civil

More actual hands on teaching in classrooms. Some people learn easier and faster when they actually see a process happening.

I would recommend more faculty and student interaction during the freshman and sophomore years.

Do away with ECON 421 and ECIV 405 or take them out of required classes.

None

I feel the electrical engineering department is horrible!! One professor is mean and only helps students in its own section outside the classroom and during test taking. That is completely unfair to the other sections! (This is about circuits.) Also USC electrical professors don’t seem to care what sections the students decide to sit in. They are basically allowed to sit in another section that is technically not their section. They are never there to help you except for one. And the TA’s are never there during office hours. These situations do not really help the electrical students and all those who must take Circuits I to improve on their educational experience. I find the mechanical and civil engineering departments have wonderful professors who are willing to help any student! If I know someone who is considering electrical engineering, I would say NO!

Increase the types of software for civil engineering applications.

I would first have to agree with the ideas of others and preach using more computer programs for solving problems. I believe that once you teach students the basic equations and ideas, then you should go along and provide a computer program to solve these problems. This would help prepare students more for the real world. I think that some classes like STAT 509, ECON 421, and ECIV 405 should be replaced by classes that will help in the future. I think you can combine 101 and ECIV 300 and make a class learning about MS Office, Mathcad, etc. I think you should also provide a class dealing strictly with surveying; this class should have a lab. Use the structures lab!!

Get rid of ECIV 405, ENGR 101, STAT 509, maybe ECIV 301 and economics. I felt that these classes took up time that could have been used on more important things like programs that solve engineering problems that are used in the workplace such as WATERCAD and EAGLEPOINT for example. I think that circuits and dynamics was a waste of time, for civil engineers anyway. I think that one semester of soils, foundation, and hydraulics is not enough and should be put in place of some of the above courses I have listed. I also felt that there should have been more time on surveying and less on engineering economics. (ECIV300)

91

Page 92: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

I feel like the civil engineering department should offer Surveying I, Surveying II, and Highway Design. These are the three courses that I have to take at another educational facility. Without these three courses a graduate of USC (civil engineering) cannot take the L.S.I.T. and P.S. For that reason, USC should offer these courses. I feel that these courses would help everyone in the future.

I believe we need to have a course in Mathcad and Autocad. These are two relevant courses and both programs are used in many workplaces.

Get freshmen and sophomores involved in their organizations (AICHE, ASCE, etc.). Keep up the good work with faculty-student interaction. It’s great in the Civil/Environmental department. Put AutoCAD on the computers used by civil and mechanical students. Standardize lab reports.

There needs to be a more in depth teaching of computer software. ECIV 405 needs to be explained more clearly.

Computer

More computers that are not used as classrooms

Study, study, study!

Food station.Advisement software (degree matrix)—program to take your transcript and form a path of where you are going and what prerequisites you need to get there. This would help students and advisors to better map a plan based on where the student is in their academic career.

More emphasis on technical writing

More courses emphasizing IT systems and design

Try to recruit more black professors so that the students have people that they can model. Provide more courses in things like networking, web page design, Java, etc. Get instructors who care about learning and excelling.

I believe the college should offer more computer and software courses. I felt my choices as a computer engineering major were too limited and courses were not offered in areas I would have liked to taken.

Nothing listed.

Hire more computer engineering faculty and offer more classes. Make sure that the current ones care (most don’t seem to). Have Engr 101 equivalent introduction to department and specific major.

More variety of courses and professors

Do not use TA’s which do not have sufficient knowledge to teach a course, i.e., a student who has taken the course they are going to teach only one semester before teaching it.

Electrical

1.) Better texts. Better texts. Better texts. I cannot stress this enough --the books used by most professors seemed to have been written on a level to impress the author’s peers. These books are not written with consideration for the way people learn and assimilate new information. Though the texts might make for good reference sources -- they most surely are not suitable for people trying to learn the material.

2.) Stop professors from outlining the text as their teaching method. We pay up to $100 for the book and $300 or more for a professor to clarify the text. However, most professors I had only outlined the text, sometimes word

92

Page 93: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

for word. I only had one professor who ever indicated how their teachings were used in a real world environment. That professor was sharing wisdom that could produce valuable engineers.

3.) Fire every last professor that rolls their eyes or holds contempt for students that don’t get it the first time through. Engineering lessons are tough and not everyone is going to understand all of the concepts the first time. If these professors have better things to do than to illuminate someone in trouble then they need to be elsewhere.

4.) Link every last lesson to how it is used in industry. The best classes were those where the professors told how the theory was used in a real-life problem. I know that the professors are experienced but they don’t share it with the rest of us. I am a young engineer by now, but I have little idea of what the industry expects of me or of what my proper place is in the job market.

As hard as it may possibly be, I think the electrical engineering department needs to do what it takes to place more emphasis on student education and less on research. I think if the electrical department wants to be as successful as the chemical department, more professors need to be hired for the electrical side of the house. Diversify the electrical program. Not everyone wants to work in power systems or power electronics. I think some of the professors come to class and teach off the top of their heads. It doesn’t seem like they had the time needed to look over the text to see how the author has prepared the information and problems. They come to class, look in the book to see what chapter or section we are supposed to cover, explain their interpretation of the subject and then assign the author’s problems. It becomes apparent within the first week that the professor hasn’t had the time to go over the material in the text but, because you have no choice in professors for a course in a semester, no one says anything and they just try to make it through the course.

Nothing written.

Get rid of outdated courses and/or professors. Technology is the driving force behind engineering, especially in the computer engineering field.

Stop discrimination!!! Stop cheating!! Stop giving the same homework and exams each semester! Stop competitiveness and hatress!

Allow students to choose a specific part of engineering such as power systems or semiconductors design and have more courses specific to those fields.

Clean up the labs and have more equipment. Have each person pick a discipline before their junior year. Have classes scheduled so they can work in that discipline, which means have a course schedule about two years before so juniors can determine their junior and senior years. I had so many companies ask what my discipline was. Electrical – (electronics, computer, power, etc.) Have instructors care more about students instead of research.

More teaching assistance for each class. Review sessions before each test. More student/faculty interaction. Recruiting of many African-American students.

Better lab equipment, easier access to computer labs for upper classmen, and more memory for students in computer classes (C++, etc.) underclassmen don’t need as much memory as upper classmen.

More presentations. More liberal art electives (public communication, economics, etc.)

None.

Faster network connections

Mechanical

Get rid of grades!! Many of the problems with cheating, lack of motivation, distrust and disillusionment that occurs with being an engineer occurs because of an inaccurate grading system. Too much emphasis is placed on what grade you get in class instead of what you really learned over your 4 years. We all know that grades have no bearing on how

93

Page 94: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

good an engineer will be after he graduates. All that grades have been linked to is how much money your parents have.

Get students involved in more hands on activities. I feel you can learn just as much if not more by hands on activities as you can from a book. I think it would be good to require a project for the students to do that relates to each course. Also I don’t know how ENGR 101 is set up now, but when I took it I was still trying to figure out what engineering was all about. Maybe ENGR 101 should expose students more to the practical applications of engineering. It might be good to bring in some working engineers and have them tell the class about what their duties include. Many times when freshmen are taking this course they’re still trying to decide what they want to major in. Maybe this could help them in their decision making process. In addition to this maybe they should also bring in upperclassmen to the class to help the students learn “the ropes.”

Computer staff. Poor teachers. Poor lab equipment. The computer staff is reprehensible. These students are paid to service the students, faculty, and labs, but they are extremely rude. They make me too embarrassed to tell prospective students that USC has good computer labs and computer services, things which are very important! One student in particular has been extremely rude to both students and faculty. He is < >. Most of the teachers I had were very good. A few were terrible. < > was a bigoted, rude teacher. He displayed racism, sexism and he didn’t teach us any of the material he was supposed to. He also graded unfairly, i.e., he taught material in one class before the exam and then tested us on it on the exam. < > is also a poor teacher. He is monotone, boring, hard to understand, and he treats the class like 6th graders. I feel sorry for students who will have him for senior design. The lab equipment is in disrepair. Many of the labs could not produce decent data and were a big waste of time.

< > was not as adequate as others for < >. I feel that my education is lacking because of the fact I took < > for < >. After talking to other engineering students from other schools, I have learned that what I did/learned in < > is lacking!

Better labs. Many things need updating to this decade.

Teach more ethics and industry practices and standards.

1.) More hands on labs 2.) More interaction with industry 3.) More interactions with professors

Make sure the professors are competent to teach the material the students are paying them for. There should be a better way of recognizing their downfalls and removing them from basic course curriculum. For electives, if you choose to have them fine, but for major classes like solid mechanics and kinematics, the only professor involved should be one who knows the material and how to teach it.

1.) Have more projects like senior design, but better projects 2.) Don’t need professors that don’t care about the students.3.) Change the policy about academic forgiveness.

Communications is the only overall subject I found lacking in my education experience. Until this, my final year, presentations and group work were at a minimum. I’m expected to be able to give presentations, tech and non-tech reports, yet we’re not encouraged to take any courses that will prepare us for this. I believe this is an area that definitely needs work.

Have a required engineering ethics class. Have more applied engineering possibilities. Stop the major push for research $$. Make the students the priority; make preparing the students a priority; make teachers teach and not just do research. Change the microprocessor curriculum. Do away with so much programming.

Computer—More supervision of the computer service personnel. They are consistently rude and arrogant to all that request their help. Too many computer problems and no one is willing to help correct them.Curriculum—more information about actual manufacturing process, machinery, practices.

I would recommend a little more reading. The verbal capacity of engineers is usually very low.

94

Page 95: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Fewer liberal arts classes and more regarding experimental design

Nothing written.

More challenging classes.

Computer lab service.

I would recommend that students should have to get more involved with their respective departments. For example, the Mechanical Dept. has the legends car, solar car, etc. Maybe you should give students a technical elective credit as being part of one of the required curriculum.

Get instructors who have experience in the real engineering world.

Nothing listed.

More computer integration into courses. Also should require control theory and a class to help prepare for the FE. Also, the career center < >. They need to do more to help the student. The professors should be more active in recruiting undergrads to help out with research. Also make EMCH special problems a requirement.

Do not require micro.

Micro Processors is useless course.

More on engineering ethics, environmental issues and design type of projects.

Eliminate microprocessors class. Replace with something more practical.

Apply in class theory to real world problems. More classes like junior and senior design.

More real world problem-solving classes.

1.) Make the computers work 2.) Improve quality of ECS staff (rather rude and often unhelpful) 3.) make certain classes mandatory (any dealing with ethics, sustainability, and environmental issues)

More hands on experience.

Nothing listed.

Extracurricular Activities or Service

10. Did you have an internship with an engineering company?

Yes 33 (45%) No 41 ( 55%)

Chemical 4 (50%) 4 ( 50%)Civil 4 (33%) 8 ( 67%)Computer 5 (46%) 6 ( 55%)Electrical 6 (50%) 6 ( 50%)Mechanical 13 (43%) 17 ( 57%)

95

Page 96: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

If yes, where?

ChemicalDepartment of DefenseInternational Paper—Hampton, SCWestinghouse Savannah River CompanyMilliken & Co., Barnewell; DuPont (UK) Ltc England

CivilDesign South Professionals, Inc.Qore Property SciencesSCDOTMilliken and Qore Property SciencesColumbia Environmental Consulting Firm, BMW Manufacturing Co.

ComputerNCRNCR CorporationUSC School of MedicineCollege of EngineeringIBM, NCR

ElectricalChavis ElectricAlbermarle Corp.—Orangeburg, SCMarathon Ashland OilUnion Switch and SignalR. E. Phelon

MechanicalWorthington Custom Plastics Duty ScientificDana CorporationFTE, Inc.JPLAircond., Inc.AFCO—Anderson, SCWestinghouseCartechWestinghouseAmbac InternationalAtlantic Coast MechanicalKemet ElectronicsConsolidation System

11. Did you participate in a Co-op program?

Yes 21 (28.0%) No 54 ( 72.0%)

Chemical 3 (33%) 6 ( 67%)Civil 3 (25%) 9 ( 75%)Computer 0 ( 0%) 11 (100%)Electrical 6 (50%) 6 ( 50%)Mechanical 8 (27%) 22 ( 73%)

96

Page 97: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

If yes, where?

Chemical

Westinghouse Savannah River SiteAllied Signal/Oak-MitsuiUnion Camp Corporation

Civil

Westinghouse Savannah River CoSCDOT

Electrical

Georgetown Steel CorporationPontiac FoodsUnion Switch and SignalHubbell/Ohio Brass—Aiken, SCR. E. Phelon

Mechanical

BellSouthThermal Ceramics—Augusta, GAGeorgetown SteelGE BanyonUnion Switch and SignalBose CorporationCooper Power ToolsSantee Cooper

12. Did you work while going to school? Yes 60 ( 80%) No 15 (20%)

Chemical 5 ( 56%) 4 (44%)Civil 9 ( 75%) 3 (25%)Computer 10 ( 91%) 1 ( 9%)Electrical 8 ( 67%) 4 (33%)Mechanical 27 ( 90%) 3 (10%)

If yes, how often?

Part-time Part-time Part-time/Full-time(<20 hrs./wk) 32 (53%) (20-30 hrs./wk) 12 (20%) (> 30 hrs./wk) 16 (27%)

Chemical 3 (60%) 1 (20%) 1 (20%)Civil 6 (67%) 3 (33%) 0 ( 0%)Computer 4 (40%) 2 (20%) 4 (40%)Electrical 7 (88%) 0 ( 0%) 1 (13%)Mechanical 11 (41%) 6 (22%) 10 (37%)

97

Page 98: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

13. Are you planning to attend graduate school?

College Yes 20 (27%) No 21 (28%) Maybe 34 (45%)

Chemical 3 (33%) 0 ( 0%) 6 (67%)Civil 4 (33%) 5 (42%) 3 (25%)Computer 1 ( 9%) 2 (18%) 8 (73%)Electrical 4 (33%) 4 (33%) 4 (33%)Mechanical 8 (27%) 10 (33%) 12 (40%)

If yes, in what field? If yes, in which University do you plan to enroll?

Chemical

Chemical Engineering (NC State)Chemical Engineering (unknown at this time)Business (NA)Chemical Engineering (University of Texas or Texas Tech)Business Administration (unsure)MBA (possibly USC)Law or business (The best one that will have me.)

Civil

Business (USC)Environmental Engineering (USC)Geotechnical/Materials (Clemson)Environmental Engineering (USC)Mathematics/Physics (UNC)Environmental Engineering (USC)

Computer

Business (U of M)Information SystemsComputer Engineering or MBA (?)Maybe Education or PsychologyUnknown (UT at Austin , U of Washington)Computer Science (NC State)

Electrical

Communications (undecided)Not in the engineering department at USCBusiness (undecided)Electrical Engineering (USC)MBA (undecided)Business Administration (undecided)EE or BA (USC)

Mechanical

MBA (USC)ME or Education

98

Page 99: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Mechanical Engineering (USC)MBA (Too soon to tell)Business (USC)MBA or ME (USC)Mechanical Engineering (not sure)ME (USC or Penn State)Mechanical Engineering (NC State)Business (?)Mechanical (USC)Unknown (not USC)Business (USC or Maryland)Business (USC)Business Administration (USC)Mechanical Engineering (USC)

Employment Information

14. Have you accepted a position at this time? Yes 39 (53%) No 35 (47%)

Chemical 5 (56%) 4 (44%)Civil 5 (46%) 6 (54%)Computer 5 (46%) 6 (55%)Electrical 6 (50%) 6 (50%)Mechanical 17 (57%) 13 (43%)

If yes, what is the name of the company or organization and your job title?

Chemical

International Paper CompanyAllied SignalIngersoll-Rand (Chemical Project Engineer and Environmental Coordinator)International Paper—Vicksburg, MS (Project Engineer)Westinghouse Savannah River Company (Associate Engineer in Defense i.e. Tritium)

Civil

Qore Property Sciences (Geotechnical Engineer)Southern Company (Engineer III)SCDOT (Engineering Associate I)Power EngineeringJaderloon Co. Inc. (Structural Engineer)

Computer

SRS Westinghouse (Senior Engineer A)Palmetto Health Alliance (Systems Analyst/Programmer)Booz, Allen, and Hamilton (Consultant I)Microsoft (NT Support Engineer)IBM (Software Engineer)

Electrical

Bethlehem Steel (Engineer in planning and layout)

99

Page 100: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Microsoft Corporation (Support Engineer)Marathon Ashand Oil (Engineer I)Keyence (Applications Engineer)USAF (Developmental Engineer)R. E. Phelon (Product Engineer)

Mechanical

Solectron (Associate Design Engineer)CP&L (Associate Engineer)Milliken and Company (Assistant Plant Engineer)FTE, Inc. (Assistant Engineering Manager)Corning Asan. (Engineering Mold Design)Cutler-Hammer (Engineering Design, Professional Management Program Employee)Westinghouse Cnfd.Allied Signal (Area Engineer)Anderson Brass Co. (Staff Engineer)ACM (Project Manager)Best Auto Sales (owner/manager)United States Navy (Officer, Nuclear Engineering)Milliken (Process Improvement Engineer)Vickers Aerospace Marine Defense (Manufacturing Engineer)Engineered Systems (Engineer)

15. Did you participate in career planning in the Career Services Office?

College Yes 45 (61.6%) No 28 (38.4%)Chemical 7 (87.5%) 1 (12.5%)Civil 5 (45.5%) 6 (54.5%)Computer 5 (45.5%) 6 (54.5%)Electrical 10 (83.0%) 2 (17.0%)Mechanical 17 (57.0%) 13 (43.0%)

16. When did you enroll at USC?

1979 1988 1991 1993 1994 1 ( 1%) 2 ( 2%) 1 ( 1%) 5 ( 7%) 26 ( 35%)

Chemical 0 ( 0%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 4 (44%)Civil 0 ( 0%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 5 (42%)Computer 0 ( 0%) 0 ( 0%) 1 ( 9%) 0 ( 0%) 3 (27%)Electrical 0 ( 0%) 0 ( 0%) 0 ( 0%) 1 ( 9%) 8 (73%)Mechanical 1 ( 3%) 1 ( 3%) 0 ( 0%) 4 (13%) 7 (23%)

1995 1996 1997 1998 21 (28%) 6 ( 8%) 8 (11%) 3 ( 4%)Chemical 3 (33%) 1 (11%) 1 (11%) 0 ( 0%)Civil 3 (25%) 2 (17%) 2 (17%) 0 ( 0%) Computer 4 (36%) 0 ( 0%) 2 (18%) 1 ( 9%) Electrical 2 (18%) 0 ( 0%) 0 ( 0%) 0 ( 0%) Mechanical 9 (30%) 3 ( 10%) 3 (10%) 2 ( 7%)

100

Page 101: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

17. Did you transfer from another college or university?

Yes 27 (35%) No 47 (65%)

Chemical 3 (33%) 6 (67%)Civil 5 (42%) 7 (58%)Computer 4 ( %) 8 (73%)Electrical 3 (25%) 9 (75%)Mechanical 12 (40%) 18 (60%)

If yes, what was the transfer institution?

Colleges

Midlands Technical CollegeTrident Technical College

1 Coastal Carolina3 USC - Sumter3 USC - Aiken1 USC - Lancaster1 USC - Union

11 Other universities

Chemical:USC SumterUSC Sumter

Civil:Anderson CollegeWinthrop UniversityUSC UnionCoastal Carolina UniversityClemson, Midlands Tech, USC Salkahatchie

Computer:USC AikenUSC AikenMTCTrident Technical College

Electrical:Belmont Abbey CollegeThe Citadel

Mechanical:College of CharlestonMarine Maritime AcademyClemson UniversityGreenville College of ChicagoMidlands TechClemson UniversityUSC Lancaster

101

Page 102: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

University of MaineUSCSUSC AikenFrancis Marion University

18. What is your major?

Chemical 9 (12.0%) Civil/Environmental 12 (16.0%) Computer 11 (14.9%) Electrical 12 (16.2%) Mechanical 30 (40.5%)

19. What is your cumulative GPA (grade point average) range: 2.00 to 4.00 mean: 3.15 (SD = .43) median: 3.1

mode: 3.1

College 2.0 – 2.4 3 ( 4%) 2.5 – 2.9 21 (29%) 3.0 – 3.4 31 (42%) 3.5 – 4.0 18 (25%)

Chemical 0 ( 0%) 0 ( 0%) 5 (56%) 4 (44%)Civil 1 ( 8%) 7 (58%) 3 (25%) 1 ( 8%)Computer 0 ( 0%) 5 (56%) 4 (36%) 2 (18%)Electrical 1 ( 8%) 3 (25%) 5 (32%) 3 (25%)Mechanical 1 ( 3%) 6 (21%) 14 (48%) 8 (28%)

20. What is your gender? Male 59 ( 80%) Female 15 (20%)

Chemical 4 ( 44%) 5 (56%)Civil 7 ( 58%) 5 (42%)Computer 10 ( 91%) 1 ( 9%)Electrical 12 (100%) 0 ( 0%)Mechanical 26 ( 87%) 4 (13%)

21. What is your ethnicity?

Caucasian African-American Hispanic Asian/Pacific Is. Native American OtherCollege 56 ( 76%) 9 ( 12%) 4 ( 5%) 3 ( 4%) 0 ( 0%) 2 ( 3%)

Chemical 8 ( 89%) 0 ( 0%) 0 ( 0%) 1 ( 11%) 0 ( 0%) 0 (0%)Civil 10 ( 83%) 1 ( 8%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 1 (8%)Computer 8 ( 73%) 3 ( 27%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 0 (0%)Electrical 6 ( 50%) 3 ( 25%) 1 ( 8%) 1 ( 8%) 0 ( 0%) 1 (8%)Mechanical 24 ( 80%) 2 ( 7%) 3 (10%) 1 ( 3%) 0 ( 0%) 0 (0%)

102

Page 103: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering & Information Technology Seniors:

An Assessment of Student’s Experiences and Opinions

May 1999 Senior Survey Analysis of Results

Goals/Objectives

Students graduating from the College of Engineering & Information Technology in May 1999 completed a survey requesting information about their undergraduate college experience and their judgment regarding specific engineering skills and abilities. The purposes of the survey are fourfold: (1) to present conclusions regarding the overall outcomes of the student’s academic and extracurricular engineering performance for use in decision making; (2) to present results about programs, activities, etc. in order to improve the programs; (3) to enhance understanding and appreciation of formative and summative evaluation; and (4) to contribute to the general body of knowledge with regard to evaluation of undergraduate engineering programs.

Administration Procedures

The Director of Assessment administered the Senior Exit Survey to students within the EMCH 467 and EECE 402/404 classes during the third week in April. Civil Engineering distributed and collected surveys from their department office. A senior in Chemical Engineering distributed and returned surveys for their department.

Surveys distributed using these methodologies resulted in the following return rates:

Chemical 60% ( 9 of 15 surveys)Civil 86% ( 12 of 14 surveys)Computer 73% ( 11 of 15 surveys)Electrical 86% ( 12 of 14 surveys)Mechanical 97% ( 30 of 31 surveys)

A total of 89 surveys were collected for an overall return rate of 83 percent for the May 1999 senior sample. Return rates for the Computer, Electrical and Mechanical Programs are higher than for the previous semester, however, the proportion of surveys collected for the Chemical Engineering department is significantly lower. The return rate for Civil Engineering was about the same as the December 1998 survey administration.

Instrument

A three-page survey accompanied by a title page was developed to obtain information in the following areas:(11) overall ratings of students’ engineering education(12) life-long learning indicators(13) assessment of specific college services(14) opportunity to make recommendations

103

Page 104: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

(15) evaluation of ABET skills and competencies(16) useful experiences(17) extracurricular activities(18) plans for graduate education (19) employment information(20) demographic information including transfer status

Sample Demographics

Demographic information requested from the graduating seniors included ethnicity, gender, cumulative GPA, major, transfer institution and year of USC enrollment. It should be noted that although surveys were returned, some students did not answer all items on the instrument. Totals given in the distribution of responses or discussed in this summary vary according to the number of students supplying the information.

According to the data analysis, 59 males (80%) and 15 females (20%) returned surveys. Surveys were returned by 56 Caucasians (76%), 9 African-Americans (12%), 3 Asian/Pacific Islanders (4%), 4 Hispanic (5%), and 2 (3%) “other” students. The return sample was fairly representative of the gender and ethnicity distribution within the graduating class.

Length of Enrollment

Seniors in the May 1999 sample began their engineering coursework at the Columbia campus during a range of nineteen years (1979 –1998). This lengthy time period suggests that some of these students attended part-time over a long period of time or stopped-out during their academic career. Seniors, including the long-term students, required from one to twenty years to graduate from the College of Engineering. The following numbers and percentages of students enrolled during each year:

1979 1 1%1988 2 2%1991 1 1 % 1993 5 7% 1994 26 35% 1995 21 28 %1996 6 8%1997 8 11% 1998 3 4%

Students who entered Engineering during 1994 to 1998, approximately 86% of the sample, completed their degree within a time frame of five years or less. Students completing their degree within four years or less totaled 51 percent of the sample.

Departmental Results: The length of student enrollment varied by program. Computer Engineering had the largest percentage of students (64 percent) completing their degree in four (or less) years. Approximately 58 percent of the Mechanical and Civil Engineering seniors, 56

104

Page 105: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

percent of the Chemical Engineering seniors and 18 percent of the Electrical engineering seniors enrolled in 1995 and finished in 1999.

Transfer Population

Information supplied by the seniors reveals that a significant percentage attended another college or university prior to their enrollment within the College of Engineering at USC. Transfer students represented approximately 35 percent of the cohort (27 students). Eighty-nine percent of the transfers are male and 85 percent are Caucasian. These demographics suggest that a slighter higher percentage of the transfer population is male and Caucasian than are found in the total sample.

Seniors transferred to USC from a variety of two and four-year colleges and universities. Students attending a regional campus of USC accounted for a significant segment of this group, approximately 38 percent, or a total of nine students. An additional 17 percent of the transfer students attended Midlands or Trident Technical College with the remainder of the group, approximately 45 percent, coming from various four-year colleges within the state and around the U.S.

The dates of first-time enrollment for transfer students covered a five-year period from 1994 to 1998. The following listing gives the number and percentage of student enrolling during each year:

1994 3 11%1995 8 31%1996 5 19%1997 8 31%1998 2 8%

These statistics indicate that all of the transfers graduated within five years and that 89 percent graduated within four years.

Departmental results: Frequency distributions indicate that each program area included at least three transfer students. The number and percentages of transfer students within each program are listed below:

Chemical 3 11%Civil 5 19%Computer 4 15%Electrical 3 11%Mechanical 12 44%

Comparison of transfer and non-transfer students: The distribution of responses were similar for transfer and non-transfer students for each of the following items: gender, internships, co-ops, working while in school, number of hours worked, and GPA. Significant differences in the response patterns were observed for several variables. A higher percentage of transfer students (42 percent) indicated plans to attend graduate school at the time of the survey than non-transfer

105

Page 106: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

students (20 percent). There was a small difference in the ethnic distribution for the transfer and native groups of students who responded to the survey: Caucasians represented 79 and 67 percent of the groups, respectively.

General Performance Indicators

Four questions on the survey were designed to yield a measure of the student’s satisfaction with their overall undergraduate experience with primary emphasis on the teaching/learning process within the College of Engineering. As one indicator, students were asked if they would recommend the program to a relative or friend. Approximately 65 percent of the graduating seniors replied affirmatively. Approximately twelve percent (9 students) said they would not recommend an engineering degree; students in this response category include six (6) electrical and two (2) computer engineering majors as well as one (1) mechanical engineering senior. An additional 22 percent of the sample selected a “maybe” response to this question. At least one student from each program area selected this response but a majority of these students were computer and mechanical engineering majors.

Students were also asked to rate their satisfaction regarding their preparation to become an engineer on a scale from “not satisfied” to “very satisfied.” Students indicating a “very satisfied” or “satisfied” response pattern totaled approximately 81 percent of the respondents. Seven percent were undecided and 12 percent expressed dissatisfaction with their engineering preparation. The distributions for each program show that the nine students who chose the “not satisfied” or “a little satisfied” responses included seniors from the Computer (18%), Electrical (25%), and Mechanical (14%) programs.

Students were asked to rate their preparation to obtain a job after graduation. Students describing their preparation for a job as “satisfactory” or “very satisfactory” totaled 77 percent of the senior sample. Three students rated their preparation as “not satisfactory” and six students, or eight percent of the seniors, rated their preparation as “somewhat satisfactory.” One or more seniors representing the Civil, Computer, Electrical and Mechanical programs expressed a less than satisfactory evaluation.

The final question in this section asked students: “How would you rate your preparation to become a contributing member of society?” Approximately 91 percent of the sample believe their preparation was “satisfactory” or “very satisfactory.” Four students (six percent) gave a “somewhat satisfactory” or a “not satisfactory” response and three students (four percent) were undecided regarding this issue.

Indicators of Life-long Learning

As one indicator of a student’s motivation to continue the education process, the survey included questions concerning the types of publications read and the programs students listen to or watch on television. Of the students completing surveys, approximately 87% (66 students) indicated they read at least one publication other than a textbook on a regular basis. All but 16 of these students listed more than one publication in response to this inquiry. Newspapers (45 students) and engineering magazines (34 students) were cited most frequently by the students as the

106

Page 107: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

publications they read. Students indicated that they read these publications on a weekly or monthly basis. Very few students mentioned that they read a magazine or newspaper everyday. Engineering publications read by the students include Chemical Engineering Progress, ASCE Newsletter, IEEE Spectrum, and the ASME Journal. Students indicated that Engineering magazines are read “regularly” or “monthly.” Students also indicated that they read other types of magazines including news publications (such as Newsweek) and, hobby, science and sports magazines (such as Popular Science, Sports Illustrated and Men’s Health).

In response to the question about the kinds of news programs they watch or listen to, 87 percent indicated they watch a news program on a regular basis. Students listed three types of broadcasts: (1) cable news (26 students); (2) national and/or local network news (35 students).

Assessment of College Services

Students rated ten different categories of services provided by the College of Engineering using a scale from “very dissatisfied” to “very satisfied.” At least 48 percent of the students indicated they were “satisfied” or “very satisfied” with all of the college services listed in the table. Areas receiving the highest marks include: comfort and appropriateness of classrooms (83 %); value of assistance provided by Student Services (75%); the respectfulness of the Teaching Assistants (73%); Teaching Assistants display a clear understanding of the subject matter (73%); and, advisor’s knowledge of program requirements (72 %). College services that a significant percentage of students evaluated as unsatisfactory (by selecting a “very dissatisfied” or “dissatisfied” response) include: availability and condition of computers (35%); availability and condition of lab equipment (38%); and overall conditions of labs (31%). These student response patterns are also similar to those obtained from the previous survey in December 1998.

Departmental Results: Analysis of the college-wide programs and services findings, suggests that student response patterns on several items varied according to department affiliation. A significant proportion of the Computer and Electrical Engineering majors were dissatisfied with the value of the advisement services (46% and 50%), the conditions of the labs (73% and 58%), and the availability and condition of lab equipment (73% and 58%), respectively. Approximately 25 percent of Electrical Engineering students were dissatisfied with their advisor’s knowledge of their program requirements. Mechanical seniors expressed dissatisfaction with the availability and condition of computers (50%) and lab equipment (40%). The negative perception regarding the computer labs was also shared by students representing the other programs but to a lesser extent: Chemical (22%); Civil (25%); Computer (36%); and Electrical (16%).

Ratings of Competencies

Seniors were asked to provide their opinion regarding the amount of experience and their satisfaction with the level of competency they achieved on 21 different skills and competencies as a result of their USC education. These skills are grouped into three major categories. The following paragraphs summarize these findings.

Category 1: An ability to apply engineering terms and principles, mathematics, chemistry, and liberal arts.

107

Page 108: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Amount of Experience May 1999 seniors rated the amount of education experience received with their engineering terms, principles and theories, advanced mathematics and chemistry concepts about the same but the response pattern for liberal arts was substantially different. Overall, 89 percent of the students believe they received an “adequate” amount of experience in engineering terms and principles. The amount of experience in advanced mathematics and chemistry/physics was also rated as “adequate” by 85 and 84 percent, respectively, of the seniors. Ratings for the application of the liberal arts coursework followed a slightly different pattern with 67 percent of the students stating that their college experience was adequate. Approximately 24 percent of the seniors believe that they received “too much” liberal arts coursework.

Departmental results: Regarding the application of engineering terms, principles and theories, advanced math and chemistry/physics, the distribution of responses by departments show some variation on each of these items, but, in general, percentages for each alternative did not substantially deviate from the college totals. Unlike the other programs, Mechanical seniors were more diverse in their responses concerning advanced mathematics. Although 76% of the seniors believe their math preparation was adequate, 14% received “too little” experience but 10% felt they received “too much” training in this area. For the ratings on the amount of experience received in the Liberal Arts coursework, response patterns for the Electrical and Civil programs were somewhat different from the other programs. Approximately 42 and 33 percent of the seniors, respectively, believe they received “too much” experience.

Level of Competency Approximately 94 percent of the seniors who responded to the survey were satisfied or completely satisfied with their level of competency relating to engineering terms, principles and theories. Students also expressed satisfaction with their competency level in advanced math (94%) as well as chemistry/physics (91%) and liberal arts (92%). It is clear that seniors are satisfied with the level of competency they achieved in each of the four areas.

Category 2: An ability to identify and solve engineering problems; design a system to meet desired needs; use the computer as an analysis tool; function on multidisciplinary teams; function in culturally diverse settings; communicate orally, in writing and with computer software; design/conduct experiments; and analyze/interpret data.

Amount of Experience Overall, most seniors believe they received an adequate amount of experience on each of the skills within this set of competencies. At least 64 percent or more of the seniors indicated that they received an adequate amount of experience with each of the competencies listed above. Those skills with the largest percentages of students rating their amount of experience as “adequate” include: identify/formulate/solve engineering problems (89%); analyze/interpret data (84%); use of computer software for communications (80%); and communicate orally (80%). Seniors also believe that an insufficient amount of experience was provided in certain areas. These competencies, and the percentages of students indicating “too little” experience in each area are as follows: functioning on multi-disciplinary teams (29%), designing/conducting experiments (29%), designing a system or process (23%), and functioning in a culturally and ethnically diverse environment (20%).

108

Page 109: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Departmental Results: Departmental differences were evident for six of the ten items. A few are noteworthy because a substantial percentage of students in a particular program indicated that “too little” emphasis was given to some skills. For example, 44 percent of the Chemical seniors believe they received inadequate experience in functioning on a multi-disciplinary team compared to 29 percent of the student sample as a whole and 17 and 18 percent, respectively, for the Civil and Computer Engineering programs. Electrical (33%) and Mechanical Engineering (30%) seniors also indicated insufficient experience with multidisciplinary teams. Response patterns also differed with regards to oral communication. A larger percentage of Electrical Engineering seniors perceived a weakness in the area of oral communication skills, 42 percent, compared to the college total of 15 percent and totals for the other programs. Departmental differences are also suggested in the senior’s evaluation of analyzing and interpreting data with 20 percent of the Mechanical engineering seniors indicating there was insufficient experience provided in this area whereas seniors in the other programs did not rate this as a weakness. It is interesting to note that 27 percent of the computer engineering seniors rated use of computer software for professional communications as inadequate compared to the overall college total of 12 percent.

Level of Competency Overall, the students were satisfied with the level of competency they achieved on each of these ten skills. Positive response rates ranged from 78 to 92 percent of the responding students. This finding indicates that students feel confident in solving engineering problems, designing systems, using computers, functioning in ethnically diverse environments, and analyzing experiments. Students also feel confident in their ability to communicate either orally, by reports and through the use of computer software.

Although most of the students feel confident in their capabilities as a graduating senior, a substantial segment identified areas of weakness. These include designing and conducting experiments (22%); analyzing and interpreting data (20%); functioning on the multidisciplinary team (17%); designing a system, component, or process to meet desired needs (16%); andfunctioning in a culturally diverse environment (15%).Departmental Results: Student response patterns were fairly similar for a majority of the ten skills listed in this category. A few exceptions should be noted. Forty-three percent of the Electrical engineering students were dissatisfied with their level of competency to design a system, component or process. Also, a third of the Civil Engineering students expressed dissatisfaction with their skills in using the computer as a tool for analysis and design, and written communications such as reports, memos, etc.

Category 3: An understanding of professional and ethical responsibilities, environmental aspects of engineering, engineering on a global scale, impact of engineering solutions in global context, life-long learning, industry practices, and contemporary issues.

Amount of Experience These characteristics were assessed with seven items on the survey. Compared to the other two categories of competencies, smaller percentages of students rated the amount of experience received as “adequate.” An adequate rating on college totals for these competencies ranged from a low of 32 percent to a high of 79 percent. An understanding of professional and ethical responsibilities and the need for life long learning were the highest rated competencies in this category. According to the May 1999 seniors, the skills receiving insufficient instructional emphasis within the curriculum included contemporary issues (67%), the practice of

109

Page 110: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

engineering on a global scale (52%) , the impact of engineering in a global context (49%) and, industry practices and standards (43%).

Departmental Results: In general, response patterns for the Computer, Electrical and Mechanical programs were similar for over half of the competency items. Response patterns for the Chemical and Civil programs also followed a similar trend for six of the 10 items. A small segment of the seniors within Computer, Electrical and Mechanical expressed dissatisfaction with the following competencies: the identifying and solving of engineering problems, designing a system, component or process; using the computer as a tool for analysis; written communication; analysis and interpretation of data. None of the Civil or Chemical Engineering seniors indicated a negative rating of their competency level for these topics. All ratings for the oral communications, functioning in an ethnically diverse environment, and use of computer software were rated positively by the Chemical Engineering students but one or more students in all other programs were dissatisfied with their competency level in these areas.

Although Civil Engineering majors rated the amount of experience received in professional and ethical responsibilities as adequate, 18 to 50 percent of the seniors in the other programs felt their experience was inadequate. It is of interest for curriculum development that 50 percent of the Electrical Engineering seniors noted a deficiency in the curriculum in this area. A similar pattern was observed for the practice of engineering on a global scale: 25% of the Civil Engineering seniors rated this experience as inadequate but 47 to 75 percent of the seniors in the other programs indicated this response. Although 22 and 25 percent of the Chemical and Civil Engineering seniors, respectively, indicated a lack of knowledge regarding industry practices and standards, Electrical and Mechanical seniors, 53 and 58 percent, respectively, indicated this belief. While a majority of Civil Engineering seniors rated this competence as adequately incorporated into the curriculum, seniors in each of the other programs perceived an insufficient emphasis in this area in their programs.

Level of Competency In general, the student’s satisfaction with their level of competency in these areas reflects their opinion regarding the amount of experience they received. Students expressing dissatisfaction with the level of competency ranged from 16 to 51 percent. The competencies, and the percentage of May graduates rating the competencies as deficient include: contemporary issues (67%); knowledge of industry practices and standards (43%); impact of engineering solutions in a global context (49%); the practice of engineering on a global scale (52%); and, environmental aspects of engineering practices (43%).

Departmental Results: Response patterns for the five programs varied by topic with no consistent overall pattern detected. Response patterns, by program, were somewhat similar for the student’s rating of the environmental aspects of engineering practices and the impact of engineering solutions in a global context. Regarding professional and ethical responsibilities, Electrical and Mechanical majors exhibited lower satisfaction levels than the other programs. Satisfaction with the competency level of the practice of engineering on a global scale was significantly higher for Civil, Computer and Mechanical than for the Chemical and Electrical majors. Chemical, Electrical and Mechanical majors expressed higher levels of dissatisfaction regarding their competency level relating to contemporary issues than Civil and Computer engineers. In addition, a larger proportion of Electrical and Mechanical engineers were dissatisfied with their competency levels

110

Page 111: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

regarding basic knowledge of industry practices and the need for life-long learning than seniors from the Chemical, Civil or Computer Engineering programs.

Most Useful Experiences and Activities

Students were asked what courses, experiences, teachers or activities they believe were most useful in helping to prepare for the engineering profession. Responses differed by area of concentration.

Electrical

Eleven students majoring in electrical engineering responded to this question and provided a variety of responses. One student indicated that working with computers was the most beneficial experience of his college career. Several professors including – Hudgins, Simpson, Brice, Cokkinides and Sudarshan - were noted for their roles in heightening students’ interest in engineering. Eight of the eleven students said that labwork was the most “practical and informative” academic activity as well as yielding a “feeling of relevance” to their experience.

ComputerTen students majoring in computer engineering listed a response to this item. Three students listed EECE 351 and 352, taught by Campbell and King, as challenging and practical. Students also noted 500 level computer courses. Other beneficial activities included interning, labs, NSBE and professors Campbell, King, Sechrest, Hudgins, Dougal, Sudarshan and Simpson.

CivilSeven students in civil engineering wrote a response for this item. Several students named professors who were helpful and other students listed beneficial courses and organizations. Professors mentioned include Bradburn, Gribb, Harries, Meadows, Petrou and Sutton. One student noted ASCE as a helpful learning activity. Courses listed by students include: ECIV 470, 520, 562, and 563.

MechanicalThere were 29 responses from Mechanical Engineering majors. The top four responses included a listing of outstanding professors, senior design courses, ASME and senior lab. Fourteen of the 29 students named particular professors as having a positive impact on their education. Mechanical professors recognized by the seniors include: Young (11), Keating (6), Reynolds (6), Sutton (6), Khan (5), Peters (4), NcNeill (2), Lyons (1), Morehouse (1), Rochealeau (1), Schwartz (1), and Bayoumi (1). Students also recognized professors from regional campuses and other departments. Twelve students highlighted the senior design sequence when listing useful courses. Ten students said that the ASME organization was useful in fostering interest in Engineering. Other beneficial activities noted by students included co-op and internship experiences, research work, open door policy of the professors, and, the basic core courses in the curriculum.

ChemicalNine Chemical Engineering majors provided comments regarding useful college activities, helpful courses or outstanding teachers. Two students suggested that their most influential learning

111

Page 112: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

activity was their co-op or internship experience. Students noted the following professors: Van Brunt (6), Amiridis (4), Matthews (2), Gadala-Maria (1), Hoo (1), and Ritter (1). Five students stated that participation in AIChE assisted with interpersonal and leadership skills. Group projects, design classes, computer classes, teamwork, oral presentations, student interaction, and freshman English were also listed as helpful activities in preparing students to become an engineering professional.

Extracurricular Activities

InternshipsOnly 45 percent of the senior indicated that they held at least one internship with an engineering company during their academic career. Students with internships (33 students) listed 24 different companies as employers. Some of the companies, for example, include NCR, SC DOT, Milliken, Westinghouse, Quore Property Science, Chavis Electric, Dana Corp and AFCO. A complete listing of all the companies is provided in the frequency distribution of results.

Departmental Results: Students from all majors participated but levels of involvement varied among the program areas: Chemical (50%); mechanical (43%); computer (46%); electrical (50%) and civil (33%).

Co-opsApproximately 28 percent, or 21 students, indicated that they participated in a co-op program. Fifteen different companies provided this work opportunity for these students including; Westinghouse, Allied Signal, Union Camp, SC DOT, Georgetown Steel, Pontiac Foods, Union Switch and Signal, Ohio Brass, R.E. Phelon, BellSouth, Thermal Ceramics, GE Banyon, Bose Corporation, Cooper Power Tools, and Santee Cooper.

Departmental Results: None of the Computer Engineering seniors enrolled in a co-op program, however, 50 percent (6 students) of the Electrical engineering students and 33 percent of the Chemical Engineering students were participants. Similar proportions of Civil (25%) and Mechanical (27%) students also listed co-op experience.

Career Services

Students were asked if they participated in career planning through the Career Services Office. Approximately 63 percent of the seniors (45 students) indicated using services offered by this office. There was a difference in the participation of students in different programs. Over 80 percent of the seniors from Chemical and Electrical but less than half of the Civil and Computer seniors utilized the Career Services Offices.

Employment during School

Survey results show that 80 percent of the seniors held a part-time or full-time job while attending school. About half of the working students, 53 percent (32 students), were employed less than 20 hours per week. A sizable proportion, however, 27 percent (16 students), worked more than 30

112

Page 113: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

hours per week with an additional 20 percent (12 students) engaged for between 20 and 30 hours per week.

Departmental Results: All program areas had at least half of their students engaged in a job while attending college. The proportion of working students ranged from 56 percent of the civil Engineering seniors to 91 percent of the Computer Engineering seniors. With the exception of the Computer and Mechanical Engineering seniors, a majority of the students (60 percent or more) indicated they worked less than 20 hours per week.

Graduate Education

Students were asked if they plan to attend graduate school. Only 27 percent of the seniors surveyed indicated they have plans to enroll in graduate school, however, approximately 45 percent were unsure. A significant proportion, approximately 27 percent, said they had no plans to attend graduate school.

Departmental Results: Survey data indicated that students in each program area plan to further their education but a smaller percentage of the Computer Engineering students (only 9 percent) compared to seniors from other programs have definite plans to attend graduate school. Program of Study: Students were asked to indicate what future program of study they would pursue during graduate school. The 20 students who definitely plan to attend graduate school as well as some of those students who indicated they might pursue further education responded to this question. Student responses suggest that they are planning to pursue graduate degrees in an engineering field or a business degree. Other students indicated an interest in mathematics/physics, education and communications. Less than half of the students indicated USC as their possible choice of graduate school.

Recommendations

Students were given the opportunity to make recommendations for the improvement of the educational experience for future engineers at USC. Approximately 91 percent of the respondents (68 students) commented on this survey item. A variety of topics emerged from an analysis of the data with students making more than 50 independent suggestions. Seniors provided program specific suggestions as well as more global critiques applicable to college-wide services, etc. Recommendations common to all program areas are listed below:

1.) Redesign curriculum and/or add courses such as computer applications, ethics, business/industry standards, environmental issues

2.) Add faculty. Provide more caring faculty. Hire competent faculty. 3.) Require or increase participation in co-ops and internship programs4.) Greater collaboration with business and industry within the classroom.5.) Increase oral presentation opportunities.6.) Increase real world, hands-on, practical activities/projects in courses.

113

Page 114: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

7.) Increase access to computers – more computers – have computer labs available instead of tied up with classes.

8.) Improve instruction – have faculty focus more on course rather than research.9.) Upgrade and increase maintenance of labs and equipment.10.) Improved and increased faculty/student interaction.11.) Improve computer technical support – increase support availability, provide more

competent and courteous staff.

Some specific program recommendations, most of which are not covered in the global list, are also noteworthy. The more frequently mentioned suggestions for each area are given below.

Chemical:1.) Encourage co-op participation.2.) Improvements/expansions in courses offered, material covered.3.) Faculty with increased industrial and practical experience.4.) Increase faculty/industry contact (ex. - team taught Process Control Design)

Civil:1.) Add courses such as Surveying I and II, surveying lab, Highway Design2.) Increase use within instruction of computer programs for calculations and

applications (ex. Mathcad, Autocad, Watercad, EaglePoint)3.) Delete and/or combine ECIV 300, ECIV 301, ECIV 405, ECON 421 and STAT

509.

Computer:1.) Better access to computers (computers not used as classrooms).2.) Hire more computer faculty.3.) Include more course emphasizing IT systems, design, networking, web page design,

JAVA, etc.

Electrical:1.) Redesign curriculum: more diversity in courses offered. Delete outdated courses.2.) Clean labs and provide better equipment in them.3.) Improve teaching. Hire faculty members who know how to teach.

Mechanical:

1.) Delete the microprocessor course as a requirement.2.) Implement more applications to real world and problem-solving.3.) Expose students more to industry practices.4.) Increase the teaching of ethics, industry practices and standards, environmental

issues and design projects.5.) Improve lab conditions and equipment.6.) Improve instruction – terminate incompetent and bigoted professors.7.) Increase faculty/student interaction.

114

Page 115: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Summary

The College of Engineering administered the Senior Survey to 89 graduating seniors during May 1999. Seventy-four surveys were completed and collected from students yielding an overall 83 percent return rate. This marks a substantial increase compared to the previous semester in which the return rate was approximately 65 percent. Return rates were highest for those departments in which the surveys were administered during class or by a department administrator.

Demographics for the sample of students who completed a survey indicate that 76 percent of the students are Caucasian and 80 percent are male. Approximately 45 percent of the students held an internship during the summer and 28 percent of the sample participated in a co-op program while attending USC. These figures represent a decrease compared to the December 1998 Senior Survey results. Survey findings also show that 80 percent of the seniors held a part-time or full-time job while attending school. A majority of the seniors who held jobs while attending USC, 53 percent, were employed for less than 20 hours per week.

About 27 percent of the seniors have definite plans to attend graduate school and another 45 percent indicate that they might enroll in the future. Only 45 percent of the students who indicated a possibility of pursuing further education stated that they plan to study engineering. A sizable proportion of the seniors, 40 percent, believes they will enroll in a business program in graduate school.

A total of 27 seniors, or approximately 35 percent of the sample, transferred to the USC-Columbia campus from a regional campus or another institution. Approximately 83 percent of these transfers were from a two or four-year college within South Carolina. A substantial segment of the transfers, 33 percent, came from a USC regional campus program. Technical colleges supplied approximately 17 percent of the transfer population. The Mechanical Engineering program had the largest proportion of the transfers with 12 students or approximately 44 percent of the transfer population.

Length of enrollment statistics for this semester was somewhat unusual because three students began their college careers in 1979 and 1988. Otherwise, 90 percent of the seniors who began in 1991 through 1998 took five years to graduate. Approximately 54 percent of the seniors who entered in 1991 or later graduated in four years.

Four questions on the survey asked students to rate their degree of satisfaction with their undergraduate experience within the College of Engineering. Approximately 65 percent of the seniors said they would recommend their program to a friend or relative. This is the same percentage as in previous semesters. A majority of students, 81 percent, indicated satisfaction with their preparation to become an engineer. Approximately 77 percent rendered a positive rating regarding their preparation to obtain a job. Overall, 91 percent of the seniors believe they are satisfactorily prepared to become contributing members of society.

Students assessed satisfaction with ten categories of services provided by the College of Engineering. Although there is room for improvement, in general, students have a positive

115

Page 116: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

perception of these selected services. Areas of strength include: the comfort and appropriateness of classrooms (83%), the value of assistance provided by Student Services (75%), the respectful treatment of students by TAs (73%), the TAs understanding of the subject matter (73%), and the advisor’s knowledge of program requirements (72%). According to the following percentages of students, areas needing the most improvement are: the availability and condition of computers (35%), the availability and condition of lab equipment (38%), and the overall condition of labs (31%) and the value of general advisement services received (26%). These are the same weaknesses identified by students from the December 1998 survey administration.

Seniors were asked to give their opinion regarding the amount of experience they received and their satisfaction with the level of competency achieved on 21 different skills. Regarding the amount of experience received in coursework, at least 60 percent or more of the students believe they obtained “adequate” instructional experience for 16 of the 21 competency areas. Competencies rated the highest by 90 percent or more of the students are given below:

knowledge of engineering terms, principles and theories (94%)knowledge of advanced math (94%)knowledge of chemistry and/or physics (91%)knowledge of liberal arts (92%) identification, formulation and solving of engineering problems (92%)ability to use computer software for communications (92%)ability to communicate in writing (91%)ability to communicate orally (90%)

Students also identified several skills and/or competencies that received “too little” instructional emphasis in their coursework. Competencies indicated by 30 percent or more of the students as needing improved coverage:

contemporary issues (51%)practice of engineering on a global scale (41%)basic knowledge of industry practices (39%)impact of engineering solutions in a global context (38%) environmental aspects of engineering practice (33%).

Students, as a whole, were also satisfied with their level of competency in each of the 21 skills. Satisfaction levels ranged from 50 to 94 percent. Tabulations indicate that 90 percent or more of the students gave the highest satisfaction ratings for the following competencies:

knowledge of engineering terms, principles, and theories (94%)knowledge of advanced math (94%)knowledge of liberal arts (92%)identification and formulation of engineering problems (92%)knowledge of chemistry and/or physics (91%)written communications (91%)use of computer software for communications (91%)oral presentations (90%)

116

Page 117: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Students were asked what courses, experiences, teachers, or activities were most useful in helping them to prepare for the engineering profession. A majority of students who responded to this question listed particular courses, professors and professional organizations. A significant number of students indicated that the co-op and/or internship experiences were the most important learning activities during their academic career. Many of the students believe that membership and participation in an organization such as the NSBE, AiChE, ASCE, and ASME fostered interest in Engineering and provided valuable information about the profession. Students in each discipline mentioned numerous faculty members who were especially helpful with the learning process. Particular courses within each department, such as labs and design courses, were also noted as providing useful experiences for the students.

Students made numerous recommendations regarding the improvement of the educational experience for future engineering students. The most frequently cited suggestions made by all students, regardless of major, include the following:

Redesign curriculum and /or add courses such as computer applications, ethics, business/industry standards, environmental issues

Add faculty. Have more caring faculty. Hire competent faculty. Have greater collaboration with business and industry within the classroom. Increase real world, hands-on, practical activities/projects in courses. Increase access to computers – more computers – have computer labs available instead of

tied up with classes. Improve instruction – have faculty focus more on course rather than research. Upgrade and increase maintenance of labs and equipment. Improved and increased faculty/student interaction. Improve computer technical support – increase support availability, provide more

competent and courteous staff.

Overall, the interpretation of the survey results suggests that students perceive their engineering experience in a positive light. In their evaluation of the ABET Criteria 2000 skills, students believe they received about the right amount of coursework experience and are satisfied with their level of competency on a majority of the specified skills.

117

Page 118: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix D

Course Survey

118

Page 119: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

119

Page 120: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

120

Page 121: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix E

Course Survey Reports

(sample)

121

Page 122: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Spring 2000 Course Survey ResultsCollege and Program Totals

1. The instructor clearly stated the instructional objectives of the course.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 28 (1.3%) 54 (2.4%) 132 ( 6.0%) 1062 (48.0%) 937 (42.3%) 4.28 .79

CSCE 8 (1.9%) 22 (5.1%) 43 (10.0%) 238 (55.2%) 120 (27.8%) 4.02 . 87ECHE 0 (0.0%) 4 (1.4%) 9 ( 3.2%) 121 (42.6%) 150 (52.8%) 4.47 .63ECIV 3 (0.7%) 8 (2.0%) 26 ( 6.4%) 178 (43.6%) 193 (47.3%) 4.35 .80ELCT 12 (3.7%) 14 (4.3%) 41 (12.6%) 183 (56.1%) 76 (23.3%) 3.91 .93EMCH 5 (0.7%) 6 (0.9%) 12 ( 1.8%) 325 (47.5%) 336 (49.1%) 4.43 .65AIKEN 0 (0.0%) 0 (0.0%) 1 ( 1.4%) 15 (21.1%) 55 (77.5%) 4.75 .47

2. The instructor clearly stated the method by which your final grade would be determined.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 49 (1.9%) 90 (4.1%) 172 ( 7.8%) 933 (42.2%) 972 (43.9%) 4.21 .91

CSCE 14 (3.2%) 26 (6.0%) 42 ( 9.7%) 212 (49.1%) 138 (31.9%) 4.00 .98ECHE 2 (0.7%) 13 (4.6%) 19 ( 6.7%) 108 (38.0%) 142 (50.0%) 4.32 .84ECIV 12 (2.9%) 21 (5.1%) 35 ( 8.6%) 152 (37.3%) 188 (46.1%) 4.18 .99ELCT 15 (4.6%) 21 (6.4%) 47 (14.4%) 156 (47.7%) 88 (26.9%) 3.86 1.03EMCH 6 (0.9%) 9 (1.3%) 28 ( 4.1%) 282 (41.2%) 360 (52.6%) 4.43 .72AIKEN 0 (0.0%) 0 (0.0%) 1 ( 1.4%) 22 (31.0%) 48 (67.6%) 4.65 .51

3. The instructor clearly explained any special requirements of attendance which differ from the attendance policy of the University.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 32 (1.5%) 54 (2.5%) 311 (14.1%) 961 (43.7%) 842 (38.3%) 4.15 .86

CSCE 13 (3.0%) 15 (3.5%) 74 (17.3%) 196 (45.9%) 129 (30.2%) 3.97 .94ECHE 0 (0.0%) 8 (2.8%) 40 (14.1%) 121 (42.6%) 115 (40.5%) 4.21 .79ECIV 6 (1.5%) 5 (1.2%) 55 (13.6%) 159 (39.3%) 180 (44.4%) 4.24 .84ELCT 8 (2.5%) 19 (5.8%) 82 (25.2%) 141 (43.4%) 75 (23.1%) 3.79 .95EMCH 5 (0.7%) 7 (1.0%) 57 ( 8.4%) 321 (47.3%) 289 (42.6%) 4.30 .73AIKEN 0 (0.0%) 0 (0.0%) 3 ( 4.2%) 21 (29.6%) 47 (66.2%) 4.61 .57

4. The instructor clearly graded and returned the student’s written work (e.g., examinations and papers) in a timely manner..

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 94 ( 4.3%) 146 ( 6.6%) 230 (10.4%) 854 (38.6%) 887 (40.1%) 4.04 1.07

122

Page 123: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

CSCE 37 ( 8.6%) 47 (10.9%) 58 (13.5%) 156 (36.3%) 132 (30.7%) 3.70 1.25ECHE 1 ( 0.4%) 18 ( 6.4%) 29 (10.3%) 128 (45.4%) 106 (37.6%) 4.13 .87ECIV 6 ( 1.5%) 26 ( 6.4%) 39 ( 9.6%) 160 (39.3%) 176 (43.2%) 4.16 .94ELCT 41 (12.5%) 38 (11.6%) 54 (16.5%) 123 (37.6%) 71 (21.7%) 3.44 1.29EMCH 7 ( 1.0%) 15 ( 2.2%) 43 ( 6.3%) 266 (38.8%) 354 (51.7%) 4.38 .78AIKEN 2 ( 2.8%) 2 ( 2.8%) 4 ( 5.6%) 21 (29.6%) 42 (59.2%) 4.38 .94

5. The instructor met the class regularly and at the scheduled times.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 26 (1.2%) 32 (1.5%) 101 (4.6%) 901 (40.9%) 1143 (51.9%) 4.41 .75

CSCE 12 (2.8%) 5 (1.2%) 38 (8.9%) 197 (46.0%) 176 (41.1%) 4.21 . 87ECHE 5 (1.8%) 11 (3.9%) 14 (4.9%) 111 (39.1%) 143 (50.4%) 4.32 .87ECIV 1 (0.2%) 1 (0.2%) 9 (2.2%) 137 (33.6%) 260 (63.7%) 4.60 .57ELCT 5 (1.6%) 13 (4.1%) 26 (8.1%) 175 (54.7%) 101 (31.6%) 4.11 .83EMCH 3 (0.4%) 2 (0.3%) 14 (2.0%) 259 (37.9%) 405 (59.3%) 4.55 .60AIKEN 0 (0.0%) 0 (0.0%) 0 (0.0%) 21 (29.6%) 50 (70.4%) 4.69 .46

6. The instructor scheduled a reasonable number of office hours per week.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 41 (1.9%) 66 (3.0%) 228 (10.3%) 882 (39.8%) 997 (45.0%) 4.23 .89

CSCE 22 (5.1%) 21 (4.9%) 62 (14.4%) 174 (40.5%) 151 (35.1%) 3.96 1.07ECHE 1 (0.4%) 9 (3.2%) 20 ( 7.0%) 114 (40.1%) 140 (49.3%) 4.35 .78ECIV 2 (0.5%) 2 (0.5%) 22 ( 5.4%) 149 (36.5%) 233 (57.1%) 4.49 .69ELCT 11 (3.4%) 23 (7.0%) 60 (18.3%) 140 (42.8%) 93 (28.4%) 3.86 1.02EMCH 4 (0.6%) 11 (1.6%) 60 ( 8.8%) 281 (41.0%) 329 (48.0%) 4.34 .75AIKEN 1 (1.4%) 0 (0.0%) 2 ( 2.8%) 23 (32.4%) 45 (63.4%) 4.56 .69

7. Please indicate your satisfaction with the availability of the instructor outside the classroom by choosing one response from the scale above. (In selecting your rating, consider the instructor’s availability via established office hours, appointments, and other opportunities for face-to-face interaction as well as via telephone, e-mail, fax and other means.).

Very Very Dissatisfied Dissatisfied Satisfied Satisfied Means SD

Total 36 (1.7%) 107 ( 5.0%) 998 (47.0%) 983 (46.3%) 3.38 .66

CSCE 17 (4.2%) 41 (10.2%) 201 (49.9%) 144 (35.7%) 3.17 .77ECHE 3 (1.1%) 6 ( 2.2%) 131 (47.3%) 137 (49.5%) 3.45 .60ECIV 2 (0.5%) 7 ( 1.8%) 166 (41.6%) 224 (56.1%) 3.53 .56ELCT 13 (4.2%) 42 (13.6%) 173 (56.0%) 81 (26.2%) 3.04 .75EMCH 1 (0.2%) 11 ( 1.7%) 303 (46.0%) 343 (52.1%) 3.50 .54AIKEN 0 (0.0%) 0 ( 0.0%) 23 ( 33.3%) 46 (66.7%) 3.67 .47

8. The stated course objectives reflect what was actually taught..

123

Page 124: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 27 (1.2%) 77 (3.5%) 201 (9.1%) 1018 (46.1%) 887 (40.1%) 4.20 .84

CSCE 9 (2.1%) 34 (7.9%) 58 (13.5%) 218 (50.7%) 111 (25.8%) 3.90 . 94ECHE 0 (0.0%) 8 (2.8%) 14 ( 4.9%) 122 (43.0%) 140 (49.3%) 4.39 .71ECIV 3 (0.7%) 9 (2.2%) 38 ( 9.3%) 164 (40.3%) 193 (47.4%) 4.31 .79ELCT 9 (2.8%) 17 (5.3%) 55 (17.0%) 167 (51.7%) 75 (23.2%) 3.87 .92EMCH 6 (0.9%) 9 (1.3%) 35 ( 5.1%) 323 (47.1%) 313 (45.6%) 4.35 .72AIKEN 0 (0.0%) 0 (0.0%) 0 ( 0.0%) 22 (31.0%) 49 (69.0%) 4.68 .47

9. The assignments were meaningful, and contributed to my understanding of the subject..

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 46 (2.1%) 103 (4.7%) 199 (9.0%) 938 (42.4%) 924 (41.8%) 4.17 .92

CSCE 18 (4.2%) 35 (8.2%) 50 (11.7%) 191 (44.6%) 134 (31.3%) 3.91 1 06ECHE 2 (0.7%) 14 (4.9%) 15 ( 5.3%) 121 (42.6%) 132 (46.5%) 4.29 .83ECIV 5 (1.2%) 11 (2.7%) 26 ( 6.4%) 162 (39.8%) 203 (49.9%) 4.34 .82ELCT 13 (4.0%) 28 (8.6%) 53 (16.3%) 154 (47.4%) 77 (23.7%) 3.78 1.03EMCH 8 (1.2%) 15 (2.2%) 52 ( 7.6%) 284 (41.4%) 327 (47.7%) 4.32 .80AIKEN 0 (0.0%) 0 (0.0%) 3 ( 4.2%) 24 (33.8%) 44 (62.0%) 4.56 .60

10. The course was intellectually challenging.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 22 (1.0%) 50 (2.3%) 171 (7.7%) 918 (41.5%) 1049 (47.5%) 4.32 .79

CSCE 9 (2.1%) 19 (4.4%) 44 (10.2%) 200 (46.5%) 158 (36.7%) 4.11 .91ECHE 0 (0.0%) 3 (1.1%) 14 ( 4.9%) 107 (37.8%) 159 (56.2%) 4.49 .64ECIV 2 (0.5%) 6 (1.5%) 25 ( 6.1%) 161 (39.6%) 213 (52.3%) 4.42 .72ELCT 6 (1.9%) 11 (3.4%) 39 (12.0%) 154 (47.5%) 114 (35.2%) 4.11 .87EMCH 5 (0.7%) 11 (1.6%) 48 ( 7.0%) 272 (39.7%) 350 (51.0%) 4.39 .75AIKEN 0 (0.0%) 0 (0.0%) 1 ( 1.4%) 22 (31.1%) 48 (67.6%) 4.65 .51

11. The course was well organized; course materials were well prepared and carefully explained.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 70 (3.2%) 133 (6.1%) 243 (11.1%) 929 (42.3%) 823 (37.4%) 4.05 1.01

CSCE 27 (6.3%) 44 (10.3%) 56 (13.1%) 196 (46.0%) 103 (24.2%) 3.71 1.13ECHE 4 (1.4%) 16 ( 5.7%) 19 ( 6.7%) 114 (40.3%) 130 (45.9%) 4.24 .91ECIV 7 (1.7%) 21 ( 5.2%) 43 (10.6%) 148 (36.5%) 187 (46.1%) 4.20 .94ELCT 26 (8.0%) 34 (10.5%) 70 (21.7%) 133 (41.2%) 60 (18.6%) 3.52 1.15EMCH 6 (0.9%) 17 ( 2.5%) 53 ( 7.8%) 310 (45.6%) 294 (43.2%) 4.28 .78AIKEN 0 (0.0%) 0 ( 0.0%) 2 ( 2.8%) 27 (38.0%) 42 (59.2%) 4.56 .55

13. The required course readings were valuable.

124

Page 125: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 100 (4.6%) 143 ( 6.5%) 423 (19.3%) 845 (38.6%) 680 (31.0%) 3.85 1.07

CSCE 29 (6.8%) 37 ( 8.7%) 67 (15.8%) 185 (43.5%) 107 (25.2%) 3.72 1.14 ECHE 7 (2.5%) 18 ( 6.4%) 39 (13.9%) 109 (38.9%) 107 (38.2%) 4.04 1.00ECIV 14 (3.5%) 13 ( 3.3%) 91 (22.8%) 141 (35.3%) 141 (35.3%) 3.96 1.01ELCT 24 (7.4%) 41 (12.7%) 103 (31.8%) 111 (34.3%) 45 (13.9%) 3.35 1.10EMCH 26 (3.8%) 33 ( 4.8%) 117 (17.2%) 268 (39.3%) 238 (34.9%) 3.97 1.03AIKEN 0 (0.0%) 0 ( 0.0%) 6 ( 8.5%) 31 (43.7%) 34 (47.9%) 4.39 .64

13. The tests, projects, reports, and/or presentations were related to course objectives.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 24 (1.1%) 49 (2.2%) 161 ( 7.3%) 1007 (46.0%) 950 (43.4%) 4.28 .78

CSCE 10 (2.4%) 18 (4.3%) 45 (10.6%) 218 (51.5%) 132 (31.2%) 4.05 . 89ECHE 1 (0.4%) 1 (0.4%) 13 ( 4.6%) 121 (42.9%) 146 (51.8%) 4.45 .64ECIV 1 (0.2%) 6 (1.5%) 28 ( 6.9%) 169 (41.9%) 199 (49.4%) 4.39 .70ELCT 8 (2.5%) 18 (5.6%) 37 (11.5%) 177 (54.8%) 83 (25.7%) 3.96 .90EMCH 4 (0.6%) 6 (0.9%) 35 ( 5.1%) 301 (44.2%) 335 (49.2%) 4.41 .68AIKEN 0 (0.0%) 0 (0.0%) 3 ( 4.2%) 20 (28.2%) 48 (67.6%) 4.63 .57

14. The assessments used to determine the grade in this course were objectively or fairly scored by the instructor or TA.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 45 (2.1%) 95 (4.4%) 257 (11.8%) 932 (42.9%) 841 (38.8%) 4.12 .92

CSCE 18 (4.3%) 28 (6.6%) 66 (15.6%) 199 (47.2%) 111 (26.3%) 3.85 1.02ECHE 3 (1.1%) 20 (7.1%) 32 (11.3%) 121 (42.9%) 106 (37.6%) 4.09 .93ECIV 4 (1.0%) 10 (2.5%) 49 (12.3%) 153 (38.4%) 182 (45.7%) 4.25 .84ELCT 17 (5.4%) 27 (8.5%) 56 (17.7%) 143 (45.3%) 73 (23.1%) 3.72 1.08EMCH 3 (0.4%) 10 (1.5%) 52 ( 7.7%) 292 (43.3%) 318 (47.1%) 4.35 .73AIKEN 0 (0.0%) 0 (0.0%) 2 ( 2.9%) 22 (32.4%) 44 (64.7%) 4.61 .55

15. Overall, how would you rate this course?

Very Poor Poor Average Good Excellent Means SDTotal 48 (2.2%) 109 ( 5.0%) 226 (10.3%) 1006 (45.8%) 806 (36.7%) 4.10 .93

CSCE 24 (5.6%) 36 ( 8.5%) 65 (15.3%) 199 (46.7%) 102 (23.9%) 3.75 1.0 8ECHE 3 (1.1%) 14 ( 4.9%) 23 ( 8.1%) 133 (47.0%) 110 (38.9%) 4.18 .86ECIV 3 (0.7%) 14 ( 3.5%) 37 ( 9.1%) 177 (43.7%) 174 (43.0%) 4.25 .82ELCT 15 (4.7%) 34 (10.6%) 52 (16.1%) 157 (48.8%) 64 (19.9%) 3.69 1.05EMCH 3 (0.4%) 11 ( 1.6%) 46 ( 6.8%) 321 (47.1%) 300 (44.1%) 4.33 .71AIKEN 0 (0.0%) 0 ( 0.0%) 2 ( 2.9%) 19 (27.5%) 48 (69.6%) 4.66 .54

16. The instructor made the objectives clear for each class.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

125

Page 126: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Total 31 (1.4%) 83 (3.8%) 203 (9.3%) 1043 (47.9%) 819 (37.6%) 4.16 .85

CSCE 9 (2.2%) 27 (6.5%) 53 (12.8%) 216 (52.2%) 109 (26.3%) 3.94 .92ECHE 2 (0.7%) 10 (3.5%) 25 ( 8.9%) 111 (39.4%) 134 (47.5%) 4.29 .83ECIV 4 (1.0%) 16 (4.0%) 35 ( 8.6%) 185 (45.7%) 165 (40.7%) 4.21 .84ELCT 10 (3.1%) 25 (7.8%) 53 (16.6%) 170 (53.3%) 61 (19.1%) 3.77 .95EMCH 6 (0.0%) 5 (0.7%) 37 ( 5.4%) 333 (49.0%) 299 (44.0%) 4.34 .70AIKEN 0 (0.0%) 0 (0.0%) 0 ( 0.0%) 27 (38.0%) 44 (62.0%) 4.63 .49

17. The instructor was prepared for each class session.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 24 (1.1 %) 60 (2.8%) 151 (6.9%) 956 (43.9%) 989 (45.4%) 4.30 .80

CSCE 9 (2.2%) 19 (4.6%) 49 (11.9%) 208 (50.4%) 128 (31.0%) 4.03 .90ECHE 1 (0.4%) 4 (1.4%) 14 ( 4.9%) 115 (40.6%) 149 (52.7%) 4.44 .69ECIV 3 (0.7%) 13 (3.2%) 30 ( 7.4%) 148 (36.5%) 211 (52.1%) 4.36 .81ELCT 7 (2.2%) 18 (5.6%) 38 (11.9%) 167 (52.2%) 90 (28.1%) 3.98 .91EMCH 4 (0.6%) 6 (0.9%) 19 ( 2.8%) 295 (43.4%) 356 (52.4%) 4.46 .65AIKEN 0 (0.0%) 0 (0.0%) 1 ( 1.4%) 21 (29.6%) 49 (69.0%) 4.68 .50

18. The instructor made effective use of the available time.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 39 (1.8%) 78 (3.6%) 186 (8.5%) 924 (42.4%) 951 (43.7%) 4.23 .88

CSCE 12 (2.9%) 27 (6.6%) 55 (13.4%) 198 (48.2%) 119 (29.0%) 3.94 .97ECHE 8 (2.8%) 11 (3.9%) 18 ( 6.4%) 111 (39.2%) 135 (47.7%) 4.25 .94ECIV 2 (0.5%) 14 (3.5%) 32 ( 7.9%) 145 (35.8%) 212 (52.8%) 4.36 .81ELCT 12 (3.8%) 11 (3.4%) 47 (14.7%) 167 (52.2%) 83 (25.9%) 3.93 .94EMCH 5 (0.7%) 15 (2.2%) 33 ( 4.9%) 282 (41.5%) 345 (50.7%) 4.39 .75AIKEN 0 (0.0%) 0 (0.0%) 1 ( 1.4%) 20 (28.2%) 50 (70.4%) 4.69 .49

19. The instructor was enthusiastic about the subject.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 19 (0.9%) 64 (2.9%) 158 (7.3%) 856 (39.3%) 1079 (49.6%) 4.34 .81

CSCE 7 (1.7%) 25 (6.1%) 44 (10.7%) 200 (48.8%) 134 (32.7%) 4.05 .91ECHE 1 (0.4%) 5 (1.8%) 19 ( 6.7%) 95 (33.7%) 162 (57.4%) 4.46 .73ECIV 2 (0.5%) 5 (1.2%) 24 ( 5.9%) 127 (31.4%) 247 (61.0%) 4.51 .71ELCT 8 (2.5%) 23 (7.2%) 39 (12.2%) 155 (48.4%) 95 (29.7%) 3.96 .97EMCH 1 (0.1%) 6 (0.9%) 31 ( 4.6%) 261 (38.4%) 381 (56.0%) 4.49 .64AIKEN 0 (0.0%) 0 (0.0%) 0 ( 0.0%) 18 (25.4%) 53 (74.6%) 4.75 .44

20. The instructor illustrated basic concepts so that I could understand.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 60 (2.8%) 127 ( 5.8%) 224 (10.3%) 881 (40.4%) 886 (40.7%) 4.10 .99

126

Page 127: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

CSCE 22 (5.3%) 40 ( 9.7%) 56 (13.6%) 196 (47.6%) 98 (23.8%) 3.75 1.09ECHE 7 (2.5%) 16 ( 5.7%) 28 ( 9.9%) 95 (33.7%) 136 (48.2%) 4.20 1.00ECIV 2 (0.5%) 17 ( 4.2%) 40 ( 9.9%) 153 (37.9%) 192 (47.5%) 4.28 . 84ELCT 22 (6.9%) 41 (12.8%) 49 (15.3%) 142 (44.4%) 66 (20.6%) 3.59 1.15EMCH 7 (1.0%) 12 ( 1.8%) 49 ( 7.2%) 269 (39.5%) 344 (50.5%) 4.37 .78AIKEN 0 (0.0%) 0 ( 0.0%) 2 ( 2.8%) 25 (35.2%) 44 (62.0%) 4.60 .55

21. The instructor clearly answered questions asked by students..

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 38 (2.7%) 105 (4.8%) 193 ( 8.9%) 908 (41.7%) 911 (41.9%) 4.15 .96

CSCE 20 (4.9%) 39 (9.5%) 52 (12.6%) 189 (45.9%) 112 (27.2%) 3.81 1.09ECHE 5 (1.8%) 15 (5.3%) 17 ( 6.0%) 105 (37.4%) 139 (49.5%) 4.27 .93ECIV 1 (0.2%) 11 (2.7%) 36 ( 8.9%) 166 (41.1%) 190 (47.0%) 4.32 .77ELCT 28 (8.8%) 28 (8.8%) 50 (15.6%) 136 (42.5%) 78 (24.4%) 3.65 1.19EMCH 4 (0.6%) 11 (1.6%) 34 ( 5.0%) 287 (42.3%) 343 (50.5%) 4.40 .71AIKEN 0 (0.0%) 0 (0.0%) 4 ( 5.6%) 25 (35.2%) 42 (59.2%) 4.54 .60

22. The instructor respected the students as individuals.

Strongly StronglyDisagree Disagree Neutral Agree Agree Means SD

Total 41 (1.9%) 48 (2.2%) 130 ( 6.0%) 856 (39.4%) 1095 (50.5%) 4.34 .84

CSCE 12 (2.9%) 17 (4.1%) 37 ( 9.0%) 186 (45.1%) 160 (38.8%) 4.13 . 94ECHE 2 (0.7%) 6 (2.1%) 12 ( 4.3%) 100 (35.6%) 161 (57.3%) 4.47 .74ECIV 1 (0.2%) 2 (0.5%) 15 ( 3.7%) 142 (35.2%) 243 (60.3%) 4.55 .62ELCT 23 (7.7%) 17 (5.4%) 39 (12.3%) 151 (47.6%) 87 (27.4%) 3.83 1.11EMCH 3 (0.4%) 6 (0.9%) 24 ( 3.5%) 258 (38.1%) 387 (57.1%) 4.50 .65AIKEN 0 (0.0%) 0 (0.0%) 2 ( 2.8%) 19 (26.8%) 50 (70.4%) 4.68 .53

23. Overall, I rate the performance of my instructor as:

Very Poor Poor Average Good Excellent Means SD

Total 29 (1.6%) 84 (4.6%) 125 ( 6.9%) 697 (38.3%) 883 (48.6%) 4.28 .90

CSCE 13 (3.8%) 31 (9.0%) 34 ( 9.9%) 155 (45.2%) 110 (32.1%) 3.93 1.06ECHE 0 (0.0%) 11 (4.5%) 18 ( 7.3%) 80 (32.5%) 137 (55.7%) 4.39 .81ECIV 1 (0.3%) 10 (3.2%) 22 ( 7.1%) 97 (31.1%) 182 (58.3%) 4.44 .79ELCT 13 (5.0%) 25 (9.6%) 33 (12.6%) 122 (46.7%) 68 (26.1%) 3.79 1.09EMCH 2 (0.3%) 7 (1.2%) 18 ( 3.1%) 222 (38.3%) 331 (57.1%) 4.50 .65AIKEN 0 (0.0%) 0 (0.0%) 0 ( 0.0%) 20 (29.4%) 48 (70.6%) 4.71 .46

127

Page 128: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix F

Alumnae/Alumni Survey

128

Page 129: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering& Information Technology

Alumnae/Alumni Survey

An Assessment of YourExperiences and Opinions

College of Engineering & Information Technology Susan CreightonUniversity of South Carolina Director of AssessmentColumbia, SC 29208 803/777-4423

129

Page 130: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Employment Information:

1. Please indicate which of the following statements is applicable to your situation. Mark all that apply.

_____Employed full time (30 or more hours a week)_____Employed part time (less than 30 hours a week)_____I am enrolled in graduate school._____Not employed, but seeking a position._____Not employed and not seeking a position._____Not employed and not attending graduate school.

If you are NOT employed, skip to question 3.

2a. What is your present position? ____________________________________________________________

2b. Where are you employed? _______________________________________________________________

2c. What is your primary business activity? (for example, design, research, sales, etc.) _______________

2d. If you are NOT employed in the engineering field, please indicate the reasons for this decision.

2e. Are you satisfied with your current position? Circle one. Yes NoPlease elaborate why or why not.

2f. Are you satisfied with your career progression? Why or why not?

2g. Are you satisfied with your salary level? Why or why not?

2h. Are you generally satisfied with your career choice? (such as engineering) Circle one. Yes No Please elaborate why or why not.

2i. Do you ever see yourself leaving engineering in the future to enter another field? Circle one. Yes No

130

Page 131: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

If yes, which field?___________________________________________

First Time Employment 3. What was your first position after graduation? _____________________________________________

4. How long after graduation did you obtain an engineering-related job? _________________________

Continuing Education

5. Have you applied to graduate school? Circle one. Yes No

5a. If yes, were you accepted? Yes No

5b. Did you enroll in graduate school? Yes No

5c. If yes, in what field? __________________________________________

5d. Institution: __________________________________________________

5e. Have you completed an advanced degree? Yes No

Undergraduate Experience

6. How would you rate your overall satisfaction with your preparation to become an engineer? Please mark the box that best describes your opinion.

Not A Little VerySatisfied Satisfied Undecided Satisfied Satisfied □ □ □ □ □

7. How would you rate your preparation to obtain a job after graduation? Please mark the box that best describes your opinion.

Not A Little VerySatisfied Satisfied Undecided Satisfied Satisfied □ □ □ □ □

8. How would you rate your preparation to become a contributing member of society? Please mark the box that best describes your opinion.

Not A Little VerySatisfied Satisfied Undecided Satisfied Satisfied □ □ □ □ □

131

Page 132: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

9. Below are listed some skills and competencies that are expected of engineering graduates. Please provide us with your opinion about the importance of each skill as it relates to your engineering positions. Also indicate your satisfaction with the level of competency you achieved as a result of your USC education. For each item please circle the number in the column appropriate to your answer.

Competencies Importance of Skills Level of Competency

Not

ImportantImportant Very

ImportantCompletelyDissatisfied Dissatisfied

Satisfied

Completely Satisfied

An ability to apply:

Engineering terms, principles and theories

1 2 3 1 2 3 4

Advanced mathematics (calculus & above)

1 2 3 1 2 3 4

Chemistry and/or physics 1 2 3 1 2 3 4Liberal Arts (English, history, economics, business, etc.)

1 2 3 1 2 3 4

An ability to:

Identify, formulate, and solve engineering problems

1 2 3 1 2 3 4

Design a system, component, or process to meet desired needs and quality

1 2 3 1 2 3 4

Use the computer as a tool for analysis & design

1 2 3 1 2 3 4

Function on multi-disciplinary or cross-functional teams

1 2 3 1 2 3 4

Function in culturally and ethnically diverse environments

1 2 3 1 2 3 4

Communicate orally, informally, and in prepared talks

1 2 3 1 2 3 4

Communicate in writing - technical reports, memos, proposals, etc.

1 2 3 1 2 3 4

Use computer software for professional communications

1 2 3 1 2 3 4

Design and conduct experiments 1 2 3 1 2 3 4Analyze and interpret data 1 2 3 1 2 3 4

An understanding of:

Professional and ethical responsibilities 1 2 3 1 2 3 4Environmental aspects of engineering practice

1 2 3 1 2 3 4

The practice of engineering on a global scale

1 2 3 1 2 3 4

The impact of engineering solutions in a global and societal context

1 2 3 1 2 3 4

The need for engaging in life-long learning

1 2 3 1 2 3 4

Basic knowledge of industry practices 1 2 3 1 2 3 4

132

Page 133: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

and standardsContemporary issues (welfare reform, irradiation, etc.)

1 2 3 1 2 3 4

133

Page 134: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

10. Which aspects of your undergraduate or graduate engineering program (courses, experiences, instructors, professional organizations) have most contributed to your satisfaction working in engineering or your present career and why?

11. Of the professors in the College of Engineering, which one was the most influential in your professional development and why?

12. What recommendations would you make to improve the educational experience for future engineering students at USC?

134

Page 135: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Professional Development13. Please indicate the following information by circling the appropriate response.

13a. Have you passed the Fundamentals of Engineering Examination? Yes No Haven’t taken it

13b. Have you completed 4 years of engineering practice as an EIT? Yes No Working toward it

13c. Have you successfully completed the Principles and Practice Examination? Yes No

13d. Are you a licensed professional engineer? Yes No

14. List your memberships in professional organizations and indicate any offices/positions you have held or are presently fulfilling.

15. List your involvement with any committees or other community organizations.

16. What conferences do you attend on a regular basis?

Demographic Information:

17. What year did you receive your engineering degree? ____________________

18. Did you transfer to USC from another college or university? Yes No

If yes, what was the transfer institution? _________________________________19. What was your undergraduate major? Civil/Environmental Chemical Computer Electrical Mechanical

20. What was your cumulative GPA (grade point average) at the time of graduation? ________________

21. What is your gender? Please circle. Female Male

22. What is your ethnicity? Please circle. Caucasian African-American Hispanic

Asian/Pacific Islander Native American Other

Thank you for completing this survey!

135

Page 136: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix G

Alumnae/Alumni Survey Reports

(sample)

136

Page 137: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Alumnae/Alumni SurveySurvey Results for the 1996 Graduates

Please note: Some capitalization, spelling, grammar, etc. errors have been corrected. Double underlines were not possible. Otherwise, information recorded here was typed as received!

Employment Information

1. Please indicate which of the following statements is applicable to your situation. Mark all that apply.

32 Employed full time (30 or more hours a week) 4 I am enrolled in graduate school.

2a. What is your present position?2b. Where are you employed?2c. What is your primary business activity? (for example, design, research, sales, etc.)

Chemical Chemical Engineer – Intermediates Development; Carolina Eastman; R&D Self-employed; World Art Imports; Sales, marketing, e-commerce

Civil Environmentalist; State of New Mexico (Silver City); Everything Hydraulic Design Engineer; SCDOT; Design Engineer-in-Training; Consulting Firm; Design Transportation planner; Wilbur Smith Associates, Falls Church, VA; Analysis & Design Transportation Planner; Wilbur Smith Associates; Planning Project Engineer/Project Manager; Grant + Associates, LLC; Design/Management CATV Design Engineer; Horry Telephone Coop. Conway, SC; Design Engineer Tech III; SCDOT – Shop Rd in Columbia, SC; Lab Testing

Computer Software Engineer; Conita Technologies, Inc.; Design/Programming Lead Programmer; Acclaim Studios Austin; Software production Senior Associate; Cambridge Technology Partners; Consulting/SI Senior Applications Developer; Renaissance Interactive Inc.; Design/Development Platform Design Engineer; Dell Computer Corp. Round Rock, TX; Design, Research

Electrical Systems Engineer; Day + Zimmermann International Inc.; Programming Jr. Engineer; Mid-Carolina Electrical Cooperative; System design & GIS Coordinator Network Facilities Engineer; Chester Telephone Company; Utilities Controls Project Engineer; Yuasa-Exide Inc. Sumter, SC; Design

Mechanical Engineer/CAD technician; M.E.C.A.; Design Process Engineer; Becton Dickinson; Troubleshooter Project Engineer; Reverse Engineering, Inc.; Design Quality Engineer; Cutler-Hammer, Eaton Corp.; Manufacturing Quality Engineer / Coordinator CMM Measurement; Spartanburg Steel Products; Trouble-shooting Design Engineer – Model Build Coord.; GE Appliances; Project Engineer Coordinator Manufacturing Eng.; Eaton Corp. / Cutler-Hammer Puerto Rico; Manufacturing Process Engineer; Bridgestone/Firestone South Carolina; Design/process improvement Mechanical Engineer; Datex-Ohmeda R&D dept.; Design – new product development

137

Page 138: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Engineer (process & project); Becton Dickinson; Manufacturing City Engineer – City of Winder, GA; City of Winder, GA; Design, management Assistant Engineer + Services Manager; Milliken + Co. – Spartanburg; Maintenance Process Engr; Siemens; Manufacturing

2d. If you are NOT employed in the engineering field, please indicate the reasons for this decision.

Chemical Limited opportunities

Civil Consultants for the mining industry are not always certain they will have work – NEED some

stability Lack of experience, lack of confidence, lack of knowledge

2e. Are you satisfied with your current position? Circle one. Yes 28 (88%) No 4 (13%)

Please elaborate why or why not.

Chemical Yes – Everyday is a different experience.

Civil Yes – Not “stuck” with any one particular type of assignment Yes – It’s what I want to do Yes – It has changing markets so there is always something new and improved. I am satisfied to be gaining practical experience in lab testing in soils. However I am not satisfied

that I have not gained confidence. I feel I need to pursue engineering job + expected to be past this point at this time in my life.

Computer Yes – Pay is good, atmosphere is great, and good work is appreciated. Yes – I wanted to make games, and I am. Yes – I am working with new computer telephony integration products that appears to be a hot

leading edge technology. Yes – Challenging Position, Cutting Edge development

Electrical Yes - Challenging, Technical, room to grow Yes – I enjoy working with the other engineers of Mid-Carolina. Mid-Carolina supplies us with

some of the best technology to do our job.

Mechanical No - Need P.E. certification for engineering consulting. No – I do not think production is the appropriate setting for me. Yes – Challenging & interesting – projects are always different with new problems to tackle. No – Not enough information provided in school on different careers Yes - I am working on another promotion (into management) Yes – It is challenging Yes – Job is good… People who own the company are willing to invest, but foreign personalities

hard to deal with

138

Page 139: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

No – I sometimes feel out of place because of my lack of practical experience. Yes – I ♥ Manufacturing

2f. Are you satisfied with your career progression? Yes 30 (94%) No 2 (6%)

Why or why not?

Chemical Yes; Eastman is a good company with lots of opportunities for dedicated workers. Yes, I’ve done more than I ever thought possible.

Civil Yes – lots of potential Yes. I take on as much as I can handle Yes. Opportunities for advancement Yes, I feel I have progressed quickly. Yes, I am gaining experience in a very up and growing market. No. I graduated 1996. I feel I needed experience I didn’t get while in school to get the confidence +

understanding at the engineering field I had hoped to get while in school + also to pinpoint the specific areas + engineering I want to pursue my life’s work.

Computer Yes, because I am constantly learning which is why I went into engineering to begin with. New

challenges every day. Yes. Fast-moving field. Yes – I’m making very good progress with salary and position promotions every year. Good Challenges, room to grow

Electrical Yes, I feel that I have progressed fairly well for a two year engineer. Yes. Mid-Carolina continues to give me the opportunity to expand my knowledge in the engineering

field. Yes – I have increased in position responsibilities Yes, I think I am ahead of where I should be for my experience

Mechanical Yes, M.E. degree allows me to advance. Yes. I have worked in different aspects of engineering, so I am getting a lot of exposure. Yes, my position is satisfactory with opportunities for advancement in the future. Yes. I’ve managed to stay alert and ask questions. Yes – [I am working on another promotion (into management)] Yes – constant improvement in title and pay I have been given the mechanical lead position for my project No. I have changed from design to manufacturing which means I basically had to start over. Yes. This is just one more step towards my goal in management.

2g. Are you satisfied with your salary level? Yes 20 (63%) No 12 (38%)

Why or why not?

Chemical

139

Page 140: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Yes; I would like to make more, but I feel that my salary is competitive. No, but it takes time when self-employed

Civil No, but good benefits (Have 3 kids – insurance is more important – two for braces on teeth) Yes. I have enough to live and have fun. No – Engineers as a whole are vastly underpaid. Yes, but it is getting better gradually. No. It is not at the engineering level because I have not attained an engineering position.

Computer Yes. I live comfortably. Yes. Higher than others. Yes. I’m keeping w/ market average and have doubled my entry level salary in just 2 years. Yes. I make plenty of money for my years in the field.

Electrical Yes, I am satisfied with my salary level, but I expect my salary to advance a little because my

responsibilities have slightly advanced. Yes. My salary has increased yearly by more than what I expected. Yes, $10K above average for my experience

Mechanical No, HVAC design is a low salary level. No. I was underpaid in my first position, so I have not caught up yet. No – it is ok for a small company, but rather low for the engineering field. No. does not meet national average due to lack of expertise. No. I am paid less than male employees with MUCH less responsibility than myself. No – I believe I can always make more money No. Am paid overtime (not part of salary) to meet satisfaction level Yes. Slightly above avg. No. I am being paid less than my less experienced coworkers.

2h. Are you generally satisfied with your career choice? Yes 28 (88%) No 4 (13%)

Please elaborate why or why not.

Chemical I am doing what I planned to do. Yes – unlimited opportunity

Civil Yes + No. Yes because I like the background No because no-one likes engineers (HaHa) Yes – I enjoy what I do. Yes – I enjoy the variation and the challenge. Yes – Engineering background gives a person a process of thinking Yes - I believe engineering is the correct field. The difficulty is pinpointing the area of engineering I

want to pursue my life’s work.

Computer I, very much, enjoy software design and implementation. Yes – No better career out there! Yes – It has provided me with confidence that I can understand a wide range of problems and the

capability to solve them. Yes – It seems that there is huge demand for people in our industry Yes – It’s always what I wanted to do.

140

Page 141: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Electrical Yes, I have to admit that when I started I did not have much of an idea at all what I was getting into, but it has turned out to be a choice that I am glad I made. Yes – I grew up around the utility organization so I knew this was a good choice. Yes – It fits my desires + goals. Yes – I am going to get my PE in 2000 and open my own controls company

Mechanical No - I do not think engineering is the field I need to work in. It is not a good fit for my personality. Yes – Allows me to be creative and use my problem solving skills. No – Needed more information in school. Yes – I enjoy what I am doing. Yes – I work on many different projects at once Yes – Flexibility, respect, mostly interesting field of work No – I didn’t realize the amount of stress that comes with engineering. Yes – It is a field where there is always an opportunity for growth and employment Yes - I have great satisfaction with my job. I enjoy design & making important decisions.

2i. Do you ever see yourself leaving engineering in the future to enter another field?

Yes 14 (44%) No 18 (56%)

If yes, which field?

Chemical Management; Finance; Sales Already left

Civil Mercenary for hire Missions Music Management or Education Engineering Management

Computer I would stay in the technical area but more to a business management/executive position.

Electrical

Mechanical Law Business, maybe Something non-technical or education Management – probably associated with engineering

First Time Employment 3. What was your first position after graduation?4. How long after graduation did you obtain an engineering-related job?

141

Page 142: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Chemical This one; Had the job 9 months before I graduated Grad School; Never

Civil Self-employed contract assignments; 18 months after grad. school Same [Hydraulic Design Engineer]; 1 ½ months Worked at Target; 2 ¼ years Traffic engineer; Immediately following graduate school Wilbur Smith Assoc. – Transportation Planner; After graduate school Project Designer; 1 month Construction Project Managing; 1 month Engineering Tech III; 3 yrs

Computer Programmer; Yes Programming; 2 months Internet developer – BellSouth.net; 0 – immediately Web Developer; Had one before leaving Performance Engineer; 1 mo.

Electrical Junior Project Engineer; Approx. 5 months (I was hired after 1 month but could not start until after

5) Same as current; Began week after graduation. Plant Electrical Engineer; 1 month Controls Engineer; before graduation Contract Engineer; 2 weeks

Mechanical Engineer; Advanced position @ current job Design engineer; Immediately after graduation Current position (project engineer); 3 years before graduation Engineer; 0 days. Process Engineer; 3 mos. 2LT – US Army; 2 ½ yrs Manufacturing Eng; -3 months Applications Engineer (AutoCAD); (2) months Mechanical Engineer – Lockheed Martin; 2 months Design engineer; 2 weeks Design Engineer for consulting Eng. Firm; I was working with Eng. firm while I was in College. Production manager; 2 months Project Engineer; 1 month

Continuing Education

5. Have you applied to graduate school? Circle one. Yes 13 (41%) No 19 (59%)

Chemical 2 (100%) 0 ( 0%)Civil 3 ( 38%) 5 (63%)Computer 2 ( 40%) 3 (60%)Electrical 2 ( 40%) 3 (60%)Mechanical 4 ( 31%) 9 (69%)

142

Page 143: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

5a. If yes, were you accepted? Yes 13 (87%) No 2 (13%)

Chemical 2 (100%) 0 ( 0%)Civil 3 (100%) 0 ( 0%)Computer 2 (100%) 0 ( 0%)Electrical 2 (100%) 0 ( 0%)Mechanical 4 (100%) 0 ( 0%)

5b. Did you enroll in graduate school? Yes 13 ( 72%) No 5 (28%)

Chemical 2 (100%) 0 ( 0%)Civil 3 (100%) 0 ( 0%)Computer 2 (100%) 0 ( 0%)Electrical 2 (100%) 0 ( 0%)Mechanical 4 (100%) 0 ( 0%)

5c. If yes, in what field? 5d. What institution?

Chemical MBA; USC MIB; USC

Civil Civil Engineering - Transportation; Penn State University Transportation Engineering; University of Washington Environmental; USC

Computer Computer Engineering; USC Software; USC

Electrical ME in Computer Engineering; Electrical; MIT

Mechanical Business; Troy State University Management; Troy State University Mechanical; USC Mechanical; USC

5e. Have you completed an advanced degree? Yes 5 ( 26%) No 14 ( 74%)

Chemical 1 (50%) 1 ( 50%)Civil 2 (67%) 1 ( 33%)Computer 1 (33%) 2 ( 67%)Electrical 1 (33%) 2 ( 67%)Mechanical 0 ( 0%) 8 (100%)

Undergraduate Experience

143

Page 144: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

9. How would you rate your overall satisfaction with your preparation to become an engineer? Please mark the box that best describes your opinion.

Not A Little VerySatisfied Satisfied Undecided Satisfied Satisfied

College 4 (12%) 3 ( 9%) 3 ( 9%) 21 ( 64%) 2 ( 6%)

Chemical 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%)Civil 3 (38%) 0 ( 0%) 2 (25%) 2 ( 25%) 1 (13%)Computer 0 ( 0%) 0 ( 0%) 0 ( 0%) 5 (100%) 0 ( 0%)Electrical 0 ( 0%) 1 (20%) 0 ( 0%) 4 ( 80%) 0 ( 0%)Mechanical 1 ( 8%) 2 (15%) 1 ( 8%) 8 ( 62%) 1 ( 8%)

10. How would you rate your preparation to obtain a job after graduation? Please mark the box that best describes your opinion.

Not A Little VerySatisfied Satisfied Undecided Satisfied Satisfied

College 8 (24%) 2 ( 6%) 7 (21%) 15 (46%) 1 ( 3%)

Chemical 0 ( 0%) 0 ( 0%) 1 (50%) 1 (50%) 0 ( 0%)Civil 4 (50%) 0 ( 0%) 0 ( 0%) 4 (50%) 0 ( 0%)Computer 1 (20%) 0 ( 0%) 1 (20%) 2 (40%) 1 (20%)Electrical 1 (20%) 0 ( 0%) 0 ( 0%) 4 (80%) 0 ( 0%)Mechanical 2 (15%) 2 (15%) 5 (39%) 4 (31%) 0 ( 0%)

11. How would you rate your preparation to become a contributing member of society? Please mark the box that best describes your opinion.

Not A Little VerySatisfied Satisfied Undecided Satisfied Satisfied

College 1 ( 3%) 2 ( 6%) 7 (21%) 18 (55%) 5 (15%)

Chemical 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%)Civil 1 (13%) 1 (13%) 1 (13%) 3 ( 38%) 2 (25%)Computer 0 ( 0%) 0 ( 0%) 3 (60%) 0 ( 0%) 2 (40%)Electrical 0 ( 0%) 1 (20%) 0 ( 0%) 4 ( 80%) 0 ( 0%)Mechanical 0 ( 0%) 0 ( 0%) 3 (23%) 9 ( 69%) 1 ( 8%)

12. Below are listed some skills and competencies that are expected of engineering graduates. Please provide us with your opinion about the importance of each skill as it relates to your engineering positions. Also indicate your satisfaction with the level of competency you achieved as a result of your USC education. For each item please circle the appropriate number in the column.

Competencies Importance of Skills Level of Competency

Not Important Very Completely Completely

144

Page 145: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Important Important Dissatisfied Dissatisfied Satisfied SatisfiedEngineering terms, principles and theories College 2 ( 6%) 13 ( 41 %) 17 (53%) 1 ( 3%) 2 ( 6%) 26 ( 79%) 4 (12%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 1 (13%) 2 ( 25%) 5 (63%) 1 (13%) 2 (25%) 3 ( 38%) 2 (25%) Computer 0 ( 0%) 3 ( 60%) 2 (40%) 0 ( 0%) 0 ( 0%) 5 (100%) 0 ( 0%) Electrical 0 ( 0%) 3 ( 60%) 2 (40%) 0 ( 0%) 0 ( 0%) 4 ( 80%) 1 (20%) Mechanical 1 ( 8%) 4 ( 31%) 8 (62%) 0 ( 0%) 0 ( 0%) 12 ( 92%) 1 ( 8%)Advanced mathematics (calculus & above) College 7 (22%) 20 ( 63%) 5 (16%) 1 ( 3%) 3 (9%) 25 ( 76%) 4 (12%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 2 (25%) 5 ( 63%) 1 (13%) 0 ( 0%) 1 (13%) 5 ( 63%) 2 (25%) Computer 0 ( 0%) 4 ( 80%) 1 (20%) 0 ( 0%) 1 (20%) 4 ( 80%) 0 ( 0%) Electrical 0 ( 0%) 3 ( 60%) 2 (40%) 0 ( 0%) 0 ( 0%) 4 ( 80%) 1 (20%) Mechanical 5 (39%) 7 ( 54%) 1 ( 8%) 1 ( 8%) 1 ( 8%) 10 ( 77%) 1 ( 8%)Chemistry and/or physics College 11 (36%) 17 (55%) 3 (10%) 0 ( 0%) 4 (13%) 23 ( 72%) 5 (16%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 4 (50%) 4 (50%) 0 ( 0%) 0 ( 0%) 1 (13%) 6 ( 75%) 1 (13%) Computer 2 (50%) 1 (25%) 1 (25%) 0 ( 0%) 0 ( 0%) 2 ( 50%) 2 (50%) Electrical 1 (20%) 3 (60%) 1 (20%) 0 ( 0%) 0 ( 0%) 4 ( 80%) 1 (20%) Mechanical 4 (31%) 8 (62%) 1 ( 8%) 0 ( 0%) 3 (23%) 9 ( 69%) 1 ( 8%)Liberal Arts College 8 (25%) 16 (50%) 8 (25%) 0 ( 0%) 4 (13%) 24 (75%) 4 (13%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 (0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 4 (50%) 4 (50%) 0 (0%) 1 (13%) 5 ( 63%) 2 (25%) Computer 1 (20%) 2 (40%) 2 (40%) 0 (0%) 1 (25%) 2 ( 50%) 1 (25%) Electrical 1 (20%) 4 (80%) 0 ( 0%) 0 (0%) 1 (20%) 3 ( 60%) 1 (20%) Mechanical 6 (46%) 5 (39%) 2 (15%) 0 (0%) 1 ( 8%) 12 ( 92%) 0 ( 0%)An ability to:

Identify, formulate, and solve engineering problems

College 2 ( 6%) 3 ( 9%) 27 ( 84%) 1 ( 3%) 6 (18%) 22 (67%) 4 (12%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 2 (25%) 0 ( 0%) 6 ( 75%) 1 (13%) 3 (38%) 4 ( 50%) 0 ( 0%) Computer 0 ( 0%) 0 ( 0%) 5 (100%) 0 ( 0%) 1 (20%) 2 ( 40%) 2 (40%) Electrical 0 ( 0%) 0 ( 0%) 5 (100%) 0 ( 0%) 0 ( 0%) 4 ( 80%) 1 (20%) Mechanical 0 ( 0%) 2 (15%) 11 ( 85%) 0 ( 0%) 2 (15%) 10 ( 77%) 1 ( 8%)

Design a system, component, or process to meet desired needs and quality College

1 ( 3%) 3 ( 9%)

1 ( 3%) 8 (24%) 20 (61%) 4 (12%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 1 (13%) 1 ( 13%) 6 ( 75%) 1 (13%) 3 (38%) 4 ( 50%) 0 ( 0%) Computer 0 ( 0%) 0 ( 0%) 5 (100%) 0 ( 0%) 0 ( 0%) 4 ( 80%) 1 (20%) Electrical 0 ( 0%) 0 ( 0%) 5 (100%) 0 ( 0%) 0 ( 0%) 4 ( 80%) 1 (20%)

145

Page 146: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Mechanical 0 ( 0%) 1 ( 8%) 12 ( 92%) 0 ( 0%) 5 (39%) 6 ( 46%) 2 (15%)Use the computer as a tool for analysis and design College 0 ( 0%) 6 ( 19%) 26 ( 81%) 1 ( 3%) 4 (12%) 17 (52%) 11 (33%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 2 ( 25%) 6 ( 75%) 1 (13%) 2 (25%) 3 ( 38%) 2 (25%) Computer 0 ( 0%) 0 ( 0%) 5 (100%) 0 ( 0%) 0 ( 0%) 2 ( 40%) 3 (60%) Electrical 0 ( 0%) 0 ( 0%) 5 (100%) 0 ( 0%) 0 ( 0%) 3 ( 60%) 2 (40%) Mechanical 0 ( 0%) 3 (23%) 10 ( 77%) 0 ( 0%) 2 (15%) 7 ( 54%) 4 (31%)Function on multi-disciplinary or cross-Functional teams College 1 ( 3%) 14 (44%) 17 (53%) 2 ( 6%) 5 (15%) 23 (70%) 3 ( 9%)

Chemical 0 ( 0%) 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 4 (50%) 4 ( 50%) 2 (25%) 1 (13%) 5 ( 63%) 0 ( 0%) Computer 0 ( 0%) 2 (40%) 3 ( 60%) 0 ( 0%) 0 ( 0%) 4 ( 80%) 1 (20%) Electrical 0 ( 0%) 2 (40%) 3 ( 60%) 0 ( 0%) 1 (20%) 4 ( 80%) 0 ( 0%) Mechanical 1 ( 8%) 6 (46%) 6 ( 46%) 0 ( 0%) 3 (23%) 8 ( 62%) 2 (15%)Function in culturally an ethnically diverse environments College

4 (13%) 12 ( 38%) 16 ( 50%)

1 ( 3%)

6 (19%) 22 (71%)

2 ( 7%)

Chemical 0 ( 0%) 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 2 ( 25%) 6 ( 75%) 1 (13%) 0 ( 0%) 6 ( 75%) 1 (13%) Computer 0 ( 0%) 2 ( 40%) 3 ( 60%) 0 ( 0%) 1 (25%) 2 ( 50%) 1 (25%) Electrical 0 ( 0%) 5 (100%) 0 ( 0%) 0 (0%) 1 (20%) 4 ( 80%) 0 ( 0%) Mechanical 4 (31%) 3 ( 23%) 6 ( 46%) 0 (0%) 4 (33%) 8 ( 67%) 0 ( 0%)Communicate orally, informally, and in prepared talks College 0 ( 0%) 8 (25%) 24 ( 75%) 1 (3%) 8 (24%) 19 (58%) 5 (15%)

Chemical 0 ( 0%) 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 1 (13%) 7 ( 88%) 1 (13%) 2 (25%) 4 ( 50%) 1 (13%) Computer 0 ( 0%) 2 (40%) 3 ( 60%) 0 ( 0%) 2 (40%) 2 ( 40%) 1 (20%) Electrical 0 ( 0%) 4 (80%) 1 ( 20%) 0 ( 0%) 2 (40%) 2 ( 40%) 1 (20%) Mechanical 0 ( 0%) 1 ( 8%) 12 ( 92%) 0 (0%) 2 (15%) 9 ( 69%) 2 (15%)Communicate in writing – technical reports, memos, proposals, etc. College 0 ( 0%) 8 (25%) 24 ( 75%) 1 ( 3%) 6 (18%) 21 (64%) 5 (15%)

Chemical 0 ( 0%) 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 1 (13%) 7 ( 88%) 1 (13%) 3 (38%) 3 ( 38%) 1 (13%) Computer 0 ( 0%) 2 (40%) 3 ( 60%) 0 ( 0%) 1 (20%) 3 ( 60%) 1 (20%) Electrical 0 ( 0%) 3 (60%) 2 ( 40%) 0 ( 0%) 1 (20%) 3 ( 60%) 1 (20%) Mechanical 0 ( 0%) 2 (15%) 11 ( 85%) 0 ( 0%) 1 ( 8%) 10 ( 77%) 2 (15%)

Use computer software for professional communications College

0 ( 0%) 6 ( 19%) 26 ( 81%)

1 ( 3%)

7 (21%) 16 ( 49%)

9 (27%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 0 ( 0%) 8 (100%) 1 (13%) 2 (25%) 4 ( 50%) 1 (13%) Computer 0 ( 0%) 0 ( 0%) 5 (100%) 0 ( 0%) 1 (20%) 1 ( 20%) 3 (60%) Electrical 0 ( 0%) 3 ( 60%) 2 ( 40%) 0 ( 0%) 2 (40%) 1 ( 20%) 2 (40%)

146

Page 147: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Mechanical 0 ( 0%) 2 ( 15%) 11 ( 85%) 0 ( 0%) 2 (15%) 8 ( 62%) 3 (23%)Design and conduct experiments College 10 (31%) 10 (31%)

12 ( 38%) 2 ( 6%) 9 (27%) 17 (52%) 5 (15%)

Chemical 0 ( 0%) 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 3 (38%) 3 (38%) 2 ( 25%) 1 (13%) 2 (25%) 3 ( 38%) 2 (25%) Computer 1 (20%) 3 (60%) 1 ( 20%) 0 ( 0%) 3 (60%) 1 ( 20%) 1 (20%) Electrical 1 (20%) 3 (60%) 1 ( 20%) 0 ( 0%) 1 (20%) 3 ( 60%) 1 (20%) Mechanical 5 (39%) 1 ( 8%) 7 ( 54%) 1 ( 8%) 3 (23%) 8 ( 62%) 1 ( 8%)Analyze and interpret data College

0 ( 0%) 8 (25%) 24 (75%)

0 (0%)

9 (27%) 18 (55%)

6 (18%)

Chemical 0 ( 0%) 0 ( 0%) 1 (100%) 0 (0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 1 (13%) 7 ( 88%) 0 (0%) 3 (38%) 3 ( 38%) 2 (25%) Computer 0 ( 0%) 1 (20%) 4 ( 80%) 0 (0%) 2 (40%) 1 ( 20%) 2 (40%) Electrical 0 ( 0%) 4 (80%) 1 ( 20%) 0 (0%) 0 ( 0%) 4 ( 80%) 1 (20%) Mechanical 0 ( 0%) 2 (15%) 11 ( 85%) 0 (0%) 4 (31%) 8 ( 62%) 1 ( 8%)

An understanding of:

Professional and ethical responsibilities College 0 ( 0%) 9 ( 28%) 23 (72%) 2 (6%) 5 (15%) 20 (61%) 6 (18%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 1 ( 12%) 7 (88%) 1 (13%) 2 (25%) 4 ( 50%) 1 (13%) Computer 0 ( 0%) 2 ( 40%) 3 (60%) 1 (20%) 1 (20%) 2 ( 40%) 1 (20%) Electrical 0 ( 0%) 3 (60%) 2 (40%) 0 ( 0%) 0 ( 0%) 4 ( 80%) 1 (20%) Mechanical 0 ( 0%) 2 (15%) 11 (85%) 0 ( 0%) 2 (15%) 8 ( 62%) 3 (23%)Environmental aspects of engineering practice College

6 ( 19%)

11 ( 36%) 14 (45%)

1 ( 3%)

11 (34%)

16 (50%)

4 (13%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 1 ( 13%) 1 ( 13%) 6 (75%) 0 ( 0%) 3 (38%) 4 ( 50%) 1 (13%) Computer 4 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 3 (75%) 0 ( 0%) 1 (25%) Electrical 1 ( 20%) 3 ( 60%) 1 (20%) 0 ( 0%) 2 (40%) 3 ( 60%) 0 ( 0%) Mechanical 0 ( 0%) 6 ( 46%) 7 (54%) 1 ( 8%) 3 (23%) 7 ( 54%) 2 ( 15%)The practice of engineering on a global scale College 7 (22%) 21 ( 66%) 4 (13%) 3 ( 9%) 15 (46%) 14 (42%)

1 ( 3%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 1 (13%) 6 ( 75%) 1 (13%) 2 (25%) 3 (38%) 3 ( 38%) 0 ( 0%) Computer 3 (60%) 2 ( 40%) 0 ( 0%) 0 ( 0%) 4 (80%) 0 ( 0%) 1 (20%) Electrical 0 ( 0%) 5 (100%) 0 ( 0%) 0 ( 0%) 3 (60%) 2 ( 40%) 0 ( 0%) Mechanical 3 (23%) 7 ( 54%) 3 (23%) 1 ( 8%) 5 (39%) 7 ( 54%) 0 ( 0%)

147

Page 148: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The impact of engineering solutions in a global and societal context College 7 (22%) 18 ( 56%) 7 (22%) 2 ( 6%) 14 (44%) 15 (47%) 1 ( 3%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 4 ( 50%) 4 (50%) 2 (25%) 3 (38%) 3 ( 38%) 0 ( 0%) Computer 2 (40%) 3 ( 60%) 0 ( 0%) 0 ( 0%) 3 (60%) 1 ( 20%) 1 (20%) Electrical 1 (20%) 4 ( 80%) 0 ( 0%) 0 ( 0%) 3 (60%) 2 ( 40%) 0 ( 0%) Mechanical 4 (31%) 6 ( 46%) 3 (23%) 0 ( 0%) 5 (42%) 7 ( 58%) 0 ( 0%)The need for engaging in life-long learning College 1 (3%) 15 (47%) 16 (50%) 2 ( 6%) 6 (18%) 22 (67%) 3 ( 9%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 0 ( 0%) 2 ( 25%) 6 (75%) 1 (13%) 0 ( 0%) 6 ( 75%) 1 (13%) Computer 1 (20%) 1 ( 20%) 3 (60%) 0 ( 0%) 2 (40%) 2 ( 40%) 1 (20%) Electrical 0 ( 0%) 4 ( 80%) 1 (20%) 0 ( 0%) 1 (20%) 4 ( 80%) 0 ( 0%) Mechanical 0 ( 0%) 7 ( 54%) 6 (46%) 1 ( 8%) 3 (23%) 8 ( 62%) 1 ( 8%)Basic knowledge of industry practices and standards College

1 (3%) 13 (41%) 18 (56%)

7 (21%)

11 (33%) 13 (39%)

2 ( 6%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 1 (13%) 3 (38%) 4 (50%) 3 (38%) 2 (25%) 3 ( 38%) 0 ( 0%) Computer 0 ( 0%) 2 (40%) 3 (60%) 1 (20%) 2 (40%) 1 ( 20%) 1 (20%) Electrical 0 ( 0%) 3 (60%) 2 (40%) 1 (20%) 3 (60%) 1 ( 20%) 0 ( 0%) Mechanical 0 ( 0%) 4 (31%) 9 (69%) 2 (15%) 4 (31%) 6 ( 46%) 1 ( 8%)Contemporary issues (welfare reform, irradiation, etc.) College 19 (59%) 11 (34%) 2 ( 6%) 4 (13%) 9 (28%) 17 (53%) 2 ( 6%)

Chemical 0 ( 0%) 1 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%) 2 (100%) 0 ( 0%) Civil 2 (25%) 5 ( 63%) 1 (13%) 2 (25%) 2 (25%) 3 ( 38%) 1 (12%) Computer 4 (80%) 1 ( 20%) 0 ( 0%) 1 (20%) 1 (20%) 2 ( 40%) 1 (20%) Electrical 4 (80%) 1 ( 20%) 0 ( 0%) 1 (20%) 2 (40%) 2 ( 40%) 0 ( 0%) Mechanical 9 (69%) 3 ( 23%) 1 ( 8%) 0 ( 0%) 4 (33%) 8 ( 67%) 0 ( 0%)

13. Which aspects of your undergraduate or graduate engineering program (courses, experiences, instructors, professional organizations) have most contributed to your satisfaction working in engineering or your present career and why?

Chemical: The teamwork assignments were helpful and the quality of the professors was beneficial as well. The

light atmosphere of the College of Engineering kept the days fun (cookouts, etc.) The fact that I accomplished one of the most difficult degrees possible in undergrad is good for self-

confidence.

Civil: I feel that Dr. Steve McAnally’s approach during my graduate research interim helped/encouraged

me more than the entire previous 5 years. He allowed me to choose my own research, meet with clients, write reports, etc. with minimal “interference.” The most important thing I learned, is how to “self-direct.”

148

Page 149: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The degree. I use very little of what I actually learned in school. All I use is the piece of paper that says I completed the program.

Nothing – I was actually completely unsatisfied with my education through the USC College of Engineering. It is only because of my education at Penn State that I am still in the field of engineering

At USC, my favorite instructor was Jack Jakubs in the Dept. of Geography. I did my transportation-planning Senior thesis with him. He is the only instructor I know of that has an interest in transportation planning.

All of it. The complete experience. Solid mechanics class helped with understanding all objects and their reactions. The lab experiments particularly the environmental labs contributed greatly to my understanding the

coursework in the classroom.

Computer: The most influence placed upon me was by the professors who understood the current state of

engineering in the practical world and conducted class accordingly; classes with completely project based and independently earned grades were the most beneficial to my working career.

The lab systems. I thought they were great for work ethic + team building. Having an understanding of programming, databases, and networks were the most valuable industry

skills used from my engineering courses.Elective courses that I took in Management & accounting have been extremely valuable in applying technology solutions to business needs.

Team Work!!!Software Engineering courses plus the Computer Engineering Labs were the basis for everything I now about software development.

Senior and Junior Labs.

Electrical: I would have to say the close attention to students’ computer skills. The programming techniques we

all had to learn have become very useful. Also, through the writing of all of those many, many reports we had to turn in, I had to become a pretty good user of several software applications. Those skills have also been useful.

Courses:Power SystemsComputer & design courses

Organizations:IEEE

The computer skills I learned helped me greatly in fulfilling job requirements as did the team approach to solving problems + completing goals, which I learned in some of my classes and labs.

The labs (301, 201, 401, 402) were the most educational for the real world.EECEThe classes 211 and 221 were best for learning concepts for understanding.

My courses helped me to have the knowledge I needed, but IEEE is what encouraged me to become involved in the professional duties of Engineering.

Mechanical: Heat transfer & thermo are most related to Heating, Ventilation, + Air Conditioning Design. Statics & dynamics, heat transfer, statistics, engineering materials & metallurgy, public speaking,

technical writing, computer skills, ASME, all of my professors. -ASME, being a part of it

-Working in the machine shop. Actually making the parts that I designed Long term projects like senior design assisted greatly. The vehicle project teams in school gave me

hands on knowledge that many engineers do not gain even after 6 years in industry Problem solving labs…

149

Page 150: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Micro-processing….Classes that give you a goal (a problem) that you must solve by studying and compiling data

I think professors that incorporate real-life problems are the one’s who best prepare their students. Senior projects are great to express the importance of teamwork. I do believe that projects involving multidisciplines are the most effective. In the real world, mechanical, civil, chemical, environmental and structural engineers all work on the same projects and most work well together.

-working on teams in senior design with private sector companies-small size classes with first name relationships w/professors

Dr. Jed Lyons – manufacturing processes + metallurgyDr. Mike Sutton – all coursesDr. Jamil Khan – all courses

13. Of the professors in the College of Engineering, which one was the most influential in your professional development and why?

Chemical: Dr. Vincent Van Brunt. Dr. Vincent Van Brunt has a passion for teaching and this is contagious for

his students. He gives you the kind of fire that you can build your career on. He shows that determination is the most important aspect that you can have as an engineer.

Both Dr. Stanford and Dr. Gadala-Maria were influential teachers. They made learning interesting. They also encouraged the exploitation of science in developing new ideas.

Civil: Dr. McAnally – for letting me attempt the impossible

Dr. Gribb – for staying mad at meDr. Ray – for teaching me how to get dirtyDr. Baus – for always being thereDr. Bradburn – for getting it through my head that F=ma=0*Jo Wooley, Abby Cradock – Making me laugh when I wanted to cry

Ms. Molly Gribb – She is one of the only professors at USC that invests a sincere effort in students & making sure that teaching takes priority over research & personal gain.

Dr. Michael Meadows – put simply, he is a true Teacher. Dr. Ray showed me ways of analyzing things and breaking them down to solve a problem

(generally). Dr. Ray + Dr. Meadows have been most influential. Even though I struggled with coursework +

tests, I sensed that these individuals believed in my ability to performing engineering tasks.

Computer: Dr. Juan Vargas. He challenged me in every class I had with him and knew exactly what was needed

to excel in the “real world.” Unlike almost all other professors I had, he kept abreast of the latest advancements and techniques in software.

Dr. Bailey, before he left. He actually cared whether or not we learned and prepared us well for life after college.

Simpson – I realized that you don’t have to be able to please everyone or fit the mold to succeed in this world.

Prof. Vargas. He kicked our butts and helped challenge me with “modern” software development using commonly used tools.

Dr. Pettus + Prof. Byrd.

Electrical:

150

Page 151: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

I don’t believe he is still at the college of Engineering, but Professor Dan Bailey was the most influential. He was a good teacher in the classroom. He always had time for students. He always tried to make his tests fair. Also, he was young so I think us students felt more comfortable in his presence. Professor Byrd and Huggins were also great professors.

Hard question to answer. I guess I would have to say professor Sam Hilborn was my most influential professor due to his emphasis on teamwork to complete assignments, which has really helped me in dealing with problems in industry.

Ronald Bonnell, I want to learn about Databases for Data Acquisition, important or future controls work. My graduate degree is in Database Engineering.

Dr. Sudarshan – who allowed me to participate in an Undergraduate Research program

Mechanical: Dr. Kahn + Dr. Rocheleau. Both Prof. Showed interests to the students’ individual effort. However;

Dr. Kahn would be more influential. The most influential professor on the positive side for me was Dr. Michael Sutton. He helped me to

understand that my capabilities were not limited by my previous educational experience and that whatever I wanted to do, I could be successful at it.The most influential professor on the negative side was < >. He made a lot of us feel as though we were allowed in as a favor and not our merit and that we would not graduate without those same favors.

Dr. Jamil Khan was the most influential – he was always willing to set time aside to assist students and ensure that they learned the material. He is very friendly and extremely knowledgeable.

Dr. Stephen McNeill. He empowered students. Dr. Kahn.

He was a tough but fair professor. He taught you that through hard work you can excel at anything. Dr. Young was the most influential since he was strictly dedicated to teaching and was not distracted

by trying to complete research projects. Wally Peters

-1st lesson – brush your teeth Prof. Lyons – I retained more from his classes than any other. His subjects (materials &

manufacturing) were the most applicable than any other. Dr. Michael Sutton was most influential because he showed us what the real world expected of us and

how to handle it. Dr. Poole gave me a great understanding in statics and economics. I know students that were not as

fluent in statics and vectors, resulting in a more difficult curriculum.I also thought Dr. Kahn and Professor Rocheleau made a great impact on my future decisions

Dr. Kahn – challenged us to think + analyze more than any other professor. He also would help with any problems in any course, even if he was not the instructor.

Dr. Sutton – he cared

14. What recommendations would you make to improve the educational experience for future engineering students at USC?

Chemical Put more $$ in the ChemE program. Don’t stifle the faculty with politics. Let them do what they are

good at and get out of the way. Encourage “persistence.” Many times I thought of quitting and doing something easier, but the

experience of studying engineering has made me mentally stronger and determined to do whatever I wish to pursue.

Civil 1. Laptops mandatory

2. Apply principles through “real life” application.3. More multi-disciplinary work-groups ie – Senior design class with a few liberal arts majors thrown in

151

Page 152: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

4. Do not have babies in the middle of graduate school Bring in professional engineers who can teach you exactly what you will use when you get into your

field.Use AutoCad/Softdesk in class (more than just 2 semesters)Teach more practice as opposed to theory.

I would be happy to discuss why I feel the University of South Carolina has a long way to go before the College of Engineering reaches any level of satisfactory performance. (name and number given)

Establish same transportation-related engineering/planning courses (e.g. traffic engineering, transportation planning, etc.). There is much more to Civil Engineering than is presented at Carolina. Clemson and The Citadel both have transportation instructors and courses. Transportation is my area of interest, but I had to go to graduate school elsewhere to gain instruction in this area.

I support ASCE in the Idea that All engineering students work towards a Masters Degree. I would like to see USC no longer after a BS in Engineering. The Masters Degree should be the first Professional Degree of Engineering.

More involvement with actual industries such as communication field. The more practical and hands-on knowledge can be conveyed in the classroom so the more successful

students will be in understanding the coursework and will be more successful in the practical application at this knowledge.

Computer Don’t give grades away! If someone consistently shows that he or she does not have a desire or

ability to be an engineer, do not pass them. The ones that pass respect their education more and the ones that fail will not end up doing something they will hate.

1. Wider variety of courses to take. 2. Integrate with some of the Computer Science classes. I think the students need to have more opportunities to work on real business problems, whether these

are solutions labs, internships or co-ops. More software classes using business tools.

Electrical I know this may be a topic that is too specific, but I know that it would have been very helpful for me

if I would have learned about Programmable Logic Controllers (PLC’s) and Ladder Logic. Even though theories are important, I would use more real world examples and problems in the

classrooms. I have learned more from real world applications than I have from theory. I hope that future engineering students are given more real-world projects and classes based on

industrial practices (as much as they can) to gain an understanding of how the real world works, since the majority of the graduates will be in some type of industry after graduation.

More real life controls (Practical Experience)PLC’s (at least more relay logic academics)Motion Control (VFD’s, Servo & Stepper Dives)Schematic’s Standards (being able to read and understand schematics)

Allow the development of lab experiments by students.

Mechanical Need more labs + classes pertaining to HVAC design. I had very few labs + one class relating to

HVAC design which conflicted with my schedule. The program needs to focus less on theory and more on real world applications. We need to know

how to make things happen not how they are supposed to happen. The computer lab in 300 Main needed more functioning computers with up-to-date software. More

CAD training would have been beneficial. Also, manufacturing engineering would be a valuable course (if there isn’t one already).

-Remove out of date Professors-Focus more attention on making sure students understand the core engineering classes. Statics, dynamics.-Consistency between information taught from different instructors.

There should be much more exposure to the real working world. Programs that involve students and companies, so that students see 1st hand what is important, and what they should really apply their

152

Page 153: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

efforts to learn. Sometimes Professors’ ideas of what the work-world expects of their employees is different from reality.I found that the things my professors really stressed were trivial, and the things necessary for survival were put on the back-burner. This caused me to require a lot of training, and not a whole lot to sell myself to a potential employer with. Without experience, topped with the lack of necessary skills, I think the first few companies I interviewed with, laughed as I walked out the door.

Improve Sr. Design Projects. Incorporate cross-functional team members (at least as part of an advocacy team) such as marketing & finance.

More hands on projects. More lab! -Project driven that uses theory and training More hands-on, practical experience is an absolute must. A course outlining a physical task to be

accomplished in addition to design classes would be a great addition. -more projects/courses with private industry

-courses on management/supervision/HR issues-courses like concurrent eng. EMCH 520 for undergrads

I am writing to you because I received your survey dealing with my satisfaction of my education at the University of South Carolina Mechanical Engineering Department. The survey came in a timely fashion because I had already been contemplating writing a letter to the College to express my disappointment with my education. I have been employed in the engineering field for three years now and have had an extremely frustrating experience, most of which I attribute to lack of preparedness to enter the design field. Please understand, this letter is not intended to be a complaint directed at USC’s engineering program. My intent is actually to offer you the opportunity to speak with me regarding my opinions of the program when I was enrolled as well as offer you some suggestions for program improvement. I have given much thought in the last three years about what types of courses or programs would have helped me transition into a design position more smoothly.

Overall, I would easily compare my education to schools such as Purdue, University of Colorado, University of Illinois, and others in terms of course curriculum. In speaking to other students from these schools, it is apparent to me that new graduates as a whole struggle with the same frustrations that I have experienced. Also, with employers increasing focus on speed to market with a “lean and mean” mentality, many times there is not time or available engineers to adequately train a new engineer through mentoring programs. I do not foresee this mentality changing, therefore, I believe it to be the responsibility of the university to prepare students for this incredibly fast paced, cut throat workplace. I understand that the ABET accreditation board has requirements that schools must meet. However, the fundamental approach to education seems to have remained stagnant for decades. I would be very proud to see the USC College of Engineering devise a truly innovative program to more aptly prepare students for the engineering fields. I have several ideas in which to accomplish this goal if you are interested. (name and number given)

Professional Development

15. Please indicate the following information by circling the appropriate response.

153

Page 154: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

15a. Have you passed the Fundamentals of Engineering Examination?

Yes 24 (73%) No 3 (9%) Haven’t taken it 6 (18%)

Civil 6 ( 75%) 1 (13%) 1 (13%)Chemical 2 (100%) 0 ( 0%) 0 ( 0%)Computer 1 ( 20%) 0 ( 0%) 4 (80%)Electrical 3 ( 60%) 1 (20%) 1 (20%)Mechanical 12 ( 92%) 1 ( 8%) 0 ( 0%)

15b. Have you completed 4 years of engineering practice as an EIT?

Yes No Working toward it

2 ( 6%) 11 (34%) 19 (59%)

Chemical 0 ( 0%) 1 (50%) 1 (50%)Civil 1 (13%) 2 (25%) 5 (63%)Computer 0 ( 0%) 3 (75%) 1 (25%)Electrical 0 ( 0%) 2 (40%) 3 (60%)Mechanical 1 ( 8%) 3 (23%) 9 (69%)

15c. Have you successfully completed the Principles and Practice Examination?

Yes No Not Applicable

College 1 (4%) 27 (96%) 0 ( 0%)

Chemical 0 ( 0%) 2 (100%) 0 ( 0%)Civil 0 ( 0%) 7 (100%) 0 ( 0%)Computer 0 ( 0%) 4 (100%) 0 ( 0%)Electrical 1 (20%) 4 ( 80%) 0 ( 0%)Mechanical 0 ( 0%) 10 (100%) 0 ( 0%)

15d. Are you a licensed professional engineer?

Yes No

College 0 (0%) 30 (100%)Chemical 0 (0%) 2 (100%)Civil 0 (0%) 8 (100%)Computer 0 (0%) 4 (100%)Electrical 0 (0%) 5 (100%)Mechanical 0 (0%) 11 (100%)

16. List your memberships in professional organizations and indicate any offices/positions you have held or are presently fulfilling.

154

Page 155: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Chemical AIChE Association of International Business

Civil Have 3 kids + I work, not time for any right now ASCE ASCE

Tau Beta Pi (Chapter Secretary)Chi Epsilon (Chapter President)Institute of Transportation Engineers (Chapter Vice-President)Intelligent Transportation Systems of America (Chapter Vice-President)

Institute of Transportation EngineersTransportation Association of South CarolinaMember, Board of Directors

ASCE ASCE

Computer IEEE

SCOA IEEE, ACM

Electrical Member of IEEE

Member of South Carolina Electrical Cooperative Engineering Association IEEE-Columbia Section

Mechanical ASHRAE ASME, Pi Tau Sigma, Lexington Who’s Who ASQ ASME PRO/E user group head – 1998 Datex-Ohmeda ASME, NSPE ASME

17. List your involvement with any committees or other community organizations.

Chemical United Way

Civil Work for the state – involved with all of them The Jaycees of Northern Virginia

Local symphony orchestra – Fairfax County, VirginiaLocal church fellowship group – Vienna, Virginia

ITE Southern District Meeting Planning Committee Boy Scouts of America Church

Computer

155

Page 156: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

S.C. Software dev. Group

Electrical Church-based committees (if that counts)

Mechanical Oakland Primary School Mentoring Program SC Libertarian Party (Richland County Vice-Chair) Active Church member Knights of Columbus Edgefield Youth Soccer Young Life

High School Science & Math Tutor Lion’s Club USA Track + Field-> held @ Milliken Headquarters – National Chapionship Cross Country

Planning Committee

18. What conferences do you attend on a regular basis?

Chemical Eastman Technical Conference

Civil State-sponsored Transportation Research Board (Washington, D.C.)

ITEITS America

Transportation Research Board

Computer Computer Game Developer’s Conference Computer Telephony Expo

CTI Expo M.S. Tech Ed.

Electrical Gentry Systems User Conference – software

Stoner User Conference – softwareS.C. Electric Cooperative Engineering Association meetings

TCI conferences (S.C. Telephone Associaton)

Mechanical Design meetings Automated Manufacturing Exposition (Greenville, SC)

Demographic Information

156

Page 157: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

17. What year did you receive your engineering degree? 1996 (100%)

18. Did you transfer to USC from another college or university?

Yes 7 (21%)No 26 (79%)

Civil 1 (13%) 7 ( 88%)Chemical 0 ( 0%) 2 (100%)Computer 0 ( 0%) 5 (100%)Electrical 2 (40%) 3 ( 60%)Mechanical 4 (31%) 9 ( 69%)

If yes, what was the transfer institution?

Civil USC – Costal Carolina

Electrical USC Spartanburg Midlands Tech

Mechanical Midlands Tech. USC Spartanburg Clemson University

19. What was your undergraduate major?

Chemical 2 ( 6%)Civil/Environmental 8 (24%)Computer 5 (15%)Electrical 5 (15%)Mechanical 13 (39%)

20. What was your cumulative GPA (grade point average) at the time of graduation?

College Chemical Civil Computer Electrical Mechanical2.0 – 2.49 3 (10%) 0 ( 0%) 2 (29%) 1 (20%) 0 ( 0%) 0 ( 0%)2.5 – 2.99 12 (40%) 0 ( 0%) 2 (29%) 2 (40%) 3 (60%) 5 (45%)3.0 – 3.49 8 (27%) 2 (100%) 1 (14%) 1 (20%) 0 ( 0%) 4 (36%)3.5 – 3.79 5 (17%) 0 ( 0%) 1 (14%) 1 (20%) 1 (20%) 2 (28%)3.8 – 4.00 2 ( 7%) 0 ( 0%) 1 (14%) 0 ( 0%) 1 (20%) 0 ( 0%)

157

Page 158: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

21. What is your gender? Female 8 (24%) Male 25 (76%)

Chemical 0 ( 0%) 2 (100%)Civil 4 (50%) 4 ( 50%)Computer 0 ( 0%) 5 (100%)Electrical 0 ( 0%) 5 (100%)Mechanical 4 (31%) 9 ( 69%)

21. What is your ethnicity?

Caucasian African-American Hispanic Asian/Pacific Islander

College 29 (91%) 3 ( 9%) 0 ( 0%) 0 ( 0%)

Chemical 2 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%)Civil 8 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%)Computer 4 ( 80%) 1 (20%) 0 ( 0%) 0 ( 0%)Electrical 5 (100%) 0 ( 0%) 0 ( 0%) 0 ( 0%)Mechanical 10 ( 83%) 2 (17%) 0 ( 0%) 0 ( 0%)

College of Engineering and Information Technology

Alumnae/Alumni Survey1996 Graduates

Summary of Survey Results

158

Page 159: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Survey Administration

During March 2000, alumnae/alumni surveys were mailed to 170 students who graduated in May, August or December of 1996. Fifty-six surveys, approximately 33 percent, were returned from the post office labeled as undeliverable. Twelve surveys were resent to graduates using an alternate address. A total of 126 surveys may have reached the graduates. The analysis sample consists of 33 surveys or approximately 26 percent of the graduates that may have received a survey. Although the sample is not very large, the return rate is about what would be expected for surveys mailed to alumnae/alumni. The following table lists the return rate data for each program.

Return Rates by ProgramProgram Number of

GraduatesNumber of GraduatesReceiving Survey

Number of Surveys Completed

Return Rate

Chemical 22 13 2 15Civil 27 22 8 36Computer 21 18 5 28Electrical 41 30 5 17Mechanical 56 40 13 33

Demographics

Gender, ethnicity, GPA, major and transfer status were the demographic variables of interest requested on the survey. Analysis of sample data indicates that seventy-six percent of the respondents are males, which is slightly lower than the 81 percent present in the sample. This indicates that females returned surveys at a higher rate than their male counterparts. Ninety-one percent of the responding alumnae/alumni are Caucasian and nine percent represent African-American minorities. These figures suggest that higher percentage of Caucasians completed surveys than the minority groups. None of the Asian/Pacific Islanders within the sample returned surveys.

Twenty-one percent of the alumnae/alumni (7 graduates) were transfers from another institution. Previous colleges include Midlands Technical College (3), USC-Spartanburg (2), Coastal Carolina (1), and Clemson (1).

Employment

All but one of the alumnae/alumni who returned surveys are employed full time; one alumnus enrolled in graduate school and is not employed. One alumnae/alumni is not employed in an engineering-related field. This chemical graduate is self-employed in the import business. He cites limited opportunities as the reason why he is not employed in an engineering–related field. Another alumnae respondent, employed as an Engineer Technician III, indicates that “lack of experience, lack of confidence, lack of knowledge” is the reason she is not working as an engineer. Despite her job title it is evident that this civil engineering graduate does not consider herself an engineer.

159

Page 160: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

A range of companies throughout South Carolina and the United States employs the sample of 32 alumnae/alumni. At least 15 of the 32 respondents (approximately 49 percent) are employed within the state of South Carolina. A list of the employers and the discipline of the graduates are given below.

Chemical

Carolina EastmanWorld of Art Imports

Civil

State of New MexicoSouth Carolina Department of TransportationConsulting FirmWilbur Smith AssociatesGrant and Associates Horry County Telephone Cooperative

Computer

Acclaim StudiosCambridge Technology PartnersConita Technologies, Inc.Dell Computer CorporationRenaissance Interactive Inc.

Electrical

Day and Zimmermann InternationalMid-Carolina CooperativeChester Telephone CompanyYuasa-Exide Inc.

Mechanical

Becton DickinsonCity of WinderCutler-Hammer, Easton CorporationDatex-OhmedaGeneral Electric AppliancesFirestoneMillikenMECAReverse EngineeringSpartanburg Steel Products

160

Page 161: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Siemens

Eighty-eight percent of the alumnae/alumni (28 respondents) are satisfied with their current employment listing a variety of reasons for this feeling. Reasons for their satisfaction include the following quotes of alumnae/alumni representing each program:

“Everyday is a different experience.” (Chemical)“It has changing markets so there is always something new and improved.” (Civil)“Pay is good, atmosphere is great, and good work is appreciated.” (Computer)“I enjoy working with the other engineers of Mid-Carolina. Mid-Carolina supplies us with some of the best technology to do our job.” (Electrical)“Challenging and interesting – projects are always different with new problems to tackle.” (Mechanical)

Four graduates, however, are unhappy with their positions. All are mechanical engineering graduates. Different reasons for being dissatisfied with their positions include:

“Need P.E. Certification for engineering consulting.”“I do not think production is the appropriate setting for me.”“Not enough information provided in school on different careers.”“I sometimes feel out of place because of my lack of practical experience.”

Are you satisfied with your career progression? All but two alumnae and alumni, approximately 94 percent, responded affirmatively to this question. Graduates cited opportunities for advancement, new challenges, and gaining experience as the primary reasons why they are satisfied with their career progression. A typical response is like the following from a mechanical engineer, “My position is satisfactory with opportunities for advancement in the future.” Another respondent indicated: “Yes, because I am constantly learning which is why I went into engineering to begin with. New challenges everyday.”

Are you satisfied with your salary level? Why or why not? Only sixty-three percent of the respondents (20 alumnae/alumni) expressed satisfaction with their salary at the present time. One computer alumni explained the reason for his satisfaction: “I’m keeping w/market average and have doubled my entry level salary in just 2 years.” All of the computer graduates expressed satisfaction with their salary. A civil engineering alumnus said, “Yes. I have enough to live and have fun.”

Twelve alumnae/alumni, however, indicated that they are not happy with their salaries; these include 1 chemical, 3 civil and 8 mechanical engineering graduates. A civil engineering graduate unhappy with his salary said: “Engineers as a whole are vastly underpaid.” As noted, mechanical engineering graduates accounted for 67 percent of the negative responses regarding salary. A variety of reasons were given for their dissatisfaction. Two alumni cited lack of experience or expertise as the reason. Another believes the low salary is because he works for a small company. A female mechanical engineering graduate said, “ I am paid less than male employees with MUCH less responsibility than myself.”

161

Page 162: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Are you generally satisfied with your career choice? All but four alumnae/alumni expressed satisfaction with engineering as a career choice. Most alumnae/alumni mentioned that they enjoy what they are doing. Reasons for liking engineering include, “I enjoy the variation and challenge.” And “All want me to be creative and use my problem-solving skills.”

Another question on the survey asked alumnae/alumni: “Do you ever see yourself leaving engineering in the future to enter another field?” Fourteen respondents, or 44 percent of the sample, indicated plans to leave engineering. Some respondents are not sure what field they will enter but others indicated an interest in management, finance, music, missions, law and education.

First-Time Employment

A majority of the alumni/alumnae, approximately 70 percent, held a different position with the same or another company before moving to their present employment. Four alumni indicated it was three to eight months after graduation that they began employment with their first job after graduation. The first positions held by these responding alumnae/alumni are listed below.

Self-employed contract assignmentsTarget employeeTraffic engineerPerformance engineerConstruction project managingDesign engineerProject engineerEngineering Technician IIProgrammerWeb developerInternet developerPlant electrical engineerProcess engineerArmyApplications engineerProject designerProduction managerContract engineer

Nine alumnae/alumni have the same position they entered upon leaving USC. One student enrolled in graduate school and has not worked in another job. Most alumnae/alumni accepted a job prior to graduation or within three months of graduation. Three alumnae/alumni did not acquire an engineering-related position within this time period.

Continuing Education

Forty-one percent of the respondents (13 graduates) indicated that they applied to graduate school; all students were accepted and enrolled into the graduate program for which they applied. Five

162

Page 163: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

alumnae/alumni completed an advanced degree. The degrees received by the alumnae/alumni include: Master’s in International Business Studies (1), Civil Engineering (1), Transportation Engineering (1), Computer Engineering (1), and Electrical Engineering (1). These degrees were obtained from MIT, Penn State University, the University of Washington, and the University of South Carolina (2).

Other alumnae/alumni who are enrolled in graduate school are attending the University of South Carolina and Troy State University. One graduate indicated that she could not finish her thesis in environmental engineering because of a lack of data. She has completed all other requirements for the master’s degree.

Academic Preparation

Students were asked to rate their satisfaction with their preparation to become an engineer. Seventy percent of the graduates said they are satisfied or very satisfied with their preparation. Seven graduates indicated they were only “a little satisfied” or “not satisfied” with their training. Civil, electrical and mechanical alumnae/alumni selected these negative responses. Three alumnae/alumni, graduating form civil and mechanical, selected the undecided response category.

Alumnae/alumni also rated their preparation to obtain a job after graduation. Forty-nine percent (16 students) indicated a “satisfied” or “very satisfied” response. Ten students indicated negative responses to this item totaling approximately 30 percent of the sample. Eight students said that they were “not satisfied” and two students said they were “a little satisfied” with their preparation to obtain a job. Students selecting these responses graduated from the Civil, Computer, Electrical and Mechanical programs. In addition, seven students from the Chemical, Computer and Mechanical programs selected the undecided response to this question.

Overall, students responded positively to the question regarding their preparation to become a contributing member of society. Approximately 76 percent selected a “satisfied” or “very satisfied” response. One Electrical and two Civil Engineering graduates indicated they were not satisfied with their overall educational preparation and seven alumnae/alumni (Chemical, Computer, Civil and Mechanical are undecided regarding this issue.

Engineering Skills and Competencies

Alumnae/alumni were asked to provide their opinion regarding the importance of skills and their satisfaction with the level of competency they achieved on 21 different competencies as a result of their College of Engineering education. The skills and competencies identified on the survey include those recommended and outlined in the EC 2000 Criteria. For discussion purposes in the following paragraphs, these competencies are grouped into three major categories.

Category 1: An ability to apply engineering terms, principles, mathematics, chemistry/physics and liberal arts.

163

Page 164: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Importance of Skills A majority of students rated mathematics, chemistry/physics, liberal arts and engineering terms, concepts and principles as “important” or “very important” skills to possess as it relates to their engineering position. Positive ratings for each of these skill areas included: Engineering principles (94%); advanced mathematics (79%); chemistry and/or physics (65%); and liberal arts (75%). A small percentage of engineering graduates classified these skills as “not important.” Thirty-nine percent of the Mechanical engineers indicate that advanced math is not important for their engineering position. Approximately 36 percent of the alumnae/alumni classified chemistry and/or physics as “not important” for their positions; graduates from all programs except Computer Engineering selected this response. Overall 25 percent of the alumnae/alumni rated liberal arts as unimportant; Mechanical engineers comprise the majority of this response category.

Level of Competency Eighty-eight percent or more of the students responded positively regarding their satisfaction with the level of competency achieved on the skills and competencies in this category. These findings indicated that alumni are satisfied with their competency levels in advanced mathematics, chemistry, physics, liberal arts and engineering principles and theories.

Category 2: An ability to identify and solve engineering problems, design a system to meet desired needs; use the computer as an analysis tool; function on multidisciplinary teams; function in culturally diverse settings; communicate orally, in writing and with computer software; design/conduct experiments; and analyze/interpret data.

Importance of Skills All but a few of the respondents believe that the competencies listed in category 2 are “important” or “very important” skills to possess as it relates to their employment. Positive ratings on these skills are given in Table 1.

Table 1Positive Ratings for Specific Competencies

Competencies Importance of Skills

Satisfaction with Level of

Competency

% of “important ” & “very important”

% of “satisfied” & “very satisfied”

Identify and solve problems 93 79Design a system 97 73Use computer as a tool 100 85Function on multi-disciplinary team 97 79Function in a culturally diverse environment 88 78Oral communications 100 73Written communications 100 79Use of computer software for communication 100 76Design and conduct experiments 69 67Analyze and interpret data 100 73

164

Page 165: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

As indicated in the table, a majority of the respondents believe these skills to be important for their job. In one area, however, design/conduct experiments, respondents indicate that these skills are not as important in their present positions as the other skills in this category. Graduates from the Civil and Mechanical programs account for the largest percentage of these ratings. Half of the graduates in chemical engineering who returned surveys also indicated that designing and conducting experiments is not an important part of their job.

Level of Competency Graduates rated their satisfaction with the level of competency achieved in each of these skill areas. The percentage of students selecting the “satisfied” and “completely satisfied” response categories is listed in Table 1. Overall results suggest that a majority of alumni are satisfied with their competency-level for each skill with a minimum of 67 percent of the respondents indicating a positive rating. For all skills, the level-of-satisfaction totals are lower than the percentage of alumni who selected the competency as important or very important. Although reflecting a positive level, the competency areas receiving the lowest satisfaction ratings include oral communications (73%), designing a system (73%), and designing and conducting experiments (67%). Dissatisfaction response patterns were fairly evenly distributed throughout each program.

Category 3: An understanding of professional and ethical responsibilities, environmental aspects of engineering, engineering on a global scale, impact of engineering solutions in global context, life-long learning, industry practices, and contemporary issues.

Importance of Skills These characteristics were assessed with seven items on the survey. A majority of alumni/alumnae rated all skills in this category as important but the response patterns were mixed; some competencies were rated more important than other competencies. Response levels are given in Table 2. For their current positions, alumni rated life-long learning (97%), knowledge of industry practices (97%), and ethical responsibilities (100%) as the most important skills. As seen in Table 2, alumni/alumnae rated contemporary issues as the least important competency in this category with only 40 percent of the alumnae/alumni indicating this skill as important in their current situation.

Table 2Positive Ratings of Specific Competencies

Competencies Importance of Skills

Satisfaction with Level of Competency

% of “important ” & “very

important”

% of “satisfied” & “very satisfied”

Professional and ethical responsibilities 100 79Environmental aspects of engineering practice 81 63Practice of engineering on a global scale 79 45Impact of engineering solutions in a global 78 50

165

Page 166: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

societal contextLife-long learning 97 76Basic knowledge of industry practices 97 45Contemporary issues 40 59

On the average, response patterns for the sample indicate that alumnae/alumni rank the skills in this category as important as the skills comprising the other two groups. Response patterns for each program are similar for most of the competencies in this category indicating no significant differences in alumni/alumnae opinions across the five engineering programs. Several obvious exceptions should be noted. In contrast to the other program graduates, none of the computer engineering alumnae/alumni believes that the environment aspects of engineering are important for their job. Sixty percent of the computer alumni also rated the practice of engineering on a global scale as unimportant.

Level of Competency As shown in Table 2, alumni/alumnae satisfaction with their level of competency for skills in this category ranged from a low of 45 percent to a high of 79 percent suggesting that almost half or more of the alumnae/alumni gave positive ratings for these competencies. Highly rated competencies include professional ethical responsibilities and life-long learning. Competency ratings for these skills, on the whole, are slightly lower than the ratings for the other skills listed on the survey. More importantly, however, is the fact that the proportion of alumnae/alumni who rated these competencies as important does not correspond to the proportion of graduates who are satisfied with their level of competency on these skills. For example, 97 percent of the alumnae/alumni believe that basic knowledge of industry practices is an important skill to possess but only 45 percent feel satisfied with their expertise in this area. This trend holds for each program except Chemical Engineering.

Professional Development

Seven survey items elicit information concerning the alumnae’s involvement in professional and service organizations and the engineering licensing process. Seventy-three percent of the alumni indicate they have passed the Fundamentals of Engineering Exam. Three alumnae/alumni did not pass the test and the remainder of the respondents (6 graduates or 18 percent) said they have not taken it. All but one of the alumni/alumnae (96 percent) have not completed the Principles and Practice Examination. Only two respondents have completed practice as an EIT and fifty-nine percent of the survey respondents are working toward this credential. None of the alumnae/alumni are licensed as a professional engineer.

Eighteen of the 32 respondents (56%) are members of a professional organization including ASCE (4), IEEE (2), ASME (4) and AIChE (1) and other local and company sponsored groups.

Sixteen of the 32 respondents (50%) listed participation in a variety of community or service organizations. Three of the 16 alumni mentioned involvement with a church or a religious organization. Five alumni/alumnae responding to this item volunteer with national civic organizations such as Boy Scouts of America, Jaycees, Lions’ Club and Knights of Columbus.Several graduates participate in local schools as a mentor or a tutor.

166

Page 167: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Contributions to Engineering Success

Students were asked which aspects of their undergraduate engineering programs contributed to their satisfaction working in engineering or their present career. Twenty-seven of the 32 alumnae/alumni provided a variety of responses mentioning courses, competencies, teaching strategies, instructors, organizations and specific learning experiences.

Projects, labs, teamwork, computer skills and the quality of the faculty members were the most frequently cited responses to this question. Graduates emphasized different aspects of their relationship with faculty members. For example, one alumnae/alumni concluded: “I feel that Dr. Steve McAnally’s approach during my graduate research interim helped/encourage me more than the entire previous 5 years. He allowed me to choose my own research, meet with clients, write reports, etc. with minimal ‘interference.’ The most important thing I learned, is how to self-direct.” Another alumnae stated: “I think professors that incorporate real-life problems are the ones who best prepare their students.”

Another frequently cited response concerning the factors that contributed to their satisfaction with engineering careers was the opportunity to participate in “hands-on” or “real-world” activities such as the labs offered in each program. One alumnae/alumni expressed the opinion as follows: “The lab experiments particularly the environmental labs contributed greatly to my understanding the coursework in the classroom.” Another student said: “The labs (301, 201, 401, 402) were the most educational for the real world.”

According to alumnae/alumni, class projects were an important contribution in preparing them to work as engineers. A typical explanation included: “Classes with completely project based and independently earned grades were the most beneficial to my working career. Another alumni response indicates the helpfulness of several teaching learning techniques: “Long term projects like senior design assisted greatly. The vehicle project teams in school gave me hands on knowledge that many engineers do not gain even after 6 years in industry.”

Regarding the importance of computer skills one graduate stated: “I would have to say the close attention to student’s computer skills. The programming techniques we all had to learn have been very useful.”

Particular courses, professional organizations, industrial input, oral and written communications were also mentioned by the respondents. One alumnae/alumni said: “Also, through the writing of all those many, many reports we had to turn in, I had to become a pretty good user of several software applications. Those skills have also been useful.”

Most Influential Faculty Member

Alumnae/alumni were asked to identify the most influential professor in their professional development at the College of Engineering and Information Technology. Twenty-six professors were mentioned by 28 alumnae/alumni. In several instances, graduates selected more than one professor. Faculty members who were acknowledged by the 1996 alumnae/alumni include: Dan Bailey, Ronald Baus, Ronald Bonnell, Hugh Bradburn, Joseph Byrd, Francis Gadala-Maria, Molly

167

Page 168: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Gribb, Sam Hilborn, Jerry Hudgins, Jamil Khan, Jed Lyons, Steve McAnally, Stephen McNeill, Michael Meadows, Walter Peters, Robert Pettus, Richard Poole, Richard Ray, David Rocheleau, Ted Simpson, Thomas Stanford, Tangali Sudarshan, Michael Sutton, Vincent Van Brunt, Juan Vargas, and Edward Young.

Three of these faculty members will be highlighted in this summary - Jamil Kahn, Molly Gribb and Richard Ray. Several insightful comments were written regarding Dr. Khan including: “Dr. Jamil Khan was the most influential – he was always willing to set time aside to assist students and ensure that they learned the material. He is very friendly and extremely knowledgeable.” Alumnae/alumni appreciated Dr. Kahn’s attention to the student’s individual efforts. Other students commented on Dr. Kahn’s ability to challenge students to think and analyze. Another alumni commended Dr. Kahn’s efforts to motivate students: “He was a tough but fair professor. He taught you that through hard work you can excel at anything.”

A wonderful tribute was written regarding Molly Gribb. The student said: “She is one of the only professors at USC that invests a sincere effort in students and making sure that teaching takes priority over research and personal gain.” Another professor frequently praised was Dr. Richard Ray. One of the most memorable quotes concerning Dr. Ray was: “ Dr. Ray showed me ways of analyzing things and breaking them down to solve a problem (generally).” Another reference to Dr. Ray said: “Even though I struggle with coursework and tests, I sensed that these individuals believed in my ability to perform engaging tasks.”

Recommendations

The former College of Engineering students provided numerous recommendations to improve the educational experience for future engineering students; 30 respondents provided thoughtful feedback. Although 21 or more different suggestions were received, most of the comments can be grouped into three major categories.

The major theme throughout student responses to this question is the need for more “hands-on” or “real world” experiences within the classroom. This suggestion was reiterated by eleven of the responding alumnae/alumni. Graduates believe that if practical knowledge is conveyed in the classroom then students will be more successful in the coursework and in their future employment. Recommendations from some of the students regarding this theme are:

“I think the students need to have more opportunities to work on real business problems, whether these are solutions labs, internships or co-ops.” (Computer)

“Apply principles through “real-life” applications.” (Civil)

“Even though theories are important, I would use more real world examples and problems in the classroom. I have learned more from real world application than I have from theory.” (Electrical)

“There should be much more exposure to the real working world. Programs that involve students and companies, so that students see 1st hand what is important, and what they

168

Page 169: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

should really apply their efforts to learn. Sometimes professors’ ideas of what the work-world expects of their employees is different from reality. I found that the things my professors really stressed were trivial, and the things necessary for survival were put on the back-burner. This caused me to require a lot of training, and not a whole lot to sell myself to a potential employer with. Without experience, topped with the lack of necessary skills, I think the first few companies I interviewed with, laughed as I walked out the door.” (Mechanical)

As indicated within several of the above quotes, a related theme emerging from the student responses is the recommendation for more involvement of business with student coursework. One mechanical graduate succinctly stated this recommendation for outside input: “ More projects/courses with private industry.” Another graduate’s suggestion: “Bring in professional engineers who can teach you exactly what you will use when you get into your field.” This suggestion links hands-on/real-life applications with increased industry involvement.

Another group of recommendations are associated with computer and software usage and instruction. Three or more alumnae/alumni suggest that additional software/applications instruction is needed before graduation. Recommendations related to this topic included: (1) more functioning computer with up-to-date software at 300 Main Street; (2) More AutoCAD/CAD training; (3) Use AutoCad/Softdesk in class; (4) Integrate with some of the computer science classes; (5) more software classes using business tools; and (6) make laptops mandatory.

Other recommendations concerned proposed changes to the current curriculum; alumnae/alumni believe courses in HVAC design, transportation, manufacturing and business/management should be added to the catalogue. Also one alumnae/alumni believes that a wider variety of courses would be beneficial for future engineers. Other alumnae/alumni stated that some courses could be integrated and instruction should be more consistent among professors.

Other recommendations from alumnae/alumni involve some general observations such as “Encourage persistence” but also some specific ones like “Put more $$ in the ChemE program” and “Don’t give grades away!” In general, recommendations were shared in a positive manner and show the care and concern the former students gave to this endeavor. Two alumnae/alumni did not share their suggestions on paper but listed a name, phone number and email address to reach them for more specific comments.

Summary

Alumnae/alumni surveys were mailed to 170 students who graduated in May, August or December of 1996. Excluding 44 surveys returned because of an insufficient address, approximately 26 percent of the sample returned completed forms. The analysis sample consists of 33 surveys with return rates for each program as follows:

Program Number of Graduates

Number of GraduatesReceiving Survey

Number of Surveys Completed

Return Rate

169

Page 170: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Chemical 22 13 2 15Civil 27 22 8 36Computer 21 18 5 28Electrical 41 30 5 17Mechanical 56 40 13 33

Ethnic and gender characteristics of the respondents are fairly representative of the 1996 graduate sample. Seventy-six percent of the respondents are males and 91 percent are Caucasian. These figures suggest that a slightly higher percentage of females and Caucasians completed surveys compared to the total group. Computer and electrical alumnae/alumni are somewhat under-represented in the analysis sample. Approximately 21 percent of the respondents transferred to USC from another college or university.

All but one of the alumnae/alumni who returned surveys are employed full time; one alumnus enrolled in graduate school and is not employed. One alumnae/alumni is not employed in an engineering-related field. This chemical graduate is self-employed in the import business.

A range of companies throughout South Carolina and the United States employs the sample of 32 alumnae/alumni. At least 15 of the 32 respondents (approximately 49 percent) are employed within the state of South Carolina. Some of the companies employing 1996 graduates include: Carolina Eastman, Wilbur Smith Associates, Horry County Telephone Cooperative, Cambridge Technology Partners, Conita Technologies, Mid-Carolina Cooperative, Datex-Ohmeda, General Electric Appliances, Becton Dickinson, Reverse Engineering, and Spartanburg Steel Products.

Eighty-eight percent of the alumnae/alumni (28 respondents) are satisfied with their current employment listing a variety of reasons for this feeling. Reasons for their satisfaction include challenging projects; good pay, working with other engineers, and a variety of tasks and experiences. Four graduates, however, are unhappy with their positions. All four are mechanical engineering graduates and listed different reasons for being dissatisfied with their positions.

All but four alumnae/alumni (88%) expressed satisfaction with engineering as a career choice. Most alumnae/alumni mentioned that they enjoy what they are doing. Approximately 94 percent of the respondents are satisfied with the progression of their career citing opportunities for advancement, new challenges, and gaining experience as the primary reasons for this opinion.

Sixty-three percent of the respondents (20 alumnae/alumni) expressed satisfaction with their salary at the present time. Some alumnae/alumni indicated they had received raises and that their salaries were in keeping with the market averages or competitive in their field.

Twelve alumnae/alumni, however, indicated that they are not happy with their salaries; these include 1 chemical, 3 civil and 8 mechanical engineering graduates. All of the computer graduates expressed satisfaction with their salary As noted, mechanical engineering graduates accounted for 67 percent of the negative responses regarding salary. A variety of reasons were given for their dissatisfaction including lack of experience or expertise, working for a small company and unequal pay for female engineers.

170

Page 171: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Fourteen respondents, or 44 percent of the sample, indicated plans to leave engineering. Some respondents are not sure what field they will enter but others indicated an interest in management, finance, music, missions, law and education.

Forty-one percent of the respondents (13 graduates) applied and enrolled in graduate school. At the present time, five alumnae/alumni have completed an advanced degree; all degrees were obtained in an engineering-related field except one.

Three questions on the survey were designed to measure graduate’s satisfaction with their undergraduate experience within the College of Engineering. Seventy percent of the alumnae/alumni said they are satisfied with their preparation to become an engineer. Approximately 49 percent of the respondents indicated they are satisfied with their preparation to obtain a job after graduation. Finally, 76 percent of the alumnae/alumni expressed satisfaction with their preparation to become a contributing member of society.

Alumnae/alumni were asked to provide their opinion regarding the importance of skills and their satisfaction with the level of competency they achieved on 21 different skills as a result of their College of Engineering education. Alumnae/alumni report that the 21 competencies listed on the survey are important skills for their engineering work. A majority of respondents, ranging from 40 to 100 percent rated these skills as ‘important” or “very important.” The skills or competencies rated as important by 100 percent of the alumnae/alumni are given below:

Use the computer as a tool for analysis and design Communicate orallyCommunicate in writingUse computer software for professional communicationAnalyze and interpret dataProfessional and ethical responsibilities

Although positively rated, competencies receiving the lowest endorsements include: contemporary issues (40%), chemistry/physics (65%) and to design and conduct experiments (69%).

In general, alumnae/alumni were also satisfied with their level of competency in each of the 21 skills. Satisfaction levels ranged from 45 to 91 percent of the respondents. Skills in which 80 percent or more of the alumnae/alumni were “satisfied” or “completely satisfied” with the level of competency they achieved as a result of their USC education include:

Engineering terms, principles, and theories (91%)Advanced math (88%)Liberal Arts (88%)Chemistry/physics (88%)Use the computer as a tool for analysis and design (85%)

Alumnae/alumni were asked which aspects of their undergraduate education contributed to their satisfaction working in engineering. Twenty-seven of the 32 alumnae/alumni provided a variety of

171

Page 172: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

responses mentioning courses, competencies, teaching strategies, instructors, organizations and specific learning experiences.

Projects, labs, teamwork, computer skills and the quality of the faculty members were the most frequently cited responses to this question. Graduates emphasized different aspects of their relationship with faculty members. For example, one alumnae/alumni concluded: “I think professors that incorporate real-life problems are the ones who best prepare their students.”

Another frequently cited response concerning the factors that contributed to their satisfaction with engineering careers was the opportunity to participate in “hands-on” or “real-world” activities such as the labs offered in each program.

According to alumnae/alumni, class projects were an important contribution in preparing them to work as engineers. A typical explanation included: “Classes with completely project based and independently earned grades were the most beneficial to my working career.

Another group of recommendations are associated with computer and software usage and instruction. Three or more alumnae/alumni suggest that additional software/applications instruction is needed before graduation.

Alumnae/alumni were asked to identify the most influential professor in their professional development at the College of Engineering. Twenty-six professors were mentioned by 28 alumnae/alumni. Faculty members acknowledged by the 1996 alumnae/alumni include: Dan Bailey, Ronald Baus, Ronald Bonnell, Hugh Bradburn, Joseph Byrd, Francis Gadala-Maria, Molly Gribb, Sam Hilborn, Jerry Hudgins, Jamil Khan, Jed Lyons, Steve McAnally, Stephen McNeill, Michael Meadows, Walter Peters, Robert Pettus, Richard Poole, Richard Ray, David Rocheleau, Ted Simpson, Thomas Stanford, Tangali Sudarshan, Michael Sutton, Vincent Van Brunt, Juan Vargas, and Edward Young.

The former College of Engineering students provided numerous recommendations to improve the educational experience for future engineering students; 30 respondents provided thoughtful feedback. Although 21 or more different suggestions were received, most of the comments can be grouped into three major categories.

The major theme throughout student responses to this question is the need for more “hands-on” or “real world” experiences within the classroom. This suggestion was reiterated by eleven of the responding alumnae/alumni. Graduates believe that if practical knowledge is conveyed in the classroom then students will be more successful in the coursework and in their future employment.

A related theme emerging from the student responses is the recommendation for more involvement of business with student coursework. One mechanical graduate succinctly stated this recommendation for outside input: “ More projects/courses with private industry.”

Another group of recommendations is associated with computer and software usage and instruction. Three or more alumnae/alumni suggest that additional software/applications instruction is needed before graduation.

172

Page 173: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

173

Page 174: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix H

Faculty/Staff Surveys

174

Page 175: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information TechnologyFaculty and Staff Survey

1999 Spring Semester

Indicate your department affiliation __________________________ Check one: _____Faculty _____Staff

The survey consists of statements about policies and programs that individuals within the College of Engineering will have different opinions or judgments. Indicate your opinion by placing a check (√) in the appropriate column. If you have no opinion, please leave the item blank.

StronglyDisagree

Disagree Agree StronglyAgree

1. I am aware of the priorities of the University.2. I am aware of the priorities for the College of Engineering.3. The College plans and aspirations are aligned with the University

goals.4. I am aware of the vision and mission statement s of the College.5. There is a sense of shared interests within the College of Engineering. 6. The Senior Survey, Course Survey and other results of other college studies are reported to the department chairs.7. Budget information is shared with faculty and staff members.8. In general, the deans and department chairs provide effective

leadership and advocacy.9. The faculty and staff are involved in the strategic planning process. 10. There is faculty and staff involvement in important decisions about College programs and activities.11. The Faculty and Staff Advisory Councils provide an effective forum for communication between faculty and staff and the administration.

Indicate your opinion by placing a check (√) in the appropriate column. If you have no opinion, please leave the item blank.

Inadequate Poor Average Good Excellent12. Indicate your perception of the quality of the undergraduate programs in the College. 14. Rate the effectiveness of communication that is exchanged among the Administration, faculty, and staff.14. Rate your overall impression of the College of Engineering’s collaboration with business and industries in the state.15. Rate the level of information you have received about the ABET process and the new accreditation criteria.16. Rate the process for improving the quality of your program.18. Rate the effectiveness of the College’s economic development initiatives.19. Rate the overall effectiveness of the Professional Communications Center in providing support for improving the quality of student’s oral and written communications.18. Rate the overall effectiveness of the Career Services Center in providing internships, co-op opportunities and the placement of graduates.20. Rate the undergraduate student recruiting efforts of the College

175

Page 176: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

of Engineering.21. Rate your awareness of programs in your discipline at peer aspirant Institutions.22. Rate your perception of how aware peer institutions are of your program.23. Rate your perception of the quality of the information technology infrastructure within the College.24. Rate the effectiveness of the College’s efforts in creating better public awareness of what we do.25. Rate the effectiveness of your industrial advisory board in affecting change in your curriculum.26. Please indicate industries’ perception of the currency and relevancy of your undergraduate curriculum.

Please indicate your perception of the overall rank of the undergraduate program for each of the two peer groups given below. 1 represents the best program among the group.

Undergraduate Programs

CHE/University Peer Group Regional Peer Group

_____ Indiana (no engineering _____ Auburn University_____ University of Colorado – Boulder _____ Clemson University_____ University of Florida _____ Georgia Tech_____ University of Iowa _____ N. C. State_____ University of Kansas _____ Mississippi State_____ University of North Carolina – Chapel Hill (no engineering) _____ University of Kentucky_____ University of South Carolina _____ University of North

Carolina - Charlotte _____ University of Virginia _____ University of South Carolina_____ Vanderbilt University _____ Virginia Tech

Graduate Programs

CHE/University Peer Group Regional Peer Group

_____ Indiana (no engineering) _____ Auburn_____ University of Colorado – Boulder _____ Clemson_____ University of Florida _____ Georgia Tech_____ University of Iowa _____ N. C. State University_____ University of Kansas _____ Mississippi State_____ University of North Carolina – Chapel Hill _____ University of Kentucky_____ University of South Carolina _____ University of North

Carolina - Charlotte_____ University of Virginia _____ University of South Carolina_____ Vanderbilt University _____ Virginia Tech

176

Page 177: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

With the limited resources and the multitude of needs, please rank the following in order of priority for receiving new funds. Use 1 as the highest ranking. Please use each number only one time.

_____ Upgrading the college computer network_____ Machine Shop_____ Professional Communications Center_____ Freshman Engineering Experience_____ Classroom enhancements to improve the teaching environment_____ Recruiting of undergraduate students_____ Recruiting of graduate students_____ Start-up funds for new faculty_____ Hiring more computer support personnel_____ Hiring more departmental support staff_____ Instructional laboratory equipment_____ Computer hardware and software in departments_____ Computer hardware and software in College labs_____ High performance research computing_____ Other

__________________________________________________________________

177

Page 178: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information Technology

Faculty Survey

2000 Spring Semester

Please indicate your program ______________________________

1. Below are listed some skills and competencies that an engineering graduate should have according to the Engineering Criteria 2000. Please indicate your perception of the amount of experience students received in your coursework regarding these skills. Use the one course that you most often teach as the basis for answering these questions. Also indicate your opinion of the level of competency students have achieved as a result of their USC engineering education.

Competencies Amount of Experience Level of Competency

Too Little Adequate Too

MuchCompletelyDissatisfied Dissatisfied

Satisfied

Completely Satisfied

An ability to apply:

Engineering terms, principles and theoriesAdvanced mathematics (calculus & above)Chemistry and/or physicsLiberal Arts (English, history, economics, business, etc.)

An ability to:

Identify, formulate, and solve engineering problems Design a system, component, or

process to meet desired needs and quality

Use the computer as a tool for analysis & designFunction on multi-disciplinary or cross-functional teamsFunction in culturally and ethnically diverse environmentsCommunicate orally, informally, and in prepared talksCommunicate in writing - technical

reports, memos, proposals, etc.Use computer software for professional communicationsDesign and conduct experimentsAnalyze and interpret data

An understanding of:

Professional and ethical responsibilities Environmental aspects of engineering practiceThe practice of engineering on a global scale

178

Page 179: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The impact of engineering solutions in a global and societal context

The need for engaging in life-long learning Basic knowledge of industry practices and standardsContemporary issues

2. Please indicate the extent to which you incorporate the following teaching/learning strategies and topics into the major course you typically teach each year. Secondly, indicate if you use these strategies more or less than the last academic year. Please indicate the direction of change by checking one box.

Activities in the classroom Extent to which you use each activity Direction of change

Never All the time

Less than last year

More than last year

1 2 3 4 5

Use technology to deliver instructionUse computer activities to enhance student learning Use a variety of methods to accommodate differences in student learning stylesIntegrate math and science into engineering coursesUse a variety of teaching strategiesInteract with students outside of classAvailable for student appointmentsEncourage students to read professional journalsEncourage students to visit professional websites

3. Other than the course survey administered at the end of the semester (green form), how often do you ask for student input on how to improve the courses you teach?

__________ Never__________ Seldom (once a year)__________ Once per semester in each course__________ Twice per semester in each course___________ Three or more times per semester

179

Page 180: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

4. Are you aware that SC State law requires the administration of the first seven items of the course survey for all courses (undergraduate and graduate) each semester?

Yes No

5. One of the goals of the of the ABET Engineering Criteria 2000 is to improve the education experience for students in engineering. In what ways have you tried to improve the education experience for students in your courses?

6. How can the College of Engineering and Information Technology do more to enhance undergraduate engineering education?

7. Examine the following list of classroom assessment instruments. Indicate the extent to which you rely on a given method within the course you typically teach during one semester. (Consider tests, finals, assignments, projects, homework, etc.). Rate each type of assessment given below.

Types of Assessments

Never use Sometimes in some classes

Always in some classes

Always in all classes

Not applicableFor the course

Multiple choice tests/quizzesShort response tests/quizzesWritten solutions to math problems Development of computer programsWritten papersOral presentationsOral questioningPeer ratingsStudent self-evaluationsPortfoliosDesign projectsExperimentsTeam projects

Others (please specify_____________________________________)

180

Page 181: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

8. List the professional development activities that you have participated in during the past year.

9. What types of faculty development activities have you participated in during the past two years that focus on ways to improve the teaching/learning process? Check all that apply. If you attended more than on conference, etc. please indicate the number in the blank.

A. ________ Attending conferencesB. ________ Presenting at conferencesC. ________ Writing articlesD. ________ Presenting within the COEITE. ________ Attending in-house workshopsF. ________ Attending presentations by invited guest speakers within your programG. ________ Serving on College-wide ABET-related committeesH. ________ Serving on ABET-related committees within your programI. ________ One-on-one consultationsJ. ________ Reading journal articlesK. ________ Visiting websites of other engineering institutions

L. ________ Did not participate in any such activities

10. ABET Engineering Criteria 2000 requires the implementation of a continuous quality improvement system within each program. Provide any comments you wish regarding issues, concerns, strengths and weaknesses of this assessment process within your program.

181

Page 182: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix I

Faculty/Staff Survey Reports

(sample)

182

Page 183: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information TechnologyFaculty and Staff Survey

Summary of ResultsSpring 1999

The first annual Faculty and Staff Survey was distributed to full-time employees within the College of Engineering (COE) during the first week of June 1999. A total of 72 surveys were returned with partial or complete responses. Approximately 58 percent of the respondents, or 42 surveys, were received from faculty members. The remainder, 30 respondents, indicated positions as staff. Staff members included employees from the Dean’s area, Student Services, Institutional Services and each of the four departments. Breakdowns of the returned surveys by department are as follows:

ECHE: 12 ECIV: 11 EECE: 11 EMCH: 16 AdministrationFaculty: 8 Faculty: 10 Faculty: 9 Faculty: 12 Staff: 30 Staff: 4 Staff: 1 Staff: 2 Staff: 3

Unknown: 1

The survey requested faculty and staff opinions regarding a variety of college objectives and functions. The first set of statements asked respondents to react to 11 Likert-type items. Response choices included: “strongly disagree,” “disagree,” “agree,” and “strongly agree.” Over half of the faculty and staff members responded positively to 7 of the 11 items. In general, faculty and staff are aware of the goals and priorities of the University of South Carolina and the College of Engineering and believe that the COE objectives are aligned with U.S.C.’s goals. Over 93% of the respondents know that reports are given to their Department Chairs. They also expressed the opinion that budget information is shared with them and that the COE has effective leadership and advocacy.

Response patterns for four of the items in the first set suggest that over half of the faculty do not have positive perceptions regarding their involvement in the planning or decision-making process. Faculty and staff believe that the Faculty and Staff Councils do not provide an effective communication medium (62%) nor do they believe that there is a sense of shared interests within the COE (72%).

In the second set of items, faculty and staff members were asked to rate the quality and effectiveness of 15 different areas, programs or services within the COE. These survey statements were given in a Likert-type format using the following alternatives: inadequate, poor, average, good and excellent. Overall, faculty and staff respondents rated 13 of 15 items as average, good or excellent indicating a degree of satisfaction with the quality of the College services. Faculty and staff members rated seven of these 15 topics as good or excellent. These include: quality of undergraduate programs, ABET information, the Professional Communications Center, their awareness of programs at peer institutions, public awareness and the Industrial Advisory Board’s

183

Page 184: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

perception of the curriculum. Areas of the College that were rated in a negative manner by more than a third of the respondents include communication (44%) and peer awareness of USC programs (47%).

The next section of the survey requested respondents to rank the list of aspirant peers that were adopted for use by the CHE and legislature in the performance funding evaluation. Some of the institutions do not have an engineering program while others have a very large engineering college. The ranking of the undergraduate programs of these Universities by faculty and staff members, according to the frequency of first place votes, are as follows: UVA, Florida, Colorado, UNC, USC, and Indiana, Iowa and Kansas tied for the bottom places. UVA received 20 first place votes followed by the University of Florida with 7 and UNC with 5. These schools were also most frequently ranked in the second place among the CHE designated peer group.

Faculty and staff also ranked the universities designated as part of the regional peer group. Rankings for the 10 regional peer institutions were also consistent among the faculty responses. Georgia Tech received 26 first place and 10 second place votes. Virginia Tech was second in the frequency of the top selections with 8 and 15 first and second place votes respectively. NC State was a close 3rd differing only by a frequency of 5. The overall ranking, in order of the frequency of first place votes, was: Georgia Tech, Virginia Tech, Vanderbilt, and NC State. Auburn, Kentucky, Mississippi, UNCC and USC were given the lowest ratings within the group.

Faculty and staff members were also asked to rate the graduate programs of the CHE aspirant peer Universities. Rankings for the top four graduate programs were the same as for the undergraduate programs with UVA receiving the largest number of first place rankings followed by Florida, Colorado and UNC. The next rankings include Iowa, Indiana, Kansas and USC placing at the bottom of the list. Rankings of the regional peer group programs were somewhat similar to the undergraduate results indicated previously. Georgia Tech obtained the largest number of first place ranks with Virginia Tech and NC State coming next. Unlike the undergraduate rankings, USC was listed in fourth place among the universities listed outranking Vanderbilt, Mississippi and Clemson. Auburn, Kentucky and UNCC were ranked last.

In the final section of the survey, faculty and staff members ranked a list of 14 items that could receive funds from the College budget. The survey directions asked employees to prioritize these 14 items or add their own selections. Examination of the frequencies for a first place ranking suggests that recruiting of undergraduate students is a top priority. Priorities ranked in the next four positions include: hiring more departmental staff; providing start-up finds for new faculty, the Machine Shop and classroom enhancements.

When first, second and third place priorities are collapsed the top five priorities are as follows: recruiting undergraduates, recruiting graduates, start-up funds for faculty, lab equipment and hiring computer support staff.

Analysis of the rankings, by department, indicates some differences among the programs. The following tables show the top five (5) priorities for each department. The first table indicates the priorities examining only the frequencies for items ranked # 1. The second table indicates the top five priorities when first, second and third place rankings are combined.

184

Page 185: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Ranked as #1 Priority

Administration Staff ECHE ECIV EECE EMCH

Recruit Undergraduates Recruit Undergraduates

Hire staff Hire faculty Recruit Undergraduates

Machine Shop Start-up funds Lab EquipmentFreshman Year Hire staffClassroom EnhancementsStart-up funds

Note. Only priorities listed above are those that received a first place ranking.

Combined 1st – 3rd Ranks

Administration Staff ECHE ECIV EECE EMCH

Recruit undergraduates Start-up funds Hire staff Recruit graduates & Start-up funds

Recruit undergraduate

Freshman Year Recruit undergraduates

Recruit graduates Recruit graduates

Classroom enhancements

Recruit graduates Hire computer personnel

Recruit undergraduates and hire faculty

Start-up funds & lab equipment

Recruit graduates Hire faculty and hire staff

Classroom enhancements, lab equipment & computer network

Lab equipment Lab equipment and computer support personnel

Machine Shop, Computer network, and computer support personnel

185

Page 186: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information TechnologyFaculty Survey

Summary of Survey Results2000 Spring Semester

Overview

Numerous reports over the past ten years have outlined the attributes that engineering graduates need to possess in the 21st century workplace. Engineering is part of the growing national trend toward increased accountability and assessment to provide feedback from multiple constituencies and to enhance student learning in and out of the classroom. There is broad agreement of the need for systemic engineering educational reform that must be implemented for colleges to successfully develop graduates who meet these criteria and to provide evidence to legislatures, parents and potential employers that programs are achieving their stated missions, goals and objectives.

As part of this reform, institutions of higher education now focus on student outcomes or performance-based models of instruction that strive to measure what students have learned and what they can do. This altered view of the teaching-learning process also required a concomitant change in the way learning outcomes are assessed inside and outside of the classroom. Outcomes assessment examines the results of the education process by asking to what extent student have accomplished the objectives of their discipline.

Believing in the need for change, ABET and other accrediting organizations have taken leadership roles in defining the parameters of the reform movement. This paradigm shift is clearly evident in the new Engineering Criteria 2000. EC 2000 stipulates that programs must have published educational objectives that are consistent with the mission of the institution and that they must evaluate the success of students in meeting these program objectives. The ABET criteria also require engineering programs to include a continuous quality improvement process that documents progress toward achievement of these objectives.

To advance the criteria, ABET has promoted more diversity in classroom practices that move instruction from a traditional lecture to structured activities reflecting what engineers do in the workplace. The reform movement advocates that engineering curricula incorporate a variety of teaching and assessment methods to involve students in active learning, design projects, technology use, and multidisciplinary teams. Outcomes-based assessments, in the form of design projects, portfolios, and model construction are more direct measures of student learning than multiple exams and are strongly advocated to enable faculty members to directly link student competencies with the expectations of the workplace.

Goals and Purposes

A major goal of this survey is to provide an opportunity for faculty members to identify the strengths and weaknesses of their students and to evaluate their use of ABET recommended

186

Page 187: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

teaching/learning strategies. The survey represents one of multiple methods that assess program impact within the College’s continuous quality improvement program. As a key stakeholder within this system, faculty members are requested to evaluate: (1) the skills and competencies of their students, (2) their involvement in professional development activities, and (3) their use of multiple teaching and learning strategies in the classroom. The Faculty and Staff Survey administered in the fall semester will address other issues such as space utilization, IT Services, Career Services, Student Services and other college-wide services.

Administration

Surveys were mailed to 81 College faculty members on April 19, 2000. This distribution included Computer Science faculty even though they are new to the College and are probably unfamiliar with the EC 2000 Criteria. A reminder was emailed to faculty during the following week. A total of 31 surveys were received for a return rate of 38 percent.

The following return rates were obtained for each program:

Chemical 47% ( 7 of 15 surveys)Civil 33% ( 5 of 15 surveys)Computer 26% ( 6 of 23 surveys)Electrical 33% ( 3 of 9 surveys)Mechanical 53% (10 of 19 surveys)

Instrument

A three-page survey was developed to obtain information in the following areas:

Ratings of student competency for each EC 2000 CriteriaRatings of student experience with each competency (EC 2000 Criteria)Use of teaching/learning strategiesStudent input regarding coursesImproving engineering educationUse of classroom assessment techniquesInvolvement in professional development

Survey results, consisting of frequencies and percentages for each survey item, are given by program in the accompanying tabular report. The following paragraphs summarize the general findings for each section of the survey.Ratings of Competencies

Faculty were asked to provide their opinion regarding the amount of experience students received in engineering courses and their satisfaction with the level of competency students achieved as a result of their USC education. Ratings were requested for 21 different skills and competencies outlined in the Engineering Criteria 2000 published by ABET. These skills are grouped into three major categories. The following section summarizes the survey findings in each category.

187

Page 188: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Category 1: An ability to apply engineering terms and principles, mathematics, chemistry and/or physics and liberal arts.

Amount of Experience A majority of faculty members believe that their students received an “adequate” amount of experience in coursework regarding skills in this category. Adequacy ratings include: engineering terms, etc. (97%); advanced math (96%); chemistry and/or physics (84%); and liberal arts (59%). Approximately 36 percent of the faculty members believe that engineering students received “too little” experience in the liberal arts courses.

Program Results Overall, no significant differences were observed among the programs regarding the student’s ability to apply engineering terms, principles, theories, etc., and advanced mathematics. With one exception, similar response patterns were also noted for each program concerning the application of chemistry and physics. Computer faculty members were equally divided having one faculty member to choose each response category – “too little,” “adequate,” and “too much.” Engineering professors are less unanimous in their opinion regarding the application of liberal arts. Although a substantial proportion of the overall group (36%) believes students have not received sufficient liberal arts coursework, an even larger proportion (67 percent) of Civil engineering faculty members believe students need more training in this area.

Level of Competency For each skill in this category, at least half of the faculty members perceive that students have a sufficient level of competency in the four skills in this category. The following proportion of faculty members are satisfied or completely satisfied with student’s level of competency in the application of:

Engineering terms, principles and theories (82%)Chemistry/physics (74%)Liberal arts concepts (72%)Advanced math (58%)

Compared to the other key areas, the figures suggest that a lower percentage of faculty members are comfortable with student’s expertise in advanced math. In fact, survey results indicate that approximately 41 percent expressed dissatisfaction with the level of competency students have achieved in math as a result of their USC education.

Program Results The distribution of responses by program indicates that there are some differences among faculty member’s perception of student competency in these areas. A larger percentage of Civil and Electrical professors are dissatisfied with student’s ability to apply engineering and chemistry/physics concepts. Approximately 67 and 50 percent, respectively, of the Electrical and Mechanical respondents believe that students are under-prepared in advanced math. For liberal arts, 50 percent of the Civil and 30 percent of the Mechanical professors are dissatisfied with student skills in these areas.

Category 2: An ability to identify and solve engineering problems; design a system to meet desired needs; use the computer as an analysis tool; function on multidisciplinary teams; function in culturally diverse settings; communicate orally, in writing and with computer software; design/conduct experiments; and analyze/interpret data.

188

Page 189: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Amount of Experience Overall, engineering faculty expressed the opinion that students have an adequate amount of experience with each of the ten competencies listed in this category. At least 59 percent or more of the faculty indicated this viewpoint. The percentage of faculty members selecting adequate as their rating of student experience are as follows:

Identify, formulate and solve problems (90%)Use computer software for communications (79%)Communicate in writing (76%)Communicate orally (75%)Function in diverse environments (75%)Design a system, component or process (75%).Analyze and interpret data (70%)Use the computer as a tool for analysis and design (63%)Design and conduct experiments (59%)Function on multidisciplinary teams (59%)

A substantial proportion of the faculty members think that students have received “too little” experience in several areas. These competencies, including the percentage of professors indicating this rating, are as follows:

Design and conduct experiments (41%)Function on multidisciplinary teams (41%)Use the computer as a tool for analysis and design (37%)Analyze and interpret data (30%)

Program Results Differences were observed among the distribution of responses for the programs. In general, responses from the civil engineering faculty members were fairly uniform but did not duplicate trends observed with the other programs. Half or more of the Civil engineering faculty members believe their students did not have an adequate amount of experience with 6 of 10 topics in this category. Most notably, all of the Civil engineering respondents indicated that students had insufficient experience with multidisciplinary teams and functioning in culturally diverse environments and 75 percent believe that students lack sufficient coursework in oral communications. Other differences among the programs concern the competencies for designing a system and using the computer as a tool for design. Chemical faculty members (43 percent) and Civil faculty members (50%) think students need more experience with designing a system, component or process. Chemical (50%), Civil (50%), and Electrical (67%) professors believe students lack experience in analysis and design using a computer.

189

Page 190: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Level of Competency The proportion of faculty indicating a “satisfied” or “completely satisfied” response ranged from 48 to 78 percent of the total. Faculty members believe that students exhibit a sufficient degree of competency in a number of areas. Some of these include:

Use of computer software for communication (78%)Oral communication (78%)Identify, formulate and solve problems (69%) Design a system, component or process (68%)Function in a culturally diverse environment (68%)Communicate in writing (68%)

A significant segment of the respondents identified areas needing improvement; faculty gave “completely dissatisfied” or “dissatisfied” ratings to these competencies. They include:

Use the computer for analysis and design (52%)Analyze and interpret data (50%)Design and conduct experiments (45%)Function on multi-disciplinary teams (41%)

Program Results Notable differences were also observed in the distribution of responses by program. Results indicate that a substantial portion of the Civil engineering faculty are dissatisfied with the level of competency of their students on the skills in this category. Dissatisfaction ratings ranged from 40 to 80 percent on the ten topics. Areas with the highest levels of dissatisfaction include oral (80%) and written (80%) communications, design and conduct experiments (80%), analyze and interpret data (80%), functioning in a culturally diverse environment (75%), and functioning on a multidisciplinary team (75%). These findings are not surprising because civil engineering faculty indicated that students had not received sufficient experience with these skills.

Electrical faculty also indicated overall dissatisfaction with the competency level of their students on various skills such as identifying engineering problems (67%), designing systems (67%), use of the computer as a tool for analysis (100%), designing and conducting experiments (67%), and analyze and interpret data (100%). These are areas in which faculty also indicated that students received an insufficient amount of experience. On the other hand, 100 percent of the professors were satisfied with student’s competency in functioning on multidisciplinary teams, functioning in culturally diverse environments, oral communication, written communications and use of computer software for professional communications.

Category 3: An understanding of professional and ethical responsibilities, environmental aspects of engineering, engineering on a global scale, impact of engineering solutions in a global context, life-long learning, industry practices, and contemporary issues.

190

Page 191: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Amount of Experience For all but one skill in this area, over half of the faculty members believe students received an adequate amount of experience with each competency. The proportion of faculty indicating a rating of “adequate” ranged from 46 to 68 percent of the respondents. Approximately 54 percent of the professors, however, believe that students do not have an adequate amount of experience regarding industry practice and standards.

Program Results There was a similar pattern of responses for each program regarding basic knowledge of industry practices and standards; 43 to 75 percent of the faculty members indicate that students have insufficient experience in this area. Response patterns for the remainder of the skills in this category were mixed. In general, however, the Chemical and Civil engineering professors were likely to agree that student’s experience was adequate on these particular topics. Electrical, Mechanical, and sometimes Computer faculty regarded student experience on the environment, engineering solutions in a global context, need for life-long learning and professional and ethical responsibilities as inadequate.

Level of Competency Overall, 50 percent or more of the faculty are satisfied or completely satisfied with the competency levels achieved by their students in each of these skill areas. The positive ratings of skills in this category ranged from 50 to 82 percent of the total. Approximately 82 percent of the faculty members are happy with students’ ability to engage in life-long learning and 71 percent believe students have a sufficient understanding of professional and ethical responsibilities. The two skills identified as weaknesses include basic knowledge of industry practices and standards (50%) and contemporary issues (44%).

Program Results There was not a discernable pattern of responses by program for the competencies in this category. Also, programs differed as to the magnitude. Overall 30 to 40 percent of the faculty members expressed dissatisfaction with the level of student competency for each skill but the response rate for each program varied from 0 to 70 percent indicating a wide range of perceptions. Dissatisfaction patterns for each program are given in the following table.

Dissatisfaction Ratings for Skills in Category 3

Competency Chemical Civil Computer Electrical Mechanical

Ethics 0% 50% 50% 33% 30%Environment 17% 50% 25% 67% 40%Global scale 33% 50% 25% 0% 50%Solution in a global context

0% 50% 33% 50% 30%

Life-long learning 0% 0% 0% 33% 44%Business practices 40% 25% 40% 50% 70%Contemporary Issues 40% 75% 25% 0% 50% In general, 30 to 70 percent of the mechanical faculty members are dissatisfied with student competencies on these skills. Results seem to indicate that the other programs have selected one or two areas to emphasize within the coursework.

191

Page 192: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Student Feedback

Faculty members were asked how often they ask student’s input on course improvement. Respondents were told to exclude the COEIT Course Survey (green Scantron form) from their estimate. Approximately 13 percent of the respondents indicated that they never ask for student evaluation of the course. Twenty-three percent seldom ask for student feedback. Survey results indicate that the largest proportion of faculty members, 42 percent, seek feedback from students once per semester. An additional 23 percent of the respondents obtain student input two or three times each semester.

State Mandated Course Evaluation

Faculty members were asked if they are aware that South Carolina accountability law mandates seven of the items included on the course evaluation survey. Only 61 percent of the faculty members responded affirmatively indicating that additional briefings or updates on this topic would be a beneficial exercise.

Activities in the Classroom

Faculty members were asked to indicate the extent to which they engage in various teaching and/or learning strategies within the classroom. Many of the techniques or strategies listed are those recommended by ABET and other engineering reform leaders. Possible responses covered a five-point continuum from “never” to “all the time.” A score of three on this scale would indicate an “average” use of the particular strategy with scores of four and five representing “above average” usage.

Overall, results suggest that faculty engage in the specific activities most of the time. The following percentages represent the proportion of the responding faculty members who rated their engagement as “above average.”

Available for student appointments 96%Integrate math and science within courses 84%Interact with students outside of class 80%Use a variety of teaching strategies 61%Use of technology to deliver instruction 52%Use computer activities to enhance learning 52%Use of a variety of methods to accommodate

different student learning styles 51%Encourage students to read professional journals 45%Encourage students to visit professional websites 40%

Faculty were also asked to indicate if they engaged in these activities more, less or about the same as the previous year. Over half of the faculty indicated that they used each of the specified classroom activities more than the year before. Activities noted most frequently by the professors include:

192

Page 193: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Use technology to deliver instruction 90%Interact with students outside of class 76%Use computer activities to enhance learning 75%

Improving Engineering Education

One of the goals of the ABET Engineering Criteria 2000 is to improve the educational experience for students in engineering and information technology. The faculty members were asked to relate the ways they have tried to improve the educational experience for students in their courses. Twenty-seven faculty members listed more than 34 different activities that were incorporated into the courses they taught. Responses to this question suggest a variety of methodologies and strategies in the effort to improve engineering education. Multiple persons listed several activities. They include hands-on projects (3), more use of computer (3), more interaction with students outside of class (2), formal use of the Professional Communications Center (PCC) (2), giving students more responsibility for design projects, team projects, more business industry applications, more appropriate projects/homework, and, relating topics to current/contemporary issues.

A few of the other activities faculty members listed include adding ethics, giving a workshop on teamwork, use of industry expertise within the classroom, use of internet resources, linking exam questions to course objectives and linking course materials to other courses and engineering topics.

An open-ended survey item also asked how the College of Engineering and Information Technology could enhance undergraduate engineering education. Twenty-seven faculty members listed 29 different suggestions for change and/or improvement. The most frequently cited responses from faculty are as follows:

Hire more faculty members (5)Re-organize/restructure college computer support services (4)Provide funds for teaching assistants (2)Provide funds for the purchase of computer software (2)Provide computer training for basic software (2)Admit more qualified students (2)

Some of the other suggestions include more staff support, provide teaching awards, computer projection screens in all classrooms, faculty involvement in recruiting, provide technicians, better labs and equipment and an endowed chair in education.

Professional Development

Respondents were asked to list the professional activities that they participated in during the past year. Attending and/or presenting at conferences and workshops were the most frequently cited responses to this question. Fourteen faculty members indicated that they attended/presented at one or more conferences during the academic year. Six respondents mentioned attending technical seminars given by the College or those sponsored by another department or university. Three faculty members attended meetings of a professional engineering organization. Several (2) professors indicated that they served on various committees and regularly participated in meetings.

193

Page 194: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Additionally, faculty members wrote papers, edited a journal and/or reviewed papers for publication.

Summary

An objective of this survey was to provide faculty members an additional opportunity to contribute to the continuous quality improvement process with the College of Engineering and Information Technology. Respondents evaluated student’s exposure to or experience with 21 different skills and competencies necessary for graduates to function effectively as engineers in the workplace. They also assessed student’s competencies within their program. The survey also requested information regarding other concerns such as professional development, improving engineering education, use of teaching/learning/assessment strategies within the classroom and recommendations for improvement.

On April 19, 2000, surveys were mailed to 81 College faculty members including Computer Science professors. The survey sample consisted of a total of 31 surveys for a return rate of 38 percent. The following return rates were obtained for each program:

Chemical 47% ( 7 of 15 surveys)Civil 33% ( 5 of 15 surveys)Computer 26% ( 6 of 23 surveys)Electrical 33% ( 3 of 9 surveys)Mechanical 53% (10 of 19 surveys)

Faculty provided their opinion regarding the amount of experience students received on 21 different skills and competencies within their engineering courses. A majority of faculty rated their student’s experience as adequate for all skills except one. Rating of adequate ranged from 46 to 97 percent of the total. Approximately 54 percent of the faculty members believe that students do not graduate with a basic knowledge of industry practices. A substantial proportion of faculty members (a third or more of the total group) believe that students receive “too little” experience in additional skill areas. These include Liberal Arts (36%), use of computer as a tool for analysis and design (37%), function on multi-disciplinary teams (41%), design and conduct experiments (41%), professional and ethical responsibilities (37%), environmental aspects of engineering (41%), engineering on a global scale (33%), impact of engineering solutions in a global context (33%).

Faculty members provided their perception of the level of competency students achieved as a result of their USC education. Positive faculty ratings, on the skills, ranged from 48 to 82 percent of the total indicating that they were satisfied with their student’s competency level. Competencies achieving the highest approval ratings include:

Engineering terms, principles, theories 82%Need for engaging in life-long learning 82%Communicate orally 78%Use computer software for communication 78%Ability to apply chemistry/physics 74%Ability to apply liberal arts 72%

194

Page 195: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Professional and ethical responsibilities 71%

The faculty members also identified areas of weakness by indicating that they were dissatisfied with the student’s level of competency. Areas in need of improvement (indicated by the percentage of faculty selecting a dissatisfied response) include:

Use computer as a tool for analysis and design 52%Analyze and interpret data 50%Basic knowledge of industry practices 50%Design and conduct experiments 45%Contemporary issues 44%Function on Multi-disciplinary teams 41%Advanced math 41%Practice engineering on global scale 39%Environmental aspects 37%Impact of engineering solutions in global 36%

Faculty members were asked to name the ways in which they have tried to improve the educational experience for students in their course. A total of 27 respondents listed more than 34 activities that were incorporated into their courses. The activities mentioned most frequently by the professors include hands-on projects, increased use of the computer, more interaction with students outside of class, and formal use of the Professional Communications Center (PCC).

Faculty members were asked to indicate the extent to which they engage in various teaching /learning strategies within the classroom. Techniques or strategies used “all the time” or “nearly all the time” by engineering faculty members include:

Available for student appointments 96%Integrate math and science within courses 84%Interact with students outside of class 80%Use a variety of teaching strategies 61%

Over half of the faculty indicated that they used each of the specified classroom activities more than the year before. Activities with the largest percentage of increased usage in the classroom are listed below.

Use technology to deliver instruction 90%Interact with students outside of class 76%Use computer activities to enhance learning 75%

Survey results indicate that approximately 65 percent or more of the professors seek student input regarding the course at least once during the semester. Only 61 percent of the respondents are aware that South Carolina law mandates the administration of at least seven evaluation items for all courses taught each fall and spring semester.

195

Page 196: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Respondents listed the professional activities that they participated in during the past year. Attending and/or making presentations at conferences and workshops were the most frequently cited responses to this question. Fourteen faculty members attended/presented at one or more conferences during the academic year.

Twenty-seven faculty members listed 29 suggestions as ways in which the College can enhance undergraduate engineering education. Some of these recommendations for improvement include:

Hire more faculty members (5)Re-organize/restructure college computer support services (4)Provide funds for teaching assistants (2)Provide funds for the purchase of computer software (2)Provide computer training for basic software (2)Admit more qualified students (2)

196

Page 197: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix J

Entering Student Survey

197

Page 198: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information Technology

Entering Student Questionnaire1999 Fall Semester

Name of your UNIV101-E instructor: ______________________________________________

Marketing and Recruiting Information

1. What was the primary reason why you decided to attend the University of South Carolina?

________________________________________________________________________

________________________________________________________________________

2. Was USC your first choice of colleges to attend? Yes No

3. What are some of the other important factors that influenced your decision to attend USC?(Such as cost, scholarship, close to home, academic reputation, friends, parental influence)

________________________________________________________________________

________________________________________________________________________

4. If USC was NOT your first choice, please indicate the reasons. What could we have done to make USC your first choice university?

________________________________________________________________________

________________________________________________________________________

________________________________________________________________________

5. Other than USC, list the colleges to which you applied and indicate if you were admitted.

Admitted Admitted________________________ Yes No ________________________ Yes No College Name College Name

________________________ Yes No ______________________ Yes NoCollege Name College Name

________________________ Yes No ______________________ Yes NoCollege Name College Name

_________________________ Yes No _______________________ Yes No

198

Page 199: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College Name College Name

Engineering Information

6. How did you learn about the College of Engineering and Information Technology? Circle all the options that apply.

TV Radio Newspaper ads Newspaper stories State Fair

College sponsored special events Friends Relatives Admissions fairs

Through a high school program or counselor

Others: __________________________________________________________________________

7. Did you receive enough information about the College of Engineering and Information Technology before you enrolled in Engineering?

Yes No I don’t know

8. Did you have an opportunity to tour the college? Yes No

If yes, please tell us your opinion of the tour?

Engineering Website

9. Before enrolling at USC, did you visit the college of Engineering Website? Yes No

10. Do you have any recommendations for improving the Website? ____________________________________________________________________________________________________________________________________________________________

11. If you have visited the college of Engineering and Information technology Website, please indicate your impression of the following characteristics:

NotSatisfactory

NoOpinion Satisfactory

Very Satisfactory

Ease of locating siteOrganization of front pageEase of finding specific informationCompleteness of informationCurrency of information

199

Page 200: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Student Demographics

12. Are you employed? Yes No

If yes, how many hours do you work per week? ___________________________

13. If you are not employed, do you plan to find a job during your freshman year? Yes No

14. Please indicate your gender. Female Male

15. Did you bring a computer with your when you came to USC? Yes No

(a) If no, have you purchased a computer since enrolling at USC? Yes No

(b) If yes, indicated the brand name type of computer you own.______________________________________________________________________________

(c) Is your computer a PC or a laptop? ____________________________

(d) Have you upgraded your computer since arriving at USC? Yes No

(e) List the software you have installed on your system.

___________________________________________________________

(f) Was this your first computer purchase? Yes No

16. Prior to this class, have you had any computer instruction? Yes No

If yes, where did you receive your instruction about computers and software? If you learned on your own – by reading manuals, etc. – please indicate that experience too.

______________________________________________________________________________

______________________________________________________________________________

Academic Preparation

17. Did you take a calculus course in high school? Yes NoIf yes, what grade did you receive? _______________

18. Did you take a physics course in high school? Yes No

If yes, what grade did you receive? _______________

19. Have you given an oral presentation in any of your high school classes? YesNo

20. Did you take AP English in high school? Yes No

200

Page 201: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

21. Did you write reports or papers in science or math classes in high school? Yes No

22. What best describes your attitude toward writing?

_____ Avoid it if I can_____ Don’t enjoy it, but do a pretty good job_____ Enjoy writing_____ Other: _________________________________________________________________

201

Page 202: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix K

Entering Student Survey Reports

(sample)

202

Page 203: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

203

Page 204: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information TechnologyEntering Student Questionnaire

1999 Fall Semester Results

Note. Student responses have been typed as written. Spelling and grammar have not been corrected.

UNIV101-E instructors and the number of students completing the questionnaire:

Bowles 23 Lyons 24Dougal 14 McAnally 23Gadala-Maria 22 McNeill 22Gribb 19 Van Zee 13

Total 160

Marketing and Recruiting Information

1. What was the primary reason why you decided to attend the University of South Carolina?

Close to home 44Scholarships 29Good engineering program 26Cost 19Academic reputation 19Family or friends 9Location 9Honor’s College 4Sports 3Other: 6

Prepare for futureHandicap accessiblePre-medLove ColumbiaSize of campusPersonal attention

It is close to home and I felt that this University would prepare me the best for the future.Because it is nice, very well handicap accessible,The university is close to homeI wanted be an Engineering major and USC has a very accredited engineering dept.CostTo join the University of South Carolina Naval Reserve Officer Training CorpsI wanted to attend a good university close to homeUSC has a great Engineering School and Career Placement programThe primary reason was to put distance between home and school.I heard USC had a pretty good engineering program, and it was close to home.Programs and scholarshipsNROTC scholarship, high academic standardsGood Reputation For Engineering + Closer To HomeThe University of South Carolina has a good teaching program.ScholarshipsScholarshipsMoney – scholarships, work, research

204

Page 205: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

I was impressed with the College of Engineering at U.S.C.To get a good educationMale\Female ration; scholarships; in the southB/c of the strong engineering department, and academic successI love ColumbiaClose to home, good overallIt made more sense economically to attend USC instead of my first choice.It has a good engineering program.I wanted to attend USC because it’s close to home and it has a great engineering programI wanted to get involved in the engineering department + also planning on pursuing on to Med. School.Had the highest academic reputation of the schools that I applied to.My brothers go there and my father is a Professor at USC.The ability to do Chemical Engineering with Pre-medicine and more to people to meet.Best education closer to homeUSC had my major which is Engineering and also it was very close to home.I was impressed with the College of Engineering when I visited.This school was the closest to my home.To get an engineering degreeI thought I would like living in Columbia but I was wrong.Not too far from home. In state school with my intended majorAcademicsThey were the first school to send me information on going to school here and they carried my major.My father influenced me a lot because he went here. My top priority was to find a large school close to home, but also away from home.Its location and engineering program that was involved in racing attracted me to USC.Good schoolThe only college I sent in an applicationMy primary reason was to get a better education and USC had a great Engineering programI wanted to further my education at a 4-year college.Financial PurposesScholarshipsUSC’s accredited engineering programThe engineering program of USC is great and the college suits my needs as a studentCollege of Engineering, close to homeIt was a combination of the reputation of the engineering program and the layout of the campus.Close to home, good engineering programScholarships awarded, change in climateThe engineering program was seeing changes toward improvement and the University was close to home and affordableUSC is close to home and the best price for my major, electrical engineering.Two reasons: My sister attends USC and I was looking for a personal learning environment like the one USC offersMoney – I received Scholarships.Distance, location, and qualityThe cost was primary.Because it was close to home and it’s engineering program impressed me.Close to home, but far enough awayScholarship moneyI liked the campus and the Honors College.Brainwashing Since BirthScholarships and broad curriculumThe Honors programLower costScholarship MoneyMoneyThe personal attention I received during my visit to the school.I thought it was a good place for education, and it was inexpensive.Scholarship money, proximity to home.I liked what I saw in the college of engineering and got scholarships to allow me to go here.I was already familiar with it and the honors college has a good reputationFinancial constraintsLocation and because I thought I was accepted under an environmental engineering majorBig school, close to homeCheap, close to home

205

Page 206: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

I receive my college at a low cost due to my father’s death. Only in SC USC has a great engineering program.USC has a good program for engineering and it is close to homeClose to homeOpportunity at a better education and locationBecause of the engineering programFamily traditionIt was closer to home and a well-known schoolClose to homeThey provided the largest scholarships to meClose to homeRanked highly in the Engineering programSize of campus4 year ROTC scholarshipI received an NROTC scholarshipIt has a big school & in state (heard that it has a good academic environmentIt was close to homeTo earn a degree in Computer EngineeringAcademicsBecause I could not afford to go out of stateThe university has great credibility and has a fine college of EngineeringOnly school I applied toLocation – close to homeIt was near homeClose to home and already had friends there.Good School, Close to homeBecause I got accepted hereI love football + Lou Holtz.To get a Degree in Mech. EngineeringTo play for the University’s Men’s Soccer Team.Full RideI toured the Swearingen Engineering program in February 1998.I attended USC because everyone I knew was going to SCSU and I wanted to be different.My mother forced me into going to college.To better myself as a person + increase my earning potentialTo get an educationI heard they had a good engineering program and it wasn’t too far from homeMy father attended USC and I have been raised around the University.The University is known for a good engineering program.LocationI wanted to major in Engineering and all the schools that were recruiting me for football didn’t have it.LocationOnly good college around with engineering that would accept me.It’s close to home, and I received scholarshipsClose to home, cost.I liked the school and everything that surrounded itBecause of how close it was.I didn’t really care where I went, and USC gave me $1500, so I came hereConvenient, cheap, plus it’s a good schoolFinancial situationI decided to attend USC because they have a very good engineering program and success rate is highI wanted to be in Columbia.Close to homeBig school in a city – places nearby off campus close enough to home, but far enough to live on campusThe academic status is high at this college.Offered major I wanted.Location, reputationClose to home, Parents decided for meThe Chemical Engineering Program.Close to home + scholarshipsWas close to homeI have my own home in Camden & I wanted to go somewhere close to home.I’m from Columbia and liked the Engineering school.

206

Page 207: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

It has a great academic reputation.Exceptional academic profileGood College rankingClose to home & good engineering programPretty campus and * they are very friendly to out-of-staters *They gave me moneyScholarshipsIt is in state and free.Money, Pre-Med ProgramClose to home and scholarship moneyI chose USC because of scholarships and the college of engineering.The university offered more comparative financial support than other universities I was interested in attending.Location, Engineering ProgramThe engineering program and financial aid programs.Scholarship MoneyIt was close to home + I got a good scholarshipI received a McNair Scholarship.Admission to the honors college, it was my first choice

2. Was USC your first choice of colleges to attend?

No 101 (63%) Yes 59 (37%)

3. What are some of the other important factors that influenced your decision to attend USC?(Such as cost, scholarship, close to home, academic reputation, friends, parental influence)

I had a brother who attended USC.Cost cause my first choice was out of state

Academic reputation, costCost and closer to homeClose to home, choice of majorScholarshipsIt was cheaper for me to attend an in-state universityClose to home, reasonable affordable tuition, and diverse communityOther important factors were cost and friendsClose to home and friends I have at the college.Close to home; the helpfulness of the staff at USC; the toursScholarship, academic reputationClose to home, located in cityClose to homeAcademic reputation, cost, weatherClose to home also, but mostly scholarships.Scholarships, engineering departmentScholarships, close to homeMeeting a lot of peopleConstantly improving engineering schoolCloseness to homeGood programs, scholarship, academic rep., I have a good job here outside of school.Friends, cost, Lou Holtz, developing schoolIt is close to home, I know people here and for the cost, it is a very good school.I received the University Scholars Scholarship, and another scholarship from College of Engineering, + it is close to homeMy parents did want me to attend this school, and I received partial scholarships.Academic reputation, close to homeWarm weather, not too far from home.I was close to home and I got the LIFE scholarship. The academic reputation is also goodCost, friendsAcademic reputation, suggested by an Upward Bound counselorMainly because it was close to homeAcademic reputation, nice campus + good location.

207

Page 208: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Cost, parental influence and USC reputation as a universityAcademic ReputationNothing except I thought I would like ColumbiaCosts, friends, close to homeFriends, parental influenceIn-state tuition is cheaper; I like the “city” atmosphere. Scholarships helped too.Scholarship, close to home, parents, friendsIt was close to home, and my brother previously attended USC.Scholarship, costScholarship, FriendsMy friends were an important factor, but it was basically not to far back home, but close enough that I could come back.The cost, close to home, and my family influenced my decision.Academic reputation as well as AnnClose to home, reputation, on going programs, advancementsCost, scholarships, closer to home, friends, & parental influenceI received two scholarships from the school, and one of my good friends attends USC as wellCost, close to homeFriends, cost, distance from homeScholarships awarded, change in climateAcademic reputation, close to home, scholarships, costs, friendsScholarship, academic reputation, and a good engineering program.Scholarships and the locationIt cost too much to attend my 1st choice schoolReasonable costScholarship, close to home, my brother is hereI have friends who attend here and it was close to home.FriendsCost, far from home, athletics, academics, size, locationI received scholarship money and it was a good distance away from home.All of the aboveClose to home, fun campusCost, scholarship, size, parental influenceAcceptance into Honors College, good reputationMusic Reputation, Family influenceClose to home, best of both worlds at the Honors CollegeDorms, scholarships, nice campusCost, scholarship, close to home, academics, friendsHonors College, cost,Scholarships, my comfort with the engineering college, and academic reputation of the college of engineeringCost, scholarshipFriends (to a small extent) – Honors CollegeReputation, location, city-like environment – appearance of campus.1st choice did not have my majorScholarship, friendsMy family has been gamecock fans forever.Cost and academic reputationCost, location, reputationIt was cheaper and I received more money from USCClose to home, friends, parental influenceAcademic reputation, close to homeFriendsI could stay at home, scholarships, and low tuitionReputationAcademic reputation was high, soccerFriendsChoice of major, southern school, big schoolParental influence, ROTC reputation.Parental influence, and the university is a well accredited schoolCost, marching bandClose to homeScholarship, close to homeIt was close to home and it also had good academic reputation

208

Page 209: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

In stateScholarship, close to home but not too close, costClose to home, brother attends, friends, plus best choice for my major.I had a good deal of friends that go to USC.Life scholarshipCost, Close to homeClose to homeCost, close to home, scholarship, friends, parental influence.Well, I didn’t want to go to Clemson, and this was the best in-state institution for my majorClose to home, scholarshipScholarshipI had several family members who graduated from USC. The college is well known. Location was also a factor.USC being close to home influenced my decision because I can go home anytime I want.The college is close to home and my friends go here.Scholarships, friends, and It’s close to homeCost + academic reputationFriends, close to home and not too expensiveUSC is the only school that I applied to, and I received a nice amount of scholarship money to attend.Life scholarship, close to home, sisterCost, scholarship, close to home.Close to home and friendsCost, close to home, many friends go thereIt was farther away from homeMy father attended USC, and I have many friends up here in Columbia.Close to home, scholarship, costs.I received 2 scholarships, its close to home, my parents went here, I have a lot of friends hereIt was my cheapest option, and it was an ideal campus sizeLife Scholarship; Friends attend USC; Parents wanted me close to home.CostIn the state and close to home. Very nice peopleI was given enough scholarship to make it affordable to go here. I wanted to get away from home.Scholarship, friendsScholarship and semi-close to home, instateMy parents, teachers, & some friends recommended it to me. The cost was reasonable.Close to home, good reputation, friends.My sister went here, close to homeCost, friends, Life Scholarship, parentsThe NROTC, close to home, Dad attendedParental influence, costFriends, scholarshipsScholarship, close to home, & academic reputationIt is known as a good University and my girlfriend lives here.Close to home, academic reputationClose to home, tuition paid through scholarship, prestigious academics.Cost; academic reputationCostClimateCost, academic repClose to home, variety of majorsScholarships totally covered me, close to home, and in stateScholarship, cost, friendsScholarship, close to homeThe scholarships were the deciding factors. But also, proximity to home and the reputation of the Honors College were major influences.Scholarships were the most influential reasonsCost, scholarship, friendFriends and close location to home.Nice campus, good Honors CollegeIt was mainly because it was close to home + I got a scholarship.I was also impressed with Columbia.Scholarships, far enough from home to be away, but not too far

209

Page 210: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

4. If USC was NOT your first choice, please indicate the reasons. What could we have done to make USC your first choice university?

Nothing really. I wanted to go to UNC but the only reason I didn’t is it was very poorly handicap accessibleWas not interested in 4 year University. Send more information about my particular major at USC, not just general info about the college.There wasn’t any particular reason why USC wasn’t my first choice.Academic Reputation. Make the University’s courses more rigorous + comprehensive.I was interested in the total academic, military, & extracurricular environment that was offered at the Naval Academy. There was nothing you could do to make USC my first choice.I liked the campus at Miami because it was isolated but near a city.USC was a close second to Georgia Tech… simply because of engineering reputation.Wanted to go out of stateIf USC had higher national rankingsNothing could’ve changed my mind except not being accepted to the collegeOffered more scholarships to rising seniors, let them know what’s available.A more popular engineering dept. All through high school I heard only about Clemson being the engineering school, I didn’t hear anything about USC. I know USC + Clemson both have good engineering programs.NCA&T was my first choice because they have a real good Engineering program and they’re a majority black college.USC wasn’t my first choice because it is 6 hours away from home. Can’t do much about that.I didn’t hear of USC at the beginning.USC didn’t have such a developed Engineering program as Clemson did. Maybe expand your Engineering programClemson has a slightly better engineering program based on statistics. Find ways to make USC a looked at college for engineering grads.Clemson was also reputed for their engineering program, but I disliked the campus. Promotional videos outlining what the university has to offer (sent to prospective students) would have helped in making USC my first choice.Nothing could have been done. I just love Florida StateIt’s not what USC didn’t do… it’s just that my 1st choice school was closer to home.Clemson was my first choice. Most people think that Clemson’s engineering program is better and so did I.USC couldn’t have done anything to have made this my first choice because I wanted to attend the citadel and would have been there if they offered Mechanical Engineering.Establish better reputationDistance from home, Degree offerings-Make the campus prettier, -if it were more prestigiousA little too close to home school I wanted to go to cost too much (Didn’t even apply)Get an engineering dept. on par with GAIT or a large Big Ten schoolMy first choice was about 500 miles closer to home.NothingBeen better in football and Basketball. Plus closer to my hometown.I wanted to go to school out of state.NothingUNCW was ranked higher than USC, been a higher rankMy first choice was the United States Naval Academy. USC could not have been my first choiceI didn’t know that much about the school. Ya could have advertised that the women here are the best looking in the south.Didn’t want to be close to homeThey could have offered me some more financial assistance. My sister and I graduated at the same time and we both attend here, we really need some more assistance.Nothing, you did everything you should have. I just wanted to go to a small school first.Because my first choice of college has a bigger reputation on the major of my interest. Let yourselves known moreNever recruited, never thought about itI was looking for a smaller college. Later I got a reality check that smaller is not always better.I wanted to attend college out of stat, but couldn’t afford it.Better engineering schoolI had no first choices. It would have been if I received more money from them.Not much of a recognized engineering program.USC was my first choiceNot #1 in EngineeringI want to get away from home (out of South Carolina). Other than that it would have been my topic 3 choices.Recruited at my H.S. more.Be more prompt in your responses when dealing with international students.Better academic rep

210

Page 211: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Better reputation for pre-medHigher quality pre-med programPrestige. I still feel, though, that I shall receive perhaps even a better education at USC than my other choices.USC was not on par academically with the top two institutions I was considering attendingI originally wanted to go to a smaller school.

5. Other than USC, list the colleges to which you applied and indicate if you were admitted.

College Name Frequency Admitted College Name Frequency AdmittedYes No Yes No

Clemson 56 54 2 Francis Marion 3 3College of Charleston 15 15 Lander 2 2Georgia Tech 12 11 1 Newberry 2 2UNC 7 5 2 University of Florida 3 2 1Virginia Tech 7 6 1 Johnson C. Smith 3 3NC State 7 7 Wofford 3 3Charleston Southern 6 6 West Point 2 1 1Benedict 6 5 1 Vanderbilt 3 3Auburn 5 5 Rose-Hulman 2 2Winthrop 5 4 1 Rutgers 2 2Furman 6 6 UNC-Charlotte 1 1University of Tennessee 4 4 US Air Force Academy 2 2Duke 4 2 2 University of Georgia 2 1 1S. C. State 4 4 John Hopkins 2 2Florida State 4 4 USC-Spartanburg 2 2US Naval Academy 4 1 3 Appalachian State 2 2Hampton 4 4

Engineering Information

6. How did you learn about the College of Engineering and Information Technology? Circle all the options that apply.

TV 4 Radio 2 Newspaper ads 5Newspaper stories 10 State Fair 4 College sponsored special events 38Friends 65 Relatives 46 Admissions fairs 28Through a high school program or counselor 63 Others: 29

7. Did you receive enough information about the College of Engineering and Information Technology before you enrolled in Engineering?

Yes 86 (54%) No 47 (29%) I don’t know 27 (17%)

8. Did you have an opportunity to tour the college?

Yes 105 (66%) No 55 (34%)

211

Page 212: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

If yes, please tell us your opinion of the tour?

I would have liked to tour a larger portion of the campus. All I really saw were the dorms + libraryIt was greatI had the opportunity but didn’t take advantage.The tour was very nice and filled a lot of informationOk, but the tour would have been more helpful if it was personalized to what the student wanted to major in.It was an excellent tour.The tour was a great intro to the college of Engineering, and clearly presented each aspect of engineering.It was very informativeAcceptable.It was a great tour and it influenced my choice to attend USCGoodIt was very good!The tour was well organized and the information given was interesting.Very informativeIt was very informative and interestingIt was okIt was a very good tour that represented all areas of engineering very well.It was a very informative tour.It was a very good tour.I took my own tour, so I can’t grade yours.Good, funI thought that it was very education. I was able to tour all of the engineering labs and it helped me to decide on a major.During orientation I had a brief tour of the collegeGREAT!!I think that the tour was well organized and very informative.It was well planned and the staff was prepared to show us what we wanted to see.More information for international studentBig CampusIt was very goodIt was ok, didn’t show as much as I wish you guys would have.I thought it was very informativeIt gave a foundation of understandingWell organizedI enjoyed my visit a lot and the campus was better than I thoughtA lot of resources for students to use, very impressiveIt was superb.On a one to five 3Very good, in depth, appreciated 1 on 1 tourThe tour gave me the information I did not receive from mail and/or friends – great helpIt was a help. It let me see where my classes were going to be.It was very interesting.It was niceIt was excellent.I thought it was very interesting and beneficial.GreatVery goodWell done and enlightening about the CollegeTour was informative, but should have offered more info on Honors CollegeI thought the tour was great because I got a personal tour and I was able to talk to a few professorsI had a good timeGood overview of the college, with insight on all aspects of the collegeGoodIt as informativeIt was good. I got a good feel for the campus.My family and I enjoyed the tour. I t was very well planned and very informational.Didn’t see enough of the campusThe tour was very informativeGood

212

Page 213: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Didn’t goWell roundedTom Ward gave me a great tour. He showed me a lot of the Engineering College.Very niceSatisfactoryI thought it was well organized, and very helpful.It was fineI was able to learn my way aroundWell organized.The tour was to my satisfactionToo short, only show a few thingsVery Educational, ExcitingI had an opportunity but I didn’t take advantage of it.Very well.It was okay. I think I made my decision too early though.The tour was very thorough. The demo w/ the PC camera was neat.It was tiringVery informativeVery good.GreatIt was good; there is a lot to offerThe facilities appeared to be top of the line, and everyone was very helpful.Just like a tour should be – informativeI enjoyed the tour because I got to see all the different types of engineering here at USCIt was good, but should’ve covered the campus betterIt was adequate, but could have been more detailedIt was greatEnjoyed it. Time limit was strongly enforced.It was very good. Tom Ward was an excellent guide.This campus is very large, and the people here so far a very nice.Very goodIt was well planned out, didn’t really tourI really enjoyed the engineering orientation held over the summerI enjoyed the tour, but hopping from building to building, upstairs & downstairs, back upstairs & back downstairs was absolutely ridiculous. Plan the areas to visit better this year.It was very interesting and informative.It was very educational and I felt confidant that engineering as the right major for me.It was not one provided by the college. A friend of ours went to school here and she showed me around.GoodThe tour was informative, and I felt familiar with the college afterwards.FairPretty goodI enjoyed it.It was very informative.

Engineering Website

9. Before enrolling at USC, did you visit the College of Engineering and Information Technology Website?

Yes 45 (28%) No 115 (72%)

10. Do you have any recommendations for improving the Website?

It was good.There ought to be a more helpful section on connecting to the engineering server from across campus because I’ve had a lot of trouble connecting from Maxcy.No, just get the instructors to show us how to get logged on better.

213

Page 214: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

More accessible to students off campus. Links to USC systemHave a shorter URL for the website.Faster AccessTo be able to be accessed from any computer on campus.Students can’t get into the engineering account unless they’re in the engineering building; change that.Homepage is not eye-catching. Maybe more colors or designs.Show more activities / events that go on during the year.Include typical course requirementsCommon schedule for each degree w/requirements

11. If you have visited the College of Engineering and Information Technology Website, please indicate your impression of the following characteristics:

NotSatisfactory

NoOpinion Satisfactory

VerySatisfactory

Ease of locating site 2 ( 2%) 12 (12%) 51 (51%) 36 (36%)Organization of front page 0 ( %) 6 ( 6%) 46 (46%) 49 (49%)Ease of finding specific information

6 ( 6%) 10 (10%) 64 (64%) 21 (21%)

Completeness of information 6 ( 6%) 14 (14%) 59 (59%) 21 (21%)Currency of information 4 ( 4%) 14 (14%) 57 (57%) 26 (26%)

Student Demographics

12. Are you employed?

Yes 42 (26%) No 117 (73%)

If yes, how many hours do you work per week?

5 - 10 hours 11 (28%)11 - 15 hours 9 (23%)16 - 20 hours 9 (23%)24 - 32 hours 10 (26%)

Total 39

13. If you are not employed, do you plan to find a job during your freshman year?

Yes 48 (42%) No 67 (58%)

14. Please indicate your gender.

Female 34 (21%) Male 125 (79%)

15. Did you bring a computer with you when you came to USC?

Yes 63 (40%) No 94 (60%)

214

Page 215: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

a) If no, have you purchased a computer since enrolling at USC?

Yes 12 (12%) No 90 (88%)

b) If yes, indicate the brand name type of computer you own.

Gateway 14 (21%) Compaq 12 (18%) Hewlett-Packard 10 (15%)

Packard-Bell 6 ( 9%) Dell 6 ( 9%) Toshiba 3 ( 4%)

Home/Custom-made 5 ( 7%) IBM 1 ( 1%) Other 11 (16%)

c) Is your computer a PC or a laptop?

PC 58 (75%) Laptop 17 (22%) Both 2 ( 3%)

d) Have you upgraded your computer since arriving at USC?

Yes 18 (19%) No 75 (81%)

e) List the software you have installed on your system.

Windows 98, Office 98Office 97Win 98, OfficeA lot (everything required by CEAIS)Windows 98Windows 98 2nd editionWindows 98Office 2000, Games, MS Publisher 2000Windows 95InternetEthernet Card, HDNoneMicrosoft Office, Windows 95,Adobe Photoshop, Netscape ComposerWin 98Windows 98, MS Office 2000, AOLI Don’t KnowHuh? Are you serious? Windows 98 + a bunch of other stuffWin 98, Office 2000Microsoft Office,Games, music related, office 2000, movies, C++, many moreOffice 97, works suite, gamesWin 98, Corel Quatro Pro1Win NT, Office 97 Professional, etcSeveral GamesMicrosoft Office, lotusBasic downloadsNetscape 4.6; 500 mhz; 13 G hard drive3D accelerator card + gamesOffice 2000, Win 98Office 2000Lots of games, voodoo 3000, Corel suite 8

215

Page 216: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

EverythingMS Office, MS Works, Print Master gold, Print ShopAOL, Office 97Windows 98MS 98Windows 95Windows 95, I need MS Office & Windows 98MS OfficeMS Office, ExplorerMicrosoft Windows 98, Office, Word, Excel, OutlookMicrosoft Office, AOL, AutoCADWindows 98, Word 98, etc.MS Office 97, Corel Word Perfect 8Microsoft Office 97, Windows 98Ms Office, AOLWin 98, Office, AOL, Adobe Photoshop, etc.Just the stuff that came with itWindows 98, Office 2000 Professional, IE5.0MS Office 2000, Netmeetings 3.0, Quicktime 4.0 proMicrosoft Prof. Office, Microsoft Office 2000Microsoft, Internet 4.0Microsoft Office,Win 98Microsoft Office 2000Win 98, Office 97MS Works, MS WordGRIN, games, FTP searchersWin 95, AutoCAD ver. 10Too Much – Microsoft Office professional, Win 98 o/s, Paint Pro 5.0, Adobe, …MS Office 97Microsoft 2000Microsoft office, various games + entertainment, ect…

f) Was this your first computer purchase? Yes No

31 (37%) 53 (63%)

16. Prior to this class, have you had any computer instruction? Yes No

129 (83%) 26 (17%)

If yes, where did you receive your instruction about computers and software? If you learned on your own – by reading manuals, etc. – please indicate that experience too.

I took a computer electronics class in high school + also worked at a local business for 1 year building, repairing, troubleshooting, + installing software on computers.On my own by playing with themBy reading manuals, trial and error, and generally messing around.Taking computer courses in high school.High School, personal use, peer helpThrough computer class in High School.High schoolHigh School Computer Tech I & III received instruction about computers and software in high school.From my grandfather, and from making mistakes on my mom’s computer and trying to fix itHigh school, C++ languageSHS (Socastee High School) – Typing instruction, Academy For Arts, Science, & Technology – Microsoft Word, Microsoft PowerPoint, An AutoCAD V. 12High schoolHigh School/on my own – bought programming books and read info on the Internet

216

Page 217: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

High school, learning on my own, etc.School, friendsComputer class in high schoolHigh school classesI knew a great deal about computers. Mostly self-taught; Two years of AP computer coursesKeyboarding class (high school)Learned a lot on my ownMiddle + high school Micro Computer courses. CAD programs, and work.At Ridge view high I took drafting and a word processing class.High School ClassesHigh school classesFrom my brothersLearned on my own, computer application classes in high schoolReading manuals, CPT 101Took a class at Macon Technical Institute (Macon, GA), and I mostly learned on my own.High school coursesComputer literacy courseI worked on them when I was in the Army. The little that I know I taught myself.School, InternshipHigh SchoolI took a computer applications class my senior year in high school, I went to a summer computer camp a few years ago, and I learned a lot on my own.A took a computer science course in high school, but learned a lot about software on my own or through friends.I basically learned on my own and through friends.School, reading manuals, going through the programsExperience, classes in high schoolLearned through some in one of my high school classes and I taught myselfI was taught by my parents + learned on my ownLearned on ownHigh school classes: Computer Business Applications I & II self-teachingGCHS – typing class, GCHS/Trident Technical College – 2 yrs. CADD classA few computer classes in high school; computer manualsDarlington high school – Darlington, SC Computer Technology I and IIPlay + learn, High School, friendsR.B. Skill High School, learned on my own (reading manuals)On my own by playing around on the computer and also at Dorman High School.Middle SchoolSchool class / father / self / workClasses in High School.I mostly learned by trial and error but I did take computer science in high school.High school computer scienceHigh school – info. word processing + keyboardingSelf taught, manuals, schoolComputer Tech at James Island High SchoolLearned on my own, high schoolFamily, computer scienceI took computer programming in high school, including AP C++, but I knew how to use a computer well before thatReading manuals / computer science class on C++, education programsReading manual about programming, tutoring from my father and school classmatesSelf – taught, Computer Science in High schoolMostly self-taught by trial & errorSelf interestI have read Linux books and normal operating books. I took a Computer Electronics I & II at Sumter County Career CenterHigh schoolHigh school and reading manualsI took two computer courses in high schoolHigh School ClassHigh schoolTaught myself@ Airport High SchoolSchool

217

Page 218: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Software Tool’s classReading by myself, Dad’s instruction, High SchoolSchoolComputer Software class in 10th grade.High schoolLearned on my own by reading manualsWork – Savannah River SiteHigh SchoolLearned on my own and some classes in high schoolHigh school and other collegeBasically hands on, but some in high school.Personal, high schoolHigh School, reading ManualsIn High schoolMy 9th grade computer / typing class.Aiken Tech. Col.High School Applications classReading books and an Intro to Computers course in high schoolI received instruction in high schoolAt High school during Business application classesOn my own, and in high schoolHigh schoolComputer class in high schoolManuals, previous computer experience, classes taken during high school.In high school computer classes as well as an engineering summer program this past summerSchool, friends, learned on ownOn my own.Own experience, school, + relatives.Self-taught.Reading manuals, relativesHigh schoolI learned a little in Keyboarding class in High School. I also figured out a little on my ownHigh school computer course.Richland Northeast High SchoolIn high school.FAMUSchool, and reading manuals. I know html, JavaScript, + parts of Java.Learned on my own, 3 semesters of HS instructionOn my own and from my DadHigh schoolReading manuals & just playing aroundReading and taking computer-based classes.Through high school keyboarding classes and use @ school & home for research projectsSchool and by reading manualsLearned on my own & learned from family & friendsI learned on a clerical job and in a word processing class in high school.High school typing, learned on ownMy own, manualsSome on my own from loans, some in middle and high schoolHigh School and by reading manualsHigh school / Middle School – Word processing, data entry, data manipulation, general programming, Internet browsing, web site creation and maintenance, hardware installation.Basic computer classIndependent… C, C++ (four years), DOS (12 years), Windows (6 six years), Excel, Word, etc…On own, University of Cincinnati 1 semester of Computer Science, High School Personal keyboarding…I learned it on my own and in English and Man classes at School.High school, private study

218

Page 219: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Academic Preparation

17. Did you take a calculus course in high school? Yes No103 (65%) 56 (35%)

If yes, what grade did you receive?

A 44 (45%) B 40 (41%) C 12 (12%) D 2 (2%)

18. Did you take a physics course in high school? Yes No113 (71%) 46 (29%)

If yes, what grade did you receive?

A 61 (56%) B 36 (32%) C 10 (9%) F 1 (1%)

19. Have you given an oral presentation in any of your high school classes? Yes No152 (96%) 6 (4%)

20. Did you take AP English in high school? Yes No48 (30%) 111 (70%)

21. Did you write reports or papers in science or math classes in high school?

Yes No 106 (67%) 52 (33%)

22. What best describes your attitude toward writing?

30 (19%) Avoid it if I can81 (51%) Don’t enjoy it, but do a pretty good job30 (19%) Enjoy writing17 (11%) Other:

219

Page 220: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information TechnologyEntering Student Questionnaire

Summary of Results1999 Fall Semester

Goals/Objectives

The Entering Student Questionnaire is administered to freshmen during the first month of the fall semester each year. All students enrolled in UNIV 101-E are asked to complete the 22-item survey. The survey elicits information regarding several topics of interest including: 1) marketing of the College and the engineering programs; 2) recruitment; 3) public relations; 4) the College of Engineering and Information Technology (COEIT) website (http://www.engr.sc.edu); 5) student employment; 6) computer ownership, hardware, software and training; and, 7) academic preparation for college.

Survey Administration

During the 1999 fall Semester, there were nine sections of UNIV 101-E enrolling 204 students. All but one section of the introductory course completed the survey. A total of 160 surveys were collected yielding a return rate of approximately 78 percent. A few students from several sections were absent on the day of survey administration.

Description of the Respondents

The survey sample was composed of 160 students of which 79 percent are males. Approximately 26 percent of the respondents are employed, working from 5 to 32 hours per week. Approximately 74 percent of the students indicated that they worked 20 hours or less each week. Approximately 42 percent of the unemployed students said they would be looking for a job during their freshman year.

Academic Preparation

Students were asked questions relating to the math, science and English courses taken in high school. Survey results indicate that 65 percent of the freshmen took a calculus course. Forty-five percent of the students received a grade of A and 41 percent made a B. A total of 71 percent of the freshmen completed a physics course in high school. Students received a range of grades from A to F with a majority, 56 percent, achieving an A. Almost all of the students, 96 percent, indicated that they had given an oral presentation and 67 percent said they were required to write reports in science or math classes in high school. Survey responses show that freshmen did not enroll in AP English; only 30 percent of the students took this course in high school.

The questionnaire included an item eliciting the student’s attitudes towards writing. Seventy percent of the freshmen stated that they avoid writing or that writing is not something they enjoy.

220

Page 221: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Even though 51 percent do not like to write, they believe that they “do a pretty good job” when they must prepare a paper.

Recruitment

Students were asked to identify the primary reason why they decided to attend USC. There were 158 responses to this question. Over half of the respondents listed more than one reason indicating that the decision to enroll may be a combination of factors with several reasons being of equal importance.

The most frequently cited reason for choosing USC is the proximity of the university to their home. Close to home was listed by 44 students, or approximately 28 percent of the students who completed this item. Students mentioned eleven different categories of responses to this question. Some of the other reasons listed by the freshmen include: good engineering program (26 students); scholarships (29 students); cost or financial aid (19 students); academic reputation (19 students), had major/an accredited engineering program (15); location (9); family (9); and the Honors College (4 students).

Sixty-three percent of the students indicated that USC was their first choice among colleges they considered for enrollment. In a follow-up question, students were asked why USC was not their first choice. The most frequently cited response was that other colleges/universities have a better academic reputation than USC. Other reasons given by the 1999 freshmen include 1) USC was too close to home; 2) Clemson, NC State, Georgia Tech, Florida State have more popular engineering programs; 3) liked another school or campus; 4) wanted to be out-of-state; and 5) wanted to attend a smaller school.

To assist in the recruitment efforts, the survey asked students to list all colleges to which they applied and to indicate if they were admitted to these colleges. Data suggests that students applied to numerous in-state and out-of-state colleges and/or universities. Some students applied to multiple colleges, as many as seven were listed, while other students applied to one or two colleges. A few students indicated that USC was the only school to which they applied. The colleges most frequently listed by the students include: Clemson (56 students); College of Charleston (15 students); Georgia Tech (12 students); UNC, Virginia Tech, and NC State (7 students each); Charleston Southern, Furman, Benedict (6 students each); and, Auburn and Winthrop (5 students each).

Marketing

Students responded to several items concerning the information they received about the College of Engineering. Most students indicated they learned of the College from friends (65), high school counselors (63), and, relatives (46). In addition, a substantial number of students also listed College-sponsored events (38) and admissions fairs (28) as other information sources. It is noteworthy that few students indicated TV, radio, newspaper or the S.C. State Fair as sources of information about the College of Engineering. About half of the students (54 percent) believe they received sufficient information about the College before they enrolled in Engineering.

221

Page 222: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

A guided tour was another way in which students learned about the College; approximately 66 percent of the freshmen toured the College of Engineering prior to enrollment. Students were asked to give their opinion of the college tour. The question was intended as an evaluation of the College of Engineering tours but some students interpreted it to mean the University tour. Opinions regarding the Engineering tours were very favorable. Thirty-five students characterized the tour as excellent, great, very good or good. An additional 21 students believe the tour was very informative, helpful or educational. Ratings of acceptable, adequate, fine and satisfactory were expressed by 13 students. Other comments indicated that the tour was well-organized and enjoyable.

COEIT Website

Students responded to seven items regarding the COEIT Website. Only 28 percent of the respondents visited the site before enrolling at USC. Students who visited the COEIT Website rated five different components on a scale from 1 to 4 selecting from the following choices: not satisfactory, no opinion, satisfactory and very satisfactory. Regarding ease in locating the site, approximately 87 percent of the students were satisfied with this characteristic. Over 95 percent of the respondents believe the organization of the front page is satisfactory or very satisfactory. This was the highest rated item on the website section of the survey. Over 80 percent of the students also rated the following characteristics in a positive manner:

Ease of finding specific information (85%)Completeness of information (80%)Currency of information (82%)

Computer Ownership

Forty percent of the freshmen (63 students) brought a computer with them to college. Even though the survey was administered within the first month of the semester, 12 percent of the students said they had purchased a computer since enrolling at USC. This information indicates that 75 of the freshmen (or approximately 47% of the respondents) brought a computer to USC or purchased one after they arrived. Seventy-five percent of the student computers are PC’s and 25 percent are laptops; a few students have both. The brands of computers owned by the students are listed below.

Gateway 13Compaq 12Hewlett Packard 9Dell 6Packard Bell 6Home built 5Toshiba 3IBM 2Others 9

222

Page 223: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Approximately 19 percent of the respondents indicated that they have upgraded their computer after arriving for classes. When asked if this was their first computer purchase, only 37 percent of the students answered affirmatively.

Students were asked to indicate the type of software they installed on their computer. Students listed a wide variety of word processing, graphics and networking software. The most frequently cited software selections are given below:

Office (97, 98, 2000) 37Windows (95, 98) 26Games 9AOL, Netscape, Internet 8Works Suite 3Adobe Photoshop, Paint Pro, C++,

Lotus, Quatro Pro, Netmeetings, Quicktime, etc. 7

Survey responses indicate that 83 percent of the freshmen had computer instruction prior to entering UNIV 101. Students gained knowledge and practice with computer software in a number of ways. Most frequently, students enrolled in a course in high school; approximately 81 percent learned a mix of basics and programming in a high school course. In addition, many freshmen (43 students) state that computer knowledge was gained by “playing with them” or “learning on my own.” In some cases computer manuals or books were used to acquire basic skills (25 students). Another resource cited by a substantial segment of the group is the help received from parents, relative and friends (20 students). Finally, seven freshmen mentioned that their computer skills were enhanced on the job at their place of business.

223

Page 224: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix L

Performance Assessment Instrument

224

Page 225: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Oral Presentation of EMCH 467 Senior Project

As outlined in your course syllabus, EMCH 467 students are required to make a 20 minute oral presentation of their senior project. The presentation will include the use of PowerPoint graphics to display key ideas, data and results of your study. The presentation is the culmination of your work on the project. This is your opportunity to present your ideas, design and research results to the industry representatives who solicited assistance with their mechanical engineering problem. The industry representatives, your classmates and departmental instructors will be present during your presentation. Each group, all audience participants, will be asked to critique your oral presentation using a specified format. You will receive the results in a tabulated, summarized form within a week of your presentation. In addition, you will complete a self-evaluation of your presentation. All details of this activity are outlined below.

Course Learning Outcomes for the Oral Presentation

· The student will display effective communication skills that would be expected in a job setting.

· The student will demonstrate the capability to prepare and utilize PowerPoint software when making an oral presentation.

· The student will include all expected components of a research design in the presentation and demonstrate an understanding of the content of each component and how they are interrelated.

Assistance with the Presentations

Staff in the Professional Communications Center are available to assist you with your speech and the PowerPoint software. Consultants can advise you regarding the organization, clarity, length, body language, speech patterns and other elements of your presentation. You are encourage to practice your speech so that you become familiar with the details of your report, stay within the 15 minute time frame, and make an articulate and poised presentation to your colleagues, industry representative and engineering faculty members.

Date of Presentation

Presentations will be made during the last three class periods of the fall semester. A sign-up sheet for choosing your presentation date and time will be circulated during the first part of November. On the list indicate the title of your presentation and any audio-visual equipment or supplies you will need for your presentation.

Setting

225

Page 226: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The presentations will be held in the Faculty Conference Room. A computer, projector and screen will be set-up for your use. You are responsible for preparing the PowerPoint slides for the presentation; bring your disk for this purpose.

Performance Expectations

Your presentation should represent a critical analysis and synthesis of your research project. Elements that should be covered in your presentation include:

· Statement of the ProblemWhat is the question addressed by the study? What are the goals of the research?

· Description of the Design Process This should include a brief summary of the relevant theoretical background and an overview of the methodology utilized to examine the problem.

· Findings Overview of data analysis and the results giving appropriate statistics used in the study. Provide a synthesis of the results.

· Conclusions and Recommendations Provide an evaluation of your work indicating your conclusions and the reasons for your particular recommendations. Cost and feasibility projections should be included if appropriate.

Grade and Grading Criteria

The oral presentation will be worth 85 points toward your total grade in the course. You will be evaluated on three components: the technical content, the use of PowerPoint in your presentation and your communication skills. The grading rubric (checklist) is set-up in tabular form and is provided for your information. This rubric will be used by your peers, faculty members and the industry representatives to evaluate your performance.

You will be evaluated on 17 different elements. These elements are listed in two tables on separate pages. The table of elements evaluating the content of your speech is entitled Evaluation Rubric for the Technical Content of the EMCH 467 Oral Presentation. An additional rubric is also given for the speech and graphics components, entitled Evaluation Rubric for Communication Skills and PowerPoint Graphics of the EMCH 467 Oral Presentation.

Each element will be rated on a scale from 0 to 5. The highest score is a 5 and represents an excellent performance on each element. The specific criteria that all raters will use are outlined below.

Analytical Rating Scale for the Technical Content of the Oral Presentation

Rating Description

226

Page 227: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

5 Strong organized and analytical focus. Evidence given of depth of understanding. Responds to all elements of the item. Uses convincing evidence to support the problem, goals and solutions. Shows signs of original thinking and creativity.

4 Present concepts and processes in a meaningful manner. Cites elements appropriate to item and clearly links these to the problem or goals. Discusses all major elements and issues. Lacks some clarity or understanding or provides an incomplete description.

3 Demonstrates comprehension of pertinent concepts and processes. May contain some errors. Responds to only part of item. Somewhat unorganized.

2 Weak or implausible coverage of item. Information provided lacks depth or may contain factual errors. Information may be irrelevant to problem or solution. Lack of understanding on content or process.

1 Attempts to respond to item, however, fails to provide detail and sufficient coverage. Disconnected discussion. Few, if any factual illustrations to support statement or does not include relevant information.

0 Not present to give presentation at assigned date and time. No attempt to answer the item in any meaningful way.

Analytical Rating Scale for Use of PowerPoint Software and Communication Skills

Rating Description

5 Excellent - Very effective communication. Arouses interest. Directs attention to speech topic. Smooth transitions. Pleasing and natural movements which emphasize speech. Points meaningful and clear. Connection with audience present throughout presentation. Faced audience when speaking. Eye contact with audience. Spoke clearly and projected voice so all could hear. Effective use of time. Did not have to rush to finish. Allowed time for questions. Answered questions effectively.

Slides are effective and used to reinforce points in presentation. A sufficient number of slides are presented that compliment the presentation.

4 Very good - Effective communication. Interesting presentation. Needs a little polish. Pleasing and natural movements which emphasize speech. Transitions or flow of speech could improve. Points meaningful and clear. Connection with audience present through most of presentation. Maintained eye contact. Spoke clearly and projected voice. Effective use of time. Did not have to rush to finish or was a little hurried. Allowed time for questions. Answered questions effectively or could use a little improvement.

227

Page 228: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Slides were meaningful and used to reinforce point in the presentation. A sufficient number of slides are presented that complement the presentation.

3 Average - Makes adequate presentation. Needs more tonal inflection and fewer distracting mannerisms. Points made are not always clear. Connection with audience is noted only in part of the presentation. Spoke loud enough to hear most of the time. Two or more of the following elements might be missing. Not effective use of time. Rushed to finish. No time for questions. Questions not always answered completely.

Includes appropriate slides but lacking in number and quality of information. Or too many slides presented.

2 Below Average - Ineffective presentation. Several areas need strengthening. Three or more of the following elements were noted. Abrupt transitions. Visual aids not incorporated smoothly into presentation. No eye contact with audience. Does not project voice. Does not face audience when speaking. Has to rush to finish. Does not allow time for questions.

Visual aids not used to make important points. Not an organized slide presentation. Slides or graphics not produced correctly.

1 Unacceptable - Ineffective presentation. Several areas need strengthening. Five or more of the following elements were noted. Abrupt transitions. Visual aids not incorporated smoothly into presentation. No eye contact with audience. Does not project voice. Does not face audience when speaking. Has to rush to finish. Does not allow time for questions.

Visual aids not used or not in working order. Slides or graphics not produced correctly.

0 Not present for the presentation.

228

Page 229: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Evaluation Rubric for Technical Content of the Oral Presentation

0 1 2 3 4 5 Comments

Introduction:

Summarizes statement of problem

States goal

Design Process:

Povides relevant theoretical background

Selects appropriate methodology to analyze problem

Findings:

Data analysis presented

Synthesis of results

Conclusions/Recommendations:

Conclusions formulated

Reasons given

Recommendations given

229

Page 230: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Evaluation Rubric for Software Usage and Communication SkillsEMCH 467 Oral Presentation

0 1 2 3 4 5 Comments

Use of PowerPoint:

Quality of slides

Quantity of slides

Incorporation into presentation

Communication Skills:

Enthusiasm for subject

Eye contact with audience

Clear speaking; projection of voice

Effective use of time

Smooth transitions between components

Student’s Name: _______________________________________________________

Reviewer/Rater: _______________________________________________________

Total Points for Use of PowerPoint: __________

Total Points for Communication Skills: __________

Total Points for Technical Content: __________

Total Points for Oral Presentation: __________

230

Page 231: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix M

Midterm Evaluation

231

Page 232: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Midterm Course Evaluation

Fall 2000

This evaluation form is provided so that you may express your views of this course and the way it is being taught. Please circle the number that correspond to your selected response.

Please rate the following overall characteristics of the course to this point:

1. Instructor’s overall teaching effectiveness

Poor 1 2 3 4 5 Excellent

2. Overall quality of this course

Poor 1 2 3 4 5 Excellent

3. Statement of objectives and purposes

Poor 1 2 3 4 5 Excellent

4. Instructor’s attitude toward the students

Poor 1 2 3 4 5 Excellent

5. Amount of work required for the course

Poor 1 2 3 4 5 Excellent

6. Course materials (notes, copies, etc.)

Poor 1 2 3 4 5 Excellent

Please rate the instructor’s performance:

7. General course organization

Poor 1 2 3 4 5 Excellent

232

Page 233: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

8. Instructor’s preparation for class

Poor 1 2 3 4 5 Excellent

9. Grading of homework and tests in a timely manner

Poor 1 2 3 4 5 Excellent

10. Instructor’s ability to present the class material

Poor 1 2 3 4 5 Excellent

11. Instructor’s knowledge of the subject

Poor 1 2 3 4 5 Excellent

12. Use of visual aids (chalkboard, overheads, etc.)

Poor 1 2 3 4 5 Excellent

13. Instructor’s interaction with students during class

Poor 1 2 3 4 5 Excellent

14. Availability during office hours for consultation

Poor 1 2 3 4 5 Excellent

Rate the following topics related to your quizzes or exams:

15. Length

Short 1 2 3 4 5 Long

16. Relevance to material covered

Poor 1 2 3 4 5 Excellent

17. Difficulty level

Easy 1 2 3 4 5 Hard

Please continue answering on the back.

233

Page 234: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

18. Please provide any comments on course content.

19. What do you like best about the course to this point in the semester?

20. What do you like least about the course to this point in the semester?

21. Provide any comments on the instructor’s performance.

234

Page 235: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix N

Education Outreach Survey

235

Page 236: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

E2 Everyday EngineeringParticipant Survey

Grade of the students ____________

1. Please indicate the activity (or activities) presented in your classroom.

2. Did you use the vocabulary sheets, introductory questions, or the introductory activities prior to the class presentation? Yes No

2.a. How helpful were the advance materials in preparing students for this activity?

3. Did the presentation/class activity meet your expectations? Why or why not?

4. Was the activity presented at the appropriate level? If not, please explain.

5. How does this activity assist you in meeting the learning objectives of the South Carolina Science Curriculum Standards?

6. Do you think this was a beneficial teaching/learning activity of the concepts discussed? Why or why not?

7. Rate the overall presentation of the activities by the instructor. Circle one.

Very Poor Poor Fair Good Very Good Excellent

8. How can this presentation be improved?

9. Has your awareness of the College of Engineering and Information Technology increased as a result of this program?

10. What are your suggestions for additional topics and/or activities that could be presented?

11. Would you recommend this program to another teacher? Why or why not?

236

Page 237: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix O

Professional Communications Center Assessment

237

Page 238: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Professional Communications Center

Analysis of Activity1999

Professional Communications Center

Number of Student and Faculty Consultations Per Month

Month 1999# of Consults

Percent of 1999 Total

1998# of Consults

Percent of1998 Total

January 57 10% 34 6%February 97 17% 71 13%March 39 7% 35 7%April 65 11% 54 10%May 10 2% 3 .6%June 30 5% 25 5%July 44 8% 13 3%August 36 6% 34 6%September 50 9% 73 14%October 46 8% 73 14%November 59 10% 73 14%December 36 6% 42 8% Total 570 530

In-Class Presentations By PCC StaffJanuary – December 1999

Month Number of Classroom Visits

Number of Students

January 14 177February 10 198March 1 35April 1 12May 0 0June 2 20July 3 9August 5 78September 7 118October 2 50November 7 139December 0 0

Totals 51 836

238

Page 239: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Professional Communications Center

Number of Sessions Conducted for Each Type of PCC Service

Types 1999Totals

1998 Totals

In-class Presentation 52 20Student Consultation 264 435Faculty Consultation 80 34English is Second Language 74 47Preparation Time (consultant’s time) 26 30Other Writings 52Grading 62Instructor Preparation (consultant meetings with instructor)

12

Professional Communications Center

Student and Faculty Consultations By Month

1999Number of

Student Visits

1999Number of

Faculty VisitsJanuary 19 7February 43 3March 18 8April 40 1May 0 5June 7 11July 19 9August 5 7September 26 4October 31 0November 43 2December 12 23

Totals 263 80

239

Page 240: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Professional Communications Center

Courses Related to Student Consultations

Courses 1999Number ofStudents

1998Number ofStudents

EECE 201 139 142EECE 212 0 15EECE 301 31 5EECE 302 1 26EECE 401 6 20EECE 402/403/404 2 6EECE 553 1EECE 701 2ECHE 361 1ECHE 401 5ECHE 460 14 2ECHE 461 30 12ECHE 465 1 3ECIV 303 2ECIV 303L 2ECIV 350 1ECIV 350L 2 3ECIV 470 1ECIV 551 3ECIV 750 6 8ECIV 750A 1 2ECIV 790 19ECIV 797 2EMCH 361 34 20EMCH 371 1EMCH 427 8EMCH 428 3EMCH 467 41 16EMCH 527 1EMCH 561 2EMCH 790 2EMCH 797 1ENGL 101 4 9ENGL 102 6 15UNIV 101 19 94Other courses 11 6

240

Page 241: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Professional Communications Center

Frequency of Repeat Consultations

Number of Repeat Consultations

1999Number of

Students & Faculty

1998Number of

Students & Faculty

One 108 53Two 43 42Three 15 17Four 3 10Five 6 5Six 2 4Seven 2 3Eight 0 1Nine 0 0Ten 0 1Eleven 0 1Twelve 2 0Thirteen 0 1Eighteen 1 0

Length of Time Per Visit for Consultations*

Student Consultations Range: 3 minutes to 6 hours 40 minutes Median: 50 minutes Faculty Consultations Range: 10 minutes to 4 hours 50 minutes Median: 55 minutes

Note. This table does not include Apogee students.

241

Page 242: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Professional Communications Center

Consultations for Reasons Other Than CourseworkType of Writing 1999

Frequency1998Frequency

Abstract 1 1Application 6Article 2Brochure 9 1Conference Paper 1Chapter 2Dissertation 12 16Editing 1Essay 4 6Graduate School Application 2Grant 2Letter 3 3Memo 1Newsletter (Innovations & PCC) 13 2Organizations (IEEE, SECWA) 2Poster 1Presentation 2Proposal 27 2Resume 4 7Some type of lab 4Thesis 17 7Russian (SIC Project) 8 7

242

Page 243: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix P

Longitudinal Student Tracking Report

243

Page 244: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information Technology

Student Longitudinal Tracking System

In collaboration with the University’s Institutional Planning and Assessment Office, the College of Engineering and Information Technology assisted with the design and implementation of a Longitudinal Student Tracking System that incorporates all of the necessary elements to study student trends from admission through graduation and beyond. The goal of this system is the availability of a college-wide mechanism that will provide data for faculty and administrators to enable them to continuously monitor and improve the quality of their programs.

The USC Student Longitudinal Tracking System provides data from the beginning of the 1990-1991 academic year (Fall, Spring, Summer I, and Summer II) through the end of the 1998-99 academic year. Each year the Student Longitudinal Tracking System will be updated to provide another cohort for the database and to add modified graduation, grade point average (GPA), and retention information. Developing statistical tables, analyzing the data and reporting the results will occur in stages. Reports will be generated for the following areas: enrollment, academic performance, graduation, transfer performance, and retention. The following tables and synopses give an overview of initial data collected and analyzed regarding engineering students progress toward degree completion. Each table addresses a specific research question and these questions are given in bold lettering.

Retention Statistics

The following table examines one aspect of student return rates, that is, how many students begin in the College of Engineering and Information Technology and re-enroll in subsequent semesters. It captures data for the freshman students that are the primary population of interest. As noted in the columnar headings, the tracking begins with the fall semester of each cohort. Data for the student’s second semester at the College of Engineering is listed followed by the enrollment figures for the next two fall semesters. This table, therefore, provides persistence rates from a student’s first semester through the fall semester of the second year.

What percentage of the freshmen engineering students, from the 1990, 1991, 1992, 1993, 1994, and 1995 cohorts, enrolled within the College of Engineering and Information Technology in subsequent semesters?

244

Page 245: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Table 1

Freshman Persistence Rates for the 1990-1995 Cohorts

Cohort CohortEnrollment

First to Second Semester

After One Year(Fall-to-Fall)

After Two Years(Fall-to-Fall)

# % # % # %

1990-91 297 264 89% 208 70% 155 52%1991-92 302 263 88% 202 68% 145 48%1992-93 247 225 91% 174 70% 114 46%1993-94 265 236 89% 181 68% 111 42%1994-95 245 209 85% 166 68% 111 45%1995-96 231 204 88% 166 72% 142 61%

Table 1 indicates similar persistence rates for each cohort from 1990 to 1995. For all cohorts, approximately 88 percent of the freshmen students re-enroll in Engineering after their initial fall semester. The data also suggests that an average of 69 percent of the students return for their second year in engineering. After two years, the statistics show that only 49 percent of the original cohort enrolled for the fall semester of the third year.

Table 2 shows the persistence rates for the transfer students in each of the 1990 to 1994 cohorts.

Table 2

Transfer Persistence Rates for the 1990-1994 Cohorts

Cohort Enrollment in Cohort

First to Second Semester

After One Year(Fall-to-Fall)

After 4 Semesters(Graduates)

# Of EngineeringGraduates

# Of Non-EngineeringGraduates

# % # % # % # % # %1990-91 63 53 84% 47 75% 38 (7) 71% 42 67% 6 10%1991-92 45 40 89% 33 73% 26 (3) 64% 22 49% 4 9%1992-93 52 45 87% 41 79% 34 (5) 75% 28 54% 4 8%1993-94* 42 36 86% 31 74% 25 (2) 64% 16 - 4 10%1994-95* 36 31 86% 27 75% 25 69% 5 - - -

* Note. Figures for the 1993 and 1994 Cohorts are incomplete because students will continue to graduate from Engineering and other USC programs.

Retention for Gender and Ethnic Categories

Table 3 – Table 8 provides persistence rates for freshman engineering students indicating the proportion of each gender and ethic category within each cohort. For each cohort, what are the persistence rates for each gender and ethnic category within the College of Engineering and Information Technology?

245

Page 246: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Table 3

Persistence Rates by Gender and Ethnicity1990 Cohort

1990 Fall 1991 Spring 1991 Fall 1992 FallMale Female Male Female Male Female Male Female

Caucasian 164 39 144 ( 88%) 34 ( 87%) 113 (69%) 27 ( 69%) 85 (52%) 23 (59%)African American 46 26 45 ( 98%) 22 ( 85%) 35 (76%) 16 ( 62%) 25 (54%) 9 (35%)

Asian 6 5 5 ( 83%) 5 (100%) 4 (67%) 5 (100%) 4 (67%) 3 (60%)Hispanic 5 5 (100%) 4 (80%) 3 (60%)Other 5 4 ( 80%) 4 (80%) 3 (60%)

Table 4

Persistence Rates by Gender and Ethnicity1991 Cohort

1991 Fall 1992 Spring 1992 Fall 1993 FallMale Female

Male Female Male Female Male Female

Caucasian 155 36 130 ( 84%) 34 ( 94%) 97 (63%) 27 (75%) 74 (48%) 14 (39%)African American 64 25 60 ( 94%) 22 ( 88%) 45 (70%) 19 (76%) 33 (52%) 16 (64%) Asian 10 4 8 ( 80%) 4 (100%) 7 (70%) 3 (75%) 5 (50%) 2 (50%)Hispanic 5 5 (100%) 4 (80%) 1 (20%)Other

Table 5

Persistence Rates by Gender and Ethnicity1992 Cohort

1992 Fall 1993 Spring 1993 Fall 1994 FallMale Female

Male Female Male Female Male Female

Caucasian 137 15 122 ( 89%) 13 ( 87%) 95 (69%) 9 ( 60%) 72 (53%) 4 (27%)African American 57 17 55 ( 96%) 15 ( 88%) 43 (75%) 10 ( 59%) 21 (37%) 6 (35%) Asian 13 3 13 (100%) 2 ( 67%) 11 (85%) 2 ( 67%) 8 (62%) 1 (33%) Hispanic 1 1 (100%) 1 (100%) Other 4 4 (100%) 3 (75%) 2 (50%)

Table 6

Persistence Rates by Gender and Ethnicity1993 Cohort

1993 Fall 1994 Spring 1994 Fall 1995 FallMale Female

Male Female Male Female Male Female

Caucasian 137 20 128 (93%) 17 ( 85%) 97 ( 71%) 13 ( 65%) 56 ( 41%) 7 ( 35%)African American 48 34 46 (96%) 28 ( 82%) 38 ( 79%) 20 ( 59%) 22 ( 46%) 16 ( 47%) Asian 9 3 9 (100%) 3 (100%) 7 ( 78%) 2 ( 67%) 5 ( 56%) 2 ( 67%)Hispanic 2 1 2 (100%) 1 (100%) 1 ( 50%) 1 (100%) 1 ( 50%) 1 (100%)Other 1 1 (100%) 1 (100%) 1 (100%)

246

Page 247: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Table 7

Persistence Rates By Gender and Ethnicity1994 Cohort

1994 Fall 1995 Spring 1995 Fall 1996 FallMale Female

Male Female Male Female Male Female

Caucasian 108 35 99 ( 92%) 33 ( 94%) 80 ( 74%) 27 ( 77%) 52 ( 48%) 20 ( 57%)African American 46 23 43 ( 93%) 21 ( 91%) 31 ( 67%) 7 ( 30%) 18 ( 39%) 12 ( 52%)Asian 8 5 5 ( 63%) 5 (100%) 4 ( 50%) 4 ( 80%) 4 ( 80%) 3 ( 60%)Hispanic 2 2 (100%) 2 (100%) 1 ( 50%)Other 1 1 (100%) 1 (100%) 1 (100%)

Table 8

Persistence Rates by Gender and Ethnicity1995 Cohort

1995 Fall 1996 Spring 1996 Fall 1997 FallMale Female

Male Female Male Female Male Female

Caucasian 101 35 100 ( 99%) 32 ( 91%) 84 ( 83%) 27 ( 77%) 72 ( 71%) 24 ( 69%)African American 43 28 38 ( 88%) 26 ( 93%) 29 ( 67%) 19 ( 68%) 23 ( 53%) 16 ( 57%) Asian 6 2 5 ( 83%) 0 ( 0%) 4 ( 67%) 5 ( 83%) Hispanic 2 1 ( 50%) 1 ( 50%) 1 ( 50%)Other 1 1 (100%) 1 (100%) 1 (100%)

Persistence rates for ethnic groups

From fall to spring semester of the first year, minority persistence rates exceeded Caucasian return rates in each cohort. Persistence rates for the third semester (or the beginning of the second year) indicate that minority re-enrollment was higher than the Caucasian rates for the 1990 through 1993 cohort with the reverse trend occurring in the last two years (1994 and 1995) of the tracking period. Overall ethnic group persistence rates for the second year show that similar proportions of minority and Caucasian students re-enrolled for each cohort. Comparison of African American and Caucasian persistence rates show variation among the academic cohort: in some years African American return rates for the second year exceed Caucasian rates whereas in other semesters the Caucasian return rates are higher. The overall effect is that there is little difference in persistence rates between the two groups.

Persistence rates for gender groups

Tables 3 through 8 indicate that there are some variations in the persistence rates between males and females across the cohorts. However, a comparison of the overall averages for each semester suggests that the proportion of males and females returning is approximately equal. One noteworthy trend is the slightly higher retention rate of Asian and Hispanic females but these figures are low enrollment numbers.

247

Page 248: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Graduation Statistics

A key factor in determining the health of an academic program is the rate at which the students graduate from that program. The following table gives overall graduation statistics for the first three cohorts of the longitudinal study. Information from the College of Engineering and Information Technology Senior Survey and other sources indicate that engineering students, particularly freshmen, often require longer than four years to complete their degree program. Given this fact, Table 8 stops with the 1992-1993 cohort because sufficient time has not elapsed for data to be comparable to the previous academic years. It should be noted that previous research has found that additional students are likely to graduate from the 1991 and 1992 cohort increasing the graduation rates slightly for those years.

Table 9

Graduation Statistics for the 1990, 1991 and 1992 Cohorts

CohortEnrollment

TotalGraduates

Engineering Degrees Non-Engineering Degrees

Freq. % of Cohort

Freq. % of Cohort

Freq. % of Cohort

1990-1991 Cohort All Engineering Students 360 228 63% 152 42% 76 21% First-time Freshmen 297 111 37% 70 24% Transfer Students 63 41 65% 4 6% 1991-1992 Cohort All Engineering Students 347 183 53% 104 30% 79 23% First-time Freshmen 302 82 27% 75 25% Transfer Students 45 22 49% 4 9% 1992-1993 Cohort All Engineering Students 299 158 53% 97 32% 61 20% First-time Freshmen 247 69 28% 57 23% Transfer Students 52 28 54% 4 8%

Graduation figures for the three cohorts show a decline in the number of total graduates as well as the number of Engineering degrees granted during the tracking period. Overall, approximately 56 percent of the students who began in the College of Engineering & Information Technology graduate at some point in their academic career. Approximately 35% of the students in the cohort graduated with an Engineering degree. The percentage of students to graduate with an Engineering degree declined from 42 percent to 32 percent over the three-year period. It is also noteworthy that approximately 21 percent of the students within each cohort graduate from USC with a degree in another discipline besides Engineering.

The proportion of first-time freshmen, within the cohort, to graduate with an Engineering degree equals approximately 37, 27, and 28 percent of the 1990-91, 1991-92, and 1992-93 cohorts, respectively. The overall average equals approximately 31 % for freshmen. The breakdowns for the students who were non-engineering graduates suggest that a very low percentage of this population were transfer students.

248

Page 249: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Graduation Rates for Gender and Ethnic Categories

How many students in each cohort (1990-91, 1991-92, 1992-93) graduated in Engineering as of June 1997 showing distributions for each of the following subgroups: total students, first-time freshmen, and transfer students with breakdowns by ethnicity and gender for each subgroup?

Table 10

Graduation Rates for Students Receiving Engineering DegreesDemographic Distributions for the 1990, 1991 and 1992 Cohorts

1990-1991 Cohort 1991-1992 Cohort 1992-1993 Cohort

Subgroup Ethnicity

F M Total # % # % #

F M Total # % # % #

F M Total # % # % #

All engineering students 34 (22%) 118 (78%) 152 21 (20%) 84 (80%) 105 14 ( 4%) 83 (86%) 97 African-Americans 6 ( 4%) 15 (10%) 21 7 ( 7%) 11 (11%) 18 3 ( 3%) 9 ( 9%) 12 Am. Indian/Alaskan Native Asians/Pacific Islanders 3 ( 2%) 5 ( 3%) 8 1 ( 1%) 2 ( 2%) 3 1 ( 1%) 5 ( 5%) 6 Caucasians 24 (16%) 93 (61%) 117 13 (13%) 69 (66%) 82 9 ( 9%) 68 (67%) 77 Hispanic 1 ( 1%) 2 ( 1%) 3 1 ( 1%) 1 1 ( 1%) 1 Other 3 ( 2%) 3 1 ( 1%) 1 First-time Freshmen 28 (25%) 83 (75%) 111 17 (21%) 65 (79%) 82 8 (12%) 61 (88%) 69 African-Americans 6 ( 5%) 15 (14%) 21 6 ( 7%) 11 (13%) 17 2 (3%) 7 (10%) 9 Am. Indian/Alaskan Native Asians/Pacific Islanders 3 ( 3%) 3 ( 3%) 6 1 ( 1%) 2 (2%) 3 1 ( 1%) 4 ( 6%) 5 Caucasians 19 (17%) 61 (55%) 80 10 (12%) 52 (63%) 62 4 ( 6%) 49 (71%) 53 Hispanic 2 ( 2%) 2 1 ( 1%) 1 Other 2 ( 2%) 2 1 ( 1%) 1 Transfer Students 6 (15%) 35 (85%) 41 4 (18%) 18 (82%) 22 6 (21%) 22 (79%) 28 African-Americans 1 ( 5%) 1 1 ( 4%) 2 ( 7%) 3 Am. Indian/Alaskan Native Asians/Pacific Islanders 2 ( 5%) 2 1 ( 4%) 1 Caucasians 5 (12%) 32 (78%) 37 3 (14%) 17 (77%) 20 5 (18%) 19 (68%) 24 Hispanic 1 ( 2%) 1 1 ( 5%) 1 Other 1 ( 2%) 1

The proportion of female graduates in Engineering declined during this period from a high of 22 percent in 1990 to a low of 14 percent in 1992. The percentage of female graduates with an Engineering degree approximately equaled the proportion of females enrolled in the college during the 1990 and 1991 academic periods.

African-Americans totaled 14%, 17% and 12% of the graduates in Engineering for the 1990-91, 1991-92 and 1992-93 cohorts, respectively. These percentages suggest that a slightly smaller proportion of the African-Americans graduated in Engineering when compared to their enrollment figures. The cohort percentages represent an average graduation rate of approximately 14 percent over the three-year period.

Academic Years of Graduation For Engineering and Non-Engineering Graduates

249

Page 250: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

What percentage of the students within the 1990, 1991, 1992, 1993 cohorts received their Engineering degrees at any time during each of the subsequent academic years?

Table 11

Academic Years of Graduation

1993-1994

# %

1994-1995 # %

1995-1996

# %

1996-1997

# %

1997-1998

# %

1998-1999

# %

Total Number of Graduates

1990 Cohort 39 26% 85 56% 23 15% 4 3% 5 3%

1521991 Cohort 22 21% 55 52% 19 18% 6 6% 3 3% 105

1992 Cohort 4 4% 9 9% 64 66% 19 20% 1 1% 971993 Cohort 37 43% 36 41% 14 16% 87

For the 1990-1991 and 1991-1992 cohort approximately 26 and 21 percent, respectively, of the students graduated with an Engineering degree within four years. In the 1995-1996 cohort, the figure drops to 13 percent but increases to 43 percent for the 1993 cohort. Five-year graduation rates for the four cohorts are as follows:

1990-1991 82%1991-1992 73%1992-1993 79%1993-1994 84%

What percentage of the 1990, 1991, 1992, 1993 and 1994 cohorts received their USC degrees (thus far) in a discipline other than Engineering during each of the subsequent academic years?

Table 12

1993-1994

# %

1994-1995 # %

1995-1996

# %

1996-1997

# %

1997-1998

# %

1998-99

# %

Total Number of Graduates

1990 Cohort 11 14% 36 47% 22 29% 3 4% 4 5% 761991 Cohort 19 24% 28 35% 13 16% 14 18% 5 6% 791992 Cohort 16 26% 27 44% 16 26% 2 3% 611993 Cohort 26 49% 32 49% 7 11% 65

Student tracking of these four cohorts indicates that a significant number of students who began in Engineering left the program but graduated from USC with another degree. Five–year graduation rates are slightly lower than those for the Engineering graduates:

1990-1991 61%1991-1992 59%1992-1993 70%1993-1994 89%

250

Page 251: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Academic Performance

Grade point averages (GPA) are used as one measurement of a student’s academic performance while attending college. This is also a useful tool to assess college programs. Table 13 shows the overall averages for freshmen and transfer students for the 1990, 1991, and 1992 cohorts. Also shown are the average GPA’s for the students receiving an Engineering degree, students who began in engineering but graduated from another USC discipline and the students who began in Engineering but dropped out of USC or otherwise did not receive a degree.

Table 13

GPA’s for Longitudinal Cohorts

Engineering Degree

Other USC Degrees

NoDegree

1990-1991 Cohort All Engineering Students 3.04 2.77 1.96 First-time Freshmen 3.02 2.73 1.87 Transfer Students 3.11 3.45 2.631991-1992 Cohort All Engineering Students 2.95 2.68 1.88 First-time Freshmen 2.86 2.72 1.78 Transfer Students 3.12 2.40 2.541992-1993 Cohort All Engineering Students 3.11 2.79 1.85 First-time Freshmen 3.08 2.77 1.72 Transfer Students 3.09 2.96 2.69

Note. Averages have been rounded and were calculated using a weighted sum.

Table 13 statistics show that the average GPA for students with an Engineering degree is approximately 3.0 for the first three years of the longitudinal study. Transfer students tend to have about the same GPA average as freshmen students. The GPA’s of graduates from other USC programs tend to be slightly lower than those for the Engineering graduates. The overall GPA for non-engineering graduates is 2.75 and the average of 1.90 was obtained for students with no degrees.

251

Page 252: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix Q

Bates House Living-Learning Project Report

252

Page 253: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

The Engineering Community in Bates HouseSummary of the First Semester Results

Overview and Goals

During the 1999 fall semester, freshmen students in The College of Engineering and Information Technology were offered the opportunity to participate in a unique Living and Learning Community program developed in collaboration with the USC Housing Department. The Engineering Community in Bates House is an on-campus residential community designed to enrich the educational environment for first-year engineering students. Development of this concept was based on research documenting the benefits of students living in learning environments that foster student-faculty interaction and student peer relationships strengthened by involvement with each other both in and out of the classroom.

More specifically, goals of the Engineering Community in Bates House are:

1) To increase the retention rate of these freshmen by creating a learning environment that maximizes their potential for success

6) To incorporate active learning strategies and increased academic support to increase academic performance indicators such as the student’s grade point average (GPA);

7) To develop professional attitudes and to emphasize experiential learning by encouraging student involvement in the community and the professional engineering organizations;

8) To develop and implement new technologies, such as laptop computer, that can be applied in the classroom to enhance education program delivery;

9) To provide early design and teamwork experience to enhance student motivation and learning and to develop leadership, communication and problem solving skills.

The increases in retention and academic performance are primarily long-term research questions. The Bates House project students will be tracked during their subsequent years at USC collecting course grades and GPA data each semester. Retention figures for this group of students will be tabulated with overall results available at the end of the first, second and fourth years of the project.

A group of engineering students with similar academic backgrounds will be randomly selected for use as a control group to provide a criterion for judgment of program success. Retention rates, course grades and GPA data will be collected for this group of students each semester from 1999-2000 through the 2002-2003 academic years. Control and experimental groups will be compared to determine if the additional academic support and activities given the Bates House students yields improved performance and retention within the College.

Progress toward meeting project goals will be monitored each semester during the initial semesters. It was decided to interview all experimental students in November to evaluate the effectiveness of the first semester of the project. The following narrative will analyze and summarize the results of the interview process. First, the Bates House program will be described, followed by a brief description of the experimental group of engineering students.

253

Page 254: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Description of the Bates House Project

Students who participated in the Bates House Engineering Community had access to programs and services developed specifically with the engineering student in mind. Bates House students were enrolled in two sections of University 101-E instructed by Professors Molly Gribb and Steve McNeill. Required by the College during the 1999-2000 academic year, this course provided an introduction to engineering concepts and the computer network and software utilized within College and within some engineering programs. University 101-E also provided an introduction to the USC campus and some of its high usage facilities such as the library as well as offering several programs regarding health issues including drug abuse and sexually transmitted diseases. Special tutoring services were provided for the Bates House students in math, chemistry and writing. Community dinners and cookouts, career development classes, plant tours, and other activities were arranged for the freshmen. Each Bates House participant also received a Gateway laptop computer purchased with grant monies received from the Department of Commerce; laptops were leased for a period of two years although students may not keep the computer for the entire period. All software available on the engineering network was installed on the laptop. Students were given special training on how to use the laptop and two student assistants were employed to be available via email and in-person to help students with computer-related problems.

Project Implementation

Professors Gribb and McNeill emphasized different engineering skills and competencies within their UNIV 101-E sections. Steve McNeill’s class took part in classroom discussion, summarized newspaper articles relating to current events in engineering, provided practice with the computer network software, instruction and exercises with MathCAD and completed a team design project. Professors Gribb’s section completed four or five essays, made PowerPoint and other oral presentations, problem solving exercises using Excel and participated in a team design project. All students requesting a USC Housing Application were notified for the Bates House Living Learning option. Students who volunteered for this option were placed into two sections of UNIV 101-E. The instructors for this course, Professors Molly Gribb and Steve McNeill, agreed to participate in the special activities planned for these students. After some housing and class scheduling adjustments at the beginning of the semester, 47 students remained in the Living Learning Community and comprise the experimental sample.

Sample Demographics

The sample of students included 36 males and 11 females. The ethnic distributions of the sample approximated the total undergraduate distribution of the College of Engineering with 35 (74%) Caucasians, 8 (17%) African Americans, 3 (6 %) Asians, and 1 (2%) Hispanic. Although first semester freshmen, most of the students declared a major upon entry: computer (10); civil (5); mechanical (5); chemical (3); electrical (3); and undecided (6). There were 15 students classified as conditional indicating that their math SAT scores were below the entrance requirement of 600 or that they did not place into MATH 141.

254

Page 255: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Students in the experimental group represent a wide continuum of verbal and mathematical capabilities. SAT total scores ranged from 870 to 1420 with a score of 1142 as the average Total score for the group. SAT Verbal and Math scores ranged from 440 to 670 and 410 to 800 respectively.

It was decided to interview all students at the end of the course to evaluate the effectiveness of the first semester of the project. Students answered a series of questions concerning five target areas: 1) the teaching/learning process of UNIV 101-E; 2) written and oral communications and the PCC; 3) the use of laptops; 4) particular skills and competencies including teamwork, problem solving, leadership and interpersonal skills; and 5) the Bates House Living Learning arrangement. The following narrative will analyze and summarize the results of the interview process.

UNIV 101-E

The freshmen were asked what they liked the “best” and the “least” about their UNIV 101-E course. The availability and use of the laptops was the most frequently cited response from 11 freshmen regarding what they liked best about the course. Other items mentioned by the students include: design projects (5), open class, relaxed atmosphere (4), computer programs (4), interactions with peers (3), teamwork (3), the teacher (2), the scavenger hunt (2), articles everyday (1), and living together and fun (1 each). Student voiced 17 or more activities that they liked the least about UNIV 101-E. The items cited most frequently include: writing essays (9), MathCAD (8), current event articles (4), and the workload (4). Students also mentioned other items they disliked such as community service, Excel homework, night classes, lack of organization, buying books that weren’t used and a few other items.

Students were asked to identify differences in the way UNIV 101-E was taught in comparison to the other freshman courses. Students provided 15 or more categories of responses ranging from the comment that it was like high school to the observation that the course included more technology. Students indicated that one-on-one or more personal attention (12 students) and the more relaxed self-paced classes (11 students) were the most noteworthy differences. Other observations mentioned less frequently included smaller class sizes, more application/hands-on activities, teacher instructional effectiveness, teamwork, workload, broader topics, discussion classes and more technology. Three students said they did not perceive substantial differences in the teaching learning processes between the engineering class and other freshmen courses. Students stated that the advantages of a more personalized, relaxed environment include less stress, better writing, more help when needed, more group work, and getting to know everyone. Disadvantages were few, but some students cited personality differences, the long class period, lack of organization, lack of time to prepare for other classes and some of the essays as reasons for their perceptions.

Tutoring

Twenty-two students utilized one or more of the tutoring activities provided during the semester. Three students indicated they would seek tutoring before the final exam. Eight students received Bates House tutoring and five students sought help in the math lab. Two other students attended a math class help session. One student used the NSBE tutoring services. Most of the students seeking math assistance were satisfied with the help they received citing it as “good” or very helpful. However, four students who attended the class session or went to LeConte were not pleased with the assistance given at these places. Seven students said they were tutored in Chemistry and rated their experience as “good.” Two

255

Page 256: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

students received tutoring from other sources and were split on the level of help they received. Four freshmen from the experimental sections sought communications assistance from the PCC; all found this support to be very helpful.

Plant Tours

The following is a list of the plant tours attended by the students in the Bates House project:

Allied Signal 12Cooper Tools 6Pirelli Cable 5SCANA/SCE&G 5Safety Kleen 4International Paper 4Selectron 3SMI Steel 3Kryotech 2

All students said they learned what the companies do, the day-to-day activities of engineers, they types of jobs available and/or about the different types of companies who hire engineers. Almost all students agreed the tours were interesting and informative. Students believe this is a worthwhile activity and that freshmen should be required to participate in at least one plant tour in the future. Students provided a range of comments about why the plant tour is an important activity. A few of these reasons are listed below:

Learned different aspects of engineering Different kinds of engineers need to work together to get things done Creativity is needed in the engineering world See real world applications of engineering Broadens horizons Can change perceptions and help you determine what field to go into Provides opportunities for learning resources or companies in the area

Dinner Programs

All freshmen were invited to attend the Dean’s Cookout on August 30, 1999. Most of the Bates House students attended this function. All but three Bates House students attended a special dinner program held on September 29, 1999. The program was designed for engineering students and faculty to have an opportunity to interact and to listen to presentations by engineers in the workplace. Guest speakers were Scott Echerer (mechanical) and William Holder (civil). Two additional dinner programs were planned but cancelled because of conflicts in the scheduling of the Bates House facility. All students believe the dinner/speaker program was worthwhile (7), interesting (9), informative (11), and enjoyable (2). Nine students noted, however, that one speaker was well-prepared and articulate but the second speaker did not capture the students’ interest.

256

Page 257: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Grading

Students from both sections thought the grading system was fair. All but four students understood the point system used to assign their final grade.

Academic Skills and Competencies

Students were asked to describe how UNIV 101-E provided the opportunity to develop problem-solving skills, leadership skills, written and oral communication skills and interpersonal skills as a team member. Students listed eight activities they believe enhanced their problem solving skills. The tasks, with the number of students selecting each, are as follows: design project (23); teamwork (13); EXCEL homework (9); MathCAD exercises (7); brainstorming (3) problems to solve/class discussion (3); readings (2); and reflection papers (1). Three students said that there were no problem- solving opportunities in their class.

Almost all of the students interviewed, 35 of 43 students or 81 percent, cited the group projects as a means of developing leadership skills. Other students mentioned community service, presentations, class discussions, and professional organizations. Five students said the course did not offer opportunities to develop leadership skills.

During the interviews, freshmen offered various ways in which they developed their interpersonal skills as a team member. The most frequent response concerned the need to learn to work with others and how to get a group to function. Students indicated other issues such a compromise (6), learning to listen (4), learning about different kinds of people (4), and being open to ideas (3). When asked about the productivity of their group, students were very positive in their responses. Fifteen students said their group was very productive and ten students stated that their group “got along well.” Three rated their group as “fairly productive” and four rated it as “sometimes productive.” Only four students rated their group as “unproductive” or “not so well.” All students indicated that the goal of the group, the team project, was completed as assigned.

The ability to communicate in writing is an important competency engineering students need to possess as graduates. Some sections of UNIV 101 incorporated essays, reports, and memos into the curriculum while instructors of other sections did not choose to address this competency. In Dr. Gribb’s section, students completed five essays and several memos as part of their homework assignments. In addition, they made two oral presentations during the semester. In Dr. McNeill’s section, weekly summaries of newspaper articles were required but these were not graded for technical content of communication skills. Both sections required an oral presentation using PowerPoint slides. The PCC made two presentations to students in Dr. Gribb’s section of UNIV 101-E; one presentation on technical writing and one on oral presentations using PowerPoint software. Dr. McNeill’s section did not utilize the PCC staff to present oral or written instruction.

Students in Dr. Gribb’s section were encouraged to go to the PCC to seek help with their essays and oral presentations. Only eight students in Dr. Gribb’s class reported that they received PCC assistance. Percent of these students indicated that the PCC was helpful with their problems. Twenty-two students stated that the PCC presentations made in their class were useful; ten students were satisfied with this

257

Page 258: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

instruction/learning process and ten students indicated that the PCC did “a good job” in presenting the information and providing useful handouts.

Laptop Program Evaluation

Students were asked a series of questions relating to the effectiveness of the laptop component of the Bates House program. All students said that they enjoyed having the laptop to use during the semester. Students were very enthusiastic in their responses. Students were eager to list all the advantages of having a laptop but reluctant to provide any disadvantages. Fifteen students emphasized the mobility advantage of the laptop; students listed some of the places they used their laptops: home, class, dorm rooms, other rooms, meetings and in the car. Not having to go to the labs or use the Bates House computers was also mentioned by 15 students as an advantage of having a laptop. Students stated that computers in the labs and Bates House were not always available and sometimes in disrepair. Students said that having a laptop therefore saved them time and a lot of inconvenience. More importantly, students concluded that a laptop allows them to get their assignments done on time and to be self-paced with homework and other assignments which would not be as true when depending on the accessibility of a lab computer. Seven students also indicated that having email and Internet access at all times was an advantage and six students noted that laptops aided communication among peers, instructors and family. Finally, students suggested that the laptops were used for all classes – not just engineering.

The primary disadvantage, mentioned by five students, was that the computers would frequently freeze or shut down. Several students indicated that the laptop needs a better processor and that printers and spare computers would be helpful. Students also mentioned that laptops were received late in the semester; having them at the beginning would allow more time for students to learn how to use them. Written guides or instructions were not available and several students believe this addition is needed to facilitate learning.

A majority of students believe that the laptops contributed to their overall learning experience. Thirty-one, (or 72 percent) agreed with this statement. Seven students said they “think” it did and one was unsure. Four students said that the laptop did not aide their learning experience. Fourteen respondents believe they had more opportunity to learn about the software programs having the laptops; they were able to expand their knowledge base, discover more about each program with the extra time afforded by laptop usage. Three students said having a laptop was an incentive to learn more. Nine stated that laptops made it much easier, faster and more convenient to complete assignments, projects and reports and finish them by the date due. Six students stressed the importance of being accessible to the Internet for information and research. They also stated that self-paced learning was an important outcome of laptop usage allowing for differences in the way students learn. Laptops provided students the opportunity to work together either in groups or by communicating through email. Laptops allowed easy access to course web sites, in-class instruction with software, etc. and permitted professors to personally assist student’s in-class with problems (5 students). Three students said the laptop was useful in every aspect of their lives. Two students commented that their grades were higher because of the laptop; students had extra time to revise and polish essays and reports for courses. One student commented that the laptop helped to ease the transition into college-life. Another student believes that the laptop program gives students an opportunity at USC that other schools do not offer. When asked, all students believe that the laptop program for freshmen should continue next year.

258

Page 259: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Students were asked for suggestions to improve the laptop part of the UNIV 101-E course. The most frequent response, from 14 students, was to include more instruction and training on how to use the laptop and the software. The level of instruction was too high for students who had no computer experience. Some students said a printed instruction booklet would enhance the learning process. Two students also suggested that laptop computer support needed to be available all day. Other suggestions included the upgrading of the processor within the unit, improving the speakers, and providing extra batteries. Three students said they did not use the video and four persons said that next year video cameras were not needed if used for freshmen and the curriculum as presented fall semester. Five students think that AutoCAD should be a part of the software included on the laptop. Eight students recommended that freshmen receive the laptop at the beginning of the semester. Five students would like to see computer used more often in class and three students believe there should be more use of the different types of software. In addition, four students stated that more projects and activities should incorporate the use of the laptop.

Bates House Living Arrangement

All but two of the 43 students interviewed said that they have enjoyed the Bates House arrangement in which engineering students are assigned engineering roommates and a group of engineering students were placed together in the same dorm. The two students displeased with the arrangement indicated personality conflicts with their roommates as the reason; otherwise these two students are satisfied with the placement.

One objective of the Bates House project was to engender a feeling of belonging and to encourage students to become part of the engineering community. Approximately 70 percent of the students (30 students) said that they feel a part of the Engineering community. Six students (14 percent) indicated that they did not feel this way and the remainder of the group indicated they could not say since they have had little exposure to engineering at this point.

Students who indicated a sense of belonging to the Engineering community were asked how living in Bates House and being part of the project contributed to this feeling. The most frequent student responses involve the interactions among the students and the ways they benefited from this environment. Thirteen students said they were able to seek and give help to each other with academics and other problems. Nine students indicated that they had become good friends with several of the students in the dorm. Six students said it was great having other engineering students around who were in the same classes. Five students said living together helped them with their design projects. Eight students indicated that the Bates House arrangement was very convenient; by this the students meant that they had easy access to each other for support and assistance.

Students shared 20 or more different suggestions for improvement with the UNIV 101 course. The following is a list of the most frequently cited items:

More in-depth information about all engineering disciplines (7) More projects in the course (7) A standardized curriculum in all sections (5) More diversity in the computer programs taught/used in the course (3) More in-depth computer work (3)

259

Page 260: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

More oral presentations (3) Revise MathCAD instruction (3) Too much emphasis on MathCAD (2) Include more writing (2)

260

Page 261: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

Appendix R

Quality Review Template

261

Page 262: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

College of Engineering and Information TechnologyAnnual Quality Review Template

Mission and Purposes

1. Provide a statement of the mission of the College of Engineering and Information Technology and your program.

Statements should indicate large-scale areas of activity and include education, research and service components. Describe your program’s purpose(s) telling what the program is designed to accomplish and at what curriculum level.

2. Discuss how the program mission and purposes are related to the mission and goals of the College of Engineering and Information Technology and to the mission and goals of the University of South Carolina.

Example: The Electrical Engineering program will produce graduates who are committed to lifelong learning.

Program Objectives and/or Learning Outcomes

1. List your program objectives.

These objectives should be broad statements relating what is to be achieved as a result of graduating from the program. Program objectives and learning outcomes can be combined or stated separately. Each objective or outcome, however, should be:

(1) measurable; (2) stated in terms of expected student behavior; (3) reflect a program emphasis not an individual course; and, (4) specify skills and/or competencies you expect from a graduate of your program.

Performance Criteria, Practices/Procedures and Measures/Methods

1. Provide a listing of the performance criteria for each program objective.

Performance Criteria - The performance criterion indicates the level or standard required to meet your program objectives. The performance criteria must be explicit and measurable although it can consist of cognitive or affective measurements. It should be a standard that can be adjusted as the program improves. In some cases, criteria exist that are endorsed by professional or education organizations and these should be identified and used. (e.g., standards for software development).

262

Page 263: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

2. Accompanying statements should indicate the classroom or program activity(ies) that teach the skill(s) or provide experience in that competency, how the student performance will be measured or observed, and a time frame for observing the performance. This can be accomplished in a short paragraph.

Practices/Procedures - The statements should outline the classroom practices or program procedures that will be used to achieve a specific performance.

For example, if teamwork is the program objective then several practices might include: (1) teamwork training; (2) self-evaluation of team participation; (3) team exams; (4) course projects completed by teams; (5) readings on teamwork; (6) teamwork role play; (7) outside/industry guest speaker on use of teams and teamwork in a particular field, etc.

Measures/Methods - The assessment methods or tools used to measure each performance criteria should be identified within the paragraph.

Possible data collection methods could include: Senior Exit Survey, Alumnae/Alumni Survey, portfolios, exam items, team projects, essay on the strengths and weakness of team work, review of literature, classroom observation by an outside evaluator, etc.,

Data Collection, Analysis, and Reporting of Results

1. This part of the narrative should briefly discuss when the data were collected and how, when, and by whom it was analyzed. The discussion should also indicate how and when the results were shared with the faculty members. (e.g. annual retreat, semester course review, etc.)

This is your assessment implementation plan or structure of committees or committee members who create or receive the data, synthesize it for trends, strengths and weaknesses, determine recommendations, and prioritize strategic plans for making program improvements. Each program should specify the responsibilities for the assessment tasks. For example, it may be the undergraduate and graduate committees in your program that are responsible for this. Or it may be the entire faculty member group meeting once a semester to review and evaluate data, etc.

2. Determine whether or not the performance criteria were met and the program objectives were achieved. Justify or explain your reasoning for each program objective. Make recommendations for improvement and provide an indication of how this will be accomplished.

Use of Assessment Results

1. Indicate the changes and/or improvements that were made during the preceding year for each program objective. Provide a paragraph of explanation regarding the follow-up evaluation and the results.

263

Page 264: A college-wide assessment infrastructure was · Web viewIn this model, the program evaluation process documents progress towards achievement of objectives established by the engineering

For example, if student feedback indicated that they wanted additional oral presentation experience, list the course(s) in which the changes were implemented and indicate if this change improved student evaluations of the course.

2. Were the program objectives changed during the 1999-2000 academic year? Why? What data or findings were received that justified a change?

3. At what time, place and with whom was there any discussion during the year concerning the

relevance or content of the objectives?

Example of how objectives/outcomes/criteria can be written

As part of a small group project in a senior level course, students will demonstrate the ability to search the web for relevant research data, effectively cooperate with group members to achieve project goals, write a journal quality report incorporating appropriate data from engineering journals and summarize and synthesize project findings in an oral presentation to faculty, industry representatives and colleagues. To achieve this objective, students in ELCT 401 will:

1.) provide citations of five web sites visited and researched which address their project topic2.) support group members in the effective performance of their roles (rating of 3 or better on

part 1 of team member evaluation form)3.) initiate and participate in group activities (rating of 3 or better on part 2 of team member

evaluation form)4.) execute a group generated plan for development and production of the group project (rating

of 3 or better on part 3 of team member evaluation form)5.) write a report that is concise, clear, content-relevant and meaningfully conveys a summary of

the project and its findings(a score of 3 or better on the report rubric)6.) present an oral presentation that is concise, clear, content-relevant and meaningfully conveys

a summary and a synthesis of research data on the group project (a score of 3 or better on the oral presentation rubric).

264