assessment and teaching of science skills: whole of programme perceptions of graduating students

18
This article was downloaded by: [Florida Atlantic University] On: 20 November 2014, At: 14:30 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Assessment & Evaluation in Higher Education Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/caeh20 Assessment and teaching of science skills: whole of programme perceptions of graduating students Yvonne Hodgson a , Cristina Varsavsky b & Kelly E. Matthews c a Faculty of Medicine, Nursing and Health Science, School of Biomedical Sciences, Monash University, Melbourne, Australia b Faculty of Science, Monash University, Melbourne, Australia c Teaching and Educational Development Institute, University of Queensland, Brisbane, Australia Published online: 11 Oct 2013. To cite this article: Yvonne Hodgson, Cristina Varsavsky & Kelly E. Matthews (2014) Assessment and teaching of science skills: whole of programme perceptions of graduating students, Assessment & Evaluation in Higher Education, 39:5, 515-530, DOI: 10.1080/02602938.2013.842539 To link to this article: http://dx.doi.org/10.1080/02602938.2013.842539 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &

Upload: kelly-e

Post on 27-Mar-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Assessment and teaching of science skills: whole of programme perceptions of graduating students

This article was downloaded by: [Florida Atlantic University]On: 20 November 2014, At: 14:30Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Assessment & Evaluation in HigherEducationPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/caeh20

Assessment and teaching of scienceskills: whole of programme perceptionsof graduating studentsYvonne Hodgsona, Cristina Varsavskyb & Kelly E. Matthewsc

a Faculty of Medicine, Nursing and Health Science, School ofBiomedical Sciences, Monash University, Melbourne, Australiab Faculty of Science, Monash University, Melbourne, Australiac Teaching and Educational Development Institute, University ofQueensland, Brisbane, AustraliaPublished online: 11 Oct 2013.

To cite this article: Yvonne Hodgson, Cristina Varsavsky & Kelly E. Matthews (2014) Assessment andteaching of science skills: whole of programme perceptions of graduating students, Assessment &Evaluation in Higher Education, 39:5, 515-530, DOI: 10.1080/02602938.2013.842539

To link to this article: http://dx.doi.org/10.1080/02602938.2013.842539

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoever orhowsoever caused arising directly or indirectly in connection with, in relation to or arisingout of the use of the Content.

This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &

Page 2: Assessment and teaching of science skills: whole of programme perceptions of graduating students

Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 3: Assessment and teaching of science skills: whole of programme perceptions of graduating students

Assessment and teaching of science skills: whole of programmeperceptions of graduating students

Yvonne Hodgsona*, Cristina Varsavskyb and Kelly E. Matthewsc

aFaculty of Medicine, Nursing and Health Science, School of Biomedical Sciences, MonashUniversity, Melbourne, Australia; bFaculty of Science, Monash University, Melbourne,Australia; cTeaching and Educational Development Institute, University of Queensland,Brisbane, Australia

This study reports on science student perceptions of their skills (scientificknowledge, oral communication, scientific writing, quantitative skills, teamworkand ethical thinking) as they approach graduation. The focus is on which teach-ing activities and assessment tasks over the whole programme of study studentsthought utilised each of the six nominated skills. In this quantitative studyinvolving two Australian research-intensive universities, the teaching activitiesidentified by students as developing the broadest number of skills were labora-tory classes and tutorials. Lectures were only effective for developing scientificknowledge and, to a limited extent, ethical thinking. Assessment tasks that stu-dents perceived to utilise the broadest range of skills were assignments and oralpresentations. The findings of this study document the students’ perspectiveabout their gains in skill sets, and the teaching activities and assessment tasksthat require them to use and thus develop these skills. The findings provide anopportunity to evaluate the constructive alignment of skills development,teaching activities and assessment tasks from a student’s perspective. Furtherresearch is required to actually measure the skills that students gain over theirwhole programme of study.

Keywords: assessment; learning gains; science skills; student perceptions

Literature review

The place of learning outcomes in higher education

Ensuring that the learning outcomes of undergraduate degree programmes areachieved by those who undertake them has become a significant reporting andaccountability feature of the higher education sector in a number of countries (e.g.the USA, Australia, the UK and other European countries). The advantages of anoutcome-based approach to higher education include the following: (i) the provisionof a framework for curriculum design, (ii) clarity for learners, (iii) quality assuranceand (iv) national and international transparency (Adams 2004). The use of explicitlearning outcomes benefit governments of advanced economies who want to demon-strate explicitly the economic and social gain obtained by students who go to univer-sity.

The assessment of higher education learning outcomes project was establishedby the organisation for economic cooperation and development to examine thefeasibility of obtaining global worldwide data about the capabilities of final year

*Corresponding author. Email: [email protected]

© 2013 Taylor & Francis

Assessment & Evaluation in Higher Education, 2014Vol. 39, No. 5, 515–530, http://dx.doi.org/10.1080/02602938.2013.842539

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 4: Assessment and teaching of science skills: whole of programme perceptions of graduating students

bachelor degree students (Coates and Richardson 2011; Tremblay, Lalancette, andRoseveare 2012). This project is taking place in an environment of increasingstudent enrolment and diversity, new technologies (e-learning) and changing studentbehaviour (anytime-anywhere), and of increasing competitiveness for jobs. InAustralia, this is also occurring while financial resources are diminishing, the studentto staff ratio is increasing, and there is increased pressure on academic staff for highperformance in their research.

Assessing the skills and knowledge of graduating students

Many higher education institutions have systems in place for mapping wheregraduate attributes are included in the curriculum (Sumsion and Goodfellow 2004;Tariq et al. 2004; Spencer, Riddle, and Knewstubb 2012). However, assessing gradu-ate attributes is complex and challenging (Yorke and Knight 2006; Hughes andBarrie 2010), and not many institutions have been able to measure individual studentacquisition of them. In Australia, a review of the Australian university quality assur-ance’s audits revealed that most universities failed to incorporate learning outcomesfor generic graduate attributes into the curriculum and that the assessment systemsin these institutions had failed to affect the learning of these capabilities (Ewan2009). In the USA, Kuh and Ewell (2010) found that three-quarters of institutionshad established learning outcomes for their degree programmes, but the assessmentused in these institutions, which was driven by quality assurance, was not used toenhance teaching and learning or to inform curriculum change, evidencing adisconnect between the learning outcomes of university programmes and assessmentpractices. In engineering and science graduate programmes, Borrego and Cutler(2010) have also found a lack of ‘constructive alignment’ (Biggs 1999) betweenlearning outcomes, assessment evidence and learning experiences.

Within the current environment of accountability, there is a need in the highereducation sector to determine the gains in knowledge and skills of graduatingstudents at a programme level (Yorke and Knight 2006). Assessment practices inhigher education institutions, however, generally occur at the subject (unit of study)level (Fraser and Bosanquet 2006), and these can vary in number and scope frominstitution to institution (Gibbs and Dunbar-Goddet 2009). Furthermore, assessmentat the subject level is undertaken by subject teachers, who have the expertise todetermine knowledge gains by individual students, but do not always have the sameconfidence in undertaking the responsibility of evaluating broader institutionalgraduate attributes (de la Harpe and David 2012).

Self-reported learning gains

Surveys are widely used within the higher education sector to evaluate the studentexperience of teaching and learning within the university environment. Such surveysare commonly administered at the national level by government regulatory bodies,at the university level by central administrators, and at the faculty level byprogramme coordinators. Student evaluations at the faculty level have been found tobe better predictors of overall satisfaction than prescribed questions delivered byinstitutions and external bodies (Denson, Loveday, and Dalton 2010). While measur-ing student satisfaction, these surveys often do not evaluate the gain in skills thatstudents feel they have acquired during their degree programme.

516 Y. Hodgson et al.

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 5: Assessment and teaching of science skills: whole of programme perceptions of graduating students

Douglass, Thomson, and Zhao (2012) found that self-reported gains at the levelof the programme or major were the best indicator for assessing the value-addedeffects of a student’s academic experience at a major research university in theUSA. Anaya (1999) correlated self-reports of learning, college grade-point average(GPA) and performance on a standardised USA graduate record examination andfound that self-reported gains by students were valid measures of student learningoutcomes. McNaught, Ng, and Chow (2012) and Benton, Duchon, and Pallett(2011) have also found a positive correlation between student self-ratings and per-formance in relevant course examinations. However, there is still some debate aboutthe accuracy of self-reported student gains. Bowman (2011) and Herzog (2011)found weak correlations between self-reported gains and longitudinal gains in gen-eral education. Grades, however, are not the only measure of student success whenevaluating the effectiveness of degree programmes. Bedggood and Donovan (2011)have argued for a student-centric approach using surveys that ask students to evalu-ate: (i) the set of skills they gain, (ii) the extent to which the programme stimulated,motivated and challenged them and (iii) their self-evaluating performance.

Self-reported gains have been used by McNeil et al. (2012) to ensure thatstudents were acquiring the graduate attributes that were embedded into theirmedical curriculum. These authors used a specially designed survey to gather thisinformation. A similar survey for science students, the science students skills inven-tory (SSSI) has been used by Matthews and Hodgson (2012) to obtain data aboutwhat students think they learn during their undergraduate biomedical science degreeprogramme.

Surveying students about their perceived skills gain is attractive from a methodo-logical perspective, in that it can cover a wide range of learning and developmentaloutcomes. It is also an inexpensive way to obtain data about how much studentsthink they have learned or changed since entering university (Anaya 1999; Benton,Duchon, and Pallett 2011). Generally speaking, students in higher education aremature and, therefore, in a good position to make valid and reliable judgementsabout their learning experiences and actual learning gains (Scriven 1988). Studentsin their third year have experience of three years of higher education learning todraw on, and their perceptions of the importance of knowledge and skills are morealigned with staff perceptions than those of students in the first year (Leggett et al.2004).

A focus on learning outcome is essential to inform curriculum review, toimprove teaching and assessment processes and to improve student learning (Biggs1999). Alignment of complex learning outcomes, like graduate attributes, with learn-ing and assessment activities requires a programme-level approach (Yorke andKnight 2006). It is important to know what happens at the degree programme level,to know how students perceive their knowledge and gain skills, and to know whichactivities undertaken during the study programme expose students to and developthese skills. A fundamental aspect of understanding these gains lie in understandingthe teaching and assessment approaches that students experience during theirundergraduate studies.

Aims of the study

The Australian learning and teaching academic standards (LTAS) project, inconsultation with discipline communities, has defined the academic standards for

Assessment & Evaluation in Higher Education 517

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 6: Assessment and teaching of science skills: whole of programme perceptions of graduating students

degree programmes. Within the context of science, five science threshold learningoutcomes (TLOs: Yates, Jones, and Kelder 2011) have been identified. Theseprovide a foundation for articulating the capabilities expected of a science graduate.The skills surveyed by the SSSI were aligned with the Australian science TLOs.These state that science graduates will: (i) demonstrate a coherent understanding ofscience, (ii) exhibit depth and breadth of scientific knowledge, (iii) critically analyseand solve scientific problems, (iv) be effective communicators of science and (v) beaccountable for their own learning and scientific work (Yates, Jones, and Kelder2011). The SSSI (Matthews and Hodgson 2012), which is broadly aligned with fourof the science TLOs, evaluates the perceived skills gain in the six areas of scientificknowledge, oral communication, scientific writing, quantitative skills, team workand ethical thinking.

This study uses the SSSI and examines the whole of programme teaching andassessment activities as experienced by science students close to graduation. It looksat how the intended science knowledge and skills are taught and assessed, and thealignment of these at the degree programme level and asks specifically:

(1) To what extent do graduating science students believe they have developedthe six skills – scientific knowledge, oral communication, scientific writing,quantitative skills, teamwork and ethical thinking – during their undergradu-ate degree?

(2) What are the student perceptions of the types of teaching activities thatinclude and develop these skills?

(3) What are the student perceptions of the types of assessment activities thatinclude and develop these skills?

Methods

Survey instrument

A quantitative study was designed to evaluate student perceptions of their scienceskills, and which teaching activities and assessment tasks they felt developed theirknowledge and skills during their undergraduate science degree programme. The SSSI(Matthews and Hodgson 2012) was used as the survey instrument, which also collectsdemographic information including gender, age, participation in undergraduateresearch experiences, GPA and plans students have for after graduation. The instru-ment asks students to select the teaching activities and assessment tasks they perceiveas contributing to the development of six science graduate skills (see Appendix 1).

The teaching activities and assessment tasks that are commonly used to teachsciences at university were listed on the survey. Students were asked, ‘Throughoutyour entire science degree programme, which types of class contact required you toutilise the following …’ and ‘Throughout your entire science degree programme,which assessment tasks required you to utilise the following …’ The SSSI thenallowed students to respond for the six learning outcomes across the two prompts.Students were informed they could select any number of contact types and assess-ment tasks that applied for each skill. Teaching activities or class contact (a morecommon phrase familiar to students) included lectures, practical laboratory classesand tutorials, while assessment tasks included assignments, presentations, literaturereviews, examinations and practical reports.

518 Y. Hodgson et al.

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 7: Assessment and teaching of science skills: whole of programme perceptions of graduating students

The survey then asked students to rate themselves on each of the six skills. Thequestion asked was ‘Thinking about the skills and knowledge you have acquiredduring your science degree, how would you rate yourself using the following scale?’The scale used was a seven-point Likert’s scale (from 1 = very poor to 7 = verygood).

Data collection and analysis

Students in their final year of their bachelor of science or biomedical science degreeprogramme at two Australian research-intensive universities were invited to completethe SSSI. The questionnaire was administered online (SurveyMonkey) during the lastweeks of the students’ final semester. Students were emailed a survey link along withan information sheet and were allowed two weeks in which to complete the survey.An incentive was offered (entry to win gift vouchers) to encourage survey comple-tion; this is a common practice (Berk 2006) and was not viewed by the authors as afactor causing bias in students’ responses. Ethics clearance for the study was obtainedfrom both universities. Descriptive statistics (number and percentage of positiveresponses) for the survey questions were obtained using PASW 20.0.

Results

Study participants

A total of 1207 students in the bachelor of science or biomedical science degree pro-gramme at the two research-intensive universities were invited to complete the SSSI.A response rate of 56% was obtained with 647 responses received. Responses thatwere incomplete and responses from students undertaking double degreeprogrammes were excluded, as our intention was to explore the single sciencedegree programmes without the confounding influence of curricula from doubledegree students. In total, 400 student responses (19–40 years; 57% female) acrossUniversity A (60.5%) and University B (39.5%) were analysed. The majority ofstudents across both universities were enrolled in life sciences major (89.0%) fol-lowed by earth and space sciences (5.8%), physical sciences (5.0%) and computa-tional and material sciences (0.3%). Almost half of the participants reported a GPAat the credit level (48.3), 26.3% reported a lower GPA (failing or passing), while25.5% reported a higher GPA (distinction or high distinction average). Almost halfof participants planned on enrolling in a postgraduate programme for a professionaljob role after graduation (47.5%), followed by those with other or no set plans(28.0%), planning to enter the workforce (19.0%) or enrol in a postgraduate researchcourse (5.5%).

Self-rating of graduate skills

Figure 1 shows the descriptive statistics for student self-rating of their graduate skillsacquired during their undergraduate science degree programme. Students gave thehighest self-ratings (scale of 1 = very poor–7 = very good) to their gain in teamworkskills (mean ± SD, 5.76 ± 1.11, Figure 1). Similar, and slightly lower, self-ratingswere given for scientific knowledge, written communication and oral communication(5.68 ± 0.96, 5.64 ± 1.07, 5.42 ± 1.16, respectively, Figure 1). Students gave lowerself-ratings for their gain in quantitative skills and ethical thinking skills(4.52 ± 1.48 and 4.98 ± 1.40, respectively, Figure 1).

Assessment & Evaluation in Higher Education 519

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 8: Assessment and teaching of science skills: whole of programme perceptions of graduating students

Utilisation of graduate skills within teaching and assessment activities

Scientific knowledge

Students identified lectures as the teaching activity that most developed theirscientific knowledge (83.3%, Table 1 and Figure 2), followed by practical laborato-ries (79.8%) and tutorials (67.5%). The majority of students agreed that all assess-ment tasks listed in the survey required their use of scientific knowledge (Table 1and Figure 2). Examinations were rated highest (88.8%) for assessing scientificknowledge, followed by practical reports (86.5%), assignments (83.0%), literaturereviews (76.7%) and presentations (67.3%).

Oral communication skills

Tutorials and practical laboratory classes received similar ratings from students forutilisation of oral communication skills (78.5 and 74.5%, respectively, Table 1 andFigure 2). Few students (11.3%) agreed that lectures utilised oral communicationskills. The assessment tasks in which the majority of students agreed that oralcommunication skills utilised were presentations (68.1%) and assignments (57.8%).A minority of students stated that practical reports (32.3%), literature reviews(21.8%) and examinations (7.8%) assessed oral communication skills.

Written communication skills

The teaching activity that students agreed included writing skills most was practicallaboratory classes (80.3%), followed by tutorials (61.5%, Table 1 and Figure 2).Only 13.3% of students agreed that lectures included writing skills. The majority ofstudents agreed that all assessment tasks assessed writing skills. The assessmenttasks with the highest ratings were practical reports (92.2%), literature reviews

Figure 1. Student self-rating of their graduate skills (scale of 1 = very poor–7 = very good).

520 Y. Hodgson et al.

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 9: Assessment and teaching of science skills: whole of programme perceptions of graduating students

(84.2%) and assignments (81.4%). Examinations and presentations were also ratedas assessing writing skills, but by fewer students (54.8 and 53.4%, respectively).

Quantitative skills

Teaching activities that were seen by students as including quantitative skills mostwere practical laboratory classes (86.35%) and tutorials (64.3%, Table 1 andFigure 2). Only 35.3% of students agreed that lectures included quantitative skills.Of the five nominated assessment tasks, only two were seen by students as assessingquantitative skills. These were assignments (78.5%) and examinations (61.5%). Fewstudents rated the other assessment tasks – literature reviews, presentationsand practical reports – as assessing quantitative skills (21.8, 19.5, and 16.0%,respectively).

Teamwork skills

Students utilised teamwork skills most during practical laboratory classes (93.0%agreed, Table 1 and Figure 2). Seventy per cent of students agreed that team-work skills were also utilised during tutorial classes. Very few students stated

Figure 2. Graphic representation of student perceptions of teaching activities (black) andassessment tasks (grey) according to utilisation of skill. (Lec = lecture; Lab = laboratoryclass; Tute = tutorial; Assign = assignment; pres = oral presentation; Lit rev = literaturereview; Exam = examination; Prac = practical report.) In our institutions, tutorials are smallgroup classes (15–30 students) of 1–3 hours duration in which students work with a tutor todiscuss and work through problems. Practical reports are reports written from resultsgenerated by students during a laboratory practical class.

Assessment & Evaluation in Higher Education 521

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 10: Assessment and teaching of science skills: whole of programme perceptions of graduating students

that lectures utilised teamwork skills (4.0%). Assignments were rated by a highpercentage of students as assessing their teamwork skills (83.5%). Presentations(70.4%) and practical reports (67.9%) were also rated highly for assessing theseskills. Literature reviews (20.6%) and examinations (5.5%) did not assess team-work skills.

Ethical thinking

Ethical thinking received relatively lower ratings by students for inclusion inteaching activities and for being assessed in any of the listed assessment tasktypes (Table 1 and Figure 2). A small majority of students agreed that ethicalthinking skills were included in practical laboratory work (55.8%), tutorials(55.0%) or lectures (50.0%). Ethical skills were not rated by many students asbeing assessed by the nominated assessment tasks. Practical reports wereagreed by just over 50% of students as assessing ethical thinking (Table 1).Assignments (46.5%), literature reviews (42.3%), examinations (23.3%) and pre-sentations (15.0%) were seen by fewer students as assessing ethical thinkingskills.

Discussion

This study examined the perceptions of graduating science students about theirdevelopment of six science skills (scientific knowledge, communication, scientificwriting, teamwork, quantitative skills and ethical thinking) during their degreeprogramme. It examined which teaching activities and assessment tasks contributedto their development and looked to see whether there was constructive alignmentbetween what we ask students to do, how we assess them and what they think theylearn (Biggs 1999).

We asked the students, ‘Thinking about the skills and knowledge you haveacquired during your degree programme, how would you rate yourself?’ Studentswere most positive about their gains in scientific knowledge. A recent Australian

Table 1. Student perceptions regarding where graduate skills were utilised within teachingand assessment activities [n (%)].

Scientificcontent

knowledgeCommunication

skillsWritingskills

Quantitativeskills

Team workskills

Ethicalthinking

Teaching activitiesa

Lecture 333 (83.3%) 45 (11.3%) 53 (13.3%) 141 (35.3%) 16 (4.0%) 200 (50.0%)Practical (lab) 319 (79.8%) 298 (74.5%) 321 (80.3%) 345 (86.35) 372 (93.0%) 223 (55.8%)Tutorial 270 (67.5%) 314 (78.5%) 246 (61.5%) 257 (64.3%) 281 (70.3%) 220 (55.0%)Assessment activitiesa

Assignment 332 (83.0%) 252 (57.8%) 355 (81.4%) 314 (78.5%) 364 (83.5%) 186 (46.5%)Presentation 269 (67.3%) 297 (68.1%) 233 (53.4%) 78 (19.5%) 307 (70.4%) 60 (15.0%)Literature

review304 (76.0%) 95 (21.8%) 367 (84.2%) 87 (21.8%) 90 (20.6%) 169 (42.3%)

Exam 335 (88.8%) 34 (7.8%) 239 (54.8%) 246 (61.5%) 24 (5.5%) 93 (23.3%)Practical report 346 (86.5%) 141 (32.3%) 402 (92.2%) 64 (16.0%) 296 (67.9%) 203 (50.8%)

Note: aBased on students who perceived the respective graduate skills were utilised in the respectiveactivities.

522 Y. Hodgson et al.

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 11: Assessment and teaching of science skills: whole of programme perceptions of graduating students

study involving more than 800 science graduates concluded, ‘Scientists valued theirown knowledge and considered knowledge a fundamental component of a scienceeducation’ (Harris 2012, 9). As a discipline, science is organised around a hierarchi-cal body of knowledge, leading to instruction and curricula focused on contentknowledge (Lattuca and Stark 2011). Thus, it is unsurprising that science students’identify high gains in content knowledge.

Undergraduate science curricula, many believe, emphasise content at the expenseof developing skills (Wieman 2007; Wood 2009). The students in this study, whilehighlighting the role of content knowledge, also recognised their development ofskills. Students were positive about their gains in team work, oral communicationand written communication skills, a reassuring observation, given that employersand universities want graduates who are good communicators and teamworkers(Yorke and Harvey 2005) and that two of the five science TLOs cover communica-tion and teamwork (Yates, Jones, and Kelder 2011). Hughes and Barrie (2010) havereported on the difficulty of assessing graduate attributes, due largely to their com-plex nature and challenge to conventional assessment practices. Although we havenot measured the gain in these skills, our study, in taking a student-centredapproach, has found that students do feel that they develop these skills during theirundergraduate science degree programmes, and studies have shown that studentperceptions of skills gain is a reliable indicator of academic achievement (Benton,Duchon, and Pallett 2011; Douglass, Thomson, and Zhao 2012).

Students were less positive about their gains in quantitative and ethical thinkingskills. Although the present study has sampled students from only two universities,the literature does suggest that quantitative skills are an issue of concern for staffteaching in science degree programmes in Australia (Brown 2009). Our data, fromthe perspective of students, offer further support for this finding. The LTAS projecthas listed ‘the ability to critically analyse and solve scientific problems by collectingand interpreting scientific data’ as one of the five TLOs for science graduates (Yates,Jones, and Kelder 2011), stressing further the need for curriculum reform in the areaof quantitative skills.

The low rating for ethical thinking by students in the survey can be explained bythe science courses at the two universities in our study not having specialised mod-ules on ethics as part of the curriculum. Scientists often face complex situations thatnecessitate an ethical response, and ethical thinking and behaviour is embedded inthe Australian science TLOs (Yates, Jones, and Kelder 2011). In this light, togetherwith the data collected from our students, we highlight ethics as an area where thereis a lack of alignment between the learning outcomes, teaching activities and assess-ment practices. Consequently, our study identifies ethics as an important area forscience curriculum development. In computer science programmes in the USA, thelack of teaching and assessment of ethics in some programmes has been attributedto the lack of training of academic staff in ethics, and the lack of available time inthe curriculum (Spradling, Soh, and Ansorge 2008). These factors would need to beaddressed by Australian science educators when designing curricular to teach ethicalthinking.

Teaching activities

We asked students to indicate which teaching activities they believed included anddeveloped the six skills. Lectures were clearly seen by students (83%) as the

Assessment & Evaluation in Higher Education 523

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 12: Assessment and teaching of science skills: whole of programme perceptions of graduating students

teaching activity that developed scientific knowledge. This may reflect the didacticand passive nature of most lectures. In a study of university academic staff teachingscience, Handal, Lauvhs, and Lycke (1990, 321) stated:

science teaching is regarded as a straightforward process of information transmission,in which the ‘backbone’ of the content is presented in an optimally structured way.Teaching quality seems to be judged on the basis of order, correctness and speed in theteachers’ presentation. Student activity is restricted to taking down notes.

More recently, Handelsman et al. (2004) have highlighted the lack of change inteaching methods in science, particularly in lectures. Our findings suggest that, onthe whole, lecturers at our universities, as perceived by the students they teach, deli-ver content without developing other skills. At both universities, lectures are oftendelivered to large student cohorts, so it is not unusual for students to be sitting intheatres of 300 plus students. The research-intensive nature of the two participatinguniversities may also contribute to the didactic nature of lecture delivery (Andersonet al. 2011).

An important finding of our study is the prominent role played by practical labo-ratory classes in skills development. The survey results showed that practical labora-tory classes were perceived by students to utilise and develop all of the sixnominated skills, with teamwork skills receiving the highest rating (93% agreed).Tutorials were also perceived to utilise six of the nominated skills, but the ratingsfor tutorials were lower for each skill than they were for practical laboratory classes,with the exception of communication skills. This finding is significant, because thevalue of laboratory classes is being questioned in these times of rationalisation ofresources (Johnstone 1997; Hawkes 2004). Laboratory classes are expensive, place aheavy toll on staff time and some students have negative views about them. Ourfindings demonstrate that students believe that practical laboratory classes and smallgroup work in tutorials are the teaching activities where they develop the essentialscientific and graduate skills. Students in physics have also given a positive rating tolaboratory work for their learning of physics theory and for developing teamworkskills (Hanif et al. 2009), providing further evidence that laboratory classesare aligned with learning outcomes, and that they are an important avenue fordeveloping learning outcomes that cannot be met by lectures.

Assessment tasks

The SSSI asked student about their perceptions of assessment tasks in the utilisationof the six skills. Examinations are the principal form of assessment task for assess-ing declarative knowledge and typically involve written essay or multiple choice(MCQ) questions (Biggs and Tang 2007). At the two universities reported in thisstudy, MCQs formed a substantial proportion of the end of semester examinations.MCQs typically make up between 33 and 100% of end of semester examinationpapers. The higher MCQ percentage (100%) examination papers are usuallyemployed at lower year levels where there are larger classes. These types of examin-ations, while they have the advantage of being quick and easy to mark, have weak-nesses in both motivating and assessing quality learning (Scouller 1998; Nicol2007). This is not to say that MCQ examination papers cannot be effective assess-ment tools. When they are designed appropriately and utilised in context, they can

524 Y. Hodgson et al.

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 13: Assessment and teaching of science skills: whole of programme perceptions of graduating students

be a very effective assessment tool (Nicol 2007). However, time and assessmentexpertise by staff is required to achieve this.

The students in the present study reported that examinations assessedpredominantly their scientific knowledge. Given that scientific knowledge is animportant science TLO (Yates, Joes, and Kelder 2011) and was a ‘skill’ highly ratedby students in their self-rating, their indication that examinations assessed scientificknowledge demonstrates an alignment between examinations, content knowledgeand students’ positive beliefs that their degree programmes built their contentknowledge.

Much has been written about assessment in the literature, specifically the typesof assessment tasks used, what they assess and the ability of assessment practices todrive student learning (Boud 1995; Rust 2002; Biggs and Tang 2007). The presentstudy is unusual in that it asked students their views on what skills they believe dif-ferent assessment tasks have utilised and developed at the programme level. The stu-dent perceptions of which assessment tasks utilised the six nominated skills showedthat most assessment tasks, with the exception of assignments, utilised and devel-oped only some of the nominated skills. Examinations were perceived to assessmainly scientific knowledge (89%), quantitative skills (62%) and writing skills(55%). Practical reports were agreed by the majority of students to utilise writingskills (92%), scientific knowledge (87%) and teamwork skills (68%).

A study by Aurora (2010) found that practical report writing enhanced students’learning of scientific content, supporting the perception of gain in scientific knowl-edge expressed by our cohort of students. As a teaching activity students perceivedstrongly that practical classes utilised and developed five of the six science skills inthe study. These results demonstrate an alignment between the teaching activity andthe assessment tasks (laboratory work and laboratory reports) for three skill areas –scientific content knowledge, teamwork and writing skills – but also demonstratethat practical reports do not assess all the skills that students learn in the laboratorypractical class. This indicates that, in order to ensure constructive alignment in thisarea of the curriculum (Biggs 1999), staff need to re-evaluate the tools they use toassess laboratory work.

The students in our study perceived oral presentations to utilise and developseveral skills, namely teamwork skills (70%), oral communication skills (68%) andscientific knowledge (67%). In other studies, students have described oralpresentations to be more demanding than written assignments, requiring a deeperunderstanding and leading to better learning (Joughin 2007) and a professional iden-tity (Huxham, Campbell, and Westwood 2012). Our results, although confirming amultidimensional aspect of student learning in oral presentations, did not confirm agreater learning of scientific knowledge in oral presentations compared with writtenassignments.

The students in our study gave a higher utilisation of scientific knowledge forwritten literature reviews (76%) and assignments (83%) than oral presentations. Ofall the assessment tasks included in the survey, assignments were perceived bystudents as utilising the largest number of skills. In addition to developingknowledge and writing skills, assignments were also perceived by students todevelop quantitative skills and teamwork skills, highlighting the multiple ways inwhich students perceive a single academic assessment task. This observation hasimplications for the constructive alignment of learning outcomes and assessment

Assessment & Evaluation in Higher Education 525

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 14: Assessment and teaching of science skills: whole of programme perceptions of graduating students

tasks, indicating the complex nature of assessment and the importance of includingstudent views in planning assessment.

The results of this study have implications for curriculum design. Two areas ofweakness in the curriculum were identified to be ethical thinking and quantitativeskills. These skills have been highlighted by others to be of concern in the sciencecurriculum and therefore present areas for further research. The limitation of thisstudy is that it provides a broad-brush perspective of student perceptions. There hasbeen little literature in this area for us to build on, and further research is required toactually measure the skills that students gain over their whole programme of study.Furthermore, our findings relate to science degree programmes at two research-inten-sive universities but may apply to science degree programmes at other universitiesor to other degree programmes.

Conclusion

This study has examined perceptions of graduating science students about the teach-ing tasks and assessment items that developed a set of six nominated skills overtheir whole undergraduate science degree programme. There was constructivealignment in the teaching and assessing of scientific knowledge. However, in someareas of the curriculum there is a need to fine-tune how we assess some of the nomi-nated skills. We have identified quantitative skills and ethical thinking as areas ofthe curriculum that requires further development in both teaching and assessment.

AcknowledgmentsWe thank all the students who participated in our study and acknowledge the statisticalcontribution of Victoria Andrews. Funding from the Faculty of Science, University ofQueensland.

Notes on contributorsYvonne Hodgson is the deputy director of Education in the School of Biomedical Sciencesand teaches in the field of physiology. Her research interests include interdisciplinary curricu-lum development and the evaluation of subsequent student learning outcomes, peer learningand peer assessment.

Cristina Varsavsky is the deputy dean of the Faculty of Science. Her interests in scholarshipof learning and teaching include broad areas of mathematics and science education, includingtechnology in mathematics education, curriculum development, transferrable skills and theinterface of mathematics education between secondary school and university.

Kelly E. Matthews is a lecturer in Higher Education. Her research involves practical applica-tions into contemporary higher education issues, including undergraduate curriculum reform.

ReferencesAdams, S. 2004. “Learning Outcomes: A Consideration of the Nature, Role, Application and

Implications for European Education of Employing ‘Learning Outcomes’ at the Local,National and International Levels.” Towards the European Higher Education AreaBologna Process. www.scotland.gov.uk/Resource/Doc/25725/0028779.pdf.

Anaya, G. 1999. “College Impact on Student Learning: Comparing the Use of Self-ReportedGains, Standardised Test Scores, and College Grades.” Research in Higher Educatio 40(5): 499–526.

526 Y. Hodgson et al.

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 15: Assessment and teaching of science skills: whole of programme perceptions of graduating students

Anderson, W. A. U., C. L. Banerjee, S. C. R. Drennan, I. R. Elgin, J. Epstein, G. F. Handelsman,R. Hatfull, et al. 2011. “Changing the Culture of Science Education at ResearchUniversities.” Science 331: 152–153.

Aurora, T. S. 2010. “Enhancing Learning by Writing Laboratory Reports in Class.” TheJournal of Faculty Development 2 (1): 35–36.

Bedggood, R. E., and J. D. Donovan. 2012. “University Performance Evaluations: What areWe Really Measuring?” Studies in Higher Education 37 (7): 825–842.

Benton, S. L., D. Duchon, and W. H. Pallett. 2013. “Validity of Student Self-ReportedRatings of Learning.” Assessment & Evaluation in Higher Education 38 (4): 377–388.

Berk, R. A. 2006. Thirteen Strategies to Measure College Teaching. Sterling: Stylus.Biggs, J. B. 1999. Teaching for Quality Learning at University. Buckingham: Open

University Press.Biggs, J. B., and C. Tang. 2007. Teaching for Quality Learning at University. 3rd ed.

Maidenhead: Open University Press.Borrego, M., and S. Cutler. 2010. “Constructive Alignment of Interdisciplinary Graduate

Curriculum in Engineering and Science: An Analysis of Successful IGERT Proposals.”Journal of Engineering Education 99 (4): 355–369.

Boud, D. 1995. “Assessment and Learning: Contradictory or Ccomplementary?” InAssessment for Learning in Higher Education, edited by P. Knight, 35–49. London:Kogan Page.

Bowman, N. 2011. “Validity of College Self-Reported Gains at Diverse Institutions.”Educational Reseacher 40 (1): 22–24.

Brown, G. 2009. Review of Education in Mathematics, Data Science and QuantitativeDisciplines: Report to the Group of Eight Universities. Canberra: Group of Eight.

Coates, H., and S. Richardson. 2011. “An International Assessment of Bachelor Degree Grad-uates’ Learning Outcomes.” Higher Education Management and Policy 23 (3): 51–69.

de la Harpe, B., and C. David. 2012. “Major Influences on the Teaching and Assessment ofGraduate Attributes.” Higher Education Research & Development 31 (4): 493–510.

Denson, N., T. Loveday, and H. Dalton. 2010. “Student Evaluation of Courses: WhatPredicts Satisfaction?” Higher Education Research & Development 29 (4): 339–356.

Douglass, J. A., G. Thomson, and C. M. Zhao. 2012. “The Learning Outcomes Race: TheValue of Self-Reported Gains in Large Research Universities.” Higher Education 64:317–335.

Ewan, C. 2009. Learning and Teaching in Australian Universities: A Thematic Analysis ofCycle 1 AUQA Audits. Melbourne: Australian Universities Quality Agency and the Aus-tralian Learning and Teaching Council.

Fraser, S. P., and A. M. Bosanquet. 2006. “The Curriculum? That’s Just a Unit Outline, Isn’tIt?” Studies in Higher Education 31 (3): 269–284.

Gibbs, G., and H. Dunbar-Goddet. 2009. “Characterising Programme-Level AssessmentEnvironments that Support Learning.” Assessment & Evaluation in Higher Education 34(4): 481–489.

Handal, G., P. Lauvhs, and K. Lycke. 1990. “The Concept of Rationality in AcademicScience Teaching.” European Journal of Education 25: 319–332.

Handelsman, J., D. Ebeert-May, R. Beichner, P. Bruns, A. Change, R. DeHaan, J. Gentile,et al. 2004. “Scientific Teaching.” Science 304: 521–522.

Hanif, M., P. H. Snedden, F. M. Al-Ahmadi, and N. Reid. 2009. “The Perceptions, Viewsand Opinions of University Physics Students About Physics Learning DuringUndergraduate Laboratory Work.” European Journal of Physics 30: 85–96.

Harris, K. L. April 2012. A Background in Science; What Science Means for AustralianSociety. University of Melbourne Commissioned Report Prepared for the AustralianCouncil of Deans of Science. Melbourne: Centre for the Study of Higher Education.

Hawkes, S. J. 2004. “Chemistry is NOT a Laboratory Science.” Journal of ChemicalEducation 81 (9): 1257.

Herzog, S. 2011. “Gauging Academic Growth of Bachelor Degree Recipients: Longitudinalvs. Self-reported Gain in General Education.” New Directions for Institutional Research150: 21–39.

Hughes, C., and S. Barrie. 2010. “Influences on the Assessment of Graduate Attributes inHigher Education.” Assessment & Evaluation in Higher Education 35 (3): 325–334.

Assessment & Evaluation in Higher Education 527

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 16: Assessment and teaching of science skills: whole of programme perceptions of graduating students

Huxham, M., F. Campbell, and J. Westwood. 2012. “Oral Versus Written Assessments: ATest of Student Performance.” Assessment & Evaluation in Higher Education 37 (1):125–136.

Johnstone, A. H. 1997. “Chemistry Teaching – Science or Alchemy? 1996 Brasted Lecture.”Journal of Chemical Education 74 (3): 262–268.

Joughin, G. 2007. “Student Conceptions of Oral Presentations.” Studies in Higher Education32 (3): 323–336.

Kuh, G., and P. Ewell. 2010. “The State of Learning Outcomes in the United States.” HigherEducation Management and Policy 22 (1): 1–20.

Lattuca, L. R., and J. S. Stark. 2011. Shaping the College Curriculum: Academic Plans inContext. San Francisco, CA: Jossey-Bass.

Leggett, M., A. Kinnear, M. Boyce, and I. Bennett. 2004. “Student and Staff Perceptions ofthe Importance of Generic Skills in Science.” Higher Education Research & Development23 (3): 295–312.

Matthews, K., and Y. Hodgson. 2012. “The Science Students Skills Inventory: CapturingGraduate Perceptions of their Learning Outcomes.” International Journal of Innovationsin Science and Mathematics Education 20 (1): 24–43.

McNaught, C., S. Ng, and H. Chow. 2012. “Literacties in the Humanities: The StudentVoice.” Higher Education Research & Development 31 (2): 139–154.

McNeil, H. P., H. Scicluna, P. Boyle, M. Grimm, K. Gibson, and P. Jones. 2012. “SuccessfulDevelopment of Generic Capabilities in an Undergraduate Medical Education Program.”Higher Education Research & Development 31 (4): 525–539.

Nicol, D. 2007. “E-Assessment by Design: Using Multiple-Choice Tests to Good Effect.”Journal of Further and Higher Education 31 (1): 53–64.

Rust, C. 2002. “The Impact of Assessment on Student Learning – How Can the ResearchLiterature Practically Help to Inform the Development of Departmental AssessmentStrategies and Learner Centred Assessment Practices?” Active Learning in HigherEducation 3 (2): 145–158.

Scouller, K. 1998. “The Influence of Assessment Method on Students’ Learning Approaches:Multiple-Choice Question Examination Versus Assignment Essay.” Higher Education 35(4): 453–472.

Scriven, M. 1988. “The Validity of Student Ratings.” Instructional Research 92 (2): 5–18.Spencer, D., M. Riddle, and B. Knewstubb. 2012. “Curriculum Mapping to Embed Graduate

Capabilities.” Higher Education Research & Development 31 (2): 217–231.Spradling, C., L. Soh, and C. Ansorge. 2008. “Ethics Training and Decision-Making: Do

Computer Science Programs Need Help?” Proceedings of the 39th SIGCSE TechnicalSymposium on Computer Science Education, 153–157. Portland, OR: SIGCSE.

Sumsion, J., and J. Goodfellow. 2004. “Identifying Generic Skills Through CurriculumMapping: A Critical Evaluation.” Higher Education Research & Development 23 (3):329–346.

Tariq, V. N., E. M. Scott, A. C. Cochrane, M. Lee, and L. Rypes. 2004. “Auditing andMapping Key Skills Within University Curricula.” Quality Assurance in Education 12(22): 70–81.

Tremblay, K., D. Lalancette, and D. Roseveare. 2012. “Assessment of Higher EducationLearning Outcomes. Feasibility Study Report.” Design and Implementation 1. http://www.oecd.org/edu/highereducationandadultlearning/AHELOFSReportVolume1.pdf.

Wieman, C. E. 2007. “Why Not Try a Scientific Approach to Science Education?” Change39: 9–15.

Wood, W. B. 2009. “Revising the AP Biology Curriculum.” Science 325: 1627–1628.Yates, B., S. Jones, and J. Kelder. 2011. Learning and Teaching Academic Standards Project

Science. Final Report. http://www.olt.gov.au/resource-learning-and-teaching-academic-standards-sciecne-2011.

Yorke, M., and L. Harvey. 2005. “Graduate Attributes and Their Development.” NewDirections for Institutional Research 128: 41–58.

Yorke, M., and P. T. Knight. 2006. “Curricula for Economic and Social Gain.” HigherEducation 51 (4): 565–588.

528 Y. Hodgson et al.

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 17: Assessment and teaching of science skills: whole of programme perceptions of graduating students

Q1.

What

isyo

urov

erall(cumulative)grad

epointav

erag

e(G

PA)?

0.00

–0.49

0.50

–0.99

1.00–1

.99

(Passaverage)

2.00

–2.99

(Creditaverage)

3.00–4

.00(D

istin

ction-high

distinctionaverage)

Don’tkn

ow

Q2.

Through

outyo

urentire

science

degreeprogram

,whichtypes

ofclasscontact

required

youto

utilisethefollo

wing:

(choo

seallthat

apply)

Lectures

Practicals

Tutorials

N/A

Scientificcontentkn

owledg

ein

your

field(s)

ofstud

y□

□□

Com

mun

icationskills

(oralscientificpresentatio

ns)

□□

□□

Writin

gskills(scientifi

cwritin

g)□

□□

□Quantitativ

eskills(m

athematical

&statistical

reason

ing)

□□

□□

Team

workskills(w

orking

with

others

toaccomplishashared

task)

□□

□□

Ethical

thinking

(ethical

respon

sibilities

andapproaches)

□□

□□

Researchskills(practical

skills,critical

thinking

andcritiqu

ing)

□□

□□

Q3.

Through

outyo

urentire

science

degreeprogram

,whichassessmenttasksrequired

youto

utilisethefollo

wing:

(choo

seallthat

apply)

Practical

repo

rts

Laboratory

assign

ments

Quizzes

Posters

Literature

review

sExams

N/A

Scientificcontentkn

owledg

ein

your

field(s)

ofstud

y□

□□

□□

□□

Com

mun

icationskills(oralscientificpresentatio

ns)

□□

□□

□□

□Writin

gskills(scientifi

cwritin

g)□

□□

□□

□□

Quantitativ

eskills(m

athematical

&statistical

reason

ing)

□□

□□

□□

□Team

workskills(w

orking

with

others

toaccomplishashared

task)

□□

□□

□□

□Ethical

thinking

(ethical

respon

sibilitiesandapproaches)

□□

□□

□□

□Researchskills(practical

skills,criticalthinking

andcritiqu

ing)

□□

□□

□□

(Contin

ued)

Appendix

1.Thequestion

son

theSSSI(M

atthew

san

dHod

gson

2012)usedto

providethedataforthestudy.

Assessment & Evaluation in Higher Education 529

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14

Page 18: Assessment and teaching of science skills: whole of programme perceptions of graduating students

Appendix

1.(Con

tinued).

Q4.

Thinkingab

outtheskillsan

dknow

ledge

youhav

eacquired

duringyo

urscience

degreeprogram

,how

wou

ldyo

urate

yourselfusingthefollo

wingscale?

1–Verypo

or2

34

56

7–Verygo

odMyscientificcontentkn

owledg

ein

myfield(s)

ofstud

y□ 1

□ 2□ 3

□ 4□ 5

□ 6□ 7

Mycommun

icationskills(oral

scientificpresentatio

ns)

□ 1□ 2

□ 3□ 4

□ 5□ 6

□ 7Mywritin

gskills(scientifi

cwritin

g)□ 1

□ 2□ 3

□ 4□ 5

□ 6□ 7

Myqu

antitativeskills(m

athematical

&statistical

reason

ing)

□ 1□ 2

□ 3□ 4

□ 5□ 6

□ 7Myteam

workskills(w

orking

with

others

toaccomplishashared

task)

□ 1□ 2

□ 3□ 4

□ 5□ 6

□ 7Ethical

thinking

(ethical

respon

sibilitiesandapproaches)

□ 1□ 2

□ 3□ 4

□ 5□ 6

□ 7Researchskills(practical

skills,

criticalthinking

andcritiqu

ing)

□ 1□ 2

□ 3□ 4

□ 5□ 6

□ 7

530 Y. Hodgson et al.

Dow

nloa

ded

by [

Flor

ida

Atla

ntic

Uni

vers

ity]

at 1

4:30

20

Nov

embe

r 20

14