assess & profess summer 2014 edition

10
Assessment and Research Issue 9, Summer 2014 ASSESS PROFESS THOUGTS ON ASSESSMENT As student affairs practitioners, we’ve got Mother Teresa’s idea in the bag. We focus our efforts on the immediate needs of the students with whom we work – and this is the right approach. However, this does little to minimize the importance of assessment. Because, at the end of the day, we know the “numbers” are important. I think of her statement as a way to effectively frame assessment efforts in student affairs. The numbers aren’t always what’s important to you. At times, your work with a single student is the place where you most tangibly contribute to the mission of the division and the Duke community. So, when it comes time to design and execute your assessment plan, why is it centered on numbers? Sometimes it seems like the easiest thing to do. Sometimes it’s all we think our stakeholders will accept. But those numbers often tell us little. They may not reliably be employed to inform future decisions and fail to improve our work. My challenge to you is this – when confronted with a 9, Summer 2014 In this Issue: 1 2 3 A Message from the Office: Thoughts on Numbers in Assessment Thinking Beyond the Box: By Jordan Hale Five Things NOT to do When Developing Surveys Assessment Summer Training for Division Staff Have something you would like to share in the next edition? Email Aisha Al-Qimlass [email protected] & 4 Setting Up your Baseline Account 6 The Cure for the Event Planning Headache By Kyle Fox and Jerrica Washington By C. Colgate Taylor 8 Assessment in a Culture of Connection: By Rebecca Hurst 2 situation where the counting of the “numbers” makes little sense, take a step back and look for a more appropriate way to measure the outcomes of your work. I’m sure Mother Teresa still kept some sort of measure to understand whether she effectively helped each individual next to her. Over time, doing so will help you too to report a number that means something about the most effective ways your work has impacted students. Sampling: A Memo 9

Upload: assessment-research

Post on 08-Mar-2016

222 views

Category:

Documents


1 download

DESCRIPTION

The Summer 2014 Edition of Assess & Profess by the Office of Assessment & Research at Duke University

TRANSCRIPT

Page 1: Assess & Profess Summer 2014 Edition

Assessment and Research Issue 9, Summer 2014

ASSESS PROFESS THOUGTS ON ASSESSMENT

1

As student affairs practitioners, we’ve got Mother Teresa’s idea in the bag. We focus our efforts on the immediate needs of the students with whom we work – and this is the right approach. However, this does little to minimize the importance of assessment. Because, at the end of the day, we know the “numbers” are important.

I think of her statement as a way to effectively frame assessment efforts in student affairs. The numbers aren’t always what’s important to you. At times, your work with a single student is the place

2

where you most tangibly contribute to the mission of the division and the Duke community.

So, when it comes time to design and execute your assessment plan, why is it centered on numbers? Sometimes it seems like the easiest thing to do. Sometimes it’s all we think our stakeholders will accept. But those numbers often tell us little. They may not reliably be employed to inform future decisions and fail to improve our work.

My challenge to you is this – when confronted with a

9, Summer 2014

In this Issue:

1

2

3

A Message from the Office: Thoughts on Numbers in Assessment

Thinking Beyond the Box: By Jordan Hale

Five Things NOT to do When Developing Surveys

Assessment Summer Training for Division Staff

Have something you would like to share in the next edition? Email Aisha Al-Qimlass

[email protected]

&

4

Setting Up your Baseline Account

6 The Cure for the Event Planning Headache By Kyle Fox and Jerrica Washington

By C. Colgate Taylor 8

Assessment in a Culture of Connection: By Rebecca Hurst

2

3

situation where the counting of the “numbers” makes little sense, take a step back and look for a more appropriate way to measure the outcomes of your work. I’m sure Mother Teresa still kept some sort of measure to understand whether she effectively helped each individual next to her. Over time, doing so will help you too to report a number that means something about the most effective ways your work has impacted students.

Sampling: A Memo 9

Page 2: Assess & Profess Summer 2014 Edition

Duke Student Affairs Issue 9, Summer 2014

2

Thinking Beyond the Box: Additional Methods of Data Collection

1

I am a firm believer that we are most energized and effective when we are connected to our own internal experiences and with those around us. Internal experiences may be curious thoughts like, “I wonder what a more effective way to do [blank] is.” Or, “What are students really getting out of this?” Curiosity is informed by attending to our own intuitive sense that something important is happening here (or, alternatively, that something is lacking). It is also informed by really listening to students and colleagues and by engaging one another in open conversation.

The natural ebbs and flows of our professional responsibilities and personal lives

Assessment in a Culture of Connection

Continued on page 5

Continue on page 7

By Rebecca Hurst

1

Starting in the late 1990’s, the need to collect data, assess, and reflect on the student experience has become a central component to many university Student Affairs offices. In an effort to validate the need for financial and human resources, departments across the country began to actively assess the importance of their work. Subsequently, focus groups, task forces and other teams assembled to decide the best ways to implement assessment components throughout a student’s experience. Nationally, efforts began as CAS Standards were developed, professional associations organized

2

assessment resources and universities across the country streamlined data collection methods. In recent years, the methods for collecting data have remained standard. Surveys and focus groups have remained the dominant techniques used in data collection. While these methods are effective, research in other fields such as psychology, sociology and anthropology have yielded various techniques in both qualitative and quantitative research our teams can use in collecting data. Using Photography to Tell a Story How many times have we given a camera to our students and

3

asked them to tell their story via photograph? Recent research (Birnbaum & Guido, 2011) has highlighted both the benefits and challenges of conducting research through student photography. One tangible benefit of using photography is the use of participant action research (PAR). When using PAR, the participant is an active component of the

By Jordan Hale

2

certainly require that we move in and out of connection. However, in a campus culture where being busy can become the status quo, we run the risk of functioning in a state of “disconnected doing.” While this may be necessary (and even useful) at times, staying here shuts down the potential for reflection, dialogue, and intentional action.

What does this have to do with assessment, you ask? Everything. Meaningful assessment requires us to ask questions that are relevant to the experiences of our students, colleagues, campus, and community. It requires that

Page 3: Assess & Profess Summer 2014 Edition

Assessment & Research Issue 9, Summer 2014

3

1

1. Don’t Lose Sight of What You Want to Know • The latent variable is the phenomenon you

are seeking to measure, which is not readily quantified.

• When we measure, we seek to approximate its true score since it cannot be measured directly.

• The classical measurement model will allow you to approximate the true score for what you wish to know.

2. Don’t Just Ask, Measure • When you wish to assess a student

population, think scale instead of survey. Surveys simply gather information, scales measure to get sense of the true score associated with that population

• Some redundancy among multiple items helps with accuracy when you are trying to measure your latent variable.

3. Don’t Create Your Own Survey Format • If you create your own

format, you don’t have any proof that it will work, in the form of reliability and validity.

• Scale formats such as the Likert scale have been developed and tested to capture true elicitations from respondents.

• The six-point Likert scale includes anchors of agreement; other similarly formatted scales offer standardized anchors for capturing scores associated with your latent variable.

Five Things NOT To Do When Developing Surveys for Assessment in Student Affairs

(Courtesy of the NASPA Research and Policy Institute Issue Brief) Summary by Bayley Garbutt

2

4. 4. Don’t Be Afraid of Validity • Validity gets at the issue of whether your

scale is actually measuring the latent variable you set out to understand.

• There are four types of validity you should be concerned about for assessments in student affairs: content validity, criterion-related validity, construct validity and conclusion validity.

• Asking these questions about your scale can help you begin to determine how valid your scale is:

i. Are you measuring what you say you are measuring?

ii. Does your survey align well with other similar surveys?

iii. Do items in your survey "hang together" as they should?

iv. What conclusion can you draw from your results?

5. 5. Don’t Ignore Reliability

• Recognizing that the true score of a latent variable is unknowable, reliability is as important as validity in determining whether a scale is accurate.

• Reliability pertains to how well items on your scale relate with other items.

• Reliability is typically measured using Cronbach’s coefficient alpha, with scores ranging between 0 (totally inaccurate) to 1 (perfectly accurate).

• A Cronbach's alpha greater than 0.70 is good, greater than 0.80 is very good.

Page 4: Assess & Profess Summer 2014 Edition

Duke Student Affairs Issue 9, Summer 2014

4

The Office of Assessment & Research is pleased to announce open registration for our summer assessment training series.

Be inspired to reconnect to your love (or strong dislike) of data collection methods and procedures.

In four sessions, participants will learn techniques, gain assessment skills, and leave with a project in hand.

1. WHAT’S YOUR AQ? (Assessment Question) • Conducting Literature Reviews • Foundations for Data Collection • Introduction to Baseline, Duke Groups, and Assessment at Duke University

2. HOW WILL I KNOW? • Process Measures and Evidence-Based Planning

3. RENEWING YOUR DATA VOWS • Responsibilities of Data Collection • Incorporating Students in Data Collection • Utilizing Existing Data

4. MAINTAINING YOUR AQ • Project Critique • Completing the Assessment Cycle • Ways to Visualize and Share Findings

July 15, 17, 22, 29 | 9am – 12noon

Look on PD Portfolio to register by June 15th

Questions? [email protected]

Page 5: Assess & Profess Summer 2014 Edition

Assessment & Research Issue 9, Summer 2014

5

Need some Training in

Assessment??

Join Us:

July 15, 17, 22, 29 9am – 12noon

Register via PD Portfolio

by June 15th

Questions? [email protected]

Thinking Beyond the Box (Continued)

4

research method, and this technique stretches the validity of the data collected (Creswell, 2013). The data you gather can be complimented with an accompanying story and, coupled with the photo, will bring your research to life. Reviewing Professional Evaluations, Executive Summaries and Committee Reports After we develop a research question, we often do not know where to begin other than developing an assessment instrument. Sometimes slowing down before we take a deep dive into our assessment can be beneficial. Reviewing professional evaluations, executive summaries and committee reports can assist our teams as we build assessment instruments to ensure we are asking the right questions.

Professional Evaluations can be beneficial as you begin to look at questions on job satisfaction, clarifying professional roles or review workplace moral. Before asking your staff “Do you like you job?” review previously completed evaluations. They may be already telling you.

Another effective method to developing a quality assessment is to review committee reports or executive summaries before you begin your assessment. With good record keeping, these documents are a initiators for developing research

5

questions, gathering an understanding of the work that has already been completed will ensure you are moving forward and not revisiting issues that have been previously resolved.

In conclusion, there is no doubt assessment is a critical component to the work we do. The data drives what we seek to provide for students, faculty and university staff as we develop our programs and services. However, in order to prevent survey fatigue and over-using focus groups, we should consider other methods for collecting data. This will not only further enhance the data you collect via these methods, but also make the data you collect in traditional methods more rich and useful, since our participants will not feel like completing “another survey.” Take the time. Get creative. Ask questions. And find more comprehensive ways to collect the information we need to have great programs as we continue to do great work!

References: Birnbaum, M. & Guido, F. M.

(2011). 21st Century Data: Using Photography as a Method of Student Affairs Assessment. Journal of Student Affairs, 20, 14-26.

Creswell, J. W. (2013) Qualitative inquiry & Research Design: Choosing Among Five Approaches 3rd ed. Thousand Oaks, CA: Sage Publications.

Page 6: Assess & Profess Summer 2014 Edition

Duke Student Affairs Issue 9, Summer 2014

6

1

Did I book the right space? Who do I talk to about setting up a sound system? What do I do with this contract? Event planning from the student perspective can be an intimidating and often confusing process. The administrative experience is equally complex as we track and monitor the events of hundreds of student organizations. In an effort to better serve our students and our colleagues the DukeGroups Event Registration tool officially launched on August 20th, 2013.

In a survey of student users we found that 89% of respondents report having their events approved within 7 days. Using the DukeGroups Event Registration system has allowed for more efficiency due to the streamlining and aggregation of many processes that used to send students on a scavenger hunt for various signatures. The new system allows staff to respond faster, communicate clearly, and see more pertinent details compared to prior methods.

The DukeGroups Event Registration system has seen 767 successfully approved and registered events. The site has experienced a 19% increase in traffic and those who visit the site spend 51% more time on DukeGroups and they view 35% more pages per visit than they did prior to launch. The additional visitors and the increased interaction with the site tells us that we are attracting more students and that those students are browsing more content for

The Cure for the Event Planning Headache: DukeGroups By: Kyle Fox & Jerrica Washington

2

longer durations. Though the numbers tell one story we

have made an effort to find out what the end user experience is like. Over the past six months we have solicited feedback from student users, beta testers, and colleagues. The following excerpts provide some guidance as we move forward with improvements and changes. "I like the new system. I think that it was very easy

to navigate because it did not introduce any new technology & all of the boxes were pretty self-explanatory. I appreciated that the system gave me tips & suggestions based on the categories that I put the event into. I also liked that I was directed to resources such as where to make reservation requests, how to rent equipment, and how to contact DUPD for security needs. This type of information would definitely be helpful in planning events & I do plan to use it in the future…” – Student User Feedback

Continued on page 7

Page 7: Assess & Profess Summer 2014 Edition

Assessment & Research Issue 9, Summer 2014

7

3

we carve out the time and take the risk to be curious with one another. It requires that we collaborate and form coalitions with colleagues across the division and capitalize on one another’s unique skills and strengths. And, it requires committed action—which is much more likely when we support one another in the process.

How do we cultivate a culture of connection? Certainly, I can do my part to work toward moving into connection—with myself and with others—by taking time to connect with what is important to me. I can intentionally take in the beauty of our campus on a lunchtime walk, sing loudly off-key on my drive home, and laugh with my loved ones. I can work toward really listening and staying present in interactions with students and colleagues. (I can also be kind to myself when instead I am thinking about my “to-do” list for the day.) I can have open conversations with colleagues about our experiences on campus and our curiosity about how we can best foster growth in our community. And, I can participate in efforts to advocate for and to develop a culture of connection within my department and across our division that will support assessment efforts.

Assessment in a Culture of Connection (Continued)

Duke Groups (Continued)

3

“We did not really use DukeGroups to plan the event, just to register. Given the fact that no one uses it outside of student groups, it was not a great avenue for promotion.” - Student User Feedback “I liked how DukeGroups went through each segment of the event and gave contact information and next steps.” - Student User Feedback “(We need a) checklist of some sort after submission to ensure that we completed each step and contacted all the right people.” - Student User Feedback “I believe the system was very efficient, especially in comparison to other systems we have worked with. Additional feedback was also provided in a timely fashion.” - Student User Feedback “More transparence of the approval process would be helpful.” - Student User Feedback While students have been the primary user of the new system, staff have been behind the scenes to deliver the services requested via

Continued on page 8

4

DukeGroups Event Registration. Of those administrators surveyed 87% agree and 13% strongly agree that the new system reduces the amount of time spent securing details from students and provides an effective tool for communicating details with other staff members. All staff surveyed report that DukeGroups Event Registration is an effective tool. DukeGroups Event Registration allows University administrators and advisors access to details and information that not readily available in the past. However we must be careful to focus only on the positive growth in users as the feedback gathered from students make it clear that not all students see the value provided by the system. Moving forward it will be important to more effectively communicate the value of the DukeGroups Event Registration system,

Page 8: Assess & Profess Summer 2014 Edition

Assessment and Research Issue 9, Summer 2014

1

1. Provide a signed receipt of the research packet to [email protected]

2. Complete the two introductory “How to Use

Baseline” webinars referenced in the packet.

a. How to Use Baseline: An Introduction b. How to Use Baseline: Reporting Tools

• A schedule of upcoming webinars can be found through: http://www.campuslabs.com/support/training/

• Archived / pre-recorded webinars can be found in the “Video” section through: http://baselinesupport.campuslabs.com/home

2

3. Complete the university requirement for CITI Certification by logging onto: https://www.citiprogram.org

a. The requirement is the Foundational Module, and then one module per year afterwards.

b. Once the CITI basics course is completed, send a screenshot of course completion to [email protected]

4. Please send your NetID and Unique ID to [email protected] so that we can complete the process and set up the Baseline account.

Steps to Create Your Baseline Account

Duke Groups (Continued)

5

as well as accommodate some user features such as additional checklists, the ability to save forms, and an enhanced event promotion feature. There is certainly room for improvement and growth in the event registration process within DukeGroups and we will continue to assess the effectiveness of our efforts as we strive to meet our user’s needs.

If you have any questions, please let us know. Thanks!

8

Page 9: Assess & Profess Summer 2014 Edition

Assessment & Research Issue 9, Summer 2014

9

Sampling in Student Affairs: A Memo by C. Colgate Taylor

1

* When is it best to send a survey to everyone and when will random sampling be just as effective? For example, if we did random sampling for the XYZ survey, would the results be just as reliable? Could our data still be compared with national data? Short Answer | We should almost always employ sampling.

Basic Concept | By very definition a survey is a questionnaire designed to retrieve information from a sample of a population. A questionnaire administered to an entire population is a census. Survey vs. Sample It is best to send a survey to the entire student population only when a census is needed. Census Example: The Career Center wants to know who all of the employers were of the Class of 2013 Reason for Approach: One would not be able to predict unknown data needed by Employer Relations Staff. An Even Better Approach would be to build the method for collecting this information into existing systems, perhaps it could be a simple field in the applications for graduation? Always ask, “where else could this information be observed”?

Survey Example: National XYZ Residential Life Experiences Survey designed to learn more about the residential experiences, satisfaction with RA’s, and enjoyment of residential life programs, etc. Reason for Approach: Here, a sample would be just as effective as a “Census”. The results would be sound. We could compare to national data. The sample size needed is generally based on: 1) the size of the whole population (N = 6,500) 2) How Confident one needs to be that they are getting the “True” answer (usually 90% or 95%) 3) the expected response rate (for this example – I’ll set it at 10% which is very low for our historical response rates) 4) tolerance for a margin of error (This depends on how varied I expect the population to be on the questions). If I think the population would be polarized, a low level would be needed. Otherwise, I should be able to accept a margin of error. For this example, I’ll set my confidence Interval at 5%. With this conservative formula, I still would only need 363 respondents and to send the survey out to 3,630 students.

Confidence Level: 95% Confidence Interval: 5% Population Size: 5,4611 Estimated Response Rate: 10%

Sample Needed: 359 (To be 95% sure results are reliable with only a 5% chance of sampling error) Send Survey out to 3,590 Still, to be 90% Confident, we would only need 259 responses. 2

1 Number of Students Living on Campus

2 This whole process can change depending on the type of analysis to be run following the collection of data. For this example, 260+ responses would be sufficient to generalize to the campus population, assuming we do not plan to conduct a path analysis or other multivariate technique which require many more cases. *Historically atypical for projects not executed by AR

Page 10: Assess & Profess Summer 2014 Edition

Duke Student Affairs Issue 9, Summer 2014

10

Office of Assessment and Research

Duke University | Division of Student Affairs

Contact: Cole Taylor, Assistant Director | [email protected] | 919.684.4286

For More Updates and News, Like Our Facebook Page

2

*Wait! If I sample, what about bias? What if I survey the wrong students? Short answer | if you send surveys to all students, you are still actually sampling. It is called voluntary response sampling. And, the worst thing we do to contribute to bias is to send so many surveys to all students. Longer Answer | When we use Voluntary Response Sampling, students self-select instruments (fewer instruments over time) to complete. Respondents tend to be persons who are polarized on issues, which debilitates us from capturing a good picture of the campus community. The best way to limit sampling bias is to create random (or stratified) samples, reflective of the campus population.

*If I “sample” do I need to offer big incentives? Do they create sampling bias? Are there certain kinds of surveys that would be particularly susceptible? Is bias more likely if the prizes are larger? Short Answer | Good for long surveys, Caution with Surveys of a Sensitive Nature; Prizes can hurt the integrity of the findings My Philosophy around Incentives | The best incentive for someone to complete a survey is a real promise that their voice will be heard, and action(s) will be taken following a good analysis. When Altruism doesn’t work (and it usually does over time) – the other reasons people respond to surveys are either interest in the topic at hand – or compensation.

Incentives: Good when a survey is long; Caution should be taken for surveys of a sensitive nature; large prizes can create bias; Incentives have only modest impacts on response rates; To provide an incentive, surveys collect information (traditionally) to send prizes to winners. So, on some surveys, incentives might increase response rates, but weaken the validity of the responses given.

*Are incentives more or less likely to cause sampling bias if the administration is to everyone or a sample or is there no difference?

Short Answer | The sample should be chosen with the purpose and target audience of the survey in mind. Longer Answer | Right now, so many surveys are being sent across campus. I cannot predict what the responses would look like initially with sampling. It is a relationship that needs to be built over time with the student population. Efforts have been taken to brand Student Affairs instruments in an effort to increase response rates. When a student sees a request for survey from the division, we want them to know it will be: 1) short, 2) worth their time and 3) that their privacy is respected. Building strong instruments, piloting instruments, and reducing the number of instruments are all surely more impactful than a raffle.

Questions? [email protected]