evaluation of elearning
DESCRIPTION
Moving beyond level 1 and level 2 evaluations. Using usability methods to improve elearning.TRANSCRIPT
![Page 1: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/1.jpg)
Evaluation of eLearningMichael M. Grant, PhD
Michael M. Grant 2010
![Page 2: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/2.jpg)
![Page 3: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/3.jpg)
Kirkpatrick’s Levels
Level 5:ROI
the investment of the training comparedto its relative benefits to the organizationand/or productivity/revenue
91.3%
53.9%
22.9%
7.6%
2.1%
(ASTD, 2005)
![Page 4: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/4.jpg)
Kirkpatrick (& Phillips) Model
92%
17.9%
(ASTD, 2009)
![Page 5: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/5.jpg)
FORMATIVE EVALUATIONWhat’s the purpose?
![Page 6: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/6.jpg)
A focus on improvement during development.
![Page 7: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/7.jpg)
Level 2 Evaluations
Appeal
Effectiveness
Efficiency
![Page 8: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/8.jpg)
Data Collection Matrix
Methods
1. What are the logistical requirements?
2. What are user reactions?
3. What are trainer reactions?
4. What are expert reactions?
5. What corrections must be made?
6. What enhancements can be made?
Anecdotal records X X X X X
User questionnaires X X X X
User interviews X X X X
User focus groups X X X
Usability observations X X X X
Online data collection X X
Expert reviews X X X
![Page 9: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/9.jpg)
“Vote early and often.”
The sooner formative evaluation is conducted during development, the more likely that substantive improvements will be made and costly errors avoided. (Reeves & Hedberg, 2003, p. 142)
![Page 10: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/10.jpg)
![Page 11: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/11.jpg)
“Experts are anyone with specialized knowledge that is relevant to the design of your ILE.”
(Reeves & Hedberg, 2003, p. 145)
![Page 12: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/12.jpg)
Expert Review
![Page 13: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/13.jpg)
Interface Review Guidelines
from http://it.coe.uga.edu/~treeves/edit8350/UIRF.html
![Page 14: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/14.jpg)
USER REVIEWObservations from one-on-ones and small groups
![Page 15: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/15.jpg)
What Is Usability?
![Page 16: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/16.jpg)
The most common user action on a Web site is to flee.”
— Edward Tufte
![Page 17: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/17.jpg)
“at least 90% of all commercial Web sites are overly difficult to use….the average outcome of Web usability studies is that test users fail when they try to perform a test task on the Web. Thus, when you try something new on the Web, the expected outcome is failure.”
— Jakob Nielsen
![Page 18: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/18.jpg)
Nielsen’s Web Usability Rules
1. Visibility of system status
2. Match between system and real world
3. User control and freedom
4. Consistency and standards
5. Error prevention6. Recognition rather
than recall
7. Flexibility and efficiency of use
8. Help users recognize, diagnose, and recover from errors
9. Help and documentation
10. Aesthetic and minimalist design
![Page 19: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/19.jpg)
Ease of learning - How fast can a user who has never seen the user interface before learn it sufficiently well to accomplish basic tasks?
Efficiency of use - Once an experienced user has learned to use the system, how fast can he or she accomplish tasks?
Memorability - If a user has used the system before, can he or she remember enough to use it effectively the next time or does the user have to start over again learning everything?
Error frequency and severity - How often do users make errors while using the system, how serious are these errors, and how do users recover from these errors?
Subjective satisfaction - How much does the user like using the system?
![Page 20: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/20.jpg)
Two Major Methods to Evaluate Usability
![Page 21: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/21.jpg)
Heuristic Evaluation Process
1. Several experts individually compare a product to a set of usability heuristics
2. Violations of the heuristics are evaluated for their severity and extent suggested solutions
3. At a group meeting, violation reports are categorized and assigned
4. average severity ratings, extents, heuristics violated, description of opportunity for improvement
![Page 22: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/22.jpg)
Heuristic Evaluation Comparisons
AdvantagesQuick: Do not need to find or schedule users
Easy to review problem areas many times
Inexpensive: No fancy equipment
DisadvantagesValidity: No users involved
Finds fewer problems (40-60% less??)
Getting good expertsBuilding consensus with experts
![Page 23: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/23.jpg)
Heuristic Evaluation Report
![Page 24: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/24.jpg)
Heuristic Evaluation Report
![Page 25: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/25.jpg)
USER TESTING
![Page 26: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/26.jpg)
User Testing
People whose characteristics (or profiles) match those of the Web site’s target audience perform a sequence of typical tasks using the site.
Examines:– Ease of learning
– Speed of task performance
– Error rates
– User satisfaction
– User retention over time
![Page 27: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/27.jpg)
Image from (nz)dave at http://www.flickr.com/photos/nzdave/491411546/
![Page 28: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/28.jpg)
Elements of User Testing
Define target users
Have users perform representative tasks
Observe users
Report results
![Page 29: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/29.jpg)
Why Multiple Evaluators?
Single evaluator achieves poor results
– Only finds about 35% of usability problems
– 5 evaluators find more than 75%
![Page 30: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/30.jpg)
Why only 5 Users?
(Nielsen, 2000)
![Page 31: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/31.jpg)
Reporting User Testing
Overall goals/objectives
Methodology
Target profile
Testing outline with test script
Specific task list to perform
Data analysis & results
Recommendations
![Page 32: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/32.jpg)
RECENT METHODS FOR USER TESTING
![Page 33: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/33.jpg)
![Page 34: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/34.jpg)
![Page 35: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/35.jpg)
![Page 36: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/36.jpg)
![Page 37: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/37.jpg)
![Page 39: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/39.jpg)
![Page 41: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/41.jpg)
10 Second Usability Test
1. Disable stylesheets
2. Check for the following:
1. Semantic markup
2. Logical organization
3. Only images related to content appear
![Page 42: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/42.jpg)
ALPHA, BETA & FIELD TESTING
Akin to prototyping
![Page 43: Evaluation of eLearning](https://reader036.vdocuments.mx/reader036/viewer/2022062617/54c744904a7959c27e8b4644/html5/thumbnails/43.jpg)
References & Acknolwedgements
American Society for Training & Development. (2009). The value of evaluation: Making training evaluations more effective. Author.
Follett, A. (2009, October 9). 10 qualitative tools to improve your web site. Instant Shift. Retrieved March 18, 2010 from http://www.instantshift.com/2009/10/08/10-qualitative-tools-to-improve-your-website/
Nielsen, J. (2000, March 19). Why you only need to test with 5 users. Jakob Nielsen’s Alertbox. Retrieved from http://www.useit.com/alertbox/20000319.html
Reeves, T.C. (2004, December 9). Design research for advancing the integration of digital technologies into teaching and learning: Developing and evaluating educational interventions. Paper presented to the Columbia Center for New Media Teaching and Learning, New York, NY. Available at http://ccnmtl.columbia.edu/seminars/reeves/CCNMTLFormative.ppt
Reeves, T.C. & Hedberg, J.C. (2003). Interactive learning systems evaluation. Englewood Cliffs, NJ: Educational Technology Publications.