six facets of instructional product evaluation review effectiveness maintenance formative needs...

Post on 20-Dec-2015

216 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Six Facets of Instructional Product Evaluation

Review

Effectiveness

Mai

nten

ance

Form

ativ

e

Needs Assessm

ent

Impact

Development ActivitiesDevelopment Activities

Product Conceptualization

Design

Development

Project Re-conceptualization

Implementation

Institutionalization

Evaluation FunctionsEvaluation Functions

Review

Needs Assessment

Formative Evaluation

Maintenance Evaluation

Effectiveness Evaluation

Impact Evaluation

EMCC Design Document

• Urban Science Course Environment

Dimensions of effective technology enhanced learning environments:

Task-Oriented Challenging Collaborative Constructionist Conversational Responsive Reflective Formative

Task-Oriented

The tasks faculty set for students define the essence of the learning environment. If appropriate, tasks should be authentic rather than academic.

Academic Authentic

Task-Oriented Example

Students in online instructional design courses are tasked with designing interactive modules for real clients.

Challenging

The notion that interactive learning is easy should be dispelled. Learning is difficult and students should not spoon fed simplified versions of their fields of study.

Simple Complex

Challenging Example

In a Masters of Public Health program, students confront problems as complex and difficult as the ones they’ll face in the real world.

Collaborative

Web-based tools for group work and collaboration can prepare students for team work in21st Century work environments.

Unsupported Integral

Collaborative Example

Art, dance, and music students are collaborating to produce online shows with digital versions of their works and performances for critique by international experts.

Constructionist

Faculty should engage students in creating original knowledge representations that can be shared, critiqued, and revised.

Replication Origination

Constructionist Example

Students in fields ranging from aero-engineering to zoo management are producing digital portfolios as integral components of their academic programs.

Conversational

Students must have ample time and secure spaces for in-depth discussions, debates, arguments, and other forms of conversation.

One-way Multi-faceted

Conversational Example

New knowledge and insight are being constructed in conversation spaces such as the e-learning forums found in BlackBoard, WebCT, Desire2Learn,and other online learning authoring platforms.

Responsive

In learning communities, both faculty and students have a mutual responsibility to respond quickly, accurately, and with respect.

Superficial Genuine

Responsive Example

This is an area where R&D are needed. Some universities are seeking to establish supportive online networks that will continue throughout a career, indeed throughout a life.

Reflective

Both faculty and learners must engage in deep reflection and metacognition. These are not instinctive activities, but they can be learned.

Shallow Deep

Reflective Example

Teacher preparation students are keeping electronic journals to reflect upon the children they teach, and their roles as advocates for children.

Formative

Learning environments can be designed to allow students to develop prototype solutions over time rather than to find one right answer that someone else has defined.

Fixed Assessment Developmental

Formative Example

Faculty should engage their students in ongoing efforts to evaluate and refine their work related to authentic tasks to encourage lifelong learning.

Task-Oriented

Challenging

Collaborative

Constructionist

Conversational

Responsive

Reflective

Formative

Traditional Course Online Course

Heuristic Review

What is usability?

• The concern with designing software applications which people find easy to use and personally empowering.

• Usable computer programs are logical, intuitive, and clear to the people who use them.

Web Site Usability

• “ The most common user action on a Web site is to flee.”

• “at least 90% of all commercial

Web sites are overly difficult to use…..the average outcome of Web usability studies is that test users fail when they try to perform a test task on the Web. Thus, when you try something new on the Web, the expected outcome is failure.” Jakob Neilsen

Edward Tufte

Typical Web Usability Problems

• bloated page design• internally focused

design• obscure site

structures • lack of navigation

support • writing style

optimized for print Jakob Neilsen

http://www.useit.com/

Key Usability Principles

• Structure - organize meaningfully

• Simplicity - make common tasks easy

• Visibility - all data needed for a task

• Feedback - keep users informed

• Tolerance - allow cancel, back

• Reuse - reduce the users' need to remember

Nielsen’s Web Usability Rules• Visibility of system status • Match between system and

real world • User control and freedom • Consistency and standards • Error prevention • Recognition rather than recall• Flexibility and efficiency of

use • Help users recognize,

diagnose, and recover from errors

• Help and documentation • Aesthetic and minimalist

design

Two Major Ways to Evaluate Usability

• Heuristic Review– quick and relatively inexpensive– based on expert analyses– no user involvement

• Usability Testing– finds more problems– user involvement increases validity– when designers see problems live, it

has a huge impact

Heuristic Review Several experts individually compare a

product to a set of usability heuristics

– Typical heuristic: • Visibility of

system status • The system should

always keep users informed about what is going on, through appropriate feedback within reasonable time.

Heuristic Review Violations of the heuristics are

evaluated for their severity and extent

Severity Scale:1 Cosmetic: fix if possible.2 Minor: fixing this should be given low priority.3 Medium: fixing this should be given medium priority.4 Major: fixing this should be mandatory before the system is

launched. If the problem cannot be fixed before launch, ensure that the documentation clearly shows the user a workaround.

5 Catastrophic: fixing this is mandatory; no workaround possible.

Extensiveness Scale:1 Single case2 Several places3 Widespread

Heuristic Review At a group meeting, violation reports are

categorized and assigned

Heuristics violated are identified

Average severity and extensiveness ratings are compiled

Opportunities for improvement are clarified

Feasible solutions are recommended

Heuristic Review• Example of Opportunity For Improvement

Opportunity 1 (4 reports. Avg. Severity=2.25, Avg. Extent=2.34,Heuristics Used: 1, 3)

Consider providing more user feedback about where they are andwhat they should do next. Examples cited: No Page progress indicator No indication of how to start

Suggestions: Provide a page-progress indicator, such as “page 3 of 12” Put a “Click a section below to start:” on the first screen, as a TOC

header

Heuristic Review

• Disadvantages

• Advantages– Quick: Do not need to find or schedule users– Easy to review problem areas many times– Inexpensive: No fancy equipment needed

– Validity: No users involved– Finds fewer problems (50% less in some cases)– Getting good experts can be challenging– Building consensus with experts is sometimes difficult

Another Weakness

• Some people believe that heuristic evaluation is too subjective.

• Human judges are prone to poor judgment at times.

Usability Standardshttp://www.astd.org/ASTD/marketplace/ecc/ecc_home

ASTD offers certification of e-learning courses, including 8 usability standards:

-Navigation-Orientation-Feedback cues-Link cues-Links labeled -Help-Legibility-Text quality

Heuristics for E-Learning Evaluation

1. Visibility of system status: The e-learning program keeps the learner informed about what is happening, through appropriate feedback within reasonable time. •red for a problem

•yellow for a warning •green for OK

Heuristics for E-Learning Evaluation

2. Match between system and the real world: The e-learning program’s interface employs words, phrases and concepts familiar to the learner or appropriate to the content, as opposed to system-oriented terms. Wherever possible, the e-learning program utilizes real-world conventions that make information appear in a natural and logical order.

Heuristics for E-Learning Evaluation

3. Error Recovery and Exiting: The e-learning program allows the learner to recover from input mistakes and provides a clearly marked “exit” to leave the program without having to go through an extended dialogue.

Heuristics for E-Learning Evaluation

4. Consistency and standards: When appropriate to the content and target audience, the e-learning program adheres to general software conventions and is consistent in its use of different words, situations, or actions.

Heuristics for E-Learning Evaluation5. Error prevention: The e-learning program is

carefully designed to prevent common problems from occurring in the first place.

Heuristics for E-Learning Evaluation

6. Navigation support: The e-learning program makes objects, actions, and options visible so that the user does not have to remember information when navigating from one part of the program to another. Instructions for use of the program are always visible or easily retrievable.

Heuristics for E-Learning Evaluation

7. Aesthetics: Screen displays do not contain information that is irrelevant, and “bells and whistles” are not gratuitously added to the e-learning program.

Heuristics for E-Learning Evaluation8. Help and

documentation: The e-learning program provides help and documentation that is readily accessible to the user when necessary. The help provides specific concrete steps for the user to follow. All documentation is written clearly and succinctly.

Heuristics for E-Learning Evaluation9. Interactivity: The e-learning program provides

content-related interactions and tasks that support meaningful learning.

Heuristics for E-Learning Evaluation10.Message

Design: The e-learning program presents information in accord with sound principles of information-processing theory.

Heuristics for E-Learning Evaluation11.Learning Design:

The interactions in the e-learning program have been designed in accord with sound principles of learning theory.

Heuristics for E-Learning Evaluation12.Media Integration: The inclusion of media in the

e-learning program serves clear pedagogical and/or motivational purposes.

Heuristics for E-Learning Evaluation13.Instructional

Assessment: The e-learning program provides assessment opportunities that are aligned with the program objectives and content.

Heuristics for E-Learning Evaluation14.Resources:

The e-learning program provides access to all the resources necessary to support effective learning.

Review

• The purpose of review is to ensure that the development team is well-informed about previous work done in the area during the early stages of product conceptualization.

• Designers must avoid recreating the wheel.

Review

• The two primary methods used are reviewing the related literature and reviewing competing products.

• Regularly reviewing competing products is a great professional development practice.

I can do better

than this!

Needs Assessment

• The purpose of needs assessment is to identify the critical needs that an instructional product is supposed to meet.

• Needs assessment provides essential information to guide the design phase of the development process.

Needs Assessment

• The primary methods are: – task analysis, – job analysis, and– learner analysis.

• One of the most important results is a list of specific goals and objectives that learners will accomplish with the new product.

Formative Evaluation• The purpose is to collect

information that can be used for making decisions about improving interactive learning products.

• Formative evaluation starts with the earliest stages of planning and continues through implementation.

Formative Evaluation• Provided the results are used,

formative evaluation usually provides the biggest payoff for evaluation activities.

• Clients may be reluctant to accept the results of formative evaluation, especially as a program nears completion.

Effectiveness Evaluation• The purpose is to estimate

short-term effectiveness in meeting objectives.

• It is a necessary, but insufficient, approach to determining the outcomes of interactive learning.

Effectiveness Evaluation• Evaluating implementation is

as important as evaluating outcomes.

• If you don’t understand how instructional products were actually implemented, you can’t interpret results.

A connection with the server could not

be established?

Impact Evaluation• The purpose is to estimate the

long-term impact on performance, both intended and unintended.

• It is extremely difficult to evaluate the impact of interactive learning products.

Impact Evaluation• Evaluating impact is

increasingly critical because of emphasis on the bottom line.

• More and more clients expect impact evaluation to include “return-on-investment” (ROI) approaches.

Maintenance Evaluation• The purpose of maintenance

evaluation is to ensure the viability of an interactive product over time.

• Maintenance is one of the weakest links of web-based learning environments.

Maintenance Evaluation• Document analysis, interviews,

observations, and automated data collection are among the methods used in maintenance evaluation.

• Very few education and training agencies engage in serious maintenance evaluation.

Planning is the key to successful instructional product evaluation.

• Evaluation requires good planning, careful implementation, and systematic follow-up.

• A major challenge is getting clients to identify the decisions they face.

• Clear decisions drive the rest of the planning.

Heuristics for E-Learning Evaluation15.Feedback:

The e-learning program provides feedback that is contextual and relevant to the problem or task in which the learner is engaged.

What the heck?

top related