Evaluation Planning
General Overview
Evaluation Criteria | Explanation | Data Sources |
---|---|---|
Effectiveness: mastery of goals and success of WBI | Determine if learners are able to confidently identify the core meaning of a passage of scripture and apply it to their lives | Participant learning products |
Appeal: gain and maintain learner attention and interest; usability | Review user interface for clarity and ease of use Review content for interest Review media use for interest and ease of use |
Participant survey on course content, ease of use, interest |
Efficiency: delivered in a time-efficient manner | Investigate learner time requirements vs. learner interest and learning gains | Document learners' time on WBI activities Participant survey Participant learning products |
Evaluation Matrix
Evaluation Criteria and Categories | Specific Questions | Methods and Tools |
---|---|---|
Effectiveness | ||
Goals | Is the course goal accurate? Are the goals and objectives clear and achievable? |
Expert review (SME, ID) End-user review Participant survey Course data (reflections, practice exercises, assessments, etc.) |
Content | Is the content coverage complete and accurate? Is there a match among content, objectives, activities, and assessment tools? Do the instructional activities promote learning? Do the instructional activities promote thoughtful and reflective responses? |
Expert review (SME, ID) End-user review Course data (reflections, practice exercises, assessments, etc.) |
Technology | Do the technology applications function properly? Are there any violations of copyright or intellectual property? |
Expert review (technical, ID) End-user review Participant survey |
Message Design | Do supporting features enhance the learning and are they without distractions? Is the appropriate voice used in expressing the content to learners? Are directions clear? Is the time frame of the course appropriate? |
Expert review (SME, ID) End-user review Participant Survey Course data (reflections, practice exercises, assessments, etc.) |
Appeal | ||
Goals | Are goals relevant to learners? | Expert review (SME, ID) End-user review Participant Survey |
Content | Is the content interesting and engaging? Do learners enjoy studying the content? |
Expert review (SME, ID) End-user review Participant Survey |
Technology | Is the content free of typographical, spelling, grammar, and punctuation errors? Are there minimal coding errors? Is navigation easy? |
Expert review (SME, ID) End-user review |
Message Design | Are the message and the media pleasing? Is the writing level and the tone appropriate for the content and the audience? Are the screens uncluttered and do they use white space effectively? Are color, typeface, and emphasis used appropriately? Do supporting graphics and features enhance learning without creating distractions? Does it have good navigational design? |
Expert review (SME, ID) End-user review |
Efficiency | ||
Goals | Are the purpose and goals stated clearly and concisely? Is there congruence between the goals, lesson objectives, and content? |
Expert review (SME, ID) |
Content | Is the content clearly and concisely presented? | Expert review (SME, ID) End-user review Participant survey |
Technology | Are materials easy for students to access? Are the materials easy for the instructor to modify? Do features work smoothly and properly? |
Expert review (ID) End-user review Participant survey |
Message Design | Is the organization and structure of the content coherent and clear? Is the time frame for the content appropriate? |
Expert review (SME, ID) End-user review Participant survey |
Stakeholders
Primary Stakeholders
- Instructional Designer/Instructor: Doug Wolfe. Doug is responsible for designing the course, developing evaluation plans, and executing the formative and summative evaluation plans. Doug will also serve as the initial course instructor.
- Subject Matter Experts (SMEs): Dave S. and Doug Wolfe. The SMEs are responsible for reviewing course content for accuracy. They will have an important role in the formative evaluation.
- End-Users (Learners). Learners will be directly impacted by the quality of the course, and their participation (or lack thereof) will determine whether this course and similar courses will be offered in the future. Select end-users will participate in a one-to-one tryout review of the course or a field trial. All end-users will be asked to complete a survey at the conclusion of the course.
Secondary Stakeholders
- Session (Board of Elders). The Covenant Presbyterian Church Session has authority over the adult education program of the church. They will give final approval to the course.
What Is Being Evaluated?
Materials to Be Examined | |
Design plans |
|
Prototype and website |
|
Evaluators and Reviewers
- Evaluator/Instructional Designer/Instructor/Technology: Doug Wolfe will serve as the main evaluator. Doug has 26 years of experience in instructional design (both technology and print products), along with seven courses towards a Masters in Educational Technology degree from Boise State University. As part of his coursework, Doug has completed a class in evaluation methods. Doug has also taught adult education classes at his church and led many small group Bible studies for adults.
- Expert reviewer (subject matter): Dave S. is a retired high school teacher (language and literature) and an elder at Covenant Presbyterian Church. He has extensive knowledge and experience with Bible study and Christian adult education. In addition, he co-wrote a major adult education curriculum on discipleship (including a module on Bible study) for the church.
- Peer reviewers: Fellow students in EdTech 512 will review the course and provide feedback.
- EdTech 512 Instructor: Glori Hinck will review the course as part of the assessment in EdTech 512.
- End-User reviewers: Members of the target audience will be asked to participate in end-user reviews during formative evaluation. One or two will be asked to review the design plans. Several will be asked to participate in a field trial.
Formative Evaluation Plan and Timing
The course designer and SME will review the course Learning Task Map, objectives, sample assessment items, and development plans prior to full-scale course development. (Note: due to the short development schedule, SME review may run concurrently with some of the course development.) The course designer and SME will also review the course lessons and assignments, motivational strategies, and assessment items as they are completed by the designer.
Fellow students in EdTech 512 as well as the EdTech 512 course instructor will review the course planning documents as well as the course materials. They will review for alignment of the course with the planning documents, sound instructional design, appropriate graphic design, ease of navigation, clarity of directions, and fulfillment of EdTech 512 course requirements.
One or two end-users will be asked to participate in a one-to-one tryout. Volunteers will be asked to review the navigation, clarity of directions, and appropriateness of the content and activities.
Several end-users will be asked to participate in a field trial of the course prior to a full roll-out. The purpose will be to identify any adjustments needed to the course and to identify any remaining errors.
Due to the relatively small target audience and the very short development schedule, as well as the desire to offer the class in the fall, a small-group tryout will not be used. The combination of the one-to-one tryout and the field test should be sufficient to gather the needed data.
Summative Evaluation Pre-Planning
Evaluation Criteria | Main Questions | Data Sources |
---|---|---|
Effectiveness | How well does the course meet the main objectives? Do learners successfully complete the course? |
Participant learning products Instructional team interviews Participant surveys LMS records |
Appeal | Do learners believe this is an effective means of providing Christian adult education? Are learners interested in taking more online Christian adult education courses? |
Participant surveys |
Efficiency | How much time did learners spend on learning activities? How do designer and instructor time demands compare with traditional adult education? |
LMS time on task data Participant survey Instructional team interviews |