Week+Five

This week, you and your group members will develop a plan for evaluation to be presented in your curriculum design proposal. For the purposes of your curriculum design proposal, you will focus on developing a plan for summative evaluation. Consider the procedures for summative evaluation described in your learning resources. What might the goals of evaluation be for your group’s curriculum design? What should the indicators of success be, and what would be the best orientation and design for the evaluation? Return to your group wiki and review the evaluation plans posted by your group members. How do your ideas compare with those of your colleagues? Do some ideas overlap? Do the ideas of your colleagues cause you to have a different perspective? It is the responsibility of the Week 5 Facilitator to review each group member’s suggestion and create a final plan for evaluation that reflects the majority opinion.
 * Group Project 1: Developing a Plan for Evaluation**
 * //By Monday//**, the Week 5 Facilitator should create a page in the group wiki designated for “Evaluation.”
 * //By Tuesday//**, post to your group wiki a description of the evaluation plan you would recommend be implemented in conjunction with your curriculum design proposal. Include a description of the indicators of success, the orientation of the evaluation, and how the evaluation would be conducted in order to gather the necessary data.
 * //By Wednesday//**, each group member should post his or her suggestion for the group’s final plan for evaluation. This suggestion should reflect a combination of what you consider to be the best ideas from all group members.
 * //By Thursday//**, the Week 5 Facilitator should post the final plan for evaluation to the group wiki. The plan should include a description of the indicators of success, the orientation of the evaluation, and at least one specific example of how the evaluation would be conducted in order to gather the necessary data. Group members should visit the wiki, review the plan, and use the wiki or other communication means to resolve any disagreements. In addition, the Facilitator should post the URL of this wiki content to the Week 5 area of the Group Project discussion board.

**Alexander**

**Molly** I suggest that we provide our client with the ultimate formative evaluation to ensure a quality program. We should assist our client with creating a "review team" that consists of key players including managers and volunteers/learners alike. During the design phase, we will conduct regular meetings with the review team. The review team will act as a small-group evaluation and allow us to conduct field trials.

Once the training course has been implemented, we will continue with a summative evaluation. After all, we will want to have the data showing that we created and implemented a successful training course to use for our future advertisements. Goals include: We will obtain the information needed for our summative evaluation by keeping in contact with our museum contact person. If allowed, perhaps we can have access to their LMS and reporting system to be able to run reports ourselves. Better yet, we sell the LMS & reporting software in conjunction with our training. This way, we aren't entirely reliant upon the client to provide us with the information needed to conduct a thorough evaluation. **Final Plan combining Cheri & Molly's suggestions:** We will assist the client with the implementation of a small group review team for formative evaluation. The small group review will address the following questions:
 * Did the learners achieve the goals of the instruction? (Did learners pass the assessments with accuracy? Obtain information by use of museum's LMS.)What is the return on investment of the instruction?
 * Have patron donations increased as a result of the training? (Obtained via client's bookkeeping/reporting.)
 * Has volunteer turn-over been reduced? (Obtained via client's records - perhaps volunteers keep quitting right now because they are frustrated with not having the answers or knowing where to go to get them. Once we train them, they will be more relaxed and enjoy their work, thus reducing the turn-over rate.)
 * Do the learners have the anticipated entry-level skills?
 * If so, did they succeed in the instruction? If they didn't succeed, what revisions are needed?
 * If they did not have predicted entry-level skills, did they succeed in the instruction?
 * If they did not succeed, what skills were they lacking?
 * Did the learners have additional skills that were not predicted?
 * How long does it take for the learners to complete the instruction?
 * How do the learners feel about the instruction?
 * If their feelings are negative, how do these feelings affect their performance?
 * What revisions are necessary to improve attitudes toward the instruction?
 * Are the revisions made as the result of one-to-one evaluations satisfactory?

Once the training course has been implemented, we will continue with a summative evaluation.Goals include:
 * Did the learners achieve the goals of the instruction? (Did learners pass the assessments with accuracy? Obtain information by use of museum's LMS.)What is the return on investment of the instruction?
 * Have patron donations increased as a result of the training? (Obtained via client's bookkeeping/reporting.)
 * Has volunteer turn-over been reduced? (Obtained via client's records - perhaps volunteers keep quitting right now because they are frustrated with not having the answers or knowing where to go to get them. Once we train them, they will be more relaxed and enjoy their work, thus reducing the turn-over rate.)

We will obtain the information needed for our summative evaluation by keeping in contact with our museum contact person. If allowed, perhaps we can have access to their LMS and reporting system to be able to run reports ourselves. Better yet, we sell the LMS & reporting software in conjunction with our training. This way, we aren't entirely reliant upon the client to provide us with the information needed to conduct a thorough evaluation.

**Maria** Reading all of our posts on this week's discussions I see how we all feel about evaluation and the importance of this. I believe in a process very close to what Molly is suggesting: Initially, work with people from the museum who can help us determine the content and the effectiveness of it as we are designing and developing the learning experience. This would allow us to make tweaks to the content and the learning experience to ensure that the learning and performance objectives are accurately met.

As you saw on the discussion, I am a big fan of field trials and hopefully we would have the chance to pilot the learning experience to see if it moves the needle with objectives and if learners are retaining the information that was presented in our lessons and courses. This evaluation would ask the following questions:
 * Can the instruction be implemented as it was designed?
 * What types of administration problems are encountered?
 * Does the teacher / trainer guide present the needed information in a form that can be easily used?
 * Do the learners have the expected entry level skills?
 * Can the learners attain the objectives of the instruction?
 * Are the time estimates for completion of the instruction accurate?
 * How do the learners feel about the instruction?
 * Are the revisions made as a result of small-group evaluations effective?
 * How do the teachers / trainers feel about the instruction?
 * Do teachers and learners implement the instruction as it was designed?
 * What changes or adaptations do teachers / trainers make in the instruction?

I am not saying that I would change anything that you guys already have. I agree with both the formative and summative evaluation. I just wanted to throw my few cents in :)...

**Michael** As I reviewed the week 5 comments in the wiki, there was discussion regarding including information on formative evaluation. As the insturctions stated "For the purposes of your curriculum design proposal, you will focus on developing a plan for summative evaluation", it is not my intent to include anyhing on formative evals. If you strongly disagree with this direction, please let me know. P.S. - Molly, thank you for sending me your slides, they look great.
 * Do the learners have the anticipated entry-level skills?
 * If so, did they succeed in the instruction? If they didn't succeed, what revisions are needed?
 * If they did not have predicted entry-level skills, did they succeed in the instruction?
 * If they did not succeed, what skills were they lacking?
 * Did the learners have additional skills that were not predicted?
 * How long does it take for the learners to complete the instruction?
 * How do the learners feel about the instruction?
 * If their feelings are negative, how do these feelings affect their performance?
 * What revisions are necessary to improve attitudes toward the instruction?
 * Are the revisions made as the result of one-to-one evaluations satisfactory?

**Cheri**
 * I agree with Molly's statements above for our overall evaluation and "value added" information to the client. I also think we need to "test" our instructional materials with current volunteers and solicit volunteer feedback via surveys and small group interactions. We should collect and provide all of this data to the client so they can evaluate the effectiveness of the program. Molly, I like your idea about comparing volunteer turn-over rates before and after the training, Small group should address the following questions:**

Do the learners have the anticipated entry-level skills?

• If so, did they succeed in the instruction? If they didn't succeed, what revisions are needed?

• If they did not have predicted entry-level skills, did they succeed in the instruction?

• If they did not succeed, what skills were they lacking?

• Did the learners have additional skills that were not predicted?

• How long does it take for the learners to complete the instruction?

• How do the learners feel about the instruction?

• If their feelings are negative, how do these feelings affect their performance?

• What revisions are necessary to improve attitudes toward the instruction?

• Are the revisions made as the result of one-to-one evaluations satisfactory?

(Instructional Design, 3rd Edition. John Wiley & Sons p. 332). 

**Final Plan:** We will assist the client with the implementation of a small group review team for formative evaluation. The small group review will address the following questions:


 * Do the learners have the anticipated entry-level skills?
 * If so, did they succeed in the instruction? If they didn't succeed, what revisions are needed?
 * If they did not have predicted entry-level skills, did they succeed in the instruction?
 * If they did not succeed, what skills were they lacking?
 * Did the learners have additional skills that were not predicted?
 * How long does it take for the learners to complete the instruction?
 * How do the learners feel about the instruction?
 * If their feelings are negative, how do these feelings affect their performance?
 * What revisions are necessary to improve attitudes toward the instruction?
 * Are the revisions made as the result of one-to-one evaluations satisfactory?

Once the training course has been implemented, we will continue with a summative evaluation.Goals include:


 * Did the learners achieve the goals of the instruction? (Did learners pass the assessments with accuracy? Obtain information by use of museum's LMS.)What is the return on investment of the instruction?
 * Have patron donations increased as a result of the training? (Obtained via client's bookkeeping/reporting.)
 * Has volunteer turn-over been reduced? (Obtained via client's records - perhaps volunteers keep quitting right now because they are frustrated with not having the answers or knowing where to go to get them. Once we train them, they will be more relaxed and enjoy their work, thus reducing the turn-over rate.)

We will obtain the information needed for our summative evaluation by keeping in contact with our museum contact person. If allowed, perhaps we can have access to their LMS and reporting system to be able to run reports ourselves. Better yet, we sell the LMS & reporting software in conjunction with our training. This way, we aren't entirely reliant upon the client to provide us with the information needed to conduct a thorough evaluation