The purpose of this assignment was for students to create formative and summative evaluation plans for their courses. We were instructed to include information about how and when we would conduct each phase of evaluation, examples of specific questions we would ask, and how we think our course would be revised based on the information obtained through our evaluation.

For my report, I described plans for evaluating instruction developed as part of the redesign of FSHN 350. By planning and incorporating evaluation from the start, and continuously auditing and revising instructional materials throughout the redesign process, we will increase the likelihood our instructional objectives are met.

Formative and summative evaluations are described below.

Formative Evaluation

Formative evaluation will consist of evaluating/improving the efficaciousness of the instructional materials—identifying weaknesses, selecting alternatives, and/or making revisions until they are most effective/efficient. It is especially important we perform a formative evaluation of FSHN 350 now, as new instructors and instructional designers are developing the course and integrating technologies that have not been used for the course before now. Also, many of these features may also eventually be used for the development of the first online version of the course. Additionally, the timing for evaluating and revising the course is good, as TILT funding and assistance are available and the political will is high.

The evaluation process described will begin during planning stages, e.g. pretesting designs/media before they are fully developed and diffused, and will continue throughout development and delivery of instruction via the following stages/methods:

Prior to the development of instruction, the design and instructional materials must be reviewed to confirm the instructional goals reflect the student learning needs identified in task, environment, and learner analyses. For example, one of the student learning needs identified prior to beginning the course redesign was that students should be able to learn and retain understanding of important foundational concepts and apply it to future courses. Therefore, the formative evaluation during design reviews should determine whether materials would promote understanding and provide students with the opportunity to apply it. This process occurred somewhat informally at the beginning of this semester through discussions with faculty, but it should be continued. Specific examples of questions asked during this stage would include the following:

  • Are course materials developed at a level that reflects students’ knowledge and skills, based on prerequisite coursework or pretesting?
  • Are course learning objectives and outcomes clearly stated and measurable?
  • Do lecture objectives promote understanding of larger course objectives?
  • Do objectives promote learning at all levels of Bloom’s taxonomy?
  • Do section activities promote objective-based learning outcomes?
  • Do section activities provide students opportunities to apply their learning?

Reflection. If the answer to any of the questions is “no,” the material should be revised prior to its diffusion. Also, review of instruction should be an ongoing process, which should also be modified if the profiles of students taking the class change over time, and/or if the environment changes. For example, if an additional prerequisite course such as biochemistry is added, course materials should be reevaluated and revised accordingly.

This will consist of getting course instructors to help review the instructional materials (such as submitted for the “content” and “learning assessment” assignments for EDAE 639) in draft form, to confirm the content is accurate, complete, consistent, up-to-date, and congruent with course and lecture objectives (which the experts will also need to review). Additionally, because we may be adopting a new textbook, we will need to ensure materials are also consistent with the content as it is covered in the book. Specific questions would include the following:

  • Is content in course outlines accurate, up-to-date, and congruent with other materials?
  • Do outlines cover the range of material learners must understand for section objectives?
  • Do learning activities and assessments match learning objectives for this section?

Reflection. If the answer to any of the questions is “no,” materials should be revised prior to diffusion.

This would consist of pretesting instruction with representative learners. Individuals and small groups of students would review the materials and provide feedback related to any problems or difficulties and/or make suggestions for improvements. Students participating in student-led study groups may be interested in participating in this process. These groups will be implemented for the first time next semester, and the number of students/group may vary. They will likely include current students in the course at different levels of understanding (below, average, and above average) as well as student leaders who have already completed the course in a previous semester. Students would use the course redesign materials as study aids and then provide feedback—either orally or using a written survey. Specific questions would include the following:

  • Did supplementary content outlines match content taught in class?
  • Did using the content outlines improve your understanding of the material covered in class?
  • Did reviewing the outlines help you answer quiz and exam questions for this section?
  • Did you like the figures and graphics in this section compared to others you have seen?
  • How do you feel about the practice exercises? Were they difficult? Were they time-consuming?
  • Did completing practice exercises improve your understanding of the content for this section?
  • How difficult were practice questions compared with quiz and test questions in this section?
  • Do you think completing practice exercises for each section would improve your grade?
  • Do you think most students would complete these exercises if points were NOT provided?

Reflection. Additional feedback would be obtained from students through informal one-on-one and small group testing and conversations, and materials would be revised. After revision, the materials could be implemented in the class and field tested on an ongoing basis with the larger audience. Although it may seem obvious that implementing more course activities would be beneficial, simply by increasing time on task, it is not clear how they should be implemented in this course, in particular—e.g. should they be optional or required for class as part of the grade?

Once the innovations are diffused in the course (~75 students/section and two sections/semester), additional evaluations would be performed to measure their effectiveness and any problems related to implementation. For example, the following questions may be directed to instructors:

  • Did the updated course outlines and graphics make it easier to teach the section?
  • Did you use section objectives to guide your lectures?
  • Did having section objectives necessitate changing the amount of content you covered?
  • Did you refer to objectives or practice exercises directly during lecture?
  • Did you find the changes beneficial?

Other questions may be directed to students in the class to determine how they feel about the objectives, outlines, or the practice problems, similar to questions asked of students during pretesting. If use of the materials is optional, we may ask the students whether they are using the materials, how they are using them, and the extent to which they are using them. For example, are students writing out answers to the objectives? Are they completing exercises in full or partially?

Reflection. Student and instructor feedback would be used to revise materials, to add additional innovations, and to develop additional survey questions for continuous improvement of the course.

Summative Evaluation

summativeAfter Instruction…

Summative evaluations (collecting, analyzing, and summarizing data) will occur after instruction has been implemented to determine whether the instruction was effective in solving the problems it was intended to solve. If we meet the goals of the course redesign for FSHN 350, effectiveness would mean improving students’ understanding of key concepts and their abilities to apply it to upper-level courses and careers. It would also mean fewer students earning C-F grades in the course. More specifically, we hope to determine whether particular innovations developed as part of the course redesign independently impact learning. Thus, each innovation will likely be diffused individually, one innovation/semester, keeping all other variables as consistent/unchanged as possible, including each instructor. For this assignment, only one innovation (lecture-level objectives) will be discussed. These are the detailed learning objectives that will be provided to students prior to each lecture topic taught in the course. If implemented as planned, the objectives will be used to help guide the development of lecture outlines, slides for class lectures, and assessments (quizzes and exams). Thus, the purpose of this summative evaluation is to determine whether implementing detailed, lecture-based objectives improves student learning, assessed by the current assessment methods (quizzes and exams).

The design will be a case-comparison study. Assessment data from case sections (those receiving objectives) will be compared with assessment data from comparison sections, from previous semesters, prior to development of the objectives. The topics covered in case and comparison sections will essentially be the same, as will the assessments. However, in case sections, students will receive the objectives and (if implemented as planned) instructors will also have reviewed the objectives prior to classes and will likely deliver lecture material with them in mind. Evaluation will include a combination of objective and subjective questions. The instructor and I, using basic survey methods, will collect data during and after the course. Examples of evaluation questions and reflections on how responses to the questions may affect future planning are below.


  • What percent of objectives were included in instructional materials as planned?
    • Ideally, 100% of objectives will be provided to students. However, if they are not, it will be difficult to draw conclusions from the evaluation.
  • How do the case and comparison sections differ as far as grades (mean and distribution)?
    • My hypothesis is that including detailed lecture objectives will guide student studying and their performance will improve. If grades do improve following the use of the objectives, we would recommend including them in the course in future semesters, as well as continuing to refine them and develop additional exercises based on them.
  • How much time did instructors spend planning case and comparison lectures, respectively?
    • My hypothesis is that instructors should spend the same amount of time planning lectures initially, and over time, the objectives should allow them to spend less time planning because they will clarify help which content is most important to cover.
  • What was the financial cost of the course redesign — per student, per measure of performance improvement (e.g. in grades, if any), or any improvements compared with improvements attributed to other innovations of similar cost in other courses?
    • The goal of the redesign is to implement innovations that improve student learning and performance, while long-term, reducing the cost and the amount of time instructor spends. If this is the case, we will recommend continued development.


  • How do students use the objectives to guide their studies in the course?
    • It may be difficult for them to quantify their use of the objectives, but it would be important to know whether and how they are using them. If they were using the objectives, this would be supportive of positive outcomes. Otherwise, or if performance outcomes are negative, additional investigation would be required.
  • How do the students feel about the objectives?
    • My hypothesis is that students will have positive feelings about the objectives, because they will provide clear direction about the course content. However, if students indicate negative feelings or that they do not feel the objectives are congruent with the course content, the objectives will need to be revised.
  • Do instructors believe having the objectives influenced the clarity of instruction?
    • My hypothesis is that instructors will have positives views on the objectives if they are provided to them in complete form and are consistent with the course content. If this is not true, the objectives or course content will need to be revised.
  • Do instructors believe that having the objectives affected students’ abilities to answer application-based questions on quizzes and exams?
    • It is very important that instructors find the lecture objectives to be effective in this regard. This is one of the primary goals of the course redesign, and objective should be refined until students are able to apply key concepts.
  • Long-term: do instructors of upper-level courses in the FSHN department notice a difference in students’ ability to apply learning from FSHN 350 to their courses? Do graduating FSHN majors have improved scores on standardized tests or improved job placement in the field?
    • Using surveys of instructors and alumni, we will determine whether course modifications such as integrating objectives improve students’ long-term learning. If so, these data would be supportive of any improvements identified using quantitative research. If not, instructors should attempt to identify the reasons why.

Following evaluations, data will be analyzed, and using simple descriptive statistics, presented to stakeholders, to include:

  • FSHN 350 instructors. Depending on the outcomes of the evaluation, instructors may adopt or continue to use the materials. For example, if student performance and attitudes improve following inclusion of the objectives, instructors may choose to use/emphasize them in future sections of the course.
  • Instructional designers at TILT. Again, depending on the outcomes of the evaluation, instructional designers may be interested in continued participation in the course redesign in following semesters.
  • Administrators at TILT. These are individuals who determine whether to fund similar projects in the future. If there is evidence that student performance and attitudes improved, the likelihood of future investment in the course would also increase.
  • Administrators in the College of Health & Human Sciences who supported the project would be interested in knowing the results and supporting continued improvement of the course.
  • Other instructors in the department who may be interested in performing similar redesigns for their courses.

Beyond the evaluation:

I sincerely hope I’m able to continue working on this project as part of my graduate research and have the opportunity to see improvements in the course based on my work. We are really just getting started now and it’s exciting to think about all of the potential innovations that could be included in the course to help make the material more interesting for students and ultimately to help them learn.