Sener, John. (2006). Effectively Evaluating Online Learning Programs. eLearn Magazine. Retrieved May 15, 2007 from http://www.elearnmag.org/subpage.cfm?section=tutorials&article=23-1.
Summary
This article describes some common misconceptions (or 'frames') that instructors or institutions often have when looking at course evaluation. These frames are important to examine because when they aren't, they prevent us from seeing the true potential of evaluation. The negative 'frames' described in the article were:
1. "Online Learning is a Different Universe"(para. 5)--Instead of seeing evaluating an online course as a whole new scenario, look at evaluation in the same way that you would for a face-to-face course. A lot of familiar tools can be used, just in an adjusted manner.
2. "Evaluation as Judgment" (para. 6)--Avoid making evaluations about judgment or accountability, as this just makes people defensive and may cause instructors to focus more on damage-control than on actually improving the course. Use evaluation to "make meaning" (para. 7) and consider how the program or course is working for those involved.
3. "Evaluation as Episode or Autopsy" (para. 8)--Don't allow evaluation to just be a one-time thing (usually at the end of the class). Make it continual and reflective--for both students as staff--by conducting several smaller evaluations throughout a course. This way, the data is actually useful to those students who provided it, and the course can be adjusted as it is going on, rather than revamping at the end for the next group.
4. "Content is King" (para. 13)--While evaluating how well content has been passed on is important, it is just as important to evaluate the processes that have been used. This can be especially important in evaluating online courses. This may require more non-traditional evaluation methods, such as journaling or response logs, in order to collect feedback as students work through various stages of the process of the course.
5. "The Comparison Trap" (para. 20)--Making comparisons between classes (or between online delivery vs face-to-face) to determine which is 'better' is "irrelevant and counterproductive"(para. 20). Instead, focus on making the class (whatever the format) better. This shift in focus can also keep the evaluation process from becoming adversarial.
Two approaches may allow evaluation to move in this direction. The Sloan-C Quality Framework focuses on five key pillars of access, student satisfaction, learning effectiveness, faculty satisfaction, and cost effectiveness. The CEIT Model for Evaluating Online Learning focuses on moving through four stages: comparisons, effectiveness, quality improvement, and transformation, but accommodates multiple progressions through these steps. What's important is that institutions find a system that works for them and allows evaluation to be more effective and rewarding.
Response:
I like the idea of making evaluation more useful and productive. I agree with the author that we have allowed evaluation to turn into something that often is not useful for instructors or students because it doesn't actually encourage any real reflection or improvement. Breaking it down into five simple areas is helpful because it creates concrete places where overhauling evaluation can start. Simply choosing to conduct a few smaller, maybe more informal, evaluations over the course of a quarter or semester could change the whole meaning behind the process. Rather than simply 'judging' the course, the evaluators are providing feedback that may improve the class for them, not just those who take it next time. This would also, I imagine, motivate them to take the evaluation more seriously and provide more helpful feedback. This is something I like to try to do in my class--although we don't do technical evaluations, I often ask my kids to write me a note at the end of each quarter telling me what is working for them and what isn't. I find it's helpful to hear that while I can still make changes--if I waited until the end of the year, it would be too late.
Wednesday, May 16, 2007
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment