I think I'll add this piece to the list of required reading for prospective clients. And yes, I purposely used the word "required". I already harp on this whenever we propose X number of design hours on a project, and almost reflexively the first thing a client wants to take a scalpel to is the design time. Anyone else have this experience?
If you want the cliff notes, the needs of the students at San Jose State were tossed on the back burner in favor of the wants of the business of education and of a company- Udacity.
After reading about the school's experiment, one quote stood out in my mind- "The courses were also put together in a rush." I hope for the sake of the students that the University makes the most out of their pause in this relationship, and take a long hard look at their business decisions as well as some of their instructional design choices in this matter.
The fact that the Udacity students fared significantly worse than their in-class peers is a red flag of sloppy and/ or rushed instructional design. Indeed digging deeper into the piece you'll find the following gems- "faculty were building the courses on the fly... faculty did not have a lot of time to watch how students were doing in the courses because the faculty were busy trying to finish them."
In other words, a rush job without much formative evaluation of the course before a final rollout. I recall back when I was learning to design instruction there being great attention paid to conducting a formative evaluation of a course at every stage of development. This step is crucial especially after instructional media, online interactions, and the instructional strategy are baked into a draft course. For online courses, you sit a sample of likely members of your target audience in front of a course and evaluate everything from their ability to navigate the course to how well they are able to perform the desired skills after completing the draft course. The whole purpose of the exercise is to test the effectiveness of the instructional strategy and of the course and any associated materials.
For the development of all of Collabor8's clients, we request a few test participants to go through our courses and provide us with invaluable feedback using a punch list. Many times, we create a simple spreadsheet in Google Drive, and share it with the folks who will be performing the testing. It doesn't really need to be complicated, in fact- click here to see the format that we use.
Using this sheet, students can go in and provide us with feedback. We ourselves use the sheet to note any observations that we find to be useful edits when doing our own internal quality reviews.
To avoid a fiasco like this, keep your learner's interests and needs above all else. And do yourself and your clients a favor, don't skip conducting a formative evaluation of your courses. Also remember that online training courses are fantastic supplements to more traditional instructor-led training sessions. Investing a little time during development to evaluate might have alerted some in administration to the fact that some of the enrolled students in the online courses did not have reliable access to computers.
Additional source: Online Education Start-Up Gets 'F' From University
Alex is a co-founder and Managing Member of Collabor8 Learning, LLC, an instructional design and performance management consultancy. His firm collaborates with organizations to enhance the way they develop and train their people. To learn more about Collabor8 Learning, click here.
Alex can be reached at 786-512-1069, alex@collabor8learning.com or via Twitter@collabor8alex.