When I took one of the newly offered online Stanford courses last fall, I had no idea that they would become central to a highly charged debate about elite institutions of higher education and online learning–not to mention a key locus of the debate over the forced resignation of my alma mater’s president.  I simply wanted to sharpen my command of MySQL.  Now, the Stanford courses—or, strictly speaking, courses offered by Stanford faculty but not Stanford courses but part of “A Stanford School of Engineering Initiative” (got that?)—have been cited by the columns read by the anti-Sullivan faction at UVA: one by David Brooks, one by John Chubb and Terry Moe (“Stanford, for instance, offers a free online course on artificial intelligence that enrolls more than 150,000 students world-wide”), and one by Ann Kirschner (“Stanford University professor Sebastian Thrun’s free course in artificial intelligence drew 160,000 students in more than 190 countries”).  UVA alumnus and donor Jeffrey Walker cited “the hugely successful online course at Stanford” in an email to Visitor Mark Kington (who has now resigned his position), who sent it along to Rector Helen Dragas.

Therefore, I want to reflect on the online course I took, Introduction to Databases.  First, about those amazing enrollment numbers: though I have done some searching, I have not found a count of how many students completed the artificial intelligence class.  For my databases course, I remember hearing initial enrollment numbers in the 80,000-90,000 range; the professor, Jennifer Widom, later wrote, “This past fall my enrollment was a whopping 60,000. Admittedly, only 25,000 of them chose to submit assignments, and a mere 6500 achieved a strong final score.”  When seeing statistics from these courses–and it’s clear the anti-Sullivan faction saw them repeatedly–we should keep in mind that the number of students completing a given course might be smaller than the number of enrollees by an order of magnitude.  And in most educational environments, anything like a 10% retention rate for one semester is far from “hugely successful.”  I don’t mean that in a snarky way (if you want snark, see this tweet) but rather to note how weird our current thinking about “success” is when courses with substantial costs, no revenues, and little ability to keep the students they attract become the go-to model for emulation by elite universities.

That said, the Stanford course was successful for me, and I’m grateful for it.  I was motivated to succeed in the course: it offered almost exactly the skill set I wanted to develop; I needed those skills to accomplish larger goals for my job; I did not need Stanford-backed credit; and I enjoy situations where I am given information and am left to work through it on my own, at least in the introductory stages.  The course’s online lectures and quizzes, the latter cleverly designed to be repeatable with variations, along with a well-produced discussion board for peer-to-peer interactions, allowed me to work on roughly my own schedule.  That freedom was constrained by generous but real deadlines and aided by (usually) well-calibrated discussion board tips from fellow students.

Those comments from fellow students were crucial to the functioning of the course.  A handful of talented students became, in essence, volunteer TAs, combing the discussion board to find flailing fellow-students and helping them out.  Like almost everyone else in the class, I was a consumer rather than a provider of this help.  The helpers’ spirit of volunteerism fit well with the tone Widom set for the course, which she described as a grass-roots experiment in online education.  I wonder how well these voluntary peer interactions would function in the venture-capitalistic frameworks now being developed for online courses.

I see other challenges as well.  There are the obvious ones: I don’t see a way for this model to work for humanities education, except in a very basic way, and even in technological fields, advanced undergraduate work requires a kind of interaction with peers and mentors that my course did not attempt to offer.  I came out of my course with no peers with whom to work (or study, or joke about the course), no faculty with whom to hash out ideas for new projects, no mentor to write a letter of recommendation.

Credentialing will pose a deeper challenge as well.  My course would have been extremely easy to cheat in, as Widom occasionally pointed out.  But cheating was not a big problem because the stakes were low: Stanford gave no credit for the course, and I doubt many organizations counted it for much, either.  If these courses become means of awarding credits in a way that trades on the reputation of the sponsoring school, however, cheating may become a huge problem; for the introductory skills courses that work best online, back-channel networks can easily distribute answers that will earn credit, and such cheating could quickly devalue the credential, thus removing the incentive to pay for courses.

My conclusions come very close to those recently attributed Theresa Sullivan: I see the best near-term potential in encouraging incremental, grass-roots efforts to test the potential and limits of online learning.  Contrary to some of the rhetoric surrounding references to the Stanford courses, the best parts of the course I took embodied that spirit of grass-roots creativity.


0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *

css.php