Good information and bad coverage of college costs

We’ve gotten some rare good news about transparency in college costs: the U.S. Department of Education’s new College Scorecard, though limited in many ways, gives students and their families quick, easy ways to understand some of the realities of college costs normally hidden by simplistic discussions of sticker prices. But we need to understand what the tools do and don’t offer.

Today’s Chronicle of Higher Ed is not helping. Costs are at the center of Beckie Supiano’s “What Actual High Schoolers Think of the New College Scorecard.” The piece notes some of the advantages of the College Scorecard, but its pessimistic ending frets about students having too much information to process, and the final–and memorable–anecdote of a student using the site describes an important moment in learning about college costs:

Jimena [Alvarez, a high school sophomore] searched for the University of Miami, and was immediately presented with its $30,000 average annual cost. Her reaction? “Oh, no, I can’t go there,” she said. “Or maybe I can, but I’ll have to have a lot of student loans.”

The Scorecard provides further detail on what students might pay at each college, including information on typical debt, a breakdown of net price by income band, and a link to the college’s net-price calculator. But Jimena had a strong initial reaction, and it wasn’t clear she ever made it far enough into Miami’s data to realize she could get a more personalized price.

The moral of the story seems to be that poor Jimena Alvarez’s “strong initial reaction” prevented her from finding the important truth of the story: if only she had gone “far enough into Miami’s data” to find her personalized price, she would have gained a subtler and more valuable understanding. The curious omission of what she would have found leaves the reader to think that more information would have reassured her and perhaps maintained her interest in Miami.

But the condescension is unwarranted. In fact, Alvarez understood exactly what the College Scorecard most valuably conveys: Miami is an extremely expensive university. That average cost of $30,394 is almost double the mean, as Alvarez could see clearly on the chart. If she did dig deeper, she would find even more daunting news: the annual cost for families with incomes of $0-30,000 is a staggering $20,783. Florida State’s cost for such families is $11,542. Harvard’s is $3,897. The differences are just as stark in the other income brackets under $100,000.

As limited as the College Scorecard is in some ways, this anecdote presents one of its strengths: the Scorecard emphasizes costs rather than tuition prices, allowing it to convey a much more accurate sense of relative affordability than most conversations of higher ed involve. The victory of the Scorecard, in fact, lies in an absence: Supiano’s article never uses the word “tuition.”

The questions we ask our students (and the ones they answer)

The accreditors are coming around to our campus again soon, so assessment is on the march. We held a two-day writing assessment workshop on campus over the summer, and I participated in scoring essays written by first-year students the previous fall. I came away just as skeptical about the quantitative assessment of college writing as I have always been, but I nonetheless found my self shaken by how much the exercise showed me about the pedagogy of college writing.

Recognizing the limitations of giving everybody the same prompt, detached from any connection to course content, the framers of our assessment project—a group of skilled and thoughtful people—gave the teaching faculty some directions about framing their writing prompts but left room for tailoring them to each class. This approach represented our effort to avoid the Scylla and Charybdis of writing assessment: the distorting artificiality of standard exercises, on the one hand, and, on the other, the inability of standardized questions to capture the kind of context-specific scholarship that we most want our student to practice. I was on my first committee trying to navigate those waters in about 2002; I haven’t yet seen anyone find safe passage.

In this latest assessment exercise, the variation among the faculty-written prompts was dizzying. Some were detailed, to the extent that they sounded like guidance for writing full-length scholarly articles. Some consisted of a single sentence inviting the student to analyze two writers, period. Some asked for summary followed by analysis. Some asked students to respond to passages that we faculty had trouble understanding out of context. My point is not that the prompts were bad but that they were so varied that it would be hard to imagine them producing writing that we could assess with a consistent set of criteria.

The real surprise came from reading the students’ essays. In crucial ways, their writing revealed that the students often had not read the prompts carefully, and they were right not to do so. The prompts asked for different kinds of writing, but the students responded in largely uniform ways. They understood the assessment exercise. Most of them have done similar things throughout their elementary and secondary educations: they knew they were supposed to write a short essay, conventionally structured, with some quoted evidence sprinkled in.

And indeed, that’s exactly what we assessed. With our rubrics and inter-rater reliability training in place, we were almost always able to score the essays in a straightforward way because the students knew to rely on the skills that had been praised and rewarded so often in their educations, no matter what their teachers tried to tell them on a given assignment.

The students’ ability to perform assessment-ready writing humbled me in two ways. First, it reminded me that students have often deduced my expectations when I have not explained everything that they need, even though I tend to explain a lot. The assessment exercise showed me how much we all lean on unstated expectations. Second, a gained a new way of thinking about how difficult I have found it to try new kinds of assignments, even with students who are curious, creative, and ambitious. Now I see such assignments in this light: every time I take a step away from an assignment that boils down to “Write an essay of length X on topic Y,” I remove some of my students’ confidence that they know what implicitly earns rewards in academic writing, even if the explicit requirements are incomplete or difficult to understand.

I still want to push my students and myself to break away from conventional essay assignments. I want them to become capable editors as well as readers, to give presentations that deploy ironic as well as explanatory slides, to work productively as members of creative teams that must evaluate their own work and choose how to share it. As I ask them to learn these skills, however, I will do so with a renewed awareness of how much I am requiring them to leave behind the techniques and assumptions that have gotten them to this college in the first place, and I need a similar sense of humility as I encourage colleagues to try new techniques and assignments. I have been thinking especially about the dynamics of classroom authority, race, gender, sexuality, class, and disability: it is easier for some of us than others to ask students to step away from expectations they know they can meet.

I am just beginning to turn from these thoughts to building a structured sense of how to respond constructively to them. From conversations I have had so far, I suspect that my thinking will draw heavily on the methods of my colleagues in the creative arts, for whom it is nothing new to ask students to express vulnerability, to judge one another’s work constructively, and to work in teams whose members have complementary skills. More to come.