Does patience pay off on the job market? Here’s an article that won’t tell you the answer.

Last week, I was in conversations with two groups of people seeking or soon to seek academic jobs. Though located at two different institutions and coming from a wide range of disciplines, the groups shared a new concern added to the usual ones: the new Chronicle of Higher Education Article called “On the Academic Job Market, Does Patience Pay Off?” Many readers seem to share the alarm of (currently) the first comment under the piece: “This is extraordinary information… more evidence of how merciless the academic job market has become. Graduate students need to be aware of these numbers from the moment they start a program.”

Seeing the impact of the article on the job seekers, I read the piece, and I found a problem: it does not answer the question it asks. “Does Patience Pay Off?” is an answerable question, at least at the level of statistical generality, and we can make it more precise by rephrasing it: “If a candidate stays on the job market for multiple years, does the probability of securing a job in any given year go up or down over time?”

The piece in the Chronicle, however, answers another question: of the jobs secured in a given year, how many are given to people at each stage of their job search? The cited statistics reveal that, across many fields, about half of jobs go to applicants who are ABD or in their first year after completing the doctorate, and a strong majority of jobs to to candidates who are ABD or within four years of completion.

The shape of these numbers is entirely explicable by the nature of a competitive market: assuming a constant number of new applicants per year, a much lower number of new jobs per year, and an equal chance for every candidate to get a job, a mature market will award roughly half the jobs to applicants in their first two years on the market and, of course, many more to the group that also includes the next three classes of applicants.

That is, of course a lot of  jobs go to the classes first hitting the market: that’s where the largest numbers of applicants are. Those classes are bigger than the more seasoned ones because some of the latter applicants will have gotten jobs already, and some of them will have dropped out of the market entirely. You can see these effects play out in a simple spreadsheet model that I made. My applicant-bots have the same chance of getting a job every year they apply, and as their market matures, it produces data similar to the Chronicle’s.

So does patience pay off in the academic job market? I’d still love to know.

Good information and bad coverage of college costs

We’ve gotten some rare good news about transparency in college costs: the U.S. Department of Education’s new College Scorecard, though limited in many ways, gives students and their families quick, easy ways to understand some of the realities of college costs normally hidden by simplistic discussions of sticker prices. But we need to understand what the tools do and don’t offer.

Today’s Chronicle of Higher Ed is not helping. Costs are at the center of Beckie Supiano’s “What Actual High Schoolers Think of the New College Scorecard.” The piece notes some of the advantages of the College Scorecard, but its pessimistic ending frets about students having too much information to process, and the final–and memorable–anecdote of a student using the site describes an important moment in learning about college costs:

Jimena [Alvarez, a high school sophomore] searched for the University of Miami, and was immediately presented with its $30,000 average annual cost. Her reaction? “Oh, no, I can’t go there,” she said. “Or maybe I can, but I’ll have to have a lot of student loans.”

The Scorecard provides further detail on what students might pay at each college, including information on typical debt, a breakdown of net price by income band, and a link to the college’s net-price calculator. But Jimena had a strong initial reaction, and it wasn’t clear she ever made it far enough into Miami’s data to realize she could get a more personalized price.

The moral of the story seems to be that poor Jimena Alvarez’s “strong initial reaction” prevented her from finding the important truth of the story: if only she had gone “far enough into Miami’s data” to find her personalized price, she would have gained a subtler and more valuable understanding. The curious omission of what she would have found leaves the reader to think that more information would have reassured her and perhaps maintained her interest in Miami.

But the condescension is unwarranted. In fact, Alvarez understood exactly what the College Scorecard most valuably conveys: Miami is an extremely expensive university. That average cost of $30,394 is almost double the mean, as Alvarez could see clearly on the chart. If she did dig deeper, she would find even more daunting news: the annual cost for families with incomes of $0-30,000 is a staggering $20,783. Florida State’s cost for such families is $11,542. Harvard’s is $3,897. The differences are just as stark in the other income brackets under $100,000.

As limited as the College Scorecard is in some ways, this anecdote presents one of its strengths: the Scorecard emphasizes costs rather than tuition prices, allowing it to convey a much more accurate sense of relative affordability than most conversations of higher ed involve. The victory of the Scorecard, in fact, lies in an absence: Supiano’s article never uses the word “tuition.”