Linguistics as a field is different from how it was 20, 30, 40 years ago. There was a time when linguistics was dominated by a couple of sub-fields that relied overwhelmingly on similar analytical tools, working at roughly the same level of analysis. Things are different now. There are more people working in more areas, using a more tools to ask questions at multiple levels of analysis.
The broadening of the field is clearly reflected in hiring trends for young faculty, and in the interests of students who are applying to join linguistics programs. The scientific demographics of the field has clearly evolved.
But curricula have not kept pace with the changes in the field. Almost all leading linguistics graduate programs in the US have a curriculum that closely follows the model that has dominated for the past 40 years, giving privileged status to some sub-fields and treating others as second class citizens (‘boutique’ or ‘Cinderella’ sub-fields). This is hurting a generation of young researchers, and I think it’s also missing a big opportunity for the field.
This post is a reaction to similar conversations that I have had with many people over the past couple of years.
Typical linguistics graduate programs in the US follow a model in which beginning students are expected to follow a prescribed set of courses. These consume most of a student’s attention in the first year. These courses are generally dominated by courses in ‘theoretical linguistics’, in phonology, syntax, and semantics. Courses in fields such as psycholinguistics and computational linguistics are more commonly offered as electives to be taken later in the student’s program, around the same time that students are also selecting research topics for their PhD qualifying projects (often called ‘generals’). This curriculum follows a model established at MIT, probably in the 1960s, and that has been copied across the continent.
The fixed curriculum made perfect sense 30-40 years ago, when the field was small, when the required courses covered the sub-fields that the faculty specialized in and that almost all of the students wanted to focus on in their research. But as the research in the field and the composition of departments has evolved, curricula have not kept pace.
Exploding the curriculum
My own department at the University of Maryland is an outlier. (I’m sure there are others — use the comments section to point to these.) In the early 2000s the department decided to throw out the traditional linguistics curriculum, and to have no required courses. Students must take courses, of course, but there’s no specific course that they are required to take. We offer a lot of different courses at the foundational graduate level, and students must take at least 6 of those, including two 2-course sequences, to encourage depth. But the choice and timing of these courses is entirely up to the student and their advisor(s), who are free to tailor the curriculum to fit the student’s individual needs.
At the time when we exploded the curriculum, we were also an outlier department in other ways, e.g., barely half of our faculty had a PhD in linguistics. This meant that we had more need to liberalize our curriculum than other programs had at the time. For example, at the time when we dropped all required courses, a major part of departmental funding was coming from computational linguistics, and faculty in that area were supporting their students through grants at a far higher rate than in other areas. It just didn’t seem right to recruit computational students, whose funding would be coming mainly from computational grants, and to then tell them that they should set aside their interests for the first year while they focused on other areas that were more important.
This approach has worked out really well for us. It allows students to take courses in the areas that they are most interested in as soon as they arrive. It has allowed students to take ownership of their own course of study. It certainly hasn’t led to our students being narrowly trained. On the contrary, they have gone far beyond the minimum that is required of them, within and beyond our department. Students are satisfied because they’re never taking something that they’re forced to take. Faculty are satisfied because they never have students in their classrooms under duress. The department has benefited in many other ways. It would be hard to argue that the department has suffered: applications, recruitment, job placement, funding, and pretty much any other measure that you can think of has looked better since the change. (Of course, correlation is not causation, and the curriculum wasn’t the only aspect of our program that changed.)
We expected that our curriculum change would raise some eyebrows when we first implemented it. It did. But we have been surprised at what has and has not happened over the intervening years. In terms of hiring and recruitment, we’re less of an outlier now than we were a dozen years ago, e.g., most leading linguistics programs now have one or more psycholinguists on their roster, and many also have computational linguists. They have diversified their faculty in other ways too. But our curriculum is still an oddball. In almost all other leading programs, the traditional curriculum is largely intact. And that’s the problem.
Many young faculty have been hired with the goal of diversifying departments’ research portfolio, broadening the tools and methods available, and possibly increasing external funding. New students have been attracted by the new opportunities. But these faculty feel like second class citizens when it comes to the curriculum. It’s hard for beginning students to take their courses. That makes it harder for the students to choose their area for their initial research projects and qualifying papers. And so they face a disadvantage in training students. In contrast, students in psychology or computer science programs are more likely to be diving into research in their primary area right from the start of their graduate careers, making them more competitive by the time that they graduate. And since research in these fields depends so much on faculty-student collaborations, it also holds back the faculty research. The feeling of being second class citizens is made worse when some colleagues refer to sub-fields like phonology, syntax, and semantics as “core” areas of linguistics. (I think the current fashionable term for that is “micro-aggression”.)
The situation that I’m describing is not fabricated or isolated. I’ve heard the same story from faculty and students in many different institutions. [Edit: and since writing this post, I have heard from more young faculty who have shared their version of this story.]
I have heard various arguments for why the traditional curriculum should be preserved. I’ll list them here, with comments. You won’t be surprised that I’m not particularly convinced by them.
1. Students need training in ‘core’ areas before applying this knowledge in other domains.
Yes, it’s helpful to know some linguistics in order to do good work in computational linguistics or psycholinguistics. But you don’t need a whole year of graduate level course work in syntax, phonology, etc. For better or for worse, the state of the art in computational linguistics and psycholinguistics generally does not depend on the very latest results in theoretical linguistics. And beginning linguistics graduate students know more than they used to. Advanced linguistic expertise is definitely an asset, but it’s not a pre-requisite.
2. Required courses are needed, because students need breadth.
Yes, students benefit from breadth. But there are many different ways to be broad as a linguist. Our graduate students at Maryland have no required courses, yet their training is extremely broad. Different individuals are broad in different ways. So no need to worry.
3. If students haven’t taken courses in X, Y, or Z, they won’t be able to get an academic job. You have to know these things to teach an introductory course.
Not true. Any smart linguist can deal with an introductory course. And few are hired for that ability. If you really want to ensure that students are employable in academia, load them up on courses in computational linguistics and second language acquisition. That’s where the jobs are.
There is a variant on this concern that perhaps is true. Most hiring is still done by faculty who specialize in traditional sub-fields. Many of them are worried about the invasion of hot-headed youngsters whose work looks quite different from theirs. If you’re looking to get hired, but you give the impression that you don’t understand or appreciate what the people doing the hiring care about, then you’re at a disadvantage.
Another variant on this (thanks to Brian Dillon for raising it) is that many teaching jobs require individuals who are versatile enough to teach advanced undergraduate courses in multiple areas, especially the ones that dominate the traditional graduate curriculum. I’d agree that broad teaching skills are valuable, and that we need to do more to prepare students for this. But I don’t see how it follows from this that everybody needs exactly the same breadth. High demand undergraduate courses include second language acquisition, language and gender, historical linguistics, language and computers, language and advertising, and so on. The argument favors breadth, not uniformity.
4. How can you call somebody a linguist if they don’t know X?
The field has grown up now. It covers a broad space, and it’s just not possible to keep up with everything. We should expect that different experts in our field should have different kinds of expertise. That’s a mark of a mature field.
5. It’s important for beginning students to go through a fixed set of courses, so that they build a cohort.
We agree with the goal, but in our experience this can be achieved without forcing all students into the same mould.
6. We need to make certain courses required, so that the courses reach minimum enrollments.
This reflects a harsh reality for many graduate programs. They’re expected to offer graduate courses, and those courses have to meet enrollment targets. But that is hard to do in the face of declining student support and declining student numbers. A compulsory curriculum helps to create some degree of certainty. In our PhD program we cannot meet enrollment targets based on beginning PhD students from our own programs. But enrollments haven’t been a serious problem for us. This is in part because students take the foundational courses on different schedules, but also because we draw in many students from other departments (just as our own students often take courses in other departments), and because we draw in a small number of our best undergraduates. My own foundational course in psycholinguistics, which extends over a full year, serves beginning linguistics graduate students, but they are often in the minority in the course.
Why this matters to linguistics
But why complain about this? After all, I’m delighted with our non-curriculum, and it has helped us to attract excellent students and faculty. So why care about the curriculum elsewhere? In part, it’s a concern about the impact on individual students and individual faculty careers. When we hire faculty and recruit students we need to set them up to succeed. But beyond that, I think that the fossilization of the linguistics curriculum is holding back the field, certainly in psycholinguistics, and possibly in computational linguistics. Why so?
For a long time, researchers with linguistics training have been marginal constituencies in psychological and computational research on language. But that may be changing. In psychology departments, language is not a growth area nowadays. Other areas are hotter, and more lucrative, e.g., social cognitive neuroscience. There are a few psychology departments that have big language groups, e.g., Illinois, Connecticut, Edinburgh, but it’s rare for that to be a priority these days. When I go to psycholinguistics conferences like CUNY, I encounter vastly more more linguists than I did 15-20 years ago. It’s conceivable that in the next few years a lot of psycholinguistic research will shift from psychology departments to linguistics departments. But fossilized curricula put the brakes on that. I’m less knowledgeable about the state of computer science departments, but there’s also a general sense that linguistic expertise is on the ascendancy again, after many years in the wilderness. As brute force big data methods reach their limits in NLP, and as tools like Siri and Google Translate raise consumer expectations for the quality of language tools, there’s more and more need for a deeper understanding of how human language works. This creates an opportunity for linguistics programs to play a big role (… and one that can bring funding with it). But, again, fossilized curricula create a barrier to training computational linguists in linguistics programs.
If you have more arguments why the traditional curriculum is better for the field, I’m curious to hear them.