Thursday, April 16, 2015


After an extra-long stay on the west coast to celebrate my one and only brother's wedding, I'm finally returned and recovered from #shakeass15. This was my seventh SAA in nine years, and maybe it's time to give in and admit that, drama scholar or no, this really is my conference now.

This was the first year that I organized and ran a seminar of my own (a rather wee one, as it turned out, but with great papers and participants), and probably the second at which it seemed fully half of the seminars were run by friends, or at least friends-of-friends, or, anyway: people I know well enough to talk to for five minutes at the bar.

When I was at an earlier stage of my career, I think I longed for this moment as a sign that I'd "made" it, that I was some kind of an insider. And for at least a couple of hours on Thursday, it did feel that way: at the opening reception, after 10 hours of travel, not enough to eat, and (just possibly) more wine than I'd realized, I was possessed of the delusion that either I knew everyone or everyone knew me. This was a terrific feeling, and led to my crashing a lot of conversations: I'd see a knot of four or five people, recognize one of them, and decide that the whole group probably knew who I was and would be thrilled if I barged into their conversation. When the expected enthusiastic welcome wasn't forthcoming, I'd think, geez, those are some weird, uptight people--and move along to the next bunch.

As a strategy to overcome the social-awkwardness-that-reads-as-unfriendliness at academic conferences, this may not have been the worst approach: without the anxious, inhibiting voice in my head persuading me that I was the weird, rude one, I was free to be . . . well, a little weird and a little rude. But also charming and friendly! (I'm pretty sure!)

Looking back on the reception from the following day's luncheon, it was clear that I didn't know half the attendees. (Using a generous definition of "know," it's conceivable that I knew one-quarter.) And the people I don't know aren't just grad students or scholars emeriti: they're often people my own age, at my career stage, doing interesting and important work; we just haven't met yet.

This is, I think, the real sweet spot: being only two or three degrees of separation from everyone, but never feeling that one has reached the end or exhausted all the possible SAAs within any given SAA.

But no matter how many sub-conferences any conference contains, Ima try to crash every one of them.

Tuesday, March 31, 2015


So yeah: a lot of people are interested in moving jobs some day. There's nothing wrong with that. But here's a tip: when someone you barely know asks you--just conversationally--how you like your job at X, your response should not be, "it's a good first job."

Maybe you've absorbed the snobbery of your grad school cohort; maybe you're afraid of your interlocutor's condescension or pity. But I swear to God: I'm not even a job-seeker, and when I hear that I still want to punch you in the face.

Saturday, March 28, 2015

Mid-career mobility

In my last post I mentioned one of the ways that the precarious job market affects even tenured and tenure-line faculty; in this post I want to talk about another: mid-career mobility.

Just as many of us were told that there are always jobs for good people or that we'd be fine as long as we went on the market with a couple of publications, many of us were also told that there was such a thing as a first job: that if we weren't happy somewhere (or were perfectly happy for a time, but later wanted new opportunities), we'd be able to move if we were working hard and publishing well. At least, I was told this, and the careers of my grad school professors seemed to bear it out: although a few of the senior faculty were still on their first job, most of them--and usually the most accomplished--were on their third or fourth or fifth.

Now, I'm not expecting the plight of those seeking a second tenure-track job to wring tears from the eyes of those still a long way from that kind of stability. But this affects them, too: the scarcity of jobs means that most grad students and recent PhDs are advised to take any job they get offered--and then "write their way out." Obviously, it's foolish to turn down a decent job in the hope of a better one, but what about the job that sets a candidate's Spidey-sense a-tingle or that seems like it might be unworkable for a single person or a dual-career couple or a minority or LGBT applicant? What is the likelihood of moving elsewhere?

I don't have an answer to that. I do know at least a dozen people who moved before tenure, which leads me to believe that the odds of such a move are decent--but of course the nature of the game is that those who are on the market don't usually advertise it.

The mid-career move is even more of an open question. Just as the contracting job market means many tenure-line jobs are themselves worse than they used to be--fewer TT faculty means a heavier service burden on those who remain, which frequently comes alongside higher course caps and increased teaching loads--it also means mid-career moves are harder to pull off. The two together can lead to the kind of post-tenure malaise that Notorious Ph.D. has blogged about.

I haven't seen many mid-career moves, though it's possible that I'm just too early in my career. Maybe they too are a casualty of the job market, or maybe they're in a temporary lull--or maybe they were never as common as the careers of my grad school professors led me to believe.

Although lots of people at midlife and in midcareer experience some kind of a slump or wonder whether they can bear to be doing the same thing for another 20 or 25 years, most highly-educated professionals can at least move companies or cities, if their specific working conditions are displeasing. In academia, this is rarely possible.

And I think it's the possibility, more than the reality, that matters. I've never yet taken a job that I was eager to leave or one in which I didn't think I could be happy long-term. But I've also never wished to believe that any job was my last job; it's useful to believe that other opportunities lie ahead--and that in a hazy ten years or so, or after the next book or the next, I might make another move.

Whether such an opportunity actually presents itself is less important.

Thursday, March 19, 2015

The long goodbye

As you good people are all aware, a year ago I accepted a job at Cosimo's institution--starting, for sabbatical-repayment reasons, in August 2015. But even once I start that job, almost 18 months after acceptance, I won't have fully left my current one. Instead, I'm taking an unpaid leave.

This is partly a pragmatic decision; if the Steven Salaita case has taught academics anything, it's the wisdom of disaster insurance. My own inflammatory opinions are mostly confined to long-dead literary figures and are unlikely to piss off any trustees or Boards of Whatever.* Still, flukey things happen, and in a field with extreme employment precarity, it's better to be on the safe side.

But taking a leave is also about keeping my options open and making sure the move is the right one. In addition to being able to live with my spouse, my new job offers me several things that my current one doesn't, and I'm very much looking forward to those things. But my current job, in turn, has strengths that my new one does not.

This slo-mo, not-quite-letting-go is pretty standard in academia; I know lots of people who have taken leaves rather than resigning outright--some of whom eventually returned, most of whom did not. Still, it probably seems bizarre to people in other industries, and it feels a bit bizarre to me, too. In most areas of life, I'm the kind of person who wants to lock decisions down. I hate endless dithering and lack of closure (which is why so many meetings run by academics drive me insane).

But in most industries, the consequences of an employment slip-up or a bad decision aren't grievous; you just move to a third job or return to your previous employer. When my dad decided to return, after a year of working for my uncle, to the government job he'd held for more than a decade, he could do it. He was docked a GS rank (which he later regained), but he could do it.

Academia is different, and it's only gotten worse. Though I don't have many qualms about the broader effects of my delayed start at one job and delayed resignation from another (it's unlikely that my department would be able to replace me immediately, so I'm not "keeping" a position from a needy job-seeker) I don't have none; the security that allows me to try on a new job risk-free is exactly what's unavailable to most academics today.

However, it's that broader lack of security that makes those of us who have it cling to it. Jobs are so scarce that any screw-up, whether personal or institutional, can have devastating consequences, and no one is immune. These days it's not uncommon for junior faculty in very prestigious positions to have had only that one offer, after years on the market, and to have been a minute away from leaving the profession. Even extremely talented people who get denied tenure often can't find another job, and those who leave the tenure track can rarely get back on it.

I'm thankful that both institutions have been flexible enough to let me make a decision I'm comfortable with, in a way I'm comfortable with, and no larger good would be served by my hastening to closure. But the security from which I make that decision is a privilege. I wish there were more of it to go around.

*Since you asked: the Romantic poetics are goddamn whiny, navel-gazing tree-huggers! (Except Byron; Byron's all right.)

Sunday, March 15, 2015


One of the difficult things about the early stages of one's career is never being sure how much you should be doing, what's normal, what's possible. This is one--though certainly not the only!--reason that academics feel they're never working hard enough and are haunted by vague feelings of guilt and idleness and shame.

In my experience, those feelings lessen after a while: you learn the rhythms of your job and your life, when you work best, what's normal and possible for you--and also you clear certain hurdles (reappointment, tenure, book publication, whatever). Now that I'm nearly a decade post-degree, I can also better identify the outliers.

In grad school, this is almost impossible. You don't know if the person who completes all his seminar papers early and his dissertation in five years is brilliant, disciplined, facile--or just really well prepared for graduate work. And your job-market competitor who has three articles to your one may have many more publications by percentage--but only two more publications, numerically. It's hard to know how to read that kind of data.

Things are a little clearer now. That person who already has fifteen articles when even her more serious peers have half that? Who has a third book out before most people have a second? She's working at a totally different level. And that's a relief to know. If I assumed her to be the norm, I might feel shitty about myself. But understanding her to be the scholarly equivalent of a fashion model--exceptional, admirable, even aspirational in some respects, but not a standard any sane person would expect me to meet--frees me to feel good about what I can do.

Although the conditions of one's employment certainly affect what's possible, there are outliers up and down the academic food chain. Any job that has some research expectations and gives research some time and some support is going to see a wide range of outcomes. Moreover, there are people at middling institutions who are outliers not just for that institution or their professional circumstances, but for their career cohort. I know people at institutions like my own who are dramatically outpublishing their peers with fancier jobs.

And here's where I say something a little controversial: while acknowledging both that professional circumstances shape what's possible and that most people's productivity ebbs and flows over the course of a career, I think that, on average, we work at the rates we work at. I do not seriously believe that if I had an R1 job my output would look materially different. Maybe I'd publish an additional article every two or three years or my books would come out slightly faster, but I don't believe the fundamental pace of my thinking and writing would change.

I like to believe that I would do just fine at a fancier job, but I have no illusions that suddenly I'd be able to publish a book every five years; even the outliers at R1s are lucky to do that, and if I'm not an outlier at my current institution, there's no reason to think I would be at another.

Friday, March 13, 2015

Now this is touching the future

Apparently I've been at this job long enough that not only are my former students marrying other former students, but they're opening their own cocktail bars.

Cha-Cha City has a modest wealth of appealing bars and restaurants, but not so many that there aren't dead spots in the week--nights when it seems like everything is either closed or shuts up early--and what with family schedules and teaching nights, my colleagues and I have struggled to find an appealing midweek standby. So we gave our old student's new place a whirl: to support an alum, try something different, and maybe add it to the rotation.

And oh, it's added. This bar is one straight-up, nerd-cool, English-major fantasy.

Every bar should  have a rolling library ladder

I don't know what other professors fantasize about their students doing with their lives; I mostly just want mine to be happy and secure members of the middle class (who hopefully still derive pleasure from books and movies and plays). But if I had a self-serving fantasy, one that made me feel good about myself and my proximity to talent, it wouldn't be for my students to go on to get PhDs themselves or write critically-acclaimed novels or work for the New York Times or get elected to the Senate. It would be for them to do something very much like this.

One of the pleasures of teaching at a regional institution is contributing to that region in a sustained and multi-layered way. My students teach in the urban and suburban school districts. They fix up old houses, work at local nonprofits, open their own businesses. Those are their achievements, not mine--but they benefit me. They make the place I live better. They enmesh me in a meaningful network of connections.

I may not be from here, or staying here, or have roots much of anywhere. But part of putting down roots in a place is knowing and supporting those who do.

Friday, March 06, 2015

Nothing perishes

So maybe the first thing to say about turning 40 is that I got a tattoo.

Not, like, across my forehead--but not tiny and not in a super-discreet location: capable of being concealed by professional wear (and in my currently northerly clime for up to seven months of the year regardless of what I'm wearing), but otherwise pretty visible. That was kind of the point.

I didn't get the tattoo for my fortieth, exactly; I'd been contemplating it for more than a year and my birthday just provided a convenient milestone. Still, getting a tattoo at all, and getting this one in particular, is intimately connected to my sense of aging and my desire to keep faith with my past selves as I move on to whatever I do move on to.

Anyone who's been reading this blog for any length of time will recognize that making sense of the past and unraveling the relationship between history and identity--whether personal or collective--is my only real interest, the thing that drives pretty much everything I do; indeed, twenty-five years' worth of journals and letters show that this is far from a recent obsession. (If I'm constant in anything, it may be in my search for continuity and my fear of finding it wanting.)

So I guess my tattoo is another reminder of who I am and what I value, a way of both staking myself to a moment in time and acknowledging the unknown. I'm not afraid to see the image change as my body also changes.

That, too, is kind of the point.

Tuesday, February 24, 2015

Watch this space

In theory I have a million things to say about the liminal moment in which I find myself: 40th birthday just past, a variety of 10-year anniversaries on the horizon, and a big professional move in the works. I'm busy enough and happy enough, and I've even had the time to write. It's just that my brain feels like it's gone silent.

Ordinarily, I move through life talking to myself. In the shower, I'll go into a spiel about a text I'm teaching. On my drive to work, I'll start composing a blog post. Sitting in my office, I'll hold an imaginary conversation with a friend. At the gym, I'll summarize, under my breath, an article I just read, as if talking to a colleague or a hiring committee. It's not about anxiety. My brain is busy, always, with hypothetical Facebook and Twitter posts, emails to friends, arguments with people I no longer speak to, tricky bits of scholarly prose, descriptions of what I did last weekend. In a very real way, I don't experience my life except through language.

But lately that chatter isn't there. I'm still writing to-do lists and lesson plans, taking notes toward my next book, and cursing aloud when someone cuts me off in the parking lot. But there's not the usual verbal processing of whatever I'm thinking and feeling. I'm not bored or impatient, but it's very. . . quiet. I have the sense that I'm waiting for something: a reply from the oracle, a transmission from outer space; something.

Until then, though, it may be as quiet around here as it is in my head.

Thursday, February 19, 2015


As of today, I have walked the earth for forty years.

Presents for everyone!

Saturday, February 14, 2015

Sorry: brain full. Try again later.

This semester I'm continuing my Italian study with private lessons. This initially seemed easier than what I was doing in the fall, and in most ways it is. I meet my professor once a week for two hours, which means that I get about as much instructional time but waste less time commuting; I also have less work to prepare in advance. And since it's just the two of us, it's all quality time: there are no moments when I'm zoning out or only half listening while one of my classmates is on the spot.

That's also the problem. Two hours is a lot of time. Just as my body is not ready to run 9-minute miles for two hours straight, my brain is not ready to speak Italian for two hours straight. This week I'd read a couple of articles in the Italian press about the refugee crisis in the Mediterranean, so after 45 minutes on grammar we turned to that. Then we broadened our discussion to EU immigration policies more generally and how one strikes a balance between border protection and humanitarian relief. This was hard, but I was pretty game for a while. Around the time that my instructor turned the conversation to Obama's executive action on immigration, however, and asked me to compare the American and Italian situations and outline the differences between the Democratic and Republican positions, my brain stalled out.

Partly it was the complexity of the material, but mostly it was just fatigue: at a certain point I was unable to access even the most basic vocabulary or pronounce words I'd been saying just fine twenty minutes earlier. In fact, I've never before felt quite this level of mental collapse--though those times I've been awake for 30 hours for a complicated transatlantic journey and then had to negotiate an unfamiliar municipal transportation system might come close.

I recovered, of course, but the experience has made me think a little harder about the way I schedule and manage instructional time in my own classes. I've always been mindful of the kind of fatigue produced by monotony (sitting too long in one place or doing exactly the same kind of work for 60 or 90 minutes), especially in lower-level classes or classes that meet only once a week, but I haven't thought much about the fatigue caused by brain overload. Maybe a student isn't staring off into space or typing on her phone beneath her desk because she's uninterested, but because she can't absorb any more information right now.

Not that the two are mutually exclusive.

Tuesday, February 10, 2015

Inside the snow-globe

Like much of the East, we've been snowed under for the past ten days. We didn't get even a third of the snow that Boston did--a fact I take great pleasure in pointing out--but we got enough that it's still heaped everywhere. Whatever surfaces aren't icy are slushy and salty and dirty and gross, so every venture out remains a minor expedition.

And you know, of all the things I hate about winter, the one I may hate the most is all the gear it requires. I hate putting on a coat just to take out the trash. I hate wearing snowboots to the gym. I hate the feeling of all those layers. I hate how grubby all my outerwear gets. I hate the monotony of always wearing the same things. And I really hate having to take all that crap off and put it back on eight times a day.

In this respect, this winter has been better than last. Last year, although I was on sabbatical, I was commuting downtown three days a week on public transportation, walking about a mile, and then wending my way through a Habitrail of skybridges between buildings to get to my Italian class. I dressed for the commute and for the fact that I wasn't teaching, so I wore lots of boring and practical layers. As soon as I entered the first building, I started peeling them off--first hat and gloves, then scarf, then coat, then vest, and finally I'd wind up at my classroom with a huge heap of clothes in my arms. I looked about as harassed and bedraggled as I felt.

This year, I'm commuting by car to MY VERY OWN OFFICE. My clothing choices aren't unlimited--I still have to plan for the walk to and from the parking lot and for the possibility that I might need to shovel out my car--but I have a reason to dress up and take pleasure in what I wear. And once I get to my office, I can throw all my outerwear in the corner, change into heels, and trot around free and unburdened, like a human being rather than a pack animal.

This is, for whatever reason, a huge psychological boost. And I need as many of 'em as I can get.


What small pleasures get you through the sloppy, dreary, ass-end of winter?

Saturday, January 31, 2015

Stepping it up

I know the first week of the semester isn't always predicative, but damn, I had a good one.

The greatest surprise was my M.A. class. It's a small group (though, fortunately, not this small), and I'd never met most of them. I also assigned a lot of reading before our first class--too much, probably. Last weekend I was plagued with visions of how badly things might go: what if half the class didn't get my email or didn't do it? What if three people decided to drop? What if they were just annoyed or confused?

Instead everyone showed up with the reading done and digested. They were eager to talk about it and had smart things to say. And if that weren't enough, two of them had also read an entire 450-page book I hadn't assigned (a recent biography of Donne) just because they thought it would be useful and interesting.

Okay, class: game on.

Thursday, January 29, 2015

Angry pretty girls

Because the amount of fiction reading I do is directly proportional to the amount of time I spend at the gym--and because the gym is not the place for complicated, experimental fiction--I've been whipping through novels lately. The one I finished most recently is Gillian Flynn's debut novel, Sharp Objects which I picked up after having read and loved Gone Girl a couple of years back.

Sharp Objects isn't as strong of a novel, though it's very good. Suspense/mystery/crime isn't my preferred genre, but both Gone Girl and Sharp Objects have stuck with me for reasons that are only loosely connected to genre. I guess the easiest way of putting it is that I can't stop thinking about Flynn's women. Both novels are narrated by the kind of women who are familiar from crime fiction, but who usually aren't given the chance to speak for themselves. You know: tough, beautiful, damaged, and dangerous to themselves or others. The kind of woman the male hero gets entangled with--and usually tries but fails to save.

But the women in Flynn's novels get to be more than just enigmas or objects of fascination; they show us heterosexual femininity under pressure. Some of her female characters are monstrous (Sharp Objects has a lot of these, from country-club backstabbers to suffocating mothers to mean-girl tweens), but even their monstrosity seems just a twisted and exaggerated version of types we all know. We know those types, because we live in a world where many women feel the pressures of femininity. And so they have coercive sex at 13; shun and shame other women for fear of losing status; transform themselves into perfect homemakers and spend their days shopping and decorating and drinking themselves into stupefaction.

Flynn's women are not tragic victims and they're far from feminist heroes. But in indirect and often self-serving ways, they make a feminist point about our social scripts for women. The famous "cool girl speech" from Gone Girl may have been delivered by an extremely unreliable narrator, but as its popularity suggests, it's a sentiment a lot of women relate to. As someone who was an awfully angry teen and twentysomething (though quiet and almost entirely unrebellious), I tend to believe there's a lot more female rage out there than we talk about. In Gillian Flynn, the fury of the pretty girl and the hostility of the good girl are all right there. It's a reality I appreciate seeing depicted.

Saturday, January 24, 2015

Library privatization

Over the past week I spent a few hundred dollars on books. That's a lot of money, but the expenditure itself isn't so remarkable; I might easily spend two hundred at a conference or at a used bookstore in a town I pass through only seasonally. What's giving me a bit of a twinge this time is that the books I bought are substitutes for things that in another life I might have expected to be held by my university library: one is a major Miltonic reference work and two are facsimile editions of early Donne volumes.

Now, I'm not complaining about my college library; we have a decent acquisitions budget and everything I've ever asked for has been acquired, including pricey multi-volume sets. It's possible that if I'd asked for these--all long since out of print but available on the used market--the library staff might have been able to acquire them. (Though they certainly couldn't have acquired original copies of the Donne volumes, which run more than $50,0000.)

And maybe I'd have wanted these books even if RU had copies of its own; in grad school, I splurged on some complete sets and reference works even though I lived a ten-minute walk from one of the greatest research libraries in the country. I'm not as much of a bibliomane as some people, but I'm definitely on the acquisitive end of the readerly spectrum: cost permitting, I buy just about every book I read and every book I come across that seems like it might be useful in the future. Apart from the pleasure of ownership and the efficiency of having everything I want in a single location, I also like feeling I'm doing my small part to prop up the academic publishing economy--one $95 book at a time.

But though I don't regret the money I spend on books, in light of the limitations of my institutional library (and the similar, if not greater, limitations at the library of my future employer), building a private scholarly library sometimes feels like hoarding treasure for my personal use--or at least like a retreat from a commitment to institutional libraries as the cornerstone of the intellectual community.

And yes, I know that building a private library needn't mean neglecting institutional ones: in the nine years that I've been at RU, I've helped build up our early modern collection to the tune of a few hundred volumes and many thousands of dollars. I've ordered copies of things I already own, things too expensive for me to buy, things I don't need for my own research but that I imagine as valuable for future faculty and students. But knowing what I now know about acquisitions and deaccessioning policies, I realize that if I don't use a book I ordered, it's possible that no one else will--and in five years it could be gone.

So it's hard not to feel that building my personal library is, in fact, a hedge against disaster: not just a compensation for all the things I don't have access to now, but a preemptive move against the further destruction and degradation of whatever libraries I'll be associated with in the future. Straitened acquisitions budgets, deaccessioning, the move to more-easily-stored-but-less-easily-used digital formats, and the decision to warehouse books off-site (in order to turn libraries into student lounges, computer workspaces, or similar) all mean that I can't be sure I'll ever again have the same library experience I had in college or grad school. Ergo, the private library.

I like tending my own garden, and it makes me happy to be able to share it with my students and colleagues. But it's no substitute for those that are open and available to all.

Tuesday, January 20, 2015

A well-wrought urn

I don't want to brag or anything, but not only have I completed the most complicated syllabus of my entire life (a ground-up revision of my graduate Donne class, now structured so it's also a sort of methods class and a sort of review of 20th-century literary studies), but I've written all the assignments, too.

This is something I've never done before. I mean, sure: my syllabi always say what the assignments will consist of--a presentation, a close-reading paper, a research paper, a midterm, whatever--and I have a decent idea what they'll probably entail. But write them? No. Usually I do that at the last possible minute, either when a student asks whether they might be getting the assignment sheet soon or when I happen to glance at the syllabus and realize, shit! that thing is due in two weeks! I need to write it immediatamente!

But because this class is so complicated and the assignments build on each other, involve an interlocking set of skills, will overlap in time, and are largely unlike any assignments I've designed before, I felt I had to come up with detailed instructions now, just so I could get everything clear in my own head and make reasonable decisions about how to schedule their component parts. So with my syllabus doc and four other files all open, I moved back and forth among them, composing, revising, changing due dates, and altering the particulars in innumerable ways. Finally I arrived at a sequence that seems doable and makes sense.

Parts will still fail, I'm sure, and I'll undoubtedly have to make at least medium-sized changes between this instantiation of the class and the next one. But for now it all looks like a perfect and beautiful whole, complete, unshakable and enduring.

Now, if only I'd spent half as much energy on my writing projects. . .