Should the Revolution Have a Vegan Option?

French people don’t give a shit about lifestyle politics.

Okay, it’s a generalization, but—despite my short residence and limited language skills—there’s some truth to it. I saw it on May Day, among the anarchists selling Coca-Cola products and candy bars, and I saw it at a “zero waste” conference where the attendees couldn’t manage to get their paper plates into the recycle bin. But mostly, I see it every time I go to an activist event with food—and, being France, there’s always food—and am confronted by the wholesale lack of anything vegetarian. Even the incredibly low bar of having something that is not pork, in a country with a sizeable Muslim population, is rarely reached.

As someone coming from the Bay—the land of gluten-free, of re-useable shopping bags, of farmers’ market parking lots crammed with Priuses—this has been a bit of a jolt. But being in France has challenged me to re-evaluate my politics: in the face of climate change, of billions of animals slaughtered, of global labor exploitation, does the way I live actually matter? I’m pretty sure the chain-smoking, sweatshop-buying French communists I know would say “no.” In fact, in my (usually disastrous) forays into the French language, I’ve learned that talking about “la politique” makes no sense without referencing political parties or the state. In French, “lifestyle politics” is a bit like talking about “non-political politics.”

Much like universal healthcare, what is taken for granted in France is up for debate in the U.S. In fact, hating on lifestyle politics is totally huge on the left right now.* There’s the old Derrick Jensen article, “Forget Shorter Showers,” and the more recent “Stop Worrying About Your Carbon Footprint” that I’ve been sent three times. And—thankfully—sociologists have finally added their ever-weighty opinions to the matter. Exhibit one is Samatha MacBride’s (fantastic) Recycling Reconsidered, which pretty much dismembers the idea that recycling has any impact beyond making us feel good. Her conclusion that we need to “relegate notions of personal commitment and responsibility…to the backburner” pretty much sums up the post-lifestyle zeitgeist. It’s not just that buying local is useless—it’s that it actively detracts from useful things, be it direct action or harassing our Congressperson.

Okay, I get it: the old premise behind my veganism—that each of us could save 95 animals a year, starting today, thanks to the magical power of supply and demand—is bunk. Regardless of what the economists tell us, the world is not an aggregate of individual consumptive choices. But I still want to be vegan, and I’d like a reason a bit more utilitarian than just saying it’s the right thing to do.

Fortunately, even France managed to furnish a justification. I was at an anti-waste event the other week, sitting with a group of dumpster divers on the margins (scoffing at the respectable people in suits and their power point presentations and, you know, real political program for change), and one of the caterers on his break sat next to us. “Do you see how many beer bottles they’re leaving half-full?” he observed, “And they’re telling me I need to waste less?” Whatever we think of lifestyle politics, we have to acknowledge that, even as political actors, we are increasingly judged on personal criteria. We live in an age of heightened scrutiny, and our consistency matters—not because being consistent changes the world in itself, but because inconsistency is an easy excuse to discredit us.

There are other reasons, too, to keep taking shorter showers. We need to acknowledge the profound disempowerment that most people—even privileged people—feel today, and recognize that the one area where people do feel they have some efficacy is in their consumptive choices. If we are serious about the movement-building maxim of “starting where people are at,” we need to acknowledge that most people approach problems in their lives by purchasing something. What’s more, the glorying in the irrelevance of personal choices leaves me wondering  how many activists actually want to live in the world they claim they’re trying to create through direct action and political organizing. Because guess what: you really are going to have to take shorter showers and eat less meat and buy less shit in our future utopia, even if you don’t see any of these as tactics to get us there.

Having a vegan option isn’t going to precipitate a revolution. But, to do a great injustice to Emma Goldman, it isn’t a revolution if I can’t have something ethical to eat.

- – - – -

* I mean this, of course, referring to the tiny leftist echo chambers in which I exist. It’s kind of like how my punk friends and I thought the Dropkick Murphy’s were a “huge” band in 2003 because more than two people at my high school had heard of them.

Freudian Shifts

Did you know that going on extremely long vacations and then forgetting about them used to be a socially legitimate—if not exactly socially acceptable—way to go crazy?

Maybe. It’s certainly not an original discovery of mine: it has come from reading Ian Hacking’s brilliant Mad Travelers, which explores the cultural niche, formed from anxiety about vagrancy and romantic celebration of tourism, within which “fugues”—long, aimless, unconscious journeys—flourished in 19th century France. My own version of madness (or maybe it’s just procrastination) has kept me from (re)embarking on any original research of my own. In the meantime, I’m content with methodically consuming the social-scientific literature on mental illness and vaguely imagining a future contribution to it in dissertation form.

In truth, I’ve never been as uncomfortable with social science as I am now. Perhaps that’s because I’ve previously always studied questions that, while important (“Will capitalism survive?” “Will we do anything about climate change?”) did not exactly bear on my day-to-day life. Reading about the “social construction” of mental illness, on the other hand, impinges on my ongoing own interpretation and processing of the last ten years of my life.

Prior to reading Hacking, I worked through Ethan Watters’ Crazy Like Us, which explores the globalization of American psychiatry through processes that range from clueless (telling post-Tsunami Sri Lankans that they really must be suffering from PTSD a la American) to sinister (convincing Japanese people that melancholy was a disease and that only U.S. pharmaceuticals could cure it). His most interesting vignette describes a more inadvertent sort of mental colonization. Anorexia virtually didn’t exist in Hong Kong—despite the long-running penetration of Western advertising and ideals of body type—until a 14-year-old collapsed and died in public. Despite exhibiting virtually none of the symptoms that usually come alongside not-eating, such as obsessions with thinness or fear of fatness, the press widely publicized it as a case of “anorexia.” Almost overnight, an epidemic of anorexia emerged—not of people claiming to be anorexic, but people who weren’t before and suddenly were.

The take-away lesson is that every culture opens up certain avenues for expressing distress and shuts down others. This notion worries me a bit, because it fits too easily into the don’t-talk-about-suicide-because-it-gives-people-the-idea-to-kill-themselves narrative (which, frustratingly, seems to have some sociological data to support it). More proximately, I am uncomfortable with this because it challenges the assumption—to which I cling rather dearly—that my own depression is biology, pure and simple, and that it was medication that pulled me out of it. Radical, Foucaultian critiques of psychiatry and “pharmaceuticalization” as forms of social control fit neatly in with my political worldview, but less easily with the fact that I believe—and, in a way, need to believe—that I have been saved by Big Pharma and Western medicine.

This week, I’ve been plodding through Freud. I had read Civilization and Its Discontents during my brief flirtation with anarcho-primitivism in college, but I doubt I would ever have seen reading hundreds of pages about phallic symbols and infantile sexuality as a good use of time had I not taken a year off. Most of what Freud said apparently is wrong—psycho-analysis has been shown to be no more effective than anti-depressants, which is to say, apparently not very effective—and yet I’m finding, just as the sociology of mental health tells me they should, that his theories about the manifestations of mental illness have a way of making themselves true.

When I was really depressed, I didn’t dream at all; sleep was a form of blissful oblivion, a well-earned respite prior to the mornings, which were inevitably the worst (the fact that I had read prior that depressed people don’t dream and feel worst in the mornings is, perhaps, not coincidental). As my waking life has improved, my nocturnal existence has deteriorated: I have nightmares almost every night. I usually don’t remember much from them, except that they are—surprise!—about being depressed again.

And then came this week, when I read Freud’s Interpretation of Dreams. All of the sudden, my dreams are full of symbols—of hints of repression and sublimation and transference, a horde of thoughts that day-to-day existence as a sane person requires I keep at bay. And the weird thing is, I’m convinced that it’s not just that I’m noticing these symbols now: I really think that my dreams were fairly meaningless, and now have becoming meaningful, just as I have read that they should be. And so I am reminded that studying oneself is a never-ending mindfuck, and that maybe it’d be more straightforward to crack capitalism than to crack my own brain.

 

Going back was the best of times, going back was the worst of times

Perhaps because the novelty—by which I mean an alcohol-accentuated tincture of horror and awe—has worn off, I’m not coming away from my fifth reunion with the same crazed list of stories as I had after, say, my Freshman year. There were no drunken alumni saving me from arrest at the hands of Mohawk-profiling P-safe officers; no rambling stories from Bill Fortenbaugh ’58 about the hookers we could expect at his 70th birthday party; no thieving of giant inflatable monkeys from the 35th (I’m still unclear about how that one happened).

Still, I think I “did” reunions pretty well. I went through the P-Rade with the band no less than three times and felt like I played my heart out despite dancing too energetically to read the music for songs I had never played before. I ran into my thesis adviser in a heavily inebriated state on Poe Field. I managed a temporary coup d’etat and convinced the percussion section to start “Children of Sanchez” for the umpteenth time. I swam in the fountain, got a 4:00 a.m. “Eggplant Parm without the Parm” from Hoagie Haven, and stayed up for a reunions sunrise (a first!). And my antics in the band office led one undergraduate officer—perhaps not realizing how much I would treasure the comment—to say that I really was the “hot mess” of band lore.

I list stories and antics and happenings because I always hope that, by adding them up, they will sum to three days of consistent and straightforward happiness. And, for most people, it seems like they do: my facebook feed has been dominated for days with comments about the “best damn place of all” and the sheer joy of revisiting our alma mater. I imagine there’s a certain amount of posturing in that, but I more-or-less believe the sentiments are genuine. I wish I shared them, though.

Somewhere between the moments of blasting away on trumpet and catching up with my best friend on the deck of Terrace, there were what seemed like interminable periods of wandering around alone at the 5th, avoiding eye contact and fearing conversation. I hadn’t initially expected to spend the entire weekend with the band—not even most band alums do that—but then I realized that the alternative was walking around campus by myself, not sure if I did or didn’t want anyone to see me. It’s not that I’m not incredibly fortunate to have great friends from my class: only that interacting with them, with the attendant sense of “losing” them again as soon as the weekend was over, was hard for me to bear.

Depression is, in so many ways, all about struggling with your past. For some, it’s past trauma. For me, it’s an idealized sense of past happiness that I alternate between desperately want to relive—not in the “telling stories with old friends” sense, more the “build a time machine” sense—and wipe from my mind. When I walk around Princeton, I’m not sad because I see the room where I used to cut myself, the health center where I had to inter myself Freshman year, or the street where my roommate had to pull me away from oncoming traffic. No: I’m sad because I’m constantly thinking about the sense of wonder and meaning and community that I had there and yet never really managed to appreciate and which, at Berkeley, seems so impossibly out of reach.

Being me, I told myself this was my last reunion. Not in the sense that it’ll actually be my last, but the last where I feel like I can actually have conversations with undergraduates, play with the band, or dance drunkenly until 4 a.m. It also feels like my last because I’ve chosen to make coming back a logistical absurdity, whether I’m in France or California or England or anywhere else. I feel jealous of the people who can maintain a connection to Princeton after they graduate, and I frequently fantasize about coming back for a road trip or two each football season, but I’ve realized that I burn my bridges with the past every two years because I probably couldn’t get by any other way.

For me, at least, there’s wisdom that comes from the experience, and not just angst, which makes writing about it on my 27th birthday seem less pathetic and more edifying. When I first started to recover, I followed a pretty rigidly Benthamite pleasure-maximizing strategy, avoiding anything that might make me feel bad. Now that I know that I can break down a bit without falling of the deep end, though, I am realizing that depression can be part of the normal flow of experience—that it’s okay to go back and laugh and dance like an idiot and play trumpet and bask in the warmth of good friends and, yes, cry a little bit.

 

I, Too

In retrospect, I’m lucky: I actually got called out for the most racist thing I did in college.

I’ve noted passing references to the “I, too” campaigns at various universities cropping up on my facebook feed, but I only took the time to fully absorb the images when I saw a tumblr for “I, too, am Princeton.” The frustrations are the same as I’ve seen elsewhere: of being called upon to speak for an entire race, of being assumed to have gained entrance thanks to athletics or affirmative action, of having the validity of experiences of racial discrimination being constantly questioned.

They hit harder coming from Princeton students, though. I wonder—and I hope every white alumnus or alumna of Princeton is wondering—if I was one of those racist shitheads pontificating in seminar, mouthing off on the street, or whispering at a table in Frist. I don’t know if I ever said any of the things quoted on the “I, too” whiteboards. In a sense, that’s what it means to be white: to live in a maelstrom of racism to which you are contributing and not even being aware of it. White is when you wait until some people set up a tumblr five years after you graduate to reflect, “Shit, whether or not I said that, I probably heard someone say it and didn’t do a damn thing about it.”

Then again, I actually know I did some terribly racist shit in college, because someone told me. I count myself lucky for it because so many micro- and macro-aggressions pass unmarked and unacknowledged. My junior year, the Princeton Animal Welfare Society—of which I was the Vice-President—brought a PETA campaign called the “Animal Liberation Project” to campus. The project compared the justifications for abusing animals—generally, a variant of, “They’re different from us, so we can do whatever we want to them”—to the justifications for abusing humans—generally, a variant of, “They’re different from us, so we can do whatever we want to them.” There were pictures of animals and humans—dark-skinned humans, usually—to drive the point home.

Just writing the sentences above, with six years of hindsight, is cringe-worthy. But it gets worse. I had a sense that the campaign was going to spark some controversy—particularly within the African-American community—and so I reached out to every black student group I could find. I penned an explanatory editorial, entitled “Slaves and Slaughterhouses” (yes, really), and organized a panel to discuss the demonstration that included one (“1”) black woman, one (“1) Indian woman, and two (“2”) white males.

Given the extraordinary sensitivity I had shown, I was shocked (shocked!) when no one attended the panel, the comments section of the editorial lit up with anger, and black student groups rejected my invitation to have a reasoned, dispassionate discussion and instead sent an e-mail to their membership denouncing what we were doing—and me, personally—as racist. I fought back, articulating as logically as I could why this really wasn’t offensive (something I have done subsequently, as well). I claimed that if only students recognized their own prejudices, they really wouldn’t mind the comparisons. “I’m not racist, but you are speciesist.” Numerous friends of color didn’t talk to me for months afterward.

There are various hackneyed lessons that I learned—eventually—from the experience. They are banal and should not have required denigrating hundreds of my peers to arrive at them. I realized that, as a white person, it’s really not my place to “debate” whether or not something is racist. I also discovered that gestures of conciliation for racist actions don’t make those actions less racist. Writing about this experience is hard because there’s always an element of cleansing one’s guilt for past actions, when in fact those actions should remain raw so as better to shape one’s own behavior in the future. These are lessons I’m still learning, but I don’t expect anyone to be “understanding” in the meantime when I fuck up again.

The first draft of this post contained a list of college-era racial misdeeds—half the jokes I participated in while in the band come to mind—and mitigating factors—I took classes on race! And I helped organize events around incarceration and immigrants rights! But sometimes our actions really shouldn’t be judged in context. One of the most mind-blowing things I’ve ever read on race is Sam Lucas’ Theorizing Discrimination in an Era of Contested Prejudice. He notes that most black people experience pervasive racial discrimination, but most white people claim not to be racist. As it turns out, these two are not mutually incompatible. Even if we (generously) assume that only 5% of cops racially profile, or 5% of teachers think black students are inferior, the chance that a black individual will encounter a racist cop or teacher in their lifetime is extremely high. And, really, you only need one cop to stop you for your skin color to think the system is pretty fucking fucked.

The same applies to racist actions, not just racist people. You can do only one racist thing in four years (I’m sure I did more, but for the sake of argument…) and still make a substantial contribution to a campus climate of oppression. In the end, what I’m trying to get at is that I, too, was Princeton, maybe too much so, because I, too, was part of the problem.

Free

Should a book on freegans—that is to say, people who try to live for “free” in the present through appropriating capitalism’s waste, while trying to build a future in which the things people need are provided for “free” through a gift economy—be free?

This is a purely academic question. My “book” on freegans—I’m going to call it that, even though at this point it’s just a really, really long word or PDF document, for which this blog post is a shameless plug—is already free. Even were it to be picked up by a real live academic publisher, I still have no doubt that it would quickly be scanned and shared online, and I would make no effort to stop it.

Despite the fact that reality has gotten ahead of philosophy, I still feel like I increasingly need to think through my position on the question of “free.” I feel it both in general—with advocates for open access at my own university suggesting that publishing in pay-for-access journals is just dumb—and personally—as a number of voices have told me they assume that I would never try to sell a book on freegans. I’m thus starting to wonder about what it means that—as someone who expects his life’s work to consist mostly of reformatted word documents—everything I produce is ultimately going to be free.

*          *         *

I should start by saying that the arguments for making academic products free to the public are, I think, particularly strong. We still have well-heeled institutions (universities and federal research councils) that are willing to pay (some of) us to produce knowledge and to contribute to journals as editors and reviewers. The open access advocates’ strongest argument (ironically made at the same time as we hear slogans like “information just wants to be free”) is that we’ve already paid for research with our tax dollars, so we shouldn’t have to pay for it again. I’m excited about experiments like Sociological Science, the new open-access sociological journal, not so much because I’m sure their model is the wave of the future (author fees for graduate students still are scary-high) but because I believe that experimentation is the only way to find out.

But this is emphatically not the position that most producers of cultural goods—musicians, artists, or authors are in. The other week, I read a New York Times editorial by Jeremy Rifkin who rosily declared that, “The inherent dynamism of competitive markets is bringing costs so far down that many goods and services are becoming nearly free, abundant, and no longer subject to market forces.” The “marginal cost revolution” (about which he is selling a book) has a fairly simple source. There is now a Napster for virtually anything that you can copy on a computer, and because copying a file doesn’t cost money, now books, movies, and music can be “free.”

I suppose I was most annoyed by Rifkin’s editorial because it conflated the “rise of free” with the “rise of anti-capitalism.” When I’ve been told that I really ought to make my book or anything else I write “free,” it’s usually couched in the assumption that “free” and “capitalism” are opposed to one another. But there is nothing inherently anti-capitalist about getting something for free. In fact, the “free” labor of the worker—that is, time spent producing things of value for which the worker is not commensurately paid—is at the root of all profits in a capitalist system.

So long as the things we need to survive—and I’m not talking about books on freegans here, although I do think my book is valuable, but food and housing and all that—are commodified and must be purchased, being told the things you produce are “free” is just another way of saying you are being exploited. And, unlike for academics, we don’t have any sort of public provisioning for the majority of cultural producers, and as such, for most of them, discovering that their products have “zero marginal cost” is not exactly a happy revelation.

And, of course, even as an academic, “free” sounds increasingly scary. When legislators see that students can now access Massive Open Online Courses courses for “free” (at least for the moment), it sounds like a great argument for further defunding public education. And when graduate students are expected to add more students to their sections without an increase of pay–an experience virtually any GSI at Berkeley can recount–they’re working for “free.” And I can’t help but think that the logical consequence of telling us that the books we write will be “free” is that eventually universities will feel they no longer have any obligation to pay us to produce them.

*          *          *

Admittedly, this is all a bit of a straw-person argument. For most of the activists I know—and, especially, the freegans—“free” has a very different meaning. It has nothing to do with price or with the “marginal costs” of production. As I came to understand it, “free” meant that some things are too valuable to have a price—whether necessities like food and shelter or public goods like transportation, the arts, or knowledge. Sure, there were always dumpster divers who thought that wasted food was “free,” but the wiser freegans I knew always recognized that these things had a cost—in human labor or natural resources—which were real. “Free” was, in effect, a way of recognizing that all things have a cost, albeit one that is often poorly captured by “price.”

I’m not against “free.” I’ve read enough anthropology to know that gift economies in which goods and services are shared freely are not a utopia, but a part of the human historical experience and an honest possibility for the future. It’s more an issue of timing, or, you might say, a collective action problem. I’m reluctant to say it’s fine for someone to have free access to everything I produce until I have free access to what others produce. It makes very little sense for some types of things be “free” while others are commoditized. And, frankly, I’m far more concerned about “freeing” things that do have a marginal cost, like food or shelter. I don’t want to sound like those old commercials that said, “You wouldn’t steal a car—Piracy is not a victimless crime”; just that I’d like to be able to steal dinner along with my DVDs.

I didn’t find my book in a dumpster. It’s taken time and money and effort and love. Writing it has involved a great deal of lost opportunities and missed chances. It’s been made possible by the generosity of a host of people and institutions too numerous to name. But don’t worry, I’m not some dirty capitalist or luddite who has yet to get on the digital freedom bus. My book is “free.”

Fool Me Twice

I had only been going into New York to hang out with freegan.info for a month when I featured in my first media story. A Dutch journalist was quite taken with the incongruity of a Princeton University student going through the garbage, and despite my protestations that there were way more knowledgeable people in the group for him to talk to, I wound up featuring in his article. In fact, I was the main character: the first sentence, according to a friend, approximately read, “Alex Barnard is wading through shit on the streets of New York City.”

Most reporters who visited freegan.info trash tours, though, were way more interested in the others involved in the group—the people who had organized their entire lives around freeganism. I was always in awe of how good the spokespeople for freegan.info were. Night after night, I saw them turn an aberrant activity like dumpster diving into a common-sense response to waste, and spinning our abhorrence of people in the garbage into a surprisingly relatable critique of capitalism. I’ve been involved in a couple of different social movements now and, when it came to manipulating the media, freegan.info was good.

Then again, I never actually bothered to look at the stories that were getting published. After all, I figured, I had a front-row seat to freeganism. Whatever the media was showing had to just be a dimmer version of what I was seeing. It’s only been lately, in writing up my book, that I’ve gone back and looked at some of what was published in the halcyon days of 2007 to 2009.

It’s kind of hard to believe that the TV spots and newspaper articles are really about the same people that I spent two years with. There’s the ABC report with the “Psycho” sound effects when Cindy opens a trash bag; the Wall Street Journal Reporter who cuts off Janet’s discussion of waste to say “I’m interested in the eating for free angle”; the blathering quotes from public health officials about food safety and fawning praise on stores donating a trivial amount of food for charity. Sometimes I wonder: were they really there?

There’s something so seductive about the media, especially to anyone who’s used to seeing their views ignored by it. For what it’s worth, sans media attention to freeganism, food waste would never have become the “issue” it is today. And, because of this, there’s a certain persistent faith that if we just do a better job of “slipping in the message”, we’ll fool the corporate behemoths into turning the airwaves into a conduit of anti-capitalist propaganda.

At least, that’s what I tell myself. In a pique of arrogance, I’ve been doing media work again. I was allured by the promise of a long, investigative piece about food waste, of which dumpster diving was only to be a small part. I took the reporters on a dive and to a public re-distribution of food; I talked about over-production and commoditization; I argued that stores threw things out not because they were careless or negligent, but because wasting is profitable in a capitalist system. I told them that the issue wasn’t my lifestyles or my carbon footprint; that I didn’t expect the whole world to start dumpster diving; that I recognized my own privilege that allowed me to engage in the act.

The piece aired a few weeks ago and, as they come, it wasn’t bad. The reporters traveled to a town that mandated food donations; they interviewed distributors, managers, and activists; they played down the safety concerns around food waste. The part where I featured, though, was painful. I declared myself an “activist” against the “system”, but they cut out any explanation of what the system was or how what I was doing might change it. I spouted some platitudes about how great the food in the dumpster was, before launching into an (edited out) explanation of how it got there. As far as anyone watching this is concerned, I was the guy who eats garbage.

Todd Gitlin writes that Students for a Democratic Society activists in the 1960s were alienated from their own representations, media products which “stood outside their ostensible makers…confronting them as an alien force.” I know that guy on TV, but I’m definitely not that stupid.

The Normal One

A few weeks ago, I spilled my coffee at the breakfast table three days in a row. Someone suggested that maybe I was guzzling too much caffeine, and I replied that, no, I’ve been on an unhealthy-grad-student level of coffee consumption for some time. Curious, though, I went off for a few days, but it didn’t change what I had first noticed this winter while wrapping Christmas presents: my fine motor skills are gone. That, and a partial erasure of my short-term memory (as well as the chunk of change I’m paying pharmaceutical companies for the privilege of both) is the bargain-basement price of happiness, for now.

But I’m not the one whose hands are supposed to shake. When I was little, I assumed the normalcy of elements of my relationship with my brother that now, looking back, I realize were distinctly formative. That my friends’ would be my brothers’ friends (and that he would always be the “bad guy” with us); that, when we traveled, I would be entrusted with the plane tickets, despite being four years younger; that my brother would also always be taken out of the “normal” classroom, but for different reasons. Yet, for some reason, I mostly just remember that my brother had really messy handwriting and a shaky grip. It’s weird what kids notice.

My parents let me in gradually. I first remember a matter-of-fact explanation that my brother would not—following the assumed upper-middle-class pattern—go to college. But I was mostly shielded, and I hid myself, locking myself in my room any time there was shouting. It wasn’t until 9th grade, when my brother really fell apart, that I recall hearing the word “bipolar” and only later, with its growing popularity, “autism”. Such strange and inexplicable demons make everyone feel impotent, and I was no exception. The best I was contribute was to sleep outside his door a few times when he was manic, in the hope that he’d wake me up and not my parents.

Oh, and there’s one other thing I thought I could do: achieve. Relentlessly. It’s probably not a coincidence that high school was when I went into arrogant I’m-going-to-be-a-Senator-after-I-go-to-Princeton (I literally put this in the yearbook) overdrive. It wasn’t great timing, since shortly after I became acutely aware of my brother’s limitations, I had an unexpected and novel confrontation with my own. But even as it imposed itself on me, depression was not something I allowed myself. I was the normal one. Or maybe more than that: I was the one who was compensating, the one who was succeeding for two. Having a disabled brother was lumped in with other reminders of my “privilege” that served as good fodder for admissions essays and self-serving save-the-world fervor.

I don’t pretend I will ever understand the challenges my brother faces. I will only say that certain experiences have made me more or less empathetic towards them. I’m afraid I’ve tended towards the latter, which is why my most recent “episode”—our well-worn family parlance for mental illness—was in a way a good thing. Mental illness, I’ve realized, doesn’t fit well into my usual worldview. There’s no zero-sum class war; no structure or power to overthrow; no “privilege” to be negated and redistributed. There’s no one to be angry against except god, and in a sense, the very randomness of it all feels like an argument against him/her anyway.

As I was melting down last summer, a psychiatrist threw out a term I’m so ashamed of I feel the need to unload the burden publically. He said I suffered from “survivor guilt”. But I am no survivor. My brother lives a life full of vitality and meaning and community, things—despite the unfair apportionment of certain skills and capacities between us—that I’ve at times been sorely lacking. There is so much absurdity to placing lives on a continuum, to thinking there’s any measure by which one can ‘make up’ for another, or even that worth can be measured anyway. No one exists to be the subject of a college essay or an inspiration or a reminder of privilege to others or the subject of a hackneyed blog post-conclusion. We just exist. And some of our hands shake.

Left-Wing Think Tank “United States Department of Agriculture” Concludes Capitalism Is The Cause of Food Waste

Nicholas Kristof has recently informed me that most of what I do is relatively useless, and that the only solution is to blog more (tweets are cool too). Ever since he opened my eyes to the fact that half of the world’s population has two X chromosomes, I’ve hung on his every word, so here goes.

That said, although he thinks sociologists are irrelevant because of their left-leaning (reality-leaning?) biases, I believe I can make a contribution through a somewhat different tactic. In particular: you wouldn’t know it (because of all the jargon we use!) but social movement scholars have identified a “radical flank effect” by which reformist, mainstream movements are helped by lunatics on the fringes who say crazy shit and thus make aforementioned movements seem less threatening and therefore more likely to win concessions.

A few weeks ago, the USDA released a major new study quantifying “food waste” (well, technically “food loss”*) in America. It’s the first since 1997, which suggests that the issue is gaining some momentum, or that I’m deluding myself into thinking that other people care about the things I care about. Over at “Wasted Food“, Jonathan Bloom – the U.S.’s leading public intellectual on this issue – has some well-reasoned analysis. In the spirit of “radical flank effects”, though, I’m going to drop some completely unpalatable and politically DOA thoughts in the hope that they will help the well-reasoned efforts of others to move forward. Somehow.

What the Report Says

  • About 31% of the food available at retail level doesn’t get eaten, which totals to some bad-shit high figure like 133 billion pounds per year or 429 pounds per person. The important thing to note here is that this is readily acknowledged as a massive underestimate. It ignores crops that never get harvested because of low prices (~10% by some estimates), produce culled for aesthetic or cosmetic reasons (up to 50% depending on the product), or losses in processing or manufacturing (which, as documented by Tristram Stuart, are both huge and deliberately imposed on processors by powerful supermarket chains). It’s also an underestimate even within the report’s authors’ own ambit, since, as they note, their numbers suggest that more food gets eaten than is humanly possible (obesity epidemic notwithstanding). And it doesn’t include food that could feed humans, but which we instead feed to animals, that in turn is converted to a smaller number of calories of meat. You can debate whether this constitutes “waste”, but insofar as the food system exists to feed people (it doesn’t, really), it’s not a particularly efficient way of doing it, so it’s waste in my book blog.
  • The food losses that get counted in the report sum up to about 141 trillion calories per year. This is a fun and media-friendly figure because it’s unfathomably large and implies something about hungry people in Africa. It’s also really, really meaningless. If you  look at calories, about half of the “food” we “lose” consists of added fats and sweeteners, which raises some questions about the meaning of “loss” and, dare I say, “food”. Moreover, it perpetuates the myth that the solution to food waste and hunger is to have someone standing by the bin / dumpster / household trash receptacle capturing whatever is left and giving it to the homeless person down the street. It’s a good way to get kids to eat leftovers but, as I learned at the food bank, the relationship between what gets thrown out and what is needed is a weak one and moving calories around is not the way you address hunger.
  • The total “value” of food waste is $161.6 billion. Of course, the “cost” of food waste is best measured in lost water, land, or labor. But even if we decide to attach a dollar figure to waste, we need to really ask ourselves who, exactly, bears the “cost” and why exactly it counts as a “cost” in the first place. As I’ve ranted previously, it’s no skin off Monsanto’s back if the seeds it sells don’t actually grow food to feed people. And it’s great news for farmers if distributors are purchasing 3,796 kcal/day from them, even if the average person (factoring in the elderly and children) only needs 1,900. And, to offer my favorite example (I think I’m showing my class background here…), grocery stores love it that you have to buy a big-ass bunch of cilantro that you can’t possible use, because they can sell it for more than a small-ass bunch of cilantro. As far as I can tell, waste keeps the dollars flowing and the economy humming. If that’s what you care about, throwing food out is not much of a waste at all.

The Unhappy Conclusion

At this point, I’m fairly used to meaningless platitudes about how reducing food waste is a quick fix to the global food system. To its credit, the USDA report has a healthily realistic take on possibilities for major reductions in food waste. Quoting an older report from the General Accounting Office, they observe:

From a business standpoint, the value of food product saved for human use should be equal to, or greater than, the cost of saving it. To the extent that the costs exceed value, good business judgment dictates that the loss is an acceptable cost. In the course of preparing this report, no material has been found that would indicate that opportunities were knowingly overlooked by business owners to conserve food at an acceptable cost. The profit motive should dictate against such loss.

Long and the short of it: food waste happens because a business model that involves wasting food (through cosmetic standards, pre-packaged perishables, and rampant overproduction to avoid missing any sales) is more profitable than one that doesn’t involve wasting food. Capitalist firms waste food because they are doing their job: creating “value” not in the forms of meals or satisfied stomachs, but shareholder returns.

Bloom and his U.K. compatriot Tristram Stuart both write that reducing food waste is a “triple bottom line” solution that can feed people, protect the environment, and raise profits. But they need to give capitalism a bit more credit. If there were money to be made from reducing food waste, the thousands upon thousands of managers, engineers, and technicians whose livelihoods depend on squeezing every possible penny out of our food system would have found them long ago.

An Attempt to be Constructive

Okay, so revolution or nothing, right? It’d be a cheap way to end this post, so I’ll make an effort to be a bit less nihilistic. Even if we accept that “capitalism” is going to be our economic model for the foreseeable future, we can still acknowledge that solutions to problems within that system can come from outside of it. That is to say, the market doesn’t fix itself: people organized into networks, organizations, and movements do. So, since I’ve never bothered articulating what I actually think should be done about food waste, I’ll make a quick attempt to articulate a program that’s serious about reducing food waste without turning it into a band-aid that distracts us from the myriad other problems with our food system:

  • Reform Agricultural Subsidies. This one is obvious but not really being discussed by anyone talking about food waste. Crop insurance programs, as they currently exist, allow farmers to plant with almost zero financial risk without any regard for the market for the food-like substances they produce. Given that, despite our dire financial straits, we’ve somehow found $240 billion to spend mostly on subsidizing corn and soy to feed cows and displace Mexican peasants in the last decade, it seems completely reasonable that we could use subsidies to make organic, local, and ethical food cheap and available. More localized systems could go a long way in reducing waste.
  • Introduce Painfully High Landfilling Taxes. The E.U. has already done this, and many are crediting the E.U. landfilling directive with sparking new interest in food waste reduction initiatives and donations. And hey, maybe as an after effect, it would discourage stores from marching along with the worldwide trend towards locking and or poisoning their dumpsters, since scavengers are – in the end – only leaving them a bit lighter. But, crucially, landfilling taxes have to be coupled with bans on shoving food waste onto others – like, for example, our food bank which threw out WalMart’s surfeit of cakes for them.
  • De-Commodify Food. Food is a stupid thing to treat as a commodity. Demand for it is inelastic (you can only eat so much of it) and you can’t substitute it for other goods (because, well, you die if you don’t have it). So it doesn’t really work in a growth-based, capitalist economic model, unless you find dumb other things to do with ever-increasing food production, like converting it into bio-fuels or introducing “anaerobic digestion” (which creates demand for food waste, which is also, well, stupid). This is fairly off the deep end politically, but it’s not utopian: as the great E.P. Thompson documents, even in early capitalist England, people still saw food suppliers (mostly bakers) as public servants who worked for a fair allowance, not a profit. Our notion of food as a commodity is uniquely modern, utterly moronic, and the root of contemporary food waste. With all the talk on the left about a “universal basic income”, maybe we should start with “universal foodstamps”?

Note the absence of calls for greater consumer awareness, “voting with your dollar”, and/or learning to eat your leftovers. These sorts of reforms – the targets of most campaigns – are good, but when you look at the size and power of the interests behind food waste, relying on individual, atomized consumers to change things is bringing a spork to a gunfight.

- – - – -

* The difference stems from whether you include “loss” from shrinkage, pests, peels, etc. versus “waste”, which goes unconsumed because of human action.

Happy, At Any Cost

Dates like these are always kind of arbitrary, but it’s been three months. Three months since I was last crumpled up on my parents’ couch, since I cried for no reason at all, since I could speak of “being depressed” in both a present and seemingly eternal tense. Three months, and it already seems so foreign, so distant, and so unfathomable that I sometimes wonder if I really was the same person. Unfortunately, I was, though I really, really wish I weren’t.

I was fifteen when I first realized that I was just sort of automatically sadder than most of the people around me. I told myself it was a good thing. I could focus on changing the world; giving myself to others; martyring myself for the cause, without being distracted by the petty business of actually enjoying life. In a weird way, I think being depressed made me a better person. I turned outwards for the first time in my life; I became more empathetic; having been forced to acknowledge my own imperfections, it became easier to accept those of others.

But there is a certain baseline level of happiness—a requisite amount of non-misery—that I’ve learned is necessary, even for being self-less. And so, this time, when I realized I was way, way sadder than the people around me, I replaced self-denial with a desperate sort of hedonism. For the last year, I’ve done whatever seemed likely to make me less-unhappy at the time, and figured that the consequences for others—who couldn’t possibly know what I was going through, after all—could be ignored. And so while I firmly believe mild depression made me a better person, I’m gradually realizing that major depression made me a far worse one.

Weirdly enough, when I was at my worst, the feelings of self-loathing and low self-esteem that have been my traveling companions since adolescence disappeared. Depression crowded them out; I felt so shitty that even my brain—trained to explain bad feelings as well-deserved punishment—couldn’t come close to rationalizing them. Stranger still, now that I’m feeling better, all these feelings are back And, paradoxically, precisely those things that seem to have made me better have made me feel even less deserving of the happiness I have.

Maybe it’s a reflection of the particularly pharmaceutical nature of my recovery, but so far, I’ve experienced happiness as an absence: being happy is not being miserable, not being incapacitated, not feeling hopeless all the time, not being terrified of going to bed because you know that you’ll wake up in the morning. And, to some extent, that emptiness has been enough to get back to my life. But if you had told me three months ago how hollow happiness could feel, I never would have believed it.

Gleaning the Gleanings

As much as I like to think dumpster diving is in some ways inherently political, there are times when the whole thing can feel incredibly self-involved. And so, in the perpetually problematic desire to “give back”, I’ve been volunteering in food redistribution (again).

I like this charity, even though it’s a charity and not a “movement”, more than many, because it continues a long European tradition—gleaning—and provides food that is actually healthy. Every Sunday, the “Gleaner’s Tent” takes the leftover produce from one open-air market in the 19th and distributes it to an eclectic group of punks, retirees, and immigrants.

There’s one step I left out, though. After we get the food from the distributors, we sort it. The head of the tent is proud that the food we give out is (almost) as good as the food people are buying a few meters away. But it doesn’t come that way when we ask suppliers for their leftovers. On Sunday, we had a hyper-abundance of mangoes (hey, it’s better than cake), and I was assigned to cull the good from the not-so-good. And so I did, chucking the truly desultory and inedible fruits into a rapidly-filling organic compost bin behind me.

When I thought I was nearly done, another volunteer—a migrant from West Africa—looked somewhat bemusedly at my work. She clearly knew more about mangoes than I did, and began grabbing fruits that I thought had made the cut. A split-second of contemplation determined that two-thirds of them were unfit for human consumption, and they joined the rest in the bin. I didn’t know what to think. There were hungry people, and we didn’t have nearly enough gleanings to feed them all.

When the line finally started moving, though, I had a better understanding. Just like at the food bank, people—that is, hungry and poor people—did not just take what they were offered. They reached for the brightest, the biggest, and the freshest, and haggled and traded to get something better than what we pushed onto them. There was a lot left over—so much, in fact, that I wound up gleaning the gleanings, reaching into that compost bin and taking a half-dozen mangoes that I had been convinced someone would want but which had been left behind.

I’ve been thinking a lot about questions of “value” as it relates to waste. Originally, like a good Marxist, I concluded that we waste because, under capitalism, food is a commodity valued based on its capacity to be exchanged, not its ability to be used. I’m ready to concede that this was is a jejune and simplistic point. Sure, maybe we waste food because we don’t “value” labor, animals, the environment, or nutrition. But we also waste it because of what we do value: taste, appearance, convenience, abundance. Waste starts to seem more intractable when you look at it that way, as a “positive effort to organize the environment”, as anthropologist Mary Douglas puts it.

The mangoes were edible. They were probably even nutritious. But they tasted pretty bad. And maybe it’s only from a position of privilege that eating the crappy leftovers seems like a good idea.