The Death of the Critic
The fall 2011 edition of Dissent magazine brings us the latest in a long, storied line of complaints from the gatekeepers of critical taste. Author Charles Taylor writes:
I’ve been a film critic on and off for twenty-five years and have been lucky enough to take part in the tail end of the best era of print film criticism and the beginning of the Internet, when it seemed like the Web would be the new delivery system for the kind of writing that was starting to be imperiled in print. My experience tells me that not only was film criticism in better shape in the print era, but good work stood a greater chance of making an impact. Only a fool would say that there’s not good work being done on the Internet. But the nature of the medium, the way it has reshaped journalism and public discourse, makes it harder for that work to matter. In its contribution to the ongoing disposability of our cultural, political, and social life, in encouraging the cultural segregation that currently disfigures democracy, the Internet has to bear a great deal of responsibility for the present derangement of American life.
Now to be sure, the ability to close off any kind of opposing viewpoint is almost certainly a factor in the increasingly bitter political divide that threatens to rip the country apart. But it’s hard to argue that if the majority of people prefer The Green Lantern or Pirates of the Caribbean to The King’s Speech—to use some of his own examples—that this somehow represents the defeat of democracy. After all, the internet makes possible the dissemination of movie information far beyond what was possible in the print era; granting instant access to, not just established publications, but countless blogs and websites as well.
Taylor, however, is not convinced. He complains that many of these new writers lack sufficient rigor, or boast exclusionary writing styles; worse, they are helping to create an unwieldy volume of criticism, which serves to drown out critical voice, allowing a deluge of publicity-driven lowest-common-denominator fare to overwhelm the market. From all this, he foresees “the probable death of movies as popular art, and the retreat of serious critics into contemplation cells.”
This is an ominous prediction, but fortunately, Taylor is only half right. For the internet will not result in the death of movies as popular art. It is, however, bringing about the death of the professional critic.
“There is currently no must-read critic,” Taylor laments. “No Pauline Kael or Andrew Sarris whose opinion can kick off a conversation or an argument…Part of the problem is the thing often cited to prove the strength of film criticism: the sheer number of people online who are doing it.”
The idea that the lack of a dominant critical force leads to a decline in the art form being criticized is not a new one. Ten years ago, in a now-famous Atlantic piece, B. R. Myers wrote:
The absence of a dominant school of criticism, we are told, has given rise to an extraordinary variety of styles, a smorgasbord with something for every palate…From a reader's standpoint, however, “variety” is the last word that comes to mind, and more appears to be “out” than ever before.
He then went on to explain how insightful, well-written prose has all but disappeared from the contemporary literary scene, from Annie Proulx to Cormac McCarthy, Don DeLillo and virtually every other highly-acclaimed writer of the last thirty years. In the absence of some overarching high-culture tastemaker, it seems writers have devolved into “a remarkably crude form of affectation: a prose so repetitive, so elementary in its syntax, and so numbing in its overuse of wordplay that it often demands less concentration than the average ‘genre’ novel.”
Myers believed the current critical establishment was partly to blame. “The critics’ admiration…reflects a growing consensus that the best prose is that which yields the greatest number of standout sentences, regardless of whether or not they fit the context,” he decried. Regarding Delillo’s White Noise, he writes: “The novel’s inflated reputation remains a clear signal that we should expect less from contemporary fiction than from books written in our grandparents' day.”
Myers doesn’t state it explicitly, but from his initial stab concerning “the absence of a dominant school of criticism,” it seems he views the decline of literature as a consequence of the decline of literary criticism, as though writers’ artistic standards have deteriorated without the guidance of a higher school of critical thought. Critics, meanwhile, are either making the best of a bad situation, or have let their standards decline along with the material they’re reporting on. Neither explanation speaks well of the critic.
In predicting “the death of movies as popular art,” Taylor seems to believe a similar outcome is taking place in the world of film. Unlike Myers, however, he traces the source of the trouble to the advent of the internet. In doing so, he throws his hat in with other internet decriers like Andrew Keen and Jaron Lanier, two critics Rob Horning analyzed recently over at The New Inquiry.
Lanier—author of You Are Not A Gadget, a book that Taylor describes as “essential”—argues that the internet alters and ultimately limits the possibilities not only of cultural creation, but of existence in general. Taylor explicitly turns to him to bolster his thesis that the internet is destroying democracy, citing the effects of anonymous commenting, which has supposedly wrecked our ability to communicate with people who disagree with us. Horning’s critique, however, illuminates another similarity between the two.
“Lanier…regards genuine artists as entrepreneurs first and foremost; those who are not motivated by profit are dilettantes whose work is inherently bad,” Horning writes. Therefore, “Web 2.0 voluntarism…is an inauthentic form of expression, for in order to be authentic, it must have an unambiguous value assigned to it by the market, the proxy for real social recognition under capitalism.”
It is probably no coincidence that Taylor avoids this bit of Lanier’s theory. Lanier himself doesn’t seem to shy away from equating value with market valuation—he’s quoted more or less praising a system in which “everyone would have easy access to everyone else’s creative bits at reasonable prices…[an] arrangement [which] would celebrate personhood in full, because personal expression would be valued”—but for Taylor to plaintively argue that bloggers are bad because they do what he does for free would be more than a little crass. So instead we get subjective critiques about the quality of their work.
“Their film knowledge is broad and deep, but they wear that knowledge lightly,” he complains. “To read them is to read people grounded in the sensual response to movies.” Meanwhile, “the ones who have reacted against the shallowness of the current conversation [write]…articles that analyze sequences in terms of lighting and editing…presented with the deadly seriousness of a doctoral dissertation.” In other words, the bloggers are either too shallow, or too serious, or too boring…the list goes on. The true division between their works and his, however, is that he is paid and they, mostly, are not. Their shortcomings are not justification for this, but rather, vice-versa; his reasoning emerges from the vantage point of his position.
In his defense, Taylor doesn’t harp on the fact that he is “better” than the other writers (though it is certainly implied). Rather, he thinks the internet in itself is making it worse for all of them—for film criticism as a whole. One way it is doing so is through the taint of commercial interest. For critics’ jobs, he argues,
depend on securing advertiser-pleasing hits by lavishing coverage on the worst of what’s out there, especially the superhero and fantasy movies. Editors hope to attract hits by feeding into a movie’s prerelease hoopla. What a critic actually thinks about the movie is often drowned in the ongoing publicity deluge…Editors then point to the number of hits generated by this as proof that what the readers really want is coverage of the big movies—whether or not there’s been coverage of anything else to choose from.
This is not an ill-founded complaint, and it applies to criticism of other media as well. As a former Pitchfork music magazine contributor, I found it surprisingly difficult to convince a certain then-editor-in-chief to cover music outside already-established “indie-popular” genres and artists. Headline review space, meanwhile, would be given over to panning the latest offering from Asher Roth or some other flash-in-the-pan commercial success, at the expense of listen-worthy smaller artists. There can be little doubt that business-minded thinking played a role in such decisions. After all, why take the time to ridicule something—especially something that is more or less a bad novelty gag—if it’s not just to garner page clicks?
This is not a new phenomenon, however, and it is not a byproduct of the internet era. After all, print publications were hamstrung by advertisers, too. Page count is a direct reflection of advertising sales. The gratuitous top five lists that Taylor bemoans existed then, as now. Furthermore, the gradual symbiosis between criticism and publicity is a hallmark, not of the internet age, but of capitalism itself.
Meanwhile, those searching for new, unheard music end up turning elsewhere. Does anyone look to Rolling Stone to discover new music anymore? In the future, will they still read Pitchfork? In a nod to the growing number of listeners looking outside the critic-approved/commercially-acceptable sphere of musical offerings, the latter unveiled “Altered Zones,” an amalgam of popular blog-writers covering less commercial music for a single Pitchfork-affiliated website, shortly after my departure.
Such a move, however, is little more than the critics’ attempt to decelerate their own obsolescence. After all, if you can’t compete with blog writers, you might as well co-opt them.
For all his supposed concern about the death of democracy, Taylor is still, most importantly, a film critic, and it is his prediction regarding “the death of movies as popular art” that I am most concerned with. After all, it would certainly be a bad thing if the loss of critical authority resulted in a decline in the quality of film.
Fortunately, for those of us who aren’t writing professional criticism—and especially from the perspective of young filmmakers—the outlook should not seem so dour. First of all, there is the obvious fact that there have been, and always will be, certain people inclined to seek out entertainment more intellectually stimulating than the typical mass-market fare. A desire to seek out the new, either as spectator or creator: this is essentially how “art” evolves in the first place. If Taylor is lamenting that he may no longer have a role in helping inform the populace of such works, then he should be relieved to know that, with the advent of the internet, such information is pretty easily available to anyone who wishes to find it; and unlike in the world of politics, where the echo chamber of the internet leads to closed-mindedness that has direct consequences on our ability to govern, there are no entrenched partisan perspectives instilled in the average filmgoer. If anything, he may be a bit lazy, and turn to commercial fare as a default, but there is no reason to suspect that, when faced with an array of critical reviews and informative movie capsules, he will angrily turn his head from any he’s not seen a commercial for. Anyone looking to watch a movie is already seeking out information. Taylor derides “the attempts of Amazon and Netflix to steer your next purchase based on what you’ve already bought,” but even this leads to the broadening of a customer’s cultural palette, and perhaps in ways that the only half-heartedly engaged—those most influenced by the random grab-bag of newspaper reviews anyway—wouldn’t have bothered to seek out in the first place. And for those who aren’t looking to read film criticism—which, in any era, will be a significant portion of the population—then the loss of the critic’s voice is essentially meaningless.
Meaninglessness seems to be the true crux of Taylor’s complaint. “There are too many critics writing too many pieces,” he writes, referring no doubt to the growing coterie of bloggers who’ve taken to doing his job pro bono across the web. Perhaps sensing the weakness of his earlier criticism, he dons a new tack, arguing that, other than being “sensuous” and intellectually lightweight, the blog world speaks to itself and itself alone, closing off entry to those “not already novitiates in the order of cinema.” “All the serious young cinematic men sound as if they’re writing for each other,” he argues. “Not showing off, but sealed off.”
There is some merit to these remarks. The internet does have a way of creating isolated, clique-like atmospheres which can distort perception of the world at large. Take, for example, the hilarious rantings of Psychedelic Horseshit frontman Matt Whitehurst, published—shockingly, though this was the time when all the major newspapers were getting wet over “the next coming of rock”—in a 2009 edition of the Washington Post. “Terminal Boredom is the new Rolling Stone,” he raved. “The bigger heads on Terminal Boredom are ruining music today.”
Don’t know what Terminal Boredom is? Don’t worry, I’m sure you’re not alone.
To those enveloped in the culture, however, such claims appear as truths. Certain websites take on the cachet of the old cultural trendsetters at large. And you can find one for every possible taste or style. Thus, a hegemonic consensus among the greater populace becomes increasingly difficult. In this way, Taylor is not completely off the mark concerning the increasing inability of any one critic to have any kind of sway in the cacophonous din of voices echoing around the internet. To some extent, our ears do close off to those we don’t agree with, and we take comfort in like-minded communities where we can elect our own tastemakers and come to our own opinions regarding the values of certain artists, be they writers, musicians, filmmakers, or whoever else catches your fancy.
This doesn’t necessarily bode ill for the world of film, though. This is because, no matter how novice-friendly online critics tend (or tend not) to be, the fact remains that, for those who wish to know, such access remains at their fingertips. You can catch up to speed with anything unfamiliar; with music, literature, and now movies as well, access to the works themselves is pretty much a click away. You can find whatever you want, and you can find out more about it from a vast array of sources. With profoundly easier access to a greater variety of work, the influx of cultural consumption has subsequently increased. This is not bad for the filmmaker. It does, however, decrease the importance of the critic, whose job has been traditionally to help steer time- and money-pressed consumers towards what they deemed were the most worthy endeavors. With the advent of the internet, the middleman is less and less needed.
With improving knowledge, it becomes easier for consumers to become tastemakers themselves. Natural impulses to share information and discuss it can find expression in community outlets at websites such as the aforementioned Terminal Boredom. As such, taste begins to emerge organically—making its way from a few blog writers simply espousing their opinions, gathering steam through online forum discussions, and eventually reaching wider audiences, at which point it gets picked up on by mass media outlets like the New York Times or the Washington Post (which is no doubt how Psychedelic Horseshit made it into such a publication in the first place). To claim such processes are anti-democratic is to claim that black is white—but then, that’s not Taylor’s actual complaint. The critic is no longer needed to “discover” music (or art, or film). Rather than helping bestow attention and critical merit on selected artists, his role has evolved to become little more than a lagging indicator of popularity.
Such a role—and I imagine this gets to the heart of Taylor’s fears about “the death of movies as popular art”—is not without importance. And as far as film goes, it is especially relevant, for such indication helps direct the most important thing of all: money. For musicians and novelists, a bit of money helps, but is often unnecessary; it’s nice when artists can quit their day jobs and work on their craft full time, but you can play in a band on weekends or write your novel after work. The startup costs are generally quite low.
For film, this is not the case. Even notoriously low-budget films like The Blair Witch Project or Kevin Smith’s original Clerks cost $22,000 and $27,000 to make, respectively. The world of film has always been prohibitive to the DIY kind of ethos that you can find in the worlds of music and print. Therefore, popularity must play a bigger role in its creation, and in this light, the critic’s role can be seen as almost noble, aiming to cultivate the tastes of those who might otherwise be disinclined to watch the films that the critic cares about. She implicitly needs mass recognition of her favorite films to help insure future financing is available for other like-minded filmmakers and projects.
Even assuming such ideal intentions, however, Taylor’s criticism falls flat. This is because the power of the internet overcompensates for the potential loss of the critic’s sway in guiding accolades—and with it, popularity and financing—the filmmaker’s way. Take, for example, the movie Paranormal Activity. Created for “only” $15,000, the movie debuted on September 25, 2009, in a mere 13 towns across the U.S. However, director Oren Peli turned to the web to get viewers to vote where he should take the movie next. This helped create buzz, and allowed him to roll out the film’s release in the most effective (and lucrative) way possible. By November it was showing worldwide, and it later went on to gross nearly $200 million. It also should be noted that the film has a favorability rating of 92% among Top Critics at RottenTomatoes.com—though, considering most are dated December 2009 or later, it seems they came after the initial internet success, further demonstrating that, at best, the critic seems to be performing the role of redundant identifier. This is not to say that critically acclaimed and crowd-pleasing movies should be one and the same; but if a film is truly merit-worthy, surely there will be enough like-minded individuals tied together across the internet to give it the attention it deserves. No critic is an island; now less so than ever.
This is especially true when you stop to think that, not only has the internet changed our modes of viewership, but our ability to create as well. The music world has seen the rise of low-budget, DIY artists since the birth of punk in the ‘70s (which is to say nothing of the Hasil Adkinses in the world before then), and zine culture and other low-cost printing has ensured that writers who wanted to express themselves without going through the typical channels had avenues in which to do so. But advanced technology and the advent of the internet have made it easier than ever to not just create, but reach an audience as well. Regarding writing, this is clear enough; for musicians, Myspace, SoundCloud, and other music hosting websites have become essential. In less obvious ways, too, the internet has been beneficial. Amazon.com, for instance, has begun publishing authors, allowing them to eschew traditional publishing houses whose editors have typically held the keys to the first gates of critical acceptance. Magazines such as this one exist solely on the internet. Even TV shows are being produced now with the internet in mind.
Some may argue that, in light of such influx, it’s harder than ever to sort the wheat from the chaff. This may be true—and with the death of the critic, it becomes especially so (after all, it used to be a paying gig). However, the creation and consumption of art will not fade with the dissolution of the critic’s voice. Neither will our taste buds collectively shrivel and rot. Those seeking to push boundaries will continue to do so; the cognoscenti will continue to seek out thought-provoking means of entertainment, and the ignoscenti will continue to watch lions maul each other and CGI robots explode. Some, meanwhile, argue that the internet creates a culture of instant obsolescence, and the dissolution of consensual critical authority will only accelerate this trend. This adds to a sense of disposability that is attributed to much artistic work today. Such arguments, however, underestimate the power of consensus to emerge through online communities. Furthermore, just because the internet provides an open market for opinion doesn't mean that all opinions will be valued equally: the internet may be democratic, but by sheer self-segregation, those with like-minded tastes and interests will congregate, further inspiring those among them to create. This kind of passion is absent from the souls of casual moviegoers, who don’t spend their time arguing minute details in forums or on blogs. To the extent that Taylor complains that such discourse is a closed circuit, he’s right—but not because nobody is free to enter into it if they so choose. It is from these communities that the artists of tomorrow will emerge.
There can be no serious doubt that art will continue to prosper into the future, across all media. It has survived the shift from patronage-based monarchical society to market-based capitalist society; it has survived the advent of mass-media; it will survive the internet age. If anything, the dissolution of critical authority and the decay of the canonization process brings with it increased possibility from the perspective of the artist. The internet allows for dissemination of information and expression unlike anything that has come before. This is not an unequivocally positive development. As a journalist, I confess that I, too, have often decried the devastation the internet has wreaked upon my employment possibilities. But as an artist, I find that the possibilities seem more limitless than ever.