Saturday, October 08, 2005

A List to End All Lists

This is the site's first significant post. I no longer agree with everything it says, or the way it says it. I offer it to the reader primarily in the spirit of archeology.

In the interest of providing a sense of my musical biases, here are 21 albums I have considered valuable over the years. At this point in my life, several of these records offer little more than nostalgia. Others continue to provide sustenance. Some are relatively recent interests, while others have been in my collection since I was seven or eight years old. I present them in alphabetical order.

As with any list, this one has the potential to become a rather hollow exercise. With that in mind, I’ve included some analysis below the album covers.





A Love Supreme
John Coltrane








And His Mother Called Him Bill
Duke Ellington








Blank Generation
Richard Hell









The Blues and the Abstract Truth
Oliver Nelson







Blues and Roots
Charles Mingus











Conic Sections
Evan Parker









Eat a Peach
The Allman Brothers









Free Jazz
Ornette Coleman









Fun House
The Stooges








It’s Alive
The Ramones








Latin American Suite
Duke Ellington








Layla
Derek and the Dominoes










Made in U.S.A.
The Beach Boys









Meditations
John Coltrane








Midnight Marauders
A Tribe Called Quest









Nevermind
Nirvana










Please Please Me
The Beatles








Rip, Rig, and Panic
Roland Kirk










Sgt. Pepper’s Lonely Hearts Club Band
The Beatles







Trout Mask Replica
Captain Beefheart








White Blood Cells
The White Stripes






I don’t like record lists. Most of them have ‘Buy It!’ links to Amazon or Insound next to the albums and are hardly more than electronic shopping catalogs. As consumer guides, they may be helpful -- there is a lot of crap out there, and the simplicity of numeric rankings certainly expedites the process of consumption. But for those of us who want more from music than our money’s worth, such lists may fall short. Lists with higher aspirations, meanwhile, are usually so adrift in generalizations, there’s little hope of perceiving the context they claim to present.

A good example is this list by the College Media Journal (CMJ), of the 25 Most Influential Artists of the Last 25 Years. Seriously, how on earth does one interpret the assertion that Ani Difranco is the 22nd most influential artist of the last quarter century? I am told that she has had a “stable 15-year career” that is “proof positive that you can build a cottage industry out of tape-dubbing and CD-R-burning.” But hadn’t the Grateful Dead already proven this almost thirty years ago, when Ani Difranco was still in grade school? Personally, I’m more inclined to assert the significance of Difranco and her label, Righteous Babe Records, in terms of feminism, but to do so would require a Master’s Thesis, not a paragraph. In any case, the list isn’t really concerned with the particular merits of its artists; it wants only names and numbers. This is about inclusion and exclusion, which is why the instructions to CMJ’s list encourage us to debate whether certain artists have been left out unfairly. Rock ‘n’ roll, after all, is a popularity contest, every bit as crooked as the ones from middle school. The list is simply a posting of the ballot results.

Really, though, who cares that Metallica and the Cure were left off this list? We lose sight of the bigger issue when we worry over such petty distinctions. This list promises insight into the last twenty-five years of rock ‘n’ roll, so I want to know what’s made those last twenty-five years different and unique. Anything? All I see here are buzzwords (jangle, dugga-dugga, screamo) and undefined expressions (college rock, alternative rock, anti-music) that are either made up or stated with such reluctance that they require quotation marks. I mean, does CMJ believe in “alternative rock” or not? The whole exercise just seems staged and capricious.

And I get this feeling from just about every list I look at. Pitchfork has at least realized that the Internet offers space for sincere examination, even if it rarely offers much insight. Generally, Pitchfork’s lack of depth is compensated with an excess of content. Consider the following: the Top 100 Albums of the 1970s, the Top 100 Albums of the 1980s, the Top 100 Albums of the 1990s.

Each of these lists could be considered thorough, if largely in a completist, buyer’s-guide sort of way. I often feel like they miss the point entirely. Take the 1970s list. This list is not representative of the records that were important to people in the 1970s. That list would seem tacky and dated. This is a new list for modern people, augmented with all sorts of records that, thirty years ago, were remarkably obscure and/or didn’t resonate with many listeners, either critics or fans. In the same way, this list is also a terribly unreliable indication of which 1970s records will be considered important ten or twenty years from now (if any). This list belongs exclusively to a certain point in time (early-2000s) and the value system of a particular group of people (indie-rock fans and the Pitchfork staff). Unfortunately, there is little acknowledgement of this in the explanations that accompany each album, most of which are written in the same confused, dubious language used by CMJ.

The introduction to the list does mention “casualties,” those musicians expected to make the list who didn’t: Bruce Springsteen, Bob Marley, Patti Smith, Ornette Coleman, etc. Here, with such prominent exclusions, I actually feel let down that Pitchfork doesn’t offer any speculation. Did these musicians simply not put out any single, great albums? Is their music no longer suited to the needs of today’s rock audience? What makes the albums that were included on the list more relevant than those that were not? No one can ascertain what the 1970s mean to us, today, without also considering what they meant to the people who were living through them, then. Why might certain parts of the culture continue to resonate with young people, as opposed to other parts? What sorts of distortions are being made in the service of our contemporary needs? Trying to construe the answers to such questions from Pitchfork’s list and its accompanying explanations can only lead to frustration.

Same goes for the 1980s and 1990s lists. Pitchfork has actually released two lists for the 1990s, one from (I presume) early 2000 and the more recent one. The introduction to the second 90s list says that “a lot has changed.” It refers to “revisionism” and includes another set of casualties: Sleater-Kinney, Cat Power, The Roots, Snoop Dogg etc. But it doesn’t seem interested in what Pitchfork’s apparently great shift in perspective might mean, or why it took place so fast. The introduction suggests that part of the change may be due to a turnover in staff. What of the rest, though? My feeling is that, on the whole, the lists aren’t that different. Each list’s Top 10 has seven records in common with the other, and they both include the same two Pavement albums. In the Aeroplane Over the Sea by Neutral Milk Hotel shoots from #85 to #4. Soft Bulletin by The Flaming Lips wasn’t on the original list and now it’s #3. Little Earthquakes by Tori Amos fell from #8 all the way off the list. These are fairly minor aesthetic changes, though. For the most part, the new list presents the same basic indie rock philosophy as the one from five years ago.

The subtlety of these changes is not totally insignificant. In fact, it’s necessary for the subculture to survive. This is why hipsters are always “over” whatever trend you just discovered. If they weren’t, there would never be any new trends. Pitchfork, then, must update its lists every few years if it’s going to remain relevant as a tastemaker, even if all it’s doing is spicing up a well-established flavor. (Just for kicks: the angle I can take on the two 90s lists, at least without a more scientific comparison, is that, between Bonny “Prince” Billy and Neutral Milk Hotel, the revised Top Ten suggests a new emphasis on folky-type singer-songwriter guys who emote.)

If it seems I am asking too much of the list, I suppose that’s the point. The list is reductive by nature and I may simply be fooling myself whenever I look to one with the hope that it will expand my knowledge. My own list, of course, is no better than ones discussed above -- although I’ll save the discussion of its various faults for a later, follow-up post.


*I have yet to follow up on this post directly. But the trajectory of ideas, first hinted at here, may be pursued through the following entries:

Pitchfork and the Idiocy of Lists
Coming Attractions: More of the List, and More
Countdown to Ecstasy: 2005 Best-of Music Lists
Pitchfork, the 60s, and our Interminable Commodity Culture: Part I
The List is a Fucking T-Shirt

This Blows: YouTube, Mp3 Blogs, and How to Hype a New Band (as in, The Blow)


Labels: , ,

2 Comments:

Anonymous Anonymous said...

john you forgot to include phish. you used to like phish a lot remember those days.

8:06 PM  
Anonymous Anonymous said...

Phish. Yes, I used to like them a lot. I suppose Phish didn't make the list because none of their albums really resonated in my memory. Much of my devotion to Phish was due to the "live" experience. If I had to pick a Phish album, it would probably be "Junta," but as much as I once liked that record, it was never one of my all-time *favorite* albums. This is one reason why The List is such an insufficient mechanism for encaspulating the experience of listening to and appreciating music. One could try to compensate for this particular deficiency of The List by creating a supplementary list of, say, Great Concerts -- but concert lists are even worst than record lists. A list of records refers to a set of publicly available objects that potentially anyone could possess. A list of concerts refers to a set of experiences shared by a fixed number of people, which makes it harder to evaluate in the sort of general terms demanded by The List. In the end, the experience of going to concerts doesn't lend itself to quantifiable evaluation -- it's too personal, too totally subjective, too much a matter of memory. Trying to rate a concert as a statistic ultimately discounts the humanity of the concertgoing experiece... that's my gut instinct, at least, but this subject actually deserves much more thought than I have space to give here. Anyway, I suppose I could have also just subconsciously not wanted to include Phish on the list because I don't like them anymore and am embarrassed to admit I once did.

12:17 AM  

Post a Comment

<< Home