click logo to return to the home page

essays

 


"America the Bootylicious"
The sexualization of popular culture is startling proof that our nation's attitude toward fornication is changing for the raunchier
(Metroland, Aug. 23, 2001)

Return with us now, if you will, to those chaste days of yesteryear -- specifically, just two decades ago. The year is 1981, and popular culture is a vastly different arena from that which surrounds us today. At the movies, Kathleen Turner and William Hurt raise eyebrows with their discreetly steamy clinches in a thriller called "Body Heat," while teen model Brooke Shields keeps her clothes on throughout the overwrought romance "Endless Love." Outright sexuality, as seen in such 1981 releases as "Debbie Does Dallas 2" and "Emmanuelle in Soho," is mostly marginalized to porn theaters and cable channels. On TV, the sexiest things going are such tempestuous nighttime soaps as "Dynasty" and "Falcon Crest," which rarely get raunchier than shots of women covering their breasts with bedsheets. And on the pop charts, previously wholesome pop princess Olivia Newton-John smuts up her image with the single and video "Physical," a coy double entendre that's ostensibly about the early-'80s fitness craze.

Cut to the present, and look how things have changed. At the movies, "Scary Movie 2" features an explicit scene of starlet Tori Spelling fellating an invisible penis, as well as a character being propelled against a wall by the power of an orgasm that's depicted as a tidal wave of white goo. Hardcore pornography is readily available at the click of a mouse, so anyone with an Internet connection can view or download images that used to be the exclusive purview of the trench coat crowd. Television is a hotbed of sexuality: Couples on "Temptation Island" and "Big Brother," reality shows broadcast on network TV, stop just short of actually screwing; nudity is a regular fixture on "NYPD Blue"; and daytime soaps are filled with blush-worthy scenes of heavy petting. As for the pop charts, playfully unwholesome pop princess Britney Spears bumps and grinds in singles and videos such as "Oops! . . . I Did It Again," which features the curvy teen cooing the line "I'm not that innocent." Sex is such a given in pop culture that Spears brazenly strips to a bra in an ad for Pepsi, all while ancient Viagra spokesman (and former presidential candidate) Bob Dole watches with lascivious interest.

What the hell happened in the last 20 years to bring sex from the periphery to the forefront of American popular culture? It's not as if we spent the previous decades being bashful about the dirty deed -- the careers of Larry Flynt, Betty Grable, Hugh Hefner, Marilyn Monroe, Raquel Welch, and many others are proof of that -- but sex is so intrinsic to today's movies, music, television, and online content that it's as if all Americans do these days is consume porn, or semi-porn. As "The New York Times" recently reported, trade in X-rated products -- inclusive of Internet porn, videos, magazines, books, phone sex, and other adult fare -- brings in at least $10 billion annually. "Pornography is a bigger business than professional football, basketball, and baseball put together," Frank Rich wrote in the "Times." "People pay more for pornography in America in a year than they do on movie tickets, more than they do on all the performing arts combined."

Yet, as the examples of contemporary sexualized movies, TV shows, and pop music prove, the revenue generated by out-and-out porn is only half the story. People in this country may be more open than ever before about their consumption of "adult" products, but they're also more willing than ever before to consume non-pornographic products with pornographic elements.

Pop music is smothered in smut -- there aren't too many ways to interpret Ricky Martin's "She Bangs," for instance, and blink-182 went out of their way to let folks know that the chick on the cover of their album "Enema of the State" is skin-flick performer Janine. Numerous arthouse filmmakers are pushing boundaries by including unsimulated sex in "legitimate" movies such as "Romance" and "Baise-Moi." Even stranger is the trend of dirty-movie fetishism: Walk around an average shopping mall these days, and chances are you'll encounter at least one teenager wearing a shirt emblazoned with Playboy's famous bunny logo or the words "porn star" or "pimp." And a visit to any newsstand will reveal the depth of the public's current infatuation with all things bootylicious: There's Jennifer Lopez, naked except for hotpants, on the cover of "FHM"; there's Helena Bonham Carter, of all people, naked except for lingerie on the cover of "Maxim."

These days, if it swells, it sells.

* * *

To gauge just how far the public tolerance for raunchy entertainment has come in recent years, consider the careers of two omnipresent sex symbols, Madonna and Pamela Anderson. The Material Girl first made a dent in pop culture with her 1983 debut single, "Burnin' Up," in which she proclaimed that "I'm not like the others/ I have no shame." Madonna proved her point the next year by dressing in a bustier-topped bridal ensemble, then writhing her way down a giant wedding cake on an MTV awards show. Her gyrations to "Like a Virgin" were tame, however, compared to the carnal content of her later work.

In 1992, after nearly a decade of envelope-pushing, Madonna let it all hang out in her explicit coffee-table book "Sex" and her libidinous album "Erotica." A year later, she did it again in the movie "Body of Evidence," which featured convincing scenes of the star masturbating and receiving oral sex. Yet by this point in her career, people had become so jaded about her tendency toward sleaze that her exploits were met not with outrage, but with boredom. The "Sex" book won some notoriety and remains a collector's item, but none of the three projects was a blockbuster. Oddly, though, three flops weren't enough to derail the dirty diva's career: Today, Madonna is back atop the charts with, among other things, a massive tour that's selling out across the country despite ticket prices as high as $2,500. Far from ostracizing Madonna for reveling in raunch, America seems to have embraced her for doing so.

Around the same time that Madonna released her troika of smutty projects, bubble-breasted starlet Pamela Anderson became an icon by starring as lifeguard/floatation device C.J. Parker on "Baywatch," the syndicated program beloved by millions of horny couch-dwellers for its fetishistic montages of bouncing body parts. As had myriad wannabe movie stars before her, Anderson launched her career by posing nude in "Playboy." She broke with tradition, however, by maintaining her relationship with the magazine even after she became famous. Her career has survived such scandals as her tumultuous relationship with rock star Tommy Lee and the illicit release of not one but two home videos featuring Anderson having sex with boyfriends, and now she's got the dubious honor of appearing on more "Playboy" covers than anyone in history. As Bo Derek and Ursula Andress and Barbi Benton can attest, starts in previous generations who refused to cut their ties with the sex trade were shunned by the general public, yet Anderson's popularity has been boosted, not diminished, by her nude (and sometimes explicit) cavorting.

Today, it seems, there's no such thing as an "innocent" sex symbol. Whereas '50s and '60s youths had Annette Funicello -- the Mouseketeer who filled out to become the star of such tame romps as "Beach Blanket Bingo" -- the closest contemporary youths have to a family-friendly starlet is Jennifer Love Hewitt, of "I Know What You Did Last Summer" fame. Hewitt's films are loaded with ogling shots of her figure, but because she hasn't actually stripped for the camera or played any explicit love scenes, she's downright puritanical by today's standards. It's depressing to think what sort of message America's current obsession with nubile flesh sends to young women, but an indication can by found in comments that Britney Spears and other 21st-century pop tarts have made: They couldn't do what they do, these singer-dancer-pinups say, had Madonna not paved the way.

* * *

The roots of our country's present hot-and-bothered entertainment trace back at least to the 1960s. The rise of the counterculture -- think drugs, rebellion, and free love -- led the youth of that era to experiment with relaxed standards of social behavior, as seen in such spectacles as the unabashed nudity at Woodstock. Music and movies, accordingly, got more suggestive, and American arthouses began exhibiting films that were sexier than anything being made in the United States. In 1969, for instance, the Swedish flick "I Am Curious (Yellow)," which features fairly explicit sex, became the center of a freedom-of-speech debate when U.S. Customs Department officials refused to allow prints of the movie to enter America.

Within months of that conflict, however, Americans had homegrown sex movies to enjoy, because the sexual revolution (a phenomenon emboldened by the emergence of the birth-control pill) led to a boom period in pornography. Pictures such as "Deep Throat" and "Behind the Green Door" -- sexfests shot on 35-millimeter film, just like "real" movies -- were embraced by young adults curious to see what filmed fornication was all about. For a brief moment, porn was accepted as the outer edge of mainstream culture.

The trickle-down of this changing morality was felt throughout the '70s, and seen in such pop-culture artifacts as Farrah Fawcett's famous swimsuit poster, which prominently features her erect nipples beneath form-fitting fabric, and disco songs like "Push, Push, in the Bush," which matches carnal rhythms with vulgar wordplay. Sex was everywhere in the pop culture of the '70s, in part because sex was everywhere in private life. It was the height of the swinger era, and mainstream entertainment reflected the pervasiveness of sexual experimentation.

AIDS pushed porn back underground, just as it had a chilling effect on sexual activity. The boomers who had spent the Me Decade swinging settled down, and Gen Xers coming of age in the '80s were confronted by stern warnings from the conservatives who took over American politics. It says everything about the period that a genre of movies about teens trying to get laid flourished in the '80s: People still wanted to screw and watch other people screwing, but for the time being, it seemed safer just to talk about screwing in hushed tones.

Yet certain sexualized facets of pop culture bubbled underneath the seemingly chaste mainstream. As noted earlier, Madonna began creating her provocative art in the early '80s, and around the same time, porn found a new niche with the popularization of home video. By the early '90s, when Madonna reached the apex of her amorousness, pornography -- from softcore "erotic thrillers" to hardcore orgy flicks -- was part of the bedrock of the home-video business.

Throughout the last decade of the 20th century, Americans became bolder about their support of the skin trade. Shock jock Howard Stern proclaimed his love of strippers on his top-rated radio show, actor Charlie Sheen survived the scandalous revelation that he was a client of "Hollywood Madam" Heidi Fleiss, and singer George Michael turned his arrest for indecent exposure into a self-deprecating music video. Meanwhile, Michael and other entertainers -- notably Melissa Etheridge -- turned revelations of homosexuality into record sales. Just as America was more embracing of sex products than ever before, the country was more embracing of alternative lifestyles than ever before. But as the tragic stories of Brandon Teena, Matthew Shepard, and others prove, such tolerance wasn't all-inclusive.

* * *

Teena and Shepard were both slaughtered for being gay, even at a time when Michael, Etheridge, and others were celebrated for revealing their sexuality -- but other aftershocks of the changes in America's attitude toward sexuality are less overt. Madonna begat Britney, but what's the next step in pop's sexualization? Do the little girls who shimmy along with "Oops! . . . I Did It Again" know what Spears means when she says she's not innocent? And if wannabes who emulated Madonna in the '80s did so by wearing buckles emblazoned with the words "boy toy," how will fans of Spears, Jennifer Lopez, Destiny's Child, Christina Aguilera, and other sexy stars emulate their heroes? It's creepy enough to watch 20-year-old Aguilera sashay around in a bustier and panties during her "Lady Marmalade" video, but it would be far creepier if a 10-year-old fan said to her mom that she wanted to dress like Aguilera for Halloween or a school talent show.

The company line spoken by today's pop tarts is that Madonna gave female singers the ability to "take ownership" of their sexuality, but as some pundits have pointed out, how can women who aren't yet adults understand their sexuality well enough to take ownership of it? Spears, for instance, has said in interviews that she's a virgin -- which would make it unlikely that she understood exactly what she was doing when she notoriously stripped down to a bra and see-through pants on the 2000 MTV Video Awards.

And it's not just young girls who are likely to be confused by the messages in today's libidinous entertainment. What's a young boy to think when he sees blink-182 not only cavorting with porn stars, but streaking through their videos? And what are youths to make of stars including Kid Rock, who dates Anderson and has been open about his affection for porn, or Snoop Dogg, who actually has his own line of porn videos? It's true that kids today are growing up faster than their predecessors, but is that because we're evolving as a species, or because the prevalence of sexy content -- on TV, on the radio, and especially on the Internet -- is making the idea of youthful innocence obsolete?

The sticking point here, of course, is that the most sexualized movies, music, TV shows, and online content are supposed to be for adults only: Janet Jackson's latest disc, "All for You," which is loaded with explicit lyrics about sex, has a parental-advisory sticker; "American Pie 2," the latest teen-sex comedy, is rated R and therefore supposedly off-limits to unaccompanied youths; porn sites require surfers to affirm that they are old enough to see raunchy content; and artists including Snoop Dogg often use interviews to indicate that their most salacious products are not for kids. Yet permissive parenting, lax supervision at record stores and theaters, and the inventiveness of Net-savvy kids hungry for porn makes these barriers easily surmountable. Plus, the suggestive content of, say, a Britney Spears video is totally unrestricted. Anyone with a remote can turn on MTV and watch the teen queen shake her groove thang.

The sexualization of pop culture will only get more severe if changes brewing in certain art forms come to fruition. Movies, in particular, have been poised on a lewd precipice for several years. When Paul Verhoeven made "Basic Instinct" in 1992, he very nearly broke a taboo by showing an erect penis in a mainstream movie, even going so far as to have a prosthetic device made for star Michael Douglas to wear; the director compromised by including a peek-a-boo shot up costar Sharon Stone's skirt. Notwithstanding the vile double standard that it's OK for women to be naked in movies, but it's not OK for men, the point is that Verhoeven wanted to cross a line by putting explicit sexual imagery into a studio flick.

The director tried again in his infamous 1995 flop "Showgirls," which luridly peddled female flesh but discreetly hid male private parts. So far, no Hollywood director has gone further than Verhoeven in depicting sexuality, but several European directors have made his work seem G-rated by comparison.

In 1986's "Devil in the Flesh," Italian director Marco Bellocchio featured a scene of unsimulated fellatio, which earned his movie an X rating during its American release. More recently, Danish auteur Lars von Trier took his no-fakery aesthetic to its logical extreme by having actors copulate on camera in 1995's "The Idiots." And a trio of recent French films -- "Romance," "Baise-Moi" (which translates as "Rape Me"), and "Intimacy" -- feature behavior never seen on American screens outside of porn movies. "Romance" and "Baise-Moi" were shown in American theaters without ratings, and "Intimacy" is due for Stateside release later this year.

With provocateurs like Verhoeven, Brian De Palma, and David Cronenberg (whose "Crash" combined semi-explicit sex with scenes of masochists causing auto accidents) in the mix, it's just a matter of time before an American movie breaks the long-standing taboo against including real sex in a real movie. This transition could be beneficial in one regard -- by forcing the movie industry to adjust the hypocritical standards that allow graphic violence onscreen, but not graphic intimacy -- but it also might represent one more slide toward amorality. If nothing else, the debates that are sure to fill talk shows and op-ed pages as soon as an American director shows two actors doing the horizontal bop will be enlightening.

* * *

For now, the big question seems to be this: Does our desire for ever-increasing sexuality in pop culture mean we're growing up -- or growing perverse? Alarmists likely would embrace the latter answer, saying that 21st-century Americans are like Romans before the fall of their empire, debauched libertines drowning in pointless, dehumanized pleasure. Yet there's something to the former answer, to the idea that a more sexualized popular culture reflects a more enlightened culture in general.

It's certainly hard to argue that an enlightened culture revels in watching 19-year-old Spears mimic sexual behavior that (according to her) she has not actually experienced. It's equally difficult to posit that Madonna and Pamela Anderson are avatars of spiritual growth, no matter how much either of them says she has grown by experiencing motherhood. And it's ludicrous to suggest that an advanced populace is the target audience for the lurid men's magazines adorned with shots of Lopez' posterior or Bonham Carter's décolletage. On the surface, the trend of pornified pop culture indicates that we're on a steady slide toward damnation, or at least decadence.

But perhaps our affection for porn and pseudo-porn is like an awkward phase in our collective maturation. Americans have long been more uptight about sex than, say, Europeans; "Maxim" and "FHM" are innocuous compared to the nudity-drenched periodicals pervasive in several European countries, and even the veddy proper Brits have lurid traditions such as the "Page 3 Girls" who appear topless in London's "Daily Sun" newspaper.

So it could be argued that Americans are finally learning to accept sexuality as a part of everyday human behavior -- we're indulging a little too much right now because we've dug into a cornucopia of forbidden fruit, but once we sate ourselves, sex won't seem as novel or as naughty. From this perspective, we're headed in the right direction, because a greater acceptance of the full spectrum of sexual behavior could lead to greater tolerance, increased sexual education, and, ultimately, improved national health: Young women who grow up unashamed of their sexuality, for instance, might be less inclined to seek desperate measures upon becoming pregnant, and might feel freer employing various methods of birth control.

But as easy as it is to put a positive spin on the current fancy for fornicating, it's easy to acknowledge the myriad negative aspects of this phenomenon. Women are inarguably treated like garbage in pop culture, whether or not they "take ownership" of their sexuality. We're still a society that loves looking at breasts, but shies away whenever a penis gets whipped out. Until the gender imbalance in sexualized pop culture changes, other progress is impaired.

Nonetheless, we're at an interesting juncture. We've opened Pandora's box -- and taken up-close-and-personal pictures of it to post on easily accessible websites -- so now we have to decide what to do with all this sexual freedom. We can keep going as we are, spending billions of dollars every year to watch hot chicks get poked and prodded, or we can take it to the next step by turning sexualized popular culture into an expression of something joyous and human. Right now, carnal entertainment is all about coming. But in the future, it could be all about coming of age.

 

"Let Us Entertain You . . . Selectively"
(Metroland, Sept. 20, 2001)

There's no mitigating the horror of last week's terrorist attacks in New York and Washington. Words also fail to capture the dignity of the thousands of people who sprang into action to help with the rescue effort at the site of the World Trade Center. And the deep feelings of sadness, anger, and patriotism that swelled in hearts across the nation are almost assuredly genuine. But some of the reactions to last week's cruelty are puzzling at best, and cowardly at worst. In particular, the spectrum of behavior exhibited by those in the entertainment field has included everything from nobility to jingoism.

Immediately after the reality of what had happened set in, the entertainment industry went into a holding pattern. Sitcoms and TV dramas were replaced by round-the-clock news broadcasts. Broadway theaters went dark. And while movie theaters stayed open, production on myriad films and TV shows halted. But this was just the beginning.

Radio stations quickly added patriotic numbers such as Lee Greenwood's "God Bless the U.S.A." to their playlists, even if the country song didn't fit their formats. More peculiar was the wall-to-wall broadcasting of Bruce Springsteen's "Born in the U.S.A.," even though the lyrics are critical of how Americans are treated by politicians and corporations. The use of Springsteen's song, while probably well-intentioned, recalled how the number was appropriated by right-winger Ronald Reagan back in the '80s, much to the Boss' consternation.

By Monday morning, radio conglomerate Clear Channel had taken the topical programming approach to a bizarre extreme. Clear Channel issued a list of 150 "lyrically questionable" songs that DJs were recommended not to play for fear of offending people touched by the tragedy. One hopes that DJs have the sensitivity to postpone broadcasts of such songs as Steve Miller's "Jet Airliner," Queen's "Another One Bites the Dust," R.E.M.'s "It's the End of the World As We Know It," and the Gap Band's "You Dropped a Bomb on Me," all of which are on the Clear Channel list. But one also hopes that listeners have the intelligence to differentiate, say, the Gap Band's 20-year-old disco hit from recent events.

Yet songs with lyrics that could conceivably seem insensitive at this time didn't comprise most of the Clear Channel list: It appears that the folks at the conglomerate scoured their archives for every tune with a reference to death, combat, or explosions. But come on -- Pat Benatar's "Love Is a Battlefield"? The list also warned against playing songs that encourage social consciousness, such as Jackson Browne's "Doctor My Eyes," or that reference hardship, like Simon and Garfunkel's "Bridge Over Troubled Water." Probably the most appalling item on the appalling list is Cat Stevens' "Peace Train," presumably included because the singer, now called Yusuf Islam, has in the past associated himself with extremist Islam. If it's OK to shun a song by a Muslim because Muslims apparently were responsible for the attacks, what's next -- putting the Cure's "Killing an Arab" into heavy rotation?

And, as former Youngbloods front man Jesse Colin Young noted on a Tuesday-morning National Public Radio broadcast, Clear Channel's aversion to broadcasting his song "Get Together" is worrisome, because "Get Together" -- like "Peace Train" and John Lennon's "Imagine," another no-no by Clear Channel standards -- is pacifistic. So was the conglomerate's message to make war, not love?

The paranoia of Clear Channel's hit list was echoed throughout the entertainment industry. The releases of a Tim Allen comedy called "Big Trouble" and an Arnold Schwarzenegger actioner called "Collateral Damage," among others, were scuttled because the pictures contain terrorism-themed storylines. Scheduled broadcasts of "Independence Day" (on Fox) and "The Peacemaker" (on ABC) were canceled because "Independence Day" shows New York City being attacked and "The Peacemaker" is about terrorism. The premiere episode of a new CBS show called "The Agency" may be pulled because it contains references to Osama bin Laden, the leading suspect in last week's attacks. A planned five-hour crossover between the various "Law & Order" series may go unfilmed, because the story concerns terrorism in New York City.

The theme running through all of this reactionary behavior is that people in the entertainment industry apparently believe Americans aren't intelligent or mature enough to separate fact from fiction. And while it would be wrong to generalize about an often-insensitive industry's attempts at sensitivity, there seems to be something hypocritical about shelving topical violence as if temporarily averting America's eyes from onscreen bloodshed (or its ears from pseudo-violent music) will do much good. Particularly if one kind of violence is replaced by another.

The same week that Clear Channel determined which songs were going to offend a nation in mourning, the company put a public service announcement on the air. While subtly avoiding actual warmongering language, the PSA reflected the thirst for vengeance that has permeated American political discourse since the attacks. After identifying that the PSA is addressed to the culprits behind the terrorism, a somber announcer says the following words over intense, patriotic music: "We are the sleeping giant that you have awakened. . . . You don't know what you just started. Signed, We the People."

An interesting wrinkle to this story is that almost immediately after the "lyrically questionable" list was disseminated, Clear Channel President/CEO Mark P. Mayes issued a statement, dated Sept. 18, denying the list's existence -- sort of. "Clear Channel Radio has not banned any songs from any of its radio stations," he wrote. "Clear Channel strongly believes in the First Amendment and freedom of speech." The sentiment is all well and good, but no one accused Clear Channel of banning anything. The list that was distributed "recommended" that DJs not play the cited songs. While it would be unfortunate if Clear Channel were taken to task for something it didn't do, the wiggle room in Mays' statement seems like a classic case of corporate backpedaling.

Yes, some of the entertainment-industry reactions were genuine attempts to respect the feelings of people affected by horrific events, such as the decision not to air "Independence Day's" images of American landmarks exploding. But some seem utterly shallow, such as the shelving of the Allen and Schwarzenegger pictures. If these movies are offensive this week, will they somehow not be two or three months from now? And the worst of the entertainment-industry reactions were pure censorship. Even if an argument can be made that hearing "Jet Airliner" would jar people right now, taking "Peace Train" off the air because it's sung by a Muslim -- who, interestingly, was not a Muslim when he recorded the song -- is worse than reactionary. It's racist. And, contrary to what the progenitors of knee-jerk jingoism would have you believe, it's un-American.

 

"Seduced by Savagery"
How audience bloodlust has turned movie violence perverse
(Metroland, April 2, 1998)

As a whispering synthesizer note drifts through the air, the first image fills the screen. It's a sleek, long shaft glistening in light from who knows where. The camera studies the shaft and where it leads. It's a gun barrel. Next comes the chamber, and we see a bullet -- looking huge and potent as it fills the screen -- while it slips gently into the chamber. A moment later, we're looking at the whole scene: a massive, immaculately polished service revolver studied in surreal limbo a moment before Jamie Lee Curtis eases it into the holster of her police officer's uniform.

This seductive sequence, which opens Kathryn Bigelow's 1989 thriller "Blue Steel," shows how some films synthesize sex and violence to prod a primal bloodlust many moviegoers seem to share. Yet while movies like "Blue Steel" make self-conscious allusions to this bloodlust, most pictures that sate viewers' appetite for brutality do so with the blasé regularity of grocers filling customers' orders. In fact, as a long stream of hit films prove, audiences have spent the last half-century demanding -- and receiving -- a steady diet of screen gore depicted with escalating detail and barbarity.

This trend suggests an alarming dichotomy in American life. O.J. Simpson's acquittal sparked rage, and Timothy McVeigh's conviction spread relief, but the same public that labeled these men monsters pays money to see ultraviolent movies. Americans condemn violence in real life and endorse it in reel life. The answers behind this dichotomy have everything to do with how Hollywood packages violence for mass consumption -- directors wrap sensual gloss around brutality to lure viewers in, then assault viewers with images that would make most of us nauseous if we encountered them in our everyday lives.

But the blame for the proliferation of violent imagery can't be placed entirely on the movie industry, because audiences tell the industry what to do by picking what movies they see. And by habitually patronizing violent movies, audiences are in essence asking for more of the same. Why? For the same reason swarthy men in trench coats skulked into porno theaters to watch "Behind the Green Door" and "Deep Throat" in the '70s. Violence is taboo in the real world, and there's no human habit older than sneaking a taste of forbidden fruit.

Violent movies have become the new pornography in the last decade or so, and the way we indulge our taste for this new porn is different from how those swarthy men got their fixes 20 years ago. In the '70s, people risked embarrassment by visiting triple-X theaters. Today, moviegoers breeze into ultraviolent movies like "Starship Troopers" and "The Lost World" without seeming demented for wanting to see explicit decapitations and dismemberments. Surely there's greater perversion in wanting to see people killed than in wanting to see them fornicate, yet ultraviolence is part of the mainstream while sexuality is treated with kid gloves in movies like "Boogie Nights," which featured more graphic violence than explicit sex.

The most startling image of "Boogie Nights" isn't the closing shot of Mark Wahlberg's prosthetically enhanced penis; instead, it's Don Cheadle's reaction to a shooting, as chunks of the victim's brain drip down Cheadle's forehead like beads of sweat. Director Paul Thomas Anderson lingers on pieces of an exploded brain with passionate relish, but if he'd given the same screen time to a man ejaculating -- an image that's a staple of the pornography industry Anderson depicts -- his film would have gotten an NC-17.

And "Boogie Nights" isn't the only recent movie to show exploding heads. Quentin Tarantino's "Jackie Brown" features a close-quarters murder that sends bits of a man's head all over a car windshield, recalling a similar scene in his "Pulp Fiction"; "Alien Resurrection" includes the memorable image of character actor Dan Hedaya reaching behind his head to remove a strand of his brain from a hole punched into his skull by a monster.

The easiest -- and most disturbing -- explanation for these increasingly graphic bloodbaths is that violence has replaced sex as a national turn-on. The primal thrill to be had in watching bloodletting is so ingrained into our moviegoing psyche that it was the subject of a riff spoken by faded screen bloodsucker Bela Lugosi (Martin Landau) in "Ed Wood": "Horror both repels and attracts women because in their collective unconscious, they have the agony of childbirth," he says. "The blood is horror. Take my word for it. If you want to make out with a young lady, take her to see 'Dracula.'"

But there's a world of difference between the neck-biting in "Dracula" and the horrific sexual violence in modern movies like Paul Verhoeven's "Basic Instinct" and Tarantino's "Pulp Fiction." These movies take the connection between violence and sexuality into the realm of pornography. "Basic Instinct's" portrayal of murderous author Catherine Tramell (Sharon Stone) equates promiscuity with psychotic behavior; "Pulp Fiction" sent weak-hearted viewers squealing to the popcorn stand during the scene in which two rednecks rape Marcellus Wallace (Ving Rhames).

People have enough hangups about sexuality without reading violence into it, so when filmmakers sully sex by connecting it with brutality, the connection hits viewers where they live. And though Verhoeven and Tarantino raised cinematic sexual violence to a queasy new level of photorealism, they merely followed in the tracks of Alfred Hitchcock, Brian De Palma, and Michael Powell. Hitchcock's "Psycho" remains the most famous marriage of sex and violence on film, and De Palma has spent entire movies -- namely "Dressed to Kill" and "Body Double" -- trying to out-psycho "Psycho."

Powell, though, added a level that even "Psycho" didn't touch: moviegoers' complicity in cinematic sexual violence. His "Peeping Tom" tells the story of a filmmaker who kills women just to record their death throes. As if Powell's message weren't clear enough, the killer's weapon is a phallic blade affixed to the front of his camera lens: He fucks women to death with his camera.

If Powell were the only filmmaker to portray a phallic instrument of death in a movie, it would be easy to dismiss him as a pervert who let his fetish escape into his art, as critics did when "Peeping Tom" was released in 1960. But Powell wasn't the only one. In "Psycho," Hitchcock depicts a meek hotel clerk (Anthony Perkins) whose only escape from his psychic shell comes when he dresses as his dead mother and stabs a hotel guest (Janet Leigh) with a butcher knife. To underline the sexuality of the scene, Hitchcock fixes his camera on patches of Leigh's naked flesh as she takes a shower, then lets Bernard Hermann's chilling score give the murder scene a sexual rhythm while the killer plunges the knife in and out of the victim's body.

De Palma perpetuated this imagery with "Body Double," wherein a killer impales a woman with an electric drill. And it isn't the exclusive province of men. When Stone concludes her opening-credits sexual romp in "Basic Instinct" by killing her sex partner, she does it with several orgasmic thrusts of an ice pick.

The creepy things "Basic Instinct" says about audiences can be laughed away because the picture is so unbelievable, but when it's put on a timeline with pictures like "Psycho" and "Body Double," it suggests that American audiences continually embrace films that depict sexualized violence. Yet this evidence can still be dismissed by moviegoers who want to rest assured that they're not endorsing perversion by calling "Psycho" and its descendants entries into a subgenre of perverse thrillers.

To find evidence that violence in mainstream films is anything more than it seems, it's necessary to look past films that feature obviously sexual violence. "Terminator 2: Judgment Day" seems like a conventional shoot-em-up on the surface -- albeit one with a distaff protagonist -- yet there's a sexual undercurrent beneath its action. During one of the picture's impressive set pieces, Arnold Schwarzenegger is surrounded by a huge ring of heavily armed police officers. He lines them up in his sights and opens fire, kneecapping dozens of victims while surviving their return fire. As Brad Feidel's pulsating music sets the tempo and Schwarzenegger's "mini-gun" automatic rifle keeps the beat, Schwarzenegger moves from one victim to the next with the sharp focus and insistent motion of a sex act. When the gunplay stops, Schwarzenegger stands like a man satisfied, smoke billowing out of the barrel of his spent gun. Audiences cheered that scene, which means they were applauding one of two things -- the maiming of a row of cops or the superhuman virility of Schwarzenegger's character.

The substitution of violence for sex showed up again in last year's "G.I. Jane," the story of a career Navy officer (Demi Moore) who becomes the first female Navy S.E.A.L. Moore develops a love-hate relationship with her drill instructor (Viggo Mortensen), and from the leering comments he makes when he meets Moore to the lingering glance he takes at her nude body in a shower, it's clear the tension between them is at least partly sexual. But instead of consummating their relationship, they have a fistfight, which director Ridley Scott stages with an abundance of grunting, grappling, and heavy breathing. The fight is a staying-power contest cut from the same cloth as the epic "Basic Instinct" tryst between Stone and Michael Douglas, which is less a love scene than a game of one-upmanship.

Sexually charged depictions of violence like those in "Terminator 2" and "G.I. Jane" are part of a venerable Hollywood tradition that rides shotgun with the trend of overtly sexual violence in films. For a sense of how mainstream sexualized scenes of violence have become, one needs merely revisit how controversial films like "Bonnie and Clyde" and "The Wild Bunch" were in the late '60s. These dramas, with their slow-motion bloodbaths and squirting jets of fake blood, skirted X ratings and prompted accusations that their directors were "glamorizing" violence.

"Wild Bunch" director Sam Peckinpah turned orgiastic death scenes into his trademark with subsequent pictures like "Straw Dogs" and "Bring Me the Head of Alfredo Garcia," while the balletic execution of the title characters in "Bonnie and Clyde" is echoed in James Caan's "Godfather" death scene, featuring the actor being perforated by nearly 100 fake bullet hits. These '70s scenes created what Pauline Kael once called an "aesthetic of cruelty"; they turned bloodletting into high art, and in so doing brought a level of savagery to popular entertainment not seen since the Romans fed Christians to lions for amusement.

Director Kathryn Bigelow has been an articulate advocate of filmmakers' right to put violent imagery into the world. When her "Blue Steel" opened in 1989, she made these remarks: "I think violence in a cinematic context can be very seductive. I think an audience can be titillated by violence. It's wonderful in the safe confines of a movie theater to experience certain kinds of violent tendencies, where you can live out that side of your imagination or subconscious."

But where does catharsis end and perversion begin? Can viewers draw a line between the perverse killings in "Psycho" and its ilk and the graphic violence in so many action movies? Following Bigelow's thesis, it makes sense that watching Bruce Willis blast a "Die Hard" bad guy helps moviegoers vent everyday angst, but do we need to see a bad guy get an icicle jammed into his eye, as happened in "Die Hard 2: Die Harder"?

The answer is no, we don't -- but we seem to want to. This brand of audience bloodlust is adroitly satirized in Lawrence Kasdan's dramedy "Grand Canyon," in which Steve Martin's character is loosely based on "Die Hard" producer Joel Silver. When Martin is shown the rough cut of a scene from his newest action flick, in which a shootout takes place on a city bus, Martin watches attentively until a machine gun is pointed at the bus driver's head. Just as the trigger is pulled, the film cuts from the gunshot victim to a reaction from a bus passenger.

"Where's the money shot?" Martin exclaims.

Since the cinematic medium was invented, directors have sought ways to make their illusions more convincing. Today, uber-auteurs like James Cameron use tools such as computer generated imagery and THX 24-track sound to persuade viewers they're on the deck of the sinking "Titanic." But during the climax of this year's Oscar winner for Best Picture, Cameron takes the time to gross us out with a shot of a passenger falling off the deck and bouncing -- with an audible sound -- off one of the "Titanic's" giant propellers. The gruesome shot made such an impression on audiences that it even earned a mention in Oscar telecast host Billy Crystal's annual song about the year's Best Picture nominees.

The majestic realism of "Titanic's" special effects represents 100 years of advances in cinematic technology, but the wince-inducing carnage at the end of the picture is the price we pay for those advances. When audiences demanded more realism from Hollywood by abandoning silent films for talkies and then black-and-white pictures for color ones, they unlocked a door that let out all of our Freudian demons. The violence in movies like "Die Hard," "Terminator 2," and "G.I. Jane" is filled with unnerving subtexts because audiences put them there.

 

"Skin Deep"
(The Source, March 1, 1995)

What makes a movie beautiful? Lately, the word "beautiful" has been used to describe everything from the autumnal glamour of "Legends of the Fall" to the wintry crispness of "Little Women" to the European splendor of "Immortal Beloved." But what are the critics and moviegoers who use that adjective so freely really saying?

"Legends of the Fall" features pretty cinematography, fetching actors, and picturesque locations. It's fair to call the movie's surface beautiful. But is the film logical? Are its characters believable? Are the relationships between them convincing? Does "Legends" satisfy any criteria we've established for analyzing movies other than filling our eyes? "Legends" director Ed Zwick is using superficial beauty to help his otherwise weak film woo viewers, and they're letting him get away with it. By showing us hints of an epic movie (characters aging, scenes of the seasons changing, shots of Brad Pitt gazing wistfully across the horizon), Zwick convinces some viewers they've seen an epic movie. But it's easy to reveal "Legends" as a poseur by comparing it to 1962's "Lawrence of Arabia," the textbook definition of a cinematic epic.

Whereas Pitt's "Legends" character Tristan slips into petty debauchery during his quest across the globe, the title character of "Lawrence" (Peter O'Toole) becomes a player in a desert war with hundreds of lives at stake. Both films are about men journeying into their souls, but Tristan is shallow and cruel in his narcissism. Lawrence is more difficult to define, and therein lies the distinction between the true epic and the impostor. Both films have the style, but only "Lawrence" has the substance.

That "Legends" has become a surprise blockbuster suggests style is enough for modern audiences. The tragedy here is that "Legends" is succeeding at the expense of smaller movies like "Before Sunrise."

"Beautiful" seems like the best word to describe "Before Sunrise," but here the word means something. Richard Linklater's independent film follows handsome, listless college grad Jesse (Ethan Hawke) on the last day of his train trip across Europe. While on the train, he's beguiled by Céline (Julie Delpy), a stunning French girl his age. With a mix of self-deprecating charm, witty references, and flattery, Jesse invites Céline to explore Vienna with him on his last night in Europe before flying home to America.

At first, Linklater presents Jesse and Céline's one-day courtship as an engrossing romance. Then he uses their story to air the simultaneously naive and worldly chatter peculiar to postadolescents. Jesse is cynical, but he's also in love with love. Céline is described in the movie as "an old soul." She's precociously mature, but she responds to Jesse's advances with the restraint of a schoolgirl.

Jesse and Céline talk and talk and talk; Linklater and cowriter Kim Krizan fill their conversation with tension, humor, and ideas. Céline changes from moment to moment: she's cutting when she hears Jesse bemoan romance and asks him, "so who just broke up with you?"; she's vulnerable when she tries to explain her idea of the divinity in communication. Jesse personifies America and Céline France, but they carry the burden of representation gracefully. "Yeah, I'm the dumb, vulgar American who has no culture," Jesse snaps when Céline discovers he speaks only one language; Céline expresses her disgust at being told by Americans that she's "so French" in her mannerisms.

That's the movie -- two gorgeous youths walking around gorgeous Vienna. But "Before Sunrise" isn't conventionally pretty. The poster shot of Hawke stroking Delpy's hair as they recline against a gold fountain has a grace of color and composition, but more typical is a scene of the two seated on wood palettes in a grimy, underlit alleyway. Delpy's closeup in the scene would never suffice in a Hollywood movie. Her skin is discolored to a greenish-gray, and in the failing light we can't see the details of her features. She doesn't have the halo backlight that usually illuminates a leading lady in her big dramatic moments.

But this ugly moment is beautiful. The awkward light around Céline is the same by which we behold so many of the essential moments in our lives, from tentative good-night kisses to spiritual epiphanies during sleepless nights. The imperfection of this moment creates an echo of our lives, because our lives aren't picture-perfect. In real life, we stammer and sweat when we speak our hearts, and we see acne or double chins when we look in the mirror.

Movies like "Legends of the Fall" paint every scene with the same glossy strokes, creating an illusion of perfect beauty. "Legends" betrays perception by showing events the way we wished they happened, not as they might have in real life. "Before Sunrise" shows us life as we live it. Delpy spends the movie in a trendy T-shirt and dress ensemble that will seem dated a year from now; Hawke's goatee isn't shaved evenly. "Before Sunrise" doesn't improve on reality; it is an artifact of one imperfect moment.

Yet "Before Sunrise" stars two pretty actors. By casting Hawke and Delpy, Linklater brought as much baggage to his movie as Zwick brought to "Legends" when he cast Pitt, because how many postadolescents look like Hawke and Delpy? By casting attractive actors, Linklater turned "Before Sunrise" into a story about attractive people. Could Jesse have enticed Céline off the train if he was overweight with greasy skin, even if he spoke the same charming words? Would Jesse have noticed Céline if she was wider in the hips and in need of a dermatologist's services, even if she had the same soft smile?

Zwick made "Legends" palatable by featuring attractive actors and pretty cinematography: His film was doubly false. Linklater made less of a compromise between an idealized world filled with beauty and this world filled with ugliness. But perhaps both directors are exploring the ways in which fantasy and reality intertwine. Zwick found beauty in an ugly story about death and heartache; Linklater found mundane details in a beautiful story of love and communion.

Zwick coated his movie with sugar and Linklater spiced his with salt. They made peace between what they saw and what they wanted to see. If we could all look through those eyes, wouldn't that be beautiful?

 

"Something Happened"
(The Source, Dec. 6, 1995)

Louis Malle's film "Au Revoir, Les Enfants" taught me how to grieve. So when I learned that Malle had died on Thanksgiving, the sensation I felt was loss, but it was mixed with déjà vu -- I had already mourned Malle, or rather I had mourned his first death.

"Au Revoir, Les Enfants" ("Goodbye, Children") recalls an experience Malle had while a student at a Catholic school near Fountainbleu, France, in 1944. The 12-year-old Malle befriended one of several new students, then discovered his friend was a Jew given sanctuary by the priests. On a cold January morning, the Nazis found Malle's friend and took him away. The boy was killed at Auschwitz. The sting of that day was sharper because Malle may have helped the Nazis discover his friend with one frightened look across a classroom.

When "Au Revoir" opened in the spring of 1988, Malle told the press he'd spent much of his adult life trying to tell this story, but every attempt was stifled by feelings of guilt and loss. In the spring of 1988, I was midway through my first semester of film school at New York University. I knew a handful of big words and camera tricks, and my arrogant brandishing of those things won me little favor with my peers.

Then I heard about "Au Revoir." I read about the story behind the film, and I knew that Malle was bringing his career full-circle after a tenure directing American films, including "Pretty Baby" and "Atlantic City." I read that "Au Revoir" was painful to watch. This made me want the movie more -- I deeply needed a humbling experience. So one day that March, Nicole, a film student from California, and I ventured to Lincoln Center for a matinee of "Au Revoir."

Something peculiar happened while we watched the movie. Nicole started shaking and crying as the story fell into place. By the end of the movie, she was sobbing, as were most in the theater. But I was smiling, so widely that my cheeks ached. This wasn't a perverse thrill at witnessing tragedy. This was a transcendent feeling of beholding perfection.

Not one instant of filmmaking takes place in "Au Revoir." We feel the crack of branches beneath the feet of two boys running through autumn-brown woods. We feel the soft heat of a Jewish boy's prayer candle. We feel the gentle communion of childhood friendship. We feel a sudden stop in our throats when an S.S. officer calmly enters a classroom and hunts his prey. We feel. We don't watch, see, or hear -- we feel. That was the elation that lifted me when I watched "Au Revoir" the first time, even as Nicole cried through a pocketful of concession-stand napkins in the next seat. For the first time, I saw everything film could do.

I memorized every image in "Au Revoir" that first time, especially the flushed complexions of the young actors and their sunken, innocent eyes. I could not name what the film had given me, but I knew it had given me everything. Yet the gift was slow to mature. When I shot 16-millimeter film for the first time the next semester, I framed shots as Malle had, and tried to light sets as he had. On two or three occasions, magic happened in the collision between camera and reality, but I still hadn't learned. I still tried to say I had made the magic.

I returned to "Au Revoir" many times, studying the film and the published screenplay. Sometimes I reacted as I had before, beaming with my ability to recognize this film's heart. Sometimes I reacted like Nicole, recoiling from the poignancy of the climax like the victim of a gunshot. Hard as I looked, I could not find the machine in "Au Revoir." Being who I was, it was not sufficient to know what the film conveyed -- I needed to know how it conveyed. I wanted to become the master of this experience.

A year later, I was at a midtown screening room with Jim, a film student from Wisconsin. We had just watched a dull comedy, and were heading for the exit when I saw a sweet-faced man in a raincoat, driver's cap, and thick glasses. He was little more than five feet tall, and looked all of his fiftysomething years. I grabbed Jim's sleeve. "Oh my God, that's Louis Malle," I blurted, trying to stay discreet and quiet. "Go over and say hello," Jim said, chuckling at my exuberance.

As Malle politely held open a stairwell door for his companion, I approached and extended my hand. "Pardonéz-moi -- Monsieur Malle?" He turned and looked closely, as if having to peer through my half-remembered high-school French. "Oui?" he said. "Je m'appelle Peter, et je suis un . . . I'm a film student, et . . ." His brows arched over the rims of his glasses as he listened to my bilingual stammering.

"'Au Revoir,' c'est ma favori . . ." I kept talking until the look on his face told me I had gotten my point across, despite my inarticulate speech. He looked to the floor shyly, with a little smile. "Thank you," he said. He bowed his head two or three times, softly repeating "Thank you," then the handshake ended and he went on his way.

During my school years in New York, I wore composure and eloquence like another self, and here, suddenly and unexpectedly, came the humbling experience I had sought that first time watching "Au Revoir." Being made small by a film, by an accomplishment -- that was something I understood. But this was new. Here was a man -- a short, middle-aged, quiet man -- and he overwhelmed me.

This man and his film changed me.

I have "Au Revoir" with me everywhere I go. It is the benchmark against which I measure accomplishment, and it is the height to which I strive. When I am confused, which is virtually all the time, "Au Revoir" is truth and solidity. "Au Revoir" is beauty, and pain, and fate, and accident.

Tonight and writing these words, I regard "Au Revoir" with feelings of dread and promise. The next time I watch the film the sting will be sharper, because the eyes that saw those events no longer shine. But in that sadness, I will smile again, as I did the first time. The sadness in remembering those eyes will be tinged with the joy of having looked into those eyes, however briefly. That is part of what I learned from Louis Malle and "Au Revoir, Les Enfants." To know life is to know death, because each only has meaning as it relates to the other.

Before "Au Revoir," I thought a person was complete upon birth and life was a series of challenges and tests. After "Au Revoir," I learned we are reborn every minute of every day, and the dread of becoming a new person is joined with the promise of becoming a new person. I learned that in each ending lives a beginning.

So au revoir, Monsieur Malle. Au revoir, et merci.

 

"That's Cybertainment!"
(Metroland, Jan. 6, 2000)

Working in their father's photographic factory in February 1895, brothers August and Luis Lumiere maximized the potential of several previous inventions by crafting the world's first motion-picture projector. Previously, viewers hand-cranked Kinetoscopes and other devices to look at still pictures that, when seen in quick succession, created the illusion of moving images. With the Lumieres' invention, a new kind of entertainment was born.

Cut to a century later. Throughout the '90s, viewers across the world marveled at the mercury-fast growth of computer-generated imagery, in which two existing technologies -- computer animation and motion-picture photography -- were combined to create images that could not have been made by any other means. At the close of the 20th century, mind-blowing images such as meteors racing toward the Earth, dinosaurs chasing after Range Rovers, and cyber-alien Jar Jar Binks interacting with flesh-and-blood Jedi Knights were commonplace.

In a way, today's revolutionary computer imagery is just a continuation of the experiment begun by the Lumieres. Every innovator involved in the cinema from the 1890s to the 1990s pursued the same goal: to create moving images no one had seen before.

But as the 21st century draws near, a new goal is replacing the old one. Instead of trying to overhaul the movie experience as we know it, innovators in the year 2000 are working to create entirely new entertainment experiences. Even the most technologically advanced of modern movie theaters -- IMAX houses in which patrons view images through mechanized goggles that amplify 3-D visual effects -- offer just a hint of things to come.

At this moment, newspaper, magazine, and TV stories about technology are filled with predictions of what shape the cinema and other forms of entertainment will take in the 21st century. The concept of going to a record store and purchasing the Foo Fighters' new album is endangered by MP3, the digital compression process that allows fans to download perfect copies of songs from Internet sites. Once record-industry execs figure out how to protect their assets (copyrighted songs) and profits (through download fees), store-bought CDs will give way to mixes that fans make from the Net. And as consumer technology advances to the point that all the elements of a home-entertainment system are joined in one unit, fans will be able to purchase music instantly. As soon as they hear a catchy tune on their radio (or whatever digital music-delivery device replaces radio), they will be able to press a button and secure a crisp download.

Similarly, television as we know it will change dramatically. Smart-TV machines already exist with which viewers can tape without taping. Specifically, the machines record whatever the machine's owner is watching, automatically dumping old information and collecting new information on a fixed cycle. So if you get a phone call during a new episode of "Friends," you can "pause" live TV and catch up later. Because this method of viewing threatens the means by which networks make money -- given the choice, who would watch commercials? -- new methods of marketing will be required in the brave new world. The most pervasive prediction suggests that all broadcasts will contain interactive product placements, so that if you see a TV actor wearing a jacket you like, you'll be able to click on the jacket and order one through your TV (or whatever digital content-delivery device replaces TVs).

And because so many opportunities exist for conglomerates to market their wares directly to consumers at home, movie-industry executives will want to get in on the profitable fun as well. Based on the most widespread predictions, the traditional moviegoing experience -- gathering in a room with strangers to enjoy a collective thrill -- will survive at least for a while. After celluloid gives way to digital projection, the wiring of movie theaters is the next logical step. Expect cineplexes to install IMAX-type goggles so that each viewer can customize his or her movie experience. Want the sound louder or softer? Adjust a control. Want to blur nude scenes on your child's goggles? Hit a button. Want to order a refill on your popcorn so it will be waiting for you at the concession stand? Type some commands into a keyboard.

But because the people who make movies don't earn money on concessions, the motivation for them to continue the traditional moviegoing experience doesn't exist. Steven Spielberg makes his dough when you buy a ticket to "Jurassic Park IV"; the cash you spend on a 32-oz. Coke goes straight to the theater owner. So if the means to pipe new flicks directly into consumers' homes existed, movie moguls would happily cut out the middleman. When compared to the expense of shipping copies of films to theaters -- a model that carries with it the risk of trusting theater owners to accurately report ticket sales -- the model of delivering films directly to paying customers makes a whole lot more business sense.

So in 20, 30, 40 years, it's likely that you'll be able to tap a keyboard on your all-purpose digital-entertainment-delivery device -- which will, of course, be portable -- and order "Jurassic Park X" on opening night. The sound and picture will be spectacular because you'll see and hear the 3-D experience through a headset that will let you control volume, brightness, and other aspects of the show. You'll be able to pause the movie to use the bathroom, rewind scenes, and cancel the show if you don't like it. Chances are you'll still get stuck with the full "ticket" price even if you don't watch the whole flick, and chances are that an additional fee will be added every time you run that sex scene again.

If even half of these predictions come true, you will truly be a master of your cybertainment domain in the future. And this is without even mentioning the truly interactive aspects of cybertainment; it stands to reason that in the future, so much cinematic content will be computer-generated that you'll be able to pick how plots advance. If you don't want to see a likable character die, tap a button and the character lives; if you want a space-ship ride to go longer, hit a key and it will. The parameters of the entertainment experience will be defined by how many alternate possibilities are built into the script -- which, in the future, probably will resemble a video-game program more than a screenplay.

The downsides to all of these technological goodies are immediately evident. As it has in the last several decades, spectacle will continue to rule in mainstream movies; although indie filmmakers will fight the good fight as long as possible, crap is king when the conglomerates take over. And when communal events such as movies and concerts give way to experiences that people have in the privacy of their homes, the worrisome movement from community living to insulated e-life will continue. There's spontaneity and danger in watching a real person play a real instrument, for instance, but neither of those qualities are evident when listening to digitally recorded music through a computer. If cybertainment takes root as deeply as pundits expect, the flesh and blood of yesterday's entertainment will give way to the silicon and electric signals of tomorrow's. The sad day on which viewers can guide the art they experience -- as opposed to experiencing what the artist intended -- looms in the future like a black cloud on the horizon.

Parallel with this dark development is the back-to-basics work of today's underground artists, and a perfect irony for the state of modern entertainment can be found in the story of last year's biggest cinematic sleeper, "The Blair Witch Project." The low-tech horror flick scared the bejesus out of viewers without benefit of opulent sets, glossy cinematography, or flashy computer effects. Its success was enough to give hope that not everyone in the entertainment industry is thinking in terms of technology . . . until one takes into account that the most important element of film's marketing campaign was a Web site. For those who worry that technology is pushing organically grown entertainment aside, "Blair Witch" was inspiring and frightening at the same time.

And for those who think every piece of bigger, better technology is good, keep in mind the other lesson of "Blair Witch": Much of the picture comprised simplistic, black-and-white images captured by an old-fashioned movie camera. Even in today's cybertainment-minded universe, tools extrapolated from the Lumieres' invention of a century ago are still being employed. So maybe there's room for the old and the new after all.

 

"Thrill Has Gone"
(The Source, Feb. 1, 1995)

This week, the marquee of the Madison Theater in Albany, New York, went blank. With its cathedral ceiling, ornate woodwork, and glorious curved screen, the Madison had been a shrine to the moving picture since 1929, but now the Madison has gone the way of most single-screen movie houses in the United States. With multiplexes and megaplexes replacing older theaters in America's cities and small towns, the traditional moviegoing experience has become a thing of the past.

Today, people choose movies from menus of five, ten, or even 20 titles; this makes individual films as interchangeable as items on the menu at a McDonald's drive-through.

My generation was the last one to enjoy the traditional moviegoing experience. I recall vividly a day in May 1980, when, a week from my eleventh birthday, I joined a horde of schoolchildren and teenagers who formed a line three bodies deep that snaked around the entire warehouse-sized exterior of the Fox Theater on Wolf Road in Colonie, New York. Our young eyes and ears had not yet been made blind and deaf by MTV and other overwhelming media; we didn't think we had seen it all. The opening day of a big movie was almost unbearably exciting.

Sitting with a friend in the back seat of his father's station wagon while we drove to the Fox had been a torture of anticipation, but it wasn't until we ran from the barely braked car to claim a place in line that the reality of the day hit us. This was the opening weekend of "The Empire Strikes Back," the first sequel to "Star Wars." Three years earlier, "Star Wars" had captured the imagination of nearly every kid I knew. I saw it seven times in its first run. But on that day in May 1980, it seemed as if "Star Wars" existed just to make children like me reel with desire for "The Empire Strikes Back."

After a restless half-hour in line, a message floated back toward us on a wave of disappointed groans and laments -- the first show was sold out. If we wanted to see "Empire" today, we would have to wait on line another two hours. In 1980, we didn't have the options of seeing "whatever's starting in a few minutes" or waiting for the video. To our young minds, it seemed as if the movie would disappear from existence if we didn't see it right now.

We spent the next two hours watching people gather in line behind us like cars in gridlocked traffic, then the first show let out and the line began moving toward the door. As each patron purchased a ticket, our pulses quickened -- did they get the last ticket? Luck saw my party through the Fox's great glass double doors, and we were in. The line in the lobby was no more forgiving than the one outside, and a harried usher strained to keep pace tearing tickets, but none of that mattered, because we were in.

Hints of the movie were everywhere. Posters, pins on the ushers' uniforms, lobby cards depicting scenes from the movie. We saw the cards and said to each other "Don't look, you'll spoil the surprise!" But we ignored our own warnings and peeked, which made us yelp, "Did you see that? This movie's gonna be great!" Best of all was watching the faces of the people leaving the theater after the first show. We hollered questions at them, then basked in their declarations that the movie was "a-mazing!"

With a snap of the usher's wrist, I was handed the proverbial Golden Ticket, and like Charlie rushing into the Chocolate Factory, I soared to an empty seat near the front of the theater; my companions and I settled into the cushioned chairs, secure in the knowledge that we were in. As our fatigued chaperone absorbed our shouted demands for candy and soda and popcorn and ice cream and everything, we joined our voices into the nervous cacophony of nearly a thousand people sharing an experience.

The press of a button in the projection booth brought the house lights down, and our cacophony became a shattering roar -- it's starting!

The Fox's curtain, an impossibly wide veil of velvet crepe, parted with the inaudible whir of unseen pulleys, elongating our excited howling until our throats were sore. Then came that revelation, that sudden, tangible-seeming stab of light that splashed the Twentieth Century-Fox logo across our 70-foot horizon and cued a blaring trumpet-and-drums fanfare which told us all the "it's coming," all the "we're here," and all the "it's starting" was done. The show had begun.

This terrific thrill has been felt by audiences for a hundred years. Cinema's first unsuspecting viewers felt it at the turn of the century, a Scarlett-fevered America felt it in 1939 while anticipating "Gone With the Wind," and my generation felt it in 1980 waiting for "The Empire Strikes Back." We've been addicted to this terrific thrill for a hundred years, but now the thrill has gone.

People no longer grab a newspaper on Friday, pick a movie, enjoy dinner, and then drive to a particular theater to see a particular movie. Now it's one-stop shopping. Let's go see "Dumb and Dumber" at the mall, and if we miss it, we'll just see something else. That's not an intoxicating night out. It's a hopelessly practical one.

Yet the eradication of single-screen theaters isn't the only thing that killed the simple pleasure of a night at the movies. Home video, a financial safety net that has rescued many careers and companies, has taken the impetus away from theatrical attendance. Except for movies that cater to the big screen's scale (epics like "Dances With Wolves") and rare pictures that become part of the national conversation (zeitgeist meters like "Forrest Gump"), any movie risks being demoted to "renter" status.

Though movie purists see the erection of clusters of pillbox cinemas as a travesty, those pillboxes are safer investments than even a spectacular single-screen house like the Ziegfield, Manhattan's best movie theater. If the Ziegfield books a movie for six weeks that subsequently bombs, the theater faces nearly two months of red ink. Multiplexes and megaplexes split the odds by showing several movies concurrently.

Understanding the reasons why doesn't soften the blow. Every year, going to the movies becomes less special, and for those of us to whom the cinema is like a religion, the cheapening of our places of worship is a sacrilege we mourn every time we pass places like the Madison Theater, which is being sliced into a five-screen multiplex, and the Fox. Fifteen years ago, the Fox's huge marquee was crammed with foot-high letters spelling out "The Empire Strikes Back." Today, the tattered marquee has foot-high letters spelling out "This Space For Rent."

 

"Trivial Pursuit"
(The Source, July 2, 1997)

Kevin Smith learned his lesson the hard way. After making a splash on the independent film scene with his acerbic 1994 comedy "Clerks," which was produced for a reported $27,000, Smith was handed $6 million to make "Mallrats" (1995), a pointless movie about attractive slackers roaming through a shopping mall while they sort out romantic troubles. Watching the movie was a surreal experience, as most suburban movie theaters are located in shopping malls; viewers sat and watched a film that showed them exactly what they'd seen in the lobby while they were buying their tickets.

When "Mallrats" flopped, Smith realized there's a fine line between details and trivia. "Clerks" was filled with references to pop culture and rambling conversations, but the way the characters complained about their dead-end lives offered commentary on how postadolescents live in the '90s; the conversations in "Mallrats" merely mimicked the way postadolescents speak. Smith rebounded with "Chasing Amy," an acclaimed relationship comedy about a young man who falls in love with a lesbian. By showing how the rise of alternative lifestyles transformed the dating scene, Smith returned to his forté of observing modern life.

Not too many filmmakers have indulged themselves as completely as Smith did with "Mallrats," but the same tunnel vision that motivated Smith to make a picture about malls, comic books, and movie references has infected American filmmaking in the '90s. Ever since audiences walked out of "Pulp Fiction" (1994) repeating the bizarre opening conversation between two hit men about what Quarter Pounder hamburgers are called in France, filmmakers have pursued trivia with the fervor of prospectors digging for gold.

Trivia has become a hallmark of American screenwriting because it's an easy way to convince young moviegoers the film they're watching reflects their sensibilities. When postadolescents heard Winona Ryder and her costars sing along to the Knack's MTV-era hit song "My Sharona" in "Reality Bites" (1994), it gave them a chance to sing along with the movie; by featuring a song the audience was sure to know, the filmmakers bought the audience's confidence.

After "Pulp Fiction" and "Reality Bites" set a precedent that was reinforced by pictures like "Clerks" and "Clueless" (1995) -- in which the lead character says she recognizes a "Hamlet" plot point because "I know my Mel Gibson" -- references to pop culture infiltrated mainstream movies. The submarine thriller "Crimson Tide" (1995) featured dialogue about comic-book characters and movie trivia; the megabudget sequel "Batman Forever" (1995) included a reference to the Caped Crusader's '60s television series. In "Crimson Tide," the references were unexpected and funny; in "Batman Forever," the references were distracting and obvious.

As the trivia trend gathered momentum, a pair of writers turned trivia into their trademark. Quentin Tarantino, who cowrote and directed "Pulp Fiction," wrote some of the pop-culture dialogue in "Crimson Tide." But when he directed a segment of the anthology film "Four Rooms" (1995), he indulged himself as completely as Smith had with "Mallrats." Tarantino's segment was a riff on an old "Alfred Hitchcock Presents" episode, "The Man From Reno." Not only did Tarantino duplicate the plot of "The Man From Reno," he had his characters describe watching "The Man From Reno" on television. Tarantino's "Four Rooms" segment was so steeped in references that it didn't have any original content.

The other writer who turned trivia into a trademark was Scott Rosenberg, whose screenplay for the thriller "Things To Do In Denver When You're Dead" (1995) copied Tarantino's formula of mixing black comedy with bloody action. "Things To Do In Denver" spent several months on a shelf before it flopped in theaters. Rosenberg regrouped with "Beautiful Girls" (1996), an ensemble comedy about young love in the '90s. The film contains a handful of provocative, interesting conversations, but it drowns in its own presumed cleverness. Despite the shortcomings of his first two screenplays, Rosenberg parlayed "Beautiful Girls" into a viable writing career -- he wrote the action picture "Con Air" for superproducer Jerry Bruckheimer.

Smith and Tarantino have already felt the backlash of fickle audiences. When Smith dove too deeply into pop culture with "Mallrats," audiences avoided the movie; when Tarantino ripped off an old TV episode for his "Four Rooms" segment, critics eviscerated him.

Rosenberg felt the same backlash when "Things To Do In Denver" flopped; the picture was lambasted as a "Tarantino rip-off." The reason critics hated "Things To Do In Denver" is the same reason the trivia trend may fizzle as quickly as it began: People who traffic in trivia lose touch with reality. "Things To Do In Denver" only made sense in relation to other movies, "The Man From Reno" was a pointless exercise in pop-culture recycling, and "Mallrats" was simply pointless.

Smith, Tarantino, and Rosenberg are commercially viable because they make films that speak to young audiences, but today's 20-year-old is tomorrow's 30-year-old. And when today's postadolescents become tomorrow's adults, they'll put away childish things. Tarantino and Smith already know what it feels like to go out of fashion as quickly as a clothing trend, and Smith is arguably the first of the Gen-X auteurs to grow up in public. Unless Tarantino and Rosenberg follow his example, their careers will be as short-lived as a mouthful of Pop Rocks.

 

"What You Don't Know"
(The Source, Aug. 2, 1995)

We have become carnivorous. We tear down everything we build, and that's true of nothing so much as our entertainment. We're so well-versed in the machine of stardom that we see endings in every beginning. As we watch Sandra Bullock become America's new sweetheart, we snicker at how easily she deposed Julia Roberts. The same trendsetters who trumpeted "Pulp Fiction" and called writer-director Quentin Tarantino a genius were the first to lambaste Tarantino for his acting performances in movies like "Destiny Turns on the Radio." No, we liked him last week. Catch up.

The blame for starting this kind of vicious gossip falls on whomever first let the word "shot" slip to the public vocabulary, as in, "that shot of the sun reflected in the window made a nice metaphor." Proving the cliché that a little knowledge is a dangerous thing, we all feel comfortable discussing movies on what we think to be the movies' own terms, yet the language we're using is one we only partly understand.

People aren't afraid to admit ignorance about music -- "I liked that part with the guitar" is about as technical as most folks get -- yet when it comes to movies, everybody has something to say: "I didn't think that last scene should have been in slow motion," or, "I didn't like how they used different actors to play the same character at different ages." Amateur criticism is often comparative -- "I thought Michael Douglas was better in 'Basic Instinct' than he was in 'Disclosure'" -- and the most dubious amateur critics are those who talk the best game, armed with the latest gossip about which popular male star wears lifts to stand taller than his female costars, or about which new film has actually been sitting on a shelf for a year.

But has the gossip machine spun out of control? Backstage rumors from the set of the colossally overbudget "Waterworld" were inescapable for a year preceding the movie's release. We're more interested in anticipating, criticizing, and making predictions about movies than we are in seeing them.

Our curiosity has done us in. For most of its first century, the cinema closely guarded its secrets -- director Alfred Hitchcock was famously coy about how he made films like "Vertigo" (1958) and "The Birds" (1963). But in the beginning of the blockbuster era, studios realized that telling people how the flying scenes in "Superman: The Movie" (1978) were faked and how the monster in "Jaws" (1976) got its bite could be part of a movie's marketing campaign.

Special effects magazines like "Cinefex" and "Cinefantastique" emerged, as did "Fangoria," which runs features about how makeup effects in horror movies were achieved. Mainstream media like the television show "Entertainment Tonight" and, later, "Entertainment Weekly" magazine showed fans "how they did it." As the public became more aware of special effects technology, filmmakers who used special effects became competitive with each other.

The studios behind the third "Star Wars" movie, "Return of the Jedi" (1983), and the live action/animated fantasy "Who Framed Roger Rabbit?" (1988) bragged in publicity campaigns about the number of plates used in complex effects shots. Each time an effect is added to a shot, that's one plate, as in one for Luke Skywalker's light saber and another for Darth Vader's. Both "Jedi" and "Roger" feature shots with more than one hundred plates. This information has nothing to do with whether a movie is good or bad; all it tells the public is the movie is expensive.

By the late '80s, audiences knew more about movies before they were released than any previous generation of moviegoers could have imagined. And today, we not only know about special effects, we know how much money was spent, which actor was replaced during shooting, and which major sequence was cut. We even know about movies' marketing strategies.

With the new publicity outlets of the last decade and a half, movies are under discussion from the day production begins to the day of release, and sometimes even longer. Science-fiction fans are already spreading rumors about the next three "Star Wars" movies, the first of which won't hit theaters until 1999. By giving up its secrets, Hollywood has won free, nearly limitless publicity. But there's a price, of course. Filmmakers lost the element of surprise, because it's difficult to pull a fast one on educated viewers. We used to see a commercial for a movie once, then want to see it again to catch what we missed. But now, with our MTV-speed vision, we see an ad once and notice technical flaws.

We think we're "in the know," but we really don't know a thing: The information that leaks from a movie set is as carefully spun as news from a political campaign. We think we're driving, but we're being taken for a ride. And the only way to get off this ride is to give up our modern-day illusions and reclaim our old-fashioned ones. We need to admit we're not a step ahead of our entertainment, and we need to recover the naiveté that let us buy into the movies in the first place. We're looking at the cinema more closely than ever before, but we're doing it with blinders on.

 

"X Marks the Generation"
(Delmar Spotlight, May 27, 1998)

I'm not even 30 yet, but I already feel years older than most of my peers. And sometimes, when I spend time with teenagers, I barely feel part of their species. I could explain away my inability to relate to adolescents' interests if I, in turn, fit comfortably with adults, but the truth is that my personality falls somewhere between those extremes. Allow me to introduce myself -- I'm a Gen Xer.

Now let me clarify a few things. First, I have no business being the spokesman for the children of baby boomers, whom author Douglas Coupland dubbed "Generation X." I'm not hip and apathetic or carnivorously ambitious, so I don't fit either of the prevailing stereotypes of today's twentysomethings. That said, I was raised on "The Brady Bunch," disco music, and the cynical social climate that followed Watergate, so I was exposed to all the requisite Gen-X stimuli.

So why have I become a writer who listens to country music and loathes crude television shows like "South Park"? Why have I never wanted to attend a Lollapalooza concert, hook a beeper on my bet, or wear back-in-fashion bell bottoms around my hips like the millions of suburban white kids who pretend to be urban black kids? If I'm a member of Generation X, why don't I look the part?

The answer is I do, just not in ways that everyone can see. Like most of my peers, I have a grotesquely expansive knowledge of trivia gleaned from years spent rotting in front of the boob tube. I can tell you which actors played Lenny and Squiggy on "Laverne & Shirley," and I can sing the lyrics to the "Love Boat" theme.

I have the same ignorant contempt for everything and everyone around me that most of my Gen-X acquaintances have, but I struggle to remind myself that the ironic stance Gen Xers hold dear is little more than a timid façade used to ward off human contact.

I struggle to remind myself that many Gen Xers are the products of broken homes, so they have good reason to be gun-shy about people. Thanks to women's lib, the Pill, and free love, the divorce rate in the '70s was astronomical, so the story of how my parents broke up in 1976 is just one more entry into a litany of unsuccessful boomer marriages.

I struggle to remind myself that the obsessive nostalgia Gen Xers feel for the '70s and '80s is, in part, an attempt to reclaim happier times. In this aspect, I don't have to struggle very hard -- I see myself trying to reclaim innocence constantly. Just last week, I bought a book I'd spent 10 years trying to find, and when the shopkeeper handed it to me, I touched the dog-eared pages and the cracked cover with genuine affection. The book is a collection of superhero comic strips published in 1976, and any excuse I give to explain why I spent $15 on it is a lie unless I say the book is a part of my youth.

A lone $15 splurge seems innocent enough, right? Wrong. I waste trifling amounts of money regularly on nonsense like that book, and if I ticked off a list of every Gen Xer I know, I could name what they buy compulsively, whether it's kitschy lunch boxes, toys, or "Star Wars" paraphernalia. And they aren't all couch potatoes stuck in menial jobs. Harry, for instance, a college acquaintance who collects '70s lunch boxes featuring the likes of the Bee Gees, just directed a feature film for Paramount Pictures.

Then there's Walter, who has built an encyclopedic collection of '80s pop songs on hundreds of CDs; Michael, whose array of movie posters and action figures is voluminous; and Margaret, who still goes weak in the knees at the sight of Jon Bon Jovi.

There's no common thread that binds all of us other than our age. Many come from broken homes and others from happy ones; some are old enough to remember the Jonestown tragedy and some barely recall the "Challenger" explosion.

The common parlance of Gen Xers is trivia. I can't recall the number of times I've hit it off with someone merely because we struck upon a pop-culture touchstone in casual conversation, only to discover no real bond was made when I attempted to pursue the friendship. Similarly, I've rebuffed people who thought they were my friends just because we talked about eating Boo Berry cereal while watching "The Challenge of the Superfriends" on Saturday morning television in the '70s.

As my generation starts to outgrow its collective prolonged adolescence, we'll discover if the interpersonal crutches Gen Xers use in superficial relationships also impede their progress in substantial ones. I think about Kim, a friend who got married in her 20s, grew bored with her husband, and then started dating without annulling or ending her marriage, and I wonder if her attitude is typical among Gen Xers.

But then I think about other friends, serious people like Dave, a graduate student who has been devoted to the same girl since his first year of college. He can blather about television and kitschy '80s music as well as anyone, but he's also a responsible worker and a fiercely dedicated student.

When I think about people like Dave, I realize maybe I'm not such an anomaly among my peers after all. There are times when I talk to Dave and hear hints of that Gen-X malady -- prolonged adolescence -- but more often than that, I hear reason and intelligence.

But even as I realize I have kindred spirits my own age, I see we're not turning into the kind of grownups I remember looking up to. Our parents put away childish things, but we Gen Xers are holding onto childish things for dear life. We're keeping the memories of our childhoods alive because we saw what happened to our parents in the '70s. Some of them got cynical, some got divorced, and all of them got burned in the supernova explosion of their innocent '60s ideals.

As a group, Gen Xers don't have ideals to lose. We have our trivia, our toys, and our immaturity, and through them we remember what it felt like when life was about instant gratification. As years go by, we'll get older, smarter, and more responsible, and maybe someday, we'll let down our guard and feel things the way real people do. But we'll get to that point kicking and screaming, because when our parents wore their hearts on their sleeves 30 years ago, their hearts got broken. Gen Xers hide feelings because we're scared of making the same mistake.

 

home  •  news  •  books and movies  •  field guide  •  services  •  journalism  •  gallery  •  bio