The Unknown Known Review  

Posted by Zachary Zahos in , , , ,

The Unknown Known
Directed by Errol Morris
Released in 2014

Donald Rumsfeld is a genius who found his calling in politics, which explains why he is utterly empty inside. Such is the infuriating thesis at the heart of Errol Morris’ new documentary on the former Secretary of Defense, and Rummy does not break once while staring down the barrel of Morris’ Interrotron. A bunch of times, he does this terrible grin, resembling a skeleton or, as Morris sees it, Lewis Carroll’s Cheshire Cat. In a four-part, must-read New York Times series on his research on and time with Rumsfeld, Morris concludes, “I was left with the frightening suspicion that the grin might not be hiding anything. It was a grin of supreme self-satisfaction and behind the grin might be nothing at all.”

The Unknown Known will madden those who think Morris lobbed softballs at his subject. As far as Rumsfeld goes, what you see is what you get. We get a sense of his intelligence and to what shameful ends he put it to use, but not much more. In place of the catharsis Vietnam SecDef Robert McNamara croaked through in Morris’ similar, Oscar-winning The Fog of War, we get an essay on the weaponization of words and the tenuous justifications for modern warfare. To appreciate this film is to unpack it. For that reason it is a far more intellectually demanding film than The Fog of War, and thus a superior one in my view.

Save for an early recounting of the events of 9/11, Morris structures his film around a chronological run-through of Rumsfeld’s career. Set against a black backdrop, Rumsfeld addresses Morris’ camera in his self-described “cool, measured” way. While Morris smothers a photo montage of Rumsfeld and his wife in sappy music, Rummy retells his marriage proposal in laughably clinical terms: “I was correct. It was a good decision. It just wasn’t part of my plan.” He lights up when talking about himself, such as his behind-the-scenes machinations in the Nixon and Ford administrations. In the former’s case, he ducked out soon enough to avoid Watergate while in the latter’s, Rumsfeld criticizes his old boss’ weaknesses in leadership. In tandem with a fellow named Dick Cheney, Rumsfeld resigned in protest as Chief of Staff, spurring the ensuing “Halloween Massacre” where Ford reshuffled his Cabinet. Rumsfeld got a promotion, to Secretary of Defense, out of it all.

The way Morris cuts together the Halloween Massacre sequence clues us into his complex take on his subject. Superimposed over footage of Rumsfeld’s swearing-in ceremony, newspaper headlines whiz by, all praising, through so-called objective language, Rumsfeld’s ruthlessness in getting what he coveted. They all ostensibly fuel his ego as he strides down the red carpet, with honor guard in tow. The media’s love affair with political drama can be held accountable for incubating a man like Rumsfeld, whose indisputable intelligence benefitted only himself, when all is said and done. The jokes he cracks with the press on the lead-up to the Iraq War humanize him, to an extent, but they disturb more than anything else for we notice a collusion between interviewee and interviewer, as frustration with Rumsfeld’s nonsense evasions cools into inappropriate comradery. If this reading needs further evidence, consider that the other time Morris uses this flying newsprint-over-archive footage approach is with Osama bin Laden, when he descends a mountain with his walking stick and headlines express confusion over his whereabouts. They are both boogeymen made stronger by the noise they leave in their wake.

Throughout the film, Rumsfeld reads aloud a handful of the thousands upon thousands of memos, called “snowflakes,” he wrote during his tenure at the Pentagon. “Subject: Terminology,” he begins, before boring into three terms — “unconventional warfare,” “guerrilla,” “insurgency” — with which he sought to define the Iraq War, precisely because they are vague euphemisms. He boasts how he got rid of unwanted words from the conversation, oblivious to how that approaches Orwell’s Newspeak. He clings to his infamous “unknown unknowns” — things “we don’t know we don’t know,” “The absence of evidence is not the evidence of absence,” et al — as grounds for invading Iraq, sounding like Fred Leuchter, the Holocaust denier Morris interviewed for Mr. Death. He said, after a trip to Auschwitz, “It’s not what I found that convinced me. It was what I didn’t find.”

Rumsfeld gets it right, once. He reminds Morris how, back in 2008, Obama opposed the Patriot Act, indefinite detention and Guantanamo, yet they remain with us to this day. “That validates the decisions made by George W. Bush,” he says in what may be his most humble statement. The bigger and more connected our world becomes, the more grounds there are for suspicion, for actionable “intelligence.” Rumsfeld may be loathed more than most, but his breed will continue to occupy the highest offices. Morris cannot shake the moral void behind those eyes, like when he exclaims, “Wouldn’t it have been better not to go there [Iraq] at all?” and all he gets back is a smile and “I guess time will tell.” But this film comes short of excoriating Rummy for 106 minutes and that is a wise choice, since such polemics are easy and self-evident at this point. What Morris does do is open up this focused but failed probe of a man to capture the rest of America in silent consent. Rumsfeld acts on our country’s worst tendencies, with more intellectual arrogance than anyone else, but he does so because such violence is kind of part of our deal.

Final Verdict:
3.5 Stars Out of 5

This article was written for The Cornell Daily Sun and can be viewed at its original location here.

Squirming Through the Classics  

Posted by Zachary Zahos in , , , , ,

Over break, I watched 26 feature-length films. Aside from a few mediocre new releases in theaters and on Netflix, these movies were classics, either from the art house tradition or the Golden Age of Hollywood. Before you question my sanity, know that I really enjoy these kinds of movies and I had some company — Sam Bromer ’16 dedicated his column last week to praising the intellectual value of some Criterion Collection films we watched. I agree with him that you feel good after sitting through demanding, “more nutritional,” as he puts it, fare from the olden days. That is, you do until a movie’s age begins to show, and not through cheesy special effects.

If “theme” is the artistic essence of narrative film — a reader of film cares how plot, aesthetics and cinematic form bring out a theme or question — then “representation” is its thorny by-product. A man filmed in a medium shot is more than just a man: He is a character, the actor playing that character and, whether the filmmakers intended him to be or not, a symbol. For what is anyone’s guess, though if that man wears a cowboy hat and talks, walks and looks like John Wayne, you can bet he, the man on-screen, stands in for ideals of honor, chivalry and masculinity. Of course, the same close reading should be applied to female characters as well, and it is there where things get awkward, especially when diving into the classics. 

John Ford was a master of his craft, winner of a record four Oscars for Best Director and a go-to textbook for formal nuance — he told stories through images and saw a script as just a “skeleton.” Being a pioneer of the Western genre, Ford included a lot of American Indians in his films, most of them silent, savage antagonists. That’s the case inStagecoach, the 1939 hit that made a star out of Wayne (and viewable now on Hulu Plus). While he atoned, somewhat, for past racism in his morally gray 1956 masterpiece The Searchers, his depiction of women remains interesting for the notes he struck right as well as those that were off.

In Stagecoach, Wayne’s character falls in love with a “woman of ill repute,” a common archetype in the lawless Old West of myth. There is a moment when Dallas, the prostitute, exposes her leg while climbing into the stagecoach, to the catcall of one old bastard, but for the rest of the film she keeps herself covered, even conservative in appearance. She seems ashamed less of her line of work than the reaction she spurs from others, like the ladies behind the town’s Law and Order League, who are reminiscent of 1920s Temperance activists. Wayne’s Ringo smiles at her and prods his male peers to treat her with the same respect they automatically afford the pregnant aristocrat in their midst. Dallas appreciates Ringo’s kindness but hesitates, at first, at validating his romance. As the film progresses, Dallas acts as compassionate midwife, skilled homemaker and an increasingly vocal presence.

Ford builds sympathy for Dallas by moving her away from her past and toward respectability — he is far from a feminist. He is more a Catholic than a misogynist, subtly coding prostitution as bad, yet he also satirizes the hypocrisy of drunk, stupid men who look down on Dallas and then hope for the flash when she will show some skin. Dallas is an admirable character, though not a very strong or self-made one. We like her because Ringo does, because this Male Gaze finds her appealing. Today, we find Dallas’ characterization flawed but, if we put the film in context, we recognize she, and Ford framing her, oppose prejudice, to an extent.

So it is awkward to fast-forward nearly 30 years to Sergio Leone’s 1968 Once Upon a Time in the West. The film is incredible in many respects, from its tense wordless opening to Henry Fonda’s uncanny bad guy performance. Yet there is a dimension or two missing from Claudia Cardinale’s character, a voluptuous widow who crooks and vigilantes fight to control, both for her land estate and other, obvious assets. She is at the center of the conflict, and yet Leone does not afford her much empathy. She stays tight-lipped through most of the film, not airing her grievances, while falling into positions of increasing undress. Cardinale is a beautiful actress, so the men are probably not complaining, but one wonders why the sole female character is so used and abused throughout this canonical film. Perhaps a degree of irony is lost on me; if so, its subtlety is too refined.

These two films stop short of the pretty shameful “slut shaming,” as we call it today, that can be seen in Hitchcock’s Strangers on a Train. Early on, a gregarious psychopath murders the promiscuous wife of our protagonist, who told the killer in confidence how he seeks a divorce. Prior to her death, the killer follows the wife as she giggles, licks an ice cream cone and pulls around anonymous, extramarital lovers. Just before he wraps his hands around her neck, she gives him a seductive glance, as if she wants to fool around with him too. Her subsequent murder strikes us not as awful but deserved — she was asking for it. I do not believe in that conclusion one bit, but I don’t have much choice from how Hitchcock, who never was known for being gracious to women, orders and frames the scene here.

I still take something from Strangers on a Train because, you know, Hitchcock did it. Young filmmakers can learn their craft just from breaking down how he orchestrates any given sequence. Yet I do not disown this film’s politics just because I find them wrong. Critic Peter Labuza wrote last month how “Dated films are vital to our understanding of the past.” Hitchcock is a legend who lives on, but his time has passed. A filmmaker could and should steal from him today, if only to fix where the master failed.

This article was written for The Cornell Daily Sun and can be viewed at its original location here.

Noah Review  

Posted by Zachary Zahos in , , ,

Noah
Directed by Darren Aronofsky
Released in 2014

My favorite episode of HBO’s Curb Your Enthusiasm is called “Palestinian Chicken.” In it, Juliette, the wife of one of Larry David’s so-called friends, gives Larry a mission: Keep her away from dessert, “no matter what.” She lost 65 pounds through a careful diet, so her request sounds logical, disciplined. After the meal, Juliette tiptoes to the dessert table, peers side to side and reaches for a cake. Larry comes out of nowhere to grab it from her hands, and when she tries to laugh off her earlier charge, Larry whines, “But you said, ‘No matter what.’ This is the what. That’s why you asked me and not these other people, because you knew I wouldn’t let you!” He refuses to relent and the two tackle each other to the floor.

Replace Larry with the Biblical Noah, Juliette’s request with the word of God and the cake with the lives of Noah’s wife and children and you have the conflict at heart of Noah, the new film by Darren Aronofsky (Black Swan, Requiem for a Dream). I realize that is a rather flippant analogy with which to place aside an adaptation of a sacred, 2,500 year-old text, but A) this is 2014, “God Is Dead,” yada yada yada, and B) Aronofsky has no intention to make a sanctimonious Cecil B. DeMille film. This is a film that rebukes blind faith, esteems free will and, through meticulous time-lapse sequences, promotes evolution. And like Larry David, Noah is less a hero than a dogmatic asshole. Aronofsky secularizes the story of Noah to the point that you, whatever your beliefs, should glean a provocative message or two regarding faith, human violence, love and so on. You will just have to fight against an unfocused screenplay and a truly erratic visual style to appreciate the film beyond a superficial, hey-look-it’s-Emma-Watson level.

The film opens with a montage of the events between the Garden of Eden and Noah’s time, covering 10 generations, thousands of years and a whole lot of bloodshed. It’s a trite way to open a movie tackling the dilemmas of human existence, since it reminds us of Lord of the Rings or any “Previously On ...” TV recap. But it introduces us to the Watchers, angels cast from heaven and doomed to lumber about the earth as rock giants, whose disfigured appearance I’d like to call “Doom Rococo.” They partake in some CGI-heavy battles later on, so you sense their presence is motivated by blockbuster expectations more than any narrative or thematic necessity. Thankfully, Aronofsky and co-writer Ari Handel justify their silly creatures with a subtle, melancholy conflict regarding the afterlife that comes to a head at a hectic scene of warfare right before the Flood. With Frank Langella voicing a prominent Watcher who helps Noah on his task, these beasts are more human than you would expect, which is a quietly impressive achievement.

Noah’s task is, of course, to build an ark in order to spare “the innocents” (a.k.a. animals) from the wrath of “the Creator” (not one use of the word “God”). He receives his mission through a pair of wordless, expensive-looking dreams that realize the terrifying image, “The waters of the heavens will meet the waters of the earth.” He has the assistance of not only the Watchers but also his immediate family, including his wife, Naameh (Jennifer Connelly), his put-upon son, Ham (Logan Lerman) and his adopted daughter turned daughter-in-law, Ila (Emma Watson). There is a lot of incest, implied and otherwise, in this film, and Aronofsky gives us no comment on it, which is weird. What he does stress is the ignominy of infertility, which a childhood wound inflicted upon Ila. And yet to be “barren,” might be part of the Creator’s will, given his plans to wipe his finest creation off the face of the earth. That question — should I kill my family? — taunts Noah for the second half of the film, as they pass time in their ugly brick of a boat.

Yet Noah dilutes the potency of that central question. For one, the screenplay wanders. As gnarly as the scene-stealing Ray Winstone (The Departed) may be, his character Tubal-Cain serves as a standard-issue villain who distracts us from the meat of this story. Which is ironic, since he is the carnivorous exploiter of the earth’s resources to Noah’s vegetarian, forager family. In a very Game of Thrones-esque sequence at a miserable village, the film conflates meat-eating with pollution, prostitution, even cannibalism — its environmentalist message is as sensationalist as Elysium’s health care politics were naive.

Then we get to the film’s visuals, which oscillate between spectacular and nondescript. Aronofsky delivers his signature hallucinatory montage (remember the eye-opening one from Requiem?), here a triptych of Eve plucking the apple, the serpent and Cain’s raised fist over Abel. It’s awesome. Nor will I forget the sight of Noah’s bare feet standing on a blood-soaked sea of volcanic ash, or the sight of green water lapping against a urine-yellow skeleton of some mammal that, presumably, snuffed its species before the rain begins. The most upsetting image calls back on European art, in particular Francis Danby’s The Deluge, as it depicts wailing human survivors hanging onto a mountaintop besieged by waves. There is beautiful stuff here.

And there’s the rest of the movie. For whatever reason, Aronofsky opted for handheld close-ups of not just Noah but everyone. It looks like The Hunger Games — the first one. He ignores the production design and restricts his mise-en-scène (lighting, costume, blocking, symbolic geometry) to his actor’s faces. I mean, they’re good looking, but come on. Perhaps Aronofsky sought to empower his female characters by allotting them more screen time, for Jennifer Connelly owns every second. Such is a noble aim, yet in this case it leaves Noah, perhaps the most troubled protagonist in any recent blockbuster, a puzzle unsolved. It’s one thing to plunge into his psyche and emerge with no sure diagnosis; it’s another matter when an analogy between Noah and Larry David sticks.

Final Verdict:
3 Stars Out of 5

This article was written for The Cornell Daily Sun and can be viewed at its original location here.

The Wind Rises Review  

Posted by Zachary Zahos in , , , , , ,

The Wind Rises
Directed by Hayao Miyazaki
Released in 2013

Any aspiring screenwriter has read Pixar’s “22 Rules of Storytelling” by now. Rule Six reads, “What is your character good at, comfortable with? Throw the polar opposite at them. Challenge them. How do they deal?” Pixar enjoys a sterling reputation because it tells tight, satisfying stories, wherein an unlikely protagonist braves a mountain of intensifying conflict and emerges victorious. Up fits all it’s got into a three-act structure, the Hollywood standard, and somehow makes it unforced, even sparse.

Only Hayao Miyazaki and the powerhouse he co-founded, Studio Ghibli, rival Pixar in the international market for acclaimed animated films. Yet Miyazaki tells stories that do not conform to Hollywood structure. It explains why the meandering Spirited Away proved so jarring to my nine-year-old brain, in addition to all its weird ghosts and pigs. His films use ma, the Japanese word for “space” or “pause,” to contemplative, disarming effect. The best of them are flat-out art films. So it is awkward to critique The Wind Rises,Miyazaki’s latest and potentially last film, for it conforms to a straightforward biopic formula that plays against Miyazaki’s strengths. Of course, everything on screen still brims with beauty and rewards symbolic reading.

For the first time, Miyazaki dramatizes the life of a historical figure, with the same name, look and all. We meet the young Jiro Horikoshi in his dream, in which he flies a plane of his own invention high in the sky until a monolithic flying fortress emerges from the clouds and sends Jiro and shrapnel plummeting to the ground. The Icarus myth recurs throughout the film, for while Jiro’s poor eyesight precludes a piloting career, he instead dedicates his life to designing the world’s sleekest and fastest airplanes, just as World War II beckons. That most prototypes snap and set aflame on test runs only motivates Jiro to try harder, yet he must balance his perfectionism within a military sphere willing to sacrifice speed and safety if it means bolting a machine gun onto a wing, with compassionate, grounding human relationships.

Unfortunately, this conflict — between career and life, war and grace, male superiors and female loved ones — fails to reach us with Jiro at its center. From the start, Jiro is perfect: He saves a woman and child from the Great Kanto Earthquake of 1923, fastens a splint on the former’s broken leg like a Boy Scout, carries them to their families and ducks out before they can ask his name. Voiced by Joseph Gordon-Levitt in the only occasionally awkward English dub, Jiro could not be sweeter, what with his command of etiquette and mastery of kenjōgo (“polite language”). His romance with Naoko (Emily Blunt), a sickly girl with a mature outlook on life, offers only overwhelming sentiment, effective as it may be. Their love colors the second half of the film, and Miyazaki strains to connect them with his larger questions. Naoko can only whisper into Jiro’s ear how great he is so many times until they both flatten into cardboard. To quote Reverse Shot critic Eric Hynes on Dallas Buyers Club, “Never trust a film that applauds its own protagonist.”

The undisciplined narrative disappoints, since Miyazaki works best in looser, more radical genres than the standard biopic. But count on Miyazaki to trot out the weird and fantastic, in spite of all else. Werner Herzog lends his Bavarian tenor to a watercress-loving German whose quivering pupils resemble black, cartoon suns. Out of his mouth slither omens of impending war or wishes of health and happiness, and nothing in between. He haunts a resort in midland Japan more like an apparition than a human, as does the Felliniesque inventor Giovanni Caproni (Stanley Tucci), who Jiro encounters multiple times in high-flying dreamscapes. In both Jiro’s dreams and reality, struggling aircrafts emit moaning, guttural sounds. Miyazaki refrains from flooding the soundtrack with ambient particulars (think of all you hear during one establishing crane shot from, say, Pirates of the Caribbean), so this aural motif stands out as it humanizes machines that, to Jiro, serve a higher purpose than as weapons to kill.

Miyazaki has sustained attacks from his homeland that The Wind Rises communicates an “anti-Japanese” message in its depiction of the military as droning thugs and the war effort as misguided, at best, and sinful, at worst. I admire the film’s stance, even though it could have gone further by maybe mentioning the anti-Korean violence following the Kanto earthquake. But now and then the film looks out from its bubble. When Jiro and his best friend Honjo (John Krasinski) visit Berlin, they catch a glimpse of Gestapo agents chasing renegade Jews through streets bathed in German Expressionist shadows. The Nazis stop to shove their flashlights in Jiro and Honjo’s faces, gritting their teeth to round them up too. Jiro saves them both, of course, but the uneasy alliance amongst Axis nations casts a more permeating spall over the film than any material gains Japanese engineers enjoy in their collaboration with the Nazis. That is a good thing.

Obviously, there is a whole lot I like about The Wind Rises. Miyazaki mulls over the outrageous paradox that only in times of war do governments support artists, provided they sacrifice all humane values they hold dear. Yet as pretty as every frame is, the film proves far more stimulating after you watch it, as you try and fish out significance from Miyazaki’s sincere intentions. Jiro is simply too good and thus boring a protagonist for a film so concerned about mortality, compromise and geopolitical tension. He stands aloft the breakers of global tumult, undampened by its waves of red as he floats away on a raft of saccharine fantasy, one that he sees as the world entire.

Final Verdict:
3 Stars Out of 5

This article was written for The Cornell Daily Sun and can be viewed at its original location here.

O Brother, Where Art Thou?  

Posted by Zachary Zahos in , ,

Courtesy of Santi Slade
For the first time in what feels like forever, I had a real conversation with my brother. His name is Nick, he is a freshman at UCLA and he is just over a year younger than me, though you would not think it if you put the two of us back-to-back. By that, I mean he frequents the gym a lot, and it is there, back home, where he would advise me on what exercise to do and  precisely how to do it. He makes a great trainer, and I thank him for helping his little older brother, but in our house or driving around, at dinner or just lazing about with nothing better to do, I sensed a distance between the two of us.

So it was a pleasure to call him Wednesday evening, after missing two of his calls, and hear nothing about video games, money or fitness — our go-to topics for half-hearted discussion over the last few years. Instead, he could hardly contain his excitement as he ran through everything that’s lately been on his mind: student government, campus activism, careers and the Westboro Baptist Church. Regarding that last one: Apparently members protested near UCLA a few days prior, brandishing their infamous “God Hates Fags” signs. I told him to save his bile for evils less fringe and trollish than that dinosaur, but I knew his head was in the right place.

Nick is starting to look at the bigger picture. It took me until second semester freshman year to do the same, to see college as a means not only to read books, have fun or get a good job but to change: First internally and then, you hope, out in the open. I never thought I’d talk divestment, the Israel-Palestine conflict and diversity in college admissions with my brother, but there he was, chewing through these issues and more with a passion that tells me further research and even action await in his future. In all likelihood, he will best my knowledge on these subjects, regardless of whether they are tied to his college curriculum or not. That is what is special about this kind of awakening: Important questions, regarding geopolitics, racism, faith and so on, leave the classroom and colonize your downtime, breeding lifelong pursuits and realigning your priorities.

I have used this line before, and I’m sure I stole it from somewhere, but the way I see it, you go to college to become a person. Sure, you may have scored a 2400 on the SATs and led your debate team to the championships and wrote a tear-jerking college essay that, together, proved to an Ivy League admissions office how much of a hot commodity you are. But the real test comes after, away from parents, class rankings and other ruthless motivators. In fact, college offers so many avenues for distraction that it can undo all that high school overachieving, which may not be a bad thing for some. For the rest of us, however, bridging the obligation (studying, worksheets, etc.) with the distraction (music, writing, activism, etc.) becomes the newest and mind-blowing possibility.

I realize now how little I thought before. Maybe I was also a little happier, on the whole, back then — me in my ignorance. Then one day, months into the college experience, away from home and immersed in ideas that I didn’t totally understand, I came to a deflating realization: I do not matter. I was taking an astronomy course at the time, so the verdict may have been closer to “None of this matters.” If we keep this to comprehensible earthbound terms, the ramifications are the same: All those superlatives on your transcript amount to nothing in the grand scheme of things. It’s a rather depressing subject of consideration and it will always be, since once that internal switch turns on it cannot be turned off.

The only viable response to that humbling epiphany is as follows: But I want to matter! Whether you scream it aloud or never summon those exact words, therein lies the reason you get excited at anything more substantial than ice cream, from here on out. You think humanitarianism motivates fracking protesters or U.S. presidents? It does, of course, but so does reputation, self-importance, ego. We are too narcissistic a species to base our lives solely on the needs of others, and anyone trying to do good, by writing some preposterous novel or building the most efficient solar panel, knows this, deep down.

My brother is coming to grips with the effort, fueled from within, it will take to realize his dreams in life. His idealistic, go-getter oratory spells a future politician, entrepreneur or physician. Whatever path he chooses, he will outearn me, that’s for sure. I look forward to being there as witness and, now, confidante.

This article was written for The Cornell Daily Sun and can be viewed at its original location here.

Need for Speed Review  

Posted by Zachary Zahos in , , , ,

Need for Speed
Directed by Scott Waugh
Released in 2014

Now, here is a video game movie. Typically, Hollywood buys the rights to a game like Prince of Persia: The Sands of Time for typical, Hollywood reasons: an exotic, albeit totally depoliticized, setting; a nifty time travel conceit; a male lead who can look good while swinging a sword. Need for Speed, a 20-year-old series of racing games, has no core locale, no human characters and no story. Back in fourth grade, I played Need for Speed: Hot Pursuit 2 on GameCube because it had a McLaren F1 LM that went so fast I could lap the cops chasing me. That simple pleasure motivates this adaptation, a film so poorly written and devoid of any self-awareness that its fundamental, thematic emptiness makes it a fascinating text, as well as a superficially, stupidly enjoyable one.

As if to prove its commitment to The Real and the spirit of Americana, Need for Speed spends its first 30 minutes in Westchester County. Because when we think muscle cars and blue collar roots, we think Westchester and, to be specific, Mount Kisco. I have visited that town before and found it surprising how the production transformed a town of 10,000 into an urban center 20 times as large. Turns out it filmed those scenes in Columbus, Georgia. Why didn’t the movie just start there?

I am at a loss, and so is Breaking Bad’s Jesse Pinkman, née Aaron Paul — here a mechanic named Tobey Marshall, who has spent his whole life in the town and never once boasts about a local restaurant or expresses any sense that he lives in an actual place that he either longs to break free from or hopes to never leave. The most he can summon is “Are you still allergic to Mount Kisco?” to an ex (Dakota Johnson) at a yellow-tinted, Drive-esque drive-in theater. Even the thespian who could break down at the sight of a vial of ricin has little clue what to do with a line like that.

Tobey gets his diverse — sans Asian dude, unlike The Fast and the Furious movies — band of bros pumped for the initial conflict when he mutters, “I’m, uh … behind … on the loan.” When Benny (Scott Mescudi, a.k.a. Kid Cudi) interjects about “last time,” Tobey, via the pen of screenwriter extraordinaire George Gatins, says, “This time is different,” and, “If you guys don’t show up tomorrow, we lose this place.” That place is an auto shop/man cave they run now that Tobey’s dad has passed, just before the movie starts so that his cause of death can remain perpetually and pointlessly cryptic. It takes another death — this time of Tobey’s closest friend, after schoolyard bully-cum-racing millionaire Dino Brewster (Dominic Cooper) bashes his fast car with his own fast car and instigates a fiery, though undeniably pretty slow-mo inferno — and the framing of Tobey, with Dino going scot-free, to set up the barest outline of a conflict: For Tobey to win the De Leon, a secret race of modified supercars, in order to, somehow, prove his innocence and reassert his masculinity in the process.

This may not be obvious so far, but every character in Need for Speed is a terrible human being. Dino Brewster kills people, sure, with his pride and fake name and all. But the supposed good guys are sexist, unfunny idiots, too: Little Pete (Harrison Gilbertson) flirts with Tobey’s British love interest, Julia Maddon (Imogen Poots), with winning lines like, “I really like Piers Morgan.” The only discernible arc in Tobey’s character is in his eventual acceptance of Julia as an actual person, only after she drives well and helps him evade police custody. Meanwhile, violence against pedestrians or civilian cars is ignored or even glorified: When Pete hits a homeless man’s shopping cart during an earlier street race, he smiles and laughs as the man screams, “My house!” During every race, Tobey manages to cause at least a half-dozen car pileup by driving on the opposite side of the road and cutting off SUVs and even school buses.

Naturally, the film never indicts its characters’ behavior. Whereas Transformers 2 can be easily lambasted for its offensive stereotyping, director Scott Waugh maintains a weird, remarkably open visual style that is either lazy assembly line craftsmanship or sly, subversive commentary. The host behind the De Leon game is none other than a nutty Michael Keaton, going by “The Monarch.” He spews pop philosophy into his microphone and webcam, like “Racing is art. Racing with passion — that’s high art.” Everyone in this film knows him and thus reveres him, and you wonder if his marked isolation, in a circular room with a long-suffering swivel chair, clues us into his questionable sanity. Is he any different from The Joker and his home videos in The Dark Knight? Through the grammar of film, he is not, or not by much.

Then there is Benny, who commandeers a news helicopter and eventually a U.S. Army helicopter for reconnaissance during Tobey’s cross-country trek. He could bring down the whole American military with his smile and gift of gab, which Waugh shows us whenever he can, whether on-screen or through isolated intercom. That all these dudes get away with their reckless, irresponsible behavior and never even reflect on their violence could be just brainless filmmaking, or perhaps a super-ironic treatment of machismo and other harmful byproducts from exclusively homosocial relationships. I mean, given that one of the last shots is Tobey looking up at white lighthouse, framed askew so it juts about 45 degrees across the screen, is it wrong to think a queer reading of this entire thing is in order? This is one of those films that is all surface, and inadvertently or not, the motivation behind such surface-level violence lies underneath it all, if you are willing to look. It’s fun, dumb and sexless enough that it already feels like a camp classic.

Final Verdict:
2.5 Stars out of 5

This article was written for The Cornell Daily Sun and can be viewed at its original location here.

More Than a Feeling  

Posted by Zachary Zahos in , , , , ,

William Blake's illustrations to Milton's
"L'Allegro" and "Il Penseroso" (1816-1820)
So last week, the College Board announced big changes for the SATs, an awful and borderline anti-intellectual institution most of us — in college and beyond — survived and have since tried to forget. Who cares, right? Well, unfortunately, I do, because, effective in 2016, the College Board will no longer require students to write an essay. If you shoved any of my SAT essays (I took the test thrice) into my face today, I would hurl expletives, and maybe my lunch, back at you, because I’m sure they were platitudinous, benign and boring. But, goddammit, did I ace them. Removing the essay component in the SAT puts our academic priorities in all the wrong places, away from the written word, the value of a good argument and the process of creation.

I swear I’m not going to spend this whole column talking about SATs — I’ll get to movies shortly — but humor me for a little while as I reminiscence on that time of so much undue stress. On test day, a Saturday, I woke up around 6:15a.m., stood outside in the cold and waited on a slow-moving line just to flash my pass and student I.D. to some underpaid teacher. It was a miserable migraine of a so-called academic experience, yet the mood shifted once the SAT actually started.

I had 25 minutes to fill two pages with the best points, vocabulary and gerunds I could muster. There was no guessing, process of elimination or wasted seconds. You had to just go at it, and that’s what I did. Some impotent, probably underpaid knockoff of the Muse that Milton invoked so religiously in his poetry visited me in that high school classroom, for the thrill of besting my peers and the crunch of time inspired some … I wouldn’t call it literature, but it was some pretty good bullshit. And what surprised me, reading it over before the proctor called time, was how I packed all this ephemera into a discernible structure, with a shape to my argument and, most crucially, some evidence backing it up.

The rest of my SAT experience sucked, of course, but I value its essay component for reminding me what I excelled at, and how writing mattered as much as a doing a bunch of math problems. The College Board only introduced the writing section, — and the 800 points that came with it — in 2005. Whatever its motivations then, the dismissal of SAT writing now pushes the narrative that the humanities are on their way out — that numbers and filled-in circles equip prospective patrons of higher education better than an inspired, never-before-seen arrangement of words. This is a big problem in academia right now, one The Sun will dedicate a “Dialogue” to tomorrow in Ives Hall.

The delusion governing this administrative decision-making, in favor of STEM fields and against the liberal arts, is that the latter is not “practical” or even “rigorous.” This world needs more problem-solvers and fewer manchilds pouring their feelings onto a page or piece of canvas. While the belittling of art bothers me, I take issue with the fundamental dichotomy being drawn. The worst English essay abandons “practicality” just as the laziest scientific paper tosses out the scientific method. A misguided student may ignore form, coherence and citations when writing about To Kill a Mockingbird, and instead lapse into solipsism, asserting how touched he or she was by the book and why that emotional response is so precious. 1000-plus words later, the reader of this essay learns nothing and wonders how someone forced through the crucible of college essays and SAT writing could so thoroughly forget the lessons they were supposed to learn.

Art should never be devoid of feeling, but those evaluating it must keep that side of themselves in check. University humanities education focuses more on critical and analytical engagement with texts, whether they be books, paintings, films or songs, than the process of creating them. Whereas the former requires schooling and immersion in a medium’s theory and history, the latter depends on shakier, unteachable tenants like vision, originality and, again, the Muse. Great criticism is an art on its own, for the author tests and engages with those three things during the act of writing. But in order for an analysis to carry absolutely any import, a critic must follow some form and move past his or her initial emotional reaction: Okay, you like this movie. Now, what evidence can you share?

That word “form” matters. Even the most perplexing film, like David Lynch’s Mulholland Dr., hits you at a rational, analytical and thematic level. Part of the thrill of watching that movie depends on an ineffable engagement with it, yet I know it is truly great because I detect a rigorously constructed chassis of ideas and storylines underneath all the superficial and beautiful obfuscation. If I can glean order from a surrealist nightmare such as Mulholland Dr., what can I learn, and subsequently teach, from the crisis in Ukraine or our ongoing economic imbroglio? I’m not sure, since I have not invested much time investigating those issues and am content sticking with the arts, thank you very much.

But the point is that criticism, when done right, is inherently constructive. The act itself, of putting pen to paper or fingertips to keyboard, constructs ideas, as if from whole cloth. Of course, we borrow and steal thoughts and turns of phrase more than we even know, but the balanced critic has come to terms with this. And when I use the word “critic,” I don’t mean the professional pundits who write for newspapers. Someone who just finished an SAT essay may find the cogs in his brain whirring at a most unusual brisk speed, surprised that such a stupid exam with such a stupid prompt can inspire such elevated, almost automatic thought. He continues to think; he continues to write; he wonders how to channel an awakened passion for good.

This article was written for The Cornell Daily Sun and can be viewed at its original location here.

Total Pageviews