6.12.2015

this is the braindead megaphone

I was lying in bed last night feeling braindead, as one might feel after wandering aimlessly through a dull but overstimulating kiddie carnival. That kiddie carnival was my phone. I suspected that I had spent too much time staring at it, and I’m certain that the lucidity of my mind is inversely correlated with the amount of time spent on said device. It’s dangerously easy to fill in the crevices of my day—the moments in transit, in boredom, in waiting—by refreshing and scrolling through my phone, a habit not unlike chain-smoking.

The mobile phone possesses traits common to both a good companion and bad addiction: dependable, comforting, responsive, and easily integrated into one’s routine, which makes our attention to it both insistent and habitual. It becomes the first point of contact with much of the world, a means of both private and mass communication, and a medium through which we come to understand people, events, and ideas.

I typically fall into a failed dieter's remorse after any binge on technology: I berate myself and strive to enforce stricter disciplines. This rarely augurs long-term success, but last night my lamentation led to two actions: first, I turned off my phone completely, and second, I began working my way through the stack of papers I had brought with me to Los Angeles—I print everything out because I don't read well on my computer.

I started with David Grann's horrifying and thrilling piece about a prison gang called the Aryan Brotherhood and then continued onto George Saunders’ remarkable essay “The Braindead Megaphone.” Though vastly different in form and content, both are examinations of the baseness and derangement of the human spirit—the former much more obvious, the latter more normalized, subversive, and arguably more pervasive.

“The Braindead Megaphone” (I’m referring to the essay, the first in a book by the same name) is a brilliant piece of writing from a decade ago, but it still applies. It’s a sharp and piercing comment on mass media, whose plethora of messages are fired at us relentlessly through a machine-gun-like contraption, with literally mind-numbing consequences.

Saunders begins with this premise: our mental experience differs from a man living in the year 1200 in “the number and nature of conversations we have with people we’ve never met.” He describes a media landscape that not only fosters but is ruled by “braindead megaphones”—voices whose rhetoric is unavoidable because of their loudness and dominance, not because of their intelligence. We’ve arrived at a point, Saunders argues, where we’re hardly aware of the dumbness and coarseness of the most blaring messages. Even more regrettably, because these megaphones rule our eyes and ears, their messages become our own: “what we hear changes the way we think."

I see the dominance of these messages extending beyond the politicized opinions of traditional news outlets to the cultural and aesthetic ideals of beauty, coolness, and wit propagated through social media—on Twitter, Instagram, Facebook, especially—which continue to multiply themselves in a homogeneous fashion, like horny rabbits in captivity. While the development of mass media is not new, Saunders points out that we’re in “an hour of special danger if only because our technology has become so loud, sick, and seductive, its powers of self-critique so insufficient and glacial.” As a result, we’re often unknowingly consuming propaganda—dumbed-down information with an agenda, which in turn shapes and distorts our worldview.

A loud message, Saunders says, doesn’t actually require much intelligence to spread widely (read: Real Housewives of Orange County; also, cat memes). One’s message only has to be viable or watchable, which also often means simple (complexity and nuance don’t lend themselves well to loudness), shocking, entertaining, controversial, and flashy. The methods by which we parse these messages are weak—we're suckers for drama, conflict, and gossip. In fact, we might actually enjoy listening to people go on for 10 hours a day about (in Saunders' words) “a piece of dog crap in a bowl” (think: blue dress controversy).

While we’re caught up in the morbid, scandalous, and sensational details deemed so strangely urgent by the powers that be, we forget to think about things in terms of their morality, intelligence, impact, etc. “Where was our sense of agonized wondering, of real doubt?” asks Saunders. He’s not condemning silliness, only asking that that we recognize silliness for what it is, and that we question whether our idiocy is really worth indulging in all the time.

By no means is Saunders criticizing our intelligence either; rather it’s because he assumes that we are bright and intelligent beings that he sees our increasing tolerance for stupidity as particularly tragic:
“Is human nature such that, under certain conditions, stupidity can come to dominate, infecting the brighter quadrants, dragging every body down with it?”
Here’s what Saunders says is a good story:
“The best stories proceed from a mysterious truth-seeking impulse that narrative has when revised extensively; they are complex and baffling and ambiguous; they tend to make us slower to act rather than quicker. They make us more humble, cause us to empathize with people we don’t know, because they help us imagine these people, and when we imagine them—if the storytelling is good enough—we imagine them as being, essentially, like us. If the story is poor, or has an agenda, if it comes out of paucity of imagination or is rushed, we imagine those other people as essentially unlike us: unknowable, inscrutable, inconvertible.”
He ends with a simple antidote to the problem of the braindead megaphone, which isn’t to legislate against Stupidity, however tempting (this quote is great: “Can we legislate against Stupidity? I don’t think we’d want to. Freedom means we have to be free to be Stupid, and Banal, and Perverse”), but rather, simply to become aware of “the Megaphonic tendency” and engage in discussion about the same. In action, this translates to:
“Every well thought-out rebuttal to dogma, every scrap of intelligent logic, every absurdist reduction of some bullying stance is the antidote. Every request for the clarification of the vague, every poke at smug banality, every pen stroke in a document under revision is the antidote.”
I say, lend a ear to the naysayers, permit disagreement, leave room for the meek and quiet, read longer and more thoughtful articles; read books. Ask many questions, accept ambiguity, recognize complexity, understand that there are rarely catch-all solutions to huge problems; determine what is valuable and worthy, and spend time with those ideas.

And to myself I give this advice: take a break from all the media, from all the noise. Silence the megaphones once in awhile, whenever you can; absent yourself, even if that means letting people down or being unavailable. Shut off your phone completely and disappear, even if that's not the way of the world. It’s okay not to believe what everyone else says.
“We still have the ability to rise up and whip our own ass, so to speak: keep reminding ourselves that representations of the world are never the world itself. Turn that Megaphone down, and insist that what’s said through it be as precise, intelligent, and humane as possible.”

Amen George Saunders, Amen.

(Go read “The Braindead Megaphone” now!)

6.07.2015

this is how to impress people

A friend of mine is obsessed with impressing people—whether ironically or literally I still have not figured out. “My personal brand of faking being impressive is being cocktail party interesting,” he writes, on a website called How To Seem Impressive. From what I gather, impressing people in this sense means piquing the interest of strangers or acquaintances so that they think highly of your persona and/or facade. Though I admit there’s at least a trace of this particular human proclivity in all of us, the conscious and primary pursuit of impressing people seems to me a total assault on the formation of one’s own mental faculties. One cannot fully realize the potential of her own brain—of which I believe there’s prodigious potential—if her thoughts and subsequent actions are shaped by what viscerally pleases other people.

This idea became clear to me when I began to think about not who impresses me, but who has impressed upon me—the latter being a much deeper admiration and respect that comes through strong, vivid, and often repeated impressions, rather than fleeting ones. When a stamp is impressed upon fabric, it leaves a mark, and so do the people who impress upon me, rather than merely impress me as a glittery parade float might.

The people (thinkers, writers, friends) who have impressed upon me share a few qualities. But the most salient are these:

Each thinks for herself, regardless of whatever is contemporary or mainstream or popular. In fact, each seems to disregard popular culture entirely.

Each is confident in her authority and voice. She believes—or at least acts as though she believes—in the significance of her own work, which arises from an unparalleled, inimitable existence. Without such a belief, the work crumbles—no one else can believe in its significance either. She trusts that if something is weighted with personal gravity and relevance, that it must matter to many more people too. This is by far the hardest quality to come by. Self-doubt is crippling and much more common than assured but not bombastic confidence. Believing in the significance of one’s work helps one to persevere; perseverance begets completion. This confidence does not preclude humility, or admission of fault when applicable. Constant awareness of one's own fallibility is essential to the progression of mind.

Each grapples with difficult subjects and ideas, not merely regurgitating what has already been said, or what has merely been taught. The work of grappling, whether intellectual or political or social, looks different for each person.

Each works hard. Genius may appear to be inherent or inherited, but the fruits of such genius are never without effort, and genius is only realized through labor. Each pursues her ideas to their furthest limits, beyond the limits set by those who preceded them.

Each exercises her distinct and idiosyncratic mental faculties to strive for truth, rightness, and lucidity of thought. The questions and answers in this endeavor may be neither popular nor impressive. However, a mind that exercises and strives in this way reaches its apotheosis. A unique consciousness working in the service of goodness—not merely capitalizing on its mimetic tendencies, which is much easier, much more primal, much more convenient, and much more feeble—is a noble and beautiful thing. (But of course, overcoming the majority of our mimetic tendencies requires much imitation and deep immersion in other people’s ideas and philosophies before we're even able to muster anything new. The examination of ideas, events, and language that may be difficult to comprehend, taxing to extrapolate, and time-consuming to dwell on is crucial in building the foundational layers of one’s mind). The person who does this understands the gravity and gift of human agency, which demands action. One does not accidentally stumble upon strength and confidence of thought. One is not born with mental faculties that are both fortified and humble, untroubled by passing winds but still open to its own betterment—one must consciously strive for it.

6.03.2015

this is the most human human award

When people ask what the difference is between this blog and my newsletter, I usually say that this blog contains content either too mundane or too long for my newsletter. The latter seems to be composed more of material I actually want to communicate to people, rather than private musings made public. Both are writing exercises, but any act of communication gives more weight to the audience and the receiver, whereas, when I write on this blog, I allow myself the gratuitous indulgences of the trivial and long-winded, knowing that this space is primarily for me, and that whoever reads this is merely privy to the space, and not the person for whom this space exists. But I am finding increasingly that some of the content I write for the newsletters is material that I would like to keep on this blog too, where there is a chronological continuity by which I can track what I think and write. Now and then, I'll be posting excerpts from my newsletters here.

After I came home from watching Ex Machina last week (highly recommend), I began researching the Turing Test, which tests a computer’s ability to be indistinguishable from human intelligence. The annual Loebner Prize competition is the most famous public display of the Turing test, in which artificial intelligence programs (“chatbots”) compete for the “Most Human Computer Award.” Computer programs are paired with humans (“confederates”) for five-minute conversations, and the conversations are scrutinized by judges. If a computer can fool the judge at least 30% of the time, then the computer passes the Turing Test (note: many humans cannot pass the Turing Test). The test centers on the natural language abilities of participants, which is supposed to demonstrate one's intelligence—rational, emotional, aesthetic, and otherwise.


The more interesting part of the Loebner Prize, I discovered, is the “Most Human Human Award,” which is given to the human confederate who is most convincing as a human, according to the same criteria applied to the competing computers. It seems both farcical and ironic to me that a human being would be tested for his human-ness; this test begs the question of what it means to be human, how we create criteria for human-ness, and perhaps more alarmingly, how the definition of “human-ness” changes as technology advances. Writer Brian Christian, who won the Most Human Human Award in 2009, wrote a book about his experience as a confederate in the competition (read his excellent article on the same subject here) and asks this question: “How, in fact, do we be the most human we can be—not only under the constraints of the test, but in life?”

Though Christian is told “Just be yourself” as advice to win the Most Human Human Award, he spends months researching, training, and preparing to be “the most human.” He examines the history of the computer and our relationship to it, which is a strange one: the original computer was actually a human; computers were in fact job descriptions for women who performed calculations and numerical analyses at financial firms. A long time ago, digital computers sought to imitate human computers; now, when we encounter a genius or math whiz, we say that his or her brain is “like a computer.” Christian remarks, “It’s an odd twist: we’re like the thing that used to be like us. We imitate our old imitators, in one of the strange reversals in the long saga of human uniqueness.”

In his research, Christian brings up human characteristics that we used to consider unique, like the abilities to use language and tools or do math, that are no longer considered as such (because computers can too!). “Is it appropriate to allow our definition of our own uniqueness to be, in some sense, reactive to the advancing front of technology?” he asks. “And why is it that we are so compelled to feel unique in the first place?” What he means is: Are we less human because machines are becoming more human? Do we determine our human-ness based on the abilities and limitations of computers? Perhaps we humans are becoming more like machines, he suggests.

Christian ultimately finds that the questions the Turing test elicits are also the most central questions of being human: “How do we connect meaningfully with each other, as meaningfully as possible, within the limits of language and time? How does empathy work? What is the process by which someone enters into our life and comes to mean something to us?”

In thinking about this question of what constitutes human-ness, I’ve become convinced that the Turing test is limited and flawed not only because—as other critics have noted—some human behavior is unintelligent, and some intelligent behavior is human, but also because intelligence, and our verbal demonstration of it seems to be only one facet of our humanity. This may be obvious, but it's worth thinking about. Most of my days are consumed by judgments of my abilities and their resulting productivity, and much of this week has been spent criticizing my failures in language, writing, and communication. At the moment, I would probably fail the Turing Test if I had to take it. But when I think about what makes me human, I think not of my output but of my interior life and the complex terrain in me that is constantly seeking meaning, that yearns to connect and share with others the common experiences that make us feel less alien, less alone.

In reflecting on human-ness, I fixate on our capacity to feel pain, our inability to articulate our deepest suffering, the silent awe we experience in the face of overwhelming beauty, the complex and sometimes paradoxical interplay of emotions like jealous and love, confusion and certainty, sorrow and joy. We are human because we can bear the contradictions of this life and because we are not constant, not steady, not predictable. We change with time and effect. We surprise one another. As Hava Siegelmann once described intelligence as “a kind of sensitivity to things,” I see our sensitivities—and our reactivities—to barely detectable phenomena and nuances as crucial to “human-ness.” And also: our faith and our doubt, our search for meaning, our moral judgments, our conscience, our confrontation of the incomprehensible, our creation and imposition of narratives, our belief and our disbelief; the accumulation of wisdom over time; the way people imprint on us; the inexplicability of love and heartbreak.

See the newsletter in full here.