Wednesday, August 1, 2012

"ABC (aka Human Communication), It's Easy as 1, 2, ..."


            “It’s not cool to hate contemporary country music.” So goes the explicit thesis of Chuck Klosterman’s essay “Toby Over Moby,” in his pop-criticism collection Sex, Drugs, and Cocoa Puffs. While I don’t know if I can completely agree with his argument (of the small sample of current country music I hear in passing [radio, shopping center background noise, CMT Sunday morning music video flip-throughs, etc.] the overwhelming majority of it seems to promote a consciousness dominated by what many conservatives refer to as the “3-G paradigm” of guns, God, and gays [the first two entities being fervently supported and the last being adamantly opposed]. This paradigm does not lead to me “hating” country music or the people who make it, per se, but it makes me uncomfortable in a way similar to how a vegetarian would feel if (s)he took a part-time job working at a butcher’s shop. It provokes me to accept an attitude of “live and let live, and maybe I’ll even periodically take part in your aesthetic norms while we cohabitate, but when it comes to aligning myself with your tastes, I’ll pass”), Klosterman makes an interesting closing remark that addresses “high-” and “low-” brow art and how it addresses and affects/reflects the intellectual lives of its connoisseurs:

“But whenever I go back to my hometown and see the people I grew up with—many of whom are still living the same life we all had twelve years ago as high school seniors—I realize that I was very much the exception. Lots of people (in fact, most people) do not dream about morphing their current life into something dramatic and cool and metaphoric. Most people see their life as a job that they have to finish; if anything, they want their life to be less complicated than it already is. They want their life to only have one meaning. So when they imagine a better existence, it’s either completely imaginary (i.e., Toby [Keith]’s nineteenth-century Lone Ranger fantasy) or staunchly practical (i.e., [Trisha] Yearwood’s description of the girl who just wants to get married without catching static from her old man). The reason Garth Brooks and Shania Twain have sold roughly 120 million more albums than Bob Dylan and Liz Phair is not because record buyers are all a bunch of blithering idiots; it’s because Garth and Shania are simply better 
at expressing the human condition. They’re less talented, but they understand more people.”         (184)

What fascinates me most about this declaration is not what it says about country music and the people who make and listen to it. In fact, I think this statement has little to nothing to do with music at all. Instead, Klosterman is drawing a philosophical line in the Jersey Shore sand; he’s saying that there are roughly two kinds of people in the world.

  1.     Those who see reality/living/the world as something inherently simple that should be enjoyed and/or toughed out.
  2.      Those who see reality/living/the world as something inherently complex that should be analyzed and/or suffered (and enjoyed if there is any time left over after the analyzing and suffering. And there’s never any time left over [trust me]).

Klosterman uses music to categorize these types of people (People in Group 1 listen to contemporary country, and maybe bluegrass and folk, and maybe Top 40 in their cars. People in Group 2 listen to classical and jazz, and maybe classic and/or progressive rock, and maybe blues while they ride their bikes to work), but who these people are as thinking and feeling human beings is more important to me than what kind of music they listen to in their free time. Klosterman doesn’t give us any hints as to what contributes to the psycho-social makeup of people in Groups 1 and 2, and thus we are left to speculate on our own.
           
I suppose I have to preface my following thoughts by saying that I am a Group 2 person who openly wants to be a Group 1 person. I’ll call myself a Group -2 person (since I’m in Group 2 but don’t find it a positive character trait). For the most part I find reality/living/the world to be incredibly complicated and filled with unforeseeable joys and sorrows, most of which come and go without the input of human beings. I think too much, I overanalyze everything, and, perhaps proving Klosterman a visionary, I like listening to classical and jazz music because their frenetic structures and executions remind me of my thought life. When I listen to country and most popular music, I find myself agitated that more isn’t “getting done”; I find myself reaching the end of a country or pop song and wondering why the artist didn’t leave room in the 3-minute track for an impassioned improvisation. In the end, I suppose I listen to the lovely simplicity of much good country music and quietly think to myself “This is all too good to be true.”

While many contemporary writers and critics (almost all of them undoubtedly Group 2 people), write to bolster the identity and power of Group 2 (and, at the same time, express complete contempt for Group 1), this is not my purpose. Many Group 2 people are intellectual and cultural snobs who think Group 1 people are philistines because they didn’t go to a liberal arts college somewhere on a coast. In short, many Group 2 people think Group 1 people would be Group 2 people if they had more intelligence and/or taste and/or class. I find many Group 2 people (including myself) to be disillusioned, bitter, depressed quasi-narcissists. On the flipside, I find many Group 1 people to be hopeful (even though the hope often seems misguided), happy, ambitious quasi-narcissists (but is my perception of Group 1 and 2 people skewed because I’m a Group 2 person?).
            
Third-tier (I think that’s the lowest) academia trained me to be an analytical, critically thinking, socially-conscious human being, and for this I am grateful (pre-college my ideology was soaked in Protestant heteronormativity, Midwestern racism, and Coca-Cola). But in the process of aiding my intellectual and artistic growth, academia also trained me to be a staunch Group 2 person. It said, “You need to be analytical, and people who are truly analytical overanalyze [just in case regular analysis isn’t enough]. You need to be a critical thinker, and the best critical thinkers are those who most brutally criticize themselves and everything they hold dear. You need to be socially-conscious, and the people who are most conscious of others are unconscious of themselves.”
            
As much as I want to be a Group 1 person sometimes, I have recently realized that one of the most traumatizing parts of being a Group 2 person is that there is no going back (I don’t like the notion that going from Group 1 to Group 2, as Klosterman kind of alludes to, means going “forward”, while going from Group 2 to Group 1 means going “backward”). Once someone tells you that reality/life/the world is fractured, relativistic, and meaninglessly flawed, and you even remotely believe it (maybe because you have to get a good grade in a course or complete a certain advanced degree program), it seems impossible to coherently make sense of things like a Group 1 person. As a wise colleague of mine once said, “School [the heart/brain of Group 2 culture] teaches you many ways to take things apart, but it never teaches you how to put them back together again.”
            
I want a way to put things back together again. I think most Group 2 people who are happy being Group 2 people take a certain kind of joy in intellectual/political/social cacophony; they look at a box of scattered puzzle pieces and say, “So what? Maybe they weren’t meant to be put together.” I see the same puzzle pieces and say, “What’s the bigger picture?” Group 2 culture says there is no bigger picture, or that each piece is its own bigger picture, or that only Group 1 people search for a bigger picture because they’re too stupid/afraid/dependent-on-the-system to accept the individual pieces in all their random, incomplete glory. These ideas aren’t working for me any more than my childhood militant-Group-1 upbringing did [i.e, The scattered puzzle pieces aren’t actually scattered at all. They form a complete picture that you can’t see because you aren’t looking correctly or you misbehaved or you’ve been over-influenced by evil members of Group 2]. Group 2 culture took my peace of mind and I want it back (preferably without having to leave my small niche in Group 2; adapting to a Group 1/Group 2-hybrid could be even lonelier than living in Group 2).
           
In the end, I know that people can never be broken down into two simple groups (That’s too Group 1 of a thing to do. But does recognizing the flaws of dualism confirm my status as a member of Group 2?) Human beings always exist on a continuum of infinite possibilities, and this makes for infinite frustration as well as infinite potential. Hip-hop artist Mos Def was on point when he called his life a “beautiful mess.” But what do we do when we want to clean up the mess (even just a little)? Do we risk disrupting the beauty? And all of this gets me wondering: is there a Klosterman-numbered group for people who prefer hip-hop music to country and jazz?

Friday, July 27, 2012

Why We Need to Go to the Movies

For my fifth birthday, my mother took me to Indian Springs Mall. It was in 1989 and this Kansas City mall was well known for it's monthly shootings. That was Kansas City in the eighties and nineties. When you grow up in a ghetto, you don't really notice the dysfunction around you until you're old enough to remember and reflect from another location.

In my kindergarten class, a fellow classmate asked me, quite seriously: "Are you a blood or crip?" I didn't know what either was so I asked him to explain himself, while I pretended to make pancakes on a Fisher Price stove set. "I don't know," he admitted. "But I know my brother's a blood." When I went home that day, I asked my mother "what we were." I remember the troubled look she wore on her face. I never got a confirmation on my gang status.

But that's what we were working with. I lived in a city where you could get gunned down any minute and all I wanted for my fifth birthday was to see Batman, in one of the most dangerous malls in the city. Well, my mother took me to see the non-stop action film about good vs. evil. The obvious good guy was Batman/Bruce Wayne who was rich, had a sense of humor, and was still humble, if not vengeful, because the Joker made him an orphan at an early age.

Some would say that five is rather young to subject your child to such a violent film. (There was also a little sexiness in it too, if I remember. I know the Prince soundtrack was pretty sexy and I was allowed to listen to that cassette from start to finish.) My mother didn't have the best filter for adult films but I am thankful that she allowed me to experience this film, specifically.

I say this because of the ongoing violent turmoil that Kansas City was in. In a world where she and I had very little control of our surroundings, it was nice to escape in a film, based loosely on Any-Urban-City, USA, where the crime was high and only a man in a bat costume could save the people. According to Gerard Jones, author of Killing Monsters: Why Children Need Fantasy, Superheroes and Make Believe Violence:

"Children can benefit from exposure to fictional violence because it makes feel powerful in a 'scary, uncontrollable world.' The child's fascination with mayhem has less to do with the fighting and more to do with how the action makes her feel. Children like to feel strong. Those committing violence are strong. By pretending to be these violent figures, children take on their strength and with it negotiate daily dangers."

Since that night in 1989, I've seen every single Batman film in theaters (even the George Clooney one). I've done it with the same amount of enthusiasm because Batman is a the kind of hero that cities need. He's not Superman, he wasn't born with magical powers, he's a "regular guy" who is obligated to secretly protect a city. Not everyone appreciates the work he does, but he does it anyway. Because it's his destiny.

Fast-forward to today. 2012, The Dark Knight Rises came to theaters at the time I really needed it: near my 28th birthday. I'm a lot older than the girl that came from Kansas City but with a whole new set of fears. I'm a college grad but working at a dead-end minimum wage job who worries about her future in an economically, politically unstable time.

And to top it off, a lunatic stormed a movie theater on the Dark Knight's opening night. With a small army's arsenal, he shot 70 something people, killing 12. It was Indian Springs all over again but on a much larger scale. It could have been enough to keep me from seeing summer's hugest blockbuster but I couldn't miss this. Right then, I needed Batman more than anything.

My husband and I sat in a theater two days after the Colorado movie theater massacre amongst others who were taking the same chance. We were relieved to see Bruce Wayne take control of his destiny and save Gotham City from herself.

And I was just as relieved as 5-year-old Charish. That Charish, who finally got to spending time with a single mother who worked so much, she barely saw her. They didn't have disposable income to go the movies often, but her mother made time that July night. They both, like the viewers today, needed a movie to tell them that they were in control of their lives.

I am Not You and I Can Prove it


            
Often, Facebook is touted as the social networking mega-tool that ultimately shows us how similar we all are as human beings. We all have friends. We all have things that happen to us. We all feel the desperate need to tell the universe that we exist; that the moments in our lives should be memorable not only to us, but to those we consider ourselves close to.
            
But the more time I spend on Facebook (admittedly trying to promote Y.N.F.P. in the Internet language-sphere), the more I realize that human beings (as they present themselves online) are almost nothing alike. Or, I should say, almost no one I am friends with on Facebook is anything like me.
            
This realization came two days ago while I was scouring Facebook in an attempt to distract myself from work. I saw on my newsfeed that an old friend from high school had posted photos from her wedding. I perused the photos, spotting old acquaintances here and there throughout the wedding party and congregated matrimonial onlookers. The wedding seemed to go smoothly and I was happy and hopeful for all involved. But in viewing photo after photo, I found myself asking, “How do these seemingly joyous people have any time to party with all the scholarly reading that need to get done? Isn’t this wedding getting in the way of hawking their creative work to unknown literary magazines? Does marital-status show up on CV’s, and if not, why bother with marriage in the first place? What is there to show for it professionally?”
            
These questions left me feeling cold. I clicked back to my married-old-high-school-friend’s main page and checked out what she is doing with her life. She is newly married, has a good full-time job, lives in a no-name (at least to me) town in the American Midwest, participates in the occasional arts and crafts project with her mother, and keeps in contact with old friends because she understands the communication up-keep to be a reflection of good character. In short, she is a happy, healthy young woman with excellent prospects for a successful future and likely no interest in contemporary American literature, let alone experimental poetry. Holy shit.
            
The title of this post could be “How a Facebook friend taught me that there is life outside academia/the arts/an artistic community/Word-Land” (it might actually have this title, as I have not yet titled the piece). In seeing my old friend living a nice, normal life (I guess “normal” meaning a type of life pursued by many college-educated, white, retirement-planning young people between the ages of 18 and 24ish) I completely understood how abnormal my language-dominated life is (not to mention the lives of my word-colleagues and word-friends!). To demonstrate this abnormality, I offer the reader the following account of my day thus far:
o   
       9 :00 a.m. — Wake up and drink tea while reading Time, Juxtapoz, National Geographic, and thinking about writing.
o   10:00 a.m. — Check various email accounts for updates about publishing opportunities.
o   10:15 a.m.  — Search Internet for journals to publish in. Read other writers’ blogs and writers’ blog responses to other writers’ blogs, etc.
o   11:00 a.m. — Read and highlight sections of essay entitled “Can Poetry Matter?” for future Y.N.F.P. post.
o   12:00 p.m. — Rest eyes by watching No Reservations on Netflix. Think about Anthony Bourdain’s style of writing. Think about how many different ways there are to make a living with writing.
o   1:00 p.m. — Head into town to write at a local coffee shop. Think about the writing I will have to do later in the afternoon to make money. Think about Ralph Waldo Emerson and how I would rather be reading his Essays than writing to make money.

And that is it. All of it. My whole day — from the moment I woke up to this very moment — has been completely dominated by language/writing/literature/communicated-ideas.
           
I recount today’s events not to prove my dedication to my field. I recount them to show what I, in some twisted way, think passes for normalcy in the life of a graduate student. It’s not sick that I insulate myself in the written word (even though it kind of is), it’s sick that I think my life-behaviors are somewhat normal; it’s sick that I assume everyone else my age is as hyper-literate as I am until I am shown otherwise on Facebook.
            
But beyond the sick feeling in my stomach at the understanding of my life’s path (and the nasty taste in the back of my mouth from too much tea), there is a certain sadness that comes with knowing that most, if not nearly all, of the people I “know” (as much as anyone can really know anyone else in a Facebook “relationship”) spend roughly 0% of their lives thinking about the things I spend roughly 80% of my life thinking about. And they are probably 100% fine with this.
            
Until two days ago I believed writing was the easy way out of life challenges. I studied education as an undergraduate in college. It was a lot of work that ultimately left me feeling tired and hopeless. I dropped out of the program. My parents said I had to graduate college; they told me to pick something I was “good” at and get a degree. “You’ve always been good at writing,” they said, “why not get an English degree?” So I did, and when graduation came and I had no idea what to do with myself (the working world seeming to me to be, like my previous major, a lot of work) a professor said, “Why not get a Masters degree in English?” So I did, and when graduation came and I had no idea what to do with myself (the working world seeming to me to be, like my undergraduate major, a lot of work) a professor said, “Why not get a Phd in English?” So I am, and it all seems like it came about without much thought on my part. In fact, it seems like the academic life was destined to be, since at every major transition point in my academic career someone said to me, “Well what else could you possibly do outside of writing/academia? You’re not good at anything else!” And, I suppose, that’s as good a reason as any to keep doing what you’re doing.
           
But in seeing my married-old-high-school-friend’s wedding photos, in seeing that she has a nice job in the rural or urban Midwest (both terms kind of blur into each other when describing Midwestern civilizations), I realized that when writers write, they are always drawing boundaries for their lives; they are always choosing writing instead of something else. I always assumed that I write because I can’t do anything else (I guess I similarly assumed that writers in general write because they are somehow unable to complete any other life[money-making]-activities with any degree of competence). But now I understand that I write because I can do other life-activities but, for some reason or another, don’t.
            
Words are a lot of work, even when they don’t feel like it. Writers are more than insurance salespeople that don’t sell insurance or farmers who don’t know how to grow or harvest crops. They/we are more than people who do stuff with words in a vacuum of all other career opportunities. Word-people take language seriously, probably knowing (maybe even “in spite of knowing”) that the majority of “normal” people don’t give a shit about what they say or think. I don’t know if writers’ consistent work in the face of an indifferent public is honorable or pathetic. Maybe it’s a little bit of both. But I suppose I am writing this, so that must mean something.
            
Thanks, Facebook.      

Wednesday, July 25, 2012

On Not Forgetting (Or, Remembering)


            I am, admittedly, on a Josef Pieper kick. In his essay “Remembrance: Mother of the Muses,” the author laments “There are, indeed, large areas of reality in danger of being…forgotten” (Only the Lover Sings 62). What are these alleged “large areas of reality” that are slipping through the collective memory/imagination of humankind? Well, for one thing, can anyone remember what happens when one stands in a meadow at dusk?


            Although a bit dramatic, Jason Schwartzman’s character in I Heart Huckabees, makes a strong point. The metaphorical “strip malls” of our high-tech, high-stress reality are making it almost impossible for human beings to remember the “meadows at dusk” of reality. In a similar vein as the thesis of Pieper’s previous essay, it is important to note that the human mind/imagination is not infinite; it is limited it the amount of data it can store. With so much external stimuli being thrown at human beings daily (and limited opportunities to duck/get out of the way) it is only natural that certain information gets accepted into our thought life while other information gets rejected/discarded.
            Since mental storage space is limited, it’s important for human beings to make it a discipline to hold on to the good stuff in reality (things that make them more compassionate, enlightened, and forward-thinking human beings) and reject the useless stuff (things that make them lazy, ignorant, and unproductive/destructive human beings). Deciphering what is good for one’s psyche, and is thus cleared for mental storage, from what is bad for one’s psyche, and should thus be rejected mental entrance, can be difficult. Since this deciphering is not the focus of this essay (and since it is a topic the author struggles with personally and doesn’t feel ready to fully comment on at length) suffice it to say: if external stimuli are difficult, make one question his/her preconceived notions, push one to be a more compassionate person, and/or stretch one to previously unknown ethical and intellectual limits, it is likely stuff that should be stored in the mind. If the stimuli make one feel cold, apathetic, bored, and slightly gassy, one would be wise to reserve one’s mental storage space for a more efficacious brand of data.
            What Pieper is most interested in in “Remembrance” is not how one should decide what to remember and what to forget (in terms of vital and expendable Knowledge), but how one can go about remembering vital Knowledge that was at one time forgotten or unknowingly replaced. To use the Huckabees example once more, Pieper would ask, “What should one do once one has forgotten what it feels like to stand in a meadow at dusk?” The obvious answer is: stand in a meadow at dusk and feel to one’s heart’s content. But there is a suburban-sprawl-sized roadblock to this solution: the meadow has been destroyed and replaced with a strip mall, remember?
            Now the would-be meadow stander/feeler is in deep existential shit. She desires to stand in a meadow and feel whatever the experience has to offer. If she recognizes that the strip mall has destroyed her chance to physically stand in the meadow, she at least desires to remember what it feels like to stand in the meadow. But the passage of time and the accruing of expendable knowledge has left the meadow-memory in the dust. She can’t experience the meadow directly; she can’t experience the meadow viscerally. What is she to do?
            Pieper says, “Enter the artist, the great rememberer/reminder!” For Pieper, the artist’s job is to remember life/reality as it can/should/will be and remind her audience of her own visions. The artist looks at reality (sees it in an intense and critical way) and then creates work that displays this vision. The resulting work is shown to an audience who can then not only remember reality as it was before expendable knowledge (sports statistics, stock exchange figures, world news headlines, 401(k) dividends, GPA points, etc.) took over their mental real estate, but also be encouraged to clear more creative landscapes for future vital Knowledge intake.
The artist must say, “Here is my work. Here is my vision of reality. Take a look at it, brother and/or sister. If it challenges you to be a better human being, make space for it in your mind. Perhaps this space can be found by clearing out the junk-data that makes you feel like shit. Maybe my vision will remind you of your own vision, even if you think your vision is long forgotten. Welcome your vision back. See reality again for the first time.”
             The artist must not be afraid to remember what it means to be human. In remembering, and creating from her memory, she can remind others. And this reminding can uplift and enliven the masses. 

Sunday, July 22, 2012

On Seeing


In his “Learning How to See Again” from the collection of essays entitled Only the Lover Sings: Art and Contemplation, 20th century Catholic philosopher Josef Pieper makes the ominous observation that “Man’s [sic] ability to see is in decline.” It is important to note that when Pieper talks about “seeing,” he does not refer to the physiological act of vision. Instead, he means “the spiritual capacity to perceive the visible reality as it truly is.” In the end, Pieper concludes, “the average person of our time loses the ability to see because there is too much to see,” this overabundance of stimuli totaling up to a phenomena the author calls “visual noise” (30).
When I consider Pieper’s idea of “visual noise,” I am reminded of a man interviewed in the graffiti documentary Bomb It. The interviewee, who I believe is/was a resident of Buenos Aires, comments on the plethora of high-tech visual stimuli (the majority of it corporate advertising) he is bombarded with on a daily basis. The man almost desperately relays to the audience his disgust at the words and images constantly forced into his line of sight, visuals he “didn’t ask to see.” In an interesting commentary on the effects of visual rhetoric, the man compares his unintentional prolonged sexual arousal at the sight of a female swimsuit model advertising perfume to a type of harassment or abuse. He is not in a relationship with this two-dimensional woman, and yet because she has forced her scantily clad self on him in the form of a 50-foot billboard, he cannot stop thinking about her for the rest of his afternoon. She has forced her way into his unconscious mind without his permission. He finds himself thinking about her body instead of his work. He has become deafened by an instance of too much “visual noise,” to the point that he can no longer hear himself think.
Pieper thinks “visual noise” has much greater political and psychological implications than unwanted erections caused by bikinied sirens hawking Chanel fragrances. He sees our global economy’s reliance on 24/7 hyper-marketing, and our global culture’s love jones for constant sensual stimulation, as the potential downfall of critical thinking at large and individuality in particular. He states “at stake here is this: How can man [sic] be saved from becoming a totally passive consumer of mass-produced goods and a subservient follower beholden to every slogan the managers may proclaim?” (33). In short, if human beings are constantly being told what to think and feel and do and buy (by external powers/institutions/conglomerates etc.), at what point will they give up the language and images produced in their own minds and simply go along with what they’re being told/given? Are we in danger of relinquishing our very wills to the Powers that Be, simply because an external voice convinces us that it is easier to be spoken for than to speak for ourselves?
I am not a doomsday believer. I cringe at the bumper stickers on the backs of pickup trucks in my hometown that warn viewers: “Don’t Believe the Liberal Media.” I don’t, for the most part, believe that human beings are being brainwashed into mindless automatons that open their wallets the moment a corporate commercial tells them to. All this being said, I would be lying if I said I’m not concerned for the critical-thinking faculties of members of my generation (including myself). In a technologically-stylized world that offers more and more opportunities for individuals (specifically young individuals) to gain access to information, it seems that it has become cooler and cooler for people to be uneducated and mis/un-informed (I direct the reader’s attention to current pop culture icons, and to these icons’ fascination with doing nothing and desiring nothing more than being who they are [which is often a person living blissfully within their own ignorance]).
            Pieper offers two solutions to global critical-thinking’s demise (thank goodness). The first, as one might stereotypically expect from a Catholic philosopher, comes in the form of abstinence. The world is cranking out hot air bent on robbing us of our individuality? We would do well to simply turn off our televisions, computers, radios (do people still listen to radio?), put down our unreliable newspapers and gossipy magazines, etc. If the world is drowning in “visual noise,” one way to float is to close our eyes.
            While limiting the intake of hyper-language/advertising is a great start to regaining one’s sense of self-language (see no evil, hear no evil, etc.), it is a negative solution to a positive problem (it tells us not to do something [consume] with hopes that what’s being done to us [forceful hyper-produced visuals/messages] will ultimately subside. It’s a passive solution that provides its practitioner with little ground to stand on (see no evil, hear no evil doesn’t make evil go away. In fact, it shows evil we will not do anything to retaliate when it attacks our minds/hearts/bodies. It shows evil who we hope not to be [consumers], rather than who we are/want to be.)
            I prefer Pieper’s second solution: “To be active oneself in artistic creation, producing shapes and forms for the eye to see” (35). If words are being forced on us and pushing us towards intellectual/emotional/spiritual complacency, it is not enough to close our eyes. We must combat the onslaught of language with language of our own. In a world focused on our consumption, we must create not only to show that we will not be mindless consumers, but also to regain our vision (in Pieper’s sense of the word) of what the world is and can be, of who we are and can be.
            Consumers see the world as it is given to them by external producers. They have no faculties to see it otherwise. They take what they are given and use it to make sense of their reality. This receptive-centric behavior can go well until the consumer decides she wants to be free, that is, until she decides her given reality is no longer working for her; then there is no hope for her because she has nothing to assist her in her liberation. She is forced to see the world as her captors tell her to see it. If she wants to “see” for herself again, she has to reclaim her own vision.
            But how does artistic creation lead to reclaimed vision? To answer this, it is necessary to quote Pieper at length:

“Before you can express anything in tangible form, you first need eyes to see. The mere attempt, therefore, to create an artistic form compels the artist to take a fresh look at the visible reality; it requires authentic and personal observation. Long before a creation is completed, the artist has gained for himself [sic] another and more intimate achievement: a deeper and more receptive vision, a more intense awareness, a sharper and more discerning understanding, a more patient openness for all things quiet and inconspicuous, an eye for things previously overlooked. In short: the artist will be able to perceive with new eyes the abundant wealth of all visible reality, and, thus challenged, additionally acquires the inner capacity to absorb into his mind such an exceedingly rich harvest. The capacity to see increases” (35).
           
If one wants to “see” reality (the world, her/his-self, etc.) one must not rely solely on the observations, and the subsequent recounting, of others. One must look for oneself. But it is difficult for one to focus on one’s own vision when other ways of seeing bombard her. In creating, the artist must look harder and truer at the things that are around her and inside her. In this harder and truer looking, the artist “sees” more clearly. And this new vision is not valuable in and of itself. It is only advantageous when it is used to help the artist navigate the treacherous landscapes of reality, where she will encounter (and hopefully help to heal) the poor-sighted brothers and sisters who live there.




Friday, July 20, 2012

From Confrontation to Retreat to Confrontation


            When I was seventeen-years-old I dated a headstrong (if not stubborn and melodramatic) young woman who would break your neck if you crossed her or any member of her tight-knit community of family and friends. I still talk to her on occasion and enjoy the fact that I am probably still a member of this community (tight-knit communities, in the minds and hearts of such women as the one I speak of, die hard). One weekend evening, during one of those crucial we-just-got-together-is-this-gonna-work-out high-school dating moments, my date and I were perusing the shelves of a local video rental store when I came across the film Life is Beautiful (I think it might’ve been the year’s Oscar-winner for Best Film, but don’t fact check me on that). I called my date over from the nearby stacks she was sifting through and offered my selection for her approval. She stopped cold, immediately began crying, and informed me that the actor who played the male lead in the film, and whose face was prominently displayed on the video’s front cover, closely resembled her uncle who had just passed away. We never, of course, watched the film and broke up soon after the incident (there is probably something of value in this anecdote about how the arts have influenced my relationships with others, but I’m choosing to overlook it in favor of a meditation on movie-titling, rap records, writing, and, in some sense, hope. Bear with me).
In recent years I have sometimes thought about Life is Beautiful (which I believe I at one time learned was titled to promote a sense of irony amongst its viewers), not so much as a film but as a declaration, almost a way of life. What does it mean when one says “life is beautiful”? What worldview or ideology would one have to subscribe to to make such an audacious claim? How could one have such a worldview or ideology in spite of all the ugliness that occurs daily?
For my metaphysical money, I prefer hip-hop artist Talib Kweli’s title-tracked, bold-voiced affirmation that “life is a beautiful struggle.” This notion pairs life’s potential elations with its unavoidable miseries, its natural health with its incurable illnesses, its sought-after victories with its crushing defeats. It is a notion tailor-made for artists and general-thinkers alike. I constantly and consciously write with it in the back and front of my mind. If I were the type of person that got tattoos across their belly (a la Tupac Shakur), it would be tattooed across my belly.
Writing is a perfect art form because it embraces, in a way that is tangible to its practitioners and consumers, both the beauty and struggle of human existence (and perhaps the existence of all living beings that come into contact with humans). It is easy for an average art-viewer to look at a painting and say to herself “that is beautiful” even if she on some level recognizes that the painter probably went through quite a struggle to create his/her piece. Visual arts often project only beauty or struggle to the average viewer who takes the act of seeing/viewing a piece as a natural (read: easy), rather than a critical (read: difficult), practice. Literature seems different.
Writing is work. Anyone who has seriously struggled over how to express a feeling or image or point of view with something as fragile as human language already knows this. Reading is work. Anyone who has read something and seriously struggled over how to make sense of it already knows this. (I would like to mention here that the creation and reception of visual arts are also work, very hard work indeed. Unfortunately, it seems many average contemporary viewers/witnesses of visual art experience it on a surface level that can easily be compared with entertainment rather than critical work. Such viewers, when approached by say, a painting they don’t understand immediately, often quickly move on to the next piece in hopes of finding the acquisition of beauty or struggle more easily. Paradoxically, in our visually-dominate American TV consumer-culture, it seems many people are more willing to put in emotional and intellectual work over a poem they don’t understand than an image they don’t understand. Perhaps because deciphering a poem traditionally offers more cultural capital than examining the intentions behind an image? (I guess by this I mean that one seems more cool/smart to one’s peers if he/she “gets” what a poet is trying to say than if he/she “gets” what McDonald’s is trying to rhetorically accomplish with their most recent Big Mac billboard advertisement)
Language’s inherent slipperiness make it the ideal medium to express and receive life’s illogically balanced truths. Even the words “beautiful” and “struggle” bring so many contrasting and conflicting notions and images to mind. I like that, while we’ve already discussed the work that writers and readers do, language itself seems to participate in a type of arduous meaning-making (or at least meaning-trying) work. Every participant in a written linguistic event (writer, reader, language) is working, struggling, to find something beautiful or tangible or meaningful/sustaining. Writing and reading use language to encourage their participants to hold on to something (anything) permanent in our world of impermanence; this is a beautiful struggle in itself, a lovely and complicated microcosm of the destruction/confusion-obsessed reality we nobly live in/through.

Tuesday, July 17, 2012

April 29th 1992

"Riot in LA. Where the fuck were you?"


Well, I'll tell you where I was, Ozomatli. Exactly 20 years ago, I was a seven-year-old, trapped in Omaha, NE, feeling depressed about squirrel poetry. But across the Rocky Mountain range, in a little place called Los Angeles, people were rioting in the streets over Rodney King (remember him? "Can't we all just get a long?"). In fact, there were other cities around the US that were up in arms against the "establishment."

As this song suggests (originally composed by ska-band Sublime), the riots of '92 may have started out about Rodney King, but like most riots they ended up about people being sick of authority. No one in government was especially interested in helping the little man, the middle class. Poor people were just poor and there was no helping the fact. Not after middle America grew fearful of Reagan's so-called "Welfare Queen." So no, no one was offering much in the way of "assistance." And in the end, people just had enough.

They literally toppled over "order". Taking to the streets, burning building and looting. You'd be surprised to know that a lot of stuff being looted were the necessities. Pampers and whatnot.

Does any of this sound familiar? I suppose the closest we've gotten to those '92 riots are the Occupy camp-outs. But it's all centered around the same thing, right? Government still doesn't seem particularly interested in the little man and how sick and tire he is of being sick and tired.

I bring this song to your attention because, for one thing, it's a damn good song. Its a bombastic cacophony of beats and brass. It's also an accurate archive of history. Songs can be like that, poetry as well. The creative written word can be just as useful as the Library of Congress.

Just think of Marvin Gaye's Mercy Mercy Me (The Ecology Song). Like Rachel Carson, writer of Silent Spring, Gaye was a little ahead of curve in prophesying our nation's environmental demise. Even back then, the fish were stuffed to their gills with mercury and Gaye was writing about it. He sang about it and we can still sing about it today.

I can almost guarantee that Sublime's song will still be relevant a decade from now, just as Mercy Mercy Me has stood the test of time. What beautiful eloquent piece of history will you archive today? Will some kid in the future take your artifact, brush the metaphorical dust off of it and find it relevant to her life? I hope so.