more on tom mccarthy and the international necronautical society

Symbolic Remainder


Tom McCarthy


Interviewed by Jeffrey Inaba


On behalf of the International Necronautical

Society, novelist Tom McCarthy and

philosopher Simon Critchley recently released

their ‘Interim Report on Recessional

Aesthetics’ to President Obama in the

pages of Harper’s Magazine. Among

their suggestions to the US leader was

to read the recession allegorically, as

‘the intimate space at the heart of all

economics, its muted truth’, and celebrate

it ‘as you would the revelation of godhead

itself ’. Volume spoke with McCarthy

about representing crisis and trauma –

whether assaults against the economy

or the body – and the death-driven

compulsion to repeat these moments of

intensity in seeking catharsis.


Jeffrey Inaba Can you explain the process of creating Remainder?


Tom McCarthy
Well, in a way the writing of the book

came about by happy accident. I was just passively looking

at a crack in the wall and had this moment of déjà vu

during which I remembered a similar room with a similar

crack. I remembered a building or I kind of half-remembered

– it was like the composite memory Proust describes

in which you can remember a staircase in a house that

never existed because you make a collage in your head from

other houses you’ve known – and I thought it would be

good to reconstruct this moment: to make the house and

to put the crack in the wall.


So that’s what happened in the book. The hero, or antihero,

starts by reconstructing a building he’s remembered.

And by making everyone – all of his neighbors who he’s

remembered – move to the rhythms he’s created as they

cook liver or play piano . Then he expands the parameters of
that reenactment zone until
he’s reenacting shoot-outs
in the street and bank heists.
By the end he’s making planes
fall out of the sky.


JI The hero/anti-hero of Remainder goes into a coma as a

result of an object falling from the sky and hitting him on

the head. How did you arrive at this device as a departure

point for the novel? Was it immediately apparent that this

was how the novel should start out?


TM
No, initially I had to consider that if the hero’s going

to do all this stuff, he needs a lot of money to pay for it.

So he could win the lottery or inherit lots of money from

an uncle like the character Jean Des Esseintes in that

wonderful Huysmans novel Against Nature, which was

definitely an influence on Remainder, but I just wasn’t

convinced. Then I looked into compensation

culture, trauma and post-traumatic stress disorder, and it perfectly

tarried with his whole reenactment

compulsion. For Freud, and for almost all psychologists,

trauma is always linked to repetition afterwards: the

reenactment and repetitive behavior. And so, yeah, it just

kind of made sense. The idea of something falling from

the sky is just straight Blanchot. One of the first things

he points out in The Writing of the Disaster is that

the word comes from ‘des astre’, literally, ‘from the stars’.

It’s the Fall. You can read that as the death of god, the

collapse of metaphysics or in a Newtonian way, in the

sense of gravity: things fall. And in Remainder you have

lots of things, not just airplane parts or bits of technology,

but also undisclosed matter and the share prices of stocks,

falling. He’s somehow reacting against this entropic

universe and trying to delay the inevitable, but of course

he ultimately fails.

He does get his memory back, but what’s lost is a sense of

authenticity. I conducted a long interview with someone

who’d been in a very serious accident resulting in motorneuron

damage and he had to relearn how to do everything

– from walking to lifting a glass.


And interestingly, he said ‘I can do it now, I can lift up

the glass and walk, but it seems fake. It seems like I’m

simulating.’ Warhol said the same thing after he was shot.

He said he felt like he was watching TV for the rest of

his life.


JI In this issue of Volume we think about how narratives of crisis are

told: what structures are employed to convey our experience of a world

in flux? It seems that Remainder is not about narrative per se, rather it’s

about constant confrontations with the elements of storytelling and in

particular the objects that percolate as confrontations within a larger

symbolic order.


TM
Yeah, the character keeps on going on about a carrot

that won’t stay still. That’s a metonym for the whole

material world: this thing that cannot be controlled. And

I suppose, you know, objects are really important. They’re

always really important in Freud.


JI Remainder is about all of these encounters with

estranged objects. During moments of crisis, while we

might obsess over how we construct logical explanations

of the situation, it seems that crisis is really when

things can’t be explained. It’s when there’s a breakdown

of a given symbolic order. We question the relationship

between the things we experience in the world and the

way that the world is described. In that sense do you see

the post-traumatic reencountering of objects the protagonist

goes through as analogous to crisis moments?


TM
Yes. He has to not only reprogram himself in

terms of kinetic stuff and movement, but it’s also about

movement and language. He has a large staff and he

keeps having them look up words in the dictionary and

text him the definitions. That informs his behavior.

By the end, he’s more or less killing people

because of dictionary definitions . So all of that is borne
out of crisis, out of catastrophe.
As he’s moving away
from the catastrophe he’s trying
to remaster the symbolic
order. But what for him is the
happy ending – the euphoric,
orgiastic ending – comes
not through resolution, but
through provoking an ultra
crisis. It’s when everything goes
wrong, spectacularly
wrong, when people are dying all
around him and planes
are crashing. At that moment,
everything comes together.
He’s at one with catastrophe.


Trauma studies report that only trauma is real. The trauma is

the moment-in-time. It’s always excluded from

Narratives and histories of time because it’s always censored:

the actual kernel of the disaster is always withheld

from consciousness or narratable memory. And yet it’s

the only moment which is true, which is real. Therefore

trauma victims often try to recover that moment, as if

it were some lost nirvana. The whole of Remainder is less

a movement away from – or resolution of – crisis than

it is an attempt to reenter crisis and retrigger it. In that respect

it’s successful. I mean, in the end, he gets

his disaster.

–from Volume Magazine

the philosophy of boredom: the boredom of philosopy


boredom as a philosophical problem

Svendsen’s conclusion: “Boredom is life’s own gravity."

As a philosopher, from time to time one must attempt to address big questions. If one fails to do so, one loses sight of what led one to study philosophy in the first place. In my opinion, boredom is one such big question, and an analysis of boredom ought to say something important about the conditions under which we live. We ought not – and are actually unable to – avoid considering our attitude towards the question of being from time to time. There may be many initial reasons for reflecting on one’s life, but the special thing about fundamental existential experiences is that they inevitably lead one to question one’s own existence. Profound boredom is one fundamental existential experience. As Jon Hellesnes has asked: ‘What can possibly be more existentially disturbing than boredom?’


The big questions are not necessarily the eternal questions, for boredom has only been a central cultural phenomenon for a couple of centuries. It is of course impossible to determine precisely
when boredom arose, and naturally it has its precursors. But it stands out as being a typical phenomenon of modernity. On the whole, the precursors were restricted to small groups, such as the nobility and the clergy, whereas the boredom of modernity is wide-ranging in its effect and can be said to be a relevant phenomenon today for practically everyone in the Western world.


Boredom is usually considered as something random in relation to the nature of man, but this is based on highly dubious assumptions regarding human nature. One could just as well claim that boredom is embodied in human nature, but that would also presuppose that there is anything at all that can be called ‘human nature’ – a presupposition that seems problematic to me. Postulating a given nature has a tendency to put an end to all further discussion. For, as Aristotle points out, we direct our attention first and foremost to that which is capable of change.
By postulating a nature we are claiming that it cannot be changed. It can also be tempting to postulate a completely neutral human nature, where man has just as great a potential to experience sadness as happiness, enthusiasm as boredom. In that case, the explanation of boredom is exclusively to be found in the individual’s social environment. I do not believe, however, that a clear distinction can be made between psychological and social aspects when dealing with a phenomenon such as boredom, and a reductive sociologism is just as untenable as a psychologism. So I choose to approach the matter from a different angle, adopting a perspective based partly on the history of ideas and partly on phenomenology. Nietzsche pointed out that the ‘hereditary fault of all philosophers’ is to base themselves on man at a particular period of time and then turn this into an eternal truth. So I will make do with stating that boredom is a very serious phenomenon that affects many people. Aristotle insisted that virtue is not natural, but that it is not unnatural either. The same applies to boredom. Moreover, an investigation of boredom can be carried out without presupposing any anthropological constants, i.e., anything given independently of a specifically social and historical space. We are dealing here with an investigation of man in a particular historical situation. It is us I am writing about, living in the shadow of Romanticism, as inveterate Romantics without the hyperbolic faith of Romanticism in the ability of the imagination to transform the world.


Even though all good philosophy ought to contain an important element of self-knowledge, it does not necessarily have to take the form of a confession modelled on Augustine’s
Confessions. Many people have asked me if I undertook this project because I suffered from boredom, but what I personally feel ought not to be of any interest to readers. I do not conceive philosophy as being a confessional activity, rather one that labours to gain clarity – a clarity that is admittedly never more than temporary – in the hope that the small area one feels one has shed light on will also be of relevance to others. From a philosophical point of view, my private conditions are irrelevant, even though they are naturally important to me.


I carried out a small, unscientific survey among colleagues, students, friends and acquaintances that revealed that they were on the whole unable to say whether they were
bored or not, although some answered in the affirmative or the negative – and one person even claimed that he had never been bored. To those readers who have possibly never been bored I can say by way of comparison that deep boredom is related, phenomenologically speaking, to insomnia, where the I loses its identity in the dark, caught in an apparently infinite void. One tries to fall asleep, takes perhaps a few faltering steps, but does not gain sleep, ending up in a no man’s land between a waking state and sleep. In Book of Disquiet Fernando Pessoa wrote:


Certain sensations are slumbers that fill up our mind like a fog and prevent us from thinking, from acting, from clearly and simply being. As if we hadn’t slept, something of our undreamed dreams lingers in us, and the torpor of the new day’s sun warms the stagnant surface of our senses. We’re drunk on not being anything, and our will is a bucket poured out onto the yard by the listless movement of a passing foot.


Pessoa’s boredom is obvious – it is distinct in all its formlessness. It is, however, in the nature of things that very few people indeed can come up with an unequivocal answer as to whether they are bored or not. First, moods, generally speaking, are seldom intentional subjects as far as we are concerned – they are precisely something one finds oneself
in, not something one consciously looks at. And second, boredom is a mood that is typified by a lack of quality that makes it more elusive than most other moods. Georges Bernanos’s village priest provides us with a fine description of the imperceptibly destructive nature of boredom in The Diary of a Country Priest:


So I said to myself that people are consumed by boredom. Naturally, one has to ponder for a while to realise this – one does not see it immediately. It is a like some sort of dust. One comes and goes without seeing it, one breathes it in, one eats it, one drinks it, and it is so fine that it doesn’t even scrunch between one’s teeth. But if one stops up for a moment, it settles like a blanket over the face and hands. One has to constantly shake this ash-rain off one. That is why people are so restless.


It is perfectly possible to be bored without being aware of the fact. And it is possible to be bored without being able to offer any reason or cause for this boredom. Those who claimed in my small survey that they were deeply bored were as a rule unable to state accurately
why they were bored; it wasn’t this or that that plagued them, rather a nameless, shapeless, object-less boredom. This is reminiscent of what Freud said about melancholy, where he began by stressing a similarity between melancholy and grief, since both contain an awareness of loss. But whereas the person who grieves always has a distinct object of loss, the melancholic does not precisely know what he has lost.


Introspection is a method that has obvious limitations when investigating boredom, so I decided to look critically at a number of texts of a philosophical and literary nature. I regard literature as excellent source-material for philosophical studies, and for the philosophy of culture it is just as indispensable as scientific works are for the philosophy of science. As a rule, literature is a great deal more illuminative than quantitative sociological or psychological studies. This applies not least to our subject, where much research has focused on how the deficiency or surplus of sensory stimuli cause boredom without this always being particularly illuminative when considering such a complex phenomenon as boredom.
As Adam Phillips, a psychoanalyst, has expressed it: ‘Clearly, we should speak not of boredom, but of boredoms, because the notion itself includes a multiplicity of moods and feelings that resist analysis.’


—from
Lars Svendsen, A Philosophy of Boredom (1999)

“we never experience an affect for the first time; every affect contains within it an archive of its previous objects.”

 

Is dwelling on loss not necessarily depressing?  Jonathan Flatley argues that embracing melancholy can be a road back to connecting with others and enable you to productively remap your relationship to the world. Aesthetic activity can give one the means to comprehend and change one’s relation to loss.

Flatley’s argument shares with Freud an interest in understanding the depressing effects of difficult loss and with Walter Benjamin the hope that loss itself can become a means of connection and the basis for social transformation. The affective maps artists like Henry James produce can make possible the conversion of a depressive melancholia into a way to be interested in the world (cribbed from Flatley’s publisher).

Affective Mapping

 

The decisively new ferment that enters the taedium vitae and turns it into spleen is self-estrangement.

 

—Walter Benjamin, Central Park

 

In his influential 1960 book The Image of the City, Kevin Lynch explored the ways residents internalize maps of their cities. These cogninitive maps give one a sense of location and direction, and enable one to make decisions about where one wants to go and how to get there.1 A later scholar helpfully defined cognitive mapping as “a process composed of a series of psychological transformations by which an individual acquires, stores, recalls and decodes information about the relative locations and attributes of the phenomena in his everyday spatial environment.”2 Lynch studied three different cities—Boston, Los Angeles, and Jersey City—and found that some cities are more “legible” to their residents than others. That is, “the ease with which [the city’s] parts can be recognized and can be organized into a coherent pattern” varies from city to city.3 In a nongrid city like Boston, with notable points of reference like the Charles River, Boston Common, and Boston Harbor, residents were quite able to assemble usable cognitive maps of the city through repetitive experience of it. Jersey City, on the other hand, organized by an incomplete grid, was found to be more undifferentiated and thus less legible. Many of its residents, Lynch found, had only fragmented or partial images of the city. Since an image of the total system in which one is located is of course a crucial element in establishing one’s confidence in one’s ability to live in the world—see friends, get to the hospital, buy groceries, go out to dinner, arrive at the train station on time—the lack of such an ability can produce a sense of anxiety and alienation.

 

In his essay “Cognitive Mapping,” Fredric Jameson expanded the use of the term to suggest that just as one needs a cognitive map of city space in order to have a sense of agency there, one requires a cognitive map of social space for a sense of agency in the world more generally.4 Such a map’s function is “to enable a situational representation on the part of the individual subject to that vaster and properly unrepresentable totality which is the ensemble of society’s structures as a whole.”5 In other words, in its negotiation of the gap between local subjective experience and a vision of an overall environment, the cognitive map is an apt figure for one of the functions of ideology, which is, in Althusser’s now classic formulation, “the representation of the subject’s imaginary relationship to his or her real conditions of existence.”6 We all need such representations, no matter how imaginary, in order to make sense and move through our everyday lives. By the same token, “the incapacity to map socially is as crippling to political experience as the analogous incapacity to map spatially is for urban experience.”7

 

The difference with the social map is that where the totality of Boston is quite representable, the “totality which is the ensemble of society’s structures as a whole,” conversely, is not. And the socioeconomic systems we all must negotiate on a daily basis are becoming ever less representable.8  Increasingly, Jameson argues, the distance between the structures that order everyday life and the phenomenology and datum of that life itself have become unbridgeable.9 Cognitive mapping in this context would be an essential part of “a pedagogical political culture which seeks to endow the individual subject with some new heightened sense of its place in the global system.”10 Without such a picture insights remain partial and fragmented; we remain mired in the logic of the system as it exists.

 

*

 

So then what is this thing I have been calling affective mapping? In the context of geography and environmental psychology, the term affective mapping has been used to indicate the affective aspects of the maps that guide us, in conjunction with our cognitive maps, through our spatial environment.11 That is, we develop our sense of our environments through purposive activity in the world, and we always bring with us a range of intentions, beliefs, desires, moods, and affective attachments to this activity. Hence our spatial environments are inevitably imbued with the feelings we have about the places we are going, the things that happen to us along the way, and the people we meet, and these emotional valences, of course, affect how we create itineraries. For instance, I live in downtown Detroit, and when I am in the suburbs around Detroit, I often get the sense that some people in the suburbs who have not crossed over the city limits for years carry around with them a map on which Detroit is a large, hazily defined space, but a space clearly marked by some mixture of fear, anxiety, sorrow, and nostalgia. They avoid Detroit not because of poor urban planning or a lack of landmarks but because of the emotions they have associated with the city space of Detroit.

 

Thus, by way of analogy, I would suggest that social maps are also marked with various affective values. To return to the example regarding the suburban resident who avoids Detroit, this is an affective map of social space as well, in a way that parallels ideology. For in all likelihood the person from the suburbs of whom I write is white, and Detroit is largely African American, and this split is of course overwritten by a class divide, so emotions about Detroit as a space are, for these suburban residents, inevitably also emotions about class and “race” and racism. In short, it is not just ideologies or cognitive maps that shape our behavior and practices in the world but also the affects we have about the relevant social structures of our world. The term affective map in this sense is meant to indicate the pictures we all carry around with us on which are recorded the affective values of the various sites and situations that constitute our social worlds.

 

I should perhaps reemphasize here that “map” is meant in a particular, metaphorical sense, a metaphorics that I hope does not too seriously limit the concept. The affective map, like Deleuze and Guattari’s rhizomatic map, is neither fixed nor stable: “The rhizome refers to a map that must be produced or constructed, is always detachable, connectable, reversable, and modifiable, with multiple entrances and exits, with its lines of flight. The tracings are what must be transferred onto the maps and not the reverse.”12 Such maps must be able to incorporate new information as one has new experiences in new environments; but this does not mean they are entirely self-invented. Rather the maps are cobbled together in processes of accretion and palimpsestic rewriting from other persons’ maps, first of all those defined in infancy by one’s parents, and later the maps that come to one by way of one’s historical context and the social formations one lives in. 

Continue reading

in which I create in another workplace contretemps—and emerge even further ahead!


last week i distributed in a meeting what I thought was a well-reasoned argument for engaging in more quantitative, metrics-based evaluation of fund managers, as well as some qualitative researched informed by the academic discipline known as “depth psychology.” (none of that dr. phil crap for us. we’re serious over here). It seemed to me that bernie madoff provides the perfect example of the highly-driven portfolio manager with whom we should be investing over the short-term – in other words, a manager for the investor looking for the quick “in-and-out” market opportunity.  Much of my analysis was based on recent allegations made against madoff regarding his person and his character. remember – we’re here to make money, so one’s own ethical reservations should be held in check.

i began the meeting by reading statements made by madoff’s former mistress, sheryl weinstein, in her recent book, intriguingly entitled madoff’s other secret, which contains much crucial material, salacious though it may seem to those not as market-minded as me,  about her alleged affair with the ponzi-scheming madoff. mrs weinstein’s words, courtesy of the n.y. daily news, as read aloud in my own beautiful irish tenor voice (although I see know I should have attempted to mimic weinstein’s “new yawker” timbre):

 

this man was not well-endowed… bernie had a very small penis. not only was it on the short side, it was small in circumference. that he was now pointing it out to me was telling. it clearly caused him great angst. i wanted to be careful how i responded. men and their penises have a strange and unique relationship…  liked this man and didn’t want to emasculate him. his tiny penis hadn’t prevented me from climaxing… when we made love, i was on fire…

 

incredibly, no one in the room could realize the brilliance of what I was proposing! i guess they were all tired from watching so you think you can dance or whatever shit they do at night.

i explained in laborious detail that all we had to do was assign an investigator to research the private lives of some of our prospective portfolio managers, interview their ex-girlfriends, ex-wives, even their mothers … maybe offer the manager’s g.p. or urologist a small cash honorarium for divulging a couple of facts about said manager’s anatomy, and soon we would be in a position assemble a roster of highly motivated managers adept at the art of the quick turn-around, the so-called “in-and-out” market manager.

 

remember, i told my slow-on-the-uptake colleagues, in freud’s world, everybody is always compensating for something

 

based on the venerable english legal maxim that “silence implies consent,” i figured that i had the go-ahead to begin getting quotes from private investigators and corporate security firms. i mean, i am dealing with people who think that the quality of street musicians or attractiveness of waitresses in mid-priced restaurants are reliable economic indicators… who think that mutual fund fees are reasonable!

 

alas, long story short, my boss approached me with the news that this initiative of mine was considered “a non-starter – some senior people think it is at odds with our brand values of ethics and professionalism… but surprisingly a number of the folks in design liked it.”  she then broached the idea of a “working vacation — and the company will pay for it! there’s a conference in ibiza we’d like you to attend. take a friend and stay a few extra days. take your dog if you want.”

freud on melancholia and hamlet

 

If my cigar was that small I wouldn’t let
people take pictures of it!

 

In Freud’s great essay “Mourning and Melancholia,” he makes a sustained comparison between normal sadness (associated with grieving for the loss of a loved one) and the disturbed self of dispirited mood states and self-hatred (which he associates with the clinical condition of melancholia). By introducing the concepts of object-relations theory, projective identification and introjection, Freud’s theoretical constructs in this essay informed most psychoanalytic thought on melancholia and depression in the twentieth century. Whereas previous thinkers had considered melancholia to be a state of imbalance or a mood of despondency, Freud recasts it as a frame of mind characterized by the loss of something. Indeed, melancholia is properly characterized by loss of an object of which its subject may be unconscious. In this respect it seems to mimic the earlier characterizations of melancholy as a nebulous mood state of fear and sadness without cause. And by choosing the character of Hamlet as an exemplar, and admitting that the melancholic “has a keener eye for the truth than others who are not melancholic,” Freud seems to allow that melancholia may have a glamorous aspect. Despite his scientific ambitions, Freud concedes that the categories of melancholy and melancholia elude formal definition: “the definition of melancholia is uncertain; it takes on various clinical forms . . . that do not seem definitely to warrant reduction to a unity.” 

 

In her anthology The Nature of Melancholy: From Aristotle to Kristeva (2000), Jennifer Radden highlights the technical contents of Freud’s great essay as follows:

 

Three aspects of “Mourning and Melancholia” distinguish it from earlier writing: the theme of loss, the emphasis on self-accusation and self-loathing in melancholic subjectivity, and the elaborate theory of narcissism, identification, and introjection it introduces. Melancholia represents loss of the “object,” that is, the beloved parent whose love has been perceived to be withdrawn. Self-accusation and self-hatred, which Freud describes as central characteristics of the melancholic patient, are a form of rage redirected from the loved object to the self.

 

Such redirected rage can occur because the self is deeply identified with the other. (This identification is so strong that Freud speaks of the other person as incorporated by the self. Introjection is Freud’s term for this process of incorporation.) In developmental terms, the infant’s love energy is at first directed exclusively upon the ego; later it turns to the other, a loved person with whom the infant is intimately identified. That identification allows the fantasy that the ego has incorporated the mother, or “object.” In those suffering from melancholia, some adult sorrow or slight reignites those infantile experiences. Now with the characteristic ambivalence of the oral phase, the ego attacks the loved, introjected “object” in self-accusations whose curious quality of indifference, Freud believes, proves their true object to be not the self but the incorporated other. (282)

 

 

Sigmund Freud, “Mourning and Melancholy” (1917) 

 

Now that dreams have proved of service to us as the normal prototypes of narcissistic mental disorders, we propose to try whether a comparison with the normal emotion of grief, and its expression in mourning, will not throw some light on the nature of melancholia. This time, however, we must make a certain prefatory warning against too great expectations of the result. Even in descriptive psychiatry the definition of melancholia is uncertain; it takes on various clinical forms (some of them suggesting somatic rather than psychogenic affections) that do not seem definitely to warrant reduction to a unity. Apart from those impressions which every observer may gather, our material here is limited to a small number of cases the psychogenic nature of which was indisputable. Any claim to general validity for our conclusions shall be forgone at the outset, therefore, and we will console ourselves by reflecting that, with the means of investigation at our disposal to-day, we could hardly discover anything that was not typical, at least of a small group if not of a whole class of disorders.

 

A correlation of melancholia and mourning seems justified by the general picture of the two conditions.1 Moreover, wherever it is possible to discern the external influences in life which have brought each of them about, this exciting cause proves to be the same in both. Mourning is regularly the reaction to the loss of a loved person, or to the loss of some abstraction which has taken the place of one, such as fatherland, liberty, an ideal, and so on. As an effect of the same influences, melancholia instead of a state of grief develops in some people, whom we consequently suspect of a morbid pathological disposition. It is also well worth notice that, although grief involves grave departures from the normal attitude to life, it never occurs to us to regard it as a morbid condition and hand the mourner over to medical treatment. We rest assured that after a lapse of time it will be overcome, and we look upon any interference with it as inadvisable or even harmful.

 

The distinguishing mental features of melancholia are a profoundly painful dejection, abrogation of interest in the outside world, loss of the capacity to love, inhibition of all activity, and a lowering of the self-regarding feelings to a degree that finds utterance in self-reproaches and self-revilings, and culminates in a delusional expectation of punishment. This picture becomes a little more intelligible when we consider that, with one exception, the same traits are met with in grief. The fall in self-esteem is absent in grief; but otherwise the features are the same. Profound mourning, the reaction to the loss of a loved person, contains the same feeling of pain, loss of interest in the outside world—in so far as it does not recall the dead one—loss of capacity to adopt any new object of love, which would mean a replacing of the one mourned, the same turningfrom every active effort that is not connected with thoughts of the dead. It is easy to see that this inhibition and circumscription in the ego is the expression of an exclusive devotion to its mourning, which leaves nothing over for other purposes or other interests. It is really only because we know so well how to explain it that this attitude does not seem to us pathological.

 

We should regard it as a just comparison, too, to call the temper of grief “painful.” The justification for this comparison will probably prove illuminating when we are in a position to define pain in terms of the economics of the mind.2

 

Now in what consists the work which mourning performs? I do not think there is anything far-fetched in the following representation of it. The testing of reality, having shown that the loved object no longer exists, requires forthwith that all the libido shall be withdrawn from its attachments to this object. Against this demand a struggle of course arises—it may be universally observed that man never willingly abandons a libido-position, not even when a substitute is already beckoning to him. This struggle can be so intense that a turning away from reality ensues, the object being clung to through the medium of a hallucinatory wish-psychosis. The normal outcome is that deference for reality gains the day. Nevertheless its behest cannot be at once obeyed. The task is now carried through bit by bit, under great expense of time and cathectic energy, while all the time the existence of the lost object is continued in the mind. Each single one of the memories and hopes which bound the libido to the object is brought up and hyper-cathected, and the detachment of the libido from it accomplished. Why this process of carrying out the behest of reality bit by bit, which is in the nature of a compromise, should be so extraordinarily, painful is not at all easy to explain in terms of mental economics. It is worth noting that this pain3 seems natural to us. The fact is, however, that when the work of mourning is completed the ego becomes free and uninhibited again.

 

Now let us apply to melancholia what we have learnt about grief. In one class of cases it is evident that melancholia too may be the reaction to the loss of a loved object; where this is not theexciting cause one can perceive that there is a loss of a more ideal kind. The object has not perhaps actually died, but has become lost as an object of love (e.g. the case of a deserted bride). In yet other cases one feels justified in concluding that a loss of the kind has been experienced, but one cannot see clearly what has been lost, and may the more readily suppose that the patient too cannot consciously perceive what it is he has lost. This, indeed, might be so even when the patient was aware of the loss giving rise to the melancholia, that is, when he knows whom he has lost but not what it is he has lost in them. This would suggest that melancholia is in some way related to an unconscious loss of a love-object, in contradistinction to mourning, in which there is nothing unconscious about the loss.

 

In grief we found that the ego’s inhibited condition and loss of interest was fully accounted for by the absorbing work of mourning. The unknown loss in melancholia would also result in an inner labour of the same kind and hence would be responsible for the melancholic inhibition. Only, the inhibition of the melancholiac seems puzzling to us because we cannot see what it is that absorbs him so entirely. Now the melancholiac displays something else which is lacking in grief—an extraordinary fall in his self-esteem, an impoverishment of his ego in a grand scale. In grief the world becomes poor and empty; in melancholia it is the ego itself. The patient represents his ego to us as worthless, incapable of any effort and morally despicable; he reproaches himself, vilifies himself and expects to be cast out and chastised. He abases himself before everyone and commiserates his own relatives for being connected with someone so unworthy. He does not realize that any change has taken place in him, but extends his self-criticism back over the past and declares that he was never any better. This picture of delusional belittling—which is predominantly moral—is completed by sleeplessness and refusal of nourishment, and by an overthrow, psychologically very remarkable, of that instinct which constrains every living thing to cling to life.

 

Both scientifically and therapeutically it would be fruitless to contradict the patient who brings these accusations against himself. He must surely be right in some way and be describing something that corresponds to what he thinks. Some of his statements, indeed, we are at once obliged to confirm without reservation. He really is as lacking in interest, as incapable of love and of any achievement as he says. But that, as we know, is secondary, the effect of the inner travail consuming his ego, of which we know nothing but which we compare with the work of mourning. In certain other self-accusations he also seems to us justified, only that he has a keener eye for the truth than others who are not melancholic. When in his exacerbation of self-criticism he describes himself as petty, egoistic, dishonest, lacking in independence, one whose sole aim has been to hide the weaknesses of his own nature, for all we know it may be that he has come very near to self-knowledge; we only wonder why a man must become ill before he can discover truth of this kind. For there can be no doubt that whoever holds and expresses to others such an opinion of himself—one that Hamlet harboured of himself and all men—that man is ill, whether he speaks the truth or is more or less unfair to himself. Nor is it difficult to see that there is no correspondence, so far as we can judge, between the degree to self-abasement and its real justification. A good, capable, conscientious woman will speak no better of herself after she develops melancholia than one who is actually worthless; indeed, the first is more likely to fall ill of the disease than the other, of whom we too should have nothing good to say. Finally, it must strike us that after all the melancholiac’s behaviour is not in every way the same as that of one who is normally devoured by remorse and self-reproach. Shame before others, which would characterize this condition above everything, is lacking in him, or at least there is little sign of it. One could almost say that the opposite trait of insistent talking about himself and pleasure in the consequent exposure of himself predominates in the melancholiac.

The essential thing, therefore, is not whether the melancholiac’s distressing self-abasement is justified in the opinion of others. The point must be rather that he is correctly describing his psychological situation in his lamentations. He has lost his self-respect and must have some good reason for having done so. It is true that we are then faced with a contradiction which presents a very difficult problem. From the analogy with grief we should have to conclude that the loss suffered by the melancholiac is that of an object; according to what he says the loss is one in himself.

 

 

1. “Abraham, to whom we owe the most important of the few analytic studies on this subject, also took this comparison to his starting point. (Zentralblatt, Bd. II., 1912.)”

 

2. “The words ‘painful’ and ‘pain’ in this paragraph represent the German Schmerz  i.e., the ordinary connotation of pain in English) and not Unlust, the mental antithesis of pleasure, also technically translated as ‘pain’.—Trans.”

 

3. “The German here is Schmerzunlust, a combination of the two words for pain.—Trans.” 

Continue reading

robbe-grillet on film: “reality… is problematic. we run up against it as against a wall of fog”

 


The history of cinema is still rather short, yet it is already characterized by discontinuities and reversals. The majority of contemporary films that now pass for masterpieces would have been rejected by Eisenstein and rightly so as altogether worthless, as the very negation of all art.

 

We should reread today the famous manifesto Eisenstein and Pudovkin wrote in the 1920s on the sound film. At a time when, in Moscow, a brand new American invention was being announced that would permit the actors on the screen to speak, this prophetic text warned vigorously and with extraordinary clarity of vision against the fatal abyss into which cinema was in danger of sliding: Since the illusion of realism would be considerably strengthened by giving the characters a voice, cinema could let itself be led down the cowardly path of glib superficiality (a temptation that never stops menacing us) and from then on, the better to please the multitudes, could remain content with an allegedly faithful reproduction of reality. It would thus surrender all claims to the creation of genuine artworks works in which that reality would be challenged by the very structures of the cinematic narrative.

 

Now, what Eisenstein demanded, with his customary vehemence, was that sound be used to create, on the contrary, new shocks: To the shocks between sequences created by montage (which links, according to relations of harmonic resonance or of opposition, the sequences to one another) should be added the shocks between the various elements of the sound track and still others between sounds and simultaneously projected images. As one may have expected, good Marxist-Leninist that he was, he called upon the sacrosanct "dialectic" in order to support this thesis.

 

But Communist ideology alas! could not save the Soviet cinema (which today is one of the worst in the world) from falling into the snares of glibness. In fact, good old "bourgeois realism" triumphed everywhere in the West as well as the East, where they simply rebaptized it "socialist." Eisenstein and his friends were rapidly subjected to the new universal norm: The montage of the visual sequences of their films (¡Que viva México! for example) was redone by the right-thinking bureaucracy, and all the sounds were made to follow obediently the recorded images.

 

Even in France, it was a theoretician of the extreme Left, André Bazin, who, merrily letting the dialectic go by the board, became the spokesman of illusionist realism, going so far as to write that the ideal film would entail no montage whatsoever, "since in the natural reality of the world there is no montage"! Thus, the numerous and fascinating forms of expression created in Russia and elsewhere during the silent era were summarily repudiated as if they were nothing but childish stammerings born of a merely rudimentary technique. Sound, wide screens, deep focus, color, long-duration reels all of these have allowed us to transform cinema today into a simple reproduction of the world, which, in the final analysis, is tantamount to forcing cinema as an art to disappear.

 

If today we want to restore its life, its former power, and its ability to give us veritable artworks, worthy of vying with fiction or painting of the modern era, then we must bring back to film work the ambitiousness and prominence that characterized it in the days of silent film. And so, as Eisenstein urges, we need to take advantage of every new technical invention, not in order to subject ourselves even further to the ideology of realism but, quite the opposite, to increase the possibilities of dialectical confrontation within film, thereby intensifying the "release of energy" that is just what such internal shocks and tensions allow for.

 

From this point of view, the alleged realism of contemporary commercial films, whether they be signed by Truffaut or by Altman, appears as a flawless totalitarian system, founded on hackneyed, stereotyped redundancy. The least detail in every shot, the connections between sequences, all the elements of the sound track, everything, absolutely everything must concur with the same sense and meaning, with a single sense and meaning, and with good old common sense. The immense potential richness that is concealed in this stuff of dreams these discontinuous, sonorous images must be utterly reduced, subjected to the laws of normative consciousness, to the status quo, so that, at any cost, meaning may be prevented from deviating, swarming, bifurcating, going off in several directions at once, or else getting completely lost. The technicians on the set or in the various recording studios are there precisely to see to it that no imperfections and divergences ever occur.

 

But what is the significance of this will-to-reduction? What it all means, in the final analysis, is that reality and a living reality at that is reduced to a reassuring, homogeneous, unilinear story line, a reconciled and compromised, entirely rational story line from which any disturbing roughness has been purged. Plainly put, realism is by no means the expression of the real, of what is real. But rather, the opposite. Reality is always ambiguous, uncertain, moving, enigmatic, and endlessly intersected by contradictory currents and ruptures. In a word, it is ”incomprehensible." Without a doubt, it is also unacceptable whereas the first and foremost function of realism is to make us accept reality. Realism, therefore, has a pressing obligation not only to make sense but to make one and only one sense, always the same, which it must buttress tirelessly with all the technical means, all the artifices and conventions, that can possibly serve its ends.

 

Thus, for example, prevailing film criticism may blame a certain detective film for lack of realism, ostensibly because the murderer’s motives are not clear enough, or because there are contradictions in the scenario, or because there remain lacunae in the causal chain of events. And yet, what do we actually know about nonfictional attempts to solve real crimes? Precisely that uncertainties at times essential ones always persist until the end, as do unsettling absences, "mistakes" in the protagonist’s behavior, useless and supernumerary characters, diverging proofs, a piece or two too many in the puzzle that the preliminary investigation in vain tries to complete.

 

Reality, then, is problematic. We run up against it as against a wall of fog. Meanwhile, our relation to the world becomes still more complicated because, at every moment, the world of realism presents itself to us as if it were familiar. We become so used to it that we hardly see it: It is our habitat, our cocoon. Yet, actually, we stumble against what’s real with a violence we never get used to a violence that no amount of previous experience can ever assuage so that reality remains for us irremediably foreign and strange. The German words heimlich and unheimlich, which both Freud and Heidegger have used, though in different but here overlapping contexts, give indeed an idea of this lived opposition fundamental because it is inescapable between the strange and the familiar. Both the psychoanalyst and the philosopher insist that the familiarity we think we have with the world is misleading (i.e., ideological, socialized). To acknowledge and explore (even to the point of anguish) the world’s strangeness constitutes the necessary starting point for creating a consciousness that is free. And one of the essential functions of art is precisely that it assumes this role of revealing the world to us. This explains why art does not attempt to make the world more bearable (which undoubtedly is what realism does), but less so: because its ultimate ambition is not to make us accept reality but to change it.

 

the iconic imagery of Last Year at Marienbad

read more…

nabokov slaps freud around a bit!

 

Let the credulous and the vulgar continue to believe that all mental woes can be cured by a daily application of old Greek myths to their private parts. I really do not care. 

—Vladimir Nabokov, Strong Opinions (1973)

 

 

 

 

harold bloom on literary genius and the self

 

Where does the self begin? The Freudian answer is that the ego makes an investment in itself, which thus centers a self. Shakespeare calls our sense of identity the "selfsame"; when did Jack Falstaff become Falstaff? When did Shakespeare become Shakespeare? … Our recognition of genius is always retroactive, but how does genius first recognize itself?

 


Cover Image

 

What Is Genius?

Harold Bloom

 

In employing a Kabbalistic grid or paradigm in the arrangement of this book, I rely upon Gershom Scholem’s conviction that Kabbalah is the genius of religion in the Jewish tradition. My one hundred figures, from Shakespeare through the late Ralph Ellison, represent perhaps a hundred different stances towards spirituality, covering the full range from Saint Paul and Saint Augustine to the secularism of Proust and Calvino. But Kabbalah, in my view, provides an anatomy of genius, both of women and of men; as also of their merging in Ein Sof, the endlessness of God. Here I want to use Kabbalah as a starting-point in my own personal vision of the name and nature of genius.

 

Scholem remarked that the work of Franz Kafka constituted a secular Kabbalah, and so he concluded that Kafka’s writings possess "something of the strong light of the canonical, of that perfection which destroys." Against this, Moshe Idel has argued that the canonical, both scriptural and Kabbalistic, is "the perfection which absorbs." To confront the plenitude of Bible, Talmud, and Kabbalah is to work at "absorbing perfections."

 

What Idel calls "the absorbing quality of the Torah" is akin to the absorbing quality of all authentic genius, which always has the capacity to absorb us. In American English, to "absorb" means several related processes: to take something in as through the pores, or to engross one’s full interest or attention, or to assimilate fully.

I am aware that I transfer to genius what Scholem and Idel follow Kabbalah in attributing to God, but I merely extend the ancient Roman tradition that first established the ideas of genius and of authority. In Plutarch, Mark Antony’s genius is the god Bacchus or Dionysus. Shakespeare, in his Antony and Cleopatra, has the god Hercules, as Antony’s genius, abandon him. The emperor Augustus, who defeated Antony, proclaimed that the god Apollo was his genius, according to Suetonius. The cult of the emperor’s genius thus became Roman ritual, displacing the two earlier meanings, of the family’s fathering force and of each individual’s alter ego.

 

Authority, another crucial Roman concept, may be more relevant for the study of genius than "genius," with its contradictory meanings, still can hope to be. Authority, which has vanished from Western culture, was convincingly traced by Hannah Arendt to Roman rather than Greek or Hebrew origins. In ancient Rome, the concept of authority was foundational. Auctoritas derived from the verb augere, "to augment," and authority always depended upon augmenting the foundation, thus carrying the past alive into the present.

 

Homer fought a concealed contest with the poetry of the past, and I suspect that the Redactor of the Hebrew Bible, putting together his Genesis through Kings structure in Babylon, struggled to truncate the earliest author that he wove into the text, in order to hold off the strangeness and uncanny power of the Yahwist or J writer. The Yahwist could not be excluded, because his (or her) stories possessed authority, but the disconcerting Yahweh, human-all-too-human, could be muted by other voices of the divine. What is the relationship of fresh genius to a founded authority? At this time, starting the twenty-first century, I would say: "Why, none, none at all." Our confusions about canonical standards for genius are now institutionalized confusions, so that all judgments as to the distinction between talent and genius are at the mercy of the media, and obey cultural politics and its vagaries.

 

Since my book, by presenting a mosaic of a hundred authentic geniuses, attempts to provide criteria for judgment, I will venture here upon a purely personal definition of genius, one that hopes to be useful for the early years of this new century. Whether charisma necessarily attends genius seems to me problematic. Of my hundred figures in this book, I had met three—Iris Murdoch, Octavio Paz, Ralph Ellison—who died relatively recently. Farther back, I recall brief meetings with Robert Frost and Wallace Stevens. All of them impressive, in different ways, they lacked the flamboyance and authority of Gershom Scholem, whose genius attended him palpably, despite his irony and high good humor.

 

William Hazlitt wrote an essay on persons one would wish to have known. I stare at my Kabbalistic table of contents, and wonder which I would choose. The critic Sainte-Beuve advised us to ask ourselves: what would this author I read have thought of me? My particular hero among these hundred is Dr. Samuel Johnson, the god of literary criticism, but I do not have the courage to face his judgment.

 

Genius asserts authority over me, when I recognize powers greater than my own. Emerson, the sage I attempt to follow, would disapprove of my pragmatic surrender, but Emerson’s own geniuswas so large that he plausibly could preach Self-Reliance. I myself have taught continuously for forty-six years, and wish I could urge an Emersonian self-reliance upon my students, but I can’t and don’t, for the most part. I hope to nurture genius in them, but can impart only a genius for appreciation. That is the prime purpose of this book: to activate the genius of appreciation in my readers, if I can.

 

These pages are written a week after the September 11, 2001, terrorist triumph in destroying the World Trade Center and the people trapped within it. During the last week I have taught scheduled classes on Wallace Stevens and Elizabeth Bishop, on Shakespeare’s early comedies, and on the Odyssey. I cannot know whether I helped my students at all, but I momentarily held off my own trauma, by freshly appreciating genius.

 

What is it that I, and many others, appreciate in genius? An entry in Emerson’s Journals (October 27, 1831) always hovers in my memory:

 

Is it not all in us, how strangely! Look at this congregation of men;—the words might be spoken,—though now there be none here to speak them,—but the words might be said that would make them stagger and reel like a drunken man. Who doubts it? Were you ever instructed by a wise and eloquent man? Remember then, were not the words that made your blood run cold, that brought the blood to your cheeks, that made you tremble or delighted you,-did they not sound to you as old as yourself? Was it not truth that you knew before, or do you ever expect to be moved from the pulpit or from man by anything but plain truth? Never. It is God in you that responds to God without, or affirms his own words trembling on the lips of another.

It still burns into me: "did they not sound to you as old as yourself?" The ancient critic Longinus called literary genius the Sublime, and saw its operation as a transfer of power from author to reader:

 

Touched by the true sublime your soul is naturally lifted up, she rises to a proud height, is filled with joy and vaunting, as if she had herself created this thing that she has heard.

 

Literary genius, difficult to define, depends upon deep reading for its verification. The reader learns to identify with what she or he feels is a greatness that can be joined to the self, without violating the self’s integrity. "Greatness" may be out of fashion, as is the transcendental, but it is hard to go on living without some hope of encountering the extraordinary.

 

Meeting the extraordinary in another person is likely to be deceptive or delusionary. We call it "falling in love," and the verb is a warning. To confront the extraordinary in a book—be it the Bible, Plato, Shakespeare, Dante, Proust—is to benefit almost without cost. Genius, in its writings, is our best path for reaching wisdom, which I believe to be the true use of literature for life.

 

James Joyce, when asked, "Which one book on a desert island?", replied, "I would like to answer Dante, but I would have to take the Englishman, because he is richer." The Joycean Irish edge against the English is given adequate expression, but the choice of Shakespeare is just, which is why he leads off the hundred figures in this book. Though there are a few literary geniuses who approach Shakespeare—the Yahwist, Homer, Plato, Dante, Chaucer, Cervantes, Moliére, Goethe, Tolstoy, Dickens, Proust, Joyce—even those dozen masters of representation do not match Shakespeare’s miraculous rendering of reality. Because of Shakespeare we see what otherwise we could not see, since we are made different. Dante, the nearest rival, persuades us of the terrible reality of his Inferno and his Purgatorio, and almost induces us to accept his Paradiso. Yet even the fullest of the Divine Comedy’s persons, Dante the Poet-Pilgrim, does not cross over from the Comedy’s pages into the world we inhabit, as do Falstaff, Hamlet, Iago, Macbeth, Lear, Cleopatra.

 

The invasion of our reality by Shakespeare’s prime personages is evidence for the vitality of literary characters, when created by genius. We all know the empty sensation we experience when we read popular fiction and find that there are only names upon the page, but no persons. In time, however overpraised, such fictions become period pieces, and finally rub down into rubbish. It is worth knowing that our word "character" still possesses, as a primary meaning, a graphic sign such as a letter of the alphabet, reflecting the word’s likely origin in the ancient Greek character, a sharp stylus or the mark of the stylus’s incisions. Our modern word "character" also means ethos, a habitual stance towards life.

 

It was fashionable, quite recently, to talk about "the death of the author," but this too has become rubbish. The dead genius is more alive than we are, just as Falstaff and Hamlet are considerably livelier than many people I know. Vitality is the measure of literary genius. We read in search of more life, and only genius can make that available to use.

 

What makes genius possible? There always is a Spirit of the Age, and we like to delude ourselves that what matters most about any memorable figure is what he or she shared with a particular era. In this delusion, which is both academic and popular, everyone is regarded as being determined by societal factors. Individual imagination yields to social anthropology or to mass psychology, and thus can be explained away.

 

I base this book, Genius, upon my belief that appreciation is a better mode for the understanding of achievement than are all the analytical kinds of accounting for the emergence of exceptional individuals. Appreciation may judge, but always with gratitude, and frequently with awe and wonder.

 

By "appreciation" I mean something more than "adequate esteem." Need also enters into it, in the particular sense of turning to the genius of others in order to redress a lack in oneself, or finding in genius a stimulus to one’s own powers, whatever these may emerge as being.

 

Appreciation may modulate into love, even as your consciousness of a dead genius augments consciousness itself. Your solitary self’s deepest desire is for survival, whether in the here and now, or transcendentally elsewhere. To be augmented by the genius of others is to enhance the possibilities of survival, at least in the present and the near future.

 

We do not know how and/or why genius is possible, only that—to our massive enrichment—it has existed, and perhaps (waningly) continues to appear. Though our academic institutions abound in impostors who proclaim that genius is a capitalistic myth, I am content to cite Leon Trotsky, who urged Communist writers to read and study Dante. If genius is a mystery of the capacious consciousness, what is least mysterious about it is an intimate connection with personality rather than with character. Dante’s personality is forbidding, Shakespeare’s elusive, while Jesus’ (like the fictive Hamlet’s) seems to reveal itself differently to every reader or auditor.

What is personality? Alas, we use it now as a popular synonym for celebrity, but I would argue that we cannot give the word up to the realm of buzz. When we know enough about the biography of a particular genius, then we understand what is meant by the personality of Goethe or Byron or Freud or Oscar Wilde. Conversely, when we lack biographical inwardness, then we all agree that we are uncertain as to Shakespeare’s personality, an enormous paradox since his plays may have invented personality as we now mostreadily comprehend it. If challenged, I could write a book on the personality of Hamlet, Falstaff, or Cleopatra, but I would not attempt a book upon the personality of Shakespeare or of Jesus.

 

Benjamin Disraeli’s father, the man of letters Isaac D’Israeli, wrote an amiable volume called The Literary Character of Men of Genius, one of the precursors to this book, Genius, together with Plutarch’s Parallel Lives, Emerson’s Representative Men, and Carlyle’s On Heroes and Hero-Worship. Isaac D’Israeli remarks that "many men of genius must arise before a particular man of genius can appear." Every genius has forerunners, though far enough back in time we may not know who they are. Dr. Johnson considered Homer to have been the first and most original of poets; we tend to see Homer as a relative latecomer, enriching himself with the phrases and formulas of his predecessors. Emerson, in his essay "Quotation and Originality," slyly observed, "Only an inventor knows how to borrow."

 

The great inventions of genius influence that genius itself in ways we are slow to appreciate. We speak of the man or woman in the work; we might better speak of the work in the person. And yet we scarcely know how to discuss the influence of a work upon its author, or of a mind upon itself. I take that to be the principal enterprise of this book. With all of the figures I depict in this mosaic, my emphasis will be on the contest they conducted with themselves.

 

That agon with the self can mask itself as something else, including the inspiration of idealized forerunners: Plato’s Socrates, Confucius’s the Duke of Chou, the Buddha’s earlier incarnations. Particularly the inventor of the Hebrew Bible as we know it, the Redactor of the sequence from Genesis through Kings, relies upon his own genius at reimagining the Covenant even as he honors the virtues (and failings) of the fathers. And yet, as Donald Harmon Akenson argues, the inventor-redactor or writer-editor achieved a "surpassing wonder," utterly his own. This exile in Babylon could not have thought that he was creating Scripture; as the first historian he perhaps believed only that he was forwarding the lost cause of the Kingdom of Judah. And yet he seems too cunning not to have seen that his invention of a continuity and so of a tradition was largely his own.

 

With the Redactor, as with Confucius or with Plato, we can sense an anxiety in the work that must have communicated itself to the man. How can one be worthy of the fathers with whom Yahweh spoke, face-to-face, or of the great Duke of Chou, who gave order to the people without imposing it upon themby violence? Is it possible to be the authentic disciple of Socrates, who suffered martyrdom without complaint, in order to affirm his truth? The ultimate anxiety of influence always may be, not that one’s proper space has been usurped already, but that greatness may be unable to renew itself, that one’s inspiration may be larger than one’s own powers of realization.

 

Genius is no longer a term much favored by scholars, so many of whom have become cultural levelers quite immune from awe. Yet, with the public, the idea of genius maintains its prestige, even though the word itself can seem somewhat tarnished. We need genius, however envious or uncomfortable it makes many among us. It is not necessary that we aspire after genius for ourselves, and yet, in our recesses, we remember that we had, or have, a genius. Our desire for the transcendental and extraordinary seems part of our common heritage, and abandons us slowly, and never completely.

 

To say that the work is in the writer, or the religious idea is in the charismatic leader, is not a paradox. Shakespeare, we happen to know, was a usurer. So was Shylock, but did that help to keep The Merchant of Venice a comedy? We don’t know. But to look for the work in the writer is to look for the influence and effect of the play upon Shakespeare’s development from comedy to tragicomedy to tragedy. It is to see Shylock darkening Shakespeare. To examine the effects of his own parables upon the figure of Jesus is to conduct a parallel exploration.

 

There are two ancient (Roman) meanings of the word "genius," which are rather different in emphasis. One is to beget, cause to be born, that is to be a paterfamilias. The other is to be an attendant spirit for each person or place: to be either a good or evil genius, and so to be someone who, for better or for worse, strongly influences someone else. This second meaning has been more important than the first; our genius is thus our inclination or natural gift, our inborn intellectual or imaginative power, not our power to beget power in others.

 

We all learn to distinguish, firmly and definitively, between genius and talent. A "talent" classically was a weight or sum of money, and as such, however large, was necessarily limited. But "genius," even in its linguistic origins, has no limits.


We tend now to regard genius as the creative capacity, as opposed to talent. The Victorian historian Froude observed that genius "is a spring in which there is always more behind than flows from it." The largest instances of genius that we know, aesthetically, would include Shakespeare and Dante, Bach and Mozart, Michelangelo and Rembrandt, Donatello and Rodin, Alberti and Brunelleschi. A greater complexity ensues when we attempt to confront religious genius, particularly in a religion-obsessed country like the United States. To regard Jesus and Muhammad as religious geniuses (whatever else they were) makes them, in that regard only, akin not only to one another but to Zoroaster and the Buddha, and to such secular figures of ethical genius as Confucius and Socrates.

 

Defining genius more precisely than has yet been done is one of my objectives in this book. Another is to defend the idea of genius, currently abused by detractors and reductionists, from sociobiologists through the materialists of the genome school, and on to various historicizers. But my primary aim is both to enhance our appreciation of genius, and to show how invariably it is engendered by the stimulus of prior genius, to a much greater degree than it is by cultural and political contexts. The influence of genius upon itself, already mentioned, will be one of the book’s major emphases.

 

My subject is universal, not so much because world-altering geniuses have existed, and will come again, but because genius, however repressed, exists in so many readers. Emerson thought that all Americans were potential poets and mystics. Genius does not teach how to read or whom to read, but rather how to think about exemplary human lives at their most creative.

 

It will be noted in the table of contents that I have excluded any living instances of genius, and have dealt with only three recently dead. In this book I am compelled to be brief and summary in my account of individual genius, because I believe that much is to be learned by juxtaposing many figures from varied cultures and contrasting eras. The differences between a hundred men and women, drawn from a span of twenty-five centuries, overwhelm the analogies or similarities, and to present them within a single volume may seem the enterprise of an overreacher. And yet there are common characteristics to genius, since vivid individuality of speculation, spirituality, and creativity must rely upon originality, audacity, and self-reliance.

Emerson, in his Representative Men, begins with a heartening paragraph:

 

It is natural to believe in great men. If the companions of our childhood should turn out to be heroes, and their condition regal, it will not surprise us. All mythology opens with demigods, and the circumstance is high and poetic; that is, their genius is paramount. In the legends of Gautama, the first men ate the earth, and found it deliciously sweet.

 

Gautama, the Buddha, quests for and attains freedom, as though he were one of the first men. Emerson’s twice-told tale is a touch more American than Buddhist; his first men seem American Adams, and not reincarnations of previous enlightenments. Perhaps I too can only Americanize, but that may be the paramount use of past geniuses; we have to adapt them to our place and our time, if we are to be enlightened or inspired by them.

 

Emerson had six great or representative men: Plato, Swedenborg, Montaigne, Shakespeare, Napoleon, and Goethe. Four of these are in this book; Swedenborg is replaced by Blake, and Napoleon I have discarded with all other generals and politicians. Plato, Montaigne, Shakespeare, and Goethe remain essential, as do the others I sketch. Essential for what? To know ourselves, in relation to others, for these mighty dead are among the otherness that we can know, as Emerson tells us in Representative Men:

 

We need not fear excessive influence. A more generous trust is permitted. Serve the great.

 

And yet this is the conclusion of his book:

 

The world is young: the former great men call to us affectionately. We too must write Bibles, to unite again the heavens and the earthly world. The secret of genius is to suffer no fiction to exist for us; to realize all that we know.

 

To realize all that we know, fictions included, is too large an enterprise for us, a wounded century and a half after Emerson. The world no longer seems young, and I do not always hear the accents of affection when the voices of genius call out to me. But then I have the disadvantage, and the advantage, of coming after Emerson. The genius of influence transcends its constituent anxieties, provided we become aware of them and then surmise where we stand in relation to their continuing prevalence.

Thomas Carlyle, a Victorian Scottish genius now out of fashion, wrote an admirable study that almost nobody reads anymore, On Heroes, Hero-Worship and the Heroic in History. It contains the best remark on Shakespeare that I know:

 

If called to define Shakespeare’s faculty, I should say superiority of intellect, and think I had included all under that.

 

Adumbrating the observation, Carlyle characteristically exploded into a very useful warning against dividing any genius into its illusory components:

 

What indeed are faculties? We talk of faculties as if they were distinct, things separable; as if a man had intellect, imagination, fancy, etc. as he had hands, feet and arms.

 

"Power of Insight," Carlyle continued, was the vital force in any one of us. How do we recognize that insight or force in genius? We have the works of genius, and we have the memory of their personalities. I use that last word with high deliberation, following Walter Pater, another Victorian genius, but one who defies fashion, because he is akin to Emerson and to Nietzsche. These three subtle thinkers prophesied much of the intellectual future of our century that has just passed, and are unlikely to fade as influences during the new century. Pater’s preface to his major book, The Renaissance, emphasizes that the "aesthetic critic" ("aesthetic" meaning "perceptive") identifies genius in every era:

 

In all ages there have been some excellent workmen, and some excellent work done. The question he asks is always:-In whom did it stir, the genius, the sentiment of the period find itself? Where was the receptacle of his refinement, its elevation, its taste?

 

"The ages are all equal," says William Blake, "but genius is always above its age." Blake, a visionary genius almost without peer, is a superb guide to the relative independence that genius manifests in regard to time: it "is always above its age."

 

We cannot confront the twenty-first century without expecting that it too will give us a Stravinsky or Louis Armstrong, a Picasso or Matisse, a Proust or James Joyce. To hope for a Dante or Shakespeare, a J. S. Bach or Mozart, a Michelangelo or Leonardo, is to ask for too much, since gifts that enormous are very rare. Yet we want and need what will rise above the twenty-first century, whatever that turns out to be.

 

The use of my mosaic is that it ought to help prepare us for this new century, by summoning up aspects of the personality and achievements of many of the most creative who have come before us. The ancient Roman made an offering to his genius on his birthday, dedicating that day to "the god of human nature," as the poet Horace called each person’s tutelary spirit. Our custom of a birthday cake is in direct descent from that offering. We light the candles and might do well to remember what it is that we are celebrating.

 

I have avoided all living geniuses in this book, partly so as to evade the distractions of mere provocation. I can identify for myself certain writers of palpable genius now among us: the Portuguese novelist José Saramago, the Canadian poet Anne Carson, the English poet Geoffrey Hill, and at least a half-dozen North and Latin American novelists and poets (whom I forbear naming).

 

Pondering my mosaic of one hundred exemplary creative minds, I arrive at a tentative and personal definition of literary genius. The question of genius was a perpetual concern of Ralph Waldo Emerson, who is the mind of America, as Walt Whitman is its poet, and Henry James its novelist (its dramatist is yet to come). For Emerson, genius was the God within, the self of "Self-Reliance." That self, in Emerson, therefore is not constituted by history, by society, by languages. It is aboriginal. I altogether agree.

 

Shakespeare, the supreme genius, is different in kind from his contemporaries, even from Christopher Marlowe and Ben Jonson. Cervantes stands apart from Lope de Vega, and Calderòn. Something in Shakespeare and Cervantes, as in Dante, Montaigne, Milton, and Proust (to give only a few instances), is clearly both of and above the age.

 

Fierce originality is one crucial component of literary genius, but this originality itself is always canonical, in that it recognizes and comes to terms with precursors. Even Shakespeare makes an implicit covenant with Chaucer, his essential forerunner at inventing the human.

 

If genius is the God within, I need to seek it there, in the abyss of the aboriginal self, an entity unknown to nearly all our current Explainers, in the intellectually forlorn universities and in the media’s dark Satanic mills. Emerson and ancient Gnosticism agree that what is best and oldest in each of us is no part of the Creation, no part of Nature or the Not-Me. Each of us presumably can locate what is best in herself or himself, but how do we find what is oldest?

 

Where does the self begin? The Freudian answer is that the ego makes an investment in itself, which thus centers a self. Shakespeare calls our sense of identity the "selfsame"; when did Jack Falstaff become Falstaff? When did Shakespeare become Shakespeare? The Comedy of Errors is already a work of genius, yet who could have prophesied Twelfth Night on the basis of that early farce? Our recognition of genius is always retroactive, but how does genius first recognize itself?

 

The ancient answer is that there is a god within us, and the god speaks. I think that a materialist definition of genius is impossible, which is why the idea of genius is so discredited in an age like our own, where materialist ideologies dominate. Genius, by necessity, invokes the transcendental and the extraordinary, because it is fully conscious of them. Consciousness is what defines genius: Shakespeare, like his Hamlet, exceeds us in consciousness, goes beyond the highest order of consciousness that we are capable of knowing without him.

 

Gnosticism, by definition, is a knowing rather than a believing. In Shakespeare, we have neither a knower nor a believer, but a consciousness so capacious that we cannot find its rival elsewhere: in Cervantes or Montaigne, in Freud or in Wittgenstein. Those who choose (or are chosen) by one of the world religions frequently posit a cosmic consciousness to which they assign supernatural origins. But Shakespearean consciousness, which transmutes matter into imagination, does not need to violate nature. Shakespeare’s art is itself nature, and his consciousness can seem more the product of his art than its producer.

 

There, at the end of the mind, we are stationed by Shakespearean genius: a consciousness shaped by all the consciousnesses that he imagined. He remains, presumably forever, our largest instance of the use of literature for life, which is the work of augmenting awareness.

 

Though Shakespeare’s is the largest consciousness studied in this book, all the rest of these exemplary creative minds have contributed to the consciousness of their readers and auditors. The question we need to put to any writer must be: does she or he augment our consciousness, and how is it done? I find this a rough but effectual test: however I have been entertained, has my awareness been intensified, my consciousness widened and clarified? If not, then I have encountered talent, not genius. What is best and oldest in myself has not been activated.

 

—from Harold Bloom, Genius: A Mosaic of One Hundred Exemplary Creative Minds

“even the most spiritual of autobiographies is necessarily a song of the self”: harold bloom

harold bloom on gnosticism, poetry, knowing the self and our contemporary religion:


SELF−RELIANCE OR

MERE GNOSTICISM

 

I am to invite men drenched in Time to recover themselves and come out of time, and taste their native immortal air.

 

—RALPH WALDO EMERSON

 

If you seek yourself outside yourself, then you will encounter disaster, whether erotic or ideological. That must be why Ralph Waldo Emerson, in his central essay, “Self-Reliance” (1840), remarked that “Traveling is a fool’s paradise.” I am sixty-five, and it is past time to write my own version of “Self-Reliance.” Spiritual autobiography in our era, I thought until now, is best when it is implicit. But the moment comes when you know pretty much what you are going to know, and when you realize that more living and reading and brooding will not greatly alter the self. I am in my fortieth consecutive year of teaching at Yale, and my seventh at NYU, and for the last decade I have taught Shakespeare almost exclusively. Shakespeare, aside from all his other preternatural strengths, gives me the constant impression that he knows more than anyone else ever has known. Most scholars would call that impression an illusion, but to me it seems the pragmatic truth. Knowing myself, knowing Shakespeare, and knowing God are three separate but closely related quests.

 

Why bring God into it?

 

Seeking God outside the self courts the disasters of dogma, institutional corruption, historical malfeasance, and cruelty. For at least two centuries now most Americans have sought the God within rather than the God of European Christianity. But why bring Shakespeare into all this, since to me he seems the archetype of the secular writer?

 

You know the self primarily by knowing yourself; knowing another human being is immensely difficult, perhaps impossible, though in our youth or even our middle years we deceive ourselves about this. Yet this is why we read and listen to Shakespeare: in order to encounter other selves; no other writer can do that for us. We never encounter Shakespeare himself, as we can encounter Dante or Tolstoy in their work. Whether you can encounter God himself or herself depends upon yourself; we differ greatly from one another in that vital regard. But to return to the self: we can know it primarily through our own solitude, or we can know representatives of it, most vividly in Shakespeare, or we can know God in it, but only when indeed it is our own self. Perhaps the greatest mystics, poets, and lovers have been able to know God in another self, but I am skeptical as to whether that possibility still holds at this late time, with the Millennium rushing upon us.

 

Even the most spiritual of autobiographies is necessarily a song of the self. At sixty-five, I find myself uncertain just when my self was born. I cannot locate it in my earliest memories of childhood, and yet I recall its presence in certain memories of reading, particularly of the poets William Blake and Hart Crane, when I was about nine or ten. In my instance at least, the self came to its belated birth (or second birth) by reading visionary poetry, a reading that implicitly was an act of knowing something previously unknown within me. Only later could that self-revelation become explicit; Blake and Hart Crane, like some other great poets, have the power to awaken their readers to an implicit answering power, to a previously unfelt sense of possibilities for the self. You can call it a sense of “possible sublimity,” of “something evermore about to be,” as the poet William Wordsworth named it. Emerson, advocating self-trust, asked: “What is the aboriginal Self, on which a universal reliance may be grounded?” His answer was a primal power, or “deep force,” that we discover within ourselves. In the eloquence of certain sermons, Emerson found his deep force; for me it came out of exalted passages in Blake and Crane that haunt me still:

 

God appears & God is Light

To those poor Souls who dwell in Night,

But does a Human Form Display

To those who Dwell in Realms of Day.

 

– WILLIAM BLAKE,

“Auguries of Innocence”

 

And so it was I entered the broken world

To trace the visionary company of love,

its voice

An instant in the wind (I know not whither

hurled)

But not for long to hold each desperate choice.

 

– HART CRANE,

“The Broken Tower”

 

These days, in our America, so many go about proclaiming “empowerment,” by which actually they mean “resentment,” or “catering to resentment.” To be empowered by eloquence and vision is what Emerson meant by self-reliance, and is the start of what I mean by “mere Gnosticism,” where “mere” takes its original meaning of “pure” or “unmixed.” To fall in love with great poetry when you are young is to be awakened to the self ’s potential, in a way that has little to do, initially, with overt knowing. The self ’s potential as power involves the self ’s immortality, not as duration but as the awakening to a knowledge of something in the self that cannot die, because it was never born. It is a curious sensation when a young person realizes that she or he is not altogether the child of that person’s natural parents. Freud reduced such a sensation to “the changeling fantasy,” in which you imagine you are a faery child, plucked away by adoptive parents who then masquerade as a natural mother and father. But is it only a fantasy to locate, in the self, a magical or occult element, older than any other component of the self? Deep reading in childhood was once the norm for many among us; visual and auditory overstimulation now makes such reading very rare, and I suspect that changeling fantasies are vanishing together with the experience of early, authentic reading. At more than half a century away from the deep force of first reading and loving poetry, I no longer remember precisely what I then felt, and yet can recall how it felt. It was an elevation, a mounting high on no intoxicants except incantatory language, but of a rather different sort than contemporary hip-hop. The language of Blake and Hart Crane, of Marlowe and Shakespeare and Milton, transcended its rush of glory, its high, excited verbal music, and gave the pleasures of excited thought, of a thinking that changed one’s outer nature, while opening up an inner identity, a self within the self, previously unknown…

 

We live now, more than ever, in an America where a great many people are Gnostics without knowing it, which is a peculiar irony…  

 

I recall that the ancient Gnostics denied both matter and energy, and opted instead for information above all else. Gnostic information has two primary awarenesses: first, the estrangement, even the alienation of God, who has abandoned this cosmos, and second, the location of a residuum of divinity in the Gnostic’s own inmost self. That deepest self is no part of nature, or of history: it is devoid of matter or energy, and so is not part of the Creation-Fall, which for a Gnostic constitutes one and the same event. . .

 

Our current angel worship in America is another debased parody of Gnosticism…

 

Gnosticism… in my judgment rises as a protest against apocalyptic faith, even when it rises within such a faith, as it did successively within Judaism, Christianity, and Islam. Prophetic religion becomes apocalyptic when prophecy fails, and apocalyptic religion becomes Gnosticism when apocalypse fails, as fortunately it always has and, as we must hope, will fail again. Gnosticism does not fail; it cannot fail, because its God is at once deep within the self and also estranged, infinitely far off, beyond our cosmos. Historically, Gnosticism has always been obliterated by persecution, ranging from the relatively benign rejections of normative Judaism through the horrible violence of Roman Catholicism against the Christian Gnostics throughout the ages, wherever and whenever the Church has been near allied to repressive secular authorities. The final organized Western Gnosticism was destroyed in the so-called Albigensian Crusades, which devastated southern France in the thirteenth century, exterminating not only the Cathar Gnostic heretics but also the Provençal language and its troubador culture, which has survived only in the prevalent Western myth and ideal of romantic love. It is yet another irony that our erotic lives, with their self-destructive reliance upon the psychic disease called “falling–or being–in love,” should be a final, unknowing heritage of the last organized Gnosticism to date…

 

Our rampantly flourishing industries of angel worship, “near-death experiences,” and astrology–dream divination networks–are the mass versions of an adulterated or travestied Gnosticism. I sometimes allow myself the fantasy of Saint Paul redescending upon a contemporary America where he still commands extraordinary honor, among religions as diverse as Roman Catholicism and Southern Baptism. He would be bewildered, not by change, but by sameness, and would believe he was back at Corinth and Colossae, confronted again by Gnostic myths of the angels who made this world. If you read Saint Paul, you discover that he was no friend of the angels.

 

There is his cryptic remark in 1 Corinthians 11:10 that “a woman ought to have a veil on her head, because of the angels,” which I suspect goes back to the Book of Enoch’s accounts of angelic lust for earthly women. In the Letter to the Colossians, the distinction between angels and demons seems to be voided, and Christians are warned against “worship of angels,” an admonition that the churches, at the moment, seem afraid to restate. The “near-death experience” is another pre-Millennium phenomenon that travesties Gnosticism; every account we are given of this curious matter culminates in being “embraced by the light,” by a figure of light known to Gnostic tradition variously as “the astral body,” “the Resurrection Body,” or Hermes, our guide in the land of the dead. Since all of life is, in a sense, a “near-death experience,” it does seem rather odd that actual cases of what appear to be maldiagnoses should become supposed intimations of immortality. The commercialization of angelology and of out-of-the-body shenanigans properly joins the age-old history of mercantilized astrology and dream divination.

 

As mass-audience omens of Millennium, all of these represent what may be the final debasement of a populist American Gnosticism. I am prompted by this to go back to the great texts of a purer Gnosticism and their best commentators.

 

The anarchistic Brethren of the Free Spirit in the fifteenth century, like the Provençal Cathars in the twelfth, join the Manichaeans as the three large instances of Gnostic movements that transcended an esoteric religion of the intellectuals. Ancient Gnosticism, like Romantic and modern varieties, was a religion of the elite only, almost a literary religion. A purified Gnosticism, then and now, is truly for a relative handful only, and perhaps is as much an aesthetic as it is a spiritual discipline.

 

—from Harold Bloom, Omens of Millennium: The Gnosis of Angels, Dreams and Resurrection (1996), pp 13 – 33

james graham ballard,15 november 1930 – 19 april 2009, RIP

the audacity of j.g. ballard

he took cues and inspiration from william s. burroughs, 1950s sci-fi pulps, joseph conrad, sigmund freud (and his grandson lucien freud), the surrealist painters and poets, medical journals… and created a body of fiction that once seemed outlandish and now seems uncannily—and unfortunately—prophetic.

Bookseller Photo   

WHY I WANT TO FUCK RONALD REAGAN

RONALD REAGAN AND THE CONCEPTUAL AUTO DISASTER. Numerous studies have been conducted upon patients in terminal paresis (GPI), placing Reagan in a series of simulated auto crashes, e.g. multiple pileups, head-on collisions, motorcade attacks (fantasies of Presidential assassinations remained a continuing preoccupation, subject showing a marked polymorphicfixation on windshields and rear trunk assemblies). Powerful erotic fantasies of an anal-sadistic surrounded the image of the Presidential contender.

Subjects were required to construct the optimum auto disaster victim by placing a replica of Reagan’s head on the unretouched photographs of crash fatalities.

In 82% of cases massive rear-end collisions were selected with a preference for expressed fecal matter and rectal hemorrhages. Further tests were conducted to define the optimum model-year. These indicate that a three year model lapse with child victims provide the maximum audience excitation (confirmed by manufacturers’ studies of the optimum auto disaster). It is hoped to construct a rectal modulous of Reagan and the auto disaster of maximized audience arousal.

Motion picture studies of Ronald Reagan reveal characteristic patterns of facial tones and musculature associated with homoerotic behavior. The continuing tension of buccal sphincters and the recessive tongue role tally with earlier studies of facial rigidity (cf., Adolf Hitler, Nixon). Slow-motion cine films of campaign speeches exercised a marked erotic effect upon an audience of spastic children. Even with mature adults the verbal material was found to have a minimal effect, as demonstrated by substitution of an edited tape giving diametrically opposed opinions…

INCIDENCE OF ORGASMS IN FANTASIES OF SEXUAL INTERCOURSE WITH RONALD REAGAN. Patients were provided with assembly kit photographs of sexual partners during intercourse. In each case Reagan’s face was super imposed upon the original partner. Vaginal intercourse with "Reagan" proved uniformly disappointing, producing orgasm in 2% of subjects.

Axillary, buccal, navel, aural, and orbital modes produced proximal erections. The preferred mode of entry overwhelmingly proved to be the rectal. After a preliminary course in anatomy it was found that the caecum and transverse colon also provided excellent sites for excitation. In an extreme 12% of cases, the simulated anus of post-costolomy surgery generated spontaneous orgasm in 98% of penetrations. Multiple-track cine-films were constructed of "Reagan" in intercourse during (a) campaign speeches, (b) rear-end auto collisions with one and three year model changes, (c) with rear exhaust assemblies…

SEXUAL FANTASIES IN CONNECTION WITH RONALD REAGAN. The genitalia of the Presidential contender exercised a continuing fascination. A series of imaginary genitalia were constructed using (a) the mouth parts of Jacqueline Kennedy, (b) a Cadillac, (c) the assembly kid prepuce of President Johnson…In 89% of cases, the constructed genitalia generated a high incidence of self-induced orgasm. Tests indicate the masturbatory nature of the Presidential contender’s posture. Dolls consisting of plastic models of Reagan’s alternate genitalia were found to have a disturbing effect on deprived children.

REAGAN’S HAIRSTYLE. Studies were conducted on the marked fascination exercised by the Presidential contender’s hairstyle. 65% of male subjects made positive connections between the hairstyle and their own pubic hair. A series of optimum hairstyles were constructed.

THE CONCEPTUAL ROLE OF REAGAN. Fragments of Reagan’s cinetized postures were used in the construction of model psychodramas in which the Reagan-figure played the role of husband, doctor, insurance salesman, marriage counselor, etc.

The failure of these roles to express any meaning reveals the nonfunctional character of Reagan. Reagan’s success therefore indicates society’s periodic need to re-conceptualize its political leaders. Reagan thus appears as a series of posture concepts, basic equations which reformulate the roles of aggression and anality. Reagan’s personality. The profound anality of the Presidential contender may be expected to dominate the United States in the coming years. By contrast the late JFK remained the prototype of the oral subject, usually conceived in pre-pubertal terms. In further studies sadistic psychopaths were given the task of devising sex fantasies involving Reagan. Results confirm the probability of Presidential figures being perceived primarily in genital terms; the face of LB Johnson is clearly genital in significant appearance–the nasal prepuce, scrotal jaw, etc. Faces were seen as either circumcised (JFK, Khrushchev) or uncircumcised (LBJ, Adenauer). In assembly-kit tests Reagan’s face was uniformly perceived as a penile erection. Patients were encouraged to devise the optimum sex-death of Ronald Reagan.

================================================
 WHY I WANT TO FUCK RONALD REAGAN” [1967] by JG Ballard [excerpt from The Atrocity Exhibition]

[At the 1980 Republican Convention in San Francisco a copy of the Reagan text, minus its title and the running sideheads, and furnished with the seal of the Republican Party, was distributed by some puckish pro-situationists to the RNC delegates. It was accepted for what it resembled: a psychological position paper on the candidate’s subliminal appeal, commissioned by some maverick think-tank.]

================================================

Annotation & Commentary by the author, J.G. Ballard, to "Why I Want to Fuck Ronald Reagan", published in The Atrocity Exhibition, 1990:

"Why I Want to Fuck Ronald Reagan " prompted Doubleday in 1970 to pulp its first American edition of The Atrocity Exhibition. Ronald Reagan’s presidency remained a complete mystery to most Europeans, though I noticed that Americans took him far more easily in their stride. But the amiable old duffer who occupied the White House was a very different person from the often sinister figure I described in 1967, when the present piece was first published. The then-novelty of a Hollywood film star entering politics and becoming governor of California gave Reagan considerable air time on British TV. Watching his right-wing speeches, in which he castigated in sneering tones the profligate, welfare-spending, bureaucrat-infested state government, I saw a more crude and ambitious figure, far closer to the brutal crime boss he played in the 1964 movie, The Killers, his last Hollywood role. In his commercials Reagan used the smooth, teleprompter-perfect tones of the TV auto-salesman to project a political message that was absolutely the reverse of bland and reassuring. A complete discontinuity existed between Reagan’s manner and body language, on the one hand, and his scarily simplistic far-right message on the other. Above all, it struck me that Reagan was the first politician to exploit the fact that his TV audience would not be listening too closely, if at all, to what he was saying, and indeed might well assume from his manner and presentation that he was saying the exact opposite of the words actually emerging from his mouth. Though the man himself mellowed, his later presidency seems to have run the same formula."