primo levi on (anti)-happiness

 

happiness studies, circa 1943

 

Sooner or later in life everyone discovers that perfect happiness is unrealizable, but there are few who pause to consider the antithesis: that perfect unhappiness is equally unattainable. The obstacles preventing the realization of both these extreme states are of the same nature: they derive from our human condition which is opposed to everything infinite. Our ever-insufficient knowledge of the future opposes it: and this is called, in the one instance, hope, and in the other, uncertainty of the following day. The certainty of death opposes it: for it places limit on every joy, but also on every grief. The inevitable material cares oppose it; for as they poison every lasting happiness, they equally assiduously distract us from our misfortunes and make our consciousness of them intermittent and hence supportable.

 

—from Primo Levi, Survival in Auschwitz

 

 

 

thoughts on the self from simone weil

from simone weil’s essay “human personality,” written in the last year of her life (she was just thirty-four), apparently intended as a summation of herdeepest views:

 


MODERN CLASSICS SIMONE WEIL AN ANTHOLOGY

At the bottom of the heart of every human being, from earliest infancy until the tomb, there is something that goes on indomitably expecting, in the teeth of all experience of crimes committed, suffered, and witnessed, that good and not evil will be done to him. It is this above all that is sacred in every human being. . . .

 

This profound and childlike and unchanging expectation of good in the heart is not what is involved when we agitate for our rights. The motive which prompts a little boy to watch jealously to see if his brother has a slightly larger piece of cake arises from a much more superficial level of the soul. The word justice means two very different things according to whether it refers to the one or the other level. It is only the former one that matters.

                

Every time that there arises from the depths of a human heart the childish cry which Christ himself could not restrain, “Why am I being hurt?”, then there is certainly injustice. For if, as often happens, it is only the result of a misunderstanding, then the injustice consists in the inadequacy of the

explanation.

 

Those people who inflict the blows which provoke this cry are prompted by different motives according to temperament or occasion. There are some people who get a positive pleasure from the cry; and many others simply do not hear it. For it is a silent cry, which sounds only in the secret heart. . . . In those who have suffered too many blows, in slaves for example, that place in the heart from which the infliction of evil evokes a cry of surprise may seem to be dead. But it is never quite dead; it is simply unable to cry out any more. It has sunk into a state of dumb and ceaseless lamentation.

 

And even in those who still have the power to cry out, the cry hardly ever expresses itself, either inwardly or outwardly, in coherent language. Usually, the words through which it seeks expression are quite irrelevant. That is all the more inevitable because those who most often have occasion to feel that evil is being done to them are those who are least trained in the art of speech. Nothing, for example, is more frightful than to see some poor wretch in the police court stammering before a magistrate who keeps up an elegant flow of witticisms.

 

Apart from the intelligence, the only human faculty which has an interest in the public freedom of expression is that point in the heart which cries out against evil. But as it cannot express itself, freedom is of little use to it. What is first needed is a system of public education capable of providing it, so far as possible, with means of expression; and next, a régime in which the public freedom of expression is characterized not so much by freedom as by an attentive silence in which this faint and inept cry can make itself heard; and finally, institutions are needed of a sort which will, so far as possible, put power into the hands of men who are able and anxious to hear and understand it.

 

Clearly, a political party busily seeking, or maintaining itself in, power can discern nothing in these cries except a noise. Its reaction will be different according to whether the noise interferes with or contributes to that of its own propaganda. But it can never be capable of the tender and sensitive attention which is needed to understand its meaning.

 

The same is true to a lesser degree of organizations contaminated by party influences; in other words, when public life is dominated by a party system, it is true of all organizations, including, for example, trade unions and even churches.

 

Naturally, too, parties and similar organizations are equally insensitive to intellectual scruples. So when freedom of expression means in fact no more than freedom of propaganda for organizations of this kind, there is in fact no free expression for the only parts of the human soul that deserve it. Or if there is any, it is infinitesimal; hardly more than in a totalitarian system.

 

—from Simone Weil, “Human Personality,” in The Simone Weil Reader, ed. George A.

Panichas (Mt. Kisco, N.Y.: Moyer Bell, 1977), 315–17.

harold bloom on literary genius and the self

 

Where does the self begin? The Freudian answer is that the ego makes an investment in itself, which thus centers a self. Shakespeare calls our sense of identity the "selfsame"; when did Jack Falstaff become Falstaff? When did Shakespeare become Shakespeare? … Our recognition of genius is always retroactive, but how does genius first recognize itself?

 


Cover Image

 

What Is Genius?

Harold Bloom

 

In employing a Kabbalistic grid or paradigm in the arrangement of this book, I rely upon Gershom Scholem’s conviction that Kabbalah is the genius of religion in the Jewish tradition. My one hundred figures, from Shakespeare through the late Ralph Ellison, represent perhaps a hundred different stances towards spirituality, covering the full range from Saint Paul and Saint Augustine to the secularism of Proust and Calvino. But Kabbalah, in my view, provides an anatomy of genius, both of women and of men; as also of their merging in Ein Sof, the endlessness of God. Here I want to use Kabbalah as a starting-point in my own personal vision of the name and nature of genius.

 

Scholem remarked that the work of Franz Kafka constituted a secular Kabbalah, and so he concluded that Kafka’s writings possess "something of the strong light of the canonical, of that perfection which destroys." Against this, Moshe Idel has argued that the canonical, both scriptural and Kabbalistic, is "the perfection which absorbs." To confront the plenitude of Bible, Talmud, and Kabbalah is to work at "absorbing perfections."

 

What Idel calls "the absorbing quality of the Torah" is akin to the absorbing quality of all authentic genius, which always has the capacity to absorb us. In American English, to "absorb" means several related processes: to take something in as through the pores, or to engross one’s full interest or attention, or to assimilate fully.

I am aware that I transfer to genius what Scholem and Idel follow Kabbalah in attributing to God, but I merely extend the ancient Roman tradition that first established the ideas of genius and of authority. In Plutarch, Mark Antony’s genius is the god Bacchus or Dionysus. Shakespeare, in his Antony and Cleopatra, has the god Hercules, as Antony’s genius, abandon him. The emperor Augustus, who defeated Antony, proclaimed that the god Apollo was his genius, according to Suetonius. The cult of the emperor’s genius thus became Roman ritual, displacing the two earlier meanings, of the family’s fathering force and of each individual’s alter ego.

 

Authority, another crucial Roman concept, may be more relevant for the study of genius than "genius," with its contradictory meanings, still can hope to be. Authority, which has vanished from Western culture, was convincingly traced by Hannah Arendt to Roman rather than Greek or Hebrew origins. In ancient Rome, the concept of authority was foundational. Auctoritas derived from the verb augere, "to augment," and authority always depended upon augmenting the foundation, thus carrying the past alive into the present.

 

Homer fought a concealed contest with the poetry of the past, and I suspect that the Redactor of the Hebrew Bible, putting together his Genesis through Kings structure in Babylon, struggled to truncate the earliest author that he wove into the text, in order to hold off the strangeness and uncanny power of the Yahwist or J writer. The Yahwist could not be excluded, because his (or her) stories possessed authority, but the disconcerting Yahweh, human-all-too-human, could be muted by other voices of the divine. What is the relationship of fresh genius to a founded authority? At this time, starting the twenty-first century, I would say: "Why, none, none at all." Our confusions about canonical standards for genius are now institutionalized confusions, so that all judgments as to the distinction between talent and genius are at the mercy of the media, and obey cultural politics and its vagaries.

 

Since my book, by presenting a mosaic of a hundred authentic geniuses, attempts to provide criteria for judgment, I will venture here upon a purely personal definition of genius, one that hopes to be useful for the early years of this new century. Whether charisma necessarily attends genius seems to me problematic. Of my hundred figures in this book, I had met three—Iris Murdoch, Octavio Paz, Ralph Ellison—who died relatively recently. Farther back, I recall brief meetings with Robert Frost and Wallace Stevens. All of them impressive, in different ways, they lacked the flamboyance and authority of Gershom Scholem, whose genius attended him palpably, despite his irony and high good humor.

 

William Hazlitt wrote an essay on persons one would wish to have known. I stare at my Kabbalistic table of contents, and wonder which I would choose. The critic Sainte-Beuve advised us to ask ourselves: what would this author I read have thought of me? My particular hero among these hundred is Dr. Samuel Johnson, the god of literary criticism, but I do not have the courage to face his judgment.

 

Genius asserts authority over me, when I recognize powers greater than my own. Emerson, the sage I attempt to follow, would disapprove of my pragmatic surrender, but Emerson’s own geniuswas so large that he plausibly could preach Self-Reliance. I myself have taught continuously for forty-six years, and wish I could urge an Emersonian self-reliance upon my students, but I can’t and don’t, for the most part. I hope to nurture genius in them, but can impart only a genius for appreciation. That is the prime purpose of this book: to activate the genius of appreciation in my readers, if I can.

 

These pages are written a week after the September 11, 2001, terrorist triumph in destroying the World Trade Center and the people trapped within it. During the last week I have taught scheduled classes on Wallace Stevens and Elizabeth Bishop, on Shakespeare’s early comedies, and on the Odyssey. I cannot know whether I helped my students at all, but I momentarily held off my own trauma, by freshly appreciating genius.

 

What is it that I, and many others, appreciate in genius? An entry in Emerson’s Journals (October 27, 1831) always hovers in my memory:

 

Is it not all in us, how strangely! Look at this congregation of men;—the words might be spoken,—though now there be none here to speak them,—but the words might be said that would make them stagger and reel like a drunken man. Who doubts it? Were you ever instructed by a wise and eloquent man? Remember then, were not the words that made your blood run cold, that brought the blood to your cheeks, that made you tremble or delighted you,-did they not sound to you as old as yourself? Was it not truth that you knew before, or do you ever expect to be moved from the pulpit or from man by anything but plain truth? Never. It is God in you that responds to God without, or affirms his own words trembling on the lips of another.

It still burns into me: "did they not sound to you as old as yourself?" The ancient critic Longinus called literary genius the Sublime, and saw its operation as a transfer of power from author to reader:

 

Touched by the true sublime your soul is naturally lifted up, she rises to a proud height, is filled with joy and vaunting, as if she had herself created this thing that she has heard.

 

Literary genius, difficult to define, depends upon deep reading for its verification. The reader learns to identify with what she or he feels is a greatness that can be joined to the self, without violating the self’s integrity. "Greatness" may be out of fashion, as is the transcendental, but it is hard to go on living without some hope of encountering the extraordinary.

 

Meeting the extraordinary in another person is likely to be deceptive or delusionary. We call it "falling in love," and the verb is a warning. To confront the extraordinary in a book—be it the Bible, Plato, Shakespeare, Dante, Proust—is to benefit almost without cost. Genius, in its writings, is our best path for reaching wisdom, which I believe to be the true use of literature for life.

 

James Joyce, when asked, "Which one book on a desert island?", replied, "I would like to answer Dante, but I would have to take the Englishman, because he is richer." The Joycean Irish edge against the English is given adequate expression, but the choice of Shakespeare is just, which is why he leads off the hundred figures in this book. Though there are a few literary geniuses who approach Shakespeare—the Yahwist, Homer, Plato, Dante, Chaucer, Cervantes, Moliére, Goethe, Tolstoy, Dickens, Proust, Joyce—even those dozen masters of representation do not match Shakespeare’s miraculous rendering of reality. Because of Shakespeare we see what otherwise we could not see, since we are made different. Dante, the nearest rival, persuades us of the terrible reality of his Inferno and his Purgatorio, and almost induces us to accept his Paradiso. Yet even the fullest of the Divine Comedy’s persons, Dante the Poet-Pilgrim, does not cross over from the Comedy’s pages into the world we inhabit, as do Falstaff, Hamlet, Iago, Macbeth, Lear, Cleopatra.

 

The invasion of our reality by Shakespeare’s prime personages is evidence for the vitality of literary characters, when created by genius. We all know the empty sensation we experience when we read popular fiction and find that there are only names upon the page, but no persons. In time, however overpraised, such fictions become period pieces, and finally rub down into rubbish. It is worth knowing that our word "character" still possesses, as a primary meaning, a graphic sign such as a letter of the alphabet, reflecting the word’s likely origin in the ancient Greek character, a sharp stylus or the mark of the stylus’s incisions. Our modern word "character" also means ethos, a habitual stance towards life.

 

It was fashionable, quite recently, to talk about "the death of the author," but this too has become rubbish. The dead genius is more alive than we are, just as Falstaff and Hamlet are considerably livelier than many people I know. Vitality is the measure of literary genius. We read in search of more life, and only genius can make that available to use.

 

What makes genius possible? There always is a Spirit of the Age, and we like to delude ourselves that what matters most about any memorable figure is what he or she shared with a particular era. In this delusion, which is both academic and popular, everyone is regarded as being determined by societal factors. Individual imagination yields to social anthropology or to mass psychology, and thus can be explained away.

 

I base this book, Genius, upon my belief that appreciation is a better mode for the understanding of achievement than are all the analytical kinds of accounting for the emergence of exceptional individuals. Appreciation may judge, but always with gratitude, and frequently with awe and wonder.

 

By "appreciation" I mean something more than "adequate esteem." Need also enters into it, in the particular sense of turning to the genius of others in order to redress a lack in oneself, or finding in genius a stimulus to one’s own powers, whatever these may emerge as being.

 

Appreciation may modulate into love, even as your consciousness of a dead genius augments consciousness itself. Your solitary self’s deepest desire is for survival, whether in the here and now, or transcendentally elsewhere. To be augmented by the genius of others is to enhance the possibilities of survival, at least in the present and the near future.

 

We do not know how and/or why genius is possible, only that—to our massive enrichment—it has existed, and perhaps (waningly) continues to appear. Though our academic institutions abound in impostors who proclaim that genius is a capitalistic myth, I am content to cite Leon Trotsky, who urged Communist writers to read and study Dante. If genius is a mystery of the capacious consciousness, what is least mysterious about it is an intimate connection with personality rather than with character. Dante’s personality is forbidding, Shakespeare’s elusive, while Jesus’ (like the fictive Hamlet’s) seems to reveal itself differently to every reader or auditor.

What is personality? Alas, we use it now as a popular synonym for celebrity, but I would argue that we cannot give the word up to the realm of buzz. When we know enough about the biography of a particular genius, then we understand what is meant by the personality of Goethe or Byron or Freud or Oscar Wilde. Conversely, when we lack biographical inwardness, then we all agree that we are uncertain as to Shakespeare’s personality, an enormous paradox since his plays may have invented personality as we now mostreadily comprehend it. If challenged, I could write a book on the personality of Hamlet, Falstaff, or Cleopatra, but I would not attempt a book upon the personality of Shakespeare or of Jesus.

 

Benjamin Disraeli’s father, the man of letters Isaac D’Israeli, wrote an amiable volume called The Literary Character of Men of Genius, one of the precursors to this book, Genius, together with Plutarch’s Parallel Lives, Emerson’s Representative Men, and Carlyle’s On Heroes and Hero-Worship. Isaac D’Israeli remarks that "many men of genius must arise before a particular man of genius can appear." Every genius has forerunners, though far enough back in time we may not know who they are. Dr. Johnson considered Homer to have been the first and most original of poets; we tend to see Homer as a relative latecomer, enriching himself with the phrases and formulas of his predecessors. Emerson, in his essay "Quotation and Originality," slyly observed, "Only an inventor knows how to borrow."

 

The great inventions of genius influence that genius itself in ways we are slow to appreciate. We speak of the man or woman in the work; we might better speak of the work in the person. And yet we scarcely know how to discuss the influence of a work upon its author, or of a mind upon itself. I take that to be the principal enterprise of this book. With all of the figures I depict in this mosaic, my emphasis will be on the contest they conducted with themselves.

 

That agon with the self can mask itself as something else, including the inspiration of idealized forerunners: Plato’s Socrates, Confucius’s the Duke of Chou, the Buddha’s earlier incarnations. Particularly the inventor of the Hebrew Bible as we know it, the Redactor of the sequence from Genesis through Kings, relies upon his own genius at reimagining the Covenant even as he honors the virtues (and failings) of the fathers. And yet, as Donald Harmon Akenson argues, the inventor-redactor or writer-editor achieved a "surpassing wonder," utterly his own. This exile in Babylon could not have thought that he was creating Scripture; as the first historian he perhaps believed only that he was forwarding the lost cause of the Kingdom of Judah. And yet he seems too cunning not to have seen that his invention of a continuity and so of a tradition was largely his own.

 

With the Redactor, as with Confucius or with Plato, we can sense an anxiety in the work that must have communicated itself to the man. How can one be worthy of the fathers with whom Yahweh spoke, face-to-face, or of the great Duke of Chou, who gave order to the people without imposing it upon themby violence? Is it possible to be the authentic disciple of Socrates, who suffered martyrdom without complaint, in order to affirm his truth? The ultimate anxiety of influence always may be, not that one’s proper space has been usurped already, but that greatness may be unable to renew itself, that one’s inspiration may be larger than one’s own powers of realization.

 

Genius is no longer a term much favored by scholars, so many of whom have become cultural levelers quite immune from awe. Yet, with the public, the idea of genius maintains its prestige, even though the word itself can seem somewhat tarnished. We need genius, however envious or uncomfortable it makes many among us. It is not necessary that we aspire after genius for ourselves, and yet, in our recesses, we remember that we had, or have, a genius. Our desire for the transcendental and extraordinary seems part of our common heritage, and abandons us slowly, and never completely.

 

To say that the work is in the writer, or the religious idea is in the charismatic leader, is not a paradox. Shakespeare, we happen to know, was a usurer. So was Shylock, but did that help to keep The Merchant of Venice a comedy? We don’t know. But to look for the work in the writer is to look for the influence and effect of the play upon Shakespeare’s development from comedy to tragicomedy to tragedy. It is to see Shylock darkening Shakespeare. To examine the effects of his own parables upon the figure of Jesus is to conduct a parallel exploration.

 

There are two ancient (Roman) meanings of the word "genius," which are rather different in emphasis. One is to beget, cause to be born, that is to be a paterfamilias. The other is to be an attendant spirit for each person or place: to be either a good or evil genius, and so to be someone who, for better or for worse, strongly influences someone else. This second meaning has been more important than the first; our genius is thus our inclination or natural gift, our inborn intellectual or imaginative power, not our power to beget power in others.

 

We all learn to distinguish, firmly and definitively, between genius and talent. A "talent" classically was a weight or sum of money, and as such, however large, was necessarily limited. But "genius," even in its linguistic origins, has no limits.


We tend now to regard genius as the creative capacity, as opposed to talent. The Victorian historian Froude observed that genius "is a spring in which there is always more behind than flows from it." The largest instances of genius that we know, aesthetically, would include Shakespeare and Dante, Bach and Mozart, Michelangelo and Rembrandt, Donatello and Rodin, Alberti and Brunelleschi. A greater complexity ensues when we attempt to confront religious genius, particularly in a religion-obsessed country like the United States. To regard Jesus and Muhammad as religious geniuses (whatever else they were) makes them, in that regard only, akin not only to one another but to Zoroaster and the Buddha, and to such secular figures of ethical genius as Confucius and Socrates.

 

Defining genius more precisely than has yet been done is one of my objectives in this book. Another is to defend the idea of genius, currently abused by detractors and reductionists, from sociobiologists through the materialists of the genome school, and on to various historicizers. But my primary aim is both to enhance our appreciation of genius, and to show how invariably it is engendered by the stimulus of prior genius, to a much greater degree than it is by cultural and political contexts. The influence of genius upon itself, already mentioned, will be one of the book’s major emphases.

 

My subject is universal, not so much because world-altering geniuses have existed, and will come again, but because genius, however repressed, exists in so many readers. Emerson thought that all Americans were potential poets and mystics. Genius does not teach how to read or whom to read, but rather how to think about exemplary human lives at their most creative.

 

It will be noted in the table of contents that I have excluded any living instances of genius, and have dealt with only three recently dead. In this book I am compelled to be brief and summary in my account of individual genius, because I believe that much is to be learned by juxtaposing many figures from varied cultures and contrasting eras. The differences between a hundred men and women, drawn from a span of twenty-five centuries, overwhelm the analogies or similarities, and to present them within a single volume may seem the enterprise of an overreacher. And yet there are common characteristics to genius, since vivid individuality of speculation, spirituality, and creativity must rely upon originality, audacity, and self-reliance.

Emerson, in his Representative Men, begins with a heartening paragraph:

 

It is natural to believe in great men. If the companions of our childhood should turn out to be heroes, and their condition regal, it will not surprise us. All mythology opens with demigods, and the circumstance is high and poetic; that is, their genius is paramount. In the legends of Gautama, the first men ate the earth, and found it deliciously sweet.

 

Gautama, the Buddha, quests for and attains freedom, as though he were one of the first men. Emerson’s twice-told tale is a touch more American than Buddhist; his first men seem American Adams, and not reincarnations of previous enlightenments. Perhaps I too can only Americanize, but that may be the paramount use of past geniuses; we have to adapt them to our place and our time, if we are to be enlightened or inspired by them.

 

Emerson had six great or representative men: Plato, Swedenborg, Montaigne, Shakespeare, Napoleon, and Goethe. Four of these are in this book; Swedenborg is replaced by Blake, and Napoleon I have discarded with all other generals and politicians. Plato, Montaigne, Shakespeare, and Goethe remain essential, as do the others I sketch. Essential for what? To know ourselves, in relation to others, for these mighty dead are among the otherness that we can know, as Emerson tells us in Representative Men:

 

We need not fear excessive influence. A more generous trust is permitted. Serve the great.

 

And yet this is the conclusion of his book:

 

The world is young: the former great men call to us affectionately. We too must write Bibles, to unite again the heavens and the earthly world. The secret of genius is to suffer no fiction to exist for us; to realize all that we know.

 

To realize all that we know, fictions included, is too large an enterprise for us, a wounded century and a half after Emerson. The world no longer seems young, and I do not always hear the accents of affection when the voices of genius call out to me. But then I have the disadvantage, and the advantage, of coming after Emerson. The genius of influence transcends its constituent anxieties, provided we become aware of them and then surmise where we stand in relation to their continuing prevalence.

Thomas Carlyle, a Victorian Scottish genius now out of fashion, wrote an admirable study that almost nobody reads anymore, On Heroes, Hero-Worship and the Heroic in History. It contains the best remark on Shakespeare that I know:

 

If called to define Shakespeare’s faculty, I should say superiority of intellect, and think I had included all under that.

 

Adumbrating the observation, Carlyle characteristically exploded into a very useful warning against dividing any genius into its illusory components:

 

What indeed are faculties? We talk of faculties as if they were distinct, things separable; as if a man had intellect, imagination, fancy, etc. as he had hands, feet and arms.

 

"Power of Insight," Carlyle continued, was the vital force in any one of us. How do we recognize that insight or force in genius? We have the works of genius, and we have the memory of their personalities. I use that last word with high deliberation, following Walter Pater, another Victorian genius, but one who defies fashion, because he is akin to Emerson and to Nietzsche. These three subtle thinkers prophesied much of the intellectual future of our century that has just passed, and are unlikely to fade as influences during the new century. Pater’s preface to his major book, The Renaissance, emphasizes that the "aesthetic critic" ("aesthetic" meaning "perceptive") identifies genius in every era:

 

In all ages there have been some excellent workmen, and some excellent work done. The question he asks is always:-In whom did it stir, the genius, the sentiment of the period find itself? Where was the receptacle of his refinement, its elevation, its taste?

 

"The ages are all equal," says William Blake, "but genius is always above its age." Blake, a visionary genius almost without peer, is a superb guide to the relative independence that genius manifests in regard to time: it "is always above its age."

 

We cannot confront the twenty-first century without expecting that it too will give us a Stravinsky or Louis Armstrong, a Picasso or Matisse, a Proust or James Joyce. To hope for a Dante or Shakespeare, a J. S. Bach or Mozart, a Michelangelo or Leonardo, is to ask for too much, since gifts that enormous are very rare. Yet we want and need what will rise above the twenty-first century, whatever that turns out to be.

 

The use of my mosaic is that it ought to help prepare us for this new century, by summoning up aspects of the personality and achievements of many of the most creative who have come before us. The ancient Roman made an offering to his genius on his birthday, dedicating that day to "the god of human nature," as the poet Horace called each person’s tutelary spirit. Our custom of a birthday cake is in direct descent from that offering. We light the candles and might do well to remember what it is that we are celebrating.

 

I have avoided all living geniuses in this book, partly so as to evade the distractions of mere provocation. I can identify for myself certain writers of palpable genius now among us: the Portuguese novelist José Saramago, the Canadian poet Anne Carson, the English poet Geoffrey Hill, and at least a half-dozen North and Latin American novelists and poets (whom I forbear naming).

 

Pondering my mosaic of one hundred exemplary creative minds, I arrive at a tentative and personal definition of literary genius. The question of genius was a perpetual concern of Ralph Waldo Emerson, who is the mind of America, as Walt Whitman is its poet, and Henry James its novelist (its dramatist is yet to come). For Emerson, genius was the God within, the self of "Self-Reliance." That self, in Emerson, therefore is not constituted by history, by society, by languages. It is aboriginal. I altogether agree.

 

Shakespeare, the supreme genius, is different in kind from his contemporaries, even from Christopher Marlowe and Ben Jonson. Cervantes stands apart from Lope de Vega, and Calderòn. Something in Shakespeare and Cervantes, as in Dante, Montaigne, Milton, and Proust (to give only a few instances), is clearly both of and above the age.

 

Fierce originality is one crucial component of literary genius, but this originality itself is always canonical, in that it recognizes and comes to terms with precursors. Even Shakespeare makes an implicit covenant with Chaucer, his essential forerunner at inventing the human.

 

If genius is the God within, I need to seek it there, in the abyss of the aboriginal self, an entity unknown to nearly all our current Explainers, in the intellectually forlorn universities and in the media’s dark Satanic mills. Emerson and ancient Gnosticism agree that what is best and oldest in each of us is no part of the Creation, no part of Nature or the Not-Me. Each of us presumably can locate what is best in herself or himself, but how do we find what is oldest?

 

Where does the self begin? The Freudian answer is that the ego makes an investment in itself, which thus centers a self. Shakespeare calls our sense of identity the "selfsame"; when did Jack Falstaff become Falstaff? When did Shakespeare become Shakespeare? The Comedy of Errors is already a work of genius, yet who could have prophesied Twelfth Night on the basis of that early farce? Our recognition of genius is always retroactive, but how does genius first recognize itself?

 

The ancient answer is that there is a god within us, and the god speaks. I think that a materialist definition of genius is impossible, which is why the idea of genius is so discredited in an age like our own, where materialist ideologies dominate. Genius, by necessity, invokes the transcendental and the extraordinary, because it is fully conscious of them. Consciousness is what defines genius: Shakespeare, like his Hamlet, exceeds us in consciousness, goes beyond the highest order of consciousness that we are capable of knowing without him.

 

Gnosticism, by definition, is a knowing rather than a believing. In Shakespeare, we have neither a knower nor a believer, but a consciousness so capacious that we cannot find its rival elsewhere: in Cervantes or Montaigne, in Freud or in Wittgenstein. Those who choose (or are chosen) by one of the world religions frequently posit a cosmic consciousness to which they assign supernatural origins. But Shakespearean consciousness, which transmutes matter into imagination, does not need to violate nature. Shakespeare’s art is itself nature, and his consciousness can seem more the product of his art than its producer.

 

There, at the end of the mind, we are stationed by Shakespearean genius: a consciousness shaped by all the consciousnesses that he imagined. He remains, presumably forever, our largest instance of the use of literature for life, which is the work of augmenting awareness.

 

Though Shakespeare’s is the largest consciousness studied in this book, all the rest of these exemplary creative minds have contributed to the consciousness of their readers and auditors. The question we need to put to any writer must be: does she or he augment our consciousness, and how is it done? I find this a rough but effectual test: however I have been entertained, has my awareness been intensified, my consciousness widened and clarified? If not, then I have encountered talent, not genius. What is best and oldest in myself has not been activated.

 

—from Harold Bloom, Genius: A Mosaic of One Hundred Exemplary Creative Minds

the rest of loren eiseley’s the invisible pyramid, chapter one

“When I lie in bed now and await the hastening of Halley’s comet, Iwould like to find my way back just once to that single, precise instant when the star dragon thrust out its tongue. Perhaps the story of all dragons since comes from that moment. Men have long memories when the memories are clothed in myth. But I drowse, and the train whistle mingles and howls with the heaven-sweeping light in my dream. It is 1910. I am going back once more.”

 

 

III

 

It may now appear that I have been wandering mentally amidst irrelevant and strange events—time glimpsed through a blowing curtain of dust, and, among fallen stones and badland pinnacles, bones denoting not just the erosion of ages but the mysterious transformation of living bodies.

 

Man after man in the immediately post-Darwinian days would stare into his mirror at the bony contours of a skull that held some grinning secret beyond the simple fact of death. Anatomists at the dissecting table would turn up odd vestigial muscles and organs. Our bodies held outdated machinery as strange as that to be found in the attics of old houses. Into these anatomical depths few would care to probe. But there were scholars who were not averse to delving among fossils, and the skulls they found or diagnosed would multiply. These would be recognized at last for what they were, the dropped masks of the beginning of Nature’s last great play—the play of man.

 

Strangely, it is a different play, though made partly of old ingredients. In three billion years of life upon the planet, this play had never been acted upon the great stage before. We come at a unique moment in geological history, and we ourselves are equally unique. We have brought with us out of the forest darkness a new unprophesiable world—a latent, lurking universe within our heads.

 

In the world of Charles Darwin, evolution was particulate; it contained and traced the history of fins, claws, wings, and teeth. The Darwinian circle was immersed in the study of the response of the individual organism to its environment, and the selective impact of the environment upon its creatures. By contrast, just as biological evolution had brought the magic of the endlessly new in organic form, so the evolving brain, through speech, had literally created a superorganic structure unimaginable until its emergence.

 

Alfred Russel Wallace, Darwin’s contemporary, perceived that with the emergence of the human brain, man had, to a previously inconceivable degree, passed out of the domain of the particulate evolution of biological organs and had entered upon what we may call history. Human beings, in whom the power of communication had arisen, were leaving the realm of phylogeny for the realm of history, which was to contain, henceforth, our final destiny. After three billion years of biological effort, man alone had seemingly evaded the oblique trap of biological specialization. He had done so by the development of a specialized organ—the brain—whose essential purpose was to evade specialization.

 

The tongue and the hand, so disproportionately exaggerated in his motor cortex, were to be its primary instruments. With these he would elude channelized instinct and channelized organic development. The creature who had dropped from some long-ago tree into the grass had managed to totter upright and free the grasping forelimb. Brain, hand, and tongue would henceforth evolve together. Fin, fur, and paw would vanish into the mists of the past. Henceforth it would be the brain that clothed and unclothed man. Fire would warm him, flint would strike for him, vessels would carry him over dangerous waters.

 

In the end, with the naked body of an awkward and hastily readjusted climber, he would plumb the seas’ depths and mount, with wings spun in his brain, the heights of air. Enormous computations upon the movements of far bodies in space would roll in seconds from his computers. His great machines would leap faster at his bidding than the slower speed of his own nerves.

 

Because of speech, drawn from an infinitesimal spark along a nerve end, the vague, ill-defined surroundings of the animal world would be transformed, named, and categorized. Mind would reach into a past before its becoming; the misty future experienced by dim animal instinct would leap into sudden, clear perspective. Language, whose constituents have come down the long traverse of millennia as rolled and pounded by circumstance as a flint ax churned in a river bed, leaves no direct traces of its dim beginnings. With the first hieroglyph, oral tradition would become history. Out of a spoken sound, man’s first and last source of inexhaustible power, would emerge the phantom world which the anthropologist prosaically calls culture. Its bridges, its towers, and its lightnings lie potential in a little globe of gray matter that can fade and blow away on any wind. The novelty of evolutionary progression through time has begotten another novelty, the novelty of history, the evolutionary flow of ideas in the heads of men.

 

The role of the brain is analogous in a distant way to the action of mutation in generating improbabilities in the organic realm. Moreover, the human brain appears to be a remarkably solitary product of this same organic process which, in actuality, it has transcended. In this sense life has produced a newly emergent instrument capable of transmitting a greatly speeded-up social heredity based not upon the gene but instead upon communication. In its present technological phase it has brought the ends of the world into conflict and at the same time is reaching outward into space.

 

About ourselves there always lingers a penumbral rainbow—what A. L. Kroeber termed the superorganic—that cloud of ideas, visions, institutions which hover about, indeed constitute human society, but which can be dissected from no single brain. This rainbow, which exists in all heads and dies with none, is the essential part of man. Through it he becomes what we call human, and not otherwise.

 

Man is not a creature to be contained in a solitary skull vault, nor is he measurable as, say, a saber-toothed cat or a bison is measurable. Something, the rainbow dancing before his eyes, the word uttered by the cave fire at evening, eludes us and runs onward. It is gone when we come with our spades upon the cold ashes of the campfire four hundred thousand years removed.

 

Paradoxically, the purpose of the human brain is to escape physical specialization by the projections of thought. There is no parallel organism with which to compare ourselves. The creature from which we arose has perished. On the direct hominid line there is no twilight world of living fossils which we can subject to examination. At best we are forced to make inferences from less closely related primates whose activities lie below the threshold of speech.

 

The nineteenth century, in the efforts of men like Hughlings Jackson, came to see the brain as an organ whose primary parts had been laid down successively in evolutionary time, a little like the fossil strata in the earth itself. The centers of conscious thought were the last superficial deposit on the surface of a more ancient and instinctive brain. As the roots of our phylogenetic tree pierce deep into earth’s past, so our human consciousness is similarly embedded in, and in part constructed of, pathways which were laid down before man in his present form existed. To acknowledge this fact is still to comprehend as little of the brain’s true secrets as an individual might understand of the dawning of his own consciousness from a single egg cell.

 

The long, slow turn of world-time as the geologist has known it, and the invisibly moving hour hand of evolution perceived only yesterday by the biologist, have given way in the human realm to a fantastically accelerated social evolution induced by industrial technology. So fast does this change progress that a growing child strives to master the institutional customs of a society which, compared with the pace of past history, compresses centuries of change into his lifetime. I myself, like others of my generation, was born in an age which has already perished. At my death I will look my last upon a nation which, save for some linguistic continuity, will seem increasingly alien and remote. It will be as though I peered upon my youth through misty centuries. I will not be merely old; I will be a genuine fossil embedded in onrushing man-made time before my actual death.

 

  

IV

 

"There never was a first man or a first primate," Dr. Glenn Jepsen of Princeton once remarked iconoclastically. The distinguished paleontologist then added that the "billions of genetic filaments in our ancestral phyletic cord are of many lengths, no two precisely the same. We have not had our oversized brain very long but the pentadactyl pattern of our extremities originated deep in . . . the Paleozoic." Moreover, we have, of late, discovered that our bipedal, man-ape ancestors seem to have flourished for a surprisingly long time without any increase in their cranial content whatever—some four or five million years, in fact.

 

It used to be thought that the brain of proto-man would have had to develop very early to enable him to survive upright upon the ground at all. Oddly, it now appears that man survived so well under these circumstances that it is difficult to say why, in the end, he became man at all. His bipedal pre-man phase lasted much longer—five or six times at least—than his whole archaeological history down to this very moment. What makes the whole story so mystifying is that the expansion of his neurocranium took place relatively rapidly during the million years or so of Ice Age time, and has not been traced below this point. The supposed weak-bodied creature whom Darwin nervously tried to fit into his conception of the war of nature on the continents is thought to have romped through a longer geological time period than his large-brained descendants may ever see.

 

We know that at least two million years ago the creature could make some simple use of stones and bones and may possibly have fashioned crude windbreaks. He was still small-brained in human terms, however, and if his linguistic potentialities were increasing there remains no satisfactory evidence of the fact. Thus we are confronted with the question why man, as we know him, arose, and why, having arisen, he found his way out of the green confines of his original world. Not all the human beings even of our existing species did. Though their brains are comparable to our own, they have lingered on, something less than one per cent of today’s populations, at the edge of a morning twilight that we have forgotten. There can thus be no ready assertion that man’s departure from his first world, the world of chameleon-like shifts and forest changes, waseither ordained or inevitable. Neither can it be said that visible tools created brains. Some of the forest peoples—though clever to adapt—survive with a paucity of technical equipment.

 

As to why our pygmoid ancestors, or, more accurately, some group of them, took the road to larger brains we do not know. Most of the suggestions made would just as readily fit a number of non-human primate forms which did not develop large brains. Our line is gone, and while the behavior of our existing relatives is worth examination we cannot unravel out of another genetic strand the complete story of our own.

 

Not without interest is the fact that much of this development is correlated with the advances and recessions of the continental ice fields. It is conceivable, at least, that some part of the human stock was being exposed during this time to relentless genetic pressures and then, in inter-glacial times, to renewed relaxation of barriers and consequent genetic mixture. A few scattered finds from remote portions of the Euro-Asiatic land mass will never clarify this suspicion. For hundreds of thousands of years of crucial human history we have not a single bone as a document.

 

There is another curious thing about the Ice Age. Except for the emergence of genuinely modern man toward the close of its icy winter, it is an age of death, not a birth time of species. Extinction has always followed life relentlessly through the long eras of earth’s history. The Pleistocene above all else was a time of great extinctions. Many big animals perished, and though man’s hunting technology was improving, his numbers were still modest. He did not then possess the capacity to ravage continents in the way he was later to do.

 

The dinosaurs vanished before man appeared on earth, and their disappearance has caused much debate. They died out over a period many millions of years in extent and at a time when the low warm continents lapped by inland seas were giving way to bleaker highlands. The events of the Ice Age are markedly different. First of all, many big mammals—mammoth, mastodon, sloth, long-horned bison—survived the great ice sheets only to die at their close. It is true that man, by then dispersing over the continents, may have had something to do with their final extermination, but there perished also certain creatures like the dire wolves, in which man could have taken little direct interest.

 

We are thus presented, in contrast to the situation at the close of the age of reptiles, with a narrowly demarcated line of a few thousand years in which a great variety of earth’s northern fauna died out while man survived. Along with the growing desiccation in Southwest Asia, these extinctions gave man, the hunter, a mighty push outside his original game-filled Eden. He had to turn to plant domestication to survive, and plants, it just happens, are the primary road to a settled life and the basic supplies from which cities and civilizations arise. A half-dying green kingdom, one might say, forced man out of a relationship which might otherwise have continued down to the present.

 

But, the question persists, why did so many creatures die in so little time after marching back and forth with the advancing or retreating ice through so many thousand years? Just recently the moon voyage has hinted at a possible clue, though it must be ventured very tentatively when man’s observational stay upon the moon has been so short.

 

The Apollo 11 astronauts observed and succeeded in photographing melted or glazed droplets concentrated on points and edges of moon rock. Dr. Thomas Gold, director of Cornell University’s Center for Radio Physics, has suggested that these glasslike concretions are evidence of melting, produced by a giant solar flare activated for only a few moments, but of an unexpected intensity. Giant storms are known to lick outward from the sun’s surface, but a solar disturbance of the magnitude required to account for such a melting—if it was indeed sun-produced —would have seemed from earth like the flame of a dragon’s breath. Most of the ultraviolet of the sun-storm, generated perhaps by a comet hurtling into the sun’s surface, would have been absorbed by the earth’s atmosphere. A temperature effect on earth need not have been pronounced so long as the flare was momentary. The unprotected surface of the moon, however, would have received the full impact of the dragon’s tongue.

 

Dr. Gold has calculated by various means that the event, if actually produced by a solar flare, lies somewhere close to thirty thousand years from us in time and is therefore unrecorded in the annals of man. But here is the curious thing. The period involved lies in the closing Ice Age, in the narrow time zone of vast extinctions in the northern hemisphere. Was the giant flare, an unheard-of phenomenon, in some way involved with the long dying of certain of the great mammals that followed? Seemingly the earth escaped visible damage because of its enveloping blanket of air. No living man knows what the flicking tongue of a dragon star might do, however, or what radiation impact or atmospheric change might have been precipitated upon earth. Some scholars are loath to accept the solar-flare version of the moon glaze because of the stupendous energy which would have to be expended, and the general known stability of the sun. But men are short-lived, and solar catastrophes like the sunward disintegration of a comet would be exceedingly rare. Until more satisfactory evidence is at hand, most scientists will probably prefer to regard the glazed rock as splashed by the heat of meteoritic impact.

 

Nevertheless, the turbulent outpouring of even ordinary solar flares is on so gigantic a scale as to be terrifying in a close-up view. Until there is further evidence that ours is not a sleepy dragon star, one may wonder just what happened thirty thousand years ago, and why, among so many deaths, it was man who survived. Whatever occurred, whether by ice withdrawal or the momentary penetration of the ultraviolet into our atmosphere, man’s world was changed. Perhaps there is something after all to the story of his eviction from the green Garden.

 

When I lie in bed now and await the hastening of Halley’s comet, I would like to find my way back just once to that single, precise instant when the star dragon thrust out its tongue. Perhaps the story of all dragons since comes from that moment. Men have long memories when the memories are clothed in myth. But I drowse, and the train whistle mingles and howls with the heaven-sweeping light in my dream. It is 1910. I am going back once more.

 

“at four i had been fixed with the compulsive vertigo of vast distance and even more endless time”

…echoes of Beckettian existentialism in Loren Eisely:

 

 

"I think we are now well across the last ice, toward the beginning. There is no fire of any sort but we do not miss it. We are far to the south and the climate is warm. We have no tools except an occasional bone club. We walk upright, but I think we are now animals. We are small — pygmies, in fact. We wear no clothes. We no longer stare at the stars or think of the unreal. The dead are dead. No one follows us at nightfall. Do not repeat this. I think we are animals. I think we have reached beyond the bridge. We are happy here. Tell no one."

 

Bookseller Photo 

one: THE STAR DRAGON

 

 

Already at the origin of the species man was equal to what he was destined to become.

 

 —JEAN ROSTAND

 

  

 

THE STAR DRAGON

 

In the year 1910 Halley’s comet—the comet that among many visitations had flared in 1066 over the Norman invasion of England—was again brightening the night skies of earth. "Menace of the Skies," shrieked the more lurid newspapers.

 

Like hundreds of other little boys of the new century, I was held up in my father’s arms under the cottonwoods of a cold and leafless spring to see the hurtling emissary of the void. My father told me something then that is one of my earliest and most cherished memories.

 

"If you live to be an old man," he said carefully, fixing my eyes on the midnight spectacle, "you will see it again. It will come back in seventy-five years. Remember," he whispered in my ear, "I will be gone, but you will see it. All that time it will be traveling in the dark, but somewhere, far out there"—he swept a hand toward the blue horizon ofthe plains—"it will turn back. It is running glittering through millions of miles."

 

I tightened my hold on my father’s neck and stared uncomprehendingly at the heavens. Once more he spoke against my ear and for us two alone. "Remember, all you have to do is to be careful and wait. You will be seventy-eight or seventy-nine years old. I think you will live to see it—for me," he whispered a little sadly with the foreknowledge that was part of his nature.

 

"Yes, Papa," I said dutifully, having little or no grasp of seventy-five years or millions of miles on the floorless pathways of space. Nevertheless I was destined to recall the incident all my life. It was out of love for a sad man who clung to me as I to him that, young though I was, I remembered. There are long years still to pass, and already I am breathing like a tired runner, but the voice still sounds in my ears and I know with the sureness of maturity that the great wild satellite has reversed its course and is speeding on its homeward journey toward the sun.

 

At four I had been fixed with the compulsive vertigo of vast distance and even more endless time. I had received, through inherited temperament and inclination, a nostalgic admonition to tarry. Besides, I had given what amounted to a desperate promise. "Yes, Papa," I had said with the generosity of childhood, not knowing the chances that men faced in life. This year, after a visit to my doctor, I had written anxiously to an astronomer friend. "Brad," I had asked, "where is Halley’s comet reported on the homeward track? I know it must have turned the elliptic, but where do you calculate it now, how far—and how long, how long—?"

 

I have his answer before me. "You’re pushing things, old man," he writes. "Don’t expect us to see it yet—you’re too young. The orbit is roughly eighteen astronomical units or one billion six hundred and fifty million miles. It headed back this way probably in nineteen forty-eight."

 

Nineteen forty-eight. I grope wearily amidst memories of the Cold War, Korea, the Berlin blockade, spies, the impossible-to-be-kept secrets of the atom. All that time through the black void the tiny pinpoint of light has been hurrying, hurrying, running faster than I, thousands of miles faster as it curves toward home. Because of my father and the promise I had made, a kind of personal bond has been projected between me and the comet. I do not think of what it heralded over Hastings in 1066. I think it is racing sunward so that I can see it stretched once more across the heavens and momently restore the innocence of 1910.

 

But there is inner time, "personal, private chronometry," a brain surgeon once told me. There is also outer time that harries us ruthlessly to our deaths. Some nights in a dark room, staring at the ceiling, I can see the light like a mote in my eye, like a far-off train headlight glimpsed long ago as a child on the prairies of the West. The mournful howl of the train whistle echoes in my head and mingles with the night’s black spaces. The voice is that of the comet as I hear it, climbing upward on the arc of space. At last in the dark I compose myself for sleep. I pull the blanket up to my chin and think of radar ceaselessly sweeping the horizon, and the intercontinental missiles resting in their blast-hardened pits.

 

But no, I dream deeper, slipping back like a sorcerer through the wood of time. Life was no better, not even as safe, proportionately, in the neolithic hill forts whose tiny trenches can be seen from the air over the British downs. A little band of men, with their families beside them, crouched sleepless with ill-made swords, awaiting an attack at dawn. And before that, the caves and the freezing cold, with the ice creeping ever southward autumn by autumn.

 

The dead we buried in red ochre under the fire pit, the red standing for blood, for we were quick in analogies and magic. The ochre was for life elsewhere and farewell. We tramped away in our furred garb and the leaves and snow washed over the place of our youth. We worked always toward the south across the tundra following the long trail of the mammoth. Someone saw a vast flame in the sky and pointed, but itwas not called Halley’s comet then. You could see it glinting through the green light and the falling snow.

 

Farther backward still across twin ice advances and two long interglacial summers. We were cruder now, our eyes wild and uncertain, less sure that we were men. We no longer had sewn garments, and our only weapon was a heavy pointed stone, unhafted and held in the hand. Even our faces had taken on the cavernous look of the places we inhabited. There were difficulties about making fire, and we could not always achieve it. The dead were left where they fell. Women wept less, and the bands were smaller. Our memories consisted of dim lights under heavy sockets of bone. We did not paint pictures, or increase, by magic, the slain beasts. We talked, but the words we needed were fewer. Often we went hungry. It was a sturdy child that survived. We meant well but we were terrifyingly ignorant and given to frustrated anger. There was too much locked up in us that we could not express.

 

We were being used, and perhaps it was against this that we unconsciously raged the most. We were neither beast nor man. We were only a bridge transmitting life. I say we were almost animals and knew little, but this we felt and raged against. There were no words to help us. No one could think of them. Sometimes we were stalked by the huge cats, but it was the inner stalking that was most terrible. I saw a star in the sky with a flaming tail and cowered, shaking, into a bush, making uncouth sounds. It is not laughable. Animals do not do this. They do not see the world as we do—even we.

 

I think we are now well across the last ice, toward the beginning. There is no fire of any sort but we do not miss it. We are far to the south and the climate is warm. We have no tools except an occasional bone club. We walk upright, but I think we are now animals. We are small— pygmies, in fact. We wear no clothes. We no longer stare at the stars or think of the unreal. The dead are dead. No one follows us at nightfall. Do not repeat this. I think we are animals. I think we have reached beyond the bridge. We are happy here. Tell no one.

 

I sigh in my sleep but I cannot hold to the other side of the bridge—the animal side. The comet turns blazing on its far run into space. Slowly I plod once more with the furred ones up the ladder of time. We cross one ice and then another. There is much weeping, too much of memory. It is all to do over again and go on. The white-robed men think well in Athens. I heard a man named Pindar acclaim something that implied we have a likeness to the immortals. "What course after nightfall," he questioned, "has destiny written that we must run to the end?"

 

What course after nightfall? I have followed the comet’s track returning and returning while our minds and our bodies changed. The comet will appear once more. I will follow it that far. Then I will no longer be part of the bridge. Perhaps I will be released to go back. Time and space are my inheritance from my father and the star. I will climb no further up the ladder of fiery return. I will go forward only one more rung. What will await me there is not pleasant, but it is in the star’s destiny as well as mine. I lie awake once more on the dark bed. I feel my heart beating, and wait for the hurrying light.

 

  

II

  

In 1804, well over a century and a half ago, Captain William Clark recorded in his diary far up the unknown Missouri that ahead of the little expedition that he shared with Meriwether Lewis hung a formidable curtain of blowing dust through which they could not see.

 

"Tell us what is new," the few savants in the newborn American republic had advised the explorers when they departed westward. Men continued to have strange expectations of what lay hidden in the still uncharted wilds behind the screen of the great eastern forest. Some thought that the mammoth, whose bones had been foundat Big Bone Lick, in Kentucky, might still wander alive and trumpeting in that vast hinterland. The "dreadful curtain" through which the youthful captains peered on that cold, forbidding day in January could have hidden anything. Indeed the cloud itself was symbolic. It represented time in inconceivable quantities—time, not safe, not contained in Christian quantity, but rather vast as the elemental dust storm itself.

 

The dust in those remote regions was the dust of ice ages, of mountains wearing away under the splintering of frost and sun. The Platte was slowly carrying a mountain range to the sea over giant fans of gravel. Fremont’s men would later report the strange and grotesque sculptures of the wind in stone. It was true that a few years earlier the Scottish physician James Hutton had philosophically conceived such time as possible. His views had largely proved unwelcome and had been dismissed in Europe. On the far-western divide, however, amid the roar of waters falling toward an unknown western ocean, men, frontiersmen though they were, must have felt with an increasing tinge of awe the weight of ages unknown to man.

 

Huge bones bulked in the exposed strata and were measured with wonder. No man knew their names or their antiquity. New things the savants had sought surrounded the explorers, not in the sense of the living survival of great elephants but rather in the sense of a vaster novelty —the extension of time itself. It was as though man for the first time was intruding upon some gigantic stage not devised for him. Among these wastes one felt as though inhuman actors had departed, as though the drama of life had reached an unexpected climax.

 

One catches this same lost feeling in the remarks of another traveler, Alexis de Tocqueville, venturing into the virgin forest far from the pruned orchards of France. "Here," he said, "man seems to enter life furtively. Everything enters into a silence so profound, a stillness so complete that the soul feels penetrated by a sort of religious terror." Even in the untouched forest, time had taken on this same American quality: "Immense trees," de Tocqueville wrote in awe, "retained by the surrounding branches, hang suspended in the air and fall into dust without touching the earth."

 

It is perhaps a significant coincidence that man’s full recognition of biological novelty, of the invisible transformations of the living substance itself, came close upon the heels of the discovery of the vast wilderness stage which still held the tumbled bones of the former actors. It was a domain which had remained largely unknown to Europeans. Sir Charles Lyell, who, in the 18305, successfully revived Hutton’s lost doctrines of geological antiquity, visited the United States in the 18405 and lectured here to enthralled thousands. Finally, it was Charles Darwin, the voyager-naturalist, who, as a convinced follower of Lyell, had gazed upon a comparable wilderness in South America and had succeeded, in his mind’s eye, in peopling the abandoned stage with the creatures of former epochs. It was almost as though Europe, though rife with speculation since the time of the great voyagers, could not quite escape its man-centeredness or its preoccupation with civilized hedgerows and formal gardens. Its thinkers had still to breathe, like Darwin, the thin air of Andean highlands, or hear the falling of stones in mountain cataracts.

 

To see his role on the world stage, Western man had twice to revise his conception of time: once from the brevity of a few thousand years to eons of inconceivable antiquity, and, a second time, with far more difficulty, to perceive that this lengthened time-span was peopled with wraiths and changing cloud forms. Time was not just aged rocks and trees, alike since the beginning of creation; its living aspect did not consist merely of endless Oriental cycles of civilizations rising and declining. Instead, the  living flesh itself was alterable. Our seeming stability of form was an illusion fostered by the few millennia of written history. Behind that history lay the vast and unrecorded gloom of ice ages inhabited by the great beasts which the explorers, at Thomas Jefferson’s bidding, had sought through the blowing curtain of the dust.

 

Man, but not man in the garb we know, had cracked marrow bones in those dim shadows before his footprints vanished amidst the grass of wild savannahs. For interminable ages winged reptiles had hovered over the shores of ancient seas; creatures still more strange had paddled in the silence of enormous swamps. Finally, in that long backward range of time, it was possible to emerge upon shores which no longer betrayed signs of life, because life had become mere potential.

 

At thatpoint one could have seen life as the novelty it truly is. "Tell us what is new," reiterated the eager scientists to the explorers. Past mid-century, an answer could be made. It was life itself that was eternally, constantly new. Dust settled and blew the same from age to age; mountains were worn down to rise again. Only life, that furtive intruder drifting across marsh and field and mountain, altered its masks upon the age-old stage. And as the masks were discarded they did not come again as did the lava of the upthrust mountain cores. Species died as individuals died, or, if they did not perish, they were altered beyond recognition and recall. Man cannot restore the body that once shaped his mind. The bird upon the bough cannot, any more than a summer’s yellow butterfly, again materialize the chrysalis from which it sprang.

 

Indeed, in the end, life can be seen not only as a novelty moving through time toward an endlessly diverging series of possible futures but also as a complete phantom. If we had only the scattered chemicals of the cast-off forms and no experience in ourselves of life’s existence, we would not be able to identify its reality or its mutability by any chemical test known to us. The only thing which infuses a handful of dust with such uncanny potential is our empirical knowledge that the phenomenon called life exists, and that it constantly pursues an unseen arrow which is irreversible.

 

Through the anatomical effort and puzzle-fitting of many men, time, by the mid-nineteenth century, had become gigantic. When On the Origin of Species was published, the great stage was seen not alone to have been playing to remote, forgotten audiences; the actors themselves still went masked into a future no man could anticipate. Some straggled out and died in the wings. But still the play persisted. As one watched, one could see that the play had one very strange quality about it: the characters, most of them, began in a kind of generous latitude of living space and ended by being pinched out of existence in a grimy corner.

 

Once in a while, it is true, a prisoner escaped just when all seemed over for him. This happened when some oxygen-starved Devonian fish managed to stump ashore on their fins and become the first vertebrate invaders of the land. By and large, however, the evolutionary story had a certain unhappy quality.

 

The evolutionary hero became a victim of his success and then could not turn backward; he prospered and grew too large and was set upon by clever enemies evolving about him. Or he specialized in diet, and the plants upon which he fed became increasingly rare. Or he survived at the cost of shutting out the light and eating his way into living rock like some mollusks. Or he hid in deserts and survived through rarity and supersensitive ears. In cold climates he reduced his temperature with the season, dulled his heart to long-drawn spasmodic effort, and slept most of his life away. Or, parasitically, he slumbered in the warm intestinal darkness of the tapeworm’s eyeless world.

 

Restricted and dark were many of these niches, and equally dark and malignant were some of the survivors. The oblique corner with no outlet had narrowed upon them all. Biological evolution could be denned as one long series of specializations—hoofs that prevented hands, wings that, while opening the wide reaches of the air, prevented the manipulation of tools. The list was endless. Each creature was a tiny fraction of the life force; the greater portion had died with the environments that created them. Others had continued to evolve, but always their transformations seemed to present a more skilled adaptation to an increasingly narrow corridor of existence. Success too frequently meant specialization, and specialization, ironically, was the beginning of the road to extinction. This was the essential theme that time had dramatized upon the giant stage.

  

“busy shopping centre… middle of the throng… staring into space… mouth half-open as usual”

"Not I . . . is an aural mosaic of words, which come pell-mell but not always helter-skelter, and that once it is over, a life, emotions, and a state of mind have been made manifest, with a literally stunning impact upon the audience.”

 

Two reviews of Samuel Beckett’s Not I


Edith Oliver, The New Yorker 2 December 1972, p. 124:

The nearest I can come to describing ‘Not I’ is to say that it is an aural mosaic of words, which come pell-mell but not always helter-skelter, and that once it is over, a life, emotions, and a state of mind have been made manifest, with a literally stunning impact upon the audience. Even then, much of the play remains, and should remain, mysterious and shadowy. It opens in total darkness. A woman’s voice is heard (but so quietly that it almost mingles with the rattling of programs out front) whispering and crying and laughing and then speaking in a brogue, but so quickly that one can barely distinguish the words. Then a spotlight picks out a mouth moving; that is all the lighting there is, from beginning to end. The words never stop coming, and their speed never slackens; they are, we finally realize, the pent-up words of a lifetime, and they are more than the woman can control. She refers to her own ‘raving’ and ‘flickering brain,’ and to her ‘lips, cheeks, jaws, tongue, never still a second.’ Yet something of great power and vividness— tatters of incidents and feelings, not a story but something—comes through from a dementia that is compounded of grief and confusion. We hear of a sexual episode that took place on an early April morning long ago, when she was meant to be having pleasure and was having none. There is talk of punishment for her sins, and of being godforsaken, with no love of any kind. She is obsessed with the idea of punishment. There was a trial of some kind, when all that was required of her was to say ‘Guilty’ or ‘Not guilty,’ and she stood there, her mouth half open, struck dumb. Since then (or maybe not since then), she has been unable to speak, except for once or twice a year, when she rushes out and talks to strangers—in the market, in public lavatoriesonly to see their stares and almost die of shame. She has ‘lived on and on to be seventy.’ The light slowly fades, the gabble slides off to whispers and to silence. All the while, a man in monk’s garb has been standing in the shadows, listening and occasionally bowing his head. Miss Tandy gives an accomplished performance in what must be an extremely difficult role. Henderson Forsythe is the listener. This production of ‘Not I’ (I have no idea what the title means) lasts around fifteen minutes. They are about as densely packed as any fifteen minutes I can remember.


Benedict Nightingale, New Statesman, 26 January 1973, pp. 135–6:

When I was a boy, in the 1940s and 1950s, one of the most famous sights of the West Kent countryside was a woman in a rough brown smock with string round her waist, body bent forwards, arms working like pistons as she bustled towards Tunbridge Wells station. There she was planning to meet her husband, who had been killed in the first world war. In time, her walk lost its fever and became a sort of doleful trudge, and she disappeared from the roads. I don’t know if she may conceivably still be found in some geriatric ward, staring out of the window and wondering when the war will end; but I do know that her image came forcefully back to me when I saw ‘Not I’. If the spot that lit up the speaker’s mouth, and that only, had spread to reveal the whole of her body, I would have expected to see much the same hump and rags: if the old woman of Kent had spoken, I daresay much the same anguished gabble would have poured from her. All Beckett’ s plays may be seen as threnodies to wasted lives; but ‘Not I’ is more concrete in its characterisation than most, and as starkly visual as any in its evocation of the all-but-invisible piece of human driftwood whose monologue it is. It is also unusually painful—tearing into you like a grappling iron and dragging you after it, with or without your leave.

The mouth belongs to Billie Whitelaw; and, for some 15 minutes, she pants and gasps out the tale of the character to whom it belongs, her broken phrases jostling each other in their desperation to be expressed. It is a performance of sustained intensity, all sweat, clenched muscle and foaming larynx, and one which finds its variety only upwards: a frantic cackle at the idea that there might be a merciful God; a scream of suffering designed to appease this uncertain deity. But it must be admitted that the breathless pace combines with the incoherence of the character’s thoughts to make the piece hard to follow: which is why I’d suggest either that it be played twice a session (though this might prove too much even for Miss Whitelaw’s athletic throat), or that spectators should first buy and con the script, which Faber is publishing this week at 40p. After all, one of the many assumptions which Beckett’s work challenges is that a play should necessarily strip and show its all (or even much of itself) at first encounter. Like good music, ‘Not I’ demands familiarity, and is, I suspect, capable of giving growing satisfaction with each hearing. Meanwhile, let me piece together a crib for those too poor or proud to get the score proper.

‘Mouth’, as Beckett calls her, was born a bastard, deserted by her parents, brought up in a loveless, heavily religious orphanage. She became a lonely, frightened, half-moronic adult, forever trudging round the countryside and avoiding others.

busy shopping centre…supermart…just hand in the list…with the bag…old black shopping bag… then stand there waiting…any length of time… middle of the throng…motionless…staring into space…mouth half-open as usual…till it was back in her hand… the bag back in her hand…then pay and go…not as much as goodbye.

Once she appeared in court on some unnamed charge, and couldn’t speak; once and only once, she wept; occasionally, ‘always winter for some reason’, she was seen standing in the public lavatory, mouthing distorted vowels. But otherwise ‘nothing of note’ apparently happened until a mysterious experience at the age of 70. The morning sky went dark, a ray of light played in front of her. Her reaction (‘very foolish but so like her’) was that she was about to be punished for her sins, and she tried to scream. Yet neither did she feel pain, nor could she make a sound; nor hear anything, except a dull buzzing in the head. Then, suddenly, her mouth began to pour out words, so many and fast that her brain couldn’t grasp them, though she sensed that some revelation, some discovery, was at hand. And ‘feeling was coming back… imagine… feeling coming back’—to her mouth, lips and cheeks, if not yet to her numb heart. It is that feeling, those words, which we are presumably hearing in the theatre; that mouth, bulging and writhing in its spotlight like some blubbery sea-creature on the hook, which isnow virtually all that is left alive of the speaker after decades of dereliction.

Or could it be, as some suspect, that the mouth is talking, not of itself, but of someone else? I don’t think so. True, the story is told entirely in the third person, and the play is baldly called ‘Not I’. But Beckett helpfully provides a stage direction which seems to explain that. At key moments, the speaker repeats with rising horror, ‘What? Who? No SHE’ : which is, we’re told, a vehement refusal to relinquish third person’. In other words, she can’t bring herself to utter the word ‘I’, and that, I’d suggest, is because she dare not admit that this wilderness of a life is hers and hers alone. Whenever she gets near the admission, we get instead that cry of ‘no’ and howl of ‘she’, as if she was denying any possibility so awful. Things like that happen to other people: they cannot happen to ‘me’. Again, she seems to show symptoms of what psychiatrists call ‘depersonalisation’, the condition in which the sufferer has lost nearly all capacity for emotion and is left with the sensation, not only of not being himself, but of scarcely being human at all. Thus she thinks of herself in the third person and, on two occasions, talks of her body as a ‘machine’, disconnected from sense and speech. But it is, of course, quite inadequate to argue that Beckett is offering a clinical study of a schizophrenic: her predicament is much more representative. Which of us doesn’t shut his eyes to his failures, and who wouldn’t rather say ‘he or ‘she’ of much of his own irrecoverable life? Who isn’t guilty of both evasion and waste?

The play’s resonance is typical. Beckett commonly takes a particular character, pares it down to the moral skeleton, and leaves us with the pattern, the archetype: he refines individuals into metaphors in which we can all, if we’re honest, see bits of ourselves. What distinguishes ‘Not I’ from most of his work is the extent to which ‘mouth’ is individualised and the relative straightforwardness of its implications. Once the code is cracked, the stream of consciousness channelled, it isn’t a hard play, nor is it as stunningly pessimistic as some critics believe. In ‘Endgame’, for instance, Hamm’s room is Hamm’s room, a dying man’s skull, the family hearth, society and the planet Earth, forcing the spectator to spread his poor, bewildered wits over four or five levels at once; ‘Not I‘s’ stage is a barrenly furnished human mind, and that only. Again, I can think of few gloomier plays than ‘Happy Days’, which equates happiness with gross stupidity, or the one-minute ‘Breath’, which defines life as two faint cries and the world as a rubbish- heap. Invocations of God notwithstanding, ‘Not I’ has nothing definite to say about the society, world or universe in which ‘mouth’ spins out her existence. It could be that some self-fulfilment is possible there for those who don’t evade life by crying ‘not I’: that might be the revelation that tantalises but eludes her. Unlikely, knowing Beckett; but conceivable. We should seize hopefully on the slightest chink in such a man’s determinism, the barest scratch on the dark glasses through which he surveys us all.

It’s an entirely self-sufficient play, but not without echoes from earlier ones: the omnipresence of irrational guilt; the idea that love causes only suffering; and a shapeand tone that owes something to ‘Krapp’s Last Tape’which is presumably why that piece is also on the programme, with Albert Finney poised over the recording machine, spooling his way through yet another null past. Finney proves a bit cavalier with the stage directions, but achieves a good deal with a voice that markedly thickens and coarsens over the years, and with a face that scarcely has to move to suggest fear, bewilderment, a sudden raddled tenderness. I would recommend the production; but its ‘Not I’ that lingers in my mind, not because it’s more exquisitely written, but because it is, I think, even more deeply felt. At any rate, the old woman’s predicament strikes me as more moving than the old man’s. Perhaps this is because he is cleverer, and she more fragile and vulnerable, and less responsible for her failures; perhaps not. Whatever the reason, it is hard not to identify with the bent, cowled figure Beckett calls the ‘auditor’, who stands half- invisible in the murk of the stage watching the mouth and, finally, raising his arms ‘in a gesture of helpless compassion’. Compassion is indeed and exactly what ‘Not I’ provokes, and more powerfully than anything I’ve yet seen by Beckett.

—from L. Graver and R. Federman, editors, Samuel Beckett: The Critical Heritage, Routledge, 1979, pp. 368-373.

“in a universe whose size is beyond human imagining… men have grown inconceivably lonely”

". . . the burden of consciousness has grown heavy upon us. We watch the stars, but the signs are uncertain. We uncover the bones of the past and seek for our origins. There is a path there, but it appears to wander. The vagaries of the road may have a meaning however; it is thus we torture ourselves.”

Darwin saw clearly that the succession of life on this planet was not a formal pattern imposed from without, or moving exclusively in one direction. Whatever else life might be, it was adjustable and not fixed. It worked its way through difficult environments. It modified and then, if necessary, it modified again, along roads which would never be retraced. Every creature alive is the product of a unique history. The statistical probability of its precise reduplication on another planet is so small as to be meaningless. Life, even cellular life, may exist out yonder in the dark. But high or low in nature, it will not wear the shape of man. That shape is the evolutionary product of a strange, long wandering through the attics of the forest roof, and so great are the chances of failure, that nothing precisely and identically human is likely ever to come that way again.

[. . .]

In a universe whose size is beyond human imagining, where our world floats like a dust mote in the void of night, men have grown inconceivably lonely. We scan the time scale and the mechanisms of life itself for portents and signs of the invisible. As the only thinking mammals on the planet—perhaps the only thinking animals in the entire sidereal universe—the burden of consciousness has grown heavy upon us. We watch the stars, but the signs are uncertain. We uncover the bones of the past and seek for our origins. There is a path there, but it appears to wander. The vagaries of the road may have a meaning however; it is thus we torture ourselves.

Lights come and go in the night sky. Men, troubled at last by the things they build, may toss in their sleep and dream bad dreams, or lie awake while the meteors whisper greenly overhead. But nowhere in all space or on a thousand worlds will there be men to share our loneliness. There may be wisdom; there may be power; somewhere across space great instruments, handled by strange, manipulative organs, may stare vainly at our floating cloud wrack, their owners yearning as we yearn. Nevertheless, in the nature of life and in the principles of evolution we have had our answer. Of men elsewhere, and beyond, there will be none forever.

Loren Eiseley, “Little Men and Flying Saucers”