Monthly Archives: August 2006

Sleep inducers!

What’s the best sleep inducer in an unfamiliar environment? What makes your tired limbs and drooping eyes instantaneously relax in a new place, a new atmosphere?

For me, for better or for worse, it’s been an arm. Yes, am not kidding, its an arm; the part of the human body which is used the most often except maybe the legs, the part that sticks out and comes in pairs; an arm.

Why an arm of all things? It’s understandable if it’s a pillow, or a bolster, or a toy but an arm?? Well, that’s the way life is 🙂

Let me explain my wacko logic (its too bad that’s its as unorthodox as an arm so I might as well try and come up with a plausible logic and explain it) An arm is the most malleable piece among all the other options available. You can make the arm soft, hard, squishy, smooth, rough, and anything reasonable basically, without too much trouble on your part. Ever tried making a toy softer or warmer without too much effort on your part?

Secondly, an arm is always just nice and warm. It’s never hot (well usually not), almost never too cold. Thus, there is no heating required, no temperature measurements needed before hitting the bed. It’s a ready made heater just waiting to be used.

Thirdly, an arm has the innate ability to put you to sleep. How you say. Its simple. A normal arm ends with five tendrils like stick-outs called fingers. These fingers are attached to the arm via an extremely flexible piece of flesh usually called the wrist. Get it now? When you’re sleepy, all your sleep inducer needs to do it to move the wrist and activate those fingers to give you a gentle pat on the head. And you’re in dreamland in no time 🙂

Can you find me a better sleep inducer?

Advertisements

The Age of Female Computers

Today, mathematics and computer science often appear as the province of geniuses working at the very edge of human ability and imagination. Even as American high schools struggle to employ qualified math and science teachers, American popular culture has embraced math, science, and computers as a mystic realm of extraordinary intellectual power, even verging on madness. Movies like A Beautiful Mind, Good Will Hunting, and Pi all present human intelligence in the esoteric symbolism of long, indecipherable, but visually captivating equations. One has to think of such prosaic activities as paying the mortgage and grocery shopping to be reminded of the quiet and non-revelatory quality of rudimentary arithmetic. Which is not to put such labor down. Adding the price of milk and eggs in one’s head is also brain work, and we should never forget the central place of mere calculation in the development of more sophisticated areas of human knowledge.

Long before the dawn of calculators and inexpensive desktop computers, the grinding work of large problems had to be broken up into discrete, simple parts and done by hand. Where scads of numbers needed computing—for astronomical purposes at the Royal Observatory in Greenwich, England, or to establish the metric system at the Bureau du Cadastre in Paris—such work was accomplished factory-style. In his book When Computers Were Human, a history of the pre-machine era in computing, David Alan Grier quotes Charles Dickens’s Hard Times to capture the atmosphere of such workplaces: “a stern room with a deadly statistical clock in it, which measured every second with a beat like a rap upon a coffin-lid.” The most famous modern example of such work is probably Los Alamos, where scientists’ wives were recruited in the early stages to compute long math problems for the Manhattan Project.

The social history of pre-machine computing is also interesting in light of contemporary debates about gender and scientific achievement, and here Grier’s reconsideration of the past sheds useful light on the present. Resigned Harvard president Lawrence Summers became an academic outcast after speculating that there might be an “intrinsic” basis for the unequal numbers of men and women engaged in science and engineering at the university level. The idea that men and women are different creatures, with distinct drives and ways of thinking, is apparently so radical that even to raise it leads to the academic guillotine. And yet only a few decades ago, it was assumed by even the most civilized societies that women were not fit for serious intellectual pursuits, especially scientific ones. The occasional female endowed with truly extraordinary talent occupied the unfortunate position of the George Eliot character who tells her son: “You may try—but you can never imagine what it is to have a man’s force of genius in you, and yet to suffer the slavery of being a girl.” Note that even this extraordinary character, created by an intellectually accomplished, great female novelist, refers to genius as something particularly male.

In the history of computing, the humbler levels of scientific work were open, even welcoming, to women. Indeed, by the early twentieth century computing was thought of as women’s work and computers were assumed to be female. Respected mathematicians would blithely approximate the problem-solving horsepower of computing machines in “girl-years” and describe a unit of machine labor as equal to one “kilo-girl.” In this light, one can surely understand the desire to correct past orthodoxies about the female mind with new ones. But even as we rightly decry a past when even the most talented women were prevented from pursuing math and science in the most prestigious posts, we should remember—and honor—the crucial role of women in advancing mathematical and scientific knowledge one detailed calculation at a time.

At the beginning of the long line of women who made their marks as human computers was Nicole-Reine Lepaute. Like many women featured in Grier’s book, Lepaute enjoyed a personal connection to the intellectual world, allowing her to gain experience with scientific matters in spite of conventions that warned women away from science. She owed her education to the forbearance of understanding parents; her freedom to pursue an intellectual career to an obliging husband; and her professional position to Joseph-Jérôme de Lalande, her longtime scientific collaborator.

In a book published in 1705, using Isaac Newton’s new calculus, the English gentleman-astronomer Edmond Halley identified and predicted the return of the comet eventually named after him. But it was the French mathematician Alexis-Claude Clairaut, along with Lalande and Lepaute, who first computed the date of the comet’s perihelion with any precision in 1757, predicting it would occur in the spring of the following year. Sitting “at a common table in the Palais Luxembourg using goose-quill pens and heavy linen paper,” writes Grier, the three friends slowly computed the course of Halley’s Comet along a parabola-shaped orbit, reducing the math to an extraordinary series of baby steps.

Lalande and Lepaute focused on the orbits and gravitational pulls of Jupiter and Saturn (the three-body problem), while Clairaut focused on the comet’s orbit. “With the perspective of modern astronomy,” Grier writes, “we know that Clairaut did not account for the influences of Uranus and Neptune, two large planets that were unknown in 1757.” Still, the result of their number-crunching was a tenfold improvement in accuracy over Halley’s prediction, if still not perfect. When the comet reached its perihelion just a couple of days shy of the two-month window in which Clairaut and colleagues said it would, Clairaut’s computing method was ridiculed by one of the great intellectuals of the day, Jean d’Alembert, one of the editors of the Encyclopédie and himself an astronomer, who called the calculations more “laborious than deep.” But this has not been the verdict of history. “Beyond the simple accuracy of his result,” writes Grier, “Clairaut’s more important innovation was the division of mathematical labor, the recognition that a long computation could be split into pieces that could be done in parallel by different individuals.”

Mme. Lepaute was central to this effort, if largely unrewarded with professional position and prestige. Lalande hired her as his assistant when he became the editor of Connaissance des Temps, an astronomical almanac, where together they prepared tables predicting the positions of various celestial bodies. She performed valuable but largely unappreciated work.

Half a century later and an ocean away, Maria Mitchell would play the next part of the willing female computer supporting the bold designs of male scientists. In the 1840s, as American manufacturing swelled to claim some 25 percent of the economy and American pride vis-à-vis Europe launched a new era of economic and political competition, a movement took hold to establish an American nautical almanac. Lacking such a publication, claimed one supporter, “our absent ships could not find their way home nor those in our ports grope to sea with any certainty of finding their way back again.” The almanac’s chief mathematician was Harvard professor Benjamin Pierce, while the computing staff consisted of several students and amateurs. Mitchell was the only woman in the group. The daughter of a banker and amateur astronomer, she was not some anonymous savant: her discovery of a new comet in 1847 brought her fame and a medal from the king of Denmark. Mitchell herself felt no need to announce her discovery, mentioning it only to her father, who quickly checked to see if the comet had been claimed by anyone else and then insisted on publicizing her accomplishment.

Mitchell proved an able computer, not out of place among the gentlemen who filled this minute trade. She went on to become the first female professor of astronomy at Vassar College, gaining some of the recognition and opportunities that Lepaute never did. The tide was indeed slowly turning in women’s favor, though far from decisively. In the two decades following the Civil War, Grier reports, women went from holding one out of six hundred office jobs to one in fifty. The Harvard Observatory in particular found women to be especially desirable computers, since they accepted payment equal to half the going rate for men.

By the end of the nineteenth century, astronomy was no longer driving the science of computing. New scientific interests—from Darwinist anthropological investigation to modern mathematical economics to war production—would come to require and ultimately redirect the aims of computing. In this period, the discipline of statistics as we know it was born, reshaping the character of all kinds of social inquiry. Computing followed the growth of the social sciences: the effort to move away from broad ideas and conceptual investigations toward empirically-based methodologies in pursuit of a scientific knowledge of human affairs.

Francis Galton looked to mathematics to help prove Darwin’s theory of natural selection. In one investigation, he gathered crude data on African women “endowed,” he wrote to his brother, “with that shape which European milliners so vainly attempt to imitate.” Returning to England, Galton worked for the “Committee for Conducting Statistical Inquiries Into the Measurable Characteristics of Plants and Animals,” where such efforts to support Darwinism came under the powerful influence of Karl Pearson, who introduced a breakthrough formula for correlation. Pearson was also an unusual character, a man of far-flung intellectual interests and progressive social opinions. Grier, who passes up few opportunities to enliven his history, describes Pearson’s Hampden Farm House project, where women and men worked together in an egalitarian atmosphere studying plants. On Fridays, the workers would break for what were called “biometric teas,” while calculation and number-crunching took place on weekends. One of Pearson’s larger projects collated data on some 4,000 children and parents in an attempt to demonstrate that “moral qualities” of character and intelligence were hereditary. It was a fine example of how rigorous calculation in service to misguided theories is error masquerading as a thousand facts—a problem that obviously has not gone away.

World War I shifted the focus of computing to two kinds of questions: military problems concerning artillery trajectories and atmospheric drag, and economic problems concerning production, as the United States strove to outfit, feed, and arm the American Expeditionary Force. England’s Ministry of Munitions relied heavily on Pearson’s Biometrics Laboratory for help calculating ballistics for anti-aircraft munitions. In the United States, such work was handled at the Aberdeen Proving Ground in Maryland. The main task on both sides of the Atlantic was revising Francesco Siacci’s theory of ballistics trajectory, which worked well enough for the artillery of the nineteenth century but needed significant revision in the age of aerial warfare. Human computers struggled to calculate trajectories and end points for aerial bombs, anti-aircraft artillery, and the weaponry of aerial combat.

The Aberdeen Proving Ground was the Manhattan Project of its day. “For many years after the First World War,” said the mathematician Norbert Wiener, voicing conventional wisdom on this point, “the overwhelming majority of significant American mathematicians was to be found among those who had gone through the discipline of the Proving Ground.” But Aberdeen was not the United States’s only wartime home to computers. Nearby in Washington at the Experimental Ballistics Office, Elizabeth Webb Wilson, the top mathematics student in her class at George Washington University, found employment with several other women converting the raw data from Aberdeen into tables usable at the warfront. After the war, she looked in vain for another computing job, eventually becoming a high school mathematics teacher instead.

Wilson’s story confronts us with a paradox of social progress. In a post-feminist world, a distinguished young talent like Wilson would easily find employment working with numbers. Meanwhile, high schools go begging for anyone of Wilson’s ability—male or female—to teach mathematics. That the old system was unjust is indisputable; that the new system is better at raising up the next generation of mathematicians is a complicated question.

Economists also used computing to track domestic productivity. The punched-card tabulator, which the Census Bureau first used for the 1890 census, became an increasingly important tool for tracking retail pricing data mailed in to the Food Administration by thousands of correspondents scattered nationwide. Washington was not exactly converted overnight to such numerical representations of American economic life. As Grier describes: “The notion that the sprawling agricultural economy could be described with differential equations or probed with statistics calculations was not widely accepted in 1917-18.” The work of Harry C. Wallace (an editor and future Secretary of Agriculture) and his son Henry A. Wallace (a writer and future Vice President) during this period foreshadowed the future use of statistics to calculate everything from consumer confidence to inflation to the productivity of American manufacturing. Using carefully crunched numbers, the Wallaces tried to convince U.S. Food Administrator Herbert Hoover to guarantee a price for corn in order to shore up the related price of swine, but to no avail. Hoover feared meddling in the private sector—a stance that has of course become harder for American leaders to maintain, in part because mathematical models of the economy have grown increasingly sophisticated and thus ever more inviting to political intervention.

By the early twentieth century, the machine was catching up with the human computer, as suggested by the presence of those punched-card tabulators in Washington. While computing went from merely supporting astronomy to an essential tool of social science, the technology of computing went from a series of unreliable contraptions to more sophisticated adding machines and cash registers. Both science and business came to rely on the rapid numerical calculations that machines alone could efficiently produce.

The last hurrah for pre-machine computing was a product of the New Deal, the Mathematical Tables Project. Its task was to produce mathematical tables for use, “not only by mathematicians and astronomers, but also by surveyors, engineers, chemists, physicists, biometricians, statisticians, etc.” The Work Projects Administration (W.P.A.) required this large-scale computing operation to use labor-intensive methods and limit the number of female hires to 20 percent of staff. Few of the human computers they hired had completed high school. As one of the early computers recounted, “arrested TB cases, epileptics, malnourished persons abounded.”

Arnold Lowan, a physicist who had fled anti-Semitic pogroms in Europe but could not find a regular teaching position in the United States, was the director of the Mathematical Tables Project. His first lieutenant was Gertrude Blanch, another Eastern European immigrant who could not find academic employment, despite a doctorate in mathematics. Blanch proved to be a true leader. While the regulations of the W.P.A. seemed well-designed for a make-work project of endless mediocrity, she and Lowan worked overtime to check calculations and ensure high-quality products free of errors. Blanch even organized a lunch-hour math curriculum for willing workers that took them from elementary arithmetic through high school algebra, trigonometry, all the way to college calculus and, finally, matrix calculations, the theory of differences, and special functions. It was the most successful mathematical tables project in history.

The arrival of World War II sounded the death knell for work-relief projects, but the Mathematics Tables Project was certified as an urgent wartime program, granting it a reprieve and a degree of respect Lowan had otherwise sought in vain. Grier notes an interesting moment of contact between Lowan and John Brainerd at the University of Pennsylvania, where a team was struggling to build what would become ENIAC, an electrical analyzer that was being developed to calculate ballistics for the Aberdeen Proving Ground. Brainerd was looking for highly skilled human computers, but Lowan’s group was not what he had in mind. Lowan used machines to facilitate the work of human computers; Brainerd wanted human computers to aid the work of his machine. Brainerd then met his own Nicole-Reine Lepaute figure, Adele Goldstine, the wife of a ballistics officer who had done graduate work in mathematics. Goldstine set up a classroom program to educate their own team of computers and promptly hung a “women only” sign on the door of their lab.

At the time, there were almost no researchers whose primary interest was computing, still seen as a mere handmaiden to other, more substantial scientific interests. But this was changing fast as machines began to outperform human computers. Up until World War II, human computers had the advantage. As Grier writes: “A punched-card tabulator could work much faster than a human being, but this advantage was lost if the operator had to spend days preparing the machine.” Richard Feynman, then a junior staff member at Los Alamos, arranged a showdown between man and machine, pitting a group of human computers against the Los Alamos IBM facility with both performing a calculation for the plutonium bomb. For two days, the human computers were able to keep up with the machines. “But on the third day,” recalled one observer, “the punched-card machine operation began to move decisively ahead, as the people performing the hand computing could not sustain their initial fast pace, while the machines did not tire.” Shortly after the war, the machines took over; their human accompanists were now “operators” and “programmers.”

When Computers Were Human tells an important story. Interesting for its insights into science and computing, Grier’s book is also an impressive work of economic and social history. With the discovery of binary logic, the simplest parts of long problems became both too voluminous and too simple for human hands. Yet in slightly more complicated form, the simple number-crunching of long problems made ideal work for the attentive and moderately educated, and it was sometimes the only work available to well-educated women. That scientists often had the benefit of highly talented and under-rewarded female minds who could not stake claim to better-paid academic positions was an important boost to many serious intellectual enterprises. That women of the capacities of Elizabeth Webb Wilson or Gertrude Blanch are now much freer to pursue their own interests is an even greater boost to the sciences, though not without its costs.

David Skinner, “The Age of Female Computers,” The New Atlantis, Number 12, Spring 2006, pp. 96-103.

Plato’s Republic

If any books change the world, Republic has a good claim to first place. It is commonly regarded as the culminating achievement of Plato as a philosopher and writer, brilliantly poised between the questioning and inconclusive earlier dialogues and the less compelling cosmological speculations and doubts of the later ones. Over the centuries it has probably sustained more commentary, and been subject to more radical and impassioned disagreement, than almost any other of the great founding texts of the modern world. Indeed, the history of readings of the book is itself an academic discipline, with specialist chapters on almost every episode in the story of religion and literature for the past 2,000 years and more. To take only the major English poets, there are entire books on Platonism and Chaucer, Spenser, Shakespeare, Milton, Blake, Shelley and Coleridge, to name but a few, and there are many others on whole movements and times: Plato and Christianity, Plato and the Renaissance, Plato and the Victorians, Plato and the Nazis, Plato and us. The story of Plato’s direct influence on philosophy is another study in itself, one peppered with names such as Philo Judaeus, Macrobius, Porphyry, Pseudo-Dionysius, Eriugena, as well as the better-known Plotinus, Augustine or Dante. Sometimes the Plato in question is the author of other texts, notably the inspirational dialogue Symposium and the theologically ambitious Timaeus. But Republic is seldom far away.

Anyone who stays very long in the vast silent mausoleums lined with works about Plato and his influence runs the risk of suffocating. Anyone writing on this topic must be conscious of an enormous and disapproving audience, dizzying ranks of ghosts overseeing and criticising omissions and simplifications. Many of these ghosts belong to the most brilliant linguists, scholars, philosophers, theologians and historians of their day. They do not take kindly to the garden to which they devoted their lives being trampled over by outsiders and infidels. And Republic is the shrine at the very centre of the sanctuary, since for centuries it has been the one compulsory subject in the philosophy syllabus, so these same scholars will have been educated with it as the centrepiece and inspiration.

Plato wrote his philosophy in dialogues, a form that requires different voices, and the ebb and flow of argument. It was already noted in antiquity that the Socrates who is the hero of these dialogues, and Plato himself, are shifting figures, readily admitting different interpretations: “It is well known that Socrates was in the habit of concealing his knowledge, or his beliefs; and Plato approved of the habit,” said Saint Augustine. One way of taking this is that Plato, and presumably Socrates, really did have doctrines to teach, but that for some irritating reason they preferred to unveil them only partially, one bit at a time, in a kind of intellectual striptease. This line has occasionally been taken by weak-minded commentators in love with the idea of hidden, esoteric mysteries penetrated only by initiates, among whom they are pleased to imagine themselves.

The right way of interpreting Augustine’s remark is that Plato felt philosophy was more a matter of an activity than of absorbing a static body of doctrine. It is a question of process, not product. Socrates remains the great educator, and those who came to him would be listeners and interrogators, participants in conversation, and would have to throw themselves into the labyrinths of thought. Passive reception of the word would count for nothing – this was one of the mistakes made by Plato’s opponents, the sophists, who charged fees for imparting what they sold as practical wisdom (one might think of the witless piles of “wisdom” and “self-help” literature that now choke bookshops). At the end of Plato’s dialogue Phaedrus, Socrates makes a speech despising reading philosophy as a poor second to doing it. Many people have made the same point subsequently. Schopenhauer describes reading as a mere surrogate for thinking for yourself, and in turn quotes the German polymath Goethe: “What you have inherited from your forefathers, you must first win for yourself if you are to possess it.” Robert Louis Stevenson argued that literature is but the shadow of good talk. “Talk is fluid, tentative, continually in further search and progress; while written words remain fixed, become idols even to the writer, found wooden dogmatisms, and preserve flies of obvious error in the amber of the truth.”

The insistence on engagement chimes with Plato’s adoption of the dialogue form, in which different voices get a hearing, and it is the twists and turns of the processes of argument rather than any set conclusion that help us to expand our minds as we read. Philosophy, in this view, is about discovering things in dialogue and argument (“dialectically”); anything read later could at best be a reminder of the understanding achieved in this process.

This dramatic conception of what Plato is about makes him harder to criticise. One can reject a conclusion, but it is much harder to reject a process of imaginative expansion, and if we take the link with drama seriously, it might seem as silly as “rejecting” King Lear or Hamlet. In fact, the parallel does not cut off criticism, but encourages it. In the course of Plato’s dramas, theses do get stated and defended, arguments are made, and people are persuaded. Sometimes the drama comes to an end with an apparent conclusion. And in all these cases it is appropriate to ask whether the theses, arguments and conclusions are in fact acceptable. Doing this is doing no more than taking part in the drama or entering the dialectical arena, the very activity that Socrates and Plato commend.

But Plato and his Republic have their detractors. In Raphael’s painting The School of Athens, Plato and Aristotle together hold centre-stage, but while Aristotle points to the Earth, Plato points upwards to the Heavens. Coleridge made the same contrast, saying that everyone was born either a Platonist or an Aristotelian, meaning that Plato is otherworldly, a dealer in abstractions, while Aristotle is the plain empirical man who faces things as they are in the world as we find it. Coleridge continued: “I don’t think it possible that anyone born an Aristotelian can become a Platonist, and I am sure no born Platonist can ever change into an Aristotelian.”

Much of Republic can be read as Plato Lite. These parts can be read regardless of our attitude to the heavy-duty metaphysics of the central chapters, notably the part that everyone remembers, the Myth of the Cave. On its best interpretation, it is far from suggesting an airy-fairy, visionary picture of divine raptures and illuminations. In fact, we can tame it, and see it as no more than a sensible plea for just the kind of understanding of the actual world that science and mathematics offer us two millennia later. Perhaps Plato has been horribly betrayed by Platonists – not an uncommon fate for a great philosopher.

But there are other, less doctrinal reasons why the sovereignty of Republic ought to be surprising. The work is long, sprawling and meandering. Far from holding water, its arguments range from ordinarily leaky to leaky in that zany way which leaves some interpreters unable to recognise them as ever intended to hold water at all. Its apparent theory of human nature is fanciful, and might seem inconsistent. Its apparent political implications are mainly disagreeable, and often appalling. In so far as Plato has a legacy in politics, it includes theocracy or rule by priests, militarism, nationalism, hierarchy, illiberalism, totalitarianism and complete disdain of the economic structures of society, born in his case of privileged slave-ownership. In Republic he managed to attach himself both to the most static conservatism and to the most wild-eyed utopianism. On top of all that, the book’s theory of knowledge is a disaster. Its attempt to do what it seemingly sets out to do – which is to show that the moral individual, and only the moral individual, is happy – is largely a sequence of conjuring tricks.

More insidiously, to the extent that there is now an aesthetic tone associated with Plato, it is not one to which we easily succumb, unless we have absorbed too much of it to escape. Plato’s high summer, in England at least, lay in the golden glow of the late Victorian and Edwardian age – the vaguely homoerotic, vaguely religious, emotionally arrested, leisured, class-conscious world of playing fields, expensive schools and lazy universities, the world of Walter Pater, or EM Forster, of half-forgotten belletrists and aesthetes like John Addington Symonds or Goldsworthy Lowes Dickinson, or golden boy-poets like Rupert Brooke. This is not the world around us. It is not quite a world of slave-ownership, but capitalism throws up its own drones.

An equally shocking thing about it in some people’s eyes is that, in writing Republic, Plato utterly betrayed his teacher Socrates. Socrates is the first and greatest liberal hero and martyr to freedom in thought and speech. For writers like John Stuart Mill and George Grote – practical, liberal, utilitarian thinkers – this was the real Socrates, the eternal spirit of reflection, criticism and potentially of opposition to the state itself. But in Republic he is an out-and-out dogmatist, rather than the open-minded, patient, questioning spirit his admirers love. He is shown as the spokesman for a repressive, authoritarian, static, hierarchical society in which everything up to and including sexual relations and birth control is regulated by the political classes, who deliberately use lies for the purpose. He presents a social system in which the liberal Socrates would have been executed much more promptly than he was by the Athenian democracy. In Republic the liberal Socrates has become the spokesman for a dictatorship. In presenting this figure Plato even betrayed his own calling, being once a poet, who now calls for the poets to be banned.

A work may have many defects yet be forgiven if the author comes through as a creature of sweetness and light, just as Plato’s literary creation, the Socrates of the earlier dialogues, does. But there is not much help here. True, there must have been enough sweetness and light in Plato to create the figure of the heroic, liberal Socrates in the first place. But if that figure evaporates, as it does in Republic, there is not much else to go into the balance. We know very little about Plato, and what there is to know is not generally appealing. If he is put in historical context, we may find an archetypal grumpy old man, a disenchanted aristocrat, hating the Athenian democracy, convinced that the wrong people are in charge, with a deep fear of democracy itself, constantly sneering at artisans, farmers and indeed all productive labour, deeply contemptuous of any workers’ ambition for education, and finally manifesting a hankering after the appalling military despotism of Sparta.

But as so often with Plato, there is a complication to that picture, nicely brought out in Nietzsche’s reaction to the fact that, on Plato’s deathbed, he turned out to have been reading the comic writer Aristophanes: “there is nothing that has caused me to meditate more on Plato’s secrecy and sphinx-like nature, than the happily preserved petit fait that under the pillow of his deathbed there was found no Bible, nor anything Egyptian, Pythagorean, or Platonic – but a book of Aristophanes. How could even Plato have endured life – a Greek life which he repudiated – without an Aristophanes?”

We are told that Jesus wept, but not that he ever laughed. With Plato, as with Socrates, laughter is often nearer than it seems. This is a good sign. Perhaps the grumpy old man was not quite so grumpy after all. But this does not really matter, for it is the concrete, enduring book that concerns us, not its shadowy and departed author. And it is a good dictum that while many books are wrongly forgotten, no book is wrongly remembered. So we need to work harder to come to terms with the unquestioned staying power of Republic. We need to understand something of the hold this book has had and continues to have on the imagination of readers.

This is an edited extract from Plato’s Republic: A Biography, part of a series called Books That Shook the World, published by Atlantic Books. Simon Blackburn on Plato’s Republic. Taken from The Guardian