Sunday, March 31, 2013


Sterling Hayden was an actor. He played the corrupt police chief who was shot in the head, by Al Pacino, in The Godfather. But he was a lot more. He was an adventurer, sailor, OSS agent and many others. He defied a court order and sailed to Tahiti with his four children, Christian, Dana, Gretchen and Matthew. He hated acting and always regretted naming names to the House Un-American Activities Committee. 

He wanted to be free:

   "They are enmeshed in the cancerous discipline of 'security'. And in the worship of security we fling our lives beneath the wheels of routine-and before we know it our lives are gone.
   What does a man need-really need? A few pounds of food each day, heat and shelter, six feet to lie down in-and some form of work activity that will yield a sense of accomplishment. That's all-in the material sense. And we know it. But we are brainwashed by our economic system until we end up in a tomb beneath a pyramid of time payments, mortgages, preposterous gadgetry, playthings that divert our attention from the sheer idiocy of the charade.
    The years thunder by. The dreams of youth grow dim where they lie caked in dust on the shelves of patience. Before we know it,  the tomb is sealed".
   Where, then, lies the answer? In choice. Which shall it be: bankruptcy of purse or bankruptcy of life?...
    ....Somehow it is the male's duty to put the best years of his life into work he doesn't like in order that he may 'retire' and enjoy himself as soon as he is too old to do so. This is more than just the system-it is the credo. It is the same thing that prompted Thoreau to say, in 1839:
'The mass of men lead lives of quiet desperation."'

His autobiography is quite interesting. He never seemed to know where he belonged. "You were strong enough to rebel-not strong enough to revolt"


Interesting article on Nobel-Prize winning geniuses!

The Atlantic

How Nobel Prizewinners Get That Way

By Mitchell Wilson
When Julian Schwinger came to the Columbia Graduate School of Physics in 1935 at the age of seventeen—five years younger than the youngest of us—he was shy and pudgy, with a schoolboy’s broken complexion; but he had already gone through the most advanced treatises on theoretical physics, quantum theory, and relativity all by himself, as easily and avidly as the rest of us had once gone through Two Years Before the Mast. By comparison, we were illiterates. There was even a rumor that he had published his first scientific paper in the Physical Review at fifteen when he was at Townsend Harris High School. He was at once so obviously in a class by himself that no one bothered to envy him. One thing, each of us assured the others: eventually he would earn a Nobel Prize.
When I say “we,” I mean the group of about a dozen graduate students studying and doing research toward our doctorates, along with a handful of postdoctoral fellows and instructors also in their early or middle twenties. We made up the laboratory population of the department. As it turned out, we were right about Julian. In 1965, he was awarded the Nobel Prize for work in quantum electrodynamics. Also, as it turned out, we proved to have been very poor judges of Nobel Prize material. Sitting right there among us all the time, taking part in our talk and gossip, were three other whom we had passed over completely. The first was one of our research chiefs, I. I. Rabi, who was to win a Nobel Prize in 1944. The second was Polycarp Kusch, a young experimentalist from the Middle West, with large angular movements and a loud assertive voice. He was the Nobel laureate in 1955. The third was Willis Lamb, a tall, thin Californian with a slight squint and a quiet erudition, both in physics and out. In the thirties, Lamb considered himself only as a theoretician—although certainly no then in Schwinger’s class, as far as anyone thought.
Four Nobel laureates out of a group as small as that, at a time when the world population of physicists was over ten thousand, was a remarkably high proportion indeed. All these prizes, though, were still decades in the future. We didn’t know what a genuine Nobel Prizewinner looked like, or even what he did once he had been awarded the prize. From time to time, a few such exalted beings as Harold Urey, Arthur Compton, and Robert Millikan would drop in on us for a public evening lecture, but then they took off again with their radiance unpenetrated.
Our first real contact—certainly my first contact—with a living, breathing, close-enough-to-touch Nobel laureate came in 1938 when Enrico Fermi left Italy with his family, ostensibly to go to Sweden to receive the prize for his work in artificial radioactivity. Instead of returning to Mussolini’s Rome, he kept on going until he came to us at Columbia. He was in his middle thirties at the time. I hoped only that when he’d start giving his lecture on atomic and nuclear physics I wouldn’t open my mouth and make a fool of myself in his seminars. I glimpsed him with awe as he hurried through the Pupin corridors, labs, and offices: a short, quick, long-armed man. His gray eyes looked patient, when they were really only polite. To me, he was already half a god.
About a week after Fermi’s arrival, I was called to Rabi’s office. In those days, Rabi liked to whittle at a small piece of wood as he talked. I had recently finished an apprentice research for him in his molecular-beam techniques, and had passed all my qualifying exams. I came, hoping that he was finally going to put me to work on my doctoral assignment. Instead, he told me he was releasing me from his research group so that I could be free to become Fermi’s assistant. It was as if I had been told I was to report to heaven to sit at the right hand of God. But there was also a nightmare side to all this splendor and that was my feeling that at that particular point of my career I was no more capable of carrying on research physics on the Fermi level and up to the Fermi standard than I was able to walk onstage at the Metropolitan Opera House in the middle of a performance of Tannhäuserand take over the main role. It was the greatest opportunity I had ever had; it was also the most appalling invitation to disaster.
Fermi got to the point the moment I appeared in his office. He asked me what I knew about cosmic rays. I said I knew nothing. He said, no matter, neither did anyone else. He went to the blackboard then and outlined the theory of the experiment he wanted performed, that he wanted us to perform. For the first few minutes, he was remarkably clear. How marvelous it felt to be one of the talented people up here At the Top where life shone! Then everything darkened. He was speaking brilliantly, lucidly, but really to himself, because I no longer understood anything. I kept nodding though; it never occurred to me to ask him to repeat any of the points that I lost. At last, he finished with theory and began to discuss the apparatus I would have to build: pulse-counting circuits, giant Geiger tubes, and appropriate vacuum systems. I felt a little better. I had never made any of the things he asked for, but I knew that I would be able to find out how. Physics had always come more easily to my hands than to my head.
Fermi turned out to be the most active, the most competitive man I have ever known, not only intellectually but physically as well, even with men twice his size and half his age. If it was a matter of mountain climbing, he had to be the one in the lead. If it was swimming, he proved to be the one with the greatest endurance. As his tennis partner, I never had anything to do but hold my racket and squint against the sun. He played both courts, the net and the backcourt as well. Shortly after his arrival in America, he bought a long shining black Packard with part of his prize money. When a minor adjustment had to be made one Sunday, he insisted on doing it himself—and lost a piece of his finger. In the laboratory, sometimes I literally had to wrestle pieces of equipment out of his hand, because while I never saw him lose his temper or even show impatience, he wanted things done hisway, by him. I was freed of his furious energy only when the news of nuclear fission came along, and he threw himself into that.
The discovery of nuclear fission was a direct personal challenge to Fermi. In the early 1930s, Fermi had remarked to his old professor in Rome, Carponi, that even though it might take another fifty years to work out all the details of the wave theory of atomic structure, the main outlines were already clear. It was time he moved on to where the next big questions were. Then he and his young Italian co-workers plunged into research on neutron-induced artificial radioactivity, and ranged like wolves through the entire periodic table of elements, and beyond—to the so-called “transuranic” elements, those made heavier than uranium by the nuclear capture of the bombarding neutrons. In 1938, once again Fermi found himself in a field where the general outlines had been cleared. The next advanced position for him to attack was the question of the nature of the very high energy particles found in cosmic rays; and this is what he planned to be doing in America.
The announcement, a short time after he arrived in the Untied States with the prize, that neutron-bombarded uranium sometimes split into much smaller fragments along with massive emissions of energy meant to Fermi that his “transuranic” elements had been called into question. A portion, at least, of his Nobel award rested on shaky ground. The man who reveled in being first had been first in the area where fission took place, but he had walked blindly past it, leaving to others one of the most startling discoveries in physics. To him, there was no choice but to go back into nuclear physics, re-establish his lead, and prove all over again—if anyone had any questions about it—that he deserved the prize. He then went on to build, eventually, the first chain-reacting nuclear pile.
Not only was he the Columbia physics department’s only Nobel laureate at the time; he also became the busiest physicist in the building. What we didn’t know was that Fermi, who was usual in nothing, was also an unusual Nobelist.
Of all the bizarre effects which winning the prize turns out to have on scientists, the one least often seen is heightened creativity. Only Fermi and a handful of others during the past seventy years fulfilled Alfred Nobel’s original dream. A very different pattern was set by the first man ever to win the award. In 1895 Wilhelm Konrad Roentgen, an obscure physics professor at the University of Würzburg, completed a series of modest but typically meticulous experiments that had been initiated by a chance observation. He worked for about eight concentrated weeks, then his results were described one evening to a small group of Würzburg medical men. He had come across a mysterious new radiation which was actually able o penetrate a variety of materials opaque to the eye. If the hand were held between the source of the radiation and a fluorescent screen, he told them, “The dark shadow of the bones is visible within the less dark shadow of the hand. . . . For brevity’s sake, I should like to use the expression rays; and to distinguish them from other rays, I will call them X-rays, X for the unknown.”
Roentgen’s X-ray photographs of the bones in his wife’s hand (she was wearing a heavy wedding band) was printed all over the world and created a furor that verged on panic. Women were afraid to go out on the street for fear that men with X-ray glasses would see them nude through their clothes. Bankers were afraid that thieves with X-ray vision could see what was hidden in their vaults. In the public mind, for the moment, Roentgen was considered the greatest wizard who had ever lived. He was granted the award in 1901, the first year of its existence, but for the rest of a long, increasingly isolated life, he never made another contribution to science. Behind the silence was a local scandal: Roentgen was accused of taking credit for what one of his students had really done. He was so embittered by the intensity of the vituperation and the unfairness of the charge that he turned more and more in on himself until he became available to hardly anyone. He was the first, but certainly not the last, Nobelist to become involved in an ugly struggle for credit; and to have his entire style of living and working wrenched into some other shape by the most prestigious award the modern world has ever known.
By and large, Nobel science laureates are really exceptional men. If one can measure such things, they must be about twenty to forty times as creatively productive as the average scientist, whose output over an average lifetime is only about five published papers. The men who become Nobel Prizewinners, according to a study made by Harriet Zuckerman, the Columbia sociologist, publish almost that much in a year! She matched (in terms of age, specialization, and conditions of research) the performance of the American laureates in science with an equal number of excellent scientists—active but nonlaureate—selected from the roster of American Men of Science. This is what she found: the average American laureate publishes about four papers a year; the others publish about three papers every two years. The statistics also show that the output of the laureates fell off after the award was made, by an average of a third within five years. Actually, the falloff for the laureates is about three times as sever for their less eminent colleagues of the same age.
Why this falloff?
One answer is that their new celebrity makes so many demands on them that they have less time for research. But research men make their own time, and the only ones who accept too many invitations are those who want to accept them; and since they know what the price of distraction is, their very acceptance is part of the falloff pattern, not the cause.
Another quick answer is that once these men have attained success, there is no further reason to work so hard. But a drive for “success” was never the force that kept them going. By and large, men work at research because that, more than anything else, is what they want to be doing.
Recently, in Paris, I was visiting the Pasteur Institute, and in a talk with Jacques Monod, the 1965 laureate in medicine and physiology, he happened to mention that during the war his research, absorbing as it was, had to be used as a cover for underground activities during the German occupation. Although hard at work on his experiment, behind the apparatus in neighboring rooms were illegal printing presses, forbidden newspapers, and weapons. One day, catastrophe struck: one of the men in his group was killed, another captured by the SS. Monod was ordered to go underground at once, which mean walking out of the Sorbonne, not returning to his apartment, taking another name, and staying away from any part of Paris where he might be known.
He became a full-time underground worker. Every day, he faced the danger of being shot. Yet he missed his research so severely that in whatever time he could find, he smuggled himself into the Pasteur Institute to continue his bacteriological experiment in a corner of Lwow’s lab. At this point in Monod’s story, I had interrupted.
“Even if you had finished the research, you couldn’t have published it.”
“Naturally not,” he said.
“Nor did you have any idea that you would live long enough to finish the research, did you?”
“Your idea of a rest from risking your life twenty-four hours a day was to run an even greater risk for a few hours by going where you were known—without the slightest chance you’d ever get anything out of it in terms of prestige or recognition. Why did you do it?”
Monod is a man with a finely proportioned, highly expressive Gallic face. My question astonished him; but there was something I wanted him to put into words, and so I waited.
“Because,” he said at last, almost helplessly. “That’s what I wanted to be doing—that’s what my life was all about!”
“You know, I could make $2000 a week, if I wanted,” Poly Kusch remarked to me one day at lunch at the Columbia Faculty Club some years after he had won the Nobel Prize. He had finally grown into his angular face and was an impressive-looking man. He had also become a brilliant teacher. “That’s more money than my father ever made in a year, but I’d rather stay here and teach.”
Now, $2000 a week is a lot of money for a professor, but literally thousands of American men today—in industry, advertising, finance, fashion, and entertainment—make $2000 a week, and scarcely one of them is a man of any distinction whatsoever, while Kusch to be worth that much money had to attain the highest prize in the world’s most difficult science. Any man seeking “success” in the general sense of the word would have to be a fool even to think of picking the life of a research scientist as the road. No, “success” is all very pleasant, but it cannot be the spur for the really creative man whose mind is a churning sea where fragments of ideas, half-perceptions, and partial insights keep welling up to the surface of consciousness. He can neither turn the flow on nor turn it off. All he can do is pick and choose among the ones that seem most fruitful to follow. He works because he can’t not work. When he does stop working, it is because something very deep within him has been turned off, either shattered or put to rest.
In some laureates, the change is so palpable that they become almost different men. Ernest Lawrence, who invented the cyclotron in 1929 at the age of twenty-eight, very quickly became famous. He was a hard-driving, round-the-clock worker who gathered about himself an army of assistants and graduate students on whom he continually rode herd to see that tempo was maintained. Once, in impatience, he fired someone on the spot who had been moving too languidly, only to find that it was a telephone repairman sent in to do a job. Everyone under Lawrence had to work for Lawrence or in the direction of his ideas. Yet once he had won the award in 1939 at the age of thirty-eight, the change in him was so marked that it was possible for a newcomer to the lab, Emilio Segrè, to say: “Lawrence? Why he left me strictly alone to my work!”
Segrè himself is a man who was to undergo the identical metamorphosis. He was a former student and brilliant collaborator of Fermi’s from the Rome days. In 1938, he came to the United States as an anti-Fascist, and in the world of American science very quickly got himself a reputation as a man of high energy, drive, and contentiousness, along with a low threshold for excitability. After the war years at Los Alamos, he returned to Berkeley to join and help lead the work on the big new high-energy accelerator. He discovered the antiproton. I remember remarking to him once that the current availability of funds must make for independent research for young men. I made him laugh. “Money, yes! Independence, no! At least not in high-energy physics. An ambitious young scientist has got to get himself into someone else’s group and work on his boss’s problems. There’s too much competition for machine time. Nobody’s going to take a chance on a young fellow and then have to say that a million dollars was wasted!”
Segrè the dynamo was awarded the prize in 1959. Some ten years later, when I was in England at the Rutherford High Energy Laboratory at Harwell, a young British scientist who had spent time as a visiting researcher at Berkeley only the year before said to me: “I was in the Segrè group out there. I’m told he was quite a tough cookie in his younger days, but since he’s won the Nobel Prize, he’s become positively benevolent. . . . A competitive atmosphere out there?” He laughed at my question. “I tell you not! . . . I found it all very dead . . . I was able to move in with my own ideas, take hold of things, and come out with a very successful experiment.”
On receiving the telegram which the Nobel committee sends out to each award winner before the announcement in the press, the new laureate can feel many things. Exultation, certainly; but very often something else. Rabi told me that T. D. Lee, the Chinese-born scientist who shared the 1957 prize with his countryman C. N. Yang when they were both in their early thirties, received the news with acute terror. “My God!” he said. “What happens now to the rest of my life? What comes after this?” One of the things that happened was that between him and Yang, who had been his childhood friend in China, then devoted collaborators in the Institute for Advanced Study at Princeton, there developed a coldness that has never been explained to any outsider, and they stopped working together.
It was very different for Maria Goeppert Mayer, laureate for nuclear physics in 1963, the only woman theoretical physicist ever to be honored. “To my surprise, winning the prize wasn’t half as exciting as doing the work itself,” she said to me with some perplexity. “That was the fun—seeing it work out!” Even the memory of the lack of elation seemed to sadden her; yet her achievement was all the more remarkable because she had done her work when she was well into her forties and she had only recently come into the field of physics from chemistry, and most of all because she was a woman.
For Yang, terror; for Goeppert Mayer, sadness; for Frederick Soddy, pain—because the prize was going to someone else. As far back as 1898, the young New Zealand physicist Ernest Rutherford was working at McGill University in Montreal on the recently discovered world of radioactivity, which was one of wonder and confusion. He was twenty-seven. Two years later he collaborated with another McGill scientist, a brilliant English chemist of twenty-three, Frederick Soddy. The two young men published a series of papers of fundamental importance resulting in the general theory of radioactive disintegration, which attracted immediate attention by its almost sensational statement that chemical transmutation of the elements was an actuality that had been going on since the beginning of the world. The papers of Rutherford and Soddy were quoted everywhere. Soddy finished his term of appointment at McGill and returned to England to help Sir William Ramsay, the discoverer of helium, experimentally establish the crucial fact that the mysterious alpha ray given off by radioactive substances was really ionized helium. Ramsay and Soddy proved the identity. Ramsay received the Nobel Prize in 1904 for his discovery of the so-called “noble” gases: helium, argon, krypton, and neon—with no mention made of Soddy’s contribution. Shortly after, in 1908, Soddy’s other collaborator, Rutherford, now back in England too, also received the prize—again with no mention of Soddy’s part in the work. Soddy was deeply wounded. He was not the sort of man to consider himself the junior partner in the McGill work, and actually had in his possession a testimonial written on his behalf by Rutherford in 1904 that listed all the important advances made in the collaboration and added, “The work published by us was joint work in the full sense of the term.” Soddy in the beginning had to teach Rutherford the chemical techniques that were required. Also, he felt that he had been the one who had first though of transmutation. Disappointed as he was, he continued work in the nuclear field. The years passed. In 1913, Soddy was finally able to clarify man problems by inventing the idea of chemical isotopes. In 1921, the prize was finally given to him, and yet it was for the early work on radioactive transmutation with Rutherford that he wanted recognition. “In the old days, it had always been Rutherford and Soddy—Rutherford and Soddy—but now it’s just Rutherford, wherever you go!” he said bitterly.
Soddy had great ability, and he would have looked even more gifted if it weren’t for the blinding glow given off by his contemporary Rutherford, who had that magic combination of luck, vitality, and brilliance which makes certain men seem destined for achievement and recognition the instant they achieve manhood. They are always at the right place at the right time with the right talent. “Rutherford, you’re a lucky man, always at the crest of the wave!” his biographer, A. S. Eve, once said to him, and Rutherford’s retort was, “Well, I made the wave, didn’t I?”
Rutherford was such a man that neither Nobel Prize nor earthquake could diminish or even halt his effusive creativity. He was the first to realize the nuclear nature of the atom, the first to show that nuclear transmutation could be induced. He was big, raw-boned, loud-voiced. His capacity for enjoyment was prodigious. He loved scientific ideas that worked out; he loved his laboratory; he loved recognition; he laughed when the Nobel Prize was awarded to him at the age of thirty-seven because the citation was for “work in chemistry”; and he loved being made a lord—Lord Rutherford of Nelson. And his “boys” were his too, because, literally, he turned out Nobel laureates by the dozen.
In 1932, his “boy” James Chadwick barely beat Frédéric Joliot and his wife, Irène Curie, of the Institut du Radium to the discovery of the neutron. Rutherford, now in his sixties, insisted that Chadwick get the Nobel Prize for it. “But what about Joliot? Shouldn’t they share the prize?”
Rutherford pounded the table, “I want Jimmy to have it—unshared!”
“And what are we to do about Joliot? Just ignore him?”
Rutherford waved his pawlike hands. “That boy? Let me tell you, Joliot’s so brilliant that before this year is out, he’ll discover something so new and remarkable that you’ll be able to give him a prize for that!”
Rutherford proved to be right. Within months, the Joliots discovered that artificial radioactivity could be induced by neutron bombardment. In 1935, therefore, “Jimmy” Chadwick was awarded the prize for physics—unshared; while Irène and Frédéric Joliot were given the award in chemistry—“for their synthesis of new radioactive elements.” To Rutherford, even politicking and arranging the dispensation of Nobel Prizes were all great fun. Right up to his death, though, he believed that all the talk of eventual production of nuclear energy was “all moonshine.” He died in 1937, just two years before that one great miscalculation of his scientific life was revealed by the experiment of a former student, a man whom he himself had introduced to nuclear chemistry back in the early days at McGill—Otto Hahn of Germany.
Einstein was another Nobel laureate who did not believe in the possibility of the release of nuclear energy until the experimental evidence was incontestable; but it was one of the few ways in which Einstein was not unique. Not in our time has there been a creativeness so supremely rich. What is remarkable is that the university where he took his first degree didn’t even consider him promising enough to offer him a minor post on graduation. He had to work in the Patent Office in Bern to earn a living; and while there, in his early twenties, he began his prodigious inventiveness. In 1905, at the age of twenty-six, he published three different papers in three different fields of physics, each so profoundly original that each one is considered among the germinal papers in the fields he treated. The special theory of relativity was one of the three papers. On publication, no one reacted, no one responded. Absolute silence and indifference. Not until four years later, in 1909, did any university offer him an opening, and true recognition started to explode only in 1913. Still, the Nobel Prize was not given to him until 1922 (for the year of 1921), and then not for his theory of relativity.
If science was “fun to Rutherford, to Einstein it was exaltation. The first insight into relativity was said to be such a piercing experience for him that when he was finished with his calculations, he had a nervous collapse for a few weeks. “Well-being and happiness are such trivial goals in life that I can imagine them being entertained only by pigs.” Like Rutherford, he was already so celebrated and decorated by the time the Nobel Prize was given to him that it could not possibly affect that creativeness that came from so deep a source and flowed with such majestic strength. Only time and the physical subversions of age could dim him. His last years at Princeton made the Institute for Advanced Study a sort of shrine for physicists. At lunch one day, when Julian Schwinger was in his mid-thirties, he told me of his first meeting with Einstein, who was his idol.
Since leaving Columbia, Schwinger had matured and attained the celebrity we had all predicted for him. During the war, he had developed powerful mathematical tools for radar, and afterward he had been made full professor of physics at Harvard at twenty-nine, the youngest man ever to have achieved that position. But our once shy, carelessly dressed fellow graduate student was now jolting the sensibilities of his colleagues and students at Harvard with a very un-Cambridge Cadillac convertible and a taste for suits more smartly tailored than the shapeless, unwaisted, narrow-shouldered style affected by university types. To listen to some of them talk about him, one would have thought that a young George Raft had come to town, but Schwinger was still self-effacing in his manner.
“I had always dreamed of meeting Einstein ever since I was about twelve years old,” he told me. “But I wanted to be introduced to him only after I had done something he would know about; something important enough for him to respect. Rabi kept asking me to go down to Princeton with him whenever he went, and I kept making excuses. Finally, though, I did that piece of work on the self-energy of the electron; and Rabi told me that I was to be given the first Einstein Award for it, to be granted by Einstein himself! So there it was. The $10,000 grant that went with it was fine, but more important than the money was that I would finally be presented to Einstein on terms more dramatic than I had ever dared dream about. Well, the day came, and I got down to Princeton only just in time for the ceremonies, so I went directly to the auditorium. Rabi made the introductory speech, outlining the work I had done, and at last came the moment of the actual presentation of the award, the moment I had awaited for more than twenty years. Einstein rose slowly, waiting for me to approach, and when I went up to him, I saw it was all too late. He was too old! He hadn’t understood a work Rabi had said. He didn’t know who I was; or why I was standing there; nor was he at all clear about what was happening around him. I was shaking hands with a sick, bewildered, empty old man. It was heartbreaking to see him in such a state. The man I had wanted to meet, the man I had revered, must have died quite a while before. As soon as I could, I got off by myself and just walked. I suppose for the first time I had a true sense of the tragedy of age. And perhaps that’s why I went out and blew part of the money on that car. I had always thought vaguely in the back of my mind that it might be fun to have one like it someday, and suddenly there I was asking myself: why wait? For what? All the life there is, is now!”
Still, why the disastrous falloff in production on the part of the most creative men in their fields? According to the sociological study referred to before, there does appear to be at least one answer, which is this: a man’s life is distorted by the award of a Nobel Prize in direct proportion to the extent to which he has not achieved eminence up to that time. If a man’s accomplishments are already fully recognized by his peers, the Nobel Prize generally comes as only the most lustrous of an already large number of honors. When I was recently in Heidelberg, I asked J. H. D. Jensen, who won the Nobel Prize in 1963, if the award changed his life at all. He shrugged off the question, and said: “By the time it came, it didn’t really matter very much. The big moment for me had come years before when I learned that Fermi had put my name in nomination. I didn’t get it that year, but I didn’t really care. It was Fermi’s regard that was the ultimate honor for me, not the medal.”
On the other hand, if, before winning the prize, the man has received very few, if any, of the signs of the scientific world’s recognition of the worth of his work, the sudden rise to stardom can completely distort the pattern of the rest of his life. Yet while the statistics plainly back up this assertion, it must be true only on the average for men of comparatively slender creativity who may in the course of a lifetime achieve only one brilliant breakthrough. Men like Einstein, Rutherford, Fermi, and other giants, who are bigger than the prize, can win it at any time of their lives, take it in their stride, and go on continuing to be fruitful; while Roentgen and others like him who are smaller than the prize are overwhelmed by it—a heavy crown is only for very strong kings.
This article available online at:

Saturday, March 30, 2013



Steve Jobs Commencement Address 2005-Stanford University
I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I've ever gotten to a college graduation. Today I want to tell you three stories from my life. That's it. No big deal. Just three stories.
Associated Press
Steve Jobs speaks at graduation ceremonies at Stanford University, in Palo Alto, Calif., on June 12, 2005.
The first story is about connecting the dots.
I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out?
It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: "We have an unexpected baby boy; do you want him?" They said: "Of course." My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college.
Brett Arends discusses on Lunch Break why he believes Steve Jobs was the best chief executive of his generation and wasn't just a technology genius but also a hype master.
And 17 years later I did go to college. But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents' savings were being spent on my college tuition. After six months, I couldn't see the value in it. I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out. And here I was spending all of the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back it was one of the best decisions I ever made. The minute I dropped out I could stop taking the required classes that didn't interest me, and begin dropping in on the ones that looked interesting.
It wasn't all romantic. I didn't have a dorm room, so I slept on the floor in friends' rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example:
Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn't have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can't capture, and I found it fascinating.
None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it's likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later.
Again, you can't connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.
My second story is about love and loss.
I was lucky — I found what I loved to do early in life. Woz and I started Apple in my parents garage when I was 20. We worked hard, and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees. We had just released our finest creation — the Macintosh — a year earlier, and I had just turned 30. And then I got fired. How can you get fired from a company you started? Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well. But then our visions of the future began to diverge and eventually we had a falling out. When we did, our Board of Directors sided with him. So at 30 I was out. And very publicly out. What had been the focus of my entire adult life was gone, and it was devastating.
I really didn't know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down - that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me — I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over.
I didn't see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.
During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife. Pixar went on to create the worlds first computer animated feature film, Toy Story, and is now the most successful animation studio in the world. In a remarkable turn of events, Apple bought NeXT, I returned to Apple, and the technology we developed at NeXT is at the heart of Apple's current renaissance. And Laurene and I have a wonderful family together.
I'm pretty sure none of this would have happened if I hadn't been fired from Apple. It was awful tasting medicine, but I guess the patient needed it. Sometimes life hits you in the head with a brick. Don't lose faith. I'm convinced that the only thing that kept me going was that I loved what I did. You've got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven't found it yet, keep looking. Don't settle. As with all matters of the heart, you'll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don't settle.
My third story is about death.
When I was 17, I read a quote that went something like: "If you live each day as if it was your last, someday you'll most certainly be right." It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself: "If today were the last day of my life, would I want to do what I am about to do today?" And whenever the answer has been "No" for too many days in a row, I know I need to change something.
Remembering that I'll be dead soon is the most important tool I've ever encountered to help me make the big choices in life. Because almost everything — all external expectations, all pride, all fear of embarrassment or failure - these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.
About a year ago I was diagnosed with cancer. I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas. I didn't even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctor's code for prepare to die. It means to try to tell your kids everything you thought you'd have the next 10 years to tell them in just a few months. It means to make sure everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.
I lived with that diagnosis all day. Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and I'm fine now.
This was the closest I've been to facing death, and I hope it's the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:
No one wants to die. Even people who want to go to heaven don't want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life's change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.
Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma — which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.
When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960's, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.
Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: "Stay Hungry. Stay Foolish." It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.
Stay Hungry. Stay Foolish.
Thank you all very much.
Copyright 2012 Dow Jones & Company, Inc. All Rights Reserved
This copy is for your personal, non-commercial use only. Distribution and use of this material are governed by our Subscriber Agreement and by copyright law. For non-personal use or to order multiple copies, please contact Dow Jones Reprints at 1-800-843-0008 or visit


Ayn Rand was a brilliant woman. I have always loved this quote.

"My philosophy, in essence, is the concept of man as a heroic being, with his own happiness as the moral purpose of his life, with productive achievement as his noblest activity, and reason as his only absolute. (appendix to 'Atlas Shrugged')"

Friday, March 29, 2013


This Korean artist is fantastic. He makes his work from recycled tires. Check out his website or google his name in Google Images. Yow!


This is an interesting article about an interesting man. Another type of genius.

New York Times
March 27, 2013

Is Giving the Secret to Getting Ahead?


Just after noon on a Wednesday in November, Adam Grant wrapped up a lecture at the Wharton School and headed toward his office, a six-minute speed walk away. Several students trailed him, as often happens; at conferences, Grant attracts something more like a swarm. Grant chatted calmly with them but kept up the pace. He knew there would be more students waiting outside his office, and he said, more than once, “I really don’t like to keep students waiting.”
Grant, 31, is the youngest-tenured and highest-rated professor at Wharton. He is also one of the most prolific academics in his field, organizational psychology, the study of workplace dynamics. Grant took three years to get his Ph.D., and in the seven years since, he has published more papers in his field’s top-tier journals than colleagues who have won lifetime-achievement awards. His influence extends beyond academia. He regularly advises companies about how to get the most out of their employees and how to help their employees get the most out of their jobs. It is Grant whom Google calls when “we are thinking about big problems we are trying to solve,” says Prasad Setty, who heads Google’s people analytics group. Plenty of people have made piles of money by promising the secrets to getting things done or working a four-hour week or figuring out what color your parachute is or how to be a brilliant one-minute manager. But in an academic field that is preoccupied with the study of efficiency and productivity, Grant would seem to be the most efficient and productive.
When we arrived at Grant’s office on the Philadelphia campus, five students were waiting outside. The first was a student trying to decide between Teach for America and a human-resources job at Google. Grant walked her through some other possibilities, testing her theories about potential outcomes. Although she was aware of the crowd, she seemed to be in no hurry to leave, in part because Grant was so clearly engaged. A second student came in. Then a third. Someone dropped off a bottle of wine to say thank you; another asked for a contact (Grant pledges to introduce his students to anyone he knows or has met, and they shop his LinkedIn profile for just that purpose). For every one of them, Grant seemed to have not only relevant but also scientifically tested, peer-reviewed advice:Studies show you shouldn’t move for location, since what you do is more important than where you do it. Studies show that people who take jobs with too rosy a picture get dissatisfied and quit. If you truly can’t make a decision, consider delegating it to someone who knows you well and cares about you. Is there anything else I can help you with? How else can I help? He was like some kind of robo-rabbi.
Grant might not seem so different from any number of accessible and devoted professors on any number of campuses, and yet when you witness over time the sheer volume of Grant’s commitments, and the way in which he is able to follow through on all of them, you start to sense that something profoundly different is at work. Helpfulness is Grant’s credo. He is the colleague who is always nominating another for an award or taking the time to offer a thoughtful critique or writing a lengthy letter of recommendation for a student — something he does approximately 100 times a year. His largess extends to people he doesn’t even know. A student at Warwick Business School in England recently wrote to express his admiration and to ask Grant how he manages to publish so often, and in such top-tier journals. Grant did not think, upon reading that e-mail, I cannot possibly answer in full every such query and still publish so often, and in such top-tier journals. Instead, Grant, who often returns home after a day of teaching to an in-box of 200 e-mails, responded, “I’m happy to set up a phone call if you want to discuss!” He attached handouts and slides from the presentation on productivity he gave to the Academy of Management annual conference a few years earlier.
For Grant, helping is not the enemy of productivity, a time-sapping diversion from the actual work at hand; it is the mother lode, the motivator that spurs increased productivity and creativity. In some sense, he has built a career in professional motivation by trying to unpack the puzzle of his own success. He has always helped; he has always been productive. How, he has wondered for most of his professional life, does the interplay of those two factors work for everyone else?
Organizational psychology has long concerned itself with how to design work so that people will enjoy it and want to keep doing it. Traditionally the thinking has been that employers should appeal to workers’ more obvious forms of self-interest: financial incentives, yes, but also work that is inherently interesting or offers the possibility for career advancement. Grant’s research, which has generated broad interest in the study of relationships at work and will be published for the first time for a popular audience in his new book, “Give and Take,” starts with a premise that turns the thinking behind those theories on its head. The greatest untapped source of motivation, he argues, is a sense of service to others; focusing on the contribution of our work to other peoples’ lives has the potential to make us more productive than thinking about helping ourselves.
“Give and Take” incorporates scores of studies and personal case histories that suggest the benefits of an attitude of extreme giving at work. Many of the examples — the selfless C.E.O.’s, the consultants who mentor ceaselessly — are inspiring and humbling, even if they are a bit intimidating in their natural expansiveness. These generous professionals look at the world the way Grant does: an in-box filled with requests is not a task to be dispensed with perfunctorily (or worse, avoided); it’s an opportunity to help people, and therefore it’s an opportunity to feel good about yourself and your work. “I never get much done when I frame the 300 e-mails as ‘answering e-mails,’ ” Grant told me. “I have to look at it as, How is this task going to benefit the recipient?” Where other people see hassle, he sees bargains, a little work for a lot of gain, including his own.
The message sounds terrific: Feel good about your work, and get more of it done, and bask in the appreciation of all the people you help along the way. Nice guys can finish first! (Now there’s research to prove it.) But I couldn’t help wondering, as I watched Grant race through his marathon day (even one of his mentors admitted, “He can be exhausting”), about the cost of all this other-directedness. If you are devoted to being available to everyone, all the time, how do you relax? How can you access the kind of creativity that comes from not being on task every waking moment? How do you make time for the more important relationships in your life?
As Grant’s office hours came to an end four and a half hours later, he patiently continued offering help until he finally had to close the door and tell a student to try him by phone; he would squeeze him in on his commute or by e-mail. But he would not say no.
The study of job design in the middle- and late-20th century focused on how to improve the drudge work of manufacturing: Grant is credited with reviving the field, shifting the thinking toward the more modern conditions of a service and knowledge economy. He first realized that his ideas about giving at work might actually yield quantifiable results when he was a 22-year-old graduate student at the University of Michigan, and he proposed a study set in a university fund-raising call center. Call centers, even on college campuses, are notoriously unsatisfying places to work. The job is repetitive and can be emotionally taxing, as callers absorb verbal abuse while also facing rejection (the rejection rate at that call center was about 93 percent).
The manager, Howard Heevner, did not have a lot of faith that Grant would be able to motivate his student-employees. He had already tried, in a previous job at a call center, the usual incentives — cash prizes, competitive games — and was generally unimpressed with the results. But Grant had a different idea. When he was an undergraduate at Harvard, he took a job selling advertisements for the travel guide series “Let’s Go,” but he was terrible at it. “I was a pushover,” he says in “Give and Take,” “losing revenues for the company and sacrificing my own commission.” Then he met another undergraduate whose job at “Let’s Go” was helping her pay her way through college. Suddenly the impact of his role became clear to him: without advertising revenues, the company could not make money, which in turn meant it couldn’t provide jobs to students who needed them. With that in mind, he was willing to make a harder sell, to take a tougher line on negotiations. “When I was representing the interests of students, I was willing to fight to protect them,” he writes. It would not be a mass-market psychology book if every anecdote did not have a dramatic ending: Grant eventually sold the largest advertising package in company history and less than a year later, at 19, was promoted to director of advertising sales, overseeing a budget of $1 million.
As a psychology major, Grant always hoped to do a study on the “Let’s Go” staff, in which the books’ editors and writers would meet with or read letters by people whose travels had been enhanced by their work. Would knowing how the books benefited others inspire them to work harder? Now, at the call center, Grant proposed a simple, low-cost experiment: given that one of the center’s primary purposes was funding scholarships, Grant brought in a student who had benefited from that fund-raising. The callers took a 10-minute break as the young man told them how much the scholarship had changed his life and how excited he now was to work as a teacher with Teach for America.
The results were surprising even to Grant. A month after the testimonial, the workers were spending 142 percent more time on the phone and bringing in 171 percent more revenue, even though they were using the same script. In a subsequent study, the revenues soared by more than 400 percent. Even simply showing the callers letters from grateful recipients was found to increase their fund-raising draws.
When Grant went back and talked to the callers about their improvement, many actively discounted the possibility that the brief encounter with a scholarship student helped. “Several of them were stunned,” Grant said. “Their response was, ‘Yeah, I knew I was more effective, but that was because I had more practice,’ or, ‘That was because I had a better alumni pool in that period — I got lucky.’ ” Eventually, having replicated the test five times, Grant was confident that he had eliminated other explanations. It was almost as if the good feelings had bypassed the callers’ conscious cognitive processes and gone straight to a more subconscious source of motivation. They were more driven to succeed, even if they could not pinpoint the trigger for that drive.
The study quickly raised Grant’s profile in his field, partly because it relied on hard data: dollars, as opposed to manager assessments or self-reports. “I don’t know the last time there was a study in our field that had such striking results,” says Stuart Bunderson, a professor of organizational behavior at Washington University. “In terms of an intervention that has practical significance and moves the needle on employee behavior — you don’t see them that often.” The intervention was also a manager’s dream: fast and practically free.
Over the years, Grant has followed up that study with other experiments testing his theories about prosocial motivation — the desire to help others, independent of easily foreseeable payback. In one study, Grant put up two different signs at hand-washing stations in a hospital. One reminded doctors and nurses, “Hand hygiene prevents you from catching diseases”; another read, “Hand hygiene prevents patients from catching diseases.” Grant measured the amount of soap used at each station. Doctors and nurses at the station where the sign referred to their patients used 45 percent more soap or hand sanitizer.
These studies, two of Grant’s best known, focus on typically worthy beneficiaries: needy students and vulnerable patients. But some of his other research makes the case that prosocial behavior is as applicable in corporate America as it is in a hospital or a university. “Think of it this way,” he said. “In corporate America, people do sometimes feel that the work they do isn’t meaningful. And contributing to co-workers can be a substitute for that.”
Take, for example, Grant’s study of workers at Borders who contributed to an employee-beneficiary fund managed by the staff, with Borders matching donated funds. The money was set aside for employees in need — someone facing a pregnancy that would put a strain on their finances, for example, or the funeral of a loved one. Interestingly, Grant found that it was not the beneficiaries who showed the most significant increase in their commitment to Borders; it was the donors, even those who gave just a few dollars a week. Through interviews and questionnaires, Grant determined that “as a result of gratitude to the company for the opportunity to affirm a valued aspect of their identities, they developed stronger affective commitment to the company.”
The study is uplifting and troubling at the same time: even Grant acknowledges the possibility of corporations playing off their employees’ generous impulses, as a sop to compensate for other failings — poor pay or demeaning work. (After all, if the employees at Borders had better benefits and pay, they might not have needed the emergency fund.) Jerry Davis, a management professor who taught Grant at the University of Michigan and is generally a fan of his former student’s work, couldn’t help making a pointed critique about its inherent limits when they were on a panel together: “So you think those workers at the Apple factory in China would stop committing suicide if only we showed them someone who was incredibly happy with their iPhone?”
Grant’s answer to these questions is academic: he tries to understand how these mechanisms function but does not necessarily advocate implementation. “I am also skeptical about the motivations of corporations,” he said. “My concern is ultimately for the success and well-being of people in organizations. To the extent that individual and group accomplishments and quality of work life contribute to profits, I’m happy, but that’s not my primary goal.”
For all his general interest in psychology, Grant doesn’t seem interested in digging too deeply into the origins of his own psyche. About his all-consuming desire to help, he says simply: “My mother has what she calls the fix-it gene. Maybe I just inherited it.”
He grew up in the suburbs of Detroit, raised by a lawyer, his father, and a teacher, his mother. He was an upbeat boy, though socially awkward and burdened by numerous food allergies and strong aversions — to haircuts, to bluejeans, to chocolate. He felt things deeply; those aversions were matched by equally consuming passions. An aspiring basketball player, he would not allow himself to go inside until he made 23 consecutive free throws, even if it meant missing dinner. (That he never made the high-school team is the one failure that still pains him.) On weekends, he played video games for so many consecutive hours — 10 was not unusual — that his mother called the local paper to complain about what the paper called, in the subsequent article, “The Dark Side of Nintendo.”
Grant started significantly losing his hair in his 20s, as if his head were trying to keep pace with his overall precociousness. Now almost entirely bald, he has a striking, monklike look. Though he comes across as charming and agreeable, there are still traces of the awkward boy he says he once was, a hint of discomfort in the smile he gives a student he runs into unexpectedly, a longstanding dread of parties (“unless they like psychology or magic tricks, in which case I’d come alive,” he said). He is aware of his own introverted tendencies, and some of his research involves the strengths of introverts at work.
For the most part, Grant has more than compensated for the shyness he felt growing up. Once phobic about speaking in public, he forced himself to lecture as much as he could as a graduate student, handing out feedback forms so he could methodically learn from his weaknesses. He developed strategies for socializing comfortably, even though, he said, “I feel uncomfortable when I’m in a situation and I don’t know what people want or expect of me.” Giving, he eventually realized, was a reliable way of mediating social interactions.
On the day I followed Grant as he hurried to his office hours at Wharton, I read something on his face that registered as more than just busyness; he seemed anxious. I wondered whether Grant was driven by the desire to help or a deep fear of disappointing someone.
“That is one astute observation!” Grant said when I asked him about that by e-mail. (With Grant, every observation is an astute one.) Grant often starts his research with observations about himself — “me-search,” they call it in the field — and he had conducted a study trying to determine which of those two impulses was more motivating. The answer turned out to be a combination of the two. “Givers motivate themselves to avoid complacency by focusing on the benefits to others if they succeed and worrying about disappointing them if they fail,” Grant wrote.
One of Grant’s roommates, he went on, once joked that he had a productive form of O.C.D. “He noticed that when I was anxious about something, I had a habit of throwing myself single-mindedly into tasks in which I felt responsible to others,” he said. “A few days later, my mentor, Brian Little, sent me an article by Ian McGregor, one of his doctoral students, who studied ‘compensatory conviction’: anxiety in one domain motivates people to dive into passionate pursuit in another. It was one of those crystallizing moments that triggered a ‘Yes, I want to be a psychologist!’ reaction — I was fascinated by how closely his theory and findings mapped onto my own experience.”
It’s not hard to imagine a pop-psych interpretation of Adam Grant: that his generosity might have its roots in some kind of need — maybe a need he feels, even more than the rest of us, to be liked. Or perhaps that he is channeling his extreme ambition into a feel-good form of achievement. Productive and happy, Grant could even be seen as a paradigm of Freud’s definition of mental health: aggression sublimated into work.
But he has never put much stock in psychoanalysis — if the work is not data-driven, he’s skeptical. “I think a lot of it is baggage that goes back to Freud, and Freud would always say that whatever is going on with you can be traced back to something that happened early in childhood with your mother,” he told me, by phone, as he was driving to work one day. “You can either accept that or be in denial. You can’t win!” He would rather simply understand himself as someone who gets a lot out of giving, then harness that feeling, study it and see how the mechanisms involved can inspire others to succeed.
One night Grant forwarded me a grateful e-mail from a student whose life, the student said, changed because of some advice Grant gave her. I commented that most people would be thrilled to receive one note like that in a lifetime. “I get several dozen a week,” Grant said. He agreed to send some my way. That evening, at around 8:30, the e-mails started coming — Thank you for our conversation the other day and for your genius. . . . I couldn’t have done this without you. . . . I cannot thank you enough for your time and insight. . . . I’m thrilled. And I have you to thank. . . . After the first 10, I was impressed; when they kept arriving, I was surprised. On and on, until almost 11, my e-mail kept pinging; when I awoke the next morning, I saw that he had forwarded me 41 e-mails from the preceding week, each one of them numbered for my convenience.
Was this compulsive behavior? “Not really,” Grant said. “I would see it as goal-oriented and focused.” He said the question had generated a new research idea for him: “How Prosocial Behavior Can Mitigate O.C.D. Tendencies.”
Grant’s book, incorporating several decades of social-science research on reciprocity, divides the world into three categories: givers, matchers and takers. Givers give without expectation of immediate gain; they never seem too busy to help, share credit actively and mentor generously. Matchers go through life with a master chit list in mind, giving when they can see how they will get something of equal value back and to people who they think can help them. And takers seek to come out ahead in every exchange; they manage up and are defensive about their turf. Most people surveyed fall into the matcher category — but givers, Grant says, are overrepresented at both ends of the spectrum of success: they are the doormats who go nowhere or burn out, and they are the stars whose giving motivates them or distinguishes them as leaders. Much of Grant’s book sets out to establish the difference between the givers who are exploited and those who end up as models of achievement. The most successful givers, Grant explains, are those who rate high in concern for others but also in self-interest. And they are strategic in their giving — they give to other givers and matchers, so that their work has the maximum desired effect; they are cautious about giving to takers; they give in ways that reinforce their social ties; and they consolidate their giving into chunks, so that the impact is intense enough to be gratifying. (Grant incorporates his field’s findings into his own life with methodical rigor: one reason he meets with students four and a half hours in one day rather than spreading it out over the week is that a study found that consolidating giving yields more happiness.)
The studies are elaborate, the findings nuanced — but it is easy to walk away from the book forgetting the cautionary tales about people who give too much and remembering only the wash of stories about boundless generosity resulting in surprising rewards: a computer programmer who built a Web site at no cost for music fans (one of whom turns out to be an influential figure in Silicon Valley); a financial adviser who travels to take on a client thought to be impoverished (only to find that person sitting on a significant fortune); the writers who start out working free on a project for a friend (and somehow end up among the most successful in Hollywood).
I had assumed that Grant, and the other examples of extreme givers in his book, were simply superhuman in one way or another — not only in the acute empathy that makes giving so rewarding for them but also in their unusual focus and stamina and mental-processing speed, traits that allow them to bend time and squeeze in more generosity than the rest of us. Grant, clearly, has some advantages beyond his propensity to help: more than one of his colleagues told me, for example, that when they cannot find the citation for a particular paper, they simply e-mail Grant directly, who is more reliable than Google and almost as fast (his childhood friends called him Mr. Facts).
But Grant believes that in terms of giving, we all have the same muscle; it’s just that he and the other givers in his book have exercised it more. In “Give and Take,” he cites a study that found that most people lose physical strength after enduring a test of will, like resisting chocolate-chip cookies when they are hungry. Typically, the study’s subjects could squeeze a handgrip for only 25 seconds after an exercise in willpower. But one group distinguished itself, squeezing the grip for 35 seconds after the test of will. They were people who were on the giving end of the other-directedness scale. “By consistently overriding their selfish impulses in order to help others, they had strengthened their psychological muscles, to the point where using willpower for painful tasks was no longer exhausting,” writes Grant of the study, conducted by researchers at Northwestern University. It seems too simple to assume that Grant just happens to be capable of great discipline across all facets of his life; all those exercises in will, he would argue, feed each other, with one making the others possible.
I like to think I am a typically helpful person, but after reading Grant’s book, I found myself experimenting with being more proactive about it. I started ending e-mails by encouraging people to let me know if I could help them in one way or another. I put more effort into answering random entreaties from students trying to place articles. I encouraged contacts seeking work or connections to see me as a resource.
And I did notice that simply avoiding the mental lag of deciding whether to help or not was helpful. At a minimum, Grant’s example presents a bright-line rule: Unless the person on the other end is a proven taker, just do it — collaborate, offer up, grant the favor.
The first time I exchanged those e-mails, I usually felt good; after the second exchange on a given topic, I thought perhaps I had done my duty. But I noticed that every offer of help I initiated or granted engendered four or five e-mails, at the end of which I sometimes felt surly and behind on my work — and then guilty for feeling that way. Worse, those exchanges often even ended with the person on the other end wanting to meet for coffee. Coffee! Now I struggled to find a way to say, gracefully, that there was no way I could meet for coffee — not this week or next or the week after that, because there are only so many hours in the day, and if I do not get home in time to make dinner, my children will dine on Pirate’s Booty and Smarties, which would not make me feel helpful or productive or good.
Children. It must be said that in the middle of a national debate about flexible hours and telecommuting, there is precious little in Grant’s book about work and family balance. The division of labor in Grant’s own marriage is very traditional; his wife, who has a degree in psychiatric nursing, does not work outside the home, devoting her time to the care of their two young daughters and their home. Grant would be an extraordinary giver under any circumstances; but it can only help that he doesn’t have to worry about running to the grocery store or renewing the car registration.
“Sometimes I tell him, ‘Adam — just say no,’ ” his wife, Allison, told me, referring to the hundreds of requests he gets every day. “But he can’t say no. That’s what he is. That’s his way.”
Grant is devoted to his family — he has dinner most nights at home and takes his daughter to a preschool activity on many afternoons. But he also works at least one full day on the weekend, as well as six evenings a week, often well past 11. Once, when Grant was asked to give a talk on productivity, he confessed to a mentor that for all his research, he was still not sure what he did that was any different from anyone else. It wasn’t exactly a mystery, his mentor told him: He worked more. “I made a commitment to talk about that more,” Grant said. He did not mean to suggest that everyone should work on weekends; he wanted them to be aware that they were making a choice, maybe even one they felt good about.
“The way I see it, I have several different roles,” he told me: teacher, scholar, adviser, friend, to name a few. “I’d be concerned if any of those roles took more of my time than my family.” Grant, of course, has conducted a study investigating whether giving behaviors at work translate into happiness at home. He found that people who felt they had contributed to others’ well-being at work did not always feel great at the end of the workday; but they usually did by bedtime, especially if they had reflected about their contribution in the intervening hours. It turns out that bringing your work home with you can be beneficial after all — if you’re thinking about it the right way.
A skeptic might read Grant’s book and conclude that extreme givers are just matchers who are in it, maybe even subconsciously, for the long run. Eventually, in ways that are predictable and unpredictable, the bounty returns to them. Grant’s giving instincts might be reflexive, but they do clearly contribute to his success. “The entire world feels like it owes him a favor — including me,” says Justin Berg, a doctoral candidate who studies creativity at Wharton and who has collaborated with Grant. “People rush at the opportunity to work with him.” And one round of giving enables another: when Grant calls on a work contact and asks her to meet with an undergraduate seeking work, chances are that contact is more than happy to enable Grant’s favor, because she has already been the beneficiary of more than one from him herself. The path to success is filled with people helping to clear the way.
From the point of creativity, Grant’s undiscriminating helpfulness also reaps professional benefits, Berg says. “The best ideas occur to people who are touching multiple worlds and domains. And in our field, he’s at the nexis of a lot of them.”
Because one study found that old friends and connections can be even more valuable as resources than current ones — because they intersect with different worlds and therefore have more fresh ideas — Grant has a tickler built into his calendar reminding him, once a month, to get in touch with a contact he likes but with whom he has temporarily lost touch. And he is highly efficient about his giving: he virtually never says no to the five-minute favor, something that will help someone out — an introduction, a quick suggestion — but cost him very little, relative to impact.
We were sitting in Grant’s office one afternoon talking about efficiency, when he said: “The truth is, I don’t care how many articles I publish or how many words I write. Productivity is an imperfect way of indexing how much I’m contributing, how I’m using my limited time to make the most difference.”
It wasn’t until I was transcribing the conversation a few days later that I realized that when he referred to his limited time, he wasn’t just talking about a busy schedule; there was a more existential tug in the phrase. I brought it up with him by phone.
“It’s the kind of thing I almost never talk about,” Grant said. “But my responsibility is to be open.” Mortality, he said, was the one subject that gave him something like panic attacks. He had always felt that way, since he was a brainy, sensitive kid playing basketball in his driveway, staring at the sun, suddenly terrified of what would happen when it burned out. That was why he first wanted to be a scientist — before he realized biology bored him and he would never reinvent physics — so he could help figure out how to extend life, or at least design the spacecrafts that he is sure, even now, will take us to safer planets if this one runs dry. Mortality, he said, is “something I can’t fix. I can’t do anything with or about it.” He can’t let himself think about it too much; he has lost days at a time to his anxiety, “to the point that it’s the equivalent of extreme physical pain.”
It struck Grant as odd that no one had ever tried to figure how the awareness of death motivates people’s behavior at work, and in 2009, he published a paper trying to understand the link between mortality and productivity: “The Hot and Cool of Death Awareness at Work: Mortality Cues, Aging and Self-Protective and Prosocial Motivations.” The study walks the reader through the fascinating field of death awareness, which measures how people respond to reminders of death, like a news clip about a deadly car crash. When and how, he asked, does the prospect of death become relevant to employees at work? Grant argued that when people’s reactions to reminders of death are “hot” — anxious and panicked — those workers tend to withdraw. But when they are “cool” — more reflective, as in response to chronic reminders, the kinds, for example, firefighters face — those workers would be more likely to “reflect on the meaning of life and their potential contributions.”
Grant wrote the paper, in part, to try to sort out his own hot and cool feelings on the subject. Contemplating the meaning of life doesn’t make him want to relax and work less. “I always go back to William James,” he said. “ ‘The greatest use of a life is to spend it on something that will outlast it.’ A big part of it is being remembered.” Besides, relaxing stresses him out. “For me, in my moments of idleness, I experience the most existential anxiety, so I like that every moment is scheduled, even when it’s having on my calendar that I’m going to watch a television show with my wife. It means my brain is engaged in other things, and it’s not going to be a terrifying evening.”
Grant would be the first to say that he is not purely altruistic — that pure altruism, giving without regard for one’s self-interest, perhaps does not even exist. When he writes those 100 student recommendations, he says, he gets the satisfaction of helping them succeed. But there are other happy byproducts of that work as well: he might end up the beneficiary of those students’ good will later on and possibly inspire them to try to do right by those who will eventually ask them for help. He will also have kept himself busy enough that he won’t have much time to spend agonizing over what happens when he can’t give anymore.
As he left the office after one of our meetings, Grant headed for his car, carrying another gift of gratitude: a twiggy box filled with organic jellies and dried fruit from the Environmental Defense Fund, to which he had recently spoken about how to motivate their fund-raisers.
On the way to the garage, Grant told me the story of a time that someone asked quite a lot from him. “So I got an e-mail out of the blue from a recent Ph.D. who wanted career advice,” Grant said. “And I spoke to him for a while on the phone — twice. But then, after that, he asks me if I could give him comments on his dissertation, and he sends me this thing that was like 300 pages long. It was one of those moments — yikes!”
Grant did not know this academic and was not an expert on the subject. This, I thought, was the long-awaited last straw, an occasion when Grant not only said no but also perhaps found the request itself galling. Surely he did not shun his family, his students, his ultimate Frisbee game, his research and his never-ending list of e-mail requests for the hours that it would have taken him to analyze a 300-page dissertation. Even Adam Grant must say no sometimes.
Grant said that he rarely feels resentful of such requests. “It’s on me if I want to say no,” he said. “I own my guilt.”
He did decide that in this case, the time it would take to read the paper would be excessive — and that indulging the impulse to read it all would be tantamount, in the logic of Grant’s thinking, to letting himself down, flouting his own rules of efficient giving.
“So I just skimmed it for the most important parts,” he said, and gave general feedback on those points. The author then reworked the paper completely and sent it back to Grant to read again. Grant, of course, complied.
“And guess what?” Grant said, breaking out in a smile. “The paper was great!”