Wednesday, April 3, 2013


Another military genius. He was tough, compulsive, abrasive, and driven. I love this guy. I learned of him when I read a biography called The Rickover Effect. The following comes from,%20Hyman

Hyman George Rickover was born on 27 January 1900 (1898 according to school records) in the village of Makow. At the time this was in the Russian Empire, some 50 miles north of Warsaw. His father was a tailor who emigrated to New York at the turn of the century; several years later he sent for his family to join him in the United States. Hyman Rickover attended school in the Chicago area after his family had moved there in 1908.
In 1918 he entered the US Naval Academy. During the Second World War he served as head of the electrical section of the navy's Bureau of Ships. In 1946 he was assigned to the atomic submarine project at Oak Ridge, Tennessee. He was a major factor in convincing the US Navy that nuclear sea power was feasible, and directed the planning and construction of the world's first atomic-powered submarine, the USSNautilus, launched in 1954. He was also involved in many more of the US Navy's nuclear-powered ship projects.
Although no anglophile, he gave great assistance to the Royal Navy as they developed their own nuclear submarine programme. Rickover later became chief of the Naval Reactors Branch of the Atomic Energy Commission and was in charge of the nuclear propulsion division of the Navy's Bureau of Ships.
His extraordinary naval career was marked by controversy. Rickover never commanded a ship - at a time when this was thought essential for people destined for senior rank. He held outspoken opinions and would not tolerate dissent. His single-minded purpose was to drive through his own ideas for nuclear-powered vessels and to block alternative ideas. Ultimately this provided the US Navy with a powerful fleet of nuclear-powered surface vessels and submarines ,but probably at the expense of more cost-effective innovative designs.
He was finally promoted to rear admiral (1953), vice admiral (1958), and admiral (1973). In a situation unlikely ever to be repeated he was listed as on active duty until 1981. He retired the following year.


The devil is in the details, and everything we do in the military is a detail.
Unless you can point the finger at the man who is responsible when something goes wrong then you never had anyone really responsible.
Be ever questioning. Ignorance is not bliss. It is oblivion. You don't go to heaven if you die dumb. Become better informed. Learn from others' mistakes. You could not live long enough to make them all yourself.
Good ideas are not adopted automatically. They must be driven into practice with courageous patience.
Great minds discuss ideas, average minds discuss events, small minds discuss people .

Monday, April 1, 2013


Andreas Gruentzig was a wild, dynamic maverick in medicine. His work changed the face of cardiology and cardiac surgery! His work paved the way for non-surgical treatment of heart disease. You can imagine how this was received by cardiac surgeons! He was also a great teacher and left behind a legacy of acolytes. His life is beautifully documented in "Journey into the Heart" by Monagan.

From the Society for Cardiovascular Angiography and Interventions:
Twenty-seven years ago (1977), in Zurich, Switzerland, Dr. Andreas Gruentzig performed the first coronary angioplasty on an awake human patient. In doing so, he forever altered the role of the cardiologist in treating heart and vascular disease.
As part of a special tribute to “Legends in Invasive/Interventional Cardiology,” SCAI asked Dr. Gruentzig’s colleague and friend Spencer B. King, III, M.D., FSCAI, MACC, to reminisce about the years when he and Dr. Gruentzig collaborated. Here we share some of Dr. King’s recollections.

Dr. King met Dr. Gruentzig in 1976 at a medical meeting in Miami, where Dr. Gruentzig was presenting his animal work. The poster that Dr. Gruentzig showed at the conference is still in Dr. King’s office, a cherished remembrance of a good friend. The two cardiologists began talking about Dr. Gruentzig coming to the United States in 1980 as they traveled by train outside of Zurich. Dr. Gruentzig said that he would like to return to Germany or perhaps move to the United States, perhaps to join the Cleveland Clinic.
When I asked him what he was interested in doing, he said he wanted to foster the technique and that he wanted to be a professor. I pointed out that the Cleveland Clinic was not a medical school and he could not be a professor there. This was a revelation to him, and it kind of set him back.
Shortly thereafter, Dr. Gruentzig visited Atlanta, where Dr. King introduced him to J. Willis Hurst, M.D., chief of medicine at Emory University. What followed was a long and convoluted recruitment process that required obtaining a visa and a state license for someone who did not have the other usual credentials to practice medicine in the United States. Eventually all of these obstacles were hurdled and Dr. Gruentzig decided to join Dr. King and his partner, John Douglas, M.D., FSCAI, at Emory.
Andreas was an incredibly bright and intense guy and very committed to what he was doing, but at the same time he was always open and encouraging of others. That’s what all of his courses that he initially began were about—to share the knowledge. He was active in all aspects of life—work, play, anything. His motto was that if you really wanted something, you should pay the price for it and not worry. He would not have been very good at economizing on anything. He drove fast, he lived fast, and he accomplished a lot in a short period of time.
In 1980, Dr. Gruentzig joined Dr. King’s group as a full professor.
He loved teaching. He attracted students to Emory from all over the world for angioplasty. Probably half of the fellows who were trained in the early 1980s came out of our program. There weren’t other programs that large until about 1983 or 1984.
Dr. Gruentzig spent nearly five years at Emory, from 1980 until his death in 1985. During those years, Drs. King and Gruentzig and their team forged ahead, developing the specialty of interventional cardiology. Following his death, their group conducted the first randomized trial in the field, comparing angioplasty to surgery.
Andreas would be proud of the technical advances the field has accomplished. He would think these were logical steps forward. He was interested in, and had worked on, many of these things. As early as 1985, he was involved with ideas of laser technology and stents, although they were not used until the year after his death.
If he were here today, I think he would be very excited about all of the medical breakthroughs in cardiology because his principal interest was the long-term outcome and ensuring that it would benefit patients. However, I suspect he would be a little depressed about the in-fighting among the subspecialties. To Andreas, it made no difference what discipline you came from. If you were interested and involved and could master the field and techniques, then you should be able to do it.
He would also be disappointed in the lack of reflection on indications. He was always opposed to the “see it–do it” kind of thinking. He felt that you should have careful documentation as proof that what you were doing was going to be beneficial. And he probably would be disappointed that we still have not proved definitively that angioplasty has extended life. He would probably think that by this time we should have been able to do that. These, I think, would be his concerns.
Outside of work, Dr. Gruentzig was a great entertainer, a trait that was passed on to his daughter, who is a stage actress in Europe.
My wife and I went to his cabin in the foothills of Zurich when his daughter was still very young. It was great to get together with his family. His mother would be there as well as his aunt and his wife and daughter. He could do even simple things extremely well. He would concoct a spaghetti dinner, make a salad, and serve some wine. Then he would get everybody together to play music. They would play flutes and other instruments. It was great fun. Andreas had a great sense of humor. At any party, he was always having fun and dancing.
When Dr. Gruentzig first came to Emory, he and Dr. King directed demonstration courses together. The catheterization labs were wired directly into the large auditorium via closed-circuit television. Either Dr. Gruentzig or Dr. King performed the case while the other moderated. Dr. King recalled how quickly Dr. Gruentzig could assess a situation and act in response.
One day during the lunch break of the course, Andreas was leading a group of people who asked him if he would show them the cath labs. At the same time John Douglas was doing a case on a very obese woman. During the procedure, the woman had fibrillated and John defibrillated her. She literally bounced off the table and onto the floor where John was kneeling, carefully keeping the catheter in place. At that moment, Andreas opened the door and came into the lab with about 10 people trailing behind him. Within a second he assessed the situation and, before anyone else could see, he quickly turned the group around, saying “Nothing happening here,” and ushered everyone out.
Dr. Gruentzig enjoyed living in Georgia and eventually owned two homes there, one in Atlanta and another in Sea Island. He also liked flying and owned a twin-engine plane that took him back and forth between Atlanta and Sea Island. It was on one of these trips that Dr. Gruentzig was killed. He was 46 years old.
Andreas once said, “No matter what happens to the technique, I have made one contribution, and that is allowing physicians to work within the coronary arteries of the awake, alert patient.” In other words, no one had ever conceived that you could do any of the things that followed. No one had imagined that intravascular ultrasound or pressure measurements or any of the therapeutic things we do within interventional cardiology were possible. Nobody had conceived of balloons, stents, or lasers being possible. Andreas enabled the entire field of interventional cardiology to be developed.

Sunday, March 31, 2013


Sterling Hayden was an actor. He played the corrupt police chief who was shot in the head, by Al Pacino, in The Godfather. But he was a lot more. He was an adventurer, sailor, OSS agent and many others. He defied a court order and sailed to Tahiti with his four children, Christian, Dana, Gretchen and Matthew. He hated acting and always regretted naming names to the House Un-American Activities Committee. 

He wanted to be free:

   "They are enmeshed in the cancerous discipline of 'security'. And in the worship of security we fling our lives beneath the wheels of routine-and before we know it our lives are gone.
   What does a man need-really need? A few pounds of food each day, heat and shelter, six feet to lie down in-and some form of work activity that will yield a sense of accomplishment. That's all-in the material sense. And we know it. But we are brainwashed by our economic system until we end up in a tomb beneath a pyramid of time payments, mortgages, preposterous gadgetry, playthings that divert our attention from the sheer idiocy of the charade.
    The years thunder by. The dreams of youth grow dim where they lie caked in dust on the shelves of patience. Before we know it,  the tomb is sealed".
   Where, then, lies the answer? In choice. Which shall it be: bankruptcy of purse or bankruptcy of life?...
    ....Somehow it is the male's duty to put the best years of his life into work he doesn't like in order that he may 'retire' and enjoy himself as soon as he is too old to do so. This is more than just the system-it is the credo. It is the same thing that prompted Thoreau to say, in 1839:
'The mass of men lead lives of quiet desperation."'

His autobiography is quite interesting. He never seemed to know where he belonged. "You were strong enough to rebel-not strong enough to revolt"


Interesting article on Nobel-Prize winning geniuses!

The Atlantic

How Nobel Prizewinners Get That Way

By Mitchell Wilson
When Julian Schwinger came to the Columbia Graduate School of Physics in 1935 at the age of seventeen—five years younger than the youngest of us—he was shy and pudgy, with a schoolboy’s broken complexion; but he had already gone through the most advanced treatises on theoretical physics, quantum theory, and relativity all by himself, as easily and avidly as the rest of us had once gone through Two Years Before the Mast. By comparison, we were illiterates. There was even a rumor that he had published his first scientific paper in the Physical Review at fifteen when he was at Townsend Harris High School. He was at once so obviously in a class by himself that no one bothered to envy him. One thing, each of us assured the others: eventually he would earn a Nobel Prize.
When I say “we,” I mean the group of about a dozen graduate students studying and doing research toward our doctorates, along with a handful of postdoctoral fellows and instructors also in their early or middle twenties. We made up the laboratory population of the department. As it turned out, we were right about Julian. In 1965, he was awarded the Nobel Prize for work in quantum electrodynamics. Also, as it turned out, we proved to have been very poor judges of Nobel Prize material. Sitting right there among us all the time, taking part in our talk and gossip, were three other whom we had passed over completely. The first was one of our research chiefs, I. I. Rabi, who was to win a Nobel Prize in 1944. The second was Polycarp Kusch, a young experimentalist from the Middle West, with large angular movements and a loud assertive voice. He was the Nobel laureate in 1955. The third was Willis Lamb, a tall, thin Californian with a slight squint and a quiet erudition, both in physics and out. In the thirties, Lamb considered himself only as a theoretician—although certainly no then in Schwinger’s class, as far as anyone thought.
Four Nobel laureates out of a group as small as that, at a time when the world population of physicists was over ten thousand, was a remarkably high proportion indeed. All these prizes, though, were still decades in the future. We didn’t know what a genuine Nobel Prizewinner looked like, or even what he did once he had been awarded the prize. From time to time, a few such exalted beings as Harold Urey, Arthur Compton, and Robert Millikan would drop in on us for a public evening lecture, but then they took off again with their radiance unpenetrated.
Our first real contact—certainly my first contact—with a living, breathing, close-enough-to-touch Nobel laureate came in 1938 when Enrico Fermi left Italy with his family, ostensibly to go to Sweden to receive the prize for his work in artificial radioactivity. Instead of returning to Mussolini’s Rome, he kept on going until he came to us at Columbia. He was in his middle thirties at the time. I hoped only that when he’d start giving his lecture on atomic and nuclear physics I wouldn’t open my mouth and make a fool of myself in his seminars. I glimpsed him with awe as he hurried through the Pupin corridors, labs, and offices: a short, quick, long-armed man. His gray eyes looked patient, when they were really only polite. To me, he was already half a god.
About a week after Fermi’s arrival, I was called to Rabi’s office. In those days, Rabi liked to whittle at a small piece of wood as he talked. I had recently finished an apprentice research for him in his molecular-beam techniques, and had passed all my qualifying exams. I came, hoping that he was finally going to put me to work on my doctoral assignment. Instead, he told me he was releasing me from his research group so that I could be free to become Fermi’s assistant. It was as if I had been told I was to report to heaven to sit at the right hand of God. But there was also a nightmare side to all this splendor and that was my feeling that at that particular point of my career I was no more capable of carrying on research physics on the Fermi level and up to the Fermi standard than I was able to walk onstage at the Metropolitan Opera House in the middle of a performance of Tannhäuserand take over the main role. It was the greatest opportunity I had ever had; it was also the most appalling invitation to disaster.
Fermi got to the point the moment I appeared in his office. He asked me what I knew about cosmic rays. I said I knew nothing. He said, no matter, neither did anyone else. He went to the blackboard then and outlined the theory of the experiment he wanted performed, that he wanted us to perform. For the first few minutes, he was remarkably clear. How marvelous it felt to be one of the talented people up here At the Top where life shone! Then everything darkened. He was speaking brilliantly, lucidly, but really to himself, because I no longer understood anything. I kept nodding though; it never occurred to me to ask him to repeat any of the points that I lost. At last, he finished with theory and began to discuss the apparatus I would have to build: pulse-counting circuits, giant Geiger tubes, and appropriate vacuum systems. I felt a little better. I had never made any of the things he asked for, but I knew that I would be able to find out how. Physics had always come more easily to my hands than to my head.
Fermi turned out to be the most active, the most competitive man I have ever known, not only intellectually but physically as well, even with men twice his size and half his age. If it was a matter of mountain climbing, he had to be the one in the lead. If it was swimming, he proved to be the one with the greatest endurance. As his tennis partner, I never had anything to do but hold my racket and squint against the sun. He played both courts, the net and the backcourt as well. Shortly after his arrival in America, he bought a long shining black Packard with part of his prize money. When a minor adjustment had to be made one Sunday, he insisted on doing it himself—and lost a piece of his finger. In the laboratory, sometimes I literally had to wrestle pieces of equipment out of his hand, because while I never saw him lose his temper or even show impatience, he wanted things done hisway, by him. I was freed of his furious energy only when the news of nuclear fission came along, and he threw himself into that.
The discovery of nuclear fission was a direct personal challenge to Fermi. In the early 1930s, Fermi had remarked to his old professor in Rome, Carponi, that even though it might take another fifty years to work out all the details of the wave theory of atomic structure, the main outlines were already clear. It was time he moved on to where the next big questions were. Then he and his young Italian co-workers plunged into research on neutron-induced artificial radioactivity, and ranged like wolves through the entire periodic table of elements, and beyond—to the so-called “transuranic” elements, those made heavier than uranium by the nuclear capture of the bombarding neutrons. In 1938, once again Fermi found himself in a field where the general outlines had been cleared. The next advanced position for him to attack was the question of the nature of the very high energy particles found in cosmic rays; and this is what he planned to be doing in America.
The announcement, a short time after he arrived in the Untied States with the prize, that neutron-bombarded uranium sometimes split into much smaller fragments along with massive emissions of energy meant to Fermi that his “transuranic” elements had been called into question. A portion, at least, of his Nobel award rested on shaky ground. The man who reveled in being first had been first in the area where fission took place, but he had walked blindly past it, leaving to others one of the most startling discoveries in physics. To him, there was no choice but to go back into nuclear physics, re-establish his lead, and prove all over again—if anyone had any questions about it—that he deserved the prize. He then went on to build, eventually, the first chain-reacting nuclear pile.
Not only was he the Columbia physics department’s only Nobel laureate at the time; he also became the busiest physicist in the building. What we didn’t know was that Fermi, who was usual in nothing, was also an unusual Nobelist.
Of all the bizarre effects which winning the prize turns out to have on scientists, the one least often seen is heightened creativity. Only Fermi and a handful of others during the past seventy years fulfilled Alfred Nobel’s original dream. A very different pattern was set by the first man ever to win the award. In 1895 Wilhelm Konrad Roentgen, an obscure physics professor at the University of Würzburg, completed a series of modest but typically meticulous experiments that had been initiated by a chance observation. He worked for about eight concentrated weeks, then his results were described one evening to a small group of Würzburg medical men. He had come across a mysterious new radiation which was actually able o penetrate a variety of materials opaque to the eye. If the hand were held between the source of the radiation and a fluorescent screen, he told them, “The dark shadow of the bones is visible within the less dark shadow of the hand. . . . For brevity’s sake, I should like to use the expression rays; and to distinguish them from other rays, I will call them X-rays, X for the unknown.”
Roentgen’s X-ray photographs of the bones in his wife’s hand (she was wearing a heavy wedding band) was printed all over the world and created a furor that verged on panic. Women were afraid to go out on the street for fear that men with X-ray glasses would see them nude through their clothes. Bankers were afraid that thieves with X-ray vision could see what was hidden in their vaults. In the public mind, for the moment, Roentgen was considered the greatest wizard who had ever lived. He was granted the award in 1901, the first year of its existence, but for the rest of a long, increasingly isolated life, he never made another contribution to science. Behind the silence was a local scandal: Roentgen was accused of taking credit for what one of his students had really done. He was so embittered by the intensity of the vituperation and the unfairness of the charge that he turned more and more in on himself until he became available to hardly anyone. He was the first, but certainly not the last, Nobelist to become involved in an ugly struggle for credit; and to have his entire style of living and working wrenched into some other shape by the most prestigious award the modern world has ever known.
By and large, Nobel science laureates are really exceptional men. If one can measure such things, they must be about twenty to forty times as creatively productive as the average scientist, whose output over an average lifetime is only about five published papers. The men who become Nobel Prizewinners, according to a study made by Harriet Zuckerman, the Columbia sociologist, publish almost that much in a year! She matched (in terms of age, specialization, and conditions of research) the performance of the American laureates in science with an equal number of excellent scientists—active but nonlaureate—selected from the roster of American Men of Science. This is what she found: the average American laureate publishes about four papers a year; the others publish about three papers every two years. The statistics also show that the output of the laureates fell off after the award was made, by an average of a third within five years. Actually, the falloff for the laureates is about three times as sever for their less eminent colleagues of the same age.
Why this falloff?
One answer is that their new celebrity makes so many demands on them that they have less time for research. But research men make their own time, and the only ones who accept too many invitations are those who want to accept them; and since they know what the price of distraction is, their very acceptance is part of the falloff pattern, not the cause.
Another quick answer is that once these men have attained success, there is no further reason to work so hard. But a drive for “success” was never the force that kept them going. By and large, men work at research because that, more than anything else, is what they want to be doing.
Recently, in Paris, I was visiting the Pasteur Institute, and in a talk with Jacques Monod, the 1965 laureate in medicine and physiology, he happened to mention that during the war his research, absorbing as it was, had to be used as a cover for underground activities during the German occupation. Although hard at work on his experiment, behind the apparatus in neighboring rooms were illegal printing presses, forbidden newspapers, and weapons. One day, catastrophe struck: one of the men in his group was killed, another captured by the SS. Monod was ordered to go underground at once, which mean walking out of the Sorbonne, not returning to his apartment, taking another name, and staying away from any part of Paris where he might be known.
He became a full-time underground worker. Every day, he faced the danger of being shot. Yet he missed his research so severely that in whatever time he could find, he smuggled himself into the Pasteur Institute to continue his bacteriological experiment in a corner of Lwow’s lab. At this point in Monod’s story, I had interrupted.
“Even if you had finished the research, you couldn’t have published it.”
“Naturally not,” he said.
“Nor did you have any idea that you would live long enough to finish the research, did you?”
“Your idea of a rest from risking your life twenty-four hours a day was to run an even greater risk for a few hours by going where you were known—without the slightest chance you’d ever get anything out of it in terms of prestige or recognition. Why did you do it?”
Monod is a man with a finely proportioned, highly expressive Gallic face. My question astonished him; but there was something I wanted him to put into words, and so I waited.
“Because,” he said at last, almost helplessly. “That’s what I wanted to be doing—that’s what my life was all about!”
“You know, I could make $2000 a week, if I wanted,” Poly Kusch remarked to me one day at lunch at the Columbia Faculty Club some years after he had won the Nobel Prize. He had finally grown into his angular face and was an impressive-looking man. He had also become a brilliant teacher. “That’s more money than my father ever made in a year, but I’d rather stay here and teach.”
Now, $2000 a week is a lot of money for a professor, but literally thousands of American men today—in industry, advertising, finance, fashion, and entertainment—make $2000 a week, and scarcely one of them is a man of any distinction whatsoever, while Kusch to be worth that much money had to attain the highest prize in the world’s most difficult science. Any man seeking “success” in the general sense of the word would have to be a fool even to think of picking the life of a research scientist as the road. No, “success” is all very pleasant, but it cannot be the spur for the really creative man whose mind is a churning sea where fragments of ideas, half-perceptions, and partial insights keep welling up to the surface of consciousness. He can neither turn the flow on nor turn it off. All he can do is pick and choose among the ones that seem most fruitful to follow. He works because he can’t not work. When he does stop working, it is because something very deep within him has been turned off, either shattered or put to rest.
In some laureates, the change is so palpable that they become almost different men. Ernest Lawrence, who invented the cyclotron in 1929 at the age of twenty-eight, very quickly became famous. He was a hard-driving, round-the-clock worker who gathered about himself an army of assistants and graduate students on whom he continually rode herd to see that tempo was maintained. Once, in impatience, he fired someone on the spot who had been moving too languidly, only to find that it was a telephone repairman sent in to do a job. Everyone under Lawrence had to work for Lawrence or in the direction of his ideas. Yet once he had won the award in 1939 at the age of thirty-eight, the change in him was so marked that it was possible for a newcomer to the lab, Emilio Segrè, to say: “Lawrence? Why he left me strictly alone to my work!”
Segrè himself is a man who was to undergo the identical metamorphosis. He was a former student and brilliant collaborator of Fermi’s from the Rome days. In 1938, he came to the United States as an anti-Fascist, and in the world of American science very quickly got himself a reputation as a man of high energy, drive, and contentiousness, along with a low threshold for excitability. After the war years at Los Alamos, he returned to Berkeley to join and help lead the work on the big new high-energy accelerator. He discovered the antiproton. I remember remarking to him once that the current availability of funds must make for independent research for young men. I made him laugh. “Money, yes! Independence, no! At least not in high-energy physics. An ambitious young scientist has got to get himself into someone else’s group and work on his boss’s problems. There’s too much competition for machine time. Nobody’s going to take a chance on a young fellow and then have to say that a million dollars was wasted!”
Segrè the dynamo was awarded the prize in 1959. Some ten years later, when I was in England at the Rutherford High Energy Laboratory at Harwell, a young British scientist who had spent time as a visiting researcher at Berkeley only the year before said to me: “I was in the Segrè group out there. I’m told he was quite a tough cookie in his younger days, but since he’s won the Nobel Prize, he’s become positively benevolent. . . . A competitive atmosphere out there?” He laughed at my question. “I tell you not! . . . I found it all very dead . . . I was able to move in with my own ideas, take hold of things, and come out with a very successful experiment.”
On receiving the telegram which the Nobel committee sends out to each award winner before the announcement in the press, the new laureate can feel many things. Exultation, certainly; but very often something else. Rabi told me that T. D. Lee, the Chinese-born scientist who shared the 1957 prize with his countryman C. N. Yang when they were both in their early thirties, received the news with acute terror. “My God!” he said. “What happens now to the rest of my life? What comes after this?” One of the things that happened was that between him and Yang, who had been his childhood friend in China, then devoted collaborators in the Institute for Advanced Study at Princeton, there developed a coldness that has never been explained to any outsider, and they stopped working together.
It was very different for Maria Goeppert Mayer, laureate for nuclear physics in 1963, the only woman theoretical physicist ever to be honored. “To my surprise, winning the prize wasn’t half as exciting as doing the work itself,” she said to me with some perplexity. “That was the fun—seeing it work out!” Even the memory of the lack of elation seemed to sadden her; yet her achievement was all the more remarkable because she had done her work when she was well into her forties and she had only recently come into the field of physics from chemistry, and most of all because she was a woman.
For Yang, terror; for Goeppert Mayer, sadness; for Frederick Soddy, pain—because the prize was going to someone else. As far back as 1898, the young New Zealand physicist Ernest Rutherford was working at McGill University in Montreal on the recently discovered world of radioactivity, which was one of wonder and confusion. He was twenty-seven. Two years later he collaborated with another McGill scientist, a brilliant English chemist of twenty-three, Frederick Soddy. The two young men published a series of papers of fundamental importance resulting in the general theory of radioactive disintegration, which attracted immediate attention by its almost sensational statement that chemical transmutation of the elements was an actuality that had been going on since the beginning of the world. The papers of Rutherford and Soddy were quoted everywhere. Soddy finished his term of appointment at McGill and returned to England to help Sir William Ramsay, the discoverer of helium, experimentally establish the crucial fact that the mysterious alpha ray given off by radioactive substances was really ionized helium. Ramsay and Soddy proved the identity. Ramsay received the Nobel Prize in 1904 for his discovery of the so-called “noble” gases: helium, argon, krypton, and neon—with no mention made of Soddy’s contribution. Shortly after, in 1908, Soddy’s other collaborator, Rutherford, now back in England too, also received the prize—again with no mention of Soddy’s part in the work. Soddy was deeply wounded. He was not the sort of man to consider himself the junior partner in the McGill work, and actually had in his possession a testimonial written on his behalf by Rutherford in 1904 that listed all the important advances made in the collaboration and added, “The work published by us was joint work in the full sense of the term.” Soddy in the beginning had to teach Rutherford the chemical techniques that were required. Also, he felt that he had been the one who had first though of transmutation. Disappointed as he was, he continued work in the nuclear field. The years passed. In 1913, Soddy was finally able to clarify man problems by inventing the idea of chemical isotopes. In 1921, the prize was finally given to him, and yet it was for the early work on radioactive transmutation with Rutherford that he wanted recognition. “In the old days, it had always been Rutherford and Soddy—Rutherford and Soddy—but now it’s just Rutherford, wherever you go!” he said bitterly.
Soddy had great ability, and he would have looked even more gifted if it weren’t for the blinding glow given off by his contemporary Rutherford, who had that magic combination of luck, vitality, and brilliance which makes certain men seem destined for achievement and recognition the instant they achieve manhood. They are always at the right place at the right time with the right talent. “Rutherford, you’re a lucky man, always at the crest of the wave!” his biographer, A. S. Eve, once said to him, and Rutherford’s retort was, “Well, I made the wave, didn’t I?”
Rutherford was such a man that neither Nobel Prize nor earthquake could diminish or even halt his effusive creativity. He was the first to realize the nuclear nature of the atom, the first to show that nuclear transmutation could be induced. He was big, raw-boned, loud-voiced. His capacity for enjoyment was prodigious. He loved scientific ideas that worked out; he loved his laboratory; he loved recognition; he laughed when the Nobel Prize was awarded to him at the age of thirty-seven because the citation was for “work in chemistry”; and he loved being made a lord—Lord Rutherford of Nelson. And his “boys” were his too, because, literally, he turned out Nobel laureates by the dozen.
In 1932, his “boy” James Chadwick barely beat Frédéric Joliot and his wife, Irène Curie, of the Institut du Radium to the discovery of the neutron. Rutherford, now in his sixties, insisted that Chadwick get the Nobel Prize for it. “But what about Joliot? Shouldn’t they share the prize?”
Rutherford pounded the table, “I want Jimmy to have it—unshared!”
“And what are we to do about Joliot? Just ignore him?”
Rutherford waved his pawlike hands. “That boy? Let me tell you, Joliot’s so brilliant that before this year is out, he’ll discover something so new and remarkable that you’ll be able to give him a prize for that!”
Rutherford proved to be right. Within months, the Joliots discovered that artificial radioactivity could be induced by neutron bombardment. In 1935, therefore, “Jimmy” Chadwick was awarded the prize for physics—unshared; while Irène and Frédéric Joliot were given the award in chemistry—“for their synthesis of new radioactive elements.” To Rutherford, even politicking and arranging the dispensation of Nobel Prizes were all great fun. Right up to his death, though, he believed that all the talk of eventual production of nuclear energy was “all moonshine.” He died in 1937, just two years before that one great miscalculation of his scientific life was revealed by the experiment of a former student, a man whom he himself had introduced to nuclear chemistry back in the early days at McGill—Otto Hahn of Germany.
Einstein was another Nobel laureate who did not believe in the possibility of the release of nuclear energy until the experimental evidence was incontestable; but it was one of the few ways in which Einstein was not unique. Not in our time has there been a creativeness so supremely rich. What is remarkable is that the university where he took his first degree didn’t even consider him promising enough to offer him a minor post on graduation. He had to work in the Patent Office in Bern to earn a living; and while there, in his early twenties, he began his prodigious inventiveness. In 1905, at the age of twenty-six, he published three different papers in three different fields of physics, each so profoundly original that each one is considered among the germinal papers in the fields he treated. The special theory of relativity was one of the three papers. On publication, no one reacted, no one responded. Absolute silence and indifference. Not until four years later, in 1909, did any university offer him an opening, and true recognition started to explode only in 1913. Still, the Nobel Prize was not given to him until 1922 (for the year of 1921), and then not for his theory of relativity.
If science was “fun to Rutherford, to Einstein it was exaltation. The first insight into relativity was said to be such a piercing experience for him that when he was finished with his calculations, he had a nervous collapse for a few weeks. “Well-being and happiness are such trivial goals in life that I can imagine them being entertained only by pigs.” Like Rutherford, he was already so celebrated and decorated by the time the Nobel Prize was given to him that it could not possibly affect that creativeness that came from so deep a source and flowed with such majestic strength. Only time and the physical subversions of age could dim him. His last years at Princeton made the Institute for Advanced Study a sort of shrine for physicists. At lunch one day, when Julian Schwinger was in his mid-thirties, he told me of his first meeting with Einstein, who was his idol.
Since leaving Columbia, Schwinger had matured and attained the celebrity we had all predicted for him. During the war, he had developed powerful mathematical tools for radar, and afterward he had been made full professor of physics at Harvard at twenty-nine, the youngest man ever to have achieved that position. But our once shy, carelessly dressed fellow graduate student was now jolting the sensibilities of his colleagues and students at Harvard with a very un-Cambridge Cadillac convertible and a taste for suits more smartly tailored than the shapeless, unwaisted, narrow-shouldered style affected by university types. To listen to some of them talk about him, one would have thought that a young George Raft had come to town, but Schwinger was still self-effacing in his manner.
“I had always dreamed of meeting Einstein ever since I was about twelve years old,” he told me. “But I wanted to be introduced to him only after I had done something he would know about; something important enough for him to respect. Rabi kept asking me to go down to Princeton with him whenever he went, and I kept making excuses. Finally, though, I did that piece of work on the self-energy of the electron; and Rabi told me that I was to be given the first Einstein Award for it, to be granted by Einstein himself! So there it was. The $10,000 grant that went with it was fine, but more important than the money was that I would finally be presented to Einstein on terms more dramatic than I had ever dared dream about. Well, the day came, and I got down to Princeton only just in time for the ceremonies, so I went directly to the auditorium. Rabi made the introductory speech, outlining the work I had done, and at last came the moment of the actual presentation of the award, the moment I had awaited for more than twenty years. Einstein rose slowly, waiting for me to approach, and when I went up to him, I saw it was all too late. He was too old! He hadn’t understood a work Rabi had said. He didn’t know who I was; or why I was standing there; nor was he at all clear about what was happening around him. I was shaking hands with a sick, bewildered, empty old man. It was heartbreaking to see him in such a state. The man I had wanted to meet, the man I had revered, must have died quite a while before. As soon as I could, I got off by myself and just walked. I suppose for the first time I had a true sense of the tragedy of age. And perhaps that’s why I went out and blew part of the money on that car. I had always thought vaguely in the back of my mind that it might be fun to have one like it someday, and suddenly there I was asking myself: why wait? For what? All the life there is, is now!”
Still, why the disastrous falloff in production on the part of the most creative men in their fields? According to the sociological study referred to before, there does appear to be at least one answer, which is this: a man’s life is distorted by the award of a Nobel Prize in direct proportion to the extent to which he has not achieved eminence up to that time. If a man’s accomplishments are already fully recognized by his peers, the Nobel Prize generally comes as only the most lustrous of an already large number of honors. When I was recently in Heidelberg, I asked J. H. D. Jensen, who won the Nobel Prize in 1963, if the award changed his life at all. He shrugged off the question, and said: “By the time it came, it didn’t really matter very much. The big moment for me had come years before when I learned that Fermi had put my name in nomination. I didn’t get it that year, but I didn’t really care. It was Fermi’s regard that was the ultimate honor for me, not the medal.”
On the other hand, if, before winning the prize, the man has received very few, if any, of the signs of the scientific world’s recognition of the worth of his work, the sudden rise to stardom can completely distort the pattern of the rest of his life. Yet while the statistics plainly back up this assertion, it must be true only on the average for men of comparatively slender creativity who may in the course of a lifetime achieve only one brilliant breakthrough. Men like Einstein, Rutherford, Fermi, and other giants, who are bigger than the prize, can win it at any time of their lives, take it in their stride, and go on continuing to be fruitful; while Roentgen and others like him who are smaller than the prize are overwhelmed by it—a heavy crown is only for very strong kings.
This article available online at:

Saturday, March 30, 2013



Steve Jobs Commencement Address 2005-Stanford University
I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I've ever gotten to a college graduation. Today I want to tell you three stories from my life. That's it. No big deal. Just three stories.
Associated Press
Steve Jobs speaks at graduation ceremonies at Stanford University, in Palo Alto, Calif., on June 12, 2005.
The first story is about connecting the dots.
I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out?
It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: "We have an unexpected baby boy; do you want him?" They said: "Of course." My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college.
Brett Arends discusses on Lunch Break why he believes Steve Jobs was the best chief executive of his generation and wasn't just a technology genius but also a hype master.
And 17 years later I did go to college. But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents' savings were being spent on my college tuition. After six months, I couldn't see the value in it. I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out. And here I was spending all of the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back it was one of the best decisions I ever made. The minute I dropped out I could stop taking the required classes that didn't interest me, and begin dropping in on the ones that looked interesting.
It wasn't all romantic. I didn't have a dorm room, so I slept on the floor in friends' rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example:
Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn't have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can't capture, and I found it fascinating.
None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it's likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later.
Again, you can't connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.
My second story is about love and loss.
I was lucky — I found what I loved to do early in life. Woz and I started Apple in my parents garage when I was 20. We worked hard, and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees. We had just released our finest creation — the Macintosh — a year earlier, and I had just turned 30. And then I got fired. How can you get fired from a company you started? Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well. But then our visions of the future began to diverge and eventually we had a falling out. When we did, our Board of Directors sided with him. So at 30 I was out. And very publicly out. What had been the focus of my entire adult life was gone, and it was devastating.
I really didn't know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down - that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me — I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over.
I didn't see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.
During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife. Pixar went on to create the worlds first computer animated feature film, Toy Story, and is now the most successful animation studio in the world. In a remarkable turn of events, Apple bought NeXT, I returned to Apple, and the technology we developed at NeXT is at the heart of Apple's current renaissance. And Laurene and I have a wonderful family together.
I'm pretty sure none of this would have happened if I hadn't been fired from Apple. It was awful tasting medicine, but I guess the patient needed it. Sometimes life hits you in the head with a brick. Don't lose faith. I'm convinced that the only thing that kept me going was that I loved what I did. You've got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven't found it yet, keep looking. Don't settle. As with all matters of the heart, you'll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don't settle.
My third story is about death.
When I was 17, I read a quote that went something like: "If you live each day as if it was your last, someday you'll most certainly be right." It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself: "If today were the last day of my life, would I want to do what I am about to do today?" And whenever the answer has been "No" for too many days in a row, I know I need to change something.
Remembering that I'll be dead soon is the most important tool I've ever encountered to help me make the big choices in life. Because almost everything — all external expectations, all pride, all fear of embarrassment or failure - these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.
About a year ago I was diagnosed with cancer. I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas. I didn't even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctor's code for prepare to die. It means to try to tell your kids everything you thought you'd have the next 10 years to tell them in just a few months. It means to make sure everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.
I lived with that diagnosis all day. Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and I'm fine now.
This was the closest I've been to facing death, and I hope it's the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:
No one wants to die. Even people who want to go to heaven don't want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life's change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.
Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma — which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.
When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960's, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.
Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: "Stay Hungry. Stay Foolish." It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.
Stay Hungry. Stay Foolish.
Thank you all very much.
Copyright 2012 Dow Jones & Company, Inc. All Rights Reserved
This copy is for your personal, non-commercial use only. Distribution and use of this material are governed by our Subscriber Agreement and by copyright law. For non-personal use or to order multiple copies, please contact Dow Jones Reprints at 1-800-843-0008 or visit


Ayn Rand was a brilliant woman. I have always loved this quote.

"My philosophy, in essence, is the concept of man as a heroic being, with his own happiness as the moral purpose of his life, with productive achievement as his noblest activity, and reason as his only absolute. (appendix to 'Atlas Shrugged')"

Friday, March 29, 2013


This Korean artist is fantastic. He makes his work from recycled tires. Check out his website or google his name in Google Images. Yow!