Imagine what it must have felt like to be the parent of a child who contracted diabetes before the discovery of insulin. Children with the disease would typically start feeling tired and would then find themselves unable to focus at school, before moving on to stranger symptoms, like frequent urination paired with constant thirst, and drastic weight loss despite ravenous eating. If you took a child with these symptoms to the doctor in, say, 1920, a urine test would be administered, and if glucose levels were found to be elevated, you would be told that your child had diabetes, a fatal condition for which little could be done. You would watch your child waste away over an agonizing stretch of weeks or months, enduring the only treatment—a starvation diet that could delay but not prevent the inevitable.
It is also not difficult to imagine what it must have felt like for the parents of diabetic children, in October 1922, to read a New York Times article under the headline serum proves boon in fighting diabetes. A subheading proclaimed, “Ravages of Disease Checked by Insulin Discovered by Canadian Doctors.” A paragraph at the bottom of the column described the effect of this new treatment:
A boy of 12 in extreme illness through diabetes became free from sugar after twenty-four hours of treatment with insulin [and] has remained free although his diet has been increased to practically normal. This boy is gaining weight at the rate of half a pound a day and is leading the type of life that any normal active child would lead.
The article added that the results in “other cases have been equally astonishing.”
The four “Canadian doctors” who made this discovery had begun their research a little more than a year earlier and had isolated a new medicine, a purified hormone from the pancreas of cows, that would soon be available to diabetic patients around the world. News of the breakthrough treatment was met with worldwide acclaim, including from the Nobel Committee, which awarded its prize in medicine to two of the Canadian physicians soon after their results were first published. Insulin therapy was the first medical treatment that could rapidly and consistently rescue moribund patients and return them to health. It is still viewed as one of the greatest advances in the history of medicine. It would save millions of lives.
It is not as easy to conceive how Israel Kleiner, the dean of the New York Homeopathic Medical College, would have reacted to the article in the Times, but he would probably have had mixed feelings. On the one hand, he must have been thrilled to learn that a lifesaving treatment for a disease he had studied for years was now available for desperately ill patients. But he must also have reflected that had circumstances been different, he might have been the one to make this discovery, and it might have come years earlier.
In 1913 and 1914, preliminary studies undertaken by Kleiner suggested the existence of a pancreatic hormone that could lower blood glucose, and he had spent the next five years proving it. But in 1919, after writing a definitive paper showing the hormone could treat diabetes in animals, Kleiner disappeared from the world of research science altogether. The mystery of why this happened has intrigued me since I first learned about it.
A decade ago, reading Michael Bliss’s book The Discovery of Insulin, I came across a reference to Kleiner and his little-known studies, carried out at what was then the Rockefeller Institute for Medical Research in New York. As a professor at the institute’s current incarnation, Rockefeller University, I immediately wanted to know more about Kleiner and his story, especially given my own interest in hormone research. (My laboratory identified another hormone in 1994, leptin, which regulates body weight and metabolism.) I called Bliss, who told me he’d always been curious himself, and that if I looked into it, I should let him know what I found.
So began a personal odyssey to understand the forces that both shaped and limited Kleiner. As I pieced his story together over the years, it became clear that several factors had led to his vanishing.1 The first was the great war that ended one hundred years ago this month—indeed, the delay in insulin’s discovery could reasonably be considered a hidden cost of World War I. But there were other forces at work, too, small and large, from Kleiner’s personality, and those of his colleagues, to a worldwide transformation in science and medicine.
Kleiner, a research scientist who missed out on a great discovery, is not the sort of person about whom stories are often written. But an understanding of what thwarted him offers insights into the nature of discovery, insights relevant to challenges and opportunities facing science today.
Not much is known about Kleiner the person, certainly far less than about the four Canadians who followed. His grandsons, Andrew and Kage Glantz, both now in their late sixties, remember him as gentle and soft-spoken, and Kage describes him as “the mildest man in the world.” That’s certainly how he comes off in his letters, and in the few accounts of him from contemporaries.
Kleiner, the grandson of Jewish immigrants, grew up in New Haven, Connecticut, where his father owned a tailor shop and expected his son to join him there. Israel, however, was a promising student, and instead enrolled at Yale as an undergraduate. Finding himself drawn to biochemistry and the science of metabolism, he went on to study both as a graduate student at the university, researching the metabolism of glucose and other nutrients under Lafayette Mendel, a pioneer in the science of nutrition.
Kleiner received his PhD in 1909, and a year later he was hired at the laboratory of the prominent physiologist Samuel Meltzer, at the new Rockefeller Institute. The appointment was an extraordinary opportunity for him. The institute had been founded in 1901 by John D. Rockefeller in collaboration with the great physician and bacteriologist William Welch, later known as “the dean of American medicine.” It was one of the first American institutions focused on biomedical investigation, and, at a time when government support was unavailable, it was one of very few places that provided adequate funding for research scientists. By 1910, when Kleiner arrived, it was already known as one of the world’s leading research centers.
The institute’s first director, Simon Flexner, had by then assembled a stellar and international group of scientists to take on the greatest medical problems of the age, providing them with needed resources and minimal direct supervision. Alexis Carrel, a Frenchman, was pioneering surgical methods and would soon be the first scientist working in America to receive a Nobel Prize. Phoebus Levene, a Lithuanian biochemist, studied the structure of DNA. Hideyo Noguchi, a bacteriologist from Japan, identified the organism that caused syphilis. Rufus Cole, an expert in the diagnosis and treatment of pneumonia, was put in charge of the Rockefeller Hospital, which became the first research hospital in the country. And Kleiner’s new supervisor, Meltzer, a Russian-born, German-trained physiologist, was the inventor of what he called an insufflation device, one of the earliest respirators.
The contrast between the Rockefeller Institute and nearly every other medical institution in the United States at the time is hard to overstate. Before 1893, when Welch cofounded the research-oriented Johns Hopkins School of Medicine, which was based on the model prevalent in Europe, biomedical research had barely existed in the United States, and medical education, too, was severely deficient. The year Kleiner arrived at the institute, the prominent educator Abraham Flexner (Simon’s brother), issued a scathing and influential report decrying the state of most American medical schools, even those affiliated with universities, which in many cases were little more than profit centers for physicians, reeling in fee-paying apprentices. The principal criterion for acceptance was often simply that an applicant be able to pay; many enrollees had not even graduated from high school.
The situation in Europe was very different. Until the last third of the nineteenth century, the practice of medicine had barely advanced anywhere in the world since the time of Hippocrates. In 1860, Oliver Wendell Holmes Sr. had offered this summation: “I firmly believe that if the whole materia medica, as now used, could be sunk to the bottom of the sea, it would be all the better for mankind and all the worse for the fishes.” But the French and German governments, unlike the American, had been paying to train their country’s best scientific minds at specialized institutions and universities for several decades. And just a few years after Holmes’s remark, a revolution in biomedical science began in France and Germany.
The most dramatic transformations had been in the diagnosis, treatment, and prevention of infectious disease, which was by far the largest cause of human mortality. It had long been explained by the “miasma” theory (which blamed toxins in the air) or by the vague theory of “filth.” The competing germ theory wasn’t widely accepted until the last third of the nineteenth century, when Louis Pasteur in France and Robert Koch in Germany, both supported by generous government funding, showed that the pathogenic agents for many infectious diseases were bacterial organisms that could be viewed under a microscope and isolated. These discoveries had been instrumental in raising life expectancy in the developed world, by as much as a decade, between 1850 and 1900, and would go on to contribute to a further substantial increase by 1930, even before antibiotics were widely available.
The Rockefeller Institute, some of whose faculty had trained in Germany or France, was in part an effort to import some of the knowledge and expertise that was taking hold across the Atlantic. When Kleiner arrived, the medical revolution was still very much in progress, and researchers were chasing breakthroughs in a wide array of fields. Meltzer, the head of the department of pharmacology and physiology, wanted to uncover basic processes that regulate the function of organ systems, and he was overseeing a wide variety of physiological investigations. Kleiner had previously studied how the body eliminates glucose from the blood while working with Mendel at Yale, and he continued this line of research in Meltzer’s lab.
Although diabetes was not as prevalent then as it is now, before the discovery of insulin it had one particularly gruesome aspect that made it as vivid in the public imagination as other more rampant diseases. Type 2 diabetes, the more common form that generally afflicts adults and is often associated with obesity, was—as it still is—often fatal in the long run but not acutely life-threatening. Rather than leading to a quick demise, it caused serious health problems over time, including heart disease and disorders of the kidney, eye, and peripheral nerves. But type 1 diabetes, which typically afflicts children, was another matter. In that form of the disease, glucose metabolism is severely impaired, to the point where the body burns fat relentlessly, leading to extreme hunger, emaciation, and increased acid levels in the blood—diabetic ketoacidosis. Without proper treatment, this condition causes death in fairly short order, but not so short as to spare the patients and their families a nightmare of suffering and decline.2
Kleiner may or may not have had those horrors in mind when he began his studies, but diabetes was widely understood to be a major problem. Both forms of the disease had been recognized since antiquity, when physicians noticed that ants were attracted to the sweet urine of certain patients, some of whom were wasting away. By the eighteenth century, it was known that the sweetness was from glucose that had not been properly metabolized by the body. But beyond this, and the later finding that the glucose in urine was a result of elevated levels of it in the blood, little of enduring value was learned for millennia, until 1889. That year, the physiologist Oskar Minkowski, working with a colleague, Joseph von Mering, at the University of Strasbourg, discovered by chance that removing a dog’s pancreas caused it to become diabetic.3
Minkowski’s finding had established that diabetes was a disease of the pancreas, but did not explain how the organ, when properly functioning, controlled blood glucose. Many, often exotic, explanations had been considered: the healthy pancreas made an enzyme that degraded glucose, or consumed glucose; or it removed an agent from the blood that interfered with the use of glucose; or, possibly, it made a hormone (a class of chemical messengers that had only recently been identified) that stimulated the metabolism of glucose, thus reducing its levels in the blood. Many scientists subscribed to this last theory, but it remained controversial, and as late as 1910, no one had produced definitive evidence that diabetes was caused by the deficiency of a pancreatic factor. It was this proposition that Kleiner, who was driven by an academic curiosity about the mechanism responsible for the elevated glucose levels, began to study when he moved to the institute.
Kleiner and Meltzer first compared the fate of glucose after it was injected into the blood of both diabetic and nondiabetic dogs. In 1913, they found that one and a half hours after an injection, glucose concentration in the blood of a diabetic dog was three times higher than in the blood of a healthy dog. They then showed that adding a simultaneous injection of an extract of the pancreas, prepared by incubating pieces of the organ’s tissue in water, could prevent the abnormal rise of blood glucose in diabetic animals. Meltzer presented the findings at the National Academy of Sciences in Washington, and they attracted considerable attention when a front-page New York Times headline announced find diabetes cause; now seek a remedy.
The pancreatic extract studies involved small numbers of animals, and larger studies would be needed to confirm them. And, to convince possible skeptics that he had found the long-sought antidiabetic hormone, Kleiner would need to show that the extract was lowering blood glucose by increasing its metabolism, which required sophisticated methods that were being used in Europe. In June 1914, he was making a series of visits to laboratories in Germany, England, and Denmark to learn those methods when Archduke Franz Ferdinand was shot in Sarajevo. In late July, with war on the horizon, Kleiner’s trip was curtailed, and he and his family made their way to Paris. They checked into “a ‘quiet’ hotel,” as he later told the Brooklyn Daily Eagle, but soon felt themselves besieged: raucous crowds chanted for war day and night, and strangers constantly demanded to know their nationality. “It soon became so dangerous in Paris that we thought of nothing but getting out,” he told the paper. “If the people of Berlin were any more war-crazy than those of Paris, I pity the people marooned there.” The Kleiners heard the town of Cherbourg erupt into celebratory cries as they made their way to their New York–bound steamer; war had just been declared.
The impact of the war on the institute was initially minimal. Still, there was no avoiding the news from Europe, and some of the faculty were soon directly involved in one way or another. Alexis Carrel, the Nobel Prize winner, had been in France when the war began, and, with support from the Rockefellers, he stayed on to direct a war hospital for the treatment of wounded soldiers. And Meltzer, the director of Kleiner’s laboratory, delivered a lecture in 1915 decrying war as “wholesale murder” and calling on physicians to play a central role in raising moral standards. The response was so positive that he published his speech in several scientific journals and launched an antiwar organization, the Medical Brotherhood for the Furtherance of International Morality, which would eventually grow to 16,000 members, a sizable proportion of the medical community.
Woodrow Wilson also took an antiwar stance of sorts, committing the country to neutrality. But he and the rest of the government knew from the start that this might change, and they began to prepare for that possibility well before the United States entered the conflict, in the spring of 1917.
There was a lot of preparing to do. In 1914, the US military had only 150,000 troops, fewer than Serbia or Belgium. In 1916, as tensions with Germany rose, Wilson initiated plans to begin drafting young men, a policy that would ultimately mobilize more than 4.5 million US troops, nearly all of whom would be sent to crowded camps for training.
This planned concentration of large numbers of soldiers in cramped quarters was a serious concern for another eminent physician of the age, William Gorgas, the surgeon general of the US Army. Having served as the chief sanitary officer in US-occupied Havana in 1898, after the Spanish–American War, he was acutely aware that many more soldiers had died from disease than from combat during that brief conflict (historians have estimated as many as eight to one), as was true in virtually every previous war.
Gorgas’s tasks now included preventing epidemics from breaking out in the new military camps and, more urgently, preventing their spread from one camp to another.4 He was determined that adequate sanitation measures be taken at the bases (and on the front lines, should it come to that), and that soldiers benefit from the newest advances in diagnosis and treatment of infectious disease—in particular a remedy that had been developed by Emil von Behring in Germany using antiserum (a component of blood from animals infected with low doses of bacteria that contains antibodies) to treat bacterial infections.
Gorgas was also responsible for the massive mobilization of medical personnel, and for determining how to train them in the new lifesaving treatments. The Army Medical Department would need to grow from about 800 physicians to at least 25,000, at a time when there were only 140,000 in the entire country, most of them poorly trained. Even more nurses would be needed, along with new hospitals totaling 300,000 beds, and a system for procuring and delivering enough medical supplies for an army of millions.
However, because the government’s overwhelming priority was to mobilize and later transport the Army, there was little funding available for the Medical Department. And since there was also almost no governmental health care infrastructure for Gorgas to turn to (the small government Hygienic Laboratory was then focused mainly on developing better treatments for venereal diseases), he began to lean heavily on the few American institutions able to help him address these monumental challenges. Aware of this pressing need, in 1917 Flexner suggested that the Rockefeller Institute be incorporated, in its entirety, into the Army as an auxiliary post. When presented with the offer, Gorgas didn’t hesitate.
Even before that year, Flexner was overseeing the institute’s efforts to supply the Allied Powers with therapeutic antisera for treating meningitis, dysentery, and pneumonia. Teams of scientists isolated and characterized different bacterial strains, making vaccines and immunizing dozens of horses, from which hundreds of liters of horse serum were then harvested. During the run-up to the war, the institute also began supplying American forces, and established and staffed a “war demonstration hospital” on its campus, where newly mobilized doctors were trained in the latest treatments for bacterial diseases; a new, transformative method for disinfecting wounds; the use of Meltzer’s respirator; and the management of fractures.
Other scientists at the institute worked to develop a way to store blood, synthesized the sedative barbital as well as other chemicals previously obtained through trade with Germany, and sought means for dealing with poison gas and vermin-proofing fur. The institute outfitted railway cars as mobile laboratories for use at Army camps around the country, and nearly all its physicians, including Flexner, traveled to the camps at one point or another to help quell outbreaks, having been made Army officers. Flexner was a lieutenant colonel, Welch a brigadier general, and even Meltzer, who finally disbanded his Medical Brotherhood once America entered the war, served as a major.
Because Kleiner’s training and research had no direct bearing on the war, he was left to continue his studies of the disposition of glucose in diabetic animals in his corner of Meltzer’s lab. He worked methodically, but was increasingly isolated, as the head of his laboratory devoted time first to an antiwar crusade and then to war work, and the institution around him attended to a monumental war effort in which he apparently had no part.
In a 1916 paper that was authored by him alone—unusual for research conducted in a more senior scientist’s laboratory—Kleiner reported the results of a more comprehensive study than that of 1914, with a larger number of animals, showing that glucose injected into the blood of a normal dog was rapidly cleared. This extension of his earlier work was meant to set the stage for definitive studies of the fate of glucose in diabetic animals, and of the effect of a pancreatic extract on it—the key issue. But, in 1917, the war found Kleiner too: he was dispatched to Yale to teach, filling in for Mendel, who had been called into service.
Much later, Kleiner would write of this period, with characteristic restraint: “In the meantime, the first World War had come about and interfered with” his pursuit of the glucose-regulating hormone.
Kleiner did not take up his research again until 1918. After the armistice in November, the institute began to settle back into its old routines, and he likely expected that his own work would now go forward uninterrupted. But Simon Flexner had other plans.
By late 1918, Flexner had persuaded Meltzer, whose health was in decline (as a result of type 2 diabetes, as it happened), to plan his retirement. Owing in part, perhaps, to past tensions over Meltzer’s Medical Brotherhood, Flexner also decided to let go of the researchers in Meltzer’s laboratory. Meltzer sent several letters urging that his associates be kept on at the institute, including Kleiner, but Flexner refused. In a note to Meltzer from that year, he made his feelings about Kleiner in particular clear. “His work is not essential to you or the Institute’s war program,” Flexner wrote, adding, “I believe it would be well for Kleiner to go into teaching and this might be the time to make the change. He is not a man the Institute wishes to attach itself to permanently.”
Toward the end of 1918, Kleiner was informed that his term at the institute would be over by the following June, which created a potential crisis for him. He had no other source of income and was supporting not only a wife and a young daughter but also his widowed mother-in-law. As the deadline approached, he had difficulty finding a position elsewhere, and in a desperate-sounding letter written on June 25, he implored the administration to provide a summer salary while he continued his search. Flexner agreed, for which Kleiner was immensely grateful. At no point during all of this, it seems, did he, Flexner, or Meltzer bring up his research or its obvious importance (despite the several newspaper articles that had previously highlighted its therapeutic implications).
Kleiner stayed at the institute until September, leaving just as he submitted one final paper on his diabetes research. Published in the fall of 1919, again with Kleiner as sole author, it was an extraordinary piece of work—a masterpiece, even. At a time when most scientific papers reported their data through anecdotes, in a way that would be regarded as unconvincing or even slapdash today, Kleiner showed, by precise evidentiary standards, and in a large group of animals, that the pancreas (but not other tissues) made a hormone that regulated blood glucose. And rather than dwell on how this finding answered the esoteric question of why glucose was retained in the blood of diabetics, Kleiner now went directly to the core of the medical problem with a focus not evident in his earlier papers. He repeatedly emphasized the clinical implications of his findings, writing that they “suggest a possible therapeutic application,” and also speculated about whether an extract of the pancreas from another species would have the same effect, thus predicting the use of extracts prepared from animals (cows and, later, pigs, in the case of the Canadian team).5 Kleiner closed by writing that the “effective agent or agents, their purification, concentration and identification are suggested as promising fields for further work.” Indeed they were.
If Flexner was the reason Kleiner’s research ground to a halt, the obvious question is why. Why would someone so accomplished and discerning, universally recognized as a shrewd judge of talent and good science, fail to appreciate the historic discovery that was within Kleiner’s, and the institute’s, reach?
Flexner was himself something of a mystery to many on his faculty. A small, intense man in his mid-fifties, he was intimidating to some and supportive of others, but taciturn with everyone. He was a longtime friend and protégé of William Welch, in whose lab he’d trained before Welch became the founding dean of the Johns Hopkins medical school (and eventually, in the eyes of many, the most influential person in the history of American medicine). Flexner had gone on to become an eminent bacteriologist in his own right, developing an effective treatment for meningitis, identifying Shigella flexneri as a cause of dysentery and showing that polio was a viral disease, among other achievements. He was thirty-eight when Welch, the first president of the Board of Scientific Directors at the Rockefeller Institute, persuaded him to run its laboratories.
Flexner’s focus on infectious diseases may well have contributed to his lack of appreciation for Kleiner’s research. Diabetes may have seemed to him almost a boutique illness, affecting patients whose deaths were no more tragic than those of the many more people dying of pneumonia, meningitis, and diphtheria. And the urgency surrounding such illnesses would have been at the front of his mind after his deep involvement in the institute’s wartime work on infection, which had been very successful. Disease accounted for about half of military deaths in World War I, a significant decrease from previous conflicts. (This shift could be attributed partly to the augmented killing power of the new machines of war, but the vast medical machinery set up by the Allies, of which the institute had been a vital part, also made a major difference.) Even with the war over, the Spanish flu epidemic that had been raging for more than a year was, if anything, amplifying concerns about infectious disease.6
Flexner’s commitment to the institute’s war program also colored his view of Meltzer, Kleiner’s vociferously antiwar patron, and by extension, perhaps, the members of Meltzer’s laboratory. The Medical Brotherhood had drawn the ire of people with influence over Flexner, including a close adviser to the Rockefellers and the institute’s business manager, both of whom wrote to Meltzer—as did Flexner—chastising him for allowing his activism to distract from his research and the war effort.7 Kleiner may to some extent have been caught in the institutional crossfire.
But apart from whatever problems he had with Meltzer, Flexner also seems to have had a low personal opinion of Kleiner himself, as evidenced by the finality of that 1918 note about him to Meltzer. Kleiner’s deep talent is clearly evident in his scientific papers, especially the one from 1919, but their brilliance may have been obscured by his lack of assertiveness. And it seems likely that a personal reticence described by his grandsons and others may have obscured his personal brilliance in a similar way. (As Meltzer put it in a 1918 letter, “Kleiner has to pay the penalty for being a gentleman and of a modest retired disposition.”) Flexner appears to have been attracted to established stars, and his vision for the institute was to find them and bring them there.8
In any case, as his days at the institute drew to a close, Kleiner felt lucky to have been offered a position as a professor of physiologic chemistry at the New York Homeopathic Medical College and Flower Hospital, despite the considerable step down he would be taking from the exalted world of the institute. While homeopathy was then widely regarded as a viable alternative to research-based medicine, it was also recognized as a pseudoscience by many, including the members of the institute he was preparing to leave. In addition, given the lack of federal funding for research at the time, and the rarity of institutional funding, he knew there would be no opportunity to continue the work he had been doing.
He was quickly drawn into administrative duties at the college, and was serving as dean by the early 1920s, although he still followed research in diabetes. In 1921, he attended a meeting of the American Physiological Society in New Haven, where he heard a presentation by Frederick Banting, a Canadian surgeon, and John MacLeod, a Scottish physiologist who oversaw the University of Toronto laboratory where Banting worked. Banting stammered out an account of his recent efforts, together with a student named Charles Best, to replicate Kleiner’s finding that the pancreas made a factor that could suppress diabetes.
Banting, then just thirty, had served as a medical officer on the front lines in France, and after the war had set up a surgical practice in London, Ontario, that had never taken off. But he’d had an idea about diabetes and the pancreas after reading a surgical case report in a journal, and he’d shared it with several colleagues, one of whom had referred him to MacLeod. MacLeod had been willing to give him an opportunity, providing him with space and resources and assigning Best to work with him. Their early results were presented at the New Haven conference and then published in February 1922, in a paper in which they concluded that pancreatic extracts “do have a reducing effect on blood sugar, thus confirming Kleiner.”
Around this time, James Collip, a young biochemist in Toronto who had been following Banting and Best’s progress, had remarked to a friend that it would take him two weeks to purify the hormone whose existence Banting and Best (and Kleiner before them) had demonstrated. Shortly thereafter, Banting invited Collip to join the group, and within weeks, Collip developed a purification scheme. Banting rushed ahead and tested an extract on Leonard Thompson, a desperately ill and emaciated diabetic boy of fourteen. While there appeared to be a beneficial effect on glucose, the material still had too many impurities, and Thompson developed an abscess. Collip refined the procedure and less than two weeks later the boy was reinjected, this time successfully. Soon after that, the purer material, now named insulin, was given to six other patients, and in March 1922 the group reported that all had improved dramatically. They distributed insulin to several diabetes clinics around the country, all of which replicated the results. By October of that year, newspapers around the world were trumpeting the breakthrough. In 1923, the group filed for a patent on insulin, which they immediately assigned to the University of Toronto; the university licensed production rights to the Eli Lilly company, which continues to sell it in the United States. Later that year, Banting and MacLeod were awarded the Nobel Prize. The interval between the discovery and this recognition is still among the shortest in the history of the prize.
Kleiner never complained publicly or, according to family members, even privately about his lost opportunity. But when he learned that insulin had been patented, he was “furious,” his grandson Andrew Glantz said, since he felt that the discovery to which he had contributed “was for the good of man,” and feared that it would not be fully available to patients who needed it.9 He wrote Flexner a trenchant letter objecting to the patent, but dropped the issue when Flexner advised him that “it would be well not to pursue the matter.”
Kleiner spent the rest of his career at the New York Homeopathic College, ultimately succeeding in removing homeopathy from the curriculum. Shortly after the discovery of insulin, he resigned as dean, and he later wrote a well-regarded biochemistry textbook.
Had he been given the opportunity to pursue his research, as Banting was by MacLeod, would Kleiner have discovered insulin? There’s no way to be sure, but it does not seem unlikely. He had clearly articulated the therapeutic potential of his work in his 1919 paper, in which he also referred to previous efforts by others to purify the factor using alcohol, the method Collip successfully used three years later. Given Kleiner’s biochemistry training at Yale, he presumably had the knowledge and skill to do the same.
There’s no question that Flexner made a serious error by denying Kleiner the opportunity to continue his work, which could potentially have led to the development of insulin two years earlier, saving the lives of several thousand diabetic children (and associating the institute with a world-changing discovery). As all-consuming as the war effort was, and as urgent as infectious diseases remained afterward, he could easily have decided that Kleiner’s work was important enough to justify continued support during and after the war, at minimal cost to the institute. Instead, the exigencies of the war, and those of the Spanish flu outbreak, seem to have led him to an approach at odds with his original vision for the institute: giving talented scientists the opportunity to follow their interests, whether or not they were in the mainstream, and regardless of where their research might (or might not) lead. It is this approach, rather than a more prescriptive one, that has driven the extraordinary progress in science and medicine over the past hundred and fifty years.
In fact, it was this very recipe that provided the foundation for the advances in treating infectious disease that Gorgas, Welch, Flexner, and others relied on in their wartime work. Developments like von Behring’s antiserum treatments or the later introduction of antibiotics would not have been possible without the more open-ended and exploratory work that preceded them, work funded largely by European governments and animated by the curiosity of scientists like Koch in Berlin, Pasteur in Paris, and others, many of whom may not even have been thinking of infectious disease when they began their research.
Discoveries are delicate things. To make a discovery, a scientist needs to be familiar enough with prior research to formulate an important question that is neither too easy nor too difficult to answer. (If it is too easy, someone will have answered it. If it is too difficult, no one will.) But formulating an appropriate question and having the technical skills to answer it are not sufficient. Scientists also need the resources to conduct their work freely, pursuing their curiosity uninterrupted wherever it leads.
What Flexner had apparently lost sight of, at least in this instance, was the inestimable value of pursuing knowledge for its own sake. Scientific inquiry is an arc of knowledge, a series of steps on a path toward a deeper understanding of the unknown, and the breakthroughs only come because of the body of knowledge that previous observations have built.
This is clearly true of the discovery of insulin. The Canadians’ milestone was the easiest to recognize (certainly by the public), but their discovery depended on the work of many predecessors besides Minkowski and Kleiner.10
And the arc did not end in Toronto. New findings about insulin and its actions have since helped explain the very different causes of type 1 and type 2 diabetes and have led to the development of new treatments supplementing or, in the case of the latter, even replacing insulin, which is lifesaving in the short term but of limited efficacy in preventing the long-term consequences of diabetes.
There are similar arcs across all areas of science, and the rate of their movement is directly correlated with the level of investment in science. In 1810, Wilhelm von Humboldt, a Prussian official distressed over his state’s inability to resist Napoleon’s invasion, introduced a radical scheme for German advancement that called for the government to fund scientific research at universities rather than in a very limited number of small institutes. It was a brilliant plan, but even von Humboldt could not have imagined the extent of the progress it would unleash over the next century. In addition to transforming Germany from a second-tier European state into a preeminent scientific, economic, and military power, and creating entirely new industries, it transformed medical science.
And since World War II, something similar has been playing out in the United States. Generous federal funding for basic research, in part as a result of policies enacted on the recommendation of Vannevar Bush, the wartime head of the US Office of Scientific Research and Development, has fueled another scientific revolution. This has given rise to advances such as recombinant-DNA technology; the DNA sequencing of entire genomes; new, often curative therapies for cancer; new means for preventing heart disease and managing it when it develops; improved treatments for both bacterial and viral diseases, including HIV and hepatitis; lifesaving treatments for rare orphan diseases; and countless other breakthroughs. Other branches of science have yielded noninvasive diagnostic procedures that enable views deep inside our bodies. Each of these developments is the product of its own arc, and in arguably all cases was made possible by a series of often arcane investigations by scientists pursuing their curiosity. Like von Humboldt, Bush could not possibly have foreseen precisely where the vast research program he advocated would lead, but he understood that he didn’t need to.11
This is not an argument to avoid applied or what is sometimes in medicine referred to as “translational” research, aimed at a particular target. Gorgas and Flexner’s specific objective—to translate the discoveries of Pasteur, Koch, von Behring, and others into improved health care for American troops—was crucial to both the war effort and medical progress. But focusing too much on mainstream notions of what is important or useful carries the risk that the very discoveries that make translational research possible will never be made.12 It also presupposes that we know what will be important in the future. Infectious disease was the priority for Flexner and it is likely that he would have been shocked to learn that diabetes, although perhaps a boutique illness in his day, has since become a massive worldwide public-health crisis.
Still, Flexner’s intense focus on the catastrophes unfolding in front of him is easy to understand, and while great discoveries are obvious in hindsight, it is considerably more difficult to discern one in the making. Identifying the person who will make that kind of advance can be similarly difficult.
The personal traits required for success in science are highly varied. Some scientists pursue a particular goal single-mindedly, as von Behring did with antisera, while others are gifted experimentalists who can discern and eventually explain the unexpected, like Minkowski or Pasteur. Some are intense and outwardly confident, like Flexner, while others are more timid and easier to overlook, like Kleiner.
The closest thing to an indispensable characteristic is a passion for the work and a near-obsessive interest in seeing it through, as embodied by Banting. It is perhaps here that Kleiner may have fallen short. Should he have fought harder to persuade Flexner to give him an opportunity to finish, as Banting surely would have? (For most scientists, missing the opportunity to make a discovery is their greatest nightmare, and I can say with certainty that under similar circumstances neither I nor most other ambitious scientists I know would have maintained Kleiner’s apparent sense of equanimity about his missed opportunity.) Maybe he concluded that Flexner had already decided his fate, and that writing his brilliant 1919 paper was the most he could do. In any case, he went on to live a contented life, by all accounts, perhaps satisfied with having made his contribution, even though few fully appreciated it.
If he felt that he had played a pivotal role in the story of insulin, he was not alone. Donald Van Slyke, one of the fathers of clinical chemistry and a former colleague of Kleiner’s at the institute, wrote him to this effect in 1955, on the occasion of Kleiner’s seventieth birthday. His note recalled the moment when Kleiner “pointed out clearly the probable therapeutic application of pancreatic extract in human diabetes.”
“After your paper in the 1919 Journal of Biological Chemistry one may say that the next steps . . . were inevitable,” Van Slyke continued. “The honor of clearly showing the way remains yours.”