World
How World War I Spurred the Invention of Blood Banks | HISTORY
Blood from blood banks is routinely used for life-saving transfusions and procedures. And, like many medical advances, the technology for blood transfusions and banking was developed during wartime—when every minute counts and lives are on the line.
Though military medicine had improved in the later years of the 19th century, nothing would prepare the armed forces for the absolute carnage of World War I.
“The amount of battlefield casualties in a single day overwhelmed military medical organizations,” explains Frederick Schneid, a history professor specializing in military history at High Point University.
The development of blood banks and an increase in successful transfusions helped put a small dent in the war’s fatalities and improved recoveries for wounded soldiers.
Understanding Blood Types
Blood grouping, or blood typing, is a system that categorizes human blood into different types, based on the presence or absence of specific markers on red blood cells. While the concept was discovered in the early 1900s, its learnings weren’t widely applied until World War I. When surgeons did not test blood for compatibility before a transfusion, the result could be fatal if the patient’s immune system attacked the new blood cells.
The war also focused advances in the development of anticoagulant and short-term storage techniques—all vital elements to setting up effective and safe blood banks.
“Blood transfusions, which we now view as routine, were still experimental at the start of the war,” notes Lora Vogt, vice president of education and interpretation at the National WWI Museum and Memorial.
Vein-to-Vein Transfusions to Blood Banks
The first blood transfusions were done in France in 1914 through a direct vein-to-vein method, from donor to patient, Schneid explains. “The problem was that there was no way to preserve the blood after it was taken, so the transfusion had to be immediate,” he says. It was also difficult to find enough available donors and surgeons when multiple patients required a transfusion at the same time.
Then, in the spring of 1917, a Canadian military doctor named Lawrence Bruce Robertson began performing “indirect” blood transfusions on the Western front. In these procedures, blood was transferred from donors using syringes and narrow tubes to prevent clotting. By November 2017, he described 36 cases using his indirect transfusion method in an article in The Lancet, writing that “in the cases of severe primary hemorrhage accompanied by shock, blood transfusion frequently produces an immediate and almost incredible improvement.”
Around the same time, Oswald Hope Robertson (no relation to Lawrence), a U.S. Army doctor, established the first blood depot: an ice chest stocked with flasks of blood. Roberston was sent to France to help the British army establish similar systems. He collected O negative blood (since it is the universal donor blood) and treated the blood with anticoagulants. It was then poured into one-liter glass bottles that were packed in straw in ammunition boxes, Shneid explains. The first successful transfusion from this early blood bank model took place in 1917.
Many soldiers fighting in World War I experienced massive blood loss from gaping wounds caused by shrapnel. In fact, serious injuries sustained on the battlefield with artillery—including shrapnel, shells, fragments and debris of explosions—caused more than 60 percent of casualties, according to Vogt. The introduction of blood banks and transfusions meant some of these injured men had a better chance of survival.
“The availability of blood for transfusion meant that army doctors could stabilize the patients from the field to the rear area hospitals,” Schneid says.
Blood Banking in World War I
Eventually Allied medical forces were issued standardized transfusion kits, invented by Geoffrey Keynes, a British doctor, to carry into the field. This allowed doctors to administer an injured patient blood even before transferring them to casualty clearing stations.
Blood kept in blood banks and used in transfusions during World War I often came from military members who were not wounded, volunteers or medical professionals themselves, Vogt explains. Additionally, according to Pierce, “convalescing troops often volunteered as donors for more seriously wounded comrades.”
Blood Banking After World War I
After World War I, blood banking eventually became widely accepted practice and extended for use among civilians. Some of the country’s first blood banks were founded at the Mayo Clinic in Rochester, Minnesota (1935), Cook County Hospital in Chicago (1937), Mt. Sinai Hospital in New York City (1938) and the Hamilton County Chapter of the American Red Cross in Cincinnati (1938).
Blood banking then saw major advances during World War II. In 1941, African American surgeon and researcher Charles R. Drew invented a safe way to store, process and transport blood plasma. At the outbreak of the war, Drew worked with others to set up a national blood bank for the American Red Cross, which provided blood to the U.S. Army and Navy. The network incorporated new safety protocols and standardized production techniques, and launched mobile blood donation stations, later known as “bloodmobiles.”
Not every development represented advancement, however. Misguided protocols during World War II restricted the use of blood from African Americans to white soldiers. “There were similar guidelines in Britain and France with non-white and white donors,” Schneid adds. The American Red Cross’s policy of segregating donated blood ended in 1948.
By the end of the Second World War, the Red Cross had collected more than 13 million pints of blood. The scale of blood banking, and particularly the development of plasma collection and transfusion techniques pioneered by Drew, likely saved thousands of lives.
As a sign of the life-saving potential of blood banks in conflict, blood type was added to American dog tags in 1940—and remains on the military identification tags to this day.