
The history of the first atomic bombs is one of audacious scientific strides, precipitous wartime choices, and profound questions of morality, all taking place under the darkness of the most destructive war the world had yet experienced. The chain of events began long before World War II, with the discovery of uranium in 1789. For decades, it was simply another element on the periodic table, until early 20th-century pioneers like Marie Curie, Ernest Rutherford, and Lise Meitner began peeling back the mysteries of radioactivity and the atom’s inner workings.

The breakthrough was in the late 1930s when Meitner, with Otto Hahn, discovered nuclear fission—the disintegration of uranium atoms—and named and described it after fleeing Nazi Germany. Scientists like Leo Szilard and Enrico Fermi soon realized the mind-boggling possibilities of a chain reaction: it could release an energy source the world had never known—and during war, that meant a possible weapon of unimaginable power.

As Europe descended into war, scientists made their way to the United States, not just with their knowledge but with dire warnings. Szilard and Albert Einstein co-signed a letter to President Franklin D. Roosevelt in 1939 warning him that Germany was possibly developing atomic weapons. That message initiated the U.S. government’s move into nuclear research, first through the Advisory Committee on Uranium.

Following the Pearl Harbor attack in December 1941, attention turned from scientific curiosity to a full-scale military effort. Led by General Leslie Groves and physicist J. Robert Oppenheimer, the Manhattan Project was one of the biggest and most clandestine efforts ever, with more than 100,000 workers and a price tag of over $2 billion in current dollars.

The project proceeded along two concurrent paths: one bomb powered by uranium-235, the other by plutonium-239. Creating uranium-235 was slow and technically challenging—it must be isolated from more plentiful uranium-238 in huge plants such as those at Oak Ridge, Tennessee. This design, named Little Boy, employed a “gun-type” assembly, shooting two components of uranium together to produce a chain reaction. Plutonium, manufactured at reactors such as those in Hanford, Washington, presented another challenge. It had to be imploded, with precisely shaped charges compressing the plutonium core uniformly. This was much more complex and untested, so Oppenheimer pushed for a live test.

That test—Trinity—was conducted in the New Mexico desert on July 16, 1945. The blast, produced by the plutonium “gadget,” illuminated the pre-dawn sky and billowed a mushroom cloud miles high. It demonstrated that the implosion design was functional, providing the U.S. assurance of its potential to deliver several different kinds of atomic weapons. Strategically, it also allowed America to hit Japan twice in rapid succession, demonstrating that it had not only one bomb, but a capability.

Ready for use, President Harry Truman had a history-making decision to make. Japan had suffered successive months of bombing and blockade, but did not appear about to surrender. Truman’s advisors considered four alternatives: continue with conventional bombing, invade, hold a demonstration, or drop the bomb on a city directly. Each option had immense costs. Bombing was already costing hundreds of thousands of lives without weakening Japan’s determination. Invasion would mean horrific casualties on either side. Demonstration could result in a dud or a failed demonstration of might. Eventually, Truman and his advisors decided that bombing a populated target would have the greatest psychological effect.

The targets were carefully selected. Military significance was important, but so was the state of the city—it had to be pretty undamaged so the effects of the bomb would be evident. Cultural and political interests were also involved; Kyoto was taken off the list because of cultural importance and intervention by Secretary of War Henry Stimson. On the day of the Hiroshima mission, the weather was good. Three days later, the crew assigned to bomb Kokura discovered that the city was clouded with smoke and clouds. They swerved to the alternate target, Nagasaki, where a temporary thinning in the clouds doomed the city.

The B-29 Enola Gay released Little Boy over Hiroshima on August 6, 1945. The uranium bomb detonated roughly 1,500 feet over the city, releasing a blast of 15,000 tons of TNT. The damage was almost complete—temperatures reached thousands of degrees, and tens of thousands perished in an instant. 140,000 lives had been lost by the end of the year from the effects of the blast, burns, and radiation. Three days after the first, Fat Man, the more energetic plutonium bomb, hit Nagasaki. Its explosion was even larger, but the hills in the city restricted the area of destruction, killing roughly the same number as Hiroshima.

The double shock of the bombings and the declaration of war against Japan by the Soviet Union shattered the deadlock among the leaders in Japan. Emperor Hirohito himself stepped in, commanding surrender to put an end to suffering in the nation. World War II was over within weeks, sparing Japan a long invasion and the likelihood of mass starvation, as Japan’s supply lines and facilities were falling apart.

The legacy of the atomic bomb continues to be history’s most controversial topic. The bomb is said by some to have saved countless lives by shortening the war; others highlight the immense human cost and moral significance of the decision. What is clear is that the Manhattan Project ushered in the nuclear age, triggering an arms race and a perpetual world debate regarding the morality of weapons of mass destruction. The choices made during those critical years still shape military strategy and foreign policy into the modern era.