The only constant is change. At least, that’s what the Greek philosopher Heraclitus is credited to have said. And while science and philosophy don’t always go hand in hand, there is some truth to Heraclitus’ notion. Change is inevitable and, in some cases, necessary for our species to evolve. While some change happens automatically, like the tides going in and out, some changes bloomed from scientific discoveries.
Using fire to cook food and keep warm propelled our ancestors toward the foundations of early settlements and continued the growth of civilization. Using fire to shape metals for weapons and building materials led to more and more discoveries and more and more advancements. While many advances shaped humanity, we’ve focused on ten significant scientific discoveries that changed the world.
The discovery of DNA didn’t so much change the world as it did our understanding of it — more so, our understanding of life. DNA is a term we’ve only started using in the 20th century, though its initial discovery dates back decades into the 19th century.
(Credit: Connect World/Shutterstock)
DNA is the molecule that encodes genetic information for all living things. It plays a key role in passing traits from parents to offspring and is the primary component of chromosomes in the cell nuclei of complex organisms.
Many people think scientists James Watson and Francis Crick discovered DNA in the 1950s. Nope, not so fast. DNA was actually first discovered in 1869 by Swiss physician Friedrich Miescher. He identified what he referred to as “nuclein” in blood cells. Several other researchers have worked on projects around identifying DNA up until Watson and Crick.
The term nuclein eventually evolved into what we know as DNA, the shorthand for deoxyribonucleic acid. German biochemist Albrecht Kossel, who would later go on to win the Nobel Prize, is often credited with the name.
Other scientists, such as Phoebus Levene, built on Miescher’s work over the years. Levene didn’t know how DNA’s nucleotide components were arranged. He proposed the polynucleotide model, correctly suggesting that nucleic acids are chains of nucleotides, each with a base, a sugar, and a phosphate group.
Watson and Crick and “their” groundbreaking discovery in the field of genetics accurately identified DNA’s double-stranded helix structure, connected by hydrogen bonds. For their discovery, Watson and Crick won a Nobel Prize in 1962 and worldwide acclaim.
Though Watson and Crick won a Nobel Prize, years later, we’ve learned that the duo likely took research without permission from chemist Rosalind Franklin. Thanks to her research, the double helix structure was realized, though her Nobel Prize was not.
In 2014, Watson auctioned off his Nobel Prize medal for over $4 million. The buyer was a Russian billionaire who returned it to Watson a year later. In 2019, Watson was stripped of his honorary titles because of racist comments.
While it may be common knowledge that Earth spins on an axis and revolves around the sun, at one point, this idea was extremely outlandish. How could the planet move and we not feel it? Thanks to a few clever scientists, the Earth in Motion theory became more than a wild idea.
Earth in motion refers to the understanding that Earth is not stationary but moves in different ways. Earth rotates on an axis and revolves around a star.
Earth rotates on its axis, which is an imaginary line running from the North Pole to the South Pole. This rotation is responsible for the day-night cycle, with one complete rotation taking about 24 hours.
Earth revolves around the Sun, completing one orbit approximately every 365 days. This revolution, combined with the tilt of the Earth’s axis, leads to the changing seasons.
The discovery and acceptance of Earth’s motion was a gradual process involving several key figures in the history of science.
An ancient Greek astronomer, Aristarchus of Samos, was one of the first to suggest that Earth orbits the Sun. This view was not widely accepted in his time as it was believed Earth was the center of the Universe, and stars, planets, and the sun all revolved around our planet.
Mathematician and astronomer Nicolaus Copernicus is often credited with proposing the first heliocentric model of the universe. In 1543, he published his great work, On the Revolutions of the Heavenly Spheres, which explained his theories.
Among them was that day and night was created by the Earth spinning on its axis. Copernican heliocentrism replaced the conventionally accepted Ptolemaic theory, which asserted that the Earth was stationary. Copernicus’ work was largely unknown during his lifetime but later gained support.
Galileo Galilei agreed with Copernicus’ theory and proved it through his telescopic observations. In 1610, he observed phases of Venus and the moons of Jupiter, which were strong evidence against the Earth-centered model of the universe.
Galileo agreed with Copernicus’ theory and proved it by using a telescope to confirm that the different phases Venus went through resulted from orbiting around the sun.
German mathematician Johannes Kepler formulated a series of laws detailing the orbits of planets around the Sun. These laws, which remain relevant today, provided mathematical equations for accurately predicting planetary movements in line with the Copernican theory.
According to researchers at the California Institute of Technology (CalTech), Earth spins smoothly and at a consistent speed. If Earth were to change speeds at any time, we’d feel it.
It’s a common misconception that Ben Franklin discovered electricity with his famous kite experiment. But his 1752 experiment, which used a key and kite, instead demonstrated that lightning is a form of electricity. Another myth is that Franklin was struck by lightning. He wasn’t, but the storm did charge the kite.
Back in 600 B.C.E., it was the ancient Greek philosopher Thales of Miletus who first observed static electricity when fur was rubbed against fossilized tree resin, known as amber.
British scientist and doctor William Gilbert coined the word “electric,” derived from the Greek word for amber. Regarded as the “father of electricity,” Gilbert was also the first person to use the terms magnetic pole, electric force, and electric attraction. In 1600, his six-volume book set, De Magnete, was published. Among other ideas, it included the hypothesis that Earth itself is a magnet.
Germ theory is a scientific principle in medicine that attributes the cause of many diseases to microorganisms, such as bacteria and viruses, that invade and multiply within the human body. This theory was a significant shift from previous beliefs about disease causation.
Louis Pasteur discovered germ theory when he demonstrated that living microorganisms caused fermentation, which could make milk and wine turn sour. From there, his experiments revealed that these microbes could be destroyed by heating them — a process we now know as pasteurization.
This advance was a game changer, saving people from getting sick from the bacteria in unpasteurized foods, such as eggs, milk, and cheeses. Before Pasteur, everyday people and scientists alike believed that disease came from inside the body.
Pasteur’s work proved that germ theory was true and that disease was the result of microorganisms attacking the body. Because of Pasteur, attitudes changed, and became more accepting of germ theory.
The German physician and microbiologist Robert Koch played a crucial role in establishing a systematic methodology for proving the causal relationship between microbes and diseases.
He formulated Koch’s postulates and applied these principles to identify the bacteria responsible for tuberculosis and cholera, among other diseases.
Together, Pasteur and Koch laid the foundation for bacteriology as a science and dramatically shifted the medical community’s understanding of infectious diseases. Their work led to improved hygiene, the development of vaccines, and the advancement of public health measures.
Illustration: Alison Mackey/Discover, Apple: Thinkstock
Isaac Newton didn’t really get hit on the head with an apple, as far as we know. But seeing an apple fall from a tree did spark an idea that would lead the mathematician and physicist to discover gravity at the age of just 23.
He pondered about how the force pulls objects straight to the ground, as opposed to following a curved path, like a fired cannonball. Gravity was the answer — a force that pulls objects toward each other.
The greater the mass an object has, the greater the force or gravitational pull. When objects are farther apart, the weaker the force. Newton’s work and his understanding of gravity are used to explain everything from the trajectory of a baseball to the Earth’s orbit around the sun. But Newton’s discoveries didn’t stop there.
In 1687, Newton published his book Principia, which expanded on his laws of universal gravitation and his three laws of motion. His work laid the foundation for modern physics.
Building on the discovery, advancements in the field of electricity continued.
In 1800, Italian physicist Alessandro Volta created the first voltaic pile, an early form of an electric battery.
In 1915, Einstein proposed the theory of general relativity. This theory redefined gravity not as a force but as a curvature of spacetime caused by the presence of mass and energy.
According to Einstein, massive objects cause a distortion in the fabric of space and time, similar to how a heavy ball placed on a trampoline causes it to warp. Other objects move along the curves in spacetime created by this distortion.
Both Newton and Einstein significantly advanced our understanding of gravity. Their theories marked critical milestones in the field of physics and have had far-reaching implications in science and technology.
Read More: 5 Eccentric Facts About Isaac Newton
Much like Germ Theory revolutionized modern medicine, so too did the invention of antibiotics. This discovery would go on to save countless lives.
According to the Microbiology Society, humans have been using some form of antibiotics for millennia. It was only in recent history that humans realized that bacteria caused certain infections and that we could now provide readily available treatment.
In 1909, German physician Paul Ehrlich noticed that certain chemical dyes did not color certain bacteria cells as it did for others. Because of this, he believed that it would be possible to kill certain bacteria without killing the other cells around it. Ehrlich went on to discover the cure for syphilis, which many in the scientific community refer to as the first antibiotic. However, Ehrlich referred to his discovery as chemotherapy because it used chemicals to treat a disease. Ehrlich is referred to as the “Father of Immunology” for his discoveries.
Ukrainian-American microbiologist Selman Waksman coined the term “antibiotic” about 30 years later, according to the Microbiology Society.
One of the most recognizable antibiotics known today is penicillin. Health professionals prescribe millions of patients with this antibiotic each year. However, one of the most well-known antibiotics was discovered by accident.
In 1928, after some time away from the lab, Alexander Fleming — a Scottish microbiologist — discovered that a fungus Penicillium notatum had contaminated a culture plate with Staph bacteria. Fleming noticed that the fungus had created bacteria-free areas on the plate. After multiple trials, Fleming was able to successfully prove that P. notatum prevented the growth of Staph. Soon the antibiotic was ready for mass production and helped save many lives during World War Two.
Penicillin is used to treat infections caused by bacteria. The medication works by stopping and preventing the growth of bacteria.
The Big Bang Theory is one of the most widely accepted theories on the beginning of the universe. The theory claims that about 13.7 billion years ago, all matter of the universe was condensed into one small point. After a massive explosion, the contents of the universe burst forth and expanded and continue to expand today.
This first mention of the Big Bang came from Georges Lemaître, a Belgian cosmologist and Catholic priest. Initially, in 1927, Lemaître published a paper about General Relativity and solutions to the equations around it. Though it mostly went unnoticed.
Though many scientists didn’t believe that the universe was expanding, a group of cosmologists was beginning to go against the grain. After Edwin Hubble noticed that galaxies further away from our own seemed to be pulling away faster than those closer to us, the idea of the universe expanding seemed to make more sense. Lemaître’s 1927 paper was recognized, and the term Big Bang appeared in Lemaître’s 1931 paper on the subject.
Edwin Hubble’s discovery that galaxies are moving away from our own, dubbed Hubble’s Law, is on a long list of his many discoveries. Though this discovery helped add evidence to the Big Bang Theory, this discovery was hindered by the same thing that had been distributing telescopes since their inception: Earth’s atmosphere. According to NASA, Earth’s atmosphere distorts light, limiting how far a telescope can see, even on a clear night.
Because of this, researchers, specifically Lyman Spitzer, suggested putting a telescope in space, just beyond Earth’s atmosphere and into its orbit. After a few attempts in the 1960s and 70s, NASA, along with contributions from the European Space Agency (ESA), launched a space telescope on April 24, 1990. The Hubble Space Telescope, named for the pioneering cosmologist, became the strongest telescope known to humankind until the 2021 launch of the James Webb Space Telescope.
The Big Bang emitted large amounts of primeval light, according to the ESA. Over time, this light “cooled” and was no longer visible. However, researchers are able to detect what is known as Cosmic Microwave Background (CMB), which is, according to the ESA, the cooled remnant of the first light to travel through the universe. Some researchers even refer to CMB as an echo of the Big Bang.
Read More: Did the Big Bang Happen More Than Once?
“An ounce of prevention is worth a pound of cure,” Benjamin Franklin once said. A statement that, at the time, applied to making towns safer against fires. However, the same statement can be true for health and wellness. The advent of vaccines has helped prevent several serious diseases and keep people safe. Thanks to vaccines, people rarely get diseases like polio, and smallpox has been eradicated.
According to the Centers for Disease Control (CDC), a vaccine is a method of protection that introduces a small amount of disease to the human body so that the body can form an immune response should that disease try to enter the body again.
Basically, through a vaccine, the human body is exposed to a small out of a disease so that the immune system can build a defense against it.
According to the World Health Organization (WHO), Dr. Edward Jenner created the first vaccine in 1796 by using infected material from a cowpox sore — a disease similar to smallpox. He inoculated an 8-year-old boy named James Phipps with the matter and found that the boy, though he didn’t feel well at first, recovered from the illness.
A few months later, Jenner tested Phipps with material from a smallpox sore and found that Phipps did not get ill at all. From there, the smallpox vaccine prevented countless deaths in the centuries to come.
From 1796 to 1945, doctors and scientists worked hard to create vaccines for other serious illnesses like the Spanish Flu, yellow fever, and influenza. One of these doctors was Jonas Salk. After Salk helped develop an influenza vaccine in 1945, he began working on the Polio vaccine. Between 1952 and 1955, Salk finished the vaccine, and clinical trials began. Salk’s vacation method required a needle and syringe, though, by 1960, Albert Sabin had created a different delivery method for the polio vaccine. Sabin’s version could be administered by drops or on a sugar cube.
Read More: The History of the Polio Vaccine
Evolution is a theory that suggests that organisms change and adapt to their environment on a genetic level from one generation to the next. This can take millions of years through methods such as natural selection. An animal’s color or beak may alter over time depending on the changes in their environment, helping them hide from predators or better capture prey.
After studying animals in the Galapagos, particularly the finches, a naturalist named Charles Darwin determined that the birds — who all resided on different Galapagos islands — were the same or similar species but had distinct characteristics. Darwin noted that the finches from each island had different beaks. These beaks helped the finches forage for their main food source on their specific island. Some had larger beaks for cracking open nuts and seeds, while others had smaller and more narrow beaks for finding insects.
These observations earned Charles Darwin the title of the Father of Evolution. Though the theory of evolution has changed since Darwin published On the Origin of Species in 1859, he helped lay the framework for modern scientists.
The long-held belief for thousands of years was that the world and all of its organisms were created by one power. But, as science has advanced, there is clear evidence to argue against that.
The answer to this question is complicated because evolution is both fact and theory. According to the National Center for Science Education, scientific understanding needs both theories and facts. There is proof that organisms have changed or evolved over time, and scientists now have the means to study and identify how those changes happen.
(Credit: Nathan Devery/Shutterstock)
According to the National Human Genome Research Institute, CRISPR stands for Clustered Regularly Interspaced Short Palindromic Repeats. Researchers use this technology to modify the DNA of a living organism.
There are several people involved and decades of research into the discovery of CRISPR. These researchers include Yoshizumi Ishino, Francisco Mojica, and the duo who recently won the Nobel Prize in Chemistry for CRISPR, Jennifer Doudna and Emmanuelle Charpentier.
CRISPR is a technology that can edit genes or even turn a gene “on” or “off.” Researchers have described CRISPR as a molecular scissors that clips apart DNA, then replaces, deletes, or modifies genes. According to a 2018 study, scientists can use this technology to help replace certain genes that may cause diseases such as cancer or heritable diseases like Duchenne muscular dystrophy — a degenerative disorder that can cause premature death.
In short, scientists use CRISPR technology to find specific pieces of DNA inside of a cell. Scientists then alter that piece of DNA or replace it with a different DNA sequence. CRISPR technology also ensures that the changed gene passes on to the next offspring through gene drive.
This article was originally published on Oct. 22, 2021 and has since been updated with new information from the Discover staff.