WhyKnowledgeHub
WhyKnowledgeDiscovery >> WhyKnowledgeHub >  >> science >> physical science >> history of physical science

The Last 1000 Years of Scientific Breakthroughs - A Historical Overview

 
A Thousand Years of Discovery Browse the article A Thousand Years of Discovery

Introduction to A Thousand Years of Discovery

Try to imagine what your world would be like if not for the scientific discoveries of the past millennium. If you are reading this indoors, you wouldn't have artificial lights. A thousand years ago, the best you could have done was some candles or torches. Is the building you are sitting in heated or air-conditioned? You wouldn't have those luxuries, either. A thousand years ago the only heat people had to stave off the cold of winter came from an inefficient fireplace, and in the summer they could find relief from the heat only by fanning themselves.

Look at your clothes. Anything that isn't cotton, wool, silk, or leather is the result of the chemical industry that came into being in the 1900's. And the food you eat is the result of enormous advances in our understanding of agriculture. Finally, if things today were still the way they were 1,000 years ago, you would be lucky to live past age 25. Medical science and public health are both developments of the last millennium.

It's hard for most people to imagine a world without constant progress in science and technology. Society has become so used to constant progress that it is now taken for granted, and we forget the long, hard path our predecessors followed to give us the modern world. As we approach a new millennium, it may be instructive to pause and consider how science developed and grew. Doing so might give us a stronger appreciation for the contributions that science has made to everyday life.

Science In the Ancient World

Science is not entirely a recent development. In the ancient world, Greek and Roman civilization had reached a high level of scientific sophistication and spread its influence throughout most of southern Europe and the Middle East. For example, around A.D. 150, the Greek astronomer Ptolemy, who lived and worked in Alexandria, Egypt, took centuries of astronomic observation and combined them into a single view of the universe. Ptolemy's scheme envisioned the planets moving on crystal spheres around a stationary Earth. Although this idea was completely wrong, it helped ancient astronomers predict eclipses and the positions of the planets. In fact, Ptolemy's theory explained the universe as well as it could be comprehended at that time, and it was accepted as authoritative throughout Europe until the mid-1500's.

Other Greeks made advances in the field of mathematics and medicine. Most of the geometry that is taught today in high school was systematized in the works of the Greek philosopher Euclid in about 300 B.C. In medicine, the great Greek physician Hippocrates showed in the 400's B.C. that disease had natural, rather than divine, causes. And another famous Greek physician, Galen, who practiced medicine in Rome in the A.D. 100's, experimented on animals to develop theories on anatomy, disease, and physiology that influenced doctors for nearly 1,500 years.

Despite the influence of Ptolemy and other Greek thinkers, the Romans, who dominated the Mediterranean world from the 200's B.C. to the A.D. 400's, were less interested in pure science than in architecture and engineering. Concerned mostly with pragmatic (practical) problem solving, the Romans became known as great builders of roads and aqueducts.

Internal power struggles began to weaken the Roman Empire in the late A.D. 300's, causing it to be split into the West Roman Empire and the East Roman Empire. By the mid-400's, Germanic peoples had invaded much of the West Roman Empire, which they divided into several kingdoms. The decline of the West brought an end to scientific inquiry, and intellectual pursuits were soon confined to the study of theology.

Keeping the Flame Alive

As scientific curiosity stagnated in Europe during these times—often called the Dark Ages—the Arab world kept the flame of learning alive, especially in the areas of astronomy and mathematics. Arabs in the Middle East preserved much of the ancient Greco-Roman science and accurately translated many Greek and Roman texts into Arabic. In the 800's, the Arab mathematician al-Khowarizmi organized and expanded algebra to include a number system developed in India. This system used place values and also introduced the concept of zero.

Arabs also made advances in the field of medicine. One prominent Arab physician, Avicenna, who lived from 980 to 1037, accurately described meningitis, tetanus, and other diseases and published his findings in a medical encyclopedia titled the Canon of Medicine.

In the Far East, China pursued its own course in science and technology. Observations made by Chinese astronomers over hundreds of years proved invaluable to Western astronomers more than 1,000 years later in tracing long-term occurrences of such celestial phenomena as comets, eclipses, meteor showers, and sunspots. During the Song dynasty, which ruled from 960 to 1279, the Chinese invented gunpowder, the magnetic compass, and movable type for printing.

Meanwhile, in Europe, the Dark Ages were not a period of complete ignorance. Despite the stagnation of science, several technological advances did occur. The moldboard plow, which allowed the cultivation of the heavy soils of northern Europe, the mechanical clock, and the widespread use of water wheels and windmills were all developments of the Middle Ages.

The Rebirth of Scientific Inquiry

In the 1100's, Europeans began to rediscover the works of the ancient Greeks and Romans that had been preserved by the Arabs and by Western monasteries. Many Arabic texts—which included original works as well as Arabic versions of ancient writings—were translated into Latin, the language of learning in the West. As a result, the writings of such scholars as Ptolemy once again became known in the West after an absence of hundreds of years.

Many historians credit the Spanish King Alfonso X, known as Alfonso the Wise, with promoting the rebirth of scientific thought in Europe in the 1200's. Among the scientific advances made by Alfonso's court scholars was a table of planetary motions based on Arabic sources but updated by Spanish astronomers.

The intellectual climate of Europe had changed radically by the 1400's with a revival of learning called the Renaissance, a cultural movement that began in Italy during the early 1300's and spread to other European nations by the late 1400's. During this period, a new breed of researchers—engineers in the mold of the ancient Romans—who were less interested in abstract knowledge than in solving specific problems, began to appear. If one had to point to a particular place and person that characterized this rebirth of pragmatism, a good case could be made for Milan, Italy, in the early 1500's and the work of an engineer and mathematician named Niccolo Tartaglia.

In the early 1500's, artillery was just being introduced in Europe, and the Duke of Milan asked a simple question: “At what angle should I set the barrel of a cannon to get maximum range for a given amount of powder?” To find an answer, Tartaglia did not pore over ancient texts in search of information that might help him solve the mathematical aspects of the problem. Rather, he opted for a more creative approach. Tartaglia took a cannon into a field and fired it at a number of different angles. After analyzing the results, Tartaglia determined that the maximum range was attained when the cannon barrel was at an angle of 45 degrees. The experiment was one of the first to require physical proof before a theory could be accepted as true. At a time when the rediscovered teachings of the Greek and Roman philosophers were unquestioned by most Europeans, Tartaglia's experiment represented an important first step toward scientific investigation.

Galileo and Copernicus

The rebirth of science in Europe continued to grow in the mid-1500's with the work of Nicolaus Copernicus, one of the great names in the history of astronomy. Copernicus was an administrative official at the Cathedral of Frauenburg in Poland, and he also pursued astronomy as a hobby. He had a small observatory in which he was able to study the night sky, but his most important work was using Ptolemy's records of the varying positions of the planets and stars to develop a new theory of the universe. In the Copernican theory, the sun, and not the Earth, was at the center of the universe. His book, "On the Revolutions of the Heavenly Spheres" (1543), was itself to cause a revolution. But revolutionary or not, there was no denying the fact that the Earth did indeed revolve about the sun. By the early 1600's, astronomers such as Tycho Brahe and Johannes Kepler had built on Copernicus's work to explain the motions of the planets.

One of the greatest scientists of that period, the Italian astronomer and physicist Galileo Galilei, studied the heavens with the newly invented telescope and also became convinced that Copernicus was right. Galileo, enlarging on the new approach pioneered by Tartaglia, also conducted many painstaking experiments in physics, for which he is referred to as the father of experimental science.

The systematic approach he pioneered for acquiring knowledge about the natural world developed into the scientific method, the cornerstone of modern science. Following the scientific method, if we want to learn something about how the world works, we must either make detailed observations of a natural phenomenon or conduct laboratory experiments. Once we have accumulated a sufficient body of information about the question being investigated, it should be possible to summarize the information in a compact set of statements, or laws, from which an overall theory—usually based on mathematics—can be formulated. The final step in the cycle is to use this theory to make predictions about things that have never been measured, and then to conduct further experiments and see if those predictions are correct. Thus, the scientific method is a way of approaching the truth and disproving incorrect notions about the world.

Galileo employed this questioning attitude in his investigation of the motion of objects being accelerated by gravity. Before the 1600's, people believed that if two objects of different weights were dropped from the same height at the same moment, the heavier object would fall faster and hit the ground first. Galileo conducted various experiments and concluded that all objects fall at the same speed (disregarding air resistance), and as they fall their speed increases at a regular rate. His results became known as the law of falling bodies.

The Contributions of Newton

Although Galileo laid the foundation for the scientific method, it was the English physicist and mathematician Isaac Newton who developed this investigative method to its complete form. For this reason, historians of science consider Newton one of the greatest scientists who ever lived. Thanks to Newton, when scientists in the 1800's began investigating the properties of electricity, magnetism, and heat, they didn't have to start from scratch—they knew how to proceed.

Newton's work included extensive experiments with light, in which he demonstrated that white light is a mixture of a spectrum of colors. But his most important discovery was that the forces and laws that operate on Earth also govern the motions of heavenly bodies.

Newton recounted that he came to this understanding quite suddenly while he was walking in an apple orchard and saw an apple fall from a tree. In the daylight sky beyond, he saw the moon. He realized that for the moon to stay in orbit about the Earth, a force had to be acting upon it, and that for the apple to fall, a force had to be acting on it as well. His conclusion, that in both cases the same force—gravity—was involved, was the foundation of what became known as “Newton's Law of Universal Gravitation.” Newton's discoveries, which also included laws explaining other forms of motion, were published in 1687 in a book called Principia Mathematica, which is considered to be one of the greatest single contributions in the development of science.

Newton's insight that a single force could explain the motions of both a falling apple and the orbiting moon was the beginning of our understanding of the solar system. Scientists for the first time could explain what keeps the planets in their orbits—the gravitational pull of the sun. If the sun's gravity were somehow turned off, the planets would fly off into space.

Newton's Legacy

Although Galileo laid the foundation for the scientific method, it was the English physicist and mathematician Isaac Newton who developed this investigative method to its complete form. For this reason, historians of science consider Newton one of the greatest scientists who ever lived. Thanks to Newton, when scientists in the 1800's began investigating the properties of electricity, magnetism, and heat, they didn't have to start from scratch—they knew how to proceed.

Thus, Newton's legacy to science was a picture of a very ordered and predictable universe. This view spread through the Western culture of the 1700's, an era known as the Enlightenment or the Age of Reason, and it influenced not only the scientific world but the world at large. For example, when the framers of the United States Constitution debated how best to found a just and lasting nation, they worked within the Newtonian framework. They believed that there were natural, God-given laws governing the actions of human beings, just as there were laws controlling the motions of the planets, and they fashioned the Constitution to take those laws into account.

The Quest For Knowledge Continues

The 1700's were marked by a growing thirst for knowledge and scientific discovery. Using the scientific method, researchers in a number of fields, including such emerging disciplines as biology, geology, and chemistry, continued to enlarge science's understanding of the natural world.

In biology, researchers began cataloguing the incredible diversity of living things. The Swedish naturalist and botanist Carolus Linnaeus developed the first modern system for naming and classifying plants and animals. The scientific study of the Earth got its start with James Hutton, a Scottish chemist who is often called the father of modern geology. In 1795, Hutton published a theory that the Earth was immensely old and that the planet's features were constantly, though very gradually, changing. His theory challenged a belief that the Earth was just a few thousand years old and that only huge natural disasters, particularly the Great Flood recorded in the Bible, altered its appearance.

In chemistry, the English chemist Joseph Priestley and Swedish chemist Carl Scheele independently discovered the element oxygen. The French chemist Antoine Lavoisier conducted experiments on combustion, proving that it involved the combination of a substance with oxygen. Lavoisier's Elementary Treatise on Chemistry, published in 1789, was the first modern chemistry textbook.

Advances In the 1800's

Chemists made great strides in the 1800's. In 1803, the British chemist John Dalton proposed a theory that matter is made up of tiny units called atoms. Unlike the ancient Greek philosophers who had earlier proposed the idea but made no attempt to demonstrate it experimentally, Dalton based his version of the atomic theory on observation. He and other chemists knew that most materials could be broken down by chemical reactions, such as burning, but that once a material was reduced to a certain point, it could be broken down no further. So although burning would produce charcoal, or pure carbon, no chemical reaction could further break down the carbon. Carbon was thus recognized as an elemental substance.

Dalton theorized that each chemical element—of which about 16 had then been identified–was made of a specific kind of atom. In his theory, atoms were indivisible but could be put together in different ways to make all of the materials in the world.

Chemists in the late 1800's, using Dalton's atomic theory, tried to discover the laws that govern the interactions of atoms. In 1869, the Russian chemist Dmitri Mendeleev and the German chemist Julius Lothar Meyer independently announced the discovery of a periodic law. They observed that when elements were listed according to their atomic weights, elements with similar properties appeared at regular intervals, or periods. From this finding, the chemists organized the elements into a periodic table of the elements that soon became a standard fixture in the world of chemistry. The table imposed order on the growing number of known elements, but why the elements could be arranged in such an orderly way was something the chemists of the 1800's could not explain. This problem would not be solved until the 1900's.

Meanwhile, great strides were made in the biological sciences. Beginning the 1870's, Louis Pasteur, a French chemist, and Robert Koch, a German physician, established a new theory of disease. Through their studies, Pasteur and Koch proved that infections and many diseases are caused by microscopic organisms. This "germ theory" of disease revolutionized medicine and public health. Pasteur benefited humanity by applying this basic discovery to important practical problems. Among his major accomplishments was the use of heat to kill microbes in wine, beer, milk, and food, in a process that became known as pasteurization. Pasteur also demonstrated the value of vaccination against microbes and developed several vaccines, including a rabies vaccine.

Darwin and the Theory of Evolution

An event important to humanity's understanding of nature—and of itself—occurred in 1859. This was publication of the book “The Origin of Species” by the British naturalist Charles Darwin. Darwin theorized that the multitude of species on Earth had evolved from earlier species by means of a mechanism he called natural selection. He began his argument by noting that people could produce diversity within a given population of animals by selective breeding. Darwin, who was also a pigeon breeder, discussed at great length the different kinds of pigeons that breeders had produced by selective breeding of their stock over a period of just a few hundred years. He called this process artificial selection, and he argued that if human beings could produce such major changes over relatively short periods of time, then nature, acting over enormously longer spans of time, could produce even greater changes.

This process of natural selection, Darwin said, depends on two things: first, the fact that in any population there will always be natural variations among individuals and, second, that there is always competition among members of a species for food, mates, and other things necessary to survive and reproduce. Darwin argued that in any given environment, some members of a species will have certain genetically inherited traits that give them a slight advantage in surviving and reproducing. Because these individuals will tend to have more surviving offspring than the average for their species, over many generations the advantageous traits will come to be shared by more and more of the population. Through natural selection—often called the survival of the fittest—the entire population will eventually be made up of individuals that have the advantageous traits. If enough changes accumulate, then a new species will emerge.

Darwin's theory of evolution provided a coherent unifying theme for all the life sciences. Before Darwin, a researcher studying the distribution of butterfly species in Argentina and a scientist studying the distribution of fish in the Arctic Ocean had no common framework in which to discuss their work. The theory of evolution showed that all organisms are to some degree related because they all arose from the process of natural selection, a mechanism that continues to dominate life on Earth.

The idea of natural selection had a major impact on Western culture. It implied that human beings were produced by the same evolutionary process as other animals. In fact, some biologists said, humans are almost certainly descended from apes. Such a theory shocked many people in the 1800's. The majority of Europeans at that time believed that each species had been created by a separate divine act in the Garden of Eden and that human beings had been made in God's image. Although many theologians accepted the theory of evolution, some denounced it. Nonetheless, its acceptance spread quickly in the scientific community and among most educated lay people.

The Pace of Discovery Accelerates

By the late 1800's, scientific inquiry was exploding. The Scottish physicist James Clerk Maxwell extended the known laws of electricity and magnetism and combined them into a single set of four equations. The equations, published in 1865, described how electric and magnetic fields arise and interact. In 1897, the British physicist Joseph Thomson discovered that atoms contain negatively charged particles, which he called electrons. The following year, the Polish-born French physicist Marie Curie and her husband, French physicist Pierre Curie, isolated the element radium, thereby advancing the study of radioactivity. In the late 1890's, astronomers were busily cataloguing stars through ever-larger telescopes, and biologists were developing an increasingly coherent view of life on Earth. Nevertheless, no one living at the close of the 1800's could have predicted the advances in science that would change the world in the century to come.

One of the first major scientific achievements of the 1900's came in 1905, when the German physicist Albert Einstein published his special theory of relativity. The theory, a major conceptual and theoretical event in physics, is based on the notion that the laws of physics are the same in all frames of reference—that is, no matter where you are or how you are moving, you will see the same laws of physics operating. Einstein's theory was based on the idea that different observers of the universe would see a given event differently, depending on how they were moving, and it made predictions that ran against human intuition. For example, the theory predicted that a moving clock will appear to run slower to a stationary observer than to a person traveling with the clock. As Einstein's predictions passed scientific tests, his theory of relativity became as essential a part of physics as Newton's laws of motion and changed the way scientists looked at the universe.

Investigating the Atom

At the same time that Einstein was doing his most famous work—which also included a general theory of relativity, explaining gravity as a warping of space—other researchers were learning more about the atom. One important development took place in 1911 when the British physicist Ernest Rutherford performed an experiment that revolutionized scientists' understanding of atomic structure. Thomson had pictured atoms as tiny balls with electrons interspersed all through them. Rutherford showed, however, that an atom has a well-defined structure, with most of its mass locked inside a very tiny, positively charged nucleus around which the negatively charged electrons whirl at tremendous speeds.

Rutherford's experiments did not explain how electrons were arranged around the nucleus. It was not until 1913 that the Danish physicist Niels Bohr suggested that electrons are confined to particular orbits, or “shells” around the nucleus. Bohr also theorized that when an electron jumps from an outer orbit to an inner one, it emits a quantum, or unit, of energy. His theory became a part of quantum mechanics, a field of physics that explains the behavior of atoms and the subatomic particles of which they are made. Quantum mechanics explained why the elements could be arranged in an orderly way in the periodic table. According to quantum theory, similarities in the behavior of a group of elements result from similarities in the structure of the atoms of those elements.

From then on, much of the research in physics in the 1900's involved attempts to understand how an atom's nucleus is structured and how it works. Scientists had discovered in the late 1800's that positively charged particles called protons are one of the particles that make up the nucleus. Another particle, the uncharged neutron, was discovered in 1932. Beginning in the 1930's, researchers learned to conduct experiments in which they smashed fast-moving particles into nuclei and examined the debris to learn more about what's at the heart of an atom. They discovered that there are many other kinds of "fundamental particles" that make up protons and neutrons, and that all of these supposedly elementary particles are made of even smaller things. In 1964, the American physicists Murray Gell-Mann and George Zweig independently proposed that all nuclear particles are composed of smaller building blocks called quarks. By 1999, investigators had discovered evidence for six kinds of quarks, which combine in different ways to form all of the nuclear particles known to science.

New Knowledge About the Universe

Equally dramatic changes occurred in the 1900's to alter our perceptions of the universe. Astronomers had known since the 1860's that the stars are very distant, but they still had many questions about the heavens. In the 1920's, American astronomer Edwin Hubble, working at the Mount Wilson Observatory in California, was able to solve an old problem in astronomy. The puzzle concerned whether our own galaxy, the Milky Way, constitutes the entire universe or whether other "island universes" exist. By showing that many hazy "nebulae" are actually huge collections of stars far beyond the boundaries of the Milky Way, Hubble determined that our own galaxy is just one of billions in the universe.

Hubble also ascertained that these many other galaxies are moving away from us and from one another. This finding revealed that the universe is expanding, which indicated that it began in an explosive event—a “big bang”—at a specific time in the past. Astronomers later calculated the age of the universe to be between 10 billion and 20 billion years.

Understanding the Molecular Basis of Life

Other scientific advances led to changes in our understanding of living systems. Even after Darwin offered a general framework within which to understand life, biologists continued well into the 1900's to be concerned with examining and cataloging organisms. But beginning in the late 1950's, that focus changed abruptly as life scientists realized that they could at last understand the molecular basis of life. Biologists knew, based on discoveries made by Gregor Mendel, an Austrian Monk, in the mid-1800's, that physical characteristics are produced by basic hereditary units that transmit traits from generation to generation. But the event that led to a new turn in biology was the 1953 discovery of the structure of DNA (deoxyribonucleic acid—the molecule that genes are made of) by biologists James Watson of the United States and Francis Crick of Great Britain. Science's understanding of human genes exploded in the following years.

Advances in biology and medicine have increased the average life span by 50 percent since the beginning of the century, and by 1999, a new day in medicine—the era of genetic medicine—was in the offing. Genetic medicine, which focuses on the molecular basis of disease, may lead to gene therapy, or the replacement of faulty genes with new genes to cure many diseases. Discoveries in all the major fields of science have transformed human society during the 1900's. Physics, for example, has given us aviation, computer technology, nuclear power, space travel, and telecommunications, not to mention such everyday amenities as electricity.

Concerns and Hopes

Of course, not every scientific advance has been viewed as beneficial by all parts of society. For example, though scientists and engineers found ways to harness nuclear energy as a power source, critics of science contend that nuclear plants are an environmental hazard. Furthermore, they say, the development of the atomic bomb in the 1940's and the hydrogen bomb in the 1950's created a threat of global annihilation that existed for nearly 50 years. In the late 1990's, critics of science expressed considerable opposition to the newfound ability to clone (make exact genetic duplicates of) living creatures. This astonishing technology, though it promised great benefits in agriculture, could presumably be used to create human clones, a prospect viewed with horror by many people.

Despite such valid concerns, few people would deny that science has brought humanity great progress in the last millennium. And it can undoubtedly carry us to even greater heights in the next millennium. However, as history has shown, continued progress in science and technology is not guaranteed. Some observers in 1999 warned that science was being threatened by a rising tide of irrational beliefs among the public. Proponents of science argue that the quest for knowledge is necessary for the survival of civilization, and that humanity cannot take for granted that the future will be a continuation of the present. Only science, they say, can ensure that there will be no new Dark Age.