Scientific method

From Citizendium
Revision as of 12:58, 3 January 2007 by imported>Gareth Leng
Jump to navigation Jump to search

The scientific method is how scientists investigate phenomena and acquire new knowledge. It is based on observable, empirical, measurable evidence, subject to the principles of reasoning[1]. Scientists propose hypotheses to explain phenomena, and test those hypotheses by examining the evidence from experimental studies. Scientists also formulate theories that encompass whole domains of inquiry, and which bind hypotheses together into logically coherent wholes.

"Science is a way of thinking much more than it is a body of knowledge." (Carl Sagan)[2]

If the purpose of scientific methodology is to prescribe or expound a system of enquiry or even a code of practice for scientific behavior, then scientists seem to be able to get on very well without it. Most scientists receive no tuition in scientific method, but those who have been instructed perform no better as scientists than those who have not. Of what other branch of learning can it be said that it gives its proficients no advantage; that it need not be taught or, if taught, need not be learned?

Peter Medawar[3]

Elements of scientific method

According to Charles Darwin ,

". . .science consists in grouping facts so that general laws or conclusions may be drawn from them." [4]

Darwin's simple account begs many questions. What do we mean by ‘facts’? How much can we trust our senses to enable us to believe that what we see is true? How do scientists ‘group’ facts? How do they choose which facts to pay attention to, and is it possible to do this in an objective way? And having done this, how do they draw any broader conclusions? How can we know more than we observe directly?

We live in a world that is not directly understandable. We sometimes disagree about the ‘facts’ we see around us, and some things in the world are at odds with our understanding. What we call the “scientific method” is an account of how scientists gather and report observations in ways that will be understood by others and accepted as valid evidence, and how they construct explanations that will be consistent with the world, withstand logical and experimental scrutiny, and provide the foundations for further increases in understanding.

The English philosopher Francis Bacon is often credited as the pioneer of the modern scientific method. He proposed that scientists should "empty their minds" of self-evident truths and, by 'observation and experimentation' should generate hypotheses by a process known as induction.[5] Bacon described many of the accepted principles, underscoring the importance of Theory, empirical results, data gathering, experiment, and independent corroboration. He nevertheless recognised that interpreting nature needs something more than observation and reason:

...the universe to the eye of the human understanding is framed like a labyrinth, presenting as it does on every side so many ambiguities of way, such deceitful resemblances of objects and signs, natures so irregular in their lines and so knotted and entangled. And then the way is still to be made by the uncertain light of the sense, sometimes shining out, sometimes clouded over, through the woods of experience and particulars; while those who offer themselves for guides are (as was said) themselves also puzzled, and increase the number of errors and wanderers. In circumstances so difficult neither the natural force of man's judgement nor even any accidental felicity offers any chance of success. No excellence of wit, no repetition of chance experiments, can overcome such difficulties as these. Our steps must be guided by a clue... [6]


The success of science, as measured by the technological achievements that have changed our world, have led many to conclude that this must reflect the success of some methodological rules that scientists follow. However, not all philosophers accept this conclusion; notably, the philosopher Paul Feyerabend denied that science is genuinely a methodological process. In his book Against Method he argued that scientific progress is not the result of applying any particular rules [7] . Instead, he concluded almost that 'anything goes', in that for any particular 'rule' there are abundant examples of successful science that have proceeded in a way that seems to contradict it. [8] To Feyeraband, there is no fundamental difference between science and other areas of human activity characterised by reasoned thought. A similar sentiment was expressed by T.H. Huxley in 1863: "The method of scientific investigation is nothing but the expression of the necessary mode or working of the human mind. It is simply the mode at which all phenomena are reasoned about, rendered precise and exact." [9]

Nevertheless, in the Daubert v. Merrell Dow Pharmaceuticals [509 U.S. 579 (1993)] decision, the U.S. Supreme Court accorded a special status to 'The Scientific Method', in ruling that "… to qualify as 'scientific knowledge' an inference or assertion must be derived by the scientific method. Proposed testimony must be supported by appropriate validation - i.e., 'good grounds', based on what is known." The Court also stated that "A new theory or explanation must generally survive a period of testing, review, and refinement before achieving scientific acceptance. This process does not merely reflect the scientific method, it is the scientific method."[10]

Hypotheses and theories

The man of science must work with method. Science is built up of facts, as a house is built of stones; but an accumulation of facts is no more a science than a heap of stones is a house. Most important of all, the man of science must exhibit foresight.Henri Poincaré[11]

A hypothesis is a proposed explanation of a phenomenon. It is an “inspired guess”, a “bold speculation”, embedded in current understanding yet going beyond that to assert something that we do not know for sure as a way of explaining something not otherwise accounted for. Scientists use many different means to generate hypotheses, including their own creative imagination, ideas from other fields, and by induction.Charles Sanders Peirce described the incipient stages of inquiry, instigated by the "irritation of doubt" to venture a plausible guess, as abductive reasoning. The history of science is full of stories of scientists claiming a "flash of inspiration" which motivated them. One of the best known is Kekule's account that the structure of benzene came to him in a dream, in which rows of atoms wound like serpents before him; one of the serpents seized its own tail and "the form whirled mockingly before my eyes. I came awake like a flash of lightning. This time also I spent the remainder of the night working out the consequences of the hypothesis".[12]

A scientific hypothesis is something that has consequences, it leads to predictions and these can be tested by experiments. If the predictions prove wrong, the hypothesis is discarded, otherwise it is put to further test. If it resists determined attempts to disprove it, then it might come to be accepted, at least for the moment, as plausible.

Popper and Kuhn

The philosopher Karl Popper, in The Logic of Scientific Discovery, a book that Sir Peter Medawar called "one of the most important documents of the 20th century", argued that this 'hypothetico-deductive' method was the only sound way by which science makes progress. He argued that the alternative 'Baconian' process of induction - of gathering facts, considering them, and inferring general laws - is logically unsound, as many mutually inconsistent hypotheses might be consistent with any given facts. Popper concluded that for a proposition to be considered scientific, it must, at least in principle, be possible to make an observation that would show it to be false. Otherwise the proposition has, as Popper put it, no connection with the real world.

For Popper, theory is profoundly important in science; a theory encompasses the preconceptions by which the world is viewed, and defines what we choose to study, and how we study it and understand it. He recognised that theories are not discarded lightly therefore, and a theory might be retained long after it has been shown to be inconsistent with many known facts (anomalies). However, the recognition of anomalies drives scientists to adjust the theory, and if the anomalies continue to accumulate, will drive them to develop alternative theories.

Popper proposed that the content of a theory should be judged by the extent to which it inspired testable hypotheses; while theories always also contain many elements that are not falsifiable, Popper argued that these should be kept to a minimum. However, this was not the only criterion in choosing a theory; scientists also seek theories that are "elegant"; a theory should yield clear, simple explanations of complex phenomena, that are intellectually satisfying in appearing to be logically coherent, rich in content, and involving no miracles or other supernatural devices.

Popper's views on falsifiability were in marked contrast to those of his contemporary, the historian Thomas Kuhn. Kuhn's own book "The Structure of Scientific Revolutions" was no less influential than Popper's, but its message was very different. Kuhn analysed times in the history of science when one dominant theory was replaced by another - such as the replacement of Ptolemy's heliocentric model of the Universe with the Copernican geocentric model, and the replacement of Newtonian laws of motion with Einstein's theory of Relativity. In many respects Popper was asserting his rules for 'good science', Kuhn considered himself to be reporting what scientists actually did, although he believed that as what they did was successful, probably there was merit in what they did. Kuhn concluded that falsifiability had played almost no role in the 'scientific revolutions' where one paradigm was replaced by another. He argued that scientists working in a field resisted the alternative interpretations of 'outsiders', and tenaciously defended their world view by continually elaborating their shared theory; "normal science often suppresses fundamental novelties because they are necessarily subversive of its basic commitments". According to Kuhn, most progress is made in a field when one theory is dominant; progress occurs by the "puzzle solving" of scientists who are trying not to challenge it, but to extend its scope and explanatory power, bringing theory and fact ito closer agreement by a "strenuous and devoted attempt to force nature into the conceptual boxes supplied by professional education".[13]

Experiments and observations

Werner Heisenberg in a quote that he attributed to Albert Einstein , stated [14]

The phenomenon under observation produces certain events in our measuring apparatus. As a result, further processes take place in the apparatus, which eventually and by complicated paths produce sense impressions and help us to fix the effects in our consciousness. Along this whole path—from the phenomenon to its fixation in our consciousness—we must be able to tell how nature functions, must know the natural laws at least in practical terms, before we can claim to have observed anything at all. Only theory, that is, knowledge of natural laws, enables us to deduce the underlying phenomena from our sense impressions.

Reductionism

For much of the 20th century, the dominant approach to science has been reductionism – the attempt to explain all phenomena by basic laws of physics and chemistry. This principle has ancient roots - Francis Bacon (1561-1626) quotes Aristotle as declaring "That the nature of everything is best seen in his smallest portions." [15] In many fields, however, reductionist explanations are impractical, and all explanations involve 'high level' concepts. Nevertheless, the reductionist belief has been that the role of science is to progressively explain high level concepts by concepts closer and closer to the basic physics and chemistry. For example, to explain the behaviour of individuals we might refer to motivational states such as hunger. These reflect features of brain activity that are still poorly understood, but we can investigate, for example, the 'hunger centres' of the brain that house these drives. These centres involve many neural networks – interconnected nerve cells, and each network we can be probed in detail. These networks in turn are composed of specialised neurons that can be analysed individually. These nerve cells have properties that are the product of a genetic program that is activated in development – and so are reducible to molecular biology. However, while behaviour is thus in principle reducible to basic elements, explaining the behaviour of an individual in terms of the most basic elements has little predictive value, because the uncertainties in our understanding are too great.

Measurement

The reductionist approach assigned particular importance to measurement of quantities. Measurements may be tabulated, graphed, or mapped, and statistical analysed; often these representations of the data use tools and conventions that are at a given time, accepted and understood by scientists within a given field. Measurements may need specialized instruments such as thermometers, microscopes, or voltmeters, whose properties and limitations are familiar within the field, and scientific progress is often intimately tied to their development. Measurements also provide operational definitions: a scientific quantity is defined precisely by how it is measured, in terms that enable other scientists to reproduce the measurements. Scientific quantities are often characterized by units of measure which can be described in terms of conventional physical units. Ultimately, this may involve internationally agreed ‘standards’; for example, one second is defined as exactly 9,192,631,770 oscillations or cycles of the cesium atom's resonant frequency [6]. The scientific definition of a term sometimes differs substantially from their natural language use; mass and weight overlap in meaning in common use, but have different meanings in physics. All measurements are accompanied by the possibility of error, so their uncertainty is often estimated by repeating measurements, and seeing by how much these differ. Counts of things, such as the number of people in a nation at a given time, may also have an uncertainty: counts may represent only a sample, with an uncertainty that depends upon the sampling method and the size of the sample.

The scientific method in practice

The UK Research Charity Cancer Research UK gives an outline of the scientific method, as practised by their scientists [7]. The quotes that follow are from this outline

[Scientists] start by making an educated guess about what they think the answer might be, based on all the available evidence they have. This is known as forming an hypothesis. They then try to prove if their hypothesis is right or wrong. Researchers carry out carefully designed studies, often known as experiments, to test their hypothesis. They collect and record detailed information from the studies. They look carefully at the results to work out if their hypothesis is right or wrong…

From Einstein's theory of General Relativity several predictions can be derived. According to this theory, light will appear to 'bend' in a gravitational field, by an amount that depends on the strength of the field. Arthur Eddington's observations made during a solar eclipse in 1919 supported General Relativity rather than the Newtonian theories that it replaced.

In his 1958 book, Personal Knowledge, chemist and philosopher Michael Polanyi (1891-1976) criticized the view that the scientific method is purely objective and generates objective knowledge. Polanyi considered this to be a misunderstanding of the scientific method, and argued that scientists do and must follow personal passions in appraising facts and in choosing which questions to investigate. He concluded that a structure of liberty is essential for the advancement of science -- that the freedom to pursue science for its own sake is a prerequisite for the production of knowledge.

Peer review

…Once they have completed their study, the researchers write up their results and conclusions. And they try to publish them as a paper in a scientific journal. Before the work can be published, it must be checked by a number of independent researchers who are experts in a relevant field. This process is called ‘peer review’, and involves scrutinising the research to see if there are any flaws that invalidate the results…

Manuscripts submitted for publication in scientific journals are normally sent by the editor to (usually one to three) other scientists for evaluation. These 'expert referees' advise the editor about the suitability of the paper for publication in the journal. They also report, usually anonymously, on its strengths and weaknesses, pointing out any errors or omissions that they noticed and offering suggestions for how the paper might be improved by revision or by further experiments. With this advice, the editor might reject the paper or decide that it might be acceptable if appropriately revised.

Peer review has been widely adopted by the scientific community, but has weaknesses. In particular, it is easier to publish data consistent with a generally accepted theory than to publish data that contradict it. This helps to ensure the stability of the accepted theory, but also means that the appearance of the extent to which a current theory is supported by evidence might be misleading - boosted by a poorly scrutinised supportive work while protected against ctiticism. The biologist Lynn Margulis encountered great difficulty in publishing her theory that the eukaryotic cell is a symbiotic union of primitive prokaryotic cells. In 1966, she wrote a theoretical paper entitled The Origin of Mitosing Cells; it was "rejected by about fifteen scientific journals," as Margulis recalled. Finally accepted by The Journal of Theoretical Biology, it is now considered a landmark in modern endosymbiotic theory.[16] In 1995, Richard Dawkins said, "I greatly admire Lynn Margulis's sheer courage and stamina in sticking by the endosymbiosis theory, and carrying it through from being an unorthodoxy to an orthodoxy." [17]

The scientific literature

…If the study is found to be good enough, the findings are published and acknowledged by the wider scientific community…

Sir Peter Medawar, Nobel laureate in Physiology and Medicine in his article “Is the scientific paper a fraud?” argued that "The scientific paper in its orthodox form does embody a totally mistaken conception, even a travesty, of the nature of scientific thought."

In scientific papers, the results of an experiment are interpreted only at the end, in the discussion section, giving the impression that those conclusions are drawn by induction or deduction from the reported evidence. However, explains Medawar, it is the expectations that a scientist begins with that provide the incentive for the experiments, determine their nature, and determine which observations are relevant and which are not. Only in the light of these initial expectations do the activities described in a paper have any meaning at all. The expectation, the original hypothesis, according to Medawar, is not the product of inductive reasoning but of inspiration, educated guesswork.

Confirmation

…But, it isn’t enough to prove a hypothesis once. Other researchers must also be able to repeat the study and produce the same results, if the hypothesis is to remain valid…

Sometimes scientists make errors in the design, execution or analysis of their experiments. Consequently, it is common for other scientists to try to repeat experiments, especially when the results were surprising [18]. Accordingly, scientists keep detailed records of their experiments, to provide evidence of their effectiveness and integrity and assist in reproduction. However, a scientist cannot record everything about an experiment; he (or she) reports what he believes to be relevant. This can cause problems if some supposedly irrelevant feature is questioned, but the accepted theory itself often defines what a scientist expects to be relevant. For example, Sidney Ringer's experiments with isolated frog hearts first led him to declare that the heart could continue to beat if kept in a simple saline solution. However, he later discovered that the solution had been made up not with distilled water but with London tap water, which contained a significant amount of dissolved calcium carbonate. He retracted his first reports, and is now known as the scientist who demonstrated the importance of calcium for the contractile activity of the heart. [19]

Scientific ethics

It is rare for scientists to deliberately falsify their results, although there have been well publicised examples of this. Any scientist who does so takes an enormous risk, because if the claim is important it is likely to be subjected to close scrutiny, and if it is wrong then the reputation of the scientist will suffer regardless of whether his mistakes were honest or not. Honor in Science, published by Sigma Xi, quotes C.P. Snow (The Search 1959): "The only ethical principle which has made science possible is that the truth shall be told all the time. If we do not penalise false statements made in error, we open up the way, don’t you see, for false statements by intention. And of course a false statement of fact, made deliberately, is the most serious crime a scientist can commit."[20]

Statistics

…If the initial study was carried out using a small number of samples or people, larger studies are also needed. This is to make sure the hypothesis remains valid for bigger group and isn't due to chance variation…

Statistical analysis is a standard part of hypothesis testing in many areas of science. This formalises the criteria for disproof by allowing statements of the following form "If a given hypothesis is true, the chance of getting the results that we observed is (say) only 1 in 20 or less (P < 0.05), so it is very likely that the hypothesis is wrong, and accordingly we reject it.

This notion of a hypothesis is quite different to Popper's. For instance, we might predict that a certain chemical will produce a certain effect. However what we test is often not this, but the complementary null hypothesis - that the chemical will have no effect[8]. The reason for thisis that if our original hypothesis tells us that there will be an effect but is vague about its expected magnitude, we can still logically disprove the null hypothesis (by showing an effect), even though we cannot disprove the hypothesis that the chemical is effective as we cannot exclude the possibility that the effect is smaller than we can measure reliably. The best answer might be to choose hypotheses that give precise predictions, but in many areas of science this is often unrealistic. In medicine for example, we might expect a new drug to be effective in a particular condition from our understanding of its mechanism of action, but might not know how big an effect to expect because of many uncertainties - how many people in a genetically variable population will be resistant to the drug? for example, and how quickly will tolerance to the drug develop in people who respond well?

In fact, this is not hypothesis testing in Popper's sense, because this test does not put the original hypothesis at any hazard of disproof. Verification of this type is something that Popper considered to be, at best, weak corroborative evidence, partly because it is impossible to put measure the degree of support that such evidence gives to a hypothesis. [21]

An important school of Bayesian statistics seeks to provide a statistical basis for support by induction, and some areas of science use these approaches. However, often this approach is not tenable because of the difficulty of attaching a priori probabilities in any meaningful way to the alternative predicted outcomes of an experiment.

Progress in science

…Over time, scientific opinion can change. This is because new technologies can allow us to re-examine old questions in greater detail.

Kuhn argued that scientific opinion does not change easily in fundamental things. In particular, one theory or world view is replaced by another not because many scientists are "converted" to the new world view. Instead, a new theory begins as an unfashionable alternative that is often derided, but gains adherents as its advantages become apparent to new scientists entering the field, while the adherents of the old view fight a 'rear guard action' to defend it. Barbara McClintock's work on regulatory elements that conntrol gene expression won her the Nobel Prize in Physiology or Medicine in 1983, but in 1953 she had decided to stop trying to publish detailed accounts of her work, because of the puzzlement and hostility of her peers. In 1973 she wrote:

"Over the years I have found that it is difficult if not impossible to bring to consciousness of another person the nature of his tacit assumptions when, by some special experiences, I have been made aware of them. ...One must await the right time for conceptual change"[22]


See Also

Notes and references

  1. Isaac Newton (1687, 1713, 1726) "[4] Rules for the study of natural philosophy", Philosophiae Naturalis Principia Mathematica, Third edition. The General Scholium containing the 4 rules follows Book 3, The System of the World on pp 794-796 of I. Bernard Cohen and Anne Whitman's 1999 translation, University of California Press ISBN 0-520-08817-4
  2. Sagan C (1987) The fine art of baloney detection Parade MagazineFeb 1, p 12­13
  3. Medawar P (1982) Pluto's Republic, Oxford University Press; see [1]
  4. From the autobiography of Charles Darwin [2]
  5. Bacon, Francis (1620) Novum Organum (The New Organon)
  6. from Preface to The Great Instauration; 4.18 quoted in Pesic P (2000)The Clue to the labyrinth: Francis Bacon and the decryption of nature Cryptologia. Francis Bacon should not be confused with Roger Bacon, a Fransiscan friar who also has claims to be a pioneer of observation and experiment, and who was imprisoned when his work challenged the dogma of the Church.[3]
  7. Feyerabend PK (1975) Against Method, Outline of an Anarchistic Theory of Knowledge Reprinted, Verso, London, UK, 1978
  8. Feyerabend's 'anything goes' argument explained at the Galilean Library. Criticisms such as his led to the strong programme, a radical approach to the sociology of science.
  9. Huxley TH (1863) From a 1863 lecture series aimed at making science understandable to non-specialists
  10. Text of the opinion, LII, Cornell University; Daubert-The Most Influential Supreme Court Decision You've Never Heard of
  11. Henri Poincaré (1905)Science and Hypothesis
  12. cited in Bargar RR, Duncan JK (1982) Cultivating creative endeavor in doctoral research J Higher Educ 53:1-31 doi:10.2307/1981536
  13. Kuhn TS (1961) The Function of Measurement in Modern Physical Science ISIS 52:161–193
    • Kuhn TS (1962)The Structure of Scientific Revolutions University of Chicago Press, Chicago, IL. 2nd edition 1970, 3rd edition 1996
    • Kuhn TS (1977) The Essential Tension, Selected Studies in Scientific Tradition and Change University of Chicago Press, Chicago, IL
    • A Synopsis from the original by Professor Frank Pajares, From the Philosopher's Web Magazine
    • Moloney DP (2000) First Things 10153-5
  14. Heisenberg, Werner (1971) Physics and Beyond, Encounters and Conversations, A.J. Pomerans (trans.), Harper and Row, New York, NY pp.63–64
  15. Francis Bacon 'The Advancement of Learning' [4]
  16. Sagan L (1967) On the origin of mitosing cells" J. Theor Biol 14:255-74 Abstract
  17. John Brockman, The Third Culture, New York: Touchstone 1995, 144
  18. Georg Wilhelm Richmann was killed by lightning (1753) when attempting to replicate the 1752 kite experiment of Benjamin Franklin. Krider P (2006) Benjamin Franklin and lightning rods Physics Today 59:42 [5]
  19. Carafoli E (2002) Calcium signalling: a tale for all seasons PNAS USA 99:115-22
  20. Under Federal regulations (the Federal Register, vol 65, no 235, December 6, 2000) A finding of 'research misconduct' requires that There be a significant departure from accepted practices of the relevant research community; and The misconduct be committed intentionally, or knowingly, or recklessly; and The allegation be proven by a preponderance of evidence.
  21. In appendix ix to The Logic Popper states: "As to degree of corroboration, it is nothing but a measure of the degree to which hypothesis h has been tested...it must not be interpreted therefore as a degree of the rationality of our belief in the truth of h...rather it is a measure of the rationality of accepting, tentatively, a problematic guess."
  22. McClintock B (1987) The discovery and characterization of transposable elements: the collected papers of Barbara McClintock ed John A. Moore. Garland Publishing, Inc. ISBN 0-8240-1391-3. (Introduction)

External links