New Technique To Control Fluids Using Specially Fabricated - Seas: Fill & Download for Free

GET FORM

Download the form

How to Edit The New Technique To Control Fluids Using Specially Fabricated - Seas quickly and easily Online

Start on editing, signing and sharing your New Technique To Control Fluids Using Specially Fabricated - Seas online under the guide of these easy steps:

  • Push the Get Form or Get Form Now button on the current page to access the PDF editor.
  • Wait for a moment before the New Technique To Control Fluids Using Specially Fabricated - Seas is loaded
  • Use the tools in the top toolbar to edit the file, and the added content will be saved automatically
  • Download your completed file.
Get Form

Download the form

The best-rated Tool to Edit and Sign the New Technique To Control Fluids Using Specially Fabricated - Seas

Start editing a New Technique To Control Fluids Using Specially Fabricated - Seas immediately

Get Form

Download the form

A quick guide on editing New Technique To Control Fluids Using Specially Fabricated - Seas Online

It has become quite simple these days to edit your PDF files online, and CocoDoc is the best app you would like to use to do some editing to your file and save it. Follow our simple tutorial to start!

  • Click the Get Form or Get Form Now button on the current page to start modifying your PDF
  • Add, change or delete your text using the editing tools on the toolbar above.
  • Affter altering your content, put the date on and add a signature to make a perfect completion.
  • Go over it agian your form before you save and download it

How to add a signature on your New Technique To Control Fluids Using Specially Fabricated - Seas

Though most people are adapted to signing paper documents by handwriting, electronic signatures are becoming more regular, follow these steps to sign documents online for free!

  • Click the Get Form or Get Form Now button to begin editing on New Technique To Control Fluids Using Specially Fabricated - Seas in CocoDoc PDF editor.
  • Click on the Sign tool in the tool box on the top
  • A window will pop up, click Add new signature button and you'll have three choices—Type, Draw, and Upload. Once you're done, click the Save button.
  • Drag, resize and settle the signature inside your PDF file

How to add a textbox on your New Technique To Control Fluids Using Specially Fabricated - Seas

If you have the need to add a text box on your PDF and customize your own content, do the following steps to complete it.

  • Open the PDF file in CocoDoc PDF editor.
  • Click Text Box on the top toolbar and move your mouse to position it wherever you want to put it.
  • Write in the text you need to insert. After you’ve writed down the text, you can select it and click on the text editing tools to resize, color or bold the text.
  • When you're done, click OK to save it. If you’re not happy with the text, click on the trash can icon to delete it and take up again.

A quick guide to Edit Your New Technique To Control Fluids Using Specially Fabricated - Seas on G Suite

If you are looking about for a solution for PDF editing on G suite, CocoDoc PDF editor is a recommended tool that can be used directly from Google Drive to create or edit files.

  • Find CocoDoc PDF editor and set up the add-on for google drive.
  • Right-click on a PDF document in your Google Drive and choose Open With.
  • Select CocoDoc PDF on the popup list to open your file with and give CocoDoc access to your google account.
  • Modify PDF documents, adding text, images, editing existing text, mark with highlight, erase, or blackout texts in CocoDoc PDF editor before saving and downloading it.

PDF Editor FAQ

What are some examples of wrong scientific beliefs that were held for long periods?

Some examples from the Edge website:NEIL SHUBINEvolutionary Biologist; Robert R. Bensley Distinguished Service Professor; University of Chicago; Author, Your Inner FishOne wrong idea in my field was that the map of the earth was fixed...that the continents stayed in one place over time. This notion held despite the fact that anyone, including small children, could see that the coasts of Africa and South America (like many places) fit together like a jigsaw puzzle. Evidence for moving continents piled up (fossils from different places, similar rocks...), but still there was strong resistance. Part of the problem is that nobody could imagine a mechanism for the continents to move about...did they raft like icebreakers through the ocean mushing the sea bottom as they did so? Nobody saw how this could possibly happen.GARRETT LISIIndependent Theoretical Physicist; Author, "An Exceptionally Simple Theory of Everything"One wrong scientific belief held by cosmologists until recently was that the expansion of the universe would ultimately cease, or even that the universe would re-contract. Evidence now shows that the expansion of the universe is accelerating. This came as quite a shock, although the previous belief was held on scant evidence. Many physicists liked the idea of a closed universe, and expressed distaste at the idea of galaxies accelerating off to infinity, but nature often contradicts our intuition.PETER SCHWARTZFuturist, Business Strategist; Cofounder. Global Business Network, a Monitor Company; Author, Inevitable SurprisesThere are several things we believed not true and now believe to be true for example prions did not exist and now are a major field of study and quantum entanglement was impossible, even to Einstein ("spooky action at a distance", he called it. ) and now it is the basis of quantum computing.DAVID DEUTSCHQuantum Physicist, Oxford; Author, The Fabric of RealitySurely the most extreme example is the existence of a force of gravity.It's hard to say when this belief began but it surely predates Newton. It must have existed from the time when concepts such as force were first formulated until the general theory of relativity superseded it in 1915.Why did scientists hold that belief for so long? Because it was a very good explanation and there was no rival theory to explain observations such as heavy objects exerting forces on whatever they were resting on. Since 1915 we have known the true explanation, namely that when you hold your arm out horizontally, and think you are feeling it being pulled downwards by a force of gravity, the only force you are actually feeling is the upward force exerted by your own muscles in order to keep your arm accelerating continuously away from a straight path in spacetime. Even today, it is hard to discipline our intuition to conceive of what is happening in that way, but it really is.HAIM HARARIPhysicist, former President, Weizmann Institute of Science; Author, A View from the Eye of the StormThe earth is flat and the sun goes around it for the same reason that an apple appears to be more strongly attracted by the earth than a leaf, the same reason that when you add 20% and then subtract 20% you return to the same value, and the same reason that the boat is heavier than water. All of these statements appear to be correct, at first sight, and all of them are wrong. The length of time it takes to figure it out is a matter of history and culture. Religion gets into it, psychology, fear of science, and many other factors. I do not believe that there is one parameter that determines how these things are found to be wrong.The guy who sold me a carpet last month truly insisted that people in Australia are standing on their heads and could not understand how they manage to do it. He still believes that the earth is flat and is ashamed of his belief, but refuses to accept my explanations. I know a union that got a substantial pay raise because a politician did not understand that adding and then subtracting 20% gets you to another result from the one you started. Religious people of all religions believe even more ridiculous things than all of the above. These are examples of the last 10 years, not of the middle ages.Part of the problem is that, in order to find the truth, in all of these cases, you need to ask the right question. This is more important, and often more difficult, than to find the answer. The right questions in the above cases are of different levels of complexity.ALUN ANDERSONSenior Consultant (and former Editor-in-Chief and Publishing Director of New Scientist); Author, After the Ice: Life, Death, and Geopolitics in the New ArcticThe Great Chain of Being is another great example of a long-held, still not fully displaced, false view and also stems from the same kind of "wrongly centered " thinking.Essentially the view is that humans stand at the pinnacle of creation (or just below God) and all other life forms are less perfect to a varying degree.Evolutionary theory teaches that all creatures are equally adapted to the niches in which they live; every branch of the tree is thus in a sense equally perfect.There was a critical moment in the early 1970s when the new view swept into psychology. I was a student at the time, looking at so-called comparative psychology. The dominant view, put forward by ME Bitterman, was that you could classify "learning ability" and arrange animals according to the level they had reached e.g. fish were incapable of "reversal learning" but rats were, or some such. A paper was then published (by Hodos and Campbell 1969) on the false notion of the Great Chain of Being in psychology and that every animal's learning ability fitted the particular use it made of it (e.g. honey bees are brilliant at learning the time of day at which particular flowers produce nectar, a subject I later researched). This change in the way of thinking reflects also a move way from the US Skinnerian school of lab studies of animals to the European ethological school (pioneered by Novel prize winner Niko Tinbergen who I worked with) of studying animals in their own environments.The view also fits Native American conceptions of a Creator who does not favour any particular one of his creations but is at odds with the Christian view, which is why it lingers on in the US.IRENE PEPPERBERGPsychologist, Research Associate, Harvard University; Author, Alex and MeThat all birds were stupid.It was believed to be true because (a) early neurobiologists couldn't find anything in the avian brain that looked like the primate cortex (although the more enlightened did argue that there was a 'striatal' area that seemed to work in a somewhat comparable manner for some birds) and (b) many studies on avian intelligence, using operant conditioning, focused on pigeons — which are not the most intelligent birds — and the pigeon generally never did even as well as the rat on the type of tasks used.A corollary: That parrots were not only stupid, but also could never learn to do anything more than mimic human speech.It was believed to be true because the training techniques initially used in laboratories were not appropriate for teaching heterospecific communication.JOHN HOLLANDProfessor of Psychology, Computer Science and Engineering, University of Michigan, Ann Arbor; Author, Emergence: From Chaos to OrderFrom the time of Aristotle onward, natural philosophers believed that the basic law underlying motion was that all objects (eventually) come to rest. It took Newton to lay aside the myriad details (friction, etc.) in order to build an idealized model that requires 'forces' to change direction or velocity. Subsequently, everything from models of fluid flow to slinging satellites to the outer solar system used Newton's model as a starting point.DEREK LOWEMedicinal ChemistMy nominees are:(1) The "four humours" theory of human physiology. That one, coming most prominently from Galen, persisted for centuries. Although the pure humoural theory gradually eroded, it lived on in the shape of contra-therapeutic bloodletting until the 19th century, and you'd have to think that in the vast majority of those cases it was harmful.Why it persisted so long is the tough part. My guess is that it was hard (it still is!) for both physicians and patients to realize or admit that very little could be done for most physical ailments. Bloodletting might not always, work, but it had to be better than just standing there doing nothing, right? And in those cases susceptible to it, bloodletting must have had a pretty strong placebo effect, as dramatic as it is.(2) The "bad air" theory of infectious disease. This is another one that you can find persisting for centuries. I'd say that it lasted for several factors: there were indeed such things as poisonous vapors which could make a person feel sick, for one thing. And the environments that were felt to have the worst vapors were often ones that had higher rates of disease due to the real factors (standing water, poor hygiene, overcrowded dwellings, and so on). Finally, there's the factor that's kept all sorts of erroneous beliefs alive — lack of a compelling alternative. The idea of strange-looking living creatures too small to see being the cause of infections wouldn't have gotten much of a hearing, not in the face of more tangible explanations.That last point brings up another reason that error persists — the inability (or unwillingness) to realize that man is not the measure of all things. Unaided human perceptions on the human scale don't take you very far on the macro-scale of astronomy, or the micro-scale of cell biology (much less that of subatomic physics). To me, the story of science has been the story of augmenting our perceptions, and realizing that they had to be augmented in the first place.CHARLES SIMONYIComputer Scientist, International Software; Former Chief Architect, and Distinguished Engineer, Microsoft CorporationOne short answer is this: Peripatetic Mechanics of Aristotle was probably the longest running wrong scientific idea which went from the Greek times up until practically Newton. The reason for the longevity was that it (namely Aristotle's mechanics) corresponded well to the crude and complicated word around us: with two horses the heavy cart moves indeed faster (without careful measurements we could easily say: two times faster) than with just one horse. When you give a shove to something, it will start moving and then soon stop. Heavy things move down, light things (feathers, smoke) move up. The normal world is just not friendly to the kind of abstraction that allows the setting up of general natural laws like Newton's.I am of course aware of the currently popular belief that "flat earth" was somehow a widely held "scientific" idea, but I do not know what evidence supports this belief. It was certainly not part of the Antique inheritance (who had pretty good estimates for the diameter of the earth and excellent estimates for the ratio of Earth's and Moon's diameters); It was not part of Aristotle, or Aquinus, or any of the authorities that the Church relied on. No doubt, there were some creation myths or fanciful publications that might have illustrated the world as being flat but it is a stretch to call these "scientific" even by standards of the age, when learned men would have been able to refute such a thesis easily — and probably did as part of their exams.With the geocentric world it is a different matter — geocentrism was indeed scientifically held (with Ptolemy being the best proponent) and it is indeed false — but not to the same extent as the Peripatetic Mechanics. The real issue was precision of prediction — and the complicated system of Ptolemy gave excellent results, indeed better results than Copernicus (which made the breakthrough idea of Copernicus a difficult sell — just put yourself into the shoes of someone in his time.)Real improvement in precision came only with Kepler and the elliptical orbits which were arrived at in part by scientific genius, by being a stickler for accuracy, and in part by mad superstition (music of the spheres, etc.) From his point of view, putting the coordinate system around the sun simplified his calculations. The final significance of putting the sun into the center was to be able to associate a physical effect — gravitation — with the cause of that effect, namely with the sun. But this did not really matter before Newton.In any of the cases a common thread seems to be that the "wrong" scientific ideas were held as long as the difference between "wrong" and "right" did not matter or was not even apparent given the achievable precision, or, in many cases the differences actually favored the "wrong" theory — because of the complexity of the world, the nomenclature, the abstractions.I think we are all too fast to label old theories "wrong" and with this we weaken the science of today — people say — with some justification from the facts as given to them — that since the old "right" is now "wrong" the "right" of today might be also tainted. I do not believe this — today's "right" is just fine, because yesterday's "wrong" was also much more nuanced "more right" that we are often led to believe.NATHAN MYHRVOLDCEO, Managing Director, Intellectual Ventures; Former Director, Microsoft Research and Chief Technology Officer, MicrosoftHere is a short list:1. Stress theory of ulcers — it turns out they are due to infection with Heliobacter pylori. Barry Marshall won Nobel Prize for that.2. Continental drift was proposed in the 1920-30s by Alfred Wegner, but was totally dismissed until the 1960s when it ushered in plate tectonics.3. Conventional belief was the eye evolved many, many times. Then they discovered the PAX genes that regulate eyes and are found throughout the animal kingdom — eyes evolved ONCE.4. Geoffrey St. Hillare was a French scientist who had a theory that invertebrates and vertebrates shared a common body plan. He was widely dismissed until the HOX genes were discovered.LAWRENCE KRAUSSPhysicist, Director, Origins Initiative, Arizona State University; Author, Hiding in the MirrorIntelligent design... special creation... the reason... a long age of the earth is so long that people didn't realize that evolution could occur.STEVEN STROGATZApplied mathematician, Cornell University; Author, SyncAnother classic wrong belief is that light propagates through a medium, the "ether," that pervades the universe. This was believed to be true until the early 1900s because all other waves known at that time required a medium in which to propagate. Sound waves travel through air or water; the waves on a plucked guitar string travel down the string itself. Yet on the face of it, light seemed to need no medium — it could travel through seemingly empty space. Theorists concluded that empty space must not really be empty — it must contain a light-bearing medium, the "luminiferous ether".But the ether was always a very problematic notion. For one thing, it had to be extremely stiff to propagate a wave as fast as light — yet how could empty space be "stiff"?The existence of the ether was disproved experimentally by the Michelson Morley experiment, and theoretically by Einstein's special theory of relativity.CÉSAR A. HIDALGOAssistant Professor, MIT Media Lab; Faculty Associate, Harvard Center for International DevelopmentThe age of the earth... which was believed to be only a few thousand years old, due to biblical calculations, until Charles Lyell (who was a good friend of Darwin) begun to come up with estimates of millions of years based on erosion.... the advanced age of the world was heavily refuted by scientists, particularly by Lord Kelvin, who made calculations of the rate at which earth must have cooled down and concluded that this could have only happened in a few thousand years... he did not know about the radioactive decay taking place at the earth's core...The model that was used to explain mountains was based not on tectonic plates, but rather on a shrinking earth, by assuming that as the earth cooled down it shrunk and creased up....The humors theory of disease v/s the germ theory of disease.Basically... any change of paradigm that went on during the 19th century in England...ERIC TOPOLCardiologist; Director, Scripps Translational Science Institute, La JollaIn medicine there are many of these wrong scientific beliefs (so many it is frankly embarrassing). Here are a couple:We were taught (in med school and as physicians) that when cells in the body differentiate to become heart muscle or nerve tissue/brain, they can never regenerate and there is no natural way for new cells/tissue to form. Wrong!! Enter the stem cell and regenerative medicine era.Until the mid 1980s, a heart attack was thought to be a fait accomplit, that there was nothing that could ever be done to stop the damage from occurring...just give oxygen, morphine, and say prayers. Then we discovered that we could restore blood supply to the heart and abort the heart attack or prevent much of the damage. The same is now true for stroke. It took almost 80 years for that realization to be made!CHRISTIAN KEYSERSNeuroscientist; Scientific Director, Neuroimaging Center, University Medical Center GroningenFor a long time the brain was thought to contain separate parts designed for motor control and visual perception. Only in the 1990's, through the discovery of mirror neurons, did we start to understand that the brain did not work along such divisions, but was instead using motor areas also for perception and perceptual areas also for vision.I believe that this wrong belief was so deeply engrained because of AI, in which there is no link between what a computer sees human do and the computers routines for moving a robot. Instead, in the human brain the situation is different: the movements we program for our own body look exactly the same as those other humans do. Hence, our motor programs and body are a match for those we observe, and hence afford a strong system for simulating and perceiving the actions of others.I call this the computer fallacy: thinking of the brain as a computer turned out to harm our understanding of the brain.SIMONA MORINIPhilosopher; Dipartimento delle Arti e del Disegno Industriale, IUAV University VeniceMy preference goes to euclidean geometry. It's axioms were considered true for centuries on the basis of intuition (shall we say prejudice?) about space.ROSS ANDERSONFRS; Professor, Security Engineering, Cambridge Computer Laboratory; Researcher in Security PsychologyIn the field of security engineering, a persistent flat-earth belief is 'security by obscurity': the doctrine that security measures should not be disclosed or even discussed.In the seventeenth century, when Bishop Wilkins wrote the first book on cryptography in English in 1641, he felt the need to justify himself: "If all those useful Inventions that are liable to abuse, should therefore be concealed, there is not any Art or Science which might be lawfully profest". In the nineteenth century, locksmiths objected to the publication of books on their craft; although villains already knew which locks were easy to pick, the locksmiths' customers mostly didn't. In the 1970s, the NSA tried to block academic research in cryptography; in the 1990s, big software firms tried to claim that proprietary software is more secure than its open-source competitors.Yet we actually have some hard science on this. In the standard reliability growth model, it is a theorem that opening up a system helps attackers and defenders equally; there's an empirical question whether the assumptions of this model apply to a given system, and if they don't then there's a further empirical question of whether open or closed is better.Indeed, in systems software the evidence supports the view that open is better. Yet the security-industrial complex continues to use the obscurity argument to prevent scrutiny of the systems it sells. Governments are even worse: many of them would still prefer that risk management be a matter of doctrine rather than of science."JAMES CROAKArtistThe first wrong notion that comes to mind, one that lasted centuries, is from Thales of Miletus, regarded as the "father of science" as he rejected mythology in favor of material explanations. He believed everything was water, a substance that in his experience could be viewed in all three forms: liquid, solid, gas. He further speculated that earthquakes were really waves and that the earth must be floating on water because of this.The idea that matter is one thing in different appearances is regarded as true even today.ROB KURZBANPsychologist, UPenn; Director, Penn Laboratory for Experimental Evolutionary Psychology (PLEEP); Author, Why Everyone (Else) is a HypocriteI'm guessing you'll get some of the more obvious ones, so I want to offer an instance a little off the beaten path. I came across it doing research of my own into this issue of closely held beliefs that turn out to be wrong.There was a court case in New York in 1818 surrounding the question of whether a whale was a fish or a mammal. Obviously, we now know not only that there is a correct answer to this question (for a time this wasn't obvious) but also what that answer is (mammal, obviously). Even after some good work in taxonomy, the idea that a whale was a fish persisted. Why?This one is probably reasonably clear. Humans assign animals to categories because doing so supports inferences. (There's great work by Ellen Markman and Frank Keil on this.) Usually, shared physical features supports inferences about categorization, which then supports inferences about form and behavior. In this case, the phylogeny just happens to violate what usually is a very good way to group animals (or plants), leading to the persistence of the incorrect belief.LEWIS WOLPERTBiologist, University College; Author, Six Impossible Things to Do Before BreakfastThat force causes movement — it causes acceleration. That heavy bodies fall faster than lighter ones.HOWARD GARDNERPsychologist, Harvard University; Author, Changing MindsAmong cognitive psychologists, there is widespread agreement that people learn best when they are actively engaged with a topic, have to actively problem solve, as we would put it 'construct meaning.' Yet, among individuals young and old, all over the world, there is a view that is incredibly difficult to dislodge. To wit: Education involves a transmission of knowledge/information from someone who is bigger and older (often called 'the sage on the stage') to someone who is shorter, younger, and lacks that knowledge/information. No matter how many constructivist examples and arguments are marshaled, this view — which I consider a misconception — bounces back. And it seems to be held equally by young and old, by individuals who succeeded in school as well as by individuals who failed miserably.Now this is not a scientific misconception in the sense of flat earth or six days of creation, but it is an example of a conception that is extraordinarily robust, even though almost no one who has studied cognition seriously believes it hold water.Let me take this opportunity to express my appreciation for your many contributions to our current thinking.ED REGISScience Writer, Author, What Is Life?Vitalism, the belief that living things embody a special, and not entirely natural, animating force or principle that makes them fundamentally different from nonliving entities. (Although rejected by scientists, I would hazard the guess that vitalism is not entirely dead today among many members of the general public.) This belief's persistence over the ages is explained by the obvious observable differences between life and nonlife.Living things move about under their own power, they grow, multiply, and ultimately die. Nonliving objects like stones, beer bottles and grains of sand don't do any of that. It's the overwhelming nature of these perceptible differences that accounts for the belief's longevity. In addition, there is still no universally accepted scientific explanation of how life arose, which only adds to the impression that there's something scientifically unexplainable about life.ROBERT TRIVERSEvolutionary Biologist, Rutgers University; Coauthor, Genes In Conflict: The Biology of Selfish Genetic ElementsFor more than 100 years after Darwin (1859) people believed that evolution favored what was good for the group or the species — even though Darwin explicitly rejected this errorProbable cause: the false theory was just what you would expect people to propagate in a species whose members are concerned to increase the group-orientation of others.FRANK TIPLERProfessor of Mathematical Physics, Tulane University; Author, The Physics of ChristianityI myself have been working a book on precisely the same topic, but with a slightly different emphasis: why did scientists not accept the obvious consequences of their own theories?Here are three examples of false beliefs long accepted:(1) The false belief that stomach ulcers were caused by stress rather than bacteria. I have some information on this subject that has never been published anywhere. There is a modern Galileo in this story, a scientist convicted of a felony in criminal court in the 1960's because he thought that bacteria caused ulcers.(2) The false belief that the continents do not move. The drifting continents were an automatic mathematical consequence of the fact that the Earth was at least 100 million years old, and the fact that the Earth formed by the gravitational collapse of a gas and dust cloud. One of Lord Kelvin's students pointed out the essential idea in a series of papers in Nature. This was long before Wegener.(3) The false belief that energy had to be a continuous variable. James Clerk Maxwell, no less, realized that this was a false belief. The great mathematician Felix Kelin, of Klein Bottle fame, discussed the question with Erwin Schrödinger of why the fact of quantized energy was not accepted in the 19th century.JOAN CHIAOAssistant Professor, Brain, Behavior, and Cognition; Social Psychology; Northwestern UniversityEarly pioneering cultural anthropologists, such as Lewis Morgan who penned the influential 1877 work Ancient Society and others, were heavily influenced by Darwinian notions of biological evolution to consider human culture as itself evolving linearly in stages.Morgan in particular proposed the notion that all human cultures evolved in three basic stages: from savagery, to barbarism to finally, civilization and that technological progress was the key to advancing from one stage to the next. Morgan was by no means an arm chair academic; he lived with Native Americans and and studied their culture extensively. Through these first-hand experiences, Morgan sought to reconcile what he observed to be vast diversity in human cultural practices, particularly between Native Americans and Europeans, with emerging ideas of Darwinian biological evolution.Morgan was one of several anthropologists at the time who proposed various forms of unilinear cultural evolution, the idea that human culture evolved in stages from simple to more sophisticated and complex, which ultimately later became tied to colonialist ideology and social Darwinism.Such dangerous ideas then became the catalyst for Franz Boas and other 20th century anthropologists to challenge ideas by Morgan with concepts such as ethnocentrism. By arguing how belief in the superiority of one's own culture guided anthropological theories of unilinear evolution, rather than scientific objectivity per se, Boas and his colleagues exposed an important human and scientific bias in the study of human culture that later gave way to revised theories of cultural evolution, namely multilinear evolution, and the emergence of cultural relativism.JEREMY BERNSTEINProfessor of Physics, Stevens Institute of Technology; Author, Nuclear Weapons: What You Need to Know,It was generally believed until the work of Hubble that the universe was static and that the Milky Way was everything.MATTHEW RITCHIEArtistAn example of a correct theory that was extensively accepted by the public, then displaced by an alternate interpretation, which has since been problematized without resolution.Although the 19th century idea that the fourth dimension was an extra dimension of space was in many senses correct, it was invalidated in the cultural imagination by Minkowski and Einstein's convincing and influential representation of time as the fourth dimension of space-time.For example: the polychora in Picasso & Duchamp's early cubist works were far more directly influenced by Hinton's essays "What is the Fourth Dimension?" and "A Plane World", than Minkowski & Einstein's work — but the general acceptance ofEinstein's theory encouraged art historians to interpret cubist work as being directly influenced by the theory of relativity — which was entirely inaccurate. (This is discussed in depth in Henderson's definitive work The Fourth Dimension and Non-Euclidean Geometry in Modern Art)Overall, the cultural displacement of the theory of 4-D space has required a series of re-statements of the idea of the fourth dimension — which have so far failed to properly define the nature of the fourth dimension either in time or space to the larger public.CLAY SHIRKYSocial & Technology Network Topology Researcher; Adjunct Professor, NYU Graduate School of Interactive Telecommunications Program (ITP); Author, Cognitive SurplusThe existence of ether, the medium though which light (was thought to) travel.Extra credit: It was believed to be true by analogy — waves propagate through water, and sound waves propagate through air, so light must propagate through X, and the name of this particular X was ether.It's also my favorite because it illustrates how hard it is to accumulate evidence for deciding something doesn't exist. Ether was both required by 19th century theories and undetectable by 19th century apparatus, so it accumulated a raft of negative characteristics: it was odorless, colorless, inert, and so on.When Michelson and Morley devised an apparatus sensitive enough to detect characteristic differences in the behavior of light based on the angle through which it traveled through the ether (relative to the earth's motion), and could detect no such differences, they spent considerable time and energy checking their equipment, so sure were they that ether's existence-by-analogy operated as something like proof. (Danny Kahneman calls this 'theory-induced blindness.')And of course, the failure of ether to appear opened the intellectual space in which Einstein's work was recognized.ROGER C. SCHANKPsychologist; Computer Scientist; AI Researcher. Author, The Future of Decision-MakingThe obvious candidate for failed theory in the world of learning is the stimulus-response theory (called behaviorism) that dominated psychology for many years. Yes, my dog gets excited when I make coffee because that is when he knows he will get a treat, but that kind of learned behavior is hardly all there is to learning.In my first year on the faculty at Stanford, Ken Colby offered to share his time in front of the first-year computer science graduate students. At that time each professor in Artificial Intelligence would have a week of an introductory course in which he would say what he was doing. In their second quarter, students would choose one of those professors to work with in a seminar. Ken invited me to share his time and his seminar.It took a while to get the "results." The next quarter we met with the students who had signed up for our seminar. While other seminars given by other professors had attracted one or two students we have gotten about 20. Boy was my ego fired up. I was succeeding at this new game. At least that was what I thought until all the students went around the room to say why they were there. They were all there because of Ken — none were there because of me.I wondered what had happened. Ken had given a very glib funny speech without much content. He seemed to be a lightweight, although I knew he wasn't. I, on the other hand, had given a technical speech about my ideas about how language worked and how to get computers to comprehend.I asked Ken about this and he told me: if you can say everything you know in an hour, you don't know much.It was some of the best advice I ever got. You can't tell people everything you know without talking way too fast and being incomprehensible. Ken was about hoping to be understood and to be listened to. I was about being serious and right. I never forgot his words of wisdom. These days I am much funnier.And, I realize that I do know a lot more than can fit into an hour long speech. Maybe then I actually didn't know all that much.So, I learned a great deal from Ken's just in time advice which I then had to think about. That is one kind of learning. And then, that experience became one my stories and thus a memory (which is another aspect of learning.) Learning is also about constructing explanations of events that were not predicted so that you can predicted them better next time. And learning is about constructing and trading stories with others to give you nuances of experiences to ponder, which is a very important part of learning.Learning has more aspects to it than just those things of course. Stimulus-response doesn't cover much of the turf.GARY KLEINResearch Psychologist; Founder, Klein Associates; Author, The Power of IntuitionHere are some of my favorites:1. Ulcers are created by stress. Some research on monkeys seemed to bear this out, and it fit a comforting stereotype that Type A individuals were burning up inside.2. Genes are made of protein. This was more reasonable — the complex protein molecules matched the complexity of the proteins that genes were building.3. Yellow Fever is caused by miasma and filth. I think this was sustained by a natural repugnance when entering homes that smelled bad — a feeling of "wow, that can't be good — I need to get out of here as soon as possible." Plus a class judgment that poor people live in less sanitary conditions and are more susceptible. Plus a belief that the mosquito theory had been discredited. (In fact, the study on mosquitoes failed to take into account a 12-day incubation period.)4. Cholera is caused by miasma and filth. Ditto. Now the part I can't really understand is why John Snow was so effective in changing this mindset about cholera in England and how his views quickly spread to the U.S., whereas 50 years later even after Walter Reed and his staff eliminated Yellow Fever from Cuba, his subordinate Gorgas (who was in charge of eliminating Yellow Fever in Havana) was so unsuccessful in convincing the authorities when Gorgas was subsequently posted to Panama to control Yellow Fever during the building of the canal.GREGORY COCHRANConsultant, Adaptive Optics; Adjunct Professor of Anthropology, University of Utah; Coauthor, The 10,000 Year Explosion: How Civilization Accelerated Human EvolutionEducated types in the western world have known that the shape of the Earth for a long time, about 2500 years — the idea that they believed in flat Earth is a canard. The notion that the Earth was the center was popular for much longer, largely because parallax was too small to measure, since distances to the stars are enormous compared with the radius of the Earth's orbit. Many people will undoubtedly tell you this.One favorite is helicobacter pylori as the main cause of stomach ulcers. This was repeatedly discovered and then ignored and forgotten: doctors preferred 'stress' as the the cause, not least because it was undefinable. Medicine is particularly prone to such shared mistakes. I would say this is the case because human biology is complex, experiments are not always permitted, and MDs are not trained to be puzzle-solvers — instead, to follow authority. A lot of this traces back to medical traditions, which developed over long periods during which medicine was an ineffective pseudoscience. Freudian analysis was another such madness of crowds.I would guess that most basic anthropological doctrine is false — for example. the 'psychic unity of mankind'. but then most practitioners don't really pretend to do science.One could go on and on!ERIC R. WEINSTEINMathematician and Economist; Principal, Natron GroupThe modern textbook example of groupthink within fundamental physics is likely the so-called Tau-Theta puzzle of the 1950s. The Tau and Theta particles were seen to be as physically indistinguishable as Clark Kent and Superman, except for the ways in which they disintegrated. Yet to suggest that they were the same particle required the mental leap needed to assert that natural law carries a kind of asymmetric beauty mark which could be used to distinguish processes in the real world from their reflections in a pristine mirror. After experimenters at Columbia finally indicated in 1956 that the Tau and Theta were indeed the same particle, physicists came to see that for decades, no one had really bothered to check whether something as profoundly dramatic as an asymmetric universe was hiding within plain sight and easy reach.An even more compelling example of group blindness drawn from engineering is the bizarre case of the Rollaboard suitcase. In the dawning age of jet travel, it seemed no one could find a way to create stable wheeled luggage. Typical early designs featured leashes and tiny external casters on which horizontal luggage would briefly roll before tipping over. It was only in 1989 that Northwest Airlines pilot Robert Plath solved this problem for travelers with the now ubiquitous vertical design of the Rollaboard, with built in wheels and telescoping handles. What is fascinating about this example of groupthink is that every recent scientific genius who struggled with luggage while on the lecture circuit had missed this simple elegant idea, as it required no modern technological advance or domain specific expertise.LANE GREENEJournalist, The EconomistI assume someone might already have written in to suggest "the belief that physical traits acquired during one's lifetime could be passed on to children" — e.g., that a person who became fat through overeating would thereby have fat children (and not because he had genes for obesity). This was apparently even believed by Darwin, I just read, before the discovery and understanding of genes.JAMES O'DONNELLClassicist; Provost, Georgetown University; Author, The Ruin of the Roman EmpireAs classicist, I feel I know too many examples! Ancient medicine and ancient astronomy in particular were full of truths, quite true and valid within the framework within which constructed, that now appear as utter nonsense. I would put at top of my list, however, the science of astronomy — not for the Ptolemaic mathematical workings-out, but for the solid, serious, scientific astrological content. That is to say, it's a beautiful example of a paradigm, in Kuhnian terms, that made perfect sense at the time, that was the foundation for many further advances, that led to undoubtedly serious science, that validated itself by e.g. the way it allowed you to predict eclipses (how could it not be science?), and that just fell apart at the touch of a serious thought. To compare large with small, I would put it next to the science of ulcer medicine 60 years ago, which made similar perfect sense and was all driven by diet and stress and was a continually refining science — falling apart more or less isntantaneously, what, 25 years ago, with the discovery of the link to H. pylori. What the two have in common is the focus on phenomena (that is, the things that appear, the surface data) produces science, but each time you go a step beneath phenomena to mechanisms, new science happens. That's when the impossible becomes possible.GEOFFREY CARRScience Editor, The EconomistBelieving that people believed the Earth was flat is a good example of a modern myth about ancient scientific belief. Educated people have known it was spherical (and also how big it was) since the time of Eratosthenes. That is pretty close to the beginning of any system of thought that could reasonably merit being called scientific...One that was long thought to be true, but isn't, is the spontaneous generation of life. I've never quite understood how that squared with life being divinely created. But the whole pre-Pasteur thing was definitely a widely held, incorrect belief...JONATHAN HAIDTPsychologist, University of Virginia; Author, The Happiness HypothesisThe closest thing to a persistent flat earth belief in psychology is probably the view that experiences in the first five years of life largely shape the personality of the adult. (The child is father to the man, as Freud said). It's now clear that experiences that affect brain development, such as some viral diseases or some head injuries, can indeed change adult personality. Also, extreme conditions that endure for years or that interfere with the formation of early attachments (e.g., an abusive parent) can also have lasting effects. But the idea that relatively short-lived experiences in the first few years — even traumatic ones, and even short-lived sexual abuse — will have powerful effects on adult personality... this just doesn't seem to be true. (Although such events can leave lasting traces on older children). Personality is shaped by the interaction of genes with experience; psychologists and lay people alike long underestimated the power of genes, and they spent too much time looking at the wrong phase of childhood (early childhood), instead of at the developmental phases that matter more (i.e., the prenatal period, and adolescence).Why is early childhood such a draw when people try to explain adult personalities? I think it's because we think in terms of stories, and it's almost impossible for us NOT to look back from Act III (adulthood) to early childhood (act I) when we try to explain someone turned out to be a hero or serial killer. In stories, there's usually some foreshadowing in act I of events to come in act III. But in real life there is almost never a connection.JUAN ENRIQUEZManaging Director in Excel Medical Ventures; Chairman and CEO of Biotechonomy LLC; Author, As the Future Catches YouWe have acted, with good reason, as if human beings are all alike. And given the history of eugenics this has been a good and rational position and policy. But we are entering an era where we recognize that there are more and differences in how a particular medicine affects particular groups of people. Same of foods, pollutants, viruses, and bacteria.We are beginning to recognize we react, and are at differential risks, of catching diseases like AIDS, malaria, anemias. And just this month we began to get a glimpse of the first thousand human genomes. These will soon number in the hundreds of thousands. Are we ready should these initial gene maps show that there are real and significant differences between groups of human beings?SCOTT ATRANAnthropologist; Visiting Professor of Psychology and Public Policy at the University of Michigan; Rresidential Scholar in Sociology at the John Jay College of Criminal Justice, New York City; Author, Talking to the EnemyAnglo-American empiricists and communists alike believed that human minds were almost infinitely malleable, and learned the structure and content of thoughts and ideas based on the frequency of events perceived and on the nearness of events to one another (if one kind of event frequently precedes a second kind of event then the first is likely the cause of the other). Rewards and punishments ("carrots and sticks") supposedly determine which events are attended to.Many Continental thinkers and Fascists believed that fundamental ideas of science, art and the "higher thoughts" of European civilization were either innate or inherently easy to learn only for a biologically privileged set of human beings. As with most earlier views of human cognition and learning, both of these philosophies and their accompanying pseudo-sciences of the mind were based on social and political considerations that ignored, and indeed effectively banned, reasoned inquiry and evidence as to the nature of the human mind.That is why, after centuries of science, study of the mind is still in a foetal stage, and actual progress has been limited to fundamental discoveries that can be counted on one hand (for example, that human linguistic competence — and thus perhaps other fundamental cognitive structures — is universally and innately fairly well-structured; or that human beings do not think like markov processors, logic machines, or as rational economic and political actors ought to).RUPERT SHELDRAKEDevelopmental Biologist; Author, The Sense of Being Stared AtIn the nineteenth century, many scientists were convinced that the course of nature was totally determinate and in principle predictable in every detail, as in Laplace's famous fantasy of scientific omniscience: "Consider an intelligence which, at any instant, could have a knowledge of all the forces controlling nature together with the momentary conditions of all the entities of which nature consists. If this intelligence were powerful enough to submit all these data to analysis it would be able to embrace in a single formula the movements of the largest bodies in the universe and those of the lightest atoms; for it nothing would be uncertain; the past and future would be equally present for its eyes."T.H. Huxley even imagined that the course of evolution was predictable: "If the fundamental proposition of evolution is true, that the entire world living and not living, is the result of the mutual interaction, according to definite laws, of the forces possessed by the molecules of which the primitive nebulosity of the universe was composed, it is no less certain the existing world lay, potentially, in the cosmic vapour, and that a sufficient intellect could, from a knowledge of the properties of the molecules of that vapour, have predicted, say, the state of the fauna of Great Britain in 1869."With the advent of quantum theory, indeterminacy rendered the belief in determinism untenable, and in the neo-Darwinian theory of evolution (which T.H. Huxley's grandson, Julian, did so much to promote) randomness plays a central role through the chance mutations of genes.EMANUEL DERMANProfessor in Columbia University's Industrial Engineering and Operations Research Department; Partner at Prisma Capital Partners; Author, My Life as a Quant1. For years, running shoe companies have assumed without evidence that more is better, that thicker padded soles are better at preventing injuries in runners. In the 70s, shoe soles grew Brobdingnagian. Now, recent research confirms that running barefoot and landing on your forefoot, on any surface, even one as hard as the road to hell, produces less shock than running and unavoidably landing on your heels in rigid padded stabilized shoes.2. For years optometrists have given small children spectacles at the first hint of nearsightedness. But ordinary unifocal lenses modify not only their accommodation to distance vision but to near vision too, where they don't need help. Now there is evidence that giving near-sighted kids bifocals that correct only their distance vision and not their close-up vision seems to make their nearsightedness progress less rapidly.CHARLES SEIFEProfessor of Journalism at New York University; Author, ProofinessCaloric, phlogiston, and ether immediately come to mind, but I'm particularly fond one consequence of Aristotelian mechanics: the assertion that there is no such thing as a vacuum.The concept of the void conflicted with the way that Aristotle conceived of motion; admitting a void into his universe quite simply broke all of his models about the nature of matter and the way objects move. (A rock, say, suspended in a vacuum, would not be able to fall to its proper place at the center of the world as his laws said they must.)In the West, the consequent misconception — that nature so abhors a vacuum that it can not exist under any circumstance — lasted until Torricelli and Pascal disproved it in the 17th century.MILFORD H. WOLPOFFProfessor of Anthropology and Adjunct Associate Research Scientist, Museum of Anthropology at the University of Michigan; Author, Race and Human EvolutionCreationism's step sister, intelligent design, and allied beliefs have been held true for some time, even as the mountain of evidence supporting an evolutionary explanation for the history and diversity of life continues to grow. Why has this belief persisted? There are political and religious reasons, of course, but history shows than neither politics nor religion require a creationist belief in intelligent design.I think the deeper answer lies elsewhere, in the way children categorize the world in to a hierarchy of types of inanimate and living things (and for that matter types of people), and the rigid categorization this leaves in adults that stands in the way of accepting biological explanations that show the hierarchy can develop from natural laws including randomness,and categories may originate and change by natural laws within a hierarchical structure. Could a draw poker hand improve without divine intervention? Could Plato's precept of ideals have survived a trip to Art Van's?ROBERT SHAPIROProfessor Emeritus of Chemistry and Senior Research Scientist at New York University; Author, Planetary DreamsFor many centuries, most scientists and philosophers believed that dead or inanimate matter could quickly transform itself into living beings, just as the reverse can occur quite rapidly. This belief, rapid spontaneous generation, was supported by simple observation of common events. Fireflies emerged from the morning dew, bacteria appeared in sterilized broths and small animals arose from mud at the bottom of streams and ponds.In Shakespeare's "Antony and Cleopatra" Lepidus told Antony "Your serpent of Egypt is born of the mud, by the action of the Sun, and so is your crocodile." Among the notables who endorsed this theory were Aristotle, Thomas Aquinas, Francis Bacon, Galileo and Copernicus. Many carefully controlled experiments, culminating in the work of Louis Pasteur, were needed to negate this idea.JUDITH HARRISAuthor, No Two AlikeThe apple doesn't fall far from the tree. In other words, people tend to resemble their parents. They resemble their parents not only in physical appearance but also, to some degree, in psychological characteristics.The question is: Why? Two competing answers have been offered: nature (the genes that people inherit from their parents) and nurture (the way their parents brought them up). Neither of these extreme positions stood up to scrutiny and they eventually gave way to a compromise solution: nature + nurture. Half nature, half nurture. This compromise is now an accepted belief, widely held by scientists and nonscientists alike.But the compromise solution is wrong, too. Genes do indeed make people turn out something like their parents, but the way their parents brought them up does not. So nature + nurture is wrong: it's nature + something else.The evidence has been piling up since the 1970s; by now it's overwhelming. And yet few people outside of psychology know about this evidence, and even within psychology only a minority have come to terms with it.You asked for "examples of wrong scientific beliefs that we've already learned were wrong." But who is "we"? A few thousand people have learned that the belief in nature + nurture is wrong, but most people haven't.JORDAN POLLACKComputer Science and Complex Systems Professor at Brandeis UniversityA persistent belief is that human symbolic intelligence is the highest form of intelligence around. This leads directly to both creationism and good old-fashioned AI which seeks to model cognition using Lisp programs.Evolution can design machines of such great complexity that the space shuttle with half a million parts looks like a tinker toy construction. In order to explain the design intelligence of evolution, most Republicans are convinced that a superintelligent creator was involved. Developmental intelligence which manufactures machines with 10 billion moving parts without any factory supervisors is another area where nature outstrips the best human performance. Immunological Intelligence, telling self from non-self, is another AI-complete problem. And human intelligence itself is so vastly complex that we've made up stories of conscious symbol processing, like logic and grammar, to try to explain what goes on in our heads.The mind, like the weather, envelopes the brain like a planet and requires dynamical and integrated explanations rather than just-so stories.SUE BLACKMOREPsychologist and Ex-Parapsychologist; Author, Consciousness: An IntroductionMy favourite example is the hunt for the "élan vital" or life force. People seemed to think that — given living things behave so very differently from non-living things — there must be some special underlying force or substance or energy or something that explains the difference, something that animates a living body and leaves the body when it dies.Of course many people still believe in various versions of this, such as spirits, souls, subtle energy bodies and astral bodies, but scientists long ago gave up the search once they realised that being alive is a process that we can understand and that needs no special force to make it work.I think this was believed to be true for two reasons :1. Explaining how living things work is not trivial — it has required understanding heredity, homeostasis, self-organisation and many other factors.2. (perhaps more important) Human beings are natural dualists. From an early age children begin thinking of themselves not as a physical body but as something that inhabits a physical body or brain. We feel as though we are an entity that has consciousness and free will even though this is all delusion. I suggest that this delusion of duality is also the underlying cause of the hopeless hunt for the life force.NICHOLAS G. CARRAuthor, The ShallowsI think it's particularly fascinating to look at how scientific beliefs about the functioning of the human brain have progressed through a long series of misconceptions.Aristotle couldn't believe that the brain, an inert grey mass, could have anything to do with thought; he assumed that the heart, hot and pulsing, must be the source of cognition, and that the brain's function was simply to cool the blood.Descartes assumed that the brain, with its aperture-like "cavities and pores," was, along with the heart, part of an elaborate hydraulic system that controlled the flow of "animal spirits" through the "pipes" of the nerves. More recently, there was a longstanding belief that the cellular structure of the brain was essentially fixed by the time a person hit the age of 20 or so; we now know, through a few decades' worth of neuroplasticity research, that even the adult brain is quite malleable, adapting continually to shifts in circumstances and behavior.Even more recently there's been a popular conception of the brain as a set of computing modules running, essentially, genetically determined software programs, an idea that is now also being chipped away by new research. Many of these misconceptions can be traced back to the metaphors human beings have used to understand themselves and the world (as Robert Martensen has described in his book The Brain Takes Shape).Descartes' mechanistic "clockwork" metaphor for explaining existence underpinned his hydraulic brain system and also influenced our more recent conception of the brain as a system of fixed and unchanging parts.Contemporary models of the brain's functioning draw on the popular metaphorical connection between the brain and the digital computer. My sense is that many scientific misconceptions have their roots in the dominant metaphors of the time. Metaphors are powerful explanatory tools, but they also tend to mislead by oversimplifying.LEE SMOLINFounding and Senior Faculty member at Perimeter Institute for Theoretical Physics in Waterloo, Canada; Adjunct Professor of Physics at the University of Waterloo; Author, The Trouble With PhysicsPerhaps the most embarrassing example from 20th Century physics of a false but widely held belief was the claim that von Neumann had proved in his 1930 text book on the mathematical foundations of quantum mechanics that hidden variables theories are impossible. These would be theories that give a complete description of individual systems rather than the statistical view of ensembles described by quantum mechanics. In fact de Broglie had written down a hidden variables theory in 1926 but abandoned work on it because of von Neumann's theorem. For the next two decades no one worked on hidden variables theories.In the early 1950's David Bohm reinvented de Broglie's theory. When his paper was rejected because von Neumann proved what he claimed impossible, he read and easily found a fallacy in the von Neumann's reasoning. Indeed, there had been at least one paper pointing out the fallacy in the 1930s that was ignored. The result was that progress on hidden variables theories in general, and de Broglie and Bohm's theory in particular, was delayed by several decades.An example in economics is the notion that an economic markets can usefully be described as having a single unique and stable equilibrium, to which it is driven by market forces. As described by neoclassical models of markets such as the Arrow-Debreu model of general equilibrium, equilibrium is defined as a set of prices for which demand for all goods equals supply, as a result of each consumer maximizing their utility and each producer maximizing their profit. A basic result is that such equilibria are Pareto efficient, which means no one's utility can be increased without decreasing some body else's utility. Furthermore, if the economy is in equilibrium there are no path dependent effects, moreover it can be argued that market prices in equilibrium are perfectly rational and reflect all relevant information.If equiilibrium were unique, then one could argue that the most ethical thing to do is to leave markets free and unregulated so that they can find their points of equilibrium where efficiency and utility are maximized. This kind of thinking to some extent motivated choices about leaving financial markets under-regulated resulting in the recent economic crisis and current difficulties.However, it was learned in the 1970s that even if efficiency and equilibrium are useful notions, the idea that equilibria are unique is not true in generic general equilibrium models. The Sonnenschein-Mantel-Debreu Theorem of 1972 implies that equilibria are in general highly non-unique, and it is not difficult to invent models in which the number of equilibria scales with the number of producers. But if there are multiple equilibria, most will not be stable. Moreover supply and demand are balanced in each of the many equilibria, so market forces do not suffice to explain which equilibria the market is in or to pick which would be preferred. The consequence theoretically is that path dependent effects which determine which of many the equilibria the market is in must be important, the political consequence is that there is not an ethical argument for leaving markets unregulated. Since then some of the more interesting work in economics studies issues of path dependence and multiple equilibria.I cannot comment on why economists made the mistake of thinking about market equilibrium as if it were unique. I do think I have some insight into why a false belief about the possibility of alternatives to quantum mechanics could persist for more than two decades. During this period there was rapid progress in the application of quantum mechanics to a wide set of phenomena from astrophysics to nuclear and solid state physics.Meanwhile the most popular interpretation of quantum mechanics was Bohr's, which is now hardly taken seriously by anyone. Those who concentrated on the foundations of the subject were left behind, especially as it was convenient for the progress that was being made, to believe that the foundations were surer than in fact they were. Perhaps there are periods in science where it makes sense for most scientists to sweep foundational worries under the carpet and make progress on applications, postponing the inevitable reckoning with the inconsistencies to a time when there are better hints from experiment.MARTI HEARSTAssociate Professor in the School of Information at UC Berkeley, Affiliate appointment in Computer Science DivisionAs a computer scientist, there isn't all much in my field that applies, but I do have one item (below). The real action in my view though are the many the counter-intuitive findings in psychology (how memory works, what we perceive and don't perceive, findings on child rearing, etc., etc.):In the early days of the field of Artificial Intelligence, researchers thought that it would not be terribly difficult to implement a vision recognition or language understanding program. Although there is an apocryphal quote from Minsky saying he assigned solving vision as a summer research project, more reliable quotes, taken from a well-researched wikipedia article, are below:"AI's founders were profoundly optimistic about the future of the new field: Herbert Simon predicted in 1965 that "machines will be capable, within twenty years, of doing any work a man can do" and Marvin Minsky agreed, writing that "within a generation ... the problem of creating 'artificial intelligence' will substantially be solved".The importance of these misperceptions is the underestimation of the complexity of how the brain works.GINO SEGREProfessor of Physics and Astronomy at the University of Pennsylvania; Author, Ordinary GeniusesI would not count flat earth as a wrong theory believed to be true by everybody since e.g. the ancient Greeks thought the Earth was a sphere and had even measured its curvature.A classic example of a wrong theory is that of Phlogiston, namely the existence of a substance that is released in combustion. There were also variations going by the name of caloric. A second wrong theory is that of a Luminiferous Aether, a substance through which light is transmitted. Chemical experiments disproved the first and e.g. the Michelson -Morley expt. the second.There are of course also numerous wrong theories/beliefs regarding spontaneous generation of life disproved in the 17th century by Francesco Redi and ultimately by Louis Pasteur in the 19th.I have a small favorite, the belief that body temperature varied with climate, disproved by the invention in the early 17th century of the thermometer.CARL ZIMMERScience Writer; Author, Soul Made Flesh"This laxe pithe or marrow in man's head shows no more capacity for thought than a Cake of Sewet or a Bowl of Curds."This wonderful statement was made in 1652 by Henry More, a prominent seventeenth-century British philosopher. More could not believe that the brain was the source of thought. These were not the ravings of a medieval quack, but the argument of a brilliant scholar who was living through the scientific revolution. At the time, the state of science made it was very easy for many people to doubt the capacity of the brain. And if you've ever seen a freshly dissected brain, you can see why. It's just a sack of custard. Yet now, in our brain-centered age, we can't imagine how anyone could think that way.GREGORY PAULIndependent Researcher; Author, Dinosaurs of the AirRichard Thaler seems to think that the concept of a flat earth was widely held for a long time. This is not really correct. Mariners have long understood that the earth is strongly curved and possibly a sphere. Ships disappear down over the horizon (I once saw this effect on the Chesapeake bay and was shocked how fast the hull of a giant container ship dropped out of sight while the top of the superstructure was still easily visible). Polaris gets lower on the horizon as one sails south and eventually disappears and so on. Over 2000 years ago the circumferance of the planet was pretty accurately calculated by Eratosthenes using some clever geometry and sun angle measurements. This knowledge may have been lost in the west in the dark ages, but was well known to the Euroelites after the improved communications from Constantinople, Alexandria etc after the Crusades.When Columbus was trying to get a government to cough up the money for his trip west he was not trying convince patrons that the planet was a sphere. The problem was that the experts told the people with the money that the distance from Europe and Asia across the super ocean separating them was 14,000 miles with no visible means of logistical support during the voyage (the perfect Bible did not mention extra continents being in the way). However, some works had come out saying that Eratosthenes had messed up and the planet was much smaller (I've heard this was based on Biblical passages and Columbus was very devout, but am not sure about that). Columbus figured it was 3-4000 miles to the west, a skip and a hop compared to the horrendous around Africa route. When the Spanish monarchs finally kicked the last Muslims out of Iberia and were having fun picking on Jews they decided what the heck and see what this Columbus fellow could do, the cost was just three little cargo vessels and their crews.The story about the crews getting upset about sailing off the edge of the earth is probably a myth since they knew better. That Columbus was fighting the false knowledge of the flat earth apparently was invented in the late 1800s in an effort to make him a great American symbol of the progress of science over superstition associated with the 1892 celebrations.As far as a distinct example of lots of people believing in something that is scientifically wrong the best example I can think of are the various creation myths. This occurred not only before the advent of modern science but continues in the form of various forms of creationism.ALISON GOPNIKPsychologist, UC, Berkeley; Author, The Philosophical BabyThere is interesting evidence that many once popular and evidence-resistant scientific belief systems are also developed spontaneously by many children. For example, children seem to develop a "vitalistic" theory of intuitive biology, rather like the Chinese concept of "chi", at around age 5, independently of what they are taught in school. Similarly , school-age children, even those with an explicitly atheist upbringing, develop ideas about God as an explanatory force at about 7, as part of an "intuitive teleology" that explains events in terms of agency.The psychologist Tania Lombrozo has shown that even Harvard undergraduates who endorse evolution consistently interpret evolutionary claims in a teleological rather than mechanistic way (eg giraffes try to reach the high leaves and so develop longer necks). And we have shown that six year olds develop a notion of fully autonomous "free will" that is notoriously difficult to overturn. There is also a lot of evidence that scientific theories are built out of these everyday intuitive theories.If, as we think, children use Bayesian techniques to develop intuitive theories of the world, based on the evidence they see around them, then it might , in some sense, be rational to hold on to these beliefs, which have the weight of accumulated prior experience. Other scientific beliefs, without a history of everyday confirmation, might be easier to overturn based on just scientific evidence alone.GEORGE DYSONScience Historian; Author, Darwin Among the MachinesMany (but not all) scientists assumed the far side of the moon would turn out to look much the same as the side we are familiar with. "I was very enthusiastic about getting a picture of the other side of the moon," Herbert York, former advisor to President Eisenhower, told me in 1999. "And there were various ways of doing it, sooner or later. And I argued with Hornig [Donald Hornig, Chairman of the President's Science Advisory Committee] about it and he said, 'Why? It looks just like this side.' And it turned out it didn't."

Comments from Our Customers

Easy to use, and lots of help when having issues

Justin Miller