Institute for Ethics and Emerging Technologies


The IEET is a 501(c)3 non-profit, tax-exempt organization registered in the State of Connecticut in the United States. Please give as you are able, and help support our work for a brighter future.


Search the IEET
Subscribe and Contribute to:


Technoprogressive? BioConservative? Huh?
Quick overview of biopolitical points of view




whats new at ieet

David Gunkel on Robots and Cyborgs

Should Couples Who Want Healthy Babies Deliberately Expose Themselves to Zika

Robot Overlordz - Emerging Citizens & Cyborgs

BREXIT – some historical perspective

Bio-Cryptoeconomy: Nanorobotic DACs for Cell Repair and Enhancement

3 moons and a planet that could have alien life


ieet books

Philosophical Ethics: Theory and Practice
Author
John G Messerly


comments

almostvoid on 'BREXIT – some historical perspective' (Aug 31, 2016)

instamatic on 'The Science of Fear-Mongering: How to Protect Your Mind from Demagogues' (Aug 29, 2016)

instamatic on 'No Mans Sky: A Deist Simulated Universe' (Aug 29, 2016)

Giulio Prisco on 'No Mans Sky: A Deist Simulated Universe' (Aug 29, 2016)

instamatic on 'No Mans Sky: A Deist Simulated Universe' (Aug 28, 2016)

RJP8915 on 'No Mans Sky: A Deist Simulated Universe' (Aug 27, 2016)

instamatic on 'No Mans Sky: A Deist Simulated Universe' (Aug 27, 2016)







Subscribe to IEET News Lists

Daily News Feed

Longevity Dividend List

Catastrophic Risks List

Biopolitics of Popular Culture List

Technoprogressive List

Trans-Spirit List



JET

Enframing the Flesh: Heidegger, Transhumanism, and the Body as “Standing Reserve”

Moral Enhancement and Political Realism

Intelligent Technologies and Lost Life

Hottest Articles of the Last Month


Op-ed: Climate Change Is the Most Urgent Existential Risk
Aug 7, 2016
(5864) Hits
(4) Comments

Shedding Light on Peter Thiel’s Dark Enlightenment
Aug 15, 2016
(5195) Hits
(2) Comments

Consciousness, Reality, and the Simulation Hypothesis
Aug 4, 2016
(5161) Hits
(15) Comments

Interview with Gerd Leonhard and his New Book TECHNOLOGY vs. HUMANITY
Aug 23, 2016
(3722) Hits
(0) Comments



IEET > Vision > Futurism > Technoprogressivism > Affiliate Scholar > Melanie Swan

Print Email permalink (5) Comments (21918) Hits •  subscribe Share on facebook Stumble This submit to reddit submit to digg


The Scientific Method is a Scientific Idea that is Ready for Retirement


Melanie Swan
By Melanie Swan
The Edge

Posted: May 24, 2015

The scientific idea that is most ready for retirement is the scientific method itself. More precisely it is the idea that there would be only one scientific method, one exclusive way of obtaining scientific results. The problem is that the traditional scientific method as an exclusive approach is not adequate to the new situations of contemporary science like big data, crowdsourcing, and synthetic biology.

Hypothesis-testing through observation, measurement, and experimentation made sense in the past when obtaining information was scarce and costly, but this is no longer the case. In recent decades, we have already been adapting to a new era of information abundance that has facilitated experimental design and iteration. One result is that there is now a field of computational science alongside nearly every discipline, for example computational biology and digital manuscript archiving. Information abundance and computational advance has promulgated the evolution of a scientific model that is distinct from the traditional scientific method, and three emerging areas are advancing it even more.

Big data, the creation and use of large and complex cloud-based data sets, is one pervasive trend that is reshaping the conduct of science. The scale is immense: organizations routinely process millions of transactions per hour into hundred-petabyte databases. Worldwide annual data creation is currently doubling and estimated to reach 8 zettabytes in 2015.

Even before the big data era, modeling, simulating, and predicting became a key computational step in the scientific process and the new methods required to work with big data make the traditional scientific method increasingly less relevant. Our relationship to information has changed with big data. Previously in the era of information scarcity, all data was salient. In a calendar for example, every data element, or appointment, is important and intended for action.

With big data, the opposite is true, 99% of the data may be irrelevant (immediately, over time, or once processed into higher resolution). The focus becomes extracting points of relevance from an expansive whole, looking for signal from noise, anomalies, and exceptions, for example genomic polymorphisms. The next level of big data processing is pattern recognition. High sampling frequencies allow not only point-testing of phenomena (as in the traditional scientific method), but its full elucidation over multiple time frames and conditions.

For the first time longitudinal baseline norms, variance, patterns, and cyclical behavior can be obtained. This requires thinking beyond the simple causality of the traditional scientific method into extended systemic models of correlation, association, and episode triggering. Some of the prominent methods used in big data discovery include machine learning algorithms, neural networks, hierarchical representation, and information visualization.

Crowdsourcing is another trend reshaping the conduct of science. This is the coordination of large numbers of individuals (the crowd) through the Internet to participate in some activity. Crowd models have led to the development of a science ecosystem that includes the professionally-trained institutional researcher using the traditional scientific method at one end, and the citizen scientist exploring issues of personal interest through a variety of methods at the other. In between are different levels of professionally-organized and peer-coordinated efforts.

The Internet (and the trend to Internet-connect all people - 2 billion now estimated to be 5 billion in 2020) enables very-large scale science. Not only are existing studies cheaper and quicker in crowdsourced cohorts, but studies 100x the size and detail of previous studies are now possible. The crowd can provide volumes of data by automatically linking quantified self-tracking gadgets to data commons websites. Citizen scientists participate in light information-processing and other data collection and analysis activities through websites like Galaxy Zoo.

The crowd is engaged more extensively through crowdsourced labor marketplaces (initially like Mechanical Turk, now increasingly skill-targeted), data competitions, and serious gaming (like predicting protein folding and RNA conformation). New methods for the conduct of science are being innovated through DIY efforts, the quantified self, biohacking, 3D printing, and collaborative peer-based studies.

Synthetic biology is a third wide-spread trend reshaping the conduct of science. Lauded as the potential ‘transistor of the 21st century’ given its transformative possibilities, synthetic biology is the design and construction of biological devices and systems. It is highly multi-disciplinary, linking biology, engineering, functional design, and computation.

One of the key application areas is metabolic engineering, working with cells to greatly expand their usual production of substances that can then be used for energy, agricultural, and pharmaceutical purposes. The nature of synthetic biology is pro-actively creating de novo biological systems, organisms, and capacities, which is the opposite of the esprit of the passive characterization of phenomena for which the original scientific method was developed.

While it is true that optimizing genetic and regulatory processes within cells can be partially construed under the scientific method, the overall scope of activity and methods are much broader. Innovating de novo organisms and functionality requires a significantly different scientific methodology than that supported by the traditional scientific method, and includes a re-conceptualization of science as an endeavor of characterizing and creating.

In conclusion, we can no longer rely exclusively on the traditional scientific method in the new era of science emerging through areas like big data, crowdsourcing, and synthetic biology. A multiplicity of models must be employed for the next generation of scientific advance, supplementing the traditional scientific method with new ways that are better suited and equally valid.

Not only is a plurality of methods required, it opens up new tiers for the conduct of science. Science can now be carried out downstream at increasingly detailed levels of resolution and permutation, and upstream with broader systemic dynamism. Temporality and the future become more knowable and predictable as all processes, human and otherwise, can be modeled with continuous real-time updates.

Epistemologically, how ‘we know’ and the truth of the world and reality is changing. In some sense we may be in a current intermediary ‘dark ages node’ where the multiplicity of future science methods can pull us into a new era of enlightenment just as surely as the traditional scientific method pulled us into modernity. 


Melanie Swan, MBA, is an Affiliate Scholar of the IEET. Ms. Swan, principal of the MS Futures Group, is a philosopher, science and technology futurist, and options trader.
Print Email permalink (5) Comments (21919) Hits •  subscribe Share on facebook Stumble This submit to reddit submit to digg


COMMENTS


I, by all rights, should disclose that I am a 70yr old curmudgeon.  I have lived, planned, and validated my life through rigorous application of the scientific method.  I don’t understand why crowdsourcing,  massive hordes of super data, and synthetic biology in any way are contrary to the basic principal of the scientific method.  Is the conclusion verifiable and repeatable, is there any way possible to disprove the theory, is it peer reviewable.  I did not see where the suggestions by MS Swan suggested anything else.  On the contrary Her points a true and well taken .  In my mind they enhance make make for a more robust experimental environment.





Certainly the recent discovery that “the world is complicated” (and both people and nature unusually *inventive*) does expose a deep flaw in the idea that nature follows simple scientific rules and models.  That seemed plausible only because some of the simple rules of physics are also so amazingly reliable.  Those still exist, and others are to be found most likely, but the question is: “What then do we think of them?”

I think we probably should not throw out the scientific method… particularly just because we’ve been misusing it.  The common flaw in our use of science as I see it, and studied since the 1970’s actually, is its “misrepresentation problem”.  The world is not a model, and we’ve been treating it that way. 

The world is not made of numbers, not made of quantitative relationships.  It’s made of organizations of separate things, often found in “improper sets” with the parts of one thing also often taking independent part in others too.  It makes things in nature *highly individualistic*, and held together by some kind of “organizational glue” we’ve hardly begun to study.  That presents not only a wonderfully interesting “mismatch in VARIETY”, but also several wonderfully interesting “mismatches in KIND” as well.  It may not be ‘neat’ but it’s very ‘lifelike’, and opens all sorts of new doors!

So what I think we need to retire is not so much “science” as “the representation of scientific models as nature”.  The article points to a number of the big discrepancies that have become too big to ignore, but where does that take us??  One place it takes us back to the age old “million dollar question” of how science is to refer to nature at all.  What is it we CAN define that DOES NOT misrepresent what we are studying??  I think a quite simple place to start (and obvious solution once you recover from the shock, I guess) it to treat models not AS nature, but AS “our limits of measurable uncertainty about nature”.  Yes, Popper and Bohr with turn in their graves… but models understood as representing upper and lower bounds within which we expect nature to operate, independently, will also be found to be much more useful. 

If you actually look closely at natural behaviors you readily see that, that the paths nature takes are always individualized, and we can understand them much better having some information from past events to suggest what to expect.  It gives you a straight and clear view of the all important “discrepancies”.  To make use of relieving science of its century (or more) of seriously false thinking, about nature being theory, what you then need are ways for science to refer to nature as “individual phenomena & organizations” to identify the stuff of nature that science studies.  In our century or more of trusting abstraction by itself, that’s what I think science has been missing, having a natural object of study. 

So, in a fairly direct way I’m calling for an “object oriented science” to correspond to the “object oriented programming” that has become such a big help for giving order to computer coding and the web.  My main two tools for that are what I call a “dual paradigm” view (alternating between attention to ‘theory’ and ‘things’), and a “pattern language” view (the emerging scientific method of describing natural organization based on Christopher Alexander’s work). 

Alexander’s pattern language is evolving to become a versatile general method for working with ‘recurrent patterns of design’ as ‘whole sets of working relationships’ found in ‘problems’, ‘solutions’ & ‘environments’.  My new work describing how these fit together is being presented at PURPLSOC and PLoP this year, built on a lot of fundamentals.  If interested, do a search for “dual paradigm”, “pattern language” & “Christopher Alexander” both on the web and my journal:
http://synapse9.com/signals





I always love Melanie’s posts, but before we get carried away with ourselves, here’s a counter argument from PETER COVENEY director of the Centre for Computational Science at University College London:

People have to go around measuring things. There’s no escape from that for most of that type of work. There’s a deep relationship between the two. No one’s going to come up with a model that works without going and comparing with experiment.

http://edge.org/

And here’s computer scientist Michael Jordan (he must have an easy time getting a reservation at a restaurant) on the current limits of big data for science:

“Now, if I start allowing myself to look at all of the combinations of these features—if you live in Beijing, and you ride bike to work, and you work in a certain job, and are a certain age—what’s the probability you will have a certain disease or you will like my advertisement? Now I’m getting combinations of millions of attributes, and the number of such combinations is exponential; it gets to be the size of the number of atoms in the universe.

Those are the hypotheses that I’m willing to consider. And for any particular database, I will find some combination of columns that will predict perfectly any outcome, just by chance alone. If I just look at all the people who have a heart attack and compare them to all the people that don’t have a heart attack, and I’m looking for combinations of the columns that predict heart attacks, I will find all kinds of spurious combinations of columns, because there are huge numbers of them.”

http://spectrum.ieee.org/robotics/artificial-intelligence/machinelearning-maestro-michael-jordan-on-the-delusions-of-big-data-and-other-huge-engineering-efforts

Isn’t it more rather than less likely to be the case that we’ll need the old fashioned sort of scientific method (along with its skills) to sort out the valid hypotheses from the merely spurious correlations?





I don’t see anything that violates the scientific method in big data, regardless that it might be crowd-sourced- or in the other points raised. They greatly extend the reach (and potency) of the method when integrated into it; but if you’re referring to the academic/professional shibboleth of the capitalized Scientific Method, I can agree- the meta-phenomenal aspect of authoritarian systems crumbles rapidly when reality forcibly intrudes, and these three issues stated have hyper-realistic effects (not all of them are positive- we’re going to have a VERY rough time with unforseen consequences from synthetic biology and its associated engineering).





Of course scientific progress can be sped by using virtual data.  What I am most concerned with is psychological, which ought not interfere with scientific progress regardless if the proof is virtual or not.  It has now gotten to the point where scientists and scholars “own” certain theories, and they spend all their time being skeptical of anything that threatens their territory.  Some say that Albert Einstein would never get his revolutionary paper about the Theory of Relativity published in a peer reviewed journal in the current environment.

What I am particularly concerned about is LENR.  Pons and Fleischman’s widely publicized and now widely discredited claims about cold fusion were backed up by very difficult to reproduce experimental data.  I personally wouldn’t even call it “cold fusion,” but regardless it is a proven scientific physical phenomena now, but never the less, it is controversial in the scientific community.  How on earth can a provable physical phenomena be controversial given the scientific method?

Imagine a clean, incredibly inexpensive, and super abundant energy technology that is stifled simply because the scientific community can’t surmount their psychological reaction to a proven and reproducible physical phenomena!  This isn’t just rhetorical - there is no other way that I am aware of to practically remove the excess carbon from the air and water in our environment.  Also, if the price of clean energy was very near zero, it would supercharge our economy.

Frankly, what I’ve just written seems so amazing that I feel defensive and want to give a small amount of verification in the form of NASA quotes, and (an example of) a doubly third-party verified commercial application:

“LENR has the demonstrated ability to produce excess amounts of energy, cleanly, without hazardous ionizing radiation, without producing nasty waste.” - Dennis Bushnell, Chief Scientist at NASA Langley Research Center

“Total replacement of fossil fuels for everything but synthetic organic chemistry.”—Dr. Joseph M. Zawodny, NASA

http://www.opednews.com/articles/Low-Energy-Nuclear-Reactio-by-Christopher-Calder-Andrea-Rossi_Energy-Policy_Industrial-Heat-Llc_Lenr-141013-530.html

“There are many companies now racing to bring Low Energy Nuclear Reaction products to the marketplace. One notable company is Solar Hydrogen Trends, which claims to have accidentally discovered a way to use LENR to produce hydrogen gas from water at the energy equivalent of producing pollution free oil for about $5.00 a barrel. Their hydrogen gas producing reactor has been independently tested by two well known companies, AirKinetics, Inc. and TRC Solutions. Both companies found that the reactor works as promised, and the TRC Solutions PDF report is quite shocking.”





YOUR COMMENT (IEET's comment policy)

Login or Register to post a comment.

Next entry: Self-Driving Trucks Are Going to Hit Us Like a Human-Driven Truck

Previous entry: The Long Stop (short story)

HOME | ABOUT | FELLOWS | STAFF | EVENTS | SUPPORT  | CONTACT US
SECURING THE FUTURE | LONGER HEALTHIER LIFE | RIGHTS OF THE PERSON | ENVISIONING THE FUTURE
CYBORG BUDDHA PROJECT | AFRICAN FUTURES PROJECT | JOURNAL OF EVOLUTION AND TECHNOLOGY

RSSIEET Blog | email list | newsletter |
The IEET is a 501(c)3 non-profit, tax-exempt organization registered in the State of Connecticut in the United States.

East Coast Contact: Executive Director, Dr. James J. Hughes,
56 Daleville School Rd., Willington CT 06279 USA 
Email: director @ ieet.org     phone: 860-428-1837

West Coast Contact: Managing Director, Hank Pellissier
425 Moraga Avenue, Piedmont, CA 94611
Email: hank @ ieet.org