Ed Gerck, Ph.D.
 
ABOUT

Physics and Mathematics are different, although both are made up, created by people like you and I. Mathematicians deny that (Kronecker) and physicists too (Nature), but that's where the fun is, I think, so we can change them.

I see cybersecurity as a new area in physics (Gerck, 1997), where "trust is that which is essential to a communication channel, but cannot be transferred using that channel."

This trust definition provides a framework for understanding human trust (as expected fulfillment of behavior) and for bridging trust between humans and machines (as qualified information based on factors independent of that information). The trust definition also directly creates a mathematical framework for cybersecurity, with a fresh, quantitative approach to help solve the vexing information security problems we see everyday in homes, offices, cloud providers, and in the most advanced agencies of the most advanced nation states.

Understanding trust in terms of a communication process motivates the design and development of large-scale, secure and reliable Internet-based infrastructure services where users (including their machines, operating systems and software) are not initially trusted to any extent. In other words, one can now introduce trust as an explicit part of the Internet design, with least changes as possible, which trust was implicit when the Internet (prior to commercial operation) was based on an honor system for the users and their machines (see INTERNET MODEL).

In particular, the trust definition shows that trusting user intervention (even to simply update software) is a very weak assumption, leading to interest in solutions that can solve current security and network problems without trusting user intervention.

For references and related work on trust, use the search keyword: gerck trust

My work has received extensive worldwide press coverage from New York Times, Le Monde, O Globo, Forbes, CBS, CNN, Business Week, Wired, San Jose Mercury News, Aftonbladet and USA Today. In 1999, I was a member of the Registry Advisory Board of Network Solutions, Inc. (NSI). I am also the founder of Gerck Research, a scientific cooperative, the Meta-Certificate Group (MCG), chairman of the board of the Internet Voting Technology Alliance (IVTA), founder of NMA, Inc. and founder and CEO of Safevote, Inc.

Finding a good question...

Finding a good question first has often led me to more exciting results — possibly so new as to be out of mind's reach if trying to formulate a hypothesis first.

Looking for more creativity and improved critical thinking? Looking to cut some Gordian Knots? Finding a good question is not just about asking. But it may be as easy as to change “is” to "could be", to look for an answer rather than "the" answer.

How much better? By testing what happens when people think in terms of conditionals instead of absolutes, social psychologist Ellen Langer of Harvard shows that critical thinking and problem-solving can be made 13x more productive (from 3% to 40% success rate) by people focusing on "could" instead of "is". Our own observation is that in scientific research, finding a good question (what COULD be?) at the start seems to be more productive than formulating a hypothesis (this IS) first. It also carries less bias.

The goal of a good question is to lead to one or more fruitful scientific hypothesis, each providing broad avenues of research. A good question is a veritable factory of fundamental discoveries.

Education
• Postdoctoral in Physics, Max-Planck-Institute for Quantum Optics, Garching bei Muenchen, Germany, 1982-83
• Ph.D. Physics, maximum (sehr gut) thesis grade, Ludwig-Maximillians University, Muenchen, Germany, 1983
• Doctor Research in Physics, Max-Planck-Institute for Quantum Optics, Garching bei Muenchen, Germany, 1979-82
• M.Sc. Physics, Instituto Tecnologico de Aeronautica, Sao Jose dos Campos, SP, Brazil, 1978
• Electronic Engineer, Instituto Tecnologico de Aeronautica, Sao Jose dos Campos, SP, Brazil, 1977

Awards
• MPG Germany Posdoc Grant
• DAAD Germany Doctorate Grant
• CNPq Brazil Doctorate Grant
• Fapesp Brazil Research Grant (Mathematics, Physics)
• ITA/CTA 5-year Engineering Grant

Profile and Publications: https://www.researchgate.net/profile/Ed_Gerck


New Frontiers in Physics: The Scientific Method Revisited


© Ed Gerck, 2015.
All rights reserved, free copying and citation allowed with source and author reference.


ABSTRACT
This work explores the possibility of new frontiers in modern physics, brought by a revisited scientific method, showing how and why science can safely expand its current frontiers beyond Popper's falsifiability criterion, which we report to possibly produce false positives and false negatives in defining whether a theory is scientific or not. The work eliminates common conceptual conflicts, narrowly defines scientific hypothesis, introduces a specialization of refutability, and presents a new demarcation boundary for science, albeit subjecting science to potential observational evidence that could persuasively decide whether a theory is wrong and lead one to abandon it as a scientific theory. A new narrative is also included, for applications in the philosophy of science. Popper's logic of scientific discovery is shown not to be a fitting model for science discovery.

KEYWORDS: scientific method, quantum physics, modern physics, formal sciences, big data, refutability, falsifiability, trustworthiness

DISCLAIMER: This work is presented as a draft, may change frequently, and does not intend to cover all the details of the research reported, or all the variants thereof. Its coverage is limited to provide support and references to work in progress and to unify references, concepts and terminology. No political or country-oriented criticism is to be construed from this work, which respects all the apparently divergent efforts found today on the subjects treated. Individuals or organizations are cited as part of the fact-finding work needed for this work and their citation constitutes neither a favorable nor an unfavorable recommendation or endorsement.

Introduction

It is known that fundamental constants of nature, such as the fine-structure constant and the cosmological constant, have values that lie in the small range that allows life, as we know it, to exist. This could be considered highly improbable according to current physics theories, the probability being the product of already low probabilities. The universe started out 13.8 billion years ago with predicted equal amounts of matter and antimatter, that science also predicts should have totally annihilated, and yet life, as we know it, exists. The measured cosmological constant is bizarrely lower than 10-120 times the value predicted by quantum field theory.

Successful physics theories, including quantum mechanics and relativity, are not failing to provide answers to these and other puzzles, but are failing to provide answers that can be — at the very least — scientifically discussed [1,2,3].

There is more trouble at the lab. Experiment outcomes, reportedly following the scientific method, seem to be lacking in trustworthiness, causing apprehension in society and governments [4,5].

The advent of BIG DATA and fast pattern-learning-analysis have also called into question the very need for the scientific method to exist [5,6].

Many blame the scientific method — that the empirical test requirement can make it impossible, too time-consuming, or too expensive to search for solutions, and that Popper's falsifiability criterion can be "too definitive" to reliably define what is scientific or not.

Rather, considering the success of science in the last centuries, these puzzles and troubles suggest that there is today no fitting scientific method that can at the same time deal with the needs of past and modern physics, support the increasing role of science in society and government, and continue to use Nature for acceptance or rejection.

We present these considerations in order to motivate the need for a revisited scientific method using Nature as a guide, but looking further than just requiring empirical testing and Popper's falsifiability criterion.

The revisited scientific method is presented here first, as a testable, scientific subject in the area of natural science and physics, including more channels of verification and a new definition of refutability, rejecting Popper's falsifiability criterion — as we show, it may flag false negatives in defining if a theory is scientific, or accept false positives. We then apply this work to new frontiers in physics, hoping to provide a sound framework for scientific evaluation in past and modern physics.

The work also presents as new narrative, a philosophical subject, mainly regarding how acceptance or rejection of a theory can be based on Nature, with fewer limitations than current philosophy of science concepts, helping bridge natural science, mathematics, and philosophy.

Outline

Section 1 discusses the intuition behind this work, including a Gedankenexperiment. Although the conventional scientific method provides the desirable aspect of a deterministic outcome, in order to address the needs of modern physics as well as qualitatively improve an experiment's trustworthiness in the presence of faults and lack of sufficient information. This Section also includes scientific, philosophical and other support arguments, as well as possible counter arguments.

Section 2 provides the revisited scientific method, presented as a testable, scientific subject in the area of natural science and physics.

Section 3 applies the revisited scientific method to new frontiers, including the experimentally problematic science areas of string theory, multiverse, and parallel realities, and how they can be considered in physics.

In the Conclusion, we discuss the results, limitations, extensions and applications.

1. Intuition - The Scientific Method

What if perfect human assistants, scientists, peer-reviewers, publishers, government agencies, as well as perfect computers and software, all ideally honest and error-free, designed, ran, and verified an experiment's outcome, would the result be trustworthy?

No, not necessarily. Trustworthiness of the experiment's outcome would still depend on whether a number of requirements are met, such as that no one factor (e.g., perceptual, conceptual) was left out or mistakenly used. However, while there are unknowns that we know we do not know, and may try to account for, certainly there are also things we cannot account for because we ignore that we ignore them.

The above Gedankenexperiment refutes:
  1. the public and often repeated idea that if an experiment's published results reveal themselves to be wrong one day, then there is "trouble at the lab" [5];
  2. that scientists — including peer-reviewers and publishers — were not careful or honest enough;
  3. that although scientists like to think of science as self-correcting, it is not.
It should become obvious, therefore, that even if the scientists "at the lab" and reviewers are ideally honest and careful, and all use flawless devices with flawless software, the trustworthiness of an experiment's results still remains elusive.

An experiment's trustworthiness does not depend only on the competence and honest behavior of science's immediate ecosystem, of scientists, reviewers, and publishers. It also depends on other actors, including society and government as consumers of science. Everyone, not just scientists, need to understand the limitations of a scientific experiment, and be able to let its scope as well as trustworthiness be represented accurately and reliably in any use they make of the results.

What if an experiment is not run by such ideally perfect, honest scientists, and perfect computers, in a perfect environment? What if one includes financial interests, market failure, political gain, collusion, coercion, dishonesty, egotism, corporativism, ethnicity and religious or atheistic values, bias, ignorance, human error, faults, attacks and threats by adversaries? Then, additional requirements have to be imposed in terms of the experiment's outcome trustworthiness. The conventional scientific method requires peer-review, and that experiments must be repeatable over time and different places, for example. But, how about those things that we ignore we ignore? Would more peer-review and experiment duplication with more time and places provide "enough" risk reduction to allay trustworthiness concerns? What do we mean by "enough"?

Adding falsifiability and other criteria, have not considerably improved this picture [2]. The needs of modern physics have also extrapolated this picture, seemingly outside of the impotent reach of science. For example, in realms far removed from everyday experience, where the connection to experiment may not even be possible.

Now, if we want to keep making progress in physics, and in the increasing reliance of society and governments in physics, we must make progress also in the scientific method we use, better defining requirements, what we mean by "enough", and how to reach it.

Some recognize that there is yet no good model to be used in designing these important additional requirements [2,3], to comprehensively define "enough" in terms of risk of not reaching a scientifically defensible result, and to avoid opening a door in the direction of spurious claims. George Ellis and Joe Silk [3], for example, expressed the concern that theoretical physics risks becoming a no-man's-land between mathematics, physics and philosophy, which does not truly meet the requirements of any.

The conventional scientific method can be described by "guess, experiment, then verify with Nature". Most physicists add to the conventional requirements the "falsifiability" criterion proposed by the philosopher Karl Popper in the latter part of the last century, that a theory is scientific if and only if it makes clear predictions that can be unambiguously falsified.

However, as we pursue in item 1.1, the above Gedankenexperiment shows:
  1. that there are missing verification channels in Popper's falsifiability criterion;
  2. that Popper's falsifiability criterion can produce false positives (calling "scientific" a theory that is not);
  3. that Popper's falsifiability criterion can produce false negatives (calling "non-scientific" a theory that is).
The above Gedankenexperiment also clearly opposes Popper's logic of scientific discovery, which is to remove the demand for empirical verification in science, in favor of empirical falsification. Because we cannot account for things we ignore that we ignore, the Gedankenexperiment's outcome may not be falsified within the Universe's lifetime, and yet one can argue now (from a metasystem viewpoint) that the outcome COULD BE false. Popper's logic of scientific discovery is also disemboweled in the history of physics, with many textbook examples showing that experience can determine theory (i.e., we can indeed argue or infer from observation to theory, as Newton discovered the gravitation law, as Max Planck found the quantum), and may fail to indicate which theories are false, which theories are true, considering modern physics.

Popper's logic of scientific discovery falls apart; scientific theories can be has been inductively inferred from experience.

Before we sharpen this picture, let us review the context. Science is about exploring. As we explore and then stop to focus our attention on new phenomena, we have relied on the scientific method — a free, open, self-correcting, 100% recycling, social machine that has been evolving for more than 3,000 years.

The fuel for this machine has evolved in time from observation to the scientific hypothesis, both subjects of much experimentation and discussion.

In this evolution, scientific knowledge has remained firmly defined as empirical, with its phenomena and relationships, albeit centuries of scientific and technological progress. In natural science, our starting and ending points have remained physically sensorial, while we have multiplied manifold the power of our means of observation (direct and indirect), reasoning, and mathematics.

There is trouble at the lab, however, as reported in the Introduction. The puzzles and troubles facing physics and other natural sciences seem to stem from the conventional scientific method, used in hundreds of years of "modern scientific era" research. The conventional approach, although beautifully providing a deterministic outcome, fails to provide guidance for improving the trustworthiness of the experiment's outcome beyond what we see today, and fails to satisfactorily include modern areas of physics research.

What seem to be missing is an improved model, as motivated above and further described next, offering a more comprehensive narrative, while still subjecting it to Nature's guiding hand, to potential observational evidence that could persuasively decide whether a theory is wrong and lead one to abandon it as a scientific theory.

The Gedankenexperiment supports this new narrative and a specialization of refutability, not a synonym for the "falsifiability" criterion proposed by Popper, and not including Popper's logic of scientific discovery.

1.1. What is a scientific hypothesis?

We start by revisiting and simplifying the first step of the scientific method, the hypothesis. We understand that a scientific hypothesis is not a guess, not an educated guess, not a prediction, nor needs to include cause, cause and effect, or a model outside of what is provided by the hypothesis itself. A hypothesis is not more and not less than a "testable relationship". It matters not how the hypothesis, so defined, is formulated; as well-known, the hypothesis could have come from a dream as cited by Kekulé in regard to the source of his hypothesis for the structure of benzene; the hypothesis could also have come from an educated guess, or exhaustive search; the hypothesis may include cause and effect if so desired.

Our motivation is that to necessarily prescribe more than what is needed, a "testable relationship", is to accept dead weight, which may restrict investigation, introduce bias, or lead to an error. We do not eliminate Nature (i.e., testable), however, continuing to use the past litmus test of science.

Definition: A scientific hypothesis is a testable relationship.

In observing that test, we understand that "YES" means "NOT YET FALSE" and "NO" means "COULD BE TRUE". We specialize this concept as refutability, defined below.

1.2. Refutability, Refutable

Definition: Refutability (being Refutable) is the undeniable possibility that a statement is false.

A "YES" result could be false, which means that it should be understood as "NOT YET FALSE"; a "NO" result could be false, which means that it should be understood as "COULD BE TRUE". Nature remains the ultimate possibility for acceptance or rejection, and its judgement is deferred for as long as desired.

It is important to note that refutability is not a synonym for the "falsifiability" criterion proposed by Karl Popper, to separate science from non-science (the "demarcation problem"). Refutability includes more channels of verification and can happen for a number of reasons, for example:
  1. Popper's falsifiability (that it is possible to conceive an observation or an argument for which the statement in question is false); or
  2. a YES or a NO result may at one time no longer fit the class of other results; or
  3. a YES or a NO result may be seen as incomplete; or
  4. a YES or a NO result may be outright invalid; or
  5. a combination of such reasons.
The list above shows that Popper's falsifiability criterion can produce both false positives and false negatives in defining if a theory is scientific or not.

In science, a NO is an answer meaning "possibly NOT this way". The value of publishing a NO result is well recognized. For experimentation and reasoning purposes, refutability means that we respect a NO answer because it came to us by following the same path that can lead to a YES, the path of experimentation, of questioning. On the personal side, for researchers, refutability means never tell yourself off, nor give yourself zero marks, nor ignore mistakes. Instead, researchers can be motivated by refutability to see any errors as beautiful failures, pregnant with opportunities, by focusing on "could" instead of "is", that a "NO" means "COULD BE TRUE".

1.3. Demarcation Problem

The Refutability criterion can be applied as a new, consistent solution to the "demarcation problem", in defining if a theory is scientific or not:

Definition: A theory is scientific if and only if it includes the undeniable possibility that it is false.

This definition applies not only to an experimental science such as physics ("A single experiment can prove me wrong". A. Einstein), but can also apply to a formal science such as mathematics. Early mathematicians viewed negative numbers with suspicion. The number 2 was not always a prime. The symbol "=" is now known to have at least five different definitions.

Recognizing that the final judgment over YES or NO is also deferred in conventional natural science and physics, we will continue the "demarcation problem" discussion in item 1.8.

1.4. Science cannot prove or establish truth

Many people may be surprised here, as this is another current conceptual conflict. As we honestly pursue the truth in science, we must also consider that this truth is actually a COULD and not an IS. It is actually a NOT YET FALSE, not a truth, not a YES. Therefore, to preserve the strength of argument and be honest, we understand that we have nothing to gain and everything to fear from the so-called "honest pursuit of truth". Science is not the pursuit of truth, even when done honestly. To think otherwise is a trap.

For it is impossible for science to pursue the truth, because we do not know where truth is and so we have to base ourselves on experiments and observations, both of which can be flawed. This, of course, can lead to flawed conclusions, so even the most secure scientific statements can never be confused with truth.

This creates a problem for statements such as "avoid fooling yourself into thinking something is true that is not or that something is not true that is." The problem is that science cannot define objectively what "true" is. Not in experimental science, as Nature has countered so often, and also not in formal science. Logic and mathematics have no objective notion of "true". Science, How and Why It Works requires refutability. Science cannot prove or establish truth.

Scientific discoveries, no matter how old and verified they are, should not be confused with objective reality, which should also not be confused with reality "as it is" (usually meaning what cannot be measured).

Objective reality is a social agreement that changes with time, as with the Sun going around the Earth.

What if nearly everyone in a certain society agree with the interpretation of an observation, could this be an objective reality? Yes, but this does not reduce the double-layer of "observation then interpretation" that stands in-between the phenomena and its description, through the observer (even if done by a device) and the interpretation (even if done by software). Albeit in this simplified two-layer view, one cannot eliminate precision and reliability errors at each layer, nor do away with quantum uncertainty conditions at each layer, nor layer interdependency. Additional layers can be added, for a more realistic view, such as financial interests, political gain, collusion, coercion, dishonesty, ethnicity and religious or atheistic values, bias, ignorance, and others factors including human error, all leading to further errors in what a certain society may call "objective reality" at a given time.

While the scientific method has been able, historically, to overcome many of the sources of errors besetting what a certain society may call "objective reality" at a given time, it cannot claim to reduce all such errors to zero — and it does not. Scientific discoveries are accepted because they work and while they work, continuously re-evaluated in themselves and in coherence with all other discoveries.
How about the value of pi, or that the Earth is round? Can these later found to be false?

Yes, and we already know that. Although the value of mathematically (i.e., arbitrarily, see item "1.5. Mathematics and Physics" below) defined constants may not change, the ratio of the circumference to the diameter of a circle in non-Euclidean geometry, as used in physics, can be more or less than pi. However, that ratio is very nearly pi over small distances in ordinary measurements on Earth. The Earth is not really round; we thought that it is more accurately an oblate spheroid, but that was wrong too — it is wider toward the south pole, more like a potato. The Earth shape itself will also eventually change in the future.
Like arrows in a war, words do not know how to avoid hitting undesired targets. Many respectable people, institutions, reviewers, and students often become undesired targets and believe prima facie — as sufficient to establish a truth — what scientists say, what otherwise trusted sources of science may peer-review and publish, what government agencies with scientific missions may want to lead for the meritorious purpose of an informed governance. Lacking scientific training, most people do not understand that in science "YES" means "NOT YET FALSE" and "NO" means "COULD BE TRUE".

Let, thus, our conversations begin by affirming clearly that in science we have nothing to gain and everything to fear from the so-called "honest pursuit of truth". We need to consider and explain that a fact should not be understood as truth, but as something that we were willing to believe.

The principle here is that yesterday's facts and truth are today's bias. Accepting this principle is a paradigm shift. It is similar to viewing an optical illusion image; your own bias (facts are true, I have no bias) does not let you see it at first but, once you have seen it, you can always see it. Bias is the enemy, and facts are bias.

In science, we do not affirm, we discuss; we do not accept, we discuss; we do not expound the Truth, we inquire. Unless we are willing to boldly call wrong, incomplete or see in a different way what was known and thought to be well-understood before, we cannot advance toward Truth.

All this makes a shaky foundation for science as an enterprise dedicated to discovering the truth about the Universe, or as a modern-day sharashka where truth is something assigned to be researched by low-paid postdocs, or as an expounder to governments and society of facts that are the truth.

Conversely, the discussion supports the idea that we are more creative when we change from "is" to "could be", to look for an answer rather than "the" answer. Social psychology studies (Ellen Langer et. al.) show that critical thinking and problem-solving can be made an order of magnitude more productive by people focusing on "could" instead of "is".

1.5. Mathematics and Physics

In mathematics, even if just in number theory or arithmetic, there are statements that are true and yet cannot be proved (cf. Goedel's incompleteness theorems). In physics, we also accept the idea that an accurate mathematical representation of the dynamics of Nature may not be possible.

However, mathematics and physics are developed differently.

Pure mathematics is entirely made up. Mathematics is created by human imagination, even the natural numbers. But, rather than always offering no limits and more freedom, as one might think, human imagination is quite limited in number, experience, and history. Although individually as soaring as it may be, human imagination can also harness mathematics. Still, mathematics preserves the role of intuition and abstraction to try to reach the absolute, the invariant without instances, to eliminate all unnecessary hypotheses, and to show the solution of old problems arising out of its own inward-looking mysteries.

Physics, on the other hand, while preserving intuition and abstraction, is somewhat freer from the limits of human imagination and even history. Physics can surprise our most active imagination — "imagination tires before Nature," as Pascal told us.

This motivates the skeptic to think that the current manmade division between mathematics and physics is not based on the nature of things, not necessary, may hide areas of work in a meandering "no man's land", and even lead to disconnected "islands" that are left unexplored, which areas may perhaps one day be reached by chance.

Thus, it is possible that physics in general and quantum mechanics in particular, being a system based not on human imagination alone but on Nature, can play a role where mathematics cannot — and be informative to mathematics, pure and applied, including computation, cybersecurity, and cryptography, as well as other disciplines. The reverse is also possible. Next, we exemplify and discuss these ideas further.

1.6. Do-It-Yourself (DIY)

Mathematics and physics can be much broader once you question them, and discover that what we call mathematics and physics were made up by people, by people like you and I, and you can change them, you can DIY (do-it-yourself) them, you can hack them, you can create your own definitions and theories, knowledge that other people can use.

Oftentimes, students give up on mathematics and science because they do not seem to make sense. The student may be right, and that is why the DIY case is important.

For example, when 5-year-old Christopher looks at different results on a calculator and concludes that zero is not really a number, Christopher is stating that zero belongs to a different class than other numbers. This is evidenced in division, where you can divide a number by any other number except zero. Perhaps, seeing zero as the "absence of a number" rather than a number, can lead to simplifications, if not new results or, at least, to peace for one's soul. Likewise, ca. 1950, some mathematicians got tired of writing "any prime number except 1" in theorems and decided, one by one in DIY fashion, that 1 would no longer be a prime number — although 1 satisfied all the conditions to be a prime number.

1.7. The search for the absolute -- new frontiers in science

A fault of physics and natural science is exactly that of being a science of testable relationships, which is bound to establish, although possibly conveyed in a mathematically exact form, relationships between phenomena. We multiply the power of our means of observation and reasoning, but our starting and ending points remain sensorial, the phenomena and relationships. In that, as said by Max Planck, "Our task it to find in all these [relative] factors and data, the absolute, the universally valid, the invariant, that is hidden in them." Of course, we understand that the search for the absolute in science, as mentioned by Max Planck, still includes the possibility of new phenomena and new realizations, unexplained or in refutation (i.e., showing refutability) of that what was once thought to be the universally valid, the invariant.

Conventionally, a natural science theory is ultimately judged by its ability to account for the data we can observe, albeit the steps along the way can be indirect. Counterpointing physics with mathematics, using a more abstract view, may help modern physics in realms far removed from everyday experience, where the connection to experiment may not even be possible.

Mathematics shows to physics the power of the abstract view, the invariant, while physics shows that even the most original creation of human spirit is more limited than Nature.

Thus, physics, not being limited by human history and imagination, and mathematics, not being limited by phenomena, could play a joint role in modern scientific investigation by extending our exploration beyond that which we can observe or imagine.

Most physicists are willing to accept the existence of things that cannot be proven. Most mathematicians are willing to accept the existence of things without any relation to phenomena — e.g. things that we, at present, cannot observe or construct in the physical world. Whether or not we can observe something directly, contemplating its possible existence may allow us to understand how it might play a role in how the world works.

1.8. The Demarcation Problem in New Scenarios

Consider the new "Demarcation Problem" definition in item 1.1., and new scenarios, with things that we, at present, cannot observe or construct in the physical world.

Karl Popper's criterion of "falsifiability" can no longer be used. It was shown in item 1.1 to produce both false positives and false negatives in defining if a theory is scientific or not, and is not applicable, in principle, to these new scenarios.

However, the refutability criterion can be applied in these new scenarios, both to physics and mathematics, changing the current demarcation criteria for science.

By recognizing that, as in the revisited scientific method, the final judgment over YES or NO is also deferred in conventional physics and natural science, we consider using refutability as a new guide into Nature's mysteries, allowing researchers to securely expand the frontiers of what is nowadays considered scientific into new scenarios, going into otherwise "forbidden" science areas between physics and mathematics.

2. The Scientific Method Revisited — New Narrative

Circa 300 B.C., Aristotle affirmed that to be scientific, one must have apodictic certainty (capable of demonstration, clearly provable and logically certain), and deal with causes or reasons.

The requirement to include causal or reason considerations in the scientific method was revisited in the XVI century, and is no longer necessary for a result to be considered scientific. In the second (1713) edition of the Principia, Isaac Newton famously disagreed with the need to deal with causes or reasons as a sine qua non condition to be scientific. Unable to discover the cause for properties of gravity from phenomena, Newton refused to speculate ("Hypotheses non fingo"). In 1827, when describing the jittery movement in water of particles ejected from the pollen of plants (Brownian motion), the botanist Robert Brown was able to deny that the particles were alive, but did not present a cause for the motion. Refusal to speculate about a cause, cause-effect relationship, or reason for phenomena, increased the clarity and precision of statements and reduced the expression of bias.

The requirement to include a falsifiability criterion in the scientific method was formalized in the 20th century, when philosopher of science Karl Popper proposed an additional test in the scientific method — that a theory is scientific if and only if it makes clear predictions that can be unambiguously falsified. A statement can be falsified if it is possible to conceive an observation or an argument which proves the statement in question to be false. Popper's falsifiability criterion introduced a new demarcation between science and non-science, which is still being discussed.

Today, natural science faces again a need to revisit the scientific method, as referenced herein. In our search for a revised scientific method that can deal, at the same time, with past and modern natural science, our task is to show how and why the current frontiers of science can be expanded, beyond Popper's falsifiability criterion — shown in this work to produce both false positives and false negatives in defining if a theory is scientific or not — albeit still subjecting natural science to potential observational evidence that could persuasively decide whether a theory is wrong and lead one to abandon it as a scientific theory.



(to be added, formalization of the intuition in Section 1)

3. The Scientific Method Revisited — New Frontiers in Natural Science

(to be added)

Conclusion

This work introduces a revisited scientific method, as a testable, scientific subject in the area of natural science and physics. In terms of its own definition of refutability, it should be abandoned, modified, or accepted based on observational evidence.

The revisited scientific method, including its new narrative, eliminates common conceptual conflicts, narrowly defines scientific hypothesis, and presents a new demarcation boundary for science with the specialization of refutability, showing how and why science can safely expand its current frontiers beyond Karl Popper's falsifiability criterion — shown to produce both false positives and false negatives in defining if a theory is scientific or not — albeit subjecting science to potential observational evidence that could persuasively decide whether a theory is wrong and lead one to abandon it as a scientific theory.

The new narrative is also presented as a philosophical subject, mainly regarding how acceptance or rejection of a theory can be based on Nature, with fewer limitations than current methods, bridging science and philosophy. Further, applications, observation and validation are discussed in terms of formal sciences and psychology.

Applications, observation and validation are discussed also in terms of formal sciences and psychology, including creativity and critical thinking. The wide scope of this proposal as an abstract idea is evidenced by these and other fields of application, showing different instantiations of the search for the absolute, the universally valid, the invariant, that is possibly hidden in whatever exists as contingent and conditional.

We note that this work is applicable not only to the sciences but also to other aspects of the formal and informal (DIY, do-it-yourself) pursuit of education and knowledge.

REFERENCES

[1]

[2]

[3]

[4] Lehrer, J. 2010, "The Truth Wears Off, Is there something wrong with the scientific method?", The New Yorker, Dec 13. Retrieved from http://cbbp.thep.lu.se/~henrik/mnxa09/Lehrer2010.pdf

[5] Trouble at the lab. (2013, October 19) The Economist, 409 (8858), 20-23. Retrieved from http://www.economist.com/news/briefing/21588057-scientists-think-science-self-correcting-alarming-degree-it-not-trouble




INTERNET MODEL

The original, and current, Internet design has been mostly based on an honor system for the end points. The model being that the connection is less trusted than the end points, as access to the end points was granted under an honor system — and usage rules were effectively enforceable.

Reality showed that this model was upside down for commercial operation. The end points are less trusted than the connection. In fact, even if usage rules are enforceable at some connection points, the end points cannot be controlled. Anyone can connect to the network. There is no honor system. Usage rules are in fact not enforceable, users can hide and change their end points. The solution is to introduce trust as an explicit part of the design, which trust was implicit when the Internet was based on an honor system. Of course, updating the Internet design to fit its current operating conditions is useful not only to stop spam. Social engineering and spoofing attacks also rely on the old honor system where users are trusted. "Trust no one" should be the initial state under the new Internet paradigm. The bottom line is that trust depends on corroboration with multiple channels (see Trust, above) while today we have neither (a) the multiple channels nor (b) the corroboration mechanisms. So, we lack trust because we can't communicate it. Current work by Ed Gerck and team includes these topics, proposals and tests to combat spam, spoofing, and denial of service, as well as information-theoretic secure authentication integrated with authorization for access control.


Current Research Areas
Quantum physics, complex and quaternion analysis, Clifford algebra, cybersecurity, data sans matter, information theory, trust theory, reality modes, cognitive processes (natural and artificial/AI), and effective scientific methods.

Contact Information
ed a­t gerck • com

© Ed Gerck, 2001-2015.