Friday, May 8, 2015

Abuse of Statistics - When does a Theory Fail? (Sleeping with the Fishes)

"Godfather Horse" with Last Fish Supper
In future posts I'm going to blog about reviewing the scientific literature for the criteria for the determination for new physics, the predictive precision of the theory, and various other metrics.

Really, what I am looking for is how does science determine the quality of a theory?  How well does observation match theory? Or theory match observation?

Once that is defined - how science uses sigma or other statistics to determine if new science is at play - we will then compare the way statistics is used in that case to the case how statistics are used for quality assurance in the semiconductor industry's high volume production flow well published 6 $\sigma$ program.

Really, what I am getting at is error analysis.  There is an entire combination of things that have to go into consideration when one is doing a TOTAL error analysis - all the way from the fundamental and simple absolute vs. relative measurement, and ground references, and measurement error, ""theoretical error"", PVT induced error, to other things most have never thought of...

Never a dull moment with...                                       

The Surfer, OM-IV


  1. Hiya +phxmarker mark,

    I love You! You make right tough notes to the question, Your rather concluding decisions are straight ahead about what will be an solution of/for proove for an new better a.t.o.m.-theory that is, can be anyhow, more scientific approbiate a'priori from an more unification (of ... ~). Well, (o)ne can do only statistics from data put into concatenate formulars. Also thus greater the data-basis ~ that is anyhow the mesurement of the field of mutal induce of object for real(s) ~ thus more the probability coefficients of/for correlation and significance ... but one has to refer to a list of term's-data from an technical construction. Anyhow, no one can handle the data in time formular until there is no designed apparatus for that questioning.

    I thought long times in reference to logicals and the unification of sciences, where talked about the false flag operation in abstract of an realistic and more convenient theory for that philosophies. But unfortunately also the best of are false, and that is from the outer restricted from the infinity (even in functions). May be that the word "infintiy" for understandment gone some lupo? ;). In few words: It's simple to show that "inf." is only an "a point of time depend on the real motion of object ..." where the forms of function's do not match the type of logicals the operation require and - for the measurement [m] - the [units] concern the "... object of the little" ar'nt unfold and selected from holding the parse-intelligence _what is the matter of quantum-field_ , where the forms dynamicals of functions "change/chase" what is, somehow the first to be known, form that flesh. That mathematical circumstances are not known, and for scientific there is no reason how such implementation of - semiconductor technics will be made for our newer newer view of/for sci.unification.

    ... see U

    It's hard to realize that from an unknown and even folded "horizon"

  2. Thank you +Mithrahee Zor-El. I design high volume integrated circuits for a living and use statistics to deliver a quality product (low PPM to PPB yield). So, I know statistics, dynamics, stochastics, and anything necessary to create value for the investors.

    While doing my research, I found it appears the ""scientists"" are ""grading"" their work in a way quite different from how high volume production uses statistics, or at least that's what it appears to me so far, so this is the driving force behind this new angle.

  3. Of significant note, when the going gets tough with data analysis overloading the computational power presently harnessed, how does one then get more processing power? What or who do you turn to?

    When the problems gets tough, there is nothing more powerful than a human's processing capability.