malament-hogarth t1_j3jg38e wrote
Reply to comment by SubtlySubbing in For the émigré philosopher Imre Lakatos, science degenerates unless it is theoretically and experimentally progressive by ADefiniteDescription
“Theories have always put paced experimental advancements.”
No. Take the CMB for example. It was discovered when not looking for it. Total accidental Nobel prize.
String theory is an interesting discussion, because it is more closely woven to what is serendipitous(especially in the extraction of dynamics), the nature of a substrate, and the nature of operational distinguishability, with a very simple rule set. Like Newton, it is a special use of general relativity, that requires a displacement. Newtons theory is not really a new paradigm, it is just a subset of Einstein, Kuhn is wrong in this regard. He is right in the expanse of a conceptual understanding.
Displacement is the core of the Calabi-Yau manifold, how to approximate distances. Nonlocality is the core assumptions, that degenerate into separable fermionic and bosonic interactions. DM comes into play because of how the Lagrangian drives the modern theory, the jig is not up that the anomalous magnetic dipole moment(or amplitude a electron absorbs and emits a photon) is slightly off from the Dirac delta function. There is a means of correction with coupling constants to a vast array of conformal field theories. This is what it means for a field to be renormalized, which some interpret as a means of rescaling, others a means of quantization. The constant of constants. Either way the electron mass must be measured and plugged into the theory, past the standard model. The standard model represents the core of the fracturing of symmetry and shows the use of symmetry to conserve features independent of dimensionality.
You talk about safely ignoring the ratio of between an electrical force and a gravitational one, but AdS is a all about the translational invariance in holography. The unification of fundamental forces implies a conservation.
Did nature cause decoherence for the fracturing of unitarity groups within the early universe? Was such a fracturing timeless, or “all at once” across all of time and space. Or is gravity a thermal emergence that occurs, and is conserved in DM and DE, hence what the fracturing of symmetry must have happened “all at once”? There are models for both of these, and very real philosophical interpretations to the Nature of truth we ought expect. And yet why would unitarity choose one over the other? The self referential problem, is these principles seem to imply multiple decompositions we must take on the road to reality, conditional on how we define an action, and what features we find ‘anomalous’ to reality. Parity, charge, and time all seem like very real things, but classical theory does not allow us to use all of them without including a degree of freedom, and yet we will never have all three. This is where Popperian thought collapses, what is the null hypothesis of handedness? Dimensionless features, not a problem to project some density functions. Yet there is no architecture, no experiment, to such a thing. We require the mapping of to a degree of freedom(or the implications of 1/2 for that matter).
Lakatos is wrong because the philosophy of science begins on the degenerative. Should we use classical or conditional probability? We should use both. The program is not degenerative because some aspect of subjectivity maybe intractable for causal conditional probabilities, we simply switch to classical probability. Because classical special relativity is timeless we ought be neurotic? Well good thing we are. But the classics can continue if we but assume one hidden layer(of course anymore, forget it) and work with what covariance affords, a context.
Aberration addiction is a problem of philosophy. It is a shortcut of the mind useful to an extent, a distortion we can recognize, perhaps something we can ever deload to some topological defect, or always make excuses for in Platonicism where formalism fails. Heck even intuitionism, where such lazy eliminativism is disallowed, fails.
What is so interesting, is that within poetic naturalism, we can recognize the “mob psychology” and its need for a change, but also human limitations to pierce reality. A healthy psychology has the slightly new. Maybe we just need more conditional probability, as Bayesian can commensurate the scientific method, yet Feyerbend will always be a great insurance policy. Some part of the pathos will keep the ensemble eloquent. Some part of the reduction will assume the collectively exhaustive. Some part of orthonormality will continue to be observable in the bell test. The use of capital T truth and lowercase t truth, does not make model based realism or the surrealism any less interesting. David Deutsch has good take, where we should work toward what things are “possible”.
That being said, I welcome what any imagination can bring to the absurdity of unitarity and yet the seeming rigidity of isometrics. There is a mystery that will continue in the debatable ambiguity as coverage, or the rationale hiding within the indistinguishable operationally equivalent. Our evolution finitistic as eon is defined, an extension of ourselves so familiar, yet alien. As auxiliary as is multiple realized, metastable claims are not sane, they are ahead in their thinking and behind for our kinesthetic senses. We can assume past the intractable, but we cannot state a reality wholly deterministic or ceramic. We are blessed with opportunity and knowledge, broken as a walk in three dimensions that seeks coverage.
Viewing a single comment thread. View all comments