A Research and Editorial project in progress about Interdisciplinary Modeling
Each scientific domain or even field develops its own methods, its epistemology and therefore the need of a General Epistemology to ensure a consistency between domains or even between fields, such as quantum mechanics and general relativity in Physics for instance, becomes increasingly recognized as further scientific integration is widely acknowledged, from cosmology to society going through biology and a fast growing new “Integrated Environmental Modelling”.
Physical world and boundary issues
‘Our’ universe is assumed started at ‘Big Bang’ and then gone through several eras after initial inflation: 1) radiation era, 2) matter dominated era and 3) most recent dark energy or cosmological era and its ‘late time’ accelerated expansion of the universe.
This ‘first’ evolution is increasingly well modeled and tested from a set of cosmological models, based on physical foundations and principles, such as homogeneity and isotropy, themselves relying on assumptions about concepts of dimensionality, including numbers and types of dimension(s), to begin with space and time, numbers of (degrees of) freedom, geometries and metrics, hence measurements. As an example angles are often key measurements and parameters in Physics, and found in spherical coordinates equivalent to Cartesian and yet don’t embed any spatial length while conversely Cartesian xi don’t bear alone the concept of angle.
Yet, in spite of the fast progressing experimental precision of the ‘cosmological parameters’ of a consensus ‘ΛCDM’ model, major ‘cosmological issues’ remain unsolved at its borders, especially about the transitions from dominant energy source, whether radiation, matter or current ‘dark energy’ usually linked to the energy of a ‘vacuum’ which therefore appears all but empty.
Borders, conditions, horizons, ‘closures’ and related transitions, such as Physical to Biological, are of critical importance in our work, where our hypothesis [1], applied to cosmological issues and derived cosmological parameters, seems consistent with recent observational surveys [2,3], residual discrepancies [4] and “sharp acceleration at low redshift”[5].
Biological era
A bang is somehow assumed in Biology also, on how did life start, whether here or everywhere?
Linking such event(s) some geometry, through this ‘late expansion’, should not astonish once observed how deeply biology is rooted in physics and physics in geometry, to begin with the geometry of this ‘vacuum’, where the dark energy or cosmological constant effect is hypothesized to be sourced. Furthermore a linkage of life with tridimensional chemical structure order level has already been pinpointed [6], but the mechanism of a connection with universe geometries is still to be submitted [7].
Toward this goal, recent conclusions about evolution modeling, hence its long term deterministic aspects, call for some kind of “new general concepts to predict it [8]”
Human, sociolinguistic era
This same dimensionality issue becomes critical to the progress of cheminformatics and of computing, where grammars starting with Chomsky’s, question the roots of the divergence between formal linguistics and ‘natural’ ones, which label human languages or semantics.
Linking these to the vacuum again, both smooth and quantized, is better acceptable now that the EPR (Einstein Podolsky Rosen) effect is experimentally well proven.
Our General Epistemology [9] hence General Predictability, metamodel of an integral evolutionary universe progressing toward an ever increasing complexity density, or equivalently toward its future, already appears to fit with requirements from a variety of observations ranging from cosmological to semantic.
At this point in our research program the ‘integrative geometry’ also seems to solve several issues about defining Time [10] as well as the “three worlds” issue developed by a line of logicians and physicists ranging from Leibniz to Frege, Bolzano, Popper and lately emphasized by Penrose [11], and somehow related to Peirce [12] in the range.
This issue wonders about a ‘mental world’ apparently embedded within a ‘spatial world’ or universe and yet itself embedding a ‘truth world’ which itself would contain the universe in an apparently Escherian, illogical circularity.
For a consistent evolution of the universe from cosmological to sociological eras
The time dimension, already relativized by Einstein, is ever more questioned by physicists as well as at the other edge of the complexity spectrum, i.e. in psychology and whether the “Ultimate Quest” [13] or a puzzle [14] the need for a more comprehensive, cosmological concept of time has been expressed at both Cosmo 11 (J.P. Uzan) and Cosmo 12 (A. Stebbins) conferences respective concluding talks.
Meanwhile the irreversibility of a universal time, as opposed to time reversible laws of physics, was emphasized in these references [9, 10], among many others leading us to submit the draft model, now to be more strictly presented and tested, of an equivalence between the future and the line of progress, or efficiency, or knowledge, computable as maximal ‘FCP’ complexity density, using the term ‘FCP’ to summarize as ‘Fourfold Co-necessity Principle’ the framework from which we now derive the type of extended four dimensionality needed by this geometry.
This principle, of which one foremost basis is to replace a debatable measure of time by a generalized, oriented, more relativistic and experimental one, is beginning to deliver the seeds of an interdisciplinary cartography from the manners and orders that the co-necessity expresses itself, if only once observing our a grammar, as one level, needs a grammar to be signified.
Predictive power from interdisciplinary integration
The chart that spans page 4 of Millis’ “Progress in Revolutionary Propulsion Physics”[15] illustrates over a hundred of interdisciplinary links and even routes from a short list of fundamental physics domains upward to “goal-driven (propulsion) visions” downward. However there are actually tenths of thousands of fields – each full and crossed by projects, progresses and results – and even more evolving visions that dynamically and permanently interact and no integrative predictive power may be reached unless each piece of trajectory, which must exhibit a superior and integrative consistency with sufficient past and potentially past ‘real’ or ‘realizable’ trajectories in the hypergraph.
Other complexity levels and related graphs are questioned about the biological evolution as mentioned above and related complexities for instance by Amaral, whom observes that “our understanding of biomedical system has fallen behind our ability to gather new data…[16]” .
He questions then the measure of human biological complexity from an interactome with even 650,000 interactions, although a neuronal dynamical hypergraph would then be a better level about the gap from “unveiling the working of single neuron…” to… “provide an understanding of consciousness”. Indeed it would still be useful to get to the Natural Linguistic or semantic Power set pinpointed in [26] from the communities that project these onto hierarchies of concepts and together with them into appropriate spaces to the coarse graining they manipulate.
In his ‘Foundation’ series, Isaac Asimov pictures a remote future with humanity then following the rule of a centralized galactic empire entering a decadence that a scientist predicts but strives to make less devastating by using a new, interdisciplinary science, ‘psychohistory’, from which a better future for humanity can be predicted from appropriate initial conditions plus ongoing, enhanced, general, predictive follow-up. There are interestingly two foundations, a first one rooted in physics and more generally ‘natural sciences’ and a second one, warden of the psycho-historical equations and subsequent fate, but this split is finally overcome when a wholly integrated, assuredly more environmentally friendly and holistic, reveals a wining solution.
How to reach so powerful an integrative, interdisciplinary predictive power?
To compute the future from the interactions between anticipations, goals and existing scientific fields and technologies, the step of picturing links such as mentioned above or even arrows – the ‘productions’ in grammars or the ‘morphisms’ in Category theory – is useful but far from sufficient. Knowledge flows and trajectories generally speaking must be modeled through a process allowing and predicting how dedicated models intertwine, interact and encompass one another once projected into some appropriate spaces, which themselves depend from them.
One difficulty comes from the considerable differences of order levels between the hyper-dense semantic layers, which therefore lay in the future of the under-dense structural, computational, chemical and moreover physical objects. These, conversely, require expansive space to take shape and place and to operate to become the present and repel the past where the degrees of freedom are more limited, hence the number of free paths, which is what makes it the past.
Interdisciplinary Integration versus Statistics and ‘Big Data’
Science is about predictions, often drawn and checked from statistics where typical events occur more often, more repeatedly in a ‘large number’ than under assumed by default randomness, homogeneity, i.e. equal probability of occurrence and Acceptance. A pattern of ‘YES’ versus ‘NO’ becomes scientific when reproducible from a model (formal) or Language L and technological when systematically so from a process, effective from external change S, yet delegated (S(L)).
There is a difficulty with the idea that such hazard, chance or change take place ‘spontaneously’, a word of which the meaning and testability are debatable, somehow redundant: first principles from Parmenides, Heraclites, Democrites, Socrates, Plato, Aristotle appear, after deep scrutiny, interwoven, interdependent and even co-necessary [17] to one another. This was progressively guessed by these pillars of the our philosophical and scientific tradition even if their Logos are seen from diverse angles whether S(L) or L(S) – such as NP and P in computational complexity, cf. below – is prioritized in their co-necessity because of an ill-defined context.
The manner of interaction between space and an extended concept of oriented time or horizon, with change and form, such as language, thereafter characterizes a portion of what we call an integral, hence interdisciplinary universe, driven by dimensional types and numbers. This is observed at opposite edges of the complexity ladder, from EPR phenomena to natural linguistics, realm of the meaning describable with Turing’s oracles more recent ‘provers’.
Oracles, provers and objects
Randomness is hard to define and for instance key definitions in computational complexity oppose deterministic Polynomial Time (P) to ‘Non-deterministic’ (NP) powers, a Non-deterministic language class encompassing problems for which solutions are guessed and then checked in polynomial time as opposed to the deterministic, algorithms, for which the solutions are calculated in polynomial time.
Throughout the evolution of the domain this ‘guesser’ has been replaced by a ‘Prover’, eventually itself a Turing machine complementary to the first one somehow similarly to the unavoidable role of the observer in physics and particularly emphasized by Geroch for cosmological “domains of dependence [18]”.
The ‘guessing’ of Non-deterministic language [19] is an extraction of some knowledge from a space which has so many (more) degrees of freedom that they enforce an orientation which we call the future. Hence one definition of chance as “the measure of our ignorance” [20] but then the replacement of this guess always come with the equivalent generalized domain of dependence or acceptance that encompasses them, such as ‘R’ or ‘Ck’, with their power of a continuum hence the null probability to draw any given number in them and yet the certainty to find it there.
The Cosmological example
Today the Galaxy seems to become reachable through the quest for earth-like environment while the physical evolution of the cosmos from the Big Bang to our days appears quite well modeled by a consensual “ΛCDM” model, of which however the boundaries are now at stake.
This model uses a limited set of cosmological parameters, but still leaves critical issues unanswered, whether about the Λ Cosmological constant (or its ‘Dark Energy’ effect) and CDM (‘Cold Dark Matter’) main components of the physical universe, or about its leaps from the physical to biological complexity levels and from life to intelligence, particularly the so far human, semantic, ‘psycho historical’ intelligence that we may also call knowledge.
The power of sciences comes from their tested predictability, restricted to a limited environment, yet effective enough to anticipate natural or technological phenomena: a research field, in progress toward its assumed and targeted research object, succeeds when it predicts what becomes its effective object.
A powerful scientific model may predict a diversified range of behaviors, depending environmental constraints on the object of research once become experimental or even applied. However, its predictive power is limited to the reproduction of an existing, tested form or formalism or, equivalently, to the production of an effect such as described and calculated from its model. In other words science predicts the future inasmuch it resembles the past, not the future inasmuch as it introduces radical novelty, breaking from the past.
The model proposed results in a three-dimensional trigger reached at some point, and which may be interpreted spatially, according to which the Dark Energy density ratio is limited at 0.74048 (from the Hales-Kepler theorem), hence the matter density ratio at roughly 0.26, the both kept at these levels since z circa 0.2 to 0.3 range, this precise za then be checked.
Saturated complexity density
Several principles and meta-models derive or at least appear to be consistent with the FCP, such as the Heisenberg Uncertainty Principle and principles of thermodynamics including the second principle, according which entropy grows within a closed system and a complementary principle according to which systems complexity grow relatively to one another to the future, yielding a model of the future according to which a system will become saturated from its order levels.
At one spatial edge of the universe an opposition endures between ‘cosmic voids’ and their over-dense dark matter boundaries [21], where conversely most galaxies therefore concentrate.
From recent considerable sets of observations have resulted ever sharper constraints upon a deterministic model of the evolution of the universe, which appears to govern it quite precisely and yet there remains a discrepancy between the ‘inferred’ and observed [22] Hubble constant H0 suggesting that something is missing about some of the universe boundaries… possibly ‘ours’.
This is where a low-redshift hence recent new constraint and dimensionality, is proposed [23].
How to predict a future that does not replicate the past?
Knowledge has flown from researchers and inventors into spaces of available resources from where it always revealed most favorable paths to future thereafter always to be ‘new’. This happened through apparently non-refuted a pattern whether the hypothesized paths were refuted, hence then discarded or on the contrary confirmed, possibly extended to new limits. It also applied whether the origin were to be found primarily from past experiments, along an empiricist perspective or rather using innate categories toward hypothesizing or through an intricate Kantian mechanism.
A model of scientific and technical discovery and invention therefore does not seem easily to get rid of researchers and innovators, let’s regroup them as Authors whether they follow rather L(S) formal modeling routes toward potential experiments ‘à la Maxwell’ or inventive, perhaps more pragmatic trials and resulting data gathering ‘à la Faraday’, from which appropriate models could arise.
These authors, Geroch’s observers, are questioned by Penrose, whom looks for their objective equivalent for instance in some ‘OR’, Objective Reduction, to account for the quantum reduction exhibited everywhere in physics, about which he presents the interpretation that any action upon an intricate particle A with a particle B triggers paths going backward toward their common past, to the place, date or rather formalism where they went intricate, to impact B.
But then shouldn’t we therefore conclude that this event, to which an action on particle A, or on particle B, will eventually travel some influence, actually lies in the future of these isolate particles even though it also lies in the past of the knowledge of its authors about it?
Our model of the future naturally encompasses both types and levels of phenomena, that is to say that some common conclusion is drawn about when and how much a phenomenon is located in some relative future or past of another one with most phenomena positioned in our future because this is where the diversity of paths and degrees of freedom is so incredibly richer that it makes it, mathematically, the future. As mentioned above it is from there and then that author convincingly and regularly select and project them through one of the types of routes that ‘authorize’ just because the space of items and structure going to fill the free, intermediate complexity levels between the human and less complex beings, is immense. Requiring what appears to us as a fast expanding knowledge space with appropriate, such as electronic, room.
Likewise at other dimensional combination the space of items and structure, up to spatial ‘Large Scale Structure’, filling the physical universe, has become ever better filled especially within over-dense regions that, however, become so relatively to their under-dense counterparts.
All these examples help better see, and from there model, the relative positioning of horizons, such as event or cosmological ones, or otherwise oracles or authoring ones, whether from one side seen as scattered microstates or, from the other one, as a simple four parameters set.
To conclude a model of the future may not draw from statistics, how useful they may be to uncover pattern hidden in data and even more and more powerful patterns, yet limited to the power and boundaries of the projected research project onto them.
These will be useful steps in nearby futures but may not trace the trajectories, especially interdisciplinary – such as those that are powerful enough to embed downward into technologies, economical and business models – of which the boundaries are semantic, particularly the mathematical part of the semantic continuum, from its Power set[24].
This explains that research and inventiveness have succeeded in their optimal environments, those where the degrees of freedom allows enough appropriate trajectories and the free evolution and selection of wining research projections and communities.
From the Research program to current editorial and educational projects
The road to a model of the future is long to wander, which seeks how the models of the future will succeed and embed into one another. It goes through both L(S) and S(L) routings and always some experimental, hence somehow spatial resource consumption.
The first one being the series of papers and editorial project already initiated and going to be presented in next pages and publications from where an editorial extension is planned.
The second one has already resulted in some early prototypes, such as ex.revuer.org, going to be replaced by a new set of dynamical scientific mapping prototypes and, once sufficient resources gathered, by an envisioned interdisciplinary interactive mapping, primarily made for and through the research communities, goals, networks, institutes and agencies.
From there is would become very useful for all kinds of stakeholders in need of better, more powerful, hence precise and therefore interdisciplinary scientific to technological, industrial and psycho-socio-economical mappings.
A selection of published and communicated papers, books and patents is listed next page and some unpublished papers mentioned.
[1] Research paper to be submitted
[2] G. Hinshaw et al, Nine-Year WMAP Observations: Cosmological Parameters Results, arXiv: 1212.5226, 2013
[3] Planck Collaboration, Planck 2015 results, XIV, Dark energy and modified gravity, arXiv: 1502.01590, 2015
[4] E. Aubourg et al., Cosmological implications of baryon acoustic oscillation (BAO) measurements, axXiv:1411.1074v3
[5] J. L. Bernal, L. Verde, A. G. Riess, The trouble with H0, arXiv:1607.05617 [astro-ph.CO]
[6] J. Monod, Chance and Necessity, Seuil, 1970
[7] To be submitted as part of the editorial project about “Integral Universe”
[8] V. Orgogozo, Imagine another life evolution, CNRS quarterly journal, #284, Spring 2016
[9] A. Aspect, J. Dalibard & G. Roger, Experimental test of Bell’s inequalities using time-varying analyzers, Physical Review Letters 49, n°25, 1982
[10] S. Carroll, From Eternity to Here: The Quest for the Ultimate Theory of Time, Dutton, 2010
[11] R. Penrose, The Road to Reality, A.E. Knopf ed. NY, 2004
[12] C. Thiercelin, Peirce et le pragmatisme, Puf, 1993
[13] S. Carroll, Infra
[14] R. Penrose, infra
[15] M.G. Millis, IAC-10-C4.8.7 https://arxiv.org/abs/1101.1063, 2010
[16] L. A. Nunes Amaral, A truer measure of our ignorance, PNAS, vol. 105, n°19, 2008
[17] http://dx.doi.org/10.1063/1.2737004
[18] R Geroch & G. Horowitz, Global Structure of Space-time in General Relativity, An Einstein Centenary Survey, edited by S. Hawking and W. Israel, Cambridge University Press, 1979
[19] S. Goldwasser, S. Micali, C. Rackoff, The Knowledge Complexity of Interactive proof-Systems, ACM 1985
[20] Diverse references to begin with Poincaré’s “Last thoughts”, 1913
[21] R. Wojtak, D. Powell, T. Abel, Voids in cosmological simulations over cosmic time, arXiv:1602.08541, 2016
[22] A. Riess et al., A 2.4% Determination of the Local Value of the Hubble Constant, arXiv: 1604.01424, 2016
[23] To be submitted, infra and list of papers
[24] J. Hopcroft & J. Ullmann, Formal Languages and their Relations to Automata, Addison-Wesley ed., 1969

2 thoughts on “A General Epistemology to Model the Future”