Dimensional Consequences 1

This is the beginning of a new series of blogs about a Comprehensive Universe that includes a Cosmological Universe as, logically, one of its components, following a recall of the “three worlds” issue in On the REVUER project, toward a ‘General Predictability’ and suggestions for this series of simple, plain English blogs without maths.

As a first goal we tackle with remarks on the readability of some papers such as [1 – 4] or even more recent potential-impact-of-fcp-to-fundamental-physics-and-cosmology, now to be updated.

Therefore this new blog series will get back to incriminated sentences – whether caused by scarce room, fuzzy neurons or both – and hopefully deliver acceptable reformulation.

This update follows a Cosmological Introduction citing recent observational results giving some credibility to our past predictions.

Some comments have since questioned the focus on TT rather than TE and EE modes from the ACTPOL October publication in reference but the choice was consistent with issues regarding polarized modes and with its sharper profile, and above all fits well with conclusive other observational summary [5].

In this page the  focus is on the ‘A’ dimension of the Fourfold Co-necessity Principle (FCP), from which derives [3] what looks like a 3D Space/Four dimensional space-time.

‘A’ was introduced as an extended time dimension, a dimension of the Future, or of Consequence, that the reversible t of the laws of physics does not carry. The explanation is that such a t is radiation phenomenon dependent – hence a constant light speed – as opposed to what is needed to record an irreversible time, an ‘A’ i.e. observer’s end point, with enough, differentiating power relatively to (in that case) light.

Dedicated chapters with more detailed and formal proof are devoted to it in the Book series introduced in our next blogs.

The assumption underlying Robertson-Walker metrics side of the FLRW cosmology is that of a homogeneous and isotropic universe, a pattern or rather absence of pattern that the CMB experimentally confirms through instruments such as WMAP, Planck, etc., and that may be taken as the other edge of an evolution toward Large to Small scales structures that exhibit utmost local heterogeneity from voids to filaments, clusters, galaxies, arms, stellar systems, habitability, life, intelligence, hence all kinds of end domains or observers.

Now one way to attempt to frame the Future is classically to oppose a set of events with probability – if any – close to one to happen, to a set with probably close to zero, hence effectively null, as with the classical case of the cup broken into many tiny pieces. So defining the Future equivalently describes Observer power, hence a surface of observation.

A universe with a great number of degrees of freedom, as compared to the number of constraints, has many possibilities to be ‘satisfiable’, hence potentially a future or in the future, then consistent with the current universe, as opposed to a universe with too many constraints, local to global, as introduced in Cosmological Coincidence (abstract).

The past, hence present, hence past of the future, therefore accumulates the most exact next satisfiable, hence consistent ‘real’, spatialized, or FCP – meaning that its fourfold co-necessity has become explicit – organization or large to small to quantum structure, integrating available dimensions ranging, for spatial ones, from epsilon to Planck to Cosmological universe radius lengths and therefore questioning the most critical and even central “observer at infinity” of Lorentzian, Euclidean (as pointed by Hawking) and Riemannian (as concluded by Penrose) geometries.

Other chapters will however be devoted to highlighting how the scheme similarly seems to apply well to other domains, from biology to computability, linguistics and economy. Which would assume what may be seen as a ‘Wave function’ extended to a broader geometry than the one already extended to the gravitational type of metrics.

For geometry comes with the introduction of metrics assumed to allow measurements for a dimension called space and yet using non trivial formal angular dimension as well as change and finally symbolic synthesis. If one assumes it to be fundamental, for instance for a universe starting from nothing, i.e. with no content, it is then the intrinsic, non reducible gap between the Euclidean and Riemannian universes that matters, with the “point at infinity”, the closing power embedded of this last one.

This is where is found the observing power deemed necessary by Geroch, stating that any ‘Domain of dependence’ requires some “agent who gathers the initial data and actually makes a claim about the future”, as well as Dowker and Kent [8], and Gell-Mann and Hartle [9] “Information Gathering and Utilizing Systems” (IGUS), seen as another manner to imply this ‘A’ side of the FCP in Cosmology.

One such power level is for instance the ‘Objective Reduction‘, ‘OR’ decoherence, that Penrose guesses and links to gravitation itself. Such a wave function collapse is doubted by Hawking – as opposed to the gravitational collapse – in the same reference, but we will argue for the universality of the actualization, ‘S’ collapse, of which each of them are very particular examples and instances, linked to very specific Comprehensive universe configurations, even though historical, as concluded at the end of this blog page.

And to such necessity belong, again, Hawking’s ‘observers at infinity’ for the Riemannian, compact sub-universes of Euclidean 4D glued to Lorentzian future(s), as quoted in [2].

There is much to derive from his conjecture, particularly summarized in [6, 7], where this last reference displays synthetic and picturesque content. His Wick rotation may look like a timed event but also, like historical Higgs, be considered as well a field, especially by concluding that an event may be equivalently considered as a point on a path from some ‘real’ complexity layer to upper real complexity, mathematically its consequence, hence making the Future by ‘gathering’ a very great number of potential paths, of which many a priori free.

This might be compared to Everett’s proposal but does not require many worlds, which only exist in the Future, of which the Lorentzian (1,3) with spatial 3D only opens a limited potentiality since it is still restricted to the gravitational phenomenon.

The FCP configuration needed to open future(s) with life and even more complexity, is not reducible to this gravitational wave function, as disruptive to quantum ones as it may be: closing, observing (hence acting) fields are still necessary – and indeed seem part of Hawking’s scheme – but imply, hence come with or even define such power levels that we call… life.

These derivations however need dedicated pages, which are part of the task for which we describe our Research program in next pages, as part of the “Comprehensive Universe” project.

But in a few sentences however, it operates as follows:

  • within a defined typology of dimensions, the Euclidean appears as the frontier between the Lorentzian and Riemannian universes, about which Penrose already opposed, when speaking of the Hartle-Hawking scheme, the Euclidean rotation group O(4) as “compact, so of finite volume whereas the relativistic Lorentz group O(3,1) is non-compact and of infinite volume”.
  • such a split a priori allows for the future to be found in the Lorentzian ‘side’ or conversely the Euclidean to be its past, or at least the present, hence ever flat, assuming observing power, i.e. convergence points, can be found in this future, and from where, conversely, past objects and more generally satisfiable overall structure is considered.
  • this is indeed what the cosmology community’s quest somehow assumes and concludes in its growing integration of models about all kinds of universalities, large and small, earlier and later and from quantum to cosmological levels, all having to be ‘really’ (or FCP) i.e. extensively, rather than solely formally, consistent.
  • meanwhile, there is a limit to a consensus seeking boundaries solely from the integration of contents themselves assumed to be borne from boundary conditions: it implies that the evolution of the boundaries, i.e. of the universe, may be modeled without recourse to the contents, even though their survey is useful for checking.

This is the approach retained in the papers quoted above, according to which the universe has to be and remain flat for intrinsic, extended geometry reasons, thereby answering or at least relativizing questions such as some of the known fine tuning issues.

As a summary, in the wake of approaches focusing on the geometrical side, rather than energy content of the universe (again deemed to derive from the first one), it appears logical that fields such as Higgs, matter and energetic fields, but as well life and other fields exhibiting local and eventually non local power to emulate one another, to start with the entanglement [10] and decoherence cases, result in a most precise and predictable manner from the requirement of increasing complexity, as part of the definition of the Future, hence along the direction, which we have come to describe as FCP, of fastest overall, efficient actualization compatible with available space and structure.

Conclusion

While gravitation as a geometry, like from it matter, as a realized category – Higgs field and process once satisfiable with previous aggregated and clusterized complexity – have the power to entangle and conversely pastward disentangle and decohere, similarly life and further actualized real – i.e. FCP that is to say consistent along all dimensions – complexity levels exhibit a specific capacity to emulate matter and conversely decohere into it.

The emergence of each next complexity level seems compulsory from the mere incommensurability of any immediate potential future to the previous state of the universe, to begin with a cosmological universe.

The evolving geometry that releases related additional degrees of freedom, the immediately closed through the ‘A’ dimensions from the overall consistency of the new and past constraints, seems at stake but is the immediate consequence of its (A, L, S) characteristics, which seem to fit the expectations often projected onto a quantum vacuum all but empty.

And again from where the limit of saturation to the consistency from the spherical category – or interpreted as such – triggers the transition to a more complex observing capacity of the universe, which is what we label as life.

These features are sketched in [11] and result in [2] and in what is usually attributed to Dark Energy from the Quantum vacuum: as a Planck length and cell obvious subspace of the Comprehensive Universe, it is the same from any point of the Cosmological one and therefore is often hastily interpreted as ubiquitous as summarized in [13]

Let’s add that the critical role of the Riemann sphere in quantum mechanics is fairly well emphasized in diverse Penrose’s  writings and particularly summarized in [12] and its role at the cosmological gravitational level by Hawking as quoted above, but it was useful to embed them in the most diverse ‘spatialization’ and then saturation of the category, upon which next (FCP) complexity levels may therefore start to clusterize.

References

[1] http://dx.doi.org/10.1063/1.2737004

[2] http://dx.doi.org/10.1063/1.2947668,

[3] http://dx.doi.org/10.1063/1.4728011,

[4] P. Journeau, Emergence of Dimensions in Cosmology,  New Advances in Physics, vol.4, 2010

[5] Adam G. Riess, Lucas M. Macri, Samantha L. Hoffmann, Dan Scolnic, Stefano Casertano, Alexei V. Filippenko, Brad E. Tucker, Mark J. Reid, David O. Jones, Jeffrey M. Silverman, Ryan Chornock, Peter Challis, Wenlong Yuan, and Ryan J. Foley., A 2.4% Determination of the Local Value of the Hubble Constant, arXiv:1604.01424v1 [astro-ph.CO] 5 Apr 2016

[6] J. Hartle & S. Hawking, Wave function of the Universe, Phys. Rev. D 28, 2960, 1983

[7] S. Hawking & R. Penrose, The Nature of Space and Time, Princeton University Press, 1996, Ed. Gallimard for French Translation by F. Balibar, Presentation M. Lachièze-Rey.

[8] F. Dowker & A. Kent, On the Consistent Histories Approach to Quantum Mechanics, arXiv:gr-qc/9412067v2 25 Jan 1996

[9] M. Gell-Mann & J. Hartle, Equivalent Sets of Histories and Multiple Quasiclassical  Domains, ArXiv: GR-QC-9404013, 1994

[10] A. Aspect, J. Dalibard & Gerard Roger,  Experimental Test of Bell’s Inequalities Using Time-Varying Analyzers, APS 1982

[11] P. Journeau, potential-impact-of-fcp-to-fundamental-physics-and-cosmology, 2016

[12] R. Penrose, The Road to Reality, A.E. Knopf ed., New York, 2004

[13] Cosmological Constant issue abstract

Science Culture

Is Culture a condition to Science? Or even which culture to which science? Astronomy is known to have developed well starting thousands of years BC, with most ancient cultures and in regions as diverse as Middle and Far East, Western Europe and First American civilizations. Moreover prominent scientists are since the 20th century and now increasingly born from any part of the World.

And yet Kuhn [1] pinpoints the fact that “The bulk of scientific knowledge is a product of Europe in the last four centuries“, even though most observers would concur on earlier foundations, probably to be traced to at least its first universities, and to the specific and exceptional contribution of Hellenistic antique civilization to the inception of science as such, roughly since the 6th century BC. The comments of Kant about Plato and Aristotle, as an example, leave no doubt about this.

Someone may highlight cultures more geared toward fundamental science – such as the Greeks’ – as eventually opposed to engineers’ – like the Roman, others optimally integrating the both of them, so far most western world, yet now going global, vs. others neither good at theorizing nor at experimenting.

Could a scientific world therefore never have happened? For instance in eventual extra-terrestrial civilizations, which would therefore remain at middle-age advancement? Are there cultural dimensions, possibly such as envisioned by Koyré, required for Science to make leap and to permeate society, thereafter becoming the integrated fundamental to industrial intertwining that we currently know?

These are among questions planned to be scrutinized, further modeled, represented and tested in a dedicated series of works, including about reconstructing and visualizing the historical development of science and knowledge, and more generally culture, across the ages, along centuries behind and beyond.

One line of action to achieve this, involving as wide as possible a diversity of cultures, is Cinema, hence some linked projects as sketched on www.staroad.pictures and some of our blogs about cosmology as well as about this feature film long series.

As examples it was once suggested [2] that France was first in Mathematics in Europe, Germany in Chemistry, possibly applied Physics and the United Kingdom in Life sciences even though it seemed that, from the first part of the 20th century to the second, global scientific leadership – among others – had moved from Europe to the US, where it has, so far at least, been much more efficiently shared and integrated with Entrepreneurial spirit and financial resources

Another major question pertains to former so-called ‘third world’ countries, of which several are now most advanced at least in engineering and applied science, especially in Asia, and gave birth to prominent fundamental scientists. It is not yet known to what extent the comprehensive, most integrated cycle socializing and intertwining them with entrepreneurship, capital and markets, is or will be reached, especially when new and possibly more frequent disruptions should be expected, possibly requiring new levels of cultural, societal, political and organizational flexibility and, should we say, adaptation.

In short the SCIENCE CULTURE question could be set as follows: will the context of governments and foundations and often international quite open research fundamental research, with at the other edge proprietary applied research as integrated knowledge evolve toward more open and shared knowledge, but possibly less rewarding to researchers and to further knowledge, or toward less so or, if pieces of knowledge such as scientific journals remain monetized, one way or another, how will this ultimately benefit overall knowledge increase?

There are diverse models about measuring knowledge, some of which concur to its impact in making society, to being with its economy, more efficient, hence cost-efficient, which yields a measure of so applied knowledge.

Anticipating this, and measuring the value of potential knowledge or applicable knowledge, hence bringing new ways to assess, compare and select priorities and relative values along defined horizons, is of utmost interest for Research inceptors, teams, conveyors, funders and repositories, particularly depositaries of portfolios of projects, which have value only relatively to others and to experimental and even applied knowledge in potential new uses, hence relatively to, say, industrial economics or more broadly societal improvement.

It is also of genuine value for cultures that would be most active in fundamental and experimental, published science yet not always, not enough and not downward enough connected with potential markets and dissemination.

This is part of the experimental project for REVUER, as summarized on recent and next blogs and on www.revuer.org.

[1] T.S. Kuhn, The Structure of Scientific Revolutions,  International Encyclopedia of Unified Science, © 1962, 1970 by The University of Chicago. ISBN: 0-226-45803-2 (clothbound); 0-226-45804-0 (paperbound) Library of Congress Catalog Card Number: 79-107472

[2] References to be recovered (stats about European publications and prizes)

On the REVUER project, toward a ‘General Predictability’

An overview of a visual concept in one click

It is easy to cast a glance at some very visual basic concept of REVUER’s by getting on www.revuer.org, tab EXPERIMENT or more directly on http://ex.revuer.org.

Input a research field by entering “SEARCHSTRING” in the Search box at the top of the list of “Last Updated Clusters” , preferably about those already becoming filled such as ‘COSMO’, ‘ALGAE’, ‘RUBBER’, ‘LASER’, ‘RAY’, ‘OBFUSCATION’… to have a first MAP of the field along 2 DIMENSIONS, X and Y, prompted.

The graph has controls for selection of X-axis and Y-axis dimensions, as well as choice of linear (Lin) and Log scales: feel free to click of either of these buttons to change the 2D-VISU.

Column on the right on this service platform lists “State Of the Art” research experiments as well as, with proper rights, projected results for any ongoing experiment results.

Clicking any of them gives the experiment name, date, description, experiment status, and authors  [can be anonymous].

Various constraints in the research projects or results can be found by clicking the “Categories” button.

The info button is intended for there to be a brief description of what the research cluster is about: this depends whether Researchers in charge of this field ‘cluster’ have felt it necessary to tell about it or not.

Ex.revuer.org is planned to be replaced by new, n/REVUER, which will use similar basic concepts but renewed approach and ease of use, with the primary goal to solve the following issue: while a lot of semantics can be consummated about research projects and results their holders (the researchers) teams and partners, funders and potential users want to know, preferably in seconds or minutes  of VIEWS, rather than days or months of READINGS, what quantitative difference these make to what they currently have, or know, or guess, or expect, or fear, or plan.

Short term n/REVUER project goal

REVUER has over the years experienced diverse novel ways and contexts to solve or at least improve a wide series of issues in Research to begin with the ‘Researcher’s burden’, under which all suffer, i.e. to work months or even years on new research projects and papers for submissions for consideration and funding or publication, and to see it all rejected [0].

This frustration is matched by the side of all those seeking to understand and assess what researches produce, where they are heading to and if they may impact their activities either positively or negatively.

The experimentation  easily glanced at through www.revuer.org visual Proof Of Concept, has tested or revealed several facts about the behavior and types of representation of what we have come to call “the dynamics of research fields”.

A specific goal of n/REVUER is to enable Researchers project holders or teams to exhibit them in most secured, and a priori anonymous manner through visualizing concepts of roughly similar power level but remodeled, easier manipulated and presented.

As explained below REVUER will carry this as an international, mixed non and for-profit consortium, with the primordial concern that the cultures and processes differ from one country to another, as well as the data management regulations, and therefore that it will be applied in a multi-regional or even national basis even though, when so parameterized by researchers, related data, at least anonymous, should circulate globally.

Our goal, and its constraining related experimental protocol, drive the consortium to offer this new power first to researchers themselves and to enable as well research funders and organizations, as we anticipate it, much easier fund leverage and investment securing.

n/REVUER through a REVUER Consortium, toward d/REVUER and i/REVUER

So, n/REVUER will dramatically facilitate research progress sharing, and first for the benefit of researchers, but the dynamics of research fields is then meant to be carried by a much more powerful system dubbed d/REVUER, still (multi)-disciplinary but primarily dynamical, i.e. exhibiting the trends and even a part of the trajectories of, and in the field, that is to say where its object, or one of its intermediate objects – for instance next technological conclusion(s) – should land, in other words with which numerical and qualitative characteristics.

However our ultimate, grand project is to get to an interdisciplinary, integrative representation of the dynamics of research fields, again with and for researchers most involved into it. It is called i/REVUER and our first attempt will be to test a sample of focused ‘interdisciplinary (predictive) models’ – thereby models of interdisciplinary predictability, for a list of issues in Energy, Climate, Environment, Cosmology, Materials, Chemistry  and some in social sciences as well.

These will come from applying a more general model, from which we anticipate to discover or rather to exhibit an operational “General Predictability” logically deriving from interdisciplinary predictability.

It is summarized below and we consider n/REVUER and d/REVUER, by themselves useful to researchers, as steps toward the i/REVUER compulsory purpose.

Long term overall Goal

This part relates to our next blog about “Cosmological Modeling” and particularly its reference to Popper’s “Three Worlds” issue, as summarized by Clavier[1], about consistent Leibniz, Bolzano and Frege’s set of epistemologies, yet grounded into Greek philosophers’ quest and extendable to recent reflections of Penrose’s, from which figure below was adapted.

three-worlds-image

 

About time and together with Smolin [0], he emphasizes the time reversibility of the laws of physics, which fit with their truth independent of time and to their predictive power.

This is here pictured with an example of (True) model, represented by a small circle, about a (Physical) set of observations, then presumed an object for a (Mental) observer.

The sets of theories and constraints and the set of experimental data here seem to belong to distinct worlds, which each can expect to circumscribe to result in a finally much faithful “consensus” model, then resilient to a most diverse series of occurrences of the mental, yet with body, observer, hence systematically recognizing, as well as predicting, the met or produced physical ‘reality’.

Through the FCP framework of our other blogs, and its derived geometrical model, the distinction is resolved and it becomes clearer to speak of truth than of an increasingly evanescent reality.

The trajectories to these objective and objectal conclusions – such as optimal future technologies or nature’s objects models – may seem to be easily predictable from the integration of past trends, as pursued by data or even Big data science(s).

Unfortunately, Kuhn conversely demonstrates systematic “scientific revolutions” coming to disrupt well established and apparently consensual models, then become ‘dominant paradigms’, so that we can’t rely on field level data or metadata simple convergence to conclude on the point(s) where a research field will get, even provisionally, final.

It is reasonable to conjecture that the set of models covers truth or that its consistent set is included in truth and tends toward a General Predictability. It will need a more detailed explanation in the wake of Etchemendy’s “proper understanding of model-theoretic semantics and its relation to the pre-theoretic notions of logical consequence and logical truth” [2], cf. next blog.

Premises

Researchers need to promote and nevertheless to secure their projects, at least until publication, which is expect to provide at least recognition and if only following the ‘publish or perish’ survival pattern.

This need and conflict is not specific to Research, but is direr there because research has less chance to be funded by direct users, considering the uncertainty of outcomes, and conversely may discover, through fields assumed to fit with predictive models, brand new objects with then new predictive behavior, which is what distinguishes scientific research from, as an example, poetry, fashion or many other products and services, nonetheless highly respectable.

 

Organizational and technological features

A key aspect of the REVUER project, consortium and experimental protocol pertains to organizational theory, deemed a critical component and going to be progressively integrated in a manner allowing a diversity of experiments, including market, use and relational processes, which may prove effective here and not there, especially as the success depends on cultural factors and diverse research organizational features.

The great openness or range of parameters open to users, to begin with researchers, should maximize the chances of success.

Open versus not open data, sources, etc.

The diversity of remarks, suggestions or recommendations about how to handle this project over the last years has been amazing and yet compels us to answer with the following quote[2]:

This paper is not meant to be a reply to my critics. Such a reply would be of very little interest to any one reader, inasmuch as the critics themselves disagree so sharply on fundamental points, and so the lines of criticism are often at odds with one another.”

Some of them have ranged from the imperative that REVUER should be from scratch open source and/or open data – some discussions pointing to the intricate fate of sources, themselves ranging from algorithms to oracles, and data, eventually similar – to those intimating that these be concealed in the most secured if not definitive manner.

The answer comes to the logical consequence of the  heterogeneous organizational plan even though provisional choices and limitations are now made for practicality, widened options coming later on.

 

Intellectual Property Protection

So, as opposed to the indeed always desirable openness of access, data, sources, etc. the fact is that Research needs to be compensated and that this comes from Intellectual Property preservation and even optimization, therefore parts of REVUER’s commitments.

Testing Reproducibility, fields Trajectories and General Predictability

How will research fields be objects of science themselves? Kuhn may again be considered the father of such an idea, to which REVUER plans to bring the related experimental apparatus, from the patterns of paradigms, crisis, disruptions and trends that he has discovered, but of which the quantitative and predictive modeling, which is our goal through REVUER, could not be derived from his works. The reproduced patterns and consequences are precisely, for that, those for which REVUER experiments data will be gathered.

On Data science and Big Data

These are recent concepts with at least commercial success in spite of uncertain definition, which therefore matters less than the commercial use. Whether this has been a concept without an object or an object without a concept remains to be concluded.

The REVUER look is quite reverse – consistently with the scheme above – that is to say observing that data exist primarily from and for those researchers projecting into and back from experiment from what they expect to find them, indeed with quite a success, which is where reality finally emerges under the sole condition of the consistency and resilience of the research field. That is to say the extent of reproducibility and common truth (and their semantics) throughout a sufficiently extended space-time field (with its own definitions of space and time) and ultimately through, necessary, unlimited interdisciplinary sharing.

These Research fields, as common spaces of Researchers’ projections, have proven quite successful considering what they have delivered and since, otherwise, the reader of these lines would rather be hunting, fishing or harvesting…

Financing

This will be one of our next blogs about REVUER.

[0] The case of Bose papers, finally published with Einstein’s support, being only emblematic of a generality itself inherent to a pattern modeled by Kuhn and more recently exemplified by Lee Smolin in “What’s wrong with physics” about String theory (which finds itself sometimes in the position of the dominant paradigm or conversely).

[1] P. Clavier about ‘interactions of the three worlds’ in « Le Concept de Monde », PUF, 2000

[2] J. Etchemendy Reflections on Consequence,” in New Essays on Tarski and Philosophy, Douglas Patterson, ed., New York and Oxford: Oxford University Press, 2008, 263-299

Communicate across the universe

The ‘Pooncho’ was served in distinctive earthenware mugs, glazed dark green. 
Agent Shant watched as the two raised the mugs and tested the contents. 
- Well then, what is your verdict?
Wingo coughed and cleared his throat: 
- This is a drink of several dimensions. It should not be judged in haste.” 
             Jack Vance, Lurulu, p. 260, Tom Doherty ed. 2004

Communicating instantaneously at a distance

As expeditions are envisioned to start wandering far across the solar system and beyond, to the stars, the issue of the most efficient communication process is soon going to be critical for decision makers: it is much more tolerable to be bounded by light speed for matter and radiation – at least until “beam me up” becomes a reality – if it is possible to communicate instantaneously, no matter the distance.

We have all read and dreamed about humanity spreading throughout the universe, presumably starting with the solar system (pending), next reaching closest habitable planets cf. Heinlein’s “Farmers in the Sky” and “Time for the stars“, after fast progressing discovery of exo-planets. Next phase would see a second wave, as described by Asimov in the “Foundation and Empire” series and by Vance with his ‘gaean reach‘ and by other authors.

Back to instant communication, and leaving aside for now the telepathic option of “Time for the stars“, the scientific issue has lately been shaken by the experimental results of what is usually summarized as “EPR experiments“, as initiated by Aspect in the 70s.

These experiments and their followers have established the “non-local” character of quantum entanglement.  Aspect himself concluded that we should get rid of “local realism”, yet not of a realism [1] of which the characterization has, however, been the goal of many [2].

Some conclusions from these experiments range from “Bohr vs. Einstein, Bohr wins”, or sometimes “Bohr 1, Einstein 0” to “faster than light”. Penrose answers, in his compendium on reality [2], by showing that entanglement actually does not carry – in that case instantaneously – information; this apparently suggesting that instantaneous communication is impossible.

Penrose being  right, either by definition or as a theoretical statement, yet refutable, the case could be settled by concluding from the impossibility of  any instantaneous transfer of information to the abandon of the hope of any efficient communication.

Unless we find a practical way to distinguish the respective operations of communication and of transfer of information.

The theory

So we wish to be able to communicate faster than light, in fact instantaneously, yet under the constraint that we cannot transfer information faster than light.

The solution is that information need not travel because it is already there: in other words spaceships and remotely inhabited planets will share a common in-formation, that is to say commonly informed material at time of entanglement, then spread among them and out of which effective communication will appropriately be disentangled to be shared.

Now this EPR, now experimentally well proven fact, does indeed not allow anything faster than light but rather derives from a universe, such as implied and describable by our Fourfold Co-necessity Principle (FCP) universe, or reality, in which spatiality itself is relative, with a spatial dimension going between 0 to 1, for one Planck cell dimension universe or to finite quantity, such as the radius of the universe, or infinite for a universe  or some of the spatial dimensions of the universe being infinite, depending the phenomenologies involved.

For it is obvious that for a phenomenon or rather phenomenology – taken as a real class, as precised in next blogs – for which the spatial dimensionality is one Planck cell, any communication with the other edge of the corresponding universe is instantaneous and yet not faster than light.

Features of such universes or edges of our universe are already proposed by cosmologists as wormholes, the FCP merely implying  universality.

As a conclusion the issue only arises from the common perspective according to which ‘the’ universe should primarily, intrinsically and solely be four dimensional (1, 3) with three spatial dimensions.

The related device may have already been tested although we are not aware of this, otherwise it could be quite easily realized and tested when needed.

[1] Private discussion

[2] Starting with Plato’s ideal realism, which was clearly distinct from Aristotle’s although we may interpret Kant – whom refers to them – as reconciling these perspectives within his proposed equivalence between transcendental idealism and empirical realism and lately such as d’Espagnat, and Penrose’s “Road to reality” (A.E. Knopf – 2004)

Cosmological introduction

The term “cosmological coincidence” usually refers to the fact that the Dark Energy era seems to succeed to the Matter dominated era quite precisely at ‘our’ time, that is to say to coincide with a particular cosmological time of ridiculously low probability, although [9], for instance, discusses the way this ‘chance’ is usually interpreted, hence the relevance of the coincidence problem.

The range of admissible numerical values for the cosmological parameters has been step by step reduced from the 2000s to latest 2016 results, thanks to the increasing power of instruments such as WMAP, Planck, the Sloan Digital Sky Survey II (SDSS-II) and many others and to the colossal theoretical and experimental work achieved by considerable teams of astrophysicists, cosmologists, engineers and others.

For the Hubble constant H0, i.e. the value of H(z) = ȧ/a, where a = 1/(1+z), at redshift z = 0, a ~9% gap has appeared between two types of observations: i) an indirect, mostly rooted in predictions derived from measurements of the CMB, giving a Hubble value H0 ~ 67 km/s/Mpc [10] when applying the consensus, ΛCDM  cosmological model, which combines Baryonic and Cold (or more recently possibly warm) Dark matter CDM with a cosmological constant playing the role of Dark Energy and ii) a more  directly measured value H0 ~ 73.24 km/s/Mpc, hence resulting from ‘local’ observations, i.e. on quite nearby galaxies.

The issue is important because this determines the extent of validity of the Standard Model of Cosmology, somehow framed by energy density ratios by definition summing to 1, including an Ωk accounting for the energy density embedded in the universe average curvature, hence null for quasi-Euclidean or ‘flat’.

The value of H0 = 73.24 km/s/Mpc [7] gives an ΩM,0 = Ωb,0 + ΩDM,0 ~ 0.26, hence an ΩΛ,0 ~ 0.74, assuming Ωk ~ 0, when using h2Ωb and h2ΩDM usually found from Planck, WMAP and other instruments [10, 11].

Another recent paper [13] concludes on the following results ΩM,0 (or ΩM) ~ 0.266 ± 0.016, ΩΛ ~ 0.740 ± 0.018 and w = – 1.15 (+ 0.123, – 0.121) from the observation of 581 SNe Ia up to z ~ 0.5.

A very recent paper [12] has proposed that the gap could be offset by recalculating the effect from the fact that the observers (‘we’, i.e. the Earth, the Milky Way and the Local group) are currently assumed to be located in an under-dense region of the universe – a subvoid – through modified calculations from the way this fact was already accounted in the consensus ‘local’ [7] value determination of H0 ~73.24. Its argument is that observations beyond z = 0.6 should be taken into account so as to consider supernovae out of the local under-density. However [8] reconstructs an H(z) function, using previous results [7], with z up ~1.3.

It is not known, at the time of publishing the present paper, what will be the final impacts of contributions such as this one but they may not erase the fact of the direct, local measurements and their implications, such as the questions about the precise factors underlying the Dark Energy and Dark Matter effects.

Still more recent observational paper [14] emphasizes a difference between  TT, TE cross-correlations and EE polarization modes, with results concentrating the discrepancy on these modes with, in summary, a central value of H0 = 73.4 and ΩM = 0.2593, hence ΩΛ ~ 0.7407, while authors preferred TE mode gives H0 = 63.4, hence ΩM = 0.3684 and therefore ΩΛ ~ 0.63. Actually the TT results of [11] give Ωb = 0.0458 and ΩDM = 0.21345 and even though their own authors prefer the TE results, the manner of integration of somehow disparate results, such as TT, TE and EE modes, seems debatable, and cf. [7, 8].

All these results are provisional and the object of current complementary analyzes and further tests, thanks to increasingly precise and diversified observational results. However they suggest it may be useful to actualize predictions once submitted and to provide more details about our approach to the two following questions: what may play the role of Dark Energy and why this coincidence with ‘our’ times.

These proposals come in sections 3 and 4, after an introductory perspective to their particular type of look at the cosmological issues in 2.

References

  1. Montanari, F. Ricci-Tersenghi, G. Semerjian, Clusters of solutions and replica symmetry breaking in random k-satisfiability. arXiv:0802.3627, 2008
  2. Conferences I3E CCCA’12, Marseille 2012 & ISC-Complex Systems, Orleans University, June 2013, Future = Complexity
  3. Mezard, Optimization and Physics: On the satisfiability of random Boolean formulae, arXiv:cond-math/0212448, 2002
  4. Mezard, G. Parisi, M. Tarzia, F. Zamponi, On the solution of a ‘solvable’ model of an ideal glass of hard spheres displaying a jamming transition, Journal of Statistical Mechanics : Theory and Experiment, stacks.iop.org/JSTAT/2011/P03002
  5. Journeau, New Concepts of Dimensions and Consequences, AIP n°1018, http://dx.doi.org/10.1063/1.4728011, 2008
  6. B. Hartle, S. Hawking, Wave function of the Universe, Physical Review D, vol. 18, n°12, APS, 1983
  7. G. Riess et al, A 2.4% Determination of the Local Value of the Hubble Constant, arXiv1604.01424 astro-ph, 5 Apr 2016
  8. L. Bernal, L. Verde, A.G. Riess, The trouble with H0, arXiv:1607.05617, [astro-ph.CO], 2016
  9. Bianchi, C. Rovelli, Why all these prejudices against a constant?  arXiv: 1002.3966 [astro-ph.CO]
  10. Planck 2015 results XIII. Cosmological parameters, arXiv 1502.01589 [astro-ph.CO] Feb. 2015
  11. Bennett et al., NINE-YEAR WILKINSON MICROWAVE ANISOTROPY PROBE (WMAP) OBSERVATIONS: FINAL MAPS AND RESULTS, The Astrophysical Journal Supplement Series, 208:20 (54pp), 2013 October doi:1088/0067-0049/208/2/20
  12. Campbell, M. Fraser, G. Gilmore, How SN Ia host-galaxy properties affect cosmological parameters, ArXiv: 1602.02596 v1 [astro-ph.CO], 8 Feb 2016
  13. E. Romano, Hubble trouble or Hubble bubble? arXiv:1609.04081v1 [astro-ph.CO] 14 Sep
  14. Louis et al, THE ATACAMA COSMOLOGY TELESCOPE: TWO-SEASON ACTPOL SPECTRA AND PARAMETERS, ArXiv:1610.02360v1 [astro-ph.CO] 7 Oct 2016

Know the Future

Are you serious?

Is it something impossible? Or is this not a pleonasm? Both the extent of the past in time, the depth of our instruments and our increasing knowledge of the past grow ‘our’ common knowledge everyday. Through science we also increasingly know laws according to what has happened and will again happen in the future. Our ability to plan, predict and document grows so much that the future might be just plain knowledge and to know the future… knowing knowledge.

Structures, systems and variables

The Universe is assumed started homogeneous and isotropic, hence with few variables, themselves not much variable or even not at all if it were not for an Heisenberg Uncertainty Principle (HUP), which would derive from a co-necessity principle such as previously proposed, cf. our first blog.

This means that new variables have ’emerged’ over time, allowing for instance events and phenomena as constraints, clauses ranging from nucleosynthesis to ‘Large Scale Structure’, life and finally ‘us’.. not bad after all, assuming we are happy to be on board.

So new variables have come with time, or rather time and more precisely the future, has come with new variables which, by introducing new degrees of freedom, develop a time oriented universe in their direction as opposed to the one with existing and time reversible variables, which is their past somehow because this is where the limited number of degrees of freedom makes its probability null as opposed to the direction of the future.

This avoids the cosmological coincidence problem to recourse to the Anthropic principle. Homogeneous and huge were the datasets and their successive complexity layers and nevertheless fewer and heterogeneous were those that make the future and yet, being more complex and fragile, decay with entropy growth, going to the past.

Now the goal here is not to enter and wander into debates, however important [1] they may be, among diverse attempts at realism such as between reductionism, positivism, Platonism, Aristotelianism, naturalism… which seem to come from looking the same thing from diverse angles.

Our purpose today is to focus on the sort of phase transition, about this coincidence, that saw life and, much more decisively and recently, the human take over the role of driving the future, even at a point that some geologists call ‘Anthropocene’, as a world of pioneers.

Is this human future predictable?

Quite amazingly it would seem that the future of our planet and the universe would be predictable if it were wholly natural, hence from laws of a Nature questioned as Natura naturans versus Natura naturata by Spinoza and in not so much different a duality by Kant in his last Critique. There, humans appear on the ‘naturans’ side and it is therefore striking that this part of the flow to the future may be the one that humans do not predict!

Hence one way to reconcile reductionism to emergentism from the article in reference: the first come from the future and the seconds from the past and they need one another because there is no future without a past and conversely.

Indeed it seems that we constantly fail at predicting the future inasmuch as it radically differs from the past, does not reproduce it but always, even systematically, creates it through disruptions, breakthroughs and even more the kind of revolutions that Kuhn concludes unavoidable for Science – but what may still expect to escape the reach of Science? – and Schumpeter and Chandler for innovation, entrepreneurship and management…presumably also art and not to speak of societal, social revolutions.

So here are some good news after all:

  • the future will always surprise us
  • it comes increasingly from or through us (whom are indeed surprising) and this is the future organization of a better world, the much sought after world of knowledge,
  • its adds new levels of variables and constraints… and oracles to close them up, so that the filling and increasing in complexity levels optimally hence most efficiently fills the layers of increasing complexity to us as future or conversely knowledge as pas from us.

Reductionism play it with names and formula, hence from a mathematical world with its smooth, dense, proving power set which is also the power of semantics or natural language, and it projects it onto the discrete finite lattice – as huge as it seems – of experimental data and verification.

complexity-reduced-formula

Emergence is Big, with Data, but Reduction to formula is Powerful, with Meta, complexity as summarized on the somehow reductive formula of emergence above: the key point is something that physicists fight in order to reduce and remove it from their incommensurable (hence future) and just keep what actually emerges, under the name of renormalization.

This might be the greatest among many remarkable contributions of Computational Complexity discoveries to Science: the reduction of the incommensurable to a commensurable.

Conclusion on predicting the future

You would be surprised by the extent to which the future is both always so surprising and yet so systematically predicted or more exactly anticipated, ’caused’ as told, one way or another. Before so many plans – especially when it comes through no plan – it is written or evocated but the surprise comes when, where and inasmuch as it is effected, actualized and emerges.

These are the projections from the dense, smooth semantic and mathematical future into our discrete lattices of quantized and entangled variables, that we need to renormalize in a much wider sense to predict which piece of future will appear, emerge when, how and how much.

This is what the interdisciplinary project i/REVUER, cf. www.revuer.org is planning to reach upon the premises of its ongoing and incoming n/REVUER and next, disciplinary but already disruptive d/REVUER next major step.

The predictive power of disciplinary fields is already summarized in our previous blogs and has been the bread and butter of epistemologists over decades, if not centuries: the new challenge is now the plan to develop and reach the General Predictability inherent to what would become an effective, interdisciplinary… field of fields.

[1] G. Johnson, Challenging Particle physics as Path to Truth, New York Times, 12/4/2001

Cosmological Constant issue (abstract)

The Cosmological Constant usually refers to the circa 10120 magnitude gap between so-called ‘observed’ value for related energy density, estimated [1] at less than “(10-12 GeV)4 ~ 2 10-10 erg/cm3” and estimated value from “Quantum Field Theory all the way up to the Planck scale MPl ~ 1018 GeV”, giving a related energy density circa “2 10110 erg/cm3”.

Another presentation rather emphasize the gap between the Planck length and the cosmological horizon length scale, radius of the universe, from the ‘observed’ value being given by 3H2/(8πG), derived from Dark Energy or equivalently Cosmological Constant hypothetical energy density equality to negative pressure, with a universe length scale equated to 1/H. The ratio (LΛ/LP)2 has then similar magnitude order of 10123, hence the apparent requirement of “enormous fine tuning[2].

The Problem disappears however if the magnitude gap is just left for what it is, the gap between a radius taken as some history of the universe, rather looking at the concept of density from another viewpoint, where the Zero point is a singularity that does not, therefore, have to be spread over this size because it is the same Planck size cell everywhere, hence whether you are on the earth or at the other edge of the universe.

This cell has a name, the Big Bang, usually taken as a singular event but which may as well be considered as a the common non-local dimension that EPR experiments help figure out and to be seen as a layer of the universe, or more precisely a collection of layers of Planck size spatial dimensionality, hence so compactified.

Conversely to “all the way up”, Black Holes give examples of “all the way down” to a singularity, presumably same point through which all ways go.

This means that the cause of the expansion should also be considered from another angle, according to which the energy is there but the universe expands exactly as needed for new complexity layers or equivalently eras to take place, as we observe them, hence because its own future determines it from entropy growth requirements. In other words, because of its boundaries.

As recalled in our blog about the Cosmological Coincidence many pieces of the mechanism are already communicated but the comprehensive and detailed, integrated model still needs to be so, hence the paper projects devote to this problem.

[1] S. Carroll, Living reviews in Relativity, http://livingreviews.org/lrr-2001-1

[2] T. Padmanabhan, Emergent gravity and Dark Energy, arXiv:0802.1798 [gr-qc] 2008

Cosmological Coincidence (abstract)

The shift of the energy density ratio of total matter (dark + baryonic) ΩM(z) from 1 to 0 – or so far to ~ 0.26 at least – is so abrupt a transition from the matter dominated era from the current era and according to the standard model of cosmology, that the quite null probability that it coincides roughly with ‘now’ – at cosmological scale – has become known as ‘cosmological coincidence’.

Another type of abrupt shift occurs when computing the k-SAT problem about the ‘satisfiability’ of a set of M Boolean clauses linked through a Conjunctive Normal Form (CNF) formula:  each clause is an OR combination of Boolean variables taking value True or False – translated into 1 or 0 – and each clause is AND linked to the others, all clauses or constraints using variables randomly taken from a pool of size N.

The ratio of clauses to variables is usually denoted a(k) = M(k)/N and experimentation have revealed that the satisfiability abruptly shifts from 1 to 0 when a(k) gets above a measured threshold when N goes to infinity .

Both issues seem a priori unrelated. However, while a(2) = 1 has been proven, regularly improved experimentation have found as(3) ~ 4.267 and even detected a “clustering phase transition” ad(3) observed at a value [1] circa 3.86 with then even another, intermediary “condensation phase transition” ac(k) for k>3.

We have previously [2] introduced the clause density ratio β = (1-N/M) = (1-1/a) and proposed that βd = π/√(18) ~0.74048 ~ (1 – 1/3.853), in connection to the Hales-Kepler theorem about maximal sphere density ratio, giving ad(3) = 1/(1 – π/√(18)) ~3.853, very close to the observed value.

It is useful to observe a relationship between geometrical and logical worlds so established and to consider extending it into the physical one, using the fact that any logical proposition or sentence can be written as a CNF formula, of which k-SAT only bring an indeed important restriction from the defined dimensionality of the clauses.

The logical world has no spatiality alone, as opposed to the other extreme, non-framed Euclidean ‘flat’ space alone, no less theoretical. Energy functions have however been introduced, relating randomly allocated variables in 3-SAT to spin glass [3] physics and then recently modeling a type of spheres stacking [4].

At cosmological level another type of link interpreted as formal sphere staking has been introduced [5], where the cosmological energy density ratio, written as ΩΛ to encompass the dark energy role, whether caused by a cosmological constant or a somehow equivalent phenomenon, was proposed to take this specific value, i.e. π/√(18).

Scheme [5] also referred to the ‘no-boundary’ proposal of a universe [6] in transition from compact geometry, “part of Euclidean four-sphere of radius 1/H”, to a mostly Lorentzian, unlimited volume one.

Recent observations [7, 8] seem to suggest a universe lately, at the least, departing from ΛCDM consensus but where energy density ratios today, or whatever take their roles, fit well with values proposed in [5].

The mechanism sketched in [5] is therefore more detailed in the new, full paper now to follow and where, for these roles, the contributions of the vacuum energy and of degrees of freedom able to drive the future through the present era, are integrated in hopefully consistent and surely unusual an angle.

[1] A. Montanari, F. Ricci-Tersenghi, G. Semerjian, Clusters of solutions and replica symmetry breaking in random k-satisfiability. arXiv:0802.3627, 2008

[2] Conferences IEEE CCCA’12, Marseille 2012 & ISC-Complex Systems, Orleans University, June 2013, Future = Complexity

[3] M. Mezard, Optimization and Physics: On the satisfiability of random Boolean formulae, arXiv:cond-math/0212448, 2002

[4] M. Mezard, G. Parisi, M. Tarzia, F. Zamponi, On the solution of a ‘solvable’ model of an ideal glass of hard spheres displaying a jamming transition, Journal of Statistical Mechanics : Theory and Experiment, stacks.iop.org/JSTAT/2011/P03002

[5] P. Journeau, New Concepts of Dimensions and Consequences, AIP n°1018, http://dx.doi.org/10.1063/1.4728011, 2008

[6] J.B. Hartle, S. Hawking, Wave function of the Universe, Physical Review D, vol. 18, n°12, APS, 1983

[7] A.G. Riess et al, A 2.4% Determination of the Local Value of the Hubble Constant, arXiv1604.01424 astro-ph, 2016

[8] J.L. Bernal, L. Verde, A.G. Riess, The trouble with H0, arXiv:1607.05617, astro-ph, 2016

A General Epistemology to Model the Future

This is the post excerpt.

A Research and Editorial project in progress about Interdisciplinary Modeling

Each scientific domain or even field develops its own methods, its epistemology and therefore the need of a General Epistemology to ensure a consistency between domains or even between fields, such as quantum mechanics and general relativity in Physics for instance, becomes increasingly recognized as further scientific integration is widely acknowledged, from cosmology to society going through biology and a fast growing new “Integrated Environmental Modelling”.

Physical world and boundary issues

‘Our’ universe is assumed started at ‘Big Bang’ and then gone through several eras after initial inflation: 1) radiation era, 2) matter dominated era and 3) most recent dark energy or cosmological era and its ‘late time’ accelerated expansion of the universe.

This ‘first’ evolution is increasingly well modeled and tested from a set of cosmological models, based on physical foundations and principles, such as homogeneity and isotropy, themselves relying on assumptions about concepts of dimensionality, including numbers and types of dimension(s), to begin with space and time, numbers of (degrees of) freedom, geometries and metrics, hence measurements. As an example angles are often key measurements and parameters in Physics, and found in spherical coordinates equivalent to Cartesian and yet don’t embed any spatial length while conversely Cartesian xi don’t bear alone the concept of angle.

Yet, in spite of the fast progressing experimental precision of the ‘cosmological parameters’ of a consensus ‘ΛCDM’ model, major ‘cosmological issues’ remain unsolved at its borders, especially about the transitions from dominant energy source, whether radiation, matter or current ‘dark energy’ usually linked to the energy of a ‘vacuum’ which therefore appears all but empty.

Borders, conditions, horizons, ‘closures’ and related transitions, such as Physical to Biological, are of critical importance in our work, where our hypothesis [1],  applied to cosmological issues and derived cosmological parameters, seems consistent with recent observational surveys [2,3], residual discrepancies [4] and “sharp acceleration at low redshift[5].

Biological era

A bang is somehow assumed in Biology also, on how did life start, whether here or everywhere?

Linking such event(s) some geometry, through this ‘late expansion’, should not astonish once observed how deeply biology is rooted in physics and physics in geometry, to begin with the geometry of this ‘vacuum’, where the dark energy or cosmological constant effect is hypothesized to be sourced. Furthermore a linkage of life with tridimensional chemical structure order level has already been pinpointed [6], but the mechanism of a connection with universe geometries is still to be submitted [7].

Toward this goal, recent conclusions about evolution modeling, hence its long term deterministic aspects, call for some kind of “new general concepts to predict it [8]

Human, sociolinguistic era

This same dimensionality issue becomes critical to the progress of cheminformatics and of computing, where grammars starting with Chomsky’s, question the roots of the divergence between formal linguistics and ‘natural’ ones, which label human languages or semantics.

Linking these to the vacuum again, both smooth and quantized, is better acceptable now that the EPR (Einstein Podolsky Rosen) effect is experimentally well proven.

Our General Epistemology [9] hence General Predictability, metamodel of an integral evolutionary universe progressing toward an ever increasing complexity density, or equivalently toward its future, already appears to fit with requirements from a variety of observations ranging from cosmological to semantic.

At this point in our research program the ‘integrative geometry’ also seems to solve several issues about defining Time [10] as well as the “three worlds” issue developed by a line of logicians and physicists ranging from Leibniz to Frege, Bolzano, Popper and lately emphasized by Penrose [11], and somehow related to Peirce [12] in the range.

This issue wonders about a ‘mental world’ apparently embedded within a ‘spatial world’ or universe and yet itself embedding a ‘truth world’ which itself would contain the universe in an apparently Escherian, illogical circularity.

For a consistent evolution of the universe from cosmological to sociological eras

The time dimension, already relativized by Einstein, is ever more questioned by physicists as well as at the other edge of the complexity spectrum, i.e. in psychology and whether the “Ultimate Quest” [13] or a puzzle [14] the need for a more comprehensive, cosmological concept of time has been expressed at both Cosmo 11 (J.P. Uzan) and Cosmo 12 (A. Stebbins) conferences respective concluding talks.

Meanwhile the irreversibility of a universal time, as opposed to time reversible laws of physics, was emphasized in these references [9, 10], among many others leading us to submit the draft model, now to be more strictly presented and tested, of an equivalence between the future and the line of progress, or efficiency, or knowledge, computable as maximal ‘FCP’ complexity density, using the term ‘FCP’ to summarize as ‘Fourfold Co-necessity Principle’ the framework from which we now derive the type of extended four dimensionality needed by this geometry.

This principle, of which one foremost basis is to replace a debatable measure of time by a generalized, oriented, more relativistic and experimental one, is beginning to deliver the seeds of an interdisciplinary cartography from the manners and orders that the co-necessity expresses itself, if only once observing our a grammar, as one level, needs a grammar to be signified.

Predictive power from interdisciplinary integration

The chart that spans page 4 of Millis’ “Progress in Revolutionary Propulsion Physics[15] illustrates over a hundred of interdisciplinary links and even routes from a short list of fundamental physics domains upward to “goal-driven (propulsion) visions” downward. However there are actually tenths of thousands of fields – each full and crossed by projects, progresses and results – and even more evolving visions that dynamically and permanently interact and no integrative predictive power may be reached unless each piece of trajectory, which must exhibit a superior and integrative consistency with sufficient past and potentially past ‘real’ or ‘realizable’ trajectories in the hypergraph.

Other complexity levels and related graphs are questioned about the biological evolution as mentioned above and related complexities for instance by Amaral, whom observes that “our understanding of biomedical system has fallen behind our ability to gather new data[16]” .

He questions then the measure of human biological complexity from an interactome with even 650,000 interactions, although a neuronal dynamical hypergraph would then be a better level about the gap from “unveiling the working of single neuron…” to… “provide an understanding of consciousness”. Indeed it would still be useful to get to the Natural Linguistic or semantic Power set pinpointed in [26] from the communities that project these onto hierarchies of concepts and together with them into appropriate spaces to the coarse graining they manipulate.

In his ‘Foundation’ series, Isaac Asimov pictures a remote future with humanity then following the rule of a centralized galactic empire entering a decadence that a scientist predicts but strives to make less devastating by using a new, interdisciplinary science, ‘psychohistory’, from which a better future for humanity can be predicted from appropriate initial conditions plus ongoing, enhanced, general, predictive follow-up. There are interestingly two foundations, a first one rooted in physics and more generally ‘natural sciences’ and a second one, warden of the psycho-historical equations and subsequent fate, but this split is finally overcome when a wholly integrated, assuredly more environmentally friendly and holistic, reveals a wining solution.

How to reach so powerful an integrative, interdisciplinary predictive power?

To compute the future from the interactions between anticipations, goals and existing scientific fields and technologies, the step of picturing links such as mentioned above or even arrows – the ‘productions’ in grammars or the ‘morphisms’ in Category theory – is useful but far from sufficient. Knowledge flows and trajectories generally speaking must be modeled through a process allowing and predicting how dedicated models intertwine, interact and encompass one another once projected into some appropriate spaces, which themselves depend from them.

One difficulty comes from the considerable differences of order levels between the hyper-dense semantic layers, which therefore lay in the future of the under-dense structural, computational, chemical and moreover physical objects. These, conversely, require expansive space to take shape and place and to operate to become the present and repel the past where the degrees of freedom are more limited, hence the number of free paths, which is what makes it the past.

Interdisciplinary Integration versus Statistics and ‘Big Data’

Science is about predictions, often drawn and checked from statistics where typical events occur more often, more repeatedly in a ‘large number’ than under assumed by default randomness, homogeneity, i.e. equal probability of occurrence and Acceptance. A pattern of ‘YES’ versus ‘NO’ becomes scientific when reproducible from a model (formal) or Language L and technological when systematically so from a process, effective from external change S, yet delegated (S(L)).

There is a difficulty with the idea that such hazard, chance or change take place ‘spontaneously’, a word of which the meaning and testability are debatable, somehow redundant: first principles from Parmenides, Heraclites, Democrites, Socrates, Plato, Aristotle appear, after deep scrutiny, interwoven, interdependent and even co-necessary [17] to one another. This was progressively guessed by these pillars of the our philosophical and scientific tradition even if their Logos are seen from diverse angles whether S(L) or L(S) – such as NP and P in computational complexity, cf. below – is prioritized in their co-necessity because of an ill-defined context.

The manner of interaction between space and an extended concept of oriented time or horizon, with change and form, such as language, thereafter characterizes a portion of what we call an integral, hence interdisciplinary universe, driven by dimensional types and numbers. This is observed at opposite edges of the complexity ladder, from EPR phenomena to natural linguistics, realm of the meaning describable with Turing’s oracles more recent ‘provers’.

Oracles, provers and objects

Randomness is hard to define and for instance key definitions in computational complexity oppose deterministic Polynomial Time (P) to ‘Non-deterministic’ (NP) powers, a Non-deterministic language class encompassing problems for which solutions are guessed and then checked in polynomial time as opposed to the deterministic, algorithms, for which the solutions are calculated in polynomial time.

Throughout the evolution of the domain this ‘guesser’ has been replaced by a ‘Prover’, eventually itself a Turing machine complementary to the first one somehow similarly to the unavoidable role of the observer in physics and particularly emphasized by Geroch for cosmological “domains of dependence [18]”.

The ‘guessing’ of Non-deterministic language [19] is an extraction of some knowledge from a space which has so many (more) degrees of freedom that they enforce an orientation which we call the future. Hence one definition of chance as “the measure of our ignorance” [20] but then the replacement of this guess always come with the equivalent generalized domain of dependence or acceptance that encompasses them, such as ‘R’ or ‘Ck’, with their power of a continuum hence the null probability to draw any given number in them and yet the certainty to find it there.

The Cosmological example

Today the Galaxy seems to become reachable through the quest for earth-like environment while the physical evolution of the cosmos from the Big Bang to our days appears quite well modeled by a consensual “ΛCDM” model, of which however the boundaries are now at stake.

This model uses a limited set of cosmological parameters, but still leaves critical issues unanswered, whether about the Λ Cosmological constant (or its ‘Dark Energy’ effect) and CDM (‘Cold Dark Matter’) main components of the physical universe, or about its leaps from the physical to biological complexity levels and from life to intelligence, particularly the so far human, semantic, ‘psycho historical’ intelligence that we may also call knowledge.

The power of sciences comes from their tested predictability, restricted to a limited environment, yet effective enough to anticipate natural or technological phenomena: a research field, in progress toward its assumed and targeted research object, succeeds when it predicts what becomes its effective object.

A powerful scientific model may predict a diversified range of behaviors, depending environmental constraints on the object of research once become experimental or even applied. However, its predictive power is limited to the reproduction of an existing, tested form or formalism or, equivalently, to the production of an effect such as described and calculated from its model. In other words science predicts the future inasmuch it resembles the past, not the future inasmuch as it introduces radical novelty, breaking from the past.

The model proposed results in a three-dimensional trigger reached at some point, and which may be interpreted spatially, according to which the Dark Energy density ratio is limited at 0.74048 (from the Hales-Kepler theorem), hence the matter density ratio at roughly 0.26, the both kept at these levels since z circa 0.2 to 0.3 range, this precise za then be checked.

Saturated complexity density

Several principles and meta-models derive or at least appear to be consistent with the FCP, such as the Heisenberg Uncertainty Principle and principles of thermodynamics including the second principle, according which entropy grows within a closed system and a complementary principle according to which systems complexity grow relatively to one another to the future, yielding a model of the future according to which a system will become saturated from its order levels.

At one spatial edge of the universe an opposition endures between ‘cosmic voids’ and their over-dense dark matter boundaries [21], where conversely most galaxies therefore concentrate.

From recent considerable sets of observations have resulted ever sharper constraints upon a deterministic model of the evolution of the universe, which appears to govern it quite precisely and yet there remains a discrepancy between the ‘inferred’ and observed [22] Hubble constant H0 suggesting that something is missing about some of the universe boundaries… possibly ‘ours’.

This is where a low-redshift hence recent new constraint and dimensionality, is proposed [23].

How to predict a future that does not replicate the past?

Knowledge has flown from researchers and inventors into spaces of available resources from where it always revealed most favorable paths to future thereafter always to be ‘new’. This happened through apparently non-refuted a pattern whether the hypothesized paths were refuted, hence then discarded or on the contrary confirmed, possibly extended to new limits. It also applied whether the origin were to be found primarily from past experiments, along an empiricist perspective or rather using innate categories toward hypothesizing or through an intricate Kantian mechanism.

A model of scientific and technical discovery and invention therefore does not seem easily to get rid of researchers and innovators, let’s regroup them as Authors whether they follow rather L(S) formal modeling routes toward potential experiments ‘à la Maxwell’ or inventive, perhaps more pragmatic trials and resulting data gathering ‘à la Faraday’, from which appropriate models could arise.

These authors, Geroch’s observers, are questioned by Penrose, whom looks for their objective equivalent for instance in some ‘OR’, Objective Reduction, to account for the quantum reduction exhibited everywhere in physics, about which he presents the interpretation that any action upon an intricate particle A with a particle B triggers paths going backward toward their common past, to the place, date or rather formalism where they went intricate, to impact B.

But then shouldn’t we therefore conclude that this event, to which an action on particle A, or on particle B, will eventually travel some influence, actually lies in the future of these isolate particles even though it also lies in the past of the knowledge of its authors about it?

Our model of the future naturally encompasses both types and levels of phenomena, that is to say that some common conclusion is drawn about when and how much a phenomenon is located in some relative future or past of another one with most phenomena positioned in our future because this is where the diversity of paths and degrees of freedom is so incredibly richer that it makes it, mathematically, the future. As mentioned above it is from there and then that author convincingly and regularly select and project them through one of the types of routes that ‘authorize’ just because the space of items and structure going to fill the free, intermediate complexity levels between the human and less complex beings, is immense. Requiring what appears to us as a fast expanding knowledge space with appropriate, such as electronic, room.

Likewise at other dimensional combination the space of items and structure, up to spatial ‘Large Scale Structure’, filling the physical universe, has become ever better filled especially within over-dense regions that, however, become so relatively to their under-dense counterparts.

All these examples help better see, and from there model, the relative positioning of horizons, such as event or cosmological ones, or otherwise oracles or authoring ones, whether from one side seen as scattered microstates or, from the other one, as a simple four parameters set.

To conclude a model of the future may not draw from statistics, how useful they may be to uncover pattern hidden in data and even more and more powerful patterns, yet limited to the power and boundaries of the projected research project onto them.

These will be useful steps in nearby futures but may not trace the trajectories, especially interdisciplinary – such as those that are powerful enough to embed downward into technologies, economical and business models – of which the boundaries are semantic, particularly the mathematical part of the semantic continuum, from its Power set[24].

This explains that research and inventiveness have succeeded in their optimal environments, those where the degrees of freedom allows enough appropriate trajectories and the free evolution and selection of wining research projections and communities.

From the Research program to current editorial and educational projects

The road to a model of the future is long to wander, which seeks how the models of the future will succeed and embed into one another. It goes through both L(S) and S(L) routings and always some experimental, hence somehow spatial resource consumption.

The first one being the series of papers and editorial project already initiated and going to be presented in next pages and publications from where an editorial extension is planned.

The second one has already resulted in some early prototypes, such as ex.revuer.org, going to be replaced by a new set of dynamical scientific mapping prototypes and, once sufficient resources gathered, by an envisioned interdisciplinary interactive mapping, primarily made for and through the research communities, goals, networks, institutes and agencies.

From there is would become very useful for all kinds of stakeholders in need of better, more powerful, hence precise and therefore interdisciplinary scientific to technological, industrial and psycho-socio-economical mappings.

A selection of published and communicated papers, books and patents is listed next page and some unpublished papers mentioned.

[1] Research paper to be submitted

[2] G. Hinshaw et al, Nine-Year WMAP Observations: Cosmological Parameters Results, arXiv: 1212.5226, 2013

[3] Planck Collaboration, Planck 2015 results, XIV, Dark energy and modified gravity, arXiv: 1502.01590, 2015

[4] E. Aubourg et al., Cosmological implications of baryon acoustic oscillation (BAO) measurements, axXiv:1411.1074v3

[5] J. L. Bernal, L. Verde, A. G. Riess, The trouble with H0, arXiv:1607.05617 [astro-ph.CO]

[6] J. Monod, Chance and Necessity, Seuil, 1970

[7] To be submitted as part of the editorial project about “Integral Universe

[8] V. Orgogozo, Imagine another life evolution, CNRS quarterly journal, #284, Spring 2016

[9] A. Aspect, J. Dalibard & G. Roger, Experimental test of Bell’s inequalities using time-varying analyzers, Physical Review Letters 49, n°25, 1982

[10] S. Carroll, From Eternity to Here: The Quest for the Ultimate Theory of Time, Dutton, 2010

[11] R. Penrose, The Road to Reality, A.E. Knopf ed. NY, 2004

[12] C. Thiercelin, Peirce et le pragmatisme, Puf, 1993

[13] S. Carroll, Infra

[14] R. Penrose, infra

[15] M.G. Millis, IAC-10-C4.8.7 https://arxiv.org/abs/1101.1063, 2010

[16] L. A. Nunes Amaral, A truer measure of our ignorance, PNAS, vol. 105, n°19, 2008

[17] http://dx.doi.org/10.1063/1.2737004

[18] R Geroch & G. Horowitz, Global Structure of Space-time in General Relativity, An Einstein Centenary Survey, edited by S. Hawking and W. Israel, Cambridge University Press, 1979

[19] S. Goldwasser, S. Micali, C. Rackoff, The Knowledge Complexity of Interactive proof-Systems, ACM 1985

[20] Diverse references to begin with Poincaré’s “Last thoughts”, 1913

[21] R. Wojtak, D. Powell, T. Abel, Voids in cosmological simulations over cosmic time, arXiv:1602.08541, 2016

[22] A. Riess et al., A 2.4% Determination of the Local Value of the Hubble Constant, arXiv: 1604.01424, 2016

[23] To be submitted, infra and list of papers

[24] J. Hopcroft & J. Ullmann, Formal Languages and their Relations to Automata, Addison-Wesley ed., 1969

post