Immortal… Human

As I was about to leave dear Dutch neighbors, a few days ago, I answered a previous point about all having to die by mentioning my latest tentative book, this one about Human beings’ proven immortality. Therefore, Hans’ possibly a bit skeptical request for the related equations now compel the upcoming pieces of summary, in fact too long delayed if not overdue but presently required, if only considering the superb cooking by which Ljam and Hans entertained us that evening (a statement from a French man).

The required equation, although an inequality might be preferred, could state as follows:

NH = Infinity

were NH quantifies shared, both common and individual Negentropy, or Entropic potential of the Human.

This being said come all the questions about measuring this quantity, the related dimensions, underlying physics and biology and justifying that we may not die when all experiments seem to prove precisely the contrary, one key point being that these are never one’s own death but always external and somehow ‘intermediary’ experiments, in a sense those of participants to the apparatus.

As a starter let’s mention the invaluable light, from the great Niels Bohr himself, shed on this part of the discussion by his “Atomic physics and Human knowledge”, which I strongly recommend and which, incidentally, will give me sufficient time to go forward into the details of the reasoning, and proof, that we, Human species, inasmuch as incoming from the edge as End of the world, may not die into it.

More details about the proof of our immortality soon to come… 🙂

UNSATisfied Universe

https://madridge.org/journal-of-cosmology-astronomy-and-astrophysics/early-online

Above is the Internet address where the open source research article, technically more precise about the “Song of the Universe“, that was announced in my previous Post, may be retrieved from the International Journal of Cosmology, Astronomy and Astrophysics (IJCAA) where it was published on July the 30th under the title “SATisfiable Cosmologies“.

A preliminary concern may arise in your mind, dear reader (‘O Reader, where art thou?), from the apparent conflict, if not inconsistency, between Cosmologies (as Cosmological models) being required to be SATisfiable, then consistent, and the Universe asserted, or possibly demonstrated, UNSAtisfiable and at least so far UNSAtisfied, hence its lament.

The conclusion of the article, hopefully, answers the issue.

And the song of the universe is…

Making us wonder whether co-authors Keith Richards and Mike Jagger will someday be Nobelized in Physics, like Bob Dylan in literature for his beautiful songs.

For “I can’t get no…o… Satisfactio….on!”, the chorus of their famous song, may reveal to be the lament of our universe, which cosmological problem is… to be SATisfied.

Why that? Because SATisfaction, more precisely random Satisfaction – like if you don’t know whether to go surfing on sea waves or snowboarding the slopes – is proven in Complexity theory to compute Consistency. Wouldn’t therefore the universe be content to be consistent?

Actually, although long ago, the universe was perhpas content before even having content; a sort of being before having if you will. So, in the ‘Satisfiable Cosmological model’ initiated in works and papers already published or being submitted for publishing, but mostly still to be published – perhaps even before the Big Crunch, if any – the expansion resulting from its primarily spatial random variables minimally constrained appear to result in a behavior matching quite well what is observed, cf. below.

A word for our happy reader so far: random K-SATisfiability is a Constraint Satisfaction Problems technique which has proven efficient to model diverse Statistical Physics phenomena. And it is more precisely, now than in our previous papers and other works, applied to Cosmology, cf. past and future Posts here or there.

Consistency of the Universe versus a ‘Human Singularity

Theoretical issues

In the section 2.5 “Implications” of their cosmological survey, Georges Ellis and Henk van Elst emphasize the “key issue” of a general consistency of the constraints with the evolution equations of a “1+3 covariant”, General Relativity grounded universe. More broadly, is the universe (logically) consistent? Then, after Kurt Gödel, how could it be also complete? And yet Quantum Mechanics was proven so and so should as well a Quantum Gravity integrated universe, e.g. from a path integral formulation, such as the Hartle-Hawking model, be understood.

The completeness of the Copenhagen interpretation however comes at a price, Niels Bohr’s complementarity. Meanwhile, could the model be extended up to the impact of an observer of the universe (taken as the quantum system with its cosmological wave function) upon which a measurement is performed? This is scrutinized by John Barrow and Franck Tipler from the chapter 7 of their master work, where they study this interpretation and extend it to the cosmos, with others, such as Hugh Everett’s Many-worlds interpretation. There, the role of what we model as a “Human Singularity” is indirectly at stake when they discuss “the Final Observation by the Ultimate Observer”, whom should stand “either at the final singularity or at future timelike infinity”. But would the universe then appear, currently so termed and then so terminated, consistent? That would require it to be de-terminated, which is for instance what the path integral might yield at this final T time of the formula for a Wave function at this end where, say, Penrose’s U (Unitary, functional) versus R (Quantum Reduction) should be reconciled.

The question is indirectly considered by Jean-Yves Girard from a mathematical logic (consistency) perspective, referring to a “transcendence of the next axiom”. Again this seems to call for an impossible reconciliation: “a definition, whatsoever, of the next axiom, contradicts the uncompleteness”: indeed not only a final observation, but then an unpredictable next axiom, i.e. out of whatever wave function has been accrued so far, is required. The natural idea is to choose at random, were randomness somehow definable.

Call it otherwise another clause in an unlimited series of constraints, which would not at the edges be limited to those of the “1+3 covariance” and there comes the use of Constraint Satisfaction Problems (CSP) and even Computational Complexity well known random CSPs. A particularly efficient reduction of these to random K-SATisfiability – commented and formulated in some previous and novel posts and papers – has been justified by Stephen Cook and particularly later refined by Fortnow and then successfully applied by researchers, such as quoted below, to a widening range of Physical problems.

In these approaches random sets of N Boolean variables are constrained by random sets of M clauses (each linking K variables from the pool of N) in order to consider whether consistency easily (SATISFIABILITY regime) arises or not (UNSATISFIABILITY). So, considering the issue of the consistency of universe as a whole, whether entirely evolving at random and yet under a growing set of constraints and new axioms or much more framed, could such an approach tell us something new about the universe currently observed and measured, with its cosmological parameters? In other words is the universe consistent, and as such SATisfiable, including with its final observers so far… us?

Observational hints

Diverse series of observations appear to have recently driven the scientific community toward the idea that the so-called ‘LCDM consensus’ model was not able or at least not sufficient to account for observed and measured behavior and parameters at its edges, meaning for instance largest scale or latest rhythm of expansion.

This is particularly the result of a consistent, determined effort of Adam Riess et al.’s to stick to observation so that the increasingly precise measurements of H(z) over the last two decades, from which came their conclusion of an enduring discrepancy with CMB rooted hence more LCDM model-based calculation of the then assumed ‘constant’, for Now, H0. The difference was then envisaged as a ‘late’ late acceleration with H(z) apparently nearly constant since z circa 0.2, which would seem the kind of behavior of a universe already entering a de Sitter exponential expansion. However this would at first only be expected when the energy density ratio WL, whether sourced in a cosmological constant or a sort of ‘dark energy’, would, if any, come closer to 1. Unless neither L nor dark energy are needed for a constant and consistent… ‘flatness’ (this seems the intuition of Roger Penrose, for instance as he summarizes some aspects of his Twistor theory at the end of his comprehensive survey of Physics). Hence the question: what might take up the role interpreted as a repulsive energy?

Another stream of observations and simulations has focused through recent decades on the evolution of the ‘cosmic voids’, with a series of conclusions about what currently appears to make the majority of the volume of the universe. Still expanding, and in fact carrying the expansion of the universe as opposed to rather contracting clusters, the largest voids build a sort of configuration of gigantic, quite spherical empty bubbles with flat cores and then a “sharp transition” to dense domain walls. These appear as even “overdense” according to Hamaus et al. and Sutter et al. (pinpointing the coincident development of large cosmic voids with the dark energy effect), so that the landscape appears like “separate universes”. All this might after all seem quite consistent with the scheme mentioned above (AIP # 1018).

Finally the third and unfortunately still less publicly acclaimed stream of observations and simulations, but of primary interest here, has along recent decades related Computational complexity – itself a fascinating scientific domain owing particularly to Alan Turing – and Condensed Matter Physics, also known as Statistical Physics.

This is particularly well summarized in a paper of Marc Mézard’s about the amazing power of the random K-SAT approach, modeling the consistency of a typical combinatorial problem of (logically, or consistently) SATisfiying a growing series of constraints, in this case all clauses of a defined K dimensionality and for instance all related to ANDs while, in each clause, the K variables would be linked through ORs in the case of a Conjunctive Normal Form (CNF) as opposed to the converse Disjunctive Normal Form.

The theoretical value traces backward to the seminal paper of Stephen Cook where the reducibility of Non-deterministic problems to DNF, and even 3-DNF (and then later to 3-CNF, hence K = 3) was proven through two theorems. The practical value has then arisen with the observation [10] of an effect labeled as a “sharper acceleration” at low redshift hence in very recent cosmology, while a recent “sharper transition between core and boundaries” of cosmic voids is analyzed.

Meanwhile, in Complexity/Statistical Physics, a ‘sharp transition’ was observed at some thresholds of the ratio a = M/N of the number of M random clauses to N random binary variables allocated to them: several papers have then, modeled this behavior in terms of energy and entropy global cost or goal functions, interpreted it as hypergraph with clustering, sphere jamming and condensation effects and narrowed the values of the ad, ac, as… where the transitions occur. This was mentioned in a previous post, together with the interest of looking at a ratio b = 1-1/a, which would range within [0, 1] when a spans [1, +¥[: the interest of a threshold at bd @ p/Ö18 fitting with the Hales-Kepler ‘orange stacking’ threshold was pinpointed in previous post and communications.

The quoted authors conclude that there is ‘easy’ SATisfiability – meaning that the typical 3-CNF formula linking the M 3-clauses through ANDs with a random allocation of the N variables easily ends as ‘True’ – with a below the first threshold, hard then and finally UNSATisfiability.

Now the question may have become, for the sake of an overall consistency, ‘How might this be of any relevance for the behavior of the universe as a whole’? And for the Human phenomenon (as also emphasized by Barrow & Tipler)?

Satisfying a basic universe and more… up to the Human Singularity

Applying this to the universe itself appears to deliver a picture of a naturally flat universe, hence without requiring neither cosmological constant nor dark energy, and in fact at first glance not more dark matter per se. Arguably the technical details, still to be published, will need to be shaken (there are probably already countless such models in the air) but the resulting landscape would seem to evolve as observed and particularly attuned with the very brief summary of observations above. A first test might be that the ‘equivalent’ WL, in the years or decades to come, converge to the p/Ö18 sphere packing threshold of Hales-Kepler theorem.

Then what about its links with us, the ‘Homo Saps’ as summarized by Arthur C. Clarke when envisioning that the heirs of our good old transistors, once become Robots and possibly sort of Terminators, send us join ‘Dino’ in the dust-bins of (then their) history? Well, assuming the gravitational universe proven consistent – hence satisfied, hence our proposed model – and then accordingly a matter and life filled universe, even only locally, what are that odds that the Human be themselves consistent when they seem so often UNSATisfied?

This drives us back to the position of any observer, in the (Copenhagen or Many-worlds, physical) diverse senses considered by Barrow & Tipler, but also any author of a ‘next axiom’, even if, in both cases, each in their local space-time: extending a cosmological SAT into matter and life seems natural, even promising, but the fascinating potential consequences come when it is pushed into the realm of Natural Language, and Consciousness.

Conclusion

Anyway, this post is a poor and mere hint at what the full and detailed model requires but its unfolding seems to require neither dark energy nor dark matter specific components. Remains to see whether its quantifiable prediction appears comforted or refuted by incoming observations. With the energy density ratios usually interpreted as dark energy or cosmological constant (WL) , or dark + baryonic matter (WM) becoming nearly constant, as strange as it may appear (technical content awaiting publishing).

More widely the question of a universe both consistent and complete remains and so far we, Homo Saps are the terms, so far, wherefrom they are de-terminate. Will some piece of logical quantum gravity satisfy us? Otherwise it will fall as another bit of quantum logic in the grave. But if it brings us SATisfication, then its system of random and then perhaps integral paths should somehow reach  us… the ‘Human Singularity’.

(The) Human Singularity

This is the main title of a new Book to be published (initially written in French, copyright (c) 2020, then in English (c) 2021), hopefully in 2022, about what makes the Human so special, hence whether those beings labeled “Homo Saps” by Arthur C. Clarke, aka ‘us’, will prevail over fast improving Robots, linked through networking Artificial Intelligence as ‘RobAIs’. This is just starting to be a global issue of a magnitude comparable or even greater than climate change, especially since a “Technological Singularity” has been predicted or even advocated by a stream of computationalists and other gurus for decades as the incoming point where Robots/AI might reach this specific Human power, self-consciousness, and even exceed it. Many books have been written about or around this issue but apparently very few inter-disciplinary enough addressing what the eventual match ‘Homo Sap versus RobAI’ involves, from Linguistics and Computational Complexity to Mathematics, Physics, Cosmology (with singularities and the Anthropic principle), not ignoring what scrutinizers of our psyche and brains have come to tell us.

#Colonized

Colonization appears to be a hot topic these days and this is good. Actually one constructive purpose of this paper is to contribute to the global empathy toward this key issue by suggesting to the U.N. a worldwide day to the glorious colonizer.

Obviously that great day should start with the greatest colonizer ever, I mean the African ‘Homo sap’ (as shortened by A.C. Clarke) who colonized all the rest of the world, thanks to them!!

This is a colonization that will remain unmatched in history, at least unless RobIA comes in a few decades to conclude that, as Clarke says, it is time for us to join Dino, whether in their own, remotely isolated Jurassic island or on another, dedicated, anthroposcenic zoo.

Of course I might have from time to time some Neanderthalian feelings a bit frustrated by those sapiens sapiens who came to invade our shores, slaughtered part of us, possibly enslaved, ate or raped others. And yet I also see some benefits… let me explain.

France, as an example, has a proud history of being colonized over and over. The paleolithic artists of say Lascaux and other places after the events summarized above were among these pre-pre-indo-europeans surviving the last ice age (together with a few deserving squirrels as we know)…. (to be continued)

A research paper (as I recall) recently summarized from anthropological studies that these populations had moved to below Garonne river (I’ll add : much later to become ‘vascons’, ‘gascons’ well know as proud musketeers such as d’Artagnan). Region also known as Aquitania (though with more wine than water know).

The same study saw populations between Garonne and Loire (much wider then) come from South-East Europe and neolithic middle-east while population north of Loire valley primarily went from neolithic east/north-east… much later as celts, gauls, … each wave eventually happily slaughtering and/or enslaving people met along the journey.

Gauls settled in northern Italy, saw Hannibal’s elephants cross the alps and Romans later conversely until much later Julius Caesar went to help fight one another (willing to help). I’ll pass for now his successful story with Eduens and Allobroges to summarize the conclusion : Eduens roughly told other lazy gallic tribes “guys if you are annoyed by nasty neighbours just call our friend/ally Caesar and he will deal with the troublemaker.

This Julius did soon with the German Arioviste who had crossed the Rhine westward (may people got the tradition to come there westward, especially German, probably because eastward that would be people coming from America, otherwise a comeback of the Atlants)

Still to make it short Julius finally thought to satisfy every tribe by conquerring all of Gaul and bringing it the famous “Roman peace”… should we complain? Romans brought latin and many monuments, technics, openness to the Roman world so that Gauls finally became strong Roman supporters and eventually emperors.

As kinds of stubborn colonizers a few centuries later came again all kinds of German tribes, some just crossing, as usual, others settling here or there, such as the Wisigoths, and finally the Francs, from north-East, while Angles and Saxons were colonizing British shores, themselves a few centuries later being colonized by Danes, whom also came to France, finally to settle to what was to become Normandy.

As you guess all this very briefly summarizes the World as a history of colonizers moving to shores just previously colonized by others so that anyone complaining to have eventually been colonized just forget that he was himself a colonizer the day before.

Now one of the colonization that seems currently to keep some minds busy (why not after all) is the story of the Spanish conquistadores, again westward, to what was to be known as the Americas (hence to be continued under the next title #Colon).

Innovation, Welfare and Democracy, Part II

This blog page follows  https://journeau.net/2017/01/28/socialize-innovation-and-knowledge/  and Innovation, Welfare and Democracy, Part I, where we observe and discuss the societal conditions of emergence of disruptive, new knowledge, production and dissemination processes, able to overcome the natural inertia of established structures and streams.

The growth of welfare imply conditions, for which we seek a quite physical equivalent and mathematical model, enabling enduring derivations from the “circular flow”, as expressed by Schumpeter, which may not only erase profits but even shrink the related (such as autarkic) economic welfare.

In this approach the subsequent question of ‘sharing’ the cake is not considered as a preamble or even an axiom but as a variable, if not even a subordinate aspect since sharing a not growing, or relatively or even absolutely shrinking ‘cake’, may be seen as of little interest after the premises above. Let’s emphasize that in a global economy a local one may apparently grow although it shrinks relatively to others or possibly preserves its share of the general welfare by eating some of its reserves or potential.

In previous blog page we discussed Kurz [1] focus, in its 4th section, on “The ‘natural course of things’ vs. the ‘circular flow‘”, presented as Schumpeter’s “counterfactual reasoning, because it contemplates an economic system in which there is no change whatsoever” while “…’new combinations’, innovations, continually invade the actual economy and make the system grow and undergo structural transformations.”

And yet the “natural course” appears not so natural, not so easy, because societies tend, in a more natural quest for optimal complexity, toward power and welfare, to get unnecessary structure, which then ‘naturally’ resists change so that innovations only effectively grow and disseminate in rare, if any, most agile societies, only from where they acquire the power to disturb less adaptive and undergo their “structural transformations” the value generated, which may be low since Kurz adds that “in the circular flow, Schumpeter contended, there will be neither profits nor interest“.

Profits and interests as traces of the disruption in the flow may then imply what  Kurz concludes about “Schumpeter between Walras and Marx“, after Mas-Colell, i.e. that “the relationship between the two ratios [wage to rate-of-profits versus labor/capital] can have any shape whatsoever“.

The second article on which we here focus, by Egidi [2], leaps with Schumpeter and through Langlois from the need of an economy to innovate, cf. above, to the role of the entrepreneur, and from there to the characterization of the knowledge, or even cognition, and other conditions and factors enabling enough entrepreneurial innovation process.

Egidi scrutinizes the implications drawn by Schumpeter and other authors about the individual role of the entrepreneur and cognitive process, which are then considered in the perspective of the generalization – otherwise called socialization – of cognition, with the limits of ‘conscious rationality’ versus advertising and persuasion, in or onto Society.

The scheme discussed here seeks one step further, i.e. from Egidi’s analysis into a more structured and potentially predictive hence refutable model of such cognitive behavior.

It will go through three parts: a summary about the need for better models of cognition from its “components”, with a more precise model of this ‘conscious rationality’, a presentation of the model and a general conclusion about next steps and about consequences for democracy.

Our model, in progress, shows and quantifies the gap between the “slow growth’ law” about which [3] cites Bennett [4] to claim that “an evolutionary system T(t) cannot have its logical depth LD(T(t)) that grows suddenly“. Our model of fast growth depicts the entrepreneur’s impact while slow growth applies to societies, although obviously with a rate that accelerates in this early 21st century as compared to previous decades.

This model about the extent of the complexity [5] leap here involved, as compared to other definitions of complexity, allows a comprehensive complexity ladder, in which the relative complexity depths of individuals versus the societies might be compared.

We will as an example quantify the relative complexities of Simon’s chess and other game players, involved by Egidi, as compared to Deacon’s “teleodynamic systems” – whom emphasizes the “higher order intrinsic constraint (that) prevents the disruption of the synergy between the component morphodynamic processes that determines its unity” [6] – and to diverse attempts at formalizing intuition, from its role at the roots of some constructivism in mathematics to Goldwasser’s et al. [7] observation that “each formalization (..) cannot entirely capture our original and intuitive notions, exactly because they are intuitive” when introducing the model of Interactive Proof-Systems in more than one way seminal to ours.

Egidi’s analysis of “the Cognitive roots of Schumpeter’s picture of democracy

Egidi starts with recalling some comments of Langlois’ comment of Schumpeter’s about the condition of the entrepreneurial leap out of the “circular flow” where “there is nothing fundamentally new” but with “the impossibility of surveying exhaustively all the effects and counter-effects of the projected enterprise”. This importance of this last remark will be emphasized below as it means that, therefore, neither competitors – that may want to kill it in the bud – nor even provisional supporters ranging from shareholders and the public environment, will have this capacity as opposed to the Entrepreneur, however single he or she may be, to an extent that remains to be proven and/or better, quantified.

We will have to conclude that in effect the both of them are not only unable to “survey” but also that, as a result, they will reveal to be the main threat to the survival and to the success of innovative ventures, an observation that has led some Societies to allow protections as opposed to some others.

His best extract of Langlois on Schumpeter follows with the observation that the “success… depends upon intuition, the capacity of seeing things in a way which afterwards proves to be true, even though it cannot be established at the moment, and grasping the essential fact, discarding the unessential, even though one can give not account of the principles by which this is done” (Schumpeter (1934) 85)[2].

Let’s observe that this is very close to some kinds of scientific process, although this one aims at delivering a predictive model then scientifically valid, à la Popper, inasmuch as refutable.

We follow Egidi in his next paragraphs, dedicated to scrutinizing further, after Schumpeter, this capacity and process so important as to enable the Entrepreneur to make right decisions that much more powerful groups, such as established firms, may not.

Because of the apparently, ex-post rational entrepreneurial decisions, Schumpeter seems to widen the concept of rationality although Egidi points to the concepts of “creative response” and “discovery” where, to account for “no longer a process of optimization” – since disruptive – he introduces Herbert Simons and the analogy to the “chess playing activity” where “the winning strategy, which already exists, is not practically computable.”

Our provisional conclusion is that this computability issue indeed characterizes and underlies the ladder between the entrepreneurial “teleodynamic” quest versus a societal ‘slower growth’, and yet that the chess playing intractability is still nothing as compared to the quantity here at staked and to be scrutinized in next blog pages on this question.

 

[1] H. D. Kurz, Is there a “Ricardian Vice”? And what is its relationship with economic policy ad“vice”?, J Evol Econ. 2017; 27(1): 91–114. Published online 2016 Jul 9. doi:  10.1007/s00191-016-0468-2https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5253719/

[2] M. Egidi, Schumpeter’s picture of economic and political institutions in the light of a cognitive approach to human behavior,  J Evol Econ. 2017; 27(1): 139–159, Published online 2015 Sep 4. doi:  10.1007/s00191-015-0421-9, www.ncbi.nlm.nih.gov/pmc/articles/PMC5253716

[3] J-P Delahaye & C. Vidal, Organized Complexity: is Big History a Big Computationhttps://arxiv.org/abs/1609.07111

[4] Bennett, C.H. 1988. Logical Depth and Physical Complexity. In The Universal Turing Machine: A Half-Century Survey, edited by R. Herken, 227–57. Oxford University Press.
http://www.research.ibm.com/people/b/bennetc/UTMX.pdf

[5] P. Journeau, Comprehensive Complex Universe

[6] T. Deacon & S. Koutroufinis, Complexity and Dynamical Depthwww.mdpi.com/2078-2489/5/3/404/pdf

[7] S. Goldwasser, S. Micali & C. Rackoff, The knowledge Complexity of Interactive Proof-Systems, ACM, 1985, https://groups.csail.mit.edu/cis/pubs/shafi/1985-stoc.pdf

 

 

 

Innovation, Welfare and Democracy, Part I

This page follows previous blog page https://journeau.net/2017/01/28/socialize-innovation-and-knowledge/ which observes that recent innovations in effect most disseminating, hence socializing knowledge, whether for or not-for-profit (e.g. Wikipedia) but all for free or nearly so, in effect grew in places reputedly most capitalist.

Revisiting the literature on innovation, its consequences on welfare growth but also its constraints, the next series of pages will summarize some aspects of our purported thesis and attempted model and description of the organizational – hence also societal, political, cultural – conditions allowing the emergence and success of innovation or conversely hampering it and treating it as a threat.

This last occurrence is more frequent than commonly believed, and even the standard model, explaining that innovation has so few chances to reach its potential markets. Then, effectively most “anti-innovation” – in spite of heavily preaching support to its cause – societies result in impoverishment of their citizens, members or shareholders.

This somehow naturally derives from Schumpeter’s “Creative Destruction“, which focuses on its benefits as opposed to the previous focus on drawbacks by Marx, discoverer of a distinct but thought comparable underlying process when wondering about “how does the bourgeoisie get over these crises? On the one hand by enforced destruction of a mass of productive forces; on the other, by the conquest of new markets, and by the more thorough exploitation of the old ones. That is to say, by paving the way for more extensive and more destructive crises, and by diminishing the means whereby crises are prevented“.[3]

Obviously the farm tractor has destroyed millions of manual harvesters jobs, the computer has destroyed as many jobs of secretaries and accounting clerks and the Internet brings a swarm of innovations still both destroying and creating jobs, think of reactions to Uber and Air-BnB for instance. But this was less “by the conquest of new markets” and rather by innovative, more efficient, new “productive forces“, of course then getting the markets.

However, there is still a lack of understanding and predicting modeling about the type of society enabling innovation, hence where new added value jobs will be concentrated, as opposed to those where traditional jobs will ultimately also be lost but without, or much less, the gain of numerous highly paid jobs, typically driven where innovation is sourced.

In the next pages of this blog series we will focus on the state of the art, where it is useful to highlight as a starter a most recent publication about Schumpeter and Schumpeterians on Economic Policy Issues[1]

A summary about the value of innovation is extracted from its article by Kurz [2], revisiting a critique of Ricardo by Schumpeter by recalling “Ricardo’s ‘fundamental law of distribution’, according to which a rise in the real wage rate implies a fall in the general rate of profits, given the system of production in use” and then about “the circular flow”, with “no technical progress”, recalling that “rents would rise and the general rate of profits [but possibly also the real wage rate] would tend to fall.”

This helps understand the distance between 19th century Marx and disciples’ quite pessimistic conclusion and even misinterpretation, as opposed to 20th century Schumpeter and followers: with a limited rhythm of innovation from applied sciences at their time, the former primarily observed strategies of capitalists to prevent overproduction, born from competition and its rush to lowest marginal cost, through destruction of excessive production, when oligopolistic arrangements were not possible or not even legal.

The later got much more opportunity and historical depth to observe that, beyond the adjustments within the circular flows – on which the former were focusing, hence possibly inclined toward monopolies – there was also and more importantly in each creation of novel production organization a better “system of production“, whatever its improved components, from organizational to other scientific fields new technologies.

 

The picture, opposite to Marx’, comes as follows: depending the rhythm of innovation and its growth rate the ‘winners’ are the new categories, people and countries or regions that most innovate and disseminate and therefore precisely undercut their insufficiently efficient, typically monopolistic or most often oligopolistic or protected markets.

Most importantly the established “circular flow’ owners, which might count those so-called ‘capitalists’, or ‘bourgeois’, or as well and much more state-owned production systems, will fight innovation, considered as a threat, or at least need to control it and compel its impact on their markets to fit with their existing organization. They will obviously, by so doing, hamper it and only contribute to prevent the society where they have such capacity, and possibly the world, to benefit from the innovation.

While this is for the most part or should be well known, it is useful to recall and emphasize it in this introduction of our goals, which are the modeling of the societal, political, organizational conditions, if any, under which innovation, especially so-called ‘disruptive’ – that is to say most ‘creative-destructive‘ – will still be allowed to emerge.

A corollary might come back to the Marxian conclusion about the generalized, asymptotic decrease of the rate of profits. This  kind of ‘heat death’ might derive from innovation becoming treated as a threat at global level but otherwise the decrease is rather on wages, in those places and categories of people and societies not yet, or not anymore, allowed or endowed or incited to contribute, by leaping, to the overwhelming innovative civilization.

One critical point for Schumpter was the capacity, characteristics and environment of the entrepreneur. These are important, and at stake in our next pages, but to be relativized to the cultural, political and regulatory environment where they may, or may not, operate.

[1] http://unice.fr/laboratoires/gredeg/contenus-riches/actualites/seminaires/numero-special-du-journal-of-evolutionary-economics, Vol.27, 2017

[2] H.D. Kurz, Is there a Ricardian Vice?, Infra

Socialize Innovation and Knowledge

Maloof chuckled. ‘We also pursue glorious goals, such as profit, survival, and the sheer joy of wringing revenue from parsimonious passengers’.”
           Jack Vance, Ports of Call, © 1998, Tom Doherty associates, NY

We seek sponsors for a thesis about the causes and conditions of emergence and dissemination of Innovation – hence new knowledge – and of Knowledge generally speaking, which we call their socialization although the phenomenon does not appear to occur more efficiently, in fact apparently rather the contrary, in societies organized by the concept (for instance universalized as ‘socialist’ although this may not solely or entirely recoup political entities so labeled).

In some way this relates and somehow generalizes, to societal and political extent, a previous, doctoral thesis, about the interactions between organizational and market structures [1], at firm level and which was using major references such as Chandler’s seminal “Strategy and Structure” as well as latest industrial economy and strategies discoveries by then.

The new work, in-progress, hereunder presented, uses a diversified accumulated material since then and more recent works in industrial economy, ‘dynamics’ and strategies and a potentially applicable predictive model, to be refined.

Modeling the conditions to reflect enabling as opposed to disabling environments

This question has been introduced in previous blog pages, as the typical epistemological one  with for instance a remark of Kuhn’s [2] about the role of Europe in the modern era, yet probably with roots probably as far back as middle-age inception of universities and perhaps even more ancient with the political conditions of universal access to education.

It is here fed by the observation that entities having in effect most disseminated knowledge and innovation throughout recent decades are American and to some extent even Californian and whether for or not-for-profit, but with similar processes, switched to users time measurement unit and through much reactive experience rooted mechanisms.

Considerably more have tried, and many gone far enough, but it has only quite recently appeared to observers that quite all complete successes were born there and not elsewhere, leading to the obvious question: why? And then, how?

As an example recent articles worry that Europe and more specifically France [3] seem to lag behind the United States (U.S.) and that diverse and powerful firms, activities and sectors so far ‘at rest’ now feel threatened by innovative entities such as the “GAFAs [4]“, to which are added an Airbnb and an Uber – from which even a concept of ‘Uberization’ – and possible revolutions in the Entertainment sector, where the U.S. already and again lead the pace.

Arguably everyone wants better shared wealth, shown firstly possible with better disseminated and effective operational knowledge, hence a postulate that has led, in some countries more than others, to the conclusion that this would require more ‘res publica’, which in turn may require more ‘public economy’, finally translated in more ‘state-owned’ or, hence controlled economy as opposed to the type of ‘public’ ownership resulting from scattered capitalism through most diverse intermediating processes, if needed.

The thesis uses cases from several epochs [5] and countries although it gets more in-depth about the cultural and as a result regulatory roots of a critical, most recent difference between France – and to some extent Europe – versus the U.S., even though the diverse set of policies and tools are studied for each case through the complexity of their consequences.

Apart its expected value for general political economy, one purpose of the thesis is to provide mechanisms and even up to predictive tools for decision makers about how to organize knowledge dissemination and innovation entities, whether disruptive, such as emphasized in the thesis for the critical role in the advancement of society and welfare.

Pieces of State of the Art

A Bostonian [6] researcher had concluded, circa 2000, about the leading role of capital risk national industry depth, and more precisely average invested amount, in this American leadership. A conclusion that seems correlated to the size and results of the Silicon Valley venture capital. Several countries, such as France, have reacted by boosting national ‘risk’ funding capacity and focused on measuring the size and progress of fund allocation as compared to other countries.

The present thesis is skeptical about such policy alone and will hopefully be found useful for future political decision makers there or elsewhere and wherever they come from. In fact recent observations from civil servants in charge appear to concur with this analysis, as well papers such as [9], but the goal here is to model and measure the impact of all kinds of factors and actors as the only way to conclude about the extent and sign of their contribution, if any.

Cases used in the thesis are mostly drawn from France versus the U.S. because this is where most experimental material was accumulated, but the model is believed applicable to any Society, about which it ambitions to deliver predictive power. A recent Industrial Policy research paper [7] observes that French manufacturing share of GNP is the lowest of European Union (14 countries apart Luxembourg and before extension eastward), and lowest robotic equipment rate.

This combined lowest equipment rate and conversely highest construction level, seems contradictory with a strong public discourse and action toward innovation, yet already much debated, but as far as we know not so much modeled.

The extent to which the failure is linked to a higher fraction of ‘public’ economy with ‘public’ here taken in the sense of state owned, or directly or indirectly controlled or framed or supported [8], is considered and integrated within the proposed model. This one focuses on the knowledge economy and knowledge disruption but the paper quoted above [7], which scrutinizes manufacturing economies and their policies, already distinguishes several drawbacks that we also observe. In fact the contrary would be surprising since the knowledge of yesterday, applied into innovation today, may result in better, new and more (efficient) equipped factories of tomorrow. Our analysis and as a result model, however, considers its direct counter-effective (from biased to discriminatory, detrimental and even damaging) aspects as only one piece of a wider, organizational, cumulatively regulatory and even cultural problem.

Another correlation comes from economic agents, such as households’ decisions to discard innovative entrepreneurship and prefer other kinds of investments, including non-innovative entrepreneurship. The model concludes that this is less a cause than a consequence of the diversified, but quantifiable, stronger ‘reaction’ of a fiercely opposing environment, more than proportionate to the mass of displaced liquidity – such as Venture Capital amount – in a wake of Archimedes if the phenomenon were so simple, i.e. transitions to crystallizing and otherwise gaseous phases and objects not so prevalent.

Obviously liquidity rivers matter but the economy is only as fluid as its ability to let all kinds of resources, capital and human/knowledge, easily and most swiftly flow and transit from one structure to new ones as opposed to rigid contexts of high viscosity and crystallized blocks, whether regulatory and/or contractually and/or culturally and socially constrained.

For the (growing) portion of central interest for us, i.e. the knowledge economy, with its mostly immaterial value, traditional industrial economies and even dynamics may not apply anymore, or less, requiring new models. The adaptation is more difficult to bigger and more complex structures and some kinds of matrices, hence a possible fierce opposition between a ‘Public’ discourse praising innovation, progress and shared knowledge and a public practice that rather forbids or even kills it from their inertia and set of constraints, through which no disrupter may survive.

Stories

Cases alone obviously don’t make a thesis although their generality appears symptomatic of the disease: about the GAFAs and wider Californian range, with for instance the Space-X, Tesla, Airbnb, Uber etc… disrupters mentioned above, a two-days session in Brussels a few years ago drew a dramatic contrast between about 2 European digital champions to one’s left and about 20 US to the right with the resolution, presented by an E.U. director, that billions appropriately scattered should result in a better balance. The question however came to our mind: is it solely, or even mainly, a matter of funding size, or even, presumably, better combined funding and timing?

In France, among experimented cases, a bunch of known candidates – including our own case – have rambled funding competitions and we detail, among others, the  ‘Miloc’ endured case, warned that a sufficient entanglement of public endorsement and strong strategic alliances was a must to expect to become a happy very few winner… a status that he finally did not reach, driving him back to ‘classical’ Venture Capital entities then telling the team that the net of strategic alliances tying them up was to be dismantled as a starter.

A recent survey and research paper [9] comparing results of ‘Independent Venture Capital’ (IVC) versus more ‘Corporate VC’, including more Public (or ‘state-owned) VC and bringing some correlation with U.S. versus European different mixes, corroborates the picture, which participates to a painting and tentative modeling of more independent and fluid or less structured versus more intricate, ‘clusterized’ and integrated societies, that our model precisely investigates.

It seems that the U.S. champions have been able and allowed to focused on simple, easiest, most widely disseminated and least entangled and controlled, disruptive products or services when the ancient world, mostly from organizational and societal brakes as scrutinized and modeled in the thesis, up to a ‘geometrical’ model in progress, were rather busy tying knots presumably thought for the best but in effect rather binding and efficiently destroying potential global stars.

Hints at Industrial Dynamics predictive modeling

Industrial Dynamics, quite recently grown from Industrial Economics or reborn since the 90s [10], suggest images of flows, viscosity, fluid dynamics together its already customary elasticity and the concepts of tissue and structure deformations and ruptures borrowed from physics. A recent article [11] prompts figures about what one may see as black hole or at least giant stars kinds of gravitational impacts when observing the close to duopolistic market share acquired by the Google-Facebook on the French – and probably similarly others – advertising market. Such intuitions inspire directions to get to the quantitative modeling of the deformations of industrial economies and their propagation from single firm, disruption-sourced effects to sector and finally multi-sector waves and global economy shaking.

Modeling and testing

Many models have until recently ever better represented classical industries, now however impacted by more technology-based or even purely high-tech players, breaking barriers. We focus on the kind of digital-based disruptions allowing the rocketing growth of firms such as listed above. We have come to issue a currently still provisional and underdeveloped, by far, quantitative model that needs to be significantly refined and internally tested first, i.e. in mostly retro-dictive a fashion but then on external, real-cases, on which to test predictions of economic and, for the matter, long term financial consequences. The displacement of the dominant value base and unit, and its related complexity and complexity density with linked anthropology, in a comprehensive economy, shakes the dominant measurement axes and explains that even dynamical, apparently adaptive structures in their environment, don’t survive the decisive and disruptive, knowledge-based pressure from neighboring environments. Hence the dismay of regulators and this all the more that they find themselves entangled both downward and widely into the economic landscape.

Conclusion

Our conclusion, tested through a diversity of experiments, including some of our own, and apparently consistent with recent papers, is that, and how much, the explanation by the size of the available Venture Capital is far from sufficient although indeed a part of the comprehensive mechanism through which Societies will lead or follow innovation, and therefore drive global welfare or conversely see their wealth and impact on the world decrease, whether slowly or rapidly depending factors scrutinized and envisioned to be integrated in our model.

To the question “how is innovation possible” the amazing answer is that it is not, in the sense that it is not allowed to disturb existing market structure, which would otherwise and will have to adapt to it, and that the most thought through and eventually generous Keynesian mechanisms, served by most admirable people with the best will to be helpful and to produce the so much praised and so-desired champions, in effect kill them the most surely while the presumably most awful, Schumpeterian and egoist jungle finally lets survive at least some of these ‘disrupters’ through which economy, particularly the most innovative, flexible, even fluid and therefore knowledge-based, leaps into the Future.

One of the key factors comes as follows: while the jungle economy may rely on bets, acknowledged all the more successful that the disruption will eventually prove itself fast, hard and in unexpected ways and places, the planned economy by definition need to assess any potential impacts more rationally and further as a preamble from which, then and therefore, they may never be authorized. To the extent that the NP versus P complexity classes relative powers may here be applied as a measure of speed, the environmental ‘oracles’ will predict the efficiency gap of related economies.

Some economy thermodynamics may contribute to explain surges of knowledge where and when flows come with heterogeneities and changes ups and downs of revenues, fortune and structure, rather than a deadly erasing, but with a difficult understanding as yet of what causes what, hence the always questioned right political, regulatory ‘mix’.

It is therefore a major goal of the thesis to deliver an operational model verifiable through a luxury of available data and hoped to be especially useful for current issues in Europe and in France. At a more general level let’s recall that this is meant to relate to the ‘Comprehensive Universe’ project, about which the question of the dissemination and growth of (a) civilization, or the contrary, and its conditions, matter. One of the side aspects is the growing question of the uniqueness of our (human) civilization in the universe or of its probability, to the extent that this concept might apply.

Current status and call for sponsors

At this time a composite experimental material is gathered, with as much as possible a focus on other cases than our own to minimize the risk of subjective perspective. The predictive model does not very much, as yet, exceed the State of the Art, still under investigation. Its next step, expected to bring a more useful leap, should start to compute the propagation of the deformation of some existing economical structure, amplitude, geometry and speed, but the target is obviously to extend this to the direction and speeds of impacts to neighboring and to remote sectors.

Independence Time and Universality of Access

Last but not least, what is at stake is too important to be concealed or restricted by or to specific private entity when our whole plea, on the contrary, is geared toward knowledge dissemination and widest sharing. Hence the call for pure sponsors even though we don’t despise for-profit organizations at all, but Independence is today and much longer compelling for our Comprehensive Universe purpose.

Therefore experimental, ‘local’ activities or sectors or business ‘fields’ are welcome, upon which we plan to be able to test the model in due time. Obviously this ‘local’ is becoming only infinitesimally spatial in the knowledge economy.

Liquefying versus clustering

The goal of “wringing revenue” to make profit is not much advertised in our days when “survival” is for disrupters the horizon of all their hopes when discovering that their innovation, vector of tension to economic incumbency and much wider inertia, and of reactions from the overall system of forces, will hardly make it as far as market independence.

The question of the causes of innovation is therefore about the model that may minimize its death rate, hence the eventually considerable weight, according to parameters and far exceeding classical transaction costs, born by this combination of forces, knowing however that it reflects a societal and political cumulated reality, not the contrary.

[1] P. Journeau, Interdépendances entre structures de marché et structures organisationnelles. CRG Polytechnique. 1984

[2] T. S. Kuhn, The Structure of Scientific Revolutions, MIT, 1962

[3] Diverse newspapers to be specified, January 2017, about figures for 2016 in trade deficit, unemployment, manufacturing, GNP growth.

[4] Google, Apple, Facebook and Amazon, a set of firms having most drastically shaken practices at global scale within one or a few decades

[5] P. Lévêque, Empire et Barbaries, Augé, Gillon, Hollier-Larousse, Moreaux et Cie, 1968. Excerpts: « provincials only with difficulty cumulate reimbursement of debt with payment of taxes. Profit of Roman bankers are so scandalous that a breakpoint is reached: a crash of unprecedented magnitude marks the end of the Republic.”

[6] Presented in 2002 by the Capitole, Washington DC, exact reference to be recovered: US venture about tenfold the one of a major European country for a similar issue, quite remarkably observed a posteriori in at least one case (microalgae based biofuels).

[7] P-A. Buiges, Quelle politique industrielle pour éviter le décrochage industriel ? GREDEG WP No. 2016-35 http://www.gredeg.cnrs.fr/working-papers.html

[8] ‘Supported’ entities would for instance benefit from some kind of ‘public’ appreciation reaching a level of practical market distortion.

[9] D. Dufour, E. Nasica & D. Torre, Clusters et efficacité du capital-risque: une analyse des stratégies différenciées des fonds indépendants et des fonds industriels, GREDEG Working Paper No. 2016–33

[10] Jackie Krafft. Industrial dynamics. William Lazonick. IEBM Handbook of Economics, Thomson Learning, pp.187-194, 2002.

[11] Le Figaro / B. Ferran / A. Fontelli, Google et Facebook vampirisent le marché publicitaire en ligne, 26 et 27 janvier 2017

Organizing and financing REVUER

The global project of interactive and predictive interdisciplinary mapping of Research, known as REVUER, cf. www.revuer.org and previously introduced in this blog, cf. Discover, Predict, Reproduce… ScienceOn the REVUER project, toward a ‘General Predictability’ and Know the Future, requires well crafted organization and financing as shown by reactions from mostly US and European Research executives so far.

REVUER mapping of Research, from projects to results, stem from its visualizing power for users ranging from researchers themselves to their ‘funders’ and to potential markets.

From there we plan to exploit at REVUER novel, and even still mostly in progress research fields representation models to show and/or compute fields trends and trajectories toward future results, effects or and even technologies.

The ultimate goal of i/REVUER’s “General predictability” potential capability, naturally deriving from long term targeted effective interdisciplinary and integrative mapping framework, once projected into effectively interacting models, as summarized in other blog pages, will come into perspective once and where sufficiently dense, evolving and .complementary fields are already mapped and tested.

REVUER comes amid an evolution of research communication and practices to make them more efficiently shared, starting with scientific data as led by the Research Data Alliance (RDA), cf.  https://www.rd-alliance.org and getting toward easier access to publications.

The risk of non-optimal organizational framework for such endeavors has resulted in RDA  initial 3 rules : ‘non-governmental‘, ‘non-profit‘ and ‘non-disciplinary‘, while in science-fiction it had drove I. Asimov to the choice of 2 foundations, at two edges of the galaxy.

Without going that far, the fact that the intersection of the three spaces, defined by the 3 rules, is close to empty space, has led REVUER to the organizational choices below.

Multi-regional and even national organization focused on independence

The REVUER consortium is driven by Re-Vuer Co, an US, MD based entity and meant to gather users, teams, groups as well as potential partners from the scientific world, i.e. from universities, institutes, agencies and foundations to publishers and networks.

Discussions have started with executives from some major universities and institutes in the US and in Europe and to some extent also in Asia. From there, and from organizational previous studies we have concluded that, once operational conditions are gathered, REVUER should become based in each region of the world and in some cases even per country, with their own hubs and license.

Strict independence is compulsory for REVUER for reasons too long and cumbersome to detail here but for its demanding, consistent and global experimental protocol.

The ‘non-profit’ issue is then taken less as a constraint than a guideline as conditions above require the Consortium to ensure sustainable revenues, to warrant independence.

Hence the rules summarized below and going to be further detailed in future blog pages.

More about universality, hence autonomy, for the sake of the epistemological protocol

Recall that our foremost scientific, or more precisely epistemological, experimental goal, requires that the real dynamics of research fields be as faithfully reflected as possible, indeed with limited “governmental”, “for-profit” and “disciplinary” distortions, and yet enabling and acknowledging the role of each of these types, such as research funding agencies, foundations, publishers, universities and all kinds of domain and field subsets.

This sufficiently summarizes REVUER’s operational or equivalently “business” model, which enables its users to define and parameterize extensively what they want to share and communicate most efficiently, to, how and with whom, through the common displaying screenplays for which n/REVUER is going to replace ex/REVUER.

The related commitment to users is that REVUER shares no interest with any specific business and dedicates entirely to its users, primarily researchers, whose goals and value it focuses on and primarily cares to promote.

Observe, however, that i) non-profitability alone is neither the entire solution, and moreover nor enough a warrant for users and ii) interactions with institutions and with actors, such as publishers and scientific societies and networks, are needed: here the retained solution is to involve, enable and let users and researchers choose from all.

Intellectual Property (IP) management

REVUER users are invited to cite appropriately content sources, unless these are their own and may be anonymous. Conversely REVUER will bring a growing number of novel ways to interweave hierarchies of contents, goals and trajectories with oriented maps and paths.

We expect that this will both help preserve, identify, optimize and feed back IP rights, then much better acknowledged, categorized and related, but above all will facilitate users’ identification of potential links, through which they could further develop their IP.

Moreover current IP is not directly involved by REVUER since its metadata, mostly quantitative, hence don’t tell about the ‘how’ an experiment, or process or even model, delivers the ‘critical’ effect or measure that it displays.

Both free and subscription based 

The mechanism ensuring the independence/focus couple, required to activate, maintain and preserve so ambitious a valid long term experimental protocol, combines free access to first level results, to low cost subscriptions for more sophisticated computations.

Again our commitment to users and to external partners combines openness, equality, none or low costs, extended flexibility and free parameterizing so that researchers and related entities may all get as much as possible what they need to maximize exposure for what and to whom they want to see impacts on research fields and their evolution.

Conclusion for Value Organization and Financing cycle

The goal is clear: invite Researchers, as soon as students, to try, taste and like a novel, postulated most efficient manner to make their projects and results easily grasped and positioned by anyone potentially, directly or indirectly impacted.

One of them did summarize his viewpoint once as “what’s in for me” although younger ones had more rapidly embraced the potential at conferences at both Orlando and San Diego edges of the US with a “I could pay to use it“.

As a recall the main value offered to them is a kind of marketing benchmark, under their full control, through which they could expect to retrieve wide traction and hopefully maximize funding and partnering opportunities as well as impact factor for them and for the journals to which these will direct all kinds of watchers and potential readers.

So this is it, a future framework should enable all of them to get into it at will and, assuming they effectively contribute, get credits from which they can get shares of the national or regional place, or get some specific added values from those rather eager to monetize their contributing pieces or from funders portfolios, publishers’ and other providers’ products or services.

It is planned that these shares be over time transferable to those institutions with proven track record in carrying and developing knowledge across centuries: the universities, whether through donations or by acquisition from their foundations or other manners.

This does not fully solve the initial financing, although sponsoring and donations are anticipated, but at least explains the Exit strategy and commitment and paves the way to the sustainable and independent model required for the successive phases of 1) dissemination and sharing of most simple path, 2) increasing visualization and simulation feed back where and when the fields contents and dynamics allow it and 3) interdisciplinary growing experimentation where and when fields of fields allow it.

Conclusion

This organization should fit the goal of motivating  users, to begin with researchers, well empowered to maximize benefits and minimize risks, burden and costs, reduced to low, even as compared to usual journals and conferences fees, or even to none.

It even appears optimal, and therefore to command the schedule of next steps and phases for the next decade, but is today very much open to critique and suggestions toward the purpose of gathering, organizing, sharing and then integrating the kind of metadata from which efficient hence most disseminated knowledge mapping and predictions are tested.

 

 

Discover, Predict, Reproduce… Science

 

Science is often seen as the locus of Discovery, the place and people where concepts, effects and things never recorded before come into being or at least knowledge.

The term ‘Discover’ seems to suggest that these novelties were somehow already there but ‘covered’, not separated from already known reality from which it was not yet isolated and reduced to this exact piece going to be characterized, hence the customary practice of reduction in Science.

The value of the then ‘new’ stuff is to be cognized and recognized, categorized as extended in or rather with space and time, or space with time known as space-time, found elsewhere and elsewhen with efficience, effectively, in effect.

This means a model, pattern or language, often used as synonymous although only modes of access to knowledge, and process to operate it. They dictate the places and conditions where and when the occurrence, example, case of the category, or object in or as effect is to be found or produced again, hence re-produced from what was therefore a pre-diction.

All this is well known about Science but the following consideration requires attention: how will Science re-produce what has never been produced, predict what has never been dictated, discover what does not seem to be even covered?

This question is not merely philosophical: everyone would like to predict the future but if the future is not the reproduction of the past, then the value is precisely in what makes the difference between both, in a space of still possibles, or potentials even beyond refutable, that modern, ‘developed’, in the sense of technological, applied science civilization,  seems particularly prone to isolate and circumscribe and, from such writing and recipe, only thereafter may come to predict as reproducible.

Reproduction therefore characterizes technology and industry rather than science per say and it is the efficiency of the systematic and ultimately ubiquitous reproduction in space and time that enable them to validate science by repulsing its quest for bits, cases, occurrences of refutation, of ill-reproduction, from where science will discover again.

As a conclusion there is a conflict, beyond the apparent time opposition, between predicting the past, which is what science applies to do and to test through technological and industrial ever more efficient reproduction where it says that some pattern and processed effect in a result will make a past go on, be re-produced (hence as past) and keep consistency into the unknown that we call the future.

This has applications and implications at all scales of course and particularly at the diverse edges of the universe, called its cosmology, where the limited consistency of the past, but at least preserved into what we call the present, get patterns such as discussed in our previous blog pages such as Dimensional Consequences 1Cosmological introductionKnow the FutureCosmological Constant issue (abstract)Cosmological Coincidence (abstract).

Back to earth, or rather to immediate next steps since earth might dwell some edges of the universe after all,  such as ourselves, the practical conflict is about predicting the apparently unpredictable, that is to say the future inasmuch as never seen, never made and possibly never to come into actuality or even thought before long, even unthinkable and yet, as we can observe, always happening!

Science Fiction, that was sometimes called anticipation – a term not foreign to the concept of prediction – is one place where this kind of future is eventually found although its most common location is university, where researchers are typically invited to output a Science that would not be fiction, or not anymore.

A conjecture developed by the author these last decades, among others, is that of quite deterministic a future in the sense that the Fiction (assumed scientific) so imagined, shaped and, if this means more precisely… modeled, must be consistent with the aggregated past into ongoing present and further future to which it, therefore, adds more integrated consistency hence complexity, as we see in order to become and make the real.

This suggests ways to determine this particular future that we have characterized above as non-determinable, non-predictable and non reproducing the past but on the contrary introducing a never met piece of the new reality… unless the determinism is such from some point in the future rather than from any point from the past, from where it would be non-deterministic. However these points in the future exist, they are for instance these fictions, including all the scientific models – such as the 10^500 string theories model or variants mentioned by Smolin – mentioned above.

Finally this is where we have come to propose the project now known as REVUER, cf.On the REVUER project, toward a ‘General Predictability’, where we anticipate that many paths from, to and through more or less reproducible pieces will cross, leave or join others to better producible, novel, future and only thereafter possibly re-producible future past in it.

This explains the importance of the protocol and organization upon which the REVUER global experiment must be able to rely to these horizons, as discussed in next blog pages.