top of page

When transacting associates or Associative Liquidity.


OpenTokenomics © 2022 – Version: alpha – Author: John. F. Robert


Introduction

This post completes the Connectivity via Transacting one to form together a preamble to a white paper describing the realm and roadmap of the OpenTokenomics eR&D. Reading Connectivity via Transacting is a recommended prerequisite.

Both posts publish propaedeutic considerations without too many technical considerations to stay accessible to a large public. The post drives the reader to the reality of associative liquidity and the symbolization of its identification. The associative liquidity is well illustrated by the GAFAM. Indeed, the listed companies, actors of a connected world, witness connectivity via transacting that implicitly converts into connectivity via association for each user. These implicit conversions back huge capitalizations by their added value.


To identify the associative liquidity, the post introduces the reader to the required algebra to play that 1+1 > 2, to play that 1 +1 = 2 + associative value. Of course, it's not the arithmetic as algebra that will play the symbolic game of identifying the associative liquidity. This reality is played as an invariant of the connectivity whatever the nature of the involved agents. It is not a question of numbers, weights, or any other magnitudes.

The associative liquidity results from the fact that transacting associates without any additional contract for the association. The algebras playing this invariant of the connectivity are operating on the prototypal forms of the transaction, simply introduced in the Connectivity via Transacting post, enriched for handling openness to transacting. The latter enrichment will be formally described in a white paper. This white paper, including more finalized strategic contents and research results, is waiting for the choice of a license model to be published.

Enjoy your reading!

About Association


Association is most often a transparent reality for citizens. In the economic domain, most people experience transacting when being hired for a job, delivering a service, or shopping. In doing so, they borrow from immobilized associations, a company, a foundation, a shop, an academic institution, or a state service – all actors standing as associative capital for sustaining transacting. The more a society is developed, the more it must sustain its associative capital, corporate or not, and it goes with a growing tertiary sector.

If we take a less common experience than jobbing and shopping, founding a new initiative, exhausting the service of one person, corporate as a startup, for instance, the involved actors will face the problem of the association. Association needs will be at the core of the roadmap to aggregate financial means and expertise. The association won’t be any more fully sustained by background players or employed people for the concerned actors.

From a quantitative perspective, a mass unit is used, a currency unit to price a transaction; for an association, shares, parts of a whole, are used; pricing the shares is another question, correlated but formally different. Associating and transacting are two distinct fundamentals of the economy, having implicitly fed the ideological opposition between the capital and the work. However, let’s move away from human ideologies, following our non-anthropomorphic symbolization of economy as a connective present via transaction.


When Transacting Associates or Associative Liquidity


How to symbolically play the association from connectivity via transacting? How to extend our connectivity via transaction into connectivity via transaction and association, without any additional contract?

The proposed answer is short: by superposition, i.e., transacting associates.

Let’s first play it with our non-anthropomorphic symbolization of economy and then, let’s go back to business-as-usual.


We will use here the formal expression :


Y ← A . F ( X )


symbolizing a transaction, introduced in Connectivity via Transacting.


This symbolization stands for a published exchange, immobilizing F, accessible from A for converting X into Y.

In the flowing transaction, we have also to consider that the operants {Y, A, F, X} will pass through an association to enable the transaction. Therefore, transacting intrinsically puts into association the operants. We can’t do 1+1=2 or equivalently, if we rewrite the addition under our prototypal form, perform


2 ← A . operator+ ( 1,1 )


without associating the operants {2, A, operator+, (1,1)}.

How to identify this implicit associative value intrinsic to any transaction, implicit because it results from the transaction contract, without the need to add one more between the associates, namely the operants?


It’s not a quantitative problem. Everybody knows that 1+1 is more than 2, or that the sum of the parts is always bigger than the whole. But we can’t play it with arithmetic as algebra. A very weak point for the quantitative economy, a great opportunity for the development of the tokenized economy!

Algebras operating on prototypal forms, completed to handle openness to the transaction, can play the required symbolization. They play how transacting is superposed to associating in any transaction, how the flowing present of the transaction is prototyping the service of persistency sustaining the association of the operants, and making them differentiable via their cross convertibility in each instant. They will be described in the white paper. These algebras operate type conversions, superposing the role of buyer, seller, and staker - a chosen postmodern terminology for investor - for each actor in any transaction. Any actor, human being or not, buying, selling, or staking, formally taking respectively the X, Y, and F positions, so dropping the economic ontology, also fills the two other roles, the two other positions. This results from an intrinsic pattern of the sustainability of the present within the circulation of the operants.

This pattern will deeply exhibit how much numbers and associated beliefs, up to idolatry, fails to reflect a sustainable present. In OpenTokenomics eR&D, the research handles the question of sustainability far beyond the current idea that life and its economy need a specific ontology - which is moreover invalidated by our engineering. Sustainability is played at the level of the circulation of the operants. From their indiscernibility in their positions, we play the broken symmetry opening and serializing them into a transactional exchange, delivering the persistency of an association, delivering the alterity required for the transaction under the unicity of the association.

This symbolically plays how the transactional exchange of a look for an image shows agents in association in the visible. We are far from the geometric and topological symbolization of the visible, far from the conception of the visible from an observer's position.


Let’s come back to business-as-usual

Economically, identifying the associative value in any transaction is highly strategic. Only identifiable actors in the game of economy can become assets and this is before any quantitative symbolization. So, a robust algebraic game to identify the superposition of the roles in any transaction opens for the tokenized economy the new domain of associative liquidity, which is in some way targeted by the DAO. The associative liquidity in any transaction can't be symbolized by cash coins. It points to a specific domain for codification by code unreachable by weighting currencies, digital or not, for identification reasons.

Let’s take the GAFAM example. The listed companies have all accumulated in a short time a huge capitalization. We can think in an ontological way, making information an asset, and point to the huge size of data they collect and generate crawling with AI. States can answer with laws to prevent this appropriation without the consent of the user, bringing an unbalanced distribution of wealth, with laws to protect startups against the financial whales generated by these accumulated data, compromising the essence of a free market. New entrants can promote a web that preserves ownership of data, allowing anybody to exchange their value. The latter belongs to blockchain trends trying to fix the web. However, these correctives do not address the identification of the association, they miss the point that transacting associates. Indeed, the first reality backing their capitalization is the associative connectivity resulting from the transactional connectivity of each connected user. Of course, we can count the number of connected user accounts to size the association. Musk was facing an issue with this indicator to buy Twitter. However, it is too late in the contextuality of a closed cardinality, when the flow is expanded into extension, enabling to count; it is too late to identify the associative value, to identify that 1+1 is greater than 2. In a civilization running algebra operating on the prototypal forms of the transaction, the indicator to identify the association value, before going to count and give a quantitative valuation per user, will result implicitly from the code design. The Musk issue with Twitter would lose its context. With such algebra shaping code design for any access operator, we are equipped to symbolize a distributed capitalization and implement associative liquidity. I don’t blame GAFAM for accumulating huge capitalization, even if sometimes, as citizens, we feel threatened by the monopolistic powers. At least, they have the merit in their pioneer developments to unveil deep economic and epistemic realities of our connective present and the game they will play is not yet written...

Perspectives for algebra operating on prototypal forms for a tokenized economy

Also, if we envision a distributed and distributive economy via tokenization, it is welcome to revisit the design patterns of identifying the assets, to hook them to the prototypal forms of executed flows connecting us. It goes with a new design of the code to properly identify contributions to a sustainable present, a design of the code not relaying to the state-machine concept which asks physicists for creating stable states to program. Such design will help to conscientize how an access domain to transaction computes - in a revisited sense - a present sustaining the alterity of its instantiated inhabitants for any open type. A starting point where codesis will offer other potentials than controlling via mathesis and statistics on ontologies presupposed to follow laws for the horizon of the possible.

Quantum computing play here the role of a singular incubator, where mathematical physics and software symbolization nurtures a historical meeting. I must have to point out on this subject that the algebras operating on the prototypal forms of the transaction, playing how transacting associates, meet the group of Pauli, symbolizing today a Qubit, without any phenomenal hypothesis, without the context of linear Hilbert space to represent quantum states. It will help to understand how the world is designed to compute for us, how it is designed to sustain a present without building any computer! Without building any state-machine, because we don’t symbolically play morphology on phenomena in quest of stable states to conceptualize persistency. We symbolize that action as state change goes always with the flowing transparency of the act to sustain a present of any nature.

Blockchain designs are today attached to business-as-usual when they come to identifying the assets, e.g., we need to write contracts for building associations. Identifying associations in transactions must be transparent and shouldn’t require any explicit agreement based on a shared dictionary. That’s why, it is relevant to design the identification of the assets at the level of the prototypal forms, which contrasts with a modern approach envisioning dictionaries, ledgers of ontological instances built with AI or not.


With algebra operating on the prototypal forms of the transaction, the economy reaches a deeper epistemology than any other scientific discipline of our time. We can symbolically play that we are associated in a common world without any common language - without any requirement for a learning process against common semantics. The presence of a natural universe is symbolically played via prototypal forms of instantiating, with open ontologies. Identifying the assets is operated as a built-in feature in the design of the flow sustaining a present. This identification is the root of generic gamification for reflection – an electron, a stone, a mathematical object, and so on, do not speak any human language but they are all agents of transactions as any instance, human being included. Getting their identification in exchanging their alterity is not a question of discursive language or semantics, but immersion.


Some epistemic considerations

In OpenTokenomics, we don’t expand an iterated present into an extension, namely the time. In more conceptual words, cardinality and variadicity, i.e., closed and open cardinalities, are alterities in openness to transacting, and tomorrow is not handled via predictions along time extension. Duration is not first an interval of time but a service of persistency which factors into a couple of reading and writing services. Instances are paths of openness to transactions, interfacing paths serializing reflectors that return a callback to a call. (Everybody knows that tomorrow will be a present - up to the apocalypse). In a ramified flow, when we drop causality as a principle to derive morphology on phenomena, perceptive and actualizing functions are played free from their physiology. And the episteme about physical and physiological implementations is not trapped into magnitudes for identifying objects. The episteme plays identification from their prototypal forms of reflection for evaluating how call and callback reach their proper name. The episteme of an iterated present is to harmonize action and perception, an episteme of full reflection, of full consciousness for human beings within the associative connectivity of a transactional flow open to any ontology. It is an episteme prototyping the phenomena for open ontologies instead of modeling phenomena for given ontologies.

Currently, under the horizon of sciences, theorizing and modeling to support our consciousness of the possible, we think for the economy, more data, and more power of computation, to run behind assets with a statistical morphology designed by AI on information ontology driven by a causal phenomenality. Once we can play a sustainable present freely giving the identity of the assets without requiring a learning process, this quest for power nurtured by the imaginary of causality as an instrument of control by prediction will most probably unveil itself as very weak, even counterproductive, ‘self-destructive’ – annihilating the alterity between the environment and its inhabitants whatever their nature, whatever the ontological morphology of statistics.



For more about the operative symbolization, visit OpenTokenomics.io

Recent Posts

See All

Connectivity via Transacting

A non-anthropomorphic symbolization of economy. A weightless primitive for tokenomics. OpenTokenomics © 2022 – Version: alpha – Author:...

OpenTokenomics eR&D realm

OpenTokenomics © 2022 – Version: alpha – Author: John. F. Robert The core business of OpenTokenomics focuses on operating the prototypal...

Comentários


Os comentários foram desativados.
bottom of page