Seven years ago, the president of the European Association for Cognitive Systems along with an Oxford professor surveyed 550 AI experts.
The results were surprising: 90% of them answered that the Singularity will happen before 2075 .
This is a turning point: general artificial intelligence would be able to even surpass human cognitive capabilities.
This means that the next Capitalist cycle (the current one is about to end now, with a financial crisis that will generate a huge incentive to reduce human resources within production processes), will probably be the final one: the final setting of the transition from a society based on labor to a society based on capital (technologies).
This phase of human history began with 100% production generated by human workers and will end with 100% production generated by capital.
The process is not linear and we are missing the big picture of why this is happening now: time is much faster because the population is much larger than in the past and people are much more interconnected due to immediate communication of information via the internet.
This allows to develop in the next 50 years what humanity could not develop in the last 1000 and the ancestral brain is not able to project this scenario because it is not used to the exponential evolution of events (the spread of COVID- 19 is a perfect example of underestimation of an exponential risk).
Looking at how society is structured a big question comes to mind: how the majority of people (the working class, with work as their only or main source of income) should survive when they become the useless class.
One of the main “solutions” is “with a universal basic income” (UBI): the government would collect through taxes the money accumulated by companies using technologies and redistribute it equally among the population for the sole merit of being alive.
This is completely suboptimal garbage: taxing companies will only disincentivize them from using technologies by slowing down developments and innovation and will influence consumers by raising prices.
The government’s responsibility here is not to be the passive entity that only collects money, but should instead be to protect the country’s long-term interests by structuring a financial singularity strategy.
If the Technological Singularity occurs, i.e. non-linear risk, why would it be correct to assume Universal Basic Income as the optimal rational solution? Actually it is not, and instead it would be much more effective to structure a gradual guided shift to an automatic and intelligent solution to invest and redistribute wealth using the Singularity itself (at least eventually): it is necessary to invent a gradual and mirror (at the rate of automation within the supply chain) automatic solution to reinvest the money of people at higher risk of long-term technological unemployment to maximize their self-sufficiency, with educational incentives and diversification of income sources.
Decentralization and open-source will be critical to the success of this strategy to ensure that everyone does not have to rely on trust based on government decisions.
Finally, since the process will be subject to friction, maximizing sustainability without eroding incentives for innovation and production, and especially to prevent the generation of social and financial crises brought on by growing inequality, is the main social challenge for the next 50 years, and who will lead this evolution will determine what life will be like for everyone in the decades to come.