More advances in AI are being made than in hardware.
Can the gap be closed via decentralization?
Over the past two years, artificial intelligence (AI) capabilities have skyrocketed, making large language models (LLMs) like ChatGPT, Dall-E, and Midjourney indispensable tools.
Generative AI programs are writing emails, making graphics from simple inputs, recording tunes, and developing marketing copy as you read this post.
The pace at which people and businesses are adopting the AI ecosystem is even more astounding to observe. According to a recent McKinsey poll, the percentage of businesses using generative AI in at least one business function increased from 33% at the beginning of 2023 to 65% in less than a year.
Like other technological developments, this new field of invention is not without its difficulties.
AI centralization is a problem since it requires a lot of resources to train and operate, and big tech now appears to have the upper hand.
The computational barrier to AI advancement
The World Economic Forum reports that the need for AI compute is rising at an accelerated rate, with the computing capacity needed to support AI development growing at a current annual rate of between 26% and 36%.
This trajectory is supported by a new study from Epoch AI, which projects that training or operating AI algorithms will soon cost billions of dollars.
According to Epoch AI staff researcher Ben Cottier, “the cost of the largest AI training runs is growing by a factor of two to three per year since 2016, and that puts billion-dollar price tags on the horizon by 2027, maybe sooner.”
I think we’ve already arrived at this stage. Last year, Microsoft made a $10 billion investment in OpenAI. More recently, rumors circulated that the two companies were going to construct a data center that would house a supercomputer driven by millions of specialized processors. The price? a staggering amount of $100 billion ten times the initial investment.
It turns out that Microsoft is not the only major tech company going all out to increase its AI computing capabilities. Significant financial resources are being allocated to AI research and development by Google, Alphabet, Nvidia, and other corporations involved in the arms race in artificial intelligence.
Though we may all agree that the results might be commensurate with the sums of money invested, it is difficult to deny that AI development is currently a “big tech” sport. These are the only firms with the financial wherewithal to invest tens or perhaps hundreds of billions of dollars in AI initiatives.
It raises the question of what steps can be taken to prevent Web2 innovations from falling victim to the same traps that these ones are due to a small number of firms controlling innovation?
James Landay, the Faculty Director of Research and Vice Director of HAI at Stanford, is one of the specialists that has already commented on this topic.
According to Landay, the competition for GPU resources and the big tech companies’ emphasis on using their AI processing capacity internally will drive up demand for computing power and force players to create less expensive hardware solutions.
After chip battles with the US that prevented Chinese companies from easily obtaining vital processors, the Chinese government is already taking the lead in assisting AI startups. Earlier this year, local governments in China announced incentives, promising to provide AI businesses with computing vouchers worth between $140,000 and $280,000. The goal of this endeavor is to lower the price of processing power.
distributing the cost of AI computing
One thing consistently stands out when examining the condition of AI computing today: the sector is currently centralized. The majority of processing power and AI programs are under the control of large tech businesses. Things stay the same more often than they change.
Positively, things might actually change for the better this time around because to decentralized computing infrastructures like the Qubic Layer 1 blockchain. This L1 blockchain employs a sophisticated proof-of-work (PoW) mining mechanism called Qubic’s uPoW, which harnesses computational resources for valuable AI tasks like neural network training instead of the energy-intensive Proof-of-Work (PoW) used by Bitcoin to secure the network.
In simpler terms, Qubic is decentralising the sourcing of AI computational power by moving away from the current paradigm where innovators are limited to the hardware they own or have rented from big tech.
Instead, this L1 is tapping into its network of miners which could run into the tens of thousands to provide computational power.
Although a bit more technical than leaving big tech to handle the backend side of things, a decentralised approach to sourcing for AI computing power is more economical. But more importantly, it would only be fair if AI innovations would be driven by more stakeholders as opposed to the current state where the industry seems to rely on a few players.
What happens if all of them go down?
Make matters worse, these tech companies have proven untrustworthy with life-changing tech advancements.
Today, most people are up in arms against data privacy violations, not to mention other affiliated issues such as societal manipulation. With decentralised AI innovations, it will be easier to check on the developments while reducing the cost of entry.
Conclusion
AI innovations are just getting started, but the challenge of accessing computational power is still a headwind. To add to it, Big tech currently controls most of the resources which is a big challenge to the rate of innovation, not to mention the fact that these same companies could end up having more power over our data – the digital gold.
However, with the advent of decentralised infrastructures, the entire AI ecosystem stands a better chance of reducing computational costs and eliminating big tech control over one of the most valuable technologies of the 21st century.
Discover more from Postbox Live
Subscribe to get the latest posts sent to your email.