Biden Prediction of More Technological Progress in The Next 10 … – Center for Data Innovation

President Biden recently made the extraordinary claim that, largely due to advancements in artificial intelligence (AI), “we’ll see more technological change in the next 10 years than we’ve seen in the last 50 years and maybe even beyond that.” If AI could deliver such progress, it would represent an incredible advancement in human progress, and policymakers should do everything in their power to realize its potential. But even with AI’s dizzying possibilities, it is highly unlikely that technological advancements in the next decade will outpace the achievements of the past five decades given the extent of innovation in this timeframe. This matters because if the assumption is that robust innovation is on autopilot, then efforts to promote innovation can slacken.
The past 50 years have seen extraordinary innovations (see table 1), yet progress takes time. Martin Cooper, the head of communications at Motorola, made the first cellphone call in New York City on April 3, 1973, yet it took another decade before mobile phones were commercially available. Even after the technology became feasible, it was not until 2016 that more U.S. adults were living in households with cell phones than with landlines, nearly a decade after Apple released the iPhone. By suggesting that the upcoming decade will hold as much progress as the past five, President Biden has exaggerated the possibility that so many advancements could occur. Mirroring half a century’s worth of groundbreaking innovations in just ten years is an impossible task.
Further, in assuming the inevitability of significant technological progress, President Biden fails to consider how innovation becomes harder over time. Past innovators have already made important breakthroughs, such as creating foundational concepts or basic technologies, that future innovators will have to spend significant effort refining. For example, researchers have known the foundational principles behind quantum computing for decades, yet developing practical, scalable quantum computers has proven to be a monumental challenge.
Future innovators must also develop more complex technologies. The first digital camera in 1975 was an incredible innovation that showed how it was possible to have an all-electronic camera that did not use any consumable parts like film. But it was a relatively simple device compared to the digital cameras in today’s mobile devices. Likewise, while the invention of the first microchip in 1958 marked a significant breakthrough, today’s computers need vastly more advanced chips that require extensive knowledge, specialized facilities, and considerable investment to produce. As technologies evolve, they often become more intricate and require more time and resources to develop than their predecessors. This additional complexity tempers the pace of discovery.
Many impressive technological advancements will likely occur in the next ten years; however, Biden’s hyperbolic claim about what to expect is wrong and policymakers should not take future advancements for granted. Recognizing the impact of past achievements gives light to the efforts behind them and the importance of continuing to support current research and development into new technologies.
Image credit: Flickr user Gage Skidmore
Morgan Stevens is a Research Assistant at the Center for Data Innovation. She holds a J.D. from the Sandra Day O’Connor College of Law at Arizona State University and a B.A. in Economics and Government from the University of Texas at Austin.

source

Jesse
https://playwithchatgtp.com