On Predicting the Future in an Exponentially Growing World

Eric J. Seader
8 min readDec 26, 2019
Photo Credit: U.S. Army illustration

When I applied, early decision, to university in late 1996, I chose for my essay the option to write what would be the lead article of a ‘major [imaginary] newspaper’ on January 1st, 2010. As editor and sole journalist for this fantastical publication thirteen years in the future, I made the executive decision to lead off with my own think piece. The editorial declared that our widely distributed daily would no longer be making New Year’s predictions due to a rather rocky track record of being so very wrong.

There was the forecast in (fictional) 2002 that the groundbreaking electric car company, Päazen-Gazs, would soar in popularity and effectively end sales of internal combustion engines, but that, er, backfired and they never ended up producing a single vehicle for the consumer market. In 2008, ‘we’ predicted that every household in America would own a C.A.T.T. — a combination computer, ATM, television, and telephone (which, in retrospect, sounds like I might have imagined a very crude prototype for the first iPhone).

What I did not realize until now, that I had seemingly grasped then, is just how difficult it is to predict the future in a rapidly changing technological landscape. As our various devices and services, and the research and innovation that goes into furthering our convenience and efficiency, continue to increase at an aggressive rate year-to-year, it becomes progressively harder to predict what’s next in the pipeline. As 5G networks and infrastructure begin to take shape and amazing achievements in quantum computing are realized, while our hunger for consumption hurtles towards insatiable, we are traveling at ludicrous speed into the unknown.

While researching for this piece, I stumbled upon an interesting Business Insider article from 2012, poking fun at the naysayers who — ahead of Apple’s iPhone launch in 2007 — conjectured that it would fail to be a gamechanger or would be an outright defeat for the Cupertino company (now, the world’s most valuable). The co-CEO of Research in Motion, Jim Balsillie, said of the upcoming device “It’s kind of one more entrant into an already very busy space with lots of choice for consumers, but in terms of a sort of sea-change for BlackBerry, I would think that’s overstating it.” Steve Ballmer, who was CEO of Microsoft from 2000–2014, bullishly stated “There’s no chance that the iPhone is going to get any significant market share. No chance. It’s a $500 subsidized item. They may make a lot of money. But if you actually take a look at the 1.3 billion phones that get sold, I’d prefer to have our software in 60% or 70% or 80% of them than I would to have 2% or 3%, which is what Apple might get.”

Even seven years into the 21st century, we were still largely living and thinking in a late 20th century mindset. We were carrying multiple devices for various forms of entertainment and communication and most of us could not even fathom what the next generation of tech would look like. Wedbush Morgan Securities analyst Michael Patcher thought it would be highly unlikely for Apple to ever compete with Nintendo and other mobile gaming companies. Responding to a New York Post article, stating that Apple was secretly plotting to take over the mobile gaming market, Patcher told GamesIndustry.biz, “Rather, I think that they’re trying to establish the iPhone as an all-encompassing entertainment device. The markets are different. The iPhone is an expensive toy for the wealthy and self-indulgent, while the [Nintendo] DS is an inexpensive toy for everyone.” He concluded that he could not “conceive of their gaining significant share of the handheld market. The iPhone is unlikely to ever compete effectively, as it is unlikely to attract the level of third-party developer support afforded the DS.”

Left to right: BlackBerry Curve 8300, Nokia N95, Palm Treo 650, Samsung BlackJack

Ahead of the iPhone’s release, the most advanced smartphones of the time (notably, the BlackBerry Curve 8300, Nokia N95, Palm Treo 650, and Samsung BlackJack) featured a tactile QWERTY keyboard — except for the Nokia, which tortured users by having them cycle through upper and lowercase letters and special characters via a 12-button numbered keypad. The Treo had a VGA camera with 640x480 resolution, the BlackJack had a 1.3MP with resolution up to 1280x960, the BlackBerry featured a 2MP, and the Nokia had a (rather impressive for the time) 5MP camera. While the counter-specs of the first-gen iPhone weren’t all that impressive by comparison (320x480 resolution, 2MP camera, and a hefty base price of $500), it broke new ground with its multi-touch gorilla glass interface, lack of physical keyboard, superior on-board storage, built-in WiFi, and a truly usable 3.5” screen for web browsing and media playback.

Although all of these phones sound ancient compared to today’s iPhone 11 Pro Max or Samsung Galaxy Note 10, they were practically supercomputers compared to 1994’s Simon Personal Communicator made by IBM in a partnership with Bell Atlantic (Verizon’s mama). At a monstrous eight inches long by two-and-a-half inches wide by one-and-a-half inches thick and weighing over a pound, it dwarfs even the largest smartphones currently on the market. Yet, it was revolutionary for the early 90s, giving the user an LCD (monochromatic) touchscreen with access to email, an address book, a calculator, a calendar, note and sketch pad functions, to do lists, the ability to send and receive faxes, and even had an early iteration of predictive typing. Not to mention, you could also plug it into a standard phone jack…in the wall…to call more reliably via a landline [gasp!]. And this game-changing technology could be had for a mere $1100, which is not terrible by today’s standards, particularly considering an IBM ThinkPad at the time cost $7,600 (nearly $13,000 adjusted for inflation). But hey, at least you got a 2x internal CD-ROM drive for that jaw-dropping price.

Still, even the Simon was a veritable NASA satellite compared to the world’s first mobile phone, built by Motorola in 1973. Having no display screen, no internal memory, and giving a user only thirty minutes of talk time after a ten-hour charge, Motorola researcher and executive Martin Cooper called his rival, Dr. Joel Engel of Bell Labs from the prototype on April 3, 1973 to basically tell him “game on, bruh.” When Motorola released the DynaTAC 8000X (commonly known as the Zack Morris phone), the battery specs hadn’t improved, but it did feature an LED display, an internal memory to store up to 30 numbers, and cost a wallet-combusting $4,000 (slightly north of $10,000 today).

Source: Euromonitor International

A device once thought of as indulgent or for emergencies only has now become an indispensable lifestyle accessory for most of the developed world. While researching this piece, I found a 2017 story from Vox, chronicling how the iPhone has changed the world since its introduction. Among the more interesting statistics was one regarding sales of chewing gum. According to market research firm Euromonitor International, gum sales dropped fifteen percent between 2007 and 2017. Experts point to consumers no longer being enticed by impulse buys in supermarket checkout lines, rather being consumed by their social media feeds to pass the time. While my imaginary newspaper might have correctly predicted the meteoric rise of the iPhone had it been on ‘our’ radar in 1996), we most likely would not have had the foresight to predict what it would have meant for the Trident, JuicyFruit, and Hubba Bubba market.

As metrics seem to be key to virtually every industry, futurists have more recently been projecting in trends, rather than specific technologies. Increased adoption of RPA (Robotic Process Automation), expansion of 5G networks, democratization of AI, growth in working remotely, advanced encryption, VR/AR/MR integration, wider acceptance of blockchain, and increased consumer demand for platform governance will greatly impact and reshape the way business gets done — in the U.S. and abroad. As C-level executives continue to see the value of developers and infosec professionals, the tech world will help bring about the human augmentation revolution to corporate infrastructure.

As rosy as the picture might be of a future in which our menial, ministerial tasks are handled by AI, freeing up more of our time to create and innovate, while also giving us more time to luxuriate in recreation with our families and friends, perhaps it also makes sense to predict the negatives that might come from such a world. With tech professionals gaining more traction in the corporate world, the supply and demand balance will inevitably give way to increased salaries and benefits, with which the world’s governments will most likely fail to compete. Falling weak to attracting top talent, stifling modernization and cybersecurity protections at the local, state, and federal levels, will put our personal information and security and that of our children at risk, not to mention the threat it will pose to our electoral systems. Attacks on our electoral infrastructure beyond what we have already experienced will hinder our ability to usher in the next generation of diversified politicos who can reshape our education systems — both for the youth and for adult populations who will need to be retrained for an ever-advancing society. Additionally, exponential technological growth could very well have a devastating impact on industries that can’t or won’t modernize, and employees who are resistant to retraining.

Perhaps that is a rabbit hole better left unexplored for now. But although that is a more extreme example than the decline of chewing gum sales, it behooves us to consider the unintended consequences of a cutting-edge tomorrow to better prepare ourselves for those possibilities.

To swing the pendulum back towards the positive, one can get blissfully lost in a sea of daydreams, thinking about what the future has in store for us. As we currently live in an age where virtually anyone can learn how to do practically anything with a simple YouTube search, it is possible to imagine years from now a Matrix-style reality when we can simply have new skills added to our brains and bodies instantaneously. We can imagine a world with no communication barriers, democratized access to technology and education, governments that truly serve the will of the people, and the processing power to handle it all. So when it comes to predicting the future, it may seem logical to leave it up to the professionals, but we can see that sometimes they get it wrong. Better to leave the innovation to the scientists and technicians, the oversight to independent (and apolitical) nonprofits, and the daydreaming to all of us.

--

--

Eric J. Seader

Eric is a legal technology consultant, music hobbyist, and politics junkie. He resides in New Jersey with his wife, daughters, cat, and way too many guitars.