As Mohamed Elmasry, emeritus professor of computer engineering at the University of Waterloo, watched his 11 and 10-year-old grandchildren tapping away on their smartphones, he posed a simple question: “What’s one-third of nine?”
Instead of taking a moment to think, they immediately opened their calculator apps, he writes in his book “iMind Artificial and Real Intelligence.”
Later, fresh from a family vacation in Cuba, he asked them to name the island’s capital. Once again, their fingers flew to their devices, “Googling” the answer rather than recalling their recent experience.
With 60 percent of the global population—and 97 percent of those under 30—using smartphones, technology has inadvertently become an extension of our thinking process.
However, everything comes at a cost. Cognitive outsourcing, which involves relying on external systems to collect or process information, may increase one’s risk of cognitive decline.
Habitual GPS (global positioning system) use, for example, has been linked to a significant decrease in spatial memory, reducing one’s ability to navigate independently. As AI applications such as ChatGPT become a household norm—with 55 percent of Americans reporting regular AI use—recent studies found it is resulting in impaired critical thinking skills, dependency, loss of decision-making, and laziness.
Experts emphasize cultivating and prioritizing innate human skills that technology cannot replicate.
Neglected Real Intelligence
Referring to his grandkids and their overreliance on technology, Elmasry explains that they are far from “stupid.”
The problem is they are not using their real intelligence.
They, and the rest of their generation, have grown accustomed to using apps and digital devices—unconsciously defaulting to internet search engines such as Google rather than thinking it through.
Just as physical muscles atrophy without use, so too do our cognitive abilities weaken when we let technology think for us.
A telling case is now called the “Google effect,” or digital amnesia, as shown in a 2011 study from Columbia University.
Betsy Sparrow and colleagues at Columbia found that individuals tend to easily forget information that is readily available on the Internet.