We may be the masters of the devices we create, but as we become more and more dependent on them, we are literally rewiring our brains.
Watching the Apple Maps fiasco slowly unfold across Twitter over the past week, a stray thought struck us like a lightning bolt: humans don’t just use technology — we adapt to it.
In the dark days before the first iPhone – say, 2006 — people who wanted to catch a train wandered down to the station, flipped open a brochure or stared at a route map, and figured out how to get where they needed to be the old-fashioned way. A mere six years later, we’ve become so used to Google Maps’ excellent transit directions that their omission in Apple’s homebrewed Maps has caused a universal conniption fit. Some people literally don’t know how to find the right bus without Google’s help.
It’s a fascinating concept: If people can rewire themselves so thoroughly in just six years, imagine what our lives — and bodies — will be like sixty years from now.
What might be coming down the pike as we increasingly rely on technology? Because let’s face it, we ain’t getting any less of it going forward.
Damned dirty apes
Physically evolving to match the technology of the times isn’t anything new. In fact, humanity wouldn’t be humanity as we know it if the earliest cave squatters failed to adapt to their surroundings.
According to author Richard Wrangham of Catching Fire, humanity would still be swinging in the trees if it wasn’t for a primitive form of technology: cooking. Cooked food is softer than uncooked food. Wrangham claims that eating cooked food led to a mutation that reduced the size of our ancestors’ jaws, which gave them the capacity for speech and freed up space in their skulls for bigger, more advanced brains.
Chew on that as we journey into a speculative look into humanity going forward.
The future, today
Actually, physiological changes are already occurring in humans who drink deeply from the technological well.
As computers and the Internet become more and more ubiquitous, our brains are changing to compensate for the change in reading and memory habits. In fact, the sheer amount and availability of information that’s out there is changing the way we remember.
In a study titled “The Google Effect,” Columbia University researchers discovered that humans are actually starting to use the gigantic knowledge repository that is the Internet as a personal memory bank, rather than a simple information resource. In a nutshell, we’re getting better at remembering where and how to find online information — such as through a simple Google search — rather than the actual information itself.
UCLA neuroscientist Gary Small conducted a study of 24 people aged 55-plus, half of whom were “Net-savvy.” He had all of them conduct basic Web searches for information; MRI testing showed that the Netizens had twice as much neural activity as non-Netizens.
We process brief bytes of information in bursts now, and the world moves to compensate. Witness the Twitter-esque feeds on news and sports shows, or the rise of the tl;dr meme.
Is this rewiring of our base structure a bad thing? Only if you’re not one of the Connected Ones. Most of the information in the world is literally at our fingertips, and our brains need to compensate for the deluge. Human brains can’t store as much as near-infinite server hard drives, so adjusting our physiology to remember how to find data is evolutionary efficiency at its finest — as long as you don’t stumble into a rural area with poor broadband connectivity, that is.
How else will we change?
There’s no reason to assume our minds will be the only aspects of humanity altered in the coming years. Our cultural behavior is already changing, as well; NPR reports that high school reunions are in serious decline now that everyone can keep up with their best buds through Facebook.
In general, online connections aren’t as strong as face-to-face connections, but I’d argue that keeping tabs on your buddies year ’round rather than once every five or ten years is a vast improvement.
Consider further the way we see, smell and hear. Historically, hearing aids and glasses have been used to help people with below-average senses reach normal levels. Now we’re starting to break that paradigm.
Augmented reality headsets capable of displaying info at a glance have long been the purview of pioneers like Steve Mann. No longer. Nokia’s Lumia 920 features robust augmented reality support. The awe-inspiring Google Glasses could be combined with Jelly Bean’s Google Now to engulf us in an enhanced, Matrix-like world where the information you need is there when you need it. Then, our brains wouldn’t even need to remember how to find information. The engine handles it all.
Speaking of the Matrix, current technology is limited by modern-day battery life. That barrier should fall in the future; scientists are working on ways to turn the human body into a battery, tapping into kinetic energy generated by specialized clothing — or even heart beats. The technology is still in its infancy, but once your heart powers your Google Glasses, there will never be any reason to take them off.
Our very bodies could change shape and form with the help of technology; “body architect” Lucy McRae has already developed a pill that makes you smell like perfume when you sweat, while geneticists have discussed engineering children to be smaller in order to reduce their ecological footprint in an increasingly crowded world. Surgeon Anthony Atala has demonstrated a proof-of-concept 3D printed kidney. Got a bum liver? Just pass the bottle and print out a new one.
Advances in hassle-removing technologies could have profound effect on human society, but it’s still too early to tell how the changes will play out. How will our body — and our society — respond when all labor is handled by robotic workers, 3D printers provide our food (and body parts), and our cars drive themselves?
The Singularity
As man and machine become more and more co-dependent and Moore’s Law makes technology more and more powerful, futurists anticipate the arrival of an event they call “The Singularity;” the day that superhuman intelligence arises from the shackles of our flesh-and-blood confines, either through the use of brain augmentation or advanced AI and brain uploads.
Different theorists predict different dates, but noted futurist Ray Kurzweil expects the Singularity to occur as early as 2045.
Will the Singularity ever occur? We won’t know for a while yet. One thing is for certain, however: the more we use technology, the more it shapes humanity’s very core — and the more powerful we become.
And before any Luddites bemoan the prospective loss of our self being in a sea of ones and zeros, consider this: Socrates thought the simple written word would obliterate individual knowledge and self-identity, as well. From Plato’s Phaedrus:
This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves.
Haters gonna hate, but you can’t stop progress… unless, apparently, you ditch Google Maps.
[Image credits: Brain Circuit - takito/Shutterstock; Neurons - Lightspring/Shutterstock; Project Glass - Google]
Source : digitaltrends[dot]com
Post a Comment