Last week I really thought that people like Francis Fukuyama and Jürgen Habermas have been right all along. Both have claimed in different writings that modern (and especially future) technology will cause our fragile human nature to deteriorate and in effect dehumanize us and our societies.
What got me thinking was a news paper article published in my local Helsingin Sanomat news paper. It was based on a post in www.everydayhealth.com and penned by Keith L. Black, MD. The writer asks a serious question about a long lingering claim that modern communication technology actually has a negative effect on our brain. Namely, it is affecting our memory and ability to feel. This is because we don’t have to use our brains so much and because modern (social) communication technology have conditioned us to fast and immediate gratification. 
In this text I will represent some objections to the myth of dehumanization and claim that the speculative writings I got so worried about, should not be used to draw hasty conclusions in support of the dehumanization claim. In the end I’ll speculate about the need for a pro integration culture for the technoprogressive community.
The danger of the social media is not a new idea and www.prdaily.com has gathered a small example of similar discussions . While technology is having an impact on culture and our social order, altering the way humanity thinks and feels is a much bigger problem for the technoprogressive community.  Outsourcing cognitive functions and routinely “feeling” the “like button” may already have shot humanity in a dropping trajectory.
The ability to remember things is often thought to be a hall mark of being human . It’s also difficult to fathom complex mental processes without the ability to handle huge amounts of data in a very complex memory system. Therefore, losing or hindering the ability to remember may in fact wreak havoc on human nature.
Is this really the case? Or is perhaps a more productive question: how can we continue harvesting the benefits of technology while there may be some integration issues? This is an important issue especially if you, like me, are waiting to get your hands on things like the upcoming Google Glasses –device .
The monster in the attic
All though the speeding up of communication has played a major part in what is called modernization, the precise “bad guy” here is the social media.
It is known that memory is affected if not used. According to the on going ‘neuroplasticity revolution’, brain functions like memory and cognitive abilities can be mulled all through our lives . As the black box starts to open, it seems we have to rethink how to handle the interaction of brain and the environment.
Obviously there are two sides to this coin. On the other hand we can expect to keep our brains healthy and functional by allowing them suitable exercise now and then. All we have to do is use our brains the old way: do some maths, solve word puzzles and in general do real things in a real world. The bad side is that our lifestyle and our environment may also increase the “damage” our brain may suffer.
It is this latter part of the story communication technology is now believed to be heading. By using life simplifying features like search engines, we actually are removing the “processing time” from our brains to a computer. And when the brain is not used, it gets weak like muscles without exercise.
Breaking down the argument, we come to some basic assumptions.
1. The use of task specific systems, like search engines and automated error correcting algorithms, leave our brain unused.
2. As our emotional ability is woven into technical systems, the scope of our emotional depth decrease.
There are at least two conclusions here. One is that we are in fact doing harm to ourselves as humanity. The other is, that we are adapting.
Dehumanization is a well told story
The outcome of these premises is that our biological brain loses effectiveness in memory and menial cognitive tasks like counting, grammar and handling information. But is this what the critics of Transhumanism call dehumanization?
To be exacted, there is no common ground in understanding what dehumanization actually is. As a general rule, it denotes something that weakens or removes the “x-factor” from individuals and/or human culture. Very often the term also implies moral corruption “in human nature”. Often, like in Orson Welles’s Time Machine we see both these dimensions meshed as one.
Are we talking about culture or biology here? Technology will continue to change our cultural habits and even cultural forms. It depends on some moral judgements weather this is good or bad. Conservative thinkers in all ages have claimed the latter. Dehumanization argument is all about opposing the “new” since it threatens the “old”. There are even some very cultivated theories that are based on this view. A kind of a tome of all cultural pessimism is Oswald Spenglers The Decline of the West .
The other way is looking dehumanization as a personal or ‘biological’ process. In this case, the MRI scan is the method that shows ‘objective’ changes that support the argument of “the fall of man”. The idea is to link concrete brain analysis with what is happening to the “subject” in the real world. This field is in itself a source of much inspiration to the Transhuman philosophy but it also puts that philosophy to an empirical test.
So, either social media alters our “culture” or our brain. The outcome and conclusions about this are of relevance. The pessimist conservative now claims that we are getting dumber as our technology gets smarter. And I claim, that so far dehumanization is a story with no validity what so ever.
So, what’s a brain anyway?
At this point it is well worth noting that we still don’t have the full picture of the true effects – if any – social media technologies have. However, a fairly recent study conducted on London Taxi drivers show that even adult brains can drastically be restructured . Such findings are possible due to modern medical research methods like the MRI scan that enable scientists to see how individual neurons and nerve pathways develop and function. But is this the whole truth? No way. And this is why.
If we look at the history of almost any popular technology, there have always been critical voices. Of course an open discussion in society is a necessary part of technological progress but it also opens the door for pessimism and unfounded fear of “the new”. Social media is controversial. It holds many of the promises the entire “internet revolution” had such as open society, freedom of expression and diffusion of political and even economical power.
When television took over the world there was a strange backlash that involved moral arguments about “people who only watch television” . It was claimed, that television makes us passive, gives bad influences and brakes down “culture”. It was even feared that kids don’t develop properly if they stay indoors and watch television in stead of doing sports out in the open. And it was widely believed that watching television affects our brain. At the time it was not possible to do an MRI scan but still, the culprit was set. In general the mass media during and after the Second World War has always caused suspicion among the elite. Ominously many of our best loved conspiracy theories revolve around the “brain washing” done on television.
And then there was the case of video game violence. There is no evidence that playing violent videogames makes the ordinary person more violent but people who are prone to violence tend to gravitate towards violent videogames. And still, they make up the small margin in the billion dollar market.
But the speculation about the harm of modern communication technology has a deeper undertone. It is not, for instance, claimed that social media as such makes us dumber. The argument of physical harm is almost like we would need a set of seatbelts or UV protection against it.
Is there a way out?
Yes there is. I’m not going to call into question the validity of Keith L. Black, MD here, since I really am not qualified to do so. I will, however, point out some ideas that at least open a way out even if our brain gets “damaged” by the use of modern technology.
At this point we know the “dehumanization” story being told here. And it really is just a story. Sometimes the outcome of technology is bad and tragic. But, put DDT aside for a second. What I think, and many others may share this idea, is that dehumanization is just an interpreted story from somewhat vague facts. Facts that are visible only because the people who “find” them don’t really want a different explanation. Sounds like yellow pages to me. Any court would throw such evidence out in a minute.
Technology does not pop out form nothingness and it does not occupy an area of reality by just falling on to something. Technology is negotiated. That means, people and cultures incorporate it. This is well known from marketing research where they are struggling to figure out who some particular application or service will actually be used by the consumers.
People will respond, societies will respond and yes, technology will respond since it is built by the culture and the social order it exists in.
And that’s the way out.
We can always ask, how severe is this “damage”? Sure, it’s bad to have ones short term memory diminish in efficiency but can we calculate a minimal accepted level? And, what’s the trade off?
What does the “memory loss” argument actually mean? For one, it is too soon to say weather or not the effects are in fact negative. So far we (claim to) know that the brain can change radically when interacting with the environment. Does that then mean we turn into idiots – or, using Fukuamas terminology, get dehumanized?
What seems to be happening with the brain is that is shutting down neurons it no longer needs. Okay, this is a layman view but still, that’s the basic idea. Does this mean that the brain is getting “worse” or is it that the social media environment is not yet advanced enough to take over these processes?
For instance, I use a search engine to find out how difficult English words are written. I just type the word and Google corrects it. Especially foreign names are sometimes very difficult to remember. Again, I also use different apps to keep track of news and world events. Sure, I find out the “facts” in seconds even though I know that true knowledge is attained by days or weeks of research. And yes, I try not to use my memory since I can upload any important stuff to a cloud service and use tags to retrieve it if needed.
But wait. What’s the problem here? This kind of “error correcting” or “search capabilities” have been around over a decade in word processors. Have professional researchers lost some important abilities? In fact many productive people have found computer assisted data or time management a great tool for getting things done.
And of course there is the radical vision of human-computer hybrid. I consider myself a cyborg since I have outsourced some mental processes to international corporations like Google and Facebook. I believe that if there ever is a transition from human to cyborg, there will be a few missing links that have to exist with the downsides from both worlds. It seems we need to make sure we don’t let go of being an active subject while our “abilities” may well be simulated outside of our bodies.
Integration is the Next Big Thing – accept it.
When my great grandparents worked in the farms of their times, their brains probably were wired much different than mine. There may even be differences in how they and I perceive the world (this is purely speculative though). My 3 year old daughter does not understand what a telephone is. She does not recognise the “feel” of the machine and when we play “call daddy” she does not understand that hanging up is actually a very physical thing to do.
Our brain is not a black box but in constant flux with the external reality. This means that the locus of “me” or “human” is not a fixed point either. And that means that there is no “person” to get dumber or smarter. We may argue that social media is a system of non-personal intelligence that, from an enlightenment point of view, adds to the value of humanity.
A person is an instance, self conscious instance, that can achieve happiness and understanding now and in the future. A neurone here or there is not something to worry about too much. In stead we should ready ourselves and our societies to leap beyond the body and of the person or the “I”.
The question is not what is happening to our brain (if anything) but how do we integrate it into the social media so that the end result is the same or better than we are now.
Ilkka Vuorikuru is a PhD student in sociology of science and technology at the University of Turku, Finland. He works as a Technoculture Adviser, journalist, coach and motivational speaker.
(0) Comments •
(6906) Hits •