Technology, Transhumanism, and Perennial Wisdom

Technology and the future of the human race

Fresno Bee, November 26, 2016

The world’s religious and philosophical traditions teach that a good life requires lifelong spiritual practice. Daily effort is needed to develop virtue and character. Eat your veggies and get some exercise. Care for the weak. And come to terms with suffering and death.

Some view such old-fashioned wisdom as complacent and defeatist. Rather than embracing perennial wisdom, so-called “transhumanists” want technological shortcuts to longevity, morality and happiness.

Some advocate using drugs to improve cognitive capacity, to create happiness or to facilitate empathy. Others want to edit the human genome to eliminate diseases. Robotic surrogates could care for the sick. Medical technology can extend our lifespans. And consciousness could be downloaded, creating virtual immortality.

This may sound like science fiction, but technology is rapidly advancing. When physicist Michio Kaku spoke at the San Joaquin Valley Town Hall Lecture Series this fall, he sketched a transformed future enhanced by nanotechnology, robotics and computing power.

Audience members asked questions about the moral implications of this brave, new world. But Kaku brushed those concerns aside. Indeed, at one point he claimed that he couldn’t hear those questions due to a problem with the microphone.

And herein lies a problem. Our low-tech gadgets give us headaches. High-tech solutions could have terrifying side-effects. Plastic surgeries go awry. Today’s wonder drug is tomorrow’s health crisis. Genetic engineering could produce monsters. And so on.

TECHNOPHILES TRUST THAT THE ENGINEERS WILL FIX EMERGING PROBLEMS.
A MUCH SIMPLER SOLUTION IS TO CONSUME LESS AND REDUCE POLLUTION.

Technophiles trust that the engineers will fix emerging problems. For example, if fossil fuels cause climate change, let’s respond with nuclear energy or geo-engineering. A much simpler solution is to consume less and reduce pollution. But restraint and self-control are antiquated values in a world of instant gratification.

Our culture encourages us to seek material solutions for spiritual problems. In a world of hammers, everything looks like a nail. And in a world of computers, we suspect that there must be an app for wisdom, virtue and happiness.

Hammers are useful. So are computers. But the most important human problems require spiritual responses. A pill cannot make you happy. A robot cannot provide love. Suffering and death can be deferred, but they cannot be permanently defeated.

Last week at Fresno State, a bioethicist from Florida, Dr. Melinda Hall, gave a presentation on her new book, “The Bioethics of Enhancement” as part of the Leon S. Peters Ethics Lecture Series. Hall is a critic of the transhuman vision. She warns that emerging biotechnologies devalue the lives of the disabled by treating disability as a problem that needs a technological fix.

Hall suggests that “disability” is often created by disabling social circumstances. For example, short people are disabled in a culture that puts everything on the top shelf. But rather than engineering bodies to make them taller, we could change our social world so that short people are not disadvantaged by their stature. And we could also learn to value short people as much as we value tall people.

Consider what Stephen Hinshaw and Richard Scheffler call “The ADHD Explosion.”According to their recent book with that title, one in nine American kids are diagnosed with ADHD. Seventy percent of those diagnosed are prescribed medication.

SUFFERING IS A PART OF LIFE.
GOOD PEOPLE FIND MEANING IN CARING FOR OTHERS. AND NO ONE GETS OUT OF THIS LIFE ALIVE.

Medication can be a game changer for some kids. But we might consider alternative school structures, weaning kids from electronic gadgets, making sure kids get enough sleep and nutritious food, and other low-tech fixes.

The same is true with the obesity epidemic. We can fix obesity with bariatric surgery. But a different focus would change the social world so that we ate better and got more exercise. We might also develop a more welcoming attitude toward obese people.

We are not all the same. Some are fat. Some are thin. Some are short. Others are tall. But all human beings deserve love and a chance for happiness.

Transhumanists have a fairly narrow conception of what makes for a good and happy life. They forget that death, disability and dependence are part of the human condition. Suffering is a part of life. Good people find meaning in caring for others. And no one gets out of this life alive.

We really can do amazing things with technology. But technological fixes often float on the surface. A deeper approach embraces our differences, our dependence and our mortality.

http://www.fresnobee.com/living/liv-columns-blogs/andrew-fiala/article116931038.html

Moral Brain-Hacking and Moral Education

Science not enough, ideas and thought needed

Fresno Bee, May 16, 2014

Perhaps the solution to crime and other social problems is to fix people’s brains or dose them with love drugs. Moral brain-hacking might be a cheap and effective way to produce moral people.

Moral behavior appears to depend upon chemicals such as serotonin, dopamine and oxytocin acting in our brains. Paul Zac argues in his book, “The Moral Molecule,” that oxytocin levels are correlated with empathy, trust and love. A squirt of oxytocin can make people kinder and more trusting.

Brain structure also matters. Magnetic resonance imaging suggests that a sense of justice is located in the part of the brain associated with higher-level cognition. Antisocial behavior is linked to brain defects.

Locating moral behavior in the brain — and not as the free choice of an immaterial soul — may require us to rethink traditional ideas about guilt and responsibility, punishment and reward, praise and blame. If we follow the neuroscience, it might make sense to “punish” people by requiring them to take drugs or have brain surgery. Locking criminals in prisons with other people who have similarly defective neurochemistry may eventually seem, well, medieval.

Spiritually inclined people may be dismayed by this materialistic focus. Brain-based discussions ignore the soul and the moral conscience. Neuroscience dusts the angels and demons off of our shoulders, focusing our attention on the space between our ears.

Those who think that consciousness is distinct from the brain have to explain how Prozac, Ritalin, marijuana, and St. John’s wort are able to change experience, mood and focus. The attitude adjustment provided by a glass of wine or a cup of coffee can make you wonder whether there is anything more to the mind than the brain and its chemistry.

Some may feel that this materialistic focus misses the really big picture of why morality matters. If moral experience is reduced to brain science, traditional metaphysical notions of good and evil may be lost. A brain-based view of personality rules out punishment and reward in the afterlife. The move from the soul to the brain involves a radical reassessment of the meaning of morality and of life itself.

The focus on brains does, however, overlook the importance of ideas and education. Even if we admit that experience is based in the hardware of the brain, we still need the software of consciousness — ideas and theories — that allows us to interpret our experience. A dose of oxytocin may be able to stimulate empathy. But empathetic emotional responses are devoid of content.

Ideas and ethical theories tell us how to act on our emotional responses to the world. Does caring for a loved one mean I should pull the plug and let them die — or keep them on life support? Does empathy for murder victims mean that criminals should be executed — or should empathy extend to criminals?

To answer those kinds of questions we need ideas. Pills, potions and powders can only take us so far. The brain’s capacities and responses are channeled by the stuff of thought: ideas about right and wrong, theories of the good life, models and heroes, and the whole range of issues that arise in the context of moral education.

Ideas cannot simply be reduced to chemical signals in the brain. Does that mean that ideas float freely in a world apart from physical reality. There is a deep mystery here. What is an idea like “good” or “evil” made of? Where do ideas dwell? And how do we know them? Those kinds of questions can really blow your mind (or brain or soul?).

Neurochemical enhancement can’t entirely replace moral education as traditionally understood. Religion, philosophy and literature fill the brain with ideas that guide, bewilder and inspire. Neuro-ethical hacking may make moral education easier. But the neurotransmitters cannot tell us whether brain hacking is a good idea. For that we need moral argument and critical thinking.

Neuroscientific enthusiasm may lead us to miss the moral forest as we gaze in fascination at the neurological trees. Some of us could benefit from a chemically induced compassion boost. But a compassionate brain without moral ideas is empty. A moral person is both a brain and its ideas. And those ideas come from good old-fashioned moral education.

Read more here: http://www.fresnobee.com/2014/05/16/3930743/science-not-enough-ideas-and-thought.html#storylink=cpy

 

Technology is not to blame for evils of society

Fresno Bee

October 18, 2014

http://www.fresnobee.com/2013/10/18/3560214/technology-is-not-to-blame-for.html

We live in a culture of mass distraction. It is easy to tune out and look the other way. The ability to ignore things is a useful adaptation. We can’t respond to all of the inputs that assail us. We’ve got work to do and our own concerns to attend to. And mostly, we want to be left alone.

But detachment and dissociation can be dangerous.

In San Francisco recently, a student, Justin Valdez, was murdered on a crowded train. Passengers, engrossed in tablets and phones, failed to notice the murderer brandishing his weapon in plain sight. The San Francisco District Attorney said that bystanders were “completely oblivious to their surroundings.” The police chief warned that people absorbed in technology are vulnerable to crime.

The Valdez murder brings to mind Kitty Genovese, who was murdered in 1964 while bystanders ignored her calls for help. This case is frequently cited in ethics and psychology textbooks as an example of diffusion of responsibility and the bystander-effect. Individuals in groups assume that others will act; and so no one does. The new problem is distracted bystanders, who don’t even notice threats.

But we should be careful about assigning blame. Technology is not to blame for the Valdez murder, nor are the bystanders — the shooter is. And while we might like people to be more aware of their surroundings, we have a right to tune out. It’s the criminals who are wrong to take advantage of the vulnerability this creates.

Electronic technologies make it a bit easier to ignore our immediate surroundings. But there is nothing new about zoning out. Before cellphones, there were books, magazines and crossword puzzles. And in crowded places, it is polite to ignore others. We avert our eyes in hallways and on elevators, respecting the privacy of others.

Some fret that high tech makes it too easy for us to be “alone together,” as MIT social scientist Sherry Turkle put it in a book with that title. Turkle worries that virtual reality and communication destroy real intimacy and human empathy. I share that concern. But there are lots of things that destroy intimacy and empathy: racism, sexism, alcoholism, etc. Virtual reality has no corner on the market of callousness.

Intimacy and empathy are important. But they are also hard work. We can’t be empathetic and aware all the time. Tuning out is a coping mechanism in a hectic, crowded world. Sometimes we need to retreat to solitude, disconnect, and disengage. We nap. We daydream. We meditate or pray. And sometimes we poke around on our cellphones.

As with most issues, the context matters. It’s rude to check email in the middle of a face-to-face conversation or to surf the web in a business meeting. And texting while driving can kill. But public transportation should be a safe place for tuning out. We ride the bus or take the train, expecting to have the freedom to read, nap or listen to music.

The world might be a better place if we were all constantly engaged with one another, if we all acted as good Samaritans all the time. But a world of good Samaritans could also be oppressive. Imagine a world where everyone is watching everyone else, looking for opportunities to help. Imagine a world of incessant empathy, where everyone is trying to connect — even in elevators, on buses, or in other crowded spaces. That world would be exhausting. And it would lack zones of privacy and places where we can be alone, even when we’re together.

A broader culture of intimacy and empathy may prevent random violence. But there is no easy answer or high-tech solution here. There is no app for foiling murder — or finding love. We tend to blame technology and hope for technological solutions to the perennial problems of being human. Our obsession with technological issues may be the biggest distraction of all. We blame our tools or hope for a better tool, while ignoring the persons who use them.

We can’t blame technology for malice or alienation. Nor can we blame technology for making us clueless and oblivious to our surroundings. Evil and obliviousness are human problems. And they existed long before the iPhone was invented.

Read more here: http://www.fresnobee.com/2013/10/18/3560214/technology-is-not-to-blame-for.html#storylink=cpy

 

Ethics of Brain Hacking

Brain hackers raise social questions about learning, understanding

June 28, 2013

 

Is it ethical to use “smart drugs” to improve cognitive function?

Legal concoctions of vitamins, herbs and nutrients are advertised as improving memory, focus and mental acuity. Some of these supplements claim they can produce lucid dream states and lessen the need for sleep. And prescription drugs are being used in illegal ways as mental stimulants, aimed at enhancing memory and concentration.

So-called “brain hackers” claim that cognitive function can be enhanced by sending mild electrical current through the brain. At least one company is marketing a trans-cranial electrical current device to video game players as an upgrade for the gamer’s brain.

Assuming that these things really work, one obvious ethical issue is health and safety. But if we assume that neuro-enhancers can be used safely, another ethical issue is fairness. It doesn’t seem fair for people to artificially enhance performance in school or in business, especially if these enhancements are not widely available to everyone.

One might also worry that the learning that occurs through brain hacking doesn’t really count. It seems like cheating. Of course, these products won’t do the learning for you. They help you focus and retain information better and faster. But you still have to do the studying. If it is acceptable to drink coffee during a cram session, is it also acceptable to use another, more powerful chemical that can help you focus even better?

If learning is primarily about creating pathways in the brain, resulting in new skills and abilities, then there is nothing inherently wrong with brain upgrades that help build those pathways more quickly. Flashcards help and so might a drug. Result-oriented learning will encourage the use of the most efficient tools. From a result-oriented standpoint, it doesn’t matter that you took a chemical shortcut so long as you actually end up knowing the thing you set out to learn.

But learning and thinking are not only a means to an end. They are also ends in themselves. Aristotle suggested this when he said that learning gives us the liveliest pleasure. One source of the pleasure of learning is the resultant mastery — the ability to perform or do something as a result of learning. But there is also pleasure in the very process of practicing and working at mastery. Is the road of learning enjoyable for its own sake; or is the point to achieve mastery as quickly as possible?

The brain-hackers want to shorten the process, perhaps underestimating the pleasures of practice and study. They are primarily focused on performance and achievement. If a short cut can be found, why not take it?

But Aristotle and others would argue that the road matters as much as the destination. Learning and thinking are also deeply social activities, which build connections with other people through the shared effort of the process. There is no mechanical or pharmaceutical shortcut to building community and developing relationships.

In a culture of high-stakes testing and dog-eat-dog economic struggle, it makes sense that people would want to hack their brains, looking for a competitive advantage.

In our culture, there are tangible rewards for those who can process and recall information quickly and accurately. Quick thinkers get better grades, bigger scholarships, and higher-paying jobs. Slow thinkers are left languishing in the dust.

But quick processing and recall skills are merely mechanical: machines can process and recall information much faster than we can.

Machines cannot, however, evaluate what is worth thinking about. The brain hackers are focused on the question of “how fast?” But they forget to ask “how come?”

There is no quick answer for the deeply human question of what matters and why it matters.

Existential questions require unhurried contemplation. But our caffeinated, video-game culture has no time for ruminating and mulling things over.

We spike our brains, filling them with images and words from dawn to dusk.

We are competitive thinkers, looking for an edge in a world that has little patience for the poets and dreamers who pause to wonder about the point of the hustle.

In the end, we may find that the faster we arrive at our destination, the less we understand why we wanted to get there in the first place.