I have some difficulties with transhumanism.
In regards to physical changes -- increasing life span, making bodies more durable, getting rid of genetic diseases, etc. -- I am mostly in agreement. But many transhumanists go far beyond this. They advocate tampering with intelligence. Not just preventing genetic conditions that cause mental retardation (such as Down's Syndrome), but in trying to play with our genetics to boost the intelligence of 'normal' people.
And some transhumanists go even further, getting into issues of ethics and morality. Finding a way to program people to be more 'moral', to rid society of those elements that transhumanists find undesirable (racists, people who enjoy hurting others, etc.).
When it comes to simply tweaking our physical bodies, I don't have big problems, and would likely support the vast majority of such transhumanist goals.
When it comes to tweaking our brains, I have a bigger problem. Making changes to a person's body doesn't really change the essence of 'who they are' (a person who gets plastic surgery, for example, may appear physically be be somewhat different, but mentally is still the same person). But once we start to play with peoples' minds, it is a different story. Even relatively minor changes in the brain can have drastic results on a person's behavior, personality, etc. I'd have significant concerns that experiments in this realm aren't just 'tweaking' us to make our lives better; they involve fundamental changes that will have a far more sweeping impact on humanity as a whole.
Given that, the question is -- who has the right to determine what is 'right' or 'wrong' in such a scenario...what is desirable, and what is undesirable?
This becomes an even more obvious problem when you start talking about attempting to directly change how people think, or control/program their moral/ethical behaviors/beliefs.
This is where that old story of Frankenstein becomes so very, very relevant.
Sure, most transhumanists will make arguments that such technology and knowledge would be used very carefully, within a democratic system, based on individual choice, etc. But that's a fairy tale...the vision of someone so disassociated from reality that I really do not want them being the ones making decisions in a project like this.
What would happen in reality is that, inevitably, some people would use this technology in far worse ways. China, for example, may use it to produce a population that is more intelligent, but also more passive and obedient to authorities, creating a nation of super-intelligent slaves. Or North Korea might use it to create super-soldiers -- soldiers with incredible physical abilities and endurance, increased intelligence, etc.; but also bred to have little or no empathy for others, even to enjoy killing others, creating a true 'warrior race'.
Anyone who thinks that they can create such technology, but somehow avoid such abuses, is living in a world that has no connection to our own. So my question would be thus:
Granted that transhumanism could, potentially, bring many benefits to improve our quality of life; would those benefits be equal to or greater than the ways in which this could be abused to create greater suffering and pain?
Would being able to live longer, healthier lives outweigh the dangers of an enemy creating super-soldiers to conquer your country, for example? Because once the former is possible, so is the latter.