stamenflicker
Muse
- Joined
- Apr 8, 2004
- Messages
- 869
Any takers on a "philosophy" of the month? I stumbled across this website during an exchange with Tricky last week. Here's a summary of the "philosophy," if it is fair to use the word:
http://www.transhumanism.org/index.php/WTA/more/transhumanist-values/
It seems worthy of debate whether or not their view of the "Space of Possible Modes of Being" is philosophically sound. (See Figure 1 in the link). Philosophically, this speaks to the concept of "limits," touched on by Wittgenstein and others. But does it stand logically?
Second, it seems to be an interesting place where science, postmodernism, and theology could work together should they choose:
(from the site)
Also, in the world of ethics, to what extent should the "future" play into our current value structure? I mean this to say more than just environmentalism or world hunger. I mean specifically the values with which we apply knowledge, or specifically scientific understanding. Here's a clip:
The idea that we currently do not "want to want" particular values is of interest to me, considering how such unwanted values may impact the future of technological discovery. Asking questions about what we do not "want to want" also seems valuable in determining whether or not what we don't "want to want" is of any value down the road. I have thoughts, but I'm more curious about yours.
Flick
http://www.transhumanism.org/index.php/WTA/more/transhumanist-values/
It seems worthy of debate whether or not their view of the "Space of Possible Modes of Being" is philosophically sound. (See Figure 1 in the link). Philosophically, this speaks to the concept of "limits," touched on by Wittgenstein and others. But does it stand logically?
Second, it seems to be an interesting place where science, postmodernism, and theology could work together should they choose:
(from the site)
Transhumanism does not entail technological optimism. While future technological capabilities carry immense potential for beneficial deployments, they also could be misused to cause enormous harm, ranging all the way to the extreme possibility of intelligent life becoming extinct. Other potential negative outcomes include widening social inequalities or a gradual erosion of the hard-to-quantify assets that we care deeply about but tend to neglect in our daily struggle for material gain, such as meaningful human relationships and ecological diversity. Such risks must be taken very seriously, as thoughtful transhumanists fully acknowledge.
Also, in the world of ethics, to what extent should the "future" play into our current value structure? I mean this to say more than just environmentalism or world hunger. I mean specifically the values with which we apply knowledge, or specifically scientific understanding. Here's a clip:
According to Lewis’s theory, something is a value for you if and only if you would want to want it if you were perfectly acquainted with it and you were thinking and deliberating as clearly as possible about it. On this view, there may be values that we do not currently want, and that we do not even currently want to want, because we may not be perfectly acquainted with them or because we are not ideal deliberators.
The idea that we currently do not "want to want" particular values is of interest to me, considering how such unwanted values may impact the future of technological discovery. Asking questions about what we do not "want to want" also seems valuable in determining whether or not what we don't "want to want" is of any value down the road. I have thoughts, but I'm more curious about yours.
Flick