mumblethrax
Species traitor
- Joined
- Apr 5, 2004
- Messages
- 4,991
The results aren’t the same. We don’t have anything nearly as reliable or general as HAL.Again nobody cares if the results are the same.
Last edited:
The results aren’t the same. We don’t have anything nearly as reliable or general as HAL.Again nobody cares if the results are the same.
I think how the results are arrived at also matters. If NASA carefully calculates the correct place for the spacecraft descent and Glumbo the Chimp throws a dart at the globe and they both hit on the same spot, the results are the same. Does it matter? It does to Glumbo's employment prospects, and likely also to the peace of mind of the returning astronauts and their insurance companies.
Nobody has made that claim.Then we've looped back to the start.
If AIs can't ever do it, what are we worried about?
Are the results consistently the same?
If so then no the "how the gears are turning behind the scenes" doesn't matter in this context.
Because they are doing nothing beyond symbolic manipulation.
Long story short we've been replacing/augmenting the "Doers" with machines since basically we became human and now we're having this big moral freakout over the possibility of the "Thinkers" being replaced/augmented and I don't know if I buy it as much as others.
The world didn't end when Steve didn't have to stand on an assembly line screwing in the screw that holds review mirror to the door of the Corvette and a robot arm starting doing it. Sure we worried about what Steve was gonna do now in like work sense, but we saw it as progress.
I question whether it's really all that different because an Algorithm can spit out a basic commercial jingle in 10 seconds instead of having Steve do it.
It's there is value to "the high arts" they'll survive based on their inherent worth and if there isn't oh well. If it can be destroyed by just having competition I'm not gonna weep for it provide, as I said, at the end of the day we can the same functional things at the end. If the Rembrandts and Hemingways of the future can't compete with an algorithm in a double blind sense of the term I don't know what problem we can be expected to solve there.
There is an unpleasant air of "Okay I mean it was one thing when blue collar workers got replaced with machines but we're different, we're ARTISTSE!" to some of this.
Really have no idea how you get "moral panic" out of "No, we don't have anything like HAL yet."Long story short we've been replacing/augmenting the "Doers" with machines since basically we became human and now we're having this big moral freakout over the possibility of the "Thinkers" being replaced/augmented and I don't know if I buy it as much as others.
Yes, people are capable of understanding what a color or a prime number is, conveying that understand to others, attaching meaning to the referents of language, etc.Dr.Sid said:Do people anything more ?
Really have no idea how you get "moral panic" out of "No, we don't have anything like HAL yet.".
Questioning seems kind of crucial to actually understanding what's happening here. There's nothing "intentionally obtuse" about not letting you shut that down by decree.Jesus ******* Christ whatever. The handwringing, the worrying, the questioning, whatever you want to call it. This. This discussion we're having right now. Whatever is it we're doing NOW we didn't it then.
When Oog cracked a rock in two to make the first crude knife nobody cared if the knife had a soul or could really think or understood the internal process of carving a mammoth hide. There was not "Is the knife cutting or just performing an action it doesn't understand that is exactly like cutting?" question.
Making a crude knife made butchering the mammoth you just took down easier. That's all anybody cared about.
Counterpoint: We privilege thinking beings above all others. We put animals to work without seeking their consent or valuing their freedom, but consider human slavery a moral horror.Yes we worry about if we understand it. We don't care if the tool does or not.
Yes we worry about if we understand it. We don't care if the tool does or not.
Nobody, in short, has ever been curious, and if they are, they should cut it out. There's no reason to wonder how a car engine or a microchip works--it either makes your life easier or it doesn't.Making a crude knife made butchering the mammoth you just took down easier. That's all anybody cared about.
No, what people actually want are machines that do the boring stuff while we live a life of leisure.I'm pretty sure what people want for AI is Commander Data
Be so reliable that it can claim to make no errors without making everyone around it burst into laughter.
I just asked ChatGPT which is more likely to be green—a purple hat or a blue smoothie. The answer: the blue smoothie.
Current tools are bad at anything that requires them to understand language as representational, because they don’t.
The hard part isn’t understanding what is being said by reading lips, but understanding what it means.
Except it gave me its "reasoning" and this wasn't it.Blue is closer to green than purple and you asked which is more likely. So it was as correct as your question allowed it to be.
I feel like people don't get that AI is like all other tech, it's going to get better at an exponential rate.
"Close" means something different in this context. If AIs are getting the broad, conceptual strokes of something now and 99% screwing up the practical application of it... that's actually pretty close.
Anything AI can do in a "Funny LOL I see what you were trying to do but look at how much you messed it up" way NOW, it's going to be doing very, very, very well 18 month, 36 months, 72 months down the road. Like we're not talking AI perfecting this on the time scale of some detached far point in the future.
Like a few months back the big "tell" was the AI can draw human hands.
1) Rob Liefield couldn't draw feet and he was the most successful comic artists of an entire decade.
2) Half of cartoonist joke about how they can't draw hands.
3) Seen AI art in the last few weeks? That's not that much of a problem anymore.