• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Dumb Square Root Ques.

phildonnia said:
There is no way to define 'i' in such a way that '-i' doesn't also fit the definition. We might all be using different 'i's.

As long as they all work the same way...
 
Robin said:
Wow, you really are picky aren't you?
Sorry, I should have included a smilie, but I coulldn't think of which one would be appropriate.

A computational function may do whatever the software developers damn well want it to do, so an Informix 4GL function may return more than one value.
The idea that I was really driving at, and which I didn't express very well, is that the concept of a function includes the idea of being well-defined: with a given input, there isn't a choice between different possible outputs. There are no judgment calls, no decision of which answer to give: there's only one answer. With a computer, the same code can result in different outputs, but one can harmonize that with the concept of a function by conceptualizing the peripherals as being variables that can take on different values, meaning that the input was not truly the same. Even when a program gives an output consisting a several values, the computer still doesn't choose between values, it gives them all. Computer programs, like functions, have no choice as to what answer to give, and in that sense they follow the same rules.

Matabiri said:
Eh? i is defined as "the number which, when multiplied by itself, gives -1" isn't it? No square root in there at all...

Mind you, -i also fits that definition, I suppose...
You are confusing "definition" with "description". If two different things satisfy a description, it's not a definition. There is no such thing as "the number which, when multiplied by itself, gives -1".
 
Art Vandelay said:
Even when a program gives an output consisting a several values, the computer still doesn't choose between values, it gives them all. Computer programs, like functions, have no choice as to what answer to give, and in that sense they follow the same rules.

I agree on what you wrote.

However, I use this opportunity to mention that sometimes it is useful to pretend that computer doesn't do that.

That's the idea of non-deterministic computation: suppose that the computer could guess the correct answer and we had only to prove that the guess is correct. If a problem has more than one solution, then we get any one of the answers.

Some programming languages do include the support for defining non-deterministic choices. Of course, in the practical real life implementation the computer has to try all possible choices (well, a naive algorithm has, better algorithms can reject some bad choices immediately) so you don't save any real computational effort. And the answer we get is the one that the computer found first while going through the search tree of the choices.

Non-determinism is useful while creating programs for some classes of problems. For example, many "hard" problems have a very simple and natural non-deterministic representation. So, you can very quickly throw together a prototype implementation for the problem using non-determinism, and you can then check if that solution is efficient enough for practical use. In many cases it is. In those cases where it isn't, you have to write a conventional and heavily-optimized program for it and hope that will be good enough. (It often isn't when we are speaking about problems that are truly hard).
 

Back
Top Bottom