• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Fuzzy Logic

Brief definition for those unfamiliar:

Fuzzy Logic
An extension of two-valued logic such that statements need not be true or false, but may have a degree of truth between 0 and 1. Such a system can be extremely useful in designing control logic for real-world systems such as elevators.
Although I've never used it directly myself (unless you count QM), I also think it makes sense if used correctly. In the real world, things are rarely black and white. Fuzzy logic is a system that allows for shades of gray.

MPO, is this is relation to anything or just a spontanious statement you felt the need to share?
 
I'm a fan of the view that logically there are three types of statement- True, False and Meaningless. This takes care of all paradoxical statements at a stroke.

Part true/ part false is something else. Surely it just implies that you have not sufficiently refined the question?

As for the electronic applications- all the appliances I ever owned involved fuzzy logic, starting with the reason for buying them.
 
It's not in relation to anything, I was just reading a book titled 'Fuzzy Logic', but after reading about three chapters I wasa still unenlightened as to how it was this 'radical, world-changing new power' that the book seemed to be constantly implying it was. It's just common sense.
 
Soapy Sam, I have never read anything about "True, False and Meaningless" logic. It sounds interesting; could you tell me some reference, please? I took a peek at google but didn't find anything interesting.
 
Soapy Sam said:
I'm a fan of the view that logically there are three types of statement- True, False and Meaningless. This takes care of all paradoxical statements at a stroke.

Part true/ part false is something else. Surely it just implies that you have not sufficiently refined the question?

As for the electronic applications- all the appliances I ever owned involved fuzzy logic, starting with the reason for buying them.

Well, if you have an apple and take a bite out of it, is it still 'an apple'? The question is not meaningless. If 'no', why is it not, and what is it if not an apple? If 'yes', then how about if you take another bite, and another, until you are left with the core? Is it still an apple? If you say 'No, it's a core', but said 'yes' to the first question, when did it stop being an apple and become a core? I suppose though you could answer the first question by saying 'It's an apple with a bite taken out of it'.

So a better example would be the heap of sand grains?
 
<table cellspacing=1 cellpadding=4 bgcolor=#333333 border=0><tr><td bgcolor=#333333><font face="Arial, Helvetica, sans-serif" color=#ffffff size=1>edited by Upchurch:</font></td></tr><tr><td bgcolor=white><font face="Arial, Helvetica, sans-serif" color=black size=2>I've moved this thread to the appropriate board. Nothing really philosophical going on here.
[/i]</font></td></tr></table>
 
It is radical in the context of it's application in computer software. At the hardware level, the computer knows only 0 or 1, set or cleared, on or off. For all its basic functions, it is stuck within the realm of logic.

Making software aware, making it capable of learning over time, is a relatively neat topic, and one that was revolutionary at the time of its inception. These days, we almost expect our software to have a certain degree of "intelligence", however we define it.
 
Xouper- yes that is a fair example.

My contact with the idea is from general reading of books like Hofstadter's "Godel Escher, Bach" ,George Spencer Brown's "Laws of form" and A.J.Ayer's "Language Truth & Logic",
plus general experience of life.

Peskanov- See above paragraph, but I'm no student of formal logic, if there is a "school " which sees things this way, I have no idea what it is called. Interesting Ian might, or Bill Hoyt .

Metropolis- It's an apple till whoever is calling it an apple decides not to. Apple is a descriptive label, used for convenience; it is not a property of the fruit. The sort of meaningless situation I mean is where a language or other formal system produces an inherently self contradictory or nonsensical statement, such as the one Xouper linked to. I tend to believe imaginary (complex) numbers are an example in a very different formal system, but I'm in a minority of one on that. (Grabs tin hat and dives for cover).

Nb. I think this is close to hijacking the point of your thread anyway, so I'll shut up, except to say some of my electronics type mates were enchanted with FL at least ten years ago, so it can hardly be considered new. As for radical- seems like the digital afficionados are slowly realising that a lot of the world is, how can I put this...analog? (Retreats under table)
 
Commander Cool said:
It is radical in the context of it's application in computer software. At the hardware level, the computer knows only 0 or 1, set or cleared, on or off. For all its basic functions, it is stuck within the realm of logic.


To me though this is just an example of why Boolean logic suffices for almost everything you can do with a computer. I've never run into a computer problem I couldn't eventually solve by using booleans and nested "if then" statements.

As far as the apple I figure the routine would run something like:

Procedure "Apple?"
..If Red then
....If Round then
......If Shiny then
........If sweet then
.......... If whole then "Yes"
..........Else "HowMuch"
........Else "No"
......Else "No"
....Else "No"
..Else "No"

Procedure "HowMuch"
..If amount > / = 50% then "Yes"
..Else "No"

Procedure "Yes."
..Lprint "Yes, it is an apple"

Procedure "No."
..Lprint "No, not an apple."


So all this says to me that any statement that is mostly true or mostly false can probably be broken down into a series of smaller statments that are true or false. It's just that more of those smaller statements are true than false or vice-versa.
 
Andonyx- given the binary nature of digital computers and processors, a pc is the last place you might expect to find a non binary problem. The point (I think) is that the "real" world is less easy to digitise. Paradox exists. Shades of doubt exist. We expect that in real life.

I think why FL seems radical to some is precisely because we have grown NOT to expect such analog shading in electronics, having conditioned young engineers to see problems from a digital point of view. Had someone proposed "fuzzy logic" in 1950, I suspect it would have seemed mainstream rather than revolutionary.
 
Andronyx - of course, fuzzy logic as implemented on computers must be able to be broken down to true/false, as the algorithms are implemented on binary logic silicon.

However, the question is how easy it is to encode information or decision making using binary logic versus fuzzy logic.

Fuzzy logic is quite useful in many areas, including control theory. For example, modern PID controllers incorporate fuzzy logic to determine optimal settings of the P, I, and D settings.

For an analogy, consider sets of linear equations. Many different systems, such as neural nets, simulated annealing, finite element analysis, to name three very different beasts, all end up being expressed as sets of linear equations which are then solved (yes, that statement is ridiculously oversimplified, but let's not quibble). Imagine trying to solve a local optimization problem only using sets of linear equations without using the concepts of simulated annealing (for example). Or trying to write a learning algorithm w/o the theoretical structure provided by neural nets. Can it be done? Of course, and it has been. Nonetheless, it is much easier to understand and tweak a learning algorithm by modifying your neural net, rather than adjusting coefficients on a set of equations.

Abstraction results in cognitive power, not more powerful processing.
 
I understand what you're saying, Soapy, and Roger, it just seems to me that as we can store more data, and make faster calculations even in a purely binary systems, our fidelity to descriptions of reality gets ever finer.

Just like real life may not always be easily digitized in a binary system, audio didn't used to lend itself to discreet samples.

In real life, of course waves are best caputured in an anlogue medium since they obviously are described by curves not step patterns.

But the more samples we used over finer units of time....we were able to describe analogue signals in a digital fashion in ways that the vast majority of people cannot tell the difference.

In the case of computers I bet the further we break down a given circumstance into it's binary components the more accurate our depiction of it gets, even if it's not binary to begin with.
 
Soapy Sam said:
The sort of meaningless situation I mean is where a language or other formal system produces an inherently self contradictory or nonsensical statement, such as the one Xouper linked to.


Theory of Types (Logical Types/Logical Levels). Answered Russel's Paradox in Set theory.

http://plato.stanford.edu/entries/russell-paradox/

Also important in understanding the double-bind theory of schizophrenia:

http://www.goertzel.org/dynapsyc/1997/Koopmans.html

The traditional line of argument in double bind theory is that such interactions involve a confusion of communicative levels, or logical types, and that the participant(s) in the double bind interaction gradually internalize this confusion.
 
metropolis_part_one said:
What's the big deal. It makes sense to me. But I don't see it as radical or anything?

Why should it be a big deal? It's interesting and useful. It also encompasses some common ideas about reasoning, i. e. a chain is only as strong as its weakest link.

The only conflict I can see is between fuzzy logic and classical probability. While this was maybe a big deal twenty years ago, I don't see that it's a big deal now.
 
Andonyx said:

To me though this is just an example of why Boolean logic suffices for almost everything you can do with a computer. I've never run into a computer problem I couldn't eventually solve by using booleans and nested "if then" statements.

That something can be done using a formal system doesn't mean it should be done that way.

For example, pure lamba-calculus captures all computable problems. That is, if something can be done with a computer, then it can be done with lamda-calculus.

The formal grammar of pure lambda-calculus is simple: a term is either a variable x, application of two simpler terms (M_1 M_2), or a lambda-term (lambda x . M) where x is a variable and M is a term.

This syntax combined with a few rules for term substitution, variable conversion, and term reduction is enough to compute anything that can be computed.

Of course, no one uses pure lambda-calculus for anything in real life (applied lambda-calculus that extends the syntax with constants and special forms is used in functional programming). I once coded the RSA encryption algorithm using pure lambda-terms, and the resulting program worked reasonably fast for up to 10-bit integers. In the real life, RSA is used with at least 1024 bit integers...

Now, coming back on topic, fuzzy logic is nice in that there are many domains where it is easy to construct a formal model of a system using it. The control theory is already mentioned as one such domain. Then, you can implement a digital controller by systematically (and automatically) translating the fuzzy logic model into digital circuits or a normal computer program that uses boolean logic. In many cases this is faster and less error-prone than to program the equivalent circuit from scratch, and you quite likely end up with a more efficient design.

However, fuzzy logic is not a panacea. It is good in some domains but not-so-good in others.

In practice, you should always use the correct tool for a job. If your problem can be naturally expressed using classical logic, there is no reason to use anything else. If your problem involves probabilities, use probability theory. If it involves "fuzzy" stuff, use fuzzy logic. And so on.
 
quote: (Quoted by Suggestologist)
--------------------------------------------------------------------------------

The traditional line of argument in double bind theory is that such interactions involve a confusion of communicative levels, or logical types, and that the participant(s) in the double bind interaction gradually internalize this confusion.
--------------------------------------------------------------------------------

For a jolly fine practical example of this, study Interesting Ian's thread on free will in the philosophy area, ignoring my erroneous input and carefully monitoring Paul C. Anagnostopolous' increasing frustration.

ps Is "double bind" correct, or should it be "blind"?
 
LW said:


That something can be done using a formal system doesn't mean it should be done that way.

......

In practice, you should always use the correct tool for a job. If your problem can be naturally expressed using classical logic, there is no reason to use anything else. If your problem involves probabilities, use probability theory. If it involves "fuzzy" stuff, use fuzzy logic. And so on.

That's just it for me really, I'm not much of a programmer (obviously), and as such when the only tool one has is a hammer, every problem is a nail.

I mean look at the way I layed out my fake program! That should give you some idea of what year it was the last time I formally learned a language.
 
I work for a computer security company and I've designed risk-analysis systems. Basically, the system can decide if a specific situation is critical or not, depending of various input.

Fuzzy logic is an integral part of our approach.

Because a computer attack can happens very, very fast, the system must sometimes decide if a particular situation is "normal" or is really part of a "real attack" without having the complete picture. For exemple, port probes happens routinely on large networks so that it is impossible to investigate every single instance of them. On the other hand, some worms can be detected by an anormal level of port probes, comming from the same IP for example.

So the problem is to decide when the system should alert the user of a potential network problem and when it shouldn't, and it must do it in realtime.

The system is also designed to identify network addresses that belong to "critical" computers. Again, the term "critical" can means many things. A production DB server is certainly more "critical" than a simple workstation, and the CFO workstation may be considered more "critical" than an administrative assistant one (for exemple).

All these attributes are converted into value between 0 and 1. The system can then procede to compute the "importance" of a particular network event and offer recommendation.

There's nothing groundbreaking in this kind of approachs, thousands of different systems also use these concepts. But it is surprisingly efficient, much more than simple boolean heuristics.
 

Back
Top Bottom