The Liars Paradox - Resolved

I got that, but I read it this way: "S is false or undefined" is equivalent to: "S barometer starbucks."

But it's not.

In particular, you understand the predicate " ... is false or undefined" to be meaningful; if I said that "Vivien barometer starbucks" is false or undefined, you would both understand me and agree with me.

But if you understand the meaning of that predicate, and if you understand which string the symbol S refers to, then you understand the meaning of "S is false or undefined" through simple compositional semantics.

Neither S nor the predicate lose their meaning because someone put them together.


But then, you would say, "Ah, but to know it is undefined means you have to make sense of it. Without understanding it, you can't say it is nonsense."

More or less. That and the fact that the parts are meaningful, which means that simple composition gives you a meaningful whole. (It's not like an idiom where "kick" and "bucket" mean something altogether different when put together.)

That objection I cannot get past yet. So, for instance, "Left of center, reducto plum wheel" is nonsense, but do I have to understand its meaning to know there is a lack of meaning?

In a sense, you do. You know the meanings of the individual parts, and you also know that the meanings don't fit smoothly together -- which is to say that you do understand its meaning, and you know that simple composition won't give you a meaning for the whole phrase.
 
Semantics is an essential part of sentential logic. Without it, one cannot evaluate the terms or discern why (A & ~A) is logically meaningless.

Let A equal "squircle."

Bad example. Assuming you mean to define a "squircle" as something that simultaneously is both a square and a circle... and assuming the usual geometric definitions (i.e. a square is a rectangular rhombus and a circle is a set of points equidistant from the center point), then not only is A not logically meaningless, but I can even draw you (an ASCII representation of) one:

/\
\/

The trick is that the points are equidistant in the taxicab, Manhattan, or L_1 metric.

You may consider this cheating, but in fact this is the general rule of paradoxes. Paradoxes "in the real world" arise because you're making an unwarranted and incorrect assumption -- in this case, you're assuming that we're using a particular kind of space.

Simpler paradoxes can be solved this way, too.

E.g:
Person A : What time is it?
Person B : It's 11:30.
Person C : No, it's 12:30!
Person D : No, it's 2:30!

All three statements are true.

There are two obvious resolutions; the first is that this is a conference call and persons B, C, and D are in different time zones. The second is that this conversation doesn't take place in real time.

Sentential logic is unable to operate on A (a thing that is simultaneously both a circle and a square) because A is semantically null.

Absolutely and unequivocably false. "Sentential logic" (if there were such a thing; I think you mean "propositional" logic) can deal quite comfortably with the idea of a squircle.
 
And:



I realize I am probably out of bounds here, philosophically. My point is that using symbolic language any message can be constructed.

"The door is open and closed"

is just a sentence. It is gramatically correct and fully comprehensible, but it does not convey any meaning.

Certainly it does.

However, a real paradox would be a door that was open and closed.

A Dutch door?

File:Dutch_door_open.jpg
 
Last edited:
You know the meanings of the individual parts, and you also know that the meanings don't fit smoothly together -- which is to say that you do understand its meaning, and you know that simple composition won't give you a meaning for the whole phrase.

This is attractive, but I'm not sure it describes the whole picture. It seems what you are saying is that there is a hierarchy. The words have meanings as units and then those units (and meanings) are combined logically to give a compound meaning.

That's fine, but I think it glosses over a kind of pattern checking we do in language that we might not in mathematics. For instance, order is important; context is important. Unlike math, were there is correct and incorrect, language seems to have a default of correct -- in the sense that I will mentally strive to make something meaningful.

What it "feels like in my head" is that I am juggling possibilities to come up with a meaning that sits well and doesn't cause discordance. It is as if language were a type of music and my brain detects notes that are out of tune.

I wouldn't do this with math or straight logic though. I wouldn't say, 1 + 3 + 5 makes more sense than 3 + 1 + 5 because the sequential ordering is more pleasing, or has more meaning.

So, for example, in this one: "This sentence, no verb." I automatically insert the missing verb 'has' to make it fit in my brain. Further, it seems like this reading between the lines and treating language as more (and sometimes less) than a formal system is critical.

This is why I think I work backwards in the case of "S is either false or undefined." Because I do not get the uber-feeling of paradox, I can twist meanings in a way that counteracts what would happen if I worked bottom up in the additive way you correctly described. When I go top down, I already have an answer that fits my default mechanism and puts the "proper" meaning on it.

I think this mirrors the sort of ambiguity resolution we do all the time when we here something like, "FIRE!" and either shoot the prisoner or get a bucket of water.
 
This is attractive, but I'm not sure it describes the whole picture. It seems what you are saying is that there is a hierarchy. The words have meanings as units and then those units (and meanings) are combined logically to give a compound meaning.

Yes, that's what compositional semantics means. Not all semantics are compositional, but most of it is -- and formal logic is almost entirely propositional.

That's fine, but I think it glosses over a kind of pattern checking we do in language that we might not in mathematics. For instance, order is important; context is important. Unlike math, were there is correct and incorrect, language seems to have a default of correct -- in the sense that I will mentally strive to make something meaningful.

Yes. Also well-understood. Look up the Gricean maxims sometimes. But that's something mathematicians do as well; if a student or lecturer writes something obviously false, you look for a typo or other simple mistake instead of simply assuming the person is wrong. It's related more to "communication" than to any specifically-linguistic property.


I wouldn't do this with math or straight logic though. I wouldn't say, 1 + 3 + 5 makes more sense than 3 + 1 + 5 because the sequential ordering is more pleasing, or has more meaning.

You might if you were trying to fit the formula to a narrative -- there are eight people in the car : a grandparent, some parents, and some (grand) children. 1 + 3 + 5 = 8 suggests that there are five children, not five parents.

And notice the Gricean maxim here. You're going to assume I made a simple arithmetic mistake (or even a typo) since that properly sums to 9. You're not going to assume that we're operating in some weird nonstandard meanings of the words where the grandparent also counts as a parent.
 
Last edited:
(I snipped to ask a question) You're not going to assume that we're operating in some weird nonstandard meanings of the words where the grandparent also counts as a parent.

By the way, thank you for explaining that to me. As you can tell, I have no background in linguistics.

I am curious to know if there is such a thing as "Non-Euclidean" in language?
 
By the way, thank you for explaining that to me. As you can tell, I have no background in linguistics.

I am curious to know if there is such a thing as "Non-Euclidean" in language?

Oh, sure. Words and phrases take on specialized and non-standard meanings all the time depending upon context.

Case in point (which I stole from the late, great, and brilliant Asimov), how many syllables are there in the word "unionized"? In a chemistry class (or in a chemistry journal), four. At a labor relations board meeting, three. And "Cambridge" means someplace completely different depending on whether you're at Harvard or the University of London.

For that matter, words can take on different syntactic roles. For example does it make sense to talk about "three soups"? Normally "soup" is a mass noun (we'd talk about "three bowls of soups"). But in the right context and to the right person, "three soups" makes perfect sense. To a restaurant reviewer, "three soups" means "three types of soup" -- "They have a salad bar with at least three soups, plus chili, every day." To a waitress, it means "three bowls of soup" or "three people who ordered soup" -- "Table six has three soups, two salads, and a cheese fries." To a publisher, it might be three copies of a record or book named "Soup."

But to me, it's largely ungrammatical because (to me) "soup" is a mass noun.

ETA you've given an example yourself. What you do when someone shouts "Fire!" depends on whether you're holding a rifle or a bucket of water.....
 
I think I meant something a bit more radical than ambiguity. More along the lines of a coherent language that seemed nonsensical because of some structural/framing/axiomatic difference.

I can't think of one exactly (hence my question to you) but have in mind something like "dolphin" or one based on synesthesia or maybe something two psychotics might speak to each other.

Now that I think about it, it might relate to how a particular brain worked, since it is based on communication, or a "sharing of thoughts."

I don't even know if the pheromone language of ants would count, my instinct is to think even more radical than this.

Let me ask it a different way (related to the thread): Does language have to be grounded in reality (and hence, in some fashion, limited by that)?
 
Actually, it has four words and one number. ;P

Still, yes, one cannot argue logical consistency based upon linguistic ambiguity or syntactical ineptitude as these are imperfections of inference. The entire idea of symbolic notation is that it encapsulates idealistic factors of some underlying important principles. Math is not powerful because it simply abstracts counting. It is powerful because the abstraction has transcended counting to include notions which give sight to our deepest understandings of reality. Therefore, the principle has meaning and must be fundamental in some respect. When such an abstraction can be incontrovertibly shown to be flawed (or consistently imperfect) then it is not just a quibble over perspective (syntax, linguistics, semantics, context). It is a fundamental flaw which, as of yet, has not been shown to be circumventable. It is a paradox. Maybe our greatest achievement in existence is to solve dolely one paradox at a time until we have some fundamental system that is incorruptible? I doubt it but it is an interesting idea of what knowledge is and how it evolves.

That very problem was a big topic of discussion back in the 20s, but it was proven in a number of different ways that the goal was impossible to achieve. All possible logical systems we construct will either create paradoxes or will be incomplete. There are many problems for which we know there is a solution, but we know we can never determine it.
 
I think I meant something a bit more radical than ambiguity. More along the lines of a coherent language that seemed nonsensical because of some structural/framing/axiomatic difference.

Sure, there's lots of them. Try puzzling out Swahilli or Malagasy sometime, especially if the only language you know is English. Huge structural differences, and a completely separate vocabulary.

"Speaks an entirely different and unrelated language" is just the far side of a continuum that starts with "uses a form of jargon or slang that I'm not familiar with," extends through "speaks with a funny accent" and later through "speaks a language that's kind of like mine and I can puzzle it out if I work really hard."

The point is that it's not "ambiguity." The person who understands what's going on does not consider it ambiguous. It's simply that that person is situated in a context where he might understand something that wouldn't make sense if it were decontextualized.
 
Last edited:
...as I recall, it has something to do with "language" and "meta-language".
 
What Godel's original proof of the Incompleteness Theorem showed is that there is no way to eliminate or rule out meaningfully self-referential strings in any reasonably powerful formal system.

So the argument that statements like 'this statement is false' must always be invalid or simply meaningless, fails. They can arise not only in casual imprecise language, but in the most rigorously defined formal systems.

Since we cannot disallow them as invalid nor dismiss them as meaningless, we must instead accept them as statements that are neither true nor false. Being neither true nor false, they of course cannot be proven either true or false.

Furthermore, strings representing the statement 'this statement cannot be proven in system X' can also be validly constructed in any sufficiently powerful system X (including arithmetic and propositional logic). Unlike 'this statement is false' which is unprovable because it is not true (nor false), 'this statement cannot be proven in system X' is both unprovable and true in system X.

It is possible to create formal systems in which "meta-language" is ruled out entirely, but that is analogous to creating a computer language in which it's impossible to write any program that creates an infinite loop. Systems that meet the constraint are so limited as to be practically useless.

Respectfully,
Myriad
 
What Godel's original proof of the Incompleteness Theorem showed is that there is no way to eliminate or rule out meaningfully self-referential strings in any reasonably powerful formal system.

So the argument that statements like 'this statement is false' must always be invalid or simply meaningless, fails. They can arise not only in casual imprecise language, but in the most rigorously defined formal systems.

Since we cannot disallow them as invalid nor dismiss them as meaningless, we must instead accept them as statements that are neither true nor false. Being neither true nor false, they of course cannot be proven either true or false.

Furthermore, strings representing the statement 'this statement cannot be proven in system X' can also be validly constructed in any sufficiently powerful system X (including arithmetic and propositional logic). Unlike 'this statement is false' which is unprovable because it is not true (nor false), 'this statement cannot be proven in system X' is both unprovable and true in system X.

It is possible to create formal systems in which "meta-language" is ruled out entirely, but that is analogous to creating a computer language in which it's impossible to write any program that creates an infinite loop. Systems that meet the constraint are so limited as to be practically useless.

Respectfully,
Myriad
As always, precise and comprehensive.

Respectfully, Hans
 
Wonderful reply. I took a course "Theory of Knowledge" in college to fill out some empty credits in my pre-med courses. This is icing on that logical cake.

In fact, that course was of more use to me than several of the dreary courses that were required in pre-med, e.g. physical chemistry. Never even thought about that again, never used it. Knowing why we know what we know has been vastly more entertaining and useful.

We did study Godel's proof but never made the connection to logical fallacies, however.

Thanks!
 

Back
Top Bottom