top of page

The truth: Propositional logic

In many areas of computer science, mathematics, philosophy, law and sociology, logic plays an important role in describing ideas and arguments, to make them valid and sensible. In my last article, I talked about Shakespeare’s arguments made by characters during his fiction novel.

These arguments were phrased and evaluated with categorical logic. The reason we could describe them with this logic is that the logic of categories tries to describe categories: humans, classes, items, animals.

The second logic

You may or may not know another form of logic: propositional. Before digging into the logic itself, a common motive presented to understand yet another logical form is to understand the incompleteness of categorical logic. If I claim that if it rains, I will be taking an umbrella to work and that it is raining, it should be inevitable that I am taking my umbrella to work, after all, it’s raining!

But when we try to describe these with categories, we reach problems. There is no category of rain, and category of umbrellas, and if we made them, saying that umbrellas belong in the category of rain does not describe our problem. To actually reach this argument with categorical alone, we would have to make it very artificial.

This is a common problem in logic, if our arguments aren’t both general and natural, we either haven’t phrased it correctly, or our logic does not represent what we try to represent.

The solution to this is propositional logic (what a surprise!). Unlike categorical logic, it focuses on giving a truth to every statement. It must either be raining or not, so the statement “it is raining” is either true or false.

This thus means we can not only make this statement something with a truth value, a sign: R, and we can also make the second statement: I am bringing an umbrella, either to be true or false: U. Then, our logic here is simple: R implies U is our first fact, and R is our second fact. Note that what we are doing here is only write down statements that are true. I didn’t write “R is true”. I can just write R, and the fact that it is written, already implies that it is true.

Then, adding even more signs, we can call implication >. So R>U, and given that R is true, we can imply U. This leads us to rules, of inference. We’ve inferenced U from R. In the most abstract sense, these rules can be doubted, but for any everyday matter, they are true. Other rules of inference include:

* --R implies R: if -R means “not R”, as in R is false, then --R means R, if it is not the case that R is false, then R is true.

* P^Q implies P: if P^Q means “P and Q”, then this means both P and Q are true, therefore, P is true.

* PvQ, -P implies Q: if “PvQ” means P or Q, as in either must be true, it cannot be the case that both are false, since either are true. Then, if P is false: -P, we know that Q must be true. Otherwise, if both are false, PvQ would not be true, giving us a contradiction.

This idea of contradiction is also very important. If P^-P, then P is both true and false. This clearly makes no sense: it is not both raining and not raining. So we can say that this contradiction is always false. To say this, we can simply negate this: -(P^-P) these parentheses are simply adding to the truth value: we take (P^-P) to be either true or false.

The whole alphabet

This alphabet, is comprised of:

* Capital words P, Q, R, S

* v, for PvQ, as in either P or Q

* ^, for P^Q, as in both P and Q

* -, for -P, as in not P

* >, for P>Q, as in P implies Q

* (), for (P), as taking anything inside the brackets P, is taken to be as simple as one of the capital words

Then, these symbols form our language. To actually speak in this language, we need to use logical rules. These are things like --R is the same as R.

Languages have rules

These rules are actually plenty, but they actually come as necessary. To save your time, and because they are actually quite trivial, I will leave you with them, so you can dig into each by yourself. These are:


  1. Modus ponens


  1. Modus tollens


  1. Hypothetical syllogism


  1. Disjunctive syllogism


  1. Constructive dilemma



  1. Simplification


  1. Conjunction


  1. And addition

They seem hard, but are actually quite easy. -> Modus ponens is the simplest one: given that “p implies q”, and “p” are both true, we can conclude q. This is very basic, what it means for p to imply q is that when p is true q will also be true.

-> Next up is modus tollens: if “p implies q”, and “not q”, then we can conclude not p: if p is true then q must be true. But wait, q is already false!

-> Hypothetical syllogism is as simple as modus ponens, but with a little twist. If p implies q and q implies r, p will imply r. Assume that p implies q, and that p is true. Then, q will be true via modus ponens, and q will make r true as well. Thus, whenever p is true, r will be true.

-> Disjunctive syllogism is a bit trickier, but quite useful. Given p or q, and not p, q must be true: if either p or q are true, and p is false, q must be true.

-> Constructive dilemma may be the hardest: if p implies q and r implies s, and p or r are true, q or s will lead from the premises. We know this because either p or r are true, and whichever one is true must cause their corresponding implication to be true, so either the q or the r implication will be true.

-> Simplification is quite a silly one, it's almost a definition: if p and q are true, you can conclude that p is true.

-> Conjunction is quite like simplification, but the other way around: if p is true, and q is true, we can say that p and q is true. Note that I said is and not are, this is because we now refer not to p and q separately, but conjunct them to create this and statement.

-> At last, addition is simple: if p is true, then no matter the truth value of q, for any q, p or q will be true, because, even if q is false, p will still be true.

So! What at first seemed hard now is manageable and simple, but it isn’t the complete picture. To have a full system for proofs, we must also learn the rules of replacement. These allow us to replace some statement x by some y, given that they are the same when it comes to their truth values. These are:


  1. DeMorgans


  1. Commutativity


  1. Associativity


  1. Distribution


  1. Double negation



  1. Transposition


  1. Material implication


  1. Material equivalence


  1. Exportation


  1. Tautology


-> So, let's start. Demorgans is simple: if it is not the case that p and q is true: ~(p^q), then at least one must be false: if neither are false, then p and q would be true, but these are false. So we could replace an expression like this, to not p or not q, as at least one of them must be false.

-> Commutativity is barely a rule, due to its simplicity. If we have expressions like p and q or p or q, we can commute the two elements: p and q is the same as q and p. P or q is the same as q or p. This, though, does not work for things like implication: p implies q does not mean q implies p. Since p implies q makes p independent of q, q may be true, and p may be false.

-> Associativity is very useful for when we need to say things like p or q or r: we can say that “open parenthesis” p or q “close parenthesis” or r: (p v q) v r. This makes it so that if at least one of these 3 are true, the total will be true. So we could just change these to say: p v (q v r). We would be doing the same thing. Similarly, with and statements, “open parenthesis” p and q “close parenthesis” and r will only be true when all 3 of these are true. Then, this would be equal to p and “open parenthesis” q and r “close parenthesis”.

-> distribution is very used in math and algebra, and you’ve likely seen it before. If a and open parenthesis b or c close parenthesis: a^(bvc), this will be the same as open parenthesis a and b close parenthesis or open parenthesis a and c close parentheses: (a^b)v(a^c). This is equivalent as a and b or c will only be true when b and a are true, or c and a are true, as the b or c will trigger the right hand side of the and statement to be true, and a will trigger the left hand side. Then, we can just say that either a and b is true or a and c is true so that the whole will be true: a^(bvc) ⇔ (a^b)v(a^c). This works in a similar manner for or instead of and, and and instead of or: a or open parenthesis b and c close parenthesis: av(b^c) will be true when either a is true, or when both b and c are true. Then, we can write this as: (avb)^(avc). Then, if a is false, b and c must be true, so these or statements are true, to make the whole expression true. And if a is true, it won't matter if b and c are true or not, as this will already make the whole expression true.

-> double negation is by far the simplest rule at all: the opposite of the opposite of a is itself. Not not a is a. If a is true, not a will be false, and not not a will be true. Similarly, if a is false, not a will be true, and not not a will be false.

-> then, transposition is just another way to write modus tollens: what we’ve seen from modus tollens is that if p implies q, and q is false, p must also be false. So we can say that if p implies q, not q implies not p.

-> material implication was the first rule we’ve looked at in propositional logic: p implies q is the same as not p or q. But a quick recap would be that p is either true or false: p or not p. But we can replace p with q, as if p implies q then q will also be true. This leaves us with not p or q. If p, then not p will be false, but q will be true, and if not q, then q will be false, and not p will be true, from transposition.

-> material equivalence is the very definition of if and only if. P if and only if q is equivalent to p implies q and q implies p: (p<=>q)<=>((p=>q)^(q=>p)). This can actually be rephrased as p and q or not p and not q: either both are true or both are false: (p<=>q)<=>((p^q)v(~p^~q))

-> exportation is when a and b imply c: (a^b)=>c, this is equivalent to a implies b implies c: a=>(b=>c). This is because if a is true, and b is true, both ways, we can conclude c.

-> Lastly, tautology, if a or a is true: ava a must be true, this is trivial. And similarly, a and a implies a: a^a<=>a. This is simply a byproduct of conjunction and addition.

What next?

Next month, I will present to you what it means for us to speak in propositional logic. In summary, propositional logic can be applied to prove things, true or false.

References

cmu.edu. “Lecture 1: Propositional Logic.” Cmu.Edu, 2018, www.cs.cmu.edu/~emc/15414-f12/lecture/propositional_logic.pdf .

iep.utm.edu. “Propositional Logic | Internet Encyclopedia of Philosophy.” Https://Iep.Utm.Edu/Prop-Log/ , iep.utm.edu, 2021, iep.utm.edu/prop-log.

___________________________________

Gabriel Tonini

Comments


Contact Us

Thanks for submitting!

bottom of page