CSC 28 -- Discrete Structures ----------------------------- Note: these notes are my guide as to what to lecture on. They are not intended as a substitute for lecture, nor should they be considered a primary teaching tool. I make them available simply to remind you of what we talked about. ========================== Boolean Algebra -- CSC 28 ------------------------- A Boolean Algebra is - a set S with special elements named "0" and "1" - operations on the set: . + ' where - For all x,y,z \in S (x+y)+z = x+(y+z) Associative (xy)z = x(yz) x+y = y+x Commutative xy = yx x(y+z) = xy + xz Distributive x+(yz) = (x+y)(x+z) x+0 = x Identity x1 = x x+x' = 1 Complement xx' = 0 Hence, to show that some B = (S, +, ., ', 0, 1) is a Boolean algebra, one only needs to show that these five properties hold for all elements in S. Ex -- Given set U, Let S = P(U), 0 = { } 1 = U + = set union . = set intersection ' = set complement We need only to show for all X,Y,Z \in P(U) (X \union Y) \union Z = X \union (Y \union Z) etc. All of which we saw how to do when we studied sets. Deriving New Laws ----------------- These five basic laws are enough to derive many new laws Ex -- For any x,y \in S, if (x+y=1 and xy = 0) then y=x' (eg. sets) Proof: Assume x,y are arbitrary elements of S and that x+y=1 and xy = 0. y = y1 Identity = y(x+x') Complement = yx + yx' Distributive = xy + yx' Commutatative = 0 + yx' Assumption = xx' + yx' Complement = x'x + yx' Commutative = x'(x+y) Distributive = x'1 Assumption = x' Identity This theorem proves that there is only one complement for any x. By definition of complement x + x' = 1 and xx' = 0, and we just showed that if we say x+y=1 and xy = 0, then y is none other than x' itself. Duals ----- The dual of a theorem is created by toggling all 0/1 and ./+ One intersting thing about Boolean Algebras is that whenever a theorem is true, so is its dual. Ex -- Show x = x+x x = x+0 Identity = x+(xx') Complement = (x+x)(x+x') Distributive = (x+x)1 Complement = x+x Identity The dual of this law is x = xx x = x1 Identity = x(x+x') Complement = (xx)+(xx') Distributive = (xx)+0 Complement = xx Identity This is because all the initial laws were true for their duals as well. So, any proof of a law P using the laws of boolean algebra can be rewritten using the duals, which would prove the dual of P. Logic ----- Notice that a subset of propositional logic form a boolean algebra S = {T,F} 1 = T 0 = F . = and + = or [Verify the five laws] At a stroke, this gives us a very rich environment in which we can manipulate logical propositions. In fact, and/or/not is all we need because a --> b == (not a) or b a <--> b == (a --> b) and (b --> a) So, Boolean algebra can be used to express logical expressions. Why do we care about Boolean algebra? ------------------------------------- We use boolean algebra because it can represent logical functions. Electronic devices use logic to do their computation, and boolean algebra gives designers some tools to design and analyse solutions. For the next couple of weeks, we will look at designing electronic circuits to make simple computations. For example: [Draw black box doing two-bit multiplication] To design such a circuit we will need the following skills - Design truth-table to represent input/output behavior - Convert truth-table into boolean function - Simplify boolean function - Convert boolean-function into a circuit Converting from boolean expression to circuit --------------------------------------------- We will start with the easiest of these. Electronic devices are made up of gates: [Draw and/or/not] These gates can be combined into circuits with any number of input wires and a single output wire. Every and/or/not in a boolean expression represents a gate in the circuit. To convert a boolean expression into a circuit: 1. Choose the last operation evaluated (lowest-precedence unless parentheses chage order of evaluation) that does not have a gate drawn. 2. Draw a gate for this operation and hook up its output apprpriately. 3. Goto 1 until all operations have associated gates. 4. Attach the expression inputs appropriately. The key is to work in the opposite direct through the operations that you would if you were evaluating the expression. Ex -- Draw a circuit for xor. (x1x2'+x1'x2). Draw a circuit for bidirectional. (a'+b)(b'+a) Converting from circuit to boolean expression --------------------------------------------- The other direction is easy too. Convert a circuit into a boolean expression by. 1. Pick a wire that has a known boolean value 2. Write on the wire a boolean expression representing its value. 3. Goto 1 until all wires have a boolean expression written on them. 4. The expression for the circuit is written on the circuit's output wire. [Ex Draw a random circuit and convert to expression.] Creating an arbitrary circuit ----------------------------- We've seen how to convert between boolean expressions and circuits in a manner that maintains a one-to-one correspondence between gates in the circuit and operators in the equation. Given an arbitrary logic table, how do we realize a circuit for it. Simple, we look at the inputs that make it true, and write them out in an expression using or's. Example x1 x2 out (addition mod 2) 0 0 0 0 1 1 1 0 1 1 1 0 We want a circuit that is true when (x1 = F and x2 = T) or (x1 = T and x2 = F). In the form of an equation this out = x1x2' + x1'x2 Def: ---- A literal is a Boolean variable or its complement. A minterm of Boolean variables x1, x2, ..., xn is a Boolean product y1y2...yn, where yi = xi or yi = xi'. Hence, a minterm is a product of n literals, with one literal for each variable. An equation written only as the "or" of minterms is in disjunctive normal form (or sum-of-products form). Alg: --- To find a circuit for any logic table, - Find the rows that indicates a 1 for output - Write a minterm for each row which is 1 on the rows inputs - "Or" all the minterms This results in a DNF equation for the logic table. The equation can then be converted into a circuit. x1 x2 y 0 0 1 0 1 1 1 0 0 1 1 0 y = x1'x2' + x1'x2 We can use boolean algebra to simplify the equation y = x1'x2' + x1'x2 = x1'(x2' + x2) Distributive = x1'(1) Complement = x1' Identity Ex: --- Multiplier: z1z2z3z4 = x1x2 * y1y2 [Draw table and z1, z2, z3, z4 as separate circuits] Use distributive law to reduce z3 to z3 = x1y1(x2'+y2') Functional Completeness ----------------------- We can construct a circuit for any boolean expression using and/or/not. This means the set of gates {and, or, not} is functionally complete. So is {and, not} because x+y = (x'y')' by de morgan's law, and so any expression that can be written using and/or/not can be written using just and/not by converting all the or's into a combination of and and not's. Also, {or, not} is functionally complete because xy = (x'+y')'. Are any of the sets {and}, {or}, {not} functionally complete? In other words, can and/or/not all be converted into a single type of gate? No. {nand} is functionally complete where x nand y = (xy)' x y x nand y 0 0 1 0 1 1 1 0 1 1 1 0 To show {nand} functionally complete, we need to show that and/or/not can be implemented using nand x' = (xx)' = x nand x x+y = (x'y')' = x' nand y' = (x nand x) nand (y nand y) xy = (x'+y')' This last can be converted to nand gates by (1) eliminating all complements and then (2) eliminating all additions (ors). The results in a large number of nand gates. Karnaugh Maps ------------- A literal is a boolean variable or its complement. A minterm is a product of literals, one for each variable. A Karnaugh map is a visual tool to help see relations between minterms. A K-Map for n variables is a grid of 2^n squares where every possible minterm of n variables is represented and arranged so that every adjacent pair of squares represent two minterms which differ in exactly one literal. (Note that adjacent includes wrap-around the top and sides.) [Draw 2, 3, 4 variable K-Maps] Notice that any rectangle of 4 squares indicates four minterms in which, if all true, the value of two of the variables don't matter with respect to the other variables in the system. Ex -- ab'c'+a'b'c'+ab'c+ab'c' == a A K-Map gives us a quick way to see such relationships. We can make a simplified sum-of-products expression for a function by following the steps 1. Mark the squares of a K-map corresponding to the function. 2. Select a minimal set of rectangles where i. Each rectangle has a power-of-two area and is as large as possible, ii. Each rectangle covers only marked squares, and iii. Every marked square is covered. 3. Translate each rectangle into a single simple product of literals. Sum these products. There is no magic way to do step 2. Look and play around until you find the answer. [Do examples] Note that this does not necessarily make the best expression/circuit because all expressions made this way are sums-of-products, and some expressions can be made simpler. a(b+c) is the same as ab+ac, but uses fewer gate inputs. Don't Care ---------- Sometimes when we are designing circuits, we don't really care what output the circuit generates for some combinations of inputs. Ex. Let's say we want to guarantee that the output of a circuit is 1 if both inputs are 1, and 0 when both are 0, but otherwise do not care. This can be written as the following truth table where * means "don't care" x y | z ----+-- 0 0 0 0 1 * 1 0 * 1 1 1 We construct a Karnaugh map for this table as usual, except that the k=map squares corresponding to don't care outputs we mark specially (with a *, for example). Then, when outlining blocks of marked squares we can, at our convenience, consider the don't care squares as either 0 or 1. Since we want to make the largest outlines possible, we will sometimes consider a don't care to be true, and sometimes false. [Draw K map] Functional Completeness ----------------------- We saw that AND, OR, NOT is enough to express every logical statement. This means that {AND, OR, NOT} is a functionally complete set of logical operators. By DeMorgan's law P OR Q == NOT(NOT P AND NOT Q) which means we could do with just {NOT, AND}. SImilarly we could do with just {NOT, OR}. It turns out that we could do even better. P NAND Q is defined as NOT (P AND Q), and P NOR Q is defined as NOT (P OR Q), and both NOT and AND can be constructed using the just NAND operations, and NOT and OR can be constructed using just NOR operations. This means that {NAND} and {NOR} are functionally complete sets of operators. In fact, many fabrication processes use only NAND or NOR gates. [Remaining time spent on multiplication example]