“What is the value of 8 ÷ 2(2 + 2)?” and other Internet confusions

I received the email reproduced below a few days ago. (I have made a couple of minor presentation edits for publication here.) I get a lot of emails from laypersons and journalists every time one of those social media viral exchanges comes along. This particular one began by referencing a video of a television interview I had done on that title question back in 2019 when I was a visiting professor in my native UK; here it is.

This email was interesting because it highlighted something I long ago realized: that many people — among them some school mathematics teachers — have a perception of algebra that mathematicians abandoned in the seventeenth century. I’ll come to that presently. First, here is that email.

The (first) email

START

Re: The viral maths equation ‘8 ÷ 2(2 + 2)’

Hello Dr. Devlin,

I saw you featured in a YouTube video on the subject of the viral maths equation ‘8 ÷ 2(2 + 2),’ and I have a few thoughts about it.

Every few years, it seems, someone writes another of this type of mathematical statement, claiming that the definitive answer is such-and-such — even though there is some dispute as to how to solve the statement. There really is nothing confusing or ambiguous in the way these statements are being written. Here’s why.

A few years back, there was this statement: 6÷2(1+2).

Let me pose a word problem to illustrate how this is properly solved:

Let’s say you have a half-dozen cupcakes. Sitting at each of the 2 tables in the room is 1 girl and 2 boys. You’re giving the cupcakes to the children. How many cupcakes does each child get?

6÷2(1+2)

2(1+2) is a term, specifically a monomial — like 2x. A monomial has a single value of the PRODUCT of the coefficient multiplied by the variable (factor). Since a monomial has a singular value (indicated by juxtaposition), no additional brackets are ever necessary around a monomial.
 
Now let’s say that the cupcakes came in 2 packages, with each pack containing 1 vanilla cupcake and 2 chocolate cupcakes. The statement of “cupcakes divided by children” can now be written as:

2(1+2)÷2(1+2)

or with a slash…

2(1+2)/2(1+2)
 
Now replace what’s inside the parentheses [1+2] with the variable “x,” making the statement: 2x/2x

Obviously, the quotient is not “x squared,” because you’re dividing one monomial by another monomial — which, as you know, is a FRACTION (whether written vertically as something-over-something, or written horizontally on a single line using a slash or obelus). With monomials, you don’t get to peel off the coefficient & do some other operation with it as if it had no bearing on the variable. Again, there is never a need for additional brackets around a monomial because of its inherent singular value (indicated by juxtaposition).

This is correct, isn’t it?

END

The email exchange

I won’t reproduce my reply and the brief exchange that followed. Let me just say that after two exchanges I suspected my correspondent was not interested in learning anything, rather they wanted to explain to me why I was wrong, both in my video interview and in my initial reply. I get a lot of emails like that. In any event, the heart of my initial reply was this.

Yes, if we regard that symbolic expression  “8 ÷ 2(2 + 2)”  as simply a formal representation of the cupcake scenario then what my correspondent wrote was correct. But that is to impose a semantics on the expression,  whereas a big part of algebra, as it developed from the sixteenth century to the late nineteenth century, was to provide an axiomatically-defined, precise “language of mathematics” (and by extension “language of science”), built on an abstract ontology, that can be used in many domains — a “mathesis universalis”, to use a classical term. And in that framework, that expression was not well-formed, and as a result was ambiguous. [Only later in the exchange did I point out that the puzzler they quoted is not an equation, it’s a formal expression (that can be part of an equation).]

I sent my initial reply and then kept the exchange going beyond that because I suspected my correspondent was a math teacher, and wanted to help them bring their thinking up-to-date for the sake of their students. But when their second email did little more than rephrase the same argument, this time using scholastic-Latin words such as vinculumsolidus, and obelus, I feared the worse: a dogmatic math teacher trapped in ancient Greece. Oh dear. But after a couple of further emails from them I realized they really were trying to understand what the issue was. Email is a crude form of communication, particularly with someone you know nothing about. I was glad I had (as is my practice) not let my suspicions get in the way of trying to answer questions as helpfully as I can. It did appear though, that my correspondent did not have sufficient background in modern mathematics to appreciate the points I was trying to make. In that, they are by no means alone. Read on.

By the time I sent my final reply, I’d got enough written down in my three reply emails to turn it into a short essay on the (significant) issues that lie behind those Internet “debates”. The fact that they keep arising shows the degree to which society as a whole has yet to take on board the significant revolution that took place over the past five-hundred years, not just in the nature of mathematics in general, and not just in algebra, but in the very concept of numbers.

What is algebra and how did it begin?

Much of the powerful elegance of algebra as understood today is that its grammar and calculational rules apply whatever the letters or symbols in its expressions are taken to refer to. That power is particularly important in calculus, which helps explain why calculus inventor Isaac Newton played a role in the creation of the number concept required for today’s algebra. (Until then, numbers were quantity-species pairs, much like our present-day currency, with “numbers” such as $5:45c.) Any college calculus instructor will tell you that when a student says they find calculus hard, the problem invariably lies with an inadequate understanding of algebra.

[ASIDE: So great is the difference between algebra that was practiced for many centuries prior to the time of Newton, and the subject as understood today, that some historians of mathematicians refer to the former as “pre-modern algebra” to avoid confusion.] 

It is because algebra as understood today has so many applications, that when a mathematician is asked

“Does 1 + 1 = 2?”

they will likely reply “It depends.” 

When the expression “1 + 1″ is interpreted in terms of everyday arithmetic, it is indeed true. But equally, “1 + 1 = 10” can be true, being an identity that lies at the heart of every digital device in today’s world. So it really does “depend”.

Pretty well all of those Internet debates about algebraic expressions boil down to applying a grammatically-incorrect expression in today’s algebra, crafted to be inherently ambiguous. The various classroom rules for parsing algebraic expressions, going by mnemonics such as PEMDAS, BODMAS, FOIL, etc.) all try to force algebra back to into the rigid box of pre-modern algebra, from which it emerged over 350 years ago. But when you do that, you are no longer doing algebra, you are doing arithmetic. Which is fine; it’s just not algebra. And with our arithmetic being done for us today by machines, we educators should focus on the powerful mental framework called algebra, that we have developed to understand, and to act in, the world.

It is however, easy to understand how the confusion arose. Algebra began as an arithmetic tool, developed and used by artisans to solve real-world arithmetic problems, as far as we can tell sometime in the last millennium before the Current Era, somewhere between Europe and India.

Images of al-Khwārizmī you can find on the Web are all artists’ creations. But this cover page from a copy of his book Algebra is for real.

Major advances were made in ancient Greece and First-Millennium India. (Clear antecedents to algebra have been found in Mesopotamian clay tablets dating back some 4,000 years.) By the ninth century algebra had been used by merchants and others in Baghdad, and a local scholar, al-Khwārizmī, helped turn it into an academic discipline by writing a practical algebra textbook. (He tells us in the introduction that his purpose in writing the book was not scholarship, but to help future artisans learn the method.)

[It is sometimes erroneously claimed that al-Khwārizmī was the first person to invent algebra. That’s clearly not the case. For one thing, several hundred years earlier, Diophantus of Alexandria had written an algebra book (Arithmetica); for another, al-Khwārizmī explains in his preface that he was simply collecting together in a book what was common knowledge at the time. What is the case is that his book, in addition to presenting a systematic theory of polynomial equations up to degree two, was the first in the chain of scholastic texts that led to today’s algebra — a chain that, not long after al-Khwārizmī’s death, incorporated the algebra of Diophantus, when his Arithmetica was translated into Arabic.]

The heart of the method of algebra (more precisely, premodern algebra) was that, when you want to find a number that solves a particular arithmetical problem, you perform four steps: 

1. Start by giving the unknown number a name (in today’s school algebra, typically the letter x).

2. Formulate an equation that involves the named unknown.

3. Simplify that equation (if possible).

4. Solve the simplified equation to determine the value of the unknown.

Step 1 is trivial, and Step 2 is translation from one language into another. No calculations there. It’s in Step 3 that you might find yourself doing symbol manipulations that today we call “doing algebra”. Simplification of the equation was required in large part because they did not have negative numbers, and at best a sort of pseudo-subtraction. The goal of Step 3 was to put the equation into a form for which Step 4 is purely numerical calculation.

Step 3 was then, the key, and usually the hardest. And it is from Step 3 that the (entire) method got its name, algebra. The Arabic terms for the two main rules for simplifying an equation were  al-jabr (“restoration”), which amounted to the transposition of subtracted terms to the other side of the equation, and al-muqābala  (the word means “confrontation”, in the sense of being put face-to-face to achieve a result, much like a game of chess), which amounted to the cancellation of like terms on opposite sides of the equation. The entire phrase al-jabr wal-muqābala was used to refer to the overall technique. 

Al-Khwārizmī used that Arabic phrase in the title of his book, and our modern word “algebra” derives from the first word of the phrase. 

How did we get from al-Khwārizmī’s algebra to today’s “language of science”?

Following al-Khwārizmī, a succession of developments in the Islamic mathematical world steadily increased the power of algebra, and in the early thirteenth century, aided by scholars such as Leonardo Fibonacci in Italy, the method crossed the Mediterranean into Europe. 

Despite the input of scholars, it remained, however, primarily a practical tool used by merchants, civil engineers, and (in a more restricted way) lawyers handling inheritance matters. But that started to change in the sixteenth century.

The algebra revolution — for so it would become — that began in the sixteenth century was set in motion by the work of the Frenchman François Viète, who developed a variant of pre-modern algebra to operate not on numbers but geometric magnitudes (lines, rectangles, cubes, etc.). An important side-effect of that work was that it marked the first time letters were used to denote both unknown and known quantities. (Viète had no other choice. How else do you refer to geometric magnitudes, whether known or unknown?)

In the following century, mathematicians such as René Descartes and Pierre de Fermat in France, Thomas Harriot in England, and others built on Viète’s work, producing what we can today recognize as a system that has many of the elements of present-day algebra. (As noted above, Newton was one of several mathematicians who revised the concept of number to better reflect the way the “new algebraists” were using them.) This was when algebra finally started to unmoor itself from arithmetic.

Finally, in the late 19th Century, today’s abstract algebra emerged, as an axiomatically defined “universal language” for mathematics that could be applied to many different domains, not just numbers. 

Today’s algebra is a dazzling human invention, a pinnacle of human thought that took many centuries — and the contributions of peoples spread across the globe — to develop. “Algebra” went from being a special tool of arithmetic (in the form of premodern algebra) to a rich framework (today’s abstract algebra) within which arithmetic can be viewed as just one special case. The child became the parent — a parent that has many other offspring in addition to arithmetic.

Yet I fear that not all school teachers convey just what a powerful tool it is — in part I suspect because they are constrained by a curriculum centuries out of date. Indeed much of current K-12 “algebra instruction” seems to go no further than the pre-modern algebra used prior to the 17th Century.

No surprise then, that the population at large views “algebra” as something different from the discipline that is taught universally in colleges and universities. Those Internet debates result from that division.