The crucial importance of digital literacy in 21st Century math education

A common theme among the articles and videos on this website is the regular use of online resources in developing and using mathematics. The image shown here (which you will find in several of my videos and linked-articles on the site) presents some of the digital tools professional mathematicians use most commonly when working on a problem.

This particular list is based on my own experience over several decades working on problems for industry and various branches of the US government, and in conducting various academic research projects, but also includes two tools (MATLAB and the graphing calculator) that I do not use but many other mathematicians do, which I list as a nod towards others’ toolboxes. I typically use those tools in the order they appear in the display, reading left-to-right.

Whenever I show this image to an audience, I inevitably remark that the use of, in particular, Google and Youtube requires a degree of sophistication in order to (1) find the most useful sites and (2) assess the reliability of the information provided on those sites. Item (1) requires sufficient knowledge to enter keywords and phrases that pull up useful resources; item (2) depends on a skillset generally referred to as digital literacy.

Given the central role of digital tools and online resources in contemporary mathematical praxis, these two skillsets are absolutely critical components in 21st Century mathematical praxis. That means they have to be part of all students’ mathematics education.

The first skillset, being able to make effective use of search engines to navigate to relevant information and online resources, has to be provided in the mathematics class. It is only by having a good, broad overview of a sufficiently large part of the range of mathematical concepts, facts, methods, procedures, and tools that are available, that a student can select keywords and phrases to conduct a productive search. Such an overview can be acquired only by experience, over several years.

Since there is no longer any need for a student to spend many hours practicing the hand execution of mechanical procedures for solving math problems in order to achieve accuracy and speed, the vast amounts of time freed up from what used to be a massive time sink, can be devoted to working on a wide variety of different kinds of problem.

[CAVEAT: Whether an individual teacher has the freedom to follow this strategy is another matter. Sensible implementation of mathematics education along the lines of the Common Core should, in theory, make this possible; indeed, the CCSSM were developed — as standards, not a curriculum — to facilitate this. But I frequently see complaints that various curricula and local school districts still insist on a lot of rote practice of skills that will never be used. Other than use my bully pulpit to try to change that, as I not infrequently do, I cannot remove that obstacle, I’m afraid.]

Turning to the second skillset, assessing the reliability of the information provided on a Web resource, in today’s world that needs to be a major component of almost every classroom subject. 

In the case of Wikipedia, which is high on my list of mathematical tools, for post-secondary mathematics it is an efficient and highly reliable resource to find out about any particular mathematical definition, concept, or technique — its reliability being a consequence of the fact that only knowledgeable professional mathematicians are able to contribute at that level. Unfortunately, the same cannot be said for K-12 mathematics. 

For example, a quick look as I was writing this post showed that the Wikipedia entry for multiplication is highly misleading. In fact, it used to be plain wrong until I wrote a series of articles for the Mathematical Association of America a few years ago. [See the section on Multiplication in this compilation post.] However, while the current entry is not exactly wrong, its misleading nature is a pedagogic disaster in the making. It therefore provides a good example of why the wise teacher or student should use Wikipedia with extreme caution as a resource for K-12 mathematics.

Ditto for Google, Youtube, and any other online resource. “Buyer beware” needs to be the guiding principle.

Unfortunately, a recent report from Stanford’s Graduate School of Education (see momentarily) indicates that for the most part, America’s school system is doing a terrible job in making sure students acquire the digital literacy skills that are of such critical importance to everyone in today’s world.

Note: I am not pointing a finger at any one person or any one group here. It is the education system that’s failing our students. And not just at K-12 level. Higher education too needs to do a lot more to ensure all students acquire the digital literacy that is now an essential life skill.

The Stanford report focuses on Civics, a domain where the very functioning of democracy and government requires high levels of digital literacy, as highlighted by the massive growth of “fake news” in the period leading up to the 2016 U.S. election, and subsequently. But the basic principles of digital literacy apply equally to mathematics and pretty well any discipline where online resources are used (which is pretty well any discipline, of course). So the Stanford report provides an excellent pointer to what needs to be done in all school subjects, including mathematics.

The report is readily available online: Students’ Civic Online Reasoning, by Joel Breakstone, Mark Smith, & Sam Wineburg, The Stanford History Education Group, 14 November, 2019. (Accessing it provides an easy first exercise in applying the reliability assessing skills the report points to!)

While I recommend you read the whole thing (there is a PDF download option), it is 49 pages in length (including many charts), so let me provide here a brief summary of the parts particularly pertinent to the use of online resources in mathematics.

First, a bit of general background to the new report. In November 2016, the Stanford History Education Group released a study showing that young people lacked basic skills of digital evaluation. In the years since then, a whole host of efforts—including legislative initiatives in 18 states—have tried to address this problem. Between June 2018 and May 2019, the Stanford team conducted a new, deeper assessment on a national sample of 3,446 students, chosen to match the demographic profile of high school students across the United States.

The six exercises in the assessment gauged students’ ability to evaluate digital sources on the open internet. The results should be a wake-up call for the nation. The researchers summed up the results they found in a single, dramatic sentence: “The results—if they can be summarized in a word—are troubling.”

The nation’s high school students are, it seems, hopelessly ill-equipped to use the Internet as a source for information. The report cites the following examples:

• Fifty-two percent of students believed a grainy video claiming to show ballot stuffing in the 2016 Democratic primaries (the video was actually shot in Russia) constituted “strong evidence” of voter fraud in the U.S. Among more than 3,000 responses, only three students tracked down the source of the video, even though a quick search turns up a variety of articles exposing the ruse.

• Two-thirds of students couldn’t tell the difference between news stories and ads (set off by the words “Sponsored Content”) on Slate’s homepage.

• Ninety-six percent of students did not consider why ties between a climate change website and the fossil fuel industry might lessen that website’s credibility. Instead of investigating who was behind the site, students focused on superficial markers of credibility: the site’s aesthetics, its top-level domain, or how it portrayed itself on the About page.

The assessment questions used by the Stanford team were developed after first looking at the ways three groups of Internet users evaluated a series of unfamiliar websites: Stanford freshmen, university professors from four different institutions, and fact checkers from some of the country’s leading news outlets.

Of particular relevance, the fact checkers’ approach differed markedly from the undergraduates and the professors.

When fact checkers landed on an unknown website, they immediately left it, and opened new browser tabs to search for information about the trustworthiness of the original source. (The researchers refer to this approach as lateral reading.)

In contrast, both the students and the academics typically read vertically, spending minutes examining the original site’s prose, references, About page, and top-level domain (e.g., .com versus .org). Yet these features are all easy to manipulate. 

Fact checkers’ first action upon landing on an unfamiliar site is, then, to leave it. The result of this seemingly paradoxical behavior is that they read less, learn more, and reach better conclusions in less time. Their initial goal is to answer three questions about the resource: (1) Who is behind the information? (2) What is the evidence? (3) What do other sources say? 

While the value of the fact-checkers’ approach is critical in navigating today’s online sources of news and current affairs, the approach is no less critical in using online resources in the STEM disciplines.

For instance, imagine the initial stage of collecting data for a mathematical analysis of problems about climate change, the effectiveness of vaccines, or diet — all topics that students find highly engaging, and which thus provide excellent projects to learn about a wide variety of mathematical techniques. In all three cases, there is no shortage of deliberate misinformation that must be filtered out. 

Here then, is a summary of the six assessment tasks the Stanford team developed, listed with the fact-checkers initial questions being addressed in each case, together with (in italics) the specific example task given to the students:

  • Evaluating Video Evidence (What’s the evidence? Who’s behind the information? Evaluate whether a video posted on Facebook is good evidence of voter fraud.)
  • Webpage Comparison (Who’s behind the information? Explain which of two websites is a better source of information on gun control.) 
  • Article Evaluation (What do other sources say? Who’s behind the information? Using any online sources, explain whether a website is a reliable source of information about global warming.)
  • Claims on Social Media 1 (What’s the evidence? Who’s behind the information? Explain why a social media post is a useful source of information about background checks for firearms.)
  • Claims on Social Media 2 (What’s the evidence? Who’s behind the information? Explain how a social media post about background checks might not be a useful source of information.)
  • Homepage Analysis (Who’s behind the information? Explain whether tiles on the homepage of a website are advertisements or news stories.)

The remainder of this post focuses on the study’s results of particular relevance to mathematics education. It is taken, with minimal editing, directly from the original report.

Overall, the students in the high schools study struggled on all of the tasks. At least two-thirds of student responses were at the “Beginning” level for each of the six tasks. On four of the six tasks, over 90% of students received no credit at all. Out of all of the student responses to the six tasks, fewer than 3% earned full credit.

Claims on Social Media question 1 had the lowest proportion of Mastery responses, with fewer than 1% of students demonstrating a strong understanding of the COR competencies measured by the task.

Evaluating Evidence had the highest proportion, with 8.7% earning full credit. 

The Website Evaluation task (which is of particular significance for the use of online resources in mathematics) had the highest proportion of “Beginning” scores, with 96.8% of students earning no points. The question assessed whether students could engage in lateral reading—that is, leaving a site to investigate whether it is a trustworthy source of information. Students were provided a link to the homepage of CO2 Science (co2science.org), an organization whose About page states that their mission is to “disseminate factual reports and sound commentary” on the effects of carbon dioxide on the environment. Students were asked whether this page is a reliable source of information. Screen prompts reminded students that they were allowed to search online to answer the question. The few students who earned a Mastery score used the open internet to discover that CO2 Science is run by the Center for the Study of Carbon Dioxide and Global Change, a climate change denial organization funded by fossil fuel companies, including ExxonMobil. The Center for the Study of Carbon Dioxide and Global Change also has strong ties to the American Legislative Exchange Council, an organization that opposes legislative efforts to limit fossil fuel use.

A student from a rural district in Oregon wrote: “I do not believe this is a reliable source of information about global warming because even though the company is a nonprofit organization, it receives much of its funding from the “largest U.S. oil company–ExxonMobil–and the largest U.S. coal mining company–Peabody Energy” (Greenpeace). Moreover, Craig Idso, the founding chairman of the Center for the Study of Carbon Dioxide and Global Change, was also a consultant for Peabody Energy. It is no wonder this organization advocates for unrestricted carbon dioxide levels; these claims are in the best interest of the Center for the Study of Carbon Dioxide and Global Change as well as the oil and mining companies that sponsor it.”

Another student from suburban Oklahoma responded: “No, it is not a reliable source because it has ties to large companies that want to purposefully mislead people when it comes to climate change. According to USA TODAY, Exxon has sponsored this nonprofit to pump out misleading information on climate change. According to the Seattle Post-Intelligencer, many of their scientists also have ties with energy lobbyists.”

Both students adeptly searched for information on the internet about who was behind the site. Both concluded that the site was unreliable because of its ties to fossil fuel interests.

Unfortunately, responses like these were exceedingly rare. Fewer than two percent of students received a “Mastery” score. Over 96% of student responses were categorized as “Beginning”. Instead of leaving the site, these students were drawn to features of the site itself, such as its top-level domain (.org), the recency of its updates, the presence or absence of ads, and the quantity of information it included (e.g., graphs and infographics). 

A student from suburban New Jersey wrote: “This page is a reliable source to obtain information from. You see in the URL that it ends in .org as opposed to .com.” This student was seemingly unaware that .org is an “open” domain; any individual or group can register a .org domain without attesting to their motives. A student from the urban South was taken in by CO2 Science’s About page: “The ‘about us’ tab does show that the organization is a non-profit dedicated to simply providing the research and stating what it may represent. In their position paper on CO2, they provide evidence for both sides and state matters in a scientific manner. Therefore, I would say they are an unbiased and reliable source.”As the Stanford team note in their report, “Accepting at face value how an unknown group describes itself is a dangerous way to make judgments online.”

In fact, students often displayed a confusion about the meaning of top-level domains such as .org. While there are many .org’s that work for social betterment, the domain is a favorite for political lobby groups and groups that cast themselves as grassroots efforts but which are actually backed by powerful political or commercial interests). For-profit companies can also be listed as .org’s. Craigslist, a corporation with an estimated $1 billion in revenue in 2018, is registered as craigslist.org. Nor is nonprofit status a dependable marker of an organization’s credibility. Obtaining recognition by the IRS as a public charity is an extremely easy thing to do. Of the 101,962 applications the IRS received in 2015, 95,372 were granted tax-deductible status—an approval rate of 94%.

Food for thought, don’t you agree?

For a discussion of the other items in the study, I’ll refer you to the report itself. 

Let me end by re-iterating that the specific findings I listed above are all highly relevant to developing good skillsets for using online resources in mathematics.