Type to search

Media Literacy Examples

A guide to statistical literacy in the classroom – MEDIA LITERACY

Share

How can students understand, appreciate and perhaps question statistics in news media? Here are some real-life examples to get you and your students started!

Being able to decipher, decode and understand these statistics is integral to becoming a media-literate citizen.

In this article, we suggest a range of activities for middle school students that use actual examples from news media to help students understand the use of statistics in the media.

Through the use of critical questions, as outlined below, built upon the earlier introduction, we hope you can see the depth of questioning required next time your students or even you come across statistics in news articles or reports. Many of the examples will have contexts related to areas of the Australian Curriculum.

Getting Started

Built on the practice of statistics — which assumes students actually asking a question, collecting and analysing data, leading to a conclusion acknowledging uncertainty — statistical literacy becomes the application of this understanding in other contexts where one has to rely on a report of another person’s investigation.

Recall the definition of statistical literacy: it is the critical thinking needed when encountering claims anywhere based on statistics or data. It requires understanding of:

• the terminology and representation used — what does it mean statistically?

• the context – what does the terminology and representation mean in the context where it is presented?

critical thinking – what is the precise claim being made and is it reasonable?

Critical questions for students to ask of statistical claims in news media

The following list of questions provides a good start for any story with a flashy claim.

1. Problem
• Is the question asked clear in relation to the “answer” provided?

2. Plan
• Is it clear how the data was collected? The “sample”? The sample size?
• Is the sample voluntary in nature? Why would this be important?
• For surveys, who paid for them? Could this be significant for the results reported?

3. Data
• Does the report actually detail what data has been used and the related context?

4. Analysis
• If there is a statistic presented, does it make sense in the context? Is it accurate?
• Are any presented graphs misleading?
• Are graphs labelled correctly and meaningfully?
• Would this graph tell the same story if it were in a different form (e.g. bar or pie graph) or if the scale were changed?

5. Conclusion
• What conclusions are drawn on the basis of the statistics? How strong are the adjectives that describe the results? Are they exaggerated to attract a headline?
• Is any uncertainty acknowledged?
• Is the claim “too good to be true”?

6. General critical questions
• What questions would you ask the investigator or the reporter about the article?
• Does the article present enough information about how the study was carried out to reach a conclusion?

A question that arises is: how can we be confident that students will be able to answer these questions by the time they reach middle school? The Australian Curriculum is helpful here in suggesting activities for the early years (Foundation and Year 1) that use children’s natural curiosity to start investigating the practice of statistics.

A sample activity for Foundation and Year 1

Step 1: A question that involves variation

A class could be asked: What fruit is most popular in our lunch boxes today?

Step 2: A plan is devised to collect the data required to answer the question

Discuss what “popular” means for the class, leading to the idea of counting to find the “most”. Reporting takes place perhaps with a discussion about how to keep track of the “data”, which may mean the introduction of this new vocabulary. You could discuss whether all the data is the same. Or some children may have to make a decision between two kinds of fruit in their lunch boxes.

Step 3: Collecting, recording and storing the data

Maybe children will stand in lines behind pictures of the fruit in their lunch boxes. Dilemma: what if some children have no fruit today? Perhaps they choose what they would like to have from what are already mentioned, or a “no fruit today” category is created.

Step 4: Analysing the data

Children can then be asked to draw any representation or story of the class data they would like. These are likely to vary greatly! It should be possible to make a decision about the most popular fruit today (or maybe there is a tie if two fruits have the same total count). For this class, today, there should be no question of the certainty of the answer.

Step 5: Raising more questions

For these young children, hypothetical questions can still be asked, such as: “Will this be the same tomorrow? How certain are we?” or “Will our result be the same for the class next door? How could we find out? What about the whole school?” Even though these are questions about samples and populations, this statistical terminology is not introduced at these year levels. The purpose of these questions is not necessarily to elicit statistical answers or to collect more data (although some students may want to go and ask the class next door!) but to develop the idea of uncertainty for these questions compared with the certainty for the class today. It is uncertainty that underlies the practice of statistics.

The primary years

As students progress through the primary years and learn more types of representations, examples from the media can be used to reinforce skills as they are introduced in mathematics lessons.

The graph below could be used when students are learning about bar graphs. The class might be asked for their favourite seasons, and after creating their own bar graphs, asked to critique this graph. After a discussion, they could recreate the graph appropriately and then discuss where the sample might have come from to result in the displayed data. How similar is it to their own class results?

Given the students’ recent experiences with COVID-19, the following graph offers a similar opportunity to critique a data representation about people’s reactions to viruses. Which stands out more in the figure below: the numbers or the shaded rectangles? Obviously, putting a scale on the vertical axis would be very difficult! Perhaps ask students to redraw the graph with an appropriate vertical scale and decide if it changes their view of the message.

Once students have been introduced to pie graphs, the graph below should be very obviously inappropriate. Students should speculate on how the percentage results could have occurred in the study. For example, perhaps three questions reflecting “getting the virus”, “the family getting the virus” and “the effect on the economy” were among a long list of questions about worries associated with COVID-19. Others worries might have been “not being able to access essential food in the supermarket”, “not being able to celebrate my birthday with friends” or “spreading around the world”, and the three in the pie chart were the questions with the highest percentage responses.

A pie chart gone wrong (source: posted by u/McFlash64 on Reddit)

Once it is decided that the representation shown is totally inappropriate, perhaps the students could be given the task of developing a different visual representation that accurately represents the data.

Sample activities for years 7-10

Sampling and reporting

Moving into Years 7 to 10, more opportunities arise for connecting media literacy to contexts in the curriculum. As part of a science class, students could be asked to consider the results from a poll on global warming reported by Fox News, as shown below. Could 94 per cent of Americans believe it is at least somewhat likely that climate scientists falsify their data? Questions could also be asked about the sample size and who were sampled. Students might not initially query the question asked, but you can suggest it seems quite direct and ask how easy they would find it to answer themselves. They could even collect some data from their class.

In fact, the graphic does not show the question as asked. It was: “In order to support their own theories and beliefs about global warming, how likely is it that some scientists have falsified research data?” Have students discuss the difference between the two questions and how the respondents might have responded from different viewpoints. Hopefully students will also ask about the sample size, who was sampled and perhaps if they were asked other questions about their views on climate change.

Ask students to explain how the figures represent the sample and hence the claim for the population, perhaps inferred to be all people in the United States. Hopefully, by summing the percentages to 120 per cent, they will question how this can be so. (This is a good reminder that 100 per cent represents the whole sample and hence 120 per cent is meaningless.) How could this addition error have been made? Hopefully, students will ask how many options were available to choose from in responding to the question. There were five possibilities, with the original percentages in parenthesis:

Very likely (35%)

Somewhat likely (24%)

Not very likely (21%)

Not likely at all (5%)

Unsure (15%)

Get students to work in pairs or groups to work out the relationship between the results and the presentation in the graphic.

Data such as this is often presented in pie graphs. Have students create two different pie graphs to show the different stories from the two sets of figures (this can be a bit tricky for the results adding to 120 per cent but again a nice application of basic maths skills).

Graphical representations

Graphs are the most common way of representing data these days with the sophisticated techniques available. Most graphs in the media are appropriate and we want students to appreciate this and take the time to interpret their messages. There are, however, exceptions, as shown above, and students need to be critical in their initial engagement with graphs, particularly in relation to scales used.

Have students compare and contrast the following two graphs on the heights of females and males in two different contexts, perhaps in a geography or history class. The two graphs are classic examples of the impression given when the vertical axis does not begin at zero. With the graph on the right, one redeeming feature is that it has an indication ( // ) at the bottom of the vertical axis to indicate the scale has been truncated.

Is it possible that the average woman in Latvia could be five times as tall as the average woman in India? Or that the average 21-year-old British male in the 1870s was half as tall as the average 21-year-old British male in the 1970s? What other misconception is exemplified in the plot for females?

Or returning to climate change and science, consider the graph below, which was published under the heading “You can’t deny global warming after seeing this graph”.

Have students discuss how those who do not accept the evidence for climate change would reconstruct this graph, which shows the average global temperature change over the decades to 2010.

The importance of labelling the vertical axis is shown in comparing the above graph with one from the Australian Bureau of Meteorology for a similar time period with Australian data. Have students discuss the message and presentation of the two data sets.

For another variation on reporting the temperature change, use this climate change article, which provides an interactive graph showing the average temperatures for every year since 1880 compared to the 20th-century average. Because years are represented against a single average value, change is more easily appreciated, but of course the vertical scale must be appreciated.

Whether it is more appropriate to tell a story in a graph that starts at zero on the vertical axis or is truncated depends on the point the author wishes to make. The graphs below illustrate profoundly the importance of paying close attention to the scale when reading the messages in graphs. They are from an article on climate change. Each graph is presented in a manner to make a point on a particular issue.

In the world today, students and all citizens need to be ready to read graphs carefully, not just glance at the slope going up or down. What are the end points, and what are their meanings in the contexts of the information provided?

example graph climate 1

The ABC has many types of graphs that came into prominence during the COVID-19 pandemic that can be used for this kind of study .

Data analysis – Cats

Below, on the left, is a newspaper article that provides quite a bit of information about a study but also the opportunity to ask some questions. Students can be asked to read and critique the article as part of a social-science lesson, perhaps including class debate about its merits, as well as their own anecdotal experience with cats and the prey they bring home.

Questions might focus on how the survey was carried out and what it could be used for. The sample size is mentioned but there is no indication how households were selected, and should we assume that they all had cats? How was the data on captures collected? Students might suggest possibilities from their own experiences of what prey their cats have brought home and how reliable the data might be.

Although the word “average” is not used, students should be able to identify readily which measure was used, the mean, and how meaningful it is in terms of what it represents (perhaps students could draw humorous sketches of the average prey). Is the title of the article appropriate if half of the prey were in fact vermin? Students could suggest alternative titles.

The article on the right is recent and contains many numbers and claims that can be intimidating. Some checking is needed to see if the report is accurate or perhaps omitting one significant figure because of the point it wants to make about domestic cats and their impact on native animals.

First, students can check the initial claim of 230 million native animals being killed by 3.7 million domestic cats, which they can check gives a mean average of 62.2 native animals killed per domestic cat. An inconsistency occurs later, however, when it is claimed that domestic cats are killing 110 native animals per cat! Checking the total for each of the three categories of native animals (66.9 million mammals, 79.7 million birds and 82.9 million reptiles), this sum agrees with the 230 million native animals killed. So why is the basic average inconsistent?

Reading a later paragraph, we see that 30 per cent of domestic cats are “contained” (in enclosed areas), which reduces the number of domestic cats potentially killing native animals to 2.59 million. Now the mean number of native animals killed per domestic cat rises to 88.8, but not to 110.

Finally, an interesting number is omitted from the article. It reports that there are 2.1 million feral cats in Australia and that they each kill on average 576 native animals per year. This figure may be lost in the rest of the information on domestic cats, but when we calculate the total number of native animals killed by feral cats, it is 1,209 million, which is more than five times the toll from domestic cats! It would appear that keeping domestic cats inside is not going to solve the problem.

Also notice in comparing the two articles, the enormous change in the results reported and that there is no distinction between domestic and feral cats in the original article. Students could discuss the tremendous change in the collection and reporting of data over the years. What are some reasons for the change in the populations?

Data analysis – Increase of interest in craft

The following data was reported in a story about the increased interest in crafts during the coronavirus pandemic. Again, this might have relevance in a social-science class where there is a discussion about how people spend their leisure time.

example data - craft 1

This table can present students with a challenge. How do the ranking numbers on the left for the Top 5 craft hobbies relate to the percentages on the right? Although the percentage for sewing projects (300 per cent) is less than knitting projects (400 per cent), it is also less than the three hobbies listed below it. Also, the top ranked, knitting projects, with a percentage of 400 per cent is less than the bottom two. This seems different from the list for Top 5 emerging craft trends, where the percentages decrease in order of ranking.

The only clue in the article as to the meaning of the data presented in the table is a comment on another product: “The shortage of PPE for frontline workers has also galvanised skilled and unskilled stitchers with searches for mask-making materials such as cotton fabric and elastic up more than 500% and 700% on the Hobbycraft site.”

Students could be asked to work in pairs or groups to devise a way of showing the top five craft hobbies data in a way that agrees with the ordering. What kinds of data would be required for the initial sales?

Conclusion and inference

The first report below suggests a menu for avoiding heart ailments. This could be discussed in a unit of health literacy.

Does the report provide enough background for students to change their diets? The article offers a chance for speculation on research design. Students can discuss what questions they would ask of the researchers behind the report. For example, how were the 584 patients divided into the two groups? Were there half in each group? How different were their previous heart conditions that made them eligible for the study? What were the average ages of the two groups? Were they equally male and female? What were the general living conditions like? What about family history of heart attacks? Did they keep track of what other activities the two groups might have engaged in during the 27 months, such as exercise? Obviously this could not have been a “blind” study, so how might this have affected the results?

example data thats life

That’s Life (source: The Mercury, p.13, 4/9/93)

In recent years, there have been many studies on the Mediterranean diet, often quite complex. The report from 2020 shown below gives some results for heart disease for people with diabetes. It also summaries some other studies related to non-diabetic populations. The results are reported in terms of percentage reduction of symptoms. Students could compare the claimed reduction in heart attacks for people with diabetes (38 per cent) and non-diabetic people (65 per cent). Looking at the earlier article, what percentage reduction could be claimed for the French study?

Get critical!

Use ABC Education’s Media Literacy Week representation of statistics!

As well as using the tips above, a good starting point for students would be to look at the claims made by the ABC itself in its own News Habits Survey 2018.

Many questions could be asked in relation to the figure below, for example, about the circular graph and how it fits with the claims elsewhere, including for the number of participants. How many primary students were happy about the good news they heard? Could you make a pie chart from the data on the key ways teachers discover news? Why or why not? There could be some speculation on the use of large fonts and whether they are meant to highlight large or small numbers or percentages.

Professor Emerita Jane Watson has been a tertiary mathematics tutor, pre-service teacher educator, and mathematics education researcher across 48 years at the University of Tasmania. Her main research specialization has been statistics education and she has written 50 articles for teachers on various topics for the classroom, including lbw decisions in cricket, fishy statistics, division by zero, estimating the height of a tree, and what’s typical in the Melbourne Cup. She is a Fellow of the Academy of the Social Sciences in Australia.

Adjunct Associate Professor Rosemary Callingham is a mathematics educator at the University of Tasmania. She has an extensive background in mathematics education in Australia, at school, system and tertiary levels, including mathematics curriculum development and implementation, large-scale testing, and pre-service teacher education. In 2020, she became a Member of the Order of Australia for services to mathematics education.

Further reading:

Watson, J.M. (2004). Quantitative literacy in the media: An arena for problem solving. Australian Mathematics Teacher, 60(1), 34-40.

Watson, J. (2015). Statistical literacy in action: Should all graphs start at 0? Australian Primary Mathematics Classroom, 20(4), 26-30.

Watson, J., & Callingham, R. (2020). Covid-19 and the need for statistical literacy. Australian Mathematics Education Journal, 2(2), 16-21.

Tags:

You Might also Like

Leave a Comment

Your email address will not be published. Required fields are marked *