I have suggested that fake news is a consequence of a triad of factors:
1.The miserly mind grabs easy but inaccurate information.
2.The social mind herds together with others who have accepted the same opinion.
3.The structural design of social media feeds funnels this herd together into the same digital space, such that it becomes an echo chamber of the wrongly accepted opinion.
So what can we do about it? Let’s take each in turn:
1. The miserly mind grabs easy but inaccurate information. Five years ago, the debate about Google in classrooms was around whether there was any point in remembering facts these days. Now it has become clear how much fake news, images, videos etc we will be coping with online, discrimination- fact-checking- is back in vogue.
Right now fact checking is the main response to mis-information online. The challenge however is, as Paul Resnick, from the University of Michigan puts it “the problem is that corrections do not spread well’ whilst rumours do. News services which rate news which has been fact checked can help.”
The miserly mind prefers Amazon Echo’s answers to digging for ourselves. We regress at our peril. Combating fake news involves involves doing what the human mind does best and worst- discriminating between what we can trust and not trust- and we face a daily choice which we commit to.
2. The social mind herds together with others who have accepted the same opinion. We need to consider whether we have reached a point where avoidance of some online spaces is necessary. Diversity and debate is easier to facilitate in real spaces, and much harder online. Opportunities for diversity and debate within the school curriculum, as disciplines of social and learning practice, are becoming ever more essential. It is extraordinary that, at just the moment we need our universities to maintain such spaces, they are collapsing into campuses of safe spaces where such challenge is eliminated.
But schools can also teach pupils about how the mind learns. The model of the mind that informed internet pioneers was that if we share knowledge, the mind will process it all fairly and rationally. This was a mistake. The human mind is not just a rational, straight line engine; it is a steering car. It does not drive straight. It is biased by the opinion-road on which it, and those around it, are driving.
STEER has created an animation to help schools educate children with a more accurate understanding of the mind. The first 3.30 minutes are particularly relevant- please feel free to use this in assemblies and PSHE lessons to start the discussion with your young people.
3. The structural design of social media feeds funnels this herd together into the same digital space, such that it becomes an echo chamber of the wrongly accepted opinion. Ultimately, the structure of internet will need to be redesigned. If search engines model how you steer online, and then build a world that mirrors back your biases, the fake news outcomes are inevitable.
As long as internet giants make their money from aggregating large populations of like-opinioned people together, in order to sell them a product, then we will find ourselves vulnerable to fake news, manipulations and distortions. The internet itself needs to price truth differently.
The real hope of Blockchain may not be in eliminating fraudulent transactions, but in removing the power of massive search engines to control digital space.
STEER IS DELIGHTED TO ANNOUNCE THE FIRST FT WEEKEND OXFORD LITERARY FESTIVAL EDUCATIONAL LEADERS DAY
Educating the Human Mind in a Robotic Age
April 1st 2019, Oxford
We are delighted to announce the launch of a new whole day event Educating the Human Mind in a Robotic Age. The day is designed for educational leaders and policy makers at the 2019 FT Weekend Oxford Literary Festival. The event will address the changes that are required to educate the human mind in a robotic age.
- The morning session will focus on the effects of social media & digital technologies on the human mind, ability to learn and our mental health.
- The afternoon session will focus on the unique cognitive capabilities required by graduates to succeed in an economy of machine learning and AI.
KEYNOTE SPEAKERS INCLUDE:
• Professor John Bargh, Director of the Automaticity in Cognition, Motivation, and Evaluation Lab at Yale University. John has led global research into cognitive priming for the past three decades. John is uniquely positioned to explain the unconscious impacts of the real and digital environments on the minds of young people.
• Professor Stephen Roberts, Professor of Machine Learning in Information Engineering at the University of Oxford. Stephen has pioneered the development of intelligent algorithms to analyse big datasets. Stephen will clarify both the power and limits of machine learning, identifying the uniquely human cognitive capacities which will remain critical to educate in a robotic age.
• UNESCO Programme lead for Digital Technology and Education sharing global perspectives on technology in education.
• The day will be hosted by Dr Simon Walker, Co-founder of STEER. Simon has led STEER’s pioneering work in reducing mental health risks, signposting learning-to-learn skills and improving employability in students across more than 100 schools.
OTHER HIGHLIGHTS INCLUDE:
- Data from an ongoing study of the development of adolescent social cognition between ages of 8-18 involving 30,000 students.
- An extended panel interview and Q&A with keynote speakers.
Event places are limited to 100 and are available to headteachers, deputies and policy makers in educational trusts & UK government on a first come-first-served basis.