It's relatively uncertain how they will be used or what kind of information will be transferred from these chips and where it will be going, so I think that it's important to consider these implications to make sure every step along the way of this technology development that the security and ethical implications are considered fully.These videos reminded me of another video from last year where futurist and technology expert Scott Klososky talks about augmented intelligence and claims it won't be long before implantables are very normal and where schools may have to have classes for augmented kids (with a brain computer interface and retinal projection) and non-augmented kids. Scott claims we will be the last generation to be un-augmented. He talked about this when he came to ASB Un-Plugged earlier this year as well - personally I find this very scary, especially when you consider what will happen when students leave school and go for jobs. Clearly if some people are augmented, then those people will get the best jobs and the highest salaries - increasing inequalities. If you start to think about the consequences it's really frightening!
I've been thinking about data and student achievement recently. I was reading about soft data in Daniel Sobel's book Narrowing the Attainment Gap: A Handbook for Schools where Daniel argues that the attainment gap is mostly to do with soft data (motivations and barriers for students). At the same time I'm seeing more and more schools focusing on big data. A couple of weeks ago I was talking to Consilience's data scientist, Sujoy, who claims that in fact the power is in "small data" looking at individuals and what helps each child to progress. And as Tricia Wang says, "Relying on big data alone increases the chance we'll miss something, while giving us the illusion we know everything." (She recently did a TEDtalk about the human insights missing from big data that is worth watching.)
Let's pull these thoughts together. Currently many details of our lives are captured and traded by data-mining companies. You only have to pause a little over someone's Facebook post, and related ads seem to flow into your email and messages. For example a couple of days ago I paused over an ad for microblading (I didn't have a clue what it was) and suddenly I'm getting ads all the time about beautiful eyebrows! Data is collected from the websites we browse, what we buy, social media posts, loyalty cards, the music we listen to, the movies we watch online and so on. I've stopped giving out my real phone number when asked while making purchases as I'm flooded with spam messages afterwards, with companies targeting me in marketing their products and services.
But can "big data" really be useful in education? Does it allow schools to better understand how students learn and how best to support them? Around 6 months ago the IB published an article about data entitled Big Data, Big Problems? The question this article addresses is this: do the numbers tell the whole story? I was interested to read a comment by Allison Littlejohn, Professor of Learning Technology at the Open University in the UK who claims "We can look at trends ... and connect that with employment within countries. Depending on what the future job opportunities might be, schools can then adapt the curriculum." Really? Good lord, I'd have thought it would take a little longer than this to adapt, write and implement a new curriculum! However I do agree with something else she writes, "We need to be sure that students are properly prepared so that when they do leave school, they're able to aim for jobs that still exist, and later change careers, which they're very likely to do throughout their lives." I think it is true that schools may be able to target support that students' need when everything is more transparent, but perhaps the question then is what is being measured (is it just what is easy to measure in schools? What about the environmental factors outside of school that motivate students?) Littlejohn argues further that the success of collecting the data depends on how well coders, teachers and people who understand learning can work together. She states, "It's very difficult to actually gather the data that you need to come to the conclusions that you want to reach ... a lot of what we measure and analyse is an approximation of what people's actually ability is."
At school I work closely with our iCommons teacher/librarian and recently we've been having a big push with our Grade 4s about bias. In fact we came across a great online resource called Checkology, though a bit old for our Grade 4s, but I'm sharing it here because it provoked interesting discussions about bias in media. At the same time we are aware (and letting students know) that Google's autocomplete feature can produce a biased result while searching, perpetuating gender and racial biases which reinforce rather than eliminate discrimination, and which ultimately can negatively affect development. I'm thinking about this now in terms of smart speakers as well, such as Echo and Google Home who are "always listening" to us, and which pretty soon are going to start talking to each other. What information are they giving us - knowing what we want to hear? And here is a question that Scott left us with in one of his Un-Plugged presentations: When machines can learn faster than humans, what will be the most valuable areas of learning for humans? Something to think about as teachers right?