Critical thinking is one way to look at it. In regard to journalists, Hemingway called it "crap detection." Like search, LLMs are an extraordinarily powerful way to summon information and knowledge, with the caveat that like Internet search, it is up to the seeker to distinguish the real info/knowledge from the misinfo and disinfo. In addition to a critical mindset, learners will need to learn specific strategies and tactics. For Internet search, it can start with doing another search on the author. How to crap detect LLMs is a skill that has yet to emerge. Educational institutions didn't do such a great job with Internet search crap detection. Can educators get an early start on this one?
💯 I whole heartedly agree. I've been thinking about how the 5 literacies you outlined in Net Smart apply even more so now than they did 13 years ago when you wrote it. ;)
I appreciate that ChatGPT Search cites it sources and provides links for the user to follow & understand where the information came from... But that relies on the user being the responsible "seeker" who wants to "distinguish the real info from the misinfo", like you mentioned.
Any other "crap detection" strategies and tactics from Net Smart that might apply to interacting with LLMs?
I am certain that one of the first tactics is to check out each of the citations from ChatGPT or any other LLM because it is known that they sometimes invent them. I wonder if other educators have suggestions for crap detecting LLMs?
Learning and 'critical thinking' are important, yes. However, in times of rapid change (like now), Unlearning can be just as important- along with multiple other ways of 'thinking' (beyond just 'critical').
Practice, practice, practice. As a teacher, I want students using AI in everything we do. They need to become familiar with how it works, with how to use it, with its shortcomings.
But it’s not just practice that’s important. It’s also reflecting on the experience so they can break down and analyze what went right and what went wrong.
As a teacher I need to be the coach/guide. I need to create a safe space to experiment and fail and learn. I need to practice and master the tool(s).
Students are overwhelmed. We can help them navigate the what, why, how of using these tools so we fulfill our responsibilities to prepare them for a career
I am a senior researcher based in Seattle who studies the social implications of AI for K-12 students and teachers. My lab has interviewed and surveyed hundreds of students and teachers about their experiences with GenAI in the classroom. Something that technologists largely miss is that the majority of teachers and students that we have talked with do not want AI integration in the classroom. Several teachers have banned technology altogether and returned to paper/pencil assessment. Several students question the ethics of using AI responsibly and have turned away from using these products altogether.
Thanks for sharing. And I hear you, Jen. I think the question from the post still applies, even if teachers (or students) do not want AI in the classroom. Are there skills or mindsets educators can foster to help their students navigate a future with AI?
literally the only two skills I currently still use are reading (so students must learn to read and to love getting information in text form) and whatever you call that skill you develop when you have to sit down and painstakingly debug code (its the same skill you have to use to debug any problem in life).
IF a human child has a true desire to be valuable and or genuine curiosity, and they possess these two aforementioned skills, then AI will take them the rest of the way, with ever increasing speed.
I'm not sure what I would call it, but it does require you to be able to maintain a sustained flow state and be comfortable in it. Additionally, you need to be highly receptive to the dopamine burst you get when you solve a problem after thinking about it for a long time... Maybe resilience.
Critical thinking is one way to look at it. In regard to journalists, Hemingway called it "crap detection." Like search, LLMs are an extraordinarily powerful way to summon information and knowledge, with the caveat that like Internet search, it is up to the seeker to distinguish the real info/knowledge from the misinfo and disinfo. In addition to a critical mindset, learners will need to learn specific strategies and tactics. For Internet search, it can start with doing another search on the author. How to crap detect LLMs is a skill that has yet to emerge. Educational institutions didn't do such a great job with Internet search crap detection. Can educators get an early start on this one?
💯 I whole heartedly agree. I've been thinking about how the 5 literacies you outlined in Net Smart apply even more so now than they did 13 years ago when you wrote it. ;)
I appreciate that ChatGPT Search cites it sources and provides links for the user to follow & understand where the information came from... But that relies on the user being the responsible "seeker" who wants to "distinguish the real info from the misinfo", like you mentioned.
Any other "crap detection" strategies and tactics from Net Smart that might apply to interacting with LLMs?
I am certain that one of the first tactics is to check out each of the citations from ChatGPT or any other LLM because it is known that they sometimes invent them. I wonder if other educators have suggestions for crap detecting LLMs?
Great stuff! I'm writing about AI and higher ed here. https://hollisrobbinsanecdotal.substack.com/p/ai-aced-your-states-gen-ed-now-what
Interesting take.
A couple of add-ons.
Learning and 'critical thinking' are important, yes. However, in times of rapid change (like now), Unlearning can be just as important- along with multiple other ways of 'thinking' (beyond just 'critical').
Anything in particular you think students (or educators) need to "unlearn"?
A Notes comment simply doesn't have enough space for that list, Mamie. ;)
Here are a few to consider:
• Fear of failure
• Siloed disciplines
• Outdated hierarchies
• Static learning models
• Inflexible methodologies
• One-dimensional thinking
• The “right answer” mindset
• Rigid educational structures
• Overemphasis on standardization
• Resistance to unconventional ideas
Practice, practice, practice. As a teacher, I want students using AI in everything we do. They need to become familiar with how it works, with how to use it, with its shortcomings.
But it’s not just practice that’s important. It’s also reflecting on the experience so they can break down and analyze what went right and what went wrong.
As a teacher I need to be the coach/guide. I need to create a safe space to experiment and fail and learn. I need to practice and master the tool(s).
Students are overwhelmed. We can help them navigate the what, why, how of using these tools so we fulfill our responsibilities to prepare them for a career
I am a senior researcher based in Seattle who studies the social implications of AI for K-12 students and teachers. My lab has interviewed and surveyed hundreds of students and teachers about their experiences with GenAI in the classroom. Something that technologists largely miss is that the majority of teachers and students that we have talked with do not want AI integration in the classroom. Several teachers have banned technology altogether and returned to paper/pencil assessment. Several students question the ethics of using AI responsibly and have turned away from using these products altogether.
Thanks for sharing. And I hear you, Jen. I think the question from the post still applies, even if teachers (or students) do not want AI in the classroom. Are there skills or mindsets educators can foster to help their students navigate a future with AI?
literally the only two skills I currently still use are reading (so students must learn to read and to love getting information in text form) and whatever you call that skill you develop when you have to sit down and painstakingly debug code (its the same skill you have to use to debug any problem in life).
IF a human child has a true desire to be valuable and or genuine curiosity, and they possess these two aforementioned skills, then AI will take them the rest of the way, with ever increasing speed.
Would you call "that skill you develop when you have to sit down and painstakingly debug code" grit or resilience? Or comfortable being uncomfortable?
I'm not sure what I would call it, but it does require you to be able to maintain a sustained flow state and be comfortable in it. Additionally, you need to be highly receptive to the dopamine burst you get when you solve a problem after thinking about it for a long time... Maybe resilience.