So the shape of things to come, I suppose. We just finished up with March Break and we send a Report Card (marks & comments) home with the students as they go on holiday. As they were coming back for the last term, I thought I would do a temperature check with them when they got back (if I get to write an evaluation of them, it only seems fair!)
This time, rather than using Microsoft Forms (like here), I used a chat bot (an automated conversation) where the student would talk with the chat bot and then sentiment analysis would be performed on the data. Not only would the student's written content be considered but also how they wrote it and what word or phrasings they used.
I had all three sections interact with the chat bot (the link was posted in their OneNote) and I stepped out of the room to give them a chance for some privacy. I did mention to them that this was experimental, that they didn't have to do it and that it was anonymous (modulo the usual "nothing is really anonymous on the internet"). They had no issue getting onto the chatbot and no one had any issues with how it interacted with them, or what to do once they clicked on the link.
I used https://hubert.ai which is presently in beta -- but given the rate at which Microsoft is improving their chatbot framework, I can see school districts developing their own tools in the future. PowerBI, after all, does do sentiment and text analysis.
Once the students were done, I logged into the results to see how I did. I only blur out the overall because our school could use it as part of my own evaluation and I don't think I'm yet comfortable with that.
You can download a CSV filled with the actual conversations the students had, so it's like doing a Microsoft Form (or, umm, paper) course evaluation, or do your own PowerBI analysis of the text (I haven't had time to do that yet.)
One caveat to be aware of was the jokes that popped up! Now, I had tested the chatbot myself and didn't discover this, but trust a student to push the envelope. And they are terrible Dad jokes -- and also aimed at a post-secondary crowd rather than early high school (alcohol was mentioned in one joke).
And out of the 47 students, I did have two students loudly vocalize how they found it "creepy" to be doing this.
What I do look forward to is having the AI dig deeper once it notices a pattern, or if it reads avoidance or suggestive messages. And I'd love to see us be able to have it focus on particular topics (I use #VNPS and would like to ask questions specifically on that, for example).
Do I think they were any more honest than using Microsoft Forms or paper? Scanning through the Excel sheet of actual conversations, I don't think so. But, I likely got MORE content than I would off of a Form or paper-- and the added layer of sentiment & textual analysis helped avoid focusing so much on an individual written comment (that we always obsess over). And it definitely SAVED TIME -- the students took maybe ten minutes to do the chat and then click--> I had the results.
I will point out that there are two sessions on AI during the LearnTeams conference in a few days!
https://www.learnteamsconference.com/ (yes, I'm doing one related to Education).
This time, rather than using Microsoft Forms (like here), I used a chat bot (an automated conversation) where the student would talk with the chat bot and then sentiment analysis would be performed on the data. Not only would the student's written content be considered but also how they wrote it and what word or phrasings they used.
I had all three sections interact with the chat bot (the link was posted in their OneNote) and I stepped out of the room to give them a chance for some privacy. I did mention to them that this was experimental, that they didn't have to do it and that it was anonymous (modulo the usual "nothing is really anonymous on the internet"). They had no issue getting onto the chatbot and no one had any issues with how it interacted with them, or what to do once they clicked on the link.
I used https://hubert.ai which is presently in beta -- but given the rate at which Microsoft is improving their chatbot framework, I can see school districts developing their own tools in the future. PowerBI, after all, does do sentiment and text analysis.
Once the students were done, I logged into the results to see how I did. I only blur out the overall because our school could use it as part of my own evaluation and I don't think I'm yet comfortable with that.
One caveat to be aware of was the jokes that popped up! Now, I had tested the chatbot myself and didn't discover this, but trust a student to push the envelope. And they are terrible Dad jokes -- and also aimed at a post-secondary crowd rather than early high school (alcohol was mentioned in one joke).
And out of the 47 students, I did have two students loudly vocalize how they found it "creepy" to be doing this.
What I do look forward to is having the AI dig deeper once it notices a pattern, or if it reads avoidance or suggestive messages. And I'd love to see us be able to have it focus on particular topics (I use #VNPS and would like to ask questions specifically on that, for example).
Do I think they were any more honest than using Microsoft Forms or paper? Scanning through the Excel sheet of actual conversations, I don't think so. But, I likely got MORE content than I would off of a Form or paper-- and the added layer of sentiment & textual analysis helped avoid focusing so much on an individual written comment (that we always obsess over). And it definitely SAVED TIME -- the students took maybe ten minutes to do the chat and then click--> I had the results.
I will point out that there are two sessions on AI during the LearnTeams conference in a few days!
https://www.learnteamsconference.com/ (yes, I'm doing one related to Education).
Comments