It’s difficult to give every student the support they need when you’re fielding hundreds, or even thousands, of questions a day. In response, many institutions have turned to AI (artificial intelligence) or rule-based chatbots to help manage the volume of incoming requests. This has freed up faculty and made their jobs a little easier, though we can’t help but wonder—has it done the same for students?
College students need quick and accurate answers when they’re seeking support. Institutions are responsible for providing tools that will give them those answers in the most efficient and effective way possible. AI might seem like the best solution because it’s innovative and data shows it works well in certain situations. But there are certain jobs AI simply cannot fill—and student support might be one of them.
It’s imperative to ask: What do students want when they need support? Students don’t need a friendly robot. They don’t want to waste time exchanging pleasantries with a system. They don’t want to jump through hoops either. It’s frustrating to answer a bunch of questions in a chat box only to be directed to an online document that doesn’t help.
—Art Markman (Professor of Psychology at the University of Texas) on automated tech support
Universities offer value through their people, and that’s what students need—a true human-to-human connection. Students feel like they matter and their needs matter when a faculty member is attentive to their questions. What’s more, students need opportunities to build relationships with faculty members who might otherwise seem unapproachable.
AI vs. Automation
Before going any further, it’s important to discern the difference between AI and automation. There’s a lot of confusion between the two. Put simply, here’s the difference:
AI uses machine learning and natural language processing (NLP) to receive data, learn from it, adapt, and generate a proactive response.
Automated chatbots (or rule-based chatbots) only perform based on defined rules. They neither learn nor answer specific questions outside of the rules. Questions fielded by an automated chatbot are more likely to be passed over to a human support agent. It’s one thing to use an automated chatbot as a supplement to a help center that allows students to find information quickly without scrolling through pages of articles. But if it’s where students are directed for any and every support need, it won’t be sustainable.
A lot of products disguise themselves as AI, but they’re just automated chatbots bloated with information. Forbes AI expert Ron Schmelzer writes, “Many firms are claiming to be AI-enabled when all they have done is put some thin capability provided by a third-party library or API that doesn’t really transform their existing product into something that is inherently different with that new intelligent technology.”
Colleges and universities must be careful to avoid false advertising in this area. It’s easy to label something as “AI” for the sake of marketability. That’s why it’s crucial for institutions to truly understand what AI is and isn’t. It could make all the difference when it comes to the student experience.
Things to Consider if You’re Using or Plan to Use an AI Support Tool
- Data is key for AI technology to function successfully. AI relies on data to learn and adapt.
- Understand and measure results. Just knowing how many questions your AI answers isn’t enough—you need to know if those answers are helpful to students. Even if your AI chatbot answers 5,000 questions a day, that doesn’t mean anything. Only if students walk away with correct and accurate answers has your AI succeeded.
- Does your institution have the IT staff to manage and maintain AI systems?
- Do you trust AI to get it right? If a student asks a question, what happens if the AI tool provides a misleading or incorrect response? If a student follows through on inaccurate information, there could be significant consequences—missed deadlines, a failing grade, late applications, etc.
- An AI support tool might be beneficial for faculty (less time spent answering student questions), but does it help students? Or are faculty freed up at the students’ expense?
- How experimental is it? Schools often test out new innovations in the classroom and on campus at the students’ expense. Be careful how and with whom you test new tools. You don’t want an experiment to fail while, in the meantime, an entire semester or year has passed and students have suffered.
Is AI Equitable?
Equitable access to support is a major obstacle for college students. An AI support tool can inadvertently threaten equity by answering some questions but not others, especially the more specific, unique questions that only a human can truly understand.
Imagine a student has a problem that could be easily addressed by a faculty member, but instead, they’re directed to a chatbot that doesn’t quite understand their problem. The chatbot provides an answer, but it only answers part of their question, or worse—it offers a completely unrelated solution. The student either exits the chatbot with insufficient information, or the chatbot connects them to a faculty member, which is what should have happened in the first place.
Students are more likely to give up and become discouraged when they can’t get the help they need. This is especially true among first-generation or low-income students. First-generation students with no parental experience to lean on are more likely to have trouble knowing where to find help. When they have to jump through hoops to get connected with a faculty member, it makes the university system even more confusing to navigate. Low-income students are more likely to drop out when their college experience isn’t on par with the time and money they’ve invested. In other words, is college really worth it if they can’t get the help they need to succeed while balancing school, work, and finances?
Equitable student support means quick and easy access to the right person at the right time for all students, regardless of background, status, demographic, life stage, or income level. If an AI support tool hinders that, it’s either not the right tool for your university, or it’s being used in the wrong place.
A Place for Everything
Technology is transformational when it’s in the right place. AI isn’t a bad thing. It continues to transform industries all over the world by empowering humans to do their jobs better, faster, and more effectively. The problem is when we try to replace humans with technology in areas where the human touch is necessary.
—Dr. Luis Perez-Breva, Faculty Director of MIT Innovation Teams
AI has its place. It’s up to each institution to determine if student support is that place, or if AI is better used elsewhere.
EVAN360 and AI
EVAN360 isn’t necessarily a replacement for AI student support. It is, however, a tool that’s designed to help improve the student support experience with the right balance of technology and human interaction.
Here’s how EVAN360 is changing the support experience:
- There are no hoops to jump through. It connects students to a human upfront.
- With EVAN360, students know exactly where they can get the right answer fast.
- It improves equity. All students have the same access to faculty and staff whenever they need help.
- There are no general responses. Because students are always connected with a human faculty member, they can get personalized answers to their specific, nuanced questions in ways they never could with AI.
As you can probably see, EVAN360 is all about human connection. We believe that’s the most important part of student support. Faculty cannot be replaced by a machine, a system, or any kind of technology. As mentioned above, universities offer value through their people, and that’s one thing that can never be replaced.