Edward Conroy
Senior Policy Manager, Education Policy Program
In a search for imagined cost savings, the so-called Department of Government Efficiency has suggested AI could replace staff that help students with thousands of FAFSA and related financial aid questions every day.
The most frequent response you are likely to get from a financial aid professional when asking 鈥渉ow much help will I get to pay for college鈥? Is, it depends.
That answer might annoy a student, but it鈥檚 one driven by the reality that financial aid is really complicated. Part of the U.S. Department of Education鈥檚 job is to help students navigate that complexity. However, instead of embracing that role, the Trump administration wants to make it harder for students and families to get the assistance they need.
Answering questions about financial aid is the job of hundreds of call center staff members contracted by the Education Department. Those staffers talk to students and families about everything from what loans and grants are available, to the Free Application for Federal Student Aid (FAFSA), and repayment options after graduation. But Elon Musk鈥檚 so-called Department of Government Efficiency wants to turn over those critical responsibilities to a generative artificial intelligence system, .
This would be a disaster for students.
College is one of the most significant financial investments most students and families will ever make. Getting accurate and timely information is vital for students to make decisions about where to go to school and how to pay for it. But they鈥檙e only likely to get incorrect and unreliable answers from an AI chatbot.
Students and families tend to ask questions about financial aid in ways that make sense to them, but where lots of extra information is needed to provide a useful answer. For example, 鈥淲hat grants and scholarships will I be eligible for if my parents earn $50,000鈥.
Because of the financial aid system鈥檚 complexity, there are a lot of situations where it is difficult or impossible to give a good answer based on one or two pieces of information. Working with an actual human allows students to explain their situation so call center staff have enough information to provide accurate information. Getting the answers to financial aid questions right is vitally important to students and families who need to know how to complete the FAFSA correctly, understand what help they will get paying for college, and have their post graduation loan repayment options clearly and accurately explained.
When the , it found 鈥渟ignificant issues with over half of the responses. And, 19 percent of the answers contained factual errors. ,鈥 a delightfully different way of saying 鈥渕aking things up鈥.
Students relying on financial aid cannot afford getting guidance that 鈥榮 no more reliable than a coin flip.
While writing this piece, I asked ChatGPT 鈥渕y parents earn $54,000, will I get a Pell grant鈥? Students pose this kind of question all of the time. ChatGPT鈥檚 answer provided completely wrong information about the current criteria used to determine Pell Grant eligibility. It cited the previous eligibility scale known as Expected Family Contribution, which the government has not used in two years.
Chatbots can help students in certain circumstances. When I worked in the University of California Los Angeles financial aid office, I helped implement a chatbot to answer student questions. But that chatbot operated with a , and only provided information the office had vetted.
But again, generative AI systems do not know when they鈥檙e wrong, which introduces the risk they鈥檒l give students flawed information. Call center staff can make mistakes too, but the important difference is when that occurs, they can discover who made the error and correct it.
And eventually, a human would probably have to help a student who received inaccurate AI-supplied information. I knew immediately that ChatGPT鈥檚 answer to my Pell Grant question was wrong, but only because I鈥檓 an expert on the issue. Students and families who file a FAFSA are not policy experts or former financial aid administrators, nor should they have to be.
The question I posed was not complicated. What happens when a student whose parent has just died asks a Department of Education endorsed AI if it changes their aid eligibility? What about the student who gets married, or the one who has a baby, or loses a job? These circumstances arise for students on a regular basis, and how they affect financial aid is a topic that requires deep knowledge of the financial aid system鈥 as well as empathy to support students when they are in crisis.
The financial aid system is undoubtedly too complex, and there is a lot that can be done to make it simpler, and for students and families to navigate on their own. This is especially true in the wake of the last year. Changes to the FAFSA intended to simplify it. For instance, the number of questions were reduced from over 100 down to 46. But delays and errors in the rollout made theoretically a simpler process much more challenging. Thankfully, those issues are now largely resolved, and this year鈥檚 .
Introducing AI into an already complicated process increases the likelihood of families receiving incorrect information, and they may not realize it until they get their financial aid offer, or when a financial aid officer from their college or university calls them about the issue.
The Education Department must maintain its role supporting students as they navigate FAFSA, not punt it to AI in service of faux government efficiency.