Student Complaint Against Professor for AI Usage Emphasizes Need for Educational Agencies to Provide Clear Guidance
Student Complaint Against Professor for AI Usage Emphasizes Need for Educational Agencies to Provide Clear Guidance

Despite concerns among educators regarding students’ use of AI, educators themselves are increasingly relying on AI tools. A recent incident at Northeastern University and the resulting fallout serve as a reminder that the absence of clear, comprehensive AI policies or guidance can lead to conflicts between educators and students.  As generative AI becomes increasingly sophisticated and accessible, educational leaders must proactively address these emerging issues before they lead to formal complaints or become part of the news cycle.

The conflict at Northeastern began in February 2025, when an undergraduate student noticed that a presentation by one of her business professors contained egregious misspellings and photos of people with extra body parts, among other issues. She became suspicious that the professor had generated the presentation using AI, and believed that her professor was not transparent about its implementation.  The fact that the professor forbade students from using AI tools in the class heightened the student’s frustration.  In other words, the professor prohibited students from using AI while surreptitiously using it in his own presentation to students. 

The student filed a formal complaint against the professor, alleging undisclosed use of AI, and demanded a tuition refund exceeding $8,000 for the business course as a remedy. She argued that given Northeastern’s high cost of tuition and reputation, she expected a higher quality of education than an AI-generated slides presentation with visibly incorrect information.  University officials held a series of meetings with the student and subsequently rejected her refund claim.  Several months later, Northeastern adopted a formal AI policy which sets forth requirements for AI usage.  Among other things, this policy requires users to include attribution and review AI-generated output for accuracy and appropriateness.  The policy includes express standards in implementing the new AI policy, including “Standards for the Use of Generative AI in Administrative Work,” “Standards for the Use of AI in Research at Northeastern,” and “Standards for the Use of Generative AI in Teaching.”

In this situation, the professor explained that he intended to use AI to create an aesthetically pleasing presentation, but that he did not closely review the output.  In preparing its article on this incident, the New York Times contacted dozens of professors, the majority of whom agreed that the professor’s use of AI was “perfectly fine.”  This suggests that many instructors view AI as an acceptable tool for student instruction, but lack awareness regarding the need to take certain precautions.

This incident illustrates why educational agencies need to urgently develop agency-wide policies, or at the very least, to provide guidelines to staff members, before problems arise.  Districts should consider working with stakeholders to gain their perspectives regarding various AI topics including, but not limited to, academic integrity, student privacy, and ethical implications.  These conversations will assist in establishing AI board policies or other guidance documents that clearly outline expectations for faculty and students.  Several educational agencies have prepared comprehensive guidance documents that include a list of approved generative AI tools, permitted and prohibited uses of AI in the workplace, and information about integration of AI into the curriculum, among other things.  The alternative—reactive policymaking in response to controversy—risks negative publicity, erosion of trust, and potential explosion to legal challenges.   Educational agencies should act now to either adopt AI policies or guidance documents in anticipation of complaints similar to the above.

Thank you to law clerk Anastacia Son for her contribution to this Blog.

This AALRR publication is intended for informational purposes only and should not be relied upon in reaching a conclusion in a particular area of law. Applicability of the legal principles discussed may differ substantially in individual situations. Receipt of this or any other AALRR publication does not create an attorney-client relationship. The Firm is not responsible for inadvertent errors that may occur in the publishing process.

© 2025 Atkinson, Andelson, Loya, Ruud & Romo

Categories: Technology
  • Alex A. Lozada
    Senior Counsel

    Alex Lozada is a seasoned attorney who provides legal counsel to school districts, community college districts, and county offices of education. With an extensive background in litigation, Mr. Lozada brings a wealth of experience ...

  • Dustin  Stroeve
    Associate

    Dustin Stroeve represents school districts, community college districts, and county offices of education in labor and employment law and in general education matters. Mr. Stroeve provides representation, advice, and counsel on ...

Other AALRR Blogs

Recent Posts

Popular Categories

Contributors

Archives

Back to Page

Necessary Cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytical Cookies

Analytical cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.