11 # Northeastern University AI policy record
2+security_review: Northeastern Policy 125 says faculty or staff seeking to use an AI system in University Operations must submit the system and use case for AI Review Committee and Office of Information Security approval when the system processes confidential information, personal information, restricted research data, or may affect legal rights or physical safety.
3+Evidence (en, 7260a81fde1e): If the AI System either (i) involves the processing of Confidential Information, Personal Information, or Restricted Research Data or (ii) takes actions that may impact the legal rights or physical safety of an individual: Submit the AI System and its use case for approval by the AI Review Committee; and Submit the AI System and its use case for approval by the Office of Information Security review process.
4+ai_tool_treatment: Northeastern Policy 125 says faculty or staff seeking to use an AI system in University Operations or covered outside professional activities must provide required attribution, check AI outputs for accuracy and appropriateness, and validate anti-bias testing when the system processes personal information or affects legal rights or physical safety.
5+Evidence (en, 7260a81fde1e): Any faculty or staff member seeking to incorporate the use of an AI System in University Operations or Outside Professional Activities must: Provide appropriate attribution ... Regularly check the AI System's output for accuracy and appropriateness ... If the AI System involves the processing of Personal Information or takes actions that may impact the legal rights or physical safety of an individual, validate that it is regularly tested.
6+research: Northeastern research standards say members of the Northeastern community conducting research are expected to follow the university AI Policy, complete AI Review Committee review when research uses an AI system to process confidential information, restricted research data, or personal information, and communicate permitted generative AI uses within project teams based on research activity and sponsor guidelines.
7+Evidence (en, a883efa74b1b): The University expects all members of the Northeastern community conducting research to follow the requirements set forth in the university AI Policy and to: Complete the AI Review Committee review process if your research involves using an AI System to process Confidential Information, Restricted Research data or Personal Information ... Follow guidelines set by the funding agency or publisher ... Communicate with fellow lab and project team members about the permitted uses of generative AI.
8+security_review: Northeastern teaching and learning standards state that using generative AI to grade open-ended student responses requires AI Review Committee review because it could affect student legal rights and may involve sensitive personal information and risk of illegal bias or discrimination.
9+Evidence (en, a8a3517b7302): Because it could impact the legal rights of students and may involve sensitive personal information and risk of illegal bias and discrimination, any use of generative AI to grade open-ended student responses, including written or multimodal work products, requires review by the AI Review Committee.
10+teaching: Northeastern teaching and learning standards say instructors should clearly communicate permitted generative AI uses to students in the syllabus, assignment guidelines, and verbally in class.
11+Evidence (en, a8a3517b7302): Instructors should clearly communicate to students the permitted uses of generative AI in coursework. This communication should be written in the syllabus, assignment guidelines, and conveyed verbally in class.
12+privacy: Northeastern administrative AI standards say administrative users should use approved AI environments for data or use cases requiring AI Review Committee approval and should not enter confidential, restricted research, or personal information into an AI system that has not been reviewed and approved by the AI Review Committee and Office of Information Security.
13+Evidence (en, 5c98f7503392): Use approved AI environments only when processing data or involving use-cases that require AI Review Committee approval ... Do not enter any confidential, restricted research, or personal information into an AI system that has not been reviewed and approved by the AI Review Committee and the Office of Information Security.
14+academic_integrity: Northeastern's student AI guide advises students to check each syllabus, ask professors when an AI policy is unclear, use AI as a study partner rather than a substitute for their own thinking, and avoid using AI when assignment guidelines direct independent work.
15+Evidence (en, 5511f65033fe): Every class is different ... Check each syllabus carefully. Ask your professors if you're unsure or if an AI policy is unclear ... The key is using AI as a study partner, not as a substitute for your own thinking ... Resist the temptation to use it when assignment guidelines direct you to work independently.