Change log

Northeastern University

Source-check timeline, source snapshot hashes, claim review state, and a diff-style preview of current source-backed claim evidence.

Change summary

Current public record freshness and review state.

Northeastern University currently has 7 source-backed claim records and 5 official source attributions. Latest tracked changed date: May 16, 2026.

This tracker is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.

Claim/evidence diff preview

Diff-style preview built from current public claim/evidence records. Full old/new source diffs require paired historical snapshots.

Northeastern University current policy evidence

Inserted lines represent current public claim and evidence records in the source-backed dataset.

+14-0
11 # Northeastern University AI policy record
2+security_review: Northeastern Policy 125 says faculty or staff seeking to use an AI system in University Operations must submit the system and use case for AI Review Committee and Office of Information Security approval when the system processes confidential information, personal information, restricted research data, or may affect legal rights or physical safety.
3+Evidence (en, 7260a81fde1e): If the AI System either (i) involves the processing of Confidential Information, Personal Information, or Restricted Research Data or (ii) takes actions that may impact the legal rights or physical safety of an individual: Submit the AI System and its use case for approval by the AI Review Committee; and Submit the AI System and its use case for approval by the Office of Information Security review process.
4+ai_tool_treatment: Northeastern Policy 125 says faculty or staff seeking to use an AI system in University Operations or covered outside professional activities must provide required attribution, check AI outputs for accuracy and appropriateness, and validate anti-bias testing when the system processes personal information or affects legal rights or physical safety.
5+Evidence (en, 7260a81fde1e): Any faculty or staff member seeking to incorporate the use of an AI System in University Operations or Outside Professional Activities must: Provide appropriate attribution ... Regularly check the AI System's output for accuracy and appropriateness ... If the AI System involves the processing of Personal Information or takes actions that may impact the legal rights or physical safety of an individual, validate that it is regularly tested.
6+research: Northeastern research standards say members of the Northeastern community conducting research are expected to follow the university AI Policy, complete AI Review Committee review when research uses an AI system to process confidential information, restricted research data, or personal information, and communicate permitted generative AI uses within project teams based on research activity and sponsor guidelines.
7+Evidence (en, a883efa74b1b): The University expects all members of the Northeastern community conducting research to follow the requirements set forth in the university AI Policy and to: Complete the AI Review Committee review process if your research involves using an AI System to process Confidential Information, Restricted Research data or Personal Information ... Follow guidelines set by the funding agency or publisher ... Communicate with fellow lab and project team members about the permitted uses of generative AI.
8+security_review: Northeastern teaching and learning standards state that using generative AI to grade open-ended student responses requires AI Review Committee review because it could affect student legal rights and may involve sensitive personal information and risk of illegal bias or discrimination.
9+Evidence (en, a8a3517b7302): Because it could impact the legal rights of students and may involve sensitive personal information and risk of illegal bias and discrimination, any use of generative AI to grade open-ended student responses, including written or multimodal work products, requires review by the AI Review Committee.
10+teaching: Northeastern teaching and learning standards say instructors should clearly communicate permitted generative AI uses to students in the syllabus, assignment guidelines, and verbally in class.
11+Evidence (en, a8a3517b7302): Instructors should clearly communicate to students the permitted uses of generative AI in coursework. This communication should be written in the syllabus, assignment guidelines, and conveyed verbally in class.
12+privacy: Northeastern administrative AI standards say administrative users should use approved AI environments for data or use cases requiring AI Review Committee approval and should not enter confidential, restricted research, or personal information into an AI system that has not been reviewed and approved by the AI Review Committee and Office of Information Security.
13+Evidence (en, 5c98f7503392): Use approved AI environments only when processing data or involving use-cases that require AI Review Committee approval ... Do not enter any confidential, restricted research, or personal information into an AI system that has not been reviewed and approved by the AI Review Committee and the Office of Information Security.
14+academic_integrity: Northeastern's student AI guide advises students to check each syllabus, ask professors when an AI policy is unclear, use AI as a study partner rather than a substitute for their own thinking, and avoid using AI when assignment guidelines direct independent work.
15+Evidence (en, 5511f65033fe): Every class is different ... Check each syllabus carefully. Ask your professors if you're unsure or if an AI policy is unclear ... The key is using AI as a study partner, not as a substitute for your own thinking ... Resist the temptation to use it when assignment guidelines direct you to work independently.

Claim changes

7 claim records

academic_integrity

Northeastern's student AI guide advises students to check each syllabus, ask professors when an AI policy is unclear, use AI as a study partner rather than a substitute for their own thinking, and avoid using AI when assignment guidelines direct independent work.

Review: Agent reviewedConfidence78%Evidence1Languagesen

privacy

Northeastern administrative AI standards say administrative users should use approved AI environments for data or use cases requiring AI Review Committee approval and should not enter confidential, restricted research, or personal information into an AI system that has not been reviewed and approved by the AI Review Committee and Office of Information Security.

Review: Agent reviewedConfidence90%Evidence1Languagesen

security_review

Northeastern teaching and learning standards state that using generative AI to grade open-ended student responses requires AI Review Committee review because it could affect student legal rights and may involve sensitive personal information and risk of illegal bias or discrimination.

Review: Agent reviewedConfidence93%Evidence1Languagesen

teaching

Northeastern teaching and learning standards say instructors should clearly communicate permitted generative AI uses to students in the syllabus, assignment guidelines, and verbally in class.

Review: Agent reviewedConfidence92%Evidence1Languagesen

research

Northeastern research standards say members of the Northeastern community conducting research are expected to follow the university AI Policy, complete AI Review Committee review when research uses an AI system to process confidential information, restricted research data, or personal information, and communicate permitted generative AI uses within project teams based on research activity and sponsor guidelines.

Review: Agent reviewedConfidence93%Evidence1Languagesen

ai_tool_treatment

Northeastern Policy 125 says faculty or staff seeking to use an AI system in University Operations or covered outside professional activities must provide required attribution, check AI outputs for accuracy and appropriateness, and validate anti-bias testing when the system processes personal information or affects legal rights or physical safety.

Review: Agent reviewedConfidence94%Evidence1Languagesen

security_review

Northeastern Policy 125 says faculty or staff seeking to use an AI system in University Operations must submit the system and use case for AI Review Committee and Office of Information Security approval when the system processes confidential information, personal information, restricted research data, or may affect legal rights or physical safety.

Review: Agent reviewedConfidence96%Evidence1Languagesen

Source snapshots

5 source attributions

Policy on the Use of Artificial Intelligence Systems

official_policy_page checked May 16, 2026

Snapshot hash
7260a81fde1ee53c2673f8574cd62b4016b4f0a7e777112c4246c90016a0aa48