Change log

Northwestern University

Source-check timeline, source snapshot hashes, claim review state, and a diff-style preview of current source-backed claim evidence.

Change summary

Current public record freshness and review state.

Northwestern University currently has 6 source-backed claim records and 6 official source attributions. Latest tracked changed date: May 10, 2026.

This tracker is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.

Claim/evidence diff preview

Diff-style preview built from current public claim/evidence records. Full old/new source diffs require paired historical snapshots.

Northwestern University current policy evidence

Inserted lines represent current public claim and evidence records in the source-backed dataset.

+12-0
11 # Northwestern University AI policy record
2+privacy: Northwestern guidance states that faculty, staff, students, and affiliates should not enter institutional data into any generative AI tools that have not been validated by the University for appropriate use and have explicit permission of the data provider.
3+Evidence (en, 39cbe001cfa3): University faculty, staff, students, and affiliates should not enter institutional data into any generative AI tools that have not been validated by the University for appropriate use and have explicit permission of the data provider.
4+teaching: Northwestern provides instructors with three course-level AI policy options: Open (GAI permitted), Conditional (GAI permitted when explicitly authorized), and Closed (GAI prohibited).
5+Evidence (en, 19e5d246a33e): As with other course features, instructors are free to integrate generative artificial intelligence (GAI) to support their teaching goals. Instructors should carefully consider whether and how they will allow the use of GAI in their courses. All instructors should clearly communicate their expectations or requirements to students to avoid confusion. It may help to consider three main options: Open: The use of GAI is permitted. Conditional: The use of GAI is permitted when explicitly authorized by the instructor. Closed: The use of GAI is prohibited.
6+privacy: Northwestern classifies data into four levels. Only Level 1 (non-confidential, public) data may be uploaded to publicly available generative AI tools. Data above Level 1 requires tools approved through Northwestern IT procurement and security review.
7+Evidence (en, 39cbe001cfa3): To determine whether your data requires special attention, consult Northwestern Data Classification Policy. If your data is Level 1 (non-confidential and public data), uploading it to generative AI tools is permissible. To process data above Level 1, any generative AI tool must have been approved through Northwestern IT procurement and security review processes.
8+ai_tool_treatment: Microsoft Copilot is Northwestern's primary approved generative AI tool. All students, faculty, and staff have access to free Copilot Chat, with full Copilot for Microsoft 365 available as a paid subscription.
9+Evidence (en, 5892fd4ad2d0): Microsoft Copilot is the primary, general-use AI tool available at Northwestern. All Northwestern students, faculty, and staff have access to the free Copilot Chat, and faculty and staff can use the full Copilot for Microsoft 365 Copilot capabilities with an add-on paid subscription. When you are logged into these services with your Northwestern NetID, your data is protected and operates fully within Northwestern security, compliance, and data boundaries.
10+academic_integrity: Unauthorized use of ChatGPT or other Generative AI tools is considered cheating and/or plagiarism per Northwestern Academic Integrity guidelines.
11+Evidence (en, 0e694d977fb7): Unauthorized use of ChatGPT or other Generative AI tools is considered cheating and/or plagiarism in Academic Integrity: A Basic Guide.
12+ai_tool_treatment: Microsoft Copilot, when signed in with a Northwestern Microsoft account, is approved for Level 2 and generally Level 3 data. Publicly available AI tools (ChatGPT, Gemini, MidJourney) may only be used with Level 1 public data.
13+Evidence (en, 39cbe001cfa3): The following table outlines Northwestern current services posture based on data classification: Conversational/Interactive Mode - Public Data (Level 1): Use of publicly available tools (e.g., ChatGPT, Copilot, Bard/Gemini, MidJourney, etc.). Sensitive/Regulated Data (Level 2, Level 3, Level 4): Microsoft Copilot, when signed in with a Northwestern Microsoft account for Level 2 and Level 3 data.

Claim changes

6 claim records

privacy

Northwestern guidance states that faculty, staff, students, and affiliates should not enter institutional data into any generative AI tools that have not been validated by the University for appropriate use and have explicit permission of the data provider.

Review: Agent reviewedConfidence98%Evidence1Languagesen

teaching

Northwestern provides instructors with three course-level AI policy options: Open (GAI permitted), Conditional (GAI permitted when explicitly authorized), and Closed (GAI prohibited).

Review: Agent reviewedConfidence97%Evidence1Languagesen

privacy

Northwestern classifies data into four levels. Only Level 1 (non-confidential, public) data may be uploaded to publicly available generative AI tools. Data above Level 1 requires tools approved through Northwestern IT procurement and security review.

Review: Agent reviewedConfidence97%Evidence1Languagesen

ai_tool_treatment

Microsoft Copilot is Northwestern's primary approved generative AI tool. All students, faculty, and staff have access to free Copilot Chat, with full Copilot for Microsoft 365 available as a paid subscription.

Review: Agent reviewedConfidence96%Evidence1Languagesen

academic_integrity

Unauthorized use of ChatGPT or other Generative AI tools is considered cheating and/or plagiarism per Northwestern Academic Integrity guidelines.

Review: Agent reviewedConfidence95%Evidence1Languagesen

ai_tool_treatment

Microsoft Copilot, when signed in with a Northwestern Microsoft account, is approved for Level 2 and generally Level 3 data. Publicly available AI tools (ChatGPT, Gemini, MidJourney) may only be used with Level 1 public data.

Review: Agent reviewedConfidence95%Evidence1Languagesen

Source snapshots

6 source attributions