ai_tool_treatment
LSE states that making generative AI tools available does not endorse unrestricted use, and users should check their specific course, programme, or department policy.
Open, evidence-backed AI policy records for public reuse.
Change log
Source-check timeline, source snapshot hashes, claim review state, and a diff-style preview of current source-backed claim evidence.
Current public record freshness and review state.
The London School of Economics and Political Science (LSE) currently has 8 source-backed claim records and 6 official source attributions. Latest tracked changed date: May 12, 2026.
This tracker is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.
Diff-style preview built from current public claim/evidence records. Full old/new source diffs require paired historical snapshots.
Inserted lines represent current public claim and evidence records in the source-backed dataset.
8 claim records
LSE states that making generative AI tools available does not endorse unrestricted use, and users should check their specific course, programme, or department policy.
LSE research guidance tells researchers not to share personal, sensitive, or confidential data with third-party AI tools unless the tools meet LSE privacy and security standards, and it strongly encourages Microsoft Copilot for privacy and security.
LSE requires departments or course convenors to classify authorised generative AI use in assessment as no authorised use, limited authorised use, or full authorised use, and to communicate the position to students.
LSE research guidance applies to LSE staff and students undertaking research and frames generative AI as a supportive tool while keeping researchers accountable for outputs.
LSE legal and regulatory guidance tells users not to put personal data, confidential or commercially sensitive data, or certain copyrighted content into external AI tools.
For 2025/26, LSE requested departments to add assessment safeguards, including observed assessment methods, to help assure degree integrity and prevent unfair competitive advantage from generative AI.
LSE lists Claude for Education as available to academic staff and students and Copilot with Commercial Data Protection as available to all staff and students.
LSE Claude for Education guidance says use is optional, student and staff data will not be used to train Anthropic models, and personal, operational, or confidential data must not be shared through Claude.
6 source attributions
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026