ai_tool_treatment
The University of Manchester does not ban generative AI. The university's position is that when used appropriately, AI tools have the potential to enhance teaching and learning, and can support inclusivity and accessibility.
Open, evidence-backed AI policy records for public reuse.
Change log
Source-check timeline, source snapshot hashes, claim review state, and a diff-style preview of current source-backed claim evidence.
Current public record freshness and review state.
The University of Manchester currently has 12 source-backed claim records and 6 official source attributions. Latest tracked changed date: May 10, 2026.
This tracker is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.
Diff-style preview built from current public claim/evidence records. Full old/new source diffs require paired historical snapshots.
Inserted lines represent current public claim and evidence records in the source-backed dataset.
12 claim records
The University of Manchester does not ban generative AI. The university's position is that when used appropriately, AI tools have the potential to enhance teaching and learning, and can support inclusivity and accessibility.
Manchester has adopted five core principles for AI use: transparency, accountability, competence, responsible use, and respect. All staff and students using or developing AI are personally responsible for adhering to these.
The University of Manchester is the world's first university to deploy equitable Microsoft 365 Copilot access and training across its entire community, with 65,000 colleagues and students receiving the full M365 Copilot suite.
Copilot Chat is available to everyone with a University of Manchester account. The university has a contractual agreement with Microsoft ensuring prompts and uploaded files are private, protected by the same security and encryption as emails and OneDrive. The AI system does not learn from user prompts or data.
Submitting work created by generative AI as one's own, or misrepresenting understanding of the subject, is plagiarism at Manchester and will be dealt with in accordance with the University's Academic Malpractice Procedure.
Tools to detect AI-generated content are unreliable and biased and cannot be relied on to identify academic malpractice in summative assessment at Manchester. Output from such tools cannot currently be used as evidence of malpractice.
Students at Manchester must cite or acknowledge the outputs of generative AI tools when they use them in their work, including quoting, summarising, paraphrasing, editing, translating, data processing, re-writing, and idea generation.
University-approved enterprise AI tools (Microsoft Copilot Chat for all, M365 Copilot via licences) must be used whenever there is a risk of inappropriate disclosure. Free-to-use public AI services should only be used with extreme caution due to data disclosure risks.
With approval at School level, Manchester's default AI position may be broadened or narrowed for specific course units or assignments. Students must be given detailed information explaining the rationale and what is and is not allowed.
Using an AI tool to correct grammar or spelling is acceptable at Manchester, but students should ensure that use of the tool does not result in substantive changes to the content or meaning of their work.
Manchester recognises the potential of AI to power research and innovation. Any use of AI to generate data should be completely transparent. Using AI to fabricate or manipulate data without clear declaration constitutes research misconduct.
The University recommends Microsoft Copilot for AI-related work. It is GDPR-compliant and protects University and personal data. Staff should always carefully consider whether adding personal information into an AI tool is necessary.
6 source attributions
official_guidance checked May 10, 2026
official_guidance checked May 10, 2026
official_guidance checked May 10, 2026
official_guidance checked May 10, 2026
official_guidance checked May 10, 2026
official_guidance checked May 10, 2026