Manchester, United Kingdom

The University of Manchester

The University of Manchester is listed as QS 2026 rank 35. The University of Manchester has 12 source-backed AI policy claim records from 6 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Citation-ready overview

v1 public contract

The University of Manchester is listed as QS 2026 rank 35. The University of Manchester has 12 source-backed AI policy claim records from 6 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Reviewed claims12Candidate claims0Official sources6

Candidate claims are source-backed records pending review. They are not final policy conclusions and are not legal or academic integrity advice.

Reviewed claims

12 reviewed public claim

Ai Tool Treatment

The University of Manchester does not ban generative AI. The university's position is that when used appropriately, AI tools have the potential to enhance teaching and learning, and can support inclusivity and accessibility.

Review: Agent reviewedConfidence98%

Original evidence

Evidence 1
The University position is that, when used appropriately, AI tools have the potential to enhance teaching and learning and can support inclusivity and accessibility. Output from AI systems must be treated in the same manner by staff and students as work created by another person or persons, used critically and with permitted license, and cited and acknowledged appropriately.

Teaching

Manchester has adopted five core principles for AI use: transparency, accountability, competence, responsible use, and respect. All staff and students using or developing AI are personally responsible for adhering to these.

Review: Agent reviewedConfidence98%

Original evidence

Evidence 1
The University has adopted the following principles, building on existing frameworks for academic integrity and emerging guidance on AI. All staff and students using or developing AI are personally responsible for adhering to these.

Ai Tool Treatment

The University of Manchester is the world's first university to deploy equitable Microsoft 365 Copilot access and training across its entire community, with 65,000 colleagues and students receiving the full M365 Copilot suite.

Review: Agent reviewedConfidence98%

Original evidence

Evidence 1
The University has announced a strategic collaboration with Microsoft, making Manchester the world's first university to deploy equitable Microsoft 365 Copilot access and training across its entire community. This means 65,000 colleagues and students will benefit from the full Microsoft 365 Copilot suite.

Privacy

Copilot Chat is available to everyone with a University of Manchester account. The university has a contractual agreement with Microsoft ensuring prompts and uploaded files are private, protected by the same security and encryption as emails and OneDrive. The AI system does not learn from user prompts or data.

Review: Agent reviewedConfidence98%

Original evidence

Evidence 1
Copilot Chat is an AI-powered assistant available to everyone with a University account... Your prompts and uploaded files are private – no one else can see them and they are protected by the same security and encryption techniques as your emails and the contents of your OneDrive or SharePoint Sites. Copilot Chat's AI system does not learn from your prompts or data.

Academic Integrity

Submitting work created by generative AI as one's own, or misrepresenting understanding of the subject, is plagiarism at Manchester and will be dealt with in accordance with the University's Academic Malpractice Procedure.

Review: Agent reviewedConfidence98%

Original evidence

Evidence 1
Submitting work created by Generative AI as their own, or to misrepresent their understanding of the subject, is plagiarism and will be dealt with in accordance with the University's Academic Malpractice Procedure.

Academic Integrity

Tools to detect AI-generated content are unreliable and biased and cannot be relied on to identify academic malpractice in summative assessment at Manchester. Output from such tools cannot currently be used as evidence of malpractice.

Review: Agent reviewedConfidence98%

Original evidence

Evidence 1
Tools to detect AI-generated content are unreliable and biased and cannot be relied on to identify academic malpractice in summative assessment. Output from such tools cannot currently be used as evidence of malpractice.

Academic Integrity

Students at Manchester must cite or acknowledge the outputs of generative AI tools when they use them in their work, including quoting, summarising, paraphrasing, editing, translating, data processing, re-writing, and idea generation.

Review: Agent reviewedConfidence98%

Original evidence

Evidence 1
You must cite or acknowledge the outputs of generative A.I. tools when you use them in your work. Quotating, summarising and paraphrasing, editing, translating, data processing, re-writing your work and the generation of ideas.

Ai Tool Treatment

University-approved enterprise AI tools (Microsoft Copilot Chat for all, M365 Copilot via licences) must be used whenever there is a risk of inappropriate disclosure. Free-to-use public AI services should only be used with extreme caution due to data disclosure risks.

Review: Agent reviewedConfidence95%

Original evidence

Evidence 1
Many generative AI tools are available as either free-to-use public services or paid-for enterprise services. Free-to-use public services often store user input, potentially disclosing the content to third parties. They should only be used with extreme caution... University-approved enterprise AI tools must be used whenever there is a risk of inappropriate disclosure.

Teaching

With approval at School level, Manchester's default AI position may be broadened or narrowed for specific course units or assignments. Students must be given detailed information explaining the rationale and what is and is not allowed.

Review: Agent reviewedConfidence95%

Original evidence

Evidence 1
With approval at School level, this position may be broadened or narrowed for specific Course Units or assignments to encourage, require, or disallow specific uses of AI. In such cases students must be given detailed information that explains the rationale for the variation from the default position, as well as what is and is not allowed.

Teaching

Using an AI tool to correct grammar or spelling is acceptable at Manchester, but students should ensure that use of the tool does not result in substantive changes to the content or meaning of their work.

Review: Agent reviewedConfidence95%

Original evidence

Evidence 1
Using an AI tool to correct grammar or spelling is acceptable, but where a student uses an AI tool for proofreading work submitted for assessment, they should ensure that use of the tool does not result in substantive changes to the content or meaning of their work.

Research

Manchester recognises the potential of AI to power research and innovation. Any use of AI to generate data should be completely transparent. Using AI to fabricate or manipulate data without clear declaration constitutes research misconduct.

Review: Agent reviewedConfidence95%

Original evidence

Evidence 1
The University recognises the potential of AI to power research and innovation... Any use of AI to generate data should be completely transparent... Using AI to fabricate or manipulate data such as experimental measurements, interview texts or research images, without clear declaration, constitutes research misconduct.

Ai Tool Treatment

The University recommends Microsoft Copilot for AI-related work. It is GDPR-compliant and protects University and personal data. Staff should always carefully consider whether adding personal information into an AI tool is necessary.

Review: Agent reviewedConfidence95%

Original evidence

Evidence 1
Microsoft Copilot: The University recommends using Microsoft Copilot for AI-related work. It is GDPR-compliant and protects University and personal data. Even though Copilot is GDPR compliant, you should ALWAYS carefully consider whether adding personal information into an AI tool is necessary.

Candidate claims

0 machine or needs-review claim

Candidate claims are not final policy conclusions. They preserve source URL, source snapshot hash, evidence, confidence, and review state so the record can be audited before review.

Official sources

6 source attribution

Back to universities