London, United Kingdom

King's College London

King's College London is listed as QS 2026 rank 31. King's College London has 12 source-backed AI policy claim records from 6 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Citation-ready overview

v1 public contract

King's College London is listed as QS 2026 rank 31. King's College London has 12 source-backed AI policy claim records from 6 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Reviewed claims12Candidate claims0Official sources6

Candidate claims are source-backed records pending review. They are not final policy conclusions and are not legal or academic integrity advice.

Reviewed claims

12 reviewed public claim

Ai Tool Treatment

King's College London does not ban the use of generative AI tools by students.

Review: Agent reviewedConfidence95%

Original evidence

Evidence 1
No, King's does not ban the use of any type of AI. It is increasingly part of the wider world and is changing the nature of many aspects of life including the jobs you are in or will progress into.

Academic Integrity

At King's College London, inappropriate use of generative AI without attribution is considered academic misconduct and can result in penalties ranging from formal warnings to expulsion.

Review: Agent reviewedConfidence95%

Original evidence

Evidence 1
Inappropriate use without attribution is considered academic misconduct. For further details please see the Academic Misconduct Policy and Academic Misconduct Procedure. Potential penalties can range from formal warnings, resubmitting the coursework, suspension or expulsion.

Academic Integrity

King's College London does not require students to reference generative AI as an authoritative source in the reference list, but does require explicit acknowledgement of AI tool use in coursework.

Review: Agent reviewedConfidence95%

Original evidence

Evidence 1
King's College London, unlike some other universities, does not require students to reference generative AI as an authoritative source in the reference list for much the same reason you would not be expected to cite a search engine, a student essay website or be over-dependent on synoptic, secondary source material. However, as we learn more about the capabilities and limitations of these tools and as we work together to evolve our own critical AI literacies, we do expect you to be explicit in acknowledging your use of generative AI tools such as Microsoft Copilot (available via your KCL account), Google Gemini, ChatGPT or any other media generated through other generative AI tools.

Academic Integrity

At King's College London, submitting AI-generated text as one's own without written departmental permission is considered misconduct under third-party involvement or text manipulation offences.

Review: Agent reviewedConfidence95%

Original evidence

Evidence 1
Submitting text generated by technology/artificial intelligence as their own, without written permission from their department is considered misconduct under the offences of third-party involvement or text manipulation, if it provides undue advantage or interferes with assessment of the student's own understanding.

Academic Integrity

King's College London has disabled the AI detection feature in Turnitin due to concerns about reliability and false positives.

Review: Agent reviewedConfidence90%

Original evidence

Evidence 1
King's took the decision not to enable the AI detection % in Turnitin due to concerns about its reliability and potential for false positives. As things stand AI detection is not an option that is available to us.

Teaching

King's College London supports considered use of generative AI and is open to evolving teaching, assessment and feedback practices according to need and disciplinary differences.

Review: Agent reviewedConfidence90%

Original evidence

Evidence 1
King's College London supports considered use of generative AI, and is open to evolving teaching, assessment and feedback practices according to need and disciplinary differences.

Ai Tool Treatment

Microsoft Copilot is available to all King's College London students via their KCL Microsoft account and comes with commercial data protection under the university's enterprise license.

Review: Agent reviewedConfidence90%

Original evidence

Evidence 1
Our Enterprise license means that use of Copilot comes with commercial data protection and is therefore a more secure alternative to other generative AI tools.

Teaching

King's College London subscribes to the Russell Group's five principles on generative AI in education, including supporting AI literacy, adapting teaching and assessment, and ensuring academic integrity.

Review: Agent reviewedConfidence90%

Original evidence

Evidence 1
King's contributed to and subscribes to the Russell Group's five principles on the use of generative AI tools in education: Universities will support students and staff to become AI-literate; Staff should be equipped to support students to use generative AI tools effectively and appropriately in their learning experience; Universities will adapt teaching and assessment to incorporate the ethical use of generative AI and support equal access; Universities will ensure academic rigour and integrity is upheld; Universities will work collaboratively to share best practice as the technology and its application in education evolves.

Research

King's College London permits doctoral students to use generative AI tools in their thesis writing processes for assistive purposes such as clarifying writing, provided use is declared and consistent with guidance.

Review: Agent reviewedConfidence90%

Original evidence

Evidence 1
Yes, students are permitted to make use of generative AI tools in their thesis writing processes. However, where a student's use of generative AI tools in their thesis writing exceeds the circumstances permitted by this guidance, they will risk breaching the Academic Misconduct Policy.

Research

King's College London doctoral examiners must not upload any part of a student's thesis into a generative AI tool or use external AI detection software when assessing the thesis.

Review: Agent reviewedConfidence90%

Original evidence

Evidence 1
Must not upload any part of a student's thesis into a generative AI tool, or make use of external generative AI detection software when assessing the thesis.

Teaching

King's College London defines four broad levels of acceptable AI use in assessments: minimal, limited/selective, open, and embedded, with programme and module leaders adjusting to assessment specifics.

Review: Agent reviewedConfidence90%

Original evidence

Evidence 1
We have suggested four broad levels that your Programme and Module leaders may adjust to the specifics of an assessment: 1. Minimal - Includes routine and established use of tools such as auto transcription, spell checkers, grammar check. 2. Limited/selective - Use for clearly delineated tasks as appropriate/allowed/recommended. 3. Open - No specific restrictions but with requirement to track key stages/tools utilised. 4. Embedded - AI use is a feature of the assessment itself.

Ai Tool Treatment

Microsoft Copilot is the primary institutional generative AI tool available to all King's College London students and staff via KCL Microsoft login credentials.

Review: Agent reviewedConfidence85%

Original evidence

Evidence 1
Please note that Microsoft Copilot is available to all King's students with your KCL Microsoft log in credentials. Make sure you are logged into your KCL account and switch 'safe search' to 'moderate' to use it.

Candidate claims

0 machine or needs-review claim

Candidate claims are not final policy conclusions. They preserve source URL, source snapshot hash, evidence, confidence, and review state so the record can be audited before review.

Official sources

6 source attribution

Back to universities