ai_tool_treatment
King's College London does not ban the use of generative AI tools by students.
Open, evidence-backed AI policy records for public reuse.
Change log
Source-check timeline, source snapshot hashes, claim review state, and a diff-style preview of current source-backed claim evidence.
Current public record freshness and review state.
King's College London currently has 12 source-backed claim records and 6 official source attributions. Latest tracked changed date: May 10, 2026.
This tracker is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.
Diff-style preview built from current public claim/evidence records. Full old/new source diffs require paired historical snapshots.
Inserted lines represent current public claim and evidence records in the source-backed dataset.
12 claim records
King's College London does not ban the use of generative AI tools by students.
At King's College London, inappropriate use of generative AI without attribution is considered academic misconduct and can result in penalties ranging from formal warnings to expulsion.
King's College London does not require students to reference generative AI as an authoritative source in the reference list, but does require explicit acknowledgement of AI tool use in coursework.
At King's College London, submitting AI-generated text as one's own without written departmental permission is considered misconduct under third-party involvement or text manipulation offences.
King's College London has disabled the AI detection feature in Turnitin due to concerns about reliability and false positives.
King's College London supports considered use of generative AI and is open to evolving teaching, assessment and feedback practices according to need and disciplinary differences.
Microsoft Copilot is available to all King's College London students via their KCL Microsoft account and comes with commercial data protection under the university's enterprise license.
King's College London subscribes to the Russell Group's five principles on generative AI in education, including supporting AI literacy, adapting teaching and assessment, and ensuring academic integrity.
King's College London permits doctoral students to use generative AI tools in their thesis writing processes for assistive purposes such as clarifying writing, provided use is declared and consistent with guidance.
King's College London doctoral examiners must not upload any part of a student's thesis into a generative AI tool or use external AI detection software when assessing the thesis.
King's College London defines four broad levels of acceptable AI use in assessments: minimal, limited/selective, open, and embedded, with programme and module leaders adjusting to assessment specifics.
Microsoft Copilot is the primary institutional generative AI tool available to all King's College London students and staff via KCL Microsoft login credentials.
6 source attributions
official_policy_page checked May 10, 2026
official_guidance checked May 10, 2026
official_guidance checked May 10, 2026
official_guidance checked May 10, 2026
official_guidance checked May 10, 2026
official_guidance checked May 10, 2026