Policy presence
University of Birmingham has 5 source-backed public claims for policy presence; deterministic analysis status: unclear.
Open, evidence-backed AI policy records for public reuse.
Birmingham, United Kingdom
University of Birmingham is listed as QS 2026 rank 76. University of Birmingham has 14 source-backed AI policy claim records from 10 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.
v1 public contract
University of Birmingham is listed as QS 2026 rank 76. University of Birmingham has 14 source-backed AI policy claim records from 10 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.
As of this public record, University AI Policy Tracker lists University of Birmingham as an agent-reviewed AI policy record last checked on May 13, 2026 and last changed on May 13, 2026. The record contains 14 source-backed claims, including 14 reviewed claims, from 10 official source attributions. Original-language evidence snippets and source URLs remain canonical, with public JSON available at https://eduaipolicy.org/api/public/v1/universities/university-of-birmingham.json. The entity-level confidence is 96%. This tracker is not legal advice, not academic integrity advice, and not an official university statement unless the linked source is the university's own official page.
This reference record summarizes visible public data only. Official sources and original-language evidence remain canonical; confidence is separate from review state.
This page is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.
Deterministic source-backed dimensions derived from this record's public claims.
Policy profile rows are machine-candidate derived metadata. They are not final policy conclusions; inspect the linked claim evidence before reuse.
Analysis page-quality metadata is available at /api/public/v1/analysis/page-quality.json.
University of Birmingham has 5 source-backed public claims for policy presence; deterministic analysis status: unclear.
No source-backed public claim about AI disclosure or acknowledgement is present in this profile.
The current public tracker record does not contain claim evidence about disclosing, acknowledging, citing, or declaring AI use.
University of Birmingham has 5 source-backed public claims for coursework; deterministic analysis status: restricted.
University of Birmingham has 5 source-backed public claims for exams; deterministic analysis status: restricted.
University of Birmingham has 5 source-backed public claims for privacy and data entry; deterministic analysis status: restricted.
University of Birmingham has 3 source-backed public claims for academic integrity; deterministic analysis status: restricted.
University of Birmingham has 5 source-backed public claims for approved tools; deterministic analysis status: conditionally_allowed.
University of Birmingham has 5 source-backed public claims for named ai services; deterministic analysis status: restricted.
University of Birmingham has 3 source-backed public claims for teaching guidance; deterministic analysis status: recommended.
University of Birmingham has 5 source-backed public claims for research guidance; deterministic analysis status: restricted.
University of Birmingham has 4 source-backed public claims for security and procurement; deterministic analysis status: conditionally_allowed.
Coverage score measures breadth of public, source-backed coverage only. It is not a policy quality, strictness, legal adequacy, safety, or compliance score.
14 reviewed evidence-backed public claim
Academic Integrity
Normalized value: assessment_genai_not_permitted_unless_explicit
Original evidence
Evidence 1Unless explicitly stated otherwise, students should assume that the use of generative AI within an assessment or assignment is not permitted.
Security Review
Normalized value: ai_detection_tools_not_allowable_currently
Original evidence
Evidence 1Tools designed to detect the use of generative AI are currently known to produce both false positives and false negatives. At present, the use of any such tools within the University is not allowable and no student work should be uploaded to generative AI detection software.
Academic Integrity
Normalized value: ai_alone_not_allowed_to_allocate_grades
Original evidence
Evidence 1All decisions, outcomes and feedback must be reviewed first by an academic member of staff before they are released to students. The use of generative AI tools on their own to allocate marks and student grades is not allowed.
Privacy
Normalized value: research_ai_personal_confidential_sensitive_data_controls
Original evidence
Evidence 1Personal, confidential, or sensitive data must not be entered into AI tools without: clear justification (including consideration of locally-hosted versus cloud-based tools), data minimisation, Data Protection Impact Assessments (where applicable).
Ai Tool Treatment
Normalized value: study_aids_allowed_but_not_submission_as_own_work
Original evidence
Evidence 1The University's framework does allow you to use Generative AI tools as study aids for your personal learning and in your research. You are permitted to use these tools in this context, as long as you do not submit the actual AI-generated output as your own work for assessment.
Research
Normalized value: research_ai_guidance_scope_and_human_accountability
Original evidence
Evidence 1This guidance applies to all researchers at the University of Birmingham who engage with Artificial Intelligence (AI) in the context of research, whether by using existing tools, developing new models, or deploying AI systems in real-world environments. A human researcher must be accountable for every substantive claim, interpretation, and output.
Teaching
Normalized value: university_framework_for_teaching_learning_assessment_support
Original evidence
Evidence 1This guidance provides a framework for the implementation and use of generative AI models within teaching, learning, assessment, and support at the University of Birmingham.
Teaching
Normalized value: assessment_permissions_should_be_clearly_communicated
Original evidence
Evidence 1Within all modules, academic staff should clearly articulate if, and to what extent, the use of generative AI tools is permitted within assessments or assignments by students: This should be detailed within the course outline and all assessment and assignment briefs.
Teaching
Normalized value: ai_supported_grading_requires_approval_and_academic_responsibility
Original evidence
Evidence 1From the 1 September 2024, and upon the appropriate approval being first received, academic staff can utilise AI systems to support the assessment, grading and moderation of student work along with the provision of individualised student feedback. Where such tools are used, academic staff remain responsible for the academic judgements made on submitted student work and for any feedback they provide for learners.
Academic Integrity
Normalized value: pgt_dissertation_ai_generated_content_requires_school_permission
Original evidence
Evidence 1Students should not submit, within any part of their PGT dissertation, material or content that has been generated by AI tools unless their use has been specifically permitted by the School. Where the use of generative AI tools is permitted, the University's Framework for the Introduction and Use of Generative Artificial Intelligence within Teaching, Learning and Assessment must be followed and students required to appropriately reference its use.
Security Review
Normalized value: researchers_should_use_university_endorsed_ai_tools
Original evidence
Evidence 1Researchers should use only University-endorsed AI tools to ensure compliance with licensing, data protection, and information security requirements. Use of unapproved or externally hosted tools should be justified and recorded in project documentation such as Data Management Plans or ethics submissions.
Procurement
Normalized value: enterprise_copilot_access_and_data_protection_indicator
Original evidence
Evidence 1The University provides approved access to the Enterprise version of Microsoft Co-Pilot which you can access via your University account. Ensure the green shield labelled Enterprise data protection applies to this chat is showing so that you know you are safely using the Enterprise version.
Procurement
Normalized value: ai_tool_terms_review_before_registration
Original evidence
Evidence 1Before registering for a new tool, it is crucial to review the Terms and Conditions (licence), which form the legal agreement between you and the supplier. If issues arise, such as non-compliance with accessibility or data protection standards, indemnification requirements, or copyright concerns, seek further guidance from your local licensing or IT procurement teams before registering for the service.
Privacy
Normalized value: student_notice_should_cover_ai_marking_privacy_and_human_oversight
Original evidence
Evidence 1Where AI supported marking and feedback practices and used, clear information should be provided to students on their use that details why, and how, AI tools are being used; emphasises that there remains human oversight; and addresses any privacy concerns that students might have about their data or work being uploaded to AI tools.
0 machine or needs-review claim
Candidate claims are not final policy conclusions. They preserve source URL, source snapshot hash, evidence, confidence, and review state so the record can be audited before review.
10 source attribution
birmingham.ac.uk
intranet.birmingham.ac.uk
birmingham.ac.uk
birmingham.ac.uk
birmingham.ac.uk
birmingham.ac.uk
birmingham.ac.uk
intranet.birmingham.ac.uk
intranet.birmingham.ac.uk
birmingham.ac.uk
Source-check timeline and diff-style claim/evidence preview.
View the public change record for this university, including source snapshot hashes, claim review states, and a diff-style preview of current source-backed evidence.
Corrections create review tasks and do not directly change this public record.
If an official source is missing, stale, moved, blocked, or incorrectly summarized, submit a source URL, policy change report, or institution correction for review. Corrections must preserve source URLs, source language, original evidence, review state, and audit history.