Sheffield, United Kingdom

The University of Sheffield

The University of Sheffield is listed as QS 2026 rank 92. The University of Sheffield has 14 source-backed AI policy claim records from 6 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Short answer

v1 public contract

The University of Sheffield is listed as QS 2026 rank 92. The University of Sheffield has 14 source-backed AI policy claim records from 6 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Citation-ready summary

As of this public record, University AI Policy Tracker lists The University of Sheffield as an agent-reviewed AI policy record last checked on May 14, 2026 and last changed on May 14, 2026. The record contains 14 source-backed claims, including 14 reviewed claims, from 6 official source attributions. Original-language evidence snippets and source URLs remain canonical, with public JSON available at https://eduaipolicy.org/api/public/v1/universities/the-university-of-sheffield.json. The entity-level confidence is 96%. This tracker is not legal advice, not academic integrity advice, and not an official university statement unless the linked source is the university's own official page.

Claim coverage14 reviewedSource languageenPublic JSON/api/public/v1/universities/the-university-of-sheffield.json

Policy signals in this record

  • Evidence includes Source status claims.
  • Evidence includes Research claims.
  • Evidence includes Privacy claims.
  • Evidence includes Security review claims.
  • Evidence includes Academic integrity claims.
  • Evidence includes AI tool treatment claims.
  • Evidence includes Teaching claims.
  • Evidence includes Other policy claims.
Policy statusReviewed evidence-backed recordReview: Agent reviewedEvidence-backed claims14Reviewed14Candidate0Official sources6

This reference record summarizes visible public data only. Official sources and original-language evidence remain canonical; confidence is separate from review state.

This page is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.

Policy profile

Deterministic source-backed dimensions derived from this record's public claims.

Coverage score100/100Coverage labelbroad public coverageReview: Machine candidateAnalysis confidence80%

Policy profile rows are machine-candidate derived metadata. They are not final policy conclusions; inspect the linked claim evidence before reuse.

Analysis page-quality metadata is available at /api/public/v1/analysis/page-quality.json.

Coverage score measures breadth of public, source-backed coverage only. It is not a policy quality, strictness, legal adequacy, safety, or compliance score.

Evidence-backed claims

14 reviewed evidence-backed public claim

Source Status

StudySkills@Sheffield guidance says GenAI detection tools are not used at the University of Sheffield because of error-rate and false-positive or false-negative concerns.

Review: Agent reviewedConfidence96%

Normalized value: genai_detection_tools_not_used_due_to_error_rate_concerns

Original evidence

Evidence 1
GenAI detection tools are not used at the University of Sheffield. This is due to concerns over their error rates and the potential for both false positives and false negatives when scanning for potential use of GenAI.

Research

Sheffield PGR guidance says postgraduate researchers' use of generative AI must align with the University's expectations for responsible research and academic integrity.

Review: Agent reviewedConfidence96%

Normalized value: pgr_genai_use_must_align_responsible_research_academic_integrity_expectations

Original evidence

Evidence 1
PGRs, as researchers, produce original knowledge for an assessment (thesis and viva) that leads to the PhD, and as such, you must be mindful of the principles of research integrity and academic integrity, and your use of generative AI must align with the University's expectations for responsible research and academic integrity.

Privacy

The PGR guidance says researchers must ensure confidential, proprietary, or personally identifiable information is never uploaded to any GenAI platform.

Review: Agent reviewedConfidence96%

Normalized value: researchers_must_not_upload_confidential_proprietary_or_identifiable_information_to_genai

Original evidence

Evidence 1
Researchers must ensure that confidential, proprietary, or personally identifiable information is never uploaded to any GenAI platform.

Security Review

IT Services productivity principles say only University-approved AI tools should be used for official University business, and staff have access to Google Gemini as the institutionally supported GenAI tool.

Review: Agent reviewedConfidence96%

Normalized value: official_university_business_uses_university_approved_ai_tools

Original evidence

Evidence 1
Only University-approved AI tools should be used for official University business. All staff have access to Google Gemini as the institutionally-supported GenAI tool.

Privacy

IT Services productivity principles say AI use must comply with GDPR and University data-protection policies, and sensitive or confidential information must not be entered into public or unregulated AI models.

Review: Agent reviewedConfidence96%

Normalized value: ai_use_must_comply_with_gdpr_and_no_sensitive_data_in_public_unregulated_models

Original evidence

Evidence 1
All use of AI must comply with GDPR and the University's data protection policies. Sensitive or confidential information must not be entered into public or unregulated AI models.

Academic Integrity

Student academic-integrity guidance tells students to check school or department guidance and module assessment criteria before using GenAI, because use may be prohibited on some modules or assessments.

Review: Agent reviewedConfidence95%

Normalized value: students_check_module_assessment_criteria_before_genai_use

Original evidence

Evidence 1
The golden rule: always check your school / department guidance and the specific module assessment criteria as the use of GenAI may be specifically prohibited on certain modules or assessments.

Academic Integrity

Student academic-integrity guidance says a full disclosure of GenAI-produced content should always be acknowledged, and passing off that content as one's own work counts as academic misconduct.

Review: Agent reviewedConfidence95%

Normalized value: genai_content_disclosure_required_passing_off_counts_academic_misconduct

Original evidence

Evidence 1
A full disclosure of any content produced by GenAI should always be acknowledged in your work. Attempts to pass off content as your own work is counted as academic misconduct and may lead to action being taken against you.

Ai Tool Treatment

IT Services productivity principles say GenAI use is optional except where a team has adopted it for specific tasks, and it is a supportive resource rather than a mandatory requirement for any role.

Review: Agent reviewedConfidence95%

Normalized value: staff_productivity_genai_optional_supportive_not_mandatory

Original evidence

Evidence 1
The use of GenAI, except where your team may have adopted use for specific tasks, is optional. This technology is intended to be a supportive resource, not a mandatory requirement for any role.

Ai Tool Treatment

The University of Sheffield identifies Google Gemini as the institutionally supported GenAI tool for learning and teaching, and says Gemini should be used where possible to support those activities.

Review: Agent reviewedConfidence94%

Normalized value: google_gemini_institutionally_supported_genai_tool_for_learning_and_teaching

Original evidence

Evidence 1
All students and staff have access to Google Gemini as the institutionally supported GenAI tool. Where possible, Gemini should be used to support learning and teaching activities.

Teaching

The University of Sheffield says every undergraduate programme integrates GenAI literacy through structured teaching activities and formal assessment, while giving staff and students clarity about acceptable AI use across assessments.

Review: Agent reviewedConfidence93%

Normalized value: undergraduate_programmes_integrate_genai_literacy_and_assessment_clarity

Original evidence

Evidence 1
The University has developed a Common Approach to ensure every undergraduate programme integrates GenAI literacy through structured teaching activities and formal assessment, while providing clarity to staff and students about acceptable AI use across all assessments.

Research

The PGR guidance permits students to use generative AI in thesis writing, provided the use is consistent with the guidance and properly declared, and warns that exceeding the guidance risks breaching the Academic Misconduct Policy.

Review: Agent reviewedConfidence92%

Normalized value: pgr_thesis_genai_use_permitted_if_consistent_and_declared

Original evidence

Evidence 1
Yes, students are permitted to make use of generative AI tools in their thesis writing processes. You should declare your use of GenAI tools and take full responsibility for the content of your submitted thesis.

Academic Integrity

Assessment guidance says that when GenAI use is allowed, students may be asked to provide full disclosure using an Acknowledge, Describe, Evidence template, while noting that some schools may use a different process.

Review: Agent reviewedConfidence91%

Normalized value: allowed_assessment_genai_use_may_require_acknowledge_describe_evidence_disclosure

Original evidence

Evidence 1
In assignments where you are sure that you are allowed to use GenAI, you may be asked to provide a full disclosure of how you have done so. You can provide this information by completing an Acknowledge, Describe, Evidence template.

Security Review

For student-facing tools outside Google Suite, the University says the New IT Solution Request Process is followed to check data-protection and information-security compliance before those tools are made available on a use-case basis.

Review: Agent reviewedConfidence90%

Normalized value: student_non_google_ai_tools_use_case_it_solution_review_for_data_protection_security

Original evidence

Evidence 1
Where tools other than Google Suite are made available to students on a use case basis, the New IT Solution Request Process has been followed to ensure they comply with data protection and information security policies.

Candidate claims

0 machine or needs-review claim

Candidate claims are not final policy conclusions. They preserve source URL, source snapshot hash, evidence, confidence, and review state so the record can be audited before review.

Official sources

6 source attribution

Change log

Source-check timeline and diff-style claim/evidence preview.

View the public change record for this university, including source snapshot hashes, claim review states, and a diff-style preview of current source-backed evidence.

Last checkedMay 14, 2026Last changedMay 14, 2026Open change log

Corrections and missing evidence

Corrections create review tasks and do not directly change this public record.

If an official source is missing, stale, moved, blocked, or incorrectly summarized, submit a source URL, policy change report, or institution correction for review. Corrections must preserve source URLs, source language, original evidence, review state, and audit history.

Back to universities