Policy presence
McMaster University has 2 source-backed public claims for policy presence; deterministic analysis status: unclear.
Open, evidence-backed AI policy records for public reuse.
Hamilton, Canada
McMaster University is listed as QS 2026 rank =173. McMaster University has 8 source-backed AI policy claim records from 7 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.
v1 public contract
McMaster University is listed as QS 2026 rank =173. McMaster University has 8 source-backed AI policy claim records from 7 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.
As of this public record, University AI Policy Tracker lists McMaster University as an agent-reviewed AI policy record last checked on May 15, 2026 and last changed on May 15, 2026. The record contains 8 source-backed claims, including 8 reviewed claims, from 7 official source attributions. Original-language evidence snippets and source URLs remain canonical, with public JSON available at https://eduaipolicy.org/api/public/v1/universities/mcmaster-university.json. The entity-level confidence is 96%. This tracker is not legal advice, not academic integrity advice, and not an official university statement unless the linked source is the university's own official page.
This reference record summarizes visible public data only. Official sources and original-language evidence remain canonical; confidence is separate from review state.
This page is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.
Deterministic source-backed dimensions derived from this record's public claims.
Policy profile rows are machine-candidate derived metadata. They are not final policy conclusions; inspect the linked claim evidence before reuse.
Analysis page-quality metadata is available at /api/public/v1/analysis/page-quality.json.
McMaster University has 2 source-backed public claims for policy presence; deterministic analysis status: unclear.
McMaster University has 3 source-backed public claims for ai disclosure; deterministic analysis status: recommended.
McMaster University has 5 source-backed public claims for coursework; deterministic analysis status: restricted.
McMaster University has 4 source-backed public claims for exams; deterministic analysis status: required.
McMaster University has 5 source-backed public claims for privacy and data entry; deterministic analysis status: restricted.
McMaster University has 2 source-backed public claims for academic integrity; deterministic analysis status: required.
McMaster University has 2 source-backed public claims for approved tools; deterministic analysis status: restricted.
McMaster University has 1 source-backed public claim for named ai services; deterministic analysis status: restricted.
McMaster University has 3 source-backed public claims for teaching guidance; deterministic analysis status: recommended.
McMaster University has 1 source-backed public claim for research guidance; deterministic analysis status: recommended.
McMaster University has 3 source-backed public claims for security and procurement; deterministic analysis status: restricted.
Coverage score measures breadth of public, source-backed coverage only. It is not a policy quality, strictness, legal adequacy, safety, or compliance score.
8 reviewed evidence-backed public claim
Teaching
Normalized value: course_outlines_should_state_genai_expectations
Original evidence
Evidence 1Undergraduate and graduate course outlines should include a statement on the acceptable and unacceptable use of generative artificial intelligence in the course... If no syllabus statement is included, students should ask the educator for clarification on expectations, and if generative AI use is permitted, receive written confirmation before using generative AI in the course.
Academic Integrity
Normalized value: students_ask_first_disclose_and_cite
Original evidence
Evidence 1Ask first: Don't use generative AI unless your course syllabus or professor explicitly allows it. Remember, different courses have different rules... Be Transparent: If you're allowed to use AI, make sure you disclose how and where. Don't forget to correctly cite all your sources, including any AI-generated content.
Teaching
Normalized value: genai_no_grades_conditional_feedback
Original evidence
Evidence 1Generative AI tools should not be used to provide grades (letter or numeric) for student work. Generative AI tools may be used to provide feedback on student work, provided the following conditions are met: When generative AI tools are used to provide feedback of student work this use must be explicitly included in the course syllabus. Students should have the ability to opt-out of AI generated feedback.
Privacy
Normalized value: employee_sensitive_data_assessment_copilot_exception
Original evidence
Evidence 1Do not upload or share confidential, personal, personal health or proprietary information with a generative AI tool unless a data security and risk assessment and a privacy and algorithmic assessment have been completed for the specific tool... McMaster has an enterprise licence for Microsoft Copilot (but not 365) which ensures that, when logged in using McMaster credentials, data used is not shared with either Microsoft or McMaster.
Research
Normalized value: researchers_verify_and_acknowledge_genai
Original evidence
Evidence 1Generative AI tools may create false, misleading, or biased outputs. Critically evaluate and personally verify any outputs used in the research process. Researchers are personally accountable for the accuracy and integrity of their work... Researchers who use generative AI in any context should cite or acknowledge its use drawing on McMaster Libraries' LibGuide and follow any publication/granting specific instructions.
Academic Integrity
Normalized value: turnitin_ai_detector_not_activated
Original evidence
Evidence 1At this time McMaster has not activated the Turnitin AI Detector. Ongoing work to assess privacy and security impacts of this tool, as well as exploring its reliability, are underway. McMaster may activate the AI Detector, which allows for retroactive submissions.
Security Review
Normalized value: algorithmic_impact_assessment_for_automated_functions
Original evidence
Evidence 1For any tools that include automated functions, including artificial intelligence, the university must conduct an Algorithmic Impact Assessment (AIA). An AIA is a risk assessment process that determines the impact level of an automated decision-making system on the risks of harm to individuals. The privacy office is responsible for conducting the AIA process.
Other
Normalized value: library_genai_limitations_and_privacy_cautions
Original evidence
Evidence 1Generative AI tools can generate incorrect, outdated or biased content. Confirm the quality, reliability and accuracy of AI-generated content with reputable sources before including it in your work. Generative AI tools are also known to fabricate or hallucinate sources that do not exist... The information you share with generative AI tools can compromise your privacy and security.
0 machine or needs-review claim
Candidate claims are not final policy conclusions. They preserve source URL, source snapshot hash, evidence, confidence, and review state so the record can be audited before review.
7 source attribution
provost.mcmaster.ca
provost.mcmaster.ca
libguides.mcmaster.ca
provost.mcmaster.ca
provost.mcmaster.ca
provost.mcmaster.ca
provost.mcmaster.ca
Source-check timeline and diff-style claim/evidence preview.
View the public change record for this university, including source snapshot hashes, claim review states, and a diff-style preview of current source-backed evidence.
Corrections create review tasks and do not directly change this public record.
If an official source is missing, stale, moved, blocked, or incorrectly summarized, submit a source URL, policy change report, or institution correction for review. Corrections must preserve source URLs, source language, original evidence, review state, and audit history.