Hamilton, Canada

McMaster University

McMaster University is listed as QS 2026 rank =173. McMaster University has 8 source-backed AI policy claim records from 7 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Short answer

v1 public contract

McMaster University is listed as QS 2026 rank =173. McMaster University has 8 source-backed AI policy claim records from 7 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Citation-ready summary

As of this public record, University AI Policy Tracker lists McMaster University as an agent-reviewed AI policy record last checked on May 15, 2026 and last changed on May 15, 2026. The record contains 8 source-backed claims, including 8 reviewed claims, from 7 official source attributions. Original-language evidence snippets and source URLs remain canonical, with public JSON available at https://eduaipolicy.org/api/public/v1/universities/mcmaster-university.json. The entity-level confidence is 96%. This tracker is not legal advice, not academic integrity advice, and not an official university statement unless the linked source is the university's own official page.

Claim coverage8 reviewedSource languageenPublic JSON/api/public/v1/universities/mcmaster-university.json

Policy signals in this record

  • Evidence includes Teaching claims.
  • Evidence includes Academic integrity claims.
  • Evidence includes Privacy claims.
  • Evidence includes Research claims.
  • Evidence includes Security review claims.
  • Evidence includes Other policy claims.
  • Named AI services detected in public claims: Microsoft Copilot.
  • Disclosure, acknowledgment, citation, or attribution language appears in the public claim text.
Policy statusReviewed evidence-backed recordReview: Agent reviewedEvidence-backed claims8Reviewed8Candidate0Official sources7

This reference record summarizes visible public data only. Official sources and original-language evidence remain canonical; confidence is separate from review state.

This page is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.

Policy profile

Deterministic source-backed dimensions derived from this record's public claims.

Coverage score100/100Coverage labelbroad public coverageReview: Machine candidateAnalysis confidence80%

Policy profile rows are machine-candidate derived metadata. They are not final policy conclusions; inspect the linked claim evidence before reuse.

Analysis page-quality metadata is available at /api/public/v1/analysis/page-quality.json.

Coverage score measures breadth of public, source-backed coverage only. It is not a policy quality, strictness, legal adequacy, safety, or compliance score.

Evidence-backed claims

8 reviewed evidence-backed public claim

Teaching

McMaster's teaching and learning generative AI guidelines say undergraduate and graduate course outlines should state acceptable and unacceptable generative AI use, and students should seek clarification and written confirmation if no syllabus statement is included.

Review: Agent reviewedConfidence96%

Normalized value: course_outlines_should_state_genai_expectations

Original evidence

Evidence 1
Undergraduate and graduate course outlines should include a statement on the acceptable and unacceptable use of generative artificial intelligence in the course... If no syllabus statement is included, students should ask the educator for clarification on expectations, and if generative AI use is permitted, receive written confirmation before using generative AI in the course.

Academic Integrity

McMaster's student generative AI coursework tip sheet tells students not to use generative AI unless a course syllabus or professor explicitly allows it, and to disclose and cite AI-generated content when allowed.

Review: Agent reviewedConfidence96%

Normalized value: students_ask_first_disclose_and_cite

Original evidence

Evidence 1
Ask first: Don't use generative AI unless your course syllabus or professor explicitly allows it. Remember, different courses have different rules... Be Transparent: If you're allowed to use AI, make sure you disclose how and where. Don't forget to correctly cite all your sources, including any AI-generated content.

Teaching

McMaster's teaching and learning guidelines say generative AI tools should not provide letter or numeric grades for student work, while AI-generated feedback may be used only under stated conditions such as syllabus inclusion and an opt-out ability.

Review: Agent reviewedConfidence95%

Normalized value: genai_no_grades_conditional_feedback

Original evidence

Evidence 1
Generative AI tools should not be used to provide grades (letter or numeric) for student work. Generative AI tools may be used to provide feedback on student work, provided the following conditions are met: When generative AI tools are used to provide feedback of student work this use must be explicitly included in the course syllabus. Students should have the ability to opt-out of AI generated feedback.

Privacy

McMaster's operational excellence generative AI guidelines tell employees not to upload confidential, personal, personal health, or proprietary information to a generative AI tool unless required security/privacy assessments have been completed, while noting an enterprise Microsoft Copilot exception when logged in with McMaster credentials.

Review: Agent reviewedConfidence95%

Normalized value: employee_sensitive_data_assessment_copilot_exception

Original evidence

Evidence 1
Do not upload or share confidential, personal, personal health or proprietary information with a generative AI tool unless a data security and risk assessment and a privacy and algorithmic assessment have been completed for the specific tool... McMaster has an enterprise licence for Microsoft Copilot (but not 365) which ensures that, when logged in using McMaster credentials, data used is not shared with either Microsoft or McMaster.

Research

McMaster's research generative AI guidelines say researchers are personally accountable for the accuracy and integrity of their work, should critically evaluate and verify AI outputs, and should cite or acknowledge generative AI use according to McMaster Library guidance and publication or granting instructions.

Review: Agent reviewedConfidence95%

Normalized value: researchers_verify_and_acknowledge_genai

Original evidence

Evidence 1
Generative AI tools may create false, misleading, or biased outputs. Critically evaluate and personally verify any outputs used in the research process. Researchers are personally accountable for the accuracy and integrity of their work... Researchers who use generative AI in any context should cite or acknowledge its use drawing on McMaster Libraries' LibGuide and follow any publication/granting specific instructions.

Academic Integrity

McMaster's Provost academic integrity page says McMaster has not activated Turnitin's AI Detector, while privacy/security and reliability assessment work is underway and activation may allow retroactive submissions.

Review: Agent reviewedConfidence94%

Normalized value: turnitin_ai_detector_not_activated

Original evidence

Evidence 1
At this time McMaster has not activated the Turnitin AI Detector. Ongoing work to assess privacy and security impacts of this tool, as well as exploring its reliability, are underway. McMaster may activate the AI Detector, which allows for retroactive submissions.

Security Review

McMaster's privacy considerations page says that for tools including automated functions, including artificial intelligence, the university must conduct an Algorithmic Impact Assessment, with the Privacy Office responsible for the AIA process.

Review: Agent reviewedConfidence93%

Normalized value: algorithmic_impact_assessment_for_automated_functions

Original evidence

Evidence 1
For any tools that include automated functions, including artificial intelligence, the university must conduct an Algorithmic Impact Assessment (AIA). An AIA is a risk assessment process that determines the impact level of an automated decision-making system on the risks of harm to individuals. The privacy office is responsible for conducting the AIA process.

Other

McMaster University Libraries' generative AI citation guide warns that generative AI can generate incorrect, outdated, or biased content, can fabricate sources, and that information shared with generative AI tools can compromise privacy and security.

Review: Agent reviewedConfidence91%

Normalized value: library_genai_limitations_and_privacy_cautions

Original evidence

Evidence 1
Generative AI tools can generate incorrect, outdated or biased content. Confirm the quality, reliability and accuracy of AI-generated content with reputable sources before including it in your work. Generative AI tools are also known to fabricate or hallucinate sources that do not exist... The information you share with generative AI tools can compromise your privacy and security.

Candidate claims

0 machine or needs-review claim

Candidate claims are not final policy conclusions. They preserve source URL, source snapshot hash, evidence, confidence, and review state so the record can be audited before review.

Official sources

7 source attribution

Change log

Source-check timeline and diff-style claim/evidence preview.

View the public change record for this university, including source snapshot hashes, claim review states, and a diff-style preview of current source-backed evidence.

Last checkedMay 15, 2026Last changedMay 15, 2026Open change log

Corrections and missing evidence

Corrections create review tasks and do not directly change this public record.

If an official source is missing, stale, moved, blocked, or incorrectly summarized, submit a source URL, policy change report, or institution correction for review. Corrections must preserve source URLs, source language, original evidence, review state, and audit history.

Back to universities