Policy presence
Duke University has 5 source-backed public claims for policy presence; deterministic analysis status: unclear.
Open, evidence-backed AI policy records for public reuse.
Durham, United States
Duke University is listed as QS 2026 rank 62. Duke University has 8 source-backed AI policy claim records from 6 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.
v1 public contract
Duke University is listed as QS 2026 rank 62. Duke University has 8 source-backed AI policy claim records from 6 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.
This reference record summarizes visible public data only. Official sources and original-language evidence remain canonical; confidence is separate from review state.
This page is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.
Deterministic source-backed dimensions derived from this record's public claims.
Policy profile rows are machine-candidate derived metadata. They are not final policy conclusions; inspect the linked claim evidence before reuse.
Analysis page-quality metadata is available at /api/public/v1/analysis/page-quality.json.
Duke University has 5 source-backed public claims for policy presence; deterministic analysis status: unclear.
Duke University has 1 source-backed public claim for ai disclosure; deterministic analysis status: recommended.
Duke University has 5 source-backed public claims for coursework; deterministic analysis status: restricted.
Duke University has 5 source-backed public claims for exams; deterministic analysis status: conditionally_allowed.
Duke University has 2 source-backed public claims for privacy and data entry; deterministic analysis status: restricted.
Duke University has 2 source-backed public claims for academic integrity; deterministic analysis status: conditionally_allowed.
Duke University has 3 source-backed public claims for approved tools; deterministic analysis status: restricted.
Duke University has 4 source-backed public claims for named ai services; deterministic analysis status: restricted.
Duke University has 5 source-backed public claims for teaching guidance; deterministic analysis status: recommended.
Duke University has 1 source-backed public claim for research guidance; deterministic analysis status: recommended.
Duke University has 1 source-backed public claim for security and procurement; deterministic analysis status: restricted.
Coverage score measures breadth of public, source-backed coverage only. It is not a policy quality, strictness, legal adequacy, safety, or compliance score.
8 reviewed evidence-backed public claim
Ai Tool Treatment
Normalized value: Student AI use depends on instructor permission; unauthorized generative AI use is academic misconduct.
Original evidence
Evidence 1The answer is: it depends! Per the Duke Community Standard, the unauthorized use of generative AI is considered academic misconduct. However, your instructor may permit or at times ask you to use generative AI in certain circumstances and assignments.
Localized display only
Duke student AI use depends on instructor permission and the Duke Community Standard.
Academic Integrity
Normalized value: Unauthorized AI software use is listed in Duke academic-dishonesty guidance.
Original evidence
Evidence 1Cheating is the act of wrongfully using or attempting to use unauthorized materials, information, study aids, or the ideas or work of another. It includes, but is not limited to: using, consulting, and/or maintaining unauthorized shared resources including, but not limited to, test banks, solutions materials and/or unauthorized use of artificial intelligence (AI) software
Localized display only
Duke academic-dishonesty guidance explicitly includes unauthorized AI software use.
Ai Tool Treatment
Normalized value: Duke offers a suite of secure and accessible AI platforms for students, staff, and faculty.
Original evidence
Evidence 1Duke University offers a suite of secure and accessible AI platforms to students, staff, and faculty. Whether you are involved in AI research and development, or want to learn how to start using AI to power your work, Duke is here to help.
Localized display only
Duke provides an AI tools suite for students, staff, and faculty.
Teaching
Normalized value: Instructors should provide clear syllabus guidance and define permitted AI use.
Original evidence
Evidence 1Under the Duke Community Standard, unauthorized use of generative AI is treated as cheating. This means you have the discretion to define how, if, and when generative AI may be used in your courses. All instructors should update their syllabi to include clear guidance on the use of generative AI in their courses.
Localized display only
Duke CTL advises clear syllabus guidance for generative AI use.
Security Review
Normalized value: Duke ChatGPT guidance allows sensitive data excluding PHI under institutional agreement.
Original evidence
Evidence 1ChatGPT is designed for a broad audience and can be used for general chatbot tasks to streamline your workflow. This includes answering questions, writing, editing, and synthesizing data and ideas. Duke University faculty, staff, and students. Prepaid licenses are provided at no cost to all undergraduate students, as well as faculty, staff, and graduate students in participating units. All other Duke users can purchase a license at a significantly discounted rate. Sensitive (no PHI), (governed by institutional agreement).
Localized display only
Duke labels ChatGPT for sensitive data without PHI under an institutional agreement.
Teaching
Normalized value: AI assignments should have course and assignment-specific AI policies aligned with learning goals.
Original evidence
Evidence 1Note, it is critical to develop AI policies for your course along with policies for specific AI assignments. In the development of AI assignments, the primary consideration is whether the use of AI will help your students achieve the learning goals of the course.
Localized display only
Duke CTL recommends course and assignment-specific AI policies aligned with course learning goals.
Research
Normalized value: Researchers should document AI decision-making and authenticate chatbot-summarized information before citing it.
Original evidence
Evidence 1Do document and publish your decision-making alongside your research. Don’ts Don’t cite information found or summarized by a chatbot that you haven’t authenticated. Don’t use AI tools that collect, store, or train on user data. Check a tool’s privacy policy; if you’re not sure, contact the tool’s creator.
Localized display only
Duke research guidance emphasizes documentation, authentication, and privacy checks.
Privacy
Normalized value: Personal information should not be shared with AI tools in assignment contexts.
Original evidence
Evidence 1There are ethical issues to using AI beyond questions of plagiarism, copyright and academic integrity that should be considered. First, to minimize threats to the privacy of your students and yourself, personal information should not be shared.
Localized display only
Duke CTL warns against sharing personal information in AI assignment use.
0 machine or needs-review claim
Candidate claims are not final policy conclusions. They preserve source URL, source snapshot hash, evidence, confidence, and review state so the record can be audited before review.
6 source attribution
dukecommunitystandard.students.duke.edu
ai.duke.edu
ctl.duke.edu
ctl.duke.edu
ai.duke.edu
myresearchpath.duke.edu
Source-check timeline and diff-style claim/evidence preview.
View the public change record for this university, including source snapshot hashes, claim review states, and a diff-style preview of current source-backed evidence.
Corrections create review tasks and do not directly change this public record.
If an official source is missing, stale, moved, blocked, or incorrectly summarized, submit a source URL, policy change report, or institution correction for review. Corrections must preserve source URLs, source language, original evidence, review state, and audit history.