Melbourne, Australia

Deakin University

Deakin University is listed as QS 2026 rank =207. Deakin University has 6 source-backed AI policy claim records from 5 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Short answer

v1 public contract

Deakin University is listed as QS 2026 rank =207. Deakin University has 6 source-backed AI policy claim records from 5 official source attributions. The public record preserves original-language evidence snippets, source URLs, snapshot hashes, confidence, and review state.

Citation-ready summary

As of this public record, University AI Policy Tracker lists Deakin University as an agent-reviewed AI policy record last checked on May 15, 2026 and last changed on May 15, 2026. The record contains 6 source-backed claims, including 6 reviewed claims, from 5 official source attributions. Original-language evidence snippets and source URLs remain canonical, with public JSON available at https://eduaipolicy.org/api/public/v1/universities/deakin-university.json. The entity-level confidence is 95%. This tracker is not legal advice, not academic integrity advice, and not an official university statement unless the linked source is the university's own official page.

Claim coverage6 reviewedSource languageenPublic JSON/api/public/v1/universities/deakin-university.json

Policy signals in this record

  • Evidence includes Privacy claims.
  • Evidence includes AI tool treatment claims.
  • Evidence includes Research claims.
  • Evidence includes Academic integrity claims.
  • Evidence includes Teaching claims.
  • Named AI services detected in public claims: Microsoft Copilot.
  • Disclosure, acknowledgment, citation, or attribution language appears in the public claim text.
  • Teaching, assessment, coursework, or syllabus-related language appears in the public claim text.
Policy statusReviewed evidence-backed recordReview: Agent reviewedEvidence-backed claims6Reviewed6Candidate0Official sources5

This reference record summarizes visible public data only. Official sources and original-language evidence remain canonical; confidence is separate from review state.

This page is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.

Policy profile

Deterministic source-backed dimensions derived from this record's public claims.

Coverage score100/100Coverage labelbroad public coverageReview: Machine candidateAnalysis confidence79%

Policy profile rows are machine-candidate derived metadata. They are not final policy conclusions; inspect the linked claim evidence before reuse.

Analysis page-quality metadata is available at /api/public/v1/analysis/page-quality.json.

Teaching guidance

Deakin University has 1 source-backed public claim for teaching guidance; deterministic analysis status: recommended.

RecommendedMachine candidateConfidence77%Evidence1Sources1

Security and procurement

Deakin University has 1 source-backed public claim for security and procurement; deterministic analysis status: recommended.

RecommendedMachine candidateConfidence77%Evidence1Sources1

Coverage score measures breadth of public, source-backed coverage only. It is not a policy quality, strictness, legal adequacy, safety, or compliance score.

Evidence-backed claims

6 reviewed evidence-backed public claim

Privacy

Deakin research guidance says commercial genAI platforms may store data externally and researchers should use only non-sensitive, non-confidential content suitable for public or external use.

Review: Agent reviewedConfidence95%

Normalized value: external AI tools restricted to non-sensitive, non-confidential research content

Original evidence

Evidence 1
Commercial generative AI platforms like ChatGPT, Bing and others store data externally. Once entered, your information may be retained, reused or accessed beyond your control, even if the platform offers options to delete or restrict access. To minimise risk, use only non-sensitive, non-confidential content that is suitable for public or external use.

Localized display only

Deakin warns that commercial genAI platforms can store data externally and tells researchers to use only non-sensitive, non-confidential content suitable for public or external use.

Ai Tool Treatment

Deakin identifies a set of approved genAI and digital learning tools for learning, including Deakin GEM, Studiosity+, Microsoft Copilot for web, and FeedbackFruits.

Review: Agent reviewedConfidence94%

Normalized value: approved learning tools: Deakin GEM, Studiosity+, Microsoft Copilot for web, FeedbackFruits

Original evidence

Evidence 1
Deakin provides a set of endorsed generative AI (genAI) and digital learning tools to support teaching and learning in ways that are safe, ethical, and educationally sound. These tools have been formally evaluated and approved to align with Deakin's academic values, privacy and security obligations, and learning and teaching goals.

Localized display only

Deakin lists endorsed genAI and digital learning tools that it says have been formally evaluated and approved for learning contexts.

Research

Deakin research guidance says HDR thesis use of generative AI is limited to copyediting and proofreading, with generated images allowed only where part of an approved methodology and fully disclosed.

Review: Agent reviewedConfidence94%

Normalized value: HDR thesis AI use limited to copyediting/proofreading except approved disclosed methodology for image generation

Original evidence

Evidence 1
Generative AI may be used for copyediting and proofreading thesis text only. Refer to the Institute for Professional Editors guidelines for what qualifies as copyediting and proofreading support. Using generative AI to create or manipulate images is not permitted in your thesis, unless image generation is part of your approved research methodology and fully disclosed.

Localized display only

For HDR thesis writing, Deakin limits generative AI to copyediting and proofreading, with image generation allowed only when it is part of approved methodology and disclosed.

Academic Integrity

Deakin guidance says students should acknowledge genAI use where it contributed to developing assessment work and should include tool, access date, prompts, output, and where it was used.

Review: Agent reviewedConfidence93%

Normalized value: acknowledge genAI use in assessment development

Original evidence

Evidence 1
Where you have used generative AI (genAI) in developing your assessment (for example, in the development of ideas, problem solving, data analysis, significant writing feedback) you should acknowledge your use of genAI. It is essential that you provide details about where and how you have used it.

Localized display only

Deakin says students should acknowledge genAI use in assessment development and explain where and how it was used.

Academic Integrity

Deakin student guidance says genAI may be used as a starting point for some study tasks, but not to write the final assessment or do the work being assessed.

Review: Agent reviewedConfidence92%

Normalized value: students may use genAI supportively but remain responsible for own assessment work

Original evidence

Evidence 1
You might use genAI as a starting point for writing (for example, a writing template) but never to write your final assessment. Never use genAI to do the work that you are being assessed on. It is your responsibility to write and create your own assessments.

Localized display only

Deakin tells students genAI can be a starting point for writing support, but not a way to write the final assessment or complete the assessed work.

Teaching

Deakin has institution-level genAI principles and a genAI Steering Group coordinating guidance across learning and teaching, research, enterprise use, and digital services.

Review: Agent reviewedConfidence91%

Normalized value: institutional genAI principles and steering group

Original evidence

Evidence 1
Deakin's genAI Steering Group was established in 2023 to: oversee the uptake of genAI tools and associated activity; coordinate institutional guidelines for the use of genAI, with cross-reference to specific guidance for learning and teaching, research and enterprise use; consolidate advice from relevant teams in the Academic and Research and Innovation Portfolios and Digital Services on current practice, emerging issues and technologies.

Localized display only

Deakin says its genAI Steering Group coordinates institutional guidelines and connects learning and teaching, research, enterprise, and digital-services advice.

Candidate claims

0 machine or needs-review claim

Candidate claims are not final policy conclusions. They preserve source URL, source snapshot hash, evidence, confidence, and review state so the record can be audited before review.

Official sources

5 source attribution

Change log

Source-check timeline and diff-style claim/evidence preview.

View the public change record for this university, including source snapshot hashes, claim review states, and a diff-style preview of current source-backed evidence.

Last checkedMay 15, 2026Last changedMay 15, 2026Open change log

Corrections and missing evidence

Corrections create review tasks and do not directly change this public record.

If an official source is missing, stale, moved, blocked, or incorrectly summarized, submit a source URL, policy change report, or institution correction for review. Corrections must preserve source URLs, source language, original evidence, review state, and audit history.

Back to universities