research
For health research publications, the Faculty of Medicine guide suggests declaring the type, timing and purpose of AI use, never attributing authorship to an AI tool, and validating AI-assisted production through human authors.
Open, evidence-backed AI policy records for public reuse.
Change log
Source-check timeline, source snapshot hashes, claim review state, and a diff-style preview of current source-backed claim evidence.
Current public record freshness and review state.
Universidad de Chile currently has 5 source-backed claim records and 2 official source attributions. Latest tracked changed date: May 15, 2026.
This tracker is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.
Diff-style preview built from current public claim/evidence records. Full old/new source diffs require paired historical snapshots.
Inserted lines represent current public claim and evidence records in the source-backed dataset.
5 claim records
For health research publications, the Faculty of Medicine guide suggests declaring the type, timing and purpose of AI use, never attributing authorship to an AI tool, and validating AI-assisted production through human authors.
The Faculty of Medicine guide proposes five principles for AI initiatives in that faculty: transparency and traceability, human supervision and non-delegation of critical judgment, equity and technological justice, academic integrity and responsible authorship, and participatory governance with continuous updating.
The Faculty of Medicine's Humanizar la Inteligencia document describes itself as an orienting guide and not as binding regulation.
The FCFM guidance warns teaching teams that inadequate use of generative AI can create ethics risks such as plagiarism and copyright issues, and security/privacy risks such as insufficient data protection or data use without consent.
The FCFM guidance recommends that course teaching teams build a transparency policy with students around possible generative AI uses, including defining when and how the AI tool used should be cited.
2 source attributions
official_pdf checked May 15, 2026
official_pdf checked May 15, 2026