privacy
UniGPT terms distinguish local and external models: local models may process confidential V3 information but not strictly confidential V4 information, while external models may receive only public V1 information.
Open, evidence-backed AI policy records for public reuse.
Change log
Source-check timeline, source snapshot hashes, claim review state, and a diff-style preview of current source-backed claim evidence.
Current public record freshness and review state.
University of Münster currently has 7 source-backed claim records and 6 official source attributions. Latest tracked changed date: May 16, 2026.
This tracker is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.
Diff-style preview built from current public claim/evidence records. Full old/new source diffs require paired historical snapshots.
Inserted lines represent current public claim and evidence records in the source-backed dataset.
7 claim records
UniGPT terms distinguish local and external models: local models may process confidential V3 information but not strictly confidential V4 information, while external models may receive only public V1 information.
The UniGPT privacy page states that user-generated chat data is not used to train models; logs and metrics are automatically deleted after 14 days, and other data is automatically deleted after six months without login.
University of Munster provides UniGPT as a chatbot service for research, teaching, and administration, offering on-premise models in the Uni Cloud and external models; the service page says on-premise model data remains within the university.
University of Munster central guidance states that whether generative AI systems may be used for examination work depends on the assessed competency and is set by examination regulations and examiners; for work where expression, translation, or code creation is itself assessed, use is generally not permissible.
University of Munster central guidance says it orients itself to the DFG statement on generative models for text and image creation in research, including disclosure of whether, which, why, and how extensively generative models were used when results are made publicly accessible.
For Faculty 06, the dean's office recommends that generative AI content in examination work be clearly cited, that an explanation of tool use be added, and that full prompts and AI responses be appended; instructors may adapt or reduce these documentation and citation duties for individual exams or courses.
The University and State Library Munster advises users to ask instructors or their university before using generative AI for academic work and warns that AI use in examination documents can be evaluated as attempted deception.
6 source attributions
official_guidance checked May 16, 2026
official_guidance checked May 16, 2026
official_guidance checked May 16, 2026
official_guidance checked May 16, 2026
official_guidance checked May 16, 2026
official_guidance checked May 16, 2026