privacy
Brown OIT guidance says users should not enter Level 2 or 3 Brown data into publicly available or vendor-enabled AI tools unless Brown has a contract for a specific service that protects the data.
Open, evidence-backed AI policy records for public reuse.
Change log
Source-check timeline, source snapshot hashes, claim review state, and a diff-style preview of current source-backed claim evidence.
Current public record freshness and review state.
Brown University currently has 9 source-backed claim records and 6 official source attributions. Latest tracked changed date: May 12, 2026.
This tracker is not legal advice, not academic integrity advice, and not an official university statement unless a linked source is the university's own official page.
Diff-style preview built from current public claim/evidence records. Full old/new source diffs require paired historical snapshots.
Inserted lines represent current public claim and evidence records in the source-backed dataset.
9 claim records
Brown OIT guidance says users should not enter Level 2 or 3 Brown data into publicly available or vendor-enabled AI tools unless Brown has a contract for a specific service that protects the data.
Brown OIT guidance says Google Gemini Chat and NotebookLM are accessible at no cost to Brown and can be used with data classified as Risk Level 3, unlike consumer AI services named on the page with which Brown does not have agreements.
Brown Provost guidance says any unapproved use of AI to complete assignments would be covered by Brown’s Academic Code and Graduate Student Edition Academic Code.
Brown University Provost guidance says the University is not prescribing specific AI policies, and that faculty should give clear, unambiguous information about what AI use is and is not allowed in their courses.
Brown OIT guidance says Gemini and NotebookLM are optional tools available to Brown students, Brown staff, Brown-paid faculty, and Brown clinical/medical faculty.
Brown Sheridan Center guidance says instructors should be explicit with students about expectations for generative AI use, including how students should, might, or cannot engage with it.
Brown OIT research guidance says researchers should deeply review AI-generated code for quality and efficiency.
Brown OIT guidance says AI tool use is subject to the same policies as other information technology resources, including acceptable use, copyright, conduct, and contract review policies.
Brown University Communications guidance for Brown communicators says not to input identifying personal information or proprietary information into AI tools.
6 source attributions
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026
official_guidance checked May 12, 2026