Review workflow

Contribution review before publication

Review keeps the public database useful without letting submissions become unsupported facts. Every contribution is evaluated for source provenance, source language, evidence, rights, privacy, moderation risk, and consistency with the claim/evidence contract.

Review queues

Each queue has a different publication gate. Passing a queue does not bypass source evidence or review-state labeling.

Source discovery review

Verify that suggested URLs are public, attributable, source-language labeled, and relevant to university AI policy.

A source can be staged only after reviewers confirm provenance, rights caveats, source language, and crawlability.

Queuesource_discovery_review

Crawl failure review

Separate inaccessible, blocked, redirected, no-policy, and weak-source cases before extraction.

Failure records can be published only as labeled status metadata, not as policy conclusions.

Queuecrawl_failure_review

Claim/evidence review

Check whether a proposed claim is supported by short original-language evidence and source attribution.

Claims remain candidate records until evidence, confidence, review state, and citation fields pass review.

Queueclaim_evidence_review

Translation review

Review localized display summaries without replacing source-language evidence.

Translation changes can affect helper display only; original evidence remains canonical.

Queuetranslation_review

Institution correction review

Handle official corrections, metadata fixes, attribution disputes, and canonical page adjustments.

Corrections must preserve audit history and cite the official or attributable evidence used.

Queueinstitution_correction_review

Course submission review

Moderate course-level AI policy evidence, privacy concerns, copyright limits, and source context.

Course records reuse claim/evidence and remain pending until moderation and rights checks pass.

Queuecourse_submission_review

Abuse and moderation review

Reject personal attacks, private data, doxxing, unsupported accusations, and full copyrighted materials.

Rejected or unsafe submissions are not converted into public facts.

Queueabuse_moderation_review

Safeguards

The first review layer is intentionally stricter than a normal issue tracker.

Privacy

  • Do not submit private student information.
  • Do not submit non-public instructor personal data.
  • Course submissions should include only public course context or short evidence excerpts needed for review.
  • Institutional role claims may be noted, but public review still requires source evidence.

Copyright

  • Do not paste full copyrighted syllabi, PDFs, LMS pages, or source pages.
  • Use short original-language excerpts only when necessary for evidence review.
  • Tracker metadata can be open licensed, but source documents retain their original rights.
  • Raw source materials should not be added to Git unless explicitly approved.

Moderation

  • No doxxing, harassment, personal attacks, or unsupported accusations.
  • No requests for legal advice or academic integrity advice.
  • No login-wall, paywall, robots, or access-control bypass requests.
  • Candidate records must stay labeled until reviewed.

Publication gates

Review state and confidence remain separate in every contribution path.

  1. Submission creates a review task, not a canonical fact.
  2. Source review checks officialness, source language, accessibility, and rights caveats.
  3. Claim/evidence review checks short original-language evidence and attribution.
  4. Course submissions pass moderation before any course entity can be published.
  5. Institution corrections preserve audit history and cite supporting sources.
  6. Only reviewed outputs can graduate into public claim/evidence records.

Review policy JSON

This endpoint is metadata for contributors and agents. It is not a write API.