Safety first: shared results are anecdotal user reports only and never a substitute for manufacturer guidance.

Takedown and Review

These in-app policy pages keep the moderation, trust, and public-sharing rules close to the workflows they govern.

What can be reported

Users may report abuse, harassment, misleading public data, suspicious copied tables, or other content that appears unsafe for the shared layer.

Moderators may also receive system watchlist cues based on duplicate recipes, repeated velocity patterns, or thin-context public entries.

Initial handling

Reported content may stay visible while queued, be limited during triage, or be removed quickly if the risk is obvious.

The moderation workspace should preserve reviewer notes, queue ownership, and resolution history for accountability.

Resolution goals

Potential copyright or safety issues should be reviewed quickly, with a bias toward limiting questionable public content while preserving private user history where appropriate.

Reporters should be able to provide enough detail for reviewers to understand why the content may be abusive, copied, or unsafe.