Back to Terms & Policies

Content Moderation Policy

Last Modified March 27, 2026

SoulPal uses a combination of automated systems and human review to moderate content and platform activity.

1. Goals

Our moderation program is designed to:

  • reduce unlawful or prohibited content;
  • enforce platform rules consistently;
  • respond to user reports;
  • protect minors and vulnerable persons; and
  • limit fraud, abuse, harassment, and exploitation.

2. Moderation Methods

We may use:

  • automated classifiers and keyword detection;
  • risk scoring and behavioral signals;
  • queue-based manual review;
  • account-level enforcement tools; and
  • escalation to specialized reviewers where needed.

3. Limits of Moderation

No moderation system is perfect. Content may be missed, delayed, or incorrectly flagged. We reserve discretion in how we investigate and enforce platform rules.

4. Enforcement Outcomes

Possible actions include warning, removal, visibility limits, feature restrictions, temporary suspension, permanent suspension, or referral to law enforcement where justified.

5. User Cooperation

Users must not interfere with or attempt to circumvent moderation systems.