BluePanther
  • Home
  • Privacy Policy
  • Terms of Service
  • Community Guidelines
  • Content Moderation
  • Contact

Content Moderation & Safety

Last Updated: December 5, 2025

At BluePanther, user safety is our top priority. This page explains how we moderate user-generated content, handle reports, and maintain a safe environment for our community.

Our Commitment to Safety

BluePanther is committed to providing a safe platform where community members can connect, share, and support each other. We employ multiple layers of protection to ensure user safety:

  • Proactive content filtering and detection
  • User reporting mechanisms
  • Human review of reported content
  • Clear enforcement of Community Guidelines
  • User safety controls (block, mute, privacy settings)

How We Moderate Content

Automated Moderation

We use automated systems to help identify potentially harmful content:

  • Abusive Language Filter: Automated detection of profanity, slurs, and offensive language
  • Spam Detection: Identification of repetitive content and suspicious posting patterns
  • Link Screening: Checking links for malicious or phishing content

Automated systems flag content for review but do not make final decisions on content removal for complex cases.

Human Review

Our moderation team reviews:

  • All user reports
  • Content flagged by automated systems
  • Appeals from users
  • Complex cases requiring context and judgment

Community Reporting

Our users play a vital role in keeping the platform safe. Reports from community members help us identify content that may have been missed by automated systems.

How to Report Content or Users

Reporting a Post

  1. Tap the three-dot menu (⋮) on the post you want to report
  2. Select "Report Post"
  3. Choose the reason for your report:
    • Hate speech or discrimination
    • Harassment or bullying
    • Violence or threats
    • Spam or misinformation
    • Adult content
    • Other violation
  4. Add any additional details (optional)
  5. Submit your report

Reporting a User

  1. Go to the user's profile
  2. Tap the three-dot menu (⋮)
  3. Select "Report User"
  4. Choose the reason and provide details
  5. Submit your report

Blocking a User

  1. Go to the user's profile
  2. Tap the three-dot menu (⋮)
  3. Select "Block User"
  4. Confirm your choice

Blocked users cannot see your profile, posts, or send you messages.

What Happens After You Report

1

Report Received

Your report is logged and queued for review. You'll receive confirmation that we received it.

2

Initial Assessment

Our system categorizes the report and assigns priority based on severity.

3

Human Review

A moderator reviews the reported content against our Community Guidelines.

4

Action Taken

Appropriate action is taken: content removal, warning, suspension, or no action if guidelines weren't violated.

5

Notification

You may receive notification about the outcome of your report.

Response Times

  • Urgent reports (violence, imminent harm): Within 24 hours
  • High priority (harassment, hate speech): Within 48 hours
  • Standard reports (spam, other violations): Within 7 days

Safety Features in the App

🚫

Block Users

Prevent specific users from viewing your profile or contacting you.

🔇

Mute Users

Hide content from specific users without blocking them.

🔒

Privacy Controls

Control who can see your profile and content.

🛡️

Content Filters

Automated filtering of abusive language and inappropriate content.

🚨

Report System

Easy-to-use reporting for posts, comments, and users.

👤

Account Security

Email verification and secure authentication through Clerk.

Enforcement Actions

When content or behavior violates our guidelines, we may take the following actions:

ActionWhen Applied
Content RemovalContent that violates guidelines is removed from the platform
WarningFirst-time or minor violations; user is educated about guidelines
Feature RestrictionTemporary limitation on posting, commenting, or messaging
Temporary SuspensionAccount suspended for 24 hours to 30 days for repeated or serious violations
Permanent BanAccount permanently terminated for severe violations or continued misconduct

Transparency

We believe in transparency about our moderation practices:

  • Users are notified when their content is removed (with reason)
  • Users can appeal moderation decisions
  • We regularly review and update our policies
  • Our Community Guidelines are publicly available

Contact Our Safety Team

For urgent safety concerns or to report serious violations, contact our safety team:

Email: support@bluepanther.in

Subject line: "Safety Report" for priority handling

For general questions about our moderation policies, please review our Community Guidelines.

BluePanther

Connect, Share, and Grow with your community

Legal

  • Privacy Policy
  • Terms of Service
  • Refund & Cancellation Policy
  • Community Guidelines
  • Content Moderation
  • Child Safety Standards

Support

  • Contact Us
  • support@bluepanther.in

App

  • Google Play

© 2026 BluePanther Technologies. All rights reserved.