At Coverstar, safety is our highest priority. We’ve built Coverstar as a platform where people can foster meaningful, authentic connections while expressing their creativity in a positive environment. We are dedicated to cultivating a community rooted in kindness and respect, and we take our responsibility to protect our users seriously. Our safety features are continuously improved to keep pace with the ever-changing online world.
Coverstar has developed a robust ecosystem of security tools, safety infrastructure, community guidelines, and user policies, positioning us as a leader in user safety among our peers. To help you personalize and control your Coverstar experience, we provide features such as:
- Blocking users who violate your boundaries.
- Reporting inappropriate or harmful behavior.
In addition to these tools, our dedicated Community Team works around the clock to monitor user reports and support tickets, ensuring our community is safe and supported 24/7. No issue is too small—if you have questions or concerns, reach out to us at safety@coverstar.app, and we’ll be happy to assist.
To strengthen our efforts, Coverstar uses AI Computer Vision and Natural Language Moderation as the first line of defense. This advanced technology scans for inappropriate content in real time, helping to filter harmful language or behavior before it reaches users.
Explore our Safety Center to learn more about these resources and tips for ensuring a safe and enjoyable experience on Coverstar.
Moderation
We take all instances of bullying, hate, or violence on our platform very seriously. Our moderation system combines human expertise with cutting-edge AI technology to maintain the safety and integrity of the Coverstar community.
AI Computer Vision and Natural Language Moderation serves as the backbone of our automated system, leveraging deep learning technology to scan and filter out harmful or inappropriate content. Its advanced algorithms are capable of:
-
Detecting inappropriate or harmful content in user-submitted media, such as photos and videos, to prevent violations of community guidelines.
-
Deciphering the meaning of emojis in context.
- Keeping up with modern slang and evolving language.
- Scraping the web for real-life examples to refine its accuracy.
Alongside our AI technology, our skilled Community Team brings a human touch to moderation, using their experience and understanding to make nuanced decisions. Together, this hybrid system enables us to strike a balance between protecting our users and preserving freedom of expression.
Our goal is to create an inclusive, respectful space where diverse voices can thrive while ensuring Coverstar remains free from hate, bullying, or abuse.
Reporting
If you find content on Coverstar that you find harmful, offensive, or inappropriate, you can report it directly within the app. Here’s how:
- Locate the video or comment.
- On the video, tap the More button (•••). On the comment, slide right.
- Tap Report Abuse.
- Select a reason for your report to flag the message for review.
Our moderation team carefully evaluates each report against our Community Guidelines and Terms of Service. We consider the message content, its potential impact on the recipient, and any relevant context. If the reported message is found to violate our policies, we take appropriate action, which could range from blocking the sender to permanently banning them from the platform.
By reporting harmful content, you play a vital role in maintaining a safe and respectful community. Your actions help protect yourself and others from harassment, bullying, and other forms of online abuse. If you come across content that violates our guidelines, don’t hesitate to report it—your voice matters.
At Coverstar, we’re building a platform where creativity flourishes and users feel secure. With your support, we can continue to grow a positive, empowering space for the next generation of creators.