Coverstar is committed to providing a safe, positive space for all our users, especially children. We take child safety extremely seriously. This policy outlines our strict standards against Child Sexual Abuse and Exploitation (CSAE) and Child Sexual Abuse Material (CSAM), and explains how we detect, remove, and report such content. It aligns with Google Play’s child safety requirements and reflects our existing safety approach and community guidelines. Our goal is to foster a creative community rooted in kindness while zero tolerance is enforced for any content or behavior that harms a child.
Definitions of CSAE and CSAM
- Child Sexual Abuse and Exploitation (CSAE): Any act or content that sexually abuses or exploits a minor. This includes grooming, sexual harassment of or by minors, solicitation of sexual acts from a minor, sexual extortion (sextortion), trafficking of children for sexual purposes, or any behavior that sexualizes a child for adult benefit. In essence, any attempt to prey on or sexualize a minor is considered CSAE and is strictly banned.
- Child Sexual Abuse Material (CSAM): Any content that depicts or involves a minor in a sexually explicit situation. CSAM can be photos, videos, live streams, chat messages, illustrations, or any other material showing child sexual abuse or exploitation. CSAM is illegal and has no place on Coverstar or anywhere – it is universally condemned and will be removed and reported immediately. Even one instance of CSAM will result in swift action as detailed below.
Zero Tolerance for Child Exploitation
Coverstar maintains a zero-tolerance policy toward CSAE and CSAM. We explicitly prohibit any content or activity that exploits or endangers children. This is clearly stated in our Community Guidelines: for example, “Sexualization of Minors: Any content or behavior involving the sexualization of minors is strictly prohibited and will be reported to the proper authorities.” In practice, this means:
- No Exploitative Content or Behavior: Any attempt to sexualize a minor or engage a minor in sexual content is forbidden. This includes seemingly “harmless” behaviors that facilitate abuse, such as befriending a child online to gain trust for predatory purposes (grooming) or encouraging a child to share personal or intimate content.
- No Tolerance, No Warnings: Users who create, share, or even attempt to distribute CSAM or engage in CSAE will be immediately banned. Content will be removed without warning as soon as we become aware. There are no second chances when it comes to child safety violations.
Protective Measures and Moderation
We have designed Coverstar from the ground up with young users’ safety in mind. Here are some of the proactive measures and features we use to prevent abuse before it happens and to detect problematic content swiftly:
- Verified Parental Consent: Users under 13 must have a parent or guardian complete a verification and consent process before the account can be created. This ensures parents are aware and involved in their child’s use of the app. We comply with COPPA (Children’s Online Privacy Protection Act) and other age-related privacy laws to safeguard younger users’ data and privacy.
- No Private Messaging: To significantly reduce the risk of predatory behavior, Coverstar has no direct messaging (DMs) feature. All user interactions happen through public comments that everyone can see and that our team can monitor. By eliminating private chats, we remove a common channel for grooming or inappropriate contact with minors. This gives parents peace of mind that strangers cannot contact their child in secret.
- Content Moderation with AI and Humans: We use advanced moderation tools to scan all user-generated content (videos, comments, etc.) in real time. Our AI systems (Computer Vision and Natural Language Processing) actively monitor uploads and messages to filter out any inappropriate or harmful material. This technology can recognize explicit imagery, detect suspicious or predatory language, and even understand context (like emojis or slang) to catch potential CSAE content early. Of course, no automated system is perfect, so we also have a dedicated Community Safety team providing human oversight 24/7. They review flagged content and user reports around the clock to ensure nothing slips through the cracks.
- Strict Content Rules and Filters: We have strict rules against nudity, sexual content, violence, and other adult themes on a platform that welcomes children. For example, users are prompted to agree they will not post explicit or sexual material (like videos in underwear or bathing suits) when they first create content. Our content filters and moderators will block or take down any content that is overly mature or unsafe for our young audience. This includes not only sexual content but also excessive violence, dangerous activities, hate speech, and other content forbidden by our Community Guidelines.
- Age-Appropriate Design: We continually consider age-appropriate safety in our features. Profiles can be made private (so only approved friends can see content) if additional privacy is desired. While the user can choose this setting, we encourage parents of younger kids to keep their child’s account private and monitor their activity. Additionally, certain community features (like joining school groups on Coverstar) have safeguards – e.g. school communities require an admin’s approval and do not publicly list underage members’ school affiliationsbark.usbark.us. These measures help prevent strangers from finding or targeting minors.
How to Report Concerning Content
Keeping Coverstar safe is a shared responsibility. If you see something, say something – we make it easy for users to report any content or behavior that seems harmful, especially anything involving potential child exploitation or abuse.
In-App Reporting: Every video and comment on Coverstar has an option to report abuse. To report a piece of content or a user in the app:
- Find the video or comment that is problematic.
- If it's a video, tap the “•••” (More) button on the post. If it's a comment, swipe left or long-press (depending on your device) to reveal the Report option.
- Select “Report Abuse” or the closest relevant category.
- Provide a reason and any details. We have a category specifically for sexual content or exploitation involving minors – choosing this will flag the issue as urgent.
- Submit the report. Our moderation team will immediately be alerted for review.
You can report content for many reasons (bullying, hate, etc.), but reports related to child safety are treated with the highest priority. Our team carefully reviews each report against our guidelines and policies and will take action swiftly if a violation is confirmed.
- Reporting via Email: If you cannot report in-app or need to provide additional information, you can email us directly at safety@coverstar.app. This email is monitored by our Trust & Safety team. We encourage you to use this contact especially for urgent child safety issues or if you need to report something outside the app (for example, if someone contacts you outside of Coverstar claiming to be a Coverstar user and behaving inappropriately). When emailing, please include as much detail as possible (usernames, descriptions, screenshots or links to the content in question). Do not attach or send CSAM images/videos – simply describe the content, as law prohibits redistributing such material. We will respond promptly and take appropriate action.
- Anonymous and Confidential: Reports can be made confidentially. We will never reveal the identity of a reporter to the user being reported. Your report will be handled with care and discretion. What’s most important is that the content is brought to our attention so we can deal with it.
- External Reporting (Law Enforcement): If you believe a child is in immediate danger or being actively harmed, please contact your local law enforcement right away. You can report suspected child exploitation to your national cyber tipline (for example, the NCMEC CyberTipline in the U.S.) or local authorities. Then, please also notify us via the in-app tool or email so we can take action on our platform while authorities address the wider situation. We fully support users reaching out to proper authorities in emergency situations – your safety and the child’s safety come first.
By reporting problematic content or users, you are playing a vital role in protecting the community. No issue is too small to report – we would much rather investigate and find no serious problem than miss a chance to stop abuse. Your vigilance helps keep Coverstar safe.
Enforcement and Law Enforcement Collaboration
When it comes to confirmed cases of CSAM or CSAE, Coverstar takes decisive action in accordance with our zero-tolerance policy and legal obligations:
- Immediate Removal: Content that is confirmed as CSAM or otherwise exploitative of a child is removed immediately. We permanently delete the content from our platform. We also take steps to ensure it cannot be re-uploaded. Our systems may use hashing technology to recognize and block known illegal material proactively (for example, known CSAM hashes shared by organizations like NCMEC).
- Account Suspension or Ban: The user(s) responsible will have their accounts suspended or permanently bannedon first offense. This includes any account that uploads, comments, or messages such content, or that attempts to solicit minors for sexual content. They will be barred from creating new accounts. We also reserve the right to ban devices or take other measures to prevent circumvention by bad actors.
- Internal Investigation: Our Trust & Safety team will conduct an internal review to identify if the offending user had other accounts, if they have been reported before, and if they uploaded any other content that needs checking. We will also preserve evidence (such as account information, content metadata, chat logs) as needed to support any law enforcement investigation, consistent with our Privacy Policy and Law Enforcement Request Guidelines.
- Cooperation with Law Enforcement: We cooperate fully with law enforcement requests and investigations. If police or other authorities reach out to us (with proper legal process) for information about a user involved in child exploitation, we respond promptly and provide all requested data within the bounds of the law. Our Law Enforcement Request Guidelines explain how we handle such requests while respecting privacy and due process. In emergency situations involving danger to a child, we may also disclose information to law enforcement without delay as allowed by law.
- No Safe Haven: We want to make it crystal clear – Coverstar is not a place for predators. We will take every measure to identify, stop, and report anyone misusing our app to harm children. Offenders are not only banned from our service but will likely face criminal investigation and prosecution as a result of our reporting. We are unyielding on this point to protect our community.
Compliance with Laws and Best Practices
Coverstar’s child safety approach is designed to meet or exceed all relevant laws, regulations, and industry best practices:
- Google Play Child Safety Standards: As a social platform, we adhere to Google Play’s Child Safety Standards policy requirements. This document itself serves as our published standards against CSAE, and we maintain in-app reporting tools for user feedback, as described above. We have robust procedures to address CSAM (detection, removal, reporting) and a designated contact point for child safety issues (see Contact Us below). By fulfilling these standards, we ensure our app remains in good standing on app stores and, more importantly, safe for our users.
- Legal Compliance (COPPA and beyond): We comply with the U.S. Children’s Online Privacy Protection Act (COPPA) by obtaining verifiable parental consent for users under 13 and by implementing age-appropriate privacy settings. Our policies will evolve as laws change, but our commitment remains constant: we follow all applicable child protection laws wherever we operate.
- Content Policies and Community Guidelines: Our Terms of Service and Community Guidelines explicitly forbid child exploitation. We reference this Child Safety Policy in those documents to ensure all users and guardians are aware of our rules. Together, these policies create a protective framework: users are on notice that CSAE/CSAM is banned, and they understand the consequences. We regularly train our moderators and staff on these guidelines so that enforcement is consistent and effective.
Transparency and Continuous Improvement
Our commitment to child safety is not a one-time effort – it’s ongoing. We believe in being transparent about our progress and challenges in this area:
- Policy Updates: We will update this Child Safety Policy as needed to address new risks or to improve clarity. The “Last updated” date at the top will reflect changes. Major changes may also be communicated through the app or email. We encourage our community to read these updates and stay informed.
- Regular Reviews: We continuously review our safety measures to ensure they are effective. This includes auditing our moderation systems, testing our reporting flows, and gathering feedback from users and experts. If we find a gap, we work quickly to fix it. Our safety team meets regularly to discuss trends and how to adapt – whether it’s new tactics that predators use or new technology we can adopt to enhance protection.
- Transparency Reporting: As our platform grows, we are exploring ways to share more data about our enforcement actions. In the future, we may publish transparency reports with statistics (for example, number of CSAM reports handled, how many accounts banned for CSAE, average response times, etc.). Even if not publicly shared yet, we track these metrics internally to measure our success and identify areas to improve. Our ultimate goal is to be able to demonstrate openly to our community and partners that we are effectively keeping abuse off Coverstar.
- Education and Awareness: We aim to not only enforce rules but also educate our users. Through in-app safety tips, notifications, and help center articles, we remind users (both kids and parents) about online safety. For example, we teach kids to guard their personal information and to report anything uncomfortable, and we remind parents to stay engaged with their child’s online activities. An informed community is a safer community.
- Community Involvement: We welcome input from our users and the wider community on improving safety. If you have suggestions or concerns about child safety features, let us know (contact info below). Many of our best improvements come from listening to you. We are all stakeholders in creating a secure platform.
Contact and Further Support
If you have any questions about this Child Safety Standards Policy or need to reach us for a safety-related issue, please contact us at safety@coverstar.app. This is our dedicated child safety point of contact and is monitored by our Trust & Safety team. Whether you are a user, a parent, or even a law enforcement representative, this contact is the proper channel for urgent notifications or inquiries about child safety on Coverstar.
You can also find more information in our other safety resources on the Help Center, including [Our Approach to Safety] and [Community Guidelines], which complement this policy. We’re here to help and we take every safety inquiry seriously.
Coverstar’s Mission: Safe Creativity for All – By enforcing these standards and constantly striving for improvement, we reaffirm that Coverstar is a place for creativity, fun, and positive connections free from abuse or exploitation. We are fully committed to legal compliance and to the trust that families place in us. Thank you for being a part of the Coverstar community and for helping us keep it safe for everyone – especially the kids and teens who deserve a safe online space to express themselves. Together, we can create a better world through positivity and play, without compromise on safety.