Standards on Child Sexual Abuse and Exploitation (CSAE)

At Pikoo AI, we are committed to creating a safe environment for all users, especially children and teenagers. We maintain a strict zero-tolerance policy against any form of child sexual abuse material (CSAM) or exploitation. This includes any content that depicts, promotes, solicits, or attempts to normalize sexual harm involving minors, whether through images, text, video, in-game behavior, or communication. Such content is not only morally reprehensible but also illegal in virtually every jurisdiction worldwide.

To enforce this standard, we use a combination of automated and manual safety mechanisms. Our systems are designed to detect and block known CSAM using industry tools such as PhotoDNA and CSAI Match. In addition to automated filters, we maintain a trained moderation team that reviews flagged content, suspicious user behavior, and user reports. We prioritize swift action—reported content is reviewed within 24 hours, and confirmed violations lead to immediate consequences including account bans and reports to law enforcement.

We cooperate fully with authorities in all matters involving confirmed or suspected CSAE. Where required, we report such incidents to the National Center for Missing & Exploited Children (NCMEC) or other appropriate law enforcement agencies based on the jurisdiction of the users involved. We also inform relevant platform partners such as cloud hosting providers or app stores, in line with our duty of care and contractual obligations.

Pikoo AI incorporates age-appropriate design choices to prevent the risk of grooming and exploitation. This includes limiting direct communication between minors and other users, offering parental controls, and enforcing age restrictions on certain features. Where necessary, we verify age or require consent from a guardian. Our goal is to balance user creativity and social interaction with robust safety barriers, especially for our younger users.

To ensure effective handling of sensitive issues, our Trust & Safety team receives ongoing training in identifying signs of grooming, coercion, and other high-risk behaviors. We also retain certain user metadata in accordance with legal obligations and platform policies to assist law enforcement in the event of a confirmed investigation.

We are part of a broader ecosystem working to combat online exploitation. To that end, we align our practices with guidance from child protection organizations, government regulators, and platform policy teams. We are committed to evolving these safeguards over time. Our CSAE policy is reviewed and updated twice a year, or more frequently if required by law or emerging threats.

Anyone can report suspected abuse, exploitation, or violations of this policy by contacting our Trust & Safety team directly at contactpikoo2000@gmail.com. We take every report seriously and investigate thoroughly in accordance with this policy.

Pikoo AI exists to foster creativity, play, and learning—never harm. We recognize the responsibility that comes with building digital spaces for young users, and we take that responsibility seriously.

Report suspected abuse, exploitation, or violations of this policy by contacting our Trust & Safety team directly at
contactpikoo2000@gmail.com.