Legal · Child safety · CSAE

Child Safety Standards

Effective: April 4, 2026 · Essentia Memoria

This document describes how Essentia Memoria addresses child safety and child sexual abuse and exploitation (CSAE), including child sexual abuse material (CSAM), in line with Google Play Child Safety Standards and applicable law. It is published on the open web (HTML) and is not a PDF.

1. Scope and age

Essentia Memoria is not directed to children under 13. Our Privacy Policy states that we do not knowingly collect personal data from children under 13. Users must meet the minimum age in our Terms of Service.

2. Zero tolerance for CSAE / CSAM

We maintain zero tolerance for child sexual abuse and exploitation, including the upload, solicitation, distribution, or grooming related to CSAM or sexual exploitation of minors. Such content and behaviour violate our policies and may violate criminal law.

We prohibit users from using the service to sexualise minors, to seek inappropriate contact with minors, or to share or request illegal imagery. Moderation and enforcement apply to forum posts, collective content, direct messages, marketplace listings, profiles, and uploads where our systems and team can act.

3. Reporting from inside the app (in-app channel)

Users can report child safety concerns without leaving the product:

  • Settings → Privacy: open Settings and use the Child safety report link (same as below).
  • Dedicated contact form: Report a child safety concern — the form is labelled for priority handling. You may include URLs, usernames, timestamps, and a factual description. Do not attach illegal images; describe what you saw.

4. Email and developer contact

You may also email contact@essentiamemoria.com with the subject line Child safety report. The Google Play developer account email on file may be used for official notices; operational triage uses the addresses above.

5. How we handle reports

When we receive a credible child safety report, we prioritise review, take steps to remove violating content and restrict accounts as appropriate, and preserve relevant information where the law requires. We do not allow CSAM on our platform.

Where mandatory under applicable law, we will report to competent authorities (for example law enforcement or, where applicable, recognised child-safety reporting bodies in the relevant jurisdiction). The exact authority depends on the country and nature of the report.

6. Law enforcement

We cooperate with valid legal process from authorities with jurisdiction. Requests should identify the account or content and include contact details for the agency. Emergency matters involving imminent danger should be directed to local emergency services first.

7. Related policies

This page supplements our Privacy Policy, Terms of Service, and in-product safety tools (block, mute, and report on social features where available).