Safeguarding and Online Safety Statement

Share This Page

At Chatter Labs, children’s safety, privacy, and wellbeing come first. Our mission is to make speech and language games fun, engaging, and accessible — while keeping every child safe in every click, tap, and voice interaction. Safeguarding and Online Safety is at the heart of our business.

Safeguarding means protecting children, young people, and vulnerable adults from harm, abuse, or exploitation. For us, it’s not just a rule — it’s a core part of how we design, build, and manage everything on our platform.

Every child deserves a digital space that is safe, transparent, and respectful — and that’s exactly what Chatter Labs provides.

1. Who This Policy Covers

This policy applies to:

  • All Chatter Labs users: children, caregivers, educators, and speech-language pathologists (SLPs).
  • All features: games, screeners, AI chat tools, voice features, and dashboards.
  • All contexts: play testing in school or in clinic by Chatter Labs employees.
  • All staff and partners: developers, clinicians, designers, and researchers.

Wherever a child interacts with Chatter Labs, our safeguarding promise stands.

2. Our Legal and Ethical Foundation

We take our duty of care seriously. Our safeguarding work is guided by:

Laws and Frameworks

  • COPPA (U.S.) – requires verified parental consent before collecting data from children under 13.
  • GDPR (EU/UK) – gives children and parents clear rights over their personal data.
  • HIPAA (U.S.) – protects sensitive health information shared during therapy.
  • Children Act (UK) – ensures that the welfare of the child is always the top priority.
  • ASHA Code of Ethics – outlines how SLPs must protect client privacy and wellbeing.

Values

  • Transparency: clear communication about how data and tools work.
  • Accountability: every team member understands their role in child protection.
  • Respect: every child’s dignity and privacy are protected.

3. Our Safeguarding Team

We have a Designated Safeguarding Lead (DSL) who oversees all safety processes.

Their main duties include:

  • Reviewing and responding to all safety reports.
  • Coordinating with schools, SLPs, and authorities if needed.
  • Conducting regular staff training on safeguarding and data protection.
  • Reviewing the platform for emerging online safety risks.

All Chatter Labs employees and contractors:

  • Complete annual safeguarding and child protection training.
  • Undergo background checks before joining including identity verification, criminal records check, education verification, past-employer verification and sex offender registry check.
  • Follow our Code of Conduct and Acceptable Use Policy.

4. How We Keep Children Safe Online

We use multiple layers of protection — from secure design to human review — to create a safe online space for kids.

4.1 Data Privacy and Security by Design

  • We only collect information needed to design worksheets, setup games or track individual progress.
  • All data is encrypted during transfer and storage.
  • Identifiable data (like names or email addresses) is stored separately from session data.
  • Parents and SLPs can view, edit, or delete data anytime.
  • We do not sell or share user data for marketing.

When signing up we :

  • ask for verified parental or guardian consent.
  • explain clearly what information is collected and why.

You can read more detailed information about Chatter Labs’ web app security and tablet app security measures in our help centre.

4.2 Child-Friendly Accounts

  • Children use restricted profiles with limited settings.
  • No in-app messaging, no public posts, no storing of audio recordings and no social features.
  • Parents and SLPs can see activity and progress.
  • Children can’t share personal information like name, address or photos.

4.3 Parental and SLP Controls

  • Parent dashboards let caregivers monitor sessions, privacy settings, and usage.
  • SLP accounts include secure tools for assigning and reviewing worksheets and game assignments.
  • All access is role-based, so only approved adults can see identifiable data.

4.4 Safe and Responsible AI

We use artificial intelligence (AI) to generate content including words, sentences and images. We also, on our tablet games, use artificial intelligence for game play interactions. 

AI is never used to replace professional judgment or professionals.

Here’s what that means:

  • AI tools never make medical or diagnostic decisions. We are purely educational games focused.
  • We curate and review all AI-generated content to ensure it’s safe, accurate, and age-appropriate.
  • Audio recordings are turned off by default and requires verifiable parental consent.
  • Audio recordings are temporarily processed, not stored indefinitely.
  • We use human oversight for all automated systems.

AI models are regularly reviewed for:

  • Bias (fairness for diverse users)
  • Appropriateness (no unsafe or adult content)
  • Accuracy (correct feedback and suggestions)

If a parent or SLP prefers, AI features are by default turned-off and can be turned on with verifiable parental consent in the account settings.

5. Recognising and Reporting Concerns

We make it easy for anyone to report something that doesn’t feel right.

You can report a concern using the secure web form on our contact us page.

Reports go straight to the Safeguarding Lead. We review them within 24 hours and take action as needed.

If there’s an urgent risk, we’ll contact the proper authorities, such as:

  • Local child protection services
  • School safeguarding leads
  • Police (in serious or immediate-risk cases)

We handle all reports with:

  • Confidentiality
  • Sensitivity
  • Non-retaliation (no one is punished for raising a concern)

6. Preventing Abuse and Exploitation

Chatter Labs has strict safeguards against online abuse. Our platform design includes:

  • No private messaging between children and unknown users.
  • No public posting or comments.
  • Automatic monitoring to detect unsafe inputs or language.
  • Immediate account suspension if any misuse is detected.

Our team reviews all flagged activity and follows escalation procedures for any suspected harm or misconduct.

7. Staff Training and Culture

Everyone who works at Chatter Labs is responsible for child safety.

We make sure every team member:

  • Completes mandatory training in safeguarding, data protection, and online child safety.
  • Understands how to recognize signs of risk and how to report them.
  • Knows how to design and build technology that puts safety first.

We regularly review our internal communication practices, design guidelines, training as well as engineering and quality assurance standards

This ensures every update or new feature meets our safeguarding promise.

Please view our artificial intelligence statement for more information.

8. Data Rights and Transparency

You have the right to know, access, and control how your or your child’s data is used.

You can:

  • Ask for a copy of all stored data.
  • Request correction or deletion.
  • Withdraw consent at any time.
  • File a complaint with your regional data authority (e.g., ICO in the UK or FTC in the U.S.).

We respond to all data requests within 30 days. We also notify users immediately if any data breach ever occurs.

Please view our privacy policy for more information.

9. Reviewing and Improving Our Safeguarding Practices

Safety is never “done.” It’s ongoing.

We review our safeguarding and privacy practices:

  • Every 12 months, or sooner if laws or risks change.
  • After any major incident or system update.
  • With external experts, including clinicians and legal advisors.

Each review looks at security systems audits, staff training, quality assurance reports and user feedback.

We publish the date of last review and contact information right on this page for full transparency.

Last reviewed: [Sept 2025]

Please view our privacy policy for more information.

10. How We Build Ethical, Safe Technology

Chatter Labs follows a “Safety by Design” approach, which means:

  • Safety and privacy are built into the code from the start.
  • Features are tested for child comprehension and safety before launch.
  • Developers follow a strict internal Game Design Principles, Responsible AI and Safeguarding Checklist.
  • We work with SLPs and design experts to test new products.

Before any game or AI feature goes live, it passes checks for:

  1. Data protection
  2. Age-appropriate design
  3. Bias and fairness
  4. Accessibility
  5. Clinical relevance

11. Contact Us

We’re here to listen and respond quickly to any concern about safety, privacy, or conduct. Please reach out to us

For urgent safety matters involving a child, contact your local child protection service or emergency number immediately.

12. External References and Support

For anyone wanting to learn more about child safety online, we recommend:

13. Our Promise

At Chatter Labs, we believe technology should empower children — never expose them to risk.

We promise to:

  • Keep every child’s information private and protected.
  • Be transparent about how our tools and AI work.
  • Act fast if something goes wrong.
  • Work closely with parents, schools, and professionals to create the safest possible experience.

Child safety isn’t just a policy — it’s the heart of who we are.

More To Explore

We're Lauching Soon!

Get Ahead of the Queue...

« «