For Educators·12 min read

How Little Narratives Protects Children's Data — NQS AI Policy Compliance

Little Narratives is purpose-built to meet every requirement of the NQF Artificial Intelligence Policy — so your centre can use AI with confidence, not risk.

By The Little Narratives teamPublished 4 March 2026Updated 20 April 2026

Since 1 September 2025, the Education and Care Services National Regulations explicitly require services to have policies for the safe use of digital technologies — including AI tools. For a deeper look at what the regulation says and what your service needs to do, read our plain-English guide to Regulation 168.

This page is different. It's our commitment, in plain English, to exactly how Little Narratives is architecturally designed to meet those requirements. Every decision documented below was made before writing a single line of code — data safety is not a feature we bolted on. It's the foundation we built on.

Why centres need an AI-safe tool

The amended Regulation 168 requires centres to have policies for the safe use of digital technologies and online environments — and ACECQA's September 2025 guidance makes clear this applies to AI tools that handle children's information.1

Generic AI tools like ChatGPT, Claude, and Gemini are not designed for children's data. Under default settings, prompts are used for model improvement, data is stored on overseas servers, and there is no audit trail. An educator who pastes a child's name, behaviour, and age into a free-tier chatbot is likely breaching both Regulation 168 and Australian Privacy Principle 6.3

No child photos uploaded to AI

The NQF AI Policy explicitly prohibits staff from uploading children's photographs to AI platforms. Little Narratives complies by using a trait-based character builder — educators select visual traits (hair colour, skin tone, eye colour) and our AI generates watercolour illustrations from text descriptions only. No photograph ever touches the AI.

Parent-controlled, guardian-consented

Parents can optionally generate watercolour avatars of themselves in the Family App — these are adults making autonomous decisions about their own images, outside the scope of the NQF staff restrictions. Parents also control whether their child participates in AI story generation at all.

Every parent action is logged with explicit consent. Parents can disable AI story participation for their child at any time. Centres never upload family photos — only parents control their own images.

Your data is never used for AI training

We use a paid enterprise API tier, governed by a Data Processing Addendum that contractually prohibits the provider from using your data to train, fine-tune, or improve AI models. Your prompts are not reviewed by humans for product improvement.

Australian data residency

All user data — child profiles, stories, media, documents — is stored on Australian servers in the Sydney region. AI processing is performed via a paid enterprise AI API under a Data Processing Addendum providing contractual privacy protections.

Data location by component

  • Database: Australia (Sydney region)
  • File Storage: Australia (Sydney region)
  • Server-side Processing: Australia (Sydney region)
  • AI Processing: Secured under Data Processing Addendum

Data minimisation

We only send the absolute minimum data needed to generate personalised content. We encourage educators and parents to use first names, nicknames, or pet names only — never full names, addresses, or other identifying details.

What is sent to the AI — and why

  • First name / nickname — to personalise story text
  • Age (year of birth only) — for age-appropriate language
  • Interests — for tailored themes
  • Visual traits (text) — for character illustrations
  • Educator observations — for story context and goals

No photographs, no full names, no addresses, no medical records, no family details. The principle is simple: send the least possible data to produce the best possible result.

Educator review & professional judgment

The NQF AI Policy requires that AI is "not used as a substitute for an educator's professional knowledge." Little Narratives is a drafting assistant — every generated story, song, and observation is presented for educator review before being shared with families. Educators verify, personalise, and approve all content.

Compliance checklist for centre directors

For centre directors and NQS assessors — here's how Little Narratives meets each requirement of the NQF AI Policy:

  1. "Staff must not upload personal information or photographs to AI platforms." — Educators use a trait-based character builder (text-to-image). No photos are uploaded by staff. Parents optionally manage their own images in the Family App.
  2. "AI tools must not store or process data in non-compliant ways." — Our paid enterprise API operates under a Data Processing Addendum. Data is not used for model training. All user data is stored on Australian servers.
  3. "Staff must review and personalise AI-generated content." — Every story, song, and observation is presented for educator review before sharing. Educators approve, edit, or reject all AI outputs.
  4. "AI tools must be formally approved via documented risk assessment." — This page, our privacy policy, and our data processing documentation support your centre's risk assessment.
  5. "Records of AI usage must be kept for regulatory inspection." — Every AI-generated story includes timestamps, the educator who requested it, the children involved, and the EYLF outcomes mapped — a complete audit trail.
  6. "AI content must be checked for bias and appropriateness." — Our AI system is tuned for Australian ECEC contexts, uses inclusive language, reframes "problems" as "situations to navigate," and never implies a child is "naughty" or "wrong."
  7. "Families must be informed about AI usage and data handling." — Parents receive clear information about AI at sign-up, can view our full privacy policy, and can toggle AI story participation on or off for their child at any time.
  8. "AI must comply with the Privacy Act 1988 (Cth)."4 — Built in accordance with Australian Privacy Principles. Data minimisation, purpose limitation, and cross-border disclosure requirements are all addressed.

Ready to use AI compliantly? Register your centre for free or request a compliance pack to include in your QIP.

References & further reading

  1. ACECQA. (2025). Strengthened NQF child safety and protections: changes taking effect 1 September 2025.ACECQA — official guidance
  2. Education and Care Services National Regulations — Regulation 168.NSW consolidated regulation
  3. Office of the Australian Information Commissioner. (2024). Guidance on privacy and the use of commercially available AI products.OAIC guidance
  4. Privacy Act 1988 (Cth), schedule 1 — Australian Privacy Principles.APPs — APP 6 & APP 11