Social Impact Policy

How Binary Refinery Contributes To A Safer, More Resilient Digital Future

Binary Refinery helps organisations harness AI with clarity, purpose, and responsibility.

As a firm working in a rapidly evolving technology landscape, we recognise our responsibility to support the communities most affected by the risks AI can create.

This policy outlines our commitments to safety, education, harm reduction, and responsible practice, and the governance processes that support those commitments.

1. Our Principles

People First

Human dignity, wellbeing, and safety guide every decision we make.
Technology should expand human potential and strengthen trust.

Care and Common Sense

We take a practical, people-led approach.
We prioritise clarity, minimise harm, and support informed, confident decision-making.

Independence and Integrity

We provide vendor-independent advice and uphold strong governance, transparency, and ethical standards in all AI-related work.

2. Our Social Impact Commitments

A. Education for Vulnerable Communities

Binary Refinery provides free, accessible educational sessions designed to improve digital resilience and safety for:

  • Teachers and parents supporting young people exposed to deepfakes, misinformation, and non-consensual synthetic imagery.

  • Seniors and caregivers at heightened risk of AI-enabled scams, voice cloning, impersonation fraud, and financial exploitation.

These sessions are plain-language, empowering, and focused on practical steps to keep people safe.

B. Financial Support for Organisations Reducing Digital Harm

We contribute a portion of revenue each year to organisations working to:

  • Combat online abuse and image-based harm.

  • Protect children and young people from digital exploitation.

  • Support seniors vulnerable to fraud and impersonation.

  • Strengthen public resilience against misinformation.

  • Advance thoughtful, transparent, and accountable AI governance.

We publish a list of supported organisations annually.

C. Responsible Use of AI in Our Practice

We commit to using and recommending AI in ways that:

  • Prioritise safety, accuracy, and responsible outcomes.

  • Protect privacy and respect human dignity.

  • Avoid hype, exaggeration, and fear-based narratives.

  • Provide clients with clear, pragmatic guidance and realistic expectations.

  • Incorporate strong governance, risk management, and quality assurance.

We model the standards we advocate for.

3. How We Uphold This Policy

Transparency

We communicate openly about our practices, the limitations of AI systems, and the risks of misuse.

Continuous Learning

We monitor emerging risks and update our guidance, training, and internal processes accordingly.

Collaboration

We work with community organisations, educators, safety groups, and industry bodies to ensure our efforts are relevant and impactful.

Annual Review and Reporting

This policy is reviewed every 12 months.
We publish an annual summary of:

  • Community education delivered

  • Organisations supported

  • Policy updates

  • Key learnings and emerging risks

4. Governance & Oversight

Roles and Responsibilities

Director (Policy Owner)

  • Owns this policy and ensures it reflects Binary Refinery’s values and commitments.

  • Approves updates, contributions, partnerships, and public statements.

  • Oversees delivery of community education initiatives.

  • Ensures responsible AI practice within the business.

Binary Refinery Team Members and Contractors

  • Uphold this policy in all client engagements.

  • Identify emerging risks and escalate concerns promptly.

  • Maintain confidentiality, professionalism, and ethical standards.

External Partners

  • Are selected based on alignment with our values, safety standards, and positive social impact.

  • Must not engage in practices that undermine digital safety or human dignity.

5. Policy Management

Version Control

  • Current Version: 1.0

  • Date of Last Review: 3 December 2025

  • Next Scheduled Review: December 2026

  • Policy Owner:
    Kat Mac – Director, Binary Refinery

Revision History

  • 03 Dec 2025 (v1.0) — Initial release of Social Impact Policy, including commitments to education, donations, responsible AI practice, and governance framework.
    Author: K. Mac

Document Governance

  • This policy is approved by the Director of Binary Refinery.

  • Amendments must be authorised by the Policy Owner.

  • Updates are documented in the Change Log for transparency and traceability.

  • Superseded versions are archived and retained for reference for a minimum of 3 years.

  • This document forms part of Binary Refinery’s governance framework and complements our responsible AI practices and strategic planning processes.

6. Our Belief

AI can help people work smarter, make better decisions, and build stronger organisations - but only when guided by clarity, care, and responsibility.

Binary Refinery is committed to shaping a digital future that is safer, more informed, and more resilient for all New Zealanders.

If you’d like to partner with us, host a seminar, or learn more about our social impact work, we’d love to hear from you.

Contact →
  • You can’t protect what you don’t understand. Security starts with awareness.

    Michele Guel, Distinguished Engineer & Cybersecurity Expert, Cisco