Transparency

We show our work.

Radical transparency is a core value at KLASSIC. We publish regular reports on safety, moderation, and how our platform operates.

Safety-first design
AI-powered creativity
No algorithms

Our Transparency Commitment

Social platforms have operated as black boxes for too long. Users deserve to know how the platforms they use actually work, what content is removed, and how safety decisions are made. We publish this information quarterly.

What we publish

Our quarterly transparency reports include:

Safety Metrics

Content removed, accounts suspended, and safety interventions.

Moderation Data

How much content was reviewed, by AI vs human, and outcomes.

Government Requests

Legal requests received and how we responded.

User Appeals

Appeals received, reviewed, and overturned.

Policy Changes

Updates to our community guidelines and why.

Data Practices

What data we collect and how it is used.

Transparency Reports

Download our published reports.

Coming Soon

First report: Q1 2026

Our first transparency report will be published after launch. We will report quarterly from then on.

Safety Advisory Board

Our Safety Advisory Board consists of independent experts in child safety, digital wellness, and online harm prevention. They review our policies, advise on difficult decisions, and hold us accountable. Board members and meeting summaries will be published here.

Board members coming soon

Questions about our transparency practices?

We are happy to discuss our approach to transparency and accountability.