Código Não Binário

Press Release – Brazilian LGBTQIA+ organization creates anti-hate Artificial Intelligence and sues social media platforms for R$100 million

For release: March 2026

Contents:

1 – Suggested text
2 – Useful materials
3 – Directors’ statements
4 – Directors’ biographies
5 – Código Não Binário organization

1. SUGGESTED TEXT

Brazilian LGBTQIA+ organization builds anti-hate AI and sues social media platforms for R$ 100 million

Código Não Binário turned the wave of hate from the Boyceta case, linked to its podcast Entre Amigues, into unprecedented solutions: an AI to counter anti-LGBTQIA+ hate speech and a coalition seeking legal accountability and collective moral damages from Meta, Google, X and TikTok for omission and systemic failure in content moderation.

São Paulo, Brazil, March 2026, Brazilian organization Código Não Binário has launched TybyrIA, the first fully open AI to detect anti-LGBTQIA+ hate speech in Portuguese, and announced it has filed a Public Civil Action (ACP) against tech giants Meta (Instagram, Facebook, WhatsApp), Google (YouTube), X and ByteDance (TikTok). The Public Civil Action is a Brazilian legal instrument used to defend collective and diffuse interests.

The ACP, with the participation of IBRAT, Fonatrans and AzMina, marks an important shift: historically marginalized groups are not only reporting digital violence but creating cutting-edge technology and leading accountability processes against Big Tech, bringing the community’s motto “nothing about us without us” into AI and strategic litigation.

Recently, the São Paulo State Prosecutor’s Office (MPSP) expressed support for the ACP to proceed, stating that the debate on platforms’ duty of care, content moderation and prevention of hate speech has “unequivocal social and legal relevance”. This stance represents a significant institutional first step in the case. Had the opinion been adverse, the action could have been legally weakened from the outset. Instead, the Prosecutor’s Office reinforced the public and legal relevance of the case.

FROM PODCAST TO TECHNOLOGICAL AND LEGAL RESPONSE

The action is the result of an unprecedented response in Brazil with international impact. In May 2024, within the first 90 days, Código Não Binário’s podcast Entre Amigues went viral with 3.4 million views, 2.1 million on TikTok alone, where the clip on “boycetas” exceeded 1.6 million views. The content was among the most discussed topics in Brazil on X (formerly Twitter) for two days, largely consisting of anti-LGBTQIA+ hate speech. First TikTok’s algorithm pushed the content to millions; then far-right accounts, including a congressperson with 26 million followers, amplified the attacks.

With support from national and international philanthropic funds, the group preserved their mental health during the attacks and responded with technical capacity: they developed their own data extraction methodology (platforms do not provide adequate tools), manually labeled 2,000 comments, and trained a specialized AI model on them that scaled the analysis to over 12,000 comments. This produced the technical report “Anatomy of a Wave of Hate,” with robust analysis and key findings that now underpin the Public Civil Action, with evidence of omission and systemic failure.

TYBYRIA: AI AGAINST ANTI-LGBTQIA+ HATE

The AI’s name, “TybyrIA,” honors Tybyra do Maranhão, recognized as the first official victim of LGBTphobia during the colonization of Brazilian territory. It is rich in symbolism, as the technology is also a practical example of resisting digital colonialism, a concept used by researchers in the Global South to describe extractive practices of data and wealth by large tech conglomerates from the Global North, especially the United States.

TybyrIA stands out for being fully open, public code, model, dataset and methodology, free of charge, and for showing that small, specialized AI models (SLMs), which do not have the same environmental impact as large models (LLMs), can address concrete problems running on ordinary hardware such as laptops. Since the first public tests were published in late 2025, the model has reached 244 downloads on Hugging Face and the dataset 438 downloads, showing growing adoption by the global technical community.

The tool can be found on the organization’s website (www.codigonaobinario.org/tybyria) or on its Hugging Face and GitHub channels (platforms for AI and software developers). In late 2025 Código Não Binário presented TybyrIA at events focused on technology and digital and human rights, including at the Office of the Attorney General of the Union, Mozilla Festival in Barcelona, and the Cooperative AI Conference in Istanbul, where the technology received the Du Bois Prize (Special Mention).

Experts and organizations in Turkey, the Netherlands and the United Kingdom are already studying the technology, including a class on Código Não Binário’s case at the University of Westminster in London. The organization also won a selection by the Elas+ Fund, which enabled moving the AI from prototype to final product. The organization is now seeking further funding to improve the tool and replicate it internationally in other languages.

PUBLIC CIVIL ACTION: COALITION FOR ACCOUNTABILITY AND COMPENSATION

The ACP challenges the platforms’ business and profit architecture and demonstrates, with technical evidence and dozens of national and international studies, that content moderation is selective and driven by commercial interests. Platforms remove copyright violations in a priority and systematic way but systematically fail to remove hate speech, even when repeatedly reported. Research indicates that until 2024, 97% of Meta’s moderation actions were proactive; with recent changes in the platform’s policies, that share may drop sharply, potentially affecting hundreds of millions of hate content items. Technical analysis shows that algorithms are designed to amplify content that drives engagement (around 70% of watch time on YouTube is determined by algorithmic recommendation), including hate, and that platforms benefit economically from this ecosystem (politicians paid to boost 124 anti-trans posts in a single year), while also having removed specific protections against LGBTphobia from their policies, allowing pathologization and discrimination. There is no effective counter-structure, but a systemic failure that generates ongoing incentives and harm.

The action requests: a Human Rights Due Diligence Plan specific to LGBTQIA+ people, journalists and communicators, review of algorithms that amplify hate, dedicated reporting channels, public semi-annual transparency reports, independent auditm, and partnerships with civil society. It also seeks R$ 100 million in collective moral damages, with allocation tied to the LGBTQIA+ cause, as well as reversal of the burden of proof and adjudication with a gender perspective.

The organization prepared the ACP for over a year and chose to launch it at a strategic moment: the Supreme Court reinterpreted the Civil Rights Framework for the Internet (2024) requiring active diligence from platforms, and the Digital ECA (2025) showed that regulating foreign Big Tech in Brazil is feasible, creating a favorable legal landscape for structural accountability.

2. USEFUL MATERIALS

Links:

Código Não Binário website:

www.codigonaobinario.org

TybyrIA and anti-LGBTQIA+ hate dataset:

www.codigonaobinario.org/tybyria

Report “Anatomy of a Wave of Hate”:

www.codigonaobinario.org/anatomia-onda-de-odio

Public Civil Action:

codigonaobinario.org/acp

Contact:

Veronyka and Amanda

contato@codigonaobinario.org

+55 11 98109-6572 (WhatsApp/Signal)

Hashtags:

#TybyrIA #AI #TransVisibility #DigitalSovereignty #FeministAI #TransTech #LGBTQIAInnovation #LGBTQIA

3. DIRECTORS’ STATEMENTS

Veronyka:

“This launch on Trans Visibility Day has political meaning. We want visibility not only of the violence we suffer but of our capacity to build civilizing solutions, with cutting-edge technology, data production and leadership in regulation processes. Who better to speak about digital violence than the people who are under attack every day? We have shown that when we go through these situations we gain a privileged view and can build effective solutions for the serious problems of social media today, and the concept of digital colonialism captures some of that.”

“We are able to take AI as a tool for liberation. We gathered evidence that shows how the architecture of these platforms privileges profit over human dignity. We turned violence into a tool for defense and made it freely available to the world. This is AI as counter-hegemonic, transparent practice, centered on the community.”

“The far right would be irrelevant today if it weren’t for social media. One example was behavioral targeting (the Facebook and Cambridge Analytica case) that enabled Brexit and Trump’s first election. Here in Brazil we had the case of illegal WhatsApp messaging that benefited Bolsonaro in his first election. But the main example of the symbiosis between Big Tech and the far right is how disinformation and hate speech, elementary and frequent practices of that political camp, circulate freely and are amplified by social media algorithms.”

“Social media, which we now confuse with the internet, are products of Global North corporations we call Big Tech. They are capitalist tech conglomerates with monopolistic and imperialist plans. One of Silicon Valley’s gurus, the creator of PayPal, says it openly in one of his books: if you want to succeed you must create your monopoly. The plans of Big Tech and the far right are the same: extract wealth under a supremacist view of race, gender, and so on. With the AI race and the threat to U.S. supremacy from China, Big Tech has come out of the closet for good and is betting trillions on maintaining and deepening digital colonialism.”

“In this second Trump term, we can no longer be naive about what is happening and who is behind it. The kind of people in far-right movements and behind Big Tech is the same: cisgender, heterosexual, white men from the Global North. Their worldviews too: extract wealth and win at all costs, preserving the exploitation that enables their desires, which necessarily involves oppression by gender/sexual orientation, race, and so on. That is evident today when Trump invades Venezuela for oil and rare earths, essential for the AI race, but also when Elon Musk says the U.S. can stage a coup in any country, does a Nazi salute in public, or Zuckerberg calls for more testosterone in companies.”

“The AI field is decades old and has had ups and downs. It originated in the 1950s. People are confusing AI with this monopolistic phase of large language models and generative AI that we have been in since the pandemic. At its origins, AI was in the research arena and was largely publicly funded. It is possible to work with AI without the serious problems of some current initiatives, such as algorithmic bias and high environmental and social impact. But we cannot wait for white heterosexual men from the Global North to build technologies with those concerns. Their fantasies and desires are different. And as governments we need to set limits on their practices that threaten not only the planet but humanity itself.”

“Many see LGBTQIA+ people, especially trans people, as scapegoats, that is, used by the far right for their projects, as a kind of springboard. But the issue goes deeper: we actually have an existential conflict with these actors because our lives call into question the very foundation of their lives and of the project of domination of these cis heterosexual white men from the Global North. We have seen in history the central role of racism and the invention of cisnormativity in achieving classic colonialism. If you unmask hegemonic masculinity and its false realism, you attack the roots of what we know as capitalism and imperialism. That is roughly what we do when we show that masculinity is fragile.”

“To understand digital colonialism we need to understand colonialism. The Spanish philosopher Preciado puts it well when he says that without bodies segmented by species, sex, gender, class and race, neither fossil extractivism nor the organization of the world capitalist economy would be possible, and that in that regime the body recognized as human, classified as male in sex/gender at birth, marked as white, able-bodied and national, holds the monopoly on the use of techniques of violence. That is what we see with Trump and with Big Tech. And that is how colonialism works, and its most updated version: the digital one.”

Amanda:

“We filed this action when Meta, Microsoft and Grok are under investigation in Brazil, when platforms are rolling back moderation policies globally (Meta tolerates pathologization of trans identities, X removed protection against misgendering, YouTube removed gender identity from its hate speech policies), and when digital sovereignty is at the center of the trade wars that have become part of our daily lives. Big Tech plays a key role in deepening the main global crises, concentrates more wealth than most countries in the world and grows more powerful every day, which demands courage from us to confront them. The Civil Rights Framework reinterpreted by the Supreme Court and the Digital ECA have shown that regulating Big Tech is feasible in Brazil. We are leading strategic litigation with evidence, technology and coordination that is already winning. Our ACP is a stand against the lack of accountability of Big Tech and against algorithm-driven hate.”

4. DIRECTORS’ BIOGRAPHIES

Veronyka Gimenes

Short: Travesti/transfeminine. Ethical hacker, activist and communicator, working with the public sector, third sector and social movements for nearly 20 years. Co-founder and Director of Código Não Binário and lead developer of TybyrIA.

Long: Ethical hacker and travesti. She leads Código Não Binário in the defense of rights and creation of decolonial and LGBTQIA+ technologies and hosts the podcast Entre Amigues. She is multidisciplinary, working in areas such as software development, web design, management, activism, diversity and inclusion, communication and art. Since 2008 she has developed open and free digital technologies for democracy and digital sovereignty, including participatory master plan and budget platforms for São Paulo (Haddad administration), an intelligent chatbot and viral memes for Boulos, and blockchain anti-fake news tools for Lula.

Amanda Claro

Short: Cis bi woman, lawyer, doctoral candidate in organizational studies at FGV EAESP; Co-founder and Director of Projects at Código Não Binário, coordinates the strategic litigation front.

Long: Cisgender bisexual woman. Lawyer trained in law at USP, master’s in Management and International Business from the University of Westminster in London, UK, and doctoral candidate in Organizational Studies at FGV EAESP, where she conducts research grounded in Latin American decolonial feminism with a CNPq grant. Co-founder and Director of Projects at Código Não Binário, she coordinates the strategic litigation and operations fronts, contributed to the development of the TybyrIA AI and hosts the podcast Entre Amigues.

5. ABOUT CÓDIGO NÃO BINÁRIO

Código Não Binário is a nonprofit organization founded in 2023 that works so that diversity, inclusion and equity are at the heart of technology and policy practice, ensuring that solutions in these areas are developed and implemented in ways that protect and defend the rights and freedoms of all people, including marginalized and vulnerable communities, fulfilling technology’s transformative potential rather than leaving it subject to the profit and private interests of the few. Although recent, the NGO is the result of the partnership of Veronyka Gimenes and Amanda Claro, who together have nearly 20 years of experience developing digital technologies for the public and third sectors, in feminist legal advocacy and in the defense of LGBTQIA+ rights. The organization is a regular presence at national and international forums on technology and digital and human rights, such as ENAP Innovation Week (Brasília), Creative Commons Summit (Mexico City), RightsCon (Costa Rica and Taipei), Cooperative AI Conference (Istanbul) and Mozilla Festival (Barcelona).

Awards: Synergia Access Change (Latin America, 2024), Elas+ Digital Citizenship (Brazil, 2025), Du Bois Prize – Special Mention for TybyrIA (Turkey, 2025)