Red teaming is a semi-structured testing approach to assess and improve the safety and effectiveness of AI models and systems by identifying vulnerabilities, limitations, and potential areas for improvement. Red teaming is a great first point of entry to AI evaluations for people and organizations with deep subject matter expertise in a given domain or problem space, to help foster democratic oversight and smarter, evidence-based standards and regulations.

As a paid service, Humane Intelligence designs and hosts red teaming events for partner organizations. However, we understand that not all organizations are in a position to pay for our services. That’s why we’re proud to have published the free Red Teaming Artificial Intelligence for Social Good Playbook with UNESCO.
Humane Intelligence has pioneered red teaming for identifying sociotechnical issues and with broader, more inclusive participation than traditional red teaming events. In collaboration with Seed AI and AI Village, Humane Intelligence developed this novel approach to red teaming by hosting the first public red teaming event for closed-source AI models at DEF CON 2023, which drew a record breaking 2500+ participants. Since then, Humane Intelligence has hosted more than 15 red teaming events for industry, international civil society, governments, and academic institutions in 8+ countries. We provide red teaming services to find unintended consequences and malicious attacks, including different methods for prompt injection.
Small group assessments by invited experts in non-technology fields to test narrow harms and identify domain-specific issues. Expert red teaming can supplement internal red teaming efforts with external expertise in domains such as medical care or legal advice.
At-scale challenges conducted by invitation or fully open to a wide range of individuals. Public red teaming is used to diffuse harms and gather data en-masse to identify systemic issues.
Humane Intelligence built a web based application for the data collection aspects of AI red teaming.