Spaces:
Running
New AI bias detection tool for artificial images: Test skin tone and gender bias instantly
Thanks to @evijit , we now have a new tool for the Journalist on Hugging Face community.
Quick test: No women doctors in India, Canada, or France across multiple prompts. Concerning.
Tool details:
- Generates 10 images per prompt
- Plots exact skin tone hex codes
- Gender grid: light green (men), dark green (women)
You can test 6 different models, including Stable Diffusion 3, which hit 3M downloads last month on Hugging Face.
Built on Avijit's research on color and gender bias in text-to-image models.
Try it out and share your findings! Curious to see what biases you uncover.
👉 https://huggingface.co/spaces/JournalistsonHF/text-to-image-bias
👉 It's also in line with "Stable Bias" work by Hugging Face's ML & Society team: https://huggingface.co/spaces/society-ethics/StableBias
@lucianosb go for it! Also feel free to submit PRs if you find something cool that other people could benefit from :) Glad you like the space!
Potentially interesting to users of this space: Beyond Aesthetics: Cultural Competence in Text-to-Image Models
@evijit It worked great for my own models: https://huggingface.co/spaces/lucianosb/sinteticoXL-bias
I'll play around more with it and see if I can contribute with further analyses.
I also added two more Bias Detectors that I needed:
- Age Detector (because I noticed a bias when generating "Brazilian indigenous woman" for a comic book of mine)
- NSFW Detector (because it accidentally outputs nsfw images even when not prompted for it)
This is how it looks now. On the example, "Brazilian dancer" seems to generate only women and a lack of variety of skin tones.
Very cool! There is some work on unnecessary sexualization of certain nationalities. I really like your extension to the space and could be super interesting to see a systematic evaluation of that.