1 paper accepted at ACL 2026 (main conference)!

We have a short paper accepted at ACL 2026! The paper title is Calibrated? Not for Everyone: How Sexual Orientation and Religious Markers Distort LLM Accuracy and Confidence in Medical QA. In this paper, we show that current LLMs show consistently worse accuracy and degraded uncertainty calibration for patients who are homossexual and/or religious. We also show that intersectional identities (e.g., a patient who identifies as homosexual and Catholic) led to harms that exceeded the sum of their parts, even for frontier models like GPT-5.1.