Friends don’t let friends use an AI STI test


Picture the scene: Your date has gone well and you and your partner might sleep together. Like any safe adult, you assume there will be a conversation about STI status and the use of protection. Now imagine how you would feel if they asked to take a photo of your penis and upload it to a website you’ve never heard of. That’s the future of intimacy, as imagined by Calmara, a new service launched by “men’s health” startup HeHealth.

Friends don’t let friends use an AI STI test

HeHealth Website

Its press release suggests users take a picture of their partner’s penis so it can be run through a deep learning model for visual signs of sexually-transmitted infections. And while the website suggests users should wear protection, a banner atop the HeHealth sites describes the app as “Your intimate bestie for unprotected sex.” Mixed messages aside, you may notice some major issues with the pitch: That this only covers infections that present visually, and that it’s only designed to work with penises.

But even if that use case applies, you might not feel you can trust its conclusions once you’ve looked at the data. The Calmara website claims its scans are up to 90 percent accurate, saying its AI has been “battle-tested by over 40,000 users.” That figure doesn’t match up to its press release, which says accuracy reaches 94.4 percent (a figure cited in this NSFW preprint paper submitted a week ago), but its FAQ says the accuracy ranges “from 65 percent to 96 percent across various conditions.” We’ve reached out to the company and want to learn more about the apparent discrepancy.

See also  Los investigadores buscan una segunda víctima después de que se encontró un cuerpo en el lago de Los Ángeles.
Image of the Calmara website showing its claim of Image of the Calmara website showing its claim of

Calmara

It’s not impossible for models to categorize visual information — I reported on how systems like these look at images of cells to aid drug discovery. But there are plenty of reasons as to why visual information isn’t going to be as reliable for an STI test. After all, plenty of conditions don’t have visual symptoms and carriers can often be asymptomatic long after infection. The company admits to this in its FAQ, saying that the app is a “first line of defense, not a full-on fortress.” Not to mention that other factors, like the “lighting, the particular health quirks you’re scouting for and a rainbow of skin tones might tweak those [accuracy] numbers a bit.” Even more alarming, the unpublished paper (which is riddled with typos) admits that a full 40 percent of its training dataset is comprised of “augmented” images, for instance “extracting specific visually recognizable disease patterns from the existing clinical image dataset and layering those patterns on top of images of health (sic) penises.”

Image from the Calmara FAQ highlighting the variability of its tests.Image from the Calmara FAQ highlighting the variability of its tests.

Calmara

The Calmara website’s disclaimer says that its tools are for the purpose of “promoting and supporting general wellness and a healthy lifestyle and are not to be used to diagnose, cure, treat, manage or prevent any disease or condition.” Of course, if it really was intended as a general wellness tool, it probably wouldn’t describe itself as “Your intimate bestie for unprotected sex,” would it.

It doesn’t help that this is a system asking users to send pictures of their, or their partner’s genitalia. Issues around consent and — as writer Ella Dawson raised on Bluesky — age verification, don’t seem to have been considered. The company’s promises that the data is locked in a “digital stronghold” lacks specifics about its security approach or how the data it obtains may be shared. But that hasn’t stopped the company from suggesting that it could, in future, be integrated “directly into dating apps.”

See also  Todo lo que sabemos hasta ahora sobre la próxima consola de Nintendo

Fundamentally, there are so many red flags and potential vectors for abuse and giving users a false sense of confidence that nobody should try using it.



Source Article Link

Leave a Comment