The Basque Country is implementing Quantus Skin in its health clinics after an investment of 1.6 million euros. Specialists criticise the artificial intelligence developed by the Asisa subsidiary due to its “poor” and “dangerous” results. The algorithm has been trained only with data from white patients.

  • D4MR0D@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    26 days ago

    If someone with dark skin gets a real doctor to look at them, because it’s known that this thing doesn’t work at all in their case, then they are better off, really.

    • ryannathans@aussie.zone
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      26 days ago

      Doctors are best at diagnosing skin cancer in people of the same skin type as themselves, it’s just a case of familiarity. Black people should have black skin doctors for highest success rates, white people should have white doctors for highest success rates. Perhaps the next generation of doctors might show more broad success but that remains to be seen in research.

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    26 days ago

    Though I get the point, I would caution against calling “racism!” on AI not being able to detect molea or cancers well on people with darker skin; its harder to see darker areas on darker skins. That is physics, not racism

    • zout@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      26 days ago

      The racism is in training on white patients only, not in the abilities of the AI in this case.

      • Hardeehar@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        26 days ago

        It’s still not racism. The article itself says there is a lack of diversity in the training data. Training data will consist of 100% “obvious” pictures of skin cancers which is most books and online images I’ve looked into seems to be majority fair skinned individuals.

        “…such algorithms perform worse on black people, which is not due to technical problems, but to a lack of diversity in the training data…”

        Calling out things as racist really works to mask what a useful tool this could be to help screen for skin cancers.

        • xorollo@leminal.space
          link
          fedilink
          English
          arrow-up
          1
          ·
          26 days ago

          Why is there a lack of robust training data across skin colors? Could it be that people with darker skin colors have less access to cutting edge medical care and research studies? Would be pretty racist.

          There is a similar bias in medical literature for genders. Many studies only consider males. That is sexist.

        • Revan343@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          26 days ago

          Training data will consist of 100% “obvious” pictures of skin cancers

          Only if you’re using shitty training data

    • TimewornTraveler@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      25 days ago

      if only you read more than three sentences you’d see the problem is with the training data. instead you chose to make sure no one said the R word. ben shapiro would be proud

    • Leon@pawb.social
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      23 days ago

      It is a direct result of structural racism, as it’s a product of the treatment of white men as being the default. You see it all the time in medicine. There are conditions that disproportionately affect black people that we don’t know enough about because time and money hasn’t been spent studying it.

      Women face the same problem. Lots of conditions apply differently in women. An example of this being why women historically have been underrepresented in e.g. autism diagnoses. It presents differently so for a while the assumption was made that women just can’t be autistic.

      I don’t think necessarily that people who perpetuate this problem are doing so out of malice, they probably don’t think of women/black people as lesser (hell, many probably are women and/or black), but it doesn’t change the fact that structural problems requires awareness and conscious effort to correct.

      • Phoenixz@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 days ago

        Again, no.

        There are actual normal reasons that can explain this. Don’t assume evil when stupidity (or in this case, physics) does it. Darker patches on darker skin are harder to detect, just as facial features in the dark, on dark skin are garder to detect because there is literally less light to work with

        Scream racism all you want but you’re cheapening the meaning of the word and you’re not doing anyone a favor.

        • Leon@pawb.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          Don’t assume evil when stupidity

          I didn’t, though? I think that perhaps you missed the “I don’t think necessarily that people who perpetuate this problem are doing so out of malice” part.

          Scream racism all you want but you’re cheapening the meaning of the word and you’re not doing anyone a favor.

          I didn’t invent this term.

          Darker patches on darker skin are harder to detect, just as facial features in the dark, on dark skin are garder to detect because there is literally less light to work with

          Computers don’t see things the way we do. That’s why steganography can be imperceptible to the human eye, and why adversarial examples work when the differences cannot be seen by humans.

          If a model is struggling at doing its job it’s because the data is bad, be it the input data, or the training data. Historically one significant contributor has been that the datasets aren’t particularly diverse, and white men end up as the default. It’s why all the “AI” companies popped in “ethnically ambiguous” and other words into their prompts to coax their image generators into generating people that weren’t white, and subsequently why these image generators gave us ethnically ambigaus memes and German nazi soldiers that were black.

    • Melvin_Ferd@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      26 days ago

      Think more about the intended audience.

      This isn’t about melanoma. The media has been pushing yellow journalism like this regarding AI since it became big.

      It’s similar to how right wing media would push headlines about immigrant invasions. Hating on AI is the left’s version of illegal immigrants.