AI-infused hiring programs have drawn scrutiny, most notably over whether they end up exhibiting biases based on the data they’re trained on.

  • Odusei@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I figure you’d audit it by examining the results, and if bias isn’t detectable in the results then I’d argue that’s at the very least still better than the human-based systems we’ve been relying on up til now.

      • BraveSirZaphod@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        Not inherently, but things can be tested.

        If you have a bunch of otherwise identical résumés, with the only difference being the racial connotation of the name, and the AI gives significantly different results, there’s an identifiable problem.