A new study says Amazon facial-detection technology often misidentifies women as men, particularly when they have darker skin.
Indonesia Berita Terbaru, Indonesia Berita utama
Similar News:Anda juga dapat membaca berita serupa dengan ini yang kami kumpulkan dari sumber berita lain.
Amazon facial-identification software used by police falls short on tests for accuracy and bias, new research findsThe new research is raising concerns about how biased results could tarnish the artificial-intelligence technology’s exploding use by police and in public venues, including airports and schools.
Baca lebih lajut »
29-year-old whose company makes millions reselling on Amazon calls this his biggest money mistake29-year-old whose company makes millions reselling on Amazon calls this his biggest money mistake. via CNBCMakeIt
Baca lebih lajut »