The British police's AI keeps confusing deserts with nude images (really)
Sure, artificial intelligence has moved in leaps and bounds, but it's far from perfect. Case in point the London Metropolitan Police's reliance on the technology.
The Met use AI to detect incriminating images on seized electronic devices like phones and laptops. However according to UK newspaper the Telegraph, the AI struggles to tell the difference between nude photos and strangely, deserts.
Mark Stokes, the Met's head of digital and electronic forensics, told the Telegraph that their current software is capable of detecting photos of guns, drugs, and money on seized computers and phones. However, it keeps mistaking images of sandy deserts with pornography or indecent images.
According to the newspaper, The Metropolitan Police scanned 53,000 different devices for incriminating evidence last year. The work is incredibly challenging for staff, so being able to get AI to fill this role is advantageous.
AI continues to progress at a frightening pace. Just yesterday, Google's artificial intelligence identified two new planets that had been missed by human eyes.
In time, Stokes believes the technology will progress to allow it to identify child abuse images, a step that will happen “within two to three years.”