It makes for funny news when an ad for onions gets flagged for being overtly sexual. Things stop being cute when similar error have serious consequences for the lives of people and yet nothing is known or done about it. Mainly because the nature of the errors are such that it does not result in some very obviously wrong outcome as was the case was this banned onion ad.
The company appealed the decision, and a Facebook Canada spokeswoman confirmed the ad's rejection was an algorithm error.
"We use automated technology to keep nudity off our apps, but sometimes it doesn't know a walla walla onion from a, well, you know," the spokeswoman, Meg Sinclair, told The National Post. "We restored the ad and are sorry for the business' trouble."
A person could be flagged a credit risk by algorithmic error for example and be set for more than business trouble. Job applicants are already dealing with AI that screens them in or out and the process is likely not fair and unbiased.
Comments