Facebook’s and Google’s artificial intelligence for checking inappropriate images is failing

Mike Hall's photo of cows which Facebook considered inappropriate

I hate to say it because I do not want to aggravate Facebook and Google, the two megalithic online businesses which tend to dominate our lives and which have powers beyond those that should be bestowed upon them, but there’s no doubt in my mind that they have introduced artificial intelligence to spot what they consider to be inappropriate content such as overtly sexual content as they describe it. And it screws up sometimes.

Mike Hall's photo of cows which Facebook considered inappropriate

Mike Hall’s photo of cows which Facebook considered inappropriate

I have suffered from this myself. If it is not the artificial intelligence which goes wrong it is the threshold which is set incorrectly. So for example if I publish a picture of a hand which is been scratched by a cat it is not allowed by Google because it’s too violent and disgusting. It’s going to harm people if they see it! The fact that it is reality and happens millions of times all over the planet every day is neither here nor there.

Facebook has the same problem. There is a recent story about Mr Hall’s digital photo gallery on Facebook showing pictures of cows, the England cricket team and buildings which was blocked by Facebook for containing ‘overtly sexual content’. Clearly Facebook’s algorithm supported by artificial intelligence has gone dramatically wrong.

Regrettably it doesn’t stop there because when Mr Hall tried to rectify the obvious problem he met a brick wall. He says there was nothing risqué about the 400 images on his business page and at one point his account was suspended from placing any adverts on it. Google also suspends adverts on pages where they consider the images are inappropriate but which in all honesty are appropriate and perfectly acceptable.

Mr Hall appealed the decision but had not received a response and as usual was unable to make contact with anybody on Facebook. This is another issue that a lot of people have with these companies. It is impossible to speak to a person’s to rectify what is obviously an error. You have to go through all kinds of hoops and hurdles including forums. When you raise your problem on a forum a so-called volunteer expert responds but in my experience they are quite aggressive in their response and they invariably blame the person posting the question on the forum.

They are biased. They presume that Google is correct and that the person making the enquiry is incorrect and have done something wrong. These forums are not helpful or good places to visit.

The BBC contacted Facebook about Mr Hall’s problems whereupon they rectified it. The adverts were reinstated. That also tells a story. Facebook cannot ride roughshod over customers and be so disdainful of their presence but then respond to a big business. It’s unpleasant. Apparently it doesn’t stop there. Mr Hall says that when he set up his account with Facebook he had to verify his business but then after that “it fell into an abyss”.

He says that Google had offered him a one-to-one clinic on how to navigate their advertising options. My advice to Mr Hall is to be cautious about that because dealing with Google AdSense is not easy, I can tell you that. And if a Google employee reads this then please don’t punish me for writing about it. I’m just expressing an opinion and if you digest what I say and respond properly you will improve your artifice intelligence (AI) rather than reacting in a negative manner.

Returning to Facebook, the tech giant said that they had blamed a ‘technical error’ (a euphemism) when it blocked thousands of campaign ads taken out both by Joe Biden and Donald Trump’s teams during the presidential elections. So it happens quite a bit and I think Facebook and Google are struggling with AI. They need it because they cannot manage their businesses without it. They are probably frantically developing artificial intelligence software to make improvements but in the meantime we get this kind of mess.