Facebook developed a liam Bot to facilitate employees to satisfy close circles over ‘compromised’ company decisions.
There are more developers and A.I Experts around the globe who dream to be part of Giant Tech Companies as team member. But, if such companies compromise with set community standards, it becomes hard to defend company by existing staff , when are in close circles like family and friends.
The employees of facebook are facing same odd situation. They shared with their line managers, they feel difficulty to answer those questions which are directly or indirectly related to compromised-community-standards , asked by relatives in gatherings. It would have been better, it could have been treated in humanly way. But Facebook has developed A.I based Liam Bot to facilitate employees to satisfy the queries raised by family or friends especially on holidays, as reported by nytimes.
For example, If a relative asked how Facebook handled hate speech, the chatbot — which is a simple piece of software that uses artificial intelligence to carry on a conversation — would instruct the employee to answer with these points:
- Facebook consults with experts on the matter.
- It has hired more moderators to police its content.
- It is working on A.I. to spot hate speech.
- Regulation is important for addressing the issue.
“Our employees regularly ask for information to use with friends and family on topics that have been in the news, especially around the holidays,” a Facebook spokeswoman said. “We put this into a chatbot, which we began testing this spring.”
Facebook as per glassdoor- employees reviews site, reaches at 7 position from top spot. This happens due to its recent policies, in result, company’s top officials embraced with unhappy moments.
Updated on 8:57 pm PKT| Mallahpost