The company that was enlisted to manage Facebook postings in East Africa has recently stated that, given the benefit of hindsight, it likely would not have accepted the assignment.
Employees of Sama previously based in Kenya have declared being deeply disturbed by having to encounter highly disturbing posts.
Lawsuits are being brought against the company through Kenyan courts.
Wendy Gonzalez, CEO of Sama, declared that the company would no longer be involved in moderating any material that might be deemed as corrosive.
Alert - this article contains upsetting material
Former staffers have reported becoming traumatised after having watched videos of decapitations, self-annihilation and other explicit content in the moderation centre, which the company ran from the year 2019.
Daniel Motaung, who used to moderate for the BBC, remembers the first graphic video he encountered being "a live video of someone being beheaded".
Mr Motaung is taking legal action against both Sama and the owner of Facebook, Meta. Meta has stated that they expect all firms that they collaborate with to provide continuous assistance. Sama has maintained that certified wellness advisors are accessible at all times.
In January, Ms Gonzalez stated her intention to retire a contract representing less than 4% of the firm's business. This was confirmed by Sama, bringing their partnership to a close.
You pose the query: 'Do I wish I had acted differently?' My answer would be this: In hindsight, knowing the sheer amount of resources and effort that I would have to put into it, apart from the rest of the business, I would not have agreed to it.
She mentioned that through this experience, they had learned lessons and established a policy not to embark on projects which involve moderating dangerous content. In addition, they will not be carrying out Artificial Intelligence (AI) work "which aids in weapons of mass destruction or law enforcement surveillance".
Ms Gonzalez refused to comment on the accusations expressed by workers that they had been detrimentally impacted by seeing explicit material as a result of the ongoing legal proceedings. In regards to whether moderation work could prove damaging generally, she proclaimed that it is "an uncharted field which absolutely requires investigation and funding".
From its inception, Sama is a unique outsourcing business whose purpose is to help people rise out of poverty through digital know-how and jobs involving outsourced computing roles for tech companies.
In 2018, the BBC was present at the firm seeing workers from impoverished sections of Nairobi make $9 (£7) a day on "data annotation," which includes idenfifying items in videos of drivers, including people and lamps, and then utilising this to programme AI systems. Those questioned claimed the income had assisted them in getting out from under poverty.
This video is unable to be played..
In 2018 the BBC conducted a visit to Sama situated in Nairobi.
She states that the company continues to focus primarily on AI projects related to computer vision, and that these projects do not subject employees to objectionable content.
Ms Gonzales expressed great pride in the accomplishment of shifting 65,000 people out of poverty.
She is of the opinion that it is critical for African people to partake in the digital economy and the conception of AI systems.
Throughout the interview, Ms Gonzales repeatedly stressed that she had decided to take the work for two reasons: the significance of moderation in safeguarding social media users from danger and the importance of having African content moderated by African teams.
"She noted that it would not be reasonable to anticipate somebody from Sydney, India, or the Philippines to be competent in moderating languages native to Kenya, South Africa, or any other place,".
She also divulged that she had completed the moderation work by herself.
Sama's moderators get paid approximately 90,000 Kenyan shillings ($630) per month, which is a salary that is competitive to those of nurses, firemen and bank officers, Ms Gonzalez stated.
When asked if she would accept the job for the proposed salary, she responded, "I completed the task, but that is not my role in the organization."
Sama took a job with OpenAI, the creators of ChatGPT.
Richard Mathenge, an employee whose responsibility was to read large quantities of words that the chatbot was utilizing to learn from and mark any offensive material, talked to the BBC's Panorama series. He declared that he was subjected to upsetting content.
Sama stated that when personnel in Kenya voiced worries regarding demands related to image-based content which was not noted in the agreement, they cancelled the job. Ms. Gonzalez asserted, "we concluded this task right away."
OpenAI declared that they have formulated their own "ethical and welfare protocols" for their data labels, and are conscious that this is testing labor for their exploration team and annotators in Kenya and around the world.
Ms Gonzalez considers this AI labor to be an alternative form of moderation, and the firm has decided to no longer take part in it.
She stated that the emphasis of their work was on applications of computer vision that were not detrimental, such as those involving driver safety, drones, and identifying fruit and diagnosing diseases in crops.
It is essential that Africa is included in the discussion and design of AI - this is vital to avoid perpetuating existing prejudices. It is imperative that input comes from across the world so that this tech can be developed on an global scale.
top of page
bottom of page
Comments