According to the BBC, evidence of potential human rights violations may be lost after being deleted by tech companies.
Platforms take down graphic videos, frequently utilizing artificial intelligence; however, footage that might aid in criminal prosecutions can be removed without being archived.
Both Meta and YouTube claim that they want to strike a balance between their responsibilities to provide evidence and to shield users from inappropriate material.
The industry has been "overcautious" in its moderation, according to Alan Rusbridger, a member of Meta's Oversight Board.
When the BBC tried to upload video documenting attacks on civilians in Ukraine, it was quickly removed, despite the platforms' claims that they do have exceptions for graphic material when it is in the public interest.
Artificial intelligence (AI) has the capacity to mass-remove harmful and illicit content. However, machines lack the subtlety to recognize human rights violations when moderating violent images from wars.
Former travel journalist Ihor Zakharenko came across this in Ukraine. He has been compiling information on attacks on civilians since the Russian invasion.
He was met by the BBC in a Kyiv suburb where, a year prior, Russian troops shot and killed men, women, and children who were attempting to flee the occupation.
At least 17 bodies and burned-out cars were captured on film by the shooter.
He wanted to publish the videos online so that everyone could see what happened and disprove the Kremlin's version of events. However, when he posted them on Facebook and Instagram, they were removed right away. .
Ihor stated that the Russians themselves claimed that these were fakes and that they had only engaged in combat with the Ukrainian army.
Using fictitious accounts, we posted Ihor's video to YouTube and Instagram.
Three of the four videos were removed from Instagram within a minute.
The same three initially had age restrictions on YouTube; however, ten minutes later, all of them were removed.
We gave it another shot, but they completely failed to upload. A request to restore the videos was denied on the grounds that they contained proof of war crimes.
The need for social media companies to stop this kind of information from disappearing, according to influential figures in the sector, is urgent.
"You can see why they developed and trained their machines to take it down the moment they see something that looks difficult or traumatic," Mr. Rusbridger told the BBC. He serves on the Meta Oversight Board, which Mark Zuckerberg established as a sort of autonomous "supreme court" for the organization that controls Facebook and Instagram.
The next issue, according to Mr. Rusbridger, a former Guardian editor-in-chief, "is how do we develop the machinery, whether that's human or AI, to then make more reasonable decisions.".
According to US Ambassador for Global Criminal Justice Beth Van Schaak, no one would contest the right of tech companies to censor content. "I think where the concern happens is when that information suddenly disappears," she says. ".
Do not delete for war crimes.
Social media posts about wartime atrocities are common. To support war crimes prosecution, this information can be used as evidence. But the major social media platforms have removed this content, according to people impacted by violent conflict who spoke to the BBC.
Currently available on BBC iPlayer (UK Only).
According to YouTube and Meta, under their exemptions for graphic war footage that serves the public interest, content that would typically be taken down can be kept online with adult viewers only. Contrary to what our experiment with Ihor's videos suggests.
According to Meta, it responds to "legitimate legal requests from law enforcement agencies around the world" and "continues to explore additional avenues to support international accountability processes... in accordance with our legal and privacy obligations".
Despite having exceptions for graphic material that serves the public interest, YouTube claims that its platform is not an archive. Human rights organizations, activists, human rights defenders, researchers, citizen journalists, and others who record human rights abuses (or other potential crimes) are advised to follow best practices for protecting and preserving their content. ".
The BBC also spoke with Imad, who ran a pharmacy in Aleppo, Syria, before a barrel bomb dropped by the Syrian government hit the area in 2013.
He remembers how the explosion blew smoke and dust into the space. When he went to the outside market after hearing cries for help, he saw blood-covered hands, legs, and dead bodies.
These images were taken by local TV crews. The video was published on Facebook and YouTube but has since been removed.
Syrian journalists told the BBC that during the chaos of the conflict, bombing raids also destroyed their own recordings of the original footage.
Imad was required to present proof of his attendance at the scene when he applied for asylum in the EU years later.
"I had no doubt that my pharmacy had been photographed. When I went online, however, it directed me to a deleted video. " .
Organizations like Mnemonic, a human rights group with offices in Berlin, have stepped in to archive footage in response to incidents of this nature before it is lost.
In the beginning in Syria, and now in Yemen, Sudan, and Ukraine, Mnemonic developed a tool to automatically download and save evidence of human rights violations.
In addition to three videos showing the attack close to Imad's pharmacy, they have saved more than 700,000 images from war zones before they were taken down from social media.
Each picture could contain a crucial piece of information that would help investigators figure out where, when, or who committed the crime on the battlefield.
Organizations like Mnemonic, however, are unable to cover every region of conflict in the world.
It is extremely difficult to demonstrate that war crimes have been committed, so gathering as many sources as you can is essential.
Olga Robinson of BBC Verify explains that verification is similar to piecing together disparate pieces of information to create a more comprehensive picture of what transpired.
People on social media who want to assist their loved ones who are caught up in violent conflict frequently find themselves in charge of archiving open-source content that is accessible to just about anyone.
Rahwa resides in the United States and has family in Ethiopia's Tigray region, which has recently seen a lot of violence and where the government there tightly regulates the flow of information.
Social media has made it possible to document conflicts that might otherwise go unnoticed by the public.
Rahwa says, "It was our duty.". "I spent hours conducting research, so when this content starts to trickle in, you try to verify using every piece of open-source intelligence you can get your hands on, but you have no idea if your family is safe. ".
A formal system to collect and securely store deleted content is urgently needed, according to human rights activists. In order to verify the content and demonstrate that it hasn't been altered, this would also include maintaining metadata.
The US Ambassador for Global Criminal Justice, Ms. Van Schaak, states: "We need to develop a mechanism whereby that data can be stored for potential future accountability exercises. Platforms for social media should be open to negotiating agreements with global accountability systems. ".
Hannah Gelbart and Ziad Al-Qattan contributed more reporting.
More information about BBC Verify can be found at Defining the "How" of BBC Verify's Launch.