Home Digital media Deepfake Task Force: Danger of disinformation requires new collaboration

Deepfake Task Force: Danger of disinformation requires new collaboration

4
0

Surrounded by a seemingly endless whirlwind of digital media, today’s consumers often face the dangers of disinformation without the knowledge and tools to combat them. To meet this growing challenge, we must adopt technological solutions that will help consumers verify the validity or veracity of digital content. This solution will require close collaboration between government, media and tech companies who together can help create a consumer base that is not as easily swayed by the content they consume.

That’s why we at Adobe strongly support the Deepfake Task Force Act (S. 2559), which would establish a National Deepfake and Digital Provenance Task Force comprised of members of the private sector, federal government and academia to resolve the issue. problem of disinformation. We are encouraged to see the government leading the charge and bringing together the country’s collective expertise to find a solution – a solution that we believe must be technology-based and championed across all sectors.

While the concept of disinformation has been around for centuries, recently those keen to spread it have taken advantage of social media and easy-to-use editing technologies to do so at an alarming rate. In the past year alone, misinformation has eroded confidence in the election and spread deadly untruths about the effectiveness of the COVID-19 vaccine.

And as artificial intelligence (AI) continues to advance, it will become even easier to manipulate all types of media – and even harder to detect manipulation when it occurs. Think about altered photos, videos, audio, all with the intention of misleading.

The Deepfake Task Force Act focuses on an important aspect of the solution to disinformation: the provenance of digital content. The bill defines this as “the verifiable chronology of the origin and history of an item of digital content, such as an image, video, audio recording or electronic document”. In other words, the ability to see where content is coming from and what has happened to it along the way. The stated aim of the working group is to explore how the development and implementation of digital provenance standards could help verify information online and reduce the proliferation and impact of disinformation.

It’s the same approach we took at Adobe when we founded the Content Authenticity Initiative. The Content Authenticity Initiative is a provenance-based tool that combines tamper-proof attribution data such as name, location, and edit history with a medium, allowing creative professionals to gain credit for their work. work while providing consumers with a new level of transparency about what they see online.

A provenance-based solution removes some of the barriers social media platforms face when it comes to disinformation that detection tools alone cannot overcome. As we have seen in current practice, keeping content with “disinformation” labels is often futile – by the time it is labeled as such, millions of users have already seen it. Blocking or completely removing content is also problematic as it could lead to a drop in user trust in platforms to defend freedom of expression. Make a mistake and no one will trust your judgment anymore.

But with provenance technology, the decision making is left to the consumer – not a publisher, not social media platforms, not government. An empowered consumer base could identify disinformation based on the characteristics of the content before it spreads, without waiting for an intermediary to label it. So rather than taking on the impractical (and, frankly, impossible) task of catching every bad actor, a provenance-based solution creates a place of trust for good actors. And it provides an essential backstop if AI-powered detection tools can’t track AI-powered creations.

Tech companies have an important role to play here. We have extensive user networks and intimate knowledge of how our tools are used to create and share content. Therefore, we are also the place where a provenance-based solution should exist. But we cannot do it alone.

For a provenance-based solution to truly work, we need investment and research initiatives to continue to find the best ways to deliver these tools to consumers. We need government action to encourage the entire ecosystem to integrate these tools. And we need industry standards that companies can follow. Most importantly, we need educational efforts to anchor an awareness of disinformation into the core public understanding of the media and the internet.

There is a lot of work to be done, but the creation of a National Counterfeit and Digital Provenance Working Group is a critical step in bringing together the knowledge, perspectives and influence of the private and public sectors to address these challenges. . We encourage the Senate to pass and pass this bill so that we can work together in our collective fight against the dangers of disinformation.

Dana Rao is Executive Vice President, General Counsel and Corporate Secretary of Adobe. He directs Adobe’s efforts around content authenticity.

LEAVE A REPLY

Please enter your comment!
Please enter your name here