Adanna Taylor, Chloe Zhu, Jared Rosales, Lilah Durney, Ryan Brunswick, Samip Phuyal, Selina Song
We have entered an age of information disorder; with the current design of the internet, it has become increasingly difficult for users to access, identify, and trust authentic information. Editing tools have made the alteration or fabrication of image and video content dangerously easy, leading to the vast amount of misleading and false information available online. Misinformation online has jeopardized the public’s trust in news media and the free press. Furthermore, disinformation is being used now more than ever for information warfare, which has had measurable effects on the political, social and economical climate of nations worldwide.
The imminent transition to Web 3.0 opens up an opportunity to redesign the internet using new technologies. Looking at the Starling Lab’s work on image verification, for instance, we see cryptography’s ability to secure and verify the history of a piece of visual content as a powerful tool to ensure authenticity. In this paper, we seek to explore and demonstrate how Web 3.0’s technology can be applied to solve the information disorder; we ask the question “How might we design for authenticity? And how might we visualize that design?”
All of our results can be found in this folder.
To understand Starling Lab’s work, it’s important to consider how we arrived at this time full of so much information disorder. Since its invention in 1989, the internet has gone through three distinct phases: Web 1, Web 2, and Web 3 . Web 1 gained popularity in 1991, and its structure was decentralized, with no single authority or group of authorities dictating the content that could be published. Next, around the year 2004, Web 2 emerged with the arrival of the social-mobile web and companies like Facebook. Based on user interaction and the spread of information via posting, commenting and messaging, this phase of the web saw a centralization of power in the hands of a few large companies, which resulted in many companies selling user’s data and information . These characteristics of Web 2 are what has led to the spread of misinformation and disinformation, two terms that we will explore more later in this paper. We are now on the cusp of Web 3, a new phase of the internet that strives to fix the unintended consequences of Web 2, like centralization and monopoly power. The goal of Web 3 is to create a system that removes power from bad actors, decentralizes control, and creates transparency . Beyond Web 3 technologies that have gained popularity in the past few years, like cryptocurrencies, we believe that Web 3 technologies can reduce “information uncertainty” and help bolster trust in digital content.
Blockchain & Tech
We can specifically use cryptographic technologies to help us verify and authenticate information. The roadmap of verification starts when the creator, or Person A, takes a photo and saves it as a data file. They upload the file through a cryptographic hashing program, software that returns a hash. Next, they sign the hash with their private key, verifying their identity and ownership . Next, Person A registers their signature onto a distributed blockchain ledger, a decentralized database that can be seen across different sites and is not controlled by a single, centralized company like Google or Facebook . Person A then stores it onto a distributed storage network, which splits the data across multiple storage places, or servers. If one server is hacked, the others will not be compromised, a unique security benefit to decentralized technology. Person B will verify the signature with a public key and get the hash code . Person B will compare the hash from the digital signature with the hash from the original data file . If the hashes match, then the data is verified and authentic. If the hashes are different, someone has changed the data file and it is not authentic.
The history of changes made to a data file can be found on the ledger and is stored as metadata. Information on the ledger is immutable, so it cannot be changed or removed once saved . Data explaining what changes were made, at what time, and by whom can all be accessed.
Misinformation & Disinformation
Misinformation has been one of the main issues that has come along with the evolution of the web. Misinformation is false information that is spread or used by someone because of the unawareness of where it came from, but it may not necessarily be used for intentional harm. There are 7 categorized concepts of misinformation that range from satire, which is low manipulation, to fabricated content, to manipulated, false, misleading or imposter content . For example, Some websites will post false content on websites with headings or logos similar to those of credible and trustworthy news organizations.
In addition to these concepts there are more that come in between that only serve to confuse the public. Disinformation is false information that is continuously used to purposefully cause harm. Some politicians, for example, throw out and overuse the term “fake news,” undercutting all news including reliable journalism and information.
How might we better verify digital content and make it clear to the public that what they read and see has been authenticated thereby bolstering trust in journalism and the media? That’s a question on the minds of many journalists and photographers who worry how and where readers and viewers get their information and whether they can trust it. These days, some say, we’re all suffering from “information disorder.”
Players & Analysis of Problem
Many developers have been working together to re-architect the web by looking for upstream solutions using provenance and cryptographic tools. There is a world of open source players working in this ecosystem, which allows for various groups to collaborate using publicly available tools to combat misinformation and disinformation. We have spoken to representatives of many of the organizations spearheading this movement, including the Starling Lab for Data Integrity led by Stanford and University of Southern California, The Content Authenticity Initiative, or the CAI, spearheaded by Adobe, as well as the News Provenance Project, an experiment led by the New York Times Research and Development team.
The CAI collaborates with groups across several industries to fight misinformation by using tech and tools that bolster digital content provenance and puts data on a decentralized, unalterable, and transparent distributed network .
Similarly, the Starling Lab looks to establish trust in records of digital media by using provenance, various cryptographic methods and widely accessible collaborative tools. The Lab follows a three-step framework to capture, store and verify digital content that makes sure the information of the content’s origins are authentic and viewable .
The New York Times Research and Development team has also experimented with provenance working with IBM on a “News Provenance Project” experiment with technical solutions that combat the spread of misinformation by allowing readers to verify the validity of news online .
However, there is still a severe lack of awareness about the host of issues that misinformation and disinformation have brought about, as well as the steps that are being taken to address these problems. The work our team has done sheds light on the consequences of misinformation and disinformation, as well as solutions for them.
Methods and Materials
When discussing the most effective way to inform others about the utility and journey of Web 3.0, we advocated for the use of visual aids. Our primary criteria revolved around factors such as: comprehension, precision, and engagement. The process of creating these visuals involved the use of various editing softwares, incorporating informative text with striking images. Photoshop and Canva were primary players, allowing creative flexibility in the overlap of text and image cycling. Furthermore, iMovie played a significant role in expanding our reach to the media centric audience through video.
Throughout the summer, we have focused on multimedia forms of creation. We created visuals and infographics to explain the various concepts of Web3. These concepts span from the history of the Internet to the technologies behind data authentication, such as hashing, signatures, and public/private keys. In addition, with the supervision of media professional Aaron Huey, we have created a video to present our methods and findings throughout the program.
Our video is viewable through this link: https://www.canva.com/design/DAFH2usxdPs/5C41l5REW2ct0jKNHPC24Q/watch?utm_content=DAFH2usxdPs&utm_campaign=designshare&utm_medium=link2&utm_source=sharebutton
Some examples of our work:
The following is a compilation of all visuals and infographics we created on the various topics, technical and conceptual, of Web3:
We can further increase the scope and quality of our work to make knowledge about cryptography’s value in content verification more accessible. To measure the effectiveness of our work, we can get feedback through surveys from our local communities. After we present the infographics to them, we can track how much information was retained through a follow up quiz. In addition, to create a more professional video with greater human interaction, we can create a video interview series to gauge the starting familiarity levels of strangers with Web3 and how it changes after a brief discussion and video playback. From a more technical aspect, we can learn more about the product development process for Web3 applications and potentially create our own dApp, or decentralized app. Overall, there is an abundance of opportunity to explore the variety of uses of Web3, utilize it to create change, and continue to design for authenticity.
 Wikipedia contributors. (2022, July 21). World Wide Web. Wikipedia. https://en.wikipedia.org/wiki/World_Wide_Web
 Wikipedia contributors. (2022a, July 3). Web 2.0. Wikipedia. https://en.wikipedia.org/wiki/Web_2.0
 Web3: in a nutshell. (2021, September 9). Mirror. https://eshita.mirror.xyz/H5bNIXATsWUv_QbbEz6lckYcgAa2rhXEPDRkecOlCOI
 Johnson, S. (2021, September 3). Beyond the Bitcoin Bubble. The New York Times. https://www.nytimes.com/2018/01/16/magazine/beyond-the-bitcoin-bubble.html
 IBM. (2021, March 5). Digital signatures. https://www.ibm.com/docs/en/ztpf/184.108.40.206?topic=concepts-digital-signatures
 Blockgeeks. (2019, November 8). BLOCKCHAIN INFOGRAPHICS: The Most Comprehensive Collection. https://blockgeeks.com/blockchain-infographics/
 Koptyra, K., & Ogiela, M. R. (2020). Imagechain-Application of Blockchain Technology for Images. Sensors (Basel, Switzerland), 21(1), 82. https://doi.org/10.3390/s21010082
 CAI. (2022). Secure Mode Enabled. Content Authenticity Initiative. https://contentauthenticity.org/case-study
 Wardle, C. (2021, August 3). Understanding Information disorder. First Draft. https://firstdraftnews.org/long-form-article/understanding-information-disorder/
 Starling Lab. (2022). Starling Lab. https://www.starlinglab.org/
 NYT R&D. (2022). The New York Times R&D. https://rd.nytimes.com/