The Technium

Truth vs Trust

The dream of having a system of any sort that can label or identify what is true, what is fake, what is misinformation, is merely a dream and unattainable. We have no reliable method to know with any certainty what is true simply by inspecting the statement. That uncertainty applies to anything on a news cycle. And it applies to text, images, audio, and video. We cannot tell from looking at some words if they are true; nor can we tell by looking at a photo, now can we tell by seeing some video. Text, photo, video is no longer evidence of anything.

As historians will tell you, it can be years, decades, maybe even a century before all the facts about an event can be compiled, compared, and evaluated. Events will usually have long since passed before we can judge their veracity. Any report less than a day old is very difficult to verify in any capacity.

So what hope do we have of preventing rumors, lies, deception, misinformation, hoaxes, fakes, superstitions, and disinformation from spreading far and wide, and permeating a society?

Rather than focusing on ascertaining truth, we have to ascertain trust. Trust works where truth is hard. We can’t make technologies that surface only the truth but we can make technologies that surface trust.

Imagine a system where the source of every bit of information is embedded immutably into the information. (Maybe blockchain is used.) So every statement has a source of who asserts this claim. Every statement would end up having four parts: subject, verb, object and assertion: Who claims it is true. If a claim is quoted or forwarded, the chain of provenance, the sequence of sources is also forwarded, embedded into the statement. This embedded chain of sources enables people to begin to trust some sources over others.

Most of what we say we know, we only “know” because we believe and trust others. Almost everything I might know about physics or chemistry is because I believe what others (scientists) say. I’ve done some experiments myself, but many concepts I take to be true because I have come to trust the scientific consensus. For instance the effects of radiation are something I believe because of the trust I have with nuclear experts; I have no direct experience in this myself. Everything I know about WW2 comes from other sources that I must trust or not. And of course most of those sources are not direct sources: they are quoting and relying on others.

But we can imagine a system where it is easy to filter these chained sources (footnotes to the footnotes to the footnotes) and assign trustworthiness to them built on their reputation over time. This source filtering does not eliminate fraud or incorrect claims, but it makes it harder.

We might come to insist that we don’t look at things that have more than 4 hops away from an original source. Say an author found a statement made by a news reporter who quotes source A. And each of those hops might need to have very high scores of trustworthiness. We might discount anything that had an early route through blacklisted sources; sources we have come to not trust. Those trust scores can be calculated many ways and there might be more than one way to figure them. We would also have to trust this accounting process and agent. There might be different agencies/filters for evaluating trust that we subscribe to, in the way we subscribe to a newspaper. Another indication of trust-worthiness, is the speed and eagerness to correct mistakes. Sources that handle corrections and mistakes well can gain more trust, over those who never offer corrections or admit mistakes. In these ways this scoring process becomes yet another level, another source that must be trusted or not.

Science is in the business of sorting out claims and counterclaims and it is pretty good at generating a consensus of what is true — but it takes a long time. This “trust” method does not rely on consensus as much, and is much faster, geared to news cycles. However in many parts of the world today, we see polarization in media, where different “tribes” of people trust different sources. And some of those consumer media tend to hide the actual sources of their information, and those users are happy with that system. Any alternative system of trust may not diminish the polarization of news, if some tribes don’t care about a different way to trust. But by having at least one system with unalterable provenance of sources, there would be increased trust and confidence in fast moving information. Imagine if Wikipedia extended their policy of using only “published” sources to cover only “published and chained-sourced” sources. That would increase its trustworthiness, even if some people ignored it.

The need for embedded chain-sourcing is amplified a million-fold with the advent of AI and deep fakes. Like text before it, no one can tell from inspecting a photo, audio clip, or video whether it is true or not. A well-trained AI can generate realistic images that are complete fiction from subject to background, yet appear real in all respects. If applied to news, no consumer can tell from looking if it is true or not. Therefore we can only rely on the source to determine its state. The source should be compelled to reveal its nature, and if it does not reveal it, or it lies about it, the sources should be penalized by the system, with downgraded reputations that follow the product. Reputations in this system are sticky; they take a long time to change either way, although they can lose points faster than they gain them. It will take a long time for a source that has consistently misrepresented items to overcome that reputation if it issues fair items. Note that some sources — say a Hollywood special effects company which generates superior deep fakes — can earn a great reputation because their work is labeled as fiction, and that assertion is embedded in the item, so that this info can be inspected wherever it winds up on the net, then it builds up them as a source that can be trusted, even though they make deep fakes.

In short, for media consumers, don’t aim for truth; aim for trust.


© 2023