I’ve just read a fascinating article in The Guardian on the diffusion of emotions through social media and, subsequently, the work of Stefan Stieglitz and Linh Dang-Xuan entitled Emotions and Information Diffusion in Social Media – Sentiment of Microblogs and Sharing Behavior, published in 2013. It immediately resonated with me, as I’ve given consideration to our understanding of truth throughout my own research, and am intrigued by our relationship with it.
In the first instance, let’s talk about the recent Marina Joyce matter. Joyce is a YouTube blogger whose recent behaviour changed so dramatically, or so it seemed to her fans, that the rumour began to spread via Twitter and other social media forums that she had been kidnapped by Islamic State and was being coerced by them, through the use of physical violence. The hysteria reached such a state that the police were forced to investigate the matter, and even Joyce’s subsequent video in which she attempted to reassure her followers that all was well, was scrutinised and disbelieved. The #SaveMarinaJoyce hashtag trended worldwide. Ultimately and, with depressing inevitability, the very fans who worried and reacted to the point of panic turned on Joyce, when it appeared that the simple fact of the matter was that she was possibly suffering from some form of mental illness. This turnaround in the thoughts and feelings of fans towards Joyce also came with its own hashtag: #BoycottMarinaJoyce.
As The Guardian article reminds us, this sort of hysteria is not unknown, and has historical precedent: the Salem Witch Trials is the example cited by the newspaper, but there are others.
What is interesting is the speed with which the issue was picked up on by followers, and the evidence that was seemingly gleaned from the videos in support of their supposition.
But a report also cited in The Guardian suggested what many of us already knew: that the young are not always discerning users of the internet, unable to differentiate truth from fiction, and “too often influenced by information that they should probably discard. This makes them vulnerable to the pitfalls and rabbit holes of ignorance, falsehoods, cons and scams.”
The answer is not greater censorship or a tighter control over internet content. The task is to ensure that young people can make careful, skeptical and savvy judgments about the internet content they will, inevitably, encounter. This would allow them to identify outright lies, scams, hoaxes, selective half-truths, and mistakes, and better navigate the murkier and greyer waters of argument and opinion.
Truth, Lies & the Internet: A report into young people’s digital fluency, Bartlett & Miller 
It therefore becomes ever more important that what we produce and publish online is truthful and honest. This is often very difficult when we’re talking about the reproduction of cultural artefacts, as digitisation does, for some people, undermine the authenticity of an original artefact. Walter Benjamin was telling us far back as the 1930s that replication of an original artefact as a photographic image “is lacking one element: its presence in time and space, its unique existence at the place where it happens to be…the presence of the original is the prerequisite to the concept of authenticity.”
Bartlett & Miller suggested that “decisions about information quality [are] based on site design, rather than more accurate checks…15% [of 12-15 year olds] don’t consider the veracity of results but just visit the sites ‘they like the look of'” [Bartlett & Miller, 2011].
This might not negate the mass hysteria that sometimes occurs in cases like that of Marina Joyce, but the internet is certainly a breeding ground for inaccuracies, and it is only be perpetuating truth that we can attempt to undermine that.
I firmly believe in the democratisation of knowledge. But at the moment, often, democratisation gives rise to reams of data and information that is unchecked, and lacks gatekeepers. “In a complex, specialised and esoteric world, we must trust in experts. John Hardwig calls this the ‘novice/expert problem’. An important and fundamental strut of epistemology today is therefore the application of ‘pedigree criteria'” [Bartlett & Miller, 2011]. In seeking those gatekeepers, and experts, we are perpetuating the same old hierarchies, relying on people that we assume have intellectual authority. It is a complex issue. What is the answer? It seems trite to say, tell the truth. Because how can that be managed in a world with 200 million Twitter accounts? [Stieglitz & Dang-Xuan, 2013]. And particularly, when social media is increasingly used for political debate?
Our artefacts must be truthful. And for the most part, they are, enhancing our understanding and providing a route for those who may never get access to an original manuscript.
What happens when the truth is manipulated? It can feed into emotional contagion and fan hysteria: it trends, with its own hashtag; it creates reflections in the eye of the beholder that don’t exist; the gun in the corner of the room. When we digitise, we undermine truth, so we need to ensure the consumers of digital artefacts are digitally fluent, aware that when they traverse the internet they are often negotiating Balkanised states of information in which discussion is bounced around groups of like-minded individuals, and thus distorted.
What’s your truth? Mine is somewhere between seeing, and believing.