At the end of 2013 (I apologise for the delay), I did a research study on misinformation and disinformation in online news and journalism sites in Singapore. I explained the background, motivations, and methodology, and would like to share an excerpt of the policy recommendations (the bracketed numbers are references to the qualitative responses to the survey):
What then are the relevant policy recommendations to be contemplated? More significantly, who should shoulder the burden of crafting and executing these proposals? Based on the findings, government intervention is not preferred, mainly because of the preconceptions of its roles and intents. Attempts at control could be construed by the public as a sign of repression (#1), or bureaucrats trying to be meddlesome, paternalistic Big Brothers (#22 and #67). Regulation is interpreted as a threat to online freedoms. This, however, does not preclude the government’s right to “correct false information … on its own platform” (#20). These clarifications can counter the spread of misinformation and disinformation, and provide Singaporeans with the sources to make fair conclusions. Minimally, there should be government statements to separate fact and fiction (#60). Some scholars have maintained that “[c]alls for regulation of the Internet cannot be divorced from the wider debate over the boundaries of free expression” (Harris et al, 2009, p. 176), that regulation or censorship stymies constructive socio-political discourse, and Netenal (2000) opines that the Internet community – like other mass media platforms – is capable of self-regulation (Netenal, 2000). The onus is on the websites and aggregators to check themselves, and to check one another.
Punitive policies were mentioned too. Publishers could be held accountable through the imposition of nominal fines, and “a study on [the] cost of inaccurate publishing may be conducted to propose a suitable level of fine[s] to deter irresponsible publishing” (#73).
Websites and Aggregators
Websites should be aware of the harm associated with misinformation and disinformation. Readers could treat their publications less seriously. Based on the questions of trust, when the incidents on TOC, TRS, and TRE resulted in respondents trusting the sites very much less, it would be reasonable to posit that multiple journalistic contraventions would render this distrust even more pronounced. “Fact checks”, “repeated verification of sources”, and “due diligence throughout the information[-]gathering process” (#76) were cited as ways for websites and aggregators to be more “responsible” and “reliable”. If misinformation has been perpetuated, the response to the oversights matters too. While the controversies were poorly-received, respondents were more satisfied with the follow-ups from TOC and TRE, whereas TRS’s reply was met with greater consternation. Many spoke of the need for apologies or corrective notes (#11, #13 and #61), so that the integrity of the writers will not be tarnished.
As a community of socio-political content providers, there is also the option of “call[ing] out peer sites for misinformation and disinformation” (#73). Active policing generates healthy competition, and gradually susses out irresponsible websites, editors, or writers who publish content in a cavalier manner. The level of discourse is hence enriched. In response to problematic hate speech that is “persecutorial, hateful, and degrading” (Tsesis, 2001, p. 819) in the United States, the concept of a “neighbourhood watch” allows netizens to expose “deceitful and false content” (Wolf, 2010, pp. 551-2). As users become more experienced, tolerance and sensitivity will become norms. The same principles can apply to the aggregators, who would benefit from being more selective about the websites that they link to. Sensational content in the present moment could yield the desired web-traffic, but would greatly undermine its future standing as the readers become more selective news and information consumers.
Ultimately, the endeavours of the websites and aggregators will bear fruit if the readers become more demanding. Ideally, websites with the deliberate aim of presenting disinformation will lose favour with the general populace. The content providers and aggregators – who serve the needs of their audience – will consistently choose to run pieces that appeal to the supposed preferences and biases of the readers. In some sense, information education at schools (#59) will provide avenues for ethical and practical discussions. Results from these sustainable pedagogies will take a long period of time, but “[w]ith the Internet as possibly the closest manner in which we can exercise free speech, it falls upon the readers to act as the filters” (#69). They should give due credit and attention to pieces that truly deserve them.
Readers can become more educated and read more critically, but a healthy brand of scepticism should be expanded and translated into tangible outcomes. Media literacy pedagogies have been explored in tandem with the growing accessibility of the Internet; Coiro (2003) and Livingstone (2004) have spoken on the need to extend traditional media literacy to information and communications technology. Given the lower levels of trust consumers might have of web-based sources, with the perception that the information is “potentially inaccurate and biased” (Flanagin & Metzger, 2000), more effort is necessary. Once individuals become more capable of distinguishing facts from fabrications through education and greater exposure to online information, they should be willing to correct baseless misconceptions (#63). The potential for improvement is reflected in the collated responses to the question of countering instances of false information, which were marked by a degree of apprehension. Overcoming that hurdle of lethargy will create a culture of rigour that is often missing on online news sites.
Likewise, readers should take responsibility for spreading misinformation and disinformation. “[I]f they had previously posted … inaccurate information, they should take down the post, or edit the original post” (#12). This retrospective practice directly halts the circulation of untrue information, alerts others, and reminds them of the pertinence of ensuring veracity. This assumption of responsibility will also encourage more to be more conscientious.