Fact or Fake News The “Role of Law” for Data Accuracy

False information is an unfortunate by-product of rapid advances in infocomm technology. The problem is brought to the forefront recently by the impact of 'fake news' on socio-political developments. It has been identified as an dilemma for which the law may provide some solution. The appropriate approach should focus on prevention and deterrence, and involve all segments of society (in particular news and social media platforms). The answer may already be found in existing laws that can be updated to deal with the issue.

“Political language ... is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.”
George Orwell
Politics and the English Language, 1946

A Rising Problem – “Alternative Facts” and Convenient Untruths
2016 was a watershed year for western politics and for online news and information dissemination; both of which were intricately linked in what is becoming a protracted battle for informational integrity. The problems of informational overload and data accuracy was already apparent even before it became a catch-phrase after the type of political rhetoric surrounding the Brexit Referendum and the U.S. Presidential Elections, which was fraught with untruths. However, the sheer amount of false information and the audacity of its purveyors in 2016-7, as well as the rise of ‘fact-checking’ practices by news and other media outlets highlight the problem as never before.1 Fake news continues to be a political weapon, such as in the recent French elections.
New Media and the Viral Effect
There are multiple reasons leading to this sorry state of affairs, most of which are exacerbated by rapid progress in information and communications technology; particularly advances in storage technology, high-speed electronic transmission and ease of access to digital information. The reasons for the rise of the ‘fake news’ problem includes: Online anonymity; modern forms of ‘viral dissemination’ (e.g. through social media platforms); the ease of cloaking falsehoods (e.g. by obfuscating lies with a dash of truth or masquerading opinions as facts); the human tendency to accept data that affirms personal beliefs (or that aligns with personal interests) as truth; the age of User-Generated Content, in particular citizen ‘journalism’; a culture of instant gratification and sensationalized news (i.e. news as entertainment); the race between news outlets to report even before verification; and the blurring of lines between credible news outlets and tabloid ‘journalism’.2
Singapore and “Fake News”
Recently, there have been an increasing number of statements and opinions emerging from Singapore on the threats that fake news poses to Singapore as a society and to its political landscape as well as the possibility of new laws to curb, if not eliminate, the creation of or access to such news.3 Local incidents include fake Facebook accounts of Ministers of Parliament, false identification by cyber-vigilantes and other fake news relating to various events (including potentially sensitive issues that could sow racial and religious discord). 
On the one hand, government officials and private sector “experts” have called for new laws and public-private cooperative efforts to deal with such problems; on the other hand, proponents of freedom to information (particularly on the Internet) led by citizen journalism website editors and contributors are decrying what they perceive as another opportunistic clampdown on non-traditional media platforms and sources.4
In fact, the government is already reviewing the feasibility of new laws to deal with fake news, but have not provided any details on what form that could take.5 The Minister of Law and Home Affairs, K. Shanmugam have come out to declare that current laws are inadequate and provide only limited remedies to deal with falsehoods after the damage has already been done (and that is, in reality, irreparable), referring only to an offence under the Telecommunications Act for knowingly transmitting a false message.6
Legal Developments in Other Jurisdictions
The concern over false information, especially in the political field, has already led to the introduction or proposal of laws to combat the problem in other jurisdictions. In the United States, former U.S. President Barak Obama signed the National Defense Authorisation Act of 2017 (“NDAA”) into law. That Act authorises the Countering Foreign and Disinformation Act of 2016 (“CFDA”) to tackle the problems of disinformation and foreign propaganda.7 The focus was specifically on false information relating to politics and also against foreign manipulation of information to influence politics in the U.S. 
In Europe, the European Union Parliament President Martin Schulz have proposed a harmonised regional solution to the problem. Meanwhile, there are already state sanctioned fact-checking websites set up for the region and some Member States also have their own websites for fact-checking (e.g. factcheckeu.org for the EU and pagellapolitica.it in Italy respectively). Some Member States have gone ahead with their own plans to deal with the issue. For instance, Germany’s coalition federal government is considering new legislation that provides for hefty fines for social media platforms such as Facebook and Twitter that ‘publishes’ fake news and fail to react fast enough to remove such posts that are brought to their attention.8 Also, the Justice Ministers in three German states have reportedly proposed anti-botnet legislation to deal with automated social-media accounts that spread fake news.
The Root of the Problem
The law is an appropriate tool to fix the fake news problem. After all, the law and justice are based on truth and the search for the truth. Existing traditional controls are not entirely suitable in the digital paradigm. However, there are already mechanisms that are still effective to deal with the problem in an arguably balanced manner. Any changes to the regulatory regime should start with these laws and work seamlessly with the existing measures.
The overarching challenge that the authorities face is to put in place a system of legal checks and balances taking into consideration conflicting policy interests such as the smart nation initiative and the freedom of speech and expression on the one hand, and the need for informational integrity and accuracy on the other. Also, the final solution cannot solely be a legislative or executive one, but rather a concerted effort by both the public and private sectors (including society at large). 
The pen is mightier than the sword, and person that controls the narrative wields real power. When considering legal measures, we must first identify the parties whose actions will have the strongest impact on the creation and dissemination of false information: First, the gatekeepers of online information, including Internet intermediaries and Content Hosts such as search engines, news aggregators and social network platforms is an important key to the solution. Second, Internet Content Providers, specifically the authors, will have to be held to some co-regulatory standard of responsible speech and an independent fact-checking mechanism. 
Preemptive measures should be the main focus of a legal solution, as false information and news can inflict lasting damage after it has been disseminated, even if it is subsequently disproved. Where damage or harm is caused, appropriate remedies, on top of punitive and non-punitive damages, should also be made available. Thus, any legal measure should focus on the following main objectives: The first is deterrence, which is difficult given jurisdictional limits and digital anonymity. Thus, punishment will have a limited effect and technological measures to sieve out ‘fake news’ will be more effective. The second is prevention, which can be in the form of restriction of access or removal of content. The incapacitation of fake news purveyors is more justified in recalcitrant and egregious cases. The third is correction and the reduction of harm or negative effects, such as through a mechanism that requires a retraction (and apology) or that allows for a right of response.
Existing Laws and Regulations
At the base level, the Info-Communications Media Development Authority (the “IMDA” or the “Authority”) already sieves out publications based on several factors, presumably including publications containing false information, via a suite of regulations requiring licence for publications, whether in print form or digital format. 
For example, a permit is required to print or publish newspapers in Singapore and for the sale or distribution of a defined class of newspapers published overseas.9 Licenses are required to operate printing presses and also for online news publications. Separate licences are also required of other forms of communication like radio and television broadcasts. Conditions must be met for such permit to be issued or licence given.10
In relation to the provision of online news and other programmes on the World Wide Web through the Internet, Internet content and service providers must adhere to generous guidelines relating to “public interest”, “public order”, “national harmony” as well as those based on moral (“good taste and decency”), racial and religious grounds.11 The ambiguity of these guidelines are cushioned by the standard required of licensees, which is based on “best efforts” to comply with the Regulation through the taking of “reasonable steps in the circumstances” of each case. Further guidance by the relevant government agencies will of course be helpful in this regard. 
Meanwhile, there are other laws that provide for criminal offences for speech or expression, based on the type and nature of the content and/or its effects on the subject.12
Proposed “Source and Gateway” Approach Under the Internet Content Regulations
The IMDA’s Broadcasting Act (Cap 28) (the “BA”), and the subsidiary legislation for Internet content regulation, in particular the “class licensing framework”, allow the Authority to demand the “take down” or removal of websites that host content on a wide range of basis. These can be interpreted to extend to false news, although it is unclear to what extent it covers falsehoods generally, that is, the different forms they may take and their impact (which can range from the innocuous to the seriously damaging or harmful). The regulations are drafted widely enough to cover websites as a whole, such as news outlets and specific pages of a site (i.e. not the entire website). Hence, there is no compelling need for new laws to deal with the problem of “fake news”. 
Rather than to promulgate new laws to deal with “fake news”, the Authority should refine the existing Internet content regulation framework instead. The Internet content regulations consists primarily of the Broadcasting (Class Licence) Notification (Cap28 N1) (the “Notification”) and the Internet Code of Practice (the “Code”).13 Individual licences are also issued in some cases as determined by the Authority.14
This is the best approach for several reasons. First, this approach will draw the least criticism as it is merely clarifying that existing regulations that have worked well are also suitable for, and applicable to, the “fake news” problem. Second, this is an opportune time to provide an update of the existing provisions and provide much needed clarity to the current framework, especially on the interpretation of “Internet Content Provider” and “Internet Service Provider” that are subject to the regulations. Third, the current regulations provide an arguably “light touch approach” that primarily targets access to information rather than censorship of the source (which is in any event less effective if it originates overseas); hence it already includes Internet intermediaries (which, as will be made apparent, plays an important and integral role in the battle against fake news). Fourth, there is no need to differentiate between “fake news” and other forms of false information in general, which can fall within one of the generous exceptions to free speech and expression that are already provided for under the current regulations.
In relation to “Internet Content Provider” (the “ICP”) in particular, the current definition under the Class Licensing Scheme is confusing and requires more clarification through amendment. Under notification 2(a), it refers to “any individual in Singapore who provides any programme, for business, political or religious purposes, on the World Wide Web through the Internet”, which seems to limit the class licensing regime to websites that deal with any one of those three forms of transaction or content (as the case may be). However, notification 2(b) further includes “any corporation or group of individuals (including any association, business, club, company, society, organization or partnership, whether registrable or incorporated under the laws of Singapore or not) who provides any programme on the World Wide Web through the Internet”, which is much wider in scope based on a literal reading of the definition, since it now includes websites that cover any type of content. Indeed, the entire scheme seems to focus on content beyond “business, political or religious” programmes since the type of exclusions – in the Notification, in particular under the Schedule which specifies the conditions of the class licence; and as described in the list of “prohibited material” in para 4 of the Code – extend beyond those topics. 
Moreover, notification 2(b) further “includes any web publisher and any web server administrator”. In relation to this, it is pertinent to note that it provides leeway to the inclusion of “Internet Content Hosts” under the definition as a “content providers” in the wider sense of the word. For example, para 14 of the Schedule to the Notification provides that “[a]n Internet Content Provider who provides a webpage on the World Wide Web through the Internet to which other persons are invited to contribute or post programmes shall use its best efforts to ensure that such programmes conform with such applicable Codes of Practice as the Authority may issue from time to time”, which appears to include content hosts for third party user-generated content such as social media platforms, citizen journalism websites and encyclopaedias that allow anyone to contribute content (e.g. Wikipedia). 
Notification 2(a) may in fact be a subset of notification 2(b) on the basis that “group of individuals” can also refer to an individual.15 Moreover, it also makes sense to read notification 2(a) to cover groups of individuals as well, since the focus of the regulation under this definition is on the type of content rather than the provider as such.16 However, if that argument is not accepted, then at least on a practical application of the regulation, it will apply to websites that are published or administered by more than a single individual, which is likely to be the case. 
A possible reason for the dual definition of an ICP is to highlight the sensitive nature of certain types of content and to give special notice of its providers of their responsibility for those obligations.17 In relation to businesses, a class license is facilitative as it explicitly obviates the red tape for registration of such sites to do business in or from Singapore. Certain businesses, in particular those that provide “on-line newspapers for a subscription or other consideration” may be notified in writing to register with the relevant authorities,18 while others such as an on-line service in or from Singapore that provides “Singapore news programmes” within the scope of regulation 3A is carved out of the automatic class licensing regime and subject to specific licenses and conditions, including more onerous ones like the putting up of a bond, over and above that provided for in the Code that remains applicable. For websites providing political or religious content, which has often been determined by the government as sensitive in nature, they are clearly flagged as content under the scheme for which its providers have to comply with registration requirements under the Schedule to the Notification.19
Hence, it will be useful if the definition of an ICP is made clear as it determines the scope and coverage of the Regulation, and also its usefulness as a tool to fix the “fake news” problem. Making it clear that the regime covers news aggregators, social media platforms that provide news feeds and other similar content hosts or sources of news will have the effect of ensuring compliance as well as allow for concomitant amendments to the Regulations to include specific conditions for operation, such as the measures that deal specifically with the problem of “fake news” and false information that will be proposed below.
Once relevant Internet intermediaries are clearly brought into the Internet content regulatory framework, the actual measures that they need to take specifically in relation to “fake news” can be set out (in the similar way that specific broadcasting licence conditions can be made). This can be done within or outside of the class licensing scheme; that is, under section 9 of the Broadcasting Act (with an amendment to the Notification) or under section 8 of the same Act (specific licenses), which was incidentally the way that citizen journalism outlets for Singapore news have been dealt with under the current scheme since 1 June 2013.20 In either context, the proposed two-level approach should deal with the source and the gateway (or access provider) as set out below. 
Information Sources or Content Providers: Categorising and Fact-Checking 
First, there is a need to distinguish between news sites on the one hand, and opinions (including satire) on the other. According to the Oxford Dictionary, “news” is “newly received or noteworthy information, especially about recent events”. In the context of this article, it refers to factual information that is of interest to members of the public and goes beyond traditional broadcasts or publications. As false information as a whole is as much of a problem as fake news, and based on the above definition, “news” and “information” shall be used interchangeably in this part of the article; and shall be distinguished from opinions which are subjective views from a perspective that is not necessarily fact-based (e.g. based on false premises or half-truths).21 Certainly, “news” itself extends beyond politics. 
The current content regulatory regime places the responsibility and power only on the IMDA. More measures can be put in place to involve civil society, in particular, credible interest groups - such as an association of librarians, academics, media outlets and civil rights organisations - to work with the Authority and Internet intermediaries.22 Civil society groups can, for example, form a committee to assist Internet intermediaries to meet their face-checking obligations and to identify fake news websites or platforms for the purpose of reporting such as to the Authority to issue a takedown notice (pursuant to the Notification) and to intermediaries to take such necessary measures that will be described below. They can also be a credible source of accurate information and provide for the correction of misinformation for readers, particularly in relation to highly disputed information or news from high profile sources and in relation to popular topics. 23
In short, setting up civil society groups to determine the facts, and to dissemble fact from fiction or half-truths online is an important self- and co-regulatory approach. The exact role and make-up of such groups in the scheme will have to be considered in detail by the IMDA. In the meantime, some Internet intermediaries are already tweaking their algorithms and enlisting the help of more “neutral” media groups to fact-check news, identify recalcitrant ‘fake news’ proponents and flag biased sources. 
The above proposal deals with “fake news” without determining its merits or the gravity or importance of such news (which is impossible to determine with precision).24 Whether “trivial” or important, the legal outcome can be dealt with elsewhere based on the type of content and impact or harm caused as the case may be. In other words, where “harm” is caused, there are criminal and civil recourse through existing content specific laws such as the law on defamation, harassment, obscene publications, sedition, and so on.25 For example, the Sedition Act has been used successfully to deal with offences under this Act, including the online publication, distribution and reproduction of seditious publications.26 Persons can also be charged under the Protection from Harassment Act (the “POHA”) for providing false information that amounts to harassment.27 Similarly, there are criminal and civil recourse for defamatory statements or publications in Singapore law, including the making of an apology (and correction) under section 10 of the Defamation Act. Section 15 of the POHA provides recourse to natural persons to prevent the publication of false statements of facts and for the removal of such publication; and although it does not extend to the government,28 the latter (and other private legal entities) can still utilize their own channels of communication (and the measures recommended below) to dispel and correct such statements as well as to prevent future access to websites that perpetuate them. 
Gateway Organisations, Content Hosts or Access Providers: Tagging, Flagging and Ranking
As stated above, the current Internet content regulations should be amended to clearly and unequivocally include relevant Internet intermediaries and bring them into the scheme.29 
The main focus here should be on such intermediaries including search engines that provide new aggregation services and social network platforms that provide news feeds, which are the main disseminators of information. Internet intermediaries that hosts and/or provide access to information can be required by regulation to retool their interface to categorise news separately from opinions, with anything that is a mix of news and opinion to be categorized as the latter. This can be done by putting them in separate frames as opinion or review, by colour coding them, or both. Currently, a news search under news aggregators such as Google News and Yahoo! News produces results from traditional and social news sites without distinction. 
Another measure that can be considered is to include directions for ad blocking for websites (e.g. Google, YouTube, Facebook and Twitter) to reduce or eliminate the financial incentives to generate traffic through sensationalizing news or fake news.
Furthermore, relevant intermediaries can be required to put in place reasonable measures for the reduction in search results ranking or to tag/red-flag contentious news or news sources so as to prevent and deter the proliferation of fake news and purveyors that are producing and disseminating them.30 They can be guided by, and work with, credible fact-checking groups or committees appointed by the Authority and/or by the intermediary through private arrangements. The earlier proposed fact-check committee, network or organization, whatever form or role it should take, can also advise these intermediaries on categorization and other measures; and utilise the amended content regulations to take down sites (in egregious cases),31 or to tag/red-flag them as unreliable sources of information.32 
For tagged or red-flagged sources, perhaps the search results (of search engines or news aggregators) or news feeds (of social media platforms) can also provide a new mechanism that will feature/suggest a “reply”, alternative sources or a fact-check post (or website) for the reader to visit. This will be better received than outright blocking of sites, encourage digital literacy and develop discerning readers in society (and in the long term, obviate heavy reliance on third parties or the Authority to ensure accuracy of information). Alternative sources can come from individuals or fact checking sources, both public and private. In Singapore, for example, there is a government run website, “Factually”, that seeks to dispel fake news and false impressions; in the U.S., there are various fact checking sites that are privately run such as the International Fact-Checking Network (or “IFCN”) by the Poynter Institute (which also provides fact-checking training and certification),33 and Factcheck.org. These websites can themselves be the subject of rejoinder.34
In summary, the current notice and take down procedure for intermediaries for false news sites and the proposed mandatory processes for counter-falsehood technology will suffice. The key Internet intermediaries can provide other (more nuanced) remedies besides site blocking, take down notices, and the provision of alternative sources. These include measures such as tagging/flagging, ranking and warnings attached to the source of (or news in) contention. Fact-checking (providing access to fact-check sites), tagging/red-flagging and/or lowering the rank of disputed sites and separating factual news narratives from opinion websites will be more credible and acceptable measures to the public than outright censorship.
Finally, the above-mentioned proposals are consistent with some measures that several prominent gateway organisations are already doing. For example, Facebook is developing its strategy and technological mechanisms to deal with fake news presented in its news feeds. Google’s approach is to provide a fact check tag/flag with fake news in its search results. Both are working with fact-checkers independent of their organisation.35

Final Words
An all-or-nothing or entirely top-down approach is not feasible. A multipronged approach and one that engages and involves all segments of society (that are the stakeholders in the demand and supply chain of information, including news), and in particular, those that hold the key to information access, is integral to a good and effective strategy. What is clear is that this will be a long and protracted battle for information integrity and factual accuracy.

Warren B. Chik
     Associate Professor of Law
     Associate Dean (External Relations)
     Singapore Management University
     E-mail: [email protected]
1 Digital data is dominant and it has been reported that most data in the world was created in the last few years, and the gap will be increasingly short given the rapid advances in information storage technology. See, SINTEF, Big Data, for Better or Worse: 90% of World’s Data Generated Over Last Two Years (22 May 2013), available at: https://www.sciencedaily.com/releases/2013/05/130522085217.htm.

2 Also, the data economy and monetization of information through traffic and readership (e.g. through pay-per-click advertising models) alone can lead to, for instance, the temptation for sensationalisation of information and hyper-partisan political news, to draw traffic and attract readership.
3 Rachel Au-Yong, Experts Welcome Laws to Curb Fake News (17 April 2017, Straits Times), available online at: http://www.straitstimes.com/singapore/experts-welcome-laws-to-curb-fake-news; 

4 For example, Mothership.sg, The Middle Ground, The Online Citizen, The Independent.sg, All Singapore Stuff (ASS), among others, have since posted their opinions, critiques and analyses of the government’s intention to deal with fake news as well as predictions on the form it will take. See e.g, Nyela Zannia, Former ISD Detainee Responds to Government’s Intention to Curb “Fake News” (18 April 2017, TOC), available at: https://www.theonlinecitizen.com/2017/04/18/former-isd-detainee-responds-to-govts-intention-to-curb-fake-news/; TOC, The Impending Regulations on Fake News and Why There Should be a Line (10 April 2017), available at: https://www.theonlinecitizen.com/2017/04/10/the-impending-regulations-on-fake-news-and-why-there-should-be-a-line/; Belmont Lay, K. Shanmugam: All Singapore Stuff and States Times Review are Fake News (3 April 2017), available at: http://mothership.sg/2017/04/k-shanmugam-all-singapore-stuff-states-times-review-are-fake-news/

5 Rachel Au-Yong, Keeping Fake News at Bay (8 April 2017, Straits Times), available online at: http://www.straitstimes.com/singapore/keeping-fake-news-at-bay; Valerie Koh, Government Review Underway to Deal with Fake News (3 April 2017), available online at: http://www.todayonline.com/singapore/government-review-underway-deal-fake-news; CNA, Government ‘Seriously Considering’ How to Deal with Fake News: Shanmugam (3 April 2017), available online at: http://www.channelnewsasia.com/news/singapore/government-seriously-considering-how-to-deal-with-fake-news/3647556.html; Ronald Loh, Law may be Reviewed to Counter Fake News (4 April 2017, The New Paper), available online at: http://www.tnp.sg/news/singapore/shanmugam-current-law-limited-dealing-fake-news.

6 Rachel Au-Yong, Fake News: Current Laws “Offer Limited remedies” (4 April 2017, Straits Times) available online at: http://www.straitstimes.com/singapore/fake-news-current-laws-offer-limited-remedies. The provision is section 45 of the Telecommunications Act (Cap 323) which makes it an offence to knowingly transmit or cause to be transmitted a false or fabricated message.

7 Available at: https://www.congress.gov/bill/114th-congress/house-bill/5181/all-info.

8 Among the proposed changes are provisions for the swift removal of false information and prominent corrections, which necessitates an accessible process for complaints to be made and acted upon, and regular audits or reports on their actions and progress. In order to meet such obligations and to avoid sanction as well as to show that it is serious in tackling this issue, it is likely that such platforms will, whether legally mandated or not, provide or enhance the manpower and/or electronic processes to deal with screening or verification and complaints or reports of false news.

9 Newspaper and Printing Presses Act (Cap 206).

10 See the Broadcasting Act (Cap 28) and its subsidiary legislation.

11 Notification 15 of the Broadcasting (Class Licence) Notification (Cap 28, N1) only restricts those conditions contained therein to services that do not include computer on-line services that are provided by Internet Content or Service Providers, and hence, they are excluded from their guidelines.

12 E.g., the Undesirable Publications Act (Cap 338), the Films Act (Cap 107), the Sedition Act (Cap 290) and the Defamation Act (Cap 75). However, the truth or falsity of the information is only relevant in some cases, and is not itself the focus of the legislation.

13 See section 9 and section 6 of the BA, which empower the Authority to determine class licenses and issue codes of practice, respectively.

14 Section 8(2) of the BA. Under section 3A of the Notification, computer on-line services in or from Singapore that fulfills the conditions under the provision that provides for “Singapore news programme” are excluded from the automatic class licensing scheme in a 2013 amendment (G.N. NO. S 330/2013). Under the Schedule to the Notification, Internet Content Providers dealing with some types of content that meet the requirements must also register with the Authority in accordance with the conditions stated therein (see paras. 3-5 of the Schedule).

15 Under section 2 of the Interpretation Act (Cap 1), “words in the singular include the plural and words in the plural include the singular”.

16 Elsewhere, under the Schedule, the Regulation also refers to “a body of persons” (4) and “an individual” (5(b)) for the purpose of requiring registration in order to deal with political or religious content. 

17 At paras. 3 to 5. Similarly, for ICPs in the business of providing an on-line newspaper through the Internet. The provider of such content may be an individual, a corporation or group of individuals (as the case may be) and the Authority will make the determination whether they fall within the definition and requirements of these provisions. See also fn. 14.

18 The setting up and operation of newspapers and printing presses are regulated pursuant to the Newspaper and Printing Presses Act (Cap 206), and as such this is an effort to attain regulatory consistency between physical papers and digital ones.

19 The extension of these regulations dealing with political and religious materials, including online versions, are found in other laws such as the Parliamentary Elections Act (Cap 218)(e.g. rules on political campaigning which extends to online platforms) and Films Act (Cap 107)(“party political films”), and the Sedition Act (Cap 290), respectively. 

20 It is of interest to note that section 46 of the Broadcasting Act provides for an offence relating to the contravention of section 8(1) which requires a specific license to provide broadcasting services in or from Singapore. Section 9 provides a class license automatically and hence bears no such offense, but section 16(3) makes it an offence to broadcast a programme that is prohibited by a direction from the Authority.

21 In this regard, if this proposal is taken up, the definition of “Singapore news programme” for the purpose of the individual licensing scheme put in place by notification 3A of the Notification will have to be clarified so as not to exclude those websites that provide both news and opinions, which most (if not all) citizen journalism sites tend to provide. 

22 Due to some criticism of the use of fact-checking networks that includes news organisations, perhaps the International Association of Library Association and Institutions (IFLA) or Library Association of Singapore and the local University Research Centres dedicated to this form of research can be considered as alternative or additional resources (and be funded or otherwise supported by the government). This can be in the form of a ‘fact check committee’ consisting of representatives from a reputable cross segment of society (such as librarians, academics, etc.). Their credibility must be backed up by research and evidence, which can be resource intensive and hence some form of government support is important and will be helpful. Certainly, the final make-up of the fact-finders, that will have a role to play in the regulation and in the measures and practices of access providers and hosts, should constitute a non-partisan body or network. The government should also rely on public feedback (i.e. social engagement) to government in addition to the “fact check committee”.

23 The feedback and complaints from these sources can be channelled to a dedicated pool of officers that can utilise the current notice and take down regime more effectively and efficiently in dealing with fake news sites, where appropriate. Alternatively or additionally, the committee itself (and even individuals) can be empowered to some extent (and given a role) in responding to disputed facts at the point of access, so as to give notice to potential readers, in relation to sites that are not recommended for take down or blocking (e.g. those that are not determined to be providing only or predominantly “fake news”).

24 There should not be a distinction made between important and unimportant falsehoods, as it will be difficult and unnecessary for the decision-maker to make that additional determination or judgment. However, the gateway organisation or host/access provider will have to find a way to prioritize their reaction to news as they develop in such a manner as to react sufficiently swiftly in the dissemination of, and access to, disputed news items.

25 Section 54 of the Broadcasting Act clearly states that “[n]othing in this Act shall prevent any person from being prosecuted under any other written law for any act or omission which constitutes an offence under that law, or from being liable under that other written law to any punishment or penalty higher or other than that provided by this Act, but no person shall be punished twice for the same offence.”

26 See section 4 of the Sedition Act.

27 E.g. the use of threatening words of behavior, heard seen or otherwise by the public, which is likely to cause harassment, alarm or distress. For example, wielding a fake gun or the depositing of fake bombs or what looks like chemical weapons can constitute an offence under the POHA. There have been such incidents in Singapore where the perpetrator was convicted under the Act.

28 Attorney-General v Ting Choon Meng and another appeal [2017] SGCA 6.

29 I.e., the regulation should be updated to clarify on the subject of the regulations (esp. “Internet Content Providers”, and to some extent “Internet Service Provider”), to include news aggregators, search engines, social networking platforms, etc.

30 This approach is not new. An example of additional measures besides the “notice and take down” process to empower and educate Internet users is the obligation, added to the Regulation in 2012, for ISPs to inform, make available and provide reasonable technical support to subscribers Internet content filtering arrangements.

31 These proposed new and additional measures will supplement, and in a way, soften the hard line ‘notice and take down’ mechanism and maintain the “soft touch” approach of the IMDA on Internet content regulation.

32 Using the current Internet content regulation mechanism obviates the problems of proving men rea such as intention or knowledge that is commonly required in criminal provisions. See Kelly Ng, Tackling Question of Intent the Way to Root Out Fake News: Experts (5 April 2017), available online at: http://www.todayonline.com/singapore/tackling-question-intent-way-root-out-fake-news-expert.

33 See Poynter, International Fact-Checking Network, available at: https://www.poynter.org/tag/international-fact-checking-network/.

34 This is a form of a “right of reply” on top of the “notice and take down” mechanism that is currently available. It is another example of a “light touch” approach.

35 For example, Google provides access to a fact-checker site in reaction to a contested subject; and the site must meet a set of criteria and internal guidelines set by Google (for integrity and credibility as well as standards for transparency and accountability) and that are authoritative sources of information. Some current fact-checking sites are operated by publishers such as PolitiFact and Snopes. A similar fact check system was introduced for Google News for its news aggregator site.