News

|

Freedoms

Transcript │ AM Agenda, Sky │ 30 August 2023

August 30, 2023

Wednesday 30 August 2023
Interview with Laura Jayes, AM Agenda
Subjects: Meta ends partnership with RMIT FactLab, Labor’s dangerous misinformation laws

LAURA JAYES: Joining me live now is the Shadow Minister for Home Affairs and Cyber Security, James Paterson. James, it was always kind of heading in this direction, wasn’t it? Just take us through how this fact checking arrangement actually works with Meta.

JAMES PATERSON: Well, good morning, Laura. Understandably, a foreign headquartered tech company doesn’t want to be seen weighing in too much into the political debate, and certainly not a constitutional debate in Australia. But they are concerned about the rise of misinformation and disinformation on their platform, and so what they’ve done is, they’ve appointed a series of so-called independent fact checkers to do that work for them, to tell them what is true and what is not true on their platform, and to give them advice about how to deal with it.

The problem that Meta has run into is that the RMIT FactLab are not an independent fact checker. In fact, they’re a self-appointed fact checker. They didn’t even have the international accreditation that they were supposed to have. In relation to the Voice, in my view they were undergoing very partisan, very biased fact checks that consistently favour the Yes side, and consistently target the No side and on very dubious grounds started labelling things as misinformation on the Voice debate. Now, I think Meta has come to its senses and realised this is an untenable situation. It puts them in a very difficult situation. And I really welcome their decision to end their relationship with RMIT. I think any other organisation that has that same relationship, that works with them on the same basis should be equally reviewing them right now because this is going to be a very important six weeks in the history of our country.

When we are considering the most radical change to our Constitution since federation, we cannot have the sincerely held opinions of Australians being censored on social media platforms as part of that debate.

JAYES: This also coincides with legislation the current government wants to pass as well, when it comes to misinformation on social media. They’re trying to really clamp down on misinformation, but also putting it in the hands of someone and something with a lot of subjectivity?

PATERSON: That’s exactly right, Laura. This perfectly highlights the dangers of laws like Labor’s proposed misinformation laws because these fact checkers are not always independent. They’re not always dispassionate. They sometimes have a perspective and can be guilty, in my view, of pushing that perspective at the expense of their independence and the fairness that they’re supposed to administer these things with. The truth is, we don’t all agree what constitutes misinformation. There are some things that are very clearly misinformation in which everyone would agree with. There are other things that are subject to contentious political debate that can’t just be dispassionately fact checked by a group of experts and arrive at a clean decision. I mean, even Paul Barry, the host of Media Watch, found this particular fact check in relation to Peta Credlin’s editorial about the length of the Uluru Statement from the Heart to be really questionable. He said it should be labelled ‘disputed’ rather than ‘false information’. Well actually, everything in a debate in politics and in the constitution is disputed. That’s the nature of it. We’re having a political debate about it and this is a highly partisan, charged debate. So, I don’t think these fact checkers, self-appointed, should be wading into that saying these things are true and these things are not true when Australians can form their own views.

JAYES: Ok. So, what about the misinformation legislation and this government’s aim to crack down on it. Do you think this bill should be scrapped altogether? Or, is there a way we can actually do it without, you know, what we’ve just seen happen as a really bad example?

PATERSON: There’s no question in my mind that this bill should be thrown in the bin straight away. It is an absolutely friendless bill. It’s been attacked by the Media Entertainment and Arts Alliance, by the Human Rights Commission, by civil liberties bodies, by the social media platforms themselves, by media organisations. I mean, you cannot find a single person who would defend this bill other than the Minister Michelle Rowland. Even ACMA who is responsible for helping the government draft the bill has emphasised that this is only a draft bill. That it is out for consultation, and the Chair Nerida O’ Loughlin has described the criticisms of the bill as very valid. Now, when the department that’s supposed to be administering it thinks that the criticisms are valid, I think you’ve got a very serious problem with the bill. I’m surprised that the government hasn’t walked away from it already. The fact that they haven’t I think is deeply disturbing because it would seriously curtail free speech in this country. It would seriously restrict what Australians can say online, but also what they can read, and what they can hear online. That shouldn’t be happening in a 21st century liberal democracy like Australia. The government needs to scrap it.

JAYES: Yeah, but something needs to be in its place because you can’t just let misinformation run rife. So, what’s the solution here?

PATERSON: Well, I do think there are problems, particularly with foreign state sponsored disinformation which is different from misinformation. I think that’s part of the problem the government has run into. They’re treating misinformation which is your uncle saying something on Facebook innocently that’s wrong, and disinformation which is a Chinese bot factory pumping Twitter full of fake accounts trying to game the algorithm and trying to drive content that divides our society as the same problem. They’re very clearly very different problems, and they have very different solutions. The Senate Select Committee on Foreign Interference through Social Media which I chaired handed down its final report a few weeks ago and it had a series of recommendations that go to that other problem, that very serious problem. The way in which we deal with those problems go to transparency not through censorship. We should require the platforms to be transparent about any directions they receive from foreign governments. We should require them to label state affiliated media on their platform…

JAYES: But sometimes that’s so opaque, isn’t it? The way I see it, and I don’t know whether you would agree or disagree with me here is that what Elon Musk and Mark Zuckerberg could do straight away which would stamp this out is not allow bots, for a start, and also require some form of identification to have an account.

PATERSON: Look, some platforms are better than others at dealing with this problem, Laura. Some of them are more proactive. Others started to do some good things and then walked away from it. That’s why I really do think we need an enforceable set of transparency standards that we as Australians say, these are the minimum conditions we think you should meet if you want to operate a social media platform in Australia. It goes to things like, having a legal and physical presence here in Australia so that you can be subject directly to our laws and to oversight of our Parliament. Famously, WeChat even refused to appear before our inquiry. It goes to being transparent about how your platform operates and when you choose to take content off that platform and why. Opening yourself up to independent, third-party researchers. Often these independent bot networks that have been identified, have been found by researchers and academics because the platforms have opened themselves up to that scrutiny. But there are a lot of platforms that don’t do that, like TikTok and WeChat that are very opaque in their operations. I think as Australians we should insist; if you want to operate in this country, if you want to serve hundreds of thousands if not millions of users, well these are the criteria you have to meet if you want to operate here.

JAYES: Before I let you go, it is worth reminding our viewers what this fact check unit was all about. How much money they potentially stand to gain from Meta, it was close to $1 million a year. The unit at RMIT was, essentially, run by the opinion, I guess we could say, of a few? Is that fair?

PATERSON: Look, I think your colleague Jack Houghton has really ably demonstrated that with his research where he found that employees of the fact check unit at RMIT while they were giving supposedly independent fact checks on the Voice to Parliament debate were on

their own personal social media platforms tweeting support for the Yes side of the debate, and in fact, using some pretty incendiary language to describe people on the No side including the Opposition leader Peter Dutton. Now you cannot be both an activist and an impartial and independent fact checker but it seems at RMIT, the FactLab has tried to merge the two of those things. So it’s no wonder that they have disproportionately and unfairly targeted the No side of the case, and frankly I don’t think they’re well equipped for this task. I think they’ve very clearly demonstrated their bias, and it’s wise that Meta has got out of the business of doing this with them. But the ABC has a very significant relationship with RMIT, and I think they should be reviewing that relationship in light of Meta’s decision.

JAYES: James Paterson, good to talk to you. Thanks so much. A really important issue.

PATERSON: Thanks, Laura.

ENDS

Recent News

All Posts