On Friday 11 October, ACCAN CEO Carol Bennett and Deputy CEO Dr. Gareth Downing appeared before the Environment and Communications Legislation Committee. The Committee was hearing views on the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024.

The full transcript of their appearance can be found below, or on the Parliament of Australia website.

Transcript:

BENNETT, Ms Carol, Chief Executive Officer, Australian Communications Consumer Action Network

DOWNING, Dr Gareth, Deputy Chief Executive Officer, Australian Communications Consumer Action Network

O'SHEA, Ms Elizabeth (Lizzie), Founder and Chair, Digital Rights Watch [by video link]

[12:36]

Senator CANAVAN: Yes, I'm happy to go first. I might start with Digital Rights Watch. I believe that in your submission you've outlined that you continue to have concerns about the broad nature of how 'serious harm' is defined in the proposed bill. Could you just outline what your concerns are about that broad definition?

Ms O'Shea : Yes. The definition of 'serious harm' still includes 'imminent harm to the Australian economy'. That remains a concern for us. We don't think that aligns with international human rights law and best practice. Of course we understand that freedom of speech is a right but that it's appropriate to at times impose limitations on that right, but it has to be necessary and proportionate. There is a body of law through various mechanisms in international settings that have built out what that right means and when the appropriate limitations apply, and in general that has not included things like harm to the economy. For that reason, we think the definition of 'serious harm' should be narrowed, and that should not be a part of it.

Senator CANAVAN: I think you mentioned the leg of this stool which mentions the efficacy of preventative health measures. Do you think it's appropriate that that be in the specific definition of 'serious harm'?

Ms O'Shea : I don't believe our submission made specific recommendations about what should or should not have been included in that definition at this stage of the drafting. I would say there is utility in thinking about public health initiatives coming from a place of trust, and in those circumstances limitations of speech need to be considered in that context. I think it warrants some careful consideration.

Senator CANAVAN: Your submission says:

We also note that certain kinds of public health initiatives can be contested …

With that in mind, do you think debate about public health measures should be moderated somehow by a government enforced misinformation regime?

Ms O'Shea : I would say that debates about public health are already moderated through a variety of different mechanisms. There are laws that limit speech in Australia that exist already. There are also algorithms that are developed by social media companies that are used to amplify certain kinds of content based on what is profitable for them; that is a concern. We are not in a world within which there's freedom of speech in Australia to begin with. We may be in agreement here that we need a human rights act that will create the right to freedom of speech because that is the best way to protect it and ensure, in legislative reforms, that those considerations are taken into account.

Senator CANAVAN: I agree with that.

Ms O'Shea : I'm glad to hear it!

Senator CANAVAN: How would you define 'misinformation', if you don't want it to be as broad as defined in the current bill?

Ms O'Shea : I believe we were talking about serious harm rather than the definition of 'misinformation'. I'm not sure the definition of 'misinformation' and 'disinformation' is particularly problematic in this setting, but how 'serious harm' is defined has to be carefully considered.

Senator CANAVAN: Just to clarify my question: my understanding of how this bill operates is it's only misinformation that could cause serious harm that will be subject to censorship—so it's linked to the definition.

Ms O'Shea : Yes, they're related; I wasn't sure if you were asking a different question.

Senator CANAVAN: It is a practical definition of 'misinformation'. I'll rephrase my question: how would you specifically define in an act of parliament the kinds of misinformation that should be either banned or deprioritised on a social media platform?

Ms O'Shea : The first thing I would say about that question is I'm not sure it necessarily results in banning or deprioritisation; I'd query that as a consequence of including certain criteria in the definition of 'serious harm'. If I was to take a drafting pen and start this process from scratch, I would import human rights considerations into the test of what constitutes serious harm—so considering a balancing exercise between what the content is that might give rise to serious harm and what kinds of infringements on human rights, including the right to freedom of speech but other rights as well, might be impacted by defining certain kinds of content as serious harm. It may not be that you have to include a list in order to create a definition of 'serious harm' that would work within this architecture. I think there is the capacity to introduce human rights considerations that might improve the definition of 'serious harm' and improve the processes for developing codes for moderating content.

Senator CANAVAN: The bill, I think, has a cursory attempt but it seeks to say that any misinformation frameworks should not impinge on the constitutional protections the High Court has found around political discussion in Australia. By the sounds of it, you don't think those provisions adequately protect the human right to free speech.

Ms O'Shea : The implied right of freedom of political communication, which is contained in the Constitution, which I believe is what you're talking about—

Senator CANAVAN: Yes.

Ms O'Shea : is manifestly inadequate, in my opinion, for the protection of speech from a human rights perspective. We fall far short of most other comparable liberal democracies around the world in not having those protections in place. It is currently not enough that we only have access to an implied right within the Constitution. We absolutely need a human rights act that gives proper protection for all human rights, including freedom of speech.

Senator CANAVAN: Thank you for that; well said. Do you, here, want to add anything to that question or anything else? I have some questions for yourselves. No? Just turning to your submission, then, I believe you're putting to us that there needs to be a lot more transparency around any decision-making that ACMA takes. Could you explain what extra information you'd like to see they be made to provide?

Ms Bennett : Thanks, I appreciate the question. From our perspective, the bill does provide an opportunity to better understand the accuracy and content of information that purports to be factual or accurate. But we also have concerns about its capacity to address mis- and disinformation. I guess it's a good opportunity for engagement, in terms of this issue, and to prevent harm and keep social cohesion, and I take Lizzie's point around the definition of harms and that we need to have that in place.

To your question, our experience with the Telecommunications Consumer Protections Code is that it has been quite ineffective, in terms of its application. From our perspective, this bill would be better placed as a disallowable instrument, that enables politicians and parliament to have more of a say about stepping in if the bill doesn't work, than a code process.

The code, in our experience, doesn't provide the kinds of consumer protections that it should. It's a completely ineffective process for addressing the kinds of issues that we're seeing occur when breaches of consumer protection happen in the telecommunications code.

Senator CANAVAN: That raises an interesting point. My understanding of how this works—correct me if I'm wrong—is that the bill would require the social media companies to impose a code, or develop and impose one, to deal with misinformation and disinformation. Obviously, then, the codes developed by those social media companies are not subject to being disallowed by parliament. That's right, isn't it, because they're not government instruments?

Ms Bennett : That's right.

Senator CANAVAN: Are you suggesting there should be a government ministerial regulatory instrument made that would then apply to social media companies and would also be subject to parliamentary scrutiny?

Ms Bennett : Yes, similar to the current—the thing is, it's up to the digital companies to now put in place their own codes. It's an industry-led self-regulatory process, I guess. We haven't found that particularly useful, on the telecommunications side. Given the importance of this, in terms of protecting the community and ensuring that there isn't significant harm, we think that there would be some merit in having a more robust process than we've seen and greater powers for intervention than we see under the ACMA processes, with regard to the TCP Code.

Senator CANAVAN: One thing that does get reported to me, at the moment, when people are subject to discipline on social media platforms, is there's almost zero right of appeal or even to understand why they may have been banned or had their posts deleted. There's no recourse to understand what they've done wrong.

Are you suggesting, through this process, that there should be some consumer right here for Australian users of these social platforms, that they deserve the right to understand why they've been banned and maybe have a right of appeal et cetera? Digital Rights Watch are welcome to respond to any of these questions as well.

Ms Bennett : What I'd say is this bill would enable some greater transparency around what the platforms do and don't accept, in relation to mis- and disinformation. That, then, gives consumers greater sight of the kinds of information and the validity of that information, in terms of its factual representation. For us that's the value of this. It does enable engagement in that process. Do you want to add to that?

Dr Downing : Yes, certainly, and thank you for the question, Senator. From our perspective, we would be very supportive of a complaints and dispute resolution process being established. That's not just with respect to mis- and disinformation but more generally. Consumers have extremely limited protections at the moment. If you can imagine being disconnected from your Google account, that is a really significant issue for consumers. If you lose access to some of these very critical digital communication platforms, it really curtails your ability to operate in the economy, live your life and do your work. From our perspective, we're very supportive of expanded protections in that space, and we certainly think that there should be dispute resolution processes in place.

With respect to digital platforms, there was a recommendation from ACCC some years ago, as part of their Digital Platform Services Inquiry—which I think handed down its most cent report a few weeks ago—particularly on the need to establish a Digital Platforms Ombudsman. From our perspective, you could expand the Telecommunications Industry Ombudsman. That would be very much focused on those consumer protection matters about access.

Ms O'Shea : I would say, as well, that I fully support the idea that there should be greater transparency for researchers, say, but also generally for reporting by ACMA—if this bill were to pass—about how platforms amplify certain kinds of content and the decision-making behind that. There are very few tools available to know what kind of content is popular on social media. The few that exist have been deprecated, so there's almost no line of sight other than through very granular journalism or very laborious projects by researchers to understand the information economy on social media platforms, particularly on how the advertising business model affects and influences content generally as it's shared on the platform.

The last item on that front is that there's a significant role for privacy reform and improving these issues as well. Because, if you can limit the kinds of data that social media platforms can collect, it does then have an impact on the business model, which is largely data extractive. Greater transparency around that would improve people's choices as consumers. It would also feed into whether these companies have a social licence to operate, and what would constitute that social licence to operate and make them work harder to win public trust for their business model. At the moment, I think that is at a very serious low.

Senator DAVID POCOCK: Ms O'Shea, in your last answer you are asking much bigger questions about this bill. On that, I was interested in your thoughts on the approach that this bill is taking—this sort of Whac-A-Mole approach—to misinformation that's popping up, versus actually pushing for more transparency around the kind of information that the algorithms are promoting. What are your thoughts on this legislation versus, say, the Digital Services Act and the Digital Markets Act in the EU—two very different approaches.

Ms O'Shea : Yes, that's true, and I think it is also worth thinking about this proposal in the context of privacy reform. I mentioned that just then, but there has been a years-long process of reviewing the Privacy Act, and we have only seen recently the very initial stages of that translating into legislative reform. There's no timeframe for the rest of it, and I think it is very important that the government commit to that.

To think about the information and regulation around the information ecosystem, it is very important to think about how digital platforms engage in strategic optimisation of features that are designed to attract new users, retain attention and increase interaction with the platform. These create certain kinds of revenue streams that create financial incentives for content creators to prioritise engaging content. That can then mean that the kind of content that gets amplified by platforms tends to be extremist and polarising and, often, misinformation and disinformation. It is very profitable, not just for social media platforms, to have that kind of content on the platform, because it keeps users engaged. And also people who use those platforms can then monetise that content through virality.

For those reasons, it is very important to think about this as an information ecosystem, rather than a single policy intervention that will solve this particular problem. I prefer to think about it as an ecosystem, because it does align with how we might think about the physical ecosystem—the environment—as well. I think we should take the same approach there. We don't just focus on planning for extreme weather events that might result from climate change; we also look at avoiding burning further fossil fuels in order to stop the extreme weather events. The same approach needs to be taken here. We need to look at interventions that can limit misinformation and disinformation and the algorithmic amplification of it, but we also need to look at the source of the problem, which is data extractive business models that are focused on engaging content to continue that extraction of personal information, with immense consequences to users but also to our democracy as well. The European approach has been much more proactive, regulating platforms based on their size, asking them to proactively explain whether they're compliant with various policies and values encoded in law. I think there's real utility in looking at how we might apply some of those models in Australia.

Senator DAVID POCOCK: In raising this with the government and department, I've been told that we can't take a European approach due to our Constitution and potential international trade rules. I'm interested in your views on that. I don't understand how it would be unconstitutional to force companies to be more transparent.

Ms O'Shea : Yes, I query that as well. I don't pretend to be a deep constitutional scholar, though, so I might need to take that on notice. I also may need to take on notice the question about trade. I know there is an issue around free trade agreements with the United States that might limit our capacity to do certain things in law that relate to that. I think it's worth thinking about why that's the case and whether that's a justifiable position because that is something many Australians would not appreciate and would feel greatly troubled by.

The other component of why we may not be able to proceed with European regulation is that Europeans have access to the GDPR, the general data protection right. We are 40 years out of date with our Privacy Act, and we need root-and-branch reform to get even close to where the Europeans are, which may set us up for greater regulation of platforms.

Senator DAVID POCOCK: On the Privacy Act reform, which I think touches on a lot of the things that Australians are concerned about—social media companies scraping our data to train their AIs—we've seen the Prime Minister make the call to kick that part of privacy reform beyond the next election. How urgent is it that Australians have better protections and similar protections to Europeans to be able to opt out of that large-scale data harvesting?

Ms O'Shea : I would argue that it is absolutely critical, in part because privacy reform touches on many different fields of policy—the digital identity system, mis- and disinformation of course, the rights of children to be able to go online and be safe, as well as lots of other concerns that people might have, whether they're targeted by predatory industries or other similar experiences. It is absolutely critical. I'm joining many other civil society organisations and experts in this field calling for the government to commit to a timeframe of no longer than six months after the election if they are re-elected, and I would hope that even a government of the conservative side of politics would also commit to this, because it seems like a reform that ought to be bipartisan.

We are miles behind and completely out of date, and, as a result, some of the worst technology ends up being tested here. We have a much worse information ecosystem when compared to our colleagues abroad. That's a terrible shame not just for people who like to use the internet for community organising and talking to their friends but also for a democracy that is reliant on high-quality information being shared among voters and policymakers, which is not what we currently have.

Senator DAVID POCOCK: Just finally, Ms Bennett, you raised concerns earlier about self-regulation. It seems pretty ill-advised to me to be asking the now most powerful companies in the world to self-regulate when we know that they're absolute black boxes with very little transparency. Can you maybe explain why we're taking this approach and what the dangers are.

Ms Bennett : I think I've described it as the fox in charge of the henhouse. At the end of the day what we see in the Telecommunications Consumer Protections Code is that, effectively, industry sets its own code. They call it coregulation; in the act it's actually described as self-regulation, and that's the intention. Effectively, the industry sets its own code. We find that ACMA has very few powers. Their powers are two-step processes. If there's a breach of the code, it's up to ACMA to direct the company to address that issue. They can't undertake any fines or any actions until a breach of that direction occurs, and then they can fine, and they can fine up to $250,000, which is a drop in the ocean for these large corporations. So, effectively, it doesn't work, and we're calling for direct regulation because, when direct regulation occurs, we see that good consumer outcomes result. That's what we're here to protect: consumer outcomes. So, yes, it's, effectively, a very unworkable system.

Senator DAVID POCOCK: Thank you very much.

Dr Downing : If I could provide slightly more detail for the committee, just for context, with respect to telecommunications, we commissioned a research report based on ACMA enforcement actions taken over a 13½-year period. That research identified that the ACMA had imposed only $6.1 million in penalties against telecommunications companies over that 13½-year period. There was $6.1 million in infringement notices and just over a million dollars in civil penalties sought. So I think what we're trying to say is that, if this is a framework that was adopted in this context, you would expect to see probably very similar outcomes, and that is likely to be extremely ineffective. A few million in fines over more than a decade is obviously not sufficient to drive behaviour in the telco sector, but it certainly wouldn't be the case with digital platforms either, given their transnational scale.

CHAIR: If I can ask a clarifying question, there: you're saying that you would gauge effectiveness of policies based on how many fines are issued?

Dr Downing : Not at all. I think it's a factor that I would consider. I think with respect to consumer protections, it's quite straightforward for us. We have a variety of indicators that indicate poor outcomes. For example, we've had more than a million complaints to telcos for many, many years. That's starting to decline because people are giving up, when we and other organisations survey them. There's that aspect. We also have extensive case studies, input from the community and input from community organisations about the failures of that particular framework which are provided to us and which indicate to us that it is ineffective. There's an admission from some parts of the industry that the two-step framework is ineffective. So I think, in that context, I have a degree of comfort in saying there are different points of evidence I could refer to. But I do think that the penalties, in the context of a sector that is worth tens of billions of dollars, are clearly out of step with equivalent companies in, say, the energy sector where the penalties are substantially greater.

CHAIR: I wonder if you could you unpack that for us on notice. I'm interested in how you gauge—

Ms Bennett : If I could just answer that, Senator, the ACMA itself have indicated that they don't have the powers they need to be able to enforce the code, and they've said that more recently in relation to Telstra issuing notices to relatives of deceased persons. Also, consumer organisations that have been engaged with the TCP Code have recently walked away from the process. They've given up because, over many years, they haven't achieved any outcomes. So there are a number of indicators of its failure, and it is something that we think would not be effective in this particular case.

CHAIR: If you could unpack the fines and sanctions and how you gauge that in your experience, that would be really very helpful.

Ms Bennett :  Sure, we're very happy to take that on notice.

CHAIR: Senator Roberts.

Senator ROBERTS: Thank you for appearing today, and thank you for your submissions. Ms O'Shea, my questions are almost exclusively to you. Your recommendation 1 is to enact a comprehensive federal human rights act. We already have one. As well, there are multiple international conventions to which Australia is a signatory. Is the problem that Australians have no human rights or is the problem that our human rights are being ignored?

Ms O'Shea : We don't have an enforceable bill of rights in the same way that other jurisdictions around the world do. We do have, technically, a human rights act, but it doesn't give rise to all the rights contained within the key conventions being enforceable for individuals. It does also result in a scenario whereby policymakers consider human rights implications but aren't necessarily bound by them or courts aren't given the opportunity to make findings about how laws have been drafted and, potentially, rely on enforceable human rights to curb their interpretation of how those laws might apply.

So, in that sense, we do need a federal bill of rights that is enforceable. It is currently a gap in Australian law, and we remain the only liberal democracy in the world without one. That has all sorts of associated consequences. It means that, in drafting a bill like this, for example, rights aren't taken into account, but a range of other services delivered by governments and policies implemented by governments aren't taken into account. It also means limitations on rights. It's not something that we discuss in Australia as a society in terms of where they might be appropriate. Of course there are very few rights, as you may know, that are complete or unable to be subject to limitations that are absolute. So there are circumstances where rights can be limited, but there needs to be a discussion and a dialogue about whether that's necessary and proportionate, as well as appropriately limited and legal. We are miles behind in Australia because we don't even have these kinds of discussions, and that's a problem. So we absolutely need a federal bill of rights.

Senator ROBERTS: Do you consider that the balance between free speech and misinformation is fundamentally a human rights issue?

Ms O'Shea : Fundamentally, I think lots of digital policy issues are human rights issues, but there are other components to it of course. I think there are issues around competition in the size of online platforms, for example, and how they're regulated. I think there are issues around domicile and status of companies that are headquartered abroad—that have, for all intents and purposes, including for tax purposes, headquarters abroad—in terms of how you regulate them in Australia and what you might do about it. There's corporations regulation as well. I think mis- and disinformation touches on multiple different fields of policy endeavour. Many of these have human rights components, and, if we think about the right to freedom of speech, of course we think that should be protected, and we also think that people should be able to move in online spaces without surveillance by governments and private corporations. So there's a human rights lens to it, but there are multiple different factors that might to be taken into account in regulating for the problem of mis- and disinformation.

Senator ROBERTS: Thank you. That's very comprehensive. You say in your submission:

These issues—

misinformation issues—

are exacerbated by the fact that commercial platforms are becoming the de facto distribution system for all forms of media content, including public service media and other outlets that have a mandate to take into account public interest obligations, accuracy, and truthfulness. The result is that, for all practical purposes, the media ecosystem is increasingly shaped by a commercial model that profits from the amplification of polarising, sensationalist content regardless of its accuracy. This is not a model compatible with democracy's need for well-informed citizens and meaningful political deliberation.

Are you proposing limiting or removing private sector involvement in social media?

Ms O'Shea : I'm not sure exactly if I understand the question—

Senator ROBERTS: Are you saying we should be removing private sector involvement in social media, restricting it in some way, limiting it or opposing it?

Ms O'Shea : Social media companies are private sector companies. Am I proposing that they should be regulated? Yes, I think they should be subject to Australian laws. That can include laws around content moderation. We do that in relation to esafety, for example—

Senator ROBERTS: So you're not saying we should remove the private sector companies?

Ms O'Shea : I'm not even sure how that would be practically possible, I must say, Senator. Maybe I've misunderstood the question.

Senator ROBERTS: So you're suggesting more regulation?

Ms O'Shea : I think there's lots of regulatory reform that the parliament could engage in that would be effective in improving people's fundamental rights, including as directed to social media companies, yes.

CHAIR: Last question, Senator Roberts.

Senator ROBERTS: Could that involve understanding the algorithms and the behind-the-scenes activities of these companies, such as shadow banning?

Ms O'Shea : That, indeed, is one of the proposals in this bill that I think is encouraging. We talked earlier in this hearing about how there would be greater transparency around how content moderation decisions occur and how amplification occurs on social media platforms. At the moment there's almost no line of sight on that except through very conscientious efforts by journalists and academic researchers. So we do need greater transparency in how social media platforms make the kinds of decisions that give rise to algorithmic amplification of all sorts of content, such as mis- and disinformation but also polarising and extreme content. I think it would be a worthwhile activity to include greater transparency in reporting to give people an idea of whether these businesses have earned the social licence to operate.

Senator ROBERTS: Thank you.

CHAIR: Senator Rennick.

Senator RENNICK: Hi, guys. Given that the government is effectively forcing foreign social media companies to censor Australians, do you think the government should also provide an avenue of appeal to those Australians to appeal a decision of the foreign social media companies if they are censored? Or is the government, effectively, going to allow social media companies to be the judge and jury, with no right of appeal from Australian citizens? That's effectively what this bill is doing.

Ms O'Shea : I am happy to try to answer that question. I'm not entirely convinced by the premise, I must say. My understanding is that the process is that ACMA requires, or has the capacity to require, the creation of codes that then determine how content might be moderated, with a view to regulating for mis- and disinformation. That is a process that would involve engagement between the regulator and the industry. I think I would agree with my colleagues from ACCAN: in these processes, there's rarely sufficient representation for consumers, or for people who advocate for human rights or civil society generally, in the creation of those codes. That is a problem, because, in general, civil society is underresourced, and it becomes a situation where industry is very well resourced, and the regulator, at times, is as well, and there isn't enough of a role for civil society to play in being a voice for people who use these services.

In that sense, I think it is worth recognising that that process of co-creation needs to have in it some accommodation for citizen voice that is properly resourced to do that work. If you're asking whether there should be an appeal mechanism for people who may have had content taken down on the basis that it doesn't meet the obligations—

Senator RENNICK: Yes. That's what I'm asking.

Ms O'Shea : or stipulations of the code, I think there is a role for the existence of that kind of appeal process. At the moment, on these platforms, people are deplatformed all the time—and often that's for reasons that fall on both sides of the political spectrum—and that is a problem, so I think there is some role to be played in that kind of appeal mechanism. But if you are going to follow through on this process and draft the codes right, you'd minimise the creation of that problem and the need for that appeal mechanism.

Senator RENNICK: That leads to my next, and last, question. Should those codes and the algorithms used to censor people be more transparent?

Ms Bennett : Yes. We would certainly suggest that, yes. The situation at the moment is that there is no clear sign of how decisions are made by those companies—

Senator RENNICK: Yes, exactly.

Ms Bennett : in relation to what is happening.

Senator RENNICK: Yes. Fact checkers. I agree.

Ms Bennett : What's in? What's out? What are the algorithms? What are the decision-making points? We want greater transparency on that. This bill does potentially elevate that. We perhaps suggest it's not—

Senator RENNICK: Many of these fact checkers are actually contracted by the social media companies. They aren't Australian either.

Ms Bennett : Yes.

Senator RENNICK: They're in another country, and it's often on another topic or another context.

Ms Bennett : Yes. Agreed.

Senator RENNICK: Thanks. That's all.

Senator HANSON-YOUNG: A lot of my questions have already been asked, but I do want to get to the crux of this: obviously, it seems both the organisations think that this bill needs improvement and specific amendments. That notwithstanding, there's a lot of other things, like the lack of transparency on algorithms across the board and the protection of individual users from having their data scraped, sold and monetised. I'm particularly concerned in relation to children. We hear the government and the opposition talking about kicking young people off social media. How about protecting their data from being sold by these parasitic companies? How important do you think it is that we deal with the data issue alongside any further regulations?

Ms Bennett : It's absolutely fundamental. The economic drivers of mis- and disinformation can't be ignored, and that's why we'd agree with Digital Rights Watch around the importance of all of the privacy review recommendations being implemented in full. Fundamentally, if you remove those economic drivers, you remove the incentives for that kind of collation of information and that kind of incentive to on-sell to promote particular forms of information and incentivise that. That would be our view.

Senator HANSON-YOUNG: Today, the government has made some announcements in relation to the social media reforms relating to young people—that platforms might be able to keep young people on the platforms if they have an age-specific interface. Do you think minors should be protected from having their use data, their behaviours, scraped, monitored and monetised, regardless of whether they're on a youth platform or on an all-ages platform?

Ms O'Shea : I'd answer your question in two ways. It's a bit tricky, but what I would also say about the youth version of platforms is this. We have looked at how this has worked in other settings; we are still working through it in Australia. There are examples abroad where they include settings that many adults would benefit from—including, for example, removing push notifications, collating and things like that, but also the use of things like dark patterns. So there are a lot of ways in which reforms that we have seen that are directed at children would also benefit adults. I think that is worth thinking about, because we don't just let people come into a toxic environment when they hit a particular age and assume they will be fine. I think we do need to clean up the ecosystem as a whole.

More generally, I think we should have a prohibition on the commercial exploitation of children's data, absolutely. That's an initiative we should look into. Now, there may be some need for certain kinds of exemptions, and I think we should consider that, but the imperatives, the incentives, created by data-extractive business models are very harmful for individuals and for society at large. For children, who are specifically and particularly vulnerable, those kinds of reforms take on a greater urgency and weight, and we should be exploring those options. Until we tackle that underlying business model that is focused on commercial exploitation of personal information, we are only looking at the symptom and not the cause and we absolutely have to prioritise that with urgency.

Senator HANSON-YOUNG: When it comes to this piece of legislation—because we're going to have another piece of legislation, presumably, to deal with the youth social media engagement—I am sceptical. Unless we protect young people's data, this is just another platform for big media to make money from our kids, really, and pretend that somehow we are looking after them. A complacency from government will set in if we don't actually protect them from that business model, as you say.

Going back to this piece of legislation: at the end of the day, our committee is going to have to recommend whether this passes or not. Is it worthwhile as it is, if there were no change, or is it a pretty useless piece of wet lettuce?

Ms Bennett : That's a really good question! You've heard our reservations about it. We don't think it's going to particularly work. We feel it needs to be strengthened, and the way you would do that is to create a disallowable instrument rather than a code process for doing that. We think that, fundamentally, the best approach would be to address some of the fundamental economic drivers, which would be privacy protections and implementing those in full rather than trying to address this on the fringes. If you can stop the problem at the source, it's always going to be better than trying to create—

Senator HANSON-YOUNG: The whack-a-mole?

Ms Bennett : Yes, absolutely. So that would be my thought. Did you want to add something, Gareth?

Dr Downing : Yes. As it's currently drafted, it would be ineffective, in our view. Industry codes, from our experience, have not been an effective mechanism for driving compliance or any change in behaviour. I mentioned some of that penalty data earlier, and I won't go through that again, but, fundamentally, if this is a self-regulatory code, industry will draft this to minimise their obligations. There's a really strong commercial incentive to be narrowly regulated, and, if you hold the pen, you will absolutely seek to have rules that you can comply with or, at the least, that a competent member of your industry can comply with. From that perspective, you are likely to end up with codes that are not terribly effective. They would be difficult for the ACMA to enforce, even with the powers they have under this framework. In addition to that, you add in the powers that are not terribly strong and the penalties that are not terribly strong either. This is likely to lead to an outcome where the ACMA will have very weak enforcement powers. We will see quite a lot of directions and quite a lot of letters to industry saying, 'Please comply.' I don't think that that's going to be effective in addressing the substantive issues.

Ms Bennett : That's what we've had under the Telecommunications Consumer Protections Code. How many directions were issued in that 13-year period? There were thousands of them, but very few actually—

Dr Downing : It's hundreds.

Senator HANSON-YOUNG: ACMA tends to be a bit of a toothless tiger, doesn't it, in these other spaces?

Ms Bennett : Yes.

Senator HANSON-YOUNG: Ms O'Shea?

Ms O'Shea : It's hard to recommend that this bill be endorsed without the changes that we've identified in our submission. We are in a situation where it is a bit too much but also not enough at the same time. I remain very concerned about some of the potential for overreach, in ways that could be easily accommodated in amendments, including putting limitation, transparency and reporting requirements on ACMA. But I also remain concerned that this method of regulating for misinformation and disinformation is not going to deal with the fundamental problem that creates it.

I think there's real utility in creating spaces for people to research and understand how these business models work and how these content moderation decisions are made by platforms. I wouldn't want to underemphasise that, and I wouldn't want to underemphasise that I also think it is very important for the government to be taking action on this and putting forward regulatory proposals. But it would be remiss of me not to say that changes should be made in order for it to be not just effective but also safe and human-rights compliant.

Senator HANSON-YOUNG: I met with the minister yesterday about this piece of legislation. It was put to me that this legislation—as it is—would be 'world leading'. Is that correct?

Ms O'Shea : That's a good question. I may need to take that on notice. 'World leading' is an elastic term, shall we say. I'm not exactly sure what might be meant by that. There are other places where misinformation and disinformation are regulated. It comes under an e-safety regime in the UK, for example. I must admit that I haven't considered how that might relate to or translate across into this context and this proposal. I might need to take that on notice, although I would caution that I'm not entirely sure I know what is meant by 'world leading'.

Ms Bennett : I agree with that. I'm not sure what the definition of that is. If it means the first of its type, perhaps. But then: is it effective? I guess that's the question we're all interested in answering. I can't say more than that.

Senator HANSON-YOUNG: As we've seen, there are other regulations in other parts of the world that deal with some of the underlying issues. If we are not doing that, you might be running your own race, but you are not necessarily leading the race that needs to be won.

Senator DARMANIN: This question is to Ms Bennett or Mr Downing. Noting what you have just said, with some amendments and adjustments there are some benefits to be gained from this proposal. One of ACCAN's recommendations is to include further examples of consumer harm in relation to markets, such as:

… 'preventing or responding to misinformation or disinformation on digital communications platforms that is likely to cause consumer harms or undermine consumer trust in the digital economy'.

What sort of harms did you have in mind, and how would a government ensure that a definition is not too broad?

Dr Downing : In terms of ensuring the definition is not too broad, I would encourage a consultation and engagement process on any draft language in the usual way. In terms of the particular harms we are talking about, we are probably starting to touch on some of the issues around dark patterns that have been mentioned. The way in which information is presented can distort your perception of whether something is a good product or service. Obviously, there are restrictions around this with misleading and deceptive conduct, but there is a whole level of lower-order conduct that is quite pervasive in terms of online marketing. It is touched on in other regimes in the UK, Europe and so on. It's about really touching on what is verifiable, accurate information and the standards around that. The ACCC does a really good job, with respect to misleading and deceptive conduct, but it can only do so much with the tools that it's got available. So I think it will be going back to some of those digital platform inquiries and looking at some of the recommendations from there.

Senator DARMANIN: This is probably for both organisations, because I think you've both touched on improvements around reporting and review as a way to strengthen the proposals. One of the recommendations, I think, in the Digital Rights Watch submission is that there be formal mechanisms for the review of power exercised by ACMA, by the parliament, on a regular basis.

Given the bill already has in mind a requirement that ACMA prepare a report every year, to the minister and to the parliament, there is the ability for parliament to disallow codes registered by ACMA or standards made by ACMA, the ability for certain decisions of ACMA under the platform to be reviewable and general accountability of ACMA through parliament and Senate estimates et cetera. What did you have in mind, in terms of how you would strengthen that? This is for both of you.

Dr Downing : In terms of strengthening, I think the ordinary process, with respect to any decisions made, with respect to registering the code, would be articulating the basis for accepting or rejecting the code. That would form the basis of any determination about whether the exercise of power has been legitimate. I note that is pretty standard practice in other contexts. The provision of reasons is really important, from a legal standpoint, for that accountability. I think that's one aspect that could be touched upon.

I'm conscious that those reasons are probably available to government through various processes and accountability mechanisms but they're not publicly available, generally speaking. They are for different types of tribunals. The ACMA doesn't generally publish those written reasons, to my understanding. Given the nature of these particular provisions, this is something you would want to be looking at, because it has a very important human rights implication. That transparency, therefore, is very important.

Senator DARMANIN: Ms O'Shea, do you want to add anything?

Ms O'Shea : What I would add is not necessarily about the code-making process, although I would reiterate what I said before, which is that I think these kinds of processes, if they are where we end up, are only effective if there is a proper voice for people who are users and participants, in these ecosystems, and need to have a say in these processes.

The second point I want to make is that I don't think the transparency request, on our part at least, was just about code-making functionality or reporting on that kind of activity. It was also on the information-gathering powers that ACMA has that may justify how you create a code but also may justify inquiries about noncompliance with the code or whatever it may be.

I'll repeat that there is almost no information we have about how social media business models operate for the amplification of certain content. The technical tools that were available for this no longer exist. We are, instead, reliant on very extensive efforts that are expensive and difficult, by various people, to do this work, which is extremely valuable. But it seems a terrible shame and also, practically, silly that we don't have greater access to these kinds of information, about how these decisions are made.

If a regulator were to do that work, the regulator should be under an obligation to share that with either trusted researchers or, I would think, even the public, some form of that for the public, so that the public can know what they're dealing with when they use the platforms and also be better informed when governments put forward proposals for altering how companies are required to make these decisions or impose regulations and, ultimately, make a decision about whether they want to be on there, whether they want to let their children on there, for example.

I think that kind of transparency is something that might be available under this bill. With some improvements, I think we could get to a situation where ACMA is required to publish that information, and that would be of benefit to many people.

Ms Bennett : I would add that I agree with Lizzie's point, that consumer involvement is absolutely fundamental and critical. That is one of the key things that we have concerns about, when it relates to the TCP code, that very direct involvement of consumers about issues that affect them and, in all its diversity, the capacity for consumers to have a say about how the system works for them. That is what it's supposed to be there to serve, not just industry wishes, which is pretty much where we find ourselves at the moment.

Senator DARMANIN: Thank you.

CHAIR: Thank you. I know we've kept you over. I do appreciate your time. It's been very, very helpful. You have taken some questions on notice, and we would urge you to return those by 18 October. Also, you are going to send us through your opening statements so that we may publish those and so that the committee can use them for their considerations.

Proceedings suspended from 13:30 to 14:15

 

Comments powered by CComment