The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> THERESE MARIE UPPSTROM PANKRATOV: Hi, good afternoon, everyone, I hope you had a good lunch. This is the open forum on protecting refugees. I hope everyone is on the right channel. Very reassuring nods. That's great. Thanks. Hello, everyone, thank you so much for your time today. I am excited to have a great panel with me and hopefully a great panel discussion coming up. Today we're going to examine information risks through the lens of forced displacement. My name is Katie I work for UNHCR. I work for the digital service and on a workgroup called Information Integrity which is looking at how we can strengthen Information Integrity to mitigate the challenges of information risks online.
That directly impact the lives of forcibly displaced.
Been displaced within their own countries and people who are stateless. We also address the impact that Information Integrity risks or information risks, sorry, have two humanitarian operations and obviously this is very challenging space. So hopefully today we can actually focus on some of the positives and some of the solutions that we've been working on when we look at addressing information risks. So today we are going to look at how we can strengthen digital protection. How we improve access to reliable information, how we can uphold the freedom of expression and how importantly we can foster social cohesion and inclusion. So our key question is how can we do this collectively as well? And I think that this really speaks to the purpose of the IGF. How can we really strengthen multilateral partnerships, engage with different community members, different groups within societies to address this challenge and hopefully this is a sort of exciting panel where we can talk about some of these solutions as well. But just to get started and I'm going to ask my colleague on dean online to help us with this process. It's a little bit of forced audience participation. There must be some resistance to this after a heavy lunch but please give us your views and ideas. We're going to do a men try. This might be a familiar process to you.
If not, go to the menti.com and enter your code so if you're joining online and hello to participants online. If you can't take a photo with your own phone go to the website and enter the code and we should start coming up with a couple of questions. I will just wait for even to take a photo of the code. Please wave your hands if you're still taking photos of the code. If not, we'll move to the first mentee meter question. In your own words can you tell us what you think bemean when we talk about forcibly displaced communities and hopefully we'll have a lovely word cloud up there when we think about digital resilience. We'll give a few minutes for those online and those off. Thank you whoever was the first person to have the courage to hit send.
Brilliant. We'll wait for a few more answers to come through. I see access is coming across quite strongly there. That's very interesting. Protection. Regulation, we'll hopefully touch on a lot of these topics today. Freedom of expression, safety. Safe care. It keeping jumping around. Access to information. I think that's brilliant. Financial security. Safety online. Rights and duties. That's also something that we'll talk to today. Authenticity. So brilliant, seek when we talk about digital resilience we're talking about the ability for displaced community to have access to an information ecosystem that is robust, it meets their needs, it allows them to express their concerns, their stories. Tell, have a voice in a place where they feel that they have access to information securely and safely and that's what we are talking about when we talk about digital resilience. That sounds great but we will move to the next question. Obviously there are some challenges when we talk about digital resilience for forcibly displaced people. What might be some of the barriers when we look at that sort of safety, security, freedom of expression and access to information ecosystem. So just spend a little time thinking about the barriers. Oh that's interesting. We got vulnerability coming up. Capitalism. I'm not sure we're going to be able to knock that one on its head today. But xenophobia. That's definitely something we will be able to touch on. Lack of network. Fear and censorship and surveillance. I think that's also coming across quite strongly here. Knowledge. Yeah, untrusted. Trust. Trust and safety and access and reliability as well. Okay. So not wanting to focus too heavily on the challenges and hopefully moving quite quickly to sort of the solutions. I wanted to introduce Therese for our opening remarks. So Therese Marie Uppstrom Pankratov is the head of innovation Norway, our key donor that we've been working with on one of the case studies or on the case study we'll present today.
Innovation Norway supports humanitarian organisations to enter into innovative partnerships with the private sector and so we also have our private sector partner on the panel today which I'm quite excited to talk about the collaboration we've had in South Africa. Therese is a strong believer in the concepts: Previously worked with the permanent mission of Norway, the Norwegian refugee council, save the children and the refugee agency. Quite a familiar topic when it come to some of the challenges of forcibly displaced and stateless persons experience. It'd be great to hear from you some of the reasons why you were interested to support multistakeholder partnerships. Thank you.
>> THERESE MARIE UPPSTROM PANKRATOV: I'm looking forward to this session. I think already this week I have attended quite a lot of sessions where Information Integrity has been the topic and so I think this conversation will fall well within that discourse and help us talk about people who are forcibly displaced. Vulnerable populations are affected by bad information and this focus is important. I think we established this week that safeguarding Information Integrity is one of the key challenges of our times. It's augmented by technological development that has made it easier to develop and spread misinformation. In humanitarian operations we're going to talk about information being protection and it sounds a bit humanitarian in that language but it makes a lot of sense that we know that when we have access to quality information it helps keep us safe. And when we don't, it costs us a significant risk. So as Katie said I work for the humanitarian innovation program, a program financed by the Norwegian ministry of foreign affairs and managed by innovation Norway.
And we're set up to derisk innovation partnerships between humanitarian organisations and the private sector. That would like to design and develop solutions to humanitarian challenges. And we have an annual call for proposals so about two years ago UNHR responded to that call and had identified a lack of solutions in combat ing misinformation and hate speech targeting or affecting forcibly displaced people.
So they wanted to design an innovation process and find partners from other sectors to test the use of prebunking strategies that could proactively counter false and potentially harmful narratives before they take hold.
And safeguarding Information Integrity crisis has become one of these areas of work that is referred to as a wicked problem. I think it was illustrated really well now we're the word cloud and all the challenges that you listed. So we call it a wicked problem not because the problem is evil but because it is really complex and has a lot of interdependent factors and there's not one clearcut fix that can help us address it. So when we want to solve key challenges like this, we need innovation and we need partnerships. We cannot go it alone. Solving problems requires a deep understanding of the stakeholders involved and a deep insight into various technologies or other possible solutions. It requires an innovative approach. Characterized by dialogue with actors from various sectors and expertise. And it requires trust based partnerships and collaboration, along a process shaped by the sign thinking. It's really complex and in this case the partnership was by the needs of people affected by crisis and that are forcibly displaced. Deep insights into social media, artificial intelligence and other technologies that are used to spread misinformation and deep insight into behavioral sciences that can help us understand how people establish trust and information that they seek or receive. So one actor alone will not be able to master all these skills and so a good way forward then is to design a multisectorial innovation partnership to address the challenge in an appropriate way. I think traditionally often we thought of innovation as a fairly easy process where you have the problem and it is solved and this has caused a lot of disappointment because it normally doesn't look like this. I have seen a few in my work on humanitarian innovation but normally an innovation process is a lot messier than that. They go in loops and circles. They require multiple testing, redevelopments and so on. And when it come to wicked challenges, the solution is also most likely not one shiny new thing. It is often multiple processes, partnerships and technologies. That when they come together, help us address the challenge at hand. So this means that in addition to developing various solutions we also need to develop an ecosystem, of partnerships and solutions that can come together and this is not an easy task. What I think is particularly inspiring about the initiative that we'll hear more about today is how these various partnerships have come together around the common challenge bringing their various expertise and asking how can we strengthen the digital resilience of people affected by crisis, keeping people safe and protected from harm. We will hear from the humanitarian sectors and together their insights create a unique basis that can help us develop solutions I think the panel discussion today will help us both understand how we can support the digital resilience of people affected by crisis and how we can shape innovation partnership to solve big challenges. So I very much look forward to hearing from the panelists.
>> KATIE DREW: I will do my best to introduce our speakers. I will start with Ali. She works for a multi‑country office and she's based in Pretoria. She supports social cohesion in South Africa.
She's a strong advocate for rights of refugees and helps to address protection and challenges that they face. Really ensuring a community based and community led approach and so hopefully Mbali is going to touch on some of the ways in which we really try to bring the community voice into the project. On her immediate left is Michael Power so Michael is a public interest lawyer. He's a managing director and cofounder of ALT Advisory in South Africa. He serves as the chairperson. And specializes in policy development including a focus on technology law. Information rights. And digital governance and he work to advance constitutional rights and good governance in the digital aid and at the end of the table to his left we have Likho Bottoman, an official within the South African government of education. He holds the position of social cohesion and equity and education. And he's dedicated to advancing inclusive equitable and socially cohesive schooling in South Africa and to my right I have a technology and innovation leader. Currently serving as a managing partner at co‑creation hub and oversees the design lab and supports several of the hub workshops. He has a master's in public policy with a focus on technology policy from career development institutes school of public policy and management and is a Ph.D. researcher in creative technologies at the University of technology. So to start our conversation today I am going to pass to Mbali lee to provide us with an understanding of some of the information risks and sort of digital protection risks that we're talking about.
Specifically in South Africa as we sort of highlight to begin with that case study. Would you like to give us an overview of those channels?
>> MBALI MUSHATAMA: Thank you so much. In my experience, working with refugees in South Africa, we have observed over the years a rise of misinformation and hate speech particularly targeted at foreign national and we see a rise in this especially during towards the election period as well.
But we also have to look at the context of South Africa. South Africa is 31 years into its democracy. And during the early stages of its democracy a number of commitments were made around equality, inclusion and accessory resources. While South Africa has made significant strides, in achieving this we must recognize the gaps. The unemployment is 32% as well as the limited public resources. We find where there are limited public resources it can rate a sense of competition and that can also result in a lot of social tensions which is what we have observed within the host communities where refugees particularly reside. And so foreign nationals including forcibly displaced persons are often time used as scapegoats for social economic problems in South Africa and we have seen a rise in many online groups such as put South Africa first as well as operation tutu la. So tutu la is a term that means to push out or to push away so these are groups that will use trendy words.
Trendy phrases to gain momentum and gain traction and they have used these online platforms to incite violence and looting in the host communities. Just to also highlight how seriousness of the hate speech that these groups perpetuate we have Operation Dudula who has recently taken to court by civil society organisations.
So they can be held accountable for some of the actions that they are perpetuating, so we've seen a direct correlation of their ‑‑ online incitement to violence manifesting itself into the host communities as well and this unfortunately also trickles down to young people and young people are especially sensitive to this type of rhetoric because both in the online space as well as in the communities, so you will find that refugee students, for example, have reported cases of xenophobic motivated bullying as well as targeting in schools so these are refugee children who are born in South Africa, or perhaps fled to South Africa at a very long age, they identify as South Africans even they speak the language. When they get to school they find they're being bullied because they're considered as not from South Africa we also have comments or speeches made by prominent figures.
These tend to be influential and inciting violence within the punities so think the context of South Africa I think it was a good case study because there's a lot of research done on this.
But also we have a lot of real life examples. Thanks, Katie.
>> KATIE DREW: Thanks maybe we can talk about the social cohesion challenges in schools and maybe the role of digital information risks in that environment. So, thanks, Likho.
>> LIKHO BOTTOMAN: I want to start with some of the issues that we find in South Africa relate to the effect that South Africa in itself is a very multicultural and multiracial or even multilingual country on its own. And that on its own is bringing a set of innate challenges of diversity management in the country. When you add foreign nationals into that whole compound of diversity elements in the country, you find then that there are already existing complications and the antiforeigner, narratives than just simplifying themselves at the center of all of those complications. The second thing that we find is that even though as a country we have agreed that we will use schools as centers of life, way young minds are being molded and being prepared for a, an inclusive society. School alone, the curriculum alone is not going to solve the problem. Because curriculum is only delivered in the classroom during school hours. And this child goes back to school where these narratives are perpetuated. Or may go back home however where these narratives are perpetuated but they also go into cultural and religious spaces in the community where it is advocated for very strongly that an antiforeigner narratives are actually religiously correct and so our curriculum is not able to help us shift the mindset. Not just on issues of people of foreign nationality but even on other issues related to HIV prevention or unintended pregnancies. This issue is not immune to those issues we find in South Africa and so we've got a greater task as the basic education center to think about education beyond the classroom. And understand ourselves playing a role to educate not just the child in the classroom in front of the teacher but to educate even the country because we are a basic education department. And that is going to take a while because on the one hand the creative thinking around positioning education as a public education entity, but on the other hand there's this thing that says it's not your role to guide value systems across the country. It's not your role to guide a belief systems of the country. And so you need to start and end with the core business of education which is literacy and other school subjects. And so we find ourselves in a tug of war quite a lot because now we've got to play this role of helping the country move forward but at the same time understanding how far we can go and what our limits are as a sector.
>> KATIE DREW: Thanks Likho maybe I can follow up with the point about digital resilience. When we said we wanted to work on the concept of sort of strengthening digital resilience, how did you think that that was valuable and how did you think it might support the national action plan and maybe for the purposes of the audience just sort of outline that a little bit for us. Thank you.
>> LIKHO BOTTOMAN: Well there had been a belief in South Africa we got digital divide, we got inequality this and that but actually people have undertaken research about access to technology in our country. Are coming up with some very interesting data that actually says that even the most rural people have got access to technology. And in spaces where we never thought that today is access to technology. And so for a very long time as a government we didn't think that we need to address technology, facilitated discrimination of any kind. But what is happening now is that our population is growing ahead of us. Because they've got access to technology. They are already absorbing the misinformation and disinformation. So it is up to us as government to rethink how we see ourselves and how we see our country and begin to maybe intentionally begin to work on disinformation and misinformation that exists in the digital spaces because children are already there. South Africans are already there and if we don't by the time we get them, we would have lost them to the misinformation and disinformation.
>> KATIE DREW: Great. Thank you, Likho, Mike I will pass to you now and for the technicians in the back I believe we'll have a couple slides on. We've been talking about this project in South Africa and we need to outline a little bit of what we mean. If you could walk us through some of the approaches we've been testing together. Thanks, Michael.
>> MICHAEL POWER: So thank you Katie and to the entire unit team as well as innovation Norway for hosting us on this panel.
And maybe thank you to the technical team in the back. This is my first ever silent seminar and you're doing a wonderful job so thank you for that. You know, flowing from what Likho said we've been working for 18 months and we were simply asked how do you change, you know, children's perspectives. This was ultimately the macro question that we were asked. Within the purview of antiforeigner sentiment in South African schools and we rarely went to the drawing board for the parents in the room, changing a perception is not easy particularly when someone is in an echo chamber and those beliefs that are held are being reinforced with the community, by their parents, potentially by their educators as well and as was said we have a long history of xenophobia for multiple socioeconomic reasons but the fact that xenophobia is inherent in our community I don't think can be dispelled at this stage so we went to work and talking about internet penetration I think context is always really important to understanding any situation and our internet penetration rates in South Africa while they are increasing the biggest challenges in schools is access to the technology and state funded technology so our initial brief was to determine how you prebunk antixenophobic examples.
Can I pause there, just by a show of hands, I know it's after lunch who is aware of the concept of prebunking? Okay. So it's relatively novel it's really what it says on the box.
It's the opposite of debunking so we see in the online space information comes out. It may be misleading, it may be incorrect and you go towards debunking. Prebunking is the opposite of that where you try to norm the narrative before it starts.
So you rarely have two types of prebunking the first is dealing with common narratives.
In this context a common narrative in South Africa is that foreign nationals are criminals. Right? It's a narrative we've heard in the form of lots of national political discourse and this is a type of narrative prebunking. You then have logic based prebunking where you're attacking the technique for political purposes so these are the two underpinnings and a lot of prepunking work is piloted by Google's jigsaw project and the approach is three parts at its basic form.
You have a warning, so you point out in content narratives that are occurring in online spaces. You then have a preempty representative you explain why those narratives are incorrect and then lastly you micro dose to try explain how you counterbalance that narrative that the micro dose is meant to be just that it's not meant to further perpetuate the harm.
So you're really and simply trying to get ahead of the story. You're trying to get ahead of the narrative in terms of prebunking so when we turn to South African context, we had hoped to start digital.
You know a lot of this program is around digital but we found in consultationings in learners that our penetration rates were too low.
And access to technology was not available at that stage so we got thinking and came one the concept of a board game. So I'm not certain what slide is on the screen but we can flip the slide. And our game is called Ms. San see life. For those who played snakes and ladders before it's a similar type of game.
You become a character. And throughout the course of this game we gently micro dose to avoid antiforeigner content you have the ups, you have the downs, and you are awarded for good behaviour. You go up the game, you move towards your future. And for problematic behaviour you move backwards and the difficulty and challenge is getting the narratives. If you're too blueprint ‑‑ blunt about the situation you lose your audience.
It's about subtlety and ensuring that before children are exposed to these types of narratives they already have some type of information to counter. We learned today that Norway has a critical thinking day. Where critical thinking is promoted in schools throughout the country. We don't have that day.
And it's something for our department most certainly to think about but this type of educational material which is the board game coupled with the facilitation guide is an approach that we've looked into to prebunking this antiforeigner sentiment that we're seeing online.
If you go to the last slide and for us what has been most telling is, you know, adopting a multistakeholder approach, ensuring extensive consultation, the early results of testing of this game have somewhat exceeded our expectations. You know, we initially started piloting we thought it would be just another game, something thrown in the cupboard and not used but when we started rolling it out in our test groups we started to see really fantastic results the most important result is the one at the bottom and I will move back from there. But when it came to the question of perception, and on the question to learners who are surveyed after playing the game and going through a facilitated discussion on antiforeigner sentiment, for the statement, some people online are trying to influence me by using emotional or shocking messages that spread quickly. After a three hour engagement with learners, the perception of agree changed from 43‑86%. I can't remember personally when my perception was changed by 50%. So gamifying a concept with facilitated learning seems to be a magic ingredient. The traditional notions of prebunking which is watch a 50 second TikTok video are traditional mechanisms that are rolled out and still often adopted.
We've seen a more substantive approach as yielding slightly more significant results and Katie I hope that gives a bit of an overview as to the pilot project and I look forward to continuing the conversation.
>> KATIE DREW: And if you have questions we'll come to them at the end or please do grab Michael or myself to talk a little bit more about this case study I have one of the old iterations of the game with me so if you want to see some of the play cards and things like that I think it would be worth, you know, worth looking into if you would like a little bit more information. I would like to sort of bring us a little bit back to sort of looking at the engagement that we can have with different stakeholders and Michael mentioned some of the jigsaw approach, jigsaw Google approach but can you talk to us about the tech platforms, these?
>> MBALI MUSHATAMA: One thing we need to take into consideration is the fact ‑‑ can I hear myself. Another thing is the fact that refugees are on these platforms. These various platforms. We can't run away from that. The one thing we wanted to ensure is that they feel empowered to use these platforms. They feel safe enough to use these platforms as well but since there is an existing narrative we wanted them to put out their own stories and lived experiences.
We've been closely engaging TikTok. We recently hosted a webinar decaded to just helping us learn more about how the users can stay safe while using the platform as well.
So the session focused on equipping participants on tools and knowledge to navigate the platform safe limit and also served as a platform for open dialogue.
So we were afforded the opportunity to directly ask the team, the TikTok team some of the challenges that we're seeing because there are certain trends that we're seeing that TikTok might not be aware of. There's a certain set of misinformation or hate speech that is being spread in such a way where they're using coded language. They may not use a particular word they might change a word to another language. You have eleven languages in South Africa do perpetuate hate speech the TikTok team was very helpful in offering insights also into their community guidelines, their different reporting tech ‑‑ mechanisms. We're also looking at empowering the refugee community on how they can create their own content as well so how can they then start making content to counter act what is already on the ground?
How can they also do this safely where they feel they can see that in themselves. One of the things that came coming up was freedom of expression. High can refugees? They are contributing members of society.
They have their own lived experiences. For us really engaging platforms such as TikTok and engaging platforms such as meta. Where we're able to say we are oftentimes have people left behind, how do we bring them to the table so they can tell their own stories, where they can also access safe information in a manner that does not endanger them. Thank you, Kate.
>> KATIE DREW: Thanks Mbali. I would love to hear more about these reporting mechanisms that we know that maybe the platforms have if it duds come to someone sort of saying that they have been directly sort of a recipient of hate online. Do you feel that maybe displaced communities use these reporting platforms and what can be some of the barriers that can stand in their way when it comes to reporting?
>> Thanks Katie I will start by saying 80% of the reporting platforms from the big techs are afterthoughts. After being in technology. For most of the social media platforms, for example, reporting platforms for hate speech or digital violence were afterthoughts. Introduced, the pressure was mounted by civil society organisations and technology activists to be able to do that so it's always challenging to even create the awareness about some of these tools for people using their platforms but also as technology creators sometimes we tend to create the one size fit all types of technology for people which is not really helping vulnerable people to be able to use these platforms effectively. When we talk about internally displaced people.
There are refugees as well and some of the reporting platforms are not really feeding to the different classifications that one might have as well and in the work that we do everyday, we've seen increasingly that these people are not, some of them are even aware of these reporting platforms. But more importantly is the familiarity first of all with what you call hate speech or offensive opinion first of all. We've worked with people that are seriously emotionally battered. That when you, even you use hate speech on them they are not even aware. They are not emotionally sensitive to some of the things and this brings the complexity of even helping and supporting, you know, internally displaced people or refugees first of all from their emotional level of classification. Number two is the language and localization of some of the reporting platforms. And that is why we begin to see organisations creating different help desk offline where people can come and make reports and then this platform escalate to the platforms where some of those things have been perpetrated and I think it speak to what you were saying around the fact that foreigners, internally displaced refugees come into a new country. Sometimes they don't understand the languages being used for hate speech on them and we've also seen in our work fear retaliation, we've seen a lot of people who have experienced hate speech or violence, digital violence, they don't want to report. Because of fear of retaliation or the pour play in the mix of those who have used those words against them and also we, a particular situation that we've seen is how in ‑‑ and this is a cocreated story with some organisations we worked with lately.
In a particular IDP in Nigeria, should we call them the waters or the people in charge. A particular person in charge was making sexually violent comments towards a particular young lady. And it comes with intimidate. They don't want them to speak out. So it took another IDP.
So report to law enforcement and the law enforcement did was to ask if the lady was suggestive in her behaviour. You know? And then that discouraged lady to report. So it also comes to the responsibility of those that we have actually, you know, appointed as those in charge of the refugees or IDPs in the first place as well. There's a wide range of, I would say challenges, facing IDPs and internally displaced people but I think more importantly is the localization of the platforms that for example we address the situation where it was a violent revenge porn. That's the word, right? Image based digital violence on a particular platform. And we had to work with two other organisations to be able to dispel the situation to the platform because they didn't want to go on the platform to use it because of a lot of historical issues of what was reported in the past and nothing was done about it so it shouldn't be a case of afterthought I think in terms of the development of this platforms now they have to be, you know, it has to be with integrity. It must drive trust in people but also it must be contextual in the way that people use it and the kind of work of do we have the common center where people can come to, you know in and for those who trust to escalate to us and then we escalate to the platforms and those who need psychological sport ‑‑ support as well.
>> KATIE DREW: I was going to ask some of the practical steps to improve these reporting mechanisms pave long with localization I heard you talk about the partnerships you're working with as well.
Is there any other sort of, how can the tech center do better? Is it by engaging local actors?
>> I think they need to engage the local actors, the trick is that when, breakdown in trust people don't want to use this platform to report directly and if people have a structure they already trust either in the community or with other civil society organisations that they are comfortable speaking with, I think big tech should be able to come down and work with those organisations to be able to address these issues better. There's a lot of underreporting happening because of this trust issues. And also the way in different context our law enforcement is addressing some of the issues with level, right? So they should be a community approach where there's report the offender kind of situation in those community. They could be, we organize, you know, people on WhatsApp and people can reach out via other platforms as well. But the truth is that there are more than, in every local government, there are, you can have at least 25 civil society organisations, or local actor who is are generally interested in some of these issues and they are a great gateway to the big techs to get some of the reporting. And we should not also limit this to what is happening on platforms, online, there are offline situations as well. That can be escalated via independent platforms outside of social media as well and I think we should build more platforms that encourages people to come out and speak about these things. For social media, it's either you resort to counseling people or you keep quiet but for independent platforms, or offline situations and I think we have more offline situations. Physical words, violence words, actions spoken to display refugees, that should be escalated because when we go out to digitally transferring you say some of these guys are not even on Facebook or not on X or not on any social media. But the offline one is even more in the work that we do that we see everyday.
>> KATIE DREW: Great, thank you, thank you so much. So we talked about reporting mechanisms and Michael I'm going to put you on the spot with a long question. We talked about digital resilience and how can we create an environment that is supporting the digital resilience of refugees of asylum seekers. How up to the challenge is the policy and regulation environment when we look at that question? Are they sort of protected currently when it come to policy and regulation?
>> MICHAEL POWER: Thanks, Katie. I mean, it's a complex question. And I would, yeah, really welcome a conversation with colleagues in the room who work and have different perspectives. On this, I mean, you know, my view at least from the context we're in, I think the practice is really the challenge and I will start there. The practice in supporting vulnerable groups who are subjected to hate speech and disinformation is wholly inadequate. It's often revictimization, you know we've heard a series of lessons learned and it's not only the platforms here that are the culprits from policing stations whether it's a refugee or a survivor of the sharing of a nonconsensual image the practice is simply not to involve people or to revictimize or reblame or victim blame through a series of processes.
So given that the platforms have increased their dominance over an extended period of time, the state response practically has been wholly insufficient and I think that is informed by the regulatory landscape. Regulating hate speech is difficult, right? We're seeing as we speak that twitter is challenging new laws sort of, you know, hiding behind the hate act. And our jurisdiction in South Africa we do have legislation. It is somewhat enforced but I think that secondary vulnerability that a refugee has, the ability and the enabling space to come forward and report in the first instance is something that we haven't got past at this stage. So policy at least in the South Africa perspective there's no specific policy that that looks to refugees or internally displaced people on these particular questions. The broader framework is emergent but it's very much whack‑a‑mole and scatter shot. We're dealing with contributions here. We're trying to look into platform power through competition policy. There's nothing harmonized that is rarely there to create the support of environment. And for me, I think that the state needs to play a far bigger role. You know, colleagues have references independent mechanisms that are being used for reporting. One of the partners on our project, medium monitoring Africa runs a platform called the real 411 which is an independent platform that you can report to. Again it's a question of scale, and it's that scale that I think the platforms when it comes to content moderation or the erstwhile concept of fact checking which is a big problem we have at the moment coupled with these independent platforms just don't have the capacity to get through the volume. So while from a legal standpoint at least in the South African context there are safeguards in place. Those safeguards are severely impeded by the willingness of those empowered to pursue the rights within them.
>> KATIE DREW: I will turn back to you because I know you also work on in terms of policy and regulation and sort of governance actions. Is there any sort of recommendations that you would make to try and sort of address some of these challenges? I know we spoke about the challenge of of reporting and revictimization but but at the sort of policy and governance level do you see any, you know, positive steps that could be taken to build that resilience?
>> Recently we had invitations from a number of organisations.
But also some local actors on how do we effectively make policies around anonymous reporting that is effective? I think there's a lot of fear. And on the level of violence, how deep the situation is, right? The... and I like what you said around revictimization and we've seen that one over again. You you know a process that started by making a rape statement to toward a particular person.
And they don't report. The rape eventually happened and then they eventually report it. Or somebody even supported them and encouraged them to report and from the law enforcement side, they start with a derogatory comment. This person was shamed right from the police station. What do you expect to have happen next? Most of the cases are can we make anonymous reporting so effective? And can we also introduce policies. People who are in charge of accountability for vulnerable people which I think in most part of not just in Africa, most part of the world it's very contextual and subjective as well. We've also seen situations where people that are supposed to be in charge have their own way independent way of seeing these situations because we don't have a lot of framework, policy framework, perhaps address some of these issues so you, judge based on what you feel. That, well, that is not a rape. Or that is not violence. I don't believe. A lot of person feel that. How do we humanize that takes lessons from some of these particular issues number one to make anonymous reporting very easy and effective. Accountability for people in charge of addressing some of these issues in government and in law enforcement and number three, the localization of the platforms. We've seen, you know, reporting platforms that are just in English language but somebody only speaks a particular language in a part of Nigeria, how do they report and also for the independent organisations to have different mediums of reporting as well. I think policy in these three areas can be a long effort to get started.
>> KATIE DREW: Let's pass back to you as we heard about the practical case study around life. And we also heard around the role of reporting mechanisms and sort of regulatory policy. But you're engaged directly with refugees themselves. What particular ideas have they given you to strengthen their own digital resilience.
>> MBALI MUSHATAMA: I think working with them on a daily basis has shown me how resilient they are. They are traumatized because of the different things they have had to go through to find a country and asylum where they can be accepted and safe. And oftentimes you try to avoid retraumatizing them. You don't want them to tell their story over and over again because that's their lived experience and no one wants to retraumatize their event. What they want is safe spaces to tell their stories. They want safe spaces to have an open dialogue. They want safe spaces to report any activities or any incidence they feel they may have been violated. Someone may have violated their human rights what I have heard over and over again is that it's great that we meet in these platforms to discuss these issues but more than ever we cannot make decisions for them and about them without them. They want to be included in these conversations when policies are being drafted. They want to be brought in because unless you are a refugee and you have that lived experience, we really, we can't really dictate, you know, what works and what doesn't work. So for me, I think in my conversations with refugees the number one thing is create a safe space where we can bring in our ideas, where we can bring in, because they've got a lot to share. They have a lot to say. I think to also echo my colleague accessing reliable information in a language that they understand so many times we have seen there's a number of communication policy changes that are communicated and refugees oftentimes don't understand and so they find themselves on the wrong side of the law, sometimes simply because they genuinely just did not understand what was being communicated to them. And so I think for me also how can they access reliable information in a language that they understand? Lastly, I think ongoing digital literacy that is context specific and uses real life examples. This is different from Nigeria. How can they protect themselves in the context of South Africa? How do they stay safe in the host communities in the context of South Africa? So these are some of the few examples I will give. Thanks, Katie.
>> KATIE DREW: Thanks, maybe I will pass the one final question back to Likho if we can. We heard about the challenges, maybe different ways of working in partnerships and some of the approaches that you have seen and I know you played the game as well. If you were to give advice or guidance to maybe government counterparts both in South Africa by also in other countries thinking about is there something we could adapt to try to think of a way of building digital resilience what advice would you give or what practical caution would you maybe advise?
>> LIKHO BOTTOMAN: I would say when it comes to making use of digital platforms. This is again like the board game that we have on this we need to understand that when we say yes to a fully digital approach who are we saying no to. When we say yes, we are saying no for the ability for us to reach those that Michael is talking about who still don't have access to digital platforms therefore if you want to drive a prebunking agenda, you need to use a combination approach.
The one is not replacing the other but they should be complementary to one another. That's the first thing. The second thing that I want to say is that probably the conversation about protecting the rights of refugees is not a conversation that should be had by a country where they are because there are other international influences that need to be taken into consideration. We need to have a global conversation as a global community. About it. The third and the last thing that I want to say is that yes the fourth Industrial Revolution has brought on us the pressure to get onto technological and digital platforms but we also have the responsibility to ensure that when we push children into those spaces where think need to now access information about misinformation and disinformation and prebunking and so forth we have another responsible.
We have to protect them when they're online.
>> KATIE DREW: Thanks Likho, thank you so much I would like to say thanks to the panelists. We have a few moments now to ask some questions and first of all I'm going to, whilst people are, is, I think this is the microphone that people come to ask questions. Don't be shy. There's one there. First of all before last people are hopefully making their way to the microphone I will ask on dean online are there any questions in the chat?
>> On dean: No questions and I will read them out for you if there are any questions.
>> KATIE DREW: We'll come back to you. Thanks for your question.
>> AUDIENCE: Hello my name is Olivia and I'm coming from the London story and we work in the context of India.
It's a little bit different so I don't know if you can answer my questions but also I would like to hear your experience on that. We document cases where we work a lot on refugee protection, et cetera, and India is not a party to the refugee convention. And there are a lot of different groups of refugees which are also treated differently, et cetera, and especially in terms of the hate speech online we document that refugees in India are systematically targeted by disinformation. And they're being labeled as terrorists, et cetera, and these narratives are often pushed both by the state actors and also by non‑state actors. And they're being accused of different things that they want to grab lands, et cetera, like all sorts of things and this also results into our like obituary detention and communal violence. I would like to ask if UNCHR, if you have experience and knowledge if UNCHR has taken specific steps to target this disinformation online by state and non‑state actors in India. And more broadly, how does UNCHR respond when harmful narratives are generated or tolerated by state actors and also I would like to know what you do in the context where the state did not ratify the convention and UNCHR has a limited mandate like in India. Thank you.
>> KATIE DREW: Do we have some more questions that have come through. Ondine if there's any on the screen.
>> Ondine: No, nothing.
>> KATIE DREW: Any further questions? I heard an "um" please do ask questions afterwards. Thank you, thank you for your questions. So I mean, personally I don't, I've been working on the South Africa project so obviously I'm not able to speak or comment on the India case study for India example on this occasion. I would say that a lot, when it comes down sort of work ‑‑ to sort of working in countries where we don't have a lot of enabling environment as South Africa.
A lot of it is to be seen as a trusted entity and being seen to be able to have access and engagement and I think that this comes from how can we really make sure that, you know, that we are if able, able to sort of operate and have access to information available on channels around sort of what services are available. A lot of what we've been doing sort of on a global level as well is around sort of really trying to make sure that people understand you know, what is a refugee, how do we really try and sort of tell these narratives around solidarity? But I think the examples that you were saying strike to the examples in South Africa what are the narratives and what is the behaviour science behind this? What is the fear that people are being exploited, you know, what fear is being exploited here? And so for example I think, you know, Michael, I will pass back to you but some of the narratives we identified in South Africa are really deep seated. There's hooks and narratives that it's easy for people to manipulate because that's where the fear is. I think my advice would be to bring it back to a behavioral science approach and identify sort of what are these grand narratives and what fears and what sort of levers are people pulling and to sort of highlight not to run after the hang on, hang on, the debunking piece but how to try to allow people to recognize that maybe their fear is being manipulated and this is why prebunking has that warning, that warning sort of piece at the very beginning the moment you say to someone warning, psychologically, it's been tested in a number of different languages not just English psychologically someone is more receptive. Your fear might be being manipulated and you can have a conversation that maybe opens up, now we can start to address the issue that in this context, refugees always, you know, aligned with criminalization. I don't know if you want to build a little bit more on that point.
>> MICHAEL POWER: Yeah. I mean, I don't want to speak to a context I don't know enough about. Invariably. But I mean, when looking at these challenges and you know we're seeing it. We see it in the U.S. where there's often political alignment with a lack of safeguards should we say, on the platforms. And then I think you need to look for mechanisms in the state that may be supportive. If there's executive support for what's going on for example in the Indian context I though the competition authority has been given a ruling against Android television for example and you may need to look at somewhat radical strategies to test these types of questions. Our competition authority is working on this, we got people in the Department of Education who value the need for this but then you do have other state departments which simply are not interested in these regulatory questions. So it's really about trying to find those partners at the right time. Particularly where there's a state. I think prebunking plays a really important role as does civil society but so the specifics I can't speak to but that broader alignment between potentially strategic litigation and policy reform. Hm.
>> KATIE DREW: I'm just letting the online colleagues that was a comment from the floor but not in the microphone. There's one more question and then we can move to Therese to closing remarks?
>> Ondine: There's a question in the chat.
>> KATIE DREW: Do you want to read your question from the chat.
>> Ondine: There's a question. How do ensure that digital assurance do not overshadow their you are I didn't needs like access to food, water and shelter with many displaced communities with lacking basic services. On necessity and how can we do both without trade‑offs?
>> KATIE DREW: Great we and then we can go to the question on the mic.
>> AUDIENCE: Hi, everyone. A very informative discussion.
So I personally worked for one of the big tech company before and before that I worked in NGO in China for educational assistance with refugee children and I actually found out in addition to misinformation which is inaccurate information there are some of the information with refugees that maybe is true for example some, maybe some active news, maybe some crime or violence happened because of, you know, everyone, it could happen to everyone but then some information would easily get spread because of I guess algorithm also maybe people's human has cognitive violence. You reinforce this active image so I these these it's neither information, disinformation also not really hate speech it's just the fact that the spread may be faster than the opposite way. So I am just curious because for me I think that's make actually huge damage in terms of public's perception of this vulnerable group so I'm curious if you people on the stage have worked or think about an approach that could address this issue.
>> KATIE DREW: Great, thanks, I think we have one last question and then we can come back to the panel.
>> AUDIENCE: Hi. I do a lot of work around disinformation in South Africa. And I have worked in a couple of projects around foreign influence operations. And this has been kind of xenophobia in South Africa has been a big concern of mine for a long time I'm glad to see this kind of work is taking place but what does it look like in the next couple of months? Is there something that's going to continue because right now I think it's kind of not as busy as it can be especially on the online space, but heading to a local government election it's going to start and the thing is that it doesn't, with this it doesn't remain online like with other kind of disinformation campaigns this spreads offline and results in you know, of violence and death so what does it look like going forward? Thanks.
>> KATIE DREW: I'm going to give the panelists one minute to answer the question. Can somebody continue for prioritization? Michael, what does this look like next? As the next steps? And then do you want to take the piece around sort of how to, you know, amplification and the algorithms that maybe run away with some of the bad content as opposed to less positive content. So one two, three.
And then we'll pass to Therese, thank you. Can you remind me the prioritization.
>> KATIE DREW: Why do we focus on prioritization when we need to focus on basic needs.
>> MBALI MUSHATAMA: I think for me, there the context of South Africa we are fortunate in that our legislation is very progressive in that refugees are afforded the right to work, they have their right to education regardless of the documentation status. They have access to basic services, health care, social grants as well and so I think the number one problem that we're seeing in South Africa is really just xenophobia. Whereby as much as there's access there are limited resources as I previously said and because of this, we have a lot of the host communities saying we don't have jobs because foreigners are here and they're taking our jobs. Our children don't have spaces in school because foreigners are here taking all the spaces of our children in schools. And so for us the main priority is not necessarily access to basic services for refugees in south Africa but rather to ensure that in the country of asylum they are protected in various ways.
We have a huge problem with access to documentation. A lot of times refugees will try and get themselves documented to access such services and legalize their stay in the country however there's a lot of systematic issues with the ministry of home affair so this perpetuates the narrative that we have foreigners that don't care to get documented therefore this further incites violence this is a problem.
>> KATIE DREW: Michael one minute. What next.
>> MICHAEL POWER: There's a few things going on just speaking briefly to our project. We are into our second phase which we hope to conclude by August. The second phase is the last round of testing and we're going to start printing this game as a guide and potentially a digital game. We're still testing to see if we can pull it off in time so I think from the social prebunking approach we are hoping to move this relatively quickly. I think just for intersecting in the South African context there's two. Our competition in the provisional findings and some of the recent reporting is likely to recommend that there must be an amendment to actor to create a degree of platform liability for the amplification of hate speech now a lot of people are not supportive of that amendment but this is quite contested in South African space and I know our national human rights institute is looking into these questions as well. But there's a lot afoot both social regulatory so we are trying to move cognizant of concerns. I think looking at this art is also a process, but we're alive to this. Thank you.
>> KATIE DREW: I realize that we are strictly out of time so Therese I'm sorry we're just going to skip over the last question but maybe we can find you after to discuss the points the algorithms that augment the narratives that spin out and drown out positive content. So I will ask you to stay behind. Therese, sorry, I think you probably have minus minutes but it would be great to hear a wrap up summary I think we have a couple of minutes to hear from you.
>> THERESE MARIE UPPSTROM PANKRATOV: It's great to see the trust based relationship that has been created amongst U.S. partners and I think that's really key and essential to have an innovation process with impact. So we've, it's also, equally inspiring and great to hear the fundamental understanding of the need that you're designing that the innovation process around deep insight into the challenges around Information Integrity in the context of displacement. We heard about the rise in online misinformation and hate speech and the channel ‑‑ challenges and you emphasized the importance of multistakeholder engagement.
We heard about the participatory process and the solutions. And the importance of localization of digital platforms. And we've also heard about the potential prebunking which was new to most of the participants so that's really encouraging to hear and now I said in the beginning that an innovation process to solve challenges rarely leads to a shiny new thing and what we heard about today is this. Multiple smaller solutions that come together and create impact but we also heard about a game and shiny new things are always home and the significant impact that game seems to have already now. I look forward to seeing how that is being further rolled out. We also heard about the need for safe spaces for people that are forcibly displaced to have their voices heard and share their stories. I hope that's something we take with us as we move forwards. I hope you are inspired and that we will see all of you and have a future opportunity to collaborate. Thank you.
>> KATIE DREW: Therese, thank you for summing up. I always think that's the hardest task of the panel. I would like to say a huge thank you to Likho, Michael, Mbali and Therese for their participation today. Thank you for attending and sorry we didn't have time.
>> AUDIENCE: (Applause).