Subscribe

CSIR develops framework to spot fake election news

Simnikiwe Mzekandaba
By Simnikiwe Mzekandaba, IT in government editor
Johannesburg, 20 Mar 2024
Fake news spread through social media is a worldwide tactic used to sway elections.
Fake news spread through social media is a worldwide tactic used to sway elections.

With the 2024 elections in sight, the Council for Scientific and Industrial Research (CSIR) is in the final stages of developing a human-centric framework to detect fake election news.

This will serve as an additional layer to the existing online tools, revealed Dr Zubeida Dawood, research group leader in the information and cyber security centre at the CSIR.

Yesterday, the CSIR, in partnership with the University of Pretoria, hosted a media briefing on combatting fake news and misinformation during the elections.

Misinformation, disinformation and fake news are spread online, typically via social media platforms.

With over 60 countries, including South Africa, going to the polls this year, there are mounting concerns about the prevalence of disinformation and fake news, fuelled by the growing use of deepfakes and generative artificial intelligence (AI).

In its 2024 Global Risks Report, the World Economic Forum identified AI-generated misinformation and disinformation as the second most significant global risk, after extreme weather.

Dawood said the CSIR is working on projects focused on curbing fake news during the elections, including the framework, which is nearly complete.

The framework will be available for public use, most likely in a month or two, she noted. “It’s research at this stage, which is also important for us as researchers to package it in a way that your average South African can consume it. Once the work is published, we’ll work on creating it for public consumption.

“We do recognise that technological tools – right now – may not accurately portray the South African context, in terms of various cultural backgrounds, different languages, trends, etc. We want to look at a human-centric framework to provide outreach to some people who don’t have technology. This fosters conversation between communities to verify information.”

In addition to the framework is the Mzansi Advanced Cyber Security Learning Factory (MACS LF), she said.

“South African elections rely heavily on digital infrastructure, so the MACS LF can empower our IEC workers and officials by equipping them with training on how they can prevent phishing attacks and malware infections that might manipulate the election data.

“Building a skilled cyber security workforce is key, and the MACS LF can assist with this by safeguarding our elections.”

Tale as old as time

Dawood pointed out that fake news dates back to the 1800s. However, in today’s world of vast amounts of data, fake news is distributed quickly via the internet.

Additionally, the surge of AI tools such as ChatGPT is both good and bad. “The good is that there are AI-based tools to check fake news, but the bad is that the same tools can be used to create fake news.

“While AI is there to detect fake news, it’s also harder to detect because these tools can be used to create prompts that generate articles. Previously, we could identify fake news via spelling errors and grammar. Now, everything is perfect and written out for you.”

In terms of the different types of fake news, Dawood noted that within the South African context, this is characterised by satire and parody; for example, when the public makes fun of ministers. “However, there are false connections, misleading content and even fabricated content.”

Faking it

Deepfakes, which are highly-realistic fabricated videos or images, are also on the rise and being used maliciously, said Dawood, adding that cyber criminals use deepfakes to spread lies.

“Fabricating deepfakes is a time-consuming thing, but we do think it’s going to get easier to make as there are more tools. There has been increased use in this and we need to continue to be active in creating the relevant policies to make sure this is regulated.”

The researcher highlighted that disinformation through social media is now a worldwide tactic used to sway elections. Locally, the Democratic Alliance is seeking advice after one of its MPs fell prey to a deepfake voice recording.

“The video is on TikTok and has a voice recording of the MP, where it makes all sorts of claims. Although we now know that this is a deepfake, there is still the likelihood of lasting reputational damage for the party.”

Looking at country policies, Dawood said South Africa is doing well in some regards. “We’ve got policy work around hate speech and task forces, but we are lacking policy work around elections and misinformation, which is problematic.

“The good news is that we do have some sanctions in place. There will be penalties out there for people who do malicious things.

“To empower our voters, we need to teach people how to be vigilant and discern fact from fake. We need to be aware of common tactics or doctored audio messages, and so forth. Social media plays a role in this.

“To be resilient, we need to address this through social media platforms, we need to strengthen our cyber security measures. This will only be possible through collaboration between our government, tech companies and research institutes, as well as universities, to create a workforce against disinformation.

“All in all, you really need to develop a critical mindset and use online tools to spot fake news.”

Other initiatives the CSIR is looking at are AI and machine learning to analyse patterns, identify fake news and examine sentiment analysis, she noted.

“We are continuously learning to adapt and evolve the tactics. We also recognise the collaborative role that needs to occur with global and local entities. This is to empower our voters and educate them on identifying and reporting fake news, and to promote media literacy.” 

Share