Nina Jankowicz: AI’s new target is women in power


Jason Stevens
Contributor

Misogynistic tactics, including deepfake pornography, are being weaponised in elections to discredit women in power and create discord, according to global disinformation expert Nina Jankowicz.

Over 10 per cent of content targeting eSafety Commissioner Julie Inman Grant is threatening,” she said in an interview with InnovationAus.com, highlighting the severe harassment that is faced by women leaders.

Ms Jankowicz notes that over 24 hours, Ms Inman Grant was targeted with nearly 80,000 tweets.

Global disinformation expert Nina Jankowicz. Credit US Embassy Vienna/Flickr

While crediting Australia for leading with its Online Safety Act, she warns that these disinformation tactics aim to undermine credibility and safety, influencing public perception and voter behaviour.  

Ms Jankowicz will expose these issues and findings as a headline speaker for Something Digital 2024, with further insights from her upcoming paper on tech-facilitated gender-based violence with Columbia University’s Institute for Global Politics and Vital Voices. 

Something Digital runs as part of Something Fest, taking place in Brisbane from August 26-30. The festival is run in collaboration with Advance Queensland, which is the government’s commitment for Queensland to become a leading and sustainable world-class innovation economy.  

The theme of Something Digital for 2024 is how digital innovation can be used to ‘Empower Every Human’. The event’s headliners include national and international talent, who will appear in person to share the thought-provoking work, concepts and technologies that will enable us to think differently about the digital futures we are creating. 

Ms Jankowicz continues, I know that Julie’s children and her husband were doxxed. She had enough credible threats against her life that the Australian Federal Police had to get involved.”  

As the former Disinformation Governance Board leader at the United States Department of Homeland Security, Ms Jankowicz herself has been stalked, doxxed, and threatened. Doxers invade people’s privacy by sharing their private information, like addresses or phone numbers, online without consent.  

“I had to change my name on Uber for safety. I still have a cyber stalker who lists 350 officials he wants to ‘take care of.’ It’s exhausting, and no one deserves this for simply doing their job.” 

Deepfake pornography and doxxing may often be used together to maximise intimidation.  

Ms Jankowicz faced further harassment with trolls creating fake explicit videos of her using generative artificial intelligence. She is frustrated that US laws lack real regulatory consequences for such actions. 

She has advised governments, testified before US, UK and EU parliaments, and conducted disinformation research as a Wilson Center Fellow on themes, including the above, impacting national security.  

She raised eyebrows at a recent AI security and defence conference warning about deep fake pornography as “an easy way that any of our adversaries who want to widen the rifts in our societies can do so to discredit and destabilise women in power”. 

She complements Australia’s progressive approach to dealing with both forms of harassment, including the Online Safety Act and the Criminal Code Amendment (Deepfake Sexual Material) Bill, which remains before Parliament. 

She believes the Online Safety Act is smartly set up regarding content enforcement, for example, around child sexual abuse material: “It gives the eSafety Commissioner transparency powers to query them and ask questions that she already has the answer to from the analysis they’ve done and kind of catch them in lies”. 

The approach challenges social media platforms by presenting evidence of child sexual abuse material already identified on their sites. This forces the platforms to confront their claims and take corrective action. 

Conversely, she doesn’t think the regulatory system in Australia would take hold in the US. “Still, I’m interested in learning more and hearing from Australians about how they think it’s functioning, particularly from tech folks.” 

In addition to her talk at Something Digital, Ms Jankowicz hopes to engage government officials in Canberra, Sydney and Melbourne about balancing safety and freedom in the digital world. 

“The issues around speech in Australia are quite interesting, given the lack of right to freedom of expression in the constitution,” she said, with significant interest from Australian political parties and activists in finding practical solutions to get this right.  

“It’s just shocking to me that here in the US in particular, we have no rules on the books yet about how AI can be used in elections,” she laments. “We recently saw Elon Musk sharing a deep fake of Kamala Harris saying and doing things she never said or did, claiming she was incompetent and unfit for office.”

X, formerly Twitter, is a global source of contention about information and transparency. The social giant no longer shares information about disinformation and harmful content in Australia. 

The changes in data access policies have been criticised for limiting researchers’ ability to study user behaviour, information operations, and the spread of disinformation on the platform. 

“Right now, they’re not answering to anybody,” she said, adding that the rules apply to people like Musk, but only sometimes. “He holds regular users to a different standard than he holds himself, as we saw with the deepfake he shared recently.” 

In her first visit to Australia, Ms Jankowicz believes the Something Digital event will be crucial to sharing human-centred approaches to technology as AI reshapes political narratives, particularly for women in power. 

“We’re not going to solve everything before upcoming elections,” she said when asked about a recent study with 33,000 participants, which found that warning labels and digital literacy training improved the ability to judge true or false headlines by only five to ten per cent. 

She points out that while the study’s results are modest, incremental improvements are still valuable, and simple awareness helps people more effectively identify truth versus fiction. 

She uses an approach focusing on critical thinking and emotional awareness with her students: “One tactic of disinformation is to make you feel emotional or generate an emotional reaction. When people know this, they can be more deliberate in their consumption.

One tactic of disinformation is to make you feel emotional or generate an emotional reaction. When people know this, they can be more deliberate in their consumption.

Ms Jankowicz recently founded the non-profit American Sunlight Project to emphasise that combating disinformation isn’t partisan; it affects all our lives.  

“That’s our main goal,” she said. “It’s not exciting for many people, and we will only solve the problem partially. Yet I’m realistic about what we can accomplish.” 

This article was produced by InnovationAus.com in partnership with Something Digital, part of Something Fest 

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories