
Emmanuelle Saliba
Chief Investigative Officer
What was your path to joining GetReal Labs? Did anything surprise you along the way?
I first met Hany Farid when I interviewed him at Berkley while interviewing him for a segment for ABC’s This Week on deepfakes and content credentials. At the time, generative AI had just taken a major leap forward, and hyperrealistic images were flooding social media. There was growing unease - could we still trust what we were seeing? Our collective ability to separate fact from fiction was being tested more than ever.
At the end of our interview, off camera, he mentioned he was working on a solution to detect manipulated media. But he wasn’t ready to share any details just yet…. A real cliffhanger.
Fast forward a few months, and Hany rings. He’s got something for us to start testing.
ABC News became a design partner helping stress test the platform and providing active feedback. Over the course of the next year, I worked closely with Hany and his team, watching the technology evolve.
For most of my journalism career, I have been focused on helping audiences navigate between real and fake, trustworthy and manipulated. When the opportunity came to join GetReal, it felt like a natural alignment with the work I'd been doing all along.
What are the biggest breakthroughs or challenges in this area?
The technology isn’t just evolving rapidly – it is also being used in ways no one originally anticipated. People are layering different AI tools - one for music, another for audio, and another for video - to create a single piece of content. And beyond AI, they may also be digitally manipulating media directly on their phones.
I also predict that AI-driving editing will become second nature to most - just like adding a filter before posting a photo. How do we differentiate all these different types of content and explain it?
This is not a straightforward problem, which is why it makes it all the more exciting to tackle. It requires a multidisciplinary approach, combining machine learning, forensic research, and human investigations.
How could your work change the way people interact with digital media?
My mission in my work has always been to inform and equip people - whether audiences or clients - to better understand what they’re seeing online. Awareness is key.
People need to develop a natural skepticism, asking themselves a few critical questions whenever they encounter an image, audio clip, or video.
When bad actors - whether state-sponsored or cybercriminals - weaponize this technology, we’ve seen devastating consequences. In the first case, it could manipulate public opinion and later geopolitical decisions, impacting millions globally. In the latter, we’ve seen it lead to financial ruin or even inflict deep psychological harm on individuals. The stakes are high, and it’s only going to get more complicated in the coming months.