30 Years of Digital Forensics: From Airbrushing to AI to the Birth of GetReal

10 minute read

Author

Hany Farid

Co-Founder and Chief Science Officer

Date

26/3/2025

Blog

Share

It started with a book

It’s 1997, and I’m a post-doctoral student at MIT,  standing in line at the library. I have no cell phone, no emails, no social media posts to keep me distracted, so I notice a book in the return cart: the Federal Rules of Evidence.  

I flip it open and land on a section about introducing photographs into evidence in court. I’m curious, so I keep reading, and I stumble upon a clause stating that courts planned to treat digital images with the same equal weight as analog/film ones. 

Now, I didn't know much at the time, but that seemed like a bad idea. 

The digital revolution was still very much in its early days but digital cameras were already fundamentally shifting the photographic medium. 

Photographs have been manipulated for as long as photography has been around, but digital photography was going to make it easier to manipulate photographs in a way that was going to be much more difficult to detect. 

This passage stuck with me, and I couldn’t stop thinking about how digital images could be authenticated.

Photoshop enters the room… 

Two years go by. I’m now an Assistant Professor at Dartmouth College, I’m goofing around in Photoshop splicing the head of a friend onto the body of another person. My friend's head was too small to fit on the body and so I increased the size. 

As I performed this simple operation, I realized that mathematically, I just did something very interesting. I introduced pixels that had been synthesized by Photoshop to make the image bigger because those pixels didn't exist in the first place.

I remember thinking… “I should be able to detect that."

One of my graduate students at the time and I started writing code and quickly started to make progress. We found a way to quantify and detect these correlations, and submitted this work for publication. 

A reviewer of our paper made an insightful comment that provided the fodder for our second forensic technique. This second technique exploited the fact that digital cameras do not record all of the pixels needed to generate a complete color image, but instead record a subset of the required pixels and reconstruct the remaining needed pixels. This led us to a forensic technique to exploit this pattern.

A controversial photo of presidential hopeful Senator John Kerry and actress and anti-war activist Jane Fonda inspired our third forensic technique. The pattern of illumination on the faces of Kerry and Fonda seemed inconsistent, leading us to develop a series of forensic techniques for measuring the properties of the environmental illumination to detect photo composites.

Over the following decade, my research group developed a suite of photo forensic techniques, each based on characterizing and quantifying irregularities that arise because of different types of manipulation. 

The field of digital forensics was born, but I could not have imagined what was still to come.

When the landscape started to shift

By 2014, I felt like we had a pretty good handle on the problem of photo (and video) authentication. A few years later, however, I started to hear rumblings about a new form of AI-powered manipulation under the moniker of deepfakes. Over the next decade, our nascent and niche field of digital media forensics rapidly expanded in scope and application, tackling everything from daily press fact-checks to tackling small- to large-scale fraud, debunking viral online hoaxes, and helping courts wrestle with the thorny issues of authenticating visual evidence.

Today, deepfakes (rebranded as generative AI) are both powering new waves of creativity and wreaking havoc on truth and trust. 

Leaving the Ivy Tower

When Ted Schlein from Ballistic Ventures approached me in 2019, he saw what few others did: a new cyber threat from deepfakes that we, as individuals, organizations, and governments, weren't prepared for. We started to talk about how we could protect the world from deepfakes, but COVID disrupted everything. 

When we re-emerged in 2022 – with the threat of deepfakes even bigger than before – I knew it was time to start working on transitioning our two decades of digital forensic technologies into the real world. Having talked with just about everyone in this space, I knew that Ted and Ballistic Ventures were the right partners.

So we created and started to build GetReal.

Where we are today

And here we are – two and a half years later – GetReal is home to 40+ employees led by a star-studded executive team including Matt Moynahan, Jim Brennan, and Rob Van Es, and a diverse and brilliant research, engineering, and investigatory team. As we exit our Series A, led by a powerful team of investors and partners, we are on a mission to help individuals, organizations, and governments protect against malicious, manipulated media. 

By developing a suite of digital forensic and cyber-security tools that asynchronously and synchronously analyze images, audio, and video, we are empowering organizations across industries to figure out what’s real and what’s not and restore confidence in times of critical decision-making. 

Deepfakes, real consequences…

is more than just our tagline; it’s our day-to-day reality. For the past two and a half years, we have woken to new threats and attacks every day. Every day, we also develop new and increasingly more powerful tools for distinguishing the real from the fake.

My former students will tell you that – because it is both true and inspirational – I am particularly fond of this quote by Margaret Mead: 

“Never doubt that a small group of thoughtful, committed citizens can change the world; indeed, it’s the only thing that ever has.” 

Truer words couldn’t be said for the GetReal team — a collection of digital forensics experts, cybersecurity leaders, engineers, journalists, and investigators brought together by a shared passion and mission to rebuild trust in digital communications. We are the world’s leading authorities in digital media forensics, and together, we seek to protect everyone from a new generation of attack vectors.