How to identify manipulated images



We’re in the midst of a digital image manipulation revolution. That’s to say, image retouching and manipulation apps have taken a quantum leap from previous generations to provide users with seamless autonomy.

For years, social media influencers and trendsetters have made advancements in the shaping of social perceptions of beauty and fashion. In the process, they have instrumentally impacted users to rise to nearly impractical, yet acceptable standards of style and refinement. Following that thread of thought brings us to the modern age of perfection. Now, perfect beauty can be achieved with only a few taps on our smart devices, users can fix their imperfections and, ultimately, alter their appearances with ease.

These apps are now in great abundance and readily available to anyone, not just users who want to smooth out a forehead wrinkle, conceal a mole, or experiment with different colored hair and cosmetics. You’re probably aware of the deceptive practice of using fake profiles to lure people into scams. This is usually done to appeal to individuals looking for love, and also as a stalking apparatus.

ADVERTISEMENT

In the grand scheme of things, I must admit I'm quite a formidable catfisher myself. One could argue that as technology progresses, so does our capacity to manipulate it for the purpose of deceiving others, if not even ourselves.

While my photos are my own, I have utilized image-retouching techniques for years during tactical social engineering campaigns to catch online predators preying on children during Operation Child Safety, as well as gathering evidence against online stalkers and unmasking scammers. Therefore, it's time to do a deep dive into the techniques I use for identifying composite images.

Many users feature pictures that were taken using Snapchat filters, but where I am going to take you is far less conspicuous. If you didn’t realize that photo retouching apps could be egregiously abused, it’s possible that you might overlook the artifacts that can give them away.

example of catfishing image

For this example, I used a simple Snapchat filter to generate the first image on the left. Firstly, I am a bearded 39-year-old man that looks like a Viking. However, this Snapchat filter smoothed out my skin and gave me curly hair, among a slew of other alterations. I then uploaded that image into FaceApp, which is a state-of-the-art feature-rich photo and video editing app.

Using the Face Swap feature, I swapped the face from the first image upon itself, which generated a much more realistic image. Then, using various skin and cosmetic features, I produced the final image of a young adult woman. However, there are several artifacts that serve as clues indicating that this final image was the product of heavy manipulation.

Using OSINT to reveal artifacts in photo manipulation

Open Source Intelligence (OSINT) pertains to the use of tools and/or techniques for collecting and analyzing information from the public domain. If you suspect you are being catfished, there are simple steps you can take to uncover the truth behind an image, and the profile that’s using them. Let’s go over some obvious methods.

ADVERTISEMENT

Reverse Image Search: One of the first OSINT techniques I will often run when messaged by a questionable profile is called a Reverse Image search. What’s more, this will allow you to locate other instances where this photo appears on the internet. It can also help you identify the original image before it was altered. There are online tools such as PimEyes, TinEye, Yandex, and Google Images that are incredibly useful OSINT tools.

The reason why I’ve listed multiple Reverse Image services is that they do not all operate the same way. For example, Yandex does not use facial recognition but rather utilizes machine learning. On the other hand, PimEyes and Google Images analyze uploaded images for distinctive points, colors, lines, and textures. After the analysis, a query is generated and then compared to billions of images that Google can access.

Facial recognition isn’t used by Google Images unless Face Grouping is turned on. However, once that’s enabled, algorithms are used to create a representation of facial images known as a facial model. These algorithms have the ability to forecast the similarity between different facial images and determine if they portray the same person.

Positive hits from Reverse Image searches will allow users to basically connect the dots, find other accounts that the image is used on, and learn more about the person operating the profile.

Eyes: It’s said that eyes are the windows to the soul. Examine the irises and pupils in an image and determine if they are clearly visible and not blurry or pixelated. Authentic photos of the eyes are generally clear. If these are noticeably fuzzy, it’s a clear indication that you’re examining a composite image, which happens whenever someone uses a combination of multiple images to create a new one.

catfishing example image

Hair: In the above image, there is a halo along the edges of the right side of her head. This is because there’s an unnatural contrast between where the hair ends and the background begins. Also, the lighting around the area where the hair ends is not organic. Ultimately, the hair is very inconsistent in texture, having a lack of dimension. Some portions are curly, others seem smudged together. There are also “ghost locks” where the hair seems to fade into translucence, and then reappear past the shoulder with the ends melting into the shirt.

Skin: The lighting contrast between the head in comparison to the skin of the neck and arms is not the same texture.

Filters and Obfuscation: The use of heavy filters and overlays can be used to obfuscate unnatural shadows, broken light contrasts, and melted or smudged objects that can occur during the amalgamation process. Grayscale and black and white filters are often used in an attempt to conceal unnatural light and shadows so that they appear more cohesive to the composite image.

Error Level Analysis: If you still can’t visually tell that an image has been manipulated, the next step is to employ the ELA method and delve into a deeper forensic analysis. Online tools like FotoForensics and Forensically are especially helpful when determining whether something was added to an image.

ADVERTISEMENT

Oftentimes, we see memes with text added or objects added to images, and sometimes it’s difficult to determine with absolute certainty that they’re not real. These online tools compare an original image (or an image in question) to a recompressed version which will expose the added or manipulated objects with a higher compression or noise rate. The below image is a perfect example of a composite image. While the lighting artifacts should be obvious, the compression rate of the computers in the background is noticeably higher than the individual.

catfishing example image

This is especially useful when meeting a stranger online, and you’re dubious that they’re not a scammer. People usually ask the stranger to take a pic with today’s date written on a piece of paper, or some other form of authentication, which is ridiculous because the process can be faked with ease. Just get them on video, because it’s harder to falsify.

ELA operates by compressing the image at a 95% rate and then examining the disparities between the resulting compressed version and the original image. The manipulated regions of a photo become readily apparent in the ELA depiction due to their distinctive attributes.

Avatars: In Image 1, I created the image using a Snapchat filter, then combined that image upon itself using FaceApp’s using Face Swap feature, which generates a realistic face. I was able to use this face to create an avatar using Prequel.

I used this same face as a base image to Face Swap with an AI-generated image also created by the Prequel app. It’s important to note that online profiles that use Avatars don’t necessarily mean the image is based on a fake person. But it doesn’t necessitate that reimaging was generated from an image of a real person.

Analyzing embedded EXIF and metadata

Every digital media contains its own digital fingerprint holding Exchangeable Image File Format (EXIF) metadata. This data may contain GPS coordinates (geotags) as well as exposure level, user settings, camera type, timestamps, file size, file format, et cetera. Knowing this is important because an original photo will contain EXIF data in its image container format. However, a fake will not. Bear in mind EXIF metadata can also be manipulated. This can be ascertained by careful evaluation of all the forensic elements a photo contains.

The online tools mentioned above, as well as the tool ImageEdited will also extract and examine EXIF data, and analyze images users upload for pixels that aren’t exactly cohesive with the image itself.

It’s important to note that many social media and messaging platforms strip out metadata, making it difficult to investigate potentially manipulated images. WhatsApp, Facebook, Messenger, and Instagram strip EXIF data from user-uploaded media for privacy purposes. However, Discord, Telegram, LinkedIn, as well as several others retain the metadata, which can expose both civilian and bad actors alike like to doxing.

ADVERTISEMENT

Importantly, iMessages retain EXIF metadata. With Android devices, as long as the media files aren’t altered, the metadata will remain intact. But since Android devices typically resize large media files, investigating the metadata would be virtually useless.

Knowledge is power

Remember, everything we create digitally has a fingerprint. Even though digital information can be obfuscated, it’s important to use good judgment when engaging with people over the internet whom we do not know.

Rather than succumbing to anxiety or even paranoia over the credibility of a person’s identity, we can put these tools and skills to use and overcome any attempt a bad actor might use to deceive us. This is empowering, because as digital deception increases, we can be confident with our knowledge. Now, it’s much harder for catfishers and scammers to convince us to accept or interact with things that we can prove are false.