Looksery developed an application that modifies facial features in real-time. Snapchat acquired them in September 2015 for 150 million dollars and used their technology to launch a new feature that we call today, filters. We will talk shortly about why this is a sound investment. Although Snapchat calls them Lenses, it hasn’t caught on with the public. The user end of this application may look easy to carry out, but complications arise with what goes behind the scenes. In this post, we will decipher the engineering behind Lenses and see how Snapchat filters work.
Contents
Rediscovering an old discovery
Face detection is not a new technology. Facebook has had that for a long time. The digital cameras you owned before even using DSLRs also detected a lot of faces. So it’s common knowledge that it’s not rocket science. But the extent to which Snapchat filters find the face might be a little over your digital SLRs league.
When humans see a face, the brain processes a lot of information that triggers the memory nerves, and you recall whether you know the person or not and react so.
When a computer sees a face or any image, it sees a blank screen with different color codes on each pixel. Something similar to the picture below.
How does Snapchat recognize a face?
- This large matrix of numbers are codes, and each combination of the number represents a different color.
- The face detection algorithm goes through this code and looks for color patterns that would represent a face.
- Different parts of the face give away various details. For example, the bridge of the nose is lighter than its surroundings. The eye socket is darker than the forehead, and the center of the forehead is lighter than its sides.
- This could take a lot of time, but Snapchat created a statistical model of a face by manually pointing out different borders of the facial features. When you click your face on the screen, these already predefined points align themselves and look for areas of contrast to know precisely where your lips, jawline, eyes, eyebrows, etc. are. This statistical model looks something like this.
Once these points are located, the face is modified in any way that seems suitable.
What else can Snapchat do?
From making you look like a puppy to an old truck driver. But that’s not all Snapchat filters do. An updated version of Snapchat had a feature for swapping faces with a friend, whether in real-time or by accessing some faces from your gallery ( yes, you allowed Snapchat to look in your Camera Roll ). Notice how the face shapes are visible; that’s the position where the statistical model lies. It helps Snapchat to quickly align you and your friends face and swap the features.
After locating all your features, the application creates a mesh along your face. This mesh sticks to each point, frame by frame. This mesh can now be edited and modified as Snapchat feels.
Some lenses do much more by either asking you to raise your eyebrows or by opening your mouth. Here’s how:
- The inside of the mouth is dark, relatively. So that gives away the opening of the mouth.
- The changes on the eyebrows relative to the other facial features are taken into account when it figures out the user has raised the eyebrows.
Now, as mentioned before, this technology is not new. But to perform all those processes in real-time and on a mobile platform takes a lot of processing power. That’s why Snapchat thought it’s better to pay 150 million dollars to acquire Looksery instead of just building its platform.
Ghost faces on Snapchat: What’s this about?
Recently some snappers have posted their snaps publicly claiming to have seen a ghost. In these snaps, the lenses were being applied to random objects or a wall. You don’t need to be frightened; there’s no ‘ghost’ around you (fingers crossed). This glitch occurs because the filter is trying hard to find a face and that random object on which the filter is being applied to simply pass the criteria of detecting a face.
Though this is one helluva coincidence, there is no way to fix it just yet. Maybe after processing a lot more data, something can happen. Recently, Google’s image recognition algorithm was able to perform with 93% efficiency. All the credits go to the massive data they have collected in the past. Just like that, the more faces Snapchat can process, higher will be the efficiency in applying the lenses.
How does Snapchat make money?
They reported a revenue of over 200 million dollars in 2015. By 2017, they projected the revenue to be between 500 million and 1 billion dollars. This is all pretty impressive for a company that launched in mid-2011. Right now, they make money through On-Demand Geofilters. According to their website:
Those special filters you see when some major event is happening like the Olympics or the Euros are all paid content. Word of mouth is the fastest and the most reliable way of advertising, and Snapchat did just that for publishers. Smart. Other than this, there is the discover section when you swipe left. Lots of major publishers post promotional content, and the users can read it in Snapchat’s inbuilt browser.
Was Looksery worth the investment?
There have been some alarming acquisitions in the past. So when a company worth billions acquires Looksery for 150 million dollars, we can see that it was a pretty good deal. For Snapchat, of course.
Using their newly acquired technology, Snapchat is looking to monetize its app even further. They filed for a patent in July 2016 to take image recognition to another level. According to this technology, when a user clicks a picture of a delicious cheesy burger, they trigger special filters. Perhaps, a filter that promotes a famous burger outlet. Now because this filter is unique from the others, it utilizes our need and tendency to stand out from the crowd. This encourages the user to make use of the “special” filter and send it to their friends.
That’s word of mouth advertising in a nutshell.
Snapchat believes this is extremely plausible by developing a lot of algorithms and storing enormous amounts of filters/lenses and products in their database. Well at least they don’t have to store the images sent everyday.