Monday, April 22, 2013

Image Comparison in One Mobile App

One app called "Sleep if you can" which are available for both Android and iOS is very interesting. I know this app from my friend in social network. The app introduction page for android is: https://play.google.com/store/apps/details?id=droom.sleepIfUCan&hl=en

Briefly describe the work process. This an alarm app. Like other alarm apps, it will turn on the alarm when the preset time is up. The difference and interested thing is how to turn the alarm off: users have to take a similar photo which they took while set the alarm using the cellphone/mobile device. If the app thinks the picture is same, it will turn off the alarm. In this way, user has to get up and take pics, it more effective than regular alarms which could be turned off just by swiping screens. 

I am interested how the app compare the two pics. If found one method form internet, this method is to compare pixels of two images. In this guy's blog (http://jeffkreeftmeijer.com/2011/comparing-images-and-creating-image-diffs/), he mentioned this method. 

Further, how to compare two pixel colors? In my opinion, if the color of two pixels are different, they are probably different. This guy mentioned one method called "Delta E" (http://en.wikipedia.org/wiki/Color_difference#CIE76). 

Colors could be defined by color space, Delta E use "Lab" (http://en.wikipedia.org/wiki/L*a*b*). This space has three specifications. The pic from Wikipedia shows them. L: -black<->white+, a: -green<->magenta, b: -blue<->-yellow.

Then, the differences of colors are calculated by the formulator from wiki:
E(a,b)=sqrt((L1-L2)^2+(a1-a2)^2+(b1-b2)^2)

I think if the E value is too high, the pixel is different and the pics are probably different. 

I also searched internet, there are many posts talked about comparing of two images, do you guys has any ideas of how this app works?


3 comments:

  1. That's funny. What should we do if we took a picture of moon in the night? Will the app consider other things similar as moon during the daytime.

    ReplyDelete
  2. I'm going to assume that since phone cameras do not handle light too well (as far as fluctuations and user control), it does something with edges. By edges, I mean converting the picture to be essentially colorless, with highlights making up edges of the items in the picture. To take it a step further, I'm going to also assume that it analyzes regions instead of the whole picture to increase the accuracy. As far as handling angles that the photo was taken, there might be code in place to handle transforming in relation to the gyroscope of the phone, but that seems unlikely. Who knows. Too bad I have only have a Windows phone.

    ReplyDelete
  3. Such a lovely conversation! Really impressive, sir! I always admire the simplicity of your statements. I am also a fresher at Android application development. Since it's difficult for me to find a suitable platform for freelancing, can you recommend something to get good projects as a beginner developer in the industry?

    ReplyDelete