http://www.developer.com/http://www.developer.com/ws/android/programming/face-detection-with-android-apis.html
Through two main APIs, Android provides a simple way for you to identify the faces of people in a bitmap image, with each face containing all the basic location information. This tutorial focuses on utilizing these APIs to accomplish the face detection task, which can be extended for many other interesting applications. As we work through these APIs, we will develop a simple working project. The entire source package is available for download as a reference. One thing to note is face detection is a computer technology that determines the locations and sizes in arbitrary images. Do not confuse it with face recognition. A facial recognition system is a computer application for automatically identifying or verifying a person from a digital image. One of the ways to do this is by comparing selected facial features from the image and a facial database. Simply put, face detection extracts people's faces in images but face recognition tries to find out who they are. As mentioned before, there are two main APIs introduced in this tutorial: There is no installation necessary since they come with the base Android APIs, not from optional packages. You can construct a generic Android activity. We extend the base class ImageView to MyImageView, which we use as our main view to display the image as well as face feature markers. At the moment, the bitmap containing faces must be in 565 format for the APIs to work correctly. A detected face needs to have a confidence measure above the threshold defined in android.media.FaceDetector.Face.CONFIDENCE_THRESHOLD. The most important method is implemented in @Override mIV = new MyImageView(this); // load the photo mFaceWidth = mFaceBitmap.getWidth(); // perform face detection and set the feature points setFace(); mIV.invalidate(); public void setFace() { try { // check if we detect any faces for (int i = 0; i < count; i++) { fpx[i] = (int)midpoint.x; In the following code we added if (xx != null && yy != null && total > 0) { Figure 1: Single Face Detected in Android You can specify the maximum number of faces to be detected using FaceDetector. You can modify the following variable for this purpose, for example. In the API documentation, it does not specify whether an upper limit exists, so you can try to detect as many faces as possible. Then you can use count returned from findFaces to obtain all the results from the list. Figure 2 is one example showing multiple markers centered on the respective midpoints of the detected faces. Figure 2: Multiple Faces Detected in Android Android face detector returns other information as well for us to fine-tune the results a little bit. For example, it also returns eyesDistance, pose, and confidence. We can use eyesDistance to estimate where the eye center locations are. This time we also put setFace() in a background thread inside of doLengthyCalc(), because the computation of face detection can potentially take too long and cause the "Application Not Responding" error when dealing with big images or images with many faces to detect. Figure 3 is one example showing multiple markers centered on the respective eyes of the detected faces. protected static final int GUIUPDATE_SETFACE = 999; super.handleMessage(msg); @Override mIV = new MyImageView(this); // load the photo mFaceWidth = mFaceBitmap.getWidth(); // perform face detection in setFace() in a background thread public void setFace() { try { // check if we detect any faces for (int i = 0; i < count; i++) { // set up left eye location if (DEBUG) { mIV.setDisplayPoints(fpx, fpy, count * 2, 1); private void doLengthyCalc() { public void run() { Figure 3: Eyes Detected in Android Generally speaking, face detection is mostly achieved by searching for high-contrast areas that resemble facial features, so results from grayscale images are usually not too far off those from color images. However, some researchers are still trying to improve the accuracy of face detection in color images. In reality, other factors such as lighting and occlusion will have an even bigger impact on the accuracy of face detection. We ran through some sample grayscale and color images and got similar results from the Android APIs. Therefore, the APIs seem to ignore the factor of different color channels. One example is shown below in Figure 4. Figure 4: Grayscale Face Detected in Android In this tutorial, we introduced the simple face detector in Android APIs and worked through a real example. The entire software package is available for download; you can import it into Eclipse by selecting "Creating project from existing source." If you are interested in exploring Android face detection further, here are some helpful considerations: Chunyen Liu has been a software professional for many years. Some of his applications were among winners at programming contests administered by Sun, ACM, and IBM. He has co-authored software patents, written 20+ articles, reviewed books, and also created numerous hobby apps at Androidlet and The J Maker. He holds advanced degrees in Computer Science with knowledge from 20+ graduate-level courses. On the non-technical side, he is a tournament-ranked table tennis player, certified umpire, and certified coach of USA Table Tennis.
Face Detection with Android APIs
April 18, 2012
How To Install Android Face Detection APIs
Constructing An Android Activity For Face Detection
setFace(). It instantiates the FaceDetector object and calls findFaces. The result is then stored in faces. Face midpoints are passed onto MyImageView for display.public class TutorialOnFaceDetect1 extends Activity {
private MyImageView mIV;
private Bitmap mFaceBitmap;
private int mFaceWidth = 200;
private int mFaceHeight = 200;
private static final int MAX_FACES = 1;
private static String TAG = "TutorialOnFaceDetect";
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(mIV, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT));
Bitmap b = BitmapFactory.decodeResource(getResources(), R.drawable.face3);
mFaceBitmap = b.copy(Bitmap.Config.RGB_565, true);
b.recycle();
mFaceHeight = mFaceBitmap.getHeight();
mIV.setImageBitmap(mFaceBitmap);
}
FaceDetector fd;
FaceDetector.Face [] faces = new FaceDetector.Face[MAX_FACES];
PointF midpoint = new PointF();
int [] fpx = null;
int [] fpy = null;
int count = 0;
fd = new FaceDetector(mFaceWidth, mFaceHeight, MAX_FACES);
count = fd.findFaces(mFaceBitmap, faces);
} catch (Exception e) {
Log.e(TAG, "setFace(): " + e.toString());
return;
}
if (count > 0) {
fpx = new int[count];
fpy = new int[count];
try {
faces[i].getMidPoint(midpoint);
fpy[i] = (int)midpoint.y;
} catch (Exception e) {
Log.e(TAG, "setFace(): face " + i + ": " + e.toString());
}
}
}mIV.setDisplayPoints(fpx, fpy, count, 0);
}
}setDisplayPoints() to our MyImageView to render markers at the detected face features. Figure 1 shows a marker centered on the midpoint of the detected face.// set up detected face features for display
public void setDisplayPoints(int [] xx, int [] yy, int total, int style) {
mDisplayStyle = style;
mPX = null;
mPY = null;
mPX = new int[total];
mPY = new int[total];for (int i = 0; i < total; i++) {
mPX[i] = xx[i];
mPY[i] = yy[i];
}
}
}
Android Face Detection: Detecting Multiple Faces
private static final int MAX_FACES = 10;
Android Face Detection: Approximating Eye Center Locations
public class TutorialOnFaceDetect extends Activity {
private MyImageView mIV;
private Bitmap mFaceBitmap;
private int mFaceWidth = 200;
private int mFaceHeight = 200;
private static final int MAX_FACES = 10;
private static String TAG = "TutorialOnFaceDetect";
private static boolean DEBUG = false;
protected Handler mHandler = new Handler(){
// @Override
public void handleMessage(Message msg) {
mIV.invalidate();
}
};
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(mIV, new LayoutParams(LayoutParams.WRAP_CONTENT, LayoutParams.WRAP_CONTENT));
Bitmap b = BitmapFactory.decodeResource(getResources(), R.drawable.face3);
mFaceBitmap = b.copy(Bitmap.Config.RGB_565, true);
b.recycle();
mFaceHeight = mFaceBitmap.getHeight();
mIV.setImageBitmap(mFaceBitmap);
mIV.invalidate();
doLengthyCalc();
}
FaceDetector fd;
FaceDetector.Face [] faces = new FaceDetector.Face[MAX_FACES];
PointF eyescenter = new PointF();
float eyesdist = 0.0f;
int [] fpx = null;
int [] fpy = null;
int count = 0;
fd = new FaceDetector(mFaceWidth, mFaceHeight, MAX_FACES);
count = fd.findFaces(mFaceBitmap, faces);
} catch (Exception e) {
Log.e(TAG, "setFace(): " + e.toString());
return;
}
if (count > 0) {
fpx = new int[count * 2];
fpy = new int[count * 2];
try {
faces[i].getMidPoint(eyescenter);
eyesdist = faces[i].eyesDistance();
fpx[2 * i] = (int)(eyescenter.x - eyesdist / 2);
fpy[2 * i] = (int)eyescenter.y;
// set up right eye location
fpx[2 * i + 1] = (int)(eyescenter.x + eyesdist / 2);
fpy[2 * i + 1] = (int)eyescenter.y;
Log.e(TAG, "setFace(): face " + i + ": confidence = " + faces[i].confidence()
+ ", eyes distance = " + faces[i].eyesDistance()
+ ", pose = ("+ faces[i].pose(FaceDetector.Face.EULER_X) + ","
+ faces[i].pose(FaceDetector.Face.EULER_Y) + ","
+ faces[i].pose(FaceDetector.Face.EULER_Z) + ")"
+ ", eyes midpoint = (" + eyescenter.x + "," + eyescenter.y +")");
}
} catch (Exception e) {
Log.e(TAG, "setFace(): face " + i + ": " + e.toString());
}
}
}
}
Thread t = new Thread() {
Message m = new Message();
try {
setFace();
m.what = TutorialOnFaceDetect.GUIUPDATE_SETFACE;
TutorialOnFaceDetect.this.mHandler.sendMessage(m);
} catch (Exception e) {
Log.e(TAG, "doLengthyCalc(): " + e.toString());
}
}
};t.start();
}
}
Android Face Detection: Color vs. Grayscale

Conclusion
Android Face Detection Code Download
About the Author