在iOS5 系统中,苹果就已经有了检测人脸的api,能够检测人脸的位置,包括左右眼睛,以及嘴巴的位置,返回的信息是每个点位置。在 iOS7中,苹果又加入了检测是否微笑的功能。通过使用 CIDetector可以实现上述功能,一个小demo如下:
#import <CoreImage/CoreImage.h> //首先包含响应的头文件 /** * 用来存储检测到的信息 */ @property (nonatomic,strong) NSArray *features; //正式代码如下: UIImage *image = [[UIImage alloc] initWithContentsOfFile:self.imagePath]; NSLog(@"imagePath = %@",self.imagePath); CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]]; self.features = [faceDetector featuresInImage:[CIImage imageWithCGImage:image.CGImage]]; for(CIFaceFeature *feature in self.features){ if(feature.hasLeftEyePosition){ CGPoint leftEyePos = feature.leftEyePosition; NSLog(@"leftX = %f leftY = %f",leftEyePos.x,leftEyePos.y); } if(feature.hasRightEyePosition){ CGPoint rightEyePos = feature.rightEyePosition; NSLog(@"rightX = %f rightY = %f",rightEyePos.x,rightEyePos.y); } if(feature.hasMouthPosition){ CGPoint mouthPos = feature.mouthPosition; NSLog(@"mouthX = %f mouthY = %f",mouthPos.x,mouthPos.y); } }
可以看一下 CIFaceFeature 里面的信息,如下:
@interface CIFaceFeature : CIFeature { CGRect bounds; BOOL hasLeftEyePosition; CGPoint leftEyePosition; BOOL hasRightEyePosition; CGPoint rightEyePosition; BOOL hasMouthPosition; CGPoint mouthPosition; BOOL hasTrackingID; int trackingID; BOOL hasTrackingFrameCount; int trackingFrameCount; BOOL hasFaceAngle; float faceAngle; BOOL hasSmile; BOOL leftEyeClosed; BOOL rightEyeClosed; }
可以看到,通过调用 hasSmile 即可得到图片是否微笑。
最后有一点需要注意:
得到的坐标点中,y值是从下开始的。比如说图片的高度为300,左眼的y值为100,说明左眼距离底部的高度为100,换成我们习惯的,距离顶部的距离就是200,这一点需要注意~