• 获取图片中指定区域图片


    转载自   http://blog.csdn.net/whf727/article/details/14522635

    最近在搞直接一个类似于二维码的东西,同样也是需要获取其中某个区域的图片。直接上最为主要的一些代码吧。

    下面这个是初始化AV部分,这样就可以将图像在view上面展示了。这里简单的阐述一下在这其中碰到的问题和解决方法。

    1.如果在layer上面搞出一个“洞 ”,就是真正的裁剪区域,在这里用的是CAShapeLayer,利用fillMode,这样就可以通过mask方式作用在将覆盖在perviewLayer上面的coverLayer了。

    2. 我们可以很容易的拿到整个的image,就可以在delegate中的sampleBuffer中拿到了。这里我使用的是AVCaptureVideoDataOutput,这样就可以不断的获取到采样的流了。

    3. 从整个image中拿到裁剪区域中的图片。在这个问题上面花了不少时间和心思始终不能正确的拿到裁剪区域的图像。先是用了 CGImageFromImage ,一个是才出来的图片位置和大小不对。之后转用cgcontext的方式。但是还是不太对。不断的使用google啊,怎么搞呢,琢磨啊。因为刚开始 layer的呈现方式是fill的,这样实际的图片大小并不是和屏幕的大小是一样的。思前想后,可以确定是这个问题了,然后开始吧。针对不同的 videoGravity的方式计算出裁剪区域实际在图片中对象的位置和大小,于是就有了一个calcRect的方法,这个方法就是将之前在屏幕上挖出来 的“洞”对应到图片中的位置去。



    总算是搞出来了。有兴趣的看看吧。


      1. <pre name="code" class="objc">//  
      2. //  ScanView.m  
      3. //  xxoo  
      4. //  
      5. //  Created by Tommy on 13-11-6.  
      6. //  Copyright (c) 2013年 Tommy. All rights reserved.  
      7. //  
      8.   
      9. #import "ScanView.h"  
      10. #import <AVFoundation/AVFoundation.h>  
      11.   
      12.   
      13. static inline double radians (double degrees) {return degrees * M_PI/180;}  
      14.   
      15. @interface ScanView()<AVCaptureVideoDataOutputSampleBufferDelegate>  
      16.   
      17. @property AVCaptureVideoPreviewLayer* previewLayer;  
      18. @property AVCaptureSession* session;  
      19. @property AVCaptureDevice* videoDevice;  
      20. @property dispatch_queue_t camera_sample_queue;  
      21. @property CALayer* coverLayer;  
      22. @property CAShapeLayer* cropLayer;  
      23. @property CALayer* stillImageLayer;  
      24. @property  AVCaptureStillImageOutput* stillImageOutput;  
      25.   
      26. @property UIImageView* stillImageView;  
      27. @property UIImage* cropImage;  
      28.   
      29. @property BOOL hasSetFocus;  
      30.   
      31.   
      32.   
      33. @end  
      34.   
      35. @implementation ScanView  
      36.   
      37. - (id)initWithFrame:(CGRect)frame  
      38. {  
      39.     self = [super initWithFrame:frame];  
      40.     if (self) {  
      41.         // Initialization code  
      42.         self.hasSetFocus = NO;  
      43.         [self initAVCaptuer];  
      44.         [self initOtherLayers];  
      45.     }  
      46.     return self;  
      47. }  
      48.   
      49. /* 
      50. // Only override drawRect: if you perform custom drawing. 
      51. // An empty implementation adversely affects performance during animation. 
      52. - (void)drawRect:(CGRect)rect 
      53.     // Drawing code 
      54. */  
      55. -(void)layoutSubviews  
      56. {  
      57.     [self.previewLayer setFrame:self.bounds];  
      58.     [self.coverLayer setFrame:self.bounds];  
      59.     self.coverLayer.mask = self.cropLayer;  
      60. }  
      61.   
      62. - (void) initAVCaptuer{  
      63.       
      64.     self.cropRect = CGRectZero;  
      65.       
      66.     self.videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];  
      67.     AVCaptureDeviceInput* input = [[AVCaptureDeviceInput alloc]initWithDevice:self.videoDevice error:nil];  
      68.       
      69.     AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc]init];  
      70.     output.alwaysDiscardsLateVideoFrames = YES;  
      71.     self.camera_sample_queue = dispatch_queue_create ("com.scan.video.sample_queue", DISPATCH_QUEUE_SERIAL);  
      72.     [output setSampleBufferDelegate:self queue:self.camera_sample_queue];  
      73.       
      74.     NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;  
      75.     NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];  
      76.     NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];  
      77.     [output setVideoSettings:videoSettings];  
      78.       
      79.       
      80.     self.stillImageOutput = [[AVCaptureStillImageOutput alloc]init];  
      81.     NSDictionary* outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG};  
      82.     [self.stillImageOutput setOutputSettings:outputSettings];  
      83.       
      84.     self.session = [[AVCaptureSession alloc]init];  
      85.     self.session.sessionPreset = AVCaptureSessionPresetMedium;  
      86.       
      87.     if ([self.session canAddInput:input])  
      88.     {  
      89.         [self.session addInput:input];  
      90.           
      91.         if ([self.session canAddOutput:output])  
      92.         {  
      93.             [self.session addOutput:self.stillImageOutput];  
      94.             [self.session addOutput:output];  
      95.               
      96.             self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.session];  
      97.             self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;  
      98.             
      99.             [self.layer addSublayer: self.previewLayer];  
      100.               
      101.             return; // success  
      102.         }  
      103.     }  
      104.       
      105.     self.session = nil;  
      106. }  
      107.   
      108. - (void)setCropRect:(CGRect)cropRect  
      109. {  
      110.     _cropRect = cropRect;  
      111.     if(!CGRectEqualToRect(CGRectZero, self.cropRect)){  
      112.   
      113.         self.cropLayer = [[CAShapeLayer alloc] init];  
      114.         CGMutablePathRef path = CGPathCreateMutable();  
      115.           
      116.         CGPathAddRect(path, nil, self.cropRect);  
      117.         CGPathAddRect(path, nil, self.bounds);  
      118.           
      119.         [self.cropLayer setFillRule:kCAFillRuleEvenOdd];  
      120.         [self.cropLayer setPath:path];  
      121.         [self.cropLayer setFillColor:[[UIColor whiteColor] CGColor]];  
      122.           
      123.         [self.cropLayer setNeedsDisplay];  
      124.           
      125.         //[self setVideoFocus];  
      126.           
      127.     }  
      128.       
      129.     [self.stillImageLayer setFrame:CGRectMake(100, 450, CGRectGetWidth(cropRect), CGRectGetHeight(cropRect))];  
      130. }  
      131.   
      132. - (void) setVideoFocus{  
      133.       
      134.     NSError *error;  
      135.     CGPoint foucsPoint = CGPointMake(CGRectGetMidX(self.cropRect), CGRectGetMidY(self.cropRect));  
      136.     if([self.videoDevice isFocusPointOfInterestSupported]  
      137.        &&[self.videoDevice lockForConfiguration:&error] &&!self.hasSetFocus){  
      138.         self.hasSetFocus = YES;  
      139.         [self.videoDevice setFocusPointOfInterest:[self convertToPointOfInterestFromViewCoordinates:foucsPoint]];  
      140.         [self.videoDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];  
      141.         [self.videoDevice unlockForConfiguration];  
      142.     }  
      143. //    [self.videoDevice setFocusMode:AVCaptureFocusModeAutoFocus];  
      144.     NSLog(@"error:%@",error);  
      145.       
      146. }  
      147.   
      148.   
      149. - (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates  
      150. {  
      151.     CGPoint pointOfInterest = CGPointMake(.5f, .5f);  
      152.     CGSize frameSize = self.frame.size;  
      153.       
      154.     AVCaptureVideoPreviewLayer *videoPreviewLayer = self.previewLayer;  
      155.       
      156.     if ([self.previewLayer isMirrored]) {  
      157.         viewCoordinates.x = frameSize.width - viewCoordinates.x;  
      158.     }  
      159.       
      160.     if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize] ) {  
      161.         pointOfInterest = CGPointMake(viewCoordinates.y / frameSize.height, 1.f - (viewCoordinates.x / frameSize.width));  
      162.     } else {  
      163.         CGRect cleanAperture;  
      164.         for (AVCaptureInputPort *port in [[[[self session] inputs] lastObject] ports]) {  
      165.             if ([port mediaType] == AVMediaTypeVideo) {  
      166.                 cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);  
      167.                 CGSize apertureSize = cleanAperture.size;  
      168.                 CGPoint point = viewCoordinates;  
      169.                   
      170.                 CGFloat apertureRatio = apertureSize.height / apertureSize.width;  
      171.                 CGFloat viewRatio = frameSize.width / frameSize.height;  
      172.                 CGFloat xc = .5f;  
      173.                 CGFloat yc = .5f;  
      174.                   
      175.                 if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect] ) {  
      176.                     if (viewRatio > apertureRatio) {  
      177.                         CGFloat y2 = frameSize.height;  
      178.                         CGFloat x2 = frameSize.height * apertureRatio;  
      179.                         CGFloat x1 = frameSize.width;  
      180.                         CGFloat blackBar = (x1 - x2) / 2;  
      181.                         if (point.x >= blackBar && point.x <= blackBar + x2) {  
      182.                             xc = point.y / y2;  
      183.                             yc = 1.f - ((point.x - blackBar) / x2);  
      184.                         }  
      185.                     } else {  
      186.                         CGFloat y2 = frameSize.width / apertureRatio;  
      187.                         CGFloat y1 = frameSize.height;  
      188.                         CGFloat x2 = frameSize.width;  
      189.                         CGFloat blackBar = (y1 - y2) / 2;  
      190.                         if (point.y >= blackBar && point.y <= blackBar + y2) {  
      191.                             xc = ((point.y - blackBar) / y2);  
      192.                             yc = 1.f - (point.x / x2);  
      193.                         }  
      194.                     }  
      195.                 } else if ([[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {  
      196.                     if (viewRatio > apertureRatio) {  
      197.                         CGFloat y2 = apertureSize.width * (frameSize.width / apertureSize.height);  
      198.                         xc = (point.y + ((y2 - frameSize.height) / 2.f)) / y2;  
      199.                         yc = (frameSize.width - point.x) / frameSize.width;  
      200.                     } else {  
      201.                         CGFloat x2 = apertureSize.height * (frameSize.height / apertureSize.width);  
      202.                         yc = 1.f - ((point.x + ((x2 - frameSize.width) / 2)) / x2);  
      203.                         xc = point.y / frameSize.height;  
      204.                     }  
      205.                       
      206.                 }  
      207.                   
      208.                 pointOfInterest = CGPointMake(xc, yc);  
      209.                 break;  
      210.             }  
      211.         }  
      212.     }  
      213.       
      214.     return pointOfInterest;  
      215. }  
      216.   
      217. - (void) initOtherLayers{  
      218.     self.coverLayer = [CALayer layer];  
      219.       
      220.     self.coverLayer.backgroundColor = [[[UIColor blackColor] colorWithAlphaComponent:0.6] CGColor];  
      221.     [self.layer addSublayer:self.coverLayer];  
      222.       
      223.     if(!CGRectEqualToRect(CGRectZero, self.cropRect)){  
      224.       
      225.         self.cropLayer = [[CAShapeLayer alloc] init];  
      226.         CGMutablePathRef path = CGPathCreateMutable();  
      227.           
      228.         CGPathAddRect(path, nil, self.cropRect);  
      229.         CGPathAddRect(path, nil, self.bounds);  
      230.           
      231.         [self.cropLayer setFillRule:kCAFillRuleEvenOdd];  
      232.         [self.cropLayer setPath:path];  
      233.         [self.cropLayer setFillColor:[[UIColor redColor] CGColor]];  
      234.     }  
      235.       
      236.     self.stillImageLayer = [CALayer layer];  
      237.     self.stillImageLayer.backgroundColor = [[UIColor yellowColor] CGColor];  
      238.     self.stillImageLayer.contentsGravity = kCAGravityResizeAspect;  
      239.     [self.coverLayer addSublayer:self.stillImageLayer];  
      240.       
      241.       
      242.     self.stillImageView = [[UIImageView alloc]initWithFrame:CGRectMake(0,300, 100, 100)];  
      243.     self.stillImageView.backgroundColor = [UIColor redColor];  
      244.     self.stillImageView.contentMode = UIViewContentModeScaleAspectFit;  
      245.     [self addSubview:self.stillImageView];  
      246.       
      247.       
      248.     self.previewLayer.contentsGravity = kCAGravityResizeAspect;  
      249.       
      250. }  
      251.   
      252. - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{  
      253.       
      254.       
      255.     [self setVideoFocus];  
      256.       
      257.     UIImage *image = [self imageFromSampleBuffer:sampleBuffer];  
      258.     self.cropImage = [self cropImageInRect:image];  
      259.       
      260.     dispatch_async(dispatch_get_main_queue(), ^{  
      261.           
      262.        [self.stillImageView setImage:image];  
      263.       // [self.stillImageLayer setContents:(id)[self.cropImage CGImage]];  
      264.     });  
      265.       
      266. }  
      267. // 通过抽样缓存数据创建一个UIImage对象  
      268. - (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer  
      269. {  
      270.     // 为媒体数据设置一个CMSampleBuffer的Core Video图像缓存对象  
      271.     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);  
      272.     // 锁定pixel buffer的基地址  
      273.     CVPixelBufferLockBaseAddress(imageBuffer, 0);  
      274.       
      275.     // 得到pixel buffer的基地址  
      276.     voidvoid *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);  
      277.       
      278.     // 得到pixel buffer的行字节数  
      279.     size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);  
      280.     // 得到pixel buffer的宽和高  
      281.     size_t width = CVPixelBufferGetWidth(imageBuffer);  
      282.     size_t height = CVPixelBufferGetHeight(imageBuffer);  
      283.       
      284.     //NSLog(@"%zu,%zu",width,height);  
      285.       
      286.     // 创建一个依赖于设备的RGB颜色空间  
      287.     CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();  
      288.       
      289.     // 用抽样缓存的数据创建一个位图格式的图形上下文(graphics context)对象  
      290.     CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,  
      291.                                                  bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);  
      292.       
      293.   
      294.     // 根据这个位图context中的像素数据创建一个Quartz image对象  
      295.     CGImageRef quartzImage = CGBitmapContextCreateImage(context);  
      296.     // 解锁pixel buffer  
      297.     CVPixelBufferUnlockBaseAddress(imageBuffer,0);  
      298.       
      299.     // 释放context和颜色空间  
      300.     CGContextRelease(context);  
      301.     CGColorSpaceRelease(colorSpace);  
      302.       
      303. //    cgimageget`  
      304.       
      305.     // 用Quartz image创建一个UIImage对象image  
      306.     //UIImage *image = [UIImage imageWithCGImage:quartzImage];  
      307.     UIImage *image = [UIImage imageWithCGImage:quartzImage scale:1.0f orientation:UIImageOrientationRight];  
      308.       
      309.     // 释放Quartz image对象  
      310.     CGImageRelease(quartzImage);  
      311.       
      312.     return (image);  
      313.       
      314.       
      315. }  
      316.   
      317.   
      318.   
      319. - (CGRect) calcRect:(CGSize)imageSize{  
      320.     NSString* gravity = self.previewLayer.videoGravity;  
      321.     CGRect cropRect = self.cropRect;  
      322.     CGSize screenSize = self.previewLayer.bounds.size;  
      323.       
      324.     CGFloat screenRatio = screenSize.height / screenSize.width ;  
      325.     CGFloat imageRatio = imageSize.height /imageSize.width;  
      326.       
      327.     CGRect presentImageRect = self.previewLayer.bounds;  
      328.     CGFloat scale = 1.0;  
      329.       
      330.       
      331.     if([AVLayerVideoGravityResizeAspect isEqual: gravity]){  
      332.           
      333.         CGFloat presentImageWidth = imageSize.width;  
      334.         CGFloat presentImageHeigth = imageSize.height;  
      335.         if(screenRatio > imageRatio){  
      336.             presentImageWidth = screenSize.width;  
      337.             presentImageHeigth = presentImageWidth * imageRatio;  
      338.               
      339.         }else{  
      340.             presentImageHeigth = screenSize.height;  
      341.             presentImageWidth = presentImageHeigth / imageRatio;  
      342.         }  
      343.           
      344.         presentImageRect.size = CGSizeMake(presentImageWidth, presentImageHeigth);  
      345.         presentImageRect.origin = CGPointMake((screenSize.width-presentImageWidth)/2.0, (screenSize.height-presentImageHeigth)/2.0);  
      346.       
      347.     }else if([AVLayerVideoGravityResizeAspectFill isEqual:gravity]){  
      348.           
      349.         CGFloat presentImageWidth = imageSize.width;  
      350.         CGFloat presentImageHeigth = imageSize.height;  
      351.         if(screenRatio > imageRatio){  
      352.             presentImageHeigth = screenSize.height;  
      353.             presentImageWidth = presentImageHeigth / imageRatio;  
      354.         }else{  
      355.             presentImageWidth = screenSize.width;  
      356.             presentImageHeigth = presentImageWidth * imageRatio;  
      357.         }  
      358.           
      359.         presentImageRect.size = CGSizeMake(presentImageWidth, presentImageHeigth);  
      360.         presentImageRect.origin = CGPointMake((screenSize.width-presentImageWidth)/2.0, (screenSize.height-presentImageHeigth)/2.0);  
      361.           
      362.     }else{  
      363.         NSAssert(0, @"dont support:%@",gravity);  
      364.     }  
      365.       
      366.     scale = CGRectGetWidth(presentImageRect) / imageSize.width;  
      367.       
      368.     CGRect rect = cropRect;  
      369.     rect.origin = CGPointMake(CGRectGetMinX(cropRect)-CGRectGetMinX(presentImageRect), CGRectGetMinY(cropRect)-CGRectGetMinY(presentImageRect));  
      370.       
      371.     rect.origin.x /= scale;  
      372.     rect.origin.y /= scale;  
      373.     rect.size.width /= scale;  
      374.     rect.size.height  /= scale;  
      375.       
      376.     return rect;  
      377. }  
      378.   
      379. #define SUBSET_SIZE 360  
      380.   
      381. - (UIImage*) cropImageInRect:(UIImage*)image{  
      382.   
      383.     CGSize size = [image size];  
      384.     CGRect cropRect = [self calcRect:size];  
      385.   
      386.     float scale = fminf(1.0f, fmaxf(SUBSET_SIZE / cropRect.size.width, SUBSET_SIZE / cropRect.size.height));  
      387.     CGPoint offset = CGPointMake(-cropRect.origin.x, -cropRect.origin.y);  
      388.       
      389.     size_t subsetWidth = cropRect.size.width * scale;  
      390.     size_t subsetHeight = cropRect.size.height * scale;  
      391.       
      392.       
      393.     CGColorSpaceRef grayColorSpace = CGColorSpaceCreateDeviceGray();  
      394.       
      395.     CGContextRef ctx =  
      396.     CGBitmapContextCreate(nil,  
      397.                           subsetWidth,  
      398.                           subsetHeight,  
      399.                           8,  
      400.                           0,  
      401.                           grayColorSpace,  
      402.                           kCGImageAlphaNone|kCGBitmapByteOrderDefault);  
      403.     CGColorSpaceRelease(grayColorSpace);  
      404.     CGContextSetInterpolationQuality(ctx, kCGInterpolationNone);  
      405.     CGContextSetAllowsAntialiasing(ctx, false);  
      406.   
      407.     // adjust the coordinate system  
      408.     CGContextTranslateCTM(ctx, 0.0, subsetHeight);  
      409.     CGContextScaleCTM(ctx, 1.0, -1.0);  
      410.       
      411.       
      412.     UIGraphicsPushContext(ctx);  
      413.     CGRect rect = CGRectMake(offset.x * scale, offset.y * scale, scale * size.width, scale * size.height);  
      414.   
      415.     [image drawInRect:rect];  
      416.       
      417.     UIGraphicsPopContext();  
      418.       
      419.     CGContextFlush(ctx);  
      420.       
      421.       
      422.     CGImageRef subsetImageRef = CGBitmapContextCreateImage(ctx);  
      423.       
      424.     UIImage* subsetImage = [UIImage imageWithCGImage:subsetImageRef];  
      425.   
      426.     CGImageRelease(subsetImageRef);  
      427.       
      428.     CGContextRelease(ctx);  
      429.   
      430.       
      431.     return subsetImage;  
      432. }    
      433.   
      434.   
      435.   
      436. - (void) start{  
      437.       
      438.     dispatch_sync (self.camera_sample_queue, ^{  
      439.         [self.session startRunning]; });  
      440.       
      441. }  
      442. - (void) stop{  
      443.     if(self.session){  
      444.         [self.session stopRunning];  
      445.     }  
      446.       
      447. }  
      448.   
      449.   
      450. @end  
      451. </pre><br><br> 
  • 相关阅读:
    二维动态规划(2)
    细节是否真的打败爱情,十年后你还会爱我吗?
    C++的四种cast操作符的区别类型转换
    纯虚函数
    二维动态规划
    1,2,...n n个数m个丢失,找出丢失的数
    虚拟内存管理技术
    C++ 面试题总结
    【转】图的邻接链表 adjacent list of graph
    CIOCPServer的数据结构定义及内存池方案
  • 原文地址:https://www.cnblogs.com/allanliu/p/4207794.html
Copyright © 2020-2023  润新知