• 基于 CoreText 实现的高性能 UITableView


    1、绘制文本

    使用core text可以将文本绘制在一个CGContextRef上,最后再通过UIGraphicsGetImageFromCurrentImageContext()生成图片,再将图片赋值给cell.contentView.layer,从而达到减少cell层级的目的。

    绘制普通文本(譬如用户昵称)在context上,相关注释在代码里:

    [Objective-C] 查看源文件 复制代码
    01
    02
    03
    04
    05
    06
    07
    08
    09
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    - (void)drawInContext:(CGContextRef)context withPosition:(CGPoint)p andFont:(UIFont *)font andTextColor:(UIColor *)color andHeight:(float)height andWidth:(float)width lineBreakMode:(CTLineBreakMode)lineBreakMode {
        CGSize size = CGSizeMake(width, height);
        // 翻转坐标系
        CGContextSetTextMatrix(context,CGAffineTransformIdentity);
        CGContextTranslateCTM(context,0,height);
        CGContextScaleCTM(context,1.0,-1.0);
     
        NSMutableDictionary* attributes = [StringAttributes attributeFont:font andTextColor:color lineBreakMode:lineBreakMode];
     
        // 创建绘制区域(路径)
        CGMutablePathRef path = CGPathCreateMutable();
        CGPathAddRect(path,NULL,CGRectMake(p.x, height-p.y-size.height,(size.width),(size.height)));
     
        // 创建AttributedString
        NSMutableAttributedString*attributedStr = [[NSMutableAttributedStringalloc] initWithString:selfattributes:attributes];
        CFAttributedStringRef attributedString = (__bridge CFAttributedStringRef)attributedStr;
     
        // 绘制frame
        CTFramesetterRef framesetter = CTFramesetterCreateWithAttributedString((CFAttributedStringRef)attributedString);
        CTFrameRef ctframe = CTFramesetterCreateFrame(framesetter, CFRangeMake(0,0),path,NULL);
        CTFrameDraw(ctframe,context);
        CGPathRelease(path);
        CFRelease(framesetter);
        CFRelease(ctframe);
        [[attributedStr mutableString] setString:@""];
        CGContextSetTextMatrix(context,CGAffineTransformIdentity);
        CGContextTranslateCTM(context,0, height);
        CGContextScaleCTM(context,1.0,-1.0);
    }




    绘制朋友圈内容文本(带链接)在context上,这里我还没有去实现文本多了会折叠的效果,与上面普通文本不同的是这里需要创建带链接的AttributeString和CTLineRef的逐行绘制:

    [Objective-C] 查看源文件 复制代码
    01
    02
    03
    04
    05
    06
    07
    08
    09
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    84
    85
    86
    87
    88
    89
    90
    91
    92
    93
    94
    95
    96
    97
    98
    - (NSMutableAttributedString*)highlightText:(NSMutableAttributedString*)coloredString{
        // 创建带高亮的AttributedString
        NSString* string = coloredString.string;
        NSRangerange = NSMakeRange(0,[string length]);
        NSDataDetector*linkDetector = [NSDataDetectordataDetectorWithTypes:NSTextCheckingTypeLinkerror:nil];
        NSArray*matches = [linkDetector matchesInString:string options:0 range:range];
      
        for(NSTextCheckingResult* match in matches) {
            [self.ranges addObject:NSStringFromRange(match.range)];
            UIColor *highlightColor = UIColorFromRGB(0x297bc1);
            [coloredString addAttribute:(NSString*)kCTForegroundColorAttributeName
                                  value:(id)highlightColor.CGColor range:match.range];
        }
      
        returncoloredString;
    }
      
    - (void)drawFramesetter:(CTFramesetterRef)framesetter
           attributedString:(NSAttributedString*)attributedString
                  textRange:(CFRange)textRange
                     inRect:(CGRect)rect
                    context:(CGContextRef)c {
        CGMutablePathRef path = CGPathCreateMutable();
        CGPathAddRect(path,NULL, rect);
        CTFrameRef frame = CTFramesetterCreateFrame(framesetter, textRange, path, NULL);
      
        CGFloat ContentHeight = CGRectGetHeight(rect);
        CFArrayRef lines = CTFrameGetLines(frame);
        NSIntegernumberOfLines = CFArrayGetCount(lines);
      
        CGPoint lineOrigins[numberOfLines];
        CTFrameGetLineOrigins(frame, CFRangeMake(0, numberOfLines), lineOrigins);
      
        // 遍历每一行
        for(CFIndex lineIndex = 0; lineIndex < numberOfLines; lineIndex++) {
            CGPoint lineOrigin = lineOrigins[lineIndex];
            CTLineRef line = CFArrayGetValueAtIndex(lines, lineIndex);
      
            CGFloat descent = 0.0f, ascent = 0.0f, lineLeading = 0.0f;
            CTLineGetTypographicBounds((CTLineRef)line, &ascent, &descent, &lineLeading);
      
            CGFloat penOffset = (CGFloat)CTLineGetPenOffsetForFlush(line, NSTextAlignmentLeft, rect.size.width);
            CGFloat y = lineOrigin.y - descent - self.font.descender;
      
            // 设置每一行位置
            CGContextSetTextPosition(c, penOffset + self.xOffset, y - self.yOffset);
            CTLineDraw(line, c);
      
            // CTRunRef同一行中文本的不同样式,包括颜色、字体等,此处用途为处理链接高亮
            CFArrayRef runs = CTLineGetGlyphRuns(line);
            for(intj = 0; j < CFArrayGetCount(runs); j++) {
                CGFloat runAscent, runDescent, lineLeading1;
      
                CTRunRef run = CFArrayGetValueAtIndex(runs, j);
                NSDictionary*attributes = (__bridge NSDictionary*)CTRunGetAttributes(run);
                // 判断是不是链接
                if(!CGColorEqualToColor((__bridge CGColorRef)([attributes valueForKey:@"CTForegroundColor"]),self.textColor.CGColor)) {
                    CFRange range = CTRunGetStringRange(run);
                    floatoffset = CTLineGetOffsetForStringIndex(line, range.location, NULL);
      
                    // 得到链接的CGRect
                    CGRect runRect;
                    runRect.size.width = CTRunGetTypographicBounds(run, CFRangeMake(0,0), &runAscent, &runDescent, &lineLeading1);
                    runRect.size.height = self.font.lineHeight;
                    runRect.origin.x = lineOrigin.x + offset+ self.xOffset;
                    runRect.origin.y = lineOrigin.y;
                    runRect.origin.y -= descent + self.yOffset;
      
                    // 因为坐标系被翻转,链接正常的坐标需要通过CGAffineTransform计算得到
                    CGAffineTransform transform = CGAffineTransformMakeTranslation(0, ContentHeight);
                    transform = CGAffineTransformScale(transform, 1.f, -1.f);
                    CGRect flipRect = CGRectApplyAffineTransform(runRect, transform);
      
                    // 保存是链接的CGRect
                    NSRangenRange = NSMakeRange(range.location, range.length);
                    self.framesDict[NSStringFromRange(nRange)] = [NSValuevalueWithCGRect:flipRect];
      
                    // 保存同一条链接的不同CGRect,用于点击时背景色处理
                    for(NSString*rangeString in self.ranges) {
                        NSRangerange = NSRangeFromString(rangeString);
                        if(NSLocationInRange(nRange.location, range)) {
                            NSMutableArray*array = self.relationDict[rangeString];
                            if(array) {
                                [array addObject:NSStringFromCGRect(flipRect)];
                                self.relationDict[rangeString] = array;
                            }else{
                                self.relationDict[rangeString] = [NSMutableArrayarrayWithObject:NSStringFromCGRect(flipRect)];
                            }
                        }
                    }
      
                }
            }
        }
      
        CFRelease(frame);
        CFRelease(path);
    }






    上述方法运用起来就是:

    [Objective-C] 查看源文件 复制代码
    01
    02
    03
    04
    05
    06
    07
    08
    09
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    - (void)fillData:(CGContextRef)context {
        [self.nickname drawInContext:context withPosition:(CGPoint){kTextXOffset, kSpec} andFont:kNicknameFont
                        andTextColor:UIColorFromRGB(0x556c95) andHeight:self.nicknameSize.height
                            andWidth:self.nicknameSize.width lineBreakMode:kCTLineBreakByTruncatingTail];
        [self.drawer setText:self.contentString context:context contentSize:self.contentSize
             backgroundColor:[UIColor whiteColor] font:kContentTextFont textColor:[UIColor blackColor]
                       block:nilxOffset:kTextXOffset yOffset:kSpec * 2 + self.nicknameSize.height];
    }
      
    - (void)fillContents:(NSArray*)array {
        UIGraphicsBeginImageContextWithOptions(CGSizeMake(self.size.width,self.size.height),YES, 0);
        CGContextRef context = UIGraphicsGetCurrentContext();
        [UIColorFromRGB(0xffffff) set];
        CGContextFillRect(context, CGRectMake(0, 0, self.size.width,self.size.height));
      
        // 获取需要高亮的链接CGRect,并填充背景色
        if(array) {
            for(NSString*string in array) {
                CGRect rect = CGRectFromString(string);
                [UIColorFromRGB(0xe5e5e5) set];
                CGContextFillRect(context, rect);
            }
        }
      
        [selffillData:context];
      
        UIImage *temp = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        self.contentView.layer.contents = (__bridge id_Nullable)(temp.CGImage);
    }




    这样就完成了文本的显示。

    2、显示图片

    图片包括用户头像和朋友圈的内容,这里只是将CALayer添加到contentView.layer上,具体做法是继承了CALayer,实现部分功能。

    通过链接显示图片:

    [Objective-C] 查看源文件 复制代码
    01
    02
    03
    04
    05
    06
    07
    08
    09
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    - (void)setContentsWithURLString:(NSString*)urlString {
      
        self.contents = (__bridge id_Nullable)([UIImage imageNamed:@"placeholder"].CGImage);
        @weakify(self)
        SDWebImageManager *manager = [SDWebImageManager sharedManager];
        [manager downloadImageWithURL:[NSURLURLWithString:urlString]
                              options:SDWebImageCacheMemoryOnly
                             progress:nil
                            completed:^(UIImage *image, NSError*error, SDImageCacheType cacheType, BOOLfinished, NSURL*imageURL) {
                                if(image) {
                                    @strongify(self)
                                    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
                                        if(!_observer) {
      
                                            _observer = CFRunLoopObserverCreateWithHandler(kCFAllocatorDefault, kCFRunLoopBeforeWaiting | kCFRunLoopExit,false, POPAnimationApplyRunLoopOrder, ^(CFRunLoopObserverRef observer, CFRunLoopActivity activity) {
                                                self.contents = (__bridge id_Nullable)(image.CGImage);
                                            });
      
                                            if(_observer) {
                                                CFRunLoopAddObserver(CFRunLoopGetMain(), _observer,  kCFRunLoopCommonModes);
                                            }
                                        }
                                    });
                                    self.originImage = image;
                                }
                            }];
    }




    3、显示小视频

    之前的一篇文章简单讲了怎么自己做一个播放器,这里就派上用场了。而显示小视频封面图片的CALayer同样在显示小视频的时候可以复用。

    这里使用了NSOperationQueue来保障播放视频的流畅性,具体继承NSOperation的VideoDecodeOperation相关代码如下:

    [Objective-C] 查看源文件 复制代码
    001
    002
    003
    004
    005
    006
    007
    008
    009
    010
    011
    012
    013
    014
    015
    016
    017
    018
    019
    020
    021
    022
    023
    024
    025
    026
    027
    028
    029
    030
    031
    032
    033
    034
    035
    036
    037
    038
    039
    040
    041
    042
    043
    044
    045
    046
    047
    048
    049
    050
    051
    052
    053
    054
    055
    056
    057
    058
    059
    060
    061
    062
    063
    064
    065
    066
    067
    068
    069
    070
    071
    072
    073
    074
    075
    076
    077
    078
    079
    080
    081
    082
    083
    084
    085
    086
    087
    088
    089
    090
    091
    092
    093
    094
    095
    096
    097
    098
    099
    100
    101
    102
    103
    104
    105
    106
    107
    108
    109
    110
    111
    112
    113
    114
    115
    116
    117
    118
    119
    120
    121
    122
    123
    124
    125
    126
    127
    128
    129
    130
    131
    132
    133
    134
    135
    136
    137
    138
    139
    140
    141
    142
    143
    144
    145
    146
    147
    148
    149
    150
    151
    152
    153
    154
    - (void)main {
      
        @autoreleasepool{
      
            if(self.isCancelled) {
                _newVideoFrameBlock = nil;
                _decodeFinishedBlock = nil;
                return;
            }
      
            AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[[NSURLalloc] initFileURLWithPath:self.filePath] options:nil];
            NSError*error;
            AVAssetReader* reader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
            if(error) {
                return;
            }
      
            NSArray* videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
            AVAssetTrack* videoTrack = [videoTracks objectAtIndex:0];
            // 视频播放时,m_pixelFormatType=kCVPixelFormatType_32BGRA
            // 其他用途,如视频压缩,m_pixelFormatType=kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
            intm_pixelFormatType = kCVPixelFormatType_32BGRA;
            NSDictionary* options = [NSDictionarydictionaryWithObject:[NSNumbernumberWithInt: (int)m_pixelFormatType]
                                                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];
            AVAssetReaderTrackOutput* videoReaderOutput = [[AVAssetReaderTrackOutput alloc]
                    initWithTrack:videoTrack outputSettings:options];
            [reader addOutput:videoReaderOutput];
            [reader startReading];
            // 要确保nominalFrameRate>0,之前出现过android拍的0帧视频
            if(self.isCancelled) {
                _newVideoFrameBlock = nil;
                _decodeFinishedBlock = nil;
                return;
            }
      
            while([reader status] == AVAssetReaderStatusReading && videoTrack.nominalFrameRate > 0) {
                if(self.isCancelled) {
                    _newVideoFrameBlock = nil;
                    _decodeFinishedBlock = nil;
                    return;
                }
      
                CMSampleBufferRef sampleBuffer = [videoReaderOutput copyNextSampleBuffer];
                CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
      
                // Lock the base address of the pixel buffer
                CVPixelBufferLockBaseAddress(imageBuffer, 0);
      
                // Get the number of bytes per row for the pixel buffer
                size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
      
                // Get the pixel buffer width and height
                size_t width = CVPixelBufferGetWidth(imageBuffer);
                size_t height = CVPixelBufferGetHeight(imageBuffer);
      
                //Generate image to edit`
                unsignedchar* pixel = (unsigned char*)CVPixelBufferGetBaseAddress(imageBuffer);
      
                CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
                CGContextRef context=CGBitmapContextCreate(pixel, width, height, 8, bytesPerRow, colorSpace,
                                                           kCGBitmapByteOrder32Little|kCGImageAlphaPremultipliedFirst);
                if(context != NULL) {
                    CGImageRef imageRef = CGBitmapContextCreateImage(context);
      
                    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
                    CGColorSpaceRelease(colorSpace);
                    CGContextRelease(context);
      
                    // 解码图片
                    size_t width = CGImageGetWidth(imageRef);
                    size_t height = CGImageGetHeight(imageRef);
                    size_t bitsPerComponent = CGImageGetBitsPerComponent(imageRef);
      
                    // CGImageGetBytesPerRow() calculates incorrectly in iOS 5.0, so defer to CGBitmapContextCreate
                    size_t bytesPerRow = 0;
                    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
                    CGColorSpaceModel colorSpaceModel = CGColorSpaceGetModel(colorSpace);
                    CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef);
      
                    if(colorSpaceModel == kCGColorSpaceModelRGB) {
                        uint32_t alpha = (bitmapInfo & kCGBitmapAlphaInfoMask);
    #pragma clang diagnostic push
    #pragma clang diagnostic ignored "-Wassign-enum"
                        if(alpha == kCGImageAlphaNone) {
                            bitmapInfo &= ~kCGBitmapAlphaInfoMask;
                            bitmapInfo |= kCGImageAlphaNoneSkipFirst;
                        }elseif (!(alpha == kCGImageAlphaNoneSkipFirst || alpha == kCGImageAlphaNoneSkipLast)) {
                            bitmapInfo &= ~kCGBitmapAlphaInfoMask;
                            bitmapInfo |= kCGImageAlphaPremultipliedFirst;
                        }
    #pragma clang diagnostic pop
                    }
      
                    CGContextRef context = CGBitmapContextCreate(NULL, width, height, bitsPerComponent,
                                                                 bytesPerRow, colorSpace, bitmapInfo);
      
                    CGColorSpaceRelease(colorSpace);
      
                    if(!context) {
                        if(self.newVideoFrameBlock) {
                            dispatch_async(dispatch_get_main_queue(), ^{
                                if(self.isCancelled) {
                                    _newVideoFrameBlock = nil;
                                    _decodeFinishedBlock = nil;
                                    return;
                                }
                                self.newVideoFrameBlock(imageRef,self.filePath);
                                CGImageRelease(imageRef);
                            });
                        }
                    }else{
      
                        CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, width, height), imageRef);
                        CGImageRef inflatedImageRef = CGBitmapContextCreateImage(context);
      
                        CGContextRelease(context);
                        if(self.newVideoFrameBlock) {
                            dispatch_async(dispatch_get_main_queue(), ^{
                                if(self.isCancelled) {
                                    _newVideoFrameBlock = nil;
                                    _decodeFinishedBlock = nil;
                                    return;
                                }
                                self.newVideoFrameBlock(inflatedImageRef,self.filePath);
      
                                CGImageRelease(inflatedImageRef);
                            });
                        }
                        CGImageRelease(imageRef);
                    }
      
                    if(sampleBuffer) {
                        CMSampleBufferInvalidate(sampleBuffer);
                        CFRelease(sampleBuffer);
                        sampleBuffer = NULL;
      
                    }else{
                        break;
                    }
                }
      
                [NSThreadsleepForTimeInterval:CMTimeGetSeconds(videoTrack.minFrameDuration)];
            }
      
            if(self.isCancelled) {
                _newVideoFrameBlock = nil;
                _decodeFinishedBlock = nil;
                return;
            }
            if(self.decodeFinishedBlock) {
                self.decodeFinishedBlock(self.filePath);
            }
        }
    }




    解码图片是因为UIImage在界面需要显示的时候才开始解码,这样可能会造成主线程的卡顿,所以在子线程对其进行解压缩处理。

    具体的使用:

    [Objective-C] 查看源文件 复制代码
    01
    02
    03
    04
    05
    06
    07
    08
    09
    10
    11
    12
    13
    14
    15
    - (void)playVideoWithFilePath:(NSString*)filePath_ type:(NSString*)type {
        @weakify(self)
        [[VideoPlayerManager shareInstance] decodeVideo:filePath_
                                  withVideoPerDataBlock:^(CGImageRef imageData, NSString*filePath) {
                                      @strongify(self)
                                      if([type isEqualToString:@"video"]) {
                                          if([filePath isEqualToString:self.filePath]) {
                                              [self.sources.firstObject
                                                      setContents:(__bridgeid_Nullable)(imageData)];
                                          }
                                      }
                                  } decodeFinishBlock:^(NSString*filePath){
                    [selfplayVideoWithFilePath:filePath type:type];
                }];
    }
     
     


    4、其他

    1、触摸交互是覆盖了以下方法实现:

    [Objective-C] 查看源文件 复制代码
    1
    2
    3
    4
    <font style="color:rgb(46, 46, 46)"><font face="&quot;"><font style="font-size:15px">- (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
    - (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
    - (void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
    </font></font></font>










    2、页面上FPS的测量是使用了YYKit项目中的YYFPSLabel。

    DEMO 下载:
    http://www.code4app.com/forum.php?mod=viewthread&tid=9469&extra=page%3D1%26filter%3Dsortid%26sortid%3D1

  • 相关阅读:
    刚子扯谈 活着 没那么简单
    改写整数
    刚子扯谈:一起聊聊微信这孙子
    刚子扯谈:未完待续的微信5.0
    Citrix 服务器虚拟化之十 Xenserver高可用性HA
    JS实现——俄罗斯方块
    一种文件捆绑型病毒研究
    XP系统登录界面,需要手动点击用户帐户后才会出现输入密码的界面
    加密javascript代码
    Python的在线编辑环境
  • 原文地址:https://www.cnblogs.com/kengsir/p/5670075.html
Copyright © 2020-2023  润新知