GPUImage源碼分析與使用(二) Sources GPUImage的使用

Sources、Filters、Outputs、Pipeline的分別介紹

Sources

GPUImage的一個類GPUImageOutput

GPUImage的一個協議GPUImageInput

濾鏡鏈:輸入(圖片、視頻文件、紋理、二進制等)->輸出(視頻、view等)

濾鏡鏈的起點:輸入

不管用哪種方式進行濾鏡處理,最終處理的都是紋理數據。

  1. GPUImagePicture,處理靜態圖片,本質是解壓圖片->紋理->濾鏡處理
  2. GPUImageRawDataInput,二進制數據->紋理,得到CVPixelFormat
  3. GPUImageTextureInput,紋理,已經從壓縮文件解壓後的紋理數據
  4. GPUImageUIElement,UIView、CALayer,通過Coregraphic把要繪製的內容填充到上下文來獲取圖片數據->紋理,用在截屏、獲取當前UIview的時候
    • 圖片水印,在對應位置添加紋理圖片混合即可
    • 文字水印label,使用GPUImageUIElement可以添加文字水印
  5. GPUImageMovie,視頻文件-> 使用AVAssetReader逐幀讀取-> 幀數據轉化成紋理-> 濾鏡處理,AVAssetReaderOutput -> CMSampleBufferRef -> CVImageBufferRef -> CVOpenGLESTextureRef -> texture
  6. GPUImageVideoCamera,拍視頻,AVFoundation採集視頻-> 回調方法didOutputSampleBuffer-> CVImageBufferRef -> CVOpenGLESTextureRef -> texture
  7. GPUImageStillCamera,GPUImagePicture的子類,拍圖片,AVFoundation採集圖片-> 回調方法didOutputSampleBuffer-> CVImageBufferRef -> CVOpenGLESTextureRef -> texture

濾鏡,有兩百多種

濾鏡基類是GPUImageFilter,自定義濾鏡必須繼承於GPUImageFilter

濾鏡鏈的終點:輸出

  1. GPUImageMovieWriter,錄製的視頻,從幀緩衝中將渲染的結果紋理數據通過AVAssetWriter把每一幀保存到相應的路徑,然後將保存後的文件斷點續傳到平臺
  2. GPUImageRawDataOutput,邊錄製邊上傳,獲取處理濾鏡中幀緩衝區的二進制數據上傳到平臺
  3. GPUImageTextureOutput,輸出紋理,渲染完成後得到新的紋理
  4. GPUImageView,繼承於UIView,紋理輸出到layer上

GPUImage的使用

視頻添加濾鏡

  1. AVFoundation捕捉視頻並處理設備
- (id)initWithSessionPreset:(NSString *)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition; 
{
    if (!(self = [super init]))
    {
        return nil;
    }
    
    //AVFoundation視頻捕捉
    //1. 初始化視頻處理隊列、音頻處理隊列、GCD信號量
    cameraProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH,0);
    audioProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW,0);

    frameRenderingSemaphore = dispatch_semaphore_create(1);

    //2. 初始化幀率、其他標記等相關屬性
    _frameRate = 0; // This will not set frame rate unless this value gets set to 1 or above
    _runBenchmark = NO;
    capturePaused = NO;
    outputRotation = kGPUImageNoRotation;
    internalRotation = kGPUImageNoRotation;
    captureAsYUV = YES;
    _preferredConversion = kColorConversion709;
    
    // Grab the back-facing or front-facing camera
    //3. 獲取前後攝像頭設備(默認後置)
    _inputCamera = nil;
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) 
    {
        if ([device position] == cameraPosition)
        {
            _inputCamera = device;
        }
    }
    
    //4. 未獲取到攝像頭則返回
    if (!_inputCamera) {
        return nil;
    }
    
    // Create the capture session
    //5. 創建AVCaptureSession
    _captureSession = [[AVCaptureSession alloc] init];
    
    [_captureSession beginConfiguration];
    
    // Add the video input
    //6. 添加攝像頭視頻輸入設備
    NSError *error = nil;
    videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_inputCamera error:&error];
    if ([_captureSession canAddInput:videoInput]) 
    {
        [_captureSession addInput:videoInput];
    }
    
    // Add the video frame output
    //7. 添加視頻輸出設備
    videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    [videoOutput setAlwaysDiscardsLateVideoFrames:NO];
    
//    if (captureAsYUV && [GPUImageContext deviceSupportsRedTextures])
    //8. 判斷捕捉的YUV顏色空間
    /**
     supportsFastTextureUpload: 從iOS5開始支持的一種CVOpenGLESTextureCacheRef和CVImageBufferRef的映射,通過這個映射可以直接拿到VCPixelBufferRef,而不需要再用glReaderPixel讀取數據,這樣性能更好。
     */
    if (captureAsYUV && [GPUImageContext supportsFastTextureUpload])
    {
        BOOL supportsFullYUVRange = NO;
        //獲取所有支持的視頻像素格式
        NSArray *supportedPixelFormats = videoOutput.availableVideoCVPixelFormatTypes;
        for (NSNumber *currentPixelFormat in supportedPixelFormats)
        {
            //找到目前支持的像素格式中有kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,將supportsFullYUVRange改爲YES
            if ([currentPixelFormat intValue] == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
            {
                supportsFullYUVRange = YES;
            }
        }
        
        //9. 支持kCVPixelFormatType_420YpCbCr8BiPlanarFullRange格式
        if (supportsFullYUVRange)
        {
            //將視頻輸出格式設爲kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
            [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
            isFullYUVRange = YES;
        }
        else
        {
            [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
            isFullYUVRange = NO;
        }
    }
    else
    {
        [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
    }
    
    runSynchronouslyOnVideoProcessingQueue(^{
        
        if (captureAsYUV)
        {
            [GPUImageContext useImageProcessingContext];
            //            if ([GPUImageContext deviceSupportsRedTextures])
            //            {
            //                yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForRGFragmentShaderString];
            //            }
            //            else
            //            {
            if (isFullYUVRange)
            {
                yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVFullRangeConversionForLAFragmentShaderString];
            }
            else
            {
                yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForLAFragmentShaderString];
            }

            //            }
            
            if (!yuvConversionProgram.initialized)
            {
                [yuvConversionProgram addAttribute:@"position"];
                [yuvConversionProgram addAttribute:@"inputTextureCoordinate"];
                
                if (![yuvConversionProgram link])
                {
                    NSString *progLog = [yuvConversionProgram programLog];
                    NSLog(@"Program link log: %@", progLog);
                    NSString *fragLog = [yuvConversionProgram fragmentShaderLog];
                    NSLog(@"Fragment shader compile log: %@", fragLog);
                    NSString *vertLog = [yuvConversionProgram vertexShaderLog];
                    NSLog(@"Vertex shader compile log: %@", vertLog);
                    yuvConversionProgram = nil;
                    NSAssert(NO, @"Filter shader link failed");
                }
            }
            
            yuvConversionPositionAttribute = [yuvConversionProgram attributeIndex:@"position"];
            yuvConversionTextureCoordinateAttribute = [yuvConversionProgram attributeIndex:@"inputTextureCoordinate"];
            yuvConversionLuminanceTextureUniform = [yuvConversionProgram uniformIndex:@"luminanceTexture"];
            yuvConversionChrominanceTextureUniform = [yuvConversionProgram uniformIndex:@"chrominanceTexture"];
            yuvConversionMatrixUniform = [yuvConversionProgram uniformIndex:@"colorConversionMatrix"];
            
            [GPUImageContext setActiveShaderProgram:yuvConversionProgram];
            
            glEnableVertexAttribArray(yuvConversionPositionAttribute);
            glEnableVertexAttribArray(yuvConversionTextureCoordinateAttribute);
        }
    });
    
    //10. 攝像頭捕捉的數據到此代理方法裏
    /**
     - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;
     */
    
    [videoOutput setSampleBufferDelegate:self queue:cameraProcessingQueue];
    if ([_captureSession canAddOutput:videoOutput])
    {
        [_captureSession addOutput:videoOutput];
    }
    else
    {
        NSLog(@"Couldn't add video output");
        return nil;
    }
    
    _captureSessionPreset = sessionPreset;
    [_captureSession setSessionPreset:_captureSessionPreset];

// This will let you get 60 FPS video from the 720p preset on an iPhone 4S, but only that device and that preset
//    AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
//    
//    if (conn.supportsVideoMinFrameDuration)
//        conn.videoMinFrameDuration = CMTimeMake(1,60);
//    if (conn.supportsVideoMaxFrameDuration)
//        conn.videoMaxFrameDuration = CMTimeMake(1,60);
    
    [_captureSession commitConfiguration];
    
    return self;
}
  1. 捕捉視頻後獲取的視頻數據再delegate裏返回
#pragma mark AVCaptureVideoDataOutputSampleBufferDelegate
//AVFoundation拍攝視頻、圖片後的代理回調,取得視頻、圖片數據
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    //正在拍攝
    if (!self.captureSession.isRunning)
    {
        return;
    }
    else if (captureOutput == audioOutput)
    {
        //處理音頻
        [self processAudioSampleBuffer:sampleBuffer];
    }
    else
    {
        if (dispatch_semaphore_wait(frameRenderingSemaphore, DISPATCH_TIME_NOW) != 0)
        {
            return;
        }
        
        CFRetain(sampleBuffer);
        runAsynchronouslyOnVideoProcessingQueue(^{
            //Feature Detection Hook.
            if (self.delegate)
            {
                [self.delegate willOutputSampleBuffer:sampleBuffer];
            }
            
            //處理視頻,將sampleBuffer轉化成CVImageBufferRef
            [self processVideoSampleBuffer:sampleBuffer];
            
            CFRelease(sampleBuffer);
            dispatch_semaphore_signal(frameRenderingSemaphore);
        });
    }
}
  1. 處理視頻數據
- (void)processVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer;
{
    if (capturePaused)
    {
        return;
    }
    
    CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
    //將sampleBuffer轉換成CVImageBufferRef
    CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
    //獲得寬高
    int bufferWidth = (int) CVPixelBufferGetWidth(cameraFrame);
    int bufferHeight = (int) CVPixelBufferGetHeight(cameraFrame);
    //獲得顏色附着點
    CFTypeRef colorAttachments = CVBufferGetAttachment(cameraFrame, kCVImageBufferYCbCrMatrixKey, NULL);
    if (colorAttachments != NULL)
    {
        if(CFStringCompare(colorAttachments, kCVImageBufferYCbCrMatrix_ITU_R_601_4, 0) == kCFCompareEqualTo)
        {
            //判斷格式
            if (isFullYUVRange)
            {
                _preferredConversion = kColorConversion601FullRange;
            }
            else
            {
                _preferredConversion = kColorConversion601; //顏色空間轉換矩陣RGB->YUV或者YUV->RGB,RGB更佔用內存空間
            }
        }
        else
        {
            _preferredConversion = kColorConversion709; //顏色空間轉換矩陣RGB->YUV或者YUV->RGB
        }
    }
    else
    {
        if (isFullYUVRange)
        {
            _preferredConversion = kColorConversion601FullRange;
        }
        else
        {
            _preferredConversion = kColorConversion601;
        }
    }
    ......
    //將圖片轉化成紋理,處理紋理數據
    luminanceTexture = CVOpenGLESTextureGetName(luminanceTextureRef);
    glBindTexture(GL_TEXTURE_2D, luminanceTexture);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    ......

圖片添加濾鏡---飽和度濾鏡

GPUImageSaturationFilter.h

@interface GPUImageSaturationFilter : GPUImageFilter
{
    GLint saturationUniform; //使用Uniform修飾飽和度
}
/** Saturation ranges from 0.0 (fully desaturated) to 2.0 (max saturation), with 1.0 as the normal level
 */
@property(readwrite, nonatomic) CGFloat saturation; //正常值是1.0

GPUImageSaturationFilter.m

  • 飽和度濾鏡片元着色器實現
#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE //模擬器、手機
NSString *const kGPUImageSaturationFragmentShaderString = SHADER_STRING //ShaderString
( //片元着色器執行,圖片使用默認頂點着色器
 varying highp vec2 textureCoordinate; //紋理座標,從頂點座標傳過來
 
 uniform sampler2D inputImageTexture; //紋理ID
 uniform lowp float saturation; //飽和度
 
 // Values from "Graphics Shaders: Theory and Practice" by Bailey and Cunningham
 const mediump vec3 luminanceWeighting = vec3(0.2125, 0.7154, 0.0721); //調整飽和度的權值W
 
 void main()
 {
    lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate); //獲取紋理顏色值
    lowp float luminance = dot(textureColor.rgb, luminanceWeighting); //紋理顏色值和權重值w進行點乘
    lowp vec3 greyScaleColor = vec3(luminance); //將點乘結果變成三維向量
    
    gl_FragColor = vec4(mix(greyScaleColor, textureColor.rgb, saturation), textureColor.w); //混合
     
 }
);
  • 初始化濾鏡,使用默認頂點着色器
- (id)init;
{
    //初始化時如果不使用默認頂點着色器,需要指定頂點着色器,這裏使用默認頂點着色器kGPUImageVertexShaderString
    if (!(self = [super initWithFragmentShaderFromString:kGPUImageSaturationFragmentShaderString]))
    {
        return nil;
    }
    
    saturationUniform = [filterProgram uniformIndex:@"saturation"];
    self.saturation = 1.0;

    return self;
}
  • 默認頂點着色器
// Hardcode the vertex shader for standard filters, but this can be overridden
NSString *const kGPUImageVertexShaderString = SHADER_STRING
(
 attribute vec4 position; //頂點
 attribute vec4 inputTextureCoordinate; //紋理座標
 
 varying vec2 textureCoordinate; //傳遞紋理座標
 
 void main()
 {
     gl_Position = position; //頂點座標賦值給內建函數
     textureCoordinate = inputTextureCoordinate.xy; //傳遞紋理座標
 }
 );
  • 給圖片添加飽和度濾鏡
  1. 獲取圖片
//1.獲取圖片
    _jingImage = [UIImage imageNamed:@"jing.jpg"];
  1. 選擇需要的濾鏡,比如飽和度等,初始化濾鏡
  2. 拿到數據源頭-靜態圖片
  3. 獲取處理後的圖片顯示
//2.選擇需要的濾鏡
    if (_disFilter == nil) {
        //只有在_disFilter爲空時才創建濾鏡,初始化
        _disFilter = [[GPUImageSaturationFilter alloc] init];
    }
    
    //設置飽和度的值,默認1.0
    _disFilter.saturation = 1.0;
    
    //設置濾鏡處理的區域,圖片的大小
    [_disFilter forceProcessingAtSize:_jingImage.size];
    
    [_disFilter useNextFrameForImageCapture];
    
    //根據slider調整濾鏡的飽和度
    _disFilter.saturation = sender.value;
    
    //3.拿到數據源頭-靜態圖片
    GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:_jingImage];
    
    //圖片添加濾鏡
    [stillImageSource addTarget:_disFilter];
    //處理圖片
    [stillImageSource processImage];
    
    //4.獲取處理後的圖片
    UIImage *newImage = [_disFilter imageFromCurrentFramebuffer];
    
    //5.新圖片放到imageView顯示
    _jingImageView.image = newImage;

拍照添加濾鏡---飽和度濾鏡

  • 灰度濾鏡片元着色器實現
NSString *const kGPUImageLuminanceFragmentShaderString = SHADER_STRING //ShaderString
(
 precision highp float; //精度
 
 varying vec2 textureCoordinate; //紋理座標
 
 uniform sampler2D inputImageTexture; //紋理
 
 const highp vec3 W = vec3(0.2125, 0.7154, 0.0721); //權值
 
 void main()
 {
     lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate); //獲取紋理顏色值
     float luminance = dot(textureColor.rgb, W); //紋理顏色值點乘權值
     
     gl_FragColor = vec4(vec3(luminance), textureColor.a); //轉成四維向量
 }
);
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章