iOS視頻合成,消除中間的黑幀

方法只有視頻合成,多個視頻合成等,至於視頻後面加音頻那些,各位還是去谷歌吧,話不多說,上代碼:

- (void)combineVideos{
    NSString *firstVideo = _currentMovieURL.path;
    NSString *secondVideo = [[NSBundle mainBundle] pathForResource:@"trailVideo" ofType:@"mp4"];
    
    NSDictionary *optDict = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
    AVAsset *firstAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:firstVideo] options:optDict];
    AVAsset *secondAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:secondVideo] options:optDict];
    
    AVMutableComposition *composition = [AVMutableComposition composition];
    //爲視頻類型的的Track
    AVMutableCompositionTrack *compositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    //CMTimeRangeMake 指定起去始位置
    CMTimeRange firstTimeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);
    CMTimeRange secondTimeRange = CMTimeRangeMake(kCMTimeZero, secondAsset.duration);
    
    AVAssetTrack *track1 = [secondAsset tracksWithMediaType:AVMediaTypeVideo][0];
    AVAssetTrack *track2 = [firstAsset tracksWithMediaType:AVMediaTypeVideo][0];
    CMTime duration1 = track1.timeRange.duration;
    CMTime duration2 = track2.timeRange.duration;
    [compositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration1) ofTrack:track1 atTime:kCMTimeZero error:nil];
    [compositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration2) ofTrack:track2 atTime:kCMTimeZero error:nil];
    
    [compositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration1) ofTrack:track1 atTime:kCMTimeZero error:nil];
    [compositionTrack insertTimeRange:firstTimeRange ofTrack:track2 atTime:kCMTimeZero error:nil];
    
    //只合並視頻,導出後聲音會消失,所以需要把聲音插入到混淆器中
    //添加音頻,添加本地其他音樂也可以,與視頻一致
    AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [audioTrack insertTimeRange:secondTimeRange ofTrack:[firstAsset tracksWithMediaType:AVMediaTypeAudio][0] atTime:kCMTimeZero error:nil];
    [audioTrack insertTimeRange:firstTimeRange ofTrack:[firstAsset tracksWithMediaType:AVMediaTypeAudio][0] atTime:kCMTimeZero error:nil];
    
    NSString *name = [NSString stringWithFormat:@"%ldcomp.mp4",time(NULL)];
    NSString *cachePath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) lastObject];
    NSString *filePath = [cachePath stringByAppendingPathComponent:name];
    AVAssetExportSession *exporterSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
    exporterSession.outputFileType = AVFileTypeMPEG4;
    exporterSession.outputURL = [NSURL fileURLWithPath:filePath]; 
    exporterSession.shouldOptimizeForNetworkUse = YES; //用於互聯網傳輸
    [exporterSession exportAsynchronouslyWithCompletionHandler:^{
        switch (exporterSession.status) {
            case AVAssetExportSessionStatusUnknown:
                NSLog(@"exporter Unknow");
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"exporter Canceled");
                break;
            case AVAssetExportSessionStatusFailed:
                NSLog(@"exporter Failed");
                break;
            case AVAssetExportSessionStatusWaiting:
                NSLog(@"exporter Waiting");
                break;
            case AVAssetExportSessionStatusExporting:
                NSLog(@"exporter Exporting");
                break;
            case AVAssetExportSessionStatusCompleted:
                NSLog(@"exporter Completed");
                [self saveRecordingVideo:filePath];
                break;
        }
    }];
}

上面的代碼,請注意這裏:

 AVAssetTrack *track1 = [secondAsset tracksWithMediaType:AVMediaTypeVideo][0];
 AVAssetTrack *track2 = [firstAsset tracksWithMediaType:AVMediaTypeVideo][0];
 CMTime duration1 = track1.timeRange.duration;
    
 [compositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration1) ofTrack:track1 atTime:kCMTimeZero error:nil];

視頻合成的時候用到insertTimeRange:這個方法,百度和簡書上搜到的答案都是寫入CMTimeRangeMake(kCMTimeZero, asset.duration)這個參數,這樣做合併視頻完成之後會有1-2s的黑屏,然後才顯示後面的視頻,我這種寫法是由stackoverflow上的老外給的解決方法,拿到後加入的視頻的AVAssetTrack,取到它的timerange的duration然後insettimerange進去,這樣就好使了,合併之後沒有黑屏出現,破費!

解決黑屏1-2s的鏈接在這裏:stackoverflow老外的解決辦法

對於這個合併視頻的api我的理解是這樣的:比如我要合併12345段視頻。我先預備一個池子,先把5扔進去,再把4扔進去,每次扔的時候都讓他們的時間點插入到kcmtimezero這個時間,這樣後入的就在前面了,類似於iOS導航棧那樣。逐個扔,到最後的就顯示在第一個,倒序進行合併。

只是項目裏用到了這個功能,查到的資料也少,不知道這樣理解對不對  有懂得麻煩可以給我留言指正,謝謝

 

其他文章請查看個人博客:http://zhangqq166.cn/

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章