如何在AVQueuePlayer或AVMutableComposition中播放多个视频,而不会出现任何间隙或冻结?

4

我知道这个问题过去已经被问了几次,我也阅读了那些回答。但是似乎没有一种方法可以满足我的需要。有多个视频都添加到了 AVQueuePlayer 排队中。

我尝试了其他页面提到的两种添加方法:

AVPlayerItem *item1 = [AVPlayerItem playerItemWithURL:url1];
AVPlayerItem *item2 = [AVPlayerItem playerItemWithURL:url2];

NSArray *playerItems = [[NSArray alloc] initWithObjects:item1, item2, nil];
avPlayer = [[AVQueuePlayer alloc] initWithItems:playerItems]; 

并且这样做:
    avPlayer = [[AVQueuePlayer alloc] init];

    AVURLAsset *asset1 = [[AVURLAsset alloc] initWithURL:url1 options:nil];
    NSArray *keys = [NSArray arrayWithObject:@"playable"];

    [asset loadValuesAsynchronouslyForKeys:keys completionHandler:^()
        {
            dispatch_async(dispatch_get_main_queue(), ^
                           {
                               AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:asset1];
                               [avPlayer insertItem:playerItem afterItem:nil];
                           });

        }];
AVURLAsset *asset2 = [[AVURLAsset alloc] initWithURL:url2 options:nil];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^()
         {
             dispatch_async(dispatch_get_main_queue(), ^
                            {
                                AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:asset2];
                                [avPlayer insertItem:playerItem afterItem:nil];
                            });

         }];

但是,这些都无法解决在切换到下一项时出现的黑屏问题。每次开始播放下一项之前,大约会有1秒钟的间隔时间。我该如何消除这个间隔?
更新:我还尝试了使用AVMutableComposition。间隔已经显著减少,但仍然可以注意到。有没有任何方法可以完全消除这些间隔?
AVMutableComposition 代码:
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];

NSMutableArray *arrayInstruction = [[NSMutableArray alloc] init];

AVMutableVideoCompositionInstruction * MainInstruction =
[AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableCompositionTrack *audioTrack;

audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                         preferredTrackID:kCMPersistentTrackID_Invalid];

CMTime duration = kCMTimeZero;

for(int i = 0; i <= 5; i++)
{
    AVAsset *currentAsset;
    currentAsset = [self currentAsset:i]; // i take the for loop for getting the asset
        AVMutableCompositionTrack *currentTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [currentTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:duration error:nil];

        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:duration error:nil];

        AVMutableVideoCompositionLayerInstruction *currentAssetLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:currentTrack];
        AVAssetTrack *currentAssetTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
        ALAssetOrientation currentAssetOrientation  = ALAssetOrientationUp;
        BOOL  isCurrentAssetPortrait  = YES;
        CGAffineTransform currentTransform = currentAssetTrack.preferredTransform;

        if(currentTransform.a == 0 && currentTransform.b == 1.0 && currentTransform.c == -1.0 && currentTransform.d == 0)  {currentAssetOrientation= ALAssetOrientationRight; isCurrentAssetPortrait = YES;}
        if(currentTransform.a == 0 && currentTransform.b == -1.0 && currentTransform.c == 1.0 && currentTransform.d == 0)  {currentAssetOrientation =  ALAssetOrientationLeft; isCurrentAssetPortrait = YES;}
        if(currentTransform.a == 1.0 && currentTransform.b == 0 && currentTransform.c == 0 && currentTransform.d == 1.0)   {currentAssetOrientation =  ALAssetOrientationUp;}
        if(currentTransform.a == -1.0 && currentTransform.b == 0 && currentTransform.c == 0 && currentTransform.d == -1.0) {currentAssetOrientation = ALAssetOrientationDown;}

        CGFloat FirstAssetScaleToFitRatio = 640.0/640.0;
        if(isCurrentAssetPortrait){
            FirstAssetScaleToFitRatio = 640.0/640.0;
            CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
            [currentAssetLayerInstruction setTransform:CGAffineTransformConcat(currentAssetTrack.preferredTransform, FirstAssetScaleFactor) atTime:duration];
        }else{
            CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
            [currentAssetLayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(currentAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0, 0)) atTime:duration];
        }
        duration=CMTimeAdd(duration, currentAsset.duration);
        [arrayInstruction addObject:currentAssetLayerInstruction];
}

MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, duration);
MainInstruction.layerInstructions = arrayInstruction;
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];

MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(640.0, 640.0);

NSString* filename = [NSString stringWithFormat:@"mergedVideo.mp4"];
pathForFile = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
NSFileManager *fileManager = [NSFileManager defaultManager];
BOOL delete = [fileManager removeItemAtPath:pathForFile error:NULL];
NSLog(@"Deletion Succesful???? :: %d",delete);

NSURL *url = [NSURL fileURLWithPath:pathForFile];
NSLog(@"\n\nurl ::::::::::: %@\n\n",url);
NSError *err;
if ([url checkResourceIsReachableAndReturnError:&err] == NO)
    NSLog(@"\n\nFINEEEEEEEEEEEEE\n\n");
else
    NSLog(@"\n\nERRRRRORRRRRRRRRRRRRR\n\n");

AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.videoComposition = MainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^
 {
     switch (exporter.status)
     {
         case AVAssetExportSessionStatusCompleted:
         {
             NSURL *outputURL = exporter.outputURL;

             ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
             if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {

                 ALAssetsLibrary* library = [[ALAssetsLibrary alloc]init];
                 [library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error)
                  {
                      NSLog(@"ASSET URL %@",assetURL);
                      if (error)
                      {
                          NSLog(@"EROR %@ ", error);
                      }else{
                          NSLog(@"VIDEO SAVED ");
                      }

                  }];
                 NSLog(@"Video Merge SuccessFullt");
                 currentFile ++;
             }
         }
             break;
         case AVAssetExportSessionStatusFailed:
             NSLog(@"Failed:%@", exporter.error.description);
             break;
         case AVAssetExportSessionStatusCancelled:
             NSLog(@"Canceled:%@", exporter.error);
             break;
         case AVAssetExportSessionStatusExporting:
             NSLog(@"Exporting!");
             break;
         case AVAssetExportSessionStatusWaiting:
             NSLog(@"Waiting");
             break;
         default:
             break;
     }
 }];

我认为第一个解决方案更好,但这段代码在哪里,在viewDidLoad中,请在此处发布您的完整代码。 - Acácio Veit Schneider
@AcácioVeitSchneider,使用AVQueuePlayer同时使用两种方法看起来不是一个好主意。在更改音轨时会有1-1.5秒的间隔。这是从viewDidLoad方法中调用的。 - blancos
尝试使用mixComposition.duration替换您的duration。 - Abhi
1个回答

4
对于Ultravisual,我们使用了AVMutableComposition。只要先构建组合,然后再构建播放器进行播放,就能够在除了循环播放之外的所有地方实现无缝、无间隔的播放。
请检查AVMutableComposition中的所有轨道,并确认是否存在空隙。不要忘记音频轨道。有时,音频和视频可能具有不同的时间戳,您可能需要向AVMutableComposition添加另一个轨道来解决这个问题。

通过查看代码无法判断。遍历 mixComposition.tracks 中的所有 AVCompositionTrack,对于每个 track,通过 track.segments 遍历所有 AVCompositionTrackSegments 并将其开始和结束时间打印到调试控制台。将其复制到文本编辑器中,按时间排序,直到您可以看到数字之间的间隙。通过区分是视频还是音频轨道,可能有所帮助。 - damian
我在AVCompositionTrackSegment中没有看到任何属性可以打印时间戳。您能详细说明这些属性及其名称吗? - blancos
您需要的属性是timeMapping。它在AVAssetTrackSegment中定义,这是AVCompositionTrackSegment的超类。苹果文档非常详细地介绍了所有这些类,并会告诉您所需的一切。 - damian
我刚刚注意到你正在自己创建一个AVVideoComposition。如果你使用AVVideoComposition* mainCompositionInst = [AVVideoComposition compositionWithPropertiesOfAsset:mixComposition]会发生什么? - damian
我使用它来为导出器分配videoComposition属性,包括指令、帧持续时间等。 - blancos
显示剩余10条评论

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接