iOS后台视频合并

8

任务: 将宣传单图片合并到宣传单视频中。

情况:

  • 创建宣传单[添加表情符号图片/文本等]
  • 创建视频

情况1

  • 按下返回按钮[用户将进入宣传单屏幕的应用程序列表],在此期间我们正在将flyerSnapShoot合并到flyerVideo中,并且它完美地工作
  • 进入手机相册,我们可以在其中看到更新后的视频。

情况2

  • 按下iPhone主页按钮,我正在执行与上述相同的操作,但遇到以下错误

FAIL = Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17266d40 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x172b3920 "The operation couldn’t be completed. (OSStatus error -16980.)", NSLocalizedFailureReason=An unknown error occurred (-16980)}

代码:

- (void)modifyVideo:(NSURL *)src destination:(NSURL *)dest crop:(CGRect)crop
              scale:(CGFloat)scale overlay:(UIImage *)image
         completion:(void (^)(NSInteger, NSError *))callback {

    // Get a pointer to the asset
    AVURLAsset* firstAsset = [AVURLAsset URLAssetWithURL:src options:nil];

    // Make an instance of avmutablecomposition so that we can edit this asset:
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    // Add tracks to this composition
    AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    // Audio track
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    // Image video is always 30 seconds. So we use that unless the background video is smaller.
    CMTime inTime = CMTimeMake( MAX_VIDEO_LENGTH * VIDEOFRAME, VIDEOFRAME );
    if ( CMTimeCompare( firstAsset.duration, inTime ) < 0 ) {
        inTime = firstAsset.duration;
    }

    // Add to the video track.
    NSArray *videos = [firstAsset tracksWithMediaType:AVMediaTypeVideo];
    CGAffineTransform transform;
    if ( videos.count > 0 ) {
        AVAssetTrack *track = [videos objectAtIndex:0];
        [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:track atTime:kCMTimeZero error:nil];
        transform = track.preferredTransform;
        videoTrack.preferredTransform = transform;
    }

    // Add the audio track.
    NSArray *audios = [firstAsset tracksWithMediaType:AVMediaTypeAudio];
    if ( audios.count > 0 ) {
        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:[audios objectAtIndex:0] atTime:kCMTimeZero error:nil];
    }

    NSLog(@"Natural size: %.2f x %.2f", videoTrack.naturalSize.width, videoTrack.naturalSize.height);

    // Set the mix composition size.
    mixComposition.naturalSize = crop.size;

    // Set up the composition parameters.
    AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
    videoComposition.frameDuration = CMTimeMake(1, VIDEOFRAME );
    videoComposition.renderSize = crop.size;
    videoComposition.renderScale = 1.0;

    // Pass through parameters for animation.
    AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, inTime);

    // Layer instructions
    AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

    // Set the transform to maintain orientation
    if ( scale != 1.0 ) {
        CGAffineTransform scaleTransform = CGAffineTransformMakeScale( scale, scale);
        CGAffineTransform translateTransform = CGAffineTransformTranslate( CGAffineTransformIdentity,
                                                                          -crop.origin.x,
                                                                          -crop.origin.y);
        transform = CGAffineTransformConcat( transform, scaleTransform );
        transform = CGAffineTransformConcat( transform, translateTransform);
    }

    [passThroughLayer setTransform:transform atTime:kCMTimeZero];

    passThroughInstruction.layerInstructions = @[ passThroughLayer ];
    videoComposition.instructions = @[passThroughInstruction];

    // If an image is given, then put that in the animation.
    if ( image != nil ) {

        // Layer that merges the video and image
        CALayer *parentLayer = [CALayer layer];
        parentLayer.frame = CGRectMake( 0, 0, crop.size.width, crop.size.height);

        // Layer that renders the video.
        CALayer *videoLayer = [CALayer layer];
        videoLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
        [parentLayer addSublayer:videoLayer];

        // Layer that renders flyerly image.
        CALayer *imageLayer = [CALayer layer];
        imageLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
        imageLayer.contents = (id)image.CGImage;
        [imageLayer setMasksToBounds:YES];

        [parentLayer addSublayer:imageLayer];

        // Setup the animation tool
        videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
    }

    // Now export the movie
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
    exportSession.videoComposition = videoComposition;

    // Export the URL
    exportSession.outputURL = dest;
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    exportSession.shouldOptimizeForNetworkUse = YES;
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        callback( exportSession.status, exportSession.error );
    }];
}

我正在从AppDelegate.m调用此函数。
- (void)applicationDidEnterBackground:(UIApplication *)application
{
    bgTask = [application beginBackgroundTaskWithName:@"MyTask" expirationHandler:^{
        // Clean up any unfinished task business by marking where you
        // stopped or ending the task outright.
        [application endBackgroundTask:bgTask];
        bgTask = UIBackgroundTaskInvalid;
    }];

    // Start the long-running task and return immediately.
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{

        // Do the work associated with the task, preferably in chunks.
         [self goingToBg];

        [application endBackgroundTask:bgTask];
        bgTask = UIBackgroundTaskInvalid;
    });

    NSLog(@"backgroundTimeRemaining: %f", [[UIApplication sharedApplication] backgroundTimeRemaining]);
}

1
在这里,用户可以将视频反转并在播放器中播放带有音频的视频视频合并-链接: https://github.com/mehulparmar4ever/ConvertVideoReverse注释:1.用户可以反转原始视频。 2.从视频中提取音频 3.添加音视频文件并一起播放 - Mehul
嗯,现在我还没有恢复我的代码的状态,我正在寻找解决方法,而不是更改库 :) - Abdul Rauf
2个回答

9

我在这个问题上进行了大量的研究,但没有找到解决方案。

希望分享一些链接,希望能够帮助遇到同样问题[需求]的堆栈社区。

链接1:AVExportSession to run in background

与问题相关的引用[从上面的链接1复制]

不幸的是,由于AVAssetExportSession使用gpu来执行部分工作,如果您使用AVVideoComposition,则不能在后台运行。

链接2:Starting AVAssetExportSession in the Background

与问题相关的引用[从上面的链接2复制]

您可以在后台启动AVAssetExportSession。在AVFoundation中执行后台工作的唯一限制是使用AVVideoCompositions或AVMutableVideoCompositions。AVVideoCompositions正在使用GPU,而GPU无法在后台使用。

用于后台任务的URL:

APPLE DEV URL

RAYWENDERLICH URL

Stack question


1
如果您在项目的功能中更新“后台模式”设置以包括音频,则已经太晚参加派对。这将允许导出。
这是用于在后台播放音乐的。
这对我起作用。

你找到替代方案了吗?我们的应用因为没有使用后台音频而被拒绝了,显然。 - DennisA
我们的应用程序也被拒绝了。在 info.plist 中,“audio”被描述为“应用程序使用 AirPlay 播放音频或流式传输音频/视频”。我向应用程序审核展示了我的应用程序中内置视频播放器中的 AirPlay 图标的截屏,他们接受了它。 - Jan Ehrhardt
在iOS 13.5.1中,您有5分钟(实际上是295秒)来处理BGProcessingTaskRequest。我成功地在178秒内从后台完成了AVAssetExportSession。稍后我会发布我的代码。 - Jan Ehrhardt

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接