使用AVFoundation裁剪AVAsset视频

13
我正在使用AVCaptureMovieFileOutput录制一些视频。我使用AVLayerVideoGravityResizeAspectFill来显示预览图层,它会稍微缩放一下。我的问题是最终的视频比预览时多了一些未在屏幕上显示的图像。
这是预览和最终视频: enter image description here enter image description here 有没有办法我可以使用AVAssetExportSession指定一个CGRect来裁剪视频?
编辑 ----
当我对AVAssetTrack应用CGAffineTransformScale时,它会缩放到视频中,并且使用AVMutableVideoCompositionrenderSize设置为view.bounds时,它会裁掉两端。很好,只剩下1个问题了。视频的宽度不会拉伸到正确的宽度,而是填充黑色。
编辑2 ---- 建议的问题/答案是不完整的...
我代码的一部分:
在我的- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error方法中,我有以下代码来裁剪和调整视频的大小。
- (void)flipAndSave:(NSURL *)videoURL withCompletionBlock:(void(^)(NSURL *returnURL))completionBlock
{
    AVURLAsset *firstAsset = [AVURLAsset assetWithURL:videoURL];

    // 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
    AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
    // 2 - Video track
    AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
                        ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];

    // 2.1 - Create AVMutableVideoCompositionInstruction
    AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    mainInstruction.timeRange = CMTimeRangeMake(CMTimeMakeWithSeconds(0, 600), firstAsset.duration);

    // 2.2 - Create an AVMutableVideoCompositionLayerInstruction for the first track
    AVMutableVideoCompositionLayerInstruction *firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
    AVAssetTrack *firstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    UIImageOrientation firstAssetOrientation_  = UIImageOrientationUp;
    BOOL isFirstAssetPortrait_  = NO;
    CGAffineTransform firstTransform = firstAssetTrack.preferredTransform;
    if (firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0) {
        firstAssetOrientation_ = UIImageOrientationRight;
        isFirstAssetPortrait_ = YES;
    }
    if (firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0) {
        firstAssetOrientation_ =  UIImageOrientationLeft;
        isFirstAssetPortrait_ = YES;
    }
    if (firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0) {
        firstAssetOrientation_ =  UIImageOrientationUp;

    }
    if (firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) {
        firstAssetOrientation_ = UIImageOrientationDown;
    }
//    [firstlayerInstruction setTransform:firstAssetTrack.preferredTransform atTime:kCMTimeZero];

//    [firstlayerInstruction setCropRectangle:self.view.bounds atTime:kCMTimeZero];





    CGFloat scale = [self getScaleFromAsset:firstAssetTrack];

    firstTransform = CGAffineTransformScale(firstTransform, scale, scale);

    [firstlayerInstruction setTransform:firstTransform atTime:kCMTimeZero];

    // 2.4 - Add instructions
    mainInstruction.layerInstructions = [NSArray arrayWithObjects:firstlayerInstruction,nil];
    AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
    mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
    mainCompositionInst.frameDuration = CMTimeMake(1, 30);

//    CGSize videoSize = firstAssetTrack.naturalSize;
    CGSize videoSize = self.view.bounds.size;
    BOOL isPortrait_ = [self isVideoPortrait:firstAsset];
    if(isPortrait_) {
        videoSize = CGSizeMake(videoSize.height, videoSize.width);
    }
    NSLog(@"%@", NSStringFromCGSize(videoSize));
    mainCompositionInst.renderSize = videoSize;




    // 3 - Audio track
    AVMutableCompositionTrack *AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    [AudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
                        ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];

    // 4 - Get path
    NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"cutoutput.mov"];
    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
    NSFileManager *manager = [[NSFileManager alloc] init];
    if ([manager fileExistsAtPath:outputPath])
    {
        [manager removeItemAtPath:outputPath error:nil];
    }
    // 5 - Create exporter
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                      presetName:AVAssetExportPresetHighestQuality];
    exporter.outputURL=outputURL;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    exporter.videoComposition = mainCompositionInst;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        switch ([exporter status])
        {
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Export failed: %@ : %@", [[exporter error] localizedDescription], [exporter error]);
                completionBlock(nil);

                break;
            case AVAssetExportSessionStatusCancelled:

                NSLog(@"Export canceled");
                completionBlock(nil);

                break;
            default: {
                NSURL *outputURL = exporter.outputURL;
                dispatch_async(dispatch_get_main_queue(), ^{
                    completionBlock(outputURL);
                });

                break;
            }
        }
    }];
}

可能是如何使用AVFoundation裁剪视频的重复问题。 - matt
1
我之前看过那段代码。即使回答评论也说它不完整。 - Darren
好的,抱歉。您可能想展示您的实际代码... - matt
代码已添加。我在想也许我应该使用AVCaptureVideoDataOutput,并在帧进入时进行裁剪。默认的iOS相机应用程序仅记录屏幕上的内容,因此一定是可能的。我觉得我可能错过了一些简单的东西。 - Darren
你是在录制视频还是从相机库中添加视频?录制视频的代码在哪里? - Muruganandham K
显示剩余2条评论
2个回答

21
这是我对你问题的理解:你正在一个屏幕比例为4:3的设备上捕获视频,因此你的AVCaptureVideoPreviewLayer也是4:3,但是视频输入设备捕获的视频是16:9的,所以最终的视频比预览中看到的要“更大”。
如果你只是想裁剪预览中未捕获的额外像素,请查看这篇文章http://www.netwalk.be/article/record-square-video-ios。这篇文章展示了如何将视频裁剪成正方形。但是,你只需要进行一些修改就可以将其裁剪成4:3。我已经测试过了,以下是我所做的更改:
一旦你有了视频的AVAssetTrack,你需要计算一个新的高度。
// we convert the captured height i.e. 1080 to a 4:3 screen ratio and get the new height
CGFloat newHeight = clipVideoTrack.naturalSize.height/3*4;

然后使用 newHeight 修改这两行。
videoComposition.renderSize = CGSizeMake(clipVideoTrack.naturalSize.height, newHeight);

CGAffineTransform t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height, -(clipVideoTrack.naturalSize.width - newHeight)/2 );

所以我们在这里做的是将渲染大小设置为4:3比例 - 具体尺寸基于输入设备。然后,我们使用CGAffineTransform来转换视频位置,以便我们在AVCaptureVideoPreviewLayer中看到的内容就是渲染到我们的文件中的内容。

编辑:如果您想将所有内容放在一起,并根据设备屏幕比例(3:2、4:3、16:9)裁剪视频并考虑视频方向,我们需要添加一些内容。

首先,这是修改后的示例代码,其中包含一些关键的修改:

// output file
NSString* docFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
NSString* outputPath = [docFolder stringByAppendingPathComponent:@"output2.mov"];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputPath])
    [[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];

// input file
AVAsset* asset = [AVAsset assetWithURL:outputFileURL];

AVMutableComposition *composition = [AVMutableComposition composition];
[composition  addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

// input clip
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

// crop clip to screen ratio
UIInterfaceOrientation orientation = [self orientationForTrack:asset];
BOOL isPortrait = (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationPortraitUpsideDown) ? YES: NO;
CGFloat complimentSize = [self getComplimentSize:videoTrack.naturalSize.height];
CGSize videoSize;

if(isPortrait) {
    videoSize = CGSizeMake(videoTrack.naturalSize.height, complimentSize);
} else {
    videoSize = CGSizeMake(complimentSize, videoTrack.naturalSize.height);
}

AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMake(1, 30);

AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );

// rotate and position video
AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

CGFloat tx = (videoTrack.naturalSize.width-complimentSize)/2;
if (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationLandscapeRight) {
    // invert translation
    tx *= -1;
}

// t1: rotate and position video since it may have been cropped to screen ratio
CGAffineTransform t1 = CGAffineTransformTranslate(videoTrack.preferredTransform, tx, 0);
// t2/t3: mirror video horizontally
CGAffineTransform t2 = CGAffineTransformTranslate(t1, isPortrait?0:videoTrack.naturalSize.width, isPortrait?videoTrack.naturalSize.height:0);
CGAffineTransform t3 = CGAffineTransformScale(t2, isPortrait?1:-1, isPortrait?-1:1);

[transformer setTransform:t3 atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject: transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];

// export
exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL=[NSURL fileURLWithPath:outputPath];
exporter.outputFileType=AVFileTypeQuickTimeMovie;

[exporter exportAsynchronouslyWithCompletionHandler:^(void){
    NSLog(@"Exporting done!");

    // added export to library for testing
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]]) {
        [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]
                                    completionBlock:^(NSURL *assetURL, NSError *error) {
             NSLog(@"Saved to album");
             if (error) {

             }
         }];
    }
}];

在这里我们添加了一个调用,根据将其尺寸裁剪到屏幕比例的视频来获取新的渲染大小。一旦我们将尺寸裁剪下来,我们需要翻译位置以重新定位视频。因此,我们抓取它的方向来将其移动到适当的方向。这将解决我们在 UIInterfaceOrientationLandscapeLeft 中看到的居中问题。最后,CGAffineTransform t2, t3 使视频水平镜像。

这里是实现以上功能的两种新方法:

- (CGFloat)getComplimentSize:(CGFloat)size {
    CGRect screenRect = [[UIScreen mainScreen] bounds];
    CGFloat ratio = screenRect.size.height / screenRect.size.width;

    // we have to adjust the ratio for 16:9 screens
    if (ratio == 1.775) ratio = 1.77777777777778;

    return size * ratio;
}

- (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset {
    UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait;
    NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];

    if([tracks count] > 0) {
        AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
        CGAffineTransform t = videoTrack.preferredTransform;

        // Portrait
        if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) {
            orientation = UIInterfaceOrientationPortrait;
        }
        // PortraitUpsideDown
        if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) {
            orientation = UIInterfaceOrientationPortraitUpsideDown;
        }
        // LandscapeRight
        if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) {
            orientation = UIInterfaceOrientationLandscapeRight;
        }
        // LandscapeLeft
        if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) {
            orientation = UIInterfaceOrientationLandscapeLeft;
        }
    }
    return orientation;
}

这些都很简单。需要注意的唯一一点是,在getComplimentSize:方法中,由于iPhone5+的分辨率在数学上不完全符合真正的16:9比例,因此我们必须手动调整16:9的比例。


哇,谢谢。这让我离我所需要的非常接近。但是你的代码有两个问题。 1、在 iPad 竖屏模式下运行得非常好,但在 iPhone 5 上,屏幕不是4:3,所以视频会变小。我们能否使用屏幕尺寸而不是比例? 2、在横屏录制时,视频会旋转,这意味着它总是横向显示。 - Darren
1
@Darren 我扩展了我的原始答案,包括备选屏幕比例和方向。 - Matt
1
不要忘记更新这个:[transformer setTransform:t2 atTime:kCMTimeZero]; - Matt
1
很不幸,我对变换也不熟悉,无法提供快速解决方案。我尝试了一些方法,但它们没有解决问题。我的猜测是,当我们应用CGAffineTransformTranslate并旋转视频时,以下位置和比例变化受新锚点/矩阵的影响。尝试调整它或提出一个原始问题。我相信有人可以更好地理解变换-我只是在猜测中摸索。 - Matt
1
更新了我的答案。解决了你在UIInterfaceOrientationLandscapeLeft中看到的偏离中心的问题。还添加了一些水平镜像的转换。应该就这样了 :) - Matt
显示剩余7条评论

2

AVCaptureVideoDataOutput是AVCaptureOutput的一个具体子类,用于处理从被捕获的视频中未压缩帧或访问已压缩帧。

AVCaptureVideoDataOutput实例可以生成视频帧,您可以使用其他媒体API对其进行处理。您可以使用captureOutput:didOutputSampleBuffer:fromConnection:委托方法访问帧。

配置会话 您可以在会话上使用预设来指定所需的图像质量和分辨率。预设是一个常量,标识可能的多种配置之一;在某些情况下,实际配置是特定于设备的:

https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html

要了解这些预设表示的实际值,请参见“保存为电影文件”和“捕捉静态图像”。

如果要设置特定大小的配置,则应在设置之前检查是否支持它:

if ([session canSetSessionPreset:AVCaptureSessionPreset1280x720]) {
    session.sessionPreset = AVCaptureSessionPreset1280x720;
}
else {
    // Handle the failure.
}

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接