如何使用AVFoundation翻转视频

10

我使用了前置摄像头录制了一个视频,但输出时是镜像的...

我尝试使用AVMutablecompositionlayerinstructions翻转视频,但没有成功。

在谷歌和Stack Overflow 上搜索都没有结果,所以我认为一个简单、直接的例子将有助于解决这个问题。


如果您正在使用AVCaptureConnection进行录制,我建议您在那里纠正问题,通过使用setVideoOrientation设置视频方向。 - Lefteris
4个回答

18

目前没有得知您用来录制视频的工具,我假设您使用了AVCaptureSession+AVCaptureVideoDataOutput

lazy var videoFileOutput: AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()
let v = videoFileOutput.connectionWithMediaType(AVMediaTypeVideo)
v.videoOrientation = .Portrait
v.videoMirrored = true

你能提供更多关于如何使用它的上下文吗?我之前一直在使用 let videoRecorded = outputURL! as URL,但似乎在这里不兼容。 - user10497264

7
您可以使用- [AVMutableVideoCompositionLayerInstruction setTransform:atTime:]函数。
CGAffineTransform transform = CGAffineTransformMakeTranslation(self.config.videoSize, 0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
[videoCompositionLayerInstruction setTransform:transform atTime:videoTime];

// then append video tracks
// [compositionTrack insertTimeRange:timeRange ofTrack:track atTime:atTime error:&error];

// apply instructions
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
videoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction];

videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = CGSizeMake(self.config.videoSize, self.config.videoSize);
videoComposition.frameDuration = CMTimeMake(1, self.config.videoFrameRate);
videoComposition.instructions = @[videoCompositionInstruction];

https://github.com/ElfSundae/AVDemo/tree/ef2ca437d0d8dcb3dd41c5a272c8754a29d8a936/AVSimpleEditoriOS

导出合成:

AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:presetName];
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.outputURL = outputURL;
exportSession.shouldOptimizeForNetworkUse = YES;
// videoComposition contains transform instructions for video tracks
exportSession.videoComposition = videoComposition;
// audioMix contains background music for audio tracks
exportSession.audioMix = audioMix;

[exportSession exportAsynchronouslyWithCompletionHandler:^{
    AVAssetExportSessionStatus status = exportSession.status;
    if (status != AVAssetExportSessionStatusCompleted) {
        // exportSession.error
    } else {
        // exportSession.outputURL
    }
}];

谢谢你的示例 - 那么你如何将这个合成导出为视频呢? - PinkFloydRocks
@PinkFloydRocks 我使用AVAssetExportSession,答案已更新。 - Elf Sundae
CGAffineTransformMakeTranslation(self.config.videoSize, 0); 这里需要一个 CGFloat 值,而不是 CGSize 值。不知道你是如何让这段代码工作的,请勿在此发布。 - user5890979

2

Swift 5. AVCaptureSession:

let movieFileOutput = AVCaptureMovieFileOutput()
let connection = movieFileOutput.connection(with: .video)

if connection?.isVideoMirroringSupported ?? false {
    connection?.isVideoMirrored = true
}

对于PhotoOutput也是一样。


1

在获得输出后,转换您的视频

    func mirrorVideo(inputURL: URL, completion: @escaping (_ outputURL : URL?) -> ())
{
    let videoAsset: AVAsset = AVAsset( url: inputURL )
    let clipVideoTrack = videoAsset.tracks( withMediaType: AVMediaType.video ).first! as AVAssetTrack

    let composition = AVMutableComposition()
    composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: CMPersistentTrackID())

    let videoComposition = AVMutableVideoComposition()
    videoComposition.renderSize = CGSize(width: clipVideoTrack.naturalSize.height, height: clipVideoTrack.naturalSize.width)
    videoComposition.frameDuration = CMTimeMake(1, 30)

    let transformer = AVMutableVideoCompositionLayerInstruction(assetTrack: clipVideoTrack)

    let instruction = AVMutableVideoCompositionInstruction()
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30))
    var transform:CGAffineTransform = CGAffineTransform(scaleX: -1.0, y: 1.0)
    transform = transform.translatedBy(x: -clipVideoTrack.naturalSize.width, y: 0.0)
    transform = transform.rotated(by: CGFloat(Double.pi/2))
    transform = transform.translatedBy(x: 0.0, y: -clipVideoTrack.naturalSize.width)

    transformer.setTransform(transform, at: kCMTimeZero)

    instruction.layerInstructions = [transformer]
    videoComposition.instructions = [instruction]

    // Export

    let exportSession = AVAssetExportSession(asset: videoAsset, presetName: AVAssetExportPreset640x480)!
    let fileName = UniqueIDGenerator.generate().appending(".mp4")
    let filePath = documentsURL.appendingPathComponent(fileName)
    let croppedOutputFileUrl = filePath
    exportSession.outputURL = croppedOutputFileUrl
    exportSession.outputFileType = AVFileType.mp4
    exportSession.videoComposition = videoComposition
    exportSession.exportAsynchronously {
        if exportSession.status == .completed {
            DispatchQueue.main.async(execute: {
                completion(croppedOutputFileUrl)
            })
            return
        } else if exportSession.status == .failed {
            print("Export failed - \(String(describing: exportSession.error))")
        }

        completion(nil)
        return
    }
}

这看起来像是一堵没有任何解释的代码墙。如果你能详细说明一下就更好了。 - atereshkov

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接