如何减小使用UIImagePickerController创建的视频文件大小?

59

我有一个应用程序,允许用户使用 UIImagePickerController 录制视频,然后上传到YouTube。问题是,即使视频只有5秒钟长,UIImagePickerController 创建的视频文件也非常大,例如,5秒钟长的视频大小为16-20兆字节。我想保留540或720质量的视频,但我想减小文件大小。

我一直在尝试使用AVFoundation和AVAssetExportSession来尝试获取更小的文件大小。我已经尝试了以下代码:

AVAsset *video = [AVAsset assetWithURL:videoURL];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:video presetName:AVAssetExportPresetPassthrough];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.outputURL = [pathToSavedVideosDirectory URLByAppendingPathComponent:@"vid1.mp4"];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
    NSLog(@"done processing video!");
}];

但是这对文件大小没有任何缩减作用。我知道我在做的事情是可能的,因为在苹果的照片应用中,当你选择"在YouTube上分享"时,将会自动处理视频文件,这样就可以上传了。我想在我的应用程序中做同样的事情。

我该怎么做才能实现这个目标?


照片上传会保持质量和分辨率不变吗?我怀疑它会减少两者以使视频更小。 - davidfrancis
不会,它会保留上传时的视频。YouTube 可以支持1080p视频。 - zakdances
将文件输出类型设置为AVFileTypeQuickTimeMovie是否可以减小文件大小?或者尝试使用yourPickerController.videoQuality属性来降低视频质量,从而减小文件大小? - Just a coder
在我的帖子中,我提到我想将质量保持在720或540。我会尝试将其转换为MOV格式,但据我所知,它比MP4格式要大得多。 - zakdances
标题有误导性,因为您没有在任何地方使用UIImagePickerController,您应该更改它以避免给未来的用户带来困惑。 - thibaut noah
13个回答

68

使用 AVCaptureSessionAVAssetWriter,您可以设置以下压缩设置:

NSDictionary *settings = @{AVVideoCodecKey:AVVideoCodecH264,
                           AVVideoWidthKey:@(video_width),
                           AVVideoHeightKey:@(video_height),
                           AVVideoCompressionPropertiesKey:
                               @{AVVideoAverageBitRateKey:@(desired_bitrate),
                                 AVVideoProfileLevelKey:AVVideoProfileLevelH264Main31, /* Or whatever profile & level you wish to use */
                                 AVVideoMaxKeyFrameIntervalKey:@(desired_keyframe_interval)}};

AVAssetWriterInput* writer_input = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:settings];

编辑:我想,如果您坚持使用UIImagePicker首先创建电影,那么您将不得不使用AVAssetReader的copyNextSampleBufferAVAssetWriter的appendSampleBuffer方法来进行转码。


4
哇...这真的太棒了。真的很让人沮丧,因为缺少文档或者文档很难找到。为什么你必须使用"copyNextSampleBuffer"来处理UIImagePicker创建的视频呢?难道不能将生成的mp4/mov文件指定为AVAsset并直接将其输入AVAssetWriter中吗? - zakdances
当你说“复制所有样本”时,你是指使用copyNextSampleBuffer吗? - zakdances
@jgh,我是AVFoundation的新手。我在视频压缩方面遇到了相同的问题,你能否发送任何关于此的示例项目? - Suresh
@Infaz 不是要冒犯,但你应该学会如何阅读文档。这也适用于其他问我同样问题的人。 - jgh
嗨@jgh,导出60秒的视频现在大约需要10-20秒。除了玩弄“presetName”属性之外,您建议尝试哪些其他值来最小化导出时间,同时保持视频质量? - Crashalot
显示剩余5条评论

22

你的朋友zak是对的:设置cameraUI.videoQuality = UIImagePickerControllerQualityTypeLow;不是解决方案。解决方案是减少数据速率或比特率,这就是jgh所建议的。

我有三种方法。第一种方法处理UIImagePicker代理方法:

// For responding to the user accepting a newly-captured picture or movie
- (void) imagePickerController: (UIImagePickerController *) picker didFinishPickingMediaWithInfo: (NSDictionary *) info {

// Handle movie capture
NSURL *movieURL = [info objectForKey:
                            UIImagePickerControllerMediaURL];

NSURL *uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:[self randomString]] stringByAppendingString:@".mp4"]];

// Compress movie first
[self convertVideoToLowQuailtyWithInputURL:movieURL outputURL:uploadURL];
}

第二种方法将视频转换为较低的比特率,而不是降低尺寸。

- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL
                               outputURL:(NSURL*)outputURL
{
//setup video writer
AVAsset *videoAsset = [[AVURLAsset alloc] initWithURL:inputURL options:nil];

AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

CGSize videoSize = videoTrack.naturalSize;

NSDictionary *videoWriterCompressionSettings =  [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:1250000], AVVideoAverageBitRateKey, nil];

NSDictionary *videoWriterSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, videoWriterCompressionSettings, AVVideoCompressionPropertiesKey, [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, [NSNumber numberWithFloat:videoSize.height], AVVideoHeightKey, nil];

AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                         assetWriterInputWithMediaType:AVMediaTypeVideo
                                         outputSettings:videoWriterSettings];

videoWriterInput.expectsMediaDataInRealTime = YES;

videoWriterInput.transform = videoTrack.preferredTransform;

AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeQuickTimeMovie error:nil];

[videoWriter addInput:videoWriterInput];

//setup video reader
NSDictionary *videoReaderSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];

AVAssetReaderTrackOutput *videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoReaderSettings];

AVAssetReader *videoReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil];

[videoReader addOutput:videoReaderOutput];

//setup audio writer
AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeAudio
                                        outputSettings:nil];

audioWriterInput.expectsMediaDataInRealTime = NO;

[videoWriter addInput:audioWriterInput];

//setup audio reader
AVAssetTrack* audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

AVAssetReaderOutput *audioReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];

AVAssetReader *audioReader = [AVAssetReader assetReaderWithAsset:videoAsset error:nil];

[audioReader addOutput:audioReaderOutput];    

[videoWriter startWriting];

//start writing from video reader
[videoReader startReading];

[videoWriter startSessionAtSourceTime:kCMTimeZero];

dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue1", NULL);

[videoWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:
 ^{

     while ([videoWriterInput isReadyForMoreMediaData]) {

         CMSampleBufferRef sampleBuffer;

         if ([videoReader status] == AVAssetReaderStatusReading &&
             (sampleBuffer = [videoReaderOutput copyNextSampleBuffer])) {

             [videoWriterInput appendSampleBuffer:sampleBuffer];
             CFRelease(sampleBuffer);
         }

         else {

             [videoWriterInput markAsFinished];

             if ([videoReader status] == AVAssetReaderStatusCompleted) {

                 //start writing from audio reader
                 [audioReader startReading];

                 [videoWriter startSessionAtSourceTime:kCMTimeZero];

                 dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue2", NULL);

                 [audioWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:^{

                     while (audioWriterInput.readyForMoreMediaData) {

                         CMSampleBufferRef sampleBuffer;

                         if ([audioReader status] == AVAssetReaderStatusReading &&
                             (sampleBuffer = [audioReaderOutput copyNextSampleBuffer])) {

                            [audioWriterInput appendSampleBuffer:sampleBuffer];
                                    CFRelease(sampleBuffer);
                         }

                         else {

                             [audioWriterInput markAsFinished];

                             if ([audioReader status] == AVAssetReaderStatusCompleted) {

                                 [videoWriter finishWritingWithCompletionHandler:^(){
                                     [self sendMovieFileAtURL:outputURL];
                                 }];

                             }
                         }
                     }

                 }
                  ];
             }
         }
     }
 }
 ];
}

成功时,将调用第三种方法sendMovieFileAtURL:,该方法会上传位于outputURL的压缩视频到服务器。

请注意,我已在我的项目中启用了ARC,因此如果您的ARC未开启,则需要添加一些release调用。


1
这个解决方案完整且运行良好。我使用上述设置将一个qHD(960x540)21秒的视频从80 MB压缩到了3 MB。只需确保您的outputURL是一个fileURL [NSURL fileURLWithPath:],并在[audioWriterInput markAsFinished]之后放置您的清理代码。我无法让以下“if”语句中的代码执行,但视频输出效果很好,几乎没有伪影。 - jbcaveman
1
将其更改为“在 [videoWriter finishWritingWithCompletionHandler:^(){ }后立即放置您的清理代码。我无法在该完成处理程序内执行代码…”(5分钟后无法编辑) - jbcaveman
除了保留videoWriter(我的上一条评论)之外,您还可以考虑调用endSessionAtSourceTime:,如同Mr. T在https://dev59.com/HWMk5IYBdhLWcg3w-Cmq的同一线程中所提到的答案。我没有遇到Mr. T看到的偶发问题,但在finishWritingWithCompletionHandler:之前调用endSessionAtSourceTime:不会有害,尽管文档似乎表明不需要这样做。 - Scott Carter
我把相同的代码放在我的代码里,但是当我从文档文件夹获取视频时,我发现视频存在那里,但是无法播放。我做错了什么吗?我只是粘贴了上面的代码并注释掉了sendMovieFileAtURL:因为我想先进行测试。有任何建议吗? - Soniya
3
在我的情况下,iOS 8中出现了这个错误:“强属性”后出现错误,导致应用程序终止未捕获的异常'NSInvalidArgumentException',原因是:在阅读已经开始之后不能再次调用“AVAssetReader startReading()”。 - dev.nikolaz
显示剩余11条评论

19

UImagePickerController有一个UIImagePickerControllerQualityType类型的videoQuality属性,它将应用于录制的电影以及从库中选择的电影(这发生在转码阶段)。

或者,如果您需要处理不来自库的现有资产(文件),则可以查看这些预设:

AVAssetExportPresetLowQuality
AVAssetExportPresetMediumQuality
AVAssetExportPresetHighestQuality

AVAssetExportPreset640x480
AVAssetExportPreset960x540
AVAssetExportPreset1280x720
AVAssetExportPreset1920x1080
并将它们之一传递给 AVAssetExportSession 类的 initializer。恐怕您需要针对特定内容进行调整,因为没有确切的描述lowmedium质量是什么,或者哪个质量将用于640x4801280x720预设。文档中唯一有用的信息如下:

  

适用于设备的QuickTime文件导出预设名称   您可以使用这些导出选项生成视频大小适当的QuickTime .mov文件。

     

导出不会从较小的大小缩放视频。使用H.264压缩视频;音频使用AAC压缩。

     

某些设备无法支持某些尺寸。

除此之外,我记不得在AVFoundation中有关于像帧率或自由表单大小这样的精确控制质量的方法了

我错了,有一种方法可以调整您提到的所有参数,确实是AVAssetWriter:How do I export UIImage array as a movie?

顺便说一句,这里有一个类似的问题链接,内含代码示例:iPhone:Programmatically compressing recorded video to share?


我一直在尝试AVAssetExport,但正如您所提到的,它的质量设置似乎无法做到UIImagePickerControllerQualityType已经实现的功能。 AVAssetExportPresetMediumQuality和UIImagePickerControllerQualityType = medium是非常低质量的360ps视频,而高质量设置似乎是一个几乎未压缩的720p视频,文件大小不合理地大。我相信我的问题的答案将涉及使用AVAssetWriter来更改720p视频的帧率和比特率。 - zakdances
我希望有经验的人能对AVAssetWriter进行一些解释。 - zakdances
我错了,确实有一种方法可以调整您提到的所有参数,那就是AVAssetWriter:https://dev59.com/questions/5G865IYBdhLWcg3wlvmq#3742212 - Sash Zats

15

使用 Swift 5 编写高质量代码的方法:

以下是从此链接中获取的代码。该链接存在的问题是,它只适用于.mov文件输出,如果您想输出.mp4文件,则会崩溃。下面的代码可以让您获得.mp4输出。它经过尝试、测试并且可行。例如,一个原始大小为27mb的15秒视频可以缩小到2mb。如果您想要更好的质量,请提高bitrate。我已将其设置为1250000

复制并粘贴以下代码:

import AVFoundation

// add these properties
var assetWriter: AVAssetWriter!
var assetWriterVideoInput: AVAssetWriterInput!
var audioMicInput: AVAssetWriterInput!
var videoURL: URL!
var audioAppInput: AVAssetWriterInput!
var channelLayout = AudioChannelLayout()
var assetReader: AVAssetReader?
let bitrate: NSNumber = NSNumber(value: 1250000) // *** you can change this number to increase/decrease the quality. The more you increase, the better the video quality but the the compressed file size will also increase

// compression function, it returns a .mp4 but you can change it to .mov inside the do try block towards the middle. Change assetWriter = try AVAssetWriter ... AVFileType.mp4 to AVFileType.mov
func compressFile(_ urlToCompress: URL, completion:@escaping (URL)->Void) {
    
    var audioFinished = false
    var videoFinished = false
    
    let asset = AVAsset(url: urlToCompress)
    
    //create asset reader
    do {
        assetReader = try AVAssetReader(asset: asset)
    } catch {
        assetReader = nil
    }
    
    guard let reader = assetReader else {
        print("Could not iniitalize asset reader probably failed its try catch")
        // show user error message/alert
        return
    }
    
    guard let videoTrack = asset.tracks(withMediaType: AVMediaType.video).first else { return }
    let videoReaderSettings: [String:Any] = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32ARGB]
    
    let assetReaderVideoOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
    
    var assetReaderAudioOutput: AVAssetReaderTrackOutput?
    if let audioTrack = asset.tracks(withMediaType: AVMediaType.audio).first {
        
        let audioReaderSettings: [String : Any] = [
            AVFormatIDKey: kAudioFormatLinearPCM,
            AVSampleRateKey: 44100,
            AVNumberOfChannelsKey: 2
        ]
        
        assetReaderAudioOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: audioReaderSettings)
        
        if reader.canAdd(assetReaderAudioOutput!) {
            reader.add(assetReaderAudioOutput!)
        } else {
            print("Couldn't add audio output reader")
            // show user error message/alert
            return
        }
    }
    
    if reader.canAdd(assetReaderVideoOutput) {
        reader.add(assetReaderVideoOutput)
    } else {
        print("Couldn't add video output reader")
        // show user error message/alert
        return
    }
    
    let videoSettings:[String:Any] = [
        AVVideoCompressionPropertiesKey: [AVVideoAverageBitRateKey: self.bitrate],
        AVVideoCodecKey: AVVideoCodecType.h264,
        AVVideoHeightKey: videoTrack.naturalSize.height,
        AVVideoWidthKey: videoTrack.naturalSize.width,
        AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill
    ]
    
    let audioSettings: [String:Any] = [AVFormatIDKey : kAudioFormatMPEG4AAC,
                                       AVNumberOfChannelsKey : 2,
                                       AVSampleRateKey : 44100.0,
                                       AVEncoderBitRateKey: 128000
    ]
    
    let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: audioSettings)
    let videoInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoSettings)
    videoInput.transform = videoTrack.preferredTransform
    
    let videoInputQueue = DispatchQueue(label: "videoQueue")
    let audioInputQueue = DispatchQueue(label: "audioQueue")
    
    do {
        
        let formatter = DateFormatter()
        formatter.dateFormat = "yyyy'-'MM'-'dd'T'HH':'mm':'ss'Z'"
        let date = Date()
        let tempDir = NSTemporaryDirectory()
        let outputPath = "\(tempDir)/\(formatter.string(from: date)).mp4"
        let outputURL = URL(fileURLWithPath: outputPath)
        
        assetWriter = try AVAssetWriter(outputURL: outputURL, fileType: AVFileType.mp4)
        
    } catch {
        assetWriter = nil
    }
    guard let writer = assetWriter else {
        print("assetWriter was nil")
        // show user error message/alert
        return
    }
    
    writer.shouldOptimizeForNetworkUse = true
    writer.add(videoInput)
    writer.add(audioInput)
    
    writer.startWriting()
    reader.startReading()
    writer.startSession(atSourceTime: CMTime.zero)
    
    let closeWriter:()->Void = {
        if (audioFinished && videoFinished) {
            self.assetWriter?.finishWriting(completionHandler: { [weak self] in
                
                if let assetWriter = self?.assetWriter {
                    do {
                        let data = try Data(contentsOf: assetWriter.outputURL)
                        print("compressFile -file size after compression: \(Double(data.count / 1048576)) mb")
                    } catch let err as NSError {
                        print("compressFile Error: \(err.localizedDescription)")
                    }
                }
                
                if let safeSelf = self, let assetWriter = safeSelf.assetWriter {
                    completion(assetWriter.outputURL)
                }
            })
            
            self.assetReader?.cancelReading()
        }
    }
    
    audioInput.requestMediaDataWhenReady(on: audioInputQueue) {
        while(audioInput.isReadyForMoreMediaData) {
            if let cmSampleBuffer = assetReaderAudioOutput?.copyNextSampleBuffer() {
                
                audioInput.append(cmSampleBuffer)
                
            } else {
                audioInput.markAsFinished()
                DispatchQueue.main.async {
                    audioFinished = true
                    closeWriter()
                }
                break;
            }
        }
    }
    
    videoInput.requestMediaDataWhenReady(on: videoInputQueue) {
        // request data here
        while(videoInput.isReadyForMoreMediaData) {
            if let cmSampleBuffer = assetReaderVideoOutput.copyNextSampleBuffer() {
                
                videoInput.append(cmSampleBuffer)
                
            } else {
                videoInput.markAsFinished()
                DispatchQueue.main.async {
                    videoFinished = true
                    closeWriter()
                }
                break;
            }
        }
    }
}

如果您要压缩一个URL,以下是如何使用它。在回调函数中返回compressedURL

@IBAction func buttonTapped(sender: UIButton) {

    // show activity indicator

    let videoURL = URL(string: "...")

    compressFile(videoURL) { (compressedURL) in

       // remove activity indicator
       // do something with the compressedURL such as sending to Firebase or playing it in a player on the *main queue*
    }
}

提醒一下,我注意到音频会拖慢速度,你可以尝试在后台任务中运行它,看看是否更快。如果你在compressFile函数内部添加任何警报,你必须将其显示在主队列上,否则应用程序将崩溃。

DispatchQueue.global(qos: .background).async { [weak self] in

    self?.compressFile(videoURL) { (compressedURL) in

        DispatchQueue.main.async { [weak self] in
            // also remove activity indicator on mainQueue in addition to whatever is inside the function itself that needs to be updated on the mainQueue
        }
    }
}

如果您要压缩混合组合,请按照以下步骤操作。您需要使用AVMutableCompositionAVAssetExportSession和上面的compressFile(:completion:)函数:

@IBAction func buttonTapped(sender: UIButton) {

    // show activity indicator

    let mixComposition = AVMutableComposition()
    // code to create mix ...

    // create a local file
    let tempDir = NSTemporaryDirectory()
    let dirPath = "\(tempDir)/videos_\(UUID().uuidString).mp4"
    let outputFileURL = URL(fileURLWithPath: dirPath)

    removeUrlFromFileManager(outputFileURL) // check to see if the file already exists, if it does remove it, code is at the bottom of the answer

    createAssetExportSession(mixComposition, outputFileURL)
}

// here is the AssetExportSession function with the compressFile(:completion:) inside the callback
func createAssetExportSession(_ mixComposition: AVMutableComposition, _ outputFileURL: URL) {
    
    // *** If your video/url doesn't have sound (not mute but literally no sound, my iPhone's mic was broken when I recorded the video), change this to use AVAssetExportPresetPassthrough instead of HighestQulity. When my video didn't have sound the exporter.status kept returning .failed *** You can check for sound using https://stackoverflow.com/a/64733623/4833705
    guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) else {
        // alert user there is a problem
        return
    }
    
    exporter.outputURL = outputFileURL
    exporter.outputFileType = AVFileType.mp4
    exporter.shouldOptimizeForNetworkUse = true
    exporter.exportAsynchronously {
        
        switch exporter.status {
        case .completed:
            print("completed")
            // view the AssetExportSession file size using HighestQuality which will be very high
            do {
                let data = try Data(contentsOf: outputFileURL)
                print("createAssetExportSession -file size: \(Double(data.count / 1048576)) mb")
            } catch let err as NSError {
                print("createAssetExportSession Error: \(err.localizedDescription)")
            }
        case .failed:
            print("failed:", exporter.error as Any)
            DispatchQueue.main.async { [weak self] in
                // remove activity indicator
                // alert user there is a problem
            }
            return
        case .cancelled:
            print("cancelled", exporter.error as Any)
            DispatchQueue.main.async { [weak self] in
                // remove activity indicator
                // alert user there is a problem
            }
            return
        default:
            print("complete")
        }
        
        guard let exporterOutputURL = exporter.outputURL else {
            // alert user there is a problem
            return
        }

        DispatchQueue.main.async { [weak self] in
            
            self?.compressFile(exporterOutputURL) { (compressedURL) in
               // remove activity indicator
               // do something with the compressedURL such as sending to Firebase or playing it in a player on the *main queue*
            }
        }
    }
}

在完成使用压缩URL后,请务必从文件系统中删除它,例如在关闭视图控制器之前。

func dismissVC() {

    removeUrlFromFileManager(compressedURL)
    // dismiss vc ...
}

removeUrlFromFileManager(_ outputFileURL: URL?) {
    if let outputFileURL = outputFileURL {
        
        let path = outputFileURL.path
        if FileManager.default.fileExists(atPath: path) {
            do {
                try FileManager.default.removeItem(atPath: path)
                print("url SUCCESSFULLY removed: \(outputFileURL)")
            } catch {
                print("Could not remove file at url: \(outputFileURL)")
            }
        }
    }
}

4
没问题,我们都需要互相帮助。干杯! - Lance Samaria
1
@LanceSamaria 这只是一个camelCasing问题,文件中声明使用bitRate而不是bitrate。 - David
1
@YogeshPatel 我以前从未处理过仅有音频的情况,但我认为您可以使用上面完全相同的代码,但排除任何与视频相关的内容,例如 videoFinishedvideoTrackvideoReaderSettingsassetReaderVideoOutputvideoSettingsvideoInputvideoInputQueue。如果您从文件中删除这些“变量”,那么它应该仍然可以工作。如果不行,请给我留言。 - Lance Samaria
1
@YogeshPatel 你一定做错了什么。我刚刚按照之前告诉你的做法排除任何带有视频标记的内容,并测试了来自 https://www.ee.columbia.edu/~dpwe/sounds/music/ 的音频文件 africa-toto.wav。在压缩之前大小为 12.0 mb,压缩后大小为 3.0 mb。我使用了与答案完全相同的代码,并且排除掉了我告诉过你要排除的所有内容。 - Lance Samaria
1
@Bagusflyer 尽可能降低比特率。必须做出权衡。使用劣质的快速压缩或使用慢速压缩获得高质量。我注意到音频会使事情变慢,您也可以尝试在后台任务上运行它,但如果实际函数本身内部有任何警报之类的东西,请确保更新主队列上的所有内容。顺便说一句,如果答案有效,请为其投票。如果不起作用并且您可以找到一个能够提供快速压缩和优秀结果的答案,请发布它。我自己会使用它。 - Lance Samaria
显示剩余22条评论

14

Erik在写这篇回答时可能是正确的,但现在随着iOS8的推出,它会时不时地崩溃,我自己也花了几个小时在上面。

使用AVAssetWriter需要博士学位 - 这不是一件简单的事情:https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/05_Export.html#//apple_ref/doc/uid/TP40010188-CH9-SW1

有一个惊人的库可以做到你所需的一切,它只是AVAssetExportSession的替代品,并且具有更重要的功能,比如改变比特率:https://github.com/rs/SDAVAssetExportSession

以下是如何使用它:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{

  SDAVAssetExportSession *encoder = [SDAVAssetExportSession.alloc initWithAsset:[AVAsset assetWithURL:[info objectForKey:UIImagePickerControllerMediaURL]]];
  NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
  NSString *documentsDirectory = [paths objectAtIndex:0];
  self.myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
                      [NSString stringWithFormat:@"lowerBitRate-%d.mov",arc4random() % 1000]];
  NSURL *url = [NSURL fileURLWithPath:self.myPathDocs];
  encoder.outputURL=url;
  encoder.outputFileType = AVFileTypeMPEG4;
  encoder.shouldOptimizeForNetworkUse = YES;

  encoder.videoSettings = @
  {
  AVVideoCodecKey: AVVideoCodecH264,
  AVVideoCompressionPropertiesKey: @
    {
    AVVideoAverageBitRateKey: @2300000, // Lower bit rate here
    AVVideoProfileLevelKey: AVVideoProfileLevelH264High40,
    },
  };
  encoder.audioSettings = @
  {
  AVFormatIDKey: @(kAudioFormatMPEG4AAC),
  AVNumberOfChannelsKey: @2,
  AVSampleRateKey: @44100,
  AVEncoderBitRateKey: @128000,
  };

  [encoder exportAsynchronouslyWithCompletionHandler:^
  {
    int status = encoder.status;

    if (status == AVAssetExportSessionStatusCompleted)
    {
      AVAssetTrack *videoTrack = nil;
      AVURLAsset *asset = [AVAsset assetWithURL:encoder.outputURL];
      NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
      videoTrack = [videoTracks objectAtIndex:0];
      float frameRate = [videoTrack nominalFrameRate];
      float bps = [videoTrack estimatedDataRate];
      NSLog(@"Frame rate == %f",frameRate);
      NSLog(@"bps rate == %f",bps/(1024.0 * 1024.0));
      NSLog(@"Video export succeeded");
      // encoder.outputURL <- this is what you want!!
    }
    else if (status == AVAssetExportSessionStatusCancelled)
    {
      NSLog(@"Video export cancelled");
    }
    else
    {
      NSLog(@"Video export failed with error: %@ (%d)", encoder.error.localizedDescription, encoder.error.code);
    }
  }];
}

1
这是一个压缩现有视频的绝佳解决方案。然而,在encoder.videoSettings中缺少AVVideoWidthKey和AVVideoHeightKey键。要使用当前的,请使用以下代码: AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil]; NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo]; AVAssetTrack *track = [tracks objectAtIndex:0]; encoder.videoSettings = @ { .... AVVideoWidthKey : @(track.naturalSize.width), AVVideoHeightKey: @(track.naturalSize.height), .... } - Thibaud David
你可以缩小宽度/高度(使用相同的因子保持比例),或减小AVSampleRateKey以满足您的需求。 - Thibaud David
嗨@ThibaudDavid,我尝试通过将宽度和高度乘以0.75,以及将比特率从2300000降低到1960000来减小文件大小,但是导出的文件大小仍从2175522字节增加到了3938850字节。 :( - Mayur Shrivas
1
你的输入文件比特率是多少?如果在转换时指定了较低的比特率,则文件应包含较少的字节。例如,尝试将[track estimatedDataRate] / 2作为比特率传递,以确保您的值更低。 - Thibaud David
@ThibaudDavid 但在这种情况下,视频将失去其宽度和高度,因为黑色补丁会显示在视频的左侧/右侧和上方/下方。 - Mayur Shrivas
显示剩余17条评论

9

Erik Wegener的代码已重写为Swift 3:

class func convertVideoToLowQuailtyWithInputURL(inputURL: NSURL, outputURL: NSURL, onDone: @escaping () -> ()) {
            //setup video writer
            let videoAsset = AVURLAsset(url: inputURL as URL, options: nil)
            let videoTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
            let videoSize = videoTrack.naturalSize
            let videoWriterCompressionSettings = [
                AVVideoAverageBitRateKey : Int(125000)
            ]

            let videoWriterSettings:[String : AnyObject] = [
                AVVideoCodecKey : AVVideoCodecH264 as AnyObject,
                AVVideoCompressionPropertiesKey : videoWriterCompressionSettings as AnyObject,
                AVVideoWidthKey : Int(videoSize.width) as AnyObject,
                AVVideoHeightKey : Int(videoSize.height) as AnyObject
            ]

            let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoWriterSettings)
            videoWriterInput.expectsMediaDataInRealTime = true
            videoWriterInput.transform = videoTrack.preferredTransform
            let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileTypeQuickTimeMovie)
            videoWriter.add(videoWriterInput)
            //setup video reader
            let videoReaderSettings:[String : AnyObject] = [
                kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) as AnyObject
            ]

            let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
            let videoReader = try! AVAssetReader(asset: videoAsset)
            videoReader.add(videoReaderOutput)
            //setup audio writer
            let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil)
            audioWriterInput.expectsMediaDataInRealTime = false
            videoWriter.add(audioWriterInput)
            //setup audio reader
            let audioTrack = videoAsset.tracks(withMediaType: AVMediaTypeAudio)[0]
            let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
            let audioReader = try! AVAssetReader(asset: videoAsset)
            audioReader.add(audioReaderOutput)
            videoWriter.startWriting()





            //start writing from video reader
            videoReader.startReading()
            videoWriter.startSession(atSourceTime: kCMTimeZero)
            let processingQueue = DispatchQueue(label: "processingQueue1")
            videoWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                while videoWriterInput.isReadyForMoreMediaData {
                    let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
                    if videoReader.status == .reading && sampleBuffer != nil {
                        videoWriterInput.append(sampleBuffer!)
                    }
                    else {
                        videoWriterInput.markAsFinished()
                        if videoReader.status == .completed {
                            //start writing from audio reader
                            audioReader.startReading()
                            videoWriter.startSession(atSourceTime: kCMTimeZero)
                            let processingQueue = DispatchQueue(label: "processingQueue2")
                            audioWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                                while audioWriterInput.isReadyForMoreMediaData {
                                    let sampleBuffer:CMSampleBuffer? = audioReaderOutput.copyNextSampleBuffer()
                                    if audioReader.status == .reading && sampleBuffer != nil {
                                        audioWriterInput.append(sampleBuffer!)
                                    }
                                    else {
                                        audioWriterInput.markAsFinished()
                                        if audioReader.status == .completed {
                                            videoWriter.finishWriting(completionHandler: {() -> Void in
                                                onDone();
                                            })
                                        }
                                    }
                                }
                            })
                        }
                    }
                }
            })
        }

更改为MP4时崩溃:\ - Yair hadad
你需要添加音频设置,因为您更改了视频类型。 - Bajrang Sinha
这段代码在视频时长超过8分钟时崩溃。 - Bajrang Sinha
125000对于视频来说非常模糊。 - Ning

8
您可以在打开UIImagePickerController时将视频质量设置为以下任何一种:

UIImagePickerControllerQualityType640x480
UIImagePickerControllerQualityTypeLow
UIImagePickerControllerQualityTypeMedium
UIImagePickerControllerQualityTypeHigh
UIImagePickerControllerQualityTypeIFrame960x540
UIImagePickerControllerQualityTypeIFrame1280x720

尝试使用以下代码更改打开UIImagePickerController的质量类型:
if (([UIImagePickerController isSourceTypeAvailable:
      UIImagePickerControllerSourceTypeCamera] == NO))
    return NO;
UIImagePickerController *cameraUI = [[UIImagePickerController alloc] init];
cameraUI.sourceType = UIImagePickerControllerSourceTypeCamera;
cameraUI.mediaTypes = [[NSArray alloc] initWithObjects: (NSString *) kUTTypeMovie, nil];

cameraUI.allowsEditing = NO;
cameraUI.delegate = self;
cameraUI.videoQuality = UIImagePickerControllerQualityTypeLow;//you can change the quality here
[self presentModalViewController:cameraUI animated:YES]; 

我已经尝试过 UIImagePickerControllerQualityType,但是它不能工作,因为将质量设置为中等或低会改变视频的长宽比...我想要一种方法来减小720p视频的尺寸,而不是将720p视频降低到360p。 - zakdances

4

Swift 4:

func convertVideoToLowQuailtyWithInputURL(inputURL: NSURL, outputURL: NSURL, completion: @escaping (Bool) -> Void) {

    let videoAsset = AVURLAsset(url: inputURL as URL, options: nil)
    let videoTrack = videoAsset.tracks(withMediaType: AVMediaType.video)[0]
    let videoSize = videoTrack.naturalSize
    let videoWriterCompressionSettings = [
        AVVideoAverageBitRateKey : Int(125000)
    ]

    let videoWriterSettings:[String : AnyObject] = [
        AVVideoCodecKey : AVVideoCodecH264 as AnyObject,
        AVVideoCompressionPropertiesKey : videoWriterCompressionSettings as AnyObject,
        AVVideoWidthKey : Int(videoSize.width) as AnyObject,
        AVVideoHeightKey : Int(videoSize.height) as AnyObject
    ]

    let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoWriterSettings)
    videoWriterInput.expectsMediaDataInRealTime = true
    videoWriterInput.transform = videoTrack.preferredTransform
    let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileType.mov)
    videoWriter.add(videoWriterInput)
    //setup video reader
    let videoReaderSettings:[String : AnyObject] = [
        kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) as AnyObject
    ]

    let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
    var videoReader: AVAssetReader!

    do{

        videoReader = try AVAssetReader(asset: videoAsset)
    }
    catch {

        print("video reader error: \(error)")
        completion(false)
    }
    videoReader.add(videoReaderOutput)
    //setup audio writer
    let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
    audioWriterInput.expectsMediaDataInRealTime = false
    videoWriter.add(audioWriterInput)
    //setup audio reader
    let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]
    let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
    let audioReader = try! AVAssetReader(asset: videoAsset)
    audioReader.add(audioReaderOutput)
    videoWriter.startWriting()

    //start writing from video reader
    videoReader.startReading()
    videoWriter.startSession(atSourceTime: kCMTimeZero)
    let processingQueue = DispatchQueue(label: "processingQueue1")
    videoWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
        while videoWriterInput.isReadyForMoreMediaData {
            let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
            if videoReader.status == .reading && sampleBuffer != nil {
                videoWriterInput.append(sampleBuffer!)
            }
            else {
                videoWriterInput.markAsFinished()
                if videoReader.status == .completed {
                    //start writing from audio reader
                    audioReader.startReading()
                    videoWriter.startSession(atSourceTime: kCMTimeZero)
                    let processingQueue = DispatchQueue(label: "processingQueue2")
                    audioWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                        while audioWriterInput.isReadyForMoreMediaData {
                            let sampleBuffer:CMSampleBuffer? = audioReaderOutput.copyNextSampleBuffer()
                            if audioReader.status == .reading && sampleBuffer != nil {
                                audioWriterInput.append(sampleBuffer!)
                            }
                            else {
                                audioWriterInput.markAsFinished()
                                if audioReader.status == .completed {
                                    videoWriter.finishWriting(completionHandler: {() -> Void in
                                        completion(true)
                                    })
                                }
                            }
                        }
                    })
                }
            }
        }
    })
}

代码目前运行良好,但如果视频没有音频,则会崩溃,因为代码中有这一部分 let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]。您有什么想法可以修复它,使其在没有音频的视频上正常工作? - Toto

3

Erik Wegener的代码重写为Swift:

class func convertVideoToLowQuailtyWithInputURL(inputURL: NSURL, outputURL: NSURL, onDone: () -> ()) {
    //setup video writer
    let videoAsset = AVURLAsset(URL: inputURL, options: nil)
    let videoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
    let videoSize = videoTrack.naturalSize
    let videoWriterCompressionSettings = [
        AVVideoAverageBitRateKey : Int(125000)
    ]

    let videoWriterSettings:[String : AnyObject] = [
        AVVideoCodecKey : AVVideoCodecH264,
        AVVideoCompressionPropertiesKey : videoWriterCompressionSettings,
        AVVideoWidthKey : Int(videoSize.width),
        AVVideoHeightKey : Int(videoSize.height)
    ]

    let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoWriterSettings)
    videoWriterInput.expectsMediaDataInRealTime = true
    videoWriterInput.transform = videoTrack.preferredTransform
    let videoWriter = try! AVAssetWriter(URL: outputURL, fileType: AVFileTypeQuickTimeMovie)
    videoWriter.addInput(videoWriterInput)
    //setup video reader
    let videoReaderSettings:[String : AnyObject] = [
        kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)
    ]

    let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
    let videoReader = try! AVAssetReader(asset: videoAsset)
    videoReader.addOutput(videoReaderOutput)
    //setup audio writer
    let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: nil)
    audioWriterInput.expectsMediaDataInRealTime = false
    videoWriter.addInput(audioWriterInput)
    //setup audio reader
    let audioTrack = videoAsset.tracksWithMediaType(AVMediaTypeAudio)[0]
    let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
    let audioReader = try! AVAssetReader(asset: videoAsset)
    audioReader.addOutput(audioReaderOutput)
    videoWriter.startWriting()





    //start writing from video reader
    videoReader.startReading()
    videoWriter.startSessionAtSourceTime(kCMTimeZero)
    let processingQueue = dispatch_queue_create("processingQueue1", nil)
    videoWriterInput.requestMediaDataWhenReadyOnQueue(processingQueue, usingBlock: {() -> Void in
        while videoWriterInput.readyForMoreMediaData {
            let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
            if videoReader.status == .Reading && sampleBuffer != nil {
                videoWriterInput.appendSampleBuffer(sampleBuffer!)
            }
            else {
                videoWriterInput.markAsFinished()
                if videoReader.status == .Completed {
                    //start writing from audio reader
                    audioReader.startReading()
                    videoWriter.startSessionAtSourceTime(kCMTimeZero)
                    let processingQueue = dispatch_queue_create("processingQueue2", nil)
                    audioWriterInput.requestMediaDataWhenReadyOnQueue(processingQueue, usingBlock: {() -> Void in
                        while audioWriterInput.readyForMoreMediaData {
                            let sampleBuffer:CMSampleBufferRef? = audioReaderOutput.copyNextSampleBuffer()
                            if audioReader.status == .Reading && sampleBuffer != nil {
                                audioWriterInput.appendSampleBuffer(sampleBuffer!)
                            }
                            else {
                                audioWriterInput.markAsFinished()
                                if audioReader.status == .Completed {
                                    videoWriter.finishWritingWithCompletionHandler({() -> Void in
                                        onDone();
                                    })
                                }
                            }
                        }
                    })
                }
            }
        }
    })
}

3
Use exportSession.fileLengthLimit = 1024 * 1024 * 10 //10 MB

10MB是一个硬编码的数字,根据您需要的比特率使用。

fileLengthLimit

会话不应超过的长度。根据源资产的内容,输出文件的长度可能略微超过文件长度限制。如果您要求在使用输出之前遵守严格的限制,请测试输出文件的长度。请参阅maxDuration和timeRange。指示输出文件的文件长度。

developer.apple.com/documentation/avfoundation/avassetexportsession/1622333-filelengthlimit


1
这应该排得更靠前。 - förschter
它有效了,谢谢!数字1048576是从哪里来的? - kuzdu
1
1048576字节= 1 MB。 - Kumar

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接