在Swift/iOS中,如何减小视频大小以便上传到服务器?

6

我正在使用UIImagePickerController从我的Swift iOS应用程序中选择视频。我正在保存此URL,现在希望使用以下方法将其转换为数据以发送到我的服务器进行存储:

let messageVideoData = NSData(contentsOfURL: chosenVideoURL)

问题在于文件大小非常大。对于我在iPhone 6s上拍摄的7秒视频,分辨率为1280、720,帧率为30,文件大小超过4 MB。我注意到通过WhatsApp和其他聊天应用程序发送的相同图像被缩小到几百KB。
最佳方法是什么来减小外部存储的文件大小?该视频主要针对手机,因此将分辨率降至800或更低即可。
我尝试将UIImagePickerController质量设置为:
picker.videoQuality = UIImagePickerControllerQualityType.Type640x480

但是这只能将文件大小减小到3.5 MB。
使用:
picker.videoQuality = UIImagePickerControllerQualityType.TypeLow

我将分辨率降低到比理想值要低得多的数值。

是否有其他方法可以采取以减小视频文件大小用于存储在我的服务器上?


你能展示一下你是如何确切地解决它的吗? - Farid Al Haddad
1
尝试这个:https://dev59.com/yWgt5IYBdhLWcg3w3xPC#62862102 - Lance Samaria
2个回答

1
尝试此答案以压缩视频。根据jojaba的答案:

如果您想要压缩视频以进行远程共享,并在iPhone上保留原始质量以进行本地存储,则应考虑使用AVAssetExportSession或AVAssetWriter。

无低质量压缩视频

虽然这种方法是基于Objective-C的。

您还应考虑阅读有关iOS如何管理资产的内容。


1
//use SDAVAssetExportSession library with sprcifica bitrate as per requirement
// video file size ~10-15 MB apporox

func aVodzLatestVideoCompressor(inputURL: URL, aOutputURL: URL, aStartTime:Float, aEndTime:Float)   {
          
            let startTime = CMTime(seconds: Double(aStartTime), preferredTimescale: 1000)
            let endTime = CMTime(seconds: Double(aEndTime), preferredTimescale: 1000)
            let timeRange = CMTimeRange(start: startTime, end: endTime)
            
            let anAsset = AVURLAsset(url: inputURL, options: nil)
            guard let videoTrack = anAsset.tracks(withMediaType: AVMediaType.video).first else { return }
            var aQuality:Float = 0.0
            var duration = anAsset.duration
            let totalSeconds = Int(CMTimeGetSeconds(duration))
            print("duration -\(duration) - totalSeconds -\(totalSeconds)")
            
            var bitrate = min(aQuality, videoTrack.estimatedDataRate)
            let landscap = self.isLandScapVideo(afileURL: inputURL )
            var originalWidth = videoTrack.naturalSize.width
            var originalHeight  = videoTrack.naturalSize.height
            print("originalWidth -\(originalWidth) originalHeight- \(originalHeight) ")
            while (originalWidth >= 1920 || originalHeight >= 1920) {
                originalWidth = originalWidth / 2
                originalHeight = originalHeight / 2
            }
    
            var setWidth = Int(originalWidth)
            var setlHeight = Int(originalHeight)
            
            if  sizeVideo < 10.0 {
                // COMPRESS_QUALITY_HIGH:
                setWidth = Int(originalWidth)
                setlHeight = Int(originalHeight)
                aQuality = Float(setWidth * setlHeight *  20)
                bitrate = min(aQuality, videoTrack.estimatedDataRate)
            }else if sizeVideo < 20.0 {
                //COMPRESS_QUALITY_MEDIUM:
                if totalSeconds > 35{
                    setWidth = Int(originalWidth /  2.7)
                    setlHeight = Int(originalHeight / 2.7)
                }else if totalSeconds > 25 {
                    setWidth = Int(originalWidth / 2.3)
                    setlHeight = Int(originalHeight / 2.3)
                }else{
                    setWidth = Int(originalWidth / 2.0)
                    setlHeight = Int(originalHeight / 2.0)
                }
                aQuality = Float(setWidth * setlHeight *  10)
                bitrate = min(aQuality, videoTrack.estimatedDataRate)
            }else if sizeVideo < 30.0 {
                //COMPRESS_QUALITY_MEDIUM:
                if totalSeconds > 35{
                    setWidth = Int(originalWidth / 3)
                    setlHeight = Int(originalHeight / 3)
                }else if totalSeconds > 20 {
                    setWidth = Int(originalWidth / 2.5)
                    setlHeight = Int(originalHeight / 2.5)
                }else{
                    setWidth = Int(originalWidth / 2.0)
                    setlHeight = Int(originalHeight / 2.0)
                }
                aQuality = Float(setWidth * setlHeight * 10)
                bitrate = min(aQuality, videoTrack.estimatedDataRate)
            }else{
                if totalSeconds > 35{
                    setWidth = Int(originalWidth / 3.0)
                    setlHeight = Int(originalHeight / 3.0)
                }else if totalSeconds > 25 {
                    setWidth = Int(originalWidth / 2.5)
                    setlHeight = Int(originalHeight / 2.5)
                }else{
                    setWidth = Int(originalWidth / 2.0)
                    setlHeight = Int(originalHeight / 2.0)
                }
                aQuality = Float(setWidth * setlHeight * 10)
                bitrate = min(aQuality, videoTrack.estimatedDataRate)
            }
    
            print("aQuality")
            print(Float(aQuality))
            print("bitrate")
            print(Float(bitrate))
            let encoder = SDAVAssetExportSession(asset: anAsset)
            encoder?.shouldOptimizeForNetworkUse = true
     
            encoder?.timeRange = timeRange
            encoder?.outputFileType = AVFileType.mp4.rawValue
            encoder?.outputURL = aOutputURL
            //960 X 540 , 1280 * 720 , 1920*1080 // size reduce parameter
            encoder?.videoSettings = [
                AVVideoCodecKey: AVVideoCodecType.h264,
                AVVideoWidthKey:  landscap ? NSNumber(value:1280) : NSNumber(value:720) ,
                AVVideoHeightKey:  landscap ? NSNumber(value:720) : NSNumber(value:1280),
                AVVideoCompressionPropertiesKey: [
                    AVVideoAverageBitRateKey: NSNumber(value: bitrate),
                    AVVideoProfileLevelKey: AVVideoProfileLevelH264High40
                ]
            ]
            encoder?.audioSettings = [
                AVFormatIDKey: NSNumber(value: kAudioFormatMPEG4AAC),
                AVNumberOfChannelsKey: NSNumber(value: 2),
                AVSampleRateKey: NSNumber(value: 44100),
                AVEncoderBitRateKey: NSNumber(value: 128000)
            ]
            
            encoder?.exportAsynchronously(completionHandler: {
                if encoder?.status == .completed {
                    print("Video export succeeded")
                    DispatchQueue.main.async {
                        appDelegate.hideLoader()
                        //NotificationCenter.default.post(name: Notification.Name("getMediaEffect"), object: "3")
                        //self.sendCompletion?(UIImage(), aOutputURL)
                        let text = "Original video-  \(inputURL.verboseFileSizeInMB()) \n and Compressed video \(aOutputURL.verboseFileSizeInMB()) "
                        let alertController = UIAlertController.init(title: "Compressed!!", message: text , preferredStyle: .alert)
                        alertController.addAction(UIAlertAction.init(title: "share to server!", style: .default, handler: { (action) in
                            // Completion block
                            NotificationCenter.default.post(name: Notification.Name("getMediaEffect"), object: "3")
                            self.sendCompletion?(UIImage(), aOutputURL)
                        }))
                        alertController.addAction(UIAlertAction.init(title: "Save", style: .default, handler: { (action) in
                            // Completion block
                            DispatchQueue.main.async {
                                appDelegate.hideLoader()
                                if let videoURL = aOutputURL as? URL{
                                    self.shareVideo(aUrl:videoURL )
                                }
                            }
                        }))
                        alertController.addAction(UIAlertAction.init(title: "cancel!", style: .default, handler: { (action) in
                        }))
                        self.present(alertController, animated: true, completion: nil)
                    }
                    
                } else if encoder?.status == .cancelled {
                    print("Video export cancelled")
                    DispatchQueue.main.async {
                        appDelegate.hideLoader()
                        self.view.makeToast("error_something_went_wrong".localized)
                    }
                } else {
                    print("Video export failed with error: \(encoder!.error.localizedDescription) ")
                    DispatchQueue.main.async {
                        appDelegate.hideLoader()
                        self.view.makeToast("error_something_went_wrong".localized)
                    }
                }
            })
        }
    
     func isLandScapVideo(afileURL: URL) -> Bool{
            let resolution = self.resolutionForLocalVideo(url: afileURL)
            guard let width = resolution?.width, let height = resolution?.height else {
                return false
            }
            if abs(width) > abs(height){
                //landscap
                return true
            }else{
                //potrait
                return false
            }
        }
    extension URL {
        func verboseFileSizeInMB() -> Float{
            let p = self.path
            
            let attr = try? FileManager.default.attributesOfItem(atPath: p)
            
            if let attr = attr {
                let fileSize = Float(attr[FileAttributeKey.size] as! UInt64) / (1024.0 * 1024.0)
                print(String(format: "FILE SIZE: %.2f MB", fileSize))
                return fileSize
            } else {
                return Float.zero
            }
        }
    }
    
    
    //Below update if any issue in library at SDAVAssetExportSession library changes below at m file:(changes as per your requirement)
         ——   CGAffineTransform matrix = CGAffineTransformMakeTranslation(transx / xratio, transy / yratio - transform.ty);
    
     ——//fix Orientation - 1
        UIImageOrientation videoAssetOrientation = UIImageOrientationUp;
        BOOL isVideoAssetPortrait = NO;
        CGAffineTransform videoTransform = videoTrack.preferredTransform;
        if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
            videoAssetOrientation = UIImageOrientationRight;
            isVideoAssetPortrait = YES;
        }
        if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
            videoAssetOrientation =  UIImageOrientationLeft;
            isVideoAssetPortrait = YES;
        }
        if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
            videoAssetOrientation =  UIImageOrientationUp;
        }
        if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
            videoAssetOrientation = UIImageOrientationDown;
        }
       // [passThroughLayer setTransform:transform atTime:kCMTimeZero];
        if ((videoAssetOrientation = UIImageOrientationDown) || (videoAssetOrientation = UIImageOrientationLeft)){
            [passThroughLayer setTransform:videoTrack.preferredTransform atTime:kCMTimeZero];
        }else{
            [passThroughLayer setTransform:transform atTime:kCMTimeZero];
        }

我最终使用了NextLevelSessionExporter,但非常相似。谢谢! - landnbloc

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接