在Swift中从视频中提取帧

19

我正在尝试从Swift中的视频中提取UIImages帧。我找到了几个Objective C的解决方案,但是我很难在Swift中找到任何东西。如果以下做法正确,是否有人可以帮助我将其转换为Swift或者提供他们自己的方法?

来源:从UIImagePickerController中抓取视频的第一帧?

- (UIImage *)imageFromVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time {
    
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
    NSParameterAssert(asset);
    AVAssetImageGenerator *assetIG =
    [[AVAssetImageGenerator alloc] initWithAsset:asset];
    assetIG.appliesPreferredTrackTransform = YES;
    assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
    
    CGImageRef thumbnailImageRef = NULL;
    CFTimeInterval thumbnailImageTime = time;
    NSError *igError = nil;
    thumbnailImageRef =
    [assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
                    actualTime:NULL
                         error:&igError];
    
    if (!thumbnailImageRef)
        NSLog(@"thumbnailImageGenerationError %@", igError );
    
    UIImage *image = thumbnailImageRef
    ? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
    : nil;
    
    return image;
}

我自己从未尝试过这个,但我希望这个链接能帮到你。 - UsamaMan
4个回答

29

实际上它确实起作用了。

func imageFromVideo(url: URL, at time: TimeInterval) -> UIImage? {
    let asset = AVURLAsset(url: url)

    let assetIG = AVAssetImageGenerator(asset: asset)
    assetIG.appliesPreferredTrackTransform = true
    assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels

    let cmTime = CMTime(seconds: time, preferredTimescale: 60)
    let thumbnailImageRef: CGImage
    do {
        thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
    } catch let error {
        print("Error: \(error)")
        return nil
    }

    return UIImage(cgImage: thumbnailImageRef)
}

但是请记住,这个函数是同步的,最好不要在主队列上调用它。

你可以选择以下两种方式:

DispatchQueue.global(qos: .background).async {
    let image = self.imageFromVideo(url: url, at: 0)

    DispatchQueue.main.async {
        self.imageView.image = image
    }
}

或者使用generateCGImagesAsynchronously代替copyCGImage


@Dmitry 使用这个,但是我得到的图像比视频更亮..有人可以帮忙吗..??? - Zღk

6
这里提供一个SWIFT 5 的替代方案,避免担心你在哪个队列上的问题:
public func imageFromVideo(url: URL, at time: TimeInterval, completion: @escaping (UIImage?) -> Void) {
    DispatchQueue.global(qos: .background).async {
        let asset = AVURLAsset(url: url)

        let assetIG = AVAssetImageGenerator(asset: asset)
        assetIG.appliesPreferredTrackTransform = true
        assetIG.apertureMode = AVAssetImageGenerator.ApertureMode.encodedPixels

        let cmTime = CMTime(seconds: time, preferredTimescale: 60)
        let thumbnailImageRef: CGImage
        do {
            thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
        } catch let error {
            print("Error: \(error)")
            return completion(nil)
        }

        DispatchQueue.main.async {
            completion(UIImage(cgImage: thumbnailImageRef))
        }
    }
}

以下是如何使用它:

imageFromVideo(url: videoUrl, at: 0) { image in
   // Do something with the image here
}

2

这是@Dmitry答案的async/await版本,适用于那些不喜欢完成处理程序的人

func imageFromVideo(url: URL, at time: TimeInterval) async throws -> UIImage {
        try await withCheckedThrowingContinuation({ continuation in
            DispatchQueue.global(qos: .background).async {
                let asset = AVURLAsset(url: url)
                
                let assetIG = AVAssetImageGenerator(asset: asset)
                assetIG.appliesPreferredTrackTransform = true
                assetIG.apertureMode = AVAssetImageGenerator.ApertureMode.encodedPixels
                
                let cmTime = CMTime(seconds: time, preferredTimescale: 60)
                let thumbnailImageRef: CGImage
                do {
                    thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
                } catch {
                    continuation.resume(throwing: error)
                    return
                }
                continuation.resume(returning: UIImage(cgImage: thumbnailImageRef))
            }
        })
    }

使用方法:

let vidUrl = <#your url#>
do {
    let firstFrame = try await imageFromVideo(url: vidUrl, at: 0)
    // do something with image
} catch {
    // handle error
}

如果你在抛出函数中,可以这样写:

func someThrowingFunc() throws {
    let vidUrl = <#your url#>
    let firstFrame = try await imageFromVideo(url: vidUrl, at: 0)
    // do something with image
}

1
不要使用已弃用的copyCGImage(at:actualTime:)方法,而是使用异步替代方法image(at:),这样可以避免编写withCheckedThrowingContinuation代码块。 - undefined
谢谢通知!我会尽快更新答案,采用这种方法。 - undefined
@Suprafen 我添加了新方法的使用作为iOS 16+的解决方案(因为该方法仅在iOS 16+中可用),我认为最好也保留旧的解决方案,因为iOS 15还不算太旧(就像现在的iOS 10一样(甚至Xcode现在都不支持它)),一些开发者可能会需要它。 - undefined

-2

在iOS上,您可以轻松地完成此操作。以下是使用Swift如何完成此操作的代码片段。

    let url = Bundle.main.url(forResource: "video_name", withExtension: "mp4")
    let videoAsset = AVAsset(url: url!)
    
    let t1 = CMTime(value: 1, timescale: 1)
    let t2 = CMTime(value: 4, timescale: 1)
    let t3 = CMTime(value: 8, timescale: 1)
    let timesArray = [
        NSValue(time: t1),
        NSValue(time: t2),
        NSValue(time: t3)
    ]
    
    let generator = AVAssetImageGenerator(asset: videoAsset)
    generator.requestedTimeToleranceBefore = .zero
    generator.requestedTimeToleranceAfter = .zero
    
    generator.generateCGImagesAsynchronously(forTimes: timesArray ) { requestedTime, image, actualTime, result, error in
        
       let img = UIImage(cgImage: image!)
       
    }

你可以在这里找到示例代码,以及在这里找到medium文章。


网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接