如何使用AVCapturePhotoOutput

47

我一直在使用自定义相机,最近升级到了 Xcode 8 beta 和 Swift 3。我最初的代码如下:

var stillImageOutput: AVCaptureStillImageOutput?

然而,我现在收到了警告:

'AVCaptureStillImageOutput' 在iOS 10.0中已被弃用:请改用AVCapturePhotoOutput

由于这是比较新的事情,我没有看到太多关于此的信息。以下是我当前的代码:

var captureSession: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var previewLayer: AVCaptureVideoPreviewLayer?

func clickPicture() {

    if let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo) {

        videoConnection.videoOrientation = .portrait
        stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

            if sampleBuffer != nil {

                let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
                let dataProvider = CGDataProvider(data: imageData!)
                let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)

                let image = UIImage(cgImage: cgImageRef!, scale: 1, orientation: .right)

            }

        })

    }

}

我已经尝试查看AVCapturePhotoCaptureDelegate,但我不太确定如何使用它。 有人知道如何使用吗?谢谢。

我尝试过查看AVCapturePhotoCaptureDelegate,但不太清楚如何使用。是否有人知道如何使用?谢谢。


你需要观看WWDC 2016会议的511视频。 - LC 웃
好的!那我会观看这个视频,如果我能回答的话就会发表答案。谢谢! - Pranav Wadhwa
查看文档也许会有所帮助。 - rickster
7个回答

64

已更新至Swift 4 嗨,使用AVCapturePhotoOutput真的很容易。

您需要AVCapturePhotoCaptureDelegate,它将返回CMSampleBuffer

如果告诉AVCapturePhotoSettings预览格式,还可以获得预览图像。

    class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

        let cameraOutput = AVCapturePhotoOutput()

        func capturePhoto() {

          let settings = AVCapturePhotoSettings()
          let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
          let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                               kCVPixelBufferWidthKey as String: 160,
                               kCVPixelBufferHeightKey as String: 160]
          settings.previewPhotoFormat = previewFormat
          self.cameraOutput.capturePhoto(with: settings, delegate: self)

        }

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {                        
            if let error = error {
                print(error.localizedDescription)
            }

            if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
              print("image: \(UIImage(data: dataImage)?.size)") // Your Image
            }   
        }
    }

欲获取更多信息,请访问https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

注意:在拍照之前,您必须将AVCapturePhotoOutput添加到AVCaptureSession中。像这样:session.addOutput(output),然后再执行:output.capturePhoto(with:settings, delegate:self)。感谢@BigHeadCreations


7
出现错误:“[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] No active and enabled video connection"。您能否提供完整的iOS 10 / Swift 3示例?在iOS 10 / Swift 3中,使用AVCapturePhotoOutput捕获照片时,必须先确保摄像头的视频连接处于活动和启用状态。以下是一个可以帮助解决此问题的示例代码:import UIKit import AVFoundation class ViewController: UIViewController, AVCapturePhotoCaptureDelegate { let session = AVCaptureSession() var photoOutput = AVCapturePhotoOutput() override func viewDidLoad() { super.viewDidLoad() // 创建输入设备 guard let device = AVCaptureDevice.default(for: .video) else { return } // 创建输入对象 guard let input = try? AVCaptureDeviceInput(device: device) else { return } // 将输入对象添加到会话中 session.addInput(input) // 将输出对象添加到会话中 if session.canAddOutput(photoOutput) { session.addOutput(photoOutput) } // 开始运行会话 session.startRunning() } func capturePhoto() { let settings = AVCapturePhotoSettings() guard let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first else { return } let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType] settings.previewPhotoFormat = previewFormat photoOutput.capturePhoto(with: settings, delegate: self) } func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { if let imageData = photo.fileDataRepresentation(), let image = UIImage(data: imageData) { // 处理照片数据 } else { print("Error capturing photo: \(String(describing: error))") } } }在此示例中,我们创建了一个AVCaptureSession对象,并将其用于设置输入设备和输出对象。在capturePhoto()方法中,我们使用AVCapturePhotoOutput对象的capturePhoto(with: delegate:)方法来捕获照片。最后,在photoOutput(_:didFinishProcessingPhoto:error:)方法中,我们处理照片数据并进行错误处理。请注意,为了确保摄像头的视频连接处于活动和启用状态,请确保在调用capturePhoto()之前已经启动了会话(即调用了session.startRunning())。 - Tuomas Laatikainen
@TuomasLaatikainen,你解决了自己的问题吗?我也遇到了同样的问题。 - SRMR
@mobibob,你是否找到了关于如何解决这个问题的答案?我有同样的问题。 - SRMR
如果我想捕捉JPG格式的图片呢? - Nikhil Manapure
1
@TuomasLaatikainen 在拍照前必须将AVCapturePhotoOutput添加到AVCaptureSession中,因此类似于以下内容: session.addOutput(output),然后:output.capturePhoto(with:settings, delegate:self) - BigHeadCreations
显示剩余2条评论

45

这是我的完整实现

import UIKit
import AVFoundation

class ViewController: UIViewController, AVCapturePhotoCaptureDelegate {

var captureSesssion : AVCaptureSession!
var cameraOutput : AVCapturePhotoOutput!
var previewLayer : AVCaptureVideoPreviewLayer!

@IBOutlet weak var capturedImage: UIImageView!
@IBOutlet weak var previewView: UIView!

override func viewDidLoad() {
    super.viewDidLoad()
    captureSesssion = AVCaptureSession()
    captureSesssion.sessionPreset = AVCaptureSessionPresetPhoto
    cameraOutput = AVCapturePhotoOutput()

    let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)

    if let input = try? AVCaptureDeviceInput(device: device) {
        if (captureSesssion.canAddInput(input)) {
            captureSesssion.addInput(input)
            if (captureSesssion.canAddOutput(cameraOutput)) {
                captureSesssion.addOutput(cameraOutput)
                previewLayer = AVCaptureVideoPreviewLayer(session: captureSesssion)
                previewLayer.frame = previewView.bounds
                previewView.layer.addSublayer(previewLayer)
                captureSesssion.startRunning()
            }
        } else {
            print("issue here : captureSesssion.canAddInput")
        }
    } else {
        print("some problem here")
    }
}

// Take picture button
@IBAction func didPressTakePhoto(_ sender: UIButton) {
    let settings = AVCapturePhotoSettings()
    let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
    let previewFormat = [
         kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
         kCVPixelBufferWidthKey as String: 160,
         kCVPixelBufferHeightKey as String: 160
    ]
    settings.previewPhotoFormat = previewFormat
    cameraOutput.capturePhoto(with: settings, delegate: self)
}

// callBack from take picture
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

    if let error = error {
        print("error occure : \(error.localizedDescription)")
    }

    if  let sampleBuffer = photoSampleBuffer,
        let previewBuffer = previewPhotoSampleBuffer,
        let dataImage =  AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer:  sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
        print(UIImage(data: dataImage)?.size as Any)

        let dataProvider = CGDataProvider(data: dataImage as CFData)
        let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
        let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)

        self.capturedImage.image = image
    } else {
        print("some error here")
    }
}

// This method you can use somewhere you need to know camera permission   state
func askPermission() {
    print("here")
    let cameraPermissionStatus =  AVCaptureDevice.authorizationStatus(forMediaType: AVMediaTypeVideo)

    switch cameraPermissionStatus {
    case .authorized:
        print("Already Authorized")
    case .denied:
        print("denied")

        let alert = UIAlertController(title: "Sorry :(" , message: "But  could you please grant permission for camera within device settings",  preferredStyle: .alert)
        let action = UIAlertAction(title: "Ok", style: .cancel,  handler: nil)
        alert.addAction(action)
        present(alert, animated: true, completion: nil)

    case .restricted:
        print("restricted")
    default:
        AVCaptureDevice.requestAccess(forMediaType: AVMediaTypeVideo, completionHandler: {
            [weak self]
            (granted :Bool) -> Void in

            if granted == true {
                // User granted
                print("User granted")
 DispatchQueue.main.async(){
            //Do smth that you need in main thread   
            } 
            }
            else {
                // User Rejected
                print("User Rejected")

DispatchQueue.main.async(){
            let alert = UIAlertController(title: "WHY?" , message:  "Camera it is the main feature of our application", preferredStyle: .alert)
                let action = UIAlertAction(title: "Ok", style: .cancel, handler: nil)
                alert.addAction(action)
                self?.present(alert, animated: true, completion: nil)  
            } 
            }
        });
    }
}
}

你是如何将flashMode设置为它的? - coolly
在 iOS 10.0.2 上工作。要打开闪光灯,请使用 settings.flashMode = .on - Rajamohan S
为什么是UIImageOrientation.right?在iPad上它就是一个错误的方向。 - Makalele
运行得很好 :) - productioncoder

18

iOS 11中"photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {}"已过时。

请使用以下方法:

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
    let imageData = photo.fileDataRepresentation()
    if let data = imageData, let img = UIImage(data: data) {
        print(img)
    }
}

17

我采用了 @Aleksey Timoshchenko 的优秀答案,并将其更新为 Swift 4.x

请注意,根据我的使用情况,我允许用户拍摄多张照片,因此将它们保存在 images 数组中。

请注意,您需要通过 storyboard 或代码连接 @IBAction takePhoto 方法。在我的情况下,我使用 storyboard

iOS 11 开始,@Aleksey Timoshchenko 答案中使用的 AVCapturePhotoOutput.jpegPhotoDataRepresentation 已被弃用。

Swift 4.x

class CameraVC: UIViewController {

    @IBOutlet weak var cameraView: UIView!

    var images = [UIImage]()

    var captureSession: AVCaptureSession!
    var cameraOutput: AVCapturePhotoOutput!
    var previewLayer: AVCaptureVideoPreviewLayer!

    override func viewDidLoad() {
        super.viewDidLoad()
    }

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        startCamera()
    }

    func startCamera() {
        captureSession = AVCaptureSession()
        captureSession.sessionPreset = AVCaptureSession.Preset.photo
        cameraOutput = AVCapturePhotoOutput()

        if let device = AVCaptureDevice.default(for: .video),
           let input = try? AVCaptureDeviceInput(device: device) {
            if (captureSession.canAddInput(input)) {
                captureSession.addInput(input)
                if (captureSession.canAddOutput(cameraOutput)) {
                    captureSession.addOutput(cameraOutput)
                    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
                    previewLayer.frame = cameraView.bounds
                    cameraView.layer.addSublayer(previewLayer)
                    captureSession.startRunning()
                }
            } else {
                print("issue here : captureSesssion.canAddInput")
            }
        } else {
            print("some problem here")
        }
    }

    @IBAction func takePhoto(_ sender: UITapGestureRecognizer) {
        let settings = AVCapturePhotoSettings()
        let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
        let previewFormat = [
            kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
            kCVPixelBufferWidthKey as String: 160,
            kCVPixelBufferHeightKey as String: 160
        ]
        settings.previewPhotoFormat = previewFormat
        cameraOutput.capturePhoto(with: settings, delegate: self)   
    }
}

extension CameraVC : AVCapturePhotoCaptureDelegate {
    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {

        if let error = error {
            print("error occured : \(error.localizedDescription)")
        }

        if let dataImage = photo.fileDataRepresentation() {
            print(UIImage(data: dataImage)?.size as Any)

            let dataProvider = CGDataProvider(data: dataImage as CFData)
            let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
            let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImage.Orientation.right)

            /**
               save image in array / do whatever you want to do with the image here
            */
            self.images.append(image)

        } else {
            print("some error here")
        }
    }
}

这是最好的答案。它专注于核心方面,使其正常工作!!! - eharo2
很好的答案。但请注意,fileDataRepresentation() 需要 iOS11。 - Fraser
谢谢,这个解决了我的问题。甚至在2022年也可以使用。 - Confuseious

3

2

capture委托函数已更改为photoOutput。以下是Swift 4的更新函数。

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {            
        if let error = error {
            print(error.localizedDescription)
        }

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {
            print("image: \(String(describing: UIImage(data: dataImage)?.size))") // Your Image
        }
}

0

@productioncoder给出的答案完全一样,但我不得不将startCamera()viewDidAppear()改为viewDidLoad()下面。


网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接