我正在使用 AVCaptureSession
捕获视频并从 iPhone 相机获取实时帧,但如何通过多路复用来发送图像和声音到服务器,并且如何使用 ffmpeg 完成此任务呢?如果有任何关于 ffmpeg 的教程或示例,请在此分享。
我正在使用 AVCaptureSession
捕获视频并从 iPhone 相机获取实时帧,但如何通过多路复用来发送图像和声音到服务器,并且如何使用 ffmpeg 完成此任务呢?如果有任何关于 ffmpeg 的教程或示例,请在此分享。
这里是一些代码:
// make input device
NSError *deviceError;
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];
// make output device
AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
[outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
// initialize capture session
AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
[captureSession addInput:inputDevice];
[captureSession addOutput:outputDevice];
// make preview layer and add so that camera's view is displayed on screen
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.bounds;
[view.layer addSublayer:previewLayer];
// go!
[captureSession startRunning];
然后输出设备的代理(这里是self)必须实现回调函数:
-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
// also in the 'mediaSpecific' dict of the sampleBuffer
NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}
尝试使用AV Foundation框架捕获视频并进行HTTP流媒体传输至服务器。
另外,请查看下面的Stack Overflow帖子
您可能已经知道...
1) How to get compressed frames and audio from iPhone's camera?
2) Encoding uncompressed frames with ffmpeg's API is fast enough for
real-time streaming?
是的,它可以。但是你需要使用libx264,这将让你进入GPL领域。这与应用商店不太兼容。
出于效率原因,我建议使用AVFoundation和AVAssetWriter。