将传入的NSStream转换为视图

14

我已经成功地发送了一系列的NSData数据。下面的委托方法正在接收该流并将其附加到NSMutableData self.data。我该如何将这些数据转换为UIView/AVCaptureVideoPreviewLayer(用于显示视频)?我觉得我缺少另一个转换,AVCaptureSession > NSStream > MCSession > NSStream >?

- (void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode {
    switch(eventCode) {
        case NSStreamEventHasBytesAvailable:
        {
            if(!self.data) {
                self.data = [NSMutableData data];
            }
            uint8_t buf[1024];
            unsigned int len = 0;
            len = [(NSInputStream *)stream read:buf maxLength:1024];
            if(len) {
                [self.data appendBytes:(const void *)buf length:len];
            } else {
                NSLog(@"no buffer!");
            }

// Code here to take self.data and convert the NSData to UIView/Video
}

我用以下代码发送流:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
//    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);


    NSError *error;
    self.oStream = [self.mySession startStreamWithName:@"videoOut" toPeer:[[self.mySession connectedPeers]objectAtIndex:0] error:&error];
    self.oStream.delegate = self;
    [self.oStream scheduleInRunLoop:[NSRunLoop mainRunLoop]
                            forMode:NSDefaultRunLoopMode];
    [self.oStream open];

    [self.oStream write:[data bytes] maxLength:[data length]];






//    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );

    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    // also in the 'mediaSpecific' dict of the sampleBuffer

    NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}

你可能想看看是否可以使用Open GL。将您的数据转换为GL纹理,然后使用GL来显示它。这可能有一个更高级的API。数据不是以任何标准格式吗? - nielsbot
你需要在每一帧锁定/解锁像素吗?我想知道这是否会在时间上造成昂贵的代价。 - nielsbot
我没有意识到我这么做了。你在代码中看到哪里? - Eric
创建文件的问题在于必须在显示之前将整个文件写入。我想通过MCSession流式传输视频。我无法相信我需要一个自定义API来实现这一点。AVCaptureSession > NSStream > MCSession > NSStream > ? 我无法将流返回到AV。 - Eric
再次强调,您只是传递原始图像数据...这里有一个类似的问题。http://stackoverflow.com/questions/20150337/multipeer-connectivity-for-video-streaming-between-iphones。它包含了一个链接到一个Github仓库,作者已经开始了一个类似的项目。再次快速查看代码,它似乎并不渲染视频,而是显示图像。添加缺失的OpenGL渲染代码并不是一个复杂的任务。 - MDB983
显示剩余3条评论
2个回答

1
我认为你需要使用AVCaptureManager,请看下面的代码是否适用于你。
AVCamCaptureManager *manager = [[AVCamCaptureManager alloc] init];
[self setCaptureManager:manager];

[[self captureManager] setDelegate:self];

if ([[self captureManager] setupSession]) {
     // Create video preview layer and add it to the UI
    AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[[self captureManager] session]];
    UIView *view = self.videoPreviewView;//Add a view in XIB where you want to show video
    CALayer *viewLayer = [view layer];
    [viewLayer setMasksToBounds:YES];
    CGRect bounds = [view bounds];

    [newCaptureVideoPreviewLayer setFrame:bounds];

    [newCaptureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

    [viewLayer insertSublayer:newCaptureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];

    [self setCaptureVideoPreviewLayer:newCaptureVideoPreviewLayer];

    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
        [[[self captureManager] session] startRunning];
    });
}

管理代表们。
- (void)captureManager:(AVCamCaptureManager *)captureManager didFailWithError:(NSError *)error
{

}

- (void)captureManagerRecordingBegan:(AVCamCaptureManager *)captureManager
{

}

- (void)captureManagerRecordingFinished:(AVCamCaptureManager *)captureManager outputURL:(NSURL *)url
{



}

- (void)captureManagerStillImageCaptured:(AVCamCaptureManager *)captureManager
{



}

- (void)captureManagerDeviceConfigurationChanged:(AVCamCaptureManager *)captureManager
{

}

我希望它能帮到你。


captureManager的委托方法中没有处理视频的方式。我有什么遗漏吗? - Eric
@Eric 如果这个链接对你有帮助的话,可以查看这个 https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010112-Intro-DontLinkElementID_2 - iphonic

-2
你可以在处理事件时创建一个 UIImageView,像这样: UIImageView *iv = [[UIImageView alloc] initWithImage:[UIImage imageWithData:self.data]]; 你也可以只分配一次内存并调用 init 方法。
每次从套接字中接收到数据时,初始化 UIImageView,并将该 UIImageView 添加到 UIView 中以显示它。
对于我的英语表示抱歉,我不知道我是否理解了您的问题。

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接