如何使用AVAssetWriter将H264流写入视频?

4

我希望将来自服务器的H.264流转换为视频文件,但是当我使用assetwrite.finishwrite时,XCode会报错。

Video /var/mobile/Applications/DE4196F1-BB77-4B7D-8C20-7A5D6223C64D/Documents/test.mov cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12847 "This movie format is not supported." UserInfo=0x5334830 {NSLocalizedDescription=This movie format is not supported.}"

以下是我的代码: data是H.264帧,只有一帧,可能是I帧或P帧。
(void)_encodeVideoFrame2:(NSData *) data time:(double)tm 
{
  CMBlockBufferRef videoBlockBuffer=NULL;
  CMFormatDescriptionRef videoFormat=NULL;
  CMSampleBufferRef videoSampleBuffer=NULL;
  CMItemCount numberOfSampleTimeEntries=1;
  CMItemCount numberOfSamples=1;
  CMVideoFormatDescriptionCreate(kCFAllocatorDefault, kCMVideoCodecType_H264, 320, 240, NULL, &videoFormat);
  OSStatus result;
  result=CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, NULL, data.length, kCFAllocatorDefault, NULL, 0, data.length, kCMBlockBufferAssureMemoryNowFlag, &videoBlockBuffer);
  result=CMBlockBufferReplaceDataBytes(data.bytes, videoBlockBuffer, 0, data.length);
  CMSampleTimingInfo videoSampleTimingInformation={CMTimeMake(tm*600, 600)};
  size_t sampleSizeArray[1];
  sampleSizeArray[0]=data.length;
  result=CMSampleBufferCreate(kCFAllocatorDefault, videoBlockBuffer, TRUE, NULL, NULL, videoFormat, numberOfSamples, numberOfSampleTimeEntries, &videoSampleTimingInformation, 1, sampleSizeArray, &videoSampleBuffer);
  result = CMSampleBufferMakeDataReady(videoSampleBuffer);
  [assetWriterInput appendSampleBuffer:videoSampleBuffer]; 
}

也许 CMSampleBufferCreate 的参数有误?谢谢。
2个回答

2
为了使用AVAssetWriter混合已压缩的h264缓冲区,您需要指定一个nil的OutputSettings,而不是文档所述的内容。
因此,您需要这样做:
videoInput = [[AVAssetWriterInput alloc] 
    initWithMediaType:AVMediaTypeVideo outputSettings:@{
        AVVideoCodecKey: AVVideoCodecH264,
    }
    sourceFormatHint:myVideoFormat
];

你应该做的是:
videoInput = [[AVAssetWriterInput alloc] 
    initWithMediaType:AVMediaTypeVideo
    outputSettings:nil
    sourceFormatHint:myVideoFormat
];

这将允许您进行视频或音频数据的直通处理,而无需AVAssetWriterInput尝试为您编码/转码任何内容。

-2

试试这段代码

  • (IBAction)createVideo:(id)sender {

    ///////////// 如果我们将其移到单独的函数中,则设置OR函数定义//////// // 这应该移动到它自己的函数中,可以接受imageArray、videoOutputPath等... // - (void)exportImages:(NSMutableArray *)imageArray // asVideoToPath:(NSString *)videoOutputPath // withFrameSize:(CGSize)imageSize // framesPerSecond:(NSUInteger)fps {

    NSError *error = nil;

    // 设置文件管理器和文件videoOutputPath,在存在“test_output.mp4”的情况下删除... //NSString *videoOutputPath = @"/Users/someuser/Desktop/test_output.mp4"; NSFileManager *fileMgr = [NSFileManager defaultManager]; NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"]; NSString *videoOutputPath = [documentsDirectory stringByAppendingPathComponent:@"test_output.mp4"]; //NSLog(@"-->videoOutputPath= %@", videoOutputPath); // 删除现有mp4... if ([fileMgr removeItemAtPath:videoOutputPath error:&error] != YES) NSLog(@"Unable to delete file: %@", [error localizedDescription]);

    CGSize imageSize = CGSizeMake(400, 200); NSUInteger fps = 30;

    //NSMutableArray *imageArray; //imageArray = [[NSMutableArray alloc] initWithObjects:@"download.jpeg", @"download2.jpeg", nil]; NSMutableArray imageArray; NSArray imagePaths = [[NSBundle mainBundle] pathsForResourcesOfType:@"jpg" inDirectory:nil]; imageArray = [[NSMutableArray alloc] initWithCapacity:imagePaths.count]; NSLog(@"-->imageArray.count= %i", imageArray.count); for (NSString* path in imagePaths) { [imageArray addObject:[UIImage imageWithContentsOfFile:path]]; //NSLog(@"-->image path= %@", path); }

    ////////////// 结束设置 ///////////////////////////////////

    NSLog(@"Start building video from defined frames.");

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: [NSURL fileURLWithPath:videoOutputPath] fileType:AVFileTypeQuickTimeMovie error:&error]; NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:imageSize.width], AVVideoWidthKey, [NSNumber numberWithInt:imageSize.height], AVVideoHeightKey, nil];

    AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:nil];

    NSParameterAssert(videoWriterInput); NSParameterAssert([videoWriter canAddInput:videoWriterInput]); videoWriterInput.expectsMediaDataInRealTime = YES; [videoWriter addInput:videoWriterInput];

    //Start a session: [videoWriter startWriting]; [videoWriter startSessionAtSourceTime:kCMTimeZero];

    CVPixelBufferRef buffer = NULL;

    //convert uiimage to CGImage. int frameCount = 0; double numberOfSecondsPerFrame = 6; double frameDuration = fps * numberOfSecondsPerFrame;

    //for(VideoFrame * frm in imageArray) NSLog(@"****************************"); for(UIImage * img in imageArray) { //UIImage * img = frm._imageFrame; buffer = [self pixelBufferFromCGImage:[img CGImage]];

    BOOL append_ok = NO;
    int j = 0;
    while (!append_ok && j < 30) {
        if (adaptor.assetWriterInput.readyForMoreMediaData)  {
            //print out status:
            NSLog(@"Processing video frame (%d,%d)",frameCount,[imageArray count]);
    CMTime frameTime = CMTimeMake(frameCount*frameDuration,(int32_t) fps); append_ok = [adaptor appendPixelBuffer:buffer

    }


这段代码编码图像而不是H264样本缓冲区;OP正在请求后者。 - nevyn

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接