播放叠加视频

10

我有多个imageview子视图,根据我的输入数据进行堆叠。基本上,所有这些子视图都根据我的输入数据设置为图像或视频层。我的问题是播放视频。我可以播放堆栈中的第一个视频,但之后的每个视频都只有第一个视频的声音。如何逐个播放每个视频?

通过类似 Snapchat 的点击事件导航浏览视图。请参见下面:

@interface SceneImageViewController ()

@property (strong, nonatomic) NSURL *videoUrl;
@property (strong, nonatomic) AVPlayer *avPlayer;
@property (strong, nonatomic) AVPlayerLayer *avPlayerLayer;

@end

@implementation SceneImageViewController

- (void)viewDidLoad {

[super viewDidLoad];

self.mySubviews = [[NSMutableArray alloc] init];
self.videoCounterTags = [[NSMutableArray alloc] init];

int c = (int)[self.scenes count];
c--;
NSLog(@"int c = %d", c);
self.myCounter = [NSNumber numberWithInt:c];


for (int i=0; i<=c; i++) {

    //create imageView
    UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
    [imageView setUserInteractionEnabled:YES]; // <--- This is very important
    imageView.tag = i;                        // <--- Add tag to track this subview in the view stack
    [self.view addSubview:imageView];
    NSLog(@"added image view %d", i);


    //get scene object
    PFObject *sceneObject = self.scenes[i];


    //get the PFFile and filetype
    PFFile *file = [sceneObject objectForKey:@"file"];
    NSString *fileType = [sceneObject objectForKey:@"fileType"];



    //check the filetype
    if ([fileType  isEqual: @"image"])
    {
        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
        //get image
        NSURL *imageFileUrl = [[NSURL alloc] initWithString:file.url];
        NSData *imageData = [NSData dataWithContentsOfURL:imageFileUrl];
            dispatch_async(dispatch_get_main_queue(), ^{
        imageView.image = [UIImage imageWithData:imageData];
            });
        });

    }

    //its a video
    else
    {
        // the video player
        NSURL *fileUrl = [NSURL URLWithString:file.url];

        self.avPlayer = [AVPlayer playerWithURL:fileUrl];
        self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;

        self.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
        //self.avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

        [[NSNotificationCenter defaultCenter] addObserver:self
                                                 selector:@selector(playerItemDidReachEnd:)
                                                     name:AVPlayerItemDidPlayToEndTimeNotification
                                                   object:[self.avPlayer currentItem]];

        CGRect screenRect = [[UIScreen mainScreen] bounds];

        self.avPlayerLayer.frame = CGRectMake(0, 0, screenRect.size.width, screenRect.size.height);
        [imageView.layer addSublayer:self.avPlayerLayer];

        NSNumber *tag = [NSNumber numberWithInt:i+1];

        NSLog(@"tag = %@", tag);

        [self.videoCounterTags addObject:tag];

        //[self.avPlayer play];
    }



}



UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(viewTapped:)];

[self.view bringSubviewToFront:self.screen];

[self.screen addGestureRecognizer:tapGesture];


}


 - (void)viewTapped:(UIGestureRecognizer *)gesture{

NSLog(@"touch!");

[self.avPlayer pause];

int i = [self.myCounter intValue];
NSLog(@"counter = %d", i);



for(UIImageView *subview in [self.view subviews]) {

    if(subview.tag== i) {

        [subview removeFromSuperview];
    }
}

if ([self.videoCounterTags containsObject:self.myCounter]) {
    NSLog(@"play video!!!");
    [self.avPlayer play];
}

if (i == 0) {
    [self.avPlayer pause];
    [self.navigationController popViewControllerAnimated:NO];
}


i--;
self.myCounter = [NSNumber numberWithInt:i];


NSLog(@"counter after = %d", i);





}
2个回答

1
请看一下您如何设置myCounter变量。它被设置了一次,直到视图被点击之前都不会改变,然后它被设置为场景计数 -1。
此外,请尝试查看您如何设置_avPlayer指针变量。它总是被重复设置,似乎在循环中您想要存储引用而不仅仅是更新指向场景集合中最新值的相同指针。
另外,来自苹果文档的说明:
“您可以使用相同的AVPlayer对象创建任意数量的播放器层。只有最近创建的播放器层才会实际在屏幕上显示视频内容。”
因此,由于您正在使用相同的AVPlayer对象创建所有这些AVPlayer层,所以您可能永远不会看到超过一个实际的视频层起作用。

如果你能用代码找到解决方案,那么我会给你那个赏金。 - ian

1
布鲁克斯·海恩斯所说的是正确的,你一直在覆盖avplayer。 以下是我建议你做的事情:
  1. Add the tap gesture to the imageView instead of the screen (or for a cleaner approach use UIButton instead):

    UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height)];
    [imageView setUserInteractionEnabled:YES]; // <--- This is very important
    imageView.tag = i;                        // <--- Add tag to track this subview in the view stack
    [self.view addSubview:imageView];
    NSLog(@"added image view %d", i);
    UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:imageView action:@selector(viewTapped:)];
    [imageView addGestureRecognizer:tapGesture];
    
在您的viewTapped:方法中,您可以通过以下方式获取按下图像的标签:gesture.view.tag而不是使用myCounter
要使视频正常工作,您可以为每个视频创建一个新的AVPlayer,但这可能会在内存方面变得非常昂贵。更好的方法是使用AVPlayerItem并在更改视频时切换AVPlayerAVPlayerItem
因此,在for循环中,可以执行以下操作,其中self.videoFilesNSMutableDictionary属性:
           // the video player
            NSNumber *tag = [NSNumber numberWithInt:i+1];
            NSURL *fileUrl = [NSURL URLWithString:file.url];
          //save your video file url paired with the ImageView it belongs to.
           [self.videosFiles setObject:fileUrl forKey:tag];
// you only need to initialize the player once.
            if(self.avPlayer == nil){
                AVAsset *asset = [AVAsset assetWithURL:fileUrl];
                AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
                self.avPlayer = [[AVPlayer alloc] initWithPlayerItem:item];
                self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
                [[NSNotificationCenter defaultCenter] addObserver:self
                    selector:@selector(playerItemDidReachEnd:)
                name:AVPlayerItemDidPlayToEndTimeNotification
                object:[self.avPlayer currentItem]];
            }
            // you don't need to keep the layer as a property 
            // (unless you need it for some reason 
            AVPlayerLayer* avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
            avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

            CGRect screenRect = [[UIScreen mainScreen] bounds];                
            avPlayerLayer.frame = CGRectMake(0, 0, screenRect.size.width, screenRect.size.height);
            [imageView.layer addSublayer:avPlayerLayer];
            NSLog(@"tag = %@", tag);                
            [self.videoCounterTags addObject:tag];

现在在您的 viewTapped 中:
 if ([self.videoCounterTags containsObject:gesture.view.tag]) { 

  NSLog(@"play video!!!");
    AVAsset *asset = [AVAsset assetWithURL:[self.videoFiles objectForKey:gesture.view.tag]];
    AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
    self.avPlayer replaceCurrentItemWithPlayerItem: item];
    [self.avLayer play];
}

或者直接使用self.videoFiles,这样你就不需要使用self.videoCounterTags了:

 NSURL* fileURL = [self.videoFiles objectForKey:gesture.view.tag];
 if (fileURL!=nil) {    
     NSLog(@"play video!!!");
     AVAsset *asset = [AVAsset assetWithURL:fileURL];
     AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
     self.avPlayer replaceCurrentItemWithPlayerItem: item];
     [self.avLayer play];
 }

这就是要点。

好的,我会尝试这个。唯一让我犹豫的是,如果像上次我尝试将它添加到所有图像视图中一样,是否只能添加一次触摸手势。 - ian
你做到了!太棒了!我也从没想过使用AVAsset。即使悬赏已经过期了,还有办法给你奖金吗? - ian

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接