我希望用Java从视频文件(mov)中获取一帧样本(jpeg)。有没有简单的方法可以做到这一点? 当我在谷歌上搜索时,我只能找到将多个jpg制作成mov的方法。 我不知道,也许我找不到正确的关键字。
我希望用Java从视频文件(mov)中获取一帧样本(jpeg)。有没有简单的方法可以做到这一点? 当我在谷歌上搜索时,我只能找到将多个jpg制作成mov的方法。 我不知道,也许我找不到正确的关键字。
我知道原来的问题已经解决,但是我还是想回答一下,以防其他人像我一样卡住了。
昨天起,我尝试了所有可以尝试的方法,我的意思是真的所有方法。所有可用的Java库要么过时,不再维护,要么缺乏任何有用的文档(说真的?!?!)
我尝试了JFM(古老且无用),JCodec(没有文档),JJMpeg(看起来很有前途,但由于缺乏Java类文档而难以使用),OpenCV自动Java构建和其他一些我记不清楚的库。
最后,我决定看看 JavaCV (Github 链接)的类。它包含具有详细文档的FFMPEG绑定。
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>javacv</artifactId>
<version>1.0</version>
</dependency>
原来从视频文件中提取视频帧并将其转换为BufferedImage
甚至JPEG文件非常容易。可以使用类FFmpegFrameGrabber轻松地抓取单个帧并将其转换为BufferedImage
。 以下是代码示例:
FFmpegFrameGrabber g = new FFmpegFrameGrabber("textures/video/anim.mp4");
g.start();
Java2DFrameConverter converter = new Java2DFrameConverter();
for (int i = 0 ; i < 50 ; i++) {
Frame frame = g.grabImage(); // It is important to use grabImage() to get a frame that can be turned into a BufferedImage
BufferedImage bi = converter.convert(frame);
ImageIO.write(bi, "png", new File("frame-dump/video-frame-" + System.currentTimeMillis() + ".png"));
}
g.stop();
这段代码基本上会把视频的前50帧提取出来,并将它们保存为PNG文件。好处是内部的寻找函数,能够在实际帧而非关键帧上运行(这是我在使用JCodec时遇到的问题)。
你可以参考JavaCV主页,了解其他可用于从WebCams等设备中捕获帧的类。希望这个回答有所帮助 :-)
grab()
现在返回一个名为Frame
的对象,该对象没有getBufferedImage()
方法。这里还有一个已知缺陷。如果您需要FFmpegFrameGrabber
,暂时请使用版本0.10。 - AprelFrameConverter
类,它允许从Frame
转换为BufferedImage
或其他图像格式。 - RealSkepticnew org.bytedeco.javacv.Java2DFrameConverter
创建对象,并调用其中的getBufferedImage
方法。 - 0__Java2DFrameConverter c = new Java2DFrameConverter; c.convert(g.grab());
我还发现两件特别有用的事情:1. 如果您喜欢获取几个代表帧而不是每一帧,可以使用g.grabKeyFrame();
而不是g.grab();
。 2. @oberger如果您已经到达视频的结尾,grab();
或grabKeyFrame();
将返回null
。只需使用一个if
语句并在返回null
时中断循环即可。(确保将grab()
的结果保存在变量中,否则您将跳过一帧。) - Alexander Jank我已经修改了这个链接中的代码,使其仅保存视频的第一帧。
import javax.imageio.ImageIO;
import java.io.File;
import java.awt.image.BufferedImage;
import com.xuggle.mediatool.IMediaReader;
import com.xuggle.mediatool.MediaListenerAdapter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.mediatool.event.IVideoPictureEvent;
import com.xuggle.xuggler.Global;
/**
* * @author aclarke
* @author trebor
*/
public class DecodeAndCaptureFrames extends MediaListenerAdapter
{
private int mVideoStreamIndex = -1;
private boolean gotFirst = false;
private String saveFile;
private Exception e;
/** Construct a DecodeAndCaptureFrames which reads and captures
* frames from a video file.
*
* @param filename the name of the media file to read
*/
public DecodeAndCaptureFrames(String videoFile, String saveFile)throws Exception
{
// create a media reader for processing video
this.saveFile = saveFile;
this.e = null;
IMediaReader reader = ToolFactory.makeReader(videoFile);
// stipulate that we want BufferedImages created in BGR 24bit color space
reader.setBufferedImageTypeToGenerate(BufferedImage.TYPE_3BYTE_BGR);
// note that DecodeAndCaptureFrames is derived from
// MediaReader.ListenerAdapter and thus may be added as a listener
// to the MediaReader. DecodeAndCaptureFrames implements
// onVideoPicture().
reader.addListener(this);
// read out the contents of the media file, note that nothing else
// happens here. action happens in the onVideoPicture() method
// which is called when complete video pictures are extracted from
// the media source
while (reader.readPacket() == null && !gotFirst);
if (e != null)
throw e;
}
/**
* Called after a video frame has been decoded from a media stream.
* Optionally a BufferedImage version of the frame may be passed
* if the calling {@link IMediaReader} instance was configured to
* create BufferedImages.
*
* This method blocks, so return quickly.
*/
public void onVideoPicture(IVideoPictureEvent event)
{
try
{
// if the stream index does not match the selected stream index,
// then have a closer look
if (event.getStreamIndex() != mVideoStreamIndex)
{
// if the selected video stream id is not yet set, go ahead an
// select this lucky video stream
if (-1 == mVideoStreamIndex)
mVideoStreamIndex = event.getStreamIndex();
// otherwise return, no need to show frames from this video stream
else
return;
}
ImageIO.write(event.getImage(), "jpg", new File(saveFile));
gotFirst = true;
}
catch (Exception e)
{
this.e = e;
}
}
}
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.IOException;
import javax.imageio.ImageIO;
import org.bytedeco.javacpp.opencv_core.IplImage;
import org.bytedeco.javacv.FFmpegFrameGrabber;
import org.bytedeco.javacv.FrameGrabber.Exception;
public class Read{
public static void main(String []args) throws IOException, Exception
{
FFmpegFrameGrabber frameGrabber = new FFmpegFrameGrabber("C:/Users/Digilog/Downloads/Test.mp4");
frameGrabber.start();
IplImage i;
try {
i = frameGrabber.grab();
BufferedImage bi = i.getBufferedImage();
ImageIO.write(bi,"png", new File("D:/Img.png"));
frameGrabber.stop();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Here's how with BoofCV:
String fileName = UtilIO.pathExample("tracking/chipmunk.mjpeg");
MediaManager media = DefaultMediaManager.INSTANCE;
ConfigBackgroundBasic configBasic = new ConfigBackgroundBasic(30, 0.005f);
ImageType imageType = ImageType.single(GrayF32.class);
BackgroundModelMoving background = FactoryBackgroundModel.movingBasic(configBasic, new PointTransformHomography_F32(), imageType);
SimpleImageSequence video = media.openVideo(fileName, background.getImageType());
ImageBase nextFrame;
while(video.hasNext()) {
nextFrame = video.next();
// Now do something with it...
}
public class MediaFileExample implements Runnable{
private MarvinVideoInterface videoAdapter;
private MarvinImage videoFrame;
public MediaFileExample(){
try{
// Create the VideoAdapter used to load the video file
videoAdapter = new MarvinJavaCVAdapter();
videoAdapter.loadResource("./res/snooker.wmv");
// Start the thread for requesting the video frames
new Thread(this).start();
}
catch(MarvinVideoInterfaceException e){e.printStackTrace();}
}
@Override
public void run() {
try{
while(true){
// Request a video frame
videoFrame = videoAdapter.getFrame();
}
}catch(MarvinVideoInterfaceException e){e.printStackTrace();}
}
public static void main(String[] args) {
MediaFileExample m = new MediaFileExample();
}
}
也许这会对你有所帮助:
Buffer buf = frameGrabber.grabFrame();
// Convert frame to an buffered image so it can be processed and saved
Image img = (new BufferToImage((VideoFormat) buf.getFormat()).createImage(buf));
buffImg = new BufferedImage(img.getWidth(this), img.getHeight(this), BufferedImage.TYPE_INT_RGB);
//TODO saving the buffImg
更多信息请参见: