使用FFMPEG从Android流式传输图像。

4

我目前从外部来源接收图像的字节数组,希望通过 ffmpeg 将其作为原始视频格式发送到一个流 URL,该 URL 上有一个 RTSP 服务器接收 RTSP 流。(一篇类似但未答复的问题)。然而,我没有在 Java 中使用 FFMPEG,因此找不到如何处理它的示例。 我有一个回调函数,将图像字节复制到一个字节数组中,如下所示:

        public class MainActivity extends Activity {
            final String rtmp_url = "rtmp://192.168.0.12:1935/live/test";
            private int PREVIEW_WIDTH = 384;
            private int PREVIEW_HEIGHT = 292;
            private String TAG = "MainActivity";
            String ffmpeg = Loader.load(org.bytedeco.ffmpeg.ffmpeg.class);
            final String command[] = {ffmpeg,
                            "-y",  //Add "-re" for simulated readtime streaming.
                            "-f", "rawvideo",
                            "-vcodec", "rawvideo",
                            "-pix_fmt", "bgr24",
                            "-s", (Integer.toString(PREVIEW_WIDTH) + "x" + Integer.toString(PREVIEW_HEIGHT)),
                            "-r", "10",
                            "-i", "pipe:",
                            "-c:v", "libx264",
                            "-pix_fmt", "yuv420p",
                            "-preset", "ultrafast",
                            "-f", "flv",
                            rtmp_url};
            
      private UVCCamera mUVCCamera;

public void handleStartPreview(Object surface) throws InterruptedException, IOException {
    Log.e(TAG, "handleStartPreview:mUVCCamera" + mUVCCamera + " mIsPreviewing:");
    if ((mUVCCamera == null)) return;
    Log.e(TAG, "handleStartPreview2 ");
    try {
        mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, 0, UVCCamera.DEFAULT_BANDWIDTH, 0);
        Log.e(TAG, "handleStartPreview3 mWidth: " + mWidth + "mHeight:" + mHeight);
    } catch (IllegalArgumentException e) {
        try {
            // fallback to YUV mode
            mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, UVCCamera.DEFAULT_PREVIEW_MODE, UVCCamera.DEFAULT_BANDWIDTH, 0);
            Log.e(TAG, "handleStartPreview4");
        } catch (IllegalArgumentException e1) {
            callOnError(e1);
            return;
        }
    }
    Log.e(TAG, "handleStartPreview: startPreview1");
    int result = mUVCCamera.startPreview();
    mUVCCamera.setFrameCallback(mIFrameCallback, UVCCamera.PIXEL_FORMAT_RGBX);
    mUVCCamera.startCapture();
    Toast.makeText(MainActivity.this,"Camera Started",Toast.LENGTH_SHORT).show();
    ProcessBuilder pb = new ProcessBuilder(command);
    pb.redirectErrorStream(true);
    Process process = pb.start();
    BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
    OutputStream writer = process.getOutputStream();
    byte img[] = new byte[192*108*3];
    for (int i = 0; i < 10; i++)
    {
        for (int y = 0; y < 108; y++)
        {
            for (int x = 0; x < 192; x++)
            {
                byte r = (byte)((x * y + i) % 255);
                byte g = (byte)((x * y + i*10) % 255);
                byte b = (byte)((x * y + i*20) % 255);
                img[(y*192 + x)*3] = b;
                img[(y*192 + x)*3+1] = g;
                img[(y*192 + x)*3+2] = r;
            }
        }

        writer.write(img);
    }

    writer.close();
    String line;
    while ((line = reader.readLine()) != null)
    {
        System.out.println(line);
    }

    process.waitFor();
}
public static void buildRawFrame(Mat img, int i)
{
    int p = img.cols() / 60;
    img.setTo(new Scalar(60, 60, 60));
    String text = Integer.toString(i+1);
    int font = Imgproc.FONT_HERSHEY_SIMPLEX;
    Point pos = new Point(img.cols()/2-p*10*(text.length()), img.rows()/2+p*10);
    Imgproc.putText(img, text, pos, font, p, new Scalar(255, 30, 30), p*2);  //Blue number
}

此外:使用FFmpeg捕获Android相机画面,可以从本地Android相机逐帧捕获图像,并且不是通过RTMP推送,而是生成一个视频文件作为输出。虽然未说明如何通过FFmpeg传递图像。
frameData是我的字节数组,我想知道如何编写必要的FFmpeg命令,使用ProcessBuilder将图像通过RTSP发送到给定URL。
以下是我试图做的示例,在Python 3中,我可以轻松完成它:
import cv2
import numpy as np
import socket
import sys
import pickle
import struct
import subprocess

fps = 25
width = 224
height = 224
rtmp_url = 'rtmp://192.168.0.13:1935/live/test'
    
    
    
    command = ['ffmpeg',
               '-y',
               '-f', 'rawvideo',
               '-vcodec', 'rawvideo',
               '-pix_fmt', 'bgr24',
               '-s', "{}x{}".format(width, height),
               '-r', str(fps),
               '-i', '-',
               '-c:v', 'libx264',
               '-pix_fmt', 'yuv420p',
               '-preset', 'ultrafast',
               '-f', 'flv',
               rtmp_url]
    
    p = subprocess.Popen(command, stdin=subprocess.PIPE)
    
    while(True):
        frame = np.random.randint([255], size=(224, 224, 3))
        frame = frame.astype(np.uint8)
        p.stdin.write(frame.tobytes())

我希望在Android上做同样的事情。

更新:尽管我可以在Netbeans上重现@Rotem的答案,但在Android上执行pb.start()时却出现了NullPointer异常错误。

    Process: com.infiRay.XthermMini, PID: 32089
    java.lang.NullPointerException
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
        at com.infiRay.XthermMini.MainActivity.handleStartPreview(MainActivity.java:512)
        at com.infiRay.XthermMini.MainActivity.startPreview(MainActivity.java:563)
        at com.infiRay.XthermMini.MainActivity.access$1000(MainActivity.java:49)
        at com.infiRay.XthermMini.MainActivity$3.onConnect(MainActivity.java:316)
        at com.serenegiant.usb.USBMonitor$3.run(USBMonitor.java:620)
        at android.os.Handler.handleCallback(Handler.java:938)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at android.os.Looper.loopOnce(Looper.java:226)
        at android.os.Looper.loop(Looper.java:313)
        at android.os.HandlerThread.run(HandlerThread.java:67)
2022-06-02 11:47:20.300 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.304 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.304 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.308 32089-1049/com.infiRay.XthermMini E/libUVCCamera: [1049*UVCPreviewIR.cpp:505:uvc_preview_frame_callback]:receive err data
2022-06-02 11:47:20.312 32089-32089/com.infiRay.XthermMini E/MainActivity: onPause:
2022-06-02 11:47:20.314 32089-32581/com.infiRay.XthermMini I/Process: Sending signal. PID: 32089 SIG: 9

你的意思是在JAVA中写入FFmpeg子进程的stdin管道吗? - Rotem
我不确定在Java中是否可以这样做,从我所看到的来看,可以使用ProcessBuilder实现,但如果Java有标准输入管道,那就太好了。 - xnok
我不太了解JAVA,也不知道在Android上的限制。是否可以创建一个子进程,就像这样:p = Runtime.getRuntime().exec(cmd)?(当 cmd = "ffmpeg"... 时带有参数列表),然后使用 p_stdin = DataOutputStream(p.getOutputStream()),并将数据写入stdin:p_stdin.write(arr) - Rotem
只要您的应用程序使用 Android API 29 以下的版本,就可以在 Android 上实现此功能,否则会出现“权限被拒绝”的情况,请参见 https://github.com/bytedeco/javacv/issues/1127#issuecomment-643700534。 - Samuel Audet
显示剩余2条评论
2个回答

6

这是一个类似于Python代码的JAVA实现:

示例将原始视频帧(字节数组)写入FFmpeg子进程的stdin管道:

 _____________             ___________                  ________ 
| JAVA byte   |           |           |                |        |
| Array       |   stdin   | FFmpeg    |                | Output |
| BGR (format)| --------> | process   | -------------> | stream |
|_____________| raw frame |___________| encoded video  |________|

主要阶段:

  • Initialize FFmpeg command arguments:

     final String command[] = {"ffmpeg", "-f", "rawvideo", ...}
    
  • Create ProcessBuilder that executes FFmpeg as a sub-process:

     ProcessBuilder pb = new ProcessBuilder(command);
    
  • Redirect stderr (required for reading FFmpeg messages), without it, the sub-process halts:

     pb.redirectErrorStream(true);
    
  • Start FFmpeg sub-process, and create BufferedReader:

     Process process = pb.start();
     BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
    
  • Create OutputStream for writing to stdin pipe of FFmpeg sub-process:

     OutputStream writer = process.getOutputStream();
    
  • Write raw video frames to stdin pipe of FFmpeg sub-process in a loop:

     byte img[] = new byte[width*height*3];
    
     for (int i = 0; i < n_frmaes; i++)
     {
         //Fill img with pixel data
         ...
         writer.write(img);
     }
    
  • Close stdin, read and print stderr content, and wait for sub-process to finish:

     writer.close();
    
     String line;
     while ((line = reader.readLine()) != null)
     {
         System.out.println(line);
     }        
    
     process.waitFor();
    

代码示例:
以下代码示例将10个大小为192x108的原始视频帧写入FFmpeg。
我们将结果写入test.flv文件进行测试,而不是流式传输到RTMP。
该示例使用硬编码的字符串和数字(为了简单起见)。

注意:
代码示例假设FFmpeg可执行文件在执行路径中。

package myproject;

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.OutputStream;

public class FfmpegVideoWriter {
    public static void main(String[] args) throws IOException, InterruptedException {
        final String rtmp_url = "test.flv"; //Set output file (instead of output URL) for testing.
        
        final String command[] = {"ffmpeg",
                                  "-y",  //Add "-re" for simulated readtime streaming.
                                  "-f", "rawvideo",
                                  "-vcodec", "rawvideo",
                                  "-pix_fmt", "bgr24",
                                  "-s", "192x108",
                                  "-r", "10",
                                  "-i", "pipe:",
                                  "-c:v", "libx264",
                                  "-pix_fmt", "yuv420p",
                                  "-preset", "ultrafast",
                                  "-f", "flv",
                                  rtmp_url};
        
        //https://dev59.com/r2035IYBdhLWcg3wVud1
        ProcessBuilder pb = new ProcessBuilder(command);    //Create ProcessBuilder
        pb.redirectErrorStream(true); //Redirect stderr
        Process process = pb.start();               
        BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
        
        //Create OutputStream for writing to stdin pipe of FFmpeg sub-process.
        OutputStream writer = process.getOutputStream();
        
        byte img[] = new byte[192*108*3];   //Dummy image 
        
        //Write 10 video frames to stdin pipe of FFmpeg sub-process
        for (int i = 0; i < 10; i++)
        {
            //Fill image with some arbitrary pixel values
            for (int y = 0; y < 108; y++)
            {
                for (int x = 0; x < 192; x++)
                {
                    //Arbitrary RGB values:
                    byte r = (byte)((x * y + i) % 255); //Red component
                    byte g = (byte)((x * y + i*10) % 255); //Green component
                    byte b = (byte)((x * y + i*20) % 255); //Blue component
                    img[(y*192 + x)*3] = b; 
                    img[(y*192 + x)*3+1] = g;
                    img[(y*192 + x)*3+2] = r;
                }
            }
            
            writer.write(img);  //Write img to FFmpeg
        }
        
        writer.close();  //Close stdin pipe.

        //Read and print stderr content
        //Note: there may be cases when FFmpeg keeps printing messages, so it may not be the best solution to empty the buffer only at the end.
        //We may consider adding an argument `-loglevel error` for reducing verbosity.
        String line;
        while ((line = reader.readLine()) != null)
        {
            System.out.println(line);
        }        
       
        process.waitFor();
    }
}

这段代码在我的PC上进行了测试(使用的是Windows 10系统),我不确定它能否在Android上运行...

上面的示例过于简单和通用,您可以在您的情况下使用rgba像素格式,并将FrameData写入onFrame方法中。

示例视频帧("任意像素值"):
enter image description here


更新:

以下代码示例使用JavaCV - 将Mat数据写入FFmpeg:

package myproject;

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.OutputStream;

import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.Scalar;
import org.opencv.core.Point;
import org.opencv.imgproc.Imgproc;

public class FfmpegVideoWriter {
    static { System.loadLibrary(Core.NATIVE_LIBRARY_NAME); }
    
    //Build synthetic "raw BGR" image for testing
    public static void buildRawFrame(Mat img, int i)
    {
        int p = img.cols() / 60;    //Used as font size factor.
        img.setTo(new Scalar(60, 60, 60));  //Fill image with dark gray color
        String text = Integer.toString(i+1);
        int font = Imgproc.FONT_HERSHEY_SIMPLEX;
        Point pos = new Point(img.cols()/2-p*10*(text.length()), img.rows()/2+p*10);
        Imgproc.putText(img, text, pos, font, p, new Scalar(255, 30, 30), p*2);  //Blue number
    }
    
    public static void main(String[] args) throws IOException, InterruptedException {
        final int cols = 192;
        final int rows = 108;
        
        final String rtmp_url = "test.flv"; //Set output file (instead of output URL) for testing.
        
        final String command[] = {"ffmpeg",
                                  "-y",  //Add "-re" for simulated readtime streaming.
                                  "-f", "rawvideo",
                                  "-vcodec", "rawvideo",
                                  "-pix_fmt", "bgr24",
                                  "-s", (Integer.toString(cols) + "x" + Integer.toString(rows)),
                                  "-r", "10",
                                  "-i", "pipe:",
                                  "-c:v", "libx264",
                                  "-pix_fmt", "yuv420p",
                                  "-preset", "ultrafast",
                                  "-f", "flv",
                                  rtmp_url};
        
        //https://dev59.com/r2035IYBdhLWcg3wVud1
        ProcessBuilder pb = new ProcessBuilder(command);    //Create ProcessBuilder
        pb.redirectErrorStream(true); //Redirect stderr
        Process process = pb.start();               
        BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
        
        //Create OutputStream for writing to stdin pipe of FFmpeg sub-process.
        OutputStream writer = process.getOutputStream();
        
        //Dummy image (BGR pixel format).
        Mat img = new Mat(rows, cols, CvType.CV_8UC3, Scalar.all(0));
        
        byte buffer[] = new byte[cols*rows*3]; //Byte array for storing img data    
        
        //Write 10 video frames to stdin pipe of FFmpeg sub-process
        for (int i = 0; i < 10; i++)
        {
            buildRawFrame(img, i); //Build image with blue frame counter.
                       
            img.get(0, 0, buffer); //Copy img data to buffer (not sure if this is the best solution).  
            
            writer.write(buffer); //Write buffer (raw video frame as byte array) to FFmpeg
        }
        
        writer.close(); //Close stdin pipe.

        //Read and print stderr content
        String line;
        while ((line = reader.readLine()) != null)
        {
            System.out.println(line);
        }        
       
        process.waitFor();
    }
}

示例输出界面:
输入图像描述


我已经成功在NetBeans上使用RTSP进行了复制,现在我正在尝试使用JavaCV来进行FFmpeg操作,但目前还存在一些错误,因此我正在努力修复它。如果我成功解决了问题,我会更新评论的。 - xnok
我更新了我的回答,并提供了使用JavaCV的代码示例。我不知道这是否是您要寻找的,但它比编写任意值更有趣... - Rotem
由于某种原因,在安卓设备上出现了死锁。 - xnok
在简短的循环中,使用“任意像素值”的简单示例是否存在死锁问题?尝试添加“-re”参数(它可能会释放一些CPU利用率)。 - Rotem
这种方法适用于 Android 9 及以下版本。显然,在 Android 10 或更高版本中,您不能在代码执行期间调用可执行文件,因此它不会加载 javacv 的 ffmpeg。 - xnok
1
我们已经收到了 Samuel Audet 的警告评论:只要您的应用程序使用 Android API 29 以下,或者您将会收到“权限被拒绝”的提示,那么在 Android 上也是可能的。我猜这篇文章对其他平台仍然有帮助。很可能还有其他适用于 Android 10 的方法,但我对 Android 开发毫无经验。 - Rotem

0

我发现在Android上使用ffmpeg-kit比使用ProcessBuilder调用ffmpeg二进制文件稍微方便一些。

将任意数据(例如图像作为字节数组)传递给ffmpeg的最简单方法是利用命名管道

该管道在{app_data}/cache/pipes/fk_pipe_1中创建,可以像访问任何Unix文件一样使用FileOutputStream进行访问。

请注意,只有当可以从管道中读取数据时才会执行ffmpeg命令。更多信息可以在ffmpeg-kit wiki中找到。

在某些情况下,直接访问相机可能更容易,这对于API级别24及更高级别是支持的

String rtmp_url = "rtp://127.0.0.1:9000";

String pipe1 = FFmpegKitConfig.registerNewFFmpegPipe(context);

FFmpegKit.executeAsync("-re -f rawvideo -pixel_format bgr24 -video_size 640x480 -i " + pipe1 + " -f rtp_mpegts" + " " + rtmp_url, new FFmpegSessionCompleteCallback() {

    @Override
    public void apply(FFmpegSession session) {
        SessionState state = session.getState();
        ReturnCode returnCode = session.getReturnCode();
        // CALLED WHEN SESSION IS EXECUTED
        Log.d(TAG, String.format("FFmpeg process exited with state %s and rc %s.%s", state, returnCode, session.getFailStackTrace()));
    }
}, new LogCallback() {

    @Override
    public void apply(com.arthenica.ffmpegkit.Log log) {
        // CALLED WHEN SESSION PRINTS LOGS
    }
}, new StatisticsCallback() {

    @Override
    public void apply(Statistics statistics) {
        // CALLED WHEN SESSION GENERATES STATISTICS
    }
});

byte img[] = new byte[640*480*3];   // dummy image
FileOutputStream out = new FileOutputStream(pipe1);
try {
    for (int i=0; i<100; i++) { // write 100 empty frames
        out.write(img);
    }
} catch (Exception e) {
    e.printStackTrace();
} finally {
    out.close();

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接