使用回收的帧缓冲区的Three.js片段着色器

4
我正在尝试制作一个应用程序,模拟长时间曝光摄影。我的想法是从网络摄像头中获取当前帧,并将其合成到画布上。随着时间的推移,照片会“曝光”,变得越来越亮。(参见http://www.chromeexperiments.com/detail/light-paint-live-mercury/?f=
我有一个完美运行的着色器,就像Photoshop中的“添加”混合模式一样。问题是我无法让它回收先前的帧。
我认为这应该很简单,比如renderer.autoClear = false;,但在这种情况下似乎没有任何作用。
以下是使用THREE.EffectComposer应用该着色器的代码。
        onWebcamInit: function () {    
            var $stream = $("#user-stream"),
                width = $stream.width(),
                height = $stream.height(),
                near = .1,
                far = 10000;

            this.renderer = new THREE.WebGLRenderer();
            this.renderer.setSize(width, height);
            this.renderer.autoClear = false;
            this.scene = new THREE.Scene();

            this.camera = new THREE.OrthographicCamera(width / -2, width / 2, height / 2, height / -2, near, far);
            this.scene.add(this.camera);

            this.$el.append(this.renderer.domElement);

            this.frameTexture = new THREE.Texture(document.querySelector("#webcam"));
            this.compositeTexture = new THREE.Texture(this.renderer.domElement);

            this.composer = new THREE.EffectComposer(this.renderer);

            // same effect with or without this line
            // this.composer.addPass(new THREE.RenderPass(this.scene, this.camera));

            var addEffect = new THREE.ShaderPass(addShader);
            addEffect.uniforms[ 'exposure' ].value = .5;
            addEffect.uniforms[ 'frameTexture' ].value = this.frameTexture;
            addEffect.renderToScreen = true;
            this.composer.addPass(addEffect);

            this.plane = new THREE.Mesh(new THREE.PlaneGeometry(width, height, 1, 1), new THREE.MeshBasicMaterial({map: this.compositeTexture}));
            this.scene.add(this.plane);

            this.frameTexture.needsUpdate = true;
            this.compositeTexture.needsUpdate = true;

            new FrameImpulse(this.renderFrame);

        },
        renderFrame: function () {
            this.frameTexture.needsUpdate = true;
            this.compositeTexture.needsUpdate = true;
            this.composer.render();
        }

这是着色器。没有花哨的东西。

        uniforms: {
            "tDiffuse": { type: "t", value: null },
            "frameTexture": { type: "t", value: null },
            "exposure": { type: "f", value: 1.0 }
        },

        vertexShader: [
            "varying vec2 vUv;",
            "void main() {",
            "vUv = uv;",
            "gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );",

            "}"
        ].join("\n"),

        fragmentShader: [

            "uniform sampler2D frameTexture;",
            "uniform sampler2D tDiffuse;",
            "uniform float exposure;",
            "varying vec2 vUv;",

            "void main() {",
            "vec4 n = texture2D(frameTexture, vUv);",
            "vec4 o = texture2D(tDiffuse, vUv);",
            "vec3 sum = n.rgb + o.rgb;",
            "gl_FragColor = vec4(mix(o.rgb, sum.rgb, exposure), 1.0);",
            "}"

        ].join("\n")

1
这个讨论似乎表明你应该像这样创建WebGLRenderer:new WebGLRenderer({ preserveDrawingBuffer: true }),并将renderer.autoClearColor设置为false。 - voithos
这对于在几何图形上创建效果(如动态模糊)非常有效,但它对几何图形上的实际纹理没有影响。这就是我想要针对的。 - posit labs
3个回答

2

实质上,这与posit labs的答案等效,但我使用了更简化的解决方案 - 我创建了一个仅包含要回收的ShaderPass的EffectComposer,然后在每次渲染时交换该composer的renderTargets。

初始化:

THREE.EffectComposer.prototype.swapTargets = function() {
    var tmp = this.renderTarget2;
    this.renderTarget2 = this.renderTarget1;
    this.renderTarget1 = tmp;
};

...

composer = new THREE.EffectComposer(renderer,  
    new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat })
);

var addEffect = new THREE.ShaderPass(addShader, 'frameTexture');
addEffect.renderToScreen = true;
this.composer.addPass(addEffect);

渲染:

composer.render();
composer.swapTargets();

然后,可以使用第二个EffectComposer将两个renderTarget中的其中一个推送到屏幕上或进一步进行转换。

还要注意,在初始化ShaderPass时,将“frameTexture”声明为textureID。这样,ShaderPass就知道要使用前一个Pass的结果更新frameTexture uniform。


2
从 https://threejsfundamentals.org/threejs/lessons/threejs-post-processing.html 看起来,RenderPass 可能会被标记为 "needsSwap" 以自动执行此操作。 - Elias Hasle

1
为了实现这种反馈效果,您需要交替编写到不同实例的WebGLRenderTarget中。否则,帧缓冲区将被覆盖。不是完全确定为什么会发生这种情况...但这里是解决方案。

初始化:

    this.rt1 = new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat });
    this.rt2 = new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat });

渲染:

    this.renderer.render(this.scene, this.camera);
    this.renderer.render(this.scene, this.camera, this.rt1, false);

    // swap buffers
    var a = this.rt2;
    this.rt2 = this.rt1;
    this.rt1 = a;
    this.shaders.add.uniforms.tDiffuse.value = this.rt2;

0

试一下这个:

this.renderer = new THREE.WebGLRenderer( { preserveDrawingBuffer: true } );

谢谢你的回复,doob。不过很遗憾,这对我没用。我发布的渲染到纹理技术作为答案完美地解决了问题...但我仍然不确定它为什么有效。 - posit labs

网页内容由stack overflow 提供, 点击上面的
可以查看英文原文,
原文链接