Language
cover_photo

Three.js Particle Notes 03: FBO Technique

Building a GPGPU particle system from scratch

Quick answer

Three.js Particle Notes 03: FBO Technique: Building a GPGPU particle system from scratch.

Table of contents

  1. Intro
  2. Frame Buffer Object
  3. Mouse Interactive
  4. Math Behind the Effect
  5. Demo

Intro

This article continues from the template built in the first article and the Data Texture concept introduced in the second article. If those ideas are still unfamiliar, I recommend reading those notes first.

In this article, you will learn:

  1. How to use the FBO technique to run non-rendering calculations on the GPU.
  2. How to use the result produced by FBO computation.
  3. How to build a simple mouse interaction for a particle system.

Frame Buffer Output

What is a frame buffer?

In simple terms, a Frame Buffer Object is a chunk of memory that stores pixel data. It is an intermediary step in the rendering process. When a 3D scene is rendered, the graphics pipeline goes through geometry transformation, rasterization, pixel shading, and other stages. At some point in that pipeline, the rendered result needs to be stored before it is displayed on the screen.

What is the FBO, or Frame Buffer Object, technique?

The simple explanation is that we write data into a texture so the GPU can perform calculations on that data. The core idea is to keep updating the texture, so the data itself can be updated over time. If we only pass a texture into a fragment shader, the texture will not update by itself. The workflow in this article is:

  1. Create a new scene and capture it with an orthographic camera.
  2. Create a new geometry and point the camera exactly at that geometry.
  3. Give the geometry a new fragment shader.
  4. Put the data calculation logic inside the fragment shader and output the result to gl_FragColor. This gives us one computed result.
  5. Output what the camera sees as a new texture, then feed that texture back into the fragment shader. The logic is similar to recursion.
  6. Pass the texture output from the camera into another shader to read the computed data.

1. Create a Data Texture for the Position Buffer

This step was already explained in the previous article, so here I am only recording the implementation process.

createPositionBuffer(){
    this.size = 32;
    this.number = this.size * this.size;

    this.positions = new Float32Array(this.number * 4);
    for(let i=0; i<this.size; i++){
      for(let j=0; j<this.size; j++){
        let index = i * this.size + j;
        this.positions[index * 4 ] = i/(this.size-1) - 0.5;
        this.positions[index * 4 + 1] = j/(this.size-1) -0.5;
        this.positions[index * 4 + 2] = 0;
        this.positions[index * 4 + 3] = 1;
      }
    }
    this.positionTexture = new THREE.DataTexture(this.positions,this.size, this.size, THREE.RGBAFormat, THREE.FloatType);
    this.positionTexture.needsUpdate = true;

  }

2. Set Up the FBO Scene

This step creates the minimum elements required for the GPU computation unit:

  1. A scene.
  2. An OrthographicCamera.
  3. A plane mesh.
  4. A fragment shader.
setupFBO(){
    this.sceneFBO = new THREE.Scene();
    this.cameraFBO = new THREE.OrthographicCamera(-1,1,1,-1,0, 1);
    this.cameraFBO.position.z = 1;
    this.cameraFBO.lookAt(new THREE.Vector3(0,0,0));

    

    this.geoFBO = new THREE.PlaneGeometry(2,2);
    this.matFBO = new THREE.ShaderMaterial({
      uniforms: {
        uMousePos: {value: this.uMousePos},
        uPosTexture: {value: this.positionTexture},
        uOriginPosTexture: {value: this.positionTexture}
      },
      vertexShader:fboVertex,
      fragmentShader:fboFragment, 
    })
    this.meshFBO = new THREE.Mesh(this.geoFBO, this.matFBO);
    // this.meshFBO.position.x = 0.5;
    this.sceneFBO.add(this.meshFBO);
    ...
    }

3. RenderTarget

What is a RenderTarget?

In Three.js, a render target is an object used to capture the output of the rendering process. It represents an area where the scene will be rendered, and it is usually a texture or a framebuffer object.

This step stores the fragment shader output exactly as a new texture. In Three.js, the process of outputting rendered data through a render target is:

  1. Declare the image format of the render target.
  2. During rendering, specify which camera and scene the render target should render.
  3. Output the temporary data stored in the render target as a texture.
setupFBO(){
...
/*
       * .magFilter : number (THREE.LinearFilter or THREE.NearestFilter)
       * How the texture is sampled when a texel covers more than one pixel. The default is THREE.LinearFilter, which takes the four closest texels and bilinearly interpolates among them. The other option is THREE.NearestFilter, which uses the value of the closest texel.
       * See the texture constants page for details.

       * .minFilter : number (THREE.LinearFilter or THREE.NearestFilter)
       * How the texture is sampled when a texel covers less than one pixel. The default is THREE.LinearMipmapLinearFilter, which uses mipmapping and a trilinear filter.
       * 
       * type : number() (THREE.UnsignedByteType 
              THREE.ByteType 
              THREE.ShortType
              THREE.UnsignedShortType 
              THREE.IntType 
              THREE.UnsignedIntType
              THREE.FloatType 
              THREE.HalfFloatType 
              THREE.UnsignedShort4444Type
              THREE.UnsignedShort5551Type 
              THREE.UnsignedInt248Type)
       * This must correspond to the .format. The default is THREE.UnsignedByteType, which will be used for most texture formats.
       */
    this.renderTarget = new THREE.WebGLRenderTarget(this.size,this.size,{
      minFilter: THREE.NearestFilter,
      magFilter: THREE.NearestFilter,
      type: THREE.FloatType
      
    })
    }

4. Pass the Data Texture into the Shader

The output image result is passed into the shader that will actually render the particles.

addObject(){
    this.geometry = new THREE.PlaneGeometry(10,10,50,50);
    this.material = new THREE.MeshNormalMaterial();
    
    this.time = 0;
    this.material = new THREE.ShaderMaterial({
        uniforms: {
            time: {value: this.time},
            uTexture: this.positionTexture
        },
        vertexShader: vertexShader,
        fragmentShader: fragmentShader,
    })
    this.mesh = new THREE.Points(this.geometry, this.material);
    this.scene.add(this.mesh);
    }

Mouse Interactive

1. Convert Mouse Position from Screen Space to 3D Space

To map the mouse position into 3D space, we need these steps:

  1. Listen to mouse movement events in real time.
  2. Get the mouse position on the page or inside the container.
  3. Convert the mouse coordinate into the 2D coordinate system used by Three.js.
  4. Use a raycaster to find the object intersected by the ray.
  5. Read the intersection point.
setupMouseEvent(){
    this.uMousePos = new THREE.Vector3(0,0,0);

    this.raycastMesh = new THREE.Mesh(
      new THREE.PlaneGeometry(10,10),
      new THREE.MeshBasicMaterial()
    )
    window.addEventListener('pointermove', (e)=>{
      this.pointer.x = (e.clientX/this.width) * 2 - 1;
      this.pointer.y = -(e.clientY/this.height) * 2 + 1;
      this.raycaster.setFromCamera( this.pointer, this.camera );
      const intersects = this.raycaster.intersectObjects([this.raycastMesh]);
      if (intersects.length>0){
        // console.log(intersects[0].point);
        this.uMousePos = intersects[0].point;
       
      }

    })
  }

2. Pass the Position into the Shader

 this.matFBO.uniforms.uMousePos.value = this.uMousePos;

Math Behind the Effect

Move Direction: Force

vec4 pos = texture2D(uPosTexture, vUv);
vec3 originPos = texture2D(uOriginPosTexture, vUv).xyz;
vec3 force = pos.xyz-uMousePos;

Speed: Force Factor

I recommend using desmos.com here. It lets you draw functions as graphs, which makes it much easier to visually understand the formulas.

Once we have a decreasing formula, we can add it to the fragment shader. This creates the effect where particles farther from the interaction point receive a gradually smaller force.

Revert to the Original Position

As the force becomes weaker, each particle gradually moves back toward its original position.

vec3 posToGo = originPos + normalize(force)*forceFractor;
pos.xy += (posToGo.xy-pos.xy)*0.005;

Full Fragment Shader

varying vec2 vUv;
uniform sampler2D uPosTexture;
uniform sampler2D uOriginPosTexture;
uniform vec3 uMousePos;


void main() {
    vec4 pos = texture2D(uPosTexture, vUv);
    vec3 originPos = texture2D(uOriginPosTexture, vUv).xyz;

    // color.x += 0.01;
    vec3 force = pos.xyz-uMousePos;
    float len = length(force);
    float forceFractor = 1./max(0.1,len*50.);
    vec3 posToGo = originPos + normalize(force)*forceFractor;
    pos.xy += (posToGo.xy-pos.xy)*0.005;
  
    gl_FragColor = vec4( pos.xyz, 1.0 );
}  

Full Code

Full code: link

FAQ

What is the FBO technique used for?

It writes computed data into a texture so the GPU can update particle state over time and feed the result back into another shader.

What does this article implement?

It implements the core pieces of a GPGPU particle workflow: data textures, an FBO scene, render targets, mouse interaction, and the fragment shader logic.

Buy me a coffee