Implementing Bloom Effect with Mapbox and Three.js

Implementing Bloom Effect with Mapbox and Three.js

Bloom effects are typically used to enhance the brightness and lighting effects of images or scenes, such as enhancing light effects and drawing glowing edges. In some game engines and libraries, such as Cesium, bloom is implemented as a post-processing technique. Simply put, post-processing techniques are secondary processing done after the entire canvas rendering.

This article will introduce how to achieve bloom effects with Mapbox and Three.js. First, let’s take a look at the final result:

The bloom processing workflow typically consists of the following steps:

Set a brightness threshold to extract glowing areas.
Blur the glowing areas. Gaussian blur is commonly used to blur the glowing areas, and multiple blurs with different radii are often performed to achieve a more natural effect.
Overlay the blurred texture onto the original image to achieve the desired effect blending.

Mapbox does not directly provide the ability to create bloom effects, but its CustomLayerInterface provides a very flexible extension capability, making it possible to load Three.js with Mapbox.

The official Three.js documentation provides an Unreal Engine-inspired UnrealBloom post-processor. Official example: https://threejs.org/examples/?q=bloom#webgl_postprocessing_unreal_bloom

This article combines the two to extend the ability to create bloom effects in Mapbox.

First, let’s implement a glowing line effect. Let’s consider the problems we are about to face:

Three.js lines are only 1 pixel thick, and we need to draw lines with width.
How to synchronize the camera of Three.js with Mapbox’s camera.
How to draw Three.js lines according to latitude and longitude.

To draw lines with width, I found another plugin provided by Three.js official: Line2

Camera synchronization and coordinate conversion, thanks to the great open-source community, I found Threebox, the initial version of Threebox is no longer maintained, you can pay attention to this fork version: https://github.com/jscastro76/threebox

Adding a Three.js Line to Mapbox

Add a custom layer:

map.addLayer({
id: ‘custom_layer’,
type: ‘custom’,
onAdd: function (map, gl) {
},
render: function (gl, matrix) {

},
});

For ease of debugging and to avoid issues with blending with Mapbox textures, we will first create a separate canvas as the container for Three.js. This canvas will have the same size as the map container and perfectly cover it.

onAdd: function (map, gl) {
container = map.getCanvas();
const w = container.clientWidth;
const h = container.clientHeight;
const mapContainer = map.getContainer();
let bloomContainer = mapContainer.querySelector(‘#_THREE_EFFECTS_CONTAINER_’);
if (!bloomContainer) {
bloomContainer = document.createElement(‘canvas’);
bloomContainer.id = ‘_THREE_EFFECTS_CONTAINER_’;
bloomContainer.style.position = ‘absolute’;
bloomContainer.style.zIndex = ‘99999’;
bloomContainer.style.pointerEvents = ‘none’;
bloomContainer.style.width = ‘100%’;
bloomContainer.style.height = ‘100%’;
bloomContainer.width = w;
bloomContainer.height = h;
mapContainer.appendChild(bloomContainer);
}
}

Initialize the Three.js renderer and camera in the onAdd method:

renderer = new THREE.WebGLRenderer({
alpha: true,
antialias: true,
canvas: bloomContainer,
});

renderer.setPixelRatio(window.devicePixelRatio);
renderer.autoClear = false;
camera = new THREE.PerspectiveCamera(map.transform.fov, w / h, 0.1, 1e21);

Ensure to enable the alpha channel for transparency and disable autoClear. Also, set the clear alpha color to zero to prevent the Three.js scene’s background from covering the map.

renderer.setClearAlpha(0.0);

Next, synchronize the camera:

new CameraSync(map, camera, group);

Camera synchronization is crucial as it ensures consistency between the observation range of the three.js camera and the Mapbox camera during mouse interactions. Threebox provides the CameraSync method along with coordinate transformation utility methods, which are essential. However, I won’t delve into these details in this article. Interested readers can refer to the Threebox source code.

Next, we create a three.js line: input latitude and longitude points, and generate the corresponding mesh with custom styles.

Creating a Three.js line:

function createLine2(obj) {
// Geometry
var straightProject = utils.lnglatsToWorld(obj.geometry);
var normalized = utils.normalizeVertices(straightProject);
var flattenedArray = utils.flattenVectors(normalized.vertices);
var geometry = new LineGeometry();
geometry.setPositions(flattenedArray);

// Material
let matLine = new LineMaterial({
color: obj.color,
linewidth: obj.width,
dashed: false,
opacity: obj.opacity,
});

matLine.resolution.set(obj.containerWidth, obj.containerHeight);
matLine.isMaterial = true;
matLine.transparent = true;
matLine.depthWrite = false;

// Mesh
let line = new Line2(geometry, matLine);
line.position.copy(normalized.position);
return line;
}

Two important steps are the conversion of latitude and longitude coordinates into three.js world coordinates using lnglatsToWorld, and normalization to obtain the mesh’s position information.

The writing style of Line2 can be referenced from the official examples provided by three.js: https://threejs.org/examples/?q=line#webgl_lines_fat

Add the line to the scene:

line = createLine2({
color: 0x00bfff,
width: 4,
opacity: 1,
containerWidth: w,
containerHeight: h,
});
group.add(line);

With this, we have successfully achieved the first goal: providing an array of latitude and longitude coordinates along with custom styles to generate a Three.js line, and then drawing it onto Mapbox.

Implementing Bloom Effect for Three.js Lines

Here it is recommended that everyone first understand the basic usage of bloom effect in three.js: https://threejs.org/examples/?q=bloom#webgl_postprocessing_unreal_bloom

Set up the rendering passes using EffectComposer:

const renderScene = new RenderPass(scene, camera);
const bloomPass = new UnrealBloomPass(new THREE.Vector2(w, h), params.strength, params.radius, params.threshold);
const outputPass = new OutputPass();

composer = new EffectComposer(renderer);
composer.addPass(renderScene);
composer.addPass(bloomPass);
composer.addPass(outputPass);

In the render method of the custom layer, render the Three.js scene and update the bloom effect:

composer.render();
renderer.resetState();
renderer.render(scene, camera);

However, we quickly discovered that the background of the bloom effect was black, covering the map.

Upon analyzing the source code related to the bloom effect, we found the following line in UnrealBloomPass.js:

gl_FragColor = vec4(diffuseSum/weightSum, 1.0);

The shader’s alpha channel always remains at 1. Following some other references, it is recommended to sample the alpha value besides the color and output it according to the weight. However, the result doesn’t seem ideal.

Ultimately, it was found that controlling the maximum value of alpha significantly improves the result.

void main() {
float weightSum = gaussianCoefficients[0];
vec3 diffuseSum = texture2D( colorTexture, vUv ).rgb * weightSum;
float alphaSum = 0.0;
for( int i = 1; i < KERNEL_RADIUS; i ++ ) {
float x = float(i);
float w = gaussianCoefficients[i];
vec2 uvOffset = direction * invSize * x;
vec4 sample1 = texture2D( colorTexture, vUv + uvOffset );
vec4 sample2 = texture2D( colorTexture, vUv – uvOffset );
diffuseSum += (sample1.rgb + sample2.rgb) * w;
alphaSum += (sample1.a + sample2.a); // Sum of alpha values
weightSum += 2.0 * w;
}

alphaSum /= weightSum; // Normalize alpha sum
alphaSum = min(alphaSum, 0.15); //Limit the value of alphaSum
gl_FragColor = vec4(diffuseSum / weightSum, alphaSum);
}`

However, even so, the final result is still unsatisfactory, with noticeable boundaries in the glow areas, especially on maps with bright colors.

Nevertheless, we have essentially achieved the goal of the second stage, successfully overlaying the glow effect on the map. However, besides the unsatisfactory result, the glow effect is global and cannot be precisely controlled for individual graphics.

Therefore, further optimization is needed.

Partial Glow and Further Effect Optimization

Glow effect on individual shapes can be achieved by referring to another example provided by three.js official: https://threejs.org/examples/?q=bloom#webgl_postprocessing_unreal_bloom_selective

The principle behind this implementation is essentially to split the graphics into different layers. Before picking the brightness, the layers that do not need glow effects have their textures set to black. The layers that require glow effects extract brightness and undergo subsequent processing. During rendering, the textures that were set to black are restored, and finally, the two are blended.

After modifying according to the three.js official example, I successfully achieved the same effect as before.

Regarding the optimization of the effect, I found an issue on the three.js GitHub repository: https://github.com/mrdoob/three.js/issues/14104
The alpha channel issue seems to be a difficult problem, which has been present since 2018. At the end of the discussion, a solution proposed by a contributor is not to modify the UnrealBloomPass but to blend the source texture and target texture in the shader.

void main() {
vec4 base_color = texture2D(baseTexture, vUv);
vec4 bloom_color = texture2D(bloomTexture, vUv);

float lum = 0.21 * bloom_color.r + 0.71 * bloom_color.g + 0.07 * bloom_color.b;
gl_FragColor = vec4(base_color.rgb + bloom_color.rgb, max(base_color.a, lum));
}

This solution has essentially addressed the transparency issue, but on certain machines (in my test environment, there are issues on large screens while notebooks are fine), noticeable color boundaries may appear in the display.

So, on top of this solution, we need to increase the range of alpha values to smooth out the color transition.

void main() {
vec4 base_color = texture2D(baseTexture, vUv);
vec4 bloom_color = texture2D(bloomTexture, vUv);

float lum = 0.21 * bloom_color.r + 0.71 * bloom_color.g + 0.07 * bloom_color.b;
vec3 blendedColor = base_color.rgb + bloom_color.rgb;
float alpha = max(base_color.a, lum);

alpha = mix(alpha, 0.05, 0.1);
gl_FragColor = vec4(blendedColor, alpha);
}

The main point is the following line, where the interpolation and blending factors can be adjusted according to different situations.

alpha = mix(alpha, 0.05, 0.1);

The problem is resolved.

At this stage, the effect is basically meeting the requirements. However, this approach essentially overlays a three.js canvas on top of the mapbox, which inevitably leads to layer occlusion issues. In other words, the mapbox layers cannot cover the glowing layer.

Actually, the current solution doesn’t entirely rely on Mapbox’s custom layers; we just need the Map object.

So, let’s continue optimizing!

Blend the glow layer with Mapbox.

The three.js container, at its core, is a canvas, and since a canvas can be used as a WebGL texture, we can take the content of the three.js container and draw it onto the mapbox canvas, then blend it with the original texture. This is also the basic usage of mapbox custom layers.

We add shader-related code in the onAdd and render methods.

Method onAdd:

const vertexShaderSource = `
attribute vec2 a_position;
attribute vec2 a_texCoord;
uniform vec2 u_resolution;
varying vec2 v_texCoord;
void main() {
vec2 zeroToOne = a_position / u_resolution;
vec2 zeroToTwo = zeroToOne * 2.0;
vec2 clipSpace = zeroToTwo – 1.0;
gl_Position = vec4(clipSpace * vec2(1, -1), 0, 1);
v_texCoord = a_texCoord;
}
`;
const fragmentShaderSource = `
#ifdef GL_ES
precision mediump float;
#endif
uniform sampler2D u_image;
varying vec2 v_texCoord;
void main() {
gl_FragColor = texture2D(u_image, v_texCoord);
}
`;

const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, vertexShaderSource);
gl.compileShader(vertexShader);
if (!gl.getShaderParameter(vertexShader, gl.COMPILE_STATUS)) {
console.error(gl.getShaderInfoLog(vertexShader));
gl.deleteShader(vertexShader);
return;
}

const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, fragmentShaderSource);
gl.compileShader(fragmentShader);
if (!gl.getShaderParameter(fragmentShader, gl.COMPILE_STATUS)) {
console.error(gl.getShaderInfoLog(fragmentShader));
gl.deleteShader(fragmentShader);
return;
}

program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
console.error(gl.getProgramInfoLog(program));
gl.deleteProgram(program);
return;
}

// attrib
positionLocation = gl.getAttribLocation(program, ‘a_position’);
texcoordLocation = gl.getAttribLocation(program, ‘a_texCoord’);
resolutionLocation = gl.getUniformLocation(program, ‘u_resolution’);

// buffer
positionBuffer = gl.createBuffer();
texcoordBuffer = gl.createBuffer();

// texture
texture = gl.createTexture();

Method render:

gl.useProgram(program);

gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
setRectangle(gl, 0, 0, container.width, container.height);

gl.bindBuffer(gl.ARRAY_BUFFER, texcoordBuffer);
gl.bufferData(
gl.ARRAY_BUFFER,
new Float32Array([0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0]),
gl.STATIC_DRAW,
);

gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, bloomContainer);

gl.enableVertexAttribArray(positionLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.vertexAttribPointer(positionLocation, 2, gl.FLOAT, false, 0, 0);

gl.enableVertexAttribArray(texcoordLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, texcoordBuffer);
gl.vertexAttribPointer(texcoordLocation, 2, gl.FLOAT, false, 0, 0);

gl.uniform2f(resolutionLocation, gl.canvas.width, gl.canvas.height);
gl.enable(gl.BLEND);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);

gl.drawArrays(gl.TRIANGLES, 0, 6);

The bloom effect is too faint.

gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);

The above line represents the current blending method. We need to adjust it and set the alpha of the content inside three.js to 1.0 simultaneously.

void main() {
vec4 base_color = texture2D(baseTexture, vUv);
vec4 bloom_color = texture2D(bloomTexture, vUv);
vec3 blendedColor = base_color.rgb + bloom_color.rgb;
gl_FragColor = vec4(blendedColor, 1.0);
}

Set the blendFunc to a simple addition.

gl.blendFunc(gl.ONE, gl.ONE);

Resolved.

Test the layer order, draw a red line with mapbox, which can cover the glow effect, indicating successful blending with the mapbox layer.

Add event handling.

Finally, we need to handle events. Mapbox custom layers cannot trigger layer events, while our effect is within the three.js system, so we can use raycasting to handle events.

var raycaster = new THREE.Raycaster();
var mouse = new THREE.Vector2();

function onMouseClick(event) {
const w = container.width / window.devicePixelRatio;
const h = container.height / window.devicePixelRatio;

mouse.x = (event.clientX / w) * 2 – 1;
mouse.y = -(event.clientY / h) * 2 + 1;
raycaster.setFromCamera(mouse, camera);
var intersects = raycaster.intersectObjects(scene.children, true);
if (intersects.length > 0) {
console.log(‘Object clicked!’);
intersects[0].object.material.color.set(0xff0000);
}
}

window.addEventListener(‘click’, onMouseClick, false);

It looks simple, right? Just copy a demo from the official website. But in reality, I encountered a major issue here.

Firstly, intersectObjects always failed to hit, returning an empty result without any error. I even went through the source code of Three.js but couldn’t find the issue. Eventually, I discovered that it was due to incorrect camera parameter settings, which was mentioned earlier in the article.

At the beginning, I set it up like this:

new THREE.PerspectiveCamera(28, container.innerWidth / container.innerHeight,0.000000000001, Infinity);

Did you solve the problem?

The click event seems to work now, but only part of the area responds…

Upon closer inspection, the fov is set to 28, but to fully synchronize with Mapbox, we should directly use the fov of the Mapbox camera.

camera = new THREE.PerspectiveCamera(map.transform.fov, w / h, 0.1, 1e21);

The event issue is finally resolved.

With this, the case study that can be applied in practice is considered complete. We have achieved camera synchronization, coordinate system synchronization, bloom effect, selective bloom control, Mapbox layer hierarchy control, and event response. However, it’s important to note that bloom effect does have some impact on performance. In real projects, further encapsulation is needed, such as maintaining the bloom container in a singleton form, separating dynamic effects from static effects, and avoiding continuous refreshing for static effects, among others.

GitHub repository: https://github.com/ethan-zf/mapbox-bloom-effect-sample. Feel free to star it if you find it helpful!

Leave a Reply

Your email address will not be published. Required fields are marked *