angular - 如何在 WebGL 画布中以角度 (9) 显示 GLSL 片段着色器?
问题描述
这是我第一次处理使用 Angular 构建的项目,所以我仍然习惯于大量特定于它和 WebPack 的实践。
我希望在全屏画布中加载自定义片段着色器 (.frag/.glsl) 以用作组件的背景。对于过去的模型或其他不使用角度的项目,我在GlslCanvas之类的库的帮助下轻松做到了这一点,这些库管理了全屏四边形和基本制服的设置,但现在我无法弄清楚抛出给我的一些错误尝试构建我的角度应用程序时。
经过几个小时的浏览,我发现了如何成功导入我的着色器代码,使用GLSL-shader-loader并通过 @angular-devkit/build-angular 和 @angular-builder/custom-webpacks 添加自定义 webpack 配置:
# my-custom-webpack.config.js
module.exports = {
module: {
rules: [{
test: /\.(frag|vert|glsl)$/,
use: [
{
loader: 'glsl-shader-loader',
options: {}
}
]
}]
}
}
我还了解到,我必须通过定义所需的声明来避免在导入时抱怨非 ts-modules 的打字稿,即:
# my-declarations.d.ts
declare module '*.glsl';
declare module '*.frag';
declare module '*.vert';
此时,片段着色器的代码已正确导入(或者我认为?),我可以将其记录或打印出来({{ myShaderCode }}
例如):
# glsl-bg.component.ts
import { Component, OnInit, ElementRef, ViewChild } from '@angular/core';
import frag from './myShader.frag';
import * as GlslCanvas from 'glslCanvas'
@Component({
selector: 'app-glsl-bg',
templateUrl: './glsl-bg.component.html',
styleUrls: ['./glsl-bg.component.css']
})
export class GlslBgComponent implements OnInit {
@ViewChild('bgCanvas', {static: true})
public bgCanvas: ElementRef<HTMLCanvasElement>;
myShaderCode = frag;
constructor() { }
ngOnInit() {
console.log(this.myShaderCode);
}
}
但是,这就是我被卡住的地方:我尝试使用各种灯光库(没有像 three.js 这样的大东西)但没有成功在画布中运行着色器代码。
使用 glslCanvas 构建时,它编译成功,但画布中没有显示任何内容,我在控制台中得到了这个:
ERROR TypeError: glslCanvas__WEBPACK_IMPORTED_MODULE_2__ is not a constructor
而如果我使用glsl-canvas-js(前者的 ts 端口),它无法编译,给我这个日志:
ERROR in ./node_modules/glsl-canvas-js/dist/glsl-canvas.js
Module not found: Error: Can't resolve './buffers' in '[...]\node_modules\glsl-canvas-js\dist'
ERROR in ./node_modules/glsl-canvas-js/dist/glsl-canvas.js
Module not found: Error: Can't resolve './common' in '[...]\node_modules\glsl-canvas-js\dist'
ERROR in ./node_modules/glsl-canvas-js/dist/glsl-canvas.js
Module not found: Error: Can't resolve './context' in '[...]\node_modules\glsl-canvas-js\dist'
ERROR in ./node_modules/glsl-canvas-js/dist/glsl-canvas.js
Module not found: Error: Can't resolve './iterable' in '[...]\node_modules\glsl-canvas-js\dist'
ERROR in ./node_modules/glsl-canvas-js/dist/glsl-canvas.js
Module not found: Error: Can't resolve './logger' in '[...]\node_modules\glsl-canvas-js\dist'
ERROR in ./node_modules/glsl-canvas-js/dist/glsl-canvas.js
Module not found: Error: Can't resolve './subscriber' in '[...]\node_modules\glsl-canvas-js\dist'
ERROR in ./node_modules/glsl-canvas-js/dist/glsl-canvas.js
Module not found: Error: Can't resolve './textures' in '[...]\node_modules\glsl-canvas-js\dist'
ERROR in ./node_modules/glsl-canvas-js/dist/glsl-canvas.js
Module not found: Error: Can't resolve './uniforms' in '[...]\node_modules\glsl-canvas-js\dist'
* [...] = full paths removed for simplification
任何帮助或提示将不胜感激!
解决方案
我不是 Angular 开发人员,但这是来自https://darvin.dev Webpack Boilerplate 的 Typescript/Webpack 设置:
Webpack 的 GLSL 加载器:
// https://github.com/unic/darvin-webpack-boilerplate/blob/master/webpack/settings/addon-glsl/index.js
// setting with raw-loader and glslify
rules: [
{
test: /\.(glsl|frag|vert)$/,
exclude: /node_modules/,
use: [
'raw-loader',
{
loader: 'glslify-loader',
options: {
transform: [
['glslify-hex', { 'option-1': true, 'option-2': 42 }]
]
}
}
]
}
]
我建议在响应式框架之外创建一个 webgl 实例并通过 pubsub 调用它,这样您就不会有额外的轮询。这是一个具有高性能时间处理的示例助手,您可以在 Darvin 2.0 中找到它作为演示示例。
/**
* @author tobias.frei@unic.com
*
* @module glsl uniform demo
*
* https://github.com/unic/darvin-webpack-boilerplate/blob/master/.cli/.preview/.demo/.templates/.njk/templates/modules/m03-background/index.ts
*/
const Tweakpane = require('tweakpane');
// @ts-ignore
import vertexWobble from '@scripts/glsl/demo.glsl.vert';
// @ts-ignore
import fragmentWobble from '@scripts/glsl/demo.glsl.frag';
// Parameter object
let PARAMS: any;
const DEFINE_FPS = 35;
const DEFINE_RES = 800;
const DEFINE_RES2 = 800;
const deviceRatio = 1;
const resX = DEFINE_RES * deviceRatio,
resY = DEFINE_RES2 * deviceRatio,
verts = [-1, 1, -1, -1, 1, -1, 1, 1];
let canvas: HTMLCanvasElement,
gl: WebGLRenderingContext,
fpsInterval: number,
twodContext: CanvasRenderingContext2D,
now: DOMHighResTimeStamp,
then: DOMHighResTimeStamp,
elapsed: DOMHighResTimeStamp,
resFrame1: Promise<string>;
const imageDatas: ImageData[] = [],
textures: any[] = [],
textureLocationDarvin: WebGLUniformLocation[] | any[] = [];
// webgl uniforms
let pos: any,
program: WebGLProgram,
buffer: any,
ut: WebGLUniformLocation | null,
ures: WebGLUniformLocation | null,
ucenter: WebGLUniformLocation | null,
ushake: WebGLUniformLocation | null,
upulse: WebGLUniformLocation | null,
ublink: WebGLUniformLocation | null,
ulight: WebGLUniformLocation | null;
const createShader = (type: number, source: string) => {
const shader = gl.createShader(type);
if (!shader || !source) {
console.error('> cannot create shader');
return;
}
gl.shaderSource(shader, source);
gl.compileShader(shader);
const success = gl.getShaderParameter(shader, gl.COMPILE_STATUS);
if (!success) {
gl.deleteShader(shader);
return false;
}
return shader;
},
createProgram = (vertexShaderString: string, fragmentShaderString: string) => {
// Setup Vertext/Fragment Shader functions
const vertexShader = createShader(gl.VERTEX_SHADER, vertexShaderString);
const fragmentShader = createShader(gl.FRAGMENT_SHADER, fragmentShaderString);
// Setup Program and Attach Shader functions
const newProgram: WebGLProgram | null = gl.createProgram();
if (newProgram && vertexShader && fragmentShader) {
gl.attachShader(newProgram, vertexShader);
gl.attachShader(newProgram, fragmentShader);
gl.linkProgram(newProgram);
} else {
console.error('#dv> webgl program error');
}
return newProgram;
},
createGraphics = (vertexShader: string | null, fragmentShader: string | null) => {
if (!vertexShader || !fragmentShader) {
console.error('> shader missing');
return;
}
createTextureObject(imageDatas);
// Create the Program //
const newProgram = createProgram(vertexShader, fragmentShader);
if (!newProgram) {
console.error('#dv> webgl create graphics error');
return;
}
program = newProgram;
// Create and Bind buffer //
buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(
gl.ARRAY_BUFFER,
new Float32Array(verts),
gl.STATIC_DRAW
);
pos = gl.getAttribLocation(program, 'pos');
gl.vertexAttribPointer(
pos,
2, // size: 2 components per iteration
gl.FLOAT, // type: the data is 32bit floats
false, // normalize: don't normalize the data
0, // stride: 0 = move forward size * sizeof(type) each iteration to get the next position
0 // start at the beginning of the buffer
);
gl.enableVertexAttribArray(pos);
importProgram();
},
updateUniforms = (time: DOMHighResTimeStamp): Promise<string> => {
return new Promise(resolve => {
gl.useProgram(program);
importUniforms(time);
gl.drawArrays(
gl.TRIANGLE_FAN, // primitiveType
0, // Offset
4 // Count
);
resolve('resolved');
});
},
importProgram = () => {
if (program) {
ut = gl.getUniformLocation(program, 'time');
ures = gl.getUniformLocation(program, 'resolution');
ucenter = gl.getUniformLocation(program, 'center');
ushake = gl.getUniformLocation(program, 'shake');
upulse = gl.getUniformLocation(program, 'pulse');
ublink = gl.getUniformLocation(program, 'blink');
ulight = gl.getUniformLocation(program, 'light');
}
imageDatas.forEach((_imgData, i) => {
const temp = gl.getUniformLocation(program, 'uTexture' + i);
if (temp) {
textureLocationDarvin.push(temp);
}
});
},
importUniforms = (time: DOMHighResTimeStamp) => {
gl.uniform1f(ut, time / 1000);
gl.uniform2f(ucenter, ((((window.innerWidth) / 2)) / (resX / 100) / 100) * deviceRatio, ( (((window.innerHeight) / 2)) / (resY / 100) / 100) * deviceRatio);
gl.uniform2f(ures, resX, resY);
gl.uniform1i(ushake, PARAMS.shake);
gl.uniform1i(upulse, PARAMS.pulse);
gl.uniform1i(ublink, PARAMS.blink);
gl.uniform1f(ulight, PARAMS.light);
// Set each texture unit to use a particular texture.
textureLocationDarvin.forEach((textureLocation, i) => {
gl.uniform1i(textureLocation, i); // texture unit 0
gl.activeTexture(gl['TEXTURE' + i]);
gl.bindTexture(gl.TEXTURE_2D, textures[i]);
});
},
resizeCanvasToDisplaySize = (): boolean => {
const glCanvas = <HTMLCanvasElement>gl.canvas;
const width = glCanvas.clientWidth * deviceRatio;
const height = glCanvas.clientHeight * deviceRatio;
const needResize = glCanvas.width !== width ||
glCanvas.height !== height;
if (needResize) {
glCanvas.width = width;
glCanvas.height = height;
}
return needResize;
},
startRenderLoop = (fps: number) => {
fpsInterval = 1000 / fps;
then = Date.now();
renderLoop(then);
},
renderLoop = async (time: DOMHighResTimeStamp) => {
requestAnimationFrame(renderLoop);
// calc elapsed time since last loop
now = Date.now();
elapsed = now - then;
if (elapsed > fpsInterval) {
// Get ready for next frame by setting then=now, but also adjust for your
// specified fpsInterval not being a multiple of RAF's interval (16.7ms)
then = now - (elapsed % fpsInterval);
// begin call and store promise without waiting
resFrame1 = updateUniforms(time);
// @ts-ignore
const actualFrame = [await resFrame1];
}
},
startShaderItems = ({vertex, fragment}: any) => {
createGraphics(vertex, fragment);
},
initCanvas = () => {
resizeCanvasToDisplaySize();
gl.viewport(0, 0, gl.canvas.width, gl.canvas.height);
startRenderLoop(DEFINE_FPS);
},
resize = () => {
requestAnimationFrame(() => {
requestAnimationFrame(() => {
resizeCanvasToDisplaySize();
requestAnimationFrame(() => {
gl.viewport(0, 0, gl.canvas.width, gl.canvas.height);
});
});
});
},
addDomListener = () => {
window.addEventListener('resize', resize);
},
removeDomListener = () => {
window.removeEventListener('resize', resize);
},
setImageData = (svgPathsArray: any[]) => {
const canvas2D = <HTMLCanvasElement> document.getElementById('background-canvas2d');
twodContext = <CanvasRenderingContext2D>canvas2D.getContext('2d');
svgPathsArray.forEach((svgPaths) => {
twodContext.clearRect(0, 0, canvas2D.width, canvas2D.height);
svgPaths.forEach((svgPathNode) => {
twodContext.fill(new Path2D(svgPathNode));
});
const imageData = twodContext.getImageData(0, 0, DEFINE_RES, DEFINE_RES2);
imageDatas.push(imageData);
});
},
createTextureObject = (imgDatas: ImageData[]) => {
// create 2 textures
for (let i = 0; i < imgDatas.length; i++) {
const texture: any = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
// Set the parameters so we can render any size image.
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
// Upload the image into the texture.
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, imgDatas[i]);
// add the texture to the array of textures.
textures.push(texture);
}
},
addTweakPane = () => {
// get settings from storage
const stor = localStorage.getItem('darvindoc-params');
if (stor) {
try {
PARAMS = JSON.parse(stor);
} catch {
PARAMS = {
light: -30.0,
pulse: true,
shake: false,
blink: true
};
}
} else {
PARAMS = {
light: -30.0,
pulse: true,
shake: false,
blink: true
};
}
// params panel
const pane = new Tweakpane();
pane.addInput(PARAMS, 'pulse').on('change', () => {
localStorage.setItem('darvindoc-params', JSON.stringify(PARAMS));
});
pane.addInput(PARAMS, 'shake').on('change', () => {
localStorage.setItem('darvindoc-params', JSON.stringify(PARAMS));
});
pane.addInput(PARAMS, 'blink').on('change', () => {
localStorage.setItem('darvindoc-params', JSON.stringify(PARAMS));
});
pane.addInput(PARAMS, 'light', {
min: -50.,
max: -1.,
}).on('change', () => {
localStorage.setItem('darvindoc-params', JSON.stringify(PARAMS));
});
};
/**
* Change framerate
*
* @param {number} fps - Set new fps e.g 55
*/
const setFps = (fps: number) => {
fpsInterval = 1000 / fps;
};
/**
* destroy all instances
*
*/
const destroy = () => {
removeDomListener();
};
/**
* Initialize module
*
* @return {object} Instance of created module.
* @param webgl boolean that defines wheather to use webgl or not
*/
const init = () => {
const svgTetureObjects: NodeListOf<HTMLElement> | null = document.querySelectorAll('svg.texture-import');
const svgTexturePathStrings: any[] = [];
if (!svgTetureObjects) {
console.error('> webgl: missing svg icons');
return;
}
// import texture paths
svgTetureObjects.forEach((svgTextureObject) => {
const svgPaths: NodeListOf<HTMLElement> | null = svgTextureObject.querySelectorAll('.darvinIconPath');
const svgPathsString: any[] = [];
svgPaths.forEach((svgPathNode) => {
let svgPath: string | undefined;
// tslint:disable-next-line:no-non-null-assertion
svgPath = svgPathNode!.getAttribute('d') || undefined;
svgPathsString.push(svgPath);
});
svgTexturePathStrings.push(svgPathsString);
});
// init canvas
canvas = <HTMLCanvasElement>document.getElementById('background-canvas');
if (!canvas) {
console.error('#dv> no canvas found');
return;
}
const glContext = canvas.getContext('webgl', {
preserveDrawingBuffer: false
});
if (!glContext) {
console.error('#dv> error on webgl context');
return;
}
gl = glContext;
setImageData(svgTexturePathStrings);
gl.enable(gl.BLEND);
gl.blendFunc(gl.ONE, gl.ONE_MINUS_SRC_ALPHA);
startShaderItems({
vertex: vertexWobble,
fragment: fragmentWobble
});
initCanvas();
addDomListener();
addTweakPane();
};
export default {
init,
destroy,
setFps
};
玩得开心
推荐阅读
- swift - Xcode 无法构建新项目 (12.4) - 无法构建 Objective-C 模块“SwiftUI”
- android - android studio - 我在下载时不小心点击了“背景”按钮,现在我找不到下载进度
- mysql - jmerise export 在 mariadb 中导入时出现语法错误,但我在脚本中找不到语法错误
- sql - 递归查询查找满足条件的循环
- python - 有没有办法找到每个预测的特征重要性?
- homebrew - 任何 brew 安装导致'curl:(35)错误:1400410B:SSL 例程:CONNECT_CR_SRVR_HELLO:错误的版本号'
- javascript - 如何在循环中获取动态属性
- python - while 循环,如何设置相等并中断 while 循环?
- c# - 从数据库中检索数据时如何跳过不正确的 DateAndTime 格式
- terraform - 使用 terraform 创建 azure 自动化 dsc 配置和 dsc 配置节点似乎不起作用