Posts Tagged ‘glfx’

Post-process shaders in glfx

Pushed an update to glfx to allow for post-process shading. When a post-process shader is defined, the scene is rendered to a screen-space quad (the size of the viewport), and that quad is then rendered to the viewport with the post-process shader applied.

The shader is loaded (asynchronously) like any other:

glfx.shaders.load('screenspace.fs', "frag-shader-screenspace", glfx.gl.FRAGMENT_SHADER);

Once loaded, we create the shader program, and get locations for whatever variables are used. The vertex shader isn’t anything special, it just transforms a vertex by the model-view and projection matrices, and passes along the texture coordinates.

glfx.whenAssetsLoaded(function() {

var postProcessShaderProgram = glfx.shaders.createProgram([glfx.shaders.buffer['vert-shader-basic'], glfx.shaders.buffer['frag-shader-screenspace']],
function(_shprog) {

// Setup variables for shader program
_shprog.vertexPositionAttribute = glfx.gl.getAttribLocation(_shprog, "aVertexPosition");
_shprog.pMatrixUniform = glfx.gl.getUniformLocation(_shprog,
"uPMatrix");
_shprog.mvMatrixUniform = glfx.gl.getUniformLocation(_shprog,
"uMVMatrix");
_shprog.textureCoordAttribute = glfx.gl.getAttribLocation(_shprog,
"aTextureCoord");

_shprog.uPeriod = glfx.gl.getUniformLocation(_shprog,
"uPeriod");
_shprog.uSceneWidth = glfx.gl.getUniformLocation(_shprog,
"uSceneWidth");
_shprog.uSceneHeight = glfx.gl.getUniformLocation(_shprog,
"uSceneHeight");

glfx.gl.enableVertexAttribArray(_shprog.vertexPositionAttribute);
glfx.gl.enableVertexAttribArray(_shprog.textureCoordAttribute);

});

...

We then tell glfx to apply our post-process shader program:

glfx.scene.setPostProcessShaderProgram(postProcessShaderProgram);

This call will result in different rendering path, which renders the scene to a texture, applies that texture to a screen-space quad, and renders the quad with the post-process shader.

Here is the shader for screenspace.fs, used in the demo shown above:

precision mediump float;

uniform float uPeriod;
uniform float uSceneWidth;
uniform float uSceneHeight;
uniform sampler2D uSampler;        
varying vec2 vTextureCoord;

void main(void) {

vec4 sum = vec4( 0. );
float blurSampleOffsetScale = 2.8;
float px = (1.0 / uSceneWidth) * blurSampleOffsetScale;
float py = (1.0 / uSceneHeight) * blurSampleOffsetScale;

vec4 src = texture2D( uSampler, ( vTextureCoord ) );

sum += texture2D( uSampler, ( vTextureCoord + vec2(-px, 0) ) );
sum += texture2D( uSampler, ( vTextureCoord + vec2(-px, -py) ) );
sum += texture2D( uSampler, ( vTextureCoord + vec2(0, -py) ) );
sum += texture2D( uSampler, ( vTextureCoord + vec2(px, -py) ) );
sum += texture2D( uSampler, ( vTextureCoord + vec2(px, 0) ) );
sum += texture2D( uSampler, ( vTextureCoord + vec2(px, py) ) );
sum += texture2D( uSampler, ( vTextureCoord + vec2(0, py) ) );
sum += texture2D( uSampler, ( vTextureCoord + vec2(-px, py) ) );
sum += src;

sum = sum / 9.0;

gl_FragColor = src + (sum * 2.5 * uPeriod);

}

Note that it requires a few uniforms to be supplied to it, we use the glfx.scene.onPostProcessPreDraw() callback to setup the variables (before the post-processed scene is drawn):

var timeAcc = 0;
glfx.scene.onPostProcessPreDraw =
function(tdelta) {

timeAcc += tdelta;
var timeScaled = timeAcc * 0.00107;

if(timeScaled > 2.0*Math.PI) {
timeScaled = 0;
timeAcc = 0;
}

var period = Math.cos(timeScaled);
glfx.gl.uniform1f(postProcessShaderProgram.uPeriod, period + 1.0);

glfx.gl.uniform1f(postProcessShaderProgram.uSceneWidth, glfx.gl.viewportWidth);
glfx.gl.uniform1f(postProcessShaderProgram.uSceneHeight, glfx.gl.viewportHeight);
};

What we’re doing is using the scene rendering time deltas to generate a periodic/sinusoidal wave. This results in the pulsing brightness/fading effect of the scene. The brightness effect itself is done by adding the source pixel to a blurred + brightened version of itself. The blurring allows for the soft fade in and fade out.

glfx – WebGL basis

The base code for my WebGL experiments have been pretty sloppy thus far. I recently took some time to cleanup the code in order to have a more solid basis to work from and I’m presenting it here as a primer for anyone looking for a simple bootstrap or a code-heavy intro to WebGL.

A walk-through of the base code (glfx) and sample code to generate the demo shown below follows. The code is also available via the glfx bitbucket repository.

Dependencies

For matrix and vector operations, the glMatrix library.

Also window.requestAnimationFrame needs to be defined. For older browsers the following shim can be used:

window.requestAnimationFrame = (function(time){
return window.requestAnimationFrame ||
         window.webkitRequestAnimationFrame ||
         window.mozRequestAnimationFrame ||
         window.oRequestAnimationFrame ||
         window.msRequestAnimationFrame ||
        
function( callback ){
            window.setTimeout(callback, 1000 / 60);
         };
})();    

glfx

glfx is the crux of the rendering interface and encapsulates the WebGL context, functionality to load assets (shaders, textures, models), and functionality to setup and render the scene.

// glfx object wraps everything necessary for the rendering interface
var glfx = { };

// echo function to output debug statements to console
glfx.echo = function(txt) {
if(typeof console.log !== 'undefined') {
console.log(txt);
}
}

// WebGL context
glfx.gl = null;

// reference count for assets needed before rendering()
glfx.assetRef = 0;
// function to call when all assets are loaded, set by user via glfx.whenAssetsLoaded, reset internally
glfx.onAssetsLoaded = function() { };
// function to schedule callback when all assets are loaded, set by user
glfx.whenAssetsLoaded = function(_callback) {
if(typeof _callback !== 'undefined') {
if(glfx.assetRef === 0) {
_callback();
}
else {
glfx.onAssetsLoaded = _callback;
}
}
}
// function to increment asset ref count
glfx.incAssetRef = function() {
glfx.assetRef++;
if(glfx.assetRef === 0) {
glfx.onAssetsLoaded();
glfx.onAssetsLoaded =
function() { }; // reset
}
}
// function to decrement asset ref count
glfx.decAssetRef = function() {
glfx.assetRef--;
}

// Shaders class
glfx.shaders = { };
// buffer to store loaded shaders
glfx.shaders.buffer = new Array();

// Function to load vertex shader from external file
// _url = path to shader source
// _type = gl.VERTEX_SHADER / gl.FRAGMENT_SHADER
// _callback = function to call after shader is created, shader object passed is shader is successfully compiled, null otherwise
glfx.shaders.load = function(_url, _name, _type, _callback) {
glfx.decAssetRef();

var xmlhttp = new XMLHttpRequest();
xmlhttp.onreadystatechange =
function() {                
if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {

var shaderSrc = xmlhttp.responseText;
var shader = glfx.gl.createShader(_type);

glfx.gl.shaderSource(shader, shaderSrc);
glfx.gl.compileShader(shader);

if (!glfx.gl.getShaderParameter(shader, glfx.gl.COMPILE_STATUS)) {
shader =
null;
}

if(typeof _callback !== 'undefined') {
_callback(shader);
}

glfx.shaders.buffer[_name] = shader;

glfx.incAssetRef();
}
}

xmlhttp.open(
"GET", _url, true);
xmlhttp.send();
}


// Textures class
glfx.textures = { };
// Textures array
glfx.textures.buffer = new Array();
// Method to load texture from file
glfx.textures.load = function(_path, _name) {

glfx.decAssetRef();

glfx.textures.buffer[_name] = glfx.gl.createTexture();

var tex=glfx.textures.buffer[_name];
tex.image =
new Image();
tex.image.onload =
function() {                

var tex = glfx.textures.buffer[_name];                                                            
glfx.gl.bindTexture(glfx.gl.TEXTURE_2D, tex);
glfx.gl.pixelStorei(glfx.gl.UNPACK_FLIP_Y_WEBGL,
true);
glfx.gl.texImage2D(glfx.gl.TEXTURE_2D, 0, glfx.gl.RGBA, glfx.gl.RGBA, glfx.gl.UNSIGNED_BYTE, tex.image);

glfx.gl.texParameteri(glfx.gl.TEXTURE_2D, glfx.gl.TEXTURE_MAG_FILTER, glfx.gl.LINEAR);
glfx.gl.texParameteri(glfx.gl.TEXTURE_2D, glfx.gl.TEXTURE_MIN_FILTER, glfx.gl.LINEAR);

// required for non-power-of-2 textures
glfx.gl.texParameteri(glfx.gl.TEXTURE_2D, glfx.gl.TEXTURE_WRAP_S, glfx.gl.CLAMP_TO_EDGE);
glfx.gl.texParameteri(glfx.gl.TEXTURE_2D, glfx.gl.TEXTURE_WRAP_T, glfx.gl.CLAMP_TO_EDGE);

glfx.gl.bindTexture(glfx.gl.TEXTURE_2D,
null);

glfx.incAssetRef();

}

tex.image.src = _path;            
}


// Model class
glfx.model = function() {

this.vertexBuffer = null;
this.indexBuffer = null;
this.texcoordBuffer = null;
this.normalBuffer = null;

}

// Models class
glfx.models = { };
// Models array
glfx.models.buffer = new Array();
// Method to load models from JSON file
glfx.models.load = function(_url, _name, _callback) {

glfx.decAssetRef();

var xmlhttp = new XMLHttpRequest();
xmlhttp.onreadystatechange =
function() {                
if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {

var data = JSON.parse(xmlhttp.responseText);

var mdl = new glfx.model();

mdl.vertexBuffer = glfx.gl.createBuffer();
glfx.gl.bindBuffer(glfx.gl.ARRAY_BUFFER, mdl.vertexBuffer);
glfx.gl.bufferData(glfx.gl.ARRAY_BUFFER,
new Float32Array(data.verts), glfx.gl.STATIC_DRAW);
mdl.vertexBuffer.itemSize = 3;
mdl.vertexBuffer.numItems = data.verts.length / 3;

mdl.indexBuffer = glfx.gl.createBuffer();
glfx.gl.bindBuffer(glfx.gl.ELEMENT_ARRAY_BUFFER, mdl.indexBuffer);
glfx.gl.bufferData(glfx.gl.ELEMENT_ARRAY_BUFFER,
new Uint16Array(data.indices), glfx.gl.STATIC_DRAW);
mdl.indexBuffer.itemSize = 1;
mdl.indexBuffer.numItems = data.indices.length;        

if(data.texcoords.length > 0) {
mdl.texcoordBuffer = glfx.gl.createBuffer();
glfx.gl.bindBuffer(glfx.gl.ARRAY_BUFFER, mdl.texcoordBuffer);
glfx.gl.bufferData(glfx.gl.ARRAY_BUFFER,
new Float32Array(data.texcoords), glfx.gl.STATIC_DRAW);
mdl.texcoordBuffer.itemSize = 2;
mdl.texcoordBuffer.numItems = data.texcoords.length / 2;            
}

if(data.normals.length > 0) {
mdl.normalBuffer = glfx.gl.createBuffer();
glfx.gl.bindBuffer(glfx.gl.ARRAY_BUFFER, mdl.normalBuffer);
glfx.gl.bufferData(glfx.gl.ARRAY_BUFFER,
new Float32Array(data.normals), glfx.gl.STATIC_DRAW);
mdl.normalBuffer.itemSize = 3;
mdl.normalBuffer.numItems = data.normals / 3;
}

glfx.models.buffer[_name] = mdl;

glfx.incAssetRef();
}
}

xmlhttp.open(
"GET", _url, true);
xmlhttp.send();
}


// Scene class
glfx.scene = { };
// Scene last render time
glfx.scene.ptime = 0;
// Model-View matrix
glfx.scene.matModelView = null;
// Perspective matrix
glfx.scene.matPerspective = null;
// Scene graph
glfx.scene.graph = new Array();

// Class for scene (world) objects
// _base = object with vertex buffer, index buffer, texture coordinate buffer, etc.
glfx.scene.worldObject = function(_base, _shaderProgram) {
this.base = _base;            
this.shprog = _shaderProgram;
this.position = vec3.create();
this.rotation = vec3.create();
this.scale = vec3.create([1.0, 1.0, 1.0]);
this.update = function() { };
}

// method to add object to scene graph
glfx.scene.addWorldObject = function(_wo) {
glfx.scene.graph.push(_wo);
}

// set field of view
glfx.setFOV = function(_fov) {
mat4.perspective(_fov, glfx.gl.viewportWidth / glfx.gl.viewportHeight, 0.1, 100.0, glfx.scene.matPerspective);
}

// set clear color
glfx.setClearColor = function(_color) {
glfx.gl.clearColor(_color[0], _color[1], _color[2], _color[3]);
}

// Initialization function
// _canvas = DOM canvas element
// _onInitComplete (optional) = callback after init is complete
glfx.init = function(_canvas, _onInitComplete) {

glfx.gl = _canvas.getContext(
"experimental-webgl", {antialias:true});
if (!glfx.gl) {
glfx.echo(
"No webGL support.");
return false;
}

// Set viewport width,height based on dimensions of canvas element
glfx.gl.viewportWidth = _canvas.width;
glfx.gl.viewportHeight = _canvas.height;            

// Set clear color
glfx.setClearColor([1,1,1,1]);

// Enable depth buffer
glfx.gl.enable(glfx.gl.DEPTH_TEST);                

// Setup scene matrices
glfx.scene.matPerspective = mat4.create();
glfx.scene.matModelView = mat4.create();            
glfx.setFOV(90);

// Reset render target
glfx.gl.bindTexture(glfx.gl.TEXTURE_2D, null);
glfx.gl.bindRenderbuffer(glfx.gl.RENDERBUFFER,
null);
glfx.gl.bindFramebuffer(glfx.gl.FRAMEBUFFER,
null);

// Execute callback if one was passed
if(typeof _onInitComplete !== 'undefined') {
_onInitComplete();
}

// Begin rendering
glfx.render(0);

return true;
}

// Render loop function
glfx.render = function(time) {

requestAnimationFrame(glfx.render);

if(glfx.assetRef < 0) {
return;
}

// Reset framebuffer
glfx.gl.bindFramebuffer(glfx.gl.FRAMEBUFFER, null);        

// Clear viewport
glfx.gl.viewport(0, 0, glfx.gl.viewportWidth, glfx.gl.viewportHeight);
glfx.gl.clear(glfx.gl.COLOR_BUFFER_BIT | glfx.gl.DEPTH_BUFFER_BIT);                    

// Calculate frame time delta
var tdelta = 0;
if(glfx.scene.ptime > 0) {
tdelta = time - glfx.scene.ptime;
}    
glfx.scene.ptime = time;

// Render all models in scene
for(var i=0; i<glfx.scene.graph.length; i++) {

mat4.identity(glfx.scene.matModelView);                

glfx.scene.graph[i].update(tdelta, glfx.scene.graph[i]);
var objpos = glfx.scene.graph[i].position;
var objrot = glfx.scene.graph[i].rotation;
var objscale = glfx.scene.graph[i].scale;

mat4.scale(glfx.scene.matModelView, objscale);
mat4.translate(glfx.scene.matModelView, objpos);
mat4.rotate(glfx.scene.matModelView, objrot[0], [1, 0, 0]);                
mat4.rotate(glfx.scene.matModelView, objrot[1], [0, 1, 0]);        
mat4.rotate(glfx.scene.matModelView, objrot[2], [0, 0, 1]);                        

glfx.scene.graph[i].render(tdelta, glfx.scene.graph[i], glfx.scene.matModelView, glfx.scene.matPerspective);
}

}

Initializing glfx

Initializing glfx simply involves calling the glfx.init() function with the canvas element that’s going to be used to render on.

var canvasElem = document.getElementById('wgl-canvas');
glfx.init(canvasElem);

This will setup the rendering interface which will begin rendering frames, but as there is nothing in the scene only a clear is done when a frame is rendered. The clear color is set to white (1,1,1,1) and the field of view set to 90deg by default; these can be changed with the glfx.setClearColor() and glfx.setFOV() methods, respectively.

Loading assets

Assets (shaders, textures, and models) are loaded asynchronously via AJAX requests. As there may be dependencies on multiple assets for rendering and scene creation, a simple semaphore is used, glfx.assetRef.

  • glfx.assetRef is decremented when a new request for an asset is issued and incremented once the AJAX call succeeds and the asset has been created.
  • When glfx.assetRef < 0, it indicates a pending asset for the scene and no rendering is done.
  • A callback can be scheduled for when glfx.assetRef = 0 (i.e. all pending assets loaded) via the glfx.whenAssetsLoaded() method.
// Load basic shaders for rendering
glfx.shaders.load('basic.vs', "vert-shader-basic", glfx.gl.VERTEX_SHADER);
glfx.shaders.load(
'basictex.fs', "frag-shader-tex", glfx.gl.FRAGMENT_SHADER);

// Load necessary textures
glfx.textures.load('img/test.png', 'test-tex');                    

// Load models used in scene
glfx.models.load('cube.json', 'cubemdl', glfx.models.jsonParser);

Note that all the asset load methods take a URL as the first argument, and a name as the second argument. The name is an identifier by which to lookup the asset from the buffer it’s stored in. Also, glfx.models.jsonParser is the only model parser available and loads models corresponding to the JSON data produced by my Wavefront OBJ to JSON converter.

Building a scene

After assets are loaded, we can can create shader programs and world objects, then add them to the scene.

glfx.whenAssetsLoaded(function() {

// Create shader program from loaded shaders
var shprog = glfx.gl.createProgram();
glfx.gl.attachShader(shprog, glfx.shaders.buffer[
'vert-shader-basic']);
glfx.gl.attachShader(shprog, glfx.shaders.buffer[
'frag-shader-tex']);
glfx.gl.linkProgram(shprog);

if (!glfx.gl.getProgramParameter(shprog, glfx.gl.LINK_STATUS)) {
alert(
"Could not create shader program");
return false;
}

// Setup variables for shader program
shprog.vertexPositionAttribute = glfx.gl.getAttribLocation(shprog, "aVertexPosition");
glfx.gl.enableVertexAttribArray(shprog.vertexPositionAttribute);            

shprog.pMatrixUniform = glfx.gl.getUniformLocation(shprog,
"uPMatrix");
shprog.mvMatrixUniform = glfx.gl.getUniformLocation(shprog,
"uMVMatrix");

shprog.textureCoordAttribute = glfx.gl.getAttribLocation(shprog,
"aTextureCoord");
glfx.gl.enableVertexAttribArray(shprog.textureCoordAttribute);                        


// add some cubes to the scene graph
var cubeA = new glfx.scene.worldObject(glfx.models.buffer['cubemdl'], shprog);
cubeA.position = vec3.create([-1.6, 0.0, -25.0]);
cubeA.rotation = vec3.create([0.0, 0.0, 0.0]);
cubeA.scale = vec3.create([0.70, 1.0, 1.0]);
cubeA.render =
function(tdelta, wobj, matModelView, matPerspective) {
// Setup shader program to use
var shprog = wobj.shprog;
glfx.gl.useProgram(shprog);    

var tex = glfx.textures.buffer['test-tex'];                
glfx.gl.activeTexture(glfx.gl.TEXTURE0);
glfx.gl.bindTexture(glfx.gl.TEXTURE_2D, tex);
glfx.gl.uniform1i(shprog.samplerUniform, 0);


glfx.gl.bindBuffer(glfx.gl.ARRAY_BUFFER, wobj.base.vertexBuffer);
glfx.gl.vertexAttribPointer(shprog.vertexPositionAttribute, wobj.base.vertexBuffer.itemSize, glfx.gl.FLOAT,
false, 0, 0);                

glfx.gl.bindBuffer(glfx.gl.ARRAY_BUFFER, wobj.base.texcoordBuffer);
glfx.gl.vertexAttribPointer(shprog.textureCoordAttribute, wobj.base.texcoordBuffer.itemSize, glfx.gl.FLOAT,
false, 0, 0);                    

glfx.gl.uniformMatrix4fv(shprog.pMatrixUniform,
false, matPerspective);
glfx.gl.uniformMatrix4fv(shprog.mvMatrixUniform,
false, matModelView);                

glfx.gl.bindBuffer(glfx.gl.ELEMENT_ARRAY_BUFFER, wobj.base.indexBuffer);
glfx.gl.drawElements(glfx.gl.TRIANGLES, wobj.base.indexBuffer.numItems, glfx.gl.UNSIGNED_SHORT, 0);    
}

cubeA.update =
function(tdelta, wobj) {

// some code to position and spin cubeA

if(wobj.position[2] < -5.0) {
wobj.position[2] += 0.022 * tdelta;
}
else {
wobj.position[2] = -5.0;
}

wobj.rotation[0] = 0.35;
wobj.rotation[1] += -(75 * tdelta) / 50000.0;
if( Math.abs(wobj.rotation[1]) >= 2.0*Math.PI ) {
wobj.rotation[1] = 0.0;
}
}
glfx.scene.addWorldObject( cubeA );


// Add another cube to the scene
var cubeB = new glfx.scene.worldObject(glfx.models.buffer['cubemdl'], shprog);
cubeB.position = vec3.create([1.6, 0.0, -25.0]);
cubeB.rotation = vec3.create([0.0, 0.0, 0.0]);
cubeB.scale = vec3.create([0.70, 1.0, 1.0]);
cubeB.update =
function(tdelta, wobj) {
// some code to position and spin cubeB
if(cubeA.position[2] > -15.0) {
if(wobj.position[2] < -5.0) {
wobj.position[2] += 0.022 * tdelta;
}
else {
wobj.position[2] = -5.0;
}
}

wobj.rotation[0] = 0.35;
wobj.rotation[1] += -(75 * tdelta) / 50000.0;
if( Math.abs(wobj.rotation[1]) >= 2.0*Math.PI ) {
wobj.rotation[1] = 0.0;
}
}

cubeB.render =
function(tdelta, wobj, matModelView, matPerspective) {
// Setup shader program to use
var shprog = wobj.shprog;
glfx.gl.useProgram(shprog);    

var tex = glfx.textures.buffer['test-tex'];                
glfx.gl.activeTexture(glfx.gl.TEXTURE0);
glfx.gl.bindTexture(glfx.gl.TEXTURE_2D, tex);
glfx.gl.uniform1i(shprog.samplerUniform, 0);


glfx.gl.bindBuffer(glfx.gl.ARRAY_BUFFER, wobj.base.vertexBuffer);
glfx.gl.vertexAttribPointer(shprog.vertexPositionAttribute, wobj.base.vertexBuffer.itemSize, glfx.gl.FLOAT,
false, 0, 0);                

glfx.gl.bindBuffer(glfx.gl.ARRAY_BUFFER, wobj.base.texcoordBuffer);
glfx.gl.vertexAttribPointer(shprog.textureCoordAttribute, wobj.base.texcoordBuffer.itemSize, glfx.gl.FLOAT,
false, 0, 0);                    

glfx.gl.uniformMatrix4fv(shprog.pMatrixUniform,
false, matPerspective);
glfx.gl.uniformMatrix4fv(shprog.mvMatrixUniform,
false, matModelView);                

glfx.gl.bindBuffer(glfx.gl.ELEMENT_ARRAY_BUFFER, wobj.base.indexBuffer);
glfx.gl.drawElements(glfx.gl.TRIANGLES, wobj.base.indexBuffer.numItems, glfx.gl.UNSIGNED_SHORT, 0);    
}

glfx.scene.addWorldObject( cubeB );

});

Shaders for programs are pulled from the glfx.shaders.buffer[] associative array, referenced by the name specified when they were loaded.

Once we have a shader program and a model, we can create items for the scene by constructing glfx.scene.worldObject objects:

  • Construct the glfx.scene.worldObject object by specifying a model from the glfx.models.buffer[] associative array and the shader program as arguments to the constructor.
  • The worldObject.position, worldObject.rotation, and worldObject.scale vectors can be set as desired.
  • The worldObject.update() method can be overridden to describe how to manipulate the object in each frame.
  • The worldObject.render() method can be overridden to render the objects making use of the underlying buffers in worldObject.base: worldObject.base.indexBuffer, worldObject.base.vertexBuffer, worldObject.base.normalBuffer, worldObject.base.texcoordBuffer, as well as textures from the glfx.textures.buffer[] associative array.
  • Note that transformation on the model-view matrix (matModelView) is done within glfx.render() and should not be done within worldObject.render().

This callback is not ideal. I’m exposing a lot of rendering code that would best be abstracted away to glfx. However, without a strict definition of how a model should be textured or what variables are to be passed over to the vertex and fragment shaders, abstracting further is premature.