Skip to content

0.1 → 2.0 Migration Guide

Spark 2.0 was designed with backward-compatibility in mind. We expect most 0.1 apps to continue to work with minimal changes. Here we list some important considerations.

Update Spark dependencies

  • Install the latest version using NPM: npm install @sparkjsdev/spark@2.0.0-preview, or update your project's package.json to the latest Spark 2.0 release: "@sparkjsdev/spark": "2.0.0-preview".

  • The minimum THREE.js version is r179 in order to support 2D array textures when rendering to multiple targets (THREE.js #31476). Make sure you're project's package.json includes THREE.js using "three": "0.179.0" or higher.

      "dependencies": {
        "three": "0.180.0",
        "@sparkjsdev/spark": "2.0.0-preview"
      },
    

  • If you're linking directly to a hosted SparkJS build, update your importmap to use the new build:

    <script type="importmap">
      {
        "imports": {
          "three": "https://cdnjs.cloudflare.com/ajax/libs/three.js/0.180.0/three.module.js",
          "@sparkjsdev/spark": "https://sparkjs.dev/releases/spark/experimental/2.0.0/spark.module.js"
        }
      }
    </script>
    

Multiple viewpoints and renderers

In Spark 0.1 you created multiple viewpoint from a single SparkRenderer instance by calling .newViewpoint() on it, returning a SparkViewpoint. This representation stored 1 global set of RGBA splats and N splat rendering orders for a viewpoint for correct back-to-front rendering and blending. This model made it impossible to have different colored splats, for example RGBA splats + depth-colored, or rendering different sets of splats with different shader effects applied, or different directional lighting from spherical harmonics from different vantage points.

Spark 2.0 generalizes this model by allowing N separate SparkRenderers, each with its own set of splats and sort order. The rendering loops for each renderer are also independent, so the main canvas-attached renderer can update the screen continuously while other renderers happen on-demand, one frame at a time for video frame bake-outs, or continuously every frame for multiple updating render panes. Since additional renderers are triggered explicitly, they can adjust scene parameters, splats, or shader effects before/after to achieve custom outputs per renderer.

If your code used to say:

const viewpoint = spark.newViewpoint({
  target: { width, height, doubleBuffer: true },
  onTextureUpdated: (texture) => {
    useRenderedViewpoint(texture);
  },
});

renderer.setAnimationLoop(() => {
  viewpoint.renderTarget({ scene, camera: viewCamera })
  renderer.render(scene, camera);
});
With Spark 2.0 it would be:
const viewpoint = new SparkRenderer({
  renderer,
  target: { width, height, doubleBuffer: true },
});

renderer.setAnimationLoop(() => {
  viewpoint.renderTarget({ scene, camera: viewCamera })
  useRenderedViewpoint(viewpoint.target.texture);
  renderer.render(scene, camera);
});

Splat sorting

Spark 0.1 sorting was specified per-viewpoint, with the default canvas viewpoint set using new SparkRenderer({ renderer, view: { sortRadial: false } }) or adjusted with sparkRenderer.defaultView.sortRadial = false.

Spark 2.0 viewpoints are 1-to-1 with a SparkRenderer, so the sort options are set directly on the renderer with new SparkRenderer({ renderer, sortRadial: false }) or sparkRenderer.sortRadial = false.

Most Spark 0.1 sorting options have been made obsolete, and the only remaining option is sortRadial: true = sort by distance from viewpoint origin, while false = sort by Z-depth. Sorting by Z-depth is more common and what most splat optimization techniques use, however radial sorting has the advantage that it is stable under viewpoint rotation and also keeps all splats around the viewer, including behind. When turning quickly, most splat renderers with Z-depth sorting show black bars on the periphery, while Spark with radial sort avoids this problem.

Spark 0.1 had an option to sort with 16-bit or 32-bit precision with sort32, trading off speed for precision. To simplify the logic, increase the quality, and take advantage of the efficiency of rendering to multiple targets at once, Spark 2.0 only supports 32-bit sorting. The option sort360 allowed you to keep splats 360 degrees around you, but this is always enabled in Spark 2.0 when sortRadial is true.

Large scene support

In Spark 0.1 for large scenes we recommended creating a SparkRenderer and adding it as a child of your camera in your scene graph, so that splats near the camera would be rendered with higher precision and avoiding quantization artifacts when the camera is far from the origin.

Spark 2.0 obviates the need for this in two ways: 1) Splats are automatically rendered relative to the render camera, so the SparkRenderer can be added anywhere in the scene, and 2) Optionally you can enable ExtSplats encoding for intermediate accumulator values using the new option new SparkRenderer({ renderer, accumExtSplats: true }), however this is generally not necessary because of 1).

SparkRenderer.getRgba() / .readRgba()

Spark 0.1 provided utility functions to extract the RGBA values from the intermediate rendered splats (including the result of shader effects). In Spark 2.0 this can be generated directly from the splat's generator:

const rgba = new RgbaArray();
rgba.render({
  renderer,
  count: splatMesh.splats.getNumSplats(),
  reader: dyno.dynoBlock({ index: "int" }, { rgba8: "vec4" }, ({ index }) => {
    const { gsplat } = splatMesh.generator.apply({ index });
    return { rgba8: dyno.splitGsplat(gsplat).outputs.rgba };
  }),
});

Stochastic sort-free rendering

Spark 0.1 had an experimental "sort-free" stochastic rendering mode, using randomness and Z-buffering to render splats without needing to sort. This has been deprecated in Spark 2.0.

Splat Texture

Spark 0.1 had the ability to render each splat as an RGBA texture, allowing arbitrary splat profiles, including shapes and non-Gaussian curves. This has been deprecated in Spark 2.0.

VRButton

VRButton has been replaced by SparkXr in Spark 2.0. To migrate to SparkXr you can follow the example in the Basic XR.

PackedSplat SplatEncoding

Spark 0.1 introduced the SplatEncoding interface with min/max ranges for RGB values, scale, and min/max SH1/2/3 ranges. In Spark 2.0 we removed the SH1/2/3 min values and encode the values symmetrically using SH1/2/3 max. We added the additional option lodOpacity: boolean which controls whether the extended 0..2 opacity encoding is used in order to represent more solid LoD splats.

SOGS file support

Spark 2.0 supports SOGS files via a .zip file with manifest.json and referenced .webp images, and also the new extension .sog. However, the new Spark 2.0 file loaders don't support referencing manifest.json via remove URL and fetching the relative URL images. Please download and ZIP the files together and reference that directly for Spark 2.0.

Temporary fallback: OldSparkRenderer

If all else fails, the original SparkRenderer class from Spark 0.1 has been renamed to OldSparkRenderer (and other classes similarly like OldSparkViewpoint). Rename your new SparkRenderer() calls to new OldSparkRenderer(), and make sure to explicitly add your OldSparkRenderer to your scene because only the new SparkRenderer will be automatically injected.

We expect to unwind and remove this support over time and hope you will be able to migrate to the new renderer!

Need more help?

Join the Spark Discord to connect with other users and developers.