The Nodal Scene Interface (NSI) is a simple yet expressive API to describe a scene to a renderer. From geometry declaration, to instancing, to attribute inheritance and shader assignments, everything fits in 12  API calls. The following subsection demonstrate how to achieve most common manipulations.

Geometry Creation

Creating geometry nodes is simple. The content of each node is filled using the NSISetAttribute call. 

/**
	Polygonal meshes can be created minimally by specifying "P".
    NSI's C++ API provides an easy interface to pass parameters to all NSI
	API calls through the Args class.
*/
const char *k_poly_handle = "simple polygon"; /* avoids typos */

nsi.Create( k_poly_handle, "mesh" );

NSI::ArgumentList mesh_args;
float points[3*4] = { -1, 1, 0,  1, 1, 0, 1, -1, 0, -1, -1, 0 };
mesh_args.Add(
	NSI::Argument::New( "P" )
		->SetType( NSITypePoint )
		->SetCount( 4 )
		->SetValuePointer( points ) );
nsi.SetAttribute( k_poly_handle, mesh_args );


Specifying normals and other texture coordinates follows the same logic. Constant attributes can be declared in a more concise form:

/** Turn our mesh into a subdivision surface */
nsi.SetAttribute( k_poly_handle,
	NSI::CStringPArg("subdivision.scheme", "catmull-clark") );

Transforming Geometry

In NSI, a geometry is rendered only if connected to scene's root (which has the special handle ".root"). It is possible to directly connect a geometry node (such as the simple polygon above) to scene's root but it wouldn't be very useful. To place/instance geometry anywhere in the 3D world a transform node is used as in the code snippet below.

const char *k_instance1 = "translation";

nsi.Create( k_instance1, "transform" );
nsi.Connect( k_instance1, "", NSI_SCENE_ROOT, "objects" );
nsi.Connect( k_poly_handle, "", k_instance1, "objects" );

/*
	Matrices in NSI are in double format to allow for greater
	range and precision.
*/
double trs[16] =
{
	1., 0., 0., 0.,
	0., 1., 0., 0.,
	0., 0., 1., 0.,
	0., 1., 0., 1. /* transalte 1 unit in Y */
};

nsi.SetAttribute( k_instance1,
	NSI::DoubleMatrixArg("transformationmatrix", trs) );

Instancing is as simple as connecting geometry to different attributes (yes, instances of instances are supported).

const char *k_instance2 = "more translation";
trs[13] += 1.0; /* translate in Y+ */

nsi.Create( k_instance2, "transform" );
nsi.Connect( k_poly_handle, "", k_instance2, "objects" );
nsi.Connect( k_instance2, "", NSI_SCENE_ROOT, "objects" );

/* We know have two instances of the same polygon in the scene */

Assigning Shaders

Shaders are created as any other nodes using the NSICreate API call. They are not assigned directly on geometry but through an intermediate attributes nodes. Having an extra indirection allows for more flexible export as we will see in the following chapters.

/**
	Create a simple shader node using the standard OSL "emitter" shader.
	Set it's parameter to something different than defaults.
*/
nsi.Create( "simpleshader", "shader" );
float red[3] = {1,0,0};
nsi.SetAttribute( "simpleshader",
	(
		NSI::CStringPArg("shaderfilename", "emitter"),
		NSI::ColorArg( "Cs", red),
		NSI::FloatArg( "power", 4.f )
	) );

/** Create an attributes nodes and connect our shader to it */
nsi.Create( "attr", "attributes" );
nsi.Connect( "simpleshader", "", "attr", "surfaceshader" );

/* Connecting the attributes node to the mesh assign completes the assignment */
nsi.Connect( "attr", "", "simple mesh", "geometryattributes" );

Creating shading networks uses the same NSIConnect calls as for scene description.

/** We can inline OSL source code directly */
const char *sourcecode = "shader uv() { Ci = emission() * color(u, v, 0); } ";
nsi.Create( "uv", "shader" );
nsi.SetAttribute( "uvshader", NSI::CStringPArg("shadersource", sourcecode) );

/** We can now connect our new shader node into our simple emitter */
nsi.Connect( "uvshader", "Ci", "simplesshader", "Cs" );


Multi-Camera output, in One Render.