Natural Neighbor is an interpolation scheme suitable for scattered data. It is based on weighted average approach and uses Voronoi diagram to determine relative contribution of given data points. Weights are defined as ratio of area “stolen” from known data points in the diagram by adding an interpolated data point divided by the area assigned to the new point.
The following picture contains volumetric rendering of 3D Voronoi diagram for a set of 4 points in a 1003 cube*. Volumes marked yellow, blue, green and orange indicate areas where point closest to all other contained points was respectively point 1, 2, 3 and 4.
Adding one more point to the Voronoi diagram creates new area (marked pink in the picture below). This new region indicates fragment of space where the new point is the closest one to all other points contained within the pink hull.
You can notice that a large chunk of blue area and smaller bits of green and orange areas have been “stolen” by the pink one. This signifies that point 2 will have the biggest weight whereas points 3 and 4 will bring smaller contribution to the final interpolated result for the inserted point.
To interpolate the whole volume the procedure above needs to be repeated for all 1003 points in the volume (or more if we increase the resolution). This process can be very time consuming depending on the final number of evaluations and number of known points.
In 2006 in the paper “Discrete Sibson Interpolation” Sung W. Park outlined a method for approximate natural neighbor interpolation on a pixel/voxel grid using rasterization rather than analytical approach.
His idea boils down to iterating over all points in 2D/3D/N-d hypercube and determining the closest known data point (i.e. the Voronoi cell where the interpolated point lies in a static Voronoi diagram defined by known data points). When this distance is known it determines the radius of a hypersphere centered at the interpolated point which contains only points that are affected by given data point. The set of all interpolated points defines multiple overlapping hyperspheres. For each output point we accumulate value of corresponding data point and number of times given output point is contained within any hypersphere. After evaluating all output points, the expression (accumulated value) / (hypersphere count) approximates for each point the data value interpolated by natural neighbor algorithm.
The exact mechanics of this approach are formally described in the paper. Intuitively speaking each increment of the counter means one more “stolen” pixel (from whichever Voronoi cell). Why? From the definition of a sphere with given radius – all points contained within the hypersphere if added to the diagram would be closer (or at the same distance) to given data point. Effectively by rasterizing a hypersphere centered at given output point we’re saying – look all these rastered points would steal this output point from its original Voronoi cell. Therefore we’re incrementing their “theft” counters by 1 and accumulating corresponding data value in their value accumulators.
On the other hand we notice that no other point would steal that particular output point because again by definition it would be further away from the output point than the original data point. The whole concept is illustrated in the figure below. It’s a great observation and basically amounts to somewhat implicit construction of discrete Voronoi diagram and weighing data values by relative number of stolen pixels.
Performance of this approach stems from the fact that for a large number of data points, analytical construction of Voronoi diagrams becomes increasingly complex so voxel-wise approximation can help. On the other hand as the density of data points increases, the typical radius of hypersphere decreases further reducing the simple sphere rasterization work. Furthermore, this algorithm can be implemented on GPU hardware (e.g. authors discuss their shader implementation).
In this ALGOholic entry, I would like to share a couple of variations on the above algorithm. In the repository below you will find: 1) nn3d2.py – a reasonably streamlined OpenCL version using OpenCL-based KD-Tree to compute nearest neighbors (and their distances) for each voxel and then using OpenCL again to rasterize spheres of given radii at all voxels, accumulating values and counts to separate arrays), 2) nn3d.py – a first approach to OpenCL acceleration which aimed to do on-the-fly nearest neighbor search and sphere rendering on the OpenCL side in one go but proved not to be the preferred way to implement this, 3) nn_discr_ocl.cpp – a completely naive classical CPU-only implementation without even bothering to use KD-Tree which results in hard performance drop for many data points and also has poor rasterization performance when data points are few, therefore working tolerably only for certain range of data point counts. This version is the easiest to read through and understand the algorithm obviously.
In short, the recommended production version is of course nn3d2.py and its example usage could look like this:
from nn3d2 import nn3d from nibabel import nifti1 import numpy as np n = 10 pts = np.random.random((n, 3), dtype=np.float32) values = list(xrange(n)) (accum, cnt, val, radius) = nn3d(pts, values, res=np.ones((3,), dtype=np.int32) * 64) nifti1.save(nifti1.nifti1Image(accum / cnt, np.eye(4)), 'natural.nii') nifti1.save(nifti1.nifti1Image(accum, np.eye(4)), 'accum.nii') nifti1.save(nifti1.nifti1Image(cnt, np.eye(4)), 'cnt.nii') nifti1.save(nifti1.nifti1Image(val, np.eye(4)), 'val.nii') nifti1.save(nifti1.nifti1Image(radius, np.eye(4)), 'radius.nii')
The snippet above will run the NN algorithm on a 64x64x64 grid and should produce 5 volumes containing respectively: 1) natural.nii – results of natural neighbor interpolation of points pts with values values, 2) accum.nii – accumulator value at each voxel, 3) cnt.nii – counter value at each voxel, 4) val.nii – value of nearest input data point for each voxel, 5) radius.nii – distance to nearest input data point for each voxel.
Cross-section images of the final natural neighbor interpolation and its color-mapped 3D volume rendering are presented on the figure below**:
Once again the algorithm above is a well performing approximative implementation of natural neighbor method. It covers easily for interpolation as well as extrapolation of values although extrapolation is limited to weighing between the values of outermost input points rather than using gradients to estimate values outside of the hull defined by input data. Depending on your use-case this might be a useful behavior. You will find the source code in the following GitHub repository as well as a snapshot below. Please enjoy! 🙂
DOWNLOAD: nn3d-master.zip
* Data (Nifti) generated in MATLAB and rendered using MRIcroGL.
** SPM used to render cross-sections. MRIcroGL used for volume rendering.