![]() ![]() |
In our framework, we first partition a given 3D volumetric dataset into subblocks, called unit blocks, whose size is 16 x 16 x 16. For each unit block, a three-dimensional wavelet transform is repeatedly applied twice. We use a simple wavelet based on the Haar basis whose eight-band filter bank can be expressed as: (Figure 2) where cijk i,j,k = 0,1 on the right side are eight coefficients in each 2 x 2 x 2 subregion of a unit block. clll represents their average, and the remaining coefficients on the left side are the detail values corresponding to the filter sequences, (for example, chlh is obtained by applying the high-pass filter, the low-pass filter, then the high-pass filter.) Two applications of the 3D wavelet transforms for each unit block are enough, considering that smaller number of transforms results in faster reconstruction, and that most of the data 63/64(=1-(1/82)) are already decomposed into detail coefficients. The decomposition process converts each unit block into a decomposed version that can be stored in an array with the same number of elements, using a proper ordering of the coefficients (Figure 2).
A typical wavelet compression algorithm has three basic components: transform, quantization and encoding. The wavelet coefficients are then quantized to restrict the values of the coefficients to a limited number of possibilities. Note that usually all of the information loss occur in this stage. Then the encoding stage takes the string of symbols coming from the quantizer, and attempts to represent the data stream as efficiently as possible without loss. Popular variable length coders, such as Huffman or arithemetic coders, work well. However, such techniques are not appropriate for the situation where an individual data item must be quickly reconstructed in an arbitrary sequence ([5], [18]).
Since the wavelet transforms are applied twice for decomposition, each 4 x 4 x 4 subregion of a unit block can be considered as represented by an octree in Figure 5. The reconstruction process is the reverse of decomposition. For the Haar wavelets we use, the reconstruction formulae (Figure 3).
Shapiro implemented an encoding scheme, called zerotree
encoding, which greatly improved the performance of the wavelet
encoder, but was much slower ([20]).
Variations of zerotree encoding scheme were proposed to show better performances
([14],
[15],
[16],
[17]).
In [9] and
[10], we proposed effective encoding
scheme for very large volume data compression, showed abilities of high
compression ratio and fast random access by performance analysis.
Note that we used L2 norm for CT and RGB data compression.