Fig. shows the final texture-mapped geometric fish models obtained from the images in Fig. . Our animation results reveal that in the artificial fishes application our texture extraction and mapping technique does not suffer much from some of the most typical rendering artifacts encountered in animation of deformable objects, such as `texture swimming', where parts of the texture slide back and forth on the object's surface.
Figure: Texture mapped 3D fish models.
In conclusion, deformable meshes are an easy to use interactive tool for obtaining texture coordinates that performs well in the mapping of textures from 2D images onto 3D fish shapes. However, the technique could be improved further. A certain degree of texture mismatch is evident by comparing Fig. with Fig. . Fortunately, the mismatch is not detrimental to the realistic appearance of the fishes on the whole. Note that the mismatch is largely due to the fact that the 3D geometric models do not accurately resemble the original fish outline while the deformable mesh does. The problem arises because the process of generating the 3D geometric models is for now separate from that of generating the texture coordinates. With additional work, it should be possible to unify the process of defining the contour control points for the NURBS surfaces with the texture contour generation process.
Another observation is that although texture mapping using scanned digital images leads to photo-realistic appearance, it has several limitations. First, if we want to render a large school of fish of the same species, each with slightly different textures, we would have to store as many digital images as there are fish. Second, many animals, including fishes, may change their colorations during a particular period of time for behavioral purposes. For example, most male fish are able to grow exceptionally colorful textures during mating seasons to attract female fish. Chameleons have the ability to change their colors according the color of the immediate environment to camouflage themselves. Yet others may change their colors according their moods. Hence, it may be useful to model the variation of colors and textures across different animals (in the same species) or for any individual animal over time. However it is inconvenient to capture, scan, and segment all the natural images necessary to do so. An alternative approach would be to modify a single captured texture procedurally or to generate purely synthetic textures procedurally. The procedural function that defines the texture (and color) of an animal may depend on multiple factors, such as sizes, identities, currently engaged behavior, environmental colors, and moods of the animal.
|Xiaoyuan Tu||January 1996|