Skeleton and Weight Creation

Some methods facilitate the process of generating a skeleton or creating the skin weights for a character already having a skeleton. Au et al. (2008) use Laplacian smoothing based on mesh contraction to extract a skeleton from a mesh. They iteratively remove the surface geometry in order to obtain a thin skeletal shape. Then a “surgery” is applied to the skeletal shape to remove the redundant connectivity and obtain a 1D structure. All of these steps are done using constrained Laplacian geometry smoothing and mesh simplification. In addition to producing a curve skeleton, this method also generates the skeleton to vertex correspondence and a local “thickness”. This method is limited to meshes with fine geometry (more than 5K of vertices) as it cannot generate fine skeletons form very coarse meshes. De Aguiar et al. (2008) use a deforming mesh sequence (mesh animation), with constant surface connectivity, to extract a kinematic bone hierarchy  .

However, it is not possible to locate a joint if there is no relative animation between adjacent body parts (e.g. in feet and hands due to insignificant relative animation in the feet or the fingers of the hands). Some methods like the ones by He et al. (2009) and Le and Deng (2014) require a set of example poses of a source mesh to calculate the skeleton and/or skin weights. He et al. (2009) take several poses of a given mesh as input. They use a harmonic function defined on the given example poses of the mesh and construct the skeleton-like Reeb graph from it. Then, the initial location of the joints are calculated by examining the changes in the mean curvatures. At the end, they refine the joint locations by means of solving a constrained optimization problem. Yet, this method cannot handle all types of characters. Le and Deng (2014) compute the corresponding skeleton-based Linear Blend Skinning (LBS) model, including the skeletal structure and skin weights, to a set of example poses of a character mesh. The main limitation of this method is its low computational efficiency and the resulting artifacts from using the LBS model. Overall, all of these example based approaches share the same limitation of requiring various example poses of a source mesh as input, whose preparation is often a time consuming and complex process itself.

There are some methods which specifically generate skin weights. In the Geodesic Voxel Binding method by Dionne and de Lasa (2013), given a skeleton and a mesh, they are able to derive the skin weights by first voxelizing the input geometry, and then calculating the binding weights. Although this method works for production meshes that may contain non-manifold geometry or be non-watertight, applying its results may need post-process user interactions to modify the artifacts. The method of Bone Heat (Heat Map Binding) by Baran and Popovic´ (2007) models the weight assignment for each bone as a heat diffusion system on the surface of the mesh. However in many cases, the results show artifacts which again require further user interaction to become more acceptable. On the other hand, some example-based methods to extract the skin weights are also available such as the methods by Le and Deng (2012), Li and Lu (2011), and Wang and Phillips (2002). Le and Deng (2012) aim to extract the skin weights given a set of example poses of a character. Requiring a set of example poses, is one of the limitations of this method. Li and Lu (2011) are able to automatically animate a model, but they require the skeleton and an animation (motion) of the skeleton, which is a limitation itself.

All in all, it can be summarized that the methods used for generating skeleton and skin weights are not able to provide the same level of quality as competent artists and generally require user interaction for a post-process to correct artifacts. Furthermore, most of them require a set of example poses or animations of a character as their inputs, which itself is a limitation.

Skeleton and Weight Retargeting

Some methods address retargeting a skeletal structure from one mesh to the other in order to ease the process of setting up a new character. The methods proposed by Poirier and Paquette (2009a) and He et al. (2009) retarget a skeleton from a source mesh to the target mesh using Reeb graphs to select the joint positions. Poirier and Paquette (2009a) adapt the given skeleton to the given character by matching topology graphs between them. He et al. (2009) present the skeleton transferring as an application for the cross parametrization. They compute a consistent harmonic 1-form of the source and target meshes, and then obtain the one-to-one correspondence between the isocurves of both. The skeleton can be transferred to the target taking advantage of this fact that each joint in associated with a unique isocurve. In the skeleton sketching method by Poirier and Paquette (2009b) the user interactively positions the joints which leads to facilitating the retargeting of the skeleton. The results obtained from these methods are limited to medial axis. Furthermore the skin weights on the target are not assigned automatically by the method, so an automatic binding method has to be used then. These methods rely on the Bone Heat method by Baran and Popovic´ (2007) which then produces some artifacts and needs polishing .

Table des matières

INTRODUCTION
CHAPTER 1 LITERATURE REVIEW
1.1 Skeleton and Weight Creation
1.2 Skeleton and Weight Retargeting
1.3 Geometric Correspondence
CHAPTER 2 ANIMATION SETUP TRANSFER
CHAPTER 3 SKELETON TRANSFER
3.1 Joint Position
3.1.1 Energy Minimization
3.1.2 Procrustes Analysis
3.1.3 Spine Alignment
3.1.3.1 Energy Minimization
3.1.3.2 Procrustes Analysis
3.1.4 Mirroring
3.2 Joint Orientation and Rotation
3.3 Pose Normalization
CHAPTER 4 RESULTS AND COMPARISONS
4.1 Artist Work Preservation
4.2 Joint Position (PA vs. Energy Minimization)
4.3 Spine Alignment
4.4 Mirroring
4.5 Pose Normalization
CHAPTER 5 DISCUSSION
5.1 Limitations
CONCLUSION

Cours gratuitTélécharger le document complet

 

Télécharger aussi :

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *