Vous êtes ici : Kiosque » Annonce

Identification

Identifiant: 
Mot de passe : 

Mot de passe oublié ?
Détails d'identification oubliés ?

Annonce

18 octobre 2024

Generative Models for Garment Mesh


Catégorie : Stagiaire


Geometric deep learning has emerged in the fields of computer graphics and computer vision, enabling deep learning models to operate on geometric data such as graphs, meshes, manifolds, and point clouds. Some notable models in this area include Graph Convolutional Networks (GCNs), PointNet, Geodesic Neural Networks (GNNs), and specialized architectures for 3D meshes, such as MeshNet and MeshCNN.

Motivated by these recent successes, we will explore and develop geometric deep learning models for a 3D mesh dataset. In particular, we are interested in 3D garment mesh data representing garment shapes in motion. Our specific focus will be on generative models capable of performing various downstream tasks, such as sequence inpainting and conditional generation.

 

We will proceed with the following tasks:

  1. Shape representation: The first step is to investigate how to represent various mesh data in a uniform manner, irrespective of its topological structure. While the representation should be invariant to rigid transformations and vertex/triangle orders, it must also be sensitive to geometric characteristics in order to effectively encode shape changes during motion.
  2. Unconditional generation: We will deploy a Transformer or diffusion model for the temporal encoding of garment meshes in motion. A semi-supervised approach will be developed by incorporating loss terms that ensure physical faithfulness of the 3D mesh.
  3. Conditional generation: The developed generator will be extended to support conditional generation. Class labels, partial sequence, or the canonical forms of the targeted garment mesh are considered as conditional signals.

https://mlms.icube.unistra.fr/img_auth_namespace.php/5/50/Stage-Generative_Models_for_Garment_Mesh-2024_En.pdf

 

Dans cette rubrique

(c) GdR IASIS - CNRS - 2024.