Skip to content

Zhuoyang-Pan/Awesome-Hair-and-Fur-Modeling-Papers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 

Repository files navigation

Awesome-Hair-and-Fur-Modeling-Papers

A Collection of Papers and Codes about Hair and Fur Modeling

Catalogue

SIGGRAPH2023 [back]

Sag-free Initialization for Strand-based Hybrid Hair Simulation

CT2Hair: High-Fidelity 3D Hair Modeling using Computed Tomography

  • Code: https://github.com/facebookresearch/CT2Hair
  • Project Website: http://yuefanshen.net/CTHair
  • Abstract:

    We introduce CT2Hair, a fully automatic framework for creating high-fidelity 3D hair models using computed tomography. Our approach utilizes real-world hair wigs as input and is able to reconstruct hair strands for a wide range of hair styles. The 3D hair models are suitable for use in downstream graphics applications.

Interactive Hair Simulation on the GPU Using ADMM

A Practical Wave Optics Reflection Model for Hair and Fur

NeRF-Texture: Texture Synthesis with Neural Radiance Fields

  • Paper: [Coming Soon]
  • Code: https://github.com/yihua7/NeRF-Texture
  • Project Website: https://yihua7.github.io/NeRF-Texture-web/
  • Abstract:

    We propose NeRF-Texture, a novel texture synthesis method based on NeRF. It effectively models real-world textures containing both meso-scale geometry and view-dependent appearance by utilizing a coarse-fine disentanglement representation and synthesizes NeRF textures of arbitrary sizes via patch matching, which can be applied to new surfaces to add rich details.

ECCV2022 [back]

Neural Strands: Learning Hair Geometry and Appearance from Multi-View Images

  • Paper: https://arxiv.org/pdf/2207.14067.pdf
  • Project Website: https://radualexandru.github.io/neural_strands/
  • Abstract:

    We present Neural Strands, a novel learning framework for modeling accurate hair geometry and appearance from multi-view image inputs. The learned hair model can be rendered in real-time from any viewpoint with high-fidelity view-dependent effects. Our model achieves intuitive shape and style control unlike volumetric counterparts. To enable these properties, we propose a novel hair representation based on a neural scalp texture that encodes the geometry and appearance of individual strands at each texel location. Furthermore, we introduce a novel neural rendering framework based on rasterization of the learned hair strands. Our neural rendering is strand-accurate and anti-aliased, making the rendering view-consistent and photorealistic. Combining appearance with a multi-view geometric prior, we enable, for the first time, the joint learning of appearance and explicit hair geometry from a multi-view setup. We demonstrate the efficacy of our approach in terms of fidelity and efficiency for various hairstyles.

About

A Collection of Papers and Codes about Hair and Fur Modeling

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages