콘텐츠 본문
논문 해외 국제전문학술지(SCI급) Porous Models for Enhanced Representation of Saturated Curly Hairs: Simulation and Learning
- 학술지 구분 국제전문학술지(SCI급)
- 게재년월 2025-05
- 학술지명 IEEE ACCESS
- 발행국가 해외
- 논문언어 외국어
- 전체저자수 2
- 논문 다운로드 링크(외부) https://ieeexplore.ieee.org/document/11002486
- 연구분야 공학 > 컴퓨터학
- 키워드 #neural network #fluid simulation #physically-based simulation
논문 초록 (Abstract)
Simulating the cohesion, adhesion, stiffness, and exaggeration of curls of wet curly hair or fur, expressed through the saturation-hair interaction in physics-based simulations, remains a challenging problem. Wet hair or fur tends to clump and stiffen at the ends, a common phenomenon observed in wet hair or animal fur. Additionally, while wet hair should exhibit adhesion when in contact with solids, the uneven distribution of forces in wet curly hair, manifested as noise, complicates an accurate representation of adhesion. Research into detailed porous models for wet curly hair, driven by saturation-hair interaction, is not yet extensively explored. Previous methods have manually represented wet hair or used static hairstyles for wet curly hair and fur, maintaining shape but resulting in unnatural movement due to the lack of simulation. This paper proposes methods for representing wet curly hair features: 1) curl exaggeration using locally transformed helices, 2) deformation-based cohesion that remains stable in wet curly hair, 3) level-set-based adhesion for efficiently depicting the sticky and elongated forms of wet hair, 4) dynamic stiffness for improved simulation stability, and 5) collecting a detailed synthetic dataset of curly hairs and extending the solver to represent particle movements in wet strands through learning. Experiments in various scenes demonstrate that our proposed methods more realistically represent the saturation-hair interaction compared to previous wet curly hair simulations. Unlike previous methods in which saturation caused cohesion or curls to tangle, our method stably represents porous flow at the strand level. Additionally, we propose to extend the learning representation solver through both numerical simulation algorithms and AI-based approaches.


