Articles, Blog

Real-Time Fiber-Level Cloth Rendering | Two Minute Papers #132

December 15, 2019


Dear Fellow Scholars, this is Two Minute Papers
with Károly Zsolnai-Fehér. This piece of work shows us how to render
a piece of cloth down to the level of fibers. This is a difficult problem, because we need
to be able to handle models that are built from potentially over a hundred million fiber
curves. This technique supports a variety of goodies. One – level of detail is possible. This means that the closer we get to the cloth,
the more details appear, and that it is possible to create a highly optimized algorithm that
doesn’t render these details when they are not visible. This means a huge performance boost if we
are zoomed out. Two – optimizations are introduced so that
fiber-level self-shadows are computed in real time, which would normally be an extremely
long process. Note that we’re talking millions of fibers
here! And three – the graphical card in your computer
is amazingly effective at computing hundreds of things in parallel. However, its weak point is data transfer,
at which it is woefully slow, to the point that it is often worth recomputing multiple
gigabytes of data right on it just to avoid uploading it to its memory again. This algorithm generates the fiber curves
directly on the graphical card to minimize such data transfers and hence, it maps really
effectively to the graphical card. And the result is a remarkable technique that
can render a piece of cloth down to the tiniest details, with multiple different kinds of
yarn models, and in real-time. What I really like about this piece of work
is that this is not a stepping stone, this could be used in many state of the art systems
as-is, right now. The authors also made the cloth models available
for easier comparisons in followup research works. Thanks for watching and for your generous
support, and I’ll see you next time!

You Might Also Like

31 Comments

  • Reply Bei Zhang March 1, 2017 at 5:07 pm

    Wow, this is crazy if it is in realtime.

  • Reply Neoshaman Fulgurant March 1, 2017 at 5:16 pm

    Wow looks a like something I have to look into to get better afro hair shader asap, thanks!

  • Reply TheFloatingSheep March 1, 2017 at 5:20 pm

    omg an actual two minute one

  • Reply meegul304 March 1, 2017 at 5:25 pm

    For the millionth time, thanks for making/working on this amazing channel. It's the needle of real science in the haystack of clickbait.

  • Reply Eric Sneider March 1, 2017 at 5:29 pm

    That's amazing. The results really look phenomenal.

  • Reply eastern_BANDIT March 1, 2017 at 5:31 pm

    how can we get ahold of these bits of code?

  • Reply Mr Calligraphy March 1, 2017 at 5:42 pm

    I wonder how the techniques used here could be used for different applications other than cloth.

  • Reply THTerra March 1, 2017 at 5:56 pm

    lovely technique 😀

  • Reply Sameer Hoosen March 1, 2017 at 6:11 pm

    I wonder how long it'll be until we see this used in some upcoming AAA game. Perhaps we will have Nvidia Clothworks or something akin to that.

  • Reply m ・ ́ω・ March 1, 2017 at 6:12 pm

    Deer fellou SKALARS.

  • Reply Justin Jensen March 1, 2017 at 6:35 pm

    Hey! I was at the University of Utah just this past weekend, being recruited for their Computing PhD program. Cem Yuksel will (hopefully) be my advisor! Glad to see I picked a good school.

  • Reply HAL NineOoO March 1, 2017 at 7:28 pm

    What's the estimated time to market for this kind of research in average?
    Thanks for the high quality channel !

  • Reply TheMNTK March 1, 2017 at 8:01 pm

    Amazing!

  • Reply Máté Nagy March 1, 2017 at 8:04 pm

    It would be interesting to get into more specifics. Frame time given a graphics card, so that we know how much of this is possible in real-world scenarios like game engines, etc. Otherwise great videos! Thanks

  • Reply c64cosmin March 1, 2017 at 9:24 pm

    The exact progress that is happening using older CPUs, mainly referring to the demoscene(where people achieve 3d graphics on slow machines), is happening on modern systems. This is a very good example, common hardware used for techniques that would need datacenters to compute. Fantastic!

  • Reply Hail Sagan March 2, 2017 at 2:50 am

    This is a really excellent channel and you deserve a lot more subscribers!

  • Reply Ryan N March 2, 2017 at 3:21 am

    That's such a good idea. It does seem, now that it's mentioned, that GPUs are perfect for this sort of application.

  • Reply Pengu March 2, 2017 at 4:38 am

    Very nice episode!

  • Reply Robert Ralph March 2, 2017 at 6:25 am

    Simply astounding.

    This is swiftly becoming one of my favorite channels.

  • Reply Kram1032 March 2, 2017 at 7:54 am

    nice, looking great!

  • Reply Aakash Kalaria March 2, 2017 at 1:45 pm

    This channel is underrated.

  • Reply Bálint Áts March 2, 2017 at 2:49 pm

    Like, I understand that this is procedural, but are these polygons? Parallax mapping? Stacked polygons? Ray tracing?

  • Reply Marcel Winklmüller March 3, 2017 at 2:51 am

    I am thinking about taking your course this semester at the TU, but I am just now doing Computergraphic LVA so this might be a bit over my head..

    But I really hope you will repeat your lectures ws17?

  • Reply Bo Dodge March 3, 2017 at 6:00 am

    This is my favorite channel for CG technology news. I'm always amazed at what cutting edge info can be found here.

  • Reply orenong March 4, 2017 at 11:49 am

    are you hungarian?

  • Reply Christian Siegert March 4, 2017 at 2:46 pm

    Is “graphical card” a common term? I only ever hear it called “graphics card”.

  • Reply Martin March 6, 2017 at 9:11 am

    WOW can't wait for this technology to make it into games. 😀

  • Reply Tritoon710 April 4, 2017 at 4:34 pm

    Now include this in Maya!

  • Reply Guilty Potato August 8, 2017 at 3:41 pm

    So how do I go about acquiring the software that allows this?

  • Reply kobi2187 December 28, 2017 at 6:40 pm

    "the matrix is near" *shudders*. beautiful work, VR will look amazing and how will people know the difference if they wake up to a simulation. creeps. let's hope it's used for good, and that God is controlling everything

  • Reply jojolafrite90 June 2, 2018 at 11:39 am

    But how can we implement all those techniques in our modelers?! I want to use the open source thing about hairs and water, but it's for programmers… They don't want to compile a version of something, I don't know themselves. Implement it in blender! NOW! I need it!!! I NEED my fix!

  • Leave a Reply