Jump to content

UE and Unity demo videos


neilmcbean
 Share

Recommended Posts

Real-time tools are advancing so fast than traditional Raytracing will become a niche in the short term for sure.

 

Take these demos with massive grains of salt.

 

Here is the rub, that as scenes get more complex, the required RAM goes up. As RAM goes up on the card, the price skyrockets. Look at Pixar's XPU paper they wrote about hybird CPU and GPU rendering. Their minimum RAM requirement for Coco was 27 gigs. A $2,500 Titan RTX can't handle that. So you factor in that building a rig to handle the Titan RTX will set you back $4,000 minimum, that's kind of off putting to big render houses. I can build 3 machines with that 4k that can chew through scenes with 64+ gigs of RAM requirements.

 

All of the Quixel stuff is highly optimized to perform well on the GPU. It's not a ball of shite Revit file you got from an consultant. They also had a team of people working on it. It's not just you and a 2 day deadline.

 

I'm all for real time and such, but the reality is that it's farther off for most average usage requirement production that you think. They have got to get the prices down. When a single GPU costs more than a high end CPU machine as a total build, that's a bit of a disparity.

Edited by VelvetElvis
Link to comment
Share on other sites

I'm not sure anyone suggested it didn't take time to produce - anything that is well composed, well paced, has detail, and iterated design is going to take time and effort. Also Coco has massive scene load - from experience, people who work on films do not optimize through their workflow like people in games.

 

Optimizing for GPU doesn't mean 'bad', it means that labour and money is directed in different areas. That, to me, is like saying films are 'optimized for compositing'. It's a workflow designed for a result. I'd expect the number of people, time and cost are similar to if you rendered those same films in a CPU raytracer - but that's the point. It's getting less expensive.

 

To me it presents alternatives for workflow. You and a 2 day turnaround has value, but realtime immersive experiences may also have value in a different context, and bridges might open to speed up the process. Production requirement is very much an issue of perspective. I know people that are testing Unity as a broadcast production tool and it's performing pretty well, as well as reducing the cost need to outsource.

 

When the Adam demo was released I was talking to a VR Game AD and he said 'they used smoke and mirrors'. The director for that video was Neill Blomkamp, and all they really did was use Unity as a replacement for Nuke. To me, that's not smoke and mirrors. To me, that's filmmaking on a more efficient platform.

 

The only real limitation is memory (but that's always been a limitation). Hybrid renderers like redshift and cycles provide a good metric for comparison - it's cheaper to add GPU's than assemble new machines, maintain render farms, etc. Also Unity and Unreal both have scene optimization tools.

 

The real downside of realtime rendering is that ability to control the experience on different devices, but streaming services are being built that can solve that problem.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...