Jump to content

Correct way to handle Files with Workstation and Render Nodes


Recommended Posts

Continued from this other Thread, Post Number 3:

http://forums.cgarchitect.com/73301-recommendation-i7-render-node-configuration-500-600-budget.html

 

I use 3ds Max and Vray...Windows XP, but soon Windows 7.

 

Regarding what Benjamin Steinert mentioned in the Above post (LINK) about optimizing the Render NODE, having all the resources (textures, proxies, etc...)..stored locally in each render node.... How does that syncing work? what is that software that makes that automatically?

 

Right now i have all the project files on my workstation, along with textures, proxies, etc, but for each render job, i copy MANUALLY the project file, along with any NEW texture or proxi file i have created...

 

I just bought a small QNAP NAS, and i was thinking that the best setup was to have all the project files, textures, proxies, etc stored on the NAS, (Just as it was a file server)... and nothing stored on my workstation or the render nodes, so all the machines can access all the files in just ONE LOCATION... Will this be a bad idea? is it better to have everything locally as He mentioned?

 

Thanks for your help..

Link to comment
Share on other sites

i have to mention that mine is a small, very small studio setup, 2 workstation, and 2 more PCs dedicated mostly to rendering, (render nodes)...

 

My equipment is old dated and i want to renew it soon, but also optimize the way work and files are handled.

 

Saying that, if you have any otughts on PROS or CONS on the different setups for managing project files i would be very thankfull to hear from you.

 

This are the setups i have seen, i don't know about other possibilities:

 

A) Have project files and resources (textures, proxies, ies, hdris, etc...) all stored in each computer indepdndently. Hence, working the project in one wokrstation, and then when i want to render, i copy the need files to each PC that is goign to be used for RENDERING... I Use this setup, but i can not compare since is the only one i have tried so far... I have heard that this in this way things render faster...

 

B) Have project files and resources all stored in a SERVER or NAS, so each computer can access the same files.. no need to copy them to each computer... i guess this is s smoother way of working, but i have hear that there could be bottlenecks because of the local network...

 

C) Have Project files in a Server, and "resource files" stored locally in each computer. In this scenario, a forum member in another post suggested to do it this way, but use a SYNCING SOFTWARE to manage the files... I would like to know more about this.

Edited by unrinoceronte
Link to comment
Share on other sites

We have used the freeware DSynchronize for the purpose of manually telling the folders to sync when any new assets are added and needed on the nodes for renderings. I know there are many options for syncing software, but I found this one very simple and lightweight.

 

As I said in the other thread, we only really do this for our most commonly used resources, so it is a hybrid of both options B and C. If a texture is very specific to a project we typically store it with the project on the network, but if it is something that is somewhat generic or could easily be used for other projects, it will go into the sync'd local resources folder.

 

Also as Dean Punchard mentioned in the other thread, the synchronization software will put a load on your network during the synchronization process, and possibly even if you have the software monitoring the folder(s) and syncing automatically. I am utilizing a synchronized local folder setup in our office (3 workstations, 3 render nodes) and have opted to invoke the sync manually rather than have the software monitor and sync in real-time, automatically, or on a schedule.

 

There will be a small amount of time involved making sure everything is running smoothly, but I do think the effect on render time does pay off, and especially so when those local resources are being hosted from an SSD.

 

Also as a side note, I think people get confused at the speed of a gigabit network. A gigbit is one eighth of a gigabyte, or 128 megabytes. So at best you are looking at a transfer rate of 128 MB/s during transfer of the resources over the network. A local SSD can potentially, based on the performance of the SSD, outpace the transfer rate of a gigabit network by 3 times on read speed alone.

 

EDIT: Here is some info from the Autodesk Help docs regarding the behind-the-scenes processes of network rendering, which show that even with a good network connection, there is a considerable amount of dependency on the read/write rates of the local disk by use of the temp folder, especially if you have "include maps" checked on render submission (which you would not need to do if you are syncing assets to a local folder, saving a lot of prep and transfer time on behalf of the network and computers):

 

 

Following is a step-by-step description of the sequence of

events when you use network rendering:

 

 

  1. The user submits a job to the network Manager.
  2. On the submitting machine, the MAX file gets zipped up. If the user turned on Include Maps, all maps and XRefs are also zipped up.
  3. Once the file is zipped up, the ZIP file is copied to the Manager machine's Backburner\Network\Jobs\ folder. In the folder is an XML file describing the job itself, specifying frame size, output filename, frame range, render settings, etc.
  4. Once the Manager receives the ZIP and XML files, it looks to see which servers are sitting idle and can render jobs. It assigns the job to four servers at a time. (This is the Max Concurrent Assignments setting on the Manager General Properties dialog. See Starting Network Rendering ).
  5. Each Server machine receives the ZIP and XML files into the Backburner\Network\jobtemp folder.
  6. The MAX file gets unzipped, along with the maps and XRefs if they were included.
  7. 3ds Max is launched and loads the MAX file. If the maps and XRefs were not included, the Server searches for them as they are defined in the MAX file. For instance, if an XRef is in d:\foo\xref.max, the Server will look for xref.max in d:\foo\ on the local machine. If there are additional map paths set in the 3dsmax.ini file on the rendering server, it will search in those paths as well. If it does not find the maps and XRefs, the server fails for that particular job. This is why it is important to use UNC paths for all maps and XRefs in your scene file, so that all render servers can find them. However, if the maps and XRefs were included, then 3ds Max will get the ones that were unzipped into the \jobtemp folder.
  8. ...

Edited by beestee
Link to comment
Share on other sites

Is there any reason not to just keep your files stored on the main workstation and just make sure you setup relative paths to the files? I have all of my textures, proxies, models, etc on my main workstation. All of the nodes can see the shared folders. I never have any problems with computers referencing the files. I suppose storing the files locally would be a little faster, but it also increases your maintenance. Synchronizing folders is a good solution, but that is just one more thing to go wrong. I should qualify my statements by saying that I use Vray and Max and stay far far away from backburner. While I appreciate the added function that BB provides, I enjoy the simplicity of DR through Vray. Just my two cents.

Link to comment
Share on other sites

BB allows a person to keep working while the rendering task is offloaded to other computers, and I would contend that it is just as simple to use. DR can also be utilized through BB as well, and if you go to the trouble of getting it set up properly it ensures that you are utilizing the full processing potential of your render nodes for every frame that they have to render. Not using your workstation? Just launch spawner I do feel that DR does depend much more on the quality and congestion of the network that you are utilizing it on though, which are not always consistent and measurable in my experience.

 

I work in a fairly large office though, so network performance is subjective.

Link to comment
Share on other sites

  • 2 weeks later...

My setup is the same as Jason's, all files are kept on a NAS and I use DR every second of every day with absolutely no problems. I can't really see any major advantage of having your files on each machine in fact it seems like more of a hassle than anything else. I have a render farm of over 100 machines and they all pull files from this one server, all my nodes are rendering within a minute or two so I don't see it as a bottleneck.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...