Jump to content

another backburner issue


Dave Buckley
 Share

Recommended Posts

guys sorry to raise this question again, similiar to another post i made a few months back but its driving me mad.

 

The problem we have is that we are using ‘Alienbrain’ asset management software which stores projects locally on each pc and referred to through pseudo drive letter (S) This data obviously is not centrally located so not available to all every pc on the renderfarm. To get around this we send all maps with the job which packs all xrefs/maps/proxy objects etc into a zip file which is then distributed and opened locally on each render slave. The maps etc which are now present on each slaves local HD are however only used if MAX can’t find them in their original locations on the network. This is no problem for slaves which do not have an Alienbrain (s drive) But where the slave is also used as a workstation and therefore has Alienbrain installed, Backburner shall, by default, grab the referenced files from its s: drive. These files are rarely all at the same stage of development. Is there any way to force backburner to call the submitted local files by default, rather than their original network location? This would completely solve our problem.

Link to comment
Share on other sites

I don't use Asset management software, so I can't answer your question directly, but your setup sounds like it is a pain in the ass.

 

My first inclination as a workaround might be to use the Breidt script to save out a copy of your file. That script will re-path everything in your Max Asset Tracker to read from the directory of your choice. So, instead of the paths being "s:\..." They would simply be ".\" or somthing like that. Now when you send to the farm, Max will always render with the files you sent because it doesn't know where else to look for them.

 

At least I think that is what will happen.

 

You can find the script at... http://scripts.breidt.net "MB Resource Collector v0.1 beta"

 

But overall, it sounds like this is a recipe for things to go wrong.

Link to comment
Share on other sites

I have an answer :)

 

This could be useful to somebody out there so thought i'd post it.

 

The way 3ds Max works is that when it needs to load a bitmap it will first check the path that is saved in the scene file (not the local zip file), if that fails it will use the map paths to try and find the bitmaps (therefore finding it locally). There isn't a way to tell 3ds Max to always ignore the bitmap paths.

 

What you can do is once the project is ready to be rendered: use the Bitmap/Photometric path utility to strip out all paths ('edit resources', 'strip all paths'). That way the bitmaps won't have a path saved in the scene and so backburner will have to use the local copy

 

Woohoo

Link to comment
Share on other sites

The only thing I don't like about this solution is that Max can take a while to search paths, or at least they do when they are not the correct paths. Maybe it is faster if there is no path at all.

 

This might not be that big a deal for still images, but if you are running an animation, the amount of time spent searching can really add up. If I have bad paths in my file, sometimes it can take an extra 3 minutes to open.

Link to comment
Share on other sites

The only thing I don't like about this solution is that Max can take a while to search paths, or at least they do when they are not the correct paths. Maybe it is faster if there is no path at all.

 

This might not be that big a deal for still images, but if you are running an animation, the amount of time spent searching can really add up. If I have bad paths in my file, sometimes it can take an extra 3 minutes to open.

 

I agree and used to face the same problem.

 

Stripping the path is one solution, but why not fix the problem once and for all by having a central server (or mapped drive) containing all of the maps and xrefs, and all workstations and slaves pointing to that server/drive?

 

Adding the maps to every render job and stripping paths can as much as double your rendering time. Your current setup is very inefficient, and you could benefit greatly by centralizing all your source files, xrefs, maps and output render files on one dedicated server (also running backburner manager and monitor) - sort of like the dealer on a poker table.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...