Jump to content

Update farm via Backbunrer?


Crazy Homeless Guy
 Share

Recommended Posts

So, I was running some updates on the render farm today and a concept that may or may not be possible dawned on me. Can I run updates to the farm via Backburner? ...in other words, send a batch file to the farm to run on each machine via 3dsmaxcmd.exe. Or something along those lines.

 

I know you can render AfterEffects through Backburner, so I am guessing there is a possibility to execute multiple things. Though I am no coder, so if it is to technical, I am out of luck.

 

Maybe I am just being to hopeful. It wouldn't be the first time.

Edited by Crazy Homeless Guy
Link to comment
Share on other sites

How successful you would be may depend on what kind of update you wanted to run. If the update requires user interaction (accepting eula or just clicking "next") unless there is a flag you can run to force an install.

One way you can do it would be by while sending a Max scene to render (or by sending a scene which doesn't have to be anything at all, just a b.s. scene that each machine on the farm can grab a frame of, just for the sake of step 2):

which is to set scene to run a pre-render script (which is somewhere they can all see, of course.)

That script would simply be:

ShellLaunch "path\to\the\update\to\run" ""

 

I use this all the time as a post-render script which in turn runs a python script to send me a text message so I know when a job is done when I am rendering over the weekend or something.

 

Also, you can use this method to set the *.ini file for the EXR settings in as in your other post.

From the Maxscript Help:

setINISetting

 

In that case, the MaxScript should be something like:

setINISetting "c:/path/to/openexr.ini" "ExrExportInternalDefaultParams" "Dflt_RGBABitDepth" "HALF"

 

Although I haven't tried it myself.

Edited by luckymutt
Link to comment
Share on other sites

Michael,

 

I think this is definitely the direction I want to go, however I am still having a few problems. I run the updates via .bat scripts. I set up a very simple batch script for testing that simply copies a test file.

 

If I run the script by itself, it places the text file in the desktop as instructed to do. If I run the batch file by trying to call it via MaxScript, it does not seem to work. Maybe I am missing something simple.

 

Right now my MaxScript looks like this...

 

ShellLaunch "\\group\####\RESOURCES\Render Farm\Configuration\Testing.bat" ""

 

Where the pounds are replaced with the proper DFS pathing.

 

Do I need to call the command prompt first, and then tell it to run the batch file?

 

Thanks for your help.

Link to comment
Share on other sites

I guess it really depends on what you are trying to update, and if you have a decent in, talk to the IT heads, theres some nice tools for that sort of thing (SMS) that i'm sure a big co like yours would have, without having to resort to our backwards ways.

 

but we love backwards ways right!

 

Theres a lot of fun things you can do with that cmdjob.exe that comes with max, this thread gave a few interesting ideas, http://forums.cgsociety.org/showthread.php?f=98&t=692735&highlight=ffmpeg

 

Personally, most of the time i just go by way of building a max deployment, and remotely trigger them from here using some dos trickery to run it under the remote machines local login credentials

 

psexec \\remotecomputer -i -d -u domain\login -p pass "cmd.exe /start \\path\to\installfile.msi /quiet"

 

or something along those lines.

 

let me know if you find anything else good!

Link to comment
Share on other sites

I guess it really depends on what you are trying to update, and if you have a decent in, talk to the IT heads, theres some nice tools for that sort of thing (SMS) that i'm sure a big co like yours would have, without having to resort to our backwards ways.

 

but we love backwards ways right!

 

Theres a lot of fun things you can do with that cmdjob.exe that comes with max, this thread gave a few interesting ideas, http://forums.cgsociety.org/showthread.php?f=98&t=692735&highlight=ffmpeg

 

Personally, most of the time i just go by way of building a max deployment, and remotely trigger them from here using some dos trickery to run it under the remote machines local login credentials

 

psexec \\remotecomputer -i -d -u domain\login -p pass "cmd.exe /start \\path\to\installfile.msi /quiet"

 

or something along those lines.

 

let me know if you find anything else good!

 

IT handles all installs in terms of software, but I tend to tweak the farm for work flow, and plug-ins, etc...

 

Right now I am just looking to deploy plug-in updates, shader updates, ini file modifications, etc....

 

I find it is far easier to handle this myself, then to involve IT everytime I need a change. I am looking for the most streamlined method that isn't over my head. In the past I simply ran a batch that had a shortcut in the start folder, however the machines I am using for rendering now are dedicated, so I am looking to switch my workflow.

 

I figured if I could just send a command to launch a batch file through backburner whenever I wanted to add a shader, or change somehting, then that would be fairly easy.

 

I will have to read throgh that thread. Some time ago I was messing with the 3dsmaxcmd.exe and figured out how to send Max files to Backbruner through basic scripting, wihtout the need to launch Max. That was fun, though I haven't used it since. It looks like I may be able to use what I learned from there, and apply it to this "project," so I can update without launching Max, and via Backbunrer. .....I think. Maybe?

Link to comment
Share on other sites

I guess you could do that, maybe i'm not seeing the benefit of going via backburner instead of just old fashioned dos, ex: if its a straight up copy you can probably be just as well off doing a network copy by xcopy /s /y \\path\to\server\filesDeployment\*.* \\farmnode\c$\3dsmax2010\

 

the c$ is the default hidden share, if that doesn't work, you can create a file share and set which users to limit it to

Link to comment
Share on other sites

Hmmm.... Using the psexec tool seems like a good idea, but it prompts me for the password to log in, even though I included it. Then when I manually type the password, it says it couldn't access the computer.

 

I have tried connecting to machines through the command prompt in the past, and have had much the same luck. I am probably doing something simple wrong.

 

To get around this in the past, I simply placed a batch file in the startup directory that only called called another batch file that was located on the server. I would modify the batch file on the server if I needed an update. Since the machines in my old farm were all floor machines that were used during the day, it was easy to update because they would be started at night. Which caused the update to run.

 

Now I have a dedicated farm, and floor machines. It feels like to me it would be cleaner to simply have a maxscript that called a batch file to run. I would use this to update my dedicated machines, but I would also include it as a prerun script when I was adding floor machines.

 

I think it would save clicks when configuring new machines for the farm. I would only have to log in, and launch backburner, and walk away. As of now, I log in, run the batch scripts manually, and then launch backburner.

Link to comment
Share on other sites

OK. I managed to get it to work, but I had to do a workaround.

 

Instead of calling the batch file directly off the server, I called a shortcut on the desktop that was pointed at the batch file on the server. It worked like a charm. I am going to try and find sometime this weekend to streamline this method more.

 

For some reason when I tried to call directly off of the server for the script it didn't understand what to do. It didn't understand it as an argument.

 

So I guess now it is very similar to my old method. Only instead of the batch in the startup directory calling a batch on the server, I know call a shortcut on the desktop, which calls a batch on the server.

 

I owe you guys a beer.

Edited by Crazy Homeless Guy
Link to comment
Share on other sites

IT handles all installs in terms of software, but I tend to tweak the farm for work flow, and plug-ins, etc...

 

Right now I am just looking to deploy plug-in updates, shader updates, ini file modifications, etc....

 

I find it is far easier to handle this myself, then to involve IT everytime I need a change. I am looking for the most streamlined method that isn't over my head. In the past I simply ran a batch that had a shortcut in the start folder, however the machines I am using for rendering now are dedicated, so I am looking to switch my workflow.

 

Wow you just described our situation at HMC exactly (from IT, to managing the software installs to using a startup shortcut for updates). I'm following this thread enthusiasticly as I'm tired of not having a better way to manage the farm/floor nodes.

Link to comment
Share on other sites

hah at least you're allowed to use the floor nodes, thats just an ungodly amount of unused power here.

 

I'm always curious to hear other ways to work it out.

 

I've also had some marginal luck using the 'at' command to schedule tasks on remote systems, that was a while back though and i don't recall why i abandoned it at the time. I do remember the one advantage was that you could trigger the task remotely in the network neighborhood explorer window.

Link to comment
Share on other sites

The ideal setup i believe (Maxer?) described it at one point, they have a login script and simply reboot all floor machines into a certain render user latenight hours.

 

I remember that post. It got me started on a process to take advantage of all the machines our firm uses.

 

Backburner server and raysat run automatically using an admin account. Because all of our paths are UNC the machines don't have to be logged on, only running. We can wake most of them (those on the same subnet as the issuing machine) via WOL. You can then use Backburner monitor to setup available hours that the machines can run. Ours won't render before 7PM and they are supposed to stop at 6AM. I say supposed because out of the 50 or so I have used at one time, typically one or two will render past 6 (and I get an angry call. This is due to the refresh of Backburner. You can have all machines going past your 7PM wall and as soon as you hit refresh they stop.

 

Pushing updates through backburner sounds awesome

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...