Jump to content

filipgavril

Members
  • Posts

    3
  • Joined

filipgavril's Achievements

Newbie

Newbie (1/14)

10

Reputation

  1. Hello everyone, I have a problem that's been bugging me for the past weeks and I can't find any answers on what might be the cause or how it could be fixed. I turn to you for aid.. In short words, I use vRay RT GPU quite a lot and everything works exceptionally well until I add a couple of sticks of RAM in my build. Up until now I've used a Titan X and a 980Ti on an Asus x99 Deluxe motherboard, i7 5820K CPU and 4x4GB RAM at 2666.. Normally, I start the RT engine and the GPUs kick in at ~100% utilization. All is fine and well until I add another IDENTICAL set of RAM sticks (4x4 2666) to the build. With the new ram, as soon as I start the RT engine, the CPU kicks in at about ~80% utilization and the GPUs stay at 20-30% utilization.. This happens on ALL scene regardless of size or complexity and ONLY after I add the new RAM.. If I remove the new ram sticks everything goes back to normal. This is the second set of RAM sticks that I've tried, as the first set I thought was buggy and had replaced.. I have absolutely no idea what causes this, or how It can be solved. Any ideas? Could it be the hardware? drivers? Any thoughts or speculations on this issue would be highly appreciated, as it's really starting to scratch at my brain.. All the best, Filip
  2. Thank you Dimitris and Kaiser for the answers and help. On Nvidias site I only see two driver fot the titan x (347.88 and 350.12) and I tried both multiple times with no luck. Also, while working on my projects I noticed the titan has similar behavior on other software also (idling with an open Photoshop file and the titan is also "pedal to the metal", same with other programs)... hm I played with the fan curve a bit and I got some temperature and noise levels that I think I can live with. I just wanted to know if this was a software issue and if I can hope to see it resolved in some future driver updates. I'll probably try the back and forth with the current drivers when I get some extra time on my hands and keep you posted on any changes I might experience ! THANKS !
  3. Hello to everyone ! I would very much appreciate any comments and advice that might ease my heart on the issue I would describe below: I recently upgraded my build to a x99 platfrom running with a 5820k and Titan X in a Define R5 case packed with a couple of other Noctua fans. Everything runs fine and dandy with the Titan X sitting at 30-35 degrees C in idle, until I open 3dsMax 2015. As soon as I do so, the gpu clock jumps through the roof and so does the temperature, to about 60-65 degrees, with no gpu usage. This feels rather odd to me, as it does nothing. It just sits there with an empty scene and temperature rise from the 30s to 60s. I know this is because of the clock jump, but WHY does the clock jump? Previously I used a 560ti and 2600K also in 3dsMax 2015, but the temperatures where never above 35 even with complex scenes open... So the noob question here is WHY does the Titan X have its clock jumped when I open 3dsMax? Is it from the Nvidia Driver? is if from 3dsMax settings? AND, are 60-65 degrees safe to work with on a daily basis? (~10 hours per day in 3dsMax, plus the occasional renders left over a night or two) As i said above, I would very much appreciate anything that might shed some light on the matter. Thank you all !
×
×
  • Create New...