Greetings!

Welcome to Scifi-Meshes.com! Click one of these buttons to join in on the fun.

3DS Max + "Enable SSE"

YaricYaric0 Posts: 0Member
I'm kinda new to 3D max and I'm rendering this animation out, it's 260 frames. Anyways, I have a core 2 duo, E6600. When i render the thing out with no motion blur with the scanline renderer it takes about 2-3 hours. When i render it out with motion blur on it, it takes about 40-50 hours. I've been running it right now for 40 hours and it's almost done. Anyways, i was looking at the little menu that shows what it's doing and the options and i noticed "enable SSE". I looked it up and saw that it can make it run faster with it enabled if you have the proper processor, which i do seein it's intel.

So does anyone know how much faster it'll make it run. It currently says i have 17 hours left on my animation but that's probably more like 10 because of the scene and how the object disappears soon, and that'll elimnate alot of motion blur. But i'm not sure if i should stop it and find out it's just as slow and waste even more time or just let it finish.

Any suggestions.
Post edited by Yaric on

Posts

  • DeadlyDarknessDeadlyDarkness0 Posts: 0Member
    Did a few tests with my version of Max (version 7, but still pretty much the same). Rendered a scene, 1:03 without SSE, 1:00 with. Shaved about 3 seconds off the time on average. I use a dual core Athlon X2 clocked at 2.4 however, but I'm sure they use the same instruction sets, minus the video encoding ones. Basically I believe, it makes a very small difference, but you might as well leave it on.
  • RedEyeRedEye0 Posts: 0Member
    I think a properly set up mr will render motion blur a lot faster than scanline. It's definitely not the easiest renderer to learn though.
  • DeadlyDarknessDeadlyDarkness0 Posts: 0Member
    I myself never use motion blur in my animations, since if an object is going fast enough, it'll blur naturally. Motion blur is really for stills in my oppinion, but it could probably exaggerate certain effects in specialist cases (bet it'd be good for a BSG style camera zoom effect).
  • YaricYaric0 Posts: 0Member
    I would use MR but for some reason the mesh i'm using, the textures don't work well with MR and have all these weird specular errors and crap. Anyways, i ended up having to use scanline. I guess i'll just let this thing finish and then turn on SSE for future renders.

    The Athlon X2 has the same SSE set as the intels, but they lack behind with the new SSEs like SSE2, SSE3, SSE4. But 3D max only uses the first SSE as of now I believe.
  • DeadlyDarknessDeadlyDarkness0 Posts: 0Member
    My family of chip supports SSE, SSE 2 and SSE 3, lacks the 4 support however. If you want faster render times, I'd try overclocking your processor, a 10% overclock directly equates to a 10% reduction in render time. I had mine running at 2x 2.82Ghz, which is basically a 840Mhz overclock. Reduced render times by a great deal, but my memory let me down in that it over the period of a day severe corruption would occur and I'd have to do a cold shutdown. But the core 2's are famously good overclockers with a good motherboard, even with standard cooling you could probably do at least a 20% overclock with no problems. Just remember to look up memory dividers, hyper transport frequencies and memory timings, those are the main causes of instability.
  • havgunwilltravelhavgunwilltravel0 Posts: 0Member
    if u have pennies to spend look into this, this might help a little bit.....

    Supermicro MB H8QME-2-O -Socket F-Quad Opteron Dual-Core (Supermicro MBD-H8QME-2-B) | UpgradeNation.com {evil little grin}
  • YaricYaric0 Posts: 0Member
    I actually did have mine overclocked when i was rendering out before. Had it at 3.11Ghz. But it had some sort of error when i rendered something for days last time so i decided to just put it back to stock 2.4 and let it do it's thing. Hopefully no errors. I could probably just overclock to around 3.0 and be alright, but i dunno.

    Oh, and i wouldn't bother with that quad opteron system, pretty sure intel already blows that out of the water, and in about 6months - 1 year crap will be ridiculous. 8 cores in one chip, then a while after that the Nehalem processor with 16 cores.

    I hope AMD gets their stuff together and cranks out some good stuff soon or else they're gonna be in trouble.
  • havgunwilltravelhavgunwilltravel0 Posts: 0Member
    :shiner: AMD all the way DUDE!!! if it wasent that quadcore intels were so danm expensive........ and ur right, technology is changing all the time. recently read an artical about a chip manufacturer experimenting with micro-lasers being used to replace the ON_OFF switches within a processor core giving it exponetional processing power over anything that intel or AMD or anyone els can do....
  • ZardozZardoz2 Posts: 0Member
    Ahtlon 64 family (64, X2 and FX) soport :
    MMX, Extended 3DNow!, SSE, SSE2, AMD64, Cool'n'Quiet, NX Bit

    And the most modern versions of these CPUs supported SSE3.
Sign In or Register to comment.