Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
After puzzling over it for about an hour, I think I finally hit on the 'why'. I had chosen my z600 server as the main PC, but I had left the Render Node program actively running on that same PC at the same time that I was rendering using the main Carrara 8.5 on the same machine. As you can see in the 2nd screenshot, it shows both the z600 node and the local host as 'working'. So it looks like if you're running a render node on the same machine you're running the Carrara main app on, it will actually read that as even more render cores (of course, it's only subdividing the work your existing render cores are doing, so I don't think there's any advantage to it, but it's fascinating that this can happen. Actually I take that back, there can be an advantage, in that you can break the scene work up into even more different buckets, which means fewer render buckets are likely to get stuck at a particularly complex part of the scene, so if you use the lowest bucket size it's likely this will give a tiny bit of render speed increase).
After figuring this out, I closed the Render Node application I had running on the main Carrara machine and tried again. More mystery! This time I was able to use all of my Nodes... but not all of the render cores that should have been useable. Keep in mind every Node is an i7, so should have 8 render cores, which means if all cores on all the Nodes are working that there should be 72 Node cores, but I topped out at 68. I'm so confused by this and trying to figure out what is happening. I really thought we'd hit a 64 render core ceiling on the Nodes, but now I'm clearly getting to 68... but then what happened to the other 4 cores on that last node? I borrowed by brother's laptop to run the test, and suddenly it occurs to me (just now as I'm writing this) that maybe for some unknown reason he has hyperthreading disabled? I already gave the laptop back, but I'll have to check this tomorrow, maybe that's the culprit...
BTW, I also came up with a good education in D&D, just to put my nerd creds out there I spent many an evening reading through the Gazetteer series or the various Monster Manuals. Was it 3rd edition that went d20? I know I played a lot through 3rd edition, but I hear they're up to 5th edition thesee days
Yes, edition 5 while also reissuing older versions as well. I really like 3rd edition and I've bought most of the core rulebooks to 4th edition, which is kind of a cool change. I also had a really cool game from Games Workshop: Warhammer Quest, which was a much simpler approach on the rules - using a randomized deck to determine which dungeon tile to lay next, with other cards, tables and die rolls to determine the encounters. I loved it - collected the whole thing, painted the minis to a professional level, bought a huge supplement of other Warhammer monsters and such which went along with the beastiary section of the advanced rules, etc.,
My nephew always loved that game so I ended up giving it to him when he went to college. I really do miss it though. That was my big pull towards E-Bay - looking for Warhammer Quest as well as their SciFi game: Space Hulk.
That's the same thing that's happening to me.. It's like it is counting Machines. But, according to the Grid Documentation, it should handle 100 CPU's and 50 Rendernodes. If that's the case, and CPU's are Cores.. then Rendernodes might be machines. But we are WAY below 50 machines, so it should still be working.
Boojum
...and even below 100 cores
Ok so I checked by brother's laptop this morning and realized it's powered by an i7 3520m, which is one of those i7's that only have 2 cores hyperthreaded to 4 (in other words, false advertising and they really ought to call this an i5, not an i7). That explains why I was only getting 4 render cores from it, but now I don't know how to experiment to break through above 70 render cores on my nodes,I would have to buy another computer I guess just to run the experiment, but at the same time I don't want to buy another machine if Grid won't support it, so I'm conflicted. Hmm, maybe I can make my little Asus t100 tablet a node, it's got 4 (very weak) cores...
Sorry for the OT again, but after some digging, I've found the image I've made for the fourth Neverwinter Con, which was celebrating the new NWN 2 - two games, infinite possibilies!
I ran a room during the convention, showing off my "Adventure Gear" supplement, which allowed users to add backpacks and other assorted goodies directly during gameplay. One was a simple backpack item, another was a backpack with a torch and a bow strapped to the back, there was a medieval style bath robe, open in the front, one that overwrote the appearance of the character's arms and made them look like they were carrying a fallen companion... it was really popular putting me as an artist into the NW Vault Hall of Fame! ;) which inevitably landed me a possition with C.R.A.P. and then the CEP. What a fun ride it was.
We had a pack that allowed us to play Firefly (TV Show) RPG before the Serenity RPG (by Margaret Weiss) came out. I ran a game as DM called: "Out in the Black"
We had a Knights of the New Republic conversion for playing Star Wars, complete with Darth Vader, C3PO, R2D2, Battle Droids, Light Sabers, Blasters, Star Ships, plenty of sound effects and music, etc.,
There was (is?) a d20 Modern team, making all of that stuff work, like cars, guns, modern buildings, etc., Another team made PlaneScape, I was part of the Spelljammer team (AD&D 2nd supplement), Prestige Classes gallore and there is just about every monster, creature and class available to D&D, including the many ranges of age-classes of Dragons from wyrmling to Ancient Wyrm - and one fellow even made a Dragon Riding system... we truly were a force to be reckoned with, now that I think about it!
So with all of that history, even though I gladly purchased every upgrade and supplement available on disc as soon as they were officially available, I also bought the whole works over again in DRM-Free form from GOG when that came out. After a certain time, BioWare removed the need to have a disc in the drive to play - basically officially making it DRM-Free anyways... the GOG version allows me to install the whole works without the need for a disc drive (like on my tiny netbook-like laptop, which plays NWN like a dream!) which is super handy!
All of that jazz is what eventually led me to Carrara. This image was done in Poser with Victoria 3, I think, as Rosie whom, as a matter of fact, defeats the Blue Dragon she faces here! ;)
This image helped me to realize that I really want to animate - to make fantasy movies - simply because there really weren't hardly any to choose from then. My have the times changed since... but it doesn't change my need to create them anyways!
So I guess that brings us full circle to my interest in you getting my render nodes working for me before you send them to me! :) (hint hint!) LOL
Well, I got my test render done. This is it with no post processing. I don't know how much time I saved because I kept pausing things, making tweaks to machines or home environment, and then resuming the render. You can have a peek at http://www.nekoden.org/video/2016-10-17RocksAndSnow.MP4
My NEXT Render I will be working on, I will be trying to make a Halloween Video.
Very nice! If I hadn't known it was a render, I would have thought it was a real shot from a helicopter flying over/beside a mountain range (actually looked very similar to where I live).
Too many hot pops/noise and to little ambient atmosphere.... but it's a start. :D I can fix most of that in post, but I probably won't bother. I'm going to dive right in on my Halloween Animation!
Ok! I did a bit more antialiasing on the video during processing of the individual frames. You can see it at http://www.nekoden.org/video/2016-10-18RocksAndSnow.mp4
Boojum
So either the second one is just too taxing on the playback software (I get that due to my rendering to Full Frames) or the higher settings have caused jitters in the render - possibly due to all of the various cores calculating the higher setting slightly differently.
I love the animation, though. Very breathtaking!
I agreed that the first one could be fixed pretty nicely in post with the slightest bit of motion blur - just to equalized each frame together. That's at least the approach I'd start with ;)
Great Job Boojum!
Oh, I forgot to update things here. I contacted tech support, told them what was happening, and their response was "It must be the fault of your CPU's because we tested Grid on other CPU's." then the closed it as resolved. Yay.. support...
*sigh*
Boojum
Ok, that posts makes it sound like they closed it right after the message that it must be my CPU's fault. They actually closed it a couple days ago after the conversation had gone quiet for months, so I don't blame them for closing it, though saying it's resolved is not particularly true.
Boojum
Hi Boojum, very nice renders - although the second one stutters for me too. First one looks fine though.
I have a feeling that when the last iteration of GRID was written, nobody at DAZ has actually tested it with anything like 100 cores. Considering this was probably around 2005, the hardware they'd had to have to test GRID with would have been extremely expensive. Back in those days multi core CPUs were just coming out, and no average user would have had a workstation with 8 cores, let alone 10 of those lying around. Logic would dictate that someone in the progamming department would need to take their finger out and update GRID... it's probably at the bottom of the priority list and comes right after "releasing Carrara 9".
It almost sounds to me as if Carrara is evaluating how many cores there are per machine, and in seeing "the next machine has 8 available", it creeps out and says "yeah, that's too much - I can only support another 4, so I'm stopping here". Or perhaps those hexacore CPUs are something that Carrara is afraid of?
Threads like these are really helpful I find - GRID has always been such a msytery, I'm so glad you guys are sharing your experiences.