I'm thinking of building a 4P server; need to learn more about distributed rendering...
Hello!
I have done some limited testing with Bryce Lightning on two Windows 8 machines, so I have a tiny tiny bit of ... well, let's just call it "conceptual understanding".
I am thinking of building a 2P or 4P server system sometime in early 2014. It will be an addition to my network, not a replacement. It will be based on either an AMD 6xxx (Opteron) or Intel E5 (Xeon) processor.
My tentative thinking is to go heavy on CPU cores, such as with a 16-core Opteron or an 8-core Xeon (16 threads via Hyperthreading). Over time, I may also add two or more dual GPU cards such as those based on the AMD HD7990. I do other work that will take advantage of many CPU and GPU cores when not rendering, so this system would be running 24/7 anyhow.
My problem lies with the choice of operating system for this "rendering server".
1. If I go with a 2P (2 slot) system, I can easily run Windows 8. As I understand it, Windows 8 will run with up to 2 CPU slots, and I'm already familiar with it.
2. If I go with a 4P (4 slot) system, then I would not be able to take advantage of all 4 CPU slots with Windows 8; instead, I would have to buy two licenses of Windows Standard Server 2012 R2, which would put my OS costs into the $1,600 to $1,800 range.
That's pretty rich, so I'm wondering if I can effectively use Linux instead of Windows for this server, and still do everything I need to do. I know I can use Linux for many other tasks such as web browsing, office type software, graphic arts and even music (making and playing). I know there are other animation rendering software products that run on Linux, but I'm more interested in speeding up my current Windows-based Bryce, DAZ Studio, and Carrara renders.
Can a new Linux server handle remote rendering work from DAZ's Windows-based products for the three I mentioned? If so, what software/plugins would I need on both client (Current Windows workstation) and server(the proposed new box)?
I saw Luxus for Carrara and it looks like maybe it would run on Linux, yes? Are there other possibilities?
Are there any basic hardware requirements over and above the typical needed for any Linux-based server?
Here are the other features I'm considering:
1200 watt power supply
32 GB RAM or more
802.11N (or AC) connectivity to my network
512 GB SSD
A couple of hard drives for media data and a huge hard drive for network backups of the other computers in my home
Thanks in advance!
Comments
I use an array of Dell sc1425 servers I bought off ebay second hand as a small render farm for Carrara they run WinXP 64bit they only have the carrara render node software installed an standard PC also running WinXP64 has a full version of carrara. all scenes are stored on this machine. The scenes are rendered using the built in network render function of Carrara from there.
So long as you have the licences its straight forward to set up.
If you buy second hand servers which are designed to run Win server software usually 2003 make sure you check the manufacturers website for the server set up software is available to download. I know that the 64bit drivers for 2003 are compatible with other Win XP and 7 64bit versions
Thank you for sharing your method. Since my original post, I see that intel is planning to release a new set of server Xeon CPUS; some with as many as 12 cores/24 threads per Cpu slot! Expensive, yes. But this is bound to get interesting!
I hope you can handle the electric bill!
If I recall correctly, Boojum set up a render farm in his garage a few years ago, using mainly separate motherboards, CPUs, RAM, 250 watt power supplies and I believe ran them with VNC so he wouldn't need graphics cards for them. I think he said he figured out how much power they would use before he decided on their final number. I know from running space heaters in the winter that high wattages can really bump up your electric bill and some of the servers you are mentioning can really suck up the watts. You might want to search/ask around to find out how efficient they are as I often see 1,000 or 1,200 watt poser supplies mentioned in specs. Maybe Boojum will post more accurate specs than what I can provide. Carrara Pro has a core limit I think, besides a CPU limit, but you can add on with Grid (Render Node provides 10 nodes and 20 CPUs out of the Box. Grid offers up to 50 nodes and up to 100 CPUs. Additional GRID licenses further increase the maximum count by 50/100.). DAZ Studio uses a version of 3Delight that is unlimited cores on one PC.
I'm not entirely sure you should start with hardware specs for something like this. I think first you should look at how you intend to render --meaning what render engine you are going to use. If you're talking about Carrara's native renderer you will have different considerations than if you decide to use Octane Render. If you choose Octane Render, then the AMD HD 7990 is pretty much useless because it doesn't support CUDA which is required to use Octane Render --it supports OpenCL. OpenCL is supported in LuxRender so that AMD Card would help in that instance, but not necessarily for other render methods. In a network render environment, you could pretty much content yourself with getting your hands on a bunch of old P4s from like an auction site, off lease or an older blade server or something like that.
In this instance you're going to have to decide how you want to render first --then do your research into how best to allocate your project budget to facilitate that. That AMD card is going to run you from 6-800 dollars and it would be a shame for you to spend that kind of money only to find that it's unsupported in the render method you've chosen. I have a Dual Xeon System with 64GB of RAM and while it does provide advantages, it's not going to be the hyperspeed improvement I think you're looking for. Just start with this --are you rendering animations or still images or combination of both? Then move to deciding which method of rendering you're going to use. Once you know that --picking the best hardware to help you becomes easier to do.
Hello and thank you for your input! Yes, I know the electric bill will be one thing to consider. I read about one small animation company in the UK who had a rack of servers that was drawing about a thousand more per month. Pounds, Euros, or converted to US Dollars, I can't remember, but it was a thousand, which is a lot no matter which denomination you carry in your bank account!
I plan to control electricity costs somewhat by/because of these things:
1. The AMD CPUS have a lot of cores and are fairly efficient with electricity. The new Intel CPUs I mentioned in my more recent post are Ivy Bridge. Regardless if I go AMD or Intel, they are technically not "low power", but they are more efficient, even when they are working, and when they are not working they virtually go to sleep.
2. I want to do this with one single system. I don't really want a bunch of computers running in my house drawing power and generating heat. One workstation and one server. A BIG server. But just one, primarily to simplify maintenance, keep the heat from getting out of control, and limit all of the power-sucking parts.
3. I already use efficient power supplies; yes you are right there are 1,000 and 1,200 watt power supplies, and I even have an old 1200 in a box of spare parts, but for the rendering server I think I might want to buy one of the new, more efficient ones, and I'll need a new UPS to match, of course.
4. Yep, I know about the Carrara limits on rendering cores; I'm assuming that I would have to budget the added cost of GRID licenses if I continue to use Carrara's renderer for remote work on the server.
The cost of electricity could be the prohibitive factor if I'm not careful.
This is excellent advice, thank you.
Yes, the HD series does not support CUDA, they support OpenCL. I had originally chosen the HD 7990 because it is the most economical dual-GPU card there, most of them are only dual-slot size (so I could possibly fit up to three in a single enclosure), and because I have had very good experience and reliability with the HD 7970 already, and it gets a lot of use already as a processor (but not for rendering). It's holding up so well, I figured I'd just stick with the same family.
But you raise a good point, so I need to learn more about the differences between rendering with CUDA and rendering with OpenCL and what my options might be. To be clear, we are talking about using the card (in the server) mostly for rendering, and only a little bit for driving a monitor.
For the ebay thing; I'm not planning to go that route. I want to stick with 1 system for the server and I don't think they have room in the cases for multiple GPU cards, and cooling is more of a challenge. I think I want to just build this as a new environment and put it in a floor-standing case (like a tower or a short squat system of "R2D2" dimensions) if possible, and maybe do it over time rather than all at once.
By the way, I have tons of time on this project; I'm still learning Carrara! :)
So, the thing to remember is that Carrara's internal renderer is completely CPU based. Luxrender is supposed to get either GPU enhanced or full GPU render abilities from what I've read. I can't speak to all the other geek speak. ;-)
So the question I would ask myself is what I want to render? Is it still images or animations? If it's animations, then Luxrender would seem to be out of the question as the render times are horrific- at the moment. Things could change with the GPU rendering capabilities.
If I had the coin to do what you're wanting to do, my advice, based on what I like to do- a mix of stills and animation, would be to load up on multi-core CPUs, save money by going with stock or cheap graphics cards, but make sure the computer can accept graphics card upgrades later- of the type I would need for GPU rendering using Luxrender or whatever else may come Carrara's way.
This just my opinion of course. It's always easier to spend another person's money than your own. ;-)
EP, your post made me laugh; thanks!
I already have some workloads (not related to rendering) that would benefit from many CPUs with many cores, and also from many GPUs and even dual GPUs (because I can get even more processing power per PCIe slot), so I could move ahead as soon as I'm able to fund my urges here! :-)
Ain't that the truth! Thank you for the points; more to think about...
Here is a system spec based on an e5 Xeon processor:
• Processor: Intel Xeon E5-2660 Sandy Bridge-EP 2.2GHz (3GHz Turbo Boost) LGA 2011 95W 8-Core Server Processor BX80621E52660
• Processor: Intel Xeon E5-2660 Sandy Bridge-EP 2.2GHz (3GHz Turbo Boost) LGA 2011 95W 8-Core Server Processor BX80621E52660
• Cooler: Rosewill ROCC-12001 AIOLOS 120mm Long Life Sleeve CPU Cooler Compatible Intel Core i5 & Core i7
• Cooler: Rosewill ROCC-12001 AIOLOS 120mm Long Life Sleeve CPU Cooler Compatible Intel Core i5 & Core i7
• Motherboard: ASUS Z9PE-D8 WS Dual LGA 2011 Intel C602 SATA 6Gb/s USB 3.0 SSI EEB Intel Motherboard
• Video Card: NVIDIA® Quadro® K4000 VCQK4000-PB 3GB GDDR5 PCI Express 2.0 x16 Workstation Video Card
• Memory: PNY XLR8 16GB (2 x 8GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Desktop Memory Model MD16384KD3-1600-X9
• Memory: PNY XLR8 16GB (2 x 8GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Desktop Memory Model MD16384KD3-1600-X9
• Memory: PNY XLR8 16GB (2 x 8GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Desktop Memory Model MD16384KD3-1600-X9
• Memory: PNY XLR8 16GB (2 x 8GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Desktop Memory Model MD16384KD3-1600-X9
• Power Supply: Rosewill LIGHTNING-1300 1300W Continuous @ 50°C, Intel Haswell Ready, 80 PLUS GOLD, ATX12V v2.31 & EPS12V v2.92, SLI/CrossFire ...
• Case: Rosewill BLACKHAWK-ULTRA Gaming Super Tower Computer Case, support up to HPTX, come with Eight Fans,Top HDD docking
• SSD: PNY XLR8 PRO SSD9SC480GCDA-RB 2.5" 480GB SATA III Internal Solid State Drive (SSD)
• SSD: PNY XLR8 PRO SSD9SC480GCDA-RB 2.5" 480GB SATA III Internal Solid State Drive (SSD)
• HHD: Western Digital WD Black WD4001FAEX 4TB 7200 RPM 64MB Cache SATA 6.0Gb/s 3.5" Internal Hard Drive
• BLUE Ray: ASUS Black Blu-ray Burner SATA BW-12B1ST/BLK/G/AS
• Card Reader: Rosewill RCR-IC001 40-in-1 USB 2.0 3.5" Internal Card Reader w/ USB Port / Extra Silver Face Plate
I am going to build basically this system but the 2660 processors are about 1500 each so I am going to go with the 2630's only 650 each. yest it is slower but the Motherboard will handle the 12 core processors that you mentioned so a few years down the road I can pick up those as an upgrade for (most likly) the same 650 each that I spent on the inital processors.
Plus the other parts of the system would still be great. the Motherboard is top noch and will handle up to 4 video cards. it takes advantage of a SSD as a swap file ( thats why it calls for two) so this gives you a large and fast swap file area for large renders plus the 64 gig of ram already is great.
I also looked at the new ive bridge which can initally use faster memory so look toward 2166 ( always go faster then spec because all will work at slower and some change will come along allowing you to go faster.
the mid end dual processors will still do great when rendering and the ability to upgrade the system is priceless....