Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2024 Daz Productions Inc. All Rights Reserved.
Comments
One important lesson I learned in a lifetime of troubleshooting is to not overthink things. It is easy to try thinking through a problem and end up saying "well that couldn't be it" only to find out for some bizarre reason, it was. This really got driven home by a particular tech I worked with who would come up with the craziest things and I would be "that has nothing to do with the problem, there's no point wasting our time even checking that" only to have him go off, test it and find that it did through some convoluted backdoor reason. For that reason, sometimes my troubleshooting seems like shooting darts at a wall, but it didn't used to be, it just became somewhat like that after a lifetime of this showed it to be effective. We still need to keep up on technology, how and why things work, we just need to not let that knowledge interfere with something like troubleshooting. ;)
Try saving the cube as test files twice. (On your lappy) Once with BI and once with cycles, small resolution, like 64 by 64. Then, drop them on your box and command line render them. If either renders, then it's gotta be the video driver. As nonsuch noted, Blender has built the environment and it's starting on the interface at death. That's suggestive, right there.
of course, I'd recommend Joe's test first, and if that doesn't work, disabling one card before going to the commandline render test. Or just diabling on card and seeing what happens.
That's a good idea SB, didn't think of that. Why I love brainstorming in groups. :)
On a totally separate topic I thought since we (we being mostly me :p ) were so religious about not using ngons and booleans when this thread started I should post at some point an update. Things have changed. This is no longer the case (as much.) Both still give issues at times when subdividing, having in a deformable (organic vs hard surface) mesh, etc.. but they have come a long way. Both are valid tools that can speed up our workflow in the right circumstances and ngons even transfer decently often between different environments as most have become much better at handling them. As for Booleans and transferring, with tools generating much better topology upon applying booleans it makes them much more useful in hard surface modeling where the mesh isn't going to deform (at least in that area.) So, keep flexible and move with the times I guess, and realize they still can cause issues, but the issues they cause can be managed and if necessary we can go back and fix the geometry much easier then we used to be able to. Remember, it's not just about being able to do something, it's about doing it efficiently, and in some cases this includes ngons and booleans now. Right tool for the right job. :)
I forgot the link to the Blender command line arguments although I think you must have them yourself already:
https://docs.blender.org/manual/ja/dev/advanced/command_line/arguments.html
Hi Joe:
Joe, that last item will do what you suggest (make Blender run on one card). And it worked -- (more in a subsequent post). I can't tell Blender to use only one card because I can't get to the UI. But if windows can't see the card, then Blender surely won't see it either.
Nonesuch, I just tried those things. No change in behavior:
Thank you for doing a log on a successful install; that gives us more clues. It's something happening at or right before menu build time! Sheesh, I should have thought of doing that on my laptop since it's working! D'oh!
Be cool to have this in DAZ Studio natively!
We have a winner!
>>> Disable the second graphic card from Windows Device Manager. <<<
Looking now for the option (in the Blender preferences) to restrict to 1 GPU. After that, I'll reactivate the 2nd GPU in Device Manager and make sure it works.
I can't seem to find a way to force Blender to use only one card. The only place I think that would govern this is in "SYSTEM" tab, but I'm not sure how I can "force to 1 GPU card" when there was only one active at the time I started Blender. It would be less of a hassle if I could use Blender without having to remember to disengage that other card. Any ideas?
Under Preferences/System/Cycles Compute Device, does it list two cards? If so, can't you de-select one of them?
Hi and thanks. No, because in order to start Blender, I had to disable one of the cards in Device Manager.
Edit: I wonder, is there a way to manually edit userpref.blend and try to put the second card in there as disabled? I tried looking at it with Notepad, but it looks like it's got some special formatting.
If it turns out that I must use Blender with only one GPU card active, then can I write a script that will disable the 2nd card before invoking Blender, and enable it when exiting? Remembering to do it manually will just become an annoyance over time.
Well now that you definitively identified it as the 2nd video card you need to identify the driver versions in use for both cards, the HW versions for bother cards, and the blender log and take the diagnostics directly to Blender support. Those are the type of problems they like to resolve and more likely have a resolution in their bug database already.
Thank you for that advice. I may do that.
Both cards are GTX 980, one from Asus, the other from Zotac. Both were purchased on the same day and have pretty much all the same features and specifications. In GPU-Z, I see only minor differences reported (less than 5% difference in speed of GPU, speed of memory, etc).
I am pricing 1080 TI cards right now. At least now I know what to look for now, and I would definitely want to see those cards work correctly with Blender.
Well I don't know how much hassle you like to narrow the problem down but you could switch slots of the two cards and see if the problem is actually the card or the slot the card is in and so by extension the motherboard. Some slots don't function the same as other slots intentionally though too.
Yes, in my X99 board, different slots have different bandwidth characteristics. I thought I had chosen samsies when I built it, but maybe not.
I have a couple other cards in there too (a firewire card for my Focusrite audio interface and a UAD-2 DSP card for audio effects), so right now I can't swap the GPUs to other slots. Compounding matters is the fact that the Zotac takes up 3 slots of width, so right now he's in slot 6, which lets him hang off the end of the motherboard.
I do realize that these limitations basically mean that this problem may not go away unless/until I can eliminate some PCI cards and/or move my GPUs around.
I see that GTX 1080 Ti cards are going for $700 to $750. $1500 for two cards is kind of rich for me right now, so I'll probably wait another few months. I'm looking for prices down around $1200 or so for the two.
You want to run cards from the same manufacturer/model when running dual cards because timing issues between different models or manufacturers can cause problems. I didn't think to ask this because I pretty much take it for granted. Also, as nonesuch hinted at, some cards don't run well with some system boards, even when using high end on both (again, timing issues,) so research this also.
Right you are Joe I found that out the hard way when I first tried dual video cards
Before I get into my (mild) disagreement, does anybody know of a way to disable a 2nd card in Blender before enabling that card in Device Manager? I sure would like to have that second card online for other applications (which work just fine), but I can't access userprefs if Blender has crashed.
So here's why I disagree about the dual cards thing. First, I get what you're saying, and I would agree in a perfect world. But for a number of years now, I've been able to run non-homogeneous video cards without any trouble at all, even with Blender (until now). These two 980's are the CLOSEST I've ever gotten to having "perfectly homogenous" cards (which I'm sure doesn't really happen even with cards from the same manufacturer). Never had a problem before, not even with Blender. If I were running SLI, I could see what you're saying. But I'm running them separately.
Maybe Blender now has a more stringent check. Whatever is happening under the covers, this problem has appeared only in Blender. In fact, it only showed up in the STARTUP of blendor, even before I asked the cards to actually do any rendering work.
Everything else works fine with both cards. Except for Blender.
I'd rather not, but if necessary, I can live with this limitation in Blender. And of course, the fact is that Blender is the exception here, not the rule. (edit: Wow, I guess I really beat that dead horse, didn't I? )
Anyway, thanks for your patience. You did turn me on to the ultimate issue, and for that I'm thankful!
On the Render Settings tab in Blender: https://blender.stackexchange.com/questions/68257/does-blender-support-dual-gpu-rendering-if-the-two-gpus-arent-the-same/68269
Also, in the NVidia Control Panel under Manage 3D Settings on the Program Settings tab you can assign applications. I actually have dual video cards in my notebook but I'm running them separate, the integrated for most programs and the NVidia one for specific programs like Photoshop Unreal/Unity, Blender, etc... You may have been able to just add Blender to the Nvidia Control Panel and set it to one card then the other to troubleshoot which might have been causing the issue, didn't think of that sooner. Not thinking of things like that happen more often when trying to remote troubleshoot unfortunately. Btw, if you have an onboard video card on your system board you can set basic apps to that gpu and save the discrete cards for intensive things. I'm running a 4k monitor on my integrated gpu without a hiccup.
Is there a wiki or written documentation source for Blender?
I just can't with video stuff
https://docs.blender.org/manual/en/dev/
Its glorious
I still find the UI design inscrutable, but not spending $800 for a moderate end modeling package appeals to me.
When it changed to 2.5, the UI became what it is now; I felt as you do now about it prior to that. It made sense to me and I've loved it ever since, but shortcuts are really the key to efficient use; it is certainly possible to do everything without touching a shortcut, but is so much faster by using them.
Will, if you can figure out shaders (and that mind-boggling node system) then the Blender UI should be a breeze for you. I just pick the bits I need and read-up (or watch a video) on that aspect. First off though, follow something that explains basic navigation - those shortcuts will come in handy no matter what you do in Blender. I'd guess I know less than 10% of the total UI but it is enough to get me started on Sculpting, Mesh Editing and even Video Editing. I have no clue about texturing or baking normals/displacement etc. One day, maybe, one day.
Marble, I giggled when you mentioned that you thought ZBrush's UI was confusing but then most folks think that Blenders is! Including me! lmao
Oh yes - ZBrush was probably a step beyond Blender for me when it comes to hard to navigate UIs. I don't think I ever found out how to close the program withouty clicking the x in the corner. Also I had no idea how to save a preferred layout so I would initialise on every startup and that would bring me back to something I could work with. Then it would bork with some products I was trying to morph so sending them back to DS would result in an error about changed vertices (I think it had to do with ngons but I don't have a clue what they are). That's when I started using Blender because, a) I knew that my access to ZBrush was temporary (it was on a computer owned by the company I worked for and I had to return it when I retired) and b) Blender didn't screw up the vertex count if I told it not to.
...yeah, still waiting for my Super Secret Blender Decoder Ring to come in the post (egads, I hate breakfast cereals but bagels don't come with box tops).
I am always fascinated by how different people react to different programs.
I must be the only person on the planet who thinks Blender is very smartly laid out and brilliantly flexible.
I watched one ten-minute video that described what each window type did and where the basic options were and everything made sense after that. Gradualy learning less than a dozen shortcuts let me do things in seconds that take forever in other programs. Admittedly, before I found that video I thought it looked very intimidating, but everything makes perfect sense once you get the logic and is easy to find via menus. Hexagon on the other hand, which is often held up as an example of a clean, clear, easy to understand layout just boggles my mind - I did tutorial after tutorial with it and would go away for a few weeks, come back and not be able to figure out how to do anything. Same with Poser back when I tried that and I still find things in DS a bit awkward to use, although it has features I love too.
I guess people's brains just work that differently, but it's weirdly frustrating to me that people constantly bring up how hard Blender is to learn, when that was the exact opposite of my experience.
MDO: Almost every app I use, I didn't read any instructions beforehand -- I just dove in and started doing stuff.
Blender, I can't even figure out how to move the camera and everyone suggests weeks of study.