Jump to content

Welcome to eMastercam

Register now to participate in the forums, access the download area, buy Mastercam training materials, post processors and more. This message will be removed once you have signed in.

Use your display name or email address to sign in:

why quadro?


medaq
 Share

Recommended Posts

Building my next computer, and looking at the graphics cards. At work we have a RTX4000 and at home I have a 3080Ti. I can not tell the difference between the 2, and sometimes I feel the 3080 might be doing better. Is there something I am missing why everyone says quadro cards and such for mastercam? 

Link to comment
Share on other sites

The RTX3080Ti is twice the card that RTX4000 is. I would expect you'd see better performance from the 3080Ti. 

I'm still of the mind set that Quadro doesn't matter in Mastercam, specially when compared to the RTX 30xx and 40xx cards...until there is some sort of comparison to show otherwise.

 

  • Thanks 1
  • Like 1
Link to comment
Share on other sites
21 hours ago, medaq said:

I get stuck waiting for this a lot on our files. 

and this is where the GeForce cards will eventually come up short...the OpenGL instruction set for the Quadro cards is tuned specifically for the Quadro cards and CAD/CAD/CAE and heavy duty graphics rendering.

I won't argue that a GeForce can't get many users by, it can

Once you start working with complex models, complete simulation build outs, full setup builds, complex 3D & 4th & 5th axis cuts, this is where a Quadro will eventually fail.

I did Mastercam support for several years, we always recommended Quadro cards The level of card depending the work the user was doing. Why? Becasue from the support side, you want to help the customer eliminate any chance of issues that you can. Customers can and do have plenty of other problems and questions. Not having to deal with graphic issues is good for all involved.
 

Quote

 

In CAD (computer-aided design), tessellation is used to subdivide mesh surfaces into smaller primitive shapes. In real-time computer graphics, the surfaces of 3D objects are defined as tessellated strips of triangles, which are processed efficiently by the GPU.

Hardware tessellation is supported by graphics APIs, including OpenGL, DirectX, Vulkan, and Metal.

 

https://www.computerhope.com/jargon/t/tessellation.htm

 

Link to comment
Share on other sites
16 hours ago, mwearne said:

RTX A4000 and RTX30xx/40xx show use of OpenGL 4.6. What is Quadro doing different?

It lies in "how" they utilize the OpenGL instruction set, it also is the hardware as well, there are more processing pipelines available on the Quadro then there are on the Geforce line

Just becasue they both have it, doesn't mean they both can do the same things....

 

16 hours ago, gcode said:

was this a typo

Yes.

That should be the GeForce will eventually fail

 

https://www.gpumag.com/nvidia-quadro-vs-geforce/

 

Processing power is another area where Quadro clearly wins.

 

  • Like 2
Link to comment
Share on other sites
6 hours ago, JParis said:

It lies in "how" they utilize the OpenGL instruction set, it also is the hardware as well, there are more processing pipelines available on the Quadro then there are on the Geforce line

Just becasue they both have it, doesn't mean they both can do the same things....

 

Yes.

That should be the GeForce will eventually fail

 

https://www.gpumag.com/nvidia-quadro-vs-geforce/

 

Bit of an unfair comparison in that article. $10,000 Quadro vs $1000 GeForce. It is a good point though, there are no $10k GeForce cards so if that is the area you're working in, Quadro is the only answer.

If you compare similar priced, or even cheaper GeForce, the specs are tilted towards Geforce. These are just numbers and don't reflect real world where drivers and software come into play but I'm of the opinion that GeForce shouldn't be written off. I am more than willing to be proven wrong though. I just have a hard time with 'trust me, this is better'. Weren't Xeons seen as the best CPU option at one point?

On 1/15/2023 at 10:48 AM, JParis said:

Once you start working with complex models, complete simulation build outs, full setup builds, complex 3D & 4th & 5th axis cuts, this is where a GeForce will eventually fail.

I do these sort of things frequently with a GeForce and have never had anything fail. Any example of what would push the card to the limit? I don't notice the GPU getting used during toolpath calc, only when graphical items on screen are being displayed/hidden or manipulated.

 

Link to comment
Share on other sites
25 minutes ago, mwearne said:

I do these sort of things frequently with a GeForce and have never had anything fail.

Good luck to you...I don't recall ever saying ANYTHING about toolpath creation, ever.

 

25 minutes ago, mwearne said:

only when graphical items on screen are being displayed/hidden or manipulated.

and there you have it...keep stacking stuff on, you're working to what you see as complex...I can only guess, that your level isn't the same as others.

Link to comment
Share on other sites
3 hours ago, mwearne said:

If you compare similar priced, or even cheaper GeForce, the specs are tilted towards Geforce.

 

If you really want to be pissed off, compare the hardware specs of the cards, and then the price for a similarly hardware speced Quadro vs GeForce. 

On the upside, the outrageous price of the quadro cards help pay for RnD so... 

Link to comment
Share on other sites
4 hours ago, JParis said:

Good luck to you...I don't recall ever saying ANYTHING about toolpath creation, ever.

 

My bad, I interpreted "complex 3D & 4th & 5th axis cuts" as referring to toolpath creation. Still, point is the card is not doing anything but graphics in Mastercam. And you see the same with a Quadro?

So, technically, you can calculate the most complex of toolpaths with no videocard. Not that you would, but if we are talking performance of anything other than model manipulation, the card doesn't matter?

4 hours ago, JParis said:

and there you have it...keep stacking stuff on, you're working to what you see as complex...I can only guess, that your level isn't the same as others.

Ya, my day to day stuff is not on the highly complex spectrum but when I go down these bench marking rabbit holes, what I'm doing is probably more taxing than what most would ever put their machines through. These files that bring my current setup to its knees, I'm curious if the Quadro would make more of a difference, or would more RAM be a better investment (this is what I'm testing now)

 

 

 

Link to comment
Share on other sites
17 minutes ago, neurosis said:

 

If you really want to be pissed off, compare the hardware specs of the cards, and then the price for a similarly hardware speced Quadro vs GeForce. 

On the upside, the outrageous price of the quadro cards help pay for RnD so... 

I have compared and ya, it makes ya wonder.

 

Link to comment
Share on other sites
19 minutes ago, mwearne said:

but if we are talking performance of anything other than model manipulation, the card doesn't matter?

The ONLY performace I am referring to is graphics...and GeForce will fall down eventually.

In my mind, any trainer or reseller who does not know the level of work a customer is doing, chasing and/or apt to get, is doing them a disservice by recomenneding a GeForce card. I've been on the support side.

If you know someone is doing mostly 2D, some 3D, heck, even some light 5 axis, a GeForce card can and does work.

When you start programming with hundreds and hundreds of Operations, and hundreds of solid bodies....you will wish you had invested in a proper CAD card.  JM2C  YMMV

 

  • Like 1
Link to comment
Share on other sites
59 minutes ago, JoshC said:

if you do a lot of simulations of any kind that is where I think you will notice the biggest impact with computer performance. So if your not doing a ton of Machine Simulations or don't plan on it I think a more budget friendly pc is fine.

Machine Sim has little effect on my GPU load. Full 5X cuts. with 2D and 3D thrown in, using the Haas UMC machine, GPU doesn't get above 25%. I will admit I haven't thrown the kitchen sink at this though, but my first thought is anything else is just loading extra toolpath data and would have little effect on graphical performance. If the sim models were very detailed or there was a lot of bodies attached to an axis/fixture, this could increase the load.

What I find most taxing to my GeForce GPU is simply skewing in the graphics area. Here's a file and a short demo. Circles and Cylinders.zip

If GeForce and Quadro are the same graphically with this, maybe a different type of test is needed. I could put a couple hundred 5x toolpaths in a file and do the same? Open to other ideas.

 

 

 

 

Link to comment
Share on other sites
10 hours ago, mwearne said:

Machine Sim has little effect on my GPU load. Full 5X cuts. with 2D and 3D thrown in, using the Haas UMC machine, GPU doesn't get above 25%. I will admit I haven't thrown the kitchen sink at this though, but my first thought is anything else is just loading extra toolpath data and would have little effect on graphical performance. If the sim models were very detailed or there was a lot of bodies attached to an axis/fixture, this could increase the load.

What I find most taxing to my GeForce GPU is simply skewing in the graphics area. Here's a file and a short demo. Circles and Cylinders.zip

If GeForce and Quadro are the same graphically with this, maybe a different type of test is needed. I could put a couple hundred 5x toolpaths in a file and do the same? Open to other ideas.

 

 

 

 

thanks mike and good to know, i honestly always thought it was my gfx card getting bogged down with mach sims but must be something else, so i might be wrong on that but i definitely see the biggest performance issues with machine simulations on my pc's, not many mastercam related tasks boggs my systems down except that.

Link to comment
Share on other sites
1 hour ago, JoshC said:

thanks mike and good to know, i honestly always thought it was my gfx card getting bogged down with mach sims but must be something else, so i might be wrong on that but i definitely see the biggest performance issues with machine simulations on my pc's, not many mastercam related tasks boggs my systems down except that.

What CPU do you have? On my System, that's usually the bottleneck. I would buy the fastest i7 or i9 before I would spend a lot of money on a GPU for Mastercam.

Link to comment
Share on other sites
1 hour ago, Simon Kausch said:

What CPU do you have? On my System, that's usually the bottleneck. I would buy the fastest i7 or i9 before I would spend a lot of money on a GPU for Mastercam.

my system has an Intel(R) Core(TM) i9-9880H CPU @ 2.30GHz , so its not a bad cpu but the regular mill and Mill turn simulations i run can sometimes get pretty involved.

Link to comment
Share on other sites
15 hours ago, JoshC said:

my system has an Intel(R) Core(TM) i9-9880H CPU @ 2.30GHz , so its not a bad cpu but the regular mill and Mill turn simulations i run can sometimes get pretty involved.

It's not bad but compared to the newest desktop CPU's there is noticeable perfomance difference.

It's not a real world test but check this comparison:

https://cpu.userbenchmark.com/Compare/Intel-Core-i9-9880H-vs-Intel-Core-i9-13900K/m750169vs4129

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.

Join us!

eMastercam - your online source for all things Mastercam.

Together, we are the strongest Mastercam community on the web with over 56,000 members, and our online store offers a wide selection of training materials for all applications and skill levels.

Follow us

×
×
  • Create New...