Jump to content

Welcome to eMastercam

Register now to participate in the forums, access the download area, buy Mastercam training materials, post processors and more. This message will be removed once you have signed in.

Use your display name or email address to sign in:

2D Contour Compensation Selection


mackenzieruiter
 Share

Recommended Posts

As far as I know, Compensation Type should be reset every time I make a new 2D Contour path. Is that a correct assumption?

If so, why does it stay selected as my last chosen setting(i.e. 'Off'), rather than resetting back to 'Computer'? I didn't think modal settings in Machine Properties included this.

If not, is there a way to make it reset back to 'Computer' every time a new Contour path is created?

 

Just dicked myself for not watching the compensation close enough on backplot.

Link to comment
Share on other sites
2 hours ago, mackenzieruiter said:

As far as I know, Compensation Type should be reset every time I make a new 2D Contour path. Is that a correct assumption?

If so, why does it stay selected as my last chosen setting(i.e. 'Off'), rather than resetting back to 'Computer'? I didn't think modal settings in Machine Properties included this.

If not, is there a way to make it reset back to 'Computer' every time a new Contour path is created?

 

Just dicked myself for not watching the compensation close enough on backplot.

This is a simple option in the System Configuration File.

File > Configuration > Toolpaths Page > "Get defaults from previous operation".

Uncheck that Checkbox, and when you create a "new Contour Operation", the Operation Defaults will be used, for whatever type of Operation you are creating.

You can always re-enable the checkbox, if you ever want the behavior to go back to 'from the previous Op'.

Also, while you are in there, do these two things:

  • Disable the 'automatically calculate the HST Defaults'. This makes it so that when you switch tools in a HST Operation, it won't mess up all your settings. You can always use the 'Reload parameters from defaults file' button, at the top of the Operation Dialog Box, to reset to the Op defaults.
  • Set the Memory Buffering to '80%', from the default of '50%'. You'll likely never hit this threshold, but might as well configure your system to the best it can be setup.

Now, there is another box you should be aware of how it functions. Then you can decide if you like to work with it enabled, or not.

  • There is a checkbox for 'Lock Feedrates'. This box tells Mastercam that 'once you have finished generating an operation, do not change the SFM/RPM/Feed values, even if the Tool changes'.
  • The reason for this box, is to allow you to have 'Rough Ops' and 'Finish Ops', using the same tool. In these cases, you'll typically want to setup your Tool Definition with the 'roughing' speeds and feeds, and manually change the Finish values for those particular Ops yourself.
  • When 'Lock Feedrates' is enabled, and you start a new Operation, this Op will be 'open'. As you change Tools by selecting a different tool, in the Tool List. Once you have pressed the 'Apply' Button, or generated the Operation, those Speed and Feed values will now be 'Locked' to that Operation.
  • Also, when you are using a Locked Operation, you can still 'reset' to use a Tool's Speed/Feed values, by Right-Clicking on that Tool (in the Op Tool List), and selecting 'Re-Initialize feeds & speeds'. This will reset all the Feed, Plunge, Retract, and RPM values, to whatever is set in the Tool Definition.

I personally always leave 'Lock Feedrates' enabled, and just choose to 'Re-Initialize feeds and speeds', when I want to use a Tool's speed and feed. That way I don't get an Operation's values changing when I least expect it. I work from the premise of 'I want my Operations to not ever change, unless I am consciously attempting to change the Operation. But that's just a personal preference. If you work in a shop where you 100% always 'Use Tool's feeds & speeds', and always have a Roughing and/or Finishing tool, with Feeds/Speeds that are dialed in (and stored in the Tool Definition File or Tool Library), then you wouldn't want to use that option. I'm a big fan of trying to figure out how the whole system 'works', that way I can pick and choose how I want to use the tools that are available to me.

  • Thanks 3
  • Like 1
Link to comment
Share on other sites

Also,

I don't mean to criticize at all, as I've missed plenty of mistakes myself.

But you should be using the Verify > Compare function, to be sure your cutter is only removing material where you expect to remove it.

  • Whatever 'geometry' you have visible on the screen, will automatically be transported into verify, as a 'Workpiece'. This is the part that will be 'compared' against your stock.
  • In the 'Simulator Options' dialog box, you get to select your 'Stock' for Verify and your 'Fixture' for Verify. I typically setup a Stock Model to act as my 'initial stock', and also can use a Stock Solid. (Note: you have to select it on the screen when using the Solid option.)
  • When you load Verify, there are Checkboxes for 'Stock', 'Workpiece', and 'Fixture'.
  • If you want to detect 'collisions' with the Fixture and/or Stock, you have to enable the 'collisions' option, under the 'Stop Op' options (pop-out menu). Also, set the Collision Checking options in the configuration menu for Verify.
  • After you have 'played through' all of your cutting Operations, you can go to the 'Verify' tab, and use the 'Compare' Function.
  • The Compare Function Panel will display (usually on the right side of the screen). Here you can enter a 'Stock amount', which is your "Stock to Leave". This will make an 'offset' from your Workpiece, which is now the 'zero target stock amount'.
  • This can easily help you catch mistakes that you miss in Backplot.
  • Like 1
Link to comment
Share on other sites
On 12/27/2020 at 8:44 PM, Colin Gilchrist said:

But you should be using the Verify > Compare function, to be sure your cutter is only removing material where you expect to remove it.

I use stock models for this purpose, and it helps me keep track of what state the part is in as I program.  This works well for my palm-of-hand sized parts, it may not be practical for very large and complex parts.

Link to comment
Share on other sites
41 minutes ago, Matthew Hajicek - Conventus said:

I use stock models for this purpose, and it helps me keep track of what state the part is in as I program.  This works well for my palm-of-hand sized parts, it may not be practical for very large and complex parts.

This is also an excellent practice. I do find it to work better on "smaller scaled parts". However, it also depends heavily on the "Tolerance" in addition to the Scale of the part.

For example; lately I've been working on a bunch of "Micro-Mold" type parts. The tolerances I'm using is around 0.0001-0.00001 mm. At those scales, many of the functions in Mastercam are "input limited", to where I can't enter 6 decimal places in Metric. (Which is what I really need, for the scale I'm working at on a Yasda YMC650.) To make this clearer; I model my parts at 0.000001 mm Tolerances, because the accuracy I need is so tight. I find that you always want to be sure your "input geometry" is as accurate as possible. I like to go for 2 orders of magnitude "more accurate" for my geometry, than the resolution of the machine's least input amount. What is that so? Why do I want my models 100 times more accurate than my NC least step increment? Because the Toolpath Algorithm has an easier time calculating the "tool offsets", if your input geometry is highly accurate in the first place. I want to be sure all the lines are perfect, that all the arc sweeps are accurate, and endpoints like on a "machine grid point".

One huge thing to be aware of: when you enter a 'Tolerance' in the Stock Model dialog box, the Stock Model Algorithm will actually "offset" the model that is calculated, by this amount.

  • What I really want with a "meshing tolerance", is "accurate stock edges", but also the ability to specify the "size of the mesh" by giving it a tolerance.
  • I find that for "the most accuracy", you are better off running large roughing Ops through Verify, and saving the STL Model out, with several different Mesh Tolerances. Then, I check the Saved Folder, to see the actual physical size "on disk", to get an idea of "how big is the model".
  • I will then perform a "File > Merge", and change the File Handler to ".STL Files" (Stereolithography). I check the 'Options' to make sure it is set to 'Mesh' only, and not 'Stitch'.
  • Select your STL model, and when the Function Panel appears on the left side of the screen, set the Radio Button option to "Active Level".
  • Finish the Import with the Green Check Mark.
  • Now, click on the P-Mesh entity (you just Merged), and click F4. (Analyze Entity Properties.) Note the # of triangles.
  • You can then perform that same set of steps, to Merge in several different P-Mesh Models, that have been 'exported' from Verify, with different Tolerance Values.
  • Note: When you export the STL from Verify (as long as the "Precision Slider" is advanced all the way to Precision), the Mesh will be calculated "on the actual 'cut edges' of the Stock that is rendered in Verify. The 'Export Tolerance' in this case dictates the actual size of the triangles which are generated.
  • What you are looking for is 'acceptable quality'  (render without edges visible, and check the surface smoothness), while also being the most compact "size on disk" possible.

When you use a 'Stock Model' in the Ops tree, especially one that is tied to an Operation, or multiple Operations, that Stock Model will require a regeneration, every time you make any changes to the parent operations.

For me, the cutoff is typically 5 minutes of generation time. If the Stock Model takes longer than a minute to generate, then I really start to question "do I need a Stock Model here, which is tied to a set of Operations?" Because I know that every time I have to generate the path, I'm going to incur that computational overhead.

So, if I'm doing "less complex work", where I can run 30 Ops through a 'Stock Model', and it generates in 10-30 seconds, then I keep using Stock Models, and everything works great.

Once those generation times get above 2-3 minutes, that is when I start considering "Export STL > Merge P-Mesh on Level > Tie Stock Model to 'P-Mesh' scenario.

I find it takes about 2-3 minutes to do that process, so in many cases, it can save considerable time to not create a "Stock Model Op > Tied to a Toolpath Operation".

This is my life lately:

image.png.6e7be7ec5e0af8dae908063e0c2c71a2.png

Link to comment
Share on other sites
3 minutes ago, Colin Gilchrist said:

For example; lately I've been working on a bunch of "Micro-Mold" type parts. The tolerances I'm using is around 0.0001-0.00001 mm. At those scales, many of the functions in Mastercam are "input limited", to where I can't enter 6 decimal places in Metric. (Which is what I really need, for the scale I'm working at on a Yasda YMC650.)

I wonder if it would be practical to work at 1000X scale, and have your post shift the decimal place?

Link to comment
Share on other sites
6 minutes ago, Matthew Hajicek - Conventus said:

I wonder if it would be practical to work at 1000X scale, and have your post shift the decimal place?

I've heard of this before, but I'm using CAMplete, so that would require some real investigating.

In Mastercam, the MP Post File (.PST) works in conjunction with MP.DLL to process the code into NC Formatted ASCII output. So the scaling would take place once the NCI variables are already read into MP.

I'm not exactly sure where I would make the change to get CAMplete to scale the NCI coordinates. Or conversely, do what Mastercam does, and scale the 'output variables' by the scale factor.

I can tell you that at 4:30 pm, on the last day before I start my vacation, is not the time for trying to investigate a new feature. :)

Lol, I think this discussion kind of went off the rails from where the OP started...

Link to comment
Share on other sites

My yasda post I changed to outputs 6 places and it accepts values of .00001 inches in the misc drill parameters. And with surfacing it was posting 6 place values after I set it to linearize with a super tight step setting. Ran off the card because it would have taken hours getting it on the data server.

 

I made no other changes other than that

Link to comment
Share on other sites

Nano Meter or even down to Pico Meter tolerances is not possible with anything normally used in any CAM Software using conventional methods. Colin we talked about this many times over the years and yes scaling everything up by a factor or 1000 or 100000 then converting the code to the correct output through the post is your most accurate way to get what you are after. Take a .2 x .2 block with 4000 surfaces and then try to hold .000003 tolerance not practical using standard processes. We have to adjust the process to support the technology. By a using the scale of 1000 we now make the block 200 x 200 in size and now move the decimal output to the left 4 places to .03 in the cam software. Then when we output the code we get exactly the tolerance and code like we need it. The CAV will also support this and not be over loaded since the stl triangles cannot go that small.

When you come back you will see the light and then will make the impossible happen to hit what is needed on your project.

18 hours ago, Leon82 said:

They gave up on that I think. They ran the wire but I guess decided not to use it

ignorance is bliss.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.

Join us!

eMastercam - your online source for all things Mastercam.

Together, we are the strongest Mastercam community on the web with over 56,000 members, and our online store offers a wide selection of training materials for all applications and skill levels.

Follow us

×
×
  • Create New...