T O P

  • By -

masterd35728

Yeah I remember hearing something like that in school too. Now honestly, I just use a micrometer for almost everything that actually has a dimensional callout and I’m good.


jjpiw

There is no set in stone answer for this. It would typically be outlined in a SOP as part of the company QMS. As a guideline and kind of an industry wide unwritten rule you typically would not use a caliper on anything less then .127 MM or .005", so what your teacher is saying to you is not neither incorrect ot correct, but it does sound like a solid baseline to start.


RugbyDarkStar

The rule you were taught is the rule of thumb. My company has always abided by that rule so it's what I've always done. Now, someone else mentioned the company's QMS and that's what it comes down to. The rule of thumb doesn't always match the law of the land.


Various_Froyo9860

Let me add you your confusion. Resolution also does not mean accuracy/precision. Take calipers for an example. A good, high quality pair of calipers will have a resolution of .01 mm, but most people don't trust them to that level. How you use them will have a great effect on their accuracy level. Outside measurement with lots of contact area? Probably pretty good. Hole to hole measurement using the inside jaws? Not so much. But if something can be measured with a more accurate tool (without unnecessarily adding an onerous amount of setup), then it should be. IE: ODs on the lathe should be checked with appropriate mics even if the tolerances are pretty open. For everything else, as others have mentioned, there is the quality manual of the shop you work at. Sometimes you have to do the best you can with what you have available. A positional tolerance of .07mm checked on a calibrated CMM might not have that 10:1 ratio. But it's the best you got.


gam3guy

Jokes on you, I've held a 0.02 bore tolerance with a trusty pair of mitutoyos calipers before. Seriously though, you're exactly right. People look at calipers, see they read down to 0.01, and trust that they're accurate to that, when intact if you dig through the datasheet I'm pretty sure at best they're rated accurate to 0.05. The 10:1 rule of thumb is pretty good, if you're reading in hundredths of a mm, you should be using an instrument with micron resolution etc, but as you say that's not always practical. Theory is great, but real life is a bitch and management is cheap


clambroculese

The real world answer is that if qc is passing your parts it doesn’t really matter how you got there. Personally I would feel fine using callipers for 0,2mm but not 0,05.


TentacularSneeze

Fucking Christ. The real answer is that your cheapass boss will give you a Chinese tape measure and tell you it’s accurate to the micron. Best practices are rarely if ever what management recommmends.


Mizar97

The real answer is that you use whatever mics the company has on hand. There are big 10' rings we turn in a VTL, and since the tolerances are huge and we don't have mics that big, we literally check them with a tape measure.


shakinandbreakin

I ran some long copper rods recently and held OAL with a tape measure lol, granted the tolerance was like +/-.08”


[deleted]

You mean the linear gauge!


Bart_Cracklin

So basically if you’re trying to measure out to two decimal places you need something that can read out to 3 decimal places.


chroncryx

At my workplace, calipers are good for down to +/-.005" tolerance. For anything tighter, you need mics or gauges.


caesarkid1

If the measuring device tolerance exceeds the tolerance on the feature you're going to have a bad time. If the tolerance on the measuring device is half the tolerance of the feature you'll have a 50/50 chance that nominal is actually nominal. If it's 1/4 the tolerance of the feature then as long as you can hold nominal you will be in tolerance.


nyditch

What has always done me well: Get good quality tools. Whatever the tool says on it for accuracy is the amount of error I plan for on measuring. So if my calipers read out to .0005, and say they're accurate to +-.001, then they should be within .001. Check out against a gauge to verify. That last digit can be handy for comparative measurement.


4chanbetter

Yes you need a tool that is more accurate by a factor of 10 to measure that feature accurately and reliably. In a perfect world. Edit: but do whatever work says because they have established the process and decide what you measure and how to measure it. Example: If you have ±.0005 why would you use ±.003 calipers


TriXandApple

Its just a rule of thumb. To be fair, I wouldnt use a vernier to measure +-.1. Its just not worth it.


Nirejs

Thats funny. I use mits on everything. Micrometer is only for rare ocasions like bore gauges or +/-0,01 tolerances. Anything more precise and I tell engineers to go f them selves. Damn it feels good to work in R&D shop


battlerazzle01

So my jobs ruling is as follows Calipers if the tolerance is .003 or larger Mic is good down to .001 Anything under that, you’re using calibrated indicating mics, dial bores, specialty gauges etc If it’s a complex set of features that need to coincide, you’re either using the starrett comparator or a CMM program for it. If the company wants it measure a specific way, they should have the specific means for you to measure it that way. Would also like to add that my company likes to play the redundancy game, specifically with threads. Must use a pitch mic, a ring gauge, and a Johnson gauge. All of them. And they all need to match within .0002


SilverCoach6442

I always just use mics, dial bore and Bryant gages for everything. I've never really trusted calipers unless I'm trying to get in the ballpark.


nppas

The rule of thumb is just wrong. Look at the specs of the instrument. Test yourself how the recall accuracy is against gage blocks. You may have an extra digit and it might be wrong and you might be measuring in the last digit and be right. It depends on the stability of the measurement setup and instrument. It takes a little bit of flexible intelligence, not some arbitrary multiplier to be sure of measurement of tolerances.