I spent a couple of years as an ISO compliance officer at a Switchgear manufacturer. I spent long hours measuring measuring tapes with a standard test gauge and had to throw out approximately 80% of the tapes we would receive in the shop. I know all about the disparity in measurements from so called measuring devices. Funny thing is that the wood shop was used exclusively for crating product, but since they were part of the company, they had to be held to the same strict standards the fab shop was, and our crates were the most dimensionaly accurate in the business…lol
Wow never imagined that the number would be that high. I found about 30-40 % of the measuring devices are off and 100% of those were the low budget ones
Yep, an old thread, but since someone bumped this and since the topic is close to my heart…
Agreed 100%.
Applies to other tasks that require repeatable and precise measuring as well of course.
Well, that is usually very easy to answer.
The most common reason is that there was no need.
The second usually is (or at least was) that accurate measuring devices tend to be rather expensive.
There’s also the ever present point of diminishing returns lurking just around the corner, so:
measuring or cutting too accurately can also lead to other kinds of headaches, not to mention expense.
Re the rejection percentage:
That always depends on the required accuracy which is determined according to the need.
Or in that case, the standard used.
There’s a very good reason why any quality system should be used through the whole manufacturing process, not just on parts of it.
On the same batch of tape measures for example, with looser parameters all of them would pass, with stricter parameters all of them would fail.
The reality always falls between those two.
A persentage that high would indicate that the person responsible in the purchasing department has been lied to or mislead by the supplier, or has made some sort of an error.
Neither will matter much since the accuracy was checked, and usually the rejects can be returned or used in less precise measurements.
At a previous job, we used a lot tape measures and we had both “calibrated” and “for reference” types. The reference tapes didn’t need to pass any scrutiny and were never to be used in a production capacity. The the calibrated tapes were controlled source and 100% checked (and periodically rechecked) per our metrology lab standard. I can’t remember exactly what the standard was, but I know most tape measures were not accurate enough at the longer calibration distances. Maybe 10 feet or something similar to that. I think they needed to be within 1/16" at any point under 10 feet. Some failed within 12", others failed further out.
I know my favorite 12’ Stanley “inch decimal” tape could not pass calibration so they gave me a (I think) Mitutoyo digital but it was so enormous that I never used it. The Stanley lived a quiet and unassuming life on my belt, next to my Leatherman, never bringing attention to himself, but alway getting things done.
I only built prototypes/samples/exhibition/specials so I got away with some things regular production personnel would not.