I've been trying to determine the appropriate size difference between the inside of tubes and the parts that get pressed into them. I have a generic digital caliper with three digits after the decimal and the number 5 pops up after the three digits if it measures between numbers on the third digit. I don't know how the rounding logic is set up that causes the 5 to appear. I realize that the caliper I am using is not terribly accurate. I tried to put the same amount of pressure on each part when I measured them. I believe that the large number of parts that I measured helps to smooth out variability that was introduced by tool and operator.
I measured 5 each of 5 different platings on slims, I believe they came from a variety of sources. After deburring the ends, all the tube IDs measured .243 to .244. All the transmissions measured .248 to .249. There was an outlier nib that measured .246, all the rest measured .2485 to .2505. The caps measured .2485 to .251. On average the parts being pressured into the tubes were .00555 or 2.28% larger than the tube.
Same type of measurements for 5 each of 5 platings of cigars; there are four parts to be pressed into the cigar tubes during assembly. On average the parts to be pressed in were .004 or 1.11% larger than the tubes.
Should I consider .004 to .0055 to be within the acceptable range? Or should I consider 1.1% to 2.28% to be the acceptable range. I would have expected the % difference to have been less on the smaller diameter tube, but it was the opposite.
The purpose of all of this is that I'm having something made and I've asked for the part to be pressed in to be .005 larger than the tube with a tolerance of ± .0005. The person making the item tells me that I'm asking for an impossibly close tolerance. I only asked for ±.0005 because that is the smallest my caliper will measure. What is the appropriate tolerance to ask for?
I measured 5 each of 5 different platings on slims, I believe they came from a variety of sources. After deburring the ends, all the tube IDs measured .243 to .244. All the transmissions measured .248 to .249. There was an outlier nib that measured .246, all the rest measured .2485 to .2505. The caps measured .2485 to .251. On average the parts being pressured into the tubes were .00555 or 2.28% larger than the tube.
Same type of measurements for 5 each of 5 platings of cigars; there are four parts to be pressed into the cigar tubes during assembly. On average the parts to be pressed in were .004 or 1.11% larger than the tubes.
Should I consider .004 to .0055 to be within the acceptable range? Or should I consider 1.1% to 2.28% to be the acceptable range. I would have expected the % difference to have been less on the smaller diameter tube, but it was the opposite.
The purpose of all of this is that I'm having something made and I've asked for the part to be pressed in to be .005 larger than the tube with a tolerance of ± .0005. The person making the item tells me that I'm asking for an impossibly close tolerance. I only asked for ±.0005 because that is the smallest my caliper will measure. What is the appropriate tolerance to ask for?
Last edited: