I’ve heard several of you say this but I’m afraid I don’t understand. I do understand that things are, in theory, “deterministic and predictable” with discrete numbers of teeth on belts and pulleys and motor steps/revolution but – in practice – I routinely see that there will be be slight adjustments necessary to truly get “calibrated” axis movement.
I love your Test Pattern generator, Jamie. And I’ve always been fond of printing/lasering rulers with the machines I build… those are equally as important to me as the “MPCNC crown” to give me confidence that my machine is operating properly.
Only a couple of days ago, I was messing with the laser on my FoamRipper machine… and using the X and X/Y rulers from your generator to calibrate the X and Y axes. Note all the rulers, printed on this piece of cereal box cardboard. Obviously, 160 is the “determined” and “predictable” number for steps/mm… but 160.00, 160.80, 160.50, and 160.60 all give “measurable” differences when compared with an accurate metal ruler.
Over the 100mm I used here, I watched the rulers adjust to “dead on” with 160.60. If I really want/need to refine that value further, I do this over a larger range, say 900mm… or as much as the axis will physically allow.
Anyway, I use the following calculation to zero in on the steps/mm value for my particular machine’s axis
steps/mm = (commanded / actual) * current step/mm setting
I’m all ears if someone can tell my why this isn’t the right way to calibrate an axis. I’ve seen it on virtually every machine I’ve built over the last several years… and there have been a number of them. Why do they all seem to be more accurate with, say 160.57, than 160? To me, this is the difference between “close” and “calibrated”.