OK--this has been intriguing me for some time. When someone says they are cutting something at a 26.4˚ angle, or something like that, how do they know that that's really the angle they are using?
Yes, you can use a Wixey, but that only gives the angle relative to true vertical, or 90˚to the saw table, since few floors are absolutely level. How do you know that your blade is truly 90˚, not 90.2˚ or 90.3˚. I use an engineer's square and eyeball the space along the blade. I know that others, like Rolf, use a printed gauge, and others use the "cut into the wood then test it on the back of the blade" method. But how precise can something be that relies on the eye? Parallax can really mess things up, as well as just typical errors of measurement that occur when we read any gauge.
The reason that this gets to me is that practically speaking, it doesn't really matter as long as you come pretty close. Most folks make test cuts for inlay, and other projects, like baskets and bowls, don't require that degree of precision. A fraction of a degree usually doesn't matter. Yet I've seen insistence on a degree of precision, especially for bowls, that makes things seem far more complex than they are.
Does anyone know of a way to be sure that your starting point is dead on accurate? Something that digitally determines the angle between the blade and the table? That could be really neat to have, if it exists at a price the home woodworker could afford.
Thanks for letting me vent. I've been sitting on this for a while.
Yes, you can use a Wixey, but that only gives the angle relative to true vertical, or 90˚to the saw table, since few floors are absolutely level. How do you know that your blade is truly 90˚, not 90.2˚ or 90.3˚. I use an engineer's square and eyeball the space along the blade. I know that others, like Rolf, use a printed gauge, and others use the "cut into the wood then test it on the back of the blade" method. But how precise can something be that relies on the eye? Parallax can really mess things up, as well as just typical errors of measurement that occur when we read any gauge.
The reason that this gets to me is that practically speaking, it doesn't really matter as long as you come pretty close. Most folks make test cuts for inlay, and other projects, like baskets and bowls, don't require that degree of precision. A fraction of a degree usually doesn't matter. Yet I've seen insistence on a degree of precision, especially for bowls, that makes things seem far more complex than they are.
Does anyone know of a way to be sure that your starting point is dead on accurate? Something that digitally determines the angle between the blade and the table? That could be really neat to have, if it exists at a price the home woodworker could afford.
Thanks for letting me vent. I've been sitting on this for a while.

Comment