In 3D Sliced Mode with an 8-bit or 16-bit image, does LightBurn’s internal resampling introduce spatial aliasing or quantization errors when the source DPI (1200) does not match the Line Interval (0.035mm / 726 DPI) as an example?
Specifically, to ensure the slice boundaries are mathematically locked to the scan lines and to maintain the integrity of the 8-bit or 16-bit gradient, is it technically superior to provide a 1:1 pixel-to-interval map (726 DPI) externally rather than letting the software resample ‘on-the-fly’?
TLDR: Is it better to resample the image to the final target DPI in a photo-editing program (like GIMP/Photoshop) to bypass internal scaling math and ensure every pixel aligns 1:1 with a laser scan line?
If you wanted to be absolutely certain the inbuilt resampling wasnt going to introduce unwanted effects like this edge ringing/ghosting you could definitely prepare the image yourself at the output resolution.
Thanks, Nicholas. That ‘edge ringing’ is exactly the technical variable I suspected.
I’m curious how these artifacts compound over high layer counts in 3D Sliced Mode. Since ‘ghost’ pixels translate to physical depth, rotating the scan angle likely ‘smears’ this resampling noise and impacts the final surface finish.
It seems like providing a 1:1 pixel-to-interval map (726 DPI for 0.035mm) would be the best way to bypass the ‘on-the-fly’ math and maintain the integrity of the gradient. Appreciate the insight!
How I understand it is that if you are engraving at 0.1mm line interval in a 20mm space then you would ideally want a 200px high image so you get a good resolution since 1px would perfectly correspond to line from the laser.
But if you have a 1234px image you will have a lot more information than needed and the effect of automatically trying to condense roughly 6.17 lines of information into 1 line may produce a final result that is perhaps not as good as it could be if you resampled the image yourself externally and then gave LightBurn the exact sized image (that you are happy with).
This is somewhat similar to a passthrough process if somebody wanted do that dithering work externally, but in this case, since LB still needs to do the threshold work for the layering - passthrough is not an option, but some external preprocessing may make a tangible improvement - so worth investigation and testing at least.
Since I’m chasing detail under a microscope for 3d engravings, handling the scaling externally is maybe the better path. LightBurn is a great motion controller, but it’s not really meant to be a high-end image engine, but its fine for most use cases.
Pre-matching the DPI to the line interval (726 DPI for 0.035mm) lets me use nonlinear interpolation / bilinear interpolation and specific Gaussian blurs to get the data exactly where it needs to be before it’s imported. This keeps the data “locked” across rotated passes so the software isn’t trying to calculate depth from interpolated pixels on the fly. It’s an extra step, but for 16-bit 3D work, it seems like a good way to maintain total control over the finish.
Thanks for the clarification, Nicholas. Your answer confirms exactly what I needed to know.