I cant find option to show where actualy is laser head. In LaserGRBL this function is default (moving cross) in lightburn I can
t find this option.
Please see:
Thank you, but thatâs not what I wanted. The solution provided makes the laser head follow the mouse, it appears where I clicked the mouse on the work area. What I want is for me to see a marker on the work area where the laser is currently located. If the laser head moves, these movements are not visible on the work area. Best regards!
Thanks! This option will show me live position tracking of laser head? Or only last position? I looking for option to show me always, where is laser head. Best regards!!!
Not sure, try it and see. I donât think it reports live though.
Because the controller does not provide a stream of data detailing the laser headâs position, LightBurn cannot report where it is while itâs in motion without sending a specific request.
From GRBLâs doc on Status Reporting
:
Developers can use this data to provide an on-screen position digital-read-out (DRO) for the user and/or to show the user a 3D position in a virtual workspace. We recommend querying Grbl for a ? real-time status report at no more than 5Hz. 10Hz may be possible, but at some point, there are diminishing returns and you are taxing Grblâs CPU more by asking it to generate and send a lot of position data.
Remember that lasers move much faster than CNC spindles, so interrupting the command flow with position queries will affect the controllerâs ability to control the smooth motion required for good cutting & engraving.
Because different controllers have different internal buffer sizes, with some capable of absorbing the entire G-Code program, LightBurn cannot predict where the laser head will be at any time.
Basically, just watch the laser move to know where it is!
Hereâs a different post explaining the rationale for the absence of a live view.
Big thanks thelmuth and for everyone. Lightburn has no live indicator and you have to live with it.
The reason why LB doesnât have that feature is actually twofold.
One is the previously mentioned safety -or more precisely lack of it- side of that feature (which I for one agree with completely) and outside unsupervised automation, that feature doesnât have many uses.
The other is the huge strain a system like that presents to the computational side of things.
If You really, really want/need one for some reason, just install XY(Z) position indicators onto Your rails (direct closed loop operation).
Or, if maximum accuracy isnât required (and it usually isnât with âtouchlessâ machining), stepper motors with an encoder or position feedback windings (indirect closed loop operation).
If Your controller doesnât support such a feature, Youâll obviously need a controller that does -or simply because LB wonât be able to use that data- route the data onto a different system to be post-processed.
Marrying the feed from that âdifferent systemâ onto LB might prove a challenge, but usually in similar cases in the past, the position data was overlaid onto a copy of whatever task the (C)NC XY(Z) movement was performing, and used only(/mainly) for observational and monitoring purposes.
Without closed loop feedback, the position request @ednisley talked about will only tell where the head should be, not where it actually is.
For general monitoring that may be sufficient, for controlling the job, it really isnât.
AFAIK using the direct or indirect position feedback is still somewhat rare unless thereâs a very good reason to do so, even if the system is equipped with one.
When using closed loop operation, the point of diminishing returns approach very fast, especially with fast moving, hard accelerating and decelerating machines like lasers or multiaxis robot arms.
And obviously only work in systems that run the program line by line, not the whole program from start to end.
The feedbacks can be looked as an additional axes, and adding one will pretty much double the computational requirements if the same performance is required.
So XY motion with closed loop feedback requires ~8 times the computational power the plain XY motion does.
If in use, thereâs also the issue of what to do with that position feedback data when it differs from what the position is supposed to be.
Itâs somewhat hard to adjust the movement data on the fly without the adjustment showing as a flaw in the product, and halting the process midway because the position is a step off is obviously among the last thing anyone wants to do.
Regards,
Sam