Yes something like that. I do wonder if emulating a LightBurn bridge device would be a route that could work. I suspect that has support for a remote camera and that would give me a hook to facade the http call to capture a frame that my machine supports.
I suspect this would be far more work. Bridge is not documented and is designed to communicate with Ruida controllers which are not g-code based. I suspect GRBL would be your best bet as it’s well understood, well documented, and well supported. There are precedents for doing g-code over http.
For now network. I would have to actually do something (socat perhaps) with a couple of usb serial ports and cable to support serial as lightburn and this app are running on the same machine.
So how does LightBurn handle focusing? For my lasers software you tell it what thickness material you are cutting and it adjust z height accordingly. Is this something LB does according to some settings?
If you have z-axis controls you can use LightBurn to move to specific z heights. It can also track material thicknesses to accommodate for those thicknesses but the initial focusing is done manually or by the controller. Material thickness is defined in the Material library.
LightBurn also gives you the ability to set a Z offset as well as Z step per pass of the cut (ie. if you wanted to step down per pass).
I was able to run a simple cut last night without the LaserBox software. All I did was push, line by line, the gcode generated by it to the machine over the http interface and see what happens. It didn’t go as far as to exposing what the proprietary gcodes did but it did help with one that was to turn on air assist. Also I announced my project on the FaceBook group for owners of the machine and there was much enthusiasm. If this turns out well it will certainly sell some copies of LightBurn, so if you are employee, your efforts answering my questions will have ROI. So next question…
What does LightBurn do with the camera when one is in use? Use it to place the design relative to the stock so that things line up? Provide a live feed of the cut in process? What?
I think LightBurn only supports USB cameras so the one the machine has would not be compatible. However, depending on what kind of frame rate I could probably make one of those RaspberryPi Nanos act like a USB cam and make http calls the to the machine to capture frames. I have already done something like that where I made it act like a keyboard.
Just a user but happy to see folks moving the ball forward.
Here’s what the documentation has to say about its uses:
Position designs on material
Trace simple artwork from the camera image
Monitor your laser
Yes, USB cameras. Whether or not your existing one can be used is likely to be how it’s exposed to the computer. LightBurn should work with any UVC compliant camera on Windows.
How is the camera exposed? There may be a way of virtualizing the camera as a UVC device.
The Monitor your laser would suggest a real time feed. That part could be a challenge. 1 and 2 would be doable with a low frame rate.
From what I can tell it is only exposed through the HTTP call that grabs a single frame and perhaps some calls for doing some sort of calibration. So I think I would have to go the RaspberryPi route and have it emulate a UVC device and act as proxy on that front. But I think that would be a later phase thing as LightBurn can certainly be used without a camera.
Yeah put the laser at a couple percent power so it is visible, but not cut, manually jog it to the corner where you think your job should start, and tell LB to frame. That doesn’t sound painful.
CO2 lasers do not generate a visible light so wouldn’t function for alignment. Additionally, even if the light were visible the tubes require a certain minimum power to lase that would likely already damage the work piece.
Frequently these lasers will be paired with a guide diode laser that’s meant only for alignment and produces visible light.