Can LightBurn be used as a GCODE generator?

It is something along the line of a POST to http://ip/cnc/cmd?cmd=

Yes something like that. I do wonder if emulating a LightBurn bridge device would be a route that could work. I suspect that has support for a remote camera and that would give me a hook to facade the http call to capture a frame that my machine supports.

I will take a look.

I suspect this would be far more work. Bridge is not documented and is designed to communicate with Ruida controllers which are not g-code based. I suspect GRBL would be your best bet as it’s well understood, well documented, and well supported. There are precedents for doing g-code over http.

Check out LaserGRBL’s websocket to Serial bridge:
WiFi with ESP8266 – LaserGRBL

Just getting started and not forwarding on any traffic yet. But I have LightBurn convinced enough that there is a GRBL machine there to send it GCODE.

Nice. Are you mimicking a usb serial or network device?

For now network. I would have to actually do something (socat perhaps) with a couple of usb serial ports and cable to support serial as lightburn and this app are running on the same machine.

Have you checked out my experiences with this sort of thing:-
What sort of Gcode options are there?

I am looking at that now.

So how does LightBurn handle focusing? For my lasers software you tell it what thickness material you are cutting and it adjust z height accordingly. Is this something LB does according to some settings?

LightBurn is not involved in focusing.

If you have z-axis controls you can use LightBurn to move to specific z heights. It can also track material thicknesses to accommodate for those thicknesses but the initial focusing is done manually or by the controller. Material thickness is defined in the Material library.

LightBurn also gives you the ability to set a Z offset as well as Z step per pass of the cut (ie. if you wanted to step down per pass).

1 Like

I was able to run a simple cut last night without the LaserBox software. All I did was push, line by line, the gcode generated by it to the machine over the http interface and see what happens. It didn’t go as far as to exposing what the proprietary gcodes did but it did help with one that was to turn on air assist. Also I announced my project on the FaceBook group for owners of the machine and there was much enthusiasm. If this turns out well it will certainly sell some copies of LightBurn, so if you are employee, your efforts answering my questions will have ROI. So next question…

What does LightBurn do with the camera when one is in use? Use it to place the design relative to the stock so that things line up? Provide a live feed of the cut in process? What?

I think LightBurn only supports USB cameras so the one the machine has would not be compatible. However, depending on what kind of frame rate I could probably make one of those RaspberryPi Nanos act like a USB cam and make http calls the to the machine to capture frames. I have already done something like that where I made it act like a keyboard.

Just a user but happy to see folks moving the ball forward.

Here’s what the documentation has to say about its uses:

  • Position designs on material
  • Trace simple artwork from the camera image
  • Monitor your laser

Yes, USB cameras. Whether or not your existing one can be used is likely to be how it’s exposed to the computer. LightBurn should work with any UVC compliant camera on Windows.

How is the camera exposed? There may be a way of virtualizing the camera as a UVC device.

The Monitor your laser would suggest a real time feed. That part could be a challenge. 1 and 2 would be doable with a low frame rate.

From what I can tell it is only exposed through the HTTP call that grabs a single frame and perhaps some calls for doing some sort of calibration. So I think I would have to go the RaspberryPi route and have it emulate a UVC device and act as proxy on that front. But I think that would be a later phase thing as LightBurn can certainly be used without a camera.

Camera is a nice to have for sure as long as you have another way of aligning your workpiece (e.g. visible light laser).

Yeah put the laser at a couple percent power so it is visible, but not cut, manually jog it to the corner where you think your job should start, and tell LB to frame. That doesn’t sound painful.

1 Like

Is your laser not a CO2 laser? Or does it also include a diode laser meant for engraving?

It is a C02. I am making some assumptions about how manual work placement would go. I take it am missing something.

CO2 lasers do not generate a visible light so wouldn’t function for alignment. Additionally, even if the light were visible the tubes require a certain minimum power to lase that would likely already damage the work piece.

Frequently these lasers will be paired with a guide diode laser that’s meant only for alignment and produces visible light.

Ahh. Interesting. So maybe a 3D printed point finder or just manual guess and check would do.

Or, hell, just adding a LightBurn supported camera would do. They are not all that pricy.

Not having a guide laser will make more precise alignment impossible. For example, print and cut will basically be a no-go.

A well calibrated camera will get you to sub millimeter precision for rough alignment.

1 Like