Can LightBurn be used as a GCODE generator?

I am looking at that now.

So how does LightBurn handle focusing? For my lasers software you tell it what thickness material you are cutting and it adjust z height accordingly. Is this something LB does according to some settings?

LightBurn is not involved in focusing.

If you have z-axis controls you can use LightBurn to move to specific z heights. It can also track material thicknesses to accommodate for those thicknesses but the initial focusing is done manually or by the controller. Material thickness is defined in the Material library.

LightBurn also gives you the ability to set a Z offset as well as Z step per pass of the cut (ie. if you wanted to step down per pass).

1 Like

I was able to run a simple cut last night without the LaserBox software. All I did was push, line by line, the gcode generated by it to the machine over the http interface and see what happens. It didn’t go as far as to exposing what the proprietary gcodes did but it did help with one that was to turn on air assist. Also I announced my project on the FaceBook group for owners of the machine and there was much enthusiasm. If this turns out well it will certainly sell some copies of LightBurn, so if you are employee, your efforts answering my questions will have ROI. So next question…

What does LightBurn do with the camera when one is in use? Use it to place the design relative to the stock so that things line up? Provide a live feed of the cut in process? What?

I think LightBurn only supports USB cameras so the one the machine has would not be compatible. However, depending on what kind of frame rate I could probably make one of those RaspberryPi Nanos act like a USB cam and make http calls the to the machine to capture frames. I have already done something like that where I made it act like a keyboard.

Just a user but happy to see folks moving the ball forward.

Here’s what the documentation has to say about its uses:

  • Position designs on material
  • Trace simple artwork from the camera image
  • Monitor your laser

Yes, USB cameras. Whether or not your existing one can be used is likely to be how it’s exposed to the computer. LightBurn should work with any UVC compliant camera on Windows.

How is the camera exposed? There may be a way of virtualizing the camera as a UVC device.

The Monitor your laser would suggest a real time feed. That part could be a challenge. 1 and 2 would be doable with a low frame rate.

From what I can tell it is only exposed through the HTTP call that grabs a single frame and perhaps some calls for doing some sort of calibration. So I think I would have to go the RaspberryPi route and have it emulate a UVC device and act as proxy on that front. But I think that would be a later phase thing as LightBurn can certainly be used without a camera.

Camera is a nice to have for sure as long as you have another way of aligning your workpiece (e.g. visible light laser).

Yeah put the laser at a couple percent power so it is visible, but not cut, manually jog it to the corner where you think your job should start, and tell LB to frame. That doesn’t sound painful.

1 Like

Is your laser not a CO2 laser? Or does it also include a diode laser meant for engraving?

It is a C02. I am making some assumptions about how manual work placement would go. I take it am missing something.

CO2 lasers do not generate a visible light so wouldn’t function for alignment. Additionally, even if the light were visible the tubes require a certain minimum power to lase that would likely already damage the work piece.

Frequently these lasers will be paired with a guide diode laser that’s meant only for alignment and produces visible light.

Ahh. Interesting. So maybe a 3D printed point finder or just manual guess and check would do.

Or, hell, just adding a LightBurn supported camera would do. They are not all that pricy.

Not having a guide laser will make more precise alignment impossible. For example, print and cut will basically be a no-go.

A well calibrated camera will get you to sub millimeter precision for rough alignment.

1 Like

Excellent news. I took a gcode file generated by LightBurn (with configured custom heater/footer for machine), added negative signs, added a manual command to set focus height, and sent to the machine line by line and it worked! So it looks like my job is as simple as adding negative signs for all X/Y/Z coordinates and I am good to go.

1 Like

Nice. A pain… but nice. Are you still moving forward with creating an IP proxy service?

Yes. We will see how far it takes me. Should be easy enough.

So I have my proxy in place. I can use the LB move window to move the machine by entering coordinates and hitting “go”. As I have not implemented the ? grbl command yet, jogging with the arrow buttons doesn’t work as it appears LB requires that for positional feedback.

I did hit the play button for a simple square cut not too far from origin. All of the gcode blasted over to the machine in about a second and a half and seemed to process some of the commands out of order. Air assist on is in the header and air assist off is in the footer and it was on for a fraction of a second after which the cutting moves happened. I presume a real GRBL device doesn’t respond to a command till after it has finished processing it, wheres this HTTP interface does not. I am going to have to figure out a way to query the machine for status as a mechanism for throttling command speed. I would really like to get my hands on some sniffed grbl traffic that had timing info in it. Any idea where I might find that?

That’s awesome that you’ve got some basic communication going.

Conceptually it could be getting out of order in 3 places:

  1. out of LB
  2. out of your proxy
  3. during processing in your laser

Have you confirmed that 1 and 2 are okay?

I’m not certain about the queuing in GRBL. I do know that ? status is non-binding in the sense that it can be requested at any time. However, I’ve never heard of a scenario where commands are processed out of order. I’m assuming the HTTP requests are asynchronous. Is there a way to group multiple commands into a transaction to guarantee order? This won’t be practical unless you can guarantee order.

What type of thing are you looking for? Like USB packet sniffing? One thing you might want to look at is the GRBL emulator in LaserGRBL. It functions as a virtual GRBL device. That might give you more direct insight. There may be debug information available or else you could potentially inspect the runtime memory of the emulator.

If there’s something specific you want with a sniffer I could run that for you if you let me know what you’re after.

Yes. I have synchronous http calls in the mix. It has to be on the laser side.

You had mentioned that before. That should definitely be a tool I look at and should be able to sniff.