I’m in the process of calibrating my K40 machine with a new Ruida RDC6445G controller. My tube is 40W and should have a maximum current of 18mA. I’m using the analog output of the controller (L-AN1) on the LPSU IN1 pin to control the laser power and it generally works great - except that at 100%, the tube is dissipating 25mA, which is far too much current (the same happens when I use LPWM1, only the ghetto ancient chinese secret LPSU screeches in protest at lower power settings, so that’s a non-starter).
Through a couple deterministic tests I found that 65% is right about the sweet spot where the tube dissipates 18mA. This equates to 3.6ish volts and change from the DSP’s ADC - far less than the full-scale reading of 4.9ish volts with 100% power set (well, 99%, because thanks china, the controller won’t let me set it any higher, even if I enter 100% in the controller laser limit settings).
Now, if I limit the tube to 65% power through the controller laser settings, that’s interpreted as a ceiling for maximum power, rather than an upper limit upon which a provided power number is scaled. So if LB asks the laser for 100% on a layer, it’ll “helpfully” reconfigure that layer to 65%. So my valid power settings are from 0% to 65% now, and I want to expand that to 0% to 100%. This tells me that the clamp setting for maximum power is definitely not what I want. There’s also a ‘laser attenuation %’ setting in the controller hardware setup that initially I tried, but appeared to have no effect on, well, anything.
Has anyone seen a “maximum output scaler” setting or something I can use for this in the controller? I haven’t found anything that works the way I want it to, and I have a hard time believing that since every laser tube is a little different, there isn’t a way to change this in firmware. What am I missing here? Am I barking up the wrong tree and that’s actually how this is supposed to work?
If I can’t find a workable solution to this in firmware, my ultra-ghetto solution will be to take the DSP’s ADC output (L-AN1) and feed it to a trimmer pot acting as a voltage divider to get the voltage i’m interested in at 100%. I’m just worried that the resistors on the input would slow down the signal to the point where it’s no longer accurate (or worse, no longer linear).