We are adding on some DMX garden lighting to a Crestron project. Usually we create DMX 'shows' using a desk or software controller, save these into a playback controller and trigger the playback unit from the Crestron system. We do this as previously when we have programmed Crestron to directly control DMX (using their RS232 connected bridge unit) we found that doing colour fades, even on just one RGB LED Strip, ate up processing power, and we ran it in a different program slot on the processor to minimise impact on the main program.
Crestron however now have a sACN module in their database that can be used to send DMX values to an sACN to DMX bridge over ethernet, which begs the question "why not try programming the show on the Crestron processor and cut out the middle man?"
The reason the the last attempt at this was so processor hungry (even for just 3-channels) seemed to be that for every colour increment of the fade change, DMX values for three channels were being transmitted over RS232 and onto the DMX bus, and I'm not sure if this was necessary.
My question regarding DMX protocol therefore is: if you are asking a fitting to colour change, is it necessary to transmit a DMX value for every intensity increment for each of the RGB channels to do this, or within the DMX protocol is there a 'fade time' command that can be issued to offload the process to the fitting rather than sending all the incremental values?
I know when programming a show on the desk i can allocate a slider to 'fade time', but i don't then know if this just means the desk is sending out all the value increments spaced depending on the fade time slider value, or whether it is sending a target colour value and a fade value for the colour transition to the fittings?
Can any of the collective advise?
TIA
Nick
What do you mean you wanted it on the other wall - couldn't you have mentioned this when we prewired?