So did you give up and just use LOR? I saw you post that you wanted to stick with all LOR software if you could or were you just trying to keep the LOR wolves off your back...lol.
I've been testing and the only time I can get a crash is when I define a ton of channels and then turn off Output to Lights. It appears that if the interface can't handle the amount of data its stacking up in the output buffer of the serial driver and we weren't giving it long enough to transmit all the data when we closed the port. I'm trying to figure out a good way to not let it stack up but that will mean skipping frames when its too much data.
I did some testing and with a Pixie16 defined at 300 channels per port which is 4,800 channels it was averaging 13,250 bytes per frame when I used a Butterfly effect. Then I switched to a Bars effect and it was only 320 bytes and then 3 frames of 0 bytes. So that's what I mean about their protocol its highly depending on the number of different colors on the channels. An effect like Bars with lots of channels having the same color is more efficient than E1.31 protocol but use a Butterfly and its like 3 times worse than E1.31.
So using a 50ms sequence timing 13,250 bytes per frame would take about 5.1 seconds to transmit 1 seconds worth of data at 512Kbaud. So depending on the effect you might be limited to 1000 channels at 512Kbaud but more channels for simple effects with a lot of the same color. Now LOR might be more efficient because they have that new enhanced protocol where they apparently tried to overcome these shortcomings but its not like they want to share their scheme to help us out. They want to lock you into their hardware and software and its working for some people. If they were purely a hardware vendor they would publish that protocol because it would be in their best interest that everyone using their hardware had a good experience no matter which software they use.