7
« on: September 30, 2018, 07:12:27 PM »
It is not really about driving lots of pixels, it is more about accepting mediocre. I want stale bread, a grisly steak, flat beer. What we have works, but it is just OK. So if I went and bought a monitor I would accept a 20/40 hertz refresh rate? If you put a scope on the Ethernet port of a PI or computer you will see jitter on the Ethernet data. Why would we not move to a better output device that would have exceptionally low jitter, accurate timing and support higher refresh rates like our monitors. Say I have 120 pixels on my roof line, a curtain effect at 20 hertz over one second would turn on 6 lights at a time, and 40 hertz would be 3 at a time. If I cut the time in half for a fast beat, it just gets worse. Why wouldn't I want an output to support 60/80/100 hertz refresh rates?
If you look at the professional software "Madrix", it supports video output, why would we not move toward the same thing professionals use?