…the latency (the time between when the camera sees something, and the output renderer displays it) is TERRIBLE (about 15 seconds). The quality is great, but that’s pretty much irrelevant, if you are interested in a “real time” display.
After poking around, I see that the reason is really because I’m choosing to use HLS. Basically, “You can’t get there from here.”
Some latency is expected (indeed, several of the devices I test with have built-in latency), but I’d like to be able to get better than “This is what happened fifteen seconds ago.”
I’ll need to don my thinking cap, and figure out how to proceed.
I will have to explore choices to display on Apple devices. The main problem that I have, is that Apple devices seem to have problems displaying anything more than some very basic streaming protocols.
The current structure is great as a workbench, so I’ll just keep using it as my “breadboard,” while I figure out how to get the results that I need.
Here’s the tag for Phase One Complete.
The next work will be an extension of Phase One. Let’s call it “Phase One Point Five.”
Server Not Running |
Server Running |
The Prefs Screen |
View From the Acti Camera |
View From the AXIS Camera |
View From the FLIR “Eyeball” OEM Camera |