Hathor Logo

Preface

This entry is part 1 of 7 in the series The Streaming Chronicles

It all started with a bug.

I am in the process of writing a library that will act as an ONVIF driver for Apple operating systems, and had gotten to writing a test harness for tvOS.

My test harnesses use the excellent VLCKit to display streams from devices. This is a library that is designed to be integrated into software projects, giving them the ability to display video of almost any format.

It’s an excellent resource, but is quite “heavy,” in its requirements for resources, and has licensing issues. The licensing issues aren’t that big a deal for me, but the “heavy” is, so I knew that I’d need to find a different streaming library before trying to do anything that even smelled commercial; especially for mobile devices.

Read more

Week “Zero”: The Basics

This entry is part 2 of 7 in the series The Streaming Chronicles

This week saw some basic feasibility work, and establishment of some “infrastructure” to support the project, going forward.

As I have stated before, I prefer to use a “make it up as I go along” project planning methodology. The less structure, the better. I consider overdeveloped structure to be “concrete galoshes.

With that in mind, the first thing that I did, was to create my “napkin sketch” of the project.

Read more

Week One: Setting Up Phase 1

This entry is part 3 of 7 in the series The Streaming Chronicles

Now that I know (pretty much) where I’m going, I can start to set things up.

Here’s the repo.

Here’s the GitHub Pages for the project.

Here’s the tag for Week 1.

I will be writing the server in Swift. It’s the current “must have” language for Apple development, and I’ve been using it for the last five years. It’s a great language, and I’m happy to be working in it.

Read more

Week 1.2: Phase One “Complete,” But…

This entry is part 4 of 7 in the series The Streaming Chronicles

…the latency (the time between when the camera sees something, and the output renderer displays it) is TERRIBLE (about 15 seconds). The quality is great, but that’s pretty much irrelevant, if you are interested in a “real time” display.

After poking around, I see that the reason is really because I’m choosing to use HLS. Basically, “You can’t get there from here.”

Some latency is expected (indeed, several of the devices I test with have built-in latency), but I’d like to be able to get better than “This is what happened fifteen seconds ago.”

Read more

Week 1.5: Making An ffmpeg “Breadboard”

This entry is part 5 of 7 in the series The Streaming Chronicles

I am just getting back to the project, after taking a couple of weeks to write up a new shared resource, rewrite an old app, and prepare all of my apps for the next version of the Apple operating systems (LGV, RVS). Now, I’m back to the grind.

Alice Falling Into a Rabbithole

Before we start, here’s the tag for the work so far (Week 1.5).

ffmpeg is a deep rabbithole. I can tell that I’ll need to spend a lot of time learning up on it.

Because of this, I’ve decided to “retool” the RVS_MediaServer into a “breadboard,” which is an old EE term for an experimental prototyping device; usually easily reconfigurable, and not especially pretty or robust.

So I’ve added a couple of items to the persistent preferences (one of the reasons that I took the time to refactor the preferences into a standalone project, was because I had an inkling I’d need to have a robust supporting infrastructure for this kind of thing).

Read more

Week 2: Breadboard 1.0 Done

This entry is part 6 of 7 in the series The Streaming Chronicles

As I mentioned earlier this week, I decided to set up something that is more of an “ffmpeg test harness,” than a “shipping” server project.

I wanted to have an app that would allow me to do all the command-line stuff that you can do with ffmpeg, but also allow direct, intravenous access to the guts of a Swift project, so I could monkey with a bunch of stuff that might not be so easily done (or possible at all) with Bash.

The RVS_PeristentPrefs stuff was a big success. Because of it, it was a breeze to add a number of persistent states to the app to access special tasks; namely, establishing a “special access” mode, and a console display in the server screen.

Here’s the tag for this week’s work.

Read more

Week 3: Basic VLC Live Video Streaming Added

This entry is part 7 of 7 in the series The Streaming Chronicles

Here is the link to this week’s tag.

Abstract

After spending way too much time looking around, I have come to the conclusion that the only viable alternative for real-time streaming playback, at the programmer’s level, is VLCKit. It’s a monster, it doesn’t seem to play so well with others, and it’s really, really complex, but there doesn’t seem to be anything that even comes close; especially for mobile.

Video is hard. I really have to hand it to the VLC people. What they made, works.

Now, you can be forgiven for thinking that I said that I wasn’t going to use VCLKit.

That’s because it’s exactly what I said. We’re allowed to change our minds.

The main work ahead is “tuning.” This is a massive job, and people have made entire, lucrative careers out of the task. They don’t write software. They tune. They play with ffmpeg and VLCKit options and arguments until they have a performant video processing/conversion/streaming engine that delivers the requisite quality.

I’m going to be trying something similar, but nowhere near as well.

Read more