Standards, Sockets and Secret Sauce

Intel are certainly keen on the standards that support their business (standard instruction sets) but less keen on the standards that would invite more direct competition (standard sockets for the processors). I am sure that Intel are happy to promote their own standards, perhaps a little less enthusiastic about ARM’s or AMD’s.

Barrett goes on:

“Imagine what it would be like if there were a wide variety of films, standards, and sizes associated with cameras as opposed to a worldwide, international standard, and what it would mean to people if they couldn’t take their cameras from one country to another because film had a different specification, a different size or a different format.”

Well Craig, I don’t have to understand because I’ve been battling with image formats, and to be honest, I’ve been making money out of that battle all my working life. Twenty years ago we were able to drive the adoption of our image manipulation software by implementing any darned image format that a customer needed, and some of them were pretty weird.

One of the fundamental reasons we ended up with so many different formats was because different manufacturers needed to do very different types of things with the video and audio media (oh, no, don’t get me started on why we have completely separate formats, workflows and workstations for video and audio…). Some wanted to manipulate images, others analyse movement, others still wanted to change the sequencing of shots, while some wanted to compress the media for transmission.

Years ago a multi-national company in my industry decided that it would be a really good idea to promote a standard for passing media between systems from different manufacturers. So they created a standard, promoted it, got folks (including my company) to sign up, and we all moved forward into a bright and exciting future.

Except we  didn’t. There were a whole bunch of reasons why that standard didn’t stick, but there are two which really stand out and they point to a far bigger issue.

The ‘gotchas’ that were most evident were firstly that the standard was very definitely written to work well with the original company’s products. Of course it would be, wouldn’t it? So when our products needed data in a very different format it proved difficult even to explain why we needed to do so. For example, from a huge stream of media we needed to extract a small amount in the middle of the stream (a single frame of pixels of a video image in Red Green and Blue), edit it and write it back. Their model for how you did this was to read the whole stream and re-create it. Super slow, super inefficient and super selfish.

The second ‘gotcha’ was that the executives of this multi-national figured out that controlling a standard could give them a competitive edge. How would it be, they mused, if they could use the standard a little bit better than everyone else? What if they had some ‘secret sauce’ in the standard? That would be really cool; they could look like they were being altruistic, whilst at the same time screwing the competition. Win – win! Or not. Call that a standard? I call it double standards.

In response, others entered the fray with their own ‘standards’ so rather than simplifying the problem, we had actually made it worse.

Since then the situation has gone from bad to diabolical.

We used to worry about what format the pixels of an image were formatted in, how they were packed together and what each byte of information meant in terms of colour. To cope with these differences people came up with image file formats which allowed for a range of different pixel encoding by adding extra information. That made the problem of interpreting one of these new ‘meta formats’ tough and typically manufacturers just implemented what they wanted, or what their customers demanded, and threw the rest away. Workflows became broken and everyone got frustrated.

We like to think that engineers respond to complexity by seeking to simplify things. Coming up with clever solutions which reduce the number of variants and make things simpler and quicker. So in response to having a bunch of different image formats hidden in a bunch of different file formats, the engineers came up with another solution. The ‘wrapper’. The wrapper is a consistent way of describing lots of the information about the images; data about data or metadata. Of course there are several competing variants of these wrappers. So now we have wrappers hiding file formats, which are themselves hiding the image information. Phew, time for a breather!

Guess what? We still struggle to pass image information from one system to another. It is hard to believe, I know, but some of your favourite television programs are created on collections of systems (cameras, editing systems, asset management systems, playout servers, etc) that fundamentally won’t talk to each other without some specialist digital glue. Imagine, it is as if every piece of equipment required a unique power voltage and connector. After 20 years that’s a pretty sorry state of affairs.

But don’t worry, help is just around the corner. We now have a bunch of projects which are seeking to establish ‘framework’ standards which define how these systems communicate with each other. Yep, adding yet another layer of ‘standard’ complexity is going to really fix the problem, don’t you think?