Inside Vidpresso

The Secret Vidpresso Master Plan (just between you and me)

Written by on .

A few years back, someone sent me a link to Elon Musk's post explaining the Tesla Motors master plan. Reading it now is especially fascinating because most of his plan has happened. Tesla released the Roadster. It was enough of a hit to allow them to release the Model M. Now, they're aiming for a more affordable Model X. All explained in plain english in 2006.

Even more striking, it was clear Elon had a plan, and he wasn't afraid of sharing it publicly, despite the fact it'd years before his plans would ever come to fruition.

Now, rather than be afraid of sharing our vision, I've fully realized the value in sharing your master plan with the world. Last week, I was lucky enough to present at the NAB Futures conference. The goal of the conference is to give overburdened executive types a peek at what the future holds for broadcast.

So rather than just sell the current version of the Vidpresso dream, I outlined my vision for the future.1 The crowd was very, very excited about the future of Vidpresso, even more so than the current product.

Before, we had an informal marketing strategy of just pretending the future vision, on which the company was founded, didn't exist when talking to customers. We thought it would scare them, or something. Well, we were wrong.

As you're probably aware, Vidpresso's current product helps broadcasters use social media in their show with one click. You might also be aware that we don't require broadcasters to buy any additional proprietary hardware in order to accomplish this. But our real goal, if we're being honest, is to help anyone create the best quality broadcast, regardless of their budget.

We think anyone, from the video game enthusiast, to the local news channel, to Occupy Wall Street, to CNN, should be able to create broadcast-style shows, and we think they should be able to do so without millions of dollars, nor without a team of people to help.

So how does our current product, which is affordable (or even cheap) for current broadcasters but is prohibitively expensive for non broadcasters, get us to our goal?

Let's walk through the story, starting with the past, and working forward. It'll all make sense.


I love broadcast TV. Specifically live TV. There's some adrenaline rush that starts when the show starts, and doesn't end until you fade to black. Past the production side, I love that through TV, you can experience emotions and understand a person by seeing them, which helps you evaluate their credibility the same way you'd do in person. I love that.

I started working in broadcast doing behind the scenes work in local news. I've done most jobs behind the camera (graphics, producing, editing) and know how it all works together. I left local news to chase my other passion, technology, working for outlets like Engadget and CNET. There, I tried to create quickly produced, repeatable, broadcast quality shows on a shoestring budget. Turns out that a single person has a hard time duplicating the work of a whole team, who has a million+ budget.

But why was it so hard? The technology I was using could often create broadcast quality experiences, and it didn't cost $50,000. Why is TV so expensive? What parts could we remove to make it less expensive, without losing any of the quality? And why hasn't that happened yet?

The Epiphany

Back in the early days of computing, individual computers were very expensive, so rather than try to buy multiple machines, organizations would spend a lot of money on one piece of infrastructure to share throughout the entire organization. Instead, they have a centralized "server", with long wires that connect to "dumb terminals," which don't have any logic, but just act as a way to move keypresses from the user to the computer.

As I thought about it, I realized video production sort of works like an old-school computer mainframe. Basically, broadcasters spend a bunch of money buying infrastructure to route signals around, composite them together, and eventually deliver something that the "dumb terminals", aka TV sets, can consume.

Broadcast technology essentially crystallized in the late-70s, early 80s, I'd argue. Back then, broadcasters had huge budgets and had to do a lot of things by hand. They'd print out the teleprompter ahead of time. They used analog video tape to edit content. And, every piece of the broadcast infrastructure was essentially signal.

As computers came around, they started to replace analog things with digital things. Teleprompters became digital. Tapes became files. But all the replaced parts were essentially incremental improvements over previous technologies. The fundamental paradigm of sending signals around never changed, and is still in use today.

Because broadcasters had lots of money, innovation in the 70s and 80s swirled around them, much like it swirls around consumer technology today. Numerous companies emerged to help broadcasters spend less and get more, but the scale was more in line with other enterprise computer systems. That never changed either. That means each piece of broadcast hardware is still expected to cost $50,000 on the low end, and up to hundreds of thousands of dollars on the high end.

But what did change? Broadcasters' outsized profits are being squeezed on all fronts. Other media emerged as a competitor, Netflix, Et. al, are beating down the door. Oh, and broadcast consolidation meant more people doing more work with the same technology.

So today, broadcasters have racks full of equipment, each one costing huge amounts, and in order to be redundant, you have to spend 2x on each one.

Cloud Computing for Broadcast

10 years ago, a web software company wouldn't be that different from a broadcaster. They'd buy beefy servers, host them themselves, and if one went down, they'd need an exact replica. But, more recently, we've started to discover the benefits of cloud computing principles.

I'm not going to explain the whole revolution, but in short, if you have a $100k budget, would you rather buy 10 specialized servers, or 100 commodity servers? Cloud computing says you'd rather buy 100 commodity servers, because you can get similar performance characteristics for most tasks, and you spread the risk of failure. Additionally, those servers can be virtualized, so each individual hardware server can look like 4 different servers, making more work possible with less.

So what if we started to bring some of these same principles to broadcast? What would it take?

For starters, the most important principle is reliable quality. If you can't rely on something working repeatably, you're not going to use it in a show. You want every frame of output to look the way you expect, and you won't accept any compromises.

To achieve that, current broadcast hardware vendors had to rely on building their own rendering logic for custom hardware. That means the bulk of their development efforts went into the most crucial part of the system: The drawing logic. If you can't reliably draw a picture on the screen, you have nothing. So everyone in broadcast went to work drawing their own rendering engine. And, since this is 'hard tech' and involves lots of man hours, nobody shared their work. Each hardware device has its own output card, its own rendering logic to draw pixels on that card, and collectively we've wasted (I estimate) millions of man hours solving the same problem, over and over.

What if we had a shared system for rendering pixels? What if every vendor didn't have to implement their own font engine, their own 3d logic, their own video playback system? We probably would have seen vast improvements to the user experience of broadcast software, which typically is fair to poor.

It happened, while we weren't watching.

Good news: Someone has already started work on this. And, even better news: It's done by the three biggest names in technology, and one of the biggest names in broadcast.

That's right, Apple, Google, Microsoft and Adobe have all been collaborating on building this rendering engine. And, over the last 3 years, it's finally added some crucial features that broadcasters need.2

So why haven't you heard of it before? Don't you think this would be huge, breaking news that everyone in the broadcasting industry would be aware of?

Well, you haven't heard of it because nobody realizes it yet. The rendering engine has a name: The web browser.

That's right, Google Chrome, Microsoft Internet Explorer (v11 or later), Mozilla Firefox and Apple Safari are all more capable at rendering broadcast quality content than any piece of hardware in your rack. And, they run on literally every computer in the world. And Adobe, has been leading the charge for standards that broadcasters need most.

Since all these engines are based on W3C standards, it means anyone who thinks they can do a better job of implementing them should. And that's why the biggest tech companies in the world will continue to do a better job of leapfrogging broadcast hardware vendors in rendering quality.

Enter Vidpresso

This is where our company starts. We eschew all traditional broadcast hardware, whenever possible. The pieces of hardware our company requires to get on the air?

A simple scan converter. Oh and a computer.

So to use our solution, a station need only spend approximately $1500. And usually every station already has this equipment because they use it for other purposes, so they need $0.

So today, if you want to use social media on the air, you don't have to go through a crazy requisition process, sign any long-term contracts, or spend a bunch of time or money to get on the air... you just have to go to, sign up for a trial, and if it works, implement it in your infrastructure.

Near future

So, assuming that we're going to be successful and people will use and like our system, what's next for our company?

Well, once we have a good number of clients using our social product, we can start replacing other parts of broadcast with browsers. We don't know what we're going to tackle next, but we see the potential to replace:

  • Tickers
  • Character Generators
  • Video playback
  • Ad playout
  • Switchers

We think there's plenty of room for other people to try this too. We think most broadcast hardware vendors would look at a solution like ours and be scared because a) they'd have to cannibalize their own businesses, because a proprietary hardware solution should probably cost 10x as much as a browser based one, and b) they look at our solution as a toy, something that 'serious broadcasters' would never consider.

But honestly, we'd love to be the people to drive this future, we know we can't do it alone. If more people adopt these principles, the entire industry is going to be better off. Archaic protocols like MOS aren't encouraging though, we think a wave of startups will be the only way to displace the old style of thinking with newer, less expensive thinking.

The last step

OK, so assume the preceding step actually happens, and broadcasters start using browsers to broadcast all of their content. This is where it gets fun.

Let's go back to the mainframe analogy. It hinges on one crucial assumption: The client can only play back signal, it has no other logic.

It's clear that assumption is going to become more and more incorrect. We're already seeing phones, tablets, set top boxes, and smart TVs which have some logic, mainly tied around switching between different streams.

But each of these boxes is going to get smarter and smarter, and eventually (we think) they'll just be web browsers essentially. 3

So what if instead of broadcasting browsers over the air, we just had broadcasters create the data, and have the client devices actually do the work of compositing and signal processing?

That would mean we'd eliminate all the expense for broadcasters. And, we'd enable new possibilities like true interactive TV, or responsive video.4


That's the future we're fighting for. A future where the clients do all the work, and the broadcasters are tasked with creating the best content, and that's it. No budgets, no racks of equipment, just content.

The quick bullet point version:

  1. Today, broadcasters spend lots of money on single-use equipment.
  2. Our current product, Vidpresso, helps broadcasters get social media on the air without buying any extra equipment.
  3. We think this same approach could be applied generally across all broadcast hardware.
  4. Eventually broadcast hardware could go away completely, as clients get smart enough to do the work of the broadcast hardware locally.

We call this dynamic video. We want to be the ones to lead you into this future.

  1. You can check out the presentation I gave, but it'll be kind of weird without me explaining the slides.

  2. Frame syncing! Every frame drawn is actually synced to the monitor's vblank, meaning there's no potential for tearing anymore. Also, OpenGL ES has been implemented, and we have a common 3d transform system for 2d content as well! Oh, and to top it off: We now have a filter system where we can separate key from fill. Broadcast-style straight alpha is not only possible, but has been implemented.

  3. They might be specialized ala HBBTV or something like that, but we think they'll probably be regular browsers at their core.

  4. Blog posts coming.

Subscribe to our newsletter. We try to send one out once a month.

See our archives.