Cloud computing gets a bad rap because it can’t replace corporate data centers for mission critical apps. But new computing paradigms never do that: it is the new capabilities they enable that drive adoption. Case in point: transcoding.
Why?
Anyone who shoots video soon discovers that changing from, say, AVCHD to an editing-friendly codec and then to H.264 for distribution takes a lot of compute cycles. Conversion from one codec to another is called transcoding. It is the price we pay for high quality compressed content.
Compression and format conversion are necessary because highly compressed video – the kind most camcorders shoot – isn’t easy to edit. And the stuff that’s easy to edit has large files that chew up bandwidth and storage.
So we transcode. Add to that the number of formats we use – ranging from iPhones to flash to SD and 1080p – and transcoding is a major CPU cycle sink.
Fortunately, transcoding can be a highly parallel operation. A frame – or a series of frames – can be divided and split among multiple cores and CPUs.
Where?
Where can you find a lot of CPUs for a quick job? Right, the cloud. Which is why there are a number of online services that front-end Amazon Web Services to provide transcoding.
I spoke to the CEO of startup Zencoder, Jon Dahl to learn more.
Zencoder
Zencoder is a transcoding service provider that uses Amazon as a cloud provider. The Zencoder team has developed transcoding infrastructure for several startups and finally decided to build a general-purpose service.
While they use open source software in their stack – as do most transcoding providers – their major value-add is in a high-performance scalable interface. Handling 100,000 concurrent transcodes is non-trivial.
They also look out for problems common in transcoding such as audio/video getting out of sync and aspect ratio distortion. They can transcode 1080p faster than real time. And they’ve licensed the proprietary formats as well.
Amazon offers Linux as a service and a file service. S3’s files are limited to 5 GB, but that isn’t a problem for Zencoder: customers can specify input and output locations, bypassing Amazon storage.
Also they don’t transcode Mac ProRes – Final Cut Pro’s preferred editing format – today. But they do handle QuickTime movies.
The StorageMojo take
So the glass house doesn’t want to outsource cloud infrastructure. Who cares? They’re the last to adopt new technology anyway.
It is apps like transcoding that drive the business. In 5 years much, perhaps most, transcoding will be cloud-based.
Before the digital video craze in the last 5 years there wasn’t much demand for transcoding. But today, with HD video smartphones, millions are producing videos that they want to share and save.
Your smartphone won’t have the cycles to do it, but the cloud does. Expect transcoding vendors to add new features, such as noise-reduction or sharpening.
Business units are discovering the power of short videos to inform, train, persuade and excite. All at a fraction of the cost of 4-color brochures.
The outlook for storage vendors is mixed. Yes, much more storage will be sold – but cost-conscious cloud managers will be buying it. And as more new services develop on the cloud, consumers will be as hazy about “local” and “cloud” as they are about “memory” and “disk” today. Branding nightmare, but that’s where those petabytes will be.
Courteous comments welcome, of course.
The problem with cloud is the pipe between my PC and their cloud is too small to upload my entire video libraries.
Yes it sounds great but most studios like tight quality control on all aspects. These guys see things that most of us can see. This might fit but not sure until they have excellent quality compare to what they can produce. This guys have huge file size. In hundred of GB. Hard to upload on common DSL or broadband. They need faster line. Beside http is not very efficient for that type of transfer. SFTP is much better but not to much cloud friendly compare to http.
For everybody else I see a market. Specially with laptops and tablets growing in popularity much faster than desktop I can see that happening. They also have much smaller file size (typically less than two minutes clip).
Home Cable are coming with new setup boxes that will allow consumers to upload their smartphone videos to their home setup box. Once inside the setup box, they will offer transcoding for various devices. Storage will be provided within the setup box itself or on the cloud to be able to stream or transcode the video.
Next target will be video asset management. How home users will be able to maintain several hundreds of transcoded video files. We go the problem today with photos. Next will be videos with at least 20x more capacity.
Do we anticipate storage growth…you bet!
Soon or later we will have to do something to place these files at much cheaper cost and be environment friendly. We will face serious rare metal provisioning that are part of today’s storage devices. Earth as not infinite mine resources of these rare metals.
Using x86 hardware for transcoding in the cloud is a travesty. Low cost hardware transcoders use one hundredth (no that’s not a typo) of the power of an 8 core xeon transcoding 1080p video. Let’s not encourage people to do the wrong thing just because a business model supports it.
Robin,
Nice description of one (of many) killer apps for the cloud. Combined with CDN, you upload your source video once and the cloud takes it form there and serves it in the various distribution formats. No need to be bound to Youtube and fits many more business models.
Visiotech,
I’m not sure there’s a SOHO/SMB market for cloud transcoding.
For $100 you can get a USB transcoder like the El Gato Turbo.264 HD. Depending on your target format, you can transcode much faster than real time.
Much cheaper than paying for a faster Internet connection, and then paying for the cloud transcoding.
We’re looking at USB transcoders, despite having a 45 Mbps Internet connection. That connection is saturated much of the working day.
Rocky,
USB transcoders have quality issues that bother some. There is a wide review spread from 1 star to 5. The Elgato appears to max out at 10Mbps, which may be part of the problem.
Maybe higher data rate USB3 versions will fix that – or maybe “good enough” will be good enough for most people. But they don’t appear to be there yet.
Robin
The encoder used in the Elgato is over four and a half years old. There are newer generations of silicon providing higher resolutions and better throughput. I think most consumers want a USB attached device for simplicity, which is a shame since they could get much better throughput over PCIe.
Robin: great writeup and analysis!
Tinkthank and Rocky: hardware-based encoding does a better job than software encoding at guaranteeing transcoding speed (e.g. guaranteed real-time encoding), so it is great for live stream encoding. But today, hardware transcoder quality is significantly worse than the best software encoders. That’s a big deal for anyone who is concerned about their storage or bandwidth costs. And software-based encoding on new commodity hardware is extremely fast – as fast as many hardware-based transcoders (e.g. real-time 1080p).
If this changes – if hardware encoders improve dramatically – then that doesn’t necessarily change the need for SaaS transcoding. A cloud provider could run hardware-based encoding in the cloud easily enough.
Visiotech: you’re absolutely right. Bandwidth is definitely a limitation for some businesses, and it’s definitely not a limiting factor for others. As bandwidth gets cheaper and faster, cloud processing will make sense for more and more of the market.
Hey Robin,
There is certainly a need and a market for this sort of encoding service. From home users looking to convert their Sony/Flip/Panasonic native content to SMB sized post production facilities. Here in Cardiff local government and academia have clubbed together to provide a clustered farm service for use by the local creative community for processor & cash hungry tasks such as rendering. In that case the data can be physically shipped, the problem for a cloud service, and it had been mentioned a few times in the comments, is that video needs big fat pipes (http://www.matrixstore.net/2009/09/21/big-fat-pipes-to-the-cloud/).
Most individuals, SMBs and even some larger enterprises do not have access to the comms resource either through lack of local infrastructure or more commonly the cost of paying for the pipe.
Solve the comms and cloud based video offerings will prosper.
Big studios are dealing with several issues here:
1) The video they want to transcode is of a significantly higher bitrate (read, exceeds the bitrate of blu-ray by several orders of magnitude) and size than is practical for the cloud.
2) The files that support these high bitrates are, by definition, extremely large (in the 10’s to 100’s of GB or more).
3) The codecs that many studios are using to store their high-bitrate, large video files are not supported by open source or low-end transcoders.
4) The quality requirements (as noted by another poster) far exceed anything that is currently supportable in this model.
A previous poster mentioned that this requires lots of bandwidth. They weren’t kidding, think along the lines of a 1Gb dedicated pipe being insufficient if you’re a studio. Going to a higher bitrate pipe becomes cost prohibitive.
All that leads you to the idea that this isn’t suitable for studios, which leads you to the question, is there enough of a market for this from people that aren’t studios to sustain these startups in the long run? I agree that the concept of easily spinning up CPU cycles for transcodes is attractive, however. Unfortunately the cost of bandwidth, storage and capabilities leads me to believe that this isn’t sustainable.