Saturday, July 19, 2014

MPEG transport stream corruption; effect on pictures

I'm running training at the MET's video forensics lab soon and part of it is explaining how DCT-based compression works and particularly the effects of corruption on long-GOP MPEG transport streams as delivered in DVB-MUXes. One of the illustration videos is below. 
Three clips; each with a momentary corruption to the data stream and in each case you can see how the decompressor can't reconstruct a proper picture until the next i-Frame. The second half of the clip shows me framing through with the I, B, or P frame indicated top-left - you'll need to make it full screen to see that as the marker is quite small.

Wednesday, July 16, 2014

The challenges of modern picture quality analysis.

This is a recent article written for a trade magazine;

Engineers have sought to quantify the quality of the video signal since the birth of television. Since all aspects of the TV picture are represented first by voltages (analogue) and then by numbers-in-a-bit-stream (digital) you have to make measurements to really know anything about the quality of your TV pictures.
“When you can measure what you are speaking about, and express it in numbers, you can know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind: it may be the beginning of knowledge but you have scarcely, in your thoughts, advanced to the stage of science.” - William Thompson, Lord Kelvin.
In the dim and distant days of monochrome tele the only things the TV engineer had to worry about was the blacks and the whites. Common consent had us placing the blacks (dark areas of the picture) as a low signal (at zero volts) and the whites (bright parts of the picture) up at 0.7 of a volt. In addition we allocated the 0.3v below black to the synchronising pulses – the electronic equivalent of the sprocket holes in film; a mechanism that allows the receiving equipment to know when new lines and frames of video were starting so that the picture is “locked” and not free-running (“try adjusting the vertical-hold!”). Once all those things are well-defined then Mr. Sony’s cameras work nicely with Mr. Grassvalley’s vision mixer and the engineer at the broadcast centre can adjust the incoming signal from the OB truck such that it looks right on the wavefrom monitor and hence the pictures will match what left the truck. Dark shadows and bright clouds will look like what the camera operator saw.


Fig.1 – monochrome TV signal, two lines.

So far so good; but people wanted colour TV and so all of a sudden the way colour is encoded needs to be considered. With colour comes grading and the “look” of pictures and colourists need to see different representations of the colour parts of the signal for artistic reasons. Engineers need to ensure that the colour content of the picture is constrained to the legal gamut of colours that the transmission system can handle; nobody wants things to change colour as they get to air! Tektronix have always been the gold-standard for TV test and measurement and to this day if you ask an engineer or colourist what kind of test equipment they’d like it’s going to be a Tek.


Fig.2 – colour TV signal, several types of display

As we moved from analogue to digital working in the 1990s and then from standard definition to higher resolutions in the noughties the principle of looking at the lines and fields of the TV signal remained; we assumed that if one frame got through the system with minimal/acceptable levels of distortion then all subsequent frames would; and as we know - the illusion of television is that many frames make a moving sequence.
However – with the introduction of “long GOP” (lit. Group Of Pictures) video compression in the 90s it became apparent that we don’t treat every frame of video the same. On a compressed video-link there are I-Frames (the ones where the entire picture can be re-built) and other, more complex beasts, called B-Frames and P-Frames; these serve to convey the differences (by not sending all complete video frames, but merely the differences from previous and subsequent ones we achieve video data rate reduction AKA compression). You’ve no doubt seen the on-air fault where some parts of a picture seem to have become “stuck” in the previous scene where other parts of the picture are behaving normally. Then, suddenly the picture rights itself. What you have witnessed is a corrupt I-Frame; all your set-top-box can now do is show the changes as they arrive and you don’t get a re-built complete frame until the next I-Frame arrives (typ. half a second later). This is just one kind of “temporal” fault.


Fig.3 – MPEG multiplex fault

So, now we have to consider things happening in time as well as the pixels, lines & fields of video.  The colour-bars you’re looking at coming across the satellite link may look splendid, but perhaps the link is only able to convey pictures that change minimally between frames and as soon as moving video arrives it looks awful. Perhaps the fields of video have been reversed (it’s a more common technical fault than you’d expect) and you’ll only see that on moving pictures; and it’s not nice!

Test signals have always served to “stress” the system they are being run through; the traditional colour bars have their colours set at the extreme ends of what the system can handle – you never see that amount of saturated yellow in pictures coming out of a TV camera! We want our test signal to show faults; if the pictures look good it should be because the OB link or editing workstation is capable of carrying the worse-case pictures.
So, now we need a test signal that is not just the same frame of video repeated endlessly; we need a test that changes and serves as a challenge to compression encoders and is constructed in such a way as to have predictable picture effects that highlight when your production chain is sub-par. Ideally we could see these drop-offs on a picture monitor and not on a £10k Tektronix test set. Once we have an animated test signal we can test not just for the degradation of compression but also those field-cadence problems. We can also test for lip-sync errors, exaggerating the effect of audio being late or early with respect to the video and more importantly all of these tests can be done by an operator rather than the expensive engineer. We’d also like the sequence to be constructed such that faults are visible on a 24” video monitor from the across the other side of a busy machine room or studio gallery.

Fig.4 – SRI Visualizer - a still frame; it moves normally!

  • Lip-sync: Measure and quantify the synchronization offset between audio and video – a single frame of sync is easily missed on camera pictures.
  • Bit depth: Detect 10-bit to 8-bit truncation – in a modern facility a mix of eight and ten bit video is a fact of life, but no client wants unnecessary loss of dynamic range.
  • Compression fidelity: Measure and quantify compression levels in real time; again, real camera pictures often make this effect hard to spot.
  • Colour matrix mismatch: Determine high-definition (709) and standard-definition (601) colour space conversion errors. These colour shifts are subtle until the director is shouting about that shade of red he wants!
  • Chroma subsampling: Determine the Chroma subsampling being used (4:2:2, 4:2:0, 4:1:1…)
  • Chroma upsampling: Reveal how missing chroma samples are interpolated
  • Field dominance: Definitively determine field order reversal.
  • Chroma motion error: Demonstrate incorrect processing of chroma in interlaced 4:2:0 systems
  • Subjective image fidelity: Perform a rapid check of system integrity
  • Colour conversion accuracy: Verify colour-space conversions
  • Display gamma: Measure monitor gamma quickly
  • Black clipping: Reveal black clipping and accurately set monitor black level
  • White clipping: Determine if highlights are being blown out
  • Noise: See how an encoder handles noise
  • Skipped frames: Detect repeated and dropped frames
The SRI Visualiser is available as a hardware product (the TG-100 which also includes equally innovative audio tests) which can be installed into a machine room to augment/replace existing SPG-type test signal generators. You can also purchase and download it as several kinds of video files which can prove very useful in file-based workflows; injecting the sequence at the start of the post/production workflow and confirming all is well at the very last stage. An hour of time paying attention at the start of a new production will pay dividends by highlighting exactly where any picture faults creep in. Did that colour-shift occur during editing, grading, VFX or when the show was transcoded for distribution? Without a test system like The Visualizer these problems are hard to track down.

Tuesday, July 08, 2014

Temporal-based video tests; The SRI Visualizer

We've recently take on SRI as a supplier and I'm very excited about their test system.


You can read the article I've written for a couple of industry magazines here.
SRI International

Thursday, July 03, 2014

Someone who has some actual experience of file-based deliverables!

There are an awful lot of people touting themselves as "DPP consultants" and "file-based technologists" in the UK TV industry at the moment. For the most part they are riding the gravy train of the DPP roadshow and the fact that come October all the big broadcasters will expect file-based delivery AND material QC'ed to the DPP specification. The majority of these folks have not delivered a single minute of material for terrestrial broadcast but the one chap who really does know his stuff is my good pal Simon Brett (recently moved from National Geographic/Fox UK to NBC-Universal).


Here he is at a recent evening event we laid on; if you want to know some actual details (what bit of software to use to handle your metadata etc) then Simon is your man.  He's quite an engaging speaker (and he bigs-me-up for colourimetry!)

Tuesday, July 01, 2014

Near synchronous video over UDP/IP with Sony's NXL-IP55


Just about the only things that caught my eye at the recent Beyond HD Masters day at Bafta was Sony's IP Live system. This is a single product called the NXL-IP55 which puts four 1080i 4:2:2 signals over a gigabit connection - so modest 6:1 compression with a well-defined single field of latency. The camera channels can go either way (so three source cameras and a return preview monitor for example) and embedded audio plus tally & camera head (colour) and lens control are included. It's quite expensive ($10k per end) but is the only video-over-IP device which I've seen so far which is suitable for live production.  


http://www.sony.co.uk/pro/press/pr-sony-nab-av-over-ip-interface