A new version of OpenElec is out - it's reckoned to be the best build of XMBC for the Pi (very stable and lightweight). The two reasons I'd like to use the Pi (well, aside from it being a cheap AirPlay replacement) is that it can playback my MPEG2 off-air recordings and be a very competent BBC iPlayer.
Licenseing for MPEG2 playback - it's a very modest couple of quid to enable hardware MPEG2 decoding (MPEG4 and H.264 is there already);
the root p/w for it is openelec
With that done it's trivial to grab the iPlayer app and install it - so far it's played very nicely!
This video was shown in one of the sessions; the speaker was Professor Philip Willis of The University of Bath's Computer Science department.
It shows various pieces of video that have been converted to a contour/vector representation where instead of using pixels in a raster to represent video they use contours (which also have shading associated with them) and vectors (which dictate how the contours are moving). This is not an effort to compress the data load; although Prof. Willis was at pains to point out that they have not made any efforts to optimise or do any bit-rate reduction calculations on the data, rather it is a way of representing high resolution video in a pixel-free manner. This might provide a useful transport/mezzanine format for moving 4k and 8k television around, rendering the pictures at the resolution of the target display device.
The upshot of this is that rendering at a higher resolution than the material was shot at shows none of the aliasing that you'd expect from pixel-based video. Although you can't get more detail than was there originally the codec fails gracefully such that the images are not unpleasant to look at (unlike the low-res YouTube clip above!).
Prof. Willis gave a tantalizing little extra in the Q&A sessions - he implied that they are looking to give the contours/vectors a time-based element so that they move not only in X-Y space, but along the t-axis such that the pixel-free video now becomes frame free! You can render the resulting data as easily at 1920x1080 @60FPS as you could 720x576 @59.98 fields without any aliasing in the spacial dimensions OR temporally; say goodbye to standards conversion!
The original paper is a bit heavyweight but if your happy with vector maths it is understandable.
Today you can buy equipment that works at the TV "4k" resolution which is also referred to as "quad-HD" because it has twice the number of pixels horizontally and twice the number of active lines; 3,980 x 2,160. Blackmagic have already implemented what they call 6G SDi - i.e. 4 x 1.5Gbit/sec 1920x1080 @30FPS (max) with 4:2:2 colour sampling. If you want 50 or 60P at 4:2:2 you'd need 12G and should you want to go to 4:4:4 RGB at 12bit then you're looking at >20G!
Whilst a coax interface still (just!) works at 6G (and I'd point you towards some research I did in 2009) it seems like single-mode fibre is the only sensible interface that we'll have for synchronous video as 4K starts to be used for live production.
Richard Salmon from the BBC showed that with the huge amount of resolution that 4k brings the human brain recoils if there isn't enough temporal resolution to make moving images look as good as static images. Imagine a rapid pan across the crowd at a football stadium. At sub 100 frames per sec you don't see enough detail in the picture (each pixel is smeared so as to make it look like a much lower resolution image) but when the camera stops the pan you suddenly notice the huge amount of detail. That difference in static and dynamic resolution can, in extreme cases, cause nausea. With this in mind it seems that the standard for live TV will be 4:2:2 colour encoding at 120 FPS! Anyone for 24Gbit/sec video?! v1.4 HDMI currently only supports sub-8 Gigabits/sec. So - it seems like we're going to have to wait for cable standards to catch up and when it does it'll probably be 9/125ยต fibre.
Take this to 8k (which is the second half of the proposed UHD TV standard) then we're looking at 96Gbits/sec! Even current standard fibre struggles with that! So - the other interesting technology that may well form the mezzanine format for moving over cables and networks is pixel-free video; but that's for another blog post!
I had a day at BAFTA listening to various speakers from the industry talking about 4K (quad-HD, UltraHD, etc etc) and the surrounding standards.
ITU Rec.2020 is the document that covers 4k TV (in fact it defines two resolutions - 3840 × 2160 and 7680 × 4320 - which I'll refer to as 4k and 8k television, but these aren't the same as the 4096 pixels and 8192 wide resolutions used in digital film).
The colour space is monstrous! The 2020 triangle is even bigger than the P3 colour space (as defined by the DCI) - take that Mr. Dolby! It'll be a while before ANY displace device can faithfully reproduce that gamut. Thankfully we stay with D65 for white (well, 6504k to be strictly correct - Planck's was re-calculated in the 70s) and the primaries are;
red: 0.708, 0.292
green: 0.170, 0.797
blue: 0.131, 0.046
white: 0.3127, 0.3290
The new luma transfer function is: Y'= 0.2627 R = 0.6780 G + 0.0593 B and for the first time ever in television an allowance for constant luminance has been allowed. There is an almost philosophical argument by Charles Poynton and others that constant luminance is the way to go. Essentially the gamma response should be applied only to the derived luminance rather than the three colour components. I suppose your feeling on that comes down on whether you think gamma is correcting for the camera response (that's what I was always taught at the Beeb in analogue SD days) OR if gamma is a tool to give better dynamic range in the dark areas of the picture. I expect that constant luminance (proper Y as opposed to Y' / "luma") should best be constant in the case of 12-bit video (where you have so much more dynamic range anyway) but remain pre-corrected RGB in the case of 10 and 8 bit 4k.
Frame rates are defined up to 120FPS with no interlaced framerates - unfortunately non-integer (23.98, 29.97, 59.94) are still hanging around like bad smell! The Beeb's Richard Salmon showed a very convincing argument for >100FPS for sports footage. Essentially as you have more resolution the difference between detail in static pictures and moving scenes become objectionable. The problem is that currently HDMI 1.4 only supports a maximum of 30 FPS at 4k and so we're waiting for HDMI 2.0.
Amulet are a great company - I believe their KVM-over-IP extender to be MUCH better than Avocent, AdderLink and ThinkLogical's offerings, but like everyone else, once you packetise up DVI and USB and send it over a network you are going to have to deal with funny effects of devices and drivers that play fast & lose with the specifications. This is from their engineering team;
Wacom tablet and Red Hat Linux has been traced to a bug in the Red Hat Linux Wacom driver. The driver has a very obvious bug in it whereby outward data is declared as inward. The Wacom tablet simply ignores this when it is connected directly, although this works it is actually because the tablet USB stack is sloppy. The rack card however cannot ignore anything else we wouldn’t be able to operate at all so it believes what the Red Hat driver is saying and waits for an event which then never happens.
As detailed below in our analysis of the Red Hat and main Linux code drivers this issue has been resolved in the latest Linux build, with the fix implementing exactly what we would expect to see. This however has not yet found its way in to Red Hat, although may do so later this year, again as per the details below.
To enable use of the Wacom tablet on the current version of Red Hat (5.3) you are using we have added a work around to the rack card firmware. It appears to work but will require testing in the lab over the next couple of next week before we can release. The workaround has to circumvent many checks in the code that are designed to catch driver bugs just like this. As a result the workaround is very specific to Wacom devices and this exact bug so we can’t guarantee it will work if anything changes.
Red Hat 6.3 (Amulet Hotkey test version)
Red Hat version - Red Hat 6.3 (Santiago)
Kernel version - 2.6.32-279.el6.x86_64
RPM package - 2.6.32-279.el6.x86_64
Wacom driver source - wacom_sys.c
Driver bug - Incorrect bitmask causing incorrect data direction expectation in USB driver.
Code copied below shows the addition of the correct bit mask command (in red) in the latest Linux driver
Code in current driver - USB_REQ_GET_REPORT, USB_TYPE_CLASS | USB_RECIP_INTERFACE
Code in latest Linux v3.3 driver - USB_REQ_GET_REPORT, USB_DIR_IN, USB_TYPE_CLASS | USB_RECIP_INTERFACE
Red Hat 5.3 (Client version)
Although versions numbers differ to Amulet Hotkey test version the Wacom driver is the same and has the same bug present
Latest Red Hat release (6.4)
RPM package - kernel-2.6.32-358.el6
Wacom driver source - wacom_sys.c
Driver bug - Still present as per RPM package 2.6.32-279.el6.x86_64
Main Linux code base
Version - v3.2.45
Driver bug - Present
Version - v3.3
Driver bug - Resolved
Future Red Hat
Wikipedia gives the time frame for the next major RHL release (7.x) as Q3/4 2013
We can't be sure whether this would incorporate the fix to the Wacom driver however Wikipedia states "Red Hat Enterprise Linux 7 (?), 2013-6+ Will be based on Fedora 18,[17] which as of March 2013 uses Linux kernel 3.8.[18]". This would indicate that the upcoming release of Red Hat 7 will fix the issue, as it will use a newer kernel, v3.8
Pretty much every portable Apple computer I've owned has had a hard drive fail and the optical drives all go (normally a week outside of a year old!) - although they aren't hard to fix it does pay to have a walk through and this is where iFixit come in - they have guides for most machines and they carry the parts.
Before you start pulling current gen Mac laptops apart make sure you have the necessary drivers. Mine required the following (all in watchmaker sizes); pozi-drive, pentelobe, Torx and Trilobe.