HDMI has been around since late 2002, with updates to the specification following at least annually, and sometimes more rapidly, ever since. The most recent was HDMI 1.3, released almost year ago, and even that's received two minor updates since then.
HDMI: the HDTV favourite
HDMI's raison d'être is to channel uncompressed digital video and audio from source to display, along with control data to allow, say, your TV to turn on as soon as you press play on your Blu-ray Disc player. Information flows the other way, to let the TV to tell the player what resolutions it's capable of showing, saving the user having to set it up manually.
And not just pictures: HDMI does sound too. The basic HDMI spec has bandwidth for eight-channel, 192kHz uncompressed audio, along with a variety of compressed audio formats like DTS and Dolby Digital. On the video side, it can handle standard- and high-definition pictures at all the standard resolutions.
The various specification updates have largely added the ability to channel other CE formats - DVD Audio in HDMI 1.1, for example, and Super Audio CD in HDMI 1.2. But version 1.2 began the process of beefing up HDMI's suitability for computers.
Most HDMI-equipped devices on the market currently use HDMI 1.1 or 1.2. However, HDMI 1.3 was a big step forward, boosting the connection's bandwidth to enable it to host much larger screen resolutions than HD TV's maximum 1080p - a crucial move if the connector format's to be used to link even not-so-high-end graphics cards to big monitors. This was a bold statement of intent that HDMI's supporters want the standard to be embraced by the computing world.
DVI's Dual Link mode provides bandwidth-boosting technology to drive screen resolutions of 2560×1600 and beyond. Until HDMI 1.3 increased its signalling clock speed from a peak of 165MHz to 340MHz, the CE format couldn't deliver the picture data fast enough to show images of that size. Now it can. That said, the higher-bandwidth mode, dubbed 'Type B' to the original HDMI's 'Type A', has a different connector. An HDMI 1.3 device can support Type A and Type B connectors; an HDMI 1.2 device can only support Type A. But not all HDMI 1.3 equipment has Type B ports.
HDMI also provides for more complex methods of modelling colours. Previous versions allow colours to be represented as 8-bit values in either the RGB (Red, Green, Blue) or YCbCr (luminance, chrominance blue, chrominance red) component colour modes. HDMI 1.3 can also handle 10-, 12- and 16-bit colour, radically increasing the number of colours that an image being sent to the display can contain. All this paves the way for better-looking TV pictures, but it's really about delivering higher quality images for folk who really need them: computer-using graphics and video professionals.
I'd love to agree with you, but...
De Zeurkous - I approve in principle of a SCSI/1394-based protocol, but the re-tooling to produce this from the existing displays is significant. Likewise, I approve of a subset of displays running with display commands (in the manner of X terminals), but it gets difficult to do this when the whole display is being updated - short of lossy video compression, the amount of work required to produce an arbitrary image on the screen (e.g. during the playing of a game) easily exceeds the memory requirements of sending the image in its raw format. For a point-to-point protocol, there's little benefit in reducing the bandwidth some of the time at the cost of complicating the protocol; nothing will be using the spare bandwidth. Obviously the situation is different if the display is streamed over a shared network.
Something supporting higher total resolutions/colour depth and multiple displays, better connection distances, better connectors and a more flexible protocol *would* be a good thing, but the trick is to achieve some of these without making any of the others worse than we've got already - and *enough* better (or future-proof) that the consumer both has a benefit to upgrading and evidence that they won't need to do so again immediately after. If this isn't the case, the benefits of a new connector don't outweigh the costs of switching. I don't say that we should stay with HDMI (or DVI) forever, just that we should wait until the replacement is worthwhile - and I don't think that's true of DisplayPort. I'll reserve judgement on a 1394-video connector (not the existing compressed scheme) until someone comes up with a detailed proposal, but - much though I like the idea of a more elegant standard - I won't advocate it unless there's actually an end-user benefit. Matching the capabilities of DVI with a "cleaner" protocol isn't enough.
A J - I sympathise (especially about the patent rant), but I really think a digital connection is a good idea. There are real problems with running a CRT through a switch or longish cable at much over UXGA resolution, and VGA inputs on LCDs have to do a lot of work to convert the signal back to digital (before, admittedly, making it analogue again). SCART isn't universal (at least in the US), and I have to admit that - nice though it is on a TV - it's a bit bulky for the back of a PC (I've no idea what its signal quality tolerances are). I think it's too early to throw away analogue completely, and certainly too early to throw away a signal that can be converted to analogue easily (as DVI-D and HDMI can), but keeping digital from the frame buffer to the display does seem more practical. There are reasons for going with YCrCb (better use of the bandwidth), even if it requires some conversion at both ends, but for computer displays I suspect we're mostly talking RGB for *any* standard these days.
A fibre-optic standard would be nicer, and I'd hoped that DisplayPort might go down this route (or UDI, for that matter), but having copper *and* fibre is the worst of both worlds. We're back to fibre being useful only for the minority with the need for longer cable runs, just like dual-link DVI is useful only for the minority with high resolution/colour displays, and not making it the default will result in incompatibility, confusion, and unnecessarily high prices. Again.
It appears that the display industry is too busy trying to work its way through the standards messes that it makes for itself to learn not to do it again. I'd like to think that the consumers could put their foot down at some point, but I suspect they'll be too confused by now to have a chance - all that's happening is that everyone's holding off buying *anything*.
Why they really don't like fibre
The reason why the industry doesn't like fibre optics is because there's no way to monetarise it.
Once upon a time, people invented things. But those who didn't have the patience to keep trying things in the hope of finding something that worked got jealous, and took over the system. Modern business practice is to find something that people already do for free or low cost, and then work out how to charge them money -- or more money -- for it. Usually by waving a bogus patent claim in people's faces. This works in the USA, thanks in no small part to the American system of allowing lawyers to demand payment whilst a case is still ongoing. (Another method which sometimes works well is by insisting that a popular commodity be paid for in US dollars, in order to skim a small amount off every transaction, and invading any country that threatens to start selling it by the Euro instead.)
Now, those pesky laws of nature say you can couple a signal into a piece of fibre-optic without any expensive proprietary connectors: all you have to do is cut the end cleanly with a single chop from a very sharp kitchen knife, and hold it in place with Rizla papers and Blu-tack. This actually works surprisingly well over short distances and for equipment which is generally considered furniture and so not moved about much. Send SCSI commands serially over fibre-optics, and you've suddenly got an open standard that nobody can make money from.
And therein lies the problem; because, without government intervention, such an open standard is never going to be popular with the big established players. They want their own proprietary standards (so you can't just use a brand X recorder with a brand Y TV), or at least a common proprietary standard that saves them from having to compete on merit by allowing them to close ranks and keep young upstarts out of the game.
All of which is ignoring the fact that we have *already* had for years a royalty-free standard connector that supports RGB+Csync or Composite Video (with graceful degradation if only one device is RGB-capable), stereo audio (plenty good enough if you're using TV speakers; if you want multi-channel, you really should be using a dedicated amplifier with its own fibre-optic input) and data communication. RGB is what CRTs and LCDs use natively, and so provides better picture quality than either SVHS or YPrPb. It ought to be possible just to extend the SCART standard to deal with higher sync rates, using the data channel to indicate what the display supports and falling back to 15kHz in the worst case.
SCART (with higher scan rates) and VGA should also be reasonably compatible: the only problems are in the different signal levels, impedances and sync formats (Csync vs. separate Hsync and Vsync), but expect a single-chip solution to emerge as soon as there is a need for it. Yes, SCART is analogue; but since the signal from the SCART cocket goes, as near as d**n it is to swearing, straight to the CRT and speakers, then that oughtn't to be a problem.
If we really do need digital signals (for, say, recording from a receiver without an integrated HDD -- as if anyone will make them that way in future -- or transferring from a fixed device to a mobile one in a way consistent with the fair dealing provisions of copyright law), fibre-optic is the logical way to go.
RE: Why we should care
While I pretty much agree with your rant, I'd like to point a few things out:
1) As I indicated, IEEE1394 is still in active development. A boost from the video industry joining the fray would probably speed up that development substantially, easily leading to the transfer rates needed.
2) Framebuffers are standard, inexpensive equipment; then again, why shouldn't we interpret the image chunks as draw commands on low-end displays?
3) If we implement the SDC-over-1394 solution, we don't need format adaptors for much longer; simple SCSI PHY converters would be enough for backward combatability.
As for the final paragraph of your rant, I disagree. There should be one more standard that is not only _A_ standard, but _THE_ standard, as well. That's what really matters on both the mid- and the long term.
`` that said, I'm a cynic (can you tell?)''
Duh -- I recognize a fellow in that art when presented with one :)
Happy birthday ?
In my domain VGA is still king, and if my memory serves me well was launched 20 years ago next week. 20 years is not bad for a defacto standard. And it was invented in the UK.
Why we should care
I'm not suggesting that we can't gain from a digital standard - there's a clear advantage to using a digital connection over an analogue one, I'm just pointing out that single-link DVI (or even dual link) wasn't universally better than the VGA technology available at the time, and that this harmed its market penetration. I'm not against progress, just against the introduction of standards that take backwards steps because the high end of the old standard was considered irrelevant - future-proofing has a way of redefining the "high end". As an example, UDI appears to max out at 36 bits per pixel (although I'm not sure I'm reading an authoritative source); dual-link DVI can handle 48bpp, 8 bits per channel per link. The Canon 1DMk3 has a 14-bit image sensor; if this technology gets into video cameras, UDI requires downsampling where DVI wouldn't.
The DVI digital signal is a direct equivalent of the analogue one (regardless of whether an analogue monitor would cope with the signal, in the case of reduce blanking). This both makes it easy to combine analogue and digital output (the graphics card can throw the same pixel data at a common output component and have the data transmitted in both forms, rather than needing to scan the frame buffer twice) and makes it relatively easy to convert between video connectors. While I'm a fan of the SCSI protocol (and various networking protocols that have proper error correction), even once sufficient bandwidth has been routed to the display (3.2Gbit/s is quite a way under single-link DVI) there's still a need for a frame buffer to reconstruct the image; not a big problem in a monitor (remember to triple-buffer everything) but expensive in a display format adaptor. As I've said of DisplayPort, would it be a nicer spec? Yes, from my point of view. Would it gain us enough to be worth replacing the existing video stream approach? Personally, I doubt it.
It's true that neither UDI nor DisplayPort have any current market penetration; the concern is that, once they do, one of six things happens:
1) The new monitor you want is DisplayPort only, and you have to upgrade your graphics card/the new graphics card you want is DisplayPort only, and you have to upgrade your monitor, otherwise they won't talk to each other.
2) Devices gain yet another port on the back, which adds to the cost and space but essentially gains us nothing.
3) People use an unnecessarily expensive DVI/HDMI/DisplayPort adaptor.
4) The ports are multi-mode, which limits them to single-link support (AFAICT).
5) Graphics cards start being able to run all the standards down a set of pins, and we end up with an octopus dangling off the back of the computer (see some VIVO solutions).
6) Other connectors start to go missing. As someone with lots of CRTs, this lacks appeal.
It does not appear to be the case that DisplayPort gains us anything (except being marginally easier to plug in without seeing the socket, allegedly); in return, it possibly forces an upgrade cycle and definitely causes unnecessary incompatibility and confusion. And this is if all devices talk to each other perfectly (because that worked so well with HDMI and DVI). It seems unlikely that a new connector will make anything cheap, because there'll be years of backwards compatibility requiring *both* connectors.
Other than a few companies, I don't see who DisplayPort helps. Having re-read a presentation on the subject, the bandwidth appears to be slightly greater than HDMI 1.3 single-link, slightly *more* greater than dual-link DVI (assuming dual-link DVI is 330MPix/s, which is not actually a limit but the expected minimum because, for dual-link, one link should be capable of at least 165MPix/s), but substantially less than HDMI 1.3 down a type B connector. It is not, for example, enough to drive a WQUXGA T221 at full refresh on its own (whereas a dual-link + single-link DVI connector *is*), and the "next smallest" common display size (the WQXGA 30" panels and QSXGA medical panels) are well-catered for by existing DVI. Even if increasing bandwidth is not a significant aim for DisplayPort, it seems that a greater step should be taken in this direction.
All this assumes that four DisplayPort lanes are available. Although cables have to support this (kudos for avoiding the "it's a dual-link DVI cable" "so why are there pins missing?" debacle) there's no requirement that devices themselves do. I'll be interested to see how long after devices get the "DisplayPort" tick box it takes for the number of lanes to be labelled, because I bet - just as dual-link DVI took a while to appear - the first batch of products will be castrated to the common range of monitors. One lane can do 1080i; two lanes can do 1080p, and I'll be pleasantly surprised if four lanes are considered necessary just for the happy minority with decent resolution displays; that said, I'm a cynic (can you tell?)
On the plus side, (new) larger monitors are, I believe, obliged to have HDCP support in the US now. I suspect that higher resolution displays will be limited to HDCP over DVI for a while (when nVidia don't break dual-link support), but the HDMI type B connector might turn up eventually, you never know. (For so long as most protected content fits in a single link, if you don't mind your 30" screen running at 1280x800 and you didn't want to view in a window then single link support might not get replaced.)
I can't see how adding another standard to the mess improves matters. If the industry would concentrate on making an existing standard dominant, and maybe improving it in a backwards-compatible way, I'd have more sympathy.
Turns out I had more ranting left in me. :-)