Pretty much any TFT display can show a 720p HD movie - 1280 x 720 resolution - and you don’t need to pay more than £150 for a 22in 1920 x 1080 display that can handle full HD. Assuming, of course, you're not simply hooking it up to your HD TV. Most mini PCs have an HDMI port for this, or a DVI connector which can feed a telly's own HDMI port via an adaptor. VGA is best avoided if you can.
Expansion will be limited, but you should be able to upgrade key components, like the hard drive
Sources for HD movies are Blu-ray Disc - which requires a suitable optical drive - and downloads, both of which also mandate a dual-core CPU and graphics chip with a movie decoder. Watch out for machines with Intel integrated graphics cores, especially those in Atom CPU-equipped PCs, as they are generally not up to smooth full HD playback. Nvidia's Ion platform allows Atom-based mini PCs to handle HD with ease - and to make a decent stab at casual 3D gaming.
Proper gaming is a different matter as you need a combination of processor and graphics that is beyond the capabilities of many mini PCs. A digital graphics output is vital and we favour the neatness of HDMI over DVI.
Any of the current crop of integrated graphics chips from AMD ATI, Intel and Nvidia will deliver superb standard definition movie playback with the bare minimum of CPU load.
"Having a dual-core processor makes ..."
"Having a dual-core processor makes a big difference to the performance available to you"
You use those words as the caption to a Windows performance graph which shows four CPUs. It also shows the current CPU utilisation as zero %, average CPU utilisation at well under 20% with a very very brief spike at around 60% on two CPUs, and the fourth CPU might as well not exist. Do you think this is representative of typical SFF/entertainment PC usage?
I've largely lost interest in GHz wars, but I'd have thought one single faster core would in general be a better buy than multiple slower cores with the same price and/or same total power dissipation, because most applications are still basically single threaded and therefore gain little benefit from the extra cores but may well benefit from extra GHz. Your graph shows this - the fourth CPU barely wakes up at all.
In a small form factor entertainment PC where fans are an undesirable complication, this consideration applies even more so than in standard desktop PCs.
Still, if you want to blithely repeat the industry's multi-core hype, feel free.
Objection your honour
"VGA is best avoided if you can".
Not quite so. A very large fraction of the TVs out there have a very limited tolerance on what they will accept on their HDMI port. So if you are using the PC's HDMI output or DVI-to-HDMI adapter you have a very high likelihood of running into overscan/underscan issues where the PC picture cannot be fitted onto the screen correctly. This is especially the case for 1366x768 (HD ready) systems. I have yet to see one that accepts DVI-to-HDMI (or PC HDMI) 720p and works properly.
As far as using VGA for full HD the problem is usually not the VGA itself, but the cables and in the case of Radeon on-board connectors and cables "to bracket". Very few cables on the market are OK with the frequencies required by full HD which results in a significant amount of ghosting. If you get a cable that is OK in that range you are OK with VGA.
Overall - it remains a trial and error game and VGA remains the "old faithful" fallback which (subject to having good cabling) can save the day when HDMI/DVI does not work properly.