Image Dissectors


Twitter Facebook YouTube Home All Calendar Copyright Contact RSS
Television Internet Radio News Film Search

Perceived Quality

Simon Pitt | Television | Sunday 26th September 2010

The other day I came across this light-hearted review in The Metro, for the first episode of Mel and Sue's Great British Bake Off.
This is lick-the-screen TV for committed cake-heads. You can keep your fancypants MasterChef tarting about with shaved scallops and braised kneecap of gerbil, there's nothing like a faceful of cake to get the juices flowing.

Forget the worthy but dull academic interludes explaining why cake hits the human sweetspot, just indulge in HD close-ups of ganache layers oozing off the screen.
Pedant that I am, my first thought on reading this was: BBC Two isn't simulcast in HD, so you can't "indulge in HD close-ups". Before I go on, I'm going to cut Keith Watson a break, because he was obviously using the term HD casually for "high quality footage", rather than actually referring to the video standard of 1080 vertical pixels by 1980 horizontal pixels, scanned progressively. flowing.

However, his comments remind me of something I read in The Telegraph a while back:
But it revealed that although 30 per cent of those questioned thought they could watch high definition or Blu-ray discs at home, just 58 per cent of that group had connected the necessary player or box.
The figures here are rather strange: I don't fully understand why 70% of people with HD televisions think they can't watch HD television at home. However, the main point is that nearly half of everyone with an HD television hasn't connected it up properly, yet thinks they are watching HD Television.

A brief search found this confused individual:
Can i take HD pictures with a camera that has a max res. of 3648 x 2736? [...] i wasnt going for size, but really high quality like high def. i like taking high quality pictures that are full of detail, like a high def movie.
Now, comparing stills to video is a bit tricky. One of the problems is the fact that still cameras define their quality in megapixels (ie, the horizontal pixel count multiplied by the vertical pixel count, eg: 4MP, 10MP) while video gives its quality figures in the vertical pixel count (ie, 1080p, 720i).

However it's not difficult to convert from one to the other. So, in megapixels an HD TV is 1,080 x 1,980 = 2,138,400 pixels. Or 2.1 MegaPixels. The camera is: 3648 x 2736 = 9,980,928 pixels or 9.9 MegaPixels.

To put this into perspective, my last phone, which I bought in 2006, takes still pictures at 2.0 MegaPixels.

If one were to take pictures "like a high def movie", you would almost certainly be disappointed with the results.

The point here isn't that consumers don't understand technical jargon; that's fair enough. I mean, these things are complicated. The point is that consumers seem unable to tell whether what they are watching is high quality or not.

It is this large scale failure to understand technology that inspired this XKCD comic:

Mocking the non Geeks

Largely, consumers have formed two, equally as ignorant, views on HD: Both of these are demonstrably wrong. The truth is that HD is a lot better than standard definition, but compared with the quality you can get from a still image, it isn't technically that impressive; you can now buy digital still cameras of over 15 megapixels.

Six years ago, or so, when I was at school I had a weekend job in Currys. On the camera training course, we were told to tell customers that the quality of a 3 megapixel camera was indistinguishable from an analogue camera. Go into a branch of Currys now, and I'm fairly sure they'll tell you that you need to look at 12 megapixels or above. Even for casual holiday snaps.

Show the average consumer two identical 6 x 4 photographs, one at 3 MP and one at 14 MP, and they probably won't be able to tell the difference. However, if you showed them two cameras, and explained to them the difference, they would almost certainly leave claiming that 14 MP pictures are much, much better.

What this overlooks completely is the fact that the level of megapixels refers to the rendering of the picture. If you're using a cheap lens (as you're likely to be in a budget camera), it doesn't matter how many megapixels you render the picture at, it's not going to look any better.

But consumers continue to buy HD televisions, plug them into their VHS players and bask in the 'glorious HD'. They buy 22 inch widescreen monitors, and set the resolution to 800x600.

The problem is, consumers are being peddled a lie. They're told they need a camera of a certain quality, a hard drive of a certain size, a processor of a certain speed, but how are they going to know what that means? They have nothing to calibrate that figure against. All they can do is compare it to the megapixel counts on other cameras, compare the prices, and make a decision on which one to buy based on how much they want to spend.

Unless you are someone who is actively interested in a particular piece of technology, as a consumer these days, you are purchasing blind, and then you sit happily basking in the perceived quality of your latest purchase.

SP



~~~


Latest Articles:


More »

Most Popular:


More »

Twitter Facebook YouTube Home All Calendar Copyright Contact RSS