Page 1 of 1

Are dedicated TV sets becoming obsolete?

PostPosted: December 21st, 2014, 6:28 am
by LongRunner
Well, I don't think anyone actually makes a good TV set anymore to be honest. The colours on most (all?) modern TV sets are deliberately (to attract the masses) over-saturated to varying degrees, the brightness is essentially fixed (moving the control labelled "brightness" either artificially darkens the picture, or blends it into white) on many, and it doesn't look like they offer any functionality you can't get with a TV tuner for your PC. (I don't have one of my own — yet — but I have considered getting one in the future, provided Linux drivers will be available for it.) Let's not forget that they either omit headphone outputs or if they do have them, use low-quality circuitry that results in high noise — any half-decent PC audio codec will put it to shame. They do have some advantage in convenience, but to me that's minor compared to their current downfalls.

Of course, we can't forget that much of the programming on TV has also gone downhill — although some sensible stuff remains. But that's not the main topic here.

(Protip: Most new desktop PC monitors come with the contrast control set to 70 by default — again this is about marketability and nothing else. But at least you can still reduce it to 50 to get a realistic image. And while you're at it, adjust the brightness to a comfortable level — the mid-point is generally uncomfortably bright. Of course, you do have to choose wisely to get a good one — but aside from one dead green subpixel, I'm pretty satisfied with the image of my current Dell monitor, of course after initial adjustments. What I would really like from a monitor is higher pixel density — say a pixel pitch of 0.1mm — but the industry is still being held back by the yet-to-end stream of software that was never written with graphical scalability in mind. :()

Re: Are dedicated TV sets becoming obsolete?

PostPosted: February 28th, 2015, 1:41 am
by LongRunner
OK, I admit it — not all contrast controls have the same effect.

What I mentioned above was true for the LG monitor I used before. With my current Dell monitor it works a bit differently:
  • 0 to 74: Appears to work much like the range of 0 to 50 did on the LG. (Lowering the setting darkens the image. Although there's a difference: The image was still visible on the LG with the control at 0. On the Dell, only the OSD remains visible with the control at 0 — the image becomes pitch-black.)
  • 75 to 100: Pure white (hex #FFFFFF/decimal 255, 255, 255) becomes…whiter. Turning it higher turns progressively darker shades of grey into this "super-white", until at 100 everything from #E6E6E6 (decimal 230, 230, 230) up is converted.
Why do they have to make this stuff so confusing? :s And isn't it only analog stuff that needs contrast controls in the first place?

Of course, if you go into the menu options, you can choose from an array of gimmicky "presets", which are about as useful as the equaliser presets provided in audio managers — which is to say, don't ask me why they throw them in. (Complete with that "edge enhancement" rubbish in the "Movie" mode.)

Re: Are dedicated TV sets becoming obsolete?

PostPosted: March 7th, 2015, 3:11 pm
by Behemot
I don't have TV so I don't care. It's mostly full of propaganda or other crap here (and mostly likely anywhere on the planet) so it is jsut a waste of time and money. Suggest selling it to morons who waste their lifes with telly and do something better.