Some TVs may also try to rescale the inputs, or enhance/process the image in ways that can improve perceived video quality. Things like increasing frame rates of sources that are lower frame rates (thus the 120 Hz and 240 Hz TVs that attempt to make 24, 30, and 60 FPS sources look better), or deinterlacing 1080i ATSC sources. Some of this image processing may not work well in specific monitor use cases. I have had generally good results with using a TV as an HTPC monitor. Only issues I've run into over the years are 1.) a 1080p Sony TV with a VGA input that could not handle 1920x1080 (using HDMI worked) and 2.) a 720p Toshiba that could not show the BIOS screen of the attached computer (I think this was either an unsupported resolution issue, or a timing issue where the TV couldn't wake up fast enough from the 'signal lost' message to display a brand new signal input). YMMV. VPNs: there is a race going on between streaming services who want to block VPNs, and VPN services who have customers who want to be able to watch streams (whether in or out of their regions). Some VPN customers buy VPN services because they do not trust their ISP to not do stuff like selling browsing histories. I think ISPs are getting caught in the middle, maybe when they have IP ranges near or in the middle of ranges that are suspected by IP reputation companies as being used by VPN services. I'd guess the problem is more likely to affect smaller ISPs, and not the Comcast/Cox/Charter/Spectrum/CenturyLinks of the world. There are also 'distributed VPN' services that let people share their connections with others. We are also seeing fragmentation in the cable/streaming service space, similar to what happened in the cable/Dish Network/DirecTV wars. Add it all up, some customers may throw up their hands in annoyance at the various platforms and then revert to other means of obtaining the content they seek. On Wed, Sep 1, 2021, 15:13 Owen DeLong via NANOG <nanog@nanog.org> wrote:
On Sep 1, 2021, at 11:25 , bzs@theworld.com wrote:
Every time I've read a thread about using TVs for monitors several people who'd tried would say don't do it. I think the gist was that the image processors in the TVs would fuzz text or something like that. That it was usable but they were unhappy with their attempts, it was tiring on the eyes.
That was definitely true of 480 TVs and older 1080p units, but modern sets are almost designed to be monitors first and everything else second.
Maybe that's changed or maybe people happy with this don't do a lot of text? Or maybe there are settings involved they weren't aware of, or some TVs (other than superficial specs like 4K vs 720p) are better for this than others so some will say they're happy and others not so much?
There are some tradeoffs… For example, sitting normal computer monitor distance from a 44” 4K screen, you can damn near see the individual pixels and that can make text look fuzzy, especially if your GPU or OS are stupid enough to use a technique called anti-aliasing on text (which is the most probable source of the fuzziness in your originally quoted complaint).
Older TVs would try to smooth some aspects of the analog signal they were using through anti-aliasing pixels that occurred on the edge of a change in the color signal to “smooth” the image. (The extent of this action was what was controlled by the “Sharpness” knob back in the analog days).
Turning off this capability (Sharpness to the left most or lowest setting) would often improve things greatly.
Or maybe the unhappy ones were all trolls/sockpuppets from companies manufacturing/selling $500+ 24" **GAMING** monitors.
Possible, but unlikely.
Owen
On September 1, 2021 at 09:48 nanog@nanog.org (Owen DeLong via NANOG)
On Aug 31, 2021, at 18:01 , Michael Thomas <mike@mtcc.com> wrote:
On 8/31/21 4:40 PM, Owen DeLong via NANOG wrote:
On the other hand, the last time I went looking for a 27” monitor, I
ended up buying a 44” smart television because it was a cheaper HDMI 4K monitor than the 27” alternatives that weren’t televisions. (It also ended up being cheaper than the 27” televisions which didn’t do 4K only 1080p, but I digress).
Back when 4k just came out and they were really expensive, I found a
"TV" by an obscure brand called Seiki which was super cheap. It was a 39" model. It's just a monitor to me, but I have gotten really used to its size and not needing two different monitors (and the gfx card to support it). What's distressing is that I was looking at what would happen if I needed to replace it and there is this gigantic gap where there are 30" monitors (= expensive) and 50" TV's which are relatively cheap. The problem is that 40" is sort of Goldielocks with 4k where 50" is way too big and 30" is too small. Thankfully it's going on 10 years old and still working fine.
Costco stocks several 44” 4K TV models (like the one I got) that are relatively cheap. It’s a little larger than your 40” goldilocks, but I
wrote: think still within range.
Owen
-- -Barry Shein
Software Tool & Die | bzs@TheWorld.com | http://www.TheWorld.com Purveyors to the Trade | Voice: +1 617-STD-WRLD | 800-THE-WRLD The World: Since 1989 | A Public Information Utility | *oo*