HDMI TV versus computer monitor -- Advice needed

I have been using a 3-monitor setup. I have two “HD” monitors in the 27" range. They are my main monitors for Cubase (and Dorico and anything else of a musical nature – I have a completely different computer for word processing, email, video editing, and other non-musical stuff.) The third monitor is a smaller one that is use in portrait mode to display miscellaneous windows.

My vision is not great. and my eyes get sired working with the small menus and GUI objects that you have to deal with on these HD displays, but I love having a lot of material in my field of view. About 6 months ago, I started running my smaller monitor at 125% (This is a Windows 10 system). I kept the two big monitors at 100% That all worked OK. Cubase 10.5 (and all the other apps except Band-in-a- box, which is always screwy) worked fine. In particular, at 10.5, all the VSTs displayed correctly. This was an acceptable setup, although the feature size on the two big monitors was a little small for me.

With Cubase 11, all hell broke loose. Now, with this 125/100/100 setup, many of the VSTs don’t display correctly. They don’t fit into their windows correctly. Either the app is way too small (Spectralayers) or the Window is too small and not stretchable (most of the other non-Steinberg VSTs). SO I have to do something. My short-term options are to go 100/100/100 (making my small monitor very hard to read) or 125/125/125 (making less information available in the field of view. My immediate solution is probably to go with 125/125/125. But unless Steinberg fixes this Cubase 11 problem (and I haven’t seen any communication that they even acknowledge this problem), I think I need to go a different direction with my monitors.

I have seen some folks here recommend using TVs instead of computer monitors because you can typically get larger “glass” for fewer pixels. While you may want the maximum number of pixels when editing photos and videos, I really need the GUI features to be bigger when working on the DAW and I really don’t want to replace my video card to support 4K screens.

So with that preamble, I’m asking is anybody is using TVs as their DAW monitors. They are quite cheap these days. Here’s a 32" HDMI screen for under $200 that supports 1920x1280. That would seem to be an easy way to achieve the effect of 125% scaling while keeping the same amount of information on screen that I have today:

Samsung 32" TV

Is there any downside to this? Are TVs harder to read (blurry) at a closer distance than computer monitors? Will they work with computer video cards OK (my card supports HDMI and 1920x1280)?

I ended up getting an LG 32" monitor (not a TV) that is native at 1080P. This is pretty readable for me at 100% scaling. That was about $170 at Costco. That doesn’t solve the Cubase problem mishandling the Windows scaling factor, of course. That is all related to the scaling set on the first monitor. If the first monitor is set to anything but 100%, Cubase 11 screws up the window dimensions, no matter what monitor the window ends up on. (On further testing, Cubase 11 messes up if the scaling is other than 100% on ANY monitor.)
I decided I could live with interpolating the pixels downward on that first monitor, since it is really just for utility stuff, and not my main focus. I set it to 100% scaling and changed the resolution from native at 1050x1680 to 900x1440. That basically creates the same effect as if scaling it to 125%, so this is a workable solution to get around the Cubase problem.
Likewise I set my 3rd monitor (to my right) from native of 1050x1680 to 900x1440. I usually have the MixConsole there. This definitely makes it more legible, but I lose a lot of real estate. I’ll probably look to replace that with a much larger monitor. That one runs into DisplayPort in the video card, so I have some options in larger monitors.

For those who might stumble across this thread with similar questions, it seems to me the main differentiation between “TV” and “Computer monitor” is the refresh rate. That LG monitor will handle refresh up to 75 Hz. My video card will do up to 60 Hz. It seems that the TVs don’t normally publish a refresh rate, so I assume it is slower, which might lead to eye fatigue.

A lot of TVs do say they have high refresh rates but not on an external input. The mouse pointer is the worst thing on a low refresh rate. TVs tend to be around 30 ish which can be ok but not as a close up main monitor.

I have a 32” 4K monitor which is about as small as you can go having the scaling at 100%. Some will find this still too small and will scale to 125% or 150%. If you end up doing this then there is no point in having a 4K monitor. I think I would prefer around 40” but I’m fine with 32 and my glasses on

1 Like

Thanks for that input. As I mentioned above, Cubase 11 has serious problems if you use the Windows 10 scaling factor. So far, I am liking my new setup, which has 100% for all monitors, but a lower-than-max setting for the pixels in 2 of the monitors (forcing interpolation). I might not like that interpolation if I were doing a lot of work with text, but on this computer, it is all music stuff.

I am thinking about getting a 42" 4K TV for my rightmost monitor. That is mostly for my MixConsole, so it isn’t really mouse-intense – I don’t ever edit waves on that screen. I do use Melodyne there, but those actions are not usually fast.

I’ve been using a 2 screen setup for about a year now, both running at their respective native resolution at 60Hz driven by an NVIDIA GeForce RTX 2060 SUPER.

  • Samsung 40" TV at 3840x2160 via HDMI
  • ASUS 27" computer monitor at 2560x1440 via DisplayPort

At those resolutions and screen size, the pixel sizes are pretty much the same, and I’m running them as a continuous single Windows 10 desktop.

Things to note:

  • A 40" 4K TV on the desk is as legible at native resolution as having a 27" 2160p monitor - same size of pixels, just more space. - More space than most ultrawide computer monitors - and much less expensive.
  • Couldn’t make the NVIDIA card choose HDMI as first port when something is plugged into a DisplayPort output. So the computer bootup always happens on the (in my case) smaller monitor connected to DisplayPort until Windows comes alive.
  • HDMI on graphics cards apparently never sends the HDMI TV on/off to the TV command like a cablebox would, so I have to turn the TV on and off manually.

Things to watch out for:

  • Don’t ever run at less than 60Hz - it makes working with the mouse unbearable (source: me and everyone I’ve read or watched).
  • DisplayPort to HDMI cables to support 60Hz at 4K are hard to come by. This may be a killer issue for Mac owners considering the 4K TV option.
1 Like

As I understand it there are two distinct levels of HDMI. HDMI 1.0 is what most of us are familiar with, and that can support up to 1920x1200 @ 60 Hz.
I believe DisplayPort came online more of less to be a successor to HDMI 1. The information I see about DisplayPort seems rather complicated with many different operating modes. But it looks like any DP adapter should be able to support 4K at 60Hz.
It seems to me that there as a period when display adapters commonly included both HDMI and DP (as well as VGA). I guess it wasn’t clear which standards would prevail. However, nowadays, most consumer-grade monitors and TVs support only HDMI, with HDMI 2.1 supporting 8K and beyond at 60Hz.
In other words, it seems like DisplayPort is basically dead, at least as far as a consumer-grade things is concerned. What does that mean for those of us with display adapters that have HDMI 1.0 and DP connectors? I guess one can always replace the display adapter with one that supports multiple HDMI 2.1 ports.
A product like this converter can convert a DP to support 4K HDMI displays, but that is only if your adapter already supports DP 1.4, but that only reached the market a couple of years ago.

Any other solutions? Can anybody recommend a display adapter that will support at least 3 HDMI 2.0 or 2.1 devices?

1 Like

This display adapter provides 2 x HDMI 2.0 plus 2 x DP 1.4 ports. That may be about as close as one can come for under $500.

1 Like

Higher end NVIDIA cards are in short supply these days - the one you listed is sold out at NewEgg in the US and Canada. Same with my 2060 Super, which lists under USD 500 as well - but good luck getting one at that price. :scream: Damn crypto miners! :angry: – I was lucky to get mine in April at regular retail. :sweat_smile:

1 Like

Well, if they are sold out, maybe that means this is a solution many people like. I guess that’s something. I’m not in a super big hurry.

Crypto (Bitcoin and such) miners are buying high end NVIDIA cards for their GPU’s high speed calculation engine and don’t really care about the video outputs. So in this case (and for a while going forward) it’s not necessarily a good barometer for our needs. – However I generally trust the verdict of gaming enthusiast websites and YouTube channels - especially for higher end video cards - but also for other performance oriented parts like SSDs, motherboards, etc.

1 Like