• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Choosing the best card regarding absolute best display.

bahmi

New Member
Joined
Feb 3, 2018
Messages
4 (0.00/day)
I always wanted the best looking card and usually bought Matrox. I found the VHS series from nVidia to be outstanding, but I want to move up in speed and usable power, so what card priced under 150 could I expect to have adequate speed but surpassing picture quality? I do not game.
 
which display did you got?
 
System specs and what is it used for? why do you need dgpu (discreet) instead of igp (integrated) because honestly the days of there being any difference in visual quality between AMD and Nvidia are pretty much gone and if you don't game or use your GPU for anything apart from display there is no point in wasting upto 150 on a GPU when integrated will suit your needs perfectly well. If you don't have IGP then any low end model GPU that supports your display (HDMI/DVI/DP etc)
 
If your talking 4K and a HD monitor dedicated graphics has an advantage over some integrated graphics.
Again we need know what you talking about. Or if your talking encoding that's another issue
 
As stated, I tended to gravitate toward Matrox cards due to their visual magnificence. I find sharper, clearer cards are easier on my old, tired eyes.
 
As stated, I tended to gravitate toward Matrox cards due to their visual magnificence. I find sharper, clearer cards are easier on my old, tired eyes.
That's not giving us details on what specifically you're looking for.

Higher resolution? Refresh rates? The picture quality is more on the monitor, not the card.
 
That's not giving us details on what specifically you're looking for.

Higher resolution? Refresh rates? The picture quality is more on the monitor, not the card.

Yes gone are the days where you used to have better visual quality with one brand of GPU, although it may seem that way as they will all have slightly different visual settings when you first install them but all of this can be tweaked with monitor settings or GPU driver settings
 
Yes gone are the days where you used to have better visual quality with one brand of GPU, although it may seem that way as they will all have slightly different visual settings when you first install them but all of this can be tweaked with monitor settings or GPU driver settings

Especially outside of games. If we're talking when not gaming, the visual quality of all the cards on the market is identical.
 
Waste of time
 
I always wanted the best looking card and usually bought Matrox. I found the VHS series from nVidia to be outstanding, but I want to move up in speed and usable power, so what card priced under 150 could I expect to have adequate speed but surpassing picture quality? I do not game.
The problem is that any matrox card I've seen were all with analog output D-sub/VGA

If you are going digital then any card will outshine VGA all the way. Color precision refresh rate and whatever.
I only have experience with my RX470 so not sure about nvidia tho.
 
Swore up and down those Matrox cards had the contrast, color reproduction, and necessary speed to make viewing an absolute pleasure. Mebbe I was wrong all the time, dadgum.
 
Swore up and down those Matrox cards had the contrast, color reproduction, and necessary speed to make viewing an absolute pleasure. Mebbe I was wrong all the time, dadgum.
WHEN? matrox has been around for decades

speed is irrelevant if everything is running at refresh rate, everything is equal
 
Swore up and down those Matrox cards had the contrast, color reproduction, and necessary speed to make viewing an absolute pleasure. Mebbe I was wrong all the time, dadgum.

There are calibration settings in the graphics card's control panel that allow you to adjust a lot of things that affect how the image quality looks. They are basically enhancing what the OS is outputting, so by default it doesn't matter if you have an Intel, or a AMD, or an nVidia, or a Matrox, the image they are outputting looks pretty much the same.

But, for example, nVidia has a setting called digital vibrancy that I like to bump up slightly. It makes all the colors, well, more vibrant. Makes everything a little less dull. They also offer adjustments in Brightness, Contrast, Gamma, and Hue. But by default, they are all set to not change the image that Windows is generating.
 
There are calibration settings in the graphics card's control panel that allow you to adjust a lot of things that affect how the image quality looks. They are basically enhancing what the OS is outputting, so by default it doesn't matter if you have an Intel, or a AMD, or an nVidia, or a Matrox, the image they are outputting looks pretty much the same.

But, for example, nVidia has a setting called digital vibrancy that I like to bump up slightly. It makes all the colors, well, more vibrant. Makes everything a little less dull. They also offer adjustments in Brightness, Contrast, Gamma, and Hue. But by default, they are all set to not change the image that Windows is generating.
all adjustments are conceptually 'wrong', the quality is reduced, colors/values get crushed

practically... it depends on how bad the monitor is, but in the end you better go through calibration imagery to confirm that software adjustments did not result in side effects
 
all adjustments are conceptually 'wrong', the quality is reduced, colors/values get crushed

practically... it depends on how bad the monitor is, but in the end you better go through calibration imagery to confirm that software adjustments did not result in side effects


Agreed. Without a proper calibration tool, anything you change is really just adjusting to what you "feel" looks right, which is in no way accurate. But it doesn't really matter unless you are doing production work that requires strict color accuracy. And if you are doing that kind of work, you better have a calibration tool, because even just changing out the monitor is going to f up the color accuracy.

But if I'm just sitting at the computer using it all day, I'm going to adjust things to they look good to me. That's all that really matters.
 
Monki or Spyder5 acouple of versions of each. have used both, I favor spyder but colormunki is good too. As noted above a good monitor is key.
 
Agreed. Without a proper calibration tool, anything you change is really just adjusting to what you "feel" looks right, which is in no way accurate. But it doesn't really matter unless you are doing production work that requires strict color accuracy. And if you are doing that kind of work, you better have a calibration tool, because even just changing out the monitor is going to f up the color accuracy.

But if I'm just sitting at the computer using it all day, I'm going to adjust things to they look good to me. That's all that really matters.
but... most content everyone consumes has been professionally designed, you're not supposed to alter it

a dark scary game may still have various shades that might disappear when tweaking, clouds may look less cloudy if they lose their faint differences

what i mean by calibration imagery is more casual, not to actually properly calibrate the monitor, but to just adjust the contrast or a few other settings to not crush, something simple like http://www.lagom.nl/lcd-test/ (pg2, pg7, pg8) where you spend only a minute to make sure as many color, dark, & light blocks are seen, plus when using hdmi to double check that your OS/driver black/white output is 0-255 & that the monitor is set to the same 0-255, rather than washed out 16-235

i do a quick check of this on any & every monitor i ever use, no matter how bad the monitor is, just to see if i can get to a vague 'as good as it gets' setting

edit: also need to disable all of the stupid effects that monitor companies put in, like splendid bs, movie mode garbage, like wtf... they add a random tint & make the screen darker in most movie modes, how is that a good idea!? movies are even more professionally colored than games!
 
VHS series from Nvidia? What year is this where this made up product could possibly exist?
 
VHS series from Nvidia? What year is this where this made up product could possibly exist?

Its 2018 and we're talking about Matrox and VHS. Mind if I call troll on this one?

"I found the VHS series from nVidia to be outstanding, but I want to move up in speed and usable power"

That's a whole lot of WTF in one sentence. Is this from an alternate dimension or? You want to play your video at double speed?

1920px-VHS-Video-Tape-Top-Flat.jpg
 
As stated, I tended to gravitate toward Matrox cards due to their visual magnificence. I find sharper, clearer cards are easier on my old, tired eyes.

A Rare person a matrox Trollfanperson
:) never thought i would get to write that :)

as a side note when did you last have your eyes tested ?
 
but... most content everyone consumes has been professionally designed, you're not supposed to alter it

a dark scary game may still have various shades that might disappear when tweaking, clouds may look less cloudy if they lose their faint differences

That isn't how it works. The content all starts out the same, and the by default the video card will output the exact same image, but the problem is that every monitor is different. So what you see on your screen is not the same as what I will see on my screen, even if we are both looking at the exact same image. Even two monitors that are the exact same model number can vary enough to look different.

what i mean by calibration imagery is more casual, not to actually properly calibrate the monitor, but to just adjust the contrast or a few other settings to not crush, something simple like http://www.lagom.nl/lcd-test/ (pg2, pg7, pg8) where you spend only a minute to make sure as many color, dark, & light blocks are seen, plus when using hdmi to double check that your OS/driver black/white output is 0-255 & that the monitor is set to the same 0-255, rather than washed out 16-235

i do a quick check of this on any & every monitor i ever use, no matter how bad the monitor is, just to see if i can get to a vague 'as good as it gets' setting

I understand that, but what I'm saying is when you are doing that, you are using your judgement and taste to determine what looks "right" to you. What looks right to you may not look right to me. Doing calibration this way is in no way actual calibration or anything close to it.
 
I feel like I have gone back in time 20years. Matrox made it's name initially in the mid 90s with the Milleniumin which was capable of higher resolution / sharper displays. Later on in the CAD market (Matrox Impression add-on) they made another splash with 3D acceleration. Subsequently they tried to leverage that success with a marketing campaign based upon their cards being "professional grade".... with results that were short lived and cards from 3DFX and Diamond rose to prominence. The next cards was the Mystique (oft dubbed "Mistake") in that it had very poor 3D images and paled up against the Voddoo cards which were all the rage. Any perception of Matrox cards being superior or magnificent in any way in this day an age is merely remnant of reading those article in the mid to late 90s.

The last significant Matrox placed in the PC market was in the 90s was the G200 (8MB) .. continued making it till just into the new millenium. The G250 went by relatively unnoticed. These cards saw some initial success ... but their mindshare with the public was short lived. From then on Matrox's offerings never were able to compete in performance or graphics with the likes of nVidia and ATI. At that point Matrox focused on the industrial / commercial market. The last time I used one was in 2013 where I powered four large 1080p screens for a Power Plant Monitoring system's overhead display. The selection had nothing to do with the card's performance, or visual acuity ... it's just because it offered the ability to connect for screens in 7680 x 120 configuration.

As for buying a card today, for one, your image is dependent on the capabilities of the display. The best screens available today are 165 HZ, 1440p IPS and are running around $750. Unfortunately, the card (1070) needed to drive them in a gaming environment are running over $900 today. At your budget of 150 (150 what ... I'll assume US dollars), you are looking at a GTX 1050 ($150 - $250) depending on model) or AMD equivalent (RX 560) . Either should satisfy for movie viewing and light gaming.

As for not doing adjustments on any new monitor / card ... no, they are not delivered set up perfectly. No two monitors arrive at your door looking the same. That's why you have sites that offer ICC profiles that will get you very close yoo where you need to be for any particular model. There may be multiple profiles for different refresh rates, if you are using MBR techgnology oir even if there's aprefernce for Display Pro, Spyder, X-rite i1 Pro, Lacie or ColorMunki

http://www.tftcentral.co.uk/articles/icc_profiles.htm
 
Last edited:
can't you guys spot a troll?? lol, Pretty obvious...
 
A Rare person a matrox Trollfanperson
:) never thought i would get to write that :)

as a side note when did you last have your eyes tested ?
First jughead I've seen on tech sites.Congrats.

I feel like I have gone back in time 20years. Matrox made it's name initially in the mid 90s with the Milleniumin which was capable of higher resolution / sharper displays. Later on in the CAD market (Matrox Impression add-on) they made another splash with 3D acceleration. Subsequently they tried to leverage that success with a marketing campaign based upon their cards being "professional grade".... with results that were short lived and cards from 3DFX and Diamond rose to prominence. The next cards was the Mystique (oft dubbed "Mistake") in that it had very poor 3D images and paled up against the Voddoo cards which were all the rage. Any perception of Matrox cards being superior or magnificent in any way in this day an age is merely remnant of reading those article in the mid to late 90s.

The last significant Matrox placed in the PC market was in the 90s was the G200 (8MB) .. continued making it till just into the new millenium. The G250 went by relatively unnoticed. These cards saw some initial success ... but their mindshare with the public was short lived. From then on Matrox's offerings never were able to compete in performance or graphics with the likes of nVidia and ATI. At that point Matrox focused on the industrial / commercial market. The last time I used one was in 2013 where I powered four large 1080p screens for a Power Plant Monitoring system's overhead display. The selection had nothing to do with the card's performance, or visual acuity ... it's just because it offered the ability to connect for screens in 7680 x 120 configuration.

As for buying a card today, for one, your image is dependent on the capabilities of the display. The best screens available today are 165 HZ, 1440p IPS and are running around $750. Unfortunately, the card (1070) needed to drive them in a gaming environment are running over $900 today. At your budget of 150 (150 what ... I'll assume US dollars), you are looking at a GTX 1050 ($150 - $250) depending on model) or AMD equivalent (RX 560) . Either should satisfy for movie viewing and light gaming.

As for not doing adjustments on any new monitor / card ... no, they are not delivered set up perfectly. No two monitors arrive at your door looking the same. That's why you have sites that offer ICC profiles that will get you very close yoo where you need to be for any particular model. There may be multiple profiles for different refresh rates, if you are using MBR techgnology oir even if there's aprefernce for Display Pro, Spyder, X-rite i1 Pro, Lacie or ColorMunki

http://www.tftcentral.co.uk/articles/icc_profiles.htm
This is a great reply. Thanks.
 
Back
Top