Jump to content
raydamilo

ATI's Catalyst Detection problem

Recommended Posts

Hy Atomic team. I'm experiencing a detection problem. I'm having this Benq G2020HDA 20" widescreen and an AMD Radeon HD6570 1Gb 128 bit. The problem is that CCC (Catalyst COntrol Center) detects my monitor as a CRT instead of LCD and that causes a lot of smooth graphics problem. I'm seeing the details (icons, fonts, windows's edges etc) exactly as i'll be watching to a '98 CRT monitor. Not only this. My max reolution is 1600x900 wich is now seted but, that resolution should be on 60Hz refresh rate but i had to set it to 75Hz so i can see it right(maximum refresh rate on my monitor is 76Hz so no worries). If it it seted on 60Hz on 1600x900, the image is outside of my monitor (too widescreen) without notifying me that is out of range. Pretty odd behaviour right in despite of all specifications. Video Card supports way higher resolution then 1600x900 and my monitor's max resolution is 1600x900 Wich i think it's also recommended. It's 20" Monitor let's get serious. Oh and in windows's screen resolution's settings i got no recommended option. It's pretty odd what's happening here. Things were fine when i had this nVIDA videocard 512 256bit (don't remember the series) and an windows xp installed. Now i'ved upgraded to a AMD RADEON HD 6570 and to windows 7 ultimate and things got worst. Where is the definition of UPGRADING here? Answer: Nowhere. Anyways. I think that the core of all problems is there. Why is seeing CCC my monitor as a CRT one? And how can i solve this problem?

 

Thanks for your time to read my thread an thanks in advance to all Atomic Team. You're doying a good job. i had some PC problems before and your forum made me see the light. Not only that i solved those problems but i understoos why and where they started. Anyways i was prayin' not to get on your forum with a new thread posting :P but it looks like i couldn't find a thread to match my problem now so i took my heart in my teeth and i said: Ok. My Time As A Reader Has Come To An End! :P

 

Really guys. Please help me with something. Thanks again.

Share this post


Link to post
Share on other sites

the monitor only has d-sub input (vga) so nothing can detect it properly

you will probably find the monitor only supports 60hz at its native res and anything higher is at a lower res and this may be creating a drop in iq but maybe not

try it at the native res 60hz both ccc and the monitor should have settings to scale the image so it fits the display correctly which is just something that has to be done when using vga

being a low quality display it will probably never be real sharp

Edited by Dasa

Share this post


Link to post
Share on other sites

^ If thats the case, then thats the problem. If the screen doesn't display properly, you'll need to screw around with the monitor settings I'm guessing.

Share this post


Link to post
Share on other sites

ok i'ved adjusted some of my monitor's settings but it's not good enough. it's just acceptable. still i'd like to know what to do for CCC to see my real monitor, as an LCD, not as an CRT.

Share this post


Link to post
Share on other sites

ok i'ved adjusted some of my monitor's settings but it's not good enough. it's just acceptable. still i'd like to know what to do for CCC to see my real monitor, as an LCD, not as an CRT.

in what way is it not good enough?

for any gpu to see your monitor for what it is it must be connected by a newer display cable which your cheap ass monitor doesnt support

 

d-sub\vga are the old connector used for crt

dvi\hdmi\display port are all the new ones that will allow proper detection and automatic setup

 

this poped up in google and although i havent read it it does have all the pictures and il asume the info isnt bad

http://homeservershow.com/vga-dvi-hdmi-dis...-explained.html

Edited by Dasa

Share this post


Link to post
Share on other sites

D-Sub is an analogue input/output and as such, your card won't be able to detect the monitor over it. DVI, HDMI and Display Port are all Digital and as such, can be detected and can even install their own drivers etc. for automatic setup.

 

If you want your monitor to be detected, you need a new monitor that uses a digital input.

 

So in conclusion, its not CCC thats failing to detect your monitor, its your monitor failing to be able to put a digital signal across, hence not being detected properly.

 

Literally the only way to remedy that is to buy a monitor with a digital input.

 

This part of the article dasa linked should help make sense of why you think it looks a little blurry too.

 

When you use a VGA connector, the pixels are not individually lit and the lit pixels are affected by other pixels. For those of you with two monitors, one on VGA and one on DVI, you will notice the DVI looks slight more crisp.

You will also need to change the vertical and horizontal size within the monitors settings to ensure the image fits onto the screen correctly.

 

*EDIT* Looking a little further into this article, it seems that D-Sub can detect the monitor type, but it is up to the type of cable and most likely the monitor too.

 

There is also Display Data Channel (or DDC or DDC2) on pins six and seven that have information that tells your PC the type of monitor it is. Modern VGA cables also have DDC/DDC2

Edited by NukeJockey

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×