I have a dual monitor set-up and I recently upgraded to Windows Vista and I was having no problems until a few days ago when my second monitor just stopped working. I don't think it is a problem with either the monitor or the cable because when my system is starting up I see images on both monitors, its just when Windows loads I cannot see anything and my monitor displays "No Signal".
I have tried a few things to fix this problem but to no avail, this is what I have attempted:
- Redownloading/installing the latest Vista drivers for my video card
- Uninstalling some of the later updates I have done on Windows Update
- Switching the cables on my video card (to see if I can get the other monitor working... it did nothing)
Does anyone have any ideas on what I could do to fix this?
I have a ATI Radeon x1650 Pro, a Dell E172FPb monitor (this one is working fine) and a ViewSonic V10859 (the problematic one)
It's the F problem!!! (Function Key) ;) Read on to fix it..
Had the same problem with adding a monitor to my xps.
1.) Attach the connector lead of the monitor to your laptop
2.) So you don't see the display? :D
3.) The laptop has an Function key (Fn key next to the windows start key) on the keyboard. Press and hold the key, and hit the F5 or F8 key or that F key which has the monitor icon on it (it is F5 on toshiba and F8 on dell).
4.) This should fix it.
5.) Sometimes a high resolution to an extended screen can also be a cause of non-display. Just reduce the resolution of your new (additional) monitor to 800X600 and that must fix it.
6.) If these two still don't fix, then your next bet is re-installing your display device drivers.
Hello! It seems I've come across most of the problems described in this thread. As there doesn't seem to be a universal solution for handling secondary display problems I might as well suggest the best I have in my arsenal (starting with the basic ones):
1. So you've updated the drivers to both your monitor and graphics card (naturally) - and in case you have a notebook you modified them with modtool, right? I'm only telling you the last part, as the drivers on ATI's download page aren't for notebooks/laptops. Which is why you actually have to address your notebook manufacturer each time you have a problem with your graphics card. Sure you are already aware of this.
2. As you can actually see the system information on startup, on BOTH screens, obviously there isn't anything wrong with the monitors nor the graphics card, but instead there is something wrong with the settings. I've experienced this before and got it to work. So DON'T throw anything away! :)
3. You have an ATI card and Catalyst Control Center. I imagine that you, in CCC/Display Manager, can see your secondary display icon, which you have then cloned. I understood this correctly, right? Okay... When I had this problem, it was due to a refresh rate/resolution incompatibility issue with my two monitors. Now, my secondary monitor was an HDTV (Philips), but the logic seems to be the same.
Here's what I did to make it work:
Step 1) In Display Options I have a drop-down from which I selected something like (display only refresh rates/resolutions which works on all displays). Mine is in Swedish so I'm just guessing what it's called in English. It shouldn't be hard to find. Then click Apply.
Step 2) In Display Manager (still in CCC), I have a button called Force. Upon clicking it I receive four different alternatives, among which one is called 'HDTV'. Choosing that gives me a number of resolution/refresh rate options. Pick one - maybe some resolution at 60 Hz if available. Now it could actually work.
I recently had this issue myself, one day both we're working fine then one stopped. In the meantime I upgraded my whole system including graphics cards and cld still not get it going. I'm now running an ASUS 1g Radeon 4870, after buggering around in the display options of Vista to no avail I had a look into the Catylist Control Centre and soon found the options I needed to tweak, ie extended desktop as suggested above. So instead of playing around with the Vista options try looking into your control panel for your graphics cards and see if their is the option ya's need there.
This obviously will only help those that still have both displays showing up in the device manager but no longer are dualing up.
I have just got a Dell XPS 420 and have a similar problem.
I have an LCD as a primary monitor from the ATI 2600 card and I have an LCD TV on Svideo. It seems to work fine for a time and then when I switch it on in the morning I can see the cursor arrow on the tv so I know the system has loaded but my other monitors lamp is flashing and the screen is dark. Generally I click around and press enter and finally the login window appears but it is a hassle and no solution.
I have the latest ATI drivers installed.
Is there a key command that will take me to login without the mouse.
I tried having the monitors cloned to show the same image so I can login but the pc defaults to the lower resolution of the tv and the ATI driver does not offer this option.
i have the same problem and also using :
vista, dual dvi card, viewsonic monitor and toshiba lcd, 5 M cable to viewsonic, 5 m cable to lcd.
before all was fine until I changed to 19 viewsonic, tried my 19 monitor on differnet pc all is good! but the strange thing is that when I plug my ols 17 viewsonic back all is good!!.
ok, so I changed the vga card to a new one ( the old one was ex700pro of asus,ati ) to a 9600GT asus, still the same problem. maybe I should get a new cable but if then how it's working fine with my old monitor?!
it's drives me crazy, I can't think of a salution...
Just right click on your desktop and then click properties next click settings click on the number 2 but make sure your leads are connected to that monitor. Then click the box extend my desktop to this monitor. It should now be working. In addition to that is a great free download called ultra mon and enables you to easily move stuff from one monitor to another setting up your own hot keys follow the link http://www.realtimesoft.com/ultramon/download.asp
I fixed it by going to Control panel/Personalization/Connect to a projector or other external display/
then click on connect display window and mark the first or last choose/
I had it on second one and I could only extend the monitors , but now its back to normal, I don`t know way it changed it self.
OK, my dual monitor setup stops a lot of times when Windows has an automatic update.
I'll turn on my computer in the morning, and one of the monitors will be dead.
Usually I can fix it by right clicking on the desktop. Going to the display settings and playing around in there for awhile.
This morning, no dice.
This worked for me...
(btw, I tried most of the other stuff on this board and none of it worked).
(btw...I have a Dell desktop I got 2 years ago with Windows Vista).
In playing around in the display settings, I realized that I had a Radeon video card.
The drivers and cards are made by AMD.
Their official driver download site is http://support.amd.com/us/gpudownload/Pages/index.aspx
It will ask you what version of windows you have (for me it was Vista home edition 32 bit).
My type of card of Radeon X1300 PRO.
As I was downloading the drivers, my monitors got fixed...
I've been running a Dell XPS420 since March '08 with a 24" Dell monitor for the main and a 22" dell monitor with a webcam for the secondary without issue. About a week ago my second monitor just stopped working, I was really busy at the time so I didn't pay a lot of attention to the problem (I don't know if it was after patch, etc. ). So yesterday I get some time and start looking.
There is nothing wrong with the second monitor, the built in mic works, the built in web-cam work but it will not 'see' the signal from the Dell desktop. I upgraded the video card, uninstalled and reinstalled the video card, none of that fixed the problem.
If you try to extend the desktop to the second monitor nothing happens, click apply and the extend checkbox becomes unchecked. I tried swapping the monitor cables but that didn't resolve the issue either. I'm pretty well stuck here. I didn't personally change anything on the computer (live alone), and I can't get it working again.
The 2nd monitor power buttons lights up, and I can switch between the 3 input modes without a problem and as I said the mic and webcam built into it works just fine.
Hi, I had a very similar problem, BUT I HAVE A SOLUTION. I have just updated my computer to a Toshiba Satellite Laptop and connected the laptop to a second screen. At the beginning I had both screens working just fine, and then the main monitor stopped working, and I only had the Vista Wallpaper displayed. I found out that I had "extended" the desktop from my laptop screen (Monitor 1) to my main screen (Monitor 2) through the "desktop properties" and I could drag items across etc, it was basically a physical extension of my desktop. No matter what I tried after in Properties, nothing worked.
As I did not want this, but to work mainly from my big monitor (Monitor 2), I was told I had to make Monitor 2 the main monitor.
By pressing the "FN" button on the bottom row of my laptop keyboard at the same time as the F5 button on the top, I could get another menu on the top where I could choose to make my monitor 2 the main monitor. And the problem was solved!
If you can find those 2 buttons to press (or something similar), you should be able to get a menu on the top of your laptop screen where you can change your monitor, and this should really fix the problem.
Let me know whether you guys had any success!
Unfortunately I am not a computer geek, so explaining this is a bit hard for me ;)
I have been having a similar problem - first of all, almost every time the computer reboots I have to go into the control panel and click on the INVIDIA icon and tell it that I want "The same on both displays". I think its under "setting up two displays".
I was trying to get the highest resolution on my flat panel TV and finally found a 1080p setting - I have an older flat panel monitor and a great new 34 inch Sanyo widescreen.
The problem is that if I use "the same on both displays" with the 1080P setting, the old monitor tells me that the input is too high of a resolution.
If I use "configure monitors separately" instead of "the same on both displays", I get an empty desktop background on the high definition!
Also, even with "the same on both displays", the screen doesn't FIT on the TV (it is hooked up with an HDMI cable). I am close to having it in REALLY high definition through the computer.
Now, the guide that I downloaded for the Media Center is two hours off - so guess what I did - I set the time on the computer back two hours! HA HA! It works great!
But I sure could use some help configuring the monitors separately?!?!?
If your using Nvidia go to control panel and add a resolution, the one you were previously using, windows forces resolutions which dont work, don't know about ATI should have sometime similar to make custom resolutions I hope. Mine was showing having 16-bit color instead of 32 and would stay blank changed that it worked fine. Its most likely not a hardware issue if they work somewhere else.