As we speak, I am recovering from computer withdrawal. You see, two days ago, everything in our house decided to die. Two sinks burst and I'm down one computer. Luckily, I got to hook up the Ubuntu box to the internet, making it the only useful computer in the house. Therefore, my parents are forced to use Linux. And the Wii still works.
But anyway...
Here's the conundrum for you people to think about:
I installed a new hard drive into the computer. I hooked the computer back up. I turned the computer on, and the monitor received no signal. I tested it in various ways and concluded that the monitor was not the problem. Also, the rest of the computer clearly worked. I took it to the computer store around the corner and was told that the video card probably just gave out. So, I grabbed a new one. I slapped it in, hooked up, and turned on. No picture. I tried a different monitor. Still no picture. I brought it back to the computer store. He hooked it up to his monitor. Nada. He removed the memory, turned on the computer, and received three beeps as normal. I have this guy completely baffled. Anyone have any ideas of what the problem could possibly be?
Comments
Is this computer actually starting up? Do the fans turn on? Do they stay on? Can you heard the hard drive doing its usual business during boot? My bet is the power supply or motherboard is shot.
What happened to the old hard drive? Can you put it back in?
Is this a brand name computer? Is there a small ight on the side of the power supply that turns on when it has juice?
Whatever the case, if you can get through POST, or at least further in to it, you should be able to get more info. If swapping the video card again still doesn't work, then Scott's almost certainly right, and the last thing to try before ditching the board is the power supply.
The power supply clearly works and the motherboard is confirmed to be, at the very least, partially working.
How about a part number for the power supply?
Both monitors worked. I tried them both with my Ubuntu box. The Cable looked like this(on the right):
The input fit the cable.
And the monitor is one of these:
Scott, how am I implying analog and digital? Analog is the one that doesn't loose information over the connection, right? It either gets there, or it doesn't. And digital can loose information over the course of the transfer and still output a signal, correct? Perhaps I mean that my monitor uses analog rather than my monitor is analog? And perhaps I mean the video card outputs in digital? I'm quite confused now. It's not like the computer doesn't work, or anything, I'd just like to understand what happened and if the guy lied to me. My mom got impatient for the store around the corner to fix it because they were so backed up, so we brought it to a guy I had never heard of and this is what he told me.
Old CRT monitors are analogue by nature, and also the first LCDs that became available only had analogue input. You should check out the GeekNights episode on monitors. Many LCD monitors have both analogue and digital input, like my own. The first time I used it, I had problems configuring the screen. It took a few hours of trying, failing, and eventually reading the manual, before I understood how to select which input to use.
The cable in the picture is a RGB cable, which transfers analogue signals. If your graphics card has digital output, it's the DVI standard. You can not connect a RGB cable to a DVI plug. However, there are DVI to RGB converters available that are so small I guess it is possible to overlook one stuck to the graphics card or one end of the RGB cable.
I think this whole ordeal smells a bit fishy, but I guess there is a possibility that the core of the problem is just a series of misunderstandings.