01-01-2009 01:32 AM
I got a A6200N HP home computer and i bought a GeForce 8500GT with it. I installed it and it went smoothly. When i tried palying some games on it. i could run the game faster than it would of been with the old video card. The thing that tangles my head a little right now is that my Power Supply is a 250W. AND i can still run the 8500GT. It says on the box of the GeForce 8500Gt that the requirements is to have a 350W power supply.
-Could someone tell me a reason for this? that would helpful
01-01-2009 04:20 PM
01-01-2009 10:25 PM
ok, that you for the info i expected something like that might happen eventually.
01-01-2009 10:40 PM
01-02-2009 11:57 AM - edited 01-02-2009 12:02 PM
A battery?, maybe you meant power supply. Get the wattage rating that will make your video card happy (min 350 watt).
For the moment why dont you attach a multimeter to your PSU (Power Supply Unit) 12 rail output (black and yellow wire) and check its voltage while you are playing/ doing extensive stuff on your PC. 10% tolerance is good enough that your PSU can tolerate, lower than that and your using too much juice from your PSU and it's possible that youll break it.
PSU's are cheap nowadays, just make sure to buy the one that will fit exactly inside ur case.
edit: Forgot.. here is a link to calculate how much power you really need.
http://www.antec.outervision.com/
01-02-2009 08:24 PM
01-03-2009 01:06 AM
not just hp, call manufacture does that. that usually just put the bare minimum power supply to power up what comes out of the box, not aftermarket products.
so u can get the dynex power supply that has 400watt. that's probably the best to get right now
01-03-2009 04:51 AM
01-03-2009 09:43 AM
01-03-2009 09:45 PM