It depends entirely on your type of monitor and power supply,but you can figure on the average of 100 watts per amp (that's ageneralization of course). If you have for instance, a 350 wattpower supply, that means the power supply can put out 350 watts ofpower for the computer to use, but that's the low voltagecomponents of the power supply.As an example, I have a CRT type of monitor that uses 2.5 ampsbut on my power supply, there isn't a rating tag, so I'dapproximate it at 1 to 2 amps.People tend to confuse output watts as the draw of power fromthe AC voltage, but there are other variables that come intoplay.Yes, those variables are power supply efficiency, PFC but mostlyfollow ohms law.
![How many watts does a tv need How many watts does a tv need](http://www.worldhistory.kenwackes.net/Unit_20-Revolution_in_Science_and_Thought_files/droppedImage_3.jpg)
With most devices you can look at the label to see how much energy they use, but that doesn't work so well with computers because the label gives the theoretical maximum, not the typical amount used. A computer whose label or power supply says 300 watts might only use about 70 watts when it's actually running.