TV Electricity Usage | Most Cited Study | 107+ Tests

Last updated: February 11, 2024.

How much electricity does a TV use? Find out, below, from the most cited TV electricity usage study. Results are based on actual TV energy usage tests, not simply TV power ratings.

This article provides an overview of key results from our highly cited TV electricity usage study, updated for 2024. 

TV wattage, amperage, and running costs results are covered in more detail separately.

The results of our TV energy usage study have been cited in many locations, from scientific papers to news publications, and industry leading company websites.

This article compiles the findings of all the electrical units measured.

Key takeaways:

  • TV power ratings can reach over 350W. However, their actual energy usage is less;
  • On average, a modern TV uses 58.6W in On mode, but 117W is most common;
  • TVs most commonly use 0.5W in Standby mode;
  • TV amperage is typically 1A to 3A, but their actual amp draw is lower;
  • On average, a TV draws 0.49A, but 0.98A is most common;
  • In standby, TVs draw 0.01A amps on average;
  • The average American can expect their TV to use 302.5 Wh (0.3025 kWh) of electricity per day – 293 Wh (0.293 kWh) in On mode, and 9.5 Wh (0.0095 kWh) in Standby mode; and
  • A TV costs $0.0015 to $0.0176 every hour of watch time ($0.000075 to $0.00045 in Standby mode).

Continue reading to get more detailed insights, and to see examples of where these study results were cited.

TV Electricity Usage image containing a TV (viewed from the back), 2 multimeters, and an energy monitor.

Most cited TV electricity usage study

Major publishers, and industry leading companies have already referenced the results from this study.

This is likely due to the fact that, previously, online sources of TV electricity consumption use TV power ratings alone in their estimations.

Which results in considerably overestimated energy usage figures.

This comprehensive study, based on actual power consumption tests, was conducted to deliver a more accurate assessment of TV electricity usage.

These study results have been cited by / in:

  • Yahoo News, MSN;
  • Metro, The Evening Standard, The Sun, Ladbible;
  • Association for Computing Machinery (NY, United States);
  • International Journal of Advanced Business Studies;
  • Unbound Solar (a leading US solar solutions provider);
  • Power Sonic (a globally recognized battery solutions provider);
  • Heart Radio; and
  • Bluetti (solar generator brand).

The results also appear, unfortunately without citation, on websites from brands such as Jackery (solar generator brand), Anker (part of Anker Innovations with Soundcore, EUFY & NEBULA), Now Power (Texas utility provider), Payless Power, Nature’s Generator, and Gadgets Now by The Times Of India

Eco Cost Savings: the authoritative source for power consumption and sustainability insights.

Let’s jump into the highlevel results.

How much electricity does a TV use? (Tested)

On average, modern TVs use 58.6W but 117W is most common (in Standby mode, 0.5W is most common). Annually, TVs consume 106.9kWh of electricity on average, which costs $16.04 (assuming $0.15 per kWh – typical in the US).

Most online resources use TV power ratings to estimate electricity usage, not actual power consumption tests. This results in overestimations that are quite significant.

TV power ratings refer to the manufacturer’s maximum expected power draw under normal operating conditions.

TVs don’t continually operate at their max power draw.

The power rating of modern TVs can reach over 250W, depending on the size, and older models like plasma TVs can reach more than 350W.

For insight into the power rating of older TVs, I completed a power rating study over a decade ago for a leading utility company. Here are some results for TVs that we even considered old at the time:

  • A 15 inch LCD TV: 40W power rating (3W on Standby mode);
  • A 28 inch CRT TV: 110W power rating (3W on Standby mode); and
  • A 42 inch Plasma flat screen TV: 350W power rating (18W on Standby mode).

Continue reading for more details on TV power rating.

These results are dated now. So let’s get back to modern TVs.

The table below summarizes the test results of the TV electricity usage study, by size category. The actual amount of watts used (in both On and Standby modes), actual amp draw, cost per hour and day (On mode only & assuming $0.15 per kWh) are listed for each TV size. The results are the average figures for modern TVs.

TV sizeWatts usedAmps usedCost per hourCost per day / 24hrsWatts used on standby
19 inch TV16.5W0.14A$0.0025$0.060.5W
24 inch TV19.8W0.16A$0.0030$0.070.8W
32 inch TV28W0.23A$0.0042$0.100.7W
40 inch TV34.1W0.28A$0.0051$0.120.5W
43 inch TV47.8W0.4A$0.0072$0.170.9W
50 inch TV70.5W0.59A$0.0106$0.252.1W
55 inch TV77W0.64A$0.0115$0.281.4W
65 inch TV94.7W0.79A$0.0142$0.341.1W
70 inch TV109.1W0.91A$0.0164$0.390.5W
75 inch TV114.5W0.95A$0.0172$0.412.6W

This table shows actual TV electricity usage results based on power consumption tests – not power ratings.

Let’s briefly cover power ratings, before summarizing the study results for each electrical unit measured.

TV power rating (max watts)

TV power rating, also referred to as the manufacturer’s listed wattage, is the maximum amount of watts that a TV is expected to use under normal operating conditions.

Modern TV power ratings can reach over 250W in some cases (primarily due to size), and older models (particularly old plasma TVs) can reach over 350W.

A TV’s power rating can be found on the back of the unit, in its manual, and on its AC adapter in many cases.

TV power rating is a crucial consideration when designing a circuit, or identifying causes of nuisance tripping.

However, as TVs don’t continually operate at their power rating, it’s best to use actual electricity usage figures when calculating TV energy usage, and running costs.

Old TV power rating examples have been listed above. But next, let’s take a look at TV electricity usage test results.

How many watts does a TV use? (Tested)

Modern TVs use 58.6 watts on average, which is 58.6 watt-hours (Wh) or 0.0586 kilowatt-hours (kWh) of electricity per hour.

TVs draw less watts in Standby mode – 0.5 Wh (0.0005 kWh) per hour is most common.

The average American can expect their TV to use 302.5 Wh (0.3025 kWh) of electricity per day. That’s 293 Wh (0.293 kWh) in On mode, and 9.5 Wh (0.0095 kWh) in Standby mode.

This assumes you have a TV that consumes the average amount of watts while On, the most common watts in Standby mode, you watch TV as much as the average American (nearly 5 hours per day), and leave your TV on standby for the remaining 19 hours.

See how you can find how much energy a TV uses or get the complete TV wattage study results, here.

But let’s briefly take a look, in a bit more detail, at how many watts a TV uses in Standby mode.

TV standby power consumption (Max & tested)

TV standby power consumption typically ranges from 0.5W to 3W – 1.3W is the average, but 0.5W is most common.

Manufacturer listed TV standby power ratings are typically 3W or less.

It’s most common for modern TVs to have a standby power rating of less than or equal to 1W. But their actual power draw is commonly 0.5W.

If a TV was left in Standby mode for a full day, it’d most commonly consume 12 Wh (0.012 kWh) of electricity. 

This works out to 4,380 Wh (4.38 kWh) for a full year, which is relatively low when compared to older TVs.

Standby power consumption, also referred to as vampire power consumption, is typically wasted electricity.

As a result, regulations are now in place in some countries requiring TV standby power draw to be limited to 1W.

This limitation has already resulted in lower TV standby power ratings, and less wasted electricity when compared to older TVs.

TV amperage (max draw)

TV amperage refers to the manufacturer’s listed amp draw, which is the max current expected under normal operating conditions. TV amperage is typically 1A to 3A.

TV amperage increases with screen size and resolution, in On mode.

However, the listed TV amperage is higher than the actual amp draw.

So let’s take a brief look at this next before we look at amp draw in Standby mode.

How many amps does a TV use? (Tested)

The amount of amps a modern TV uses ranges from 0.08A to 0.98A. 0.49A is the average TV amp draw, but 0.98A is the most common. As expected, these TV amp draw test results are lower than the manufacturer’s listed TV amperage.

TV amp-hour (Ah) results are particularly important when it comes to determining battery capacity requirements.

Per hour, TVs draw 0.49 Ah (based on the average TV amp draw).

This means that a battery with a capacity of 0.49 Ah will run the average TV for 1 hour (assuming no inefficiencies).

Per day, the average TV draws 11.76 Ah in On mode.

For more detailed TV amp study insights, check out: How many amps does a TV use? [107+ tested].

Pro tip / safety note for those tempted to measure how many amps their TV uses: It’s dangerous to test amp draw using a multimeter. I’m an experienced electrician, and I’ve tested many appliances. I don’t recommend using a multimeter to test TV amp draw. I’ve listed why, and safer methods here: How to measure TV amp draw.

TVs draw considerably less current when in Standby mode.

How much, exactly? Let’s take a look.

TV amp draw on standby (tested)

On standby, modern TVs draw between 0.0042A and 0.025A, with 0.01A being the average TV amp draw. These figures are based on power consumption test results.

In terms of current flow over time, TVs draw 0.24 Ah per day, on average.

Using the manufacturer’s listed standby amperage will result in an overestimation in actual draw, but this may still be appropriate in some cases (e.g. measuring max potential).

In general, TV amp draw increases with TV size, but it’s not a linear increase. For more details, see how many amps a TV uses in Standby mode, by screen size.

Now that we’ve covered TV electricity usage, we can work out the running costs.

How much does it cost to run a TV?

It costs $0.0088 to run a TV per hour, on average. 

Modern TVs cost $0.0015 to $0.0176 every hour of watch time.

In Standby mode, the cost to run a TV drops to between $0.000075 and $0.00045 per hour.

These costs assume a kWh price of $0.15 (typical in the US), and are based on TV electricity usage test results of over 107 TVs.

TV running costs increase along with TV size. To see the cost by TV size, and to get 8 cost saving tips, check out: Cost To Run A TV Revealed [8 Cost Saving Tips + Calculator].

Also included are the running costs by resolution.

Don’t miss…

For additional and more detailed TV electricity usage insights, be sure to check out:

  1. TV Wattage – Most Efficient TVs Revealed [With Data];
  2. How Many Amps Does A TV Use? [107+ Tested, Incl. Standby Amps]; and
  3. Cost To Run A TV Revealed [8 Cost Saving Tips + Calculator].

And don’t miss: Turn Any TV Into A Solar Powered TV: The Easy 5 Step Solution.

This article goes into more detail about Wh, Ah and how you can work them out for any TV.

Plus, don’t miss this opportunity to quickly, easily and substantially reduce your electricity bills and carbon footprint – get the 6 Quick Wins Cheat Sheet: