Joined
·
25 Posts
Right off the bat, let me say that this question is strictly theoretical. The question is below, in bold. But first, some background.
I'm looking at the gas/oil heating cost comparison calculator at nwnaturalcompare.com.
The set of numbers I'm using is as follows:
The results I'm getting show an estimated usage of 613 gallons of oil (with the current oil furnace), 471 therms with a new 93% natural gas furnace, and 548 therms with a new 80% gas furnace. See attached screenshot.
With the new gas furnace, the overall heat load is 43.8 million btu per year, according to the calculator. (That's 471 therms at 93% efficiency, or 548 therms at 80% efficiency, assuming 100,000 btu per therm.)
For the old oil furnace, if we use that same heat load of 43.8 million btu's, and assume there's 138,000 btu of energy in a gallon of heating oil, I think that would mean 317 gallons of oil at 100% efficiency. Of course, oil furnaces don't operate at 100% efficiency, but to get to the 613 gallons that the comparison tool is coming up with, I believe the oil stove would need to be operating at 52% efficiency. (317/613 = 0.52)
(If I go back and select a high-efficiency 90% oil furnace as the current equipment, the oil usage goes down to 571 gallons - or an actual efficiency of only 56%.)
So here's my question: can anyone point out to me if my thinking/math is incorrect, in figuring out what efficiency is being assumed for the current oil furnace?
Ignore the dollar figures that are being generated - I'm interested in fuel usage, rather than overall cost (which fluctuates based on gas/oil prices).
Thank you in advance!
I'm looking at the gas/oil heating cost comparison calculator at nwnaturalcompare.com.
The set of numbers I'm using is as follows:
- zip code: 97211
- electic provider: PGE
- square footage: 1800
- year built: 1970
- windows: average
- number of people: 4
- number of stories: 2
- current heating: oil - forced air - standard efficiency (89%)
- cooling: none
- compare to: natural gas furnace
The results I'm getting show an estimated usage of 613 gallons of oil (with the current oil furnace), 471 therms with a new 93% natural gas furnace, and 548 therms with a new 80% gas furnace. See attached screenshot.
With the new gas furnace, the overall heat load is 43.8 million btu per year, according to the calculator. (That's 471 therms at 93% efficiency, or 548 therms at 80% efficiency, assuming 100,000 btu per therm.)
For the old oil furnace, if we use that same heat load of 43.8 million btu's, and assume there's 138,000 btu of energy in a gallon of heating oil, I think that would mean 317 gallons of oil at 100% efficiency. Of course, oil furnaces don't operate at 100% efficiency, but to get to the 613 gallons that the comparison tool is coming up with, I believe the oil stove would need to be operating at 52% efficiency. (317/613 = 0.52)
(If I go back and select a high-efficiency 90% oil furnace as the current equipment, the oil usage goes down to 571 gallons - or an actual efficiency of only 56%.)
So here's my question: can anyone point out to me if my thinking/math is incorrect, in figuring out what efficiency is being assumed for the current oil furnace?
Ignore the dollar figures that are being generated - I'm interested in fuel usage, rather than overall cost (which fluctuates based on gas/oil prices).
Thank you in advance!
Attachments
-
94.5 KB Views: 256