Blackouts in PJM are a growing risk without changes reflecting new realities of winter peak demands

The lessons learned and the second-guessing prompted by the polar vortexes that walloped the Eastern U.S. twice in January taught us that the power grid in the Mid-Atlantic states needs — quickly — to recast the rules by which peak power generators can meet demand during the winter months.

Peak demands, at least in the PJM power grid (see map below), are no longer just an issue during summer months. Public health and safety and the U.S. economy are increasingly at risk if the rules that govern power supply are not updated to reflect structural changes and environmental constraints, especially with respect to the increasing reliance on natural gas for generation.

The buzz continues over how close utilities in PJM came to rolling blackouts during the two polar vortexes. The near misses were the result of a five-fanged challenge that will grow more serious if regulators and industry don’t respond before another record-breaking cold snap:

  1. The shift to less-expensive natural gas-fueled power plants in place of larger, coal-fired, generators is reducing available generating capacity during the winter when relying on interruptible delivery of natural gas and/or lack of adequate back-up fuel;
  2. Not requiring firm gas supply contracts;
  3. Limits  on the spot price of power in the PJM wholesale market discourage additional power plants from using any fuel which could have helped meet the surge in demand for electricity;
  4. What additional natural gas that was available in areas further east couldn’t reach some electric utilities due to insufficient pipeline capacity; and
  5. Some natural gas pipeline networks froze up, further constraining natural gas supplies.

That, in essence, is the diagnosis by analysts Judah Rose and his colleagues at ICF International, a consulting firm based in Fairfax, VA. They assessed how PJM grid operators saw the crunch coming and issued the requisite calls for large power users to curtail their loads. But the convergence of these five factors pushed the eastern PJM grid to the brink.

PJM set a winter peak demand record on January 7 of 141,312 megawatts (MWs) of electricity used. That smashed the winter peak before this January’s cold spells of 136,675 MWs set in February 2007. The five biggest demands – ever – on PJM’s grid, and eight of the 10 largest, occurred between January 7 and January 30.

“The grid successfully avoided disaster, but did we need to skate so close to the edge?” asked Rose, along with colleagues Dave Gerhardt, John Karp, Frank Brock, Trisagni Sakya and Mohit Shrestha.

The maximum price PJM was permitted to offer temporary generators was $1,000 per megawatt hour (MWh). That proved to be far short of the incentive needed under PJM’s unique market-based rules.  The price cap in the Electric Reliability Council of Texas power grid, by comparison, is $5,000 per MWh.

The polar vortexes that reached their peaks on January 7 and then again around January 22  and January 30 dropped temperatures below zero in the upper Midwest and near zero in the Mid-Atlantic states. At 8 a.m. and then again at 7 p.m. local time, the Eastern portion of the PJM grid flirted with rolling blackouts.

Because much of the blackout risk stretched from New Jersey down through Baltimore and the metropolitan Washington, DC area, including Northern Virginia, one can imagine what the political fallout would have been. It might have rivaled the California power crisis in 1999-2000 and the questions it raised about competitive power markets.

The PJM grid includes the states and portions of other states in blue. The power blackout risks in January were most pronounced in New Jersey, eastern Pennsylvania, Maryland, Delaware, Washington, DC and Virginia. CREDIT:

With the benefit of hindsight, the most fundamental – and fixable – problem appears to be that the cost to generate electricity significantly exceeded the cap limiting what the competitive PJM wholesale power market was permitted to offer.  Rose and colleagues wrote that this “dangerously and inadvertently created incentives not to perform, and thus, endangered grid reliability.”

Some power plants that could have ramped up their generators chose not to do so; this even after some generators were forced to run while losing money. The Federal Energy Regulatory Commission (FERC) is expected to alleviate this disparity by providing “make-whole” payments to some generators. More on this later.

“There are huge potential implications for consumers and commercial entities , especially those not hedged and caught unaware,” the ICF team concluded in a series of three white papers entitled “Polar Vortex Energy Pricing Implications—Commercial Opportunities and System Reliability.”

Perhaps the most far-reaching dynamic is the growing reliance on natural gas to generate electricity during severe cold spells. PJM, along with grid operators south and west of its Mid-Atlantic footprint, have experienced and navigated  numerous summer peaks in demand. But existing plans do not similarly address winter peaks. Blackout risks remain during winter months until PJM and regulators update their response plans and mechanisms.

The contracted demand for natural gas for home heating means those supplies go to utilities which are heating homes and commercial facilities that have agreed on an up-front price and volume. Even with the steady increase in natural gas supplies from hydraulic fracturing – much of it under PJM’s power grid in Pennsylvania’s Marcellus Shale formation – the expected retirement of more coal-fired power plants sets the stage for more near-perfect ‘storms’ of too much demand and not enough supply in the summer as well as the winter. Depending on how utilities and independent generators  retire coal-fired and nuclear power plants, the capacity shortfall could get worse before it gets better.

“Careful policy and planning changes are needed to accommodate the new power gas relationship,” ICF states. “PJM needs to appropriately plan for both peaks – summer and winter – as well as provide appropriate pricing signals to generators perhaps by tying generator offer caps to gas prices rather than leaving them administratively set at $1,000 per MWh.”  Also needed: “the right balance between automatic market mechanisms and administrative action.”

Industrial facilities willing to scale back their demand for electricity (aka “interruptible load”) represent more than half of the expected reserves during certain peak demand periods in PJM. Virtually all of those arrangements are effective in the summer, not winter, months.  PJM called on interrupted load wherever it could the night of January 6 and then again on January 24. On those dates, the wholesale spot, “real-time,” price for power to Pepco in central Maryland and the District of Columbia was $2,200 per MWh, more than twice the capped price. Across the Potomac River, the spot price to Dominion Virginia Power (also in Eastern PJM) was even higher, at $2,600 per MWh.

To illustrate what it would have cost a temporary generator to respond, consider this scenario offered by ICF:

When natural gas prices hit $123 per million BTUs during one of the vortexes, the fuel portion of the operating cost of a marginal gas-fired peaking plant would have been $2,706 per MWh. But the generator could only bid the $1,000 per MWh maximum to cover all of its costs. Choosing to run that plant would have cost the operator $1,706,000 every hour. That would have amounted to $287 million for one week.

Maybe a request by PJM for a waiver on the price cap to reimburse plant operators for the shortfall would have satisfied some chief financial officers, but not many, if any.  For the requests PJM did make for plants obligated to participate, ICF reported that PJM “consulted with counsel and believes the anticipated retroactive waiver request will likely be accepted by the FERC.” Seems to me the reliability of the power grid deserves a much more certain response.

“It is very dangerous to rely on ex post cost recovery and legal opinions of PJM regarding FERC regulations that have not been determined,” Rose stated via email.

Repeated requests to PJM’s media professionals for comment(s) by a PJM executive yielded one substantive email response that did not contradict or challenge ICF’s conclusions. It would not make an executive available for an interview.

The grid operator serving the New England states – the New England Independent System Operator – where temperatures were even colder than in PJM, weathered both polar vortexes mainly because it relied a lot less on interruptible loads to balance supply and demand for power.  It should be noted that is likely to change according to the ICF analysts because “only 27% of (its) contracted interruptible load was actually in place.”  The upshot: the New England Independent System Operator could face similar risks if it doesn’t heed what happened to PJM in January.

This graph illustrates how the risk for winter power blackouts in PJM grows more than 20-fold in 2015 and 2016 (vs. 2014-15) because compared to 11,769 megawatt are scheduled for retirement in those years, only 3,800 MWs are to be added.  CREDIT: PJM Interconnection

Two more problems involved natural gas pipelines: One was inadequate pipeline capacity to transmit much needed natural gas to the power plants willing to power up. Natural gas utilities needed to draw on their maximum supplies to meet demands by regulated residential and commercial customers. That left slim pickings for everybody else. So even if a power plant wanted to jump in, there wasn’t enough natural gas there to begin with.

The second problem: certain natural gas pipeline did not hold up amid the frigid temperatures and certain system mechanics froze up. So even if natural gas was available, certain pipeline networks, especially in Pennsylvania, were not capable of transmitting gas. These pipelines had not adequately winterized their facilities. ICF stated that PJM reported 36,000 MW of generation — equivalent to 20 percent of its installed capacity — was unavailable at one point due to forced outages.

“The critical critical change in the resource mix is more gas coupled with a set of policies that do not incentive (sic) adequately firm gas supply and interruptible load which might not be actually interruptible,” Rose added via email.

Let’s face it, regulators won’t let U.S. utilities die; they SHOULD update rules to better serve customers by allocating costs to those who benefit from them

The tally of utilities and governments around the world imposing or boosting fixed charges for grid-supplied electricity is rising to make up for the decline in traditional, rate-based revenue.  In the U.S. this has been due primarily to ever-improving energy efficiency and the end of year-to-year increases in electricity usage. Now, the demand curve looks to stay flat or even head south due to distributed generation, particularly rooftop solar.

Think about it. Effectively penalize customers by slapping them with a charge that has little, if anything, to do with how much of a product they use.  If regulators, energy utilities and policy experts today were designing rules for the first time, would anyone want to reward increased energy consumption that puts a heavier burden on, and grows the risks to, civil society, the environment and local economies?  I think we can all agree the answer would be no.

The overarching issue here is not new. Debates have raged for decades over how to enable utilities to earn money from energy efficiency programs. A growing number of states, led by California and Washington state, have made significant strides and there are many lessons to share. I’m on a constant outlook for fresh ideas that deal with the realities of how energy utilities are regulated and the reliance on the status quo by consumer advocates and much of the industry relying on a 100+ year-old business model.

From the earliest confrontations that reached full stride in 2013 in Arizona and California and are emerging in Nevada and Minnesota, we can see how rooftop solar and other forms of distributed generation, micro grids and active energy management are upping the ante. The challenge, and opportunity, to craft rules that can stop any utility ‘death spiral’ (although I don’t believe that scenario) while enabling users to control their energy costs AND clean up our air and water is staring us straight in the face.  It’s only going to get more challenging – and more compelling – with each passing year. Solar is getting more economical. And new technologies and smarter applications all but demand that rules catch up with modern day realities.

The knee-jerk reaction to impose fixed charges is a cop out.  The biggest problems with fixed charges, even modest levies in the $10 to $20-per-month range, are that they:

1) Set a terrible precedent, making it easier for utilities to ask for and get higher charges in the future;

2) dilute the incentive to conserve energy, especially when supplies are stretched due to abnormally hot or cold weather; and

3) ignore how technologies are rapidly enabling energy efficiency and offering many consumers an even greater ability to control their consumption and generate their own power.

Now there are challenges that utilities need to be compensated for in all rate structures.  I’m talking about the need for cleaner generation portfolios, infrastructure improvements and defenses against cyber security threats.  But they can be dealt with prudently while maintaining incentives to use less energy and reduce dependence on the grid. The means to think proactively are all around us if regulators and stakeholders commit to it. There are several examples to draw from to illuminate what’s working and what’s not.

At the winter meetings of the National Association of Regulatory Utility Commissioners (NARUC)  earlier this month in Washington, DC, the Committee on Consumer Affairs searched for “real solutions to real challenges”. They’re grappling with questions such as, “Will consumers dictate what services they are provided in the future?” Fixed charges were not overtly on this agenda although distributed generation is considered a “critical issue” a NARUC committee is studying.

Thus far, about half the U.S. states have decoupled gas utility profits from revenues in ways that don’t rely too heavily on fixed charges but only 16 states have adopted electric decoupling. See adjacent map. The onus is on the remaining half – especially in the electricity sector – to follow suit.


In the interest of time and this space, I’ll highlight what I think is the most compelling and efficient decoupling rate order to date which concluded last summer at the Washington Utilities and Transportation Commission and applies to Puget Sound Energy (PSE). The rate order supports ambitious but doable energy efficiency measures that could lead to significant reductions not just in pollution but also consumer bills. PSE serves more than 1 million electric customers and almost 750,000 natural gas customers.

Ralph Cavanagh, co-director of the energy program at the Natural Resources Defense Council, a widely-acknowledged expert on decoupling and who worked on the order, called it “a victory for everyone who cares about reliable, affordable and clean energy service in the PSE territory.” He said it “establishes an important rate model for the rest of the state, region, and nation on how to end longstanding conflicts of interest between utility shareholders and customers over energy efficiency – our cheapest and cleanest way to do more with the same amount of energy.”

Forging the needed consensus which led to this rate order was the result of conversations spanning three decades. A concerted effort began in earnest in October 2012. In addition to the PSE, the Commission’s staff and the NRDC, it involved the Northwest Energy Coalition and Earth Justice: no slouches in reaching a compromise. So this was no overnight success. The state’s goal of achieving “all cost-effective energy efficiency” now has a roadmap. The process is something the other half of the U.S. utilities not under decoupling can learn from.

Going forward, the new rate structure for PSE will involve small upward or downward annual rate adjustments not to exceed 3 percent. It will ensure that PSE and its shareholders neither gain nor lose from changes in its retail sales. This modest adjustment is a centerpiece of most other decoupling mechanisms.

What makes the PSE order, (starting on p. 36 of the formal order) stand out is how it applies to virtually all sales of electricity and natural gas. It also accelerates progress the state was making for more energy efficiency with rebates for high efficiency appliances, insulation and lighting. At the same time, PSE is to increase low-income bill assistance funding by $1.5 million.

Some of the most creative thinking on advancing decoupling comes from other experts such as Ron Binz, who for a while last year was nominated to be the next chair of the U.S. Federal Energy Regulation Commission until conservative opposition compelled him to withdraw. After an illuminating seminar on the future of energy utilities February 4 at the Brookings Institution in Washington moderated by Binz, he offered an interesting alternative to fixed charges in concert with decoupling.

“I’m very suspicious of straight or fixed charges. Time-of-use rates won’t produce any windfalls in either direction if done properly,” Binz said. Rather than apply time-of-use rates to all customers, he proposed applying them only to the 20% of the largest residential customers and doing so on a revenue-neutral basis.

“That’s a politically acceptable group to do that with,” Binz asserted.  “About 40% of them will have bills that will be lower (than they otherwise would be) and 60 percent of them will have bills that are higher. You can mitigate how high the highest ones go. Most importantly, you can provide a foothold for smart grid benefits and create a market for companies to serve that market.”  Binz goes into some detail here.

A few participants at NARUC’s session Sunday, February 9 called for a relatively simply approach that could fit just about any rate structure because it’s already happening with mobile phone and bundled cable-internet-telephone packages. It includes three elements: 1) Fixed costs to cover investments all parties benefit from; 2) energy charges to cover the cost of fuels; and 3) demand charges to account for the amount of energy used.

David Sparby, CEO of Northern States Power, which operates in the first state with a formal process to value rooftop solar, told the NARUC Committee on Consumer Affairs: “Unlike the 80′s, we don’t have the luxury of time. We don’t have the ability to test solutions. Just drilling down on the costs puts us ahead.”

Janet Besser, Vice President of Policy and Government Relations for the New England Clean Energy Council, agreed. She said efforts to get it exactly right are “an impossible task.” Instead, she said, the focus should be on getting customers into the “categories of cost” that serve them.

Philip Dion, Senior Vice President of Public Policy and Customer Solutions at Tucson Electric Power, where fixed charges have been a hot topic throughout Arizona, cautioned meeting participants “You’re going to have to adapt and change the way you do things. You have to enable people to make smart choices.”  He urged utilities and their stakeholders to develop an integrated distributed energy plan.

Please share your ideas and what you think, or have read, that can help bring the other half of the nation’s utilities into the modern era of decoupling rates, assign costs where they belong and maybe even motivate ratepayers to make smarter energy choices.

How About a Wiki for Clean Energy to Share Best Practices for Improving Energy Efficiency, Boosting Renewables and Reducing Emissions?

There is no shortage of ideas on how businesses, governments and households in the U.S. and other industrialized countries can become more energy efficient; same applies on how to grow cleaner supplies of energy while lowering harmful greenhouse gas emissions and doing so in ways that create sustainable jobs.

There are so many ideas, in fact, how do policymakers, engaged business leaders, informed citizens, stakeholders and the media make sense of them all? Where and how can someone track what’s been proposed? How about ideas that have been adopted and the impact they are having? What lessons might we take away from laws that aren’t working as intended?

During a press briefing last month in Washington, DC designed to begin discussing 200 such ideas, it dawned on me THIS is what every industrialized country deserves: a searchable clearinghouse of laws, policies and public proposals to scale up efficiency, produce energy that is cleaner, more cost-effective and safer while tallying the new jobs they create.

With a healthy array of non-profit, academic and myriad foundations stirring the policy pot with their own ideas, who out there is willing to create such clearinghouse?

This, of course, would be one huge undertaking. But I think it’s doable in a way that can draw from all energy / environmental / economic points of view. It might even enable critics of policies to contribute their thoughts and invite questions that deserve to be answered.

The 200 ideas came from the Center for a New Energy Economy at Colorado State University. Founded by former Colorado Gov. Bill Ritter, the Center released an extensive menu of options that do not require Congressional action. Given the stalemate there, the Center’s Powering Forward: Presidential and Executive Agency Actions to Drive Clean Energy in America now online could be an extremely helpful head start.

More than 100 leaders from private industry, utilities, academia, non-profit organizations, think tanks and others contributed their ideas with the promise their identities would be kept secret. For a handful of them, that did not matter. We’re talking about leading thinkers such as Moray Dewhurst, Vice Chairman and Chief Financial Officer of NextEra Energy, a large utility holding company and major renewable energy developer; Dennis Beal, Vice President – Global Vehicles at FedEx and energy consultant Susan Tierney, who served as Deputy Energy Secretary under President Clinton. Among those who helped with Ritter included Heather Zichal , President Obama’s former Deputy Assistant on Energy and Climate Change and Dan Esty , Commissioner of Connecticut’s Department of Energy and Environmental Protection.

Among the recommendations, Ritter and colleagues urged the President to:

• Direct the Environmental Protection Agency to explain to states how they can be credited for reducing greenhouse gas emissions from existing fossil-fuel power plants with early adoption of new energy efficiency and renewable energy measures.
• Request that the IRS use its existing authority to issue rulings and interpretations of the tax code that increase incentives for private investors to capitalize clean energy technologies. The idea here, Ritter said, is to make the tax code “more fair by offering clean energy the same investment tools and tax benefits now given to fossil fuels.”
• More clearly define the President’s criteria for what he’s called “responsible” natural gas production. This would require that oil and gas companies use best available production practices on federal lands. States could then require these practices to be used within their borders.
• Compare full life-cycle benefits and costs of each energy resource as White House energy programs are implemented. A report could distinguish carbon-rich and low-carbon resources consistent with the President’s goals for minimizing greenhouse gas emissions most responsible for climate change.

image002You can read the Center’s full report here. See the report cover, at right.

Starting in the U.S., how about if we combine ideas from this Powering Forward collaboration with the 70 or so ideas put forth by President’s Climate Change Action Plan. Next, we could call on the American Council for an Energy Efficient Economy (ACEEE) to weigh in with their best ideas to further incentivize energy efficiency. The analysts there have done an enviable job of tracking and rating efficiency initiatives in all 50 of the states and the District of Columbia.

The faculty, students and staff managing the DSIRE database at North Carolina State University in Raleigh could pitch in with laws and other policies in place to develop sources of renewable energy in states throughout the U.S. An organization such as Resources for the Future, which has weighed in recommending more even-handed ways to regulate hydraulic fracturing of shale natural gas, could begin by reflecting on the policies in the states such as Texas, Pennsylvania and others that are producing the most natural gas with the fewest safety mishaps while controlling methane emissions.

These and countless other ideas could be submitted using a spreadsheet, web form, or some other template designed to organize the policies, laws, thoughts consistently into a public database.

A few energy policy experts I shared this with expressed a range of responses; they either shook their head in disbelief (over why I think this is even possible) to those such as Ritter, Presidential Climate Action Project Executive Director Bill Becker and Michael Northrup, who directs sustainability programs for the Rockefeller Brothers Fund. Each of them saw the value but also implied the herculean effort it would entail. But they did not say no.

We gotta start somewhere. Who’s up to it?

Since I put this call out via The Energy Collective in my late January “Game Changers” column, I’ve had an interesting exchange with Evan Juska of The Climate Project based in London and their aspirations along similar lines at The Clean Revolution. But nobody, apparently, is close to achieving this. The Clean Energy Solutions Center is notable for its global reach, as is the The Renewable Energy Policy Advice Network (REPAN), a collaborative effort between the Clean Energy Solutions Center and the International Renewable Energy Agency (IRENA).

How much is a homeowner’s rooftop solar system worth to civil society?

How much value should owners of rooftop solar systems receive credit for in reducing their use — and the environmental impact — of utility-generated electricity?  That is THE question Minnesota is tackling, the first state ever to do so.

It is one of the most closely-watched state energy initiatives in the U.S. because of the impacts it could have on the solar industry and tradition-bound electric utilities that see solar as a threat to their 100+ year-old business model.

Minnesota’s Department of Commerce was expected to submit a methodology for figuring the value of solar to the Minnesota Public Utilities Commission by January 31, 2014 but the “initial” comment period was extended through Thursday, Feb. 13, 2014, with replies to those comments due on February 20, 2014. The Minnesota Public Utilities Commission is scheduled to meet on this March 12, 2014.

Should a ruling and a rate emerge from the Commission later this year, it is expected only to be a voluntary alternative that utilities could choose if they don’t want to credit homeowners through net metering for the surplus power their solar systems push back out on to the grid. Either way, a rate case could provide a more practical path forward for valuing rooftop solar systems while providing a real-world example for other states to do something similar.

Because the Minnesota Legislature launched this initiative while requiring investor-owned utilities in the state to supply at least 1.5 percent of their energy output from solar power by 2020, the push for cleaner energy in “The Land of 10,000 Lakes” appears to be strengthening; I’m guessing that green ethic will be reflected in the Commission’s decision and it will withstand the predictable onslaught of critics trying to defeat it.

The Legislature is requiring that any formula include the projected damage to the climate that can be attributed to power plant carbon emissions. Because carbon-free solar power reduces that damage, solar owners could receive a different, but not additional, benefit from their utility. Either way, stakeholders on all sides are watching the proceedings very closely.

So how DOES a state calculate the value of a rooftop solar system?

Clean Power Research, a consulting firm, drafted the methodology for the Department of Commerce. The firm based it on a computer modeling of the “social cost of carbon” developed under appointees of President Barack Obama at the Environmental Protection Agency (EPA).  EPA acknowledges, as several critics of the effort have asserted, that it raises questions about climate science and environmental economics, along with a host of ethical considerations.

The value of solar calculation is supposed to determine the pro-rated costs that a solar system spares the utility from incurring. These are known as “avoided costs.”  The biggest avoided costs are the fuel purchases displaced by the solar generated electricity. Avoided costs also include the cost of maintaining and, if necessary, adding to the high-voltage lines that make up its transmission network;  the distribution lines that deliver grid power into homes and businesses; and upgrading meters to help utilities and their ratepayers get smarter about their usage.

No matter which way you approach it, the value of solar will be a subjective calculation. And therein lies the rub. If there is no actual cost to avoid (e.g. a carbon tax), utilities will be quick to argue there is no cost to avoid.  Furthermore, the final number depends greatly on how to discount future damage and convert it into current-day dollars (aka “present value).

That’s precisely the point made by Brian Draxten, manager of resource planning for Otter Tail Power in Fergus Falls, Minn., who argued that because the utility currently pays no money, it’s only a cost in theory. “Until there is money we’re avoiding, the whole thing should be zero,” he told Energy & Environment Publishing earlier this month.

The draft formula drawn up by Clean Power Research results in a solar value of 12.6 cents per kilowatt hour. About half of that amount comes from costs avoided for coal, natural gas and other fuels burned to run generators in Minnesota. For the damage to the climate avoided, the formula assigns a value of just under 3 cents per kilowatt hour.  See table for the complete breakdown.

Sample Value of Solar – Levelized Calculation Chart for Minnesota by Clean Power Research, 2013


Levelized Value




Load Match

(No Losses)



Loss Savings



PV Value


Avoided Fuel Cost





Avoided Plant O&M – Fixed





Avoided Plant O&M – Variable





Avoided Generation Capacity Cost





Avoided Reserve Capacity Cost





Avoided Transmission Capacity Cost





Avoided Distribution Capacity Cost





Avoided Environmental Cost





Avoided Voltage Control Cost





Solar Integration Cost





na: not applied




Minnesota may be the first state to value solar power in the context of climate change. (Track comments and future developments here.) But Austin Energy, the municipal utility in the Texas capitol city, established a value that this year is set at 10.7 cents per kilowatt hour. (It was 12.8 cents in 2012 and 2013.) That’s about 1 cent per kilowatt hour higher than the standard rate customers pay for power there due to the city’s growing reliance on relatively low-cost natural gas. Even so, the differential has triggered a wide range of supportive and critical responses as I predict the Minnesota effort will.

A controversial 2013 report about disruptive challenges facing investor-owned utilities for the industry’s trade association calculates the average residential customer pays an electric bill of $110 per month, $60 of which it asserts are fixed costs. Solar advocates challenge that number arguing that the amount is far smaller when all the benefits of on-site generation are accounted for.  While there is no groundswell yet pushing to value rooftop solar, the very attempts to value solar are setting the stage for a substantive discussion that is almost certain to gain momentum.

Karl Rábago, who served as the Vice President of Distributed Energy Services at Austin Energy and now consults about cleaner and smarter energy, urged stakeholders to keep an eye on how the California utility commission follows through on the state’s new AB 327 law and next steps in Georgia Power’s Advanced Solar Power Initiative.

This generally is how substantive changes in utility rates take root, get refined and over several years get adopted as more utility commissioners consider the rationale but also the data to consider such valuations. The upshot: five years from now, we’ll look back on what Austin has done and what Minnesota is trying to do as the game-changing sequence of new policies that that will give solar power its due.

Actions speaking louder than advocacy: Microsoft programming for profit with carbon neutrality

The number of companies taking tangible steps to prepare for and prosper in a lower-carbon economy is growing — even absent a national carbon tax or cap-and-trade regime.  Their actions speak much louder than the hyperbole that many advocates are using to argue for and against paying for carbon emissions.

One of the most instructive examples of what companies can do to hold themselves accountable comes from Microsoft, which uses a lot of energy at its data centers and sending professionals around the world on airplanes, emitting a lot of greenhouse gases in the process.

It’s nothing novel for a company to say it’s striving to become carbon neutral. Detailing and sharing a way to get there with an internal carbon fee is. Microsoft’s move to do just that offers a compelling example to corporations, governments, academic institutions and non-profit organizations around the world that for their own sustainability, they should – no, they need to — to do the same.

At least 28 other global companies – including a handful of oil and gas companies — are headed in the same direction. Let’s hope this is a substantive and growing mind-shift that will propel thousands more organizations to account for their climate impact. By doing so strategically, such forward-looking companies can outperform their rivals over the long haul and help their communities, markets and civil society adapt to climate change.


Internal Carbon Prices

Estimated by or Shared with CDP, Inc.



Price / ton in U.S. dollars

ExxonMobil Corp.






Royal Dutch Shell




Ameren Corp.


Xcel Energy Inc.


Walt Disney Co.


Devon Energy Corp.


Google Inc.


Microsoft Corp.


*external estimate

If a carbon fee or allowance of some sort becomes law somewhere in the world, these companies will have a big leg up on their competition.  Their systems, buildings and culture will already be programmed for profit.

How Microsoft stepped up to the plate can be found in its Carbon Fee Playbook. I’ll dive in with a quick summary.  I’ll leave a complete walk-through to you using the inks embedded throughout.  Note: all mentions of “carbon” refer to all greenhouse gases.

There are three primary, simple, components to the Playbook: 1) organization carbon reduction policy; 2) price on carbon; and 3) carbon fee fund investment strategy.  The price on carbon is set by calculating what it will cost to execute the company’s carbon reduction strategy.

With that, there is a five-step process to help organizations implement it:

  1. Calculate your carbon impact.
  2. Establish how you’ll reduce carbon emissions; then decide how you’ll invest the funds raised by your internal carbon fee.
  3. Determine internal carbon price.
  4. Gain executive approval, establish governance procedures and set feedback loops.
  5. Administer the fee, communicate results and adapt company actions to increase its impact.

The internal cost of energy at Microsoft includes the price it pays each of its utilities for electricity and/or natural gas, as well as the price it pays to offset the carbon emission associated with that usage. For business air travel, the cost includes the price per airline ticket – even by class – and the price it pays to offset the carbon emissions associated with each flight using algorithms from American Express and the U.S. Environmental Protection Agency. It established targets for reducing air travel and is striving to achieve them by increasing its use of collaboration technologies.

The carbon fee is divided up among the business units responsible for the resource consumption. Microsoft is quick to assert there is no “grandfathering” as is found in cap-and-trade schemes.  By doing so, the fee helps to educate the business units on their carbon impact and serves as a reminder of ways they should try to innovate and become more energy efficient.

Josh Henretig, Microsoft’s Director of Environmental Sustainability, said the company’s biggest challenge has been to shift responsibility for reducing carbon emissions from the company’s six-person sustainability team to managers in charge of every operation that uses energy or otherwise emits carbon.  “We’re changing the culture of accountability throughout the company,” he said in an interview.

The company’s evolution from mostly a software products company toward more of a devices and services enterprise has expanded the challenge, especially as it develops more cloud-based services requiring data centers throughout North America and the world.  Increasingly the impact of the company’s ever-growing supply chain is figuring into the mix.

Following the Playbook, Henretig said Microsoft first achieved carbon neutrality in its fiscal 2013 year, which ended last July, raising about $10 million in the process. Its expanding carbon footprint means they won’t be able take their eyes off that distinction, and may fall short in future years.

Microsoft set targets for reducing energy consumption in its data centers, labs and offices. A self-developed  enterprise-wide energy management program deploys seven different building management systems in helping office engineers get a grip on ways to reduce their energy consumption. The company operates in 118 buildings housing about 30,000 pieces of energy-using equipment (not including laptops) encompassing about 15 million square feet of space.  The result: 500 million data points every day. At last count, Microsoft has 100,500 employees worldwide.

To get greener, Microsoft says it is signing long-term renewable power purchase agreements wherever it makes sense. Where it cannot buy power from wind, solar and/or hydro systems to directly supply its data centers, financial managers are purchasing market-based renewable energy certificates (RECs) along with carbon offsets. These give the company credit for buying renewable energy and reducing carbon emissions in locations beyond their network of facilities because they still benefit the environment and supply the regional power grid, which Microsoft and its neighbors share.

To help ensure access to power for its large data center servers near San Antonio, Microsoft purchased RECs in signing a 20-year wholesale power agreement to for electricity from a 110 megawatt wind farm near Fort Worth – the Keechi Wind Project – set to open in 2015. Because Microsoft can only buy electricity from its regulated retail provider, CPS Energy of San Antonio, the wind turbines will supply electricity to the Texas grid, which CPS Energy can draw from.

Image“We’re definitely looking at this as a first of (its) kind, but it fits into our overall desire to have more control over our energy supply,” said Brian Janous, Microsoft’s director of energy strategy, told the San Antonio Express-News in November.

On a parallel path, program leaders are striving to achieve targets for generating less waste and using less water.

Mindy Lubber, President and CEO of Ceres, says this strategy’s simplicity is what makes it transferable. “It can be adapted easily to fit other corporations, nonprofit groups and government agencies,” she said. “The basic formula is universal (carbon emissions multiplied by carbon price equals carbon fee); it’s simply a matter of tweaking the model to fit an organization’s structure, financial processes, and individual goals.”

The foundational building block for reducing all greenhouse gases is a carbon emissions inventory. This enables the company, the Playbook states, to “institutionalize a process for collecting, calculating and maintaining carbon data.”

Microsoft created seven major sections of information to track emissions data:

  1. Organizational information, including in-house project managers;
  2. Boundary descriptions;
  3. Quantification methodologies and emission factors;
  4. Data sources, collection processes and quality assurance;
  5. Establishing a base year for structural and methodology changes;
  6. Management tools, including individual roles and responsibilities, training and data maintenance;
  7. Auditing, verification and corrective actions needed.

Carbon emissions from operations are measured in metric tons of carbon dioxide-equivalent (mtCO2e). To quantify its emissions, Microsoft multiplies the organizational activities and use of resources by specific emissions factors. The resources include electricity consumption in kilowatt hours and commercial air travel in passenger miles by class of travel and the amount of space each employee uses.

To monitor and report on an organization’s emissions inventory and provide up-to-date access to data cross the organization, Microsoft deploys software developed by Australia-based Envizi (formerly CarbonSystems) on the company’s Windows Azure cloud platform.

With its access to low-priced electricity at its Redmond, Washington headquarters supplied in part by hydro-electric generating stations operated by Puget Sound Energy, some data crunchers estimated Microsoft’s carbon price to be between $6-7 U.S.  Henretig would not disclose what its internal price is. Sharing lessons learned only goes so far.

With LED bulb options growing, how smart are you about more efficient lighting?

The tipping point in the adoption of light-emitting diode (LED) bulbs in households around the globe is rapidly approaching as 40-watt and 60-watt are phased out in the U.S. and their equivalent LED lamp bulbs drop below the closely-watched $10 (U.S.) price point.

The race in many countries to meet the growing interest in LEDs is seeing increasingly lower prices as manufacturers active in the U.S. team with retailers such as Home Depot, Staples, Walmart and Ikea to maximize their market share. That’s a game-changing shift in the residential market, which has been slower to take shape than many analysts predicted. The last half of 2013 in the U.S. demonstrates consumers there are waking up to the advantages of LEDs.

With their 20,000-hour +/- rated lifetimes (based on 3 hours per day) and 3+year limited warranties, LEDs can boost household, commercial and industrial energy efficiency helping users save on power bills, make it easier to limit regional power plant emissions and preserve the integrity of power grids. Projected energy savings is 80+ percent, meaning many consumers can recoup their purchase price within 5-7 years compared to traditional incandescent bulbs. This, of course, depends on how much they use them. Commercial and industrial payback periods can be as low as 6 months to 2 years. You can find a quick Energy Star LED primer here.

The increasing options with LED lighting come as more countries around the world proceed with laws or regulations phasing out the manufacture, importation and / or sale of traditional incandescent and fluorescent bulbs.  Compact fluorescent bulbs (CFLs) have made inroads for the energy they can save. But they’ve also created confusion either because some of them take time to warm up, are not dimmable, give off harsh light, contain mercury and / or cannot be recycled. Not so ‘green’ and convenient after all.

How long will it take LEDs to win the day among most households? Consumers who’ve been relying on incandescents seem to be catching on. Switching from halogen bulbs is a tougher sell due to their lower prices (vs. LEDs) and notable energy savings without the baggage of CFLs.


Here is how LEDs are projected to become a bigger share global household lighting sales 2012 – 2021. CREDIT: IHS, Inc.

IHS Inc. research analysts  Stephanie Pruitt and Stewart Shinkwin forecast that sales of LED lamp bulbs (shapes A17 – A21) will grow significantly each year until they are the most shipped technology of those shapes in 2019. They predict global sales of all types of LEDs will grow from about $3.6 billion in 2013 to slightly more than $7 billion in 2016.

Durham, NC-based Cree, Inc. effectively launched a consumer price war over LEDs in the U.S. this past April. With their TV ads leading the way, one competitor after another is responding. Because LEDs have such longer useful lives, each consumer purchase today is one less purchase in the market for a long time. The starting gun has sounded.

After hovering just under $20 for years, one could find Cree’s 40- and 60-watt equivalent bulbs with dimmable “soft white” lighting at Home Depot this month at or close to $5.97. Its 65-watt equivalent, soft white, indoor flood bulbs is almost sure to go drop toward $10. The same seems in store for the company’s just-announced 75-watt equivalent soft white bulb which is due to reach stores in the U.S. in February.  All these prices are without utility rebates.


So many choices: This aisle dedicated to household light bulbs at a Home Depot in northern Virginia is about 100 feet long. CREDIT: Jim Pierobon

At this writing, the race for market share in the U.S. and other countries appears to be primarily between Cree and Philips.  The “Ecosmart” brand assembled in China can often be found close by on store shelves.  Osram Sylvania,  SunSun Lighting, and Switch, are making strides while others are holding back when it comes to competing on price. Analysts said that’s because profit margins for household lighting are so slim – and are likely to get slimmer until sales and manufacturing volumes pick up significantly. Private label offerings by Ikea and Walmart (by GE) stand to get traction as well.

“Nobody expected to see the $10 price point this quickly. It was almost a shock,” said LUX Research analyst Pallavi Madakasira.

Said IHS’ Pruitt and Shinkwin: “This increased competition has led to quicker technological advances and decreasing prices as more companies attempt to become major suppliers of different LED lamp and luminaire technologies. It has also forced the major lamp and luminaire companies to increase their investments in these technologies.”

LED bulbs in various forms and for myriad applications have been available for several years. But price points above $20 for most common household uses had failed to impress consumers. The start of 2014 promises to up the ante in the U.S. as the phase out of standard 40-watt and 60-watt bulbs runs its course.

Carr Lanphier, an equity analyst with Morningstar who follows Cree, said he’s expecting a “downward trajectory” of bulb prices driven not just by competition between Cree and Philips but also by more Chinese manufacturers entering the market which are heavily subsidized by their central  government. Samsung, Toshiba and 3M are expected to become forces in the global market too. Together, they are sure to keep downward pressure on prices as more LED bulbs of different types with different watts show up on store shelves.

Because there are larger homes with more lights and more disposable income to invest for energy savings, the U.S. and Western Europe may be the most lucrative residential playground, for now. But China, with its pressing need to limit harmful emissions by controlling demand, has the most to gain air-quality wise with LEDs, Morningstar’s Lanphier said.

What apparently gives Cree and Philips an edge on the rest of industry, at least for now, is how their bulbs reduce the heat normally generated by light bulbs. The transistor materials and engineering needed to accomplish that amounts to almost one-third the cost of making LEDs, according to Madakasira. Each company has deployed proprietary technology reducing that component to about 10% of its total cost. Other companies either cannot yet or aren’t as willing to make such an investment while margins in the residential market continue to shrink.

Mike Watson, Cree’s Vice President of Product Strategy, said in an email that the tower in its lamp bulbs “acts as both a heat sink and mimics the tungsten filament found in the center (of) traditional incandescent bulbs by lifting the light-emitting diodes to the center of the bulb. By keeping the same form factor, we’re able to save in material and manufacturing costs, which translate to lower costs for consumers.”

Watching Cree’s TV commercials, there’s little doubt that the public company wants to be the leading brand, at least in the U.S. “The campaign is as much for the company as it is for their bulbs,” said Lanphier, of Morningstar.

Cree’s Watson said the response to the campaign has been “enormously successful. It has increased awareness of Cree, increased consideration of Cree bulbs (and) increased advocacy.” Watson asserted that Cree outsells all other LED bulbs combined at Home Depot stores in the U.S. (See typical lighting aisle display in accompanying photo.)

A quick coast-to-coast survey of rebates available in the U.S. indicated rebates are available through many utilities mostly for commercial retrofits and some new commercial construction. A partial list of the types of commercial rebates in the U.S. is here. A link to rebates by utility in the U.S. is here.

Watson said utilities in California, Washington state, Massachusetts, Rhode Island and Arizona are offering rebates on household bulbs with more expected soon.

For you LED diehards out there willing to explore exactly what you get for the price, here’s a list of features to compare different bulbs:

  • Energy used, in watts, with the incandescent watt-equivalent;
  • Lumens (brightness);
  • Color quality equal to natural light, aka “Color Rendering Index” (higher the better, e.g. at least 80);
  • Rated life (hours, typically 22,000 or more);
  • Correlated color temperature (“warm” or “soft” is approximately 2,700 K, for Kelvin; “day light” is approximately 5,000 K);
  • Beam spread: spot or omni-directional;
  • Dimmable.

One distinguishing feature of some LEDs in the U.S. is whether they comply with California’s voluntary light quality standard and whether it is close to the quality of an equivalent incandescent bulb. The Color Rendering Index is a useful measure here.

Another distinction in the U.S. is whether an LED bulb is Energy Star qualified. According to the U.S. Department of Energy, LED lighting products must pass a variety of tests to prove that:

  • Brightness is equal to or greater than existing lighting technologies (incandescent or fluorescent) and light is well distributed over the area lighted by the fixture;
  • Light output remains constant over time, only decreasing towards the end of the rated lifetime (at least 35,000 hours or 12 years based on use of 8 hours per day);
  • Excellent color quality: the shade of white light appears clear and consistent over time;
  • Efficiency is as good as or better than fluorescent lighting;
  • Light comes on instantly when turned on;
  • No flicker when dimmed; and
  • No power draw when the bulb is turned off.

Ah, but comparing most or all of these features online or even in a store will be difficult. Not all manufacturers and retailers promote the same features.  Good luck. Wherever you are, share here what experiences you’re having using LED lighting in your home.

With lower filibuster threshold, new Obama appointees on D.C. Circuit Court portend stricter emissions limits

The U. S. Senate’s confirmation Tuesday of Patricia Millett to the D.C. Circuit Court of Appeals, along with two other confirmations to that court expected by year’s end, increase the odds that stricter regulations on power U.S. plant emissions that President Obama has not been able to accomplish through legislation will stand up in court.

The ascension to the “D.C. Circuit,” as the court is often referred to as, of Millett, Nina Pillard and Robert L. Wilkins  (see accompanying photo) means key air quality rules set by the U.S. Environmental Protection Agency (EPA) are more likely to withstand a rear-guard assault by utilities and heavy industry.

Obama's 3 latest nominees to DC Circuit, June 2013 CREDIT WH dot GOV

U.S. Circuit Court of Appeals nominees ( from left) Robert Wilkins and Cornelia Pillard, along with newly-confirmed Patricia Millett are expected to defend President Obama’s push for stricter emission limits on future and existing U.S. power plants. CREDIT:

The shift came in the Senate’s vote Nov. 20 to change a rule that reduces to 50, down from the 60, the votes needed to end the storied practice of filibustering the administration’s nominees to executive branch positions and to the federal courts, including those to D.C. Circuit.

It’s in the D.C.Circuit where utilities, energy and industrial companies are challenging new rules taking shape at the EPA to implement Obama’s much-anticipated rules reducing emissions of carbon dioxide.  The D.C. Circuit is given the responsibility of directly reviewing the rule-makings of federal “independent” agencies, such as the Environmental Protection Agency, and very seldom does it need a prior hearing by a U.S. District Court.

If the Senate also confirms Pillard and Wilkins for the remaining two open slots on the D.C. Circuit, such rules would very likely be upheld upon appeal. In that case, Obama would achieve an important win by regulatory fiat.

Prior to this week, Obama had succeeded in filling only one of four vacancies on the 11-seat court. The eight full-time judges have been split evenly between those presumed to agree with EPA rules and those opposed. As the two other vacancies are filled the scales are expected to tip consistently in favor of those agreeing with the EPA, according to several court observers.

Officially, the latest Obama climate plan is striving to reduce U.S. carbon-dioxide emissions 17 percent below 2005 levels by 2020. Thanks largely to cheap natural gas from shale unleashed by hydraulic fracturing, U.S. energy-related carbon dioxide emissions have actually declined about 12 percent below their 2005 level. That’s without any new and tougher rules from the EPA. But much of that decline also occurred during the deep recession. As the economy recovers, CO2 emissions are resuming their upward trajectory.

Lobbyists and legal scholars on both sides of this debate are re-forging strategies to defeat — or support — EPA’s existing rules and whatever new rules the EPA produces.  This battle has become the defining energy / environmental policy contest for the next three years, and perhaps beyond because of the stalemate in Congress.

It was a 2007 decision by the Supreme Court which set the stage for this showdown. The EPA, the Court affirmed, is required under the Clean Air Act to regulate carbon dioxide if it poses a threat to public health and welfare. The EPA has been using that authority to propose carbon standards for future coal- and natural gas-fired power plants not yet built. But Obama’s climate team is ready to push hard with updated regulations in 2014, to be enforced starting in 2015, which would focus also on existing power plants.

Global  CO2 emissions with Copenhagen target CREDIT EIA, via Wash Post

Here is how U.S. energy-related CO2 emissions have increased, decreased and are expected to resume their upward trajectory. CREDIT: U.S. EIA

By most measures, existing power plants are responsible for 40 percent of the nation’s CO2 emissions. That piece of the Obama strategy could speed up the closure of power plants; along the way it could pave the way for renewable sources of electricity and shine a brighter light on reducing demand through energy efficiency initiatives. It could also further accelerate the growth of natural gas-fired generation because of its lower emissions relative to burning coal.

From Obama’s long-anticipated climate speech this past June, his administration’s plan to tackle emissions from existing plants now calls for the EPA “to build on state leadership, provide flexibility, and take advantage of a wide range of energy sources and technologies including many actions in this plan.” Here is how the EPA is receiving inputs that could frame the new regulations.

To be sure, there are numerous rule-making mechanics that could help — or hinder – the policies the EPA opts to pursue. Success or failure of an appeal at the D.C. Circuit may rest on how much legal risk the EPA wants to assume. The more aggressive the limits on emissions, the bigger the risk that an appeal will succeed; a more modest target might not draw as potent an appeal effort.

Fast Fix: The public list of companies preparing for a carbon tax keeps growing, but . . .


. . . whether the U.S. and more governments will focus their nation’s economic priorities on a cleaner energy future and a healthier environment with a carbon tax or cap and trade program seems as bleak as ever. But for how much longer?

If you know of others, help me keep track here of companies incorporating a carbon price into their strategic planning by commenting on this post or emailing me at:

Based on this week’s report in The New York Times, this year’s carbon pricing report by CDP (formerly known as the Carbon Disclosure Project) and other sources, this list of 29 mostly U.S.-based companies includes, but is not limited, to:

  1. Ameren
  2. American Electric Power
  3. Apache
  4. BP
  5. Chevron
  6. CMS Energy
  7. ConAgra Foods
  8. ConocoPhillips
  9. Cummins
  10. Delta Air Lines
  11. Delphi Automotive
  12. Devon Energy
  13. Duke Energy
  14. DuPont
  15. Entergy
  16. ExxonMobil
  17. General Electric
  18. Google
  19. Hess
  20. Integrys Energy
  21. Jabil Circuit
  22. Microsoft
  23. PG&E
  24. Shell
  25. Total
  26. Xcel Energy
  27. Wal-Mart
  28. Walt Disney
  29. Wells Fargo

Mapping Whole Communities and Homes with Thermal Images to Reduce Heat Loss

What if you could determine how much heat your home is losing, how much it’s costing you in higher energy bills and what insulation and other improvements would fix it? Oh, and you could accomplish without an energy audit, free-of-charge and from a mobile device no less?

That’s the vision of Geoffrey Hay, an Associate Professor of Geo-Information Science at University of Calgary, and a team of seven students. Together they earned top honors at the Massachusetts Institute of Technology’s Climate CoLab annual Crowds & Climate conference Nov. 6-8 for an app they’re calling Heat Energy Assessment Technologies, or HEAT.

HEAT JPG screen capture with score

Here’s part of the screen f ////// CREDIT: University of Calgary

Their mission is to show “what urban energy efficiency looks like, where it is located, what it costs and what to do about it.” From its initial development, HEAT’s vision has been to “empower the ‘urban energy efficiency movement’ by providing free, accurate and regularly updated waste heat solutions for the world.”

Pretty heady stuff, yes. But it appears doable. And because it’s scalable and could be programmed for warm weather climes to identify cool air escaping homes, the sky just might the limit. (Sorry about the unintended pun.)

In their run-up to the MIT conference, Hay asserted, “We believe that if people could see the invisible waste heat they generate and if they know how much it cost (financially and to the environment), they would want to take action. We want to show them how.”

Now anybody can hire a fully-equipped home energy auditor with a thermal imaging camera to identify where a home is losing heat, or cooled air. But doing it on this scale and using the results to motivate thousands of individuals, community leaders and energy conscious policy makers?  That is a huge leap of faith.

“The biggest obstacle in the way is the lack of interest,” Hay is quick to acknowledge. “How do you engage in something you cannot see?”

“Individuals are motivated much more by their perceptions of what other people do and find acceptable than they are by other factors such as the opportunity to save money or conserve resources, contrary to even their own perceptions of motivation,” Hay said.

Hay began developing the software about four years ago after he and his partner moved into a new house only to find it was unexpectedly cold inside. This despite a smart thermostat, triple-pane, low-E windows, a very efficient furnace and R50+ attic insulation. So why so cold? Hay wanted to know where and how the heat was escaping.

Prof Geoffrey Hay, Univ of Calgary“Wouldn’t it be great,” Hay told a TEDx Calgary audience this past summer, “ if I could pull out my mobile device, click on Google Maps, tap on my house, and automatically bring up a thermal image” that would answer those questions, at no charge?

Hay assembled a team of seven principal student engineers and software developers. They launched into their research and secured about $1 million in funding. They started collaborating – aka crowd-sourcing — with six professors at American, Austrian and Canadian universities. That led to, among other things, invitations for, and the delivery of, numerous peer-reviewed papers and conference presentations.

Hay explained the abstract of one of the papers has been downloaded more than 1,500 times; the papers have drawn  3,400 reviews.

The HEAT team joined with a Calgary-based ITRES, a high-tech imaging company which developed a camera capable of taking thermal images of wide-swaths of real estate from the sky. Flying between 11 p.m. and 4 a.m., the “thermal airborne broadband imager,” or TABI for short, can capture thermal images within 5/100th of a degree Celsius.

ITRES in-flight imaging

Calgary-based ITRES fly-over imaging can capture the heat loss from homes within 5/100s of a degree Celsius. CREDIT: ITRES

“We can fly the city of Calgary in about four and one-half hours without having to stop for gas,” Hay said.

Back in the lab, the team overlays geographic information system building data available from the city, including the when homes and other structures were built, to create HEAT scores ranging from 0 to 100; the lower the more efficient the home is.

But for all the science, Hay and his colleagues are betting on human nature. Quoting an often-cited mantra credited to Alex Steffen and the behavioral demand response movement,  “nobody wants to be the outlying energy hog.”

There may be another catch: privacy. What right does any person have to know how much heat is escaping someone’s home?

Phase I of the HEAT project assessed 368 houses in Calgary neighborhood in 2011; Phase II expanded that to 33,000 homes in 2012; they scaled up application earlier this year in Phase III to evaluate up to 300,000 homes.

“Though we had already completed a significant amount of research,” Hay explained, the proposal to the Climate CoLab competition “pushed me and my team to think beyond our initial three-phase pilot project.” Now their research has been branded to include planned upgrades reflecting the geographical reach and time-spans to be covered:  “MyHEAT Calgary” by the end of this year, “MyHEAT  Multi-Year” in 2014 and MyHEAT Canada planned for 2015.

If all goes according to plan, Hay envisions implementing MyHEAT in many other cities around the globe focusing initially on those with populations in excess of 1 million.

Will enough forward-thinking municipal leaders cooperate? What have they got to lose?

True to the ever-expanding, crowd-sourcing possibilities, the MyHEAT team developed and built in a system allowing users to write in what their roofs are made of.  These and other ideas are fertilizing more extensions to the app.  One is a collaboration with commercial LEED application developers and certified home energy auditors to help ensure the integrity of the MyHEAT scores.

Among the logical next steps is work underway with the Alberta Real-Estate Board and the Calgary Real-Estate Board to develop and evaluate scores for their respective Multiple Listing Service (MLS). This would equip home owners and their agents with data on how well insulated their structures are. Home shoppers and their agents could soon have another means of measuring a home’s energy efficiency.

Think of other uses beyond REALTORs®  and energy-conscious home owners, especially in the coldest and hottest climates: service providers offering efficiency solutions, homebuilders wanting to verify the quality of their homes, leaders of similar communities challenging each other leading up to Earth Day.

Hay says several other innovations are in the works.  You can keep track of them at .

Fast Fix: Opower’s 5 Universal Truths About Utility Customers – Do You Agree?

Customer Service Week hosted a discussion this morning with Opower‘s VP-Analytics Nancy Hersch and Customer Insights Senior Manager Lauren Llewellyn about five truths about utility customers the company is finding in its quantitative and qualitative research.

Since early 2012, Opower has reached beyond North America conducting this research, working now with 90+ utilities around the globe. Opower claims it has helped utilities cumulatively save more than 3 terrawatt hours of electricity.

Of the five that follow, which ones hit the mark? And what’s missing?

The five are:

  1. Utilities are not meeting  customer expectations. There is a large gap between customers’ expectations and what utilities are delivering through their billing and customer service operations.
  2. Sure, everyone would like lower bills, but they’re also looking for steps they can take to achieve that.
  3. While many customers may not like their utilities, they DO look to them first for guidance on how to save energy and lower their bills.
  4. Customers value insights about how they use energy in their homes. Many also would like to see recommendations on reduction tactics delivered through the channels they choose followed up by reports about the progress they’re making, or not.
  5. Just about everyone is at least curious about how their usage measures up to other homes, especially those with similar housing types in their neighborhoods.