From Air-Conditioning to Urban Planning, Defaults and Standards Create Dysfunction by Design

Mar 21, 2016 by

Environment

How can we stop outdated assumptions from forcing us to waste?


Photo Credit: Baloncici/Shutterstock

Personal heaters are a summer survival tool for many office workers chilled to the bone by hyperactive ventilation systems — an act of self-defense against an epidemic of overcooling that is wasting energy and confounding comfort in not only offices but also large shops, schools and other buildings. An audit of U.S. government buildings found that over three-fifths of their occupants felt too cold in the summer. The most likely culprit behind this big chill? Engineering conventions. Slavish adherence to unfounded and outdated rules of thumb that cause mis-programming of air conditioning systems.

Frozen spaces are but one chilling case of an iceberg of waste. Many of the technologies, practices and systems that we interact with every day are shaped by default settings in the form of “established practice,” professional standards and design codes. Many deliver dysfunction and overconsumption by design. Zoning rules preclude the construction of affordable microhomes. An outmoded presumption of universal automobile ownership begets wide streets and “gargantuan” gaps between intersections that hog-tie urban planners working to increase density. Default settings dial in waste in all manner of electronic devices, keeping video game boxes on perpetual standby and pre-programming irrigation controllers to drown gardens in wasted water. And many such controllers revert to their wasteful factory defaults every time there is a power outage, undermining any conservation impulses consumers might have had.

 The flip side, however, is that confronting problematic design defaults can open some surprising opportunities to improve living, save money, reduce pollution and conserve precious resources from energy to water to open space — sometimes all at the same time.

Breaking through decades of inertia to overcome embedded defaults will require that engineers, design professionals and policy-makers admit they may have been wrong or even irrational in the past and become activists for reexamining our assumptions about the way things need to be. And it will require patience: As reformers such as Jeff Speck, a Boston-based city planner and architect and author of Walkable City, have documented, dysfunctional default settings can take decades to set right.

A/C Overdrive

Just finding the defaults can take many years. The American Society of Heating, Refrigerating, and Air-Conditioning Engineers, for example identified the problem of overcooling in the 1990s as “consistent and systematic.” Yet two decades later, a senior ASHRAE expert, Richard de Dear at Australia’s University of Sydney, reported that the society’s interior comfort specialists remained “puzzled” by it.

This chilling mystery has inspired an array of often-contradictory theories. In his 2012 essay “Did someone set the air on Arctic?“, de Dear cites boilerplate leases in which building owners guarantee their tenants spaces far cooler than ASHRAE’s thermal comfort standards recommend. Last year researchers in the Netherlands made headlines with a report that blames those ASHRAE standards for driving thermostats down in favor of men’s higher metabolisms — an alleged gender bias that was quickly refuted by ASHRAE.

Hui Zhang, a thermal comfort expert at the University of California, Berkeley’s Center for the Built Environment, says recent research supports a disarmingly simple explanation: default settings that cause ventilation systems to blow too much air. “The cause of overcooling is over-ventilation,” she says.

The root of the problem is a technological artifact left over from the pneumatic controls that were once the go-to method to modulate airflow to each zone in a building. Pneumatic control boxes offered just one setting for each zone’s minimum airflow, whether the system was heating, cooling or just delivering fresh air. Since heating required at least 30 percent of a system’s maximum flow (or more), engineers came to view this as a natural minimum.

Pneumatics were standard equipment by the 1970s. But by the 1990s they were being edged towards the technology scrap heap by electronic controls capable of specifying different airflows for heating, cooling and ventilation.

Nevertheless, the rule of thumb endures — airflow equipment is still sold with labels advising against lower flows — keeping flows whistling in most buildings above 30 percent at all times.

Zhang and her colleagues at Berkeley and at nearby mechanical engineering firm Taylor Engineering suspected that the old pneumatic rule of thumb and other accepted wisdom on ventilation was causing the indoor equivalent of wind chill. Telecommuting and the installation of energy efficient devices and lighting was reducing the heat inside buildings. Yet ventilation systems could not be turned down below an increasingly excessive 30 percent of capacity.

In 2006 Taylor Engineering scored a first victory against the 30-percent default by demonstrating that most digital controls could reliably regulate flows below 10 percent of system capacity. Cutting flows could cut energy use, and California legislators acted fast to lock in the benefits. Their 2008 amendments to the state’s building code effectively forced builders to let flows drop to at least 20 percent of system capacity.

Skeptics within ASHRAE, whose standards guide other states’ (and countries’) building codes, still questioned whether occupant comfort would be assured under low air flows. “There was a lot of skepticism and controversy,” recalls Taylor Engineering principal Jeff Stein.

In 2012 collaborative work by researchers at Taylor and Berkeley (and partly funded by ASHRAE) satisfied even the skeptics. They tested six buildings at the Sunnyvale, California, campus of Internet giant Yahoo. When they reprogrammed the ventilation system, allowing ventilation air flows to drop from 30 percent to 10 percent, Yahoo staff reported 47 percent lower dissatisfaction with air quality.

And energy savings were substantial. Electricity use dropped by 13.5 percent. The buildings also burned 12.2 percent less natural gas because less overchilling reduced the amount of air that had to be reheated in the buildings’ most Arctic zones.

Dan Int-Hout, chief engineer at Texas–based Krueger-HVAC and an ASHRAE director, says this “incredible news” from the Yahoo! project has begun to convince engineers to allow lower flow rates. Still, he predicts that fully overcoming the inertia in practice “could easily take 20 years.”

The hang-up is engineers’ aversion to risk. Better to overcool buildings in the summer and let individual occupants sort out their own needs rather than get sued for under-delivering. “It’s easier for people to bring in a personal heater if they’re cold than to bring in a personal air conditioner,” says Stein.

Off-Street Autopilot

Default practices inspire waste and resist change in transportation, too. And transportation planning exhibits a similar pattern of breakthrough research and slow-motion response.

Consider off-street parking requirements.

University of California, Los Angeles planning professor Donald Shoup has been driving the research on parking for the past four decades, and he jokes that he chose the least illustrious field an academic could tackle. Yet his results are profound. Shoup has documented devastating impacts of minimum parking requirements on everything from delayed redevelopment and festering inequality to transportation inefficiencies and greenhouse gas emissions.

Still the response from the planning profession has largely been silence. “There isn’t a single urban planner who has denied these impacts. Nobody has said ‘this guy is wrong.’ They just carry on because it’s so much easier to change nothing,” Shoup says. “Parking is on automatic pilot.”

Parking rules have been wildly successful at assuring a place for cars, as Shoup points out: “Parking is now the single biggest land use in almost every city.”

But research by Shoup and others show that life gets better without the mandates. In 2010, Michael Manville, then a UCLA postdoctoral researcher (now an urban planning professor at Cornell University), documented rapid redevelopment in Los Angeles after the city waived its two-car-per-unit parking requirements for developers converting historic office buildings into housing. Between 1999 and 2008 the exemption fostered at least 7,300 new housing units in downtown L.A., where only 4,300 had been added in the three prior decades.

Despite research by UCLA and others, parking requirements remain nearly universal, sustained by a planning profession that remains wedded to outdated car-centric standards, according to Shoup, Speck and other urban reformers. Speck’s book backs up the intransigence of planning engineers by quoting one: Charles Marohn. As Marohn puts it in his apologetic 2010 essay — “Confessions of a Recovering Engineer”: “[W]e can’t recommend standards that are not in the manual.”

Often, says Speck, the inertia is most powerful at the county or state level. “Almost every community I have served in the past two decades has had its best efforts at making safer streets at least partially thwarted by a state or county department of transportation who literally owns the roads,” he says.

As with overcooling, there are signs of change in urban planning. Some cities are realizing that mandating parking is a bad idea. Shoup points to Buffalo, which is poised to approve a planning overhaul that would make it the first U.S. city with no requirements for off-street parking whatsoever.

Other cities, such as San Francisco, are replacing some of their minimum parking requirements with maximums. “As a minimum, L.A. requires 50 times more parking space downtown than San Francisco requires as a maximum,” says Shoup.

Gaming Design

Flipping defaults 180 degrees from minimums to maximums may be more than just a step in the right direction. It could point the way toward a broad strategy for ditching dysfunctional defaults. Imagine if ventilation engineers started designing a building’s AC systems by assuming 5 to 10 percent minimum air flows — rather than the highest level allowed by code — or if city planners added parking requirements only where research supported them.

Civil engineering professors Tripp Shealy at Virginia Tech and Leidy Klotz at Clemson University say flipping the defaults in design software for everything from buildings to roadways is a good place to start. Many such tools start each project with default parameters that tend to reinforce the status quo.

For the past three years Shealy and Klotz have been experimenting with another default flip to make “greener” choices the default option in project rating systems such as LEED, which scores building projects using various measure of sustainability (such as water consumption, support for bicycling and use of recycled materials). Shealy and Klotz tested a sustainability-by-default approach by reprogramming a software package called Envision that was created at Harvard University’s Graduate School of Design as a civil engineering analog to LEED. Envision rates the sustainability of infrastructure projects such as civic transport redesigns and land-use plans.

Like architects using LEED, Envision users start with zero points and work their way up as they add merit-worthy features to their projects. Shealy and Klotz reprogrammed Envision’s scoring process, starting users with a perfect score and then docking points as the users removed or scaled back desirable features. Their hunch was that users presented with a case study (redeveloping a rural Alabama town devastated by tornados) would produce higher-scoring (i.e., more sustainable) plans with the new defaults. The idea was based on pioneering decision science research by Nobel Prize winning economist Daniel Kahneman (author of the best-selling book Thinking, Fast and Slow) showing that loss aversion is a more potent motivator than reward seeking.

A trial run with civil engineering students and practicing civil engineers bore that out. Participants who started with full points for sustainability delivered projects that scored about 20 percent better, on average, than members of a control group who worked up from zero.

Flipping the initial default score from zero to full points, “drastically increased what engineers thought was possible, and their motivation to try to achieve those points,” says Shealy.

“We’re helping people make decisions that are in their best interest,” observes Klotz, since designers use decision tools such as LEED and Envision as a means of achieving the most sustainable project.

Shealy and Klotz say their sustainability-by-default approach may become the standard setting for the next version of Envision, which is due out in 2018. It is one of several signs that defaults are breaking down, along with Buffalo’s abandonment of off-street parking rules (which appears to be headed for a city council vote this year) and a trend towards greener factory settings for video games and other consumer electronics.

Designing by default may never be stoppable, but it looks as though it can be a lot more sustainable.

Leave a Reply

Your email address will not be published. Required fields are marked *