US Army Corps of Engineers Deploys Complex MathPosted December 13th, 2011 by Linda Bell
Today, we’re pleased to have a guest blog from Lindsey Christensen, Marketing Project Manager at PTC, which delivers Product Lifecycle Management and design software solutions.
Most people don’t think about the complexity behind the electricity that’s supplied to their home or work. We flick a switch. The lights go on or off. Simple, right? Well, not quite. As covered in the November 7th Forbes article “The High-Stakes Math Behind the West’s Greatest River”, there’s an enormous amount of data and complex calculations that go into meeting that demand for power. Harold Opitz, hydrologist in charge of the National Weather Service’s Northwest River Forecast Center, told Forbes, “I can never have too much data.” That’s because if Opitz doesn’t have enough data, and his calculations aren’t accurate, it could mean lights-out for millions of American households.
The Northwest River Forecast Center is one organization in a larger group, headed by the US Army Corps of Engineers, helping to manage over 100 large dams and hundreds of smaller installations along the mighty Columbia River. Together these structures provide a number of functions, but the chief one is hydropower generation to feed our electricity demand. In fact, the Grand Coulee Dam is North America’s largest power plant, not only providing 600,000 acres of irrigation to the Pacific Northwest, but generating nearly 7,000 megawatts of electricity at full capacity. To put this in perspective, just one megawatt can power 5,000 computers.
Effective operation of these giant dams depends on precise forecasting of weather, river and dam behaviors. It has to be incredibly accurate. As reported by Forbes: “As large as the dams are, their margins of error are miniscule and operating them takes unerring foresight and subtle management: let too much water fill reservoirs and a rainstorm might flood Portland; keep the reservoirs too empty and you’ll parch farmers. Send too much water over a dam’s spillway and you’ll suffocate fish with dissolved gases; send too much through its turbines and you’ll overload the electrical grid.”
Calculating the impact of natural and man-made factors on the Columbia River’s 27 major dams has become its own science, as engineers measure the pulse and elevation of the water in various locations along the river, the amount of fish that migrate through, how much electricity that will be demanded, snow melt from the Rocky Mountains, wind, and more. It has been an evolutionary process for the organizations involved.
It all begins with a daily report from the River Forecast Center to the Army Corps of Engineers which includes both short-term and long-term outlooks. The Corps then takes this information and applies a refined statistical model based in large part on historical data. The results from these models are fed back to the River Forecast Center and an operation plan is defined.
Forbes notes that in yesteryear the Corps relied on more of an oral tradition for decision-making around dam operation. Think lab notebooks or engineering journals. Today, from the Hydrological Engineering Center, it is able to use sophisticated mathematical analysis and calculation software for more accuracy, better analysis, and more collaboration. The software is called HEC-ResSim, a system developed in-house. Engineers are able to apply 70 years worth of stored historical data like rainfall, temperature and water levels to projected scenarios and evaluate the outcomes in ways they have never been able to do before.
All organizations seek accurate forecasting and secure management of IP and knowledge. The right engineering calculation software enables engineers to easily solve, document, share and re-use calculations and design work. It’s used when knowledge capture, data reuse, and design verification are too important for an Excel spreadsheet. The result can be faster time-to-market, higher product quality, easier compliance, and much more.
Do you have complex engineering projects that span across organizations? What calculation framework do you use? How is this analysis documented, exchanged, and stored?