Opening the Flow of Water Data

Four decades ago, water managers and utility employees didn’t have much in the way of data. Water metering to measure homes’ and buildings’ water use was neither required nor common. Water quality monitoring and sampling to safeguard drinking water and environmental health happened along just a small percentage of river miles in Colorado. It’s an era that seems incredibly ancient.

Today Coloradans are digitally connected to everything and to each other. In this new time of big data and nearly incomprehensibly large datasets, every digital process produces and tracks data, while the web-enabled devices around us—from cell phones to smart thermostats—measure and transmit that information.

When it comes to water, water providers, engineers, scientists, boaters and anglers can now collect and access all types of water data, whether to inform conservation plans and river management, or to figure out when to hit the whitewater or expect a big hatch. Water managers make decisions using massive computing power, the internet, software and applications, global positioning systems (GPS), geographic information systems (GIS) and digital maps, and seemingly boundless real-time and long-term datasets. From streamflows to flood risks to water use and water quality within communities and even households, data is now a powerful driver of decision making and water resources management at all levels.

“All of these aspects of data management have been developing in the past 40 years to get us where we can use the data in a confident way,” says Jeff Tejral, Denver Water’s manager of conservation.

As the digital landscape continues to evolve, more people expect more available and accessible data, including more water data, and the tools and context to understand it. And many local, state and federal government agencies, along with public utilities, have made a point of publishing and sharing data and information for use by others. This refers to “open data”—datasets that are accessible, free or practically free, machine-readable (meaning software programs can recognize or process the data or text), and available for unrestricted use. Open water data is creating some daunting challenges for certain agencies as they cope with navigating a changing culture, controlling the quality of shared data, and interpreting masses of information—but it also brings important benefits.

Opening toward transparency

“Open data equals good government,” says Steve Malers, founder of Open Water Foundation, a Fort Collins, Colorado-based nonprofit open-source tech firm that specializes in water management applications. By publishing the data they collect, agencies can demonstrate accountability and transparency and show they are wisely managing resources and meeting reporting requirements, says Malers. Plus, open water data platforms minimize agencies’ tendencies to “silo,” or to store but not share, their records with other agencies and the public. As a result, anyone can see, analyze and manipulate, and make decisions using the same data. Opening data can even help reveal errors or biases in data or policy.

Eagle River Watershed Council sampling the Eagle River

Eagle River Watershed Council’s Kate Burchenal and Timm Paxson collect water samples from the Eagle River, downstream of the Eagle Mine, to test for water quality indicators and concentrations of various metals. This data, publicly accessible through the Colorado Data Sharing Network, helps the watershed council assess stream health and informs the U.S. Environmental Protection Agency’s Superfund cleanup of the mine. Photo Courtesy of the Eagle River Watershed Council

Not to mention, it can make financial sense: Organizations once stored records on physical discs and computers that staff would tediously search through upon receiving records requests, in order to manually print and share copies. Now, agencies can publish records and reports online where anyone can find, search, download, and use the information. It’s more user friendly for citizens and agency staff, while organizations can reduce the costs of handling those records requests.

The concept of open data isn’t new, but the internet and digital technology have astronomically increased the availability of data. In addition to the federal government’s site, agencies such as the U.S. Geological Survey with its Science Data Catalog host open data portals or web services—interfaces that connect computers and datasets—that include hundreds of water quantity and quality datasets. The Colorado Information Marketplace (CIM), for example, is a state-level open-data website that provides a platform for agencies to share data and enables users to easily search datasets from dozens of government departments.

Since the 1980s Colorado’s Division of Water Resources (DWR) has shared more than two dozen datasets through its HydroBase, blazing a path with its open data—that same data is now also shared via the CIM. Those datasets include spreadsheets and maps of well applications and permits, water rights transactions, groundwater levels, river calls, diversion records, and surface-water gauge measurements for streams and reservoirs all over Colorado. DWR also captures time-series data from the U.S. Geological Survey, National Oceanic and Atmospheric Administration, Colorado Department of Local Affairs, and other partners to integrate a variety of data with its own. That makes DWR “one of the biggest and most advanced users” of the state’s open-data site, says Jon Gottsegen, Colorado’s chief data officer who oversees the CIM, adding that the division also automates and frequently updates its records.

“The data belongs to the citizens of Colorado and DWR has always shared it openly,” says John Rodgers, HydroBase coordinator. Although technology has advanced, and DWR continues to add new stream gauges, the agency has been collecting and sharing many of the same datasets for decades while working to increase accessibility. The latest effort? DWR’s data-services site, Colorado’s Decision Support Systems (CDSS), which enables users to retrieve data, do digital mapping, water modeling, and more, will be renovated in 2017 to increase functionality.

Not surprisingly, the great migration to open data hasn’t been easy for everyone. Some utility managers remain reluctant to openly publish data, citing concerns spurred by past experiences when shared data led to negative attention or was misinterpreted or “weaponized” by another organization or a media report. In other cases, water managers worry that errors or discrepancies in datasets mean they’re not worth sharing. Regardless, it’s time consuming to digitize records and maintain datasets.

“Over the years, a culture of not being overly public has evolved in water utilities and management agencies,” says Kathy King, a principal with Boulder-based Redstone Strategy Group working with the Water Funder Initiative, a foundation-driven program to identify priorities and solutions to Western water challenges. “That runs counter to this idea that data are an asset and something you can use to tell your story and communicate with the public.”

In speaking with water managers, researchers, industry leaders, and environmentalists for the Water Funder Initiative-sponsored Aspen Dialogue Series, King says many leaders recognize agencies and networks now collect lots of data and even make them available but that doesn’t always mean datasets are usable or readily accessible by others. “It’s not so much that the data doesn’t exist,” King says, “but that it’s isolated and not shared and integrated in a way that they can always be transformed into information for decision making.”

“Organizations tend to publish data in a form that is consistent with their mission and [regulatory] requirements,” adds Malers. “This means connections to other datasets may not be obvious and does not make it easy for third parties to understand or use the data.” This disjointedness can also complicate cooperative research or management, such as Colorado’s regional or statewide river planning initiatives that span jurisdictions, when entities use different data parameters or categories.

Data governance and quality control

Gottsegen and the CIM are now focused on ways to manage and regulate data sharing and to establish standards to increase data quality control and accessibility, otherwise known as “data governance.” Colorado’s Government Data Advisory Board, chaired by Gottsegen, engages state agency representatives to study and determine practices and protocols for data compatibility and interoperability, meaning datasets can be readily used and read by others. The federal Open Water Data Initiative represents a similar effort to coordinate data exchange among federal agencies. “Often, the organizational problems are a lot stickier than the technology,” Gottsegen says. “It is really about getting people together to say, ‘These are what the needs are and here’s how we’re going to solve that problem,’ and to think a little differently.”

Advisory panels like the state’s can partly address that challenge by publishing more and more usable data that includes accompanying metadata, or files that describe datasets’ creators, origins and purpose, and that explain their various fields or columns. Metadata enables users to more quickly and effectively determine data quality and to make sense of the datasets they are attached to. It also reduces the risks of misinterpretation. However, some records managers worry that sharing metadata can also inadvertently give users access to sensitive or confidential information.

The state data advisory board is also working to resolve tricky data management questions, such as determining what is an authoritative dataset when two or more agencies submit similar but conflicting records or when is a record considered obsolete and ready to be removed from the site in order to narrow down search results.

“What we really need to do is to improve the effectiveness and efficiency of the queries done through the site,” Gottsegen says. That doesn’t mean a user can already find and utilize every Colorado water record through the CIM, but Gottsegen says improving data governance should lead to greater usage. “We need to get to the point where we’ve got data staged and managed well so people can do good analytics.”

Steve Malers in his Fort Collins Office

Steve Malers, founder of the Open Water Foundation, works from his Fort Collins office to provide open source software and improve open data sharing and visualization across Colorado. Photo by Paula Gillen

Malers and the Open Water Foundation are among software and app developers that are already putting data to use. For example, the Open Water Foundation developed a web-accessible tool for the Colorado Water Conservation Board that uses National Snow and Ice Data Center historical and current snowpack measurements from river basins across Colorado to estimate the liquid water supply contained within mountain snowpack. The system, still in production, will enable water providers to easily monitor snowpack conditions and inform their supply decisions. In terms of the data package, Malers says, “We are trying to set a standard for collaboration and transparency in data, process and tools.”

A recently released report from the Aspen Dialogue Series and Water Funder Initiative concludes that sharing and opening datasets should also be a priority—with an understanding that agreements on data quality and interoperability are developing through data-governance programs. “We want to encourage data producers and water agencies to put data out there without too much concern for the quality, recognizing that quality can improve over time and that sunshine is the best disinfectant,” says King. “By getting it out there and getting it used, we can get a better sense of where the needs are and also where data quality really matters.”

Visualization and interpretation

That attention toward the needs of a wide range of potential data users—from realtors to researchers, anglers and oil and gas companies—also underscores an overarching task for agencies and data entrepreneurs. From the days of sparse water data and frustrating records searches, now is a time of data overload. Malers, King, Tejral and others say collecting and sharing data isn’t enough; organizations now face the work of turning data into information that most people can make sense of. That comes with another challenge: funding platforms that allow people to use and interpret data.

Many agencies have a government mandate to collect and share water data, but they lack the funding and sometimes the expertise to create usable data platforms. That leaves a niche for nonprofits like the Open Water Foundation and private enterprises, such as consulting firms, which develop customized data viewers. Water Sage, one such business, offers a massive portal that integrates water and land data and displays it through maps, graphs or other visuals for easy access and decision making.

“We saw the need for there to be someone out there reliably aggregating data and making it accessible,” says Spencer Williams, Water Sage’s business development manager. “We’re focused on ways to make that data discoverable and usable for technical and non-technical users.” Water Sage automates data retrieval and aggregation, culling from state agency data sources, and delivers it, through its platform, to subscribers, who include water resources managers, conservancy districts, land management companies, and others. Subscribers pay $6,000 for a year-long subscription. While agencies like DWR provide the pure data and some interpretation, they often are not able to invest in building the customized, value-added platforms that individuals and entities across the state seek.

Heather Dutton, manager of the San Luis Valley Water Conservancy District, says her district signed up for Water Sage’s services to help manage its Rio Grande augmentation program, which ensures regional well pumping doesn’t harm other users’ water rights. Rather than searching county assessors’ records, state water datasets, and Google Maps to compile information, Water Sage does the work for Dutton. “It’s data that is publicly available,” Dutton says, “but we really appreciate that Water Sage puts all that data in one snazzy, useful format.”

“This idea of having data is really important, but we still live in the Information Age, not the Data Age,” adds Tejral. “Data alone overwhelms people.” So, if a utility, agency or independent software developer can compile data and also synthesize it with digital maps and other datasets, from water use to assessor’s records, “you are building a story of what’s going on and putting context around it for a customer and decision makers,” Tejral says. “That’s really important.”

Translate »