NCSA Brown Dog

Brown Dog Users/Contributors Cheatsheet


The Super Mutt

of Software

Brown Dog's goal is to prototype a highly distributed and extensible science driven Data Transformation Service (DTS). As a component of a national research cyberinfrastructure Brown Dog aims at making past and present research data more accessible and more useful to scientists while also enabling novel science and scholarship on top of such data.

Rather than attempting to construct a single piece of software that magically understands all data, Brown Dog leverages and coordinates every possible source of automatable help already in existence (e.g. software, tools, libraries, and even other services) in a robust and provenance preserving manner to create a service possessing the union of their capabilities that can deal with as much of this data as possible. Brown Dog, a “super mutt” of software, serves as a low-level data infrastructure coordinating software capabilities with a users data needs to facilitate data reuse and through that enable a new era of science and applications at large. The broader impact of this work is in its potential to serve not just the scientific community but the general public as a “DNS for data”, transforming data on the fly to more accessible forms through a distributed and extensible collection of data manipulation tools, moving civilization towards an era where a user’s access to data is not limited by a file’s format or un-curated collections.


Towards a Data Cyberinfrastructure

Brown Dog is part of the DataNet/DIBBs program funded by NSF beginning in 2008. DataNet was conceived to address the increasingly digital and data-intensive nature of science and engineering research and education. Digital data are not only the output of research but provide input to new hypotheses, enabling new scientific insights and driving innovation. Therein lies one of the major challenges of this scientific generation: how to develop the new methods, management structures and technologies to manage the diversity, size, and complexity of current and future data sets and data streams. DataNet addresses that challenge by creating a set of exemplar national and global data research infrastructure organizations (dubbed DataNet Partners) that provide unique opportunities to communities of researchers to advance science and/or engineering research and learning.

Brown Dog is, more specifically, part of a follow-on effort called DIBBs (Data Infrastructure Building Blocks), focused on building software cyberinfrastructure to support current and foreseen scientific data needs, stuff lots of people can use.  All of the DIBBs projects are meant to provide complementary services, each building on the others capabilities.

More about the DataNet and DIBBs program


DataNet & DIBBs



NSF Program: DIBBs



NSF Program: DIBBs


NSF Program: DIBBs

Software: DTS, Clowder, Polyglot



NSF Program: DIBBs

Software: SkyServer

Data: Sloan Digital Sky Survey



NSF Program: DIBBs

Software: HUBzero



NSF Program: DIBBs

Software: SLASH2



NSF Program: DataNet

Software: DataONE

Data: Biology and Environmental




NSF Program: DataNet

Software: ACR, Virtual Archive

Data: Social and Environmental



Software: iRODS

Data: Ocean Observatory, Hydrology, Genome, Social Science, Education



Data: Census/Survey, Remote Sensing, Climate




Project Updates

The Team

Photo of Kenton Mchenry

Kenton McHenry ,Ph.D.


Principal Research Scientist, NCSA

Photo of Jong Lee

Jong Lee, Ph.D.


Principal Research Scientist, NCSA


Photo of Michael Dietze

Michael Dietze, Ph.D.


Associate Professor of Biology, Boston University


Photo of Barbara Minsker

Barbara Minsker, Ph.D.


Professor of Civil & Environmental Engineering, SMU


Praveen Kumar, Ph.D.


Professor of Civil & Environmental Engineering, UIUC


Art Schmidt, Ph.D.

Research Assistant Professor of Civil & Environmental Engineering, UIUC


Luigi Marini

Lead Research Programmer, NCSA


Jay Alameda

Senior Technical Program Manager, NCSA

Chris Navarro

Senior Research Programmer, NCSA


Shannon Bradley

Associate Project Manager, NCSA


Jerome McDonough,  Ph.D.

Associate Professor of Library & Information Science, UIUC


Bill Sullivan, Ph.D.

Professor of Landscape Architecture, UIUC


Richard Marciano, Ph.D.

Professor of Information Studies, Director Digital Curation Innovation Center, UMD


Rob Kooper

Lead Research Programmer, NCSA


Greg Jansen

Research Software Architect, University of Maryland


Benjamin Galewsky

Research Programmer, NCSA


Sandeep Satheesan

Research Programmer, NCSA


Marcus Slavenas

Research Programmer,  NCSA


Bing Zhang

Research Programmer, NCSA


Inna Zharnitsky

Programmer, NCSA

Josh Manthooth

Graduate Student, Biology, Boston University


Betsy Cowdery

Graduate Student, Biology, Boston University


Dongkook Woo

Graduate Student, Civil & Environmental Engineering, UIUC


Qina Yan

Graduate Student, Civil & Environmental Engineering, UIUC


Kunxuan Wang

Graduate Student, Civil & Environmental Engineering, UIUC


Ankit Rai

Graduate Student, Civil & Environmental Engineering, UIUC


Sun Young Park

Graduate Student, Civil & Environmental Engineering, UIUC


Pongsakorn (Tum) Suppakittpaisarn

Graduate Student, Landscape Architecture, UIUC


Wenqi Ji

Graduate Student, Landscape Architecture, UIUC


Xiangrong Jiang

Graduate Student, Landscape Architecture, UIUC


Dongying Li

Graduate Student, Landscape Architecture, UIUC


Advisory Board

Photo of Kenton Mchenry

David Forsyth ,Ph.D.

Professor, Computer Science , UIUC

Photo of Jong Lee

Jysoo Lee, Ph.D.

Director, Korean Institute of Science and Technology Information (KISTI)


Photo of Michael Dietze

Norma Kenyon, Ph.D.

Professor, Surgery, Medicine, Microbioligy and Immunology and Biomedical Engineering, University of Miami


Photo of Barbara Minsker

Tschangho Kim, Ph.D.

Professor of Civil, Environmental, and Infrastructure Engineering, George Mason University


Brian Wee, Ph.D.

Chief of Strategic Alliances, NEON


Yan Zhou

Research Programmer, NCSA


Bringing Long-Tail Data

Into the Light

Much of the data generated by science, social science, and the humanities is smaller, unstructured, un-curated and thus not easily shared. Taken together, however, this “long-tail” data, both past and present, represents a vast amount of research data with the potential to greatly impact future research in many areas of study.

The unstructured, un-curated nature of this data, however, means that once the data is gathered and the research published, the data often never sees the light of day again. In addition, contemporary science relies on digital data and software that evolves and disappears quickly as underlying technology changes. Thus we are entering a period where scientific results are no longer easily reproducible.  Since reproducibility is foundational to scientific discovery, development of a method for easily accessing legacy data and software is essential to maintaining the viability of large bodies of research.

Example Use Cases

The success of Brown Dog, in part, depends on the data and use cases that we have to build and test the system against. The inaccessibility of long-tail data is a problem that has been identified as necessary to address by groups within the EarthCube communities. Developers and researchers from some of these communities will work hand-in-hand to explore three compelling scientific use cases that span geoscience, engineering, biology and social science.


Research Data Management and the Clowder Supported Communities

Clowder, an open source web based research data management system designed to support multiple research domains and the diverse data types utilized across those domains.  With features supporting autocuration, analysis, publication, and customizability, Clowder has been used to support research data needs in areas such as biology, geoscience, material science, crop science, civil engineering, social science, cultural heritage & the digital humanities, medicine, education, as well as industry.  Brown Dog simplifies the addition of new extractors to Clowder, supports the building up of these community extractors used to analyze/curate data within a Tools Catalog, enhances extractor deployment and scalability to support large numbers of queries, adds support for a data-centric paradigm shift, and provides infrastructure for extractor execution across the Clowder supported communities.


Photo showing an archived photo of land surveyors at workLong Tail Data in Ecology and Global Change Biology

Michael Dietze, Boston University

Data on the abundance, species composition, and size structure of vegetation is critically important for a wide array of sub-disciplines in ecology, conservation, natural resource management, and global change biology. However, addressing many of the pressing questions in these disciplines will require that terrestrial biosphere and hydrologic models are able to assimilate the large amount of long-tail data that exists but is largely inaccessible. The Brown Dog team in cooperation with these researchers will facilitate the capture of a huge body of smaller research-oriented data sets collected over many decades such as historical vegetation data embedded in Public Land Survey data dating back to 1785.  Data such as this will be used as initial conditions for models, to make sense of other large data sets and for model calibration and validation.  Overall, Brown Dog supports the PEcAn ecological modeling community in its data transformation needs linking needed datasets to community based ecological models.


Photo showing green infrastructure in an urban environment.Designing Green Infrastructure Considering Storm Water and Human Requirements

Barbara Minsker, UIUC

William Sullivan, UIUC

Arthur Schmidt, UIUC

This case study involves developing novel green infrastructure design criteria and models that integrate requirements for storm water management and ecosystem and human health and wellbeing. To address the scientific and social problems associated with the design of green spaces, data accessibility and availability is a major challenge.  This study will focus on identified areas of the Green Healthy Neighborhood Planning region within the City of Chicago where existing local sewer performance is most deficient and where changes in impervious area through green infrastructure would be beneficial to underserved neighborhoods. Brown Dog will be used to extract long-tail experimental data on human landscape preferences and health impacts. This data will be used to develop a human health impacts model that will then be linked together with a terrestrial biosphere model and a storm water model using Brown Dog technology.


Photo showing layers soil stratificationDevelopment and Application for Critical Zone Studies

Praveen Kumar, UIUC

Critical Zone (CZ) is the “skin” of the earth that extends from the treetops to the bedrock that is created by life processes working at scales from microbes to biomes and it supports all terrestrial living systems. Its upper part is the biomantle. This is where terrestrial biota live, reproduce, use and expend energy, and where their wastes and remains accumulate and decompose. It encompasses the soil, which acts as a geomembrane through which water and solutes, energy, gases, solids, and organisms interact with the atmosphere, biosphere, hydrosphere, and lithosphere. A variety of drivers affect this biodynamic zone, ranging from climate and deforestation to agriculture, grazing and human development. Understanding and predicting these effects is central to managing and sustaining vital ecosystem services such as soil fertility, water purification, and production of food resources, and, at larger scales, global carbon cycling and carbon sequestration.

The CZ provides a unifying framework for integrating terrestrial surface and near-surface environments, and reflects an intricate web of biological and chemical processes and human impacts occurring at vastly different temporal and spatial scales. The nature of these data create significant challenges for inter-disciplinary studies of the CZ because integration of the variety and number of data products and models has been a barrier. On the other hand, CZ data provides an excellent opportunity for defining, testing, and implementing Brown Dog technologies through support for the Critical Zone Observatory community. In this context “unstructured” data is viewed broadly as comprising of a collection of heterogeneous data with formats that reflect temporal and disciplinary legacies, data from emerging low cost open hardware based sensors and embedded sensor networks that lack well defined metadata and sensor characteristics, as well as data that are available as maps, images, and text.


Photo showing a webpage for downloading dataGeneral Public Use Case

In the same way the Internet has opened up information sharing for people around the world, the broader impact of Brown Dog will be to make the ever-growing stores of data on the web as easy to search and access as a webpage is now.

Brown Dog’s DTS will allow users to seamlessly sift through and access data that would otherwise be difficult to navigate and/or unreadable on their client devices.  Similar to an Internet gateway or Domain Name Service (DNS), the DTS configuration would be entered into a user’s machine settings and forgotten thereafter. From then on, with support from a variety of clients as well as browsers, data requests over HTTP would first be examined by the DTS to determine if the native file format is readable on the client device. If not, the DTS would be called in the background to convert the file into the best possible format readable by the client machine.  Alternatively, the user would have the option of specifying the desired format themselves, instead of the DTS doing it automatically.

Further the DTS, will allow users to search collections of data using an existing file to discover other similar files in the data. Again, once the machine and browser settings are configured, a search field can be appended to the browser where example files can be dropped in by the user. Doing this triggers the DTS to search the contents of all the files under a given URL for files similar to the one provided by the user. For example, while browsing an online image collection, a user could drop an image of three people into the search field, and the DTS would return all images in the collection that also contains three people.  The DTS will also perform general indexing of the data and extract and append metadata to files and collections enabling users to gain some sense of the type of data they are encountering.

Overall, the DTS will greatly expand general access and understanding of data on the web.


Data Transformation Service

How It Works . . .

Infographic illustrating how a person will access and use the Brown Dog technologies: DAP and DTS.

Mouseover the diagram for more details

A Science Driven

Brown Dog


University of Illinois at Urbana-Champaign

Boston University

University of Maryland

Southern Methodist University


This material is based upon work supported by the National Science Foundation under Grant No. ACI-1261582.

Learn more about this award



Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.