NetCDF Climate and Forecast Metadata Convention
- 1 Standard name
- 2 Authority responsible for the standard
- 3 Statutory requirements for supporting the standard
- 4 Significant applications (within and outside NOAA) currently using the standard
- 5 Purpose/application
- 6 Data type(s) to which the standard would apply.
- 7 Existing software tools that support the implementation of the standard.
- 8 Detailed description/definition of the standard.
- 9 Questions
- 10 Achieving CF-Compliance using NcML on local or remote files
- 11 CF Conventions and ISO Metadata
NetCDF Climate and Forecast Metadata Convention
Authority responsible for the standard
Statutory requirements for supporting the standard
Significant applications (within and outside NOAA) currently using the standard
The Climate and Forecast (CF) conventions are metadata conventions for earth science data, intended to promote the processing and sharing of files created with the NetCDF Application Programmer Interface.
Data type(s) to which the standard would apply.
Existing software tools that support the implementation of the standard.
Detailed description/definition of the standard.
- What could be an elegant subset of netCDF and CF features to enable correct understanding of GSICS data by available scientific processing and display systems. This data broadly falls into two categories: satellite swath radiances and radiance correction coefficients. By "elegant" I mean "sufficient but not overwhelming to implement".
- CF standard names for satellite observations. There are barely any applicable in the current official list. What is the best approach here: flood the CF standard, or have a parallel standard that follows CF guidelines first and see which of the new names prove to be the most used and useful and then move those into the CF standard?
- What is a sane metadata approach for capturing processing steps and input data in a netCDF file? The final GSICS product will be radiance correction coefficients that convert observations from one satellite instrument to that of another, reference instrument. Users of such corrections will want to know what went into making those coefficients. Or more simply: How much stuff can be stuffed into the 'history' global attribute?
The CF Conventions are designed to promote the processing and sharing of files created with the netCDF Application Programmer Interface. They provide a definitive description of what the data in each variable represents, and of the spatial and temporal properties of the data.
Achieving CF-Compliance using NcML on local or remote files
Gridded data that is not CF-compliant can often be standardized through the use of NcML, the NetCDF Markup Language, which has the ability to create virtual datasets that point to the original non-compliant files, but add or change attributes to achieve CF Compliance. Applications that understand NcML (e.g. any application that uses the NetCDF-Java library) then can load the NcML file, which functions just as if it were a CF Compliant NetCDF file. In addition, the NcML can be used in a THREDDS Data Server Catalog, which then serves the CF-Compliant dataset via TDS services, currently OPeNDAP, WCS, WMS and the NetCDF Subset Service.
Here are some real NcML examples, with annotation:
- Add lon and lat coordinate variables to a uniformly spaced grid. The grid being "fixed" is from GDAL, which does not produce CF-compliant NetCDF.
CF Conventions and ISO Metadata
The CF Conventions focus on "Use Metadata" and should not be confused with more comprehensive metadata standards like ISO 19115. The conventions include several attributes that are intended to provide information about where the data came from and what has been done to it, mainly for the benefit of human readers. These attributes have clear crosswalks to the ISO Standard:
|CF Attribute||CF Definition||ISO Element|
|title||A succinct description of what is in the dataset.||Citation title.|
|institution||Specifies where the original data was produced.||Citation organization (role=originator).|
|source||The method of production of the original data. If it was model-generated, source should name the model and its version, as specifically as could be useful. If it is observational, source should characterize it (e.g., "surface observation" or "radiosonde").||See ISO Lineage|
|history||Provides an audit trail for modifications to the original data. Well-behaved generic netCDF filters will automatically append their name and the parameters with which they were invoked to the global history attribute of an input netCDF file. We recommend that each line begin with a timestamp indicating the date and time of day that the program was executed.||See ISO Lineage|
|references||Published or web-based references that describe the data or methods used to produce it.||Citation|
|comment||Miscellaneous information about the data or methods used to produce it.||Could be included in many places.|