Create the SeriesCatalog table
Usage
create_series_catalog(
L0_flat = NULL,
Sources = NULL,
Methods = NULL,
Variables = NULL,
Sites = NULL,
QualityControlLevels = NULL,
DataValues = NULL
)
Arguments
- L0_flat
(tbl_df, tbl, data.frame) The fully joined source L0 dataset, in "flat" format (see details).
- Sources
(tbl_df, tbl, data.frame) The Sources table.
- Methods
(tbl_df, tbl, data.frame) The Methods table.
- Variables
(tbl_df, tbl, data.frame) The Variables table.
- Sites
(tbl_df, tbl, data.frame) The Sites table.
- QualityControlLevels
(tbl_df, tbl, data.frame) The QualityControlLevels table.
- DataValues
(tbl_df, tbl, data.frame) The DataValues table.
Details
This function appends columns to the L0_flat
table and
returns the augmented table.
"flat" format refers to the fully joined source L0 dataset in "wide" form with the exception of the core observation variables, which are in "long" form (i.e. using the variable_name, value, unit columns of the observation table). This "flat" format is the "widest" an L1 hymetDP dataset can be consistently spread due to the frequent occurrence of L0 source datasets with > 1 core observation variable.
See also
Other create required tables:
create_data_values()
,
create_methods()
,
create_quality_control()
,
create_sites()
,
create_sources()
,
create_variables()
Examples
flat <- hymet_L0_flat
Sources <- hymetDP::create_sources(
L0_flat = flat,
SourceCode = "SourceCode",
Organization = "Organization",
SourceDescription = "SourceDescription",
SourceLink = "SourceLink",
ContactName = "ContactName",
Phone = "Phone",
Email = "Email",
Address = "Address",
City = "City",
State = "State",
ZipCode = "ZipCode",
Citation = "Citation")
Methods <- hymetDP::create_methods(
L0_flat = flat,
MethodCode = "MethodCode",
MethodDescription = "MethodDescription")
Variables <- hymetDP::create_variables(
L0_flat = flat,
VariableCode = "VariableCode",
VariableName = "VariableName",
VariableUnitsName = "VariableUnitsName",
SampleMedium = "SampleMedium",
ValueType = "ValueType",
IsRegular = "IsRegular",
TimeSupport = "TimeSupport",
TimeUnitsName = "TimeUnitsName",
DataType = "DataType",
GeneralCategory = "GeneralCategory",
NoDataValue = "NoDataValue")
Sites <- hymetDP::create_sites(
L0_flat = flat,
SiteCode = "SiteCode",
SiteName = "SiteName",
Latitude = "Latitude",
Longitude = "Longitude",
LatLongDatumSRSName = NULL,
Elevation_m = NULL,
VerticalDatum = NULL,
LocalX = NULL,
LocalY = NULL,
LocalProjectionSRSName = NULL,
PosAccuracy_m = NULL,
State = NULL,
County = NULL,
Comments = NULL,
SiteType = "SiteType")
QualityControlLevels <- hymetDP::create_quality_control(
L0_flat = flat,
QualityControlLevelCode = "QualityControlLevelCode",
Definition = "Definition",
Explanation = "Explanation")
DataValues <- hymetDP::create_data_values(
L0_flat = flat,
ValueID = "ValueID",
DataValue = "DataValue",
ValueAccuracy = NULL,
LocalDateTime = "LocalDateTime",
UTCOffset = "UTCOffset",
DateTimeUTC = "DateTimeUTC",
SiteCode = "SiteCode",
VariableCode = "VariableCode",
OffsetValue = NULL,
OffsetTypeCode = NULL,
CensorCode = NULL,
QualifierCode = NULL,
MethodCode = "MethodCode",
QualityControlLevelCode = "QualityControlLevelCode",
SourceCode = "SourceCode",
NoDataValue = "NoDataValue")
SeriesCatalog <- hymetDP::create_series_catalog(
Sources = Sources,
Methods = Methods,
Variables = Variables,
Sites = Sites,
QualityControlLevels = QualityControlLevels,
DataValues = DataValues)
SeriesCatalog
#> SeriesID SiteCode
#> 1 1 1
#> 2 2 1
#> 3 3 1
#> SiteName
#> 1 USGS site 9; coordinates taken from 1996-97 GPS measurements at center of weir Parent Stream: Andersen Creek Provenance : GPS96-97.DOCID: andrsn_h1
#> 2 USGS site 9; coordinates taken from 1996-97 GPS measurements at center of weir Parent Stream: Andersen Creek Provenance : GPS96-97.DOCID: andrsn_h1
#> 3 USGS site 9; coordinates taken from 1996-97 GPS measurements at center of weir Parent Stream: Andersen Creek Provenance : GPS96-97.DOCID: andrsn_h1
#> VariableCode VariableName VariableUnitsName SampleMedium
#> 1 1 Discharge liters per second Surface water
#> 2 2 Temperature degree celsius Surface water
#> 3 3 Specific conductance microsiemens per centimeter Surface water
#> ValueType TimeSupport TimeUnitsName DataType GeneralCategory
#> 1 Derived Value 15 minute Continuous Hydrology
#> 2 Field Observation 15 minute Continuous Hydrology
#> 3 Field Observation 15 minute Continuous Hydrology
#> MethodCode
#> 1 1
#> 2 1
#> 3 1
#> MethodDescription
#> 1 Campbell CR10 dataloggers were used to record stream stage, water temperature, and conductivity in a network of stream gages. Stage is monitored with pressure transducers; PSS-1 and PS-2 models form Paroscientific Corporation, and Accubars from Sutron Corporation. The pressure transducers measure the backpressure in orifice lines set into or above controls in the stream channel. In addition, some of the sites monitor water temperature and conductivity with either USGS minimonitor probes, or Campbell temperature/conductivity probes. Ratings are developed for the stage/discharge relationship at each site by measuring streamflow with current meters or portable flumes, according to standard USGS methods. Datum corrections to the stage are determined by periodically surveying the elevation of the orifice line to the control and nearby reference marks. Calibrations for the temperature and conductivity are assessed by measuring these parameters with portable field meters while simultaneously noting the readings from the gage probes. Data is downloaded into Campbell storage modules, and retrieved into pcs. From there, the data is sent to a USGS computer, where time discrepancies are resolved, and the data is loaded into ADAPS, a database system developed in the USGS for maintaining and processing water data. A determination for each site as to when the stream was flowing and when it was not is made. For water temperature and conductivity, bad data is deleted. Variable shifts are determined based on field calibration measurements, and other indicators. The shifts are applied to the remaining good data inside of ADAPS. The data is pulled out of ADAPS, and reformatted for input into ORACLE. Cases of water temperature below reasonable values are set to lower limits. A quality code is assigned to every value. The resulting data is uploaded into the ORACLE and the McMurdo database. Before 2011, For stage/discharge, bad data is deleted. Survey data is reviewed to compute weir elevations an datum corrections. A rating curve is developed graphically, based on available data, and entered into ADAPS. All applicable shifts and datum corrections are entered into ADAPS. All corrections and ratings are run against the good stage data to compute the discharge at each recording interval. The data is pulled out of ADAPS, and reformatted for input into ORACLE. A quality code is assigned to every value. The resulting data is uploaded into ORACLE and the McMurdo database. ADAPS deprecated in favor of Aquarius software in 2012. Similar procedure is used in Aquarius to convert and curate the data. Metadata was enhanced in 2013 and 2014 by Inigo San Gil. In March 2021, data from the 2016-17 field season was replaced to correct a previously published error, in which discharge was reported in cubicFeetPerSecond (CFS) instead of litersPerSecond (l/s).
#> 2 Campbell CR10 dataloggers were used to record stream stage, water temperature, and conductivity in a network of stream gages. Stage is monitored with pressure transducers; PSS-1 and PS-2 models form Paroscientific Corporation, and Accubars from Sutron Corporation. The pressure transducers measure the backpressure in orifice lines set into or above controls in the stream channel. In addition, some of the sites monitor water temperature and conductivity with either USGS minimonitor probes, or Campbell temperature/conductivity probes. Ratings are developed for the stage/discharge relationship at each site by measuring streamflow with current meters or portable flumes, according to standard USGS methods. Datum corrections to the stage are determined by periodically surveying the elevation of the orifice line to the control and nearby reference marks. Calibrations for the temperature and conductivity are assessed by measuring these parameters with portable field meters while simultaneously noting the readings from the gage probes. Data is downloaded into Campbell storage modules, and retrieved into pcs. From there, the data is sent to a USGS computer, where time discrepancies are resolved, and the data is loaded into ADAPS, a database system developed in the USGS for maintaining and processing water data. A determination for each site as to when the stream was flowing and when it was not is made. For water temperature and conductivity, bad data is deleted. Variable shifts are determined based on field calibration measurements, and other indicators. The shifts are applied to the remaining good data inside of ADAPS. The data is pulled out of ADAPS, and reformatted for input into ORACLE. Cases of water temperature below reasonable values are set to lower limits. A quality code is assigned to every value. The resulting data is uploaded into the ORACLE and the McMurdo database. Before 2011, For stage/discharge, bad data is deleted. Survey data is reviewed to compute weir elevations an datum corrections. A rating curve is developed graphically, based on available data, and entered into ADAPS. All applicable shifts and datum corrections are entered into ADAPS. All corrections and ratings are run against the good stage data to compute the discharge at each recording interval. The data is pulled out of ADAPS, and reformatted for input into ORACLE. A quality code is assigned to every value. The resulting data is uploaded into ORACLE and the McMurdo database. ADAPS deprecated in favor of Aquarius software in 2012. Similar procedure is used in Aquarius to convert and curate the data. Metadata was enhanced in 2013 and 2014 by Inigo San Gil. In March 2021, data from the 2016-17 field season was replaced to correct a previously published error, in which discharge was reported in cubicFeetPerSecond (CFS) instead of litersPerSecond (l/s).
#> 3 Campbell CR10 dataloggers were used to record stream stage, water temperature, and conductivity in a network of stream gages. Stage is monitored with pressure transducers; PSS-1 and PS-2 models form Paroscientific Corporation, and Accubars from Sutron Corporation. The pressure transducers measure the backpressure in orifice lines set into or above controls in the stream channel. In addition, some of the sites monitor water temperature and conductivity with either USGS minimonitor probes, or Campbell temperature/conductivity probes. Ratings are developed for the stage/discharge relationship at each site by measuring streamflow with current meters or portable flumes, according to standard USGS methods. Datum corrections to the stage are determined by periodically surveying the elevation of the orifice line to the control and nearby reference marks. Calibrations for the temperature and conductivity are assessed by measuring these parameters with portable field meters while simultaneously noting the readings from the gage probes. Data is downloaded into Campbell storage modules, and retrieved into pcs. From there, the data is sent to a USGS computer, where time discrepancies are resolved, and the data is loaded into ADAPS, a database system developed in the USGS for maintaining and processing water data. A determination for each site as to when the stream was flowing and when it was not is made. For water temperature and conductivity, bad data is deleted. Variable shifts are determined based on field calibration measurements, and other indicators. The shifts are applied to the remaining good data inside of ADAPS. The data is pulled out of ADAPS, and reformatted for input into ORACLE. Cases of water temperature below reasonable values are set to lower limits. A quality code is assigned to every value. The resulting data is uploaded into the ORACLE and the McMurdo database. Before 2011, For stage/discharge, bad data is deleted. Survey data is reviewed to compute weir elevations an datum corrections. A rating curve is developed graphically, based on available data, and entered into ADAPS. All applicable shifts and datum corrections are entered into ADAPS. All corrections and ratings are run against the good stage data to compute the discharge at each recording interval. The data is pulled out of ADAPS, and reformatted for input into ORACLE. A quality code is assigned to every value. The resulting data is uploaded into ORACLE and the McMurdo database. ADAPS deprecated in favor of Aquarius software in 2012. Similar procedure is used in Aquarius to convert and curate the data. Metadata was enhanced in 2013 and 2014 by Inigo San Gil. In March 2021, data from the 2016-17 field season was replaced to correct a previously published error, in which discharge was reported in cubicFeetPerSecond (CFS) instead of litersPerSecond (l/s).
#> SourceCode Organization
#> 1 1 McMurdo Dry Valleys LTER
#> 2 1 McMurdo Dry Valleys LTER
#> 3 1 McMurdo Dry Valleys LTER
#> SourceDescription
#> 1 As part of the Long Term Ecological Research (LTER) project in the McMurdo Dry Valleys of Antarctica, a systematic sampling program has been undertaken to monitor the glacial meltwater streams in that region. This package contains data pertaining to continuous monitored water quality and quantity parameters measured with automatic recording devices on streams in this region. Specifically, this metadata record describes the hydrology data set for the McMurdo Dry Valleys' Andersen Creek at the H1 streamgage, located in the Hoare Basin of Taylor Valley. Measurements commenced during the 1993-94 season and are ongoing. This dataset extends through the first half of the 2019-20 field season.
#> 2 As part of the Long Term Ecological Research (LTER) project in the McMurdo Dry Valleys of Antarctica, a systematic sampling program has been undertaken to monitor the glacial meltwater streams in that region. This package contains data pertaining to continuous monitored water quality and quantity parameters measured with automatic recording devices on streams in this region. Specifically, this metadata record describes the hydrology data set for the McMurdo Dry Valleys' Andersen Creek at the H1 streamgage, located in the Hoare Basin of Taylor Valley. Measurements commenced during the 1993-94 season and are ongoing. This dataset extends through the first half of the 2019-20 field season.
#> 3 As part of the Long Term Ecological Research (LTER) project in the McMurdo Dry Valleys of Antarctica, a systematic sampling program has been undertaken to monitor the glacial meltwater streams in that region. This package contains data pertaining to continuous monitored water quality and quantity parameters measured with automatic recording devices on streams in this region. Specifically, this metadata record describes the hydrology data set for the McMurdo Dry Valleys' Andersen Creek at the H1 streamgage, located in the Hoare Basin of Taylor Valley. Measurements commenced during the 1993-94 season and are ongoing. This dataset extends through the first half of the 2019-20 field season.
#> Citation
#> 1 Gooseff, M. and D. McKnight. 2021. Seasonal high-frequency measurements of discharge, water temperature, and specific conductivity from Andersen Creek at H1, McMurdo Dry Valleys, Antarctica (1993-2020, ongoing) ver 11. Environmental Data Initiative. https://doi.org/10.6073/pasta/7dd79fd8d81b421f8b64d446babbab65. (Accessed 2022-04-11).
#> 2 Gooseff, M. and D. McKnight. 2021. Seasonal high-frequency measurements of discharge, water temperature, and specific conductivity from Andersen Creek at H1, McMurdo Dry Valleys, Antarctica (1993-2020, ongoing) ver 11. Environmental Data Initiative. https://doi.org/10.6073/pasta/7dd79fd8d81b421f8b64d446babbab65. (Accessed 2022-04-11).
#> 3 Gooseff, M. and D. McKnight. 2021. Seasonal high-frequency measurements of discharge, water temperature, and specific conductivity from Andersen Creek at H1, McMurdo Dry Valleys, Antarctica (1993-2020, ongoing) ver 11. Environmental Data Initiative. https://doi.org/10.6073/pasta/7dd79fd8d81b421f8b64d446babbab65. (Accessed 2022-04-11).
#> QualityControlLevelCode BeginDateTime EndDateTime
#> 1 1 2002-01-11 23:45:00 2003-12-26 09:30:00
#> 2 1 2002-01-11 23:45:00 2003-12-26 09:30:00
#> 3 1 2002-01-11 23:45:00 2003-12-26 09:30:00
#> BeginDateTimeUTC EndDateTimeUTC ValueCount
#> 1 2002-01-11 10:45:00 2003-12-25 20:30:00 10001
#> 2 2002-01-11 10:45:00 2003-12-25 20:30:00 10001
#> 3 2002-01-11 10:45:00 2003-12-25 20:30:00 10001