cl_maintenanceAndUpdateFrequency

unknown

2109 record(s)
 
Type of resources
Available actions
Topics
Keywords
Contact for the resource
Provided by
Years
Formats
Representation types
Update frequencies
Resolution
From 1 - 10 / 2109
  • The Met Office's Numerical Atmospheric-dispersion Modelling Environment (NAME) was used at the University of Leicester to produce atmospheric dispersion footprints centred on Beijing for use by the projects under the Atmospheric Pollution & Human Health in a Chinese Megacity (APHH) programme. These footprints are created by model runs in which thousands of particles are released from the chosen location and are tracked backwards in time.

  • This reference only dataset contains Sentinel-1 data that has been modified to provide a Normalised Radar Backscatter, Analysis Ready Dataset over Plymouth. Two months' of data are provided for each area in the CARD4L v3.2.2 standard format. The data is designed to be used with the ESA SNAP toolbox. UK Analysis-Ready Data (ARD) tests in support of the Committee on Earth Observation Satellites (CEOS) Standards is a project run by the Group on Earth Observations (GEO)/CEOS office. The purpose of the project was to demonstrate the UK's ability to produce ARD to the specified CEOS Analysis Ready Data for Land (CARD4L) standards. The GEO/CEOS office is hosted by NCEO and funded by UK Space Agency, DEFRA and NERC.

  • This dataset contains atmospheric data from the WRF (Weather Research and Forecasting) model. The model is located over the Dudh Koshi Valley, and the model was run for July 2013. WRF version 3.8 was used. This data has been used to create and examine the effectiveness of a new debris-covered glacier representation in the WRF model. There are eight NetCDF files containing the data: The model with the default glacier landmask in the model (WRF_DudhKoshiHimalayas_201306_CleanIceGlaciers.nc); the model with a new representation of debris-covered glaciers (WRF_DudhKoshiHimalayas_201306_DebrisCoverGlaciers.nc); and six sensitivity tests varying albedo, emissivity and roughness length (WRF_DudhKoshiHimalayas_201306_DebrisCoverGlaciers_albedoHIGH.nc, etc).

  • This dataset contains coupled physical-biogeochemical ocean second generation Canadian Earth System Model (CanESM2) simulation outputs using the 1 degree NEMO-HadOCC model. The model output contains 3D Digital Image Correlation (DIC), alkalinity, temperature and salinity datasets at annualy-averaged frequency and monthly averaged surface ocean CO2 fugacities and fluxes. Job IDs included in this dataset: CanESM2 surface fluxes (started on 18th for first, 21st for second, and on the 19th for other 2): RCP85: u-ao419 RCP26: u-ao519 Constant atm CO2: RCP85: u-ao529 RCP26: u-ao531 (reduce walltime for nemo to test) This data was collected in support of CURBCO2: Carbon Uptake Revisited - Biases Corrected using Ocean Observations, a Natural Environment Research Council (NERC) funded project (NERC Grant NE/P015042/1). The overarching aim of this project was to provide UK and international governments with the best possible impartial information from which they can plan how best to work towards the global warming targets (the 'Paris Agreement') set at the Paris Climate Conference in December 2015.

  • Part of the European Space Agency's (ESA) Greenhouse Gases (GHG) Climate Change Initiative (CCI) project and the Climate Research Data Package Number 3 (CRDP#3), the XCH4 GOS SRPR (Proxy) product comprises a level 2, column-averaged dry-air mole fraction (mixing ratio) for methane (CH4). The product has been produced using data acquired from the Thermal and Near Infrared Sensor for Carbon Observations (TANSO-FTS) NIR and SWIR spectra, onboard the Japanese Greenhouse gases Observing Satellite (GOSAT). This proxy version of the product has been generated using the RemoTeC SRPR algorithm, which is being jointly developed at SRON and KIT. This has been designated as an 'alternative' GHG CCI algorithm, and a separate product has also been generated by applying the baseline GHG CCI proxy algorithm (the University of Leicester OCPR algorithm). It is advised that users who aren't sure whether to use the baseline or alternative product use the OCPR product generated with the baseline algorithm. For more information regarding the differences between the baseline and alternative algorithms please see the GHG-CCI data products webpage. The data product is stored per day in a single NetCDF file. Retrieval results are provided for the individual GOSAT spatial footprints, no averaging having been applied. As well as containing the key product, the product file contains information relevant for the use of the data, such as the vertical layering and averaging kernels. The parameters which are retrieved simultaneously with XCH4 are also included (e.g. surface albedo), in addition to retrieval diagnostics like quality of the fit and retrieval errors. For further details on the product, including the RemoTeC algorithm and the TANSO-FTS instrument, please see the associated product user guide (PUG) or the Algorithm Theoretical Basis Documents in the documentation section. The GHG-CCI team encourage all users of their products to register with them to receive information on any updates or issues regarding the data products and to receive notification of new product releases. To register, please use the following link: http://www.iup.uni-bremen.de/sciamachy/NIR_NADIR_WFM_DOAS/CRDP_REG/

  • Data used in Climate Change 2001, the Third Assessment Report (TAR) of the United Nations Intergovernmental Panel on Climate Change (IPCC). Simulations of global climate models were run by various climate modelling groups coordinated by the World Climate Research Programme (WCRP) on behalf of the United Nations Intergovernmental Panel on Climate Change (IPCC). Climatology data calculated from global climate model simulations of experiments representative of Special Report on Emission Scenarios (SRES) scenarios: A1F, A1T, A1a, A2a, A2b, A2c, B1a, B2b. The climatologies are 30-year averages. Climate anomalies are expressed relative to the period 1961-1990. The monthly climatology data covers the period from 1961-2100. The climatologies are of global scope and are provided on latitude-longitude grids.

  • Cloud properties derived from the AVHRR instrument on NOAA-18 by the ESA Cloud CCI project. The L3U datasets consists of cloud properties from L2 data granules remapped to a global space grid of 0.1 degree in latitiude and longitude, without combining any observations from overlapping orbits; only sampling is done. Common notations for this processing level are also L2b and L2G. Data is provided with a temporal resolution of 1 day. This dataset is version 1.0 data from Phase 1 of the CCI project.

  • A geographic information system (GIS) heat flow and temperature model of East Africa created by extracting data from open sources into a series of shapefiles and rasters containing information on geothermal sites, hot spring locations, digital elevation model, surface temperature, geothermal gradients, thermal conductivities and heat flow data, major faults, surface geology, crustal basement, electrification grid system and population density across East Africa. This data is stored in the World Geodetic System (WGS) 1984 Geographic Projection System.

  • The 20th Century Reanalysis dataset provides 6 hourly analyses on a global grid from 1870 to present produced from a series of 56-member ensemble runs. These data are the from each of the 56 ensemble members from the run covering 2001 to present. These data were produced on a 2 degree latitude-longitude (180x91) global grid and include data both at the surface and on pressure levels. The dataset authors request that the following acknowledgment be included in all papers using the dataset: 'Support for the Twentieth Century Reanalysis Project dataset is provided by the U.S. Department of Energy, Office of Science Innovative and Novel Computational Impact on Theory and Experiment (DOE INCITE - http://www.doeleadershipcomputing.org/incite-program/ ) program, and Office of Biological and Environmental Research (BER - http://science.energy.gov/ber/ ), and by the National Oceanic and Atmospheric Administration Climate Program Office (http://www.climate.noaa.gov/).'

  • This dataset contains a series of 99 limited-area models (LAMs) nested within the Met Office global model. Met Office Unified Model (MetUM) deployed on xce, xcf and xcs in Exeter. Model data generated using Met Office Unified Model using a nesting suite (u-bw210) that runs an N512 global forecast and 99 embedded limited-area models each using a convection-permitting grid-length of 1.5km. The LAMs are each 360x360 grid points. The outer region is deemed to be a spin-up region and is ignored. The central 240x240 is then coarse-grained onto a 45km scale using 30x30 horizontal averaging to produce a 8x8=64 grid of spatially averaged data. Each file contains data from only one of these 64 subdomains, but data from every one the 99 regions around the globe. The nesting simulations are free-running within each LAM, but the driving model is re-initialised every 00Z using operational atmospheric analyses. All 99 regions are wholly over the sea. The central lat/lon for each of the 99 regions are: (80,-150), (70,0), (60,-35), (60,-15), (50,-160), (50,-140), (50,-45), (50,-25), (50,-149), (50,170), (40,-160), (40,-140), (40,-65), (40,-45), (40,-25), (40,150), (40,170), (30,-170), (30,-150), (30,-130), (29,-65), (30,-45), (30,-25), (30,145), (30,170), (20,-170), (20,-145), (21,-115), (20,-55), (20,-30), (20,65), (20,135), (20,170), (10,-170), (10,-140), (10,-120), (10,-100), (10,-50), (10,-30), (10,60), (10,88), (10,145), (10,160), (0,-160), (0,-130), (0,-100), (0,-30), (0,-15), (0,0), (0,50), (0,70), (0,88), (0,160), (-10,-170), (-10,-140), (-10,-120), (-10,-90), (-10,-30), (-10,-15), (-10,5), (-10,60), (-10,88), (-10,170), (-20,-160), (-20,-130), (-20,-100), (-20,-30), (-20,0), (-20,55), (-20,80), (-20,105), (-30,-160), (-30,-130), (-30,-100), (-30,-40), (-30,-15), (-30,10), (-30,60), (-30,88), (-40,-160), (-40,-130), (-40,-100), (-40,-50), (-40,0), (-40,50), (-40,100), (-50,-150), (-50,-90), (-50,-30), (-50,30), (-50,88), (-50,150), (-60,-140), (-60,-70), (-60,0), (-60,70), (-60,140), (-70,-160), (-70,-40). The data has near global coverage, but using this series of small domains. Training data is available for 6 months: Jan, Mar, Apr, Jul, Oct, Dec 2016. Test data is available for Jun 2017.