Keyword

Oceans

1602 record(s)
 
Type of resources
Available actions
Topics
Keywords
Contact for the resource
Provided by
Years
Formats
Representation types
Update frequencies
status
Resolution
From 1 - 10 / 1602
  • The dataset comprises chlorophyll-a concentrations from water samples taken during RRS James Clark Ross cruise JR291, from 12/11/2013 - 19/12/2013. The cruise sailed from Stanley, Falklands, and returned to the same port. Samples were taken during transit to Signy Island (South Orkneys), and then up through the Scotia Sea to BAS survey sites P2 and P3 as well as near South Georgia and in the Western Core Box survey area to the north of the island of South Georgia. 170 samples were collected from the ship’s uncontaminated underway supply, with an intake at approximately 6.5 m depth, every two hours during transit periods. 74 samples were collected, using a rosette sampler, from the upper 1000m during CTD (conductivity, temperature and depth probe) deployments. Each 300ml sample was filtered through a 0.8μm pore size, 25mm diameter, MPF300 filter, rinsed with Milli-Q water, placed in an Eppendorf tube and stored at -20°C for later analysis. Samples were extracted in 90 % acetone for 22-24 hours at 4°C and measured on a Trilogy Turner Designs 7200 lab fluorometer calibrated with a pure chlorophyll-a standard (Sigma, UK) and set up following the method of Welschmeyer (1994). Data have not been adjusted for blanks. The data set was from the annual Western Core Box Cruise run by British Antarctic Survey (BAS). Data were collected to support the PhD of Anna Belcher and provide seasonal context for the cruise in terms of the primary production in the surface ocean. Chlorophyll samples were collected by Elena Ceballos-Romero (University of Sevilla), Frédéric Le Moigne (NOC) and Anna Belcher (NOC). Chlorophyll samples were analysed at the National Oceanography centre in Southampton by Anna Belcher from NOC.

  • This dataset includes post-processed model output from an Amundsen Sea regional configuration of MITgcm. The model use ocean, sea ice and ice shelf components. The model is forced by ERA5 from the years 1979-2021, with the exception of 1996 for which ERA Interim is used. The data is used for figures in the research paper "Wind-driven coastal polynya variability drives decadal ice-shelf melt variability in the Amundsen Sea" by Michael Haigh, Paul Holland and Thomas Caton Harrison. Creation of the dataset was funded by the NERC project "Drivers of Oceanic Change in the Amundsen Sea", NE/T012803/1.

  • The GEBCO_2019 Grid is a global continuous terrain model for ocean and land with a spatial resolution of 15 arc seconds. The grid uses as a ‘base’ Version 1 of the SRTM15_plus data set (Sandwell et al). This data set is a fusion of land topography with measured and estimated seafloor topography. It is largely based on version 11 of SRTM30_plus (5). Included on top of this base grid are gridded bathymetric data sets developed by the four Regional Centers of The Nippon Foundation-GEBCO Seabed 2030 Project, and from a number of international and national data repositories and regional mapping initiatives. The GEBCO_2019 Grid represents all data within the 2019 compilation. The compilation of the GEBCO_2019 Grid was carried out at the Seabed 2030 Global Center, hosted at the National Oceanography Centre, UK, with the aim of producing a seamless global terrain model. The majority of the compilation was done using the 'remove-restore' procedure (Smith and Sandwell, 1997; Becker, Sandwell and Smith, 2009 and Hell and Jakobsson, 2011). This is a two stage process of computing the difference between the new data and the ‘base’ grid and then gridding the difference and adding the difference back to the existing ‘base’ grid. The aim is to achieve a smooth transition between the 'new' and 'base' data sets with the minimum of perturbation of the existing base data set. The data sets supplied in the form of complete grids (primarily areas north of 60N and south of 50S) were included using feather blending techniques from GlobalMapper software. The GEBCO_2019 Grid has been developed through the Nippon Foundation-GEBCO Seabed 2030 Project. This is a collaborative project between the Nippon Foundation of Japan and the General Bathymetric Chart of the Oceans (GEBCO). It aims to bring together all available bathymetric data to produce the definitive map of the world ocean floor by 2030 and make it available to all. Funded by the Nippon Foundation, the four Seabed 2030 Regional Centers include the Southern Ocean - hosted at the Alfred Wegener Institute, Germany; South and West Pacific Ocean - hosted at the National Institute of Water and Atmospheric Research, New Zealand; Atlantic and Indian Oceans - hosted at the Lamont Doherty Earth Observatory, Columbia University, USA; Arctic and North Pacific Oceans - hosted at Stockholm University, Sweden and the Center for Coastal and Ocean Mapping at the University of New Hampshire, USA).

  • The CreamT project converted the prototype WireWall wave overtopping field measurement system into a ruggedised monitoring system between August 2020 and August 2023. The system was deployed for up to a year in two high-energy coastal environments along the Southwest coast, UK (Dawlish and Penzance). The system was designed to have a 3-month maintenance interval and was programmed to measure overtopping condition ±3hrs either side of predicted high tide. The wave-by-wave overtopping data were telemetered to the British Oceanographic Data Centre (BODC) every 10 minutes. At the time of the project, the coastal structures at these sites comprised a vertical sea wall with small return lip or curve at the top. Both sea walls were fronted by a beach. During the project period the Dawlish beach levels exposed a concreate toe at the base of the wall. In Penzance, the beach covered the sea wall toe and was higher in the southwest monitoring location. The system was designed at the National Oceanography Centre (NOC) and had previously been validated in HR Wallingford’s flume facility and field tested with Sefton Council (https://www.channelcoast.org/northwest/). During CreamT, three different system configurations were deployed: full WireWall systems each with an array of six capacitance sensors; smaller WireWand systems with two capacitance sensors mounted on a single pole to detect overtopping at hazard hotspots; and a WaveWell using a single sensor on the face of the sea wall. Six datasets are available from the CreamT project. These contain delayed mode data from: 1) a WireWall deployed at the crest of the sea wall in Dawlish; 2) a WireWand deployed at the wall just seaward of the railway line in Dawlish; 3) a WireWand deployed at the fence just inland of the railway line in Dawlish; 4) a WaveWell deployed on the face of the sea wall in Dawlish; 5) a WireWall deployed at the crest of the sea wall in Penzance near Queen’s Hotel, and; 6) a WireWall deployed at the crest of the sea wall in Penzance near the Lidal store at Wherrytown. The datasets in Dawlish provide information about the inland distribution of overtopping, and the two datasets in Penzance provide information about the alongshore variability in overtopping hazard. These data can be used alongside the regional monitoring data available from the Southwest Regional Monitoring Programme to investigate the drivers of wave overtopping. All these data can be visualised in a hazard dashboard developed by the BODC and hosted on JASMIN, https://coastalhazards.app.noc.ac.uk/. This project was delivered by the National Oceanography Centre in collaboration with BODC and the University of Plymouth under NERC Grant References NE/V002538/1 and NE/V002589/1. Project partners were Network Rail, Southwest Regional Monitoring Programme, Environment Agency and Channel Coastal Observatory.

  • This dataset provides yearly estimates of near-global (65N-65S) ocean heat content and thermosteric sea-level depth-integrated for the upper 700 meters of the ocean for 1970 - 2023. The yearly values are presented with three-year smoothing and one-sigma error estimates. The dataset builds upon and updates the methodology established in Domingues et al. (2008, Nature), incorporating temperature measurements from ocean observation systems and applying corrections for instrumental biases and sampling irregularities. To estimate ocean heat content for the upper 700 m and the associated thermosteric sea level, we used ocean temperature profiles from the ENACT/ENSEMBLES version 3 (EN3) data set (1970-2004), and Argo/Ifremer profiling floats (2000-2023, updated January 2024). Empirical Orthogonal Functions (EOFs) were used to model variability of the time-varying sea level and were calculated from 23 years (1993–2015) of satellite altimeter data sourced from Commonwealth Scientific and Industrial Research Organisation (CSIRO), (TOPEX/Poseidon, Jason-1, Jason-2 and Jason-3).

  • A set of historical tide gauge sea level records from Santander (Northern Spain) have been recovered from logbooks stored at the Spanish National Geographical Institute (IGN). Sea level measurements have been digitised, quality-controlled and merged into a consistent sea level time series. Vertical references among instruments benchmarks have been derived from high precision vertical levelling surveys. The observations were recorded as daily averages and are from three different instruments in two locations in Santander (Spain). The historical sea level record in Santander consists of a daily time series spanning the period 1876-1924 and it is further connected to the modern tide gauge station nearby, ensuring datum continuity up to the present. The data from Santander comes from a floating gauge and then syphon gauges. This scarcity of long-term sea-level observations, as well as their uneven geographical distribution is a major challenge for climate studies that address, for example, the quantification of mean sea-level rise at centennial time scales, the accurate assessment of sea-level acceleration or the long-term changes in sea-level extremes that are vital for coastal risk assessments. This dataset represents an additional effort of sea-level data archaeology and aims at preserving the historical scientific heritage that has been up to now stored in old archives in non-electronic format. The research was partially funded by the Spanish Ministry of Science, Innovation and Universities. A further two series were rescued from Alicante under the same initiative.

  • To understand seasonal climatic variability in the North East Atlantic, a fortnightly resolution marine climate record from 1353–2006 was constructed for shallow inshore waters on the west coast of Scotland using red coralline algae. The data are available in an Excel file as mean winter and summer temperatures with 95% confidence intervals for each year from 1353 to 2006. SCUBA was used to collect a 46 cm core from a coralline algal (Lithothamnion glaciale) deposit in Loch Sween, Scotland. The core was frozen and sectioned longitudinally and into 2 cm horizons. Coralline algae from each horizon were sectioned along the length of each thallus. Mg, Ca, and Sr were quantified along each thallus using electron microprobe analysis. For the live collected surface specimens, this process enabled absolute dates to be assigned to each year’s growth band present within the coralline algae. Five thalli down core were selected for radiocarbon rangefinder dating at the Scottish Universities Environmental Research Centre. Live thalli and the five rangefinder thalli were used as anchor points in construction of a combined chronology which was fine-tuned using dendrochnological techniques. Twenty seven (including anchors) Mg/Ca time series were available; each from an individual thallus. The work was funded by the Natural Environmental Research Council and the Royal Society of Edinburgh.

  • The data set consists of digital bathymetric contours taken from the International Bathymetric Chart of the Mediterranean (IBCM) chart series. Most of the IBCM sheets depict contours at depths at 0m (coastline), 20m, 50m, 100m, and 200m, and at 200m intervals thereafter, although the actual contours displayed vary slightly from sheet to sheet. The data set is included in the GEBCO Digital Atlas (GDA). Through the GDA software interface the IBCM bathymetric contours can be exported in ASCII or shapefile format. The 10 sheets of the IBCM chart series are on a Mercator Projection at a scale of 1:1 million (at 38 N). The Black Sea is included at a scale of 1:2 million. The IBCM (1st Edition) chart series was published by the Head Department of Navigation and Oceanography of the USSR Ministry of Defence, St. Petersburg, under the auspices of the Intergovernmental Oceanographic Commission (IOC) of UNESCO 1981. The bathymetric contours and coastlines from the IBCM sheets were digitised. Error checking and quality control work on the data set was carried out at BODC. The digital data set was first made available in 1988.

  • This data set consists of a bathymetric grid derived from multibeam bathymetry data from cruise JC071. The bathymetric grid was created by gridding the cleaned raw multibeam data from JC071 at 1/64 arc-minute intervals using a nearneighbour gridding algorithm from the Generic Mapping Tools (GMT) software system. The data set covers an approximate one degree square with the minimum and maximum longitude and latitude co-ordinates: 17.016667W-16.216667W; 48.78333N-49.28333N. This is located in the Northeast Atlantic Ocean area. The data were collected from 7th-8th May 2012 using an EM120 Multibeam Echo-sounder. The cruise was part of the Porcupine Abyssal Plain (PAP): sustained ocean observation project. The bathymetry data were collected on an opportunistic basis during the cruise. The cruise was operated by the National Oceanography Centre (NOC), equipment operated by National Marine Facilities Sea Systems. The bathymetric grid was created by BODC for contributing to the EMODnet High Resolution Seafloor Mapping (HRSM) Project.

  • The dataset consists of 2580 tiff images of tide gauge charts from Bowling, River Clyde. The images were taken from annual bound volumes of tide gauge charts (~1 page per week, 52 pages per volume). A typical volume measures 37 x 34 x 3.5 cm and pages are single sided. The ledgers for Bowling begin in 1888 and end in 1952, but under this project, only the charts up until 04/01/1939 were photographed. The trace on the original charts was generated by a float tide gauge. The float inside a stilling-well was connected by a wire run over pulleys to a pen that moved up and down as the tide rose and fell. The images were generated by a commercial scanning organisation (TownsWeb Archiving Ltd) using a planetary overhead book scanner. In July 2016 The Peel Group Ltd. (Glasgow) approached BODC to donate their tidal archive, due to office redevelopment. The archive consists of ledgers of tide gauge charts (345 annual bound volumes) and handwritten ledgers (91 bound books) from several locations along the Clyde, with the earliest record beginning in 1841 from Glasgow Harbour. Later that year BODC received a grant from the Marine Environmental Data and Information Network (MEDIN) to photograph a selection of the ledgers. MEDIN released these funds to support small Data Archiving Projects that increase access to industry marine data. Ledgers also exist for Broomielaw, Dalmuir, Gourock, Govan Wharf, Greenock, Partick Wharf Glasgow, Queen's Dock Entrance Glasgow and Rothesay Dock. Most begin in the late 19th Century and run to the mid-20th century. It is hoped that these will be digitised in the future, subject to funding.