The general aim of setting up a central database on benthos and plankton was to integrate long-, medium- and short-term datasets on marine biodiversity. Such a database makes it possible to analyse species assemblages and their changes on spatial and temporal scales across Europe. Data collation lasted from early 2007 until August 2008, during which 67 datasets were collected covering three divergent habitats (rocky shores, soft bottoms and the pelagic environment). The database contains a total of 4,525 distinct taxa, 17,117 unique sampling locations and over 45,500 collected samples, representing almost 542,000 distribution records. The database geographically covers the North Sea (221,452 distribution records), the North-East Atlantic (98,796 distribution records) and furthermore the Baltic Sea, the Arctic and the Mediterranean. Data from 1858 to 2008 are presented in the database, with the longest time-series from the Baltic Sea soft bottom benthos. Each delivered dataset was subjected to certain quality control procedures, especially on the level of taxonomy. The standardisation procedure enables pan-European analyses without the hazard of taxonomic artefacts resulting from different determination skills. A case study on rocky shore and pelagic data in different geographical regions shows a general overestimation of biodiversity when making use of data before quality control compared to the same estimations after quality control. These results prove that the contribution of a misspelled name or the use of an obsolete synonym is comparable to the introduction of a rare species, having adverse effects on further diversity calculations. The quality checked data source is now ready to test geographical and temporal hypotheses on a large scale. © Springer Science+Business Media B.V. 2010.
- Data acquisition
- Quality control