the gbif france new portal : atlas of living france (alf) · data into alf those php scripts (a)...

1
The GBIF France new portal : Atlas of Living France (ALF) Authors : Marie-Elise Lecoq 4 , Anne-Sophie Archambeau 4 , Eric Chenin 4 , Fabien Cavière 4 , Sophie Pamerlon 4, Régine Vignes-Lebbe 4 4 GBIF France, 43 rue Bufon, 75005 Paris - gbif(at)gbif.fr With more than 500 million records documenting species occurrences shared and accessible through GBIF.org, the Global Biodiversity Information Facility (GBIF) is a major actor in biodiversity data discovery. The launch of the new GBIF website in November 2013 saw an improvement in the reliability, access and ftness for use of the published data. With this amount of data, it is not possible to ofer as much detail as in national portals, especially in terms of map visualization, feld search, data quality, etc. GBIF France 1 launched its new portal 2 totally based on Atlas of Living Australia 3 modules at the end of May 2015 after 3 months of confguration and installation. We have been able to match this generic infrastructure to our local needs (e.g., links to diferent French structures such as Service du Patrimoine Naturel for species pages) and requests (e.g., adding a map to the result page of an occurrence search), developing our own design by adding the portal to our national website, and translating the ALA portal into French. Process of publishing data into ALF Those PHP scripts (a) use the GBIF API (b) to insert information on institutions, collections, data resources and contacts using the UUID provided by the GBIF into the MySQL instance (d). We use the Client URL Request Library named CURL to call the GBIF web services. GBIF web services return JSON fles (c) Collection pages PHP MySQL The metadata registry is stored on a MySQL instance We download archives from GBIF web services (2) through the loading tool (1) proposed by the software. It stores the DwC-Archive (3) locally and creates, on the MySQL instance (4), a metadata record for each resource. DwC-A Ins t itution page s Contact pages Biocache command line tool is a java executable with a scala implementation. The entire process takes four steps : loading, processing, sampling and indexing data into the Apache Cassandra (NoSQL Database). Data resourc e pages JSON (a) (b) (d) (c) (1) (2) (3) (4) Reference: 1 www.gbif.fr 2 www.gbif.fr then click on « Consulter » 3 http://www.ala.org.au/ Occurrence search engine ALF system GBIF.org

Upload: others

Post on 24-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The GBIF France new portal : Atlas of Living France (ALF) · data into ALF Those PHP scripts (a) use the GBIF API (b) to insert information on institutions, collections, data resources

The GBIF France new portal : Atlas of Living France (ALF)

Authors : Marie-Elise Lecoq4, Anne-Sophie Archambeau4, Eric Chenin4 , Fabien Cavière4, Sophie Pamerlon4, Régine Vignes-Lebbe4

4 GBIF France, 43 rue Bufon, 75005 Paris - gbif(at)gbif.fr

With more than 500 million records documenting species occurrences shared and accessible through GBIF.org, the Global BiodiversityInformation Facility (GBIF) is a major actor in biodiversity data discovery. The launch of the new GBIF website in November 2013 saw animprovement in the reliability, access and ftness for use of the published data. With this amount of data, it is not possible to ofer as muchdetail as in national portals, especially in terms of map visualization, feld search, data quality, etc.

GBIF France1 launched its new portal2 totally based on Atlas of Living Australia3 modules at the end of May 2015 after 3 months ofconfguration and installation. We have been able to match this generic infrastructure to our local needs (e.g., links to diferent Frenchstructures such as Service du Patrimoine Naturel for species pages) and requests (e.g., adding a map to the result page of an occurrencesearch), developing our own design by adding the portal to our national website, and translating the ALA portal into French.

Process of publishing data into ALF

Those PHP scripts (a) use theGBIF API (b) to insertinformation on institutions,collections, data resourcesand contacts using the UUIDprovided by the GBIF into theMySQL instance (d). We usethe Client URL RequestLibrary named CURL to callthe GBIF web services. GBIFweb services return JSONfles (c)

Collection pages

PHP

MySQL

The metadata registry isstored on a MySQLinstance

We download archives fromGBIF web services (2)through the loading tool (1)proposed by the software.It stores the DwC-Archive(3) locally and creates, onthe MySQL instance (4), ametadata record for eachresource.

DwC-A

Institutio

n pages

Contact p

ages

Biocache command line tool is a java executablewith a scala implementation. The entire processtakes four steps : loading, processing, sampling andindexing data into the Apache Cassandra (NoSQLDatabase).

Data re

source pages

JSON

(a)

(b)

(d)

(c)

(1)

(2)

(3)

(4)

Reference: 1 www.gbif.fr 2 www.gbif.fr then click on « Consulter » 3 http://www.ala.org.au/

Occurre

nce  searc

h engine

ALF system

GBIF.org