smos ocean salinity validation overview

22
SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010 SMOS Ocean Salinity Validation Overview 1 J. Font, J. Boutin, N. Reul, P. Spurgeon, J. Ballabrera, A. Chuprin, C. Gabarró, J. Gourrion, C. Hénocq, S. Lavender, N. Martin, J. Martínez, M. McCulloch, I. Meirold-Mautner, C. Mugerin, F. Petitcolin, M. Portabella, R. Sabia, M. Talone, J. Tenerelli, A. Turiel, J.L. Vergely, P. Waldteufel, X. Yin, S. Zine, S. Delwart (The SMOS L2 OS Team) ICM-CSIC/SMOS-BEC Barcelona, LOCEAN Paris, IFREMER Brest, ARGANS Plymouth, ACRI-ST Sophie-Antipolis, ESA-ESTEC Noordwijk [email protected]

Upload: others

Post on 27-Nov-2021

10 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

SMOSOcean Salinity Validation Overview

1

J. Font, J. Boutin, N. Reul, P. Spurgeon, J. Ballabrera, A. Chuprin, C. Gabarró, J. Gourrion, C. Hénocq, S. Lavender, N. Martin, J. Martínez, M. McCulloch, I. Meirold-Mautner, C. Mugerin, F. Petitcolin, M. Portabella,

R. Sabia, M. Talone, J. Tenerelli, A. Turiel, J.L. Vergely, P. Waldteufel, X. Yin, S. Zine, S. Delwart (The SMOS L2 OS Team)

ICM-CSIC/SMOS-BEC Barcelona, LOCEAN Paris, IFREMER Brest,ARGANS Plymouth, ACRI-ST Sophie-Antipolis, ESA-ESTEC Noordwijk

[email protected]

Page 2: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

SMOS SSS mission requirements

• The multiangular measurements of any point on the Earth’s surface provided by the SMOS interferometric radiometer MIRAS at each satellite overpass are aimed at:

– Determining sea surface salinity with an accuracy of the order of 0.1 practical salinity units, 100 – 200 km spatial resolution and 10 – 30 days temporal resolution

2

Overall SMOS scientific goalTo provide global coverage of Sea Surface Salinity fields, with repetition rate and accuracy adequate for oceanographic, climatological and hydrological studies and increase the present knowledge on: • Large-scale ocean circulation• Water cycle exchange rates quantitative estimation• Occurrence of natural catastrophic events• Management of water resources• Role of the ocean in the climate system

Page 3: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010 3

Sensitivity of Tb to SSS is small especially at low temperatures (few K range, 100 K for SM)

Retrieving salinity with SMOS is a challenge that requires very good performance of the instrument and a very demanding data processing: models and algorithms being implemented and now improved

Polarisation and viewing geometry provide a higher range of Tb values

SSS and Tb values

Page 4: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

SMOS Sea Surface Salinity retrieval

• Retrieval from brightness temperature through a convergence loop

• Comparing model (guessed SSS) with measurements at all incidence angles

• Sea surface emissivity model (top 1 cm) including roughness effects

• Other contributions from external sources and atmospheric effects

• Need for auxiliary data to describe environmental conditions (ECMWF)

4

Tb,p = Tb,p flat (θ, SST, SSS) + Tb,p rough (θ, Ф, wind waves, swell, other wave characteristics, foamcoverage, foam emissivity, rain)

Ocean

Atmosphere

Page 5: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010 5

• L2 accuracy estimated during algorithm validation- 1-2 psu range - function of distance to track- depending on environmental variables (uncertainty larger in cold waters)

SMOS SSS retrieval expectations

• Mission requirements to be reachable through spatio-temporal averaging (level 3)

• CATDS http://www.cesbio.ups-tlse.fr/fr/smos/smos_catds.html

• CP34 http://www.cp34-smos.icm.csic.es/.

Page 6: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

The SMOS L2 OS processor

6

Retrieved SSS along a SMOS ascending orbit in the Pacific Ocean.Unfiltered (left) and filtered (right, removing flagged data) values

Developed by the SMOS L2 OS team and implemented by ACRI-ST, Fr and Argans Ltd., UK

Last version 317, delivered late November 2010Includes first modifications using models fit to SMOS data

Page 7: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

L2OS processor performance

Good to detect strong SSS gradients: the Amazon plume

SMOS descending orbits, March 2010

7

by N. Reul, J. Tenerelli, IFREMER & CLS, Brest

Page 8: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

5 day average SSS, 30 August – 3 September 2010

SSS maps built from L2OS processor

by P. Spurgeon & A. Chuprin, Argans Ltd., Plymouth

Looks good, but not meeting requirements yet

SSS climatology

8

Page 9: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

Global SMOS L2 error: 1.5 psu

SMOS L2 SSS too saltyin the mean by ~0.45 psu

Global Std error ~1.5 psu

Histogram of the différencesbetween SSS in Situ and SSS SMOS L2

Top: Comparaisons between SSS in Situ and SMOS L2Bottom: median differences per bins of ±0.5 psu of in situ SSS

Reul et al., IFREMER

Comparison to ARGO SSS

Page 10: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

• Bias in the comparison of measured and modeled Tb

– Spatial pattern persistent along and in different orbits– Similar using different ocean emissivity models: mainly related to

instrument and image reconstruction imperfections

– Removal techniques being tested: Ocean Target Transformation (OTT, mean residual bias over homogeneous ocean areas) now implemented in L2OP

– Analysing how often and by what method to build OTT10

Main problems in SSS retrieval

Page 11: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010 11

• Impact of land on Tb bias patterns

Main problems in SSS retrieval

by J. Tenerelli & N. Reul

Page 12: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

• Contamination from radio frequency interferences (RFI)

Main problems in SSS retrieval

12

by R. Oliva, ESA

Point sources or large areas affected by extended sources

Page 13: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

L1 TB Temporal Drift for each pass typeEvaluated with the vicarious metrics:

Spatially Averaged ΔTb=<Tb meas –Tbmod>

The potential combination of :

Instrumental drifts (NIR AB, thermal correlations between Tbs & TP7, …), 

seasonal geophysical variability in Tbs contaminations  (sunglint, direct sun,galaxy, sea ice extent, faraday..) ,

forward model & auxilliary data errors(roughness, galactic glints,..)

results in clearly observable temporal drifts in the « Data‐Model ΔTbs » averages as reported since April. Since mid‐august:

‐the increasing Descending pass X pol data started to diverge from their Ascending X‐pol conterparts which decreasedcontinuously. 

‐while for TY a similar decreasing trend isobserved for both passes =>Drifts are Polarization & pass dependent

Reul et al, IFREMER.

Page 14: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

• Asymmetry ascending-descending passes

- Different land contamination impact- Different sun position with respect to spacecraft- Different galactic noise scattering

Main problems in SSS retrieval

by J. Boutin, LOCEAN, Paris

SMOS SSS ascending August 2010 SMOS SSS descending August 2010

14

Page 15: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

• Roughness correction models to be improved– Three options implemented in SMOS L2OS processor– Models fail at high winds

Main problems in SSS retrieval

Weekly global salinity map, July 2010: weighted averaging + discarding flagged data, plus difference with climatology

by M. Talone & R. Sabia, SMOS-BEC, Barcelona

15

Page 16: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

New empirical roughness models

SMOS SSS anomalies wrt climatology, using one of the models (Model 3) implemented in L2OS (top) and a new simplified roughness model built by fitting to data (bottom)

by J. Gourrion, SMOS-BEC, Barcelona

16

Page 17: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

SSS SMOS – ARGO collocations show relatively good performance of theoretical roughness correction models (defined previous launch) in 3-12 m/s wind speed range; need further correction at high wind speed

Using ARGO for model improvement

by J. Boutin, X. Yin, LOCEAN, Paris

S. Pacific: N=134243-12m/s=0.33+/-1.34

ITCZ: N=153503-12m/s=-0.14+/-0.67

Wind speed m/s

SS

Ssm

os–

SS

Sar

go(p

su)

0

12

Existing models are going to be improved in next version of the L2 OS processor by refining theoretical models and/or adjusting empirical parameters to fit the data

17

S. Indian: N=29813-12m/s=0.17+/-0.80

Page 18: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

Differences with ARGO buoys observations (1060 profiles)One week of data, March 2010Simplified SSS linear retrieval, X and Y pol separately

0.31.8

0.11.5

Preliminary SMOS SSS validation

by J. Gourrion, SMOS-BEC, Barcelona18

Page 19: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

SMOS SSS validation

• Still solving issues at L1 and L2• Receivers drift (modeling physical temperature)• Land-sea contamination• RFI detection and mitigation• Galactic noise scattering,• Faraday correction, …• Roughness and foam effect correction

• Comparing with in situ data is now helping to improve forward models

• Diagnostic sites for validation• Further improvements expected in L2 processor

• New version every 6 months• Reprocessing every 12 months

19

Page 20: SMOS Ocean Salinity Validation Overview

SMOS Validation and Retrieval Team Workshop, ESRIN, November 2010

Diagnostic sites

• Selected areas of special interest for product performance analysis

20

(now Pb of RFI)

(now Pb of coast vicinity)

Page 21: SMOS Ocean Salinity Validation Overview

COST Action network

Coordinate pan-European teams to define common protocols to produce high-level salinity maps and related products, and broaden expertise in their use for operational applications.

Benefits

• Unify disperse knowledge and efforts

• Define standardized data processing protocols

• Address problems with high societal/industrial impacts

SMOS-MODE SMOS Mission Oceanographic Data Exploitation

Schedule

• Sep.09: Preliminary proposal

• Nov.09: Selected for full-proposal

• Mar.10: Selected for hearing session

• Jun.10: Approved!

• Jan.11: Kick-off meeting

Features

• Workgroups meetings

• Workshops

• STSM- Short Term Scientific Missions• Training school• Keynote speakers

R. TonboeS. Scory

C. GommengingerA. Turiel A. Ostrovskii A. Martins J. Johannessen

A. Drago P.M. Poulain B. Ward G. Korres

N. ReulM. HallikainenG. Zodiatis

D. Stammer

Interested groups:contact your national delegates or join as new member country

21

Page 22: SMOS Ocean Salinity Validation Overview

SMOS SAG, ESAC, 15-16 March 201022

Thank you for your attention!