Project Task #642

Project WP #641: WP5 - Supporting Blue Assessment: VREs Development [Months: 1-30]

T5.1 Stock Assessment VRE [Months: 1-30]

Added by Franco Zoppi about 4 years ago. Updated 7 months ago.

Status:ClosedStart date:Oct 09, 2015
Priority:ImmediateDue date:Jan 31, 2018
Assignee:Julien Barde% Done:

99%

Sprint:WP05
Lead beneficiary:7 - IRD Participants:3 - ENG, 5 - FAO, 6 - ICES
Milestones:
Duration: 604

Description

Task Leader: IRD; Participants: FAO; ICES; ENG
This task aims to facilitate collaborative stock assessment where a role based workflow produces stock assessments in several clear steps: from data collation, and harmonization, through model driven analysis and dissemination. Planned solutions will be oriented to serve both resource managers in need of harmonized stock-related information and scientists seeking a powerful infrastructure for new analytical approaches. The task will rely on (a) existing community products and infrastructure resources (statistical manager, software and datasets), and (b) offerings of other VREs (models and datasets). It will extend those with more algorithms / models and new datasets. The resulting VRE will greatly facilitate the production of stock assessments, and the use of these assessments in science and commerce (trade).
This task carries out the following activities:
• Analysis of the detailed requirements of the VRE stakeholders and planning for their implementation;
• Selection of candidate models to add to the analytical module, e.g. Catch only based models, Ecological models;
• Identification of user roles and responsibilities in data harmonization efforts;
• Identification of data sets and sources to provide models and algorithms with required inputs, and integration of them into the infrastructure;
• Integration of models and tools in the infrastructure so that those: (i) become compliant with the infrastructure needs and requirements; (ii) are aligned to the user needs both in functional and non-functional areas; (iii) exploit the infrastructure resources for maximum scalability and productivity.
In particular, the following components will be developed and integrated:
• A service to systematize and harmonize the collection and publication of data (reference data, stock and fisheries data, their definitions and boundaries) from Fisheries organizations. This service should promote the use of reference formats (namely, FishFrame or FLUX), i.e. formats that are required in several EU fisheries reporting frameworks, yet currently difficult to produce.
• A Data Analytics Environment (built on BlueBRIDGE analytics facilities) that (a) supports seamless and integrated access to and usage of survey, geospatial and biological data of interest, (b) offers a flexible programming environment with access to, and control over the Blue Commons computing resources.
• A set of facilities to compute and publish indicators, such as facilities using traditional stock models or ecological assessment models to compute catch levels, abundance and fishing mortality and MSY.
• A set of facilities promoting collaborative working practices and Open Science practices that incorporate mechanisms to capture model output and share them with co-workers, to include them in reports, to visualize trends and the spatial context of model output, and to stage results as input for other models and analytic tasks.
The following Resources will be integrated and reused in order to provide the required capacity:
• State of the art algorithms and models for stock assessment, e.g. IRD related computational products including EwE, Ecospace, Atlantis and Osmosis, ICES assessment models, FAO CatchMSY, and generic surplus production models, statistical catch at age models, and virtual population analysis models;
• Capture data from FAO, IRD and other Fisheries organisations, capture data from projects and individual scientists, datasets from collaborating scientists, such as the DATRAS from ICES, and private data;
• Environmental data required for the ecological models as OGC compliant datasets, e.g. made available through Blue Environment delivering ESA and MyOcean data, WorldOcean Atlas Data, Ocean color data, and other model input datasets.


Subtasks

Project Activity #952: Stock assessment output AnnotationClosedAnton Ellenbroek

Project Activity #1674: VRE Stock Assessment WorkplanClosedAnton Ellenbroek

VRE #1677: Ecopath VRERemovedJeroen Steenbeek

Task #1777: Ecopath taxonomy dataformatsRejectedJeroen Steenbeek

VRE #1678: FAO Stock Assessment VREReleasedAnton Ellenbroek

Task #3158: Generate graphs and figures from TabmanClosedEnrico Anello

Task #3781: Connect OpenCPU to WPSClosedEnrico Anello

Task #4844: OpenCPU Library to connect to e-infra WPSClosedEnrico Anello

Project Activity #4613: FAO Stock Assessment VRE First ReleaseClosedAnton Ellenbroek

Project Activity #9670: Add RShiny based module to support SDG14.4.1 Closed_InfraScience Systems Engineer

Project Activity #9671: Add RSchiny CMSY versionsClosedEnrico Anello

Project Activity #9672: Add Rshiny based ELEFAN related modelsClosedEnrico Anello

D4Science Infrastructure - VM Creation #10361: Install a new shinyproxy server Closed_InfraScience Systems Engineer

Project Activity #1679: ICES Stock Assessment VREClosedScott Large

Task #1768: Management Strategy Evaluation (MSE) for western horse ma...ClosedScott Large

Task #1778: Toolbox for Data Limited StocksClosedScott Large

Task #5822: Simple minimal example of Shiny App with ShinyProxyClosedScott Large

Project Activity #5823: Able to deploy docker containersClosed_InfraScience Systems Engineer

Project Activity #7493: Linking work - space with VREs via RClosedScott Large

Task #7491: Length-based indicator applicationClosedScott Large

Task #7492: Use LBI application as shinyproxy use-caseClosedScott Large

Task #9542: DLMtoolkit - FLR data converterRejectedNathan Vaughan

Task #4620: Ensemble modelClosedScott Large

Task #5517: Mixfish OptimizerClosedScott Large

D4Science Infrastructure - Task #5818: Identify log-jams in use-case source codeClosedScott Large

Project Activity #10634: VSURFClosedScott Large

VRE #1779: IRD BFT AssessmentRemovedJulien Barde

Project Activity #5135: WECAFC-FIRMSClosedAnton Ellenbroek

VRE #5136: WECAFC-FIRMS: Add TabMan and Dataminer ReleasedAnton Ellenbroek

Task #11043: Enable advanced DB management features on WECAFC Firms VRERejectedPasquale Pagano

Project Activity #11214: WECAFC-FIRMS data / map dissemination toolsResolvedEmmanuel Blondel

Project Activity #11215: WECAFC-FIRMS - Set-up R Data processing componentsClosedEmmanuel Blondel

Project Activity #11216: WECAFC-FIRMS - Set-up SDI componentsClosedRoberto Cirillo

D4Science Infrastructure - VM Creation #11298: WECAFC-FIRMS dedicated GeoNetworkClosedEmmanuel Blondel

D4Science Infrastructure - VM Creation #11299: WECAFC-FIRMS dedicated GeoServerClosedEmmanuel Blondel

Project Activity #11367: Downgrade WECAFC-FIRMS Geonetwork to 3.0.4Closed_InfraScience Systems Engineer

Project Activity #11406: WECAFC-FIRMS - Geoserver logs are not accessibleClosed_InfraScience Systems Engineer

Support #11217: WECAFC-FIRMS - Need RDB database connection parametersClosedKostas Kakaletris

Project Activity #11286: WECAFC-FIRMS - Enable PostGIS spatial extensionClosedKostas Kakaletris

Project Task #11287: WECAFC-FIRMS - Compute & Provide reference spatial shapef...ClosedEmmanuel Blondel

Project Task #11312: Make country datasets available in pre-production DBClosedYann Laurent

Project Activity #11315: WECAFC-FIRMS - R data/metadata business processClosedEmmanuel Blondel

Project Activity #11317: WECAFC-FIRMS - Enable PostGIS spatial extension in produc...ClosedKostas Kakaletris

Project Task #11399: WECAFC-FIRMS - Refactoring of JS code for easy reuse ClosedEmmanuel Blondel

Project Activity #11424: Enable CORS on WECAFC-FIRMS GeoServerClosed_InfraScience Systems Engineer

Task #5238: WECAFC Stock assessment supportClosedAnton Ellenbroek

Task #5810: SS3 Stock Synthesis V3ClosedEnrico Anello

Task #8500: DLM Toolkit for WECAFC casesClosedAnton Ellenbroek

Task #8501: Validate with CMSY for WECAFC ClosedNathan Vaughan

Task #8502: Apply CMSY to US Caribbean stocks- yellowtail, hogfish, ...ClosedNathan Vaughan

Task #8503: CMSY / DLMtool IntegrationClosedEnrico Anello

Task #8504: Apply CMSY to WECAFC specific stock - Test with one country ClosedNathan Vaughan

Task #8505: Apply CMSY to Flying fishClosedAnton Ellenbroek

Task #9430: Modification of CMSYClosedNathan Vaughan

Task #9869: DLMtoolkit - Shiny interfaceClosedNathan Vaughan

VRE #6098: RDB for Fisheries Data Management (WECAFC)ReleasedAnton Ellenbroek

Task #9987: Regional Data Collection and Validation System implementa...ClosedVassilis Floros

Task #9988: Regional Data Collection and Validation Service implement...ClosedVassilis Floros

Task #10038: Implement a web method that lists the available schemas t...ClosedVassilis Floros

Task #10039: Implement a web method that the excel plugin will post th...ClosedVassilis Floros

Task #10041: DSD/SDMX handling and constraint rule management function...ClosedKonstantinos Apostolopoulos

Task #10042: Regional data collection and validation service - gCube i...ClosedVassilis Floros

Task #10644: User management and authorization in RDBClosedVassilis Floros

Task #10647: Regional Data Collection and Validation Service - Tabman ...ClosedVassilis Floros

Task #10648: RDB submission data based on user attributesClosedVassilis Floros

Task #9989: Regional Data Collection and Validation Portlet implement...ClosedVassilis Floros

Task #10034: Data Model (DSD) importing functionalityClosedVassilis Floros

Task #10035: Data model constraint rules managementClosedVassilis Floros

Task #10036: Data collection cycle managementClosedVassilis Floros

Task #10037: Data submission management pageClosedVassilis Floros

Task #9991: Regional Data Collection and Validation Excel Add-In impl...ClosedKonstantinos Apostolopoulos

Task #10043: Authentication and VRE selection on excel pluginClosedKonstantinos Apostolopoulos

Task #10044: Fetch the available data models from the regional data co...ClosedKonstantinos Apostolopoulos

Task #10045: Fetch the data model and constraints from the regional da...ClosedKonstantinos Apostolopoulos

Task #10046: Implement data constraint rules in the excel pluginClosedKonstantinos Apostolopoulos

Task #10048: Post the submission data to the regional data collection ...ClosedKonstantinos Apostolopoulos

Task #10480: Provide a list with RDB Data Submission MetadataClosedYann Laurent

VRE #6870: FLR for JRCRejectedAnton Ellenbroek

Project Activity #2344: Ichtyop modelClosedGianpaolo Coro

Task #3563: Import OSCAR NetCDF in the D4Science Thredds serviceClosedPaolo Fabriani

Project Activity #6760: Support transfer of NetCDF to Threeds Catalog organised i...ClosedFabio Sinibaldi

Project Activity #6761: Run Thredds on a SmartGears enabled containerClosedGianpaolo Coro

gCube - Feature #8361: Thredds migration toolClosedFabio Sinibaldi

D4Science Infrastructure - Task #8926: Unbound content length for threddsClosed_InfraScience Systems Engineer

Task #4621: Python for ensemble model (#4620)ClosedPaolo Fabriani

Task #4643: Improve SDMX capabilitiesClosedCiro Formisano

Task #7121: functional requirements neededClosedAymen Charef

Task #4644: Add ApiaId from WoRMS to species code-listsClosedCiro Formisano

Task #5527: Issue with the FLAssess package installationClosedScott Large

Task #5557: automatize the process of publication of algorithms on da...ClosedPaolo Fabriani

gCube - Task #7928: Normalize ACL for HProxyRejectedAndrea Dell'Amico

Task #8467: Staging dataminer towards DataMiner Pool Manager ServiceClosed_InfraScience Systems Engineer

D4Science Infrastructure - VM Creation #8472: New VM dataminer needed Closed_InfraScience Systems Engineer

D4Science Infrastructure - Support #8498: Adding SSH keys to RPrototypingLab VREClosed_InfraScience Systems Engineer

gCube - Task #9273: Updating dataminer1-devnext.d4science.org at latest versionClosedLucio Lelii

gCube - Task #9607: DataMiner PoolManager - update documentationClosedNunzio Andrea Galante

Project Activity #6348: Integrate FLR library of stock assessment servicesClosedAndrea Dell'Amico

Task #7358: SDMX related activitiesClosedCiro Formisano

Task #7359: SDMX Data Structure Definition exporter/importerClosedCiro Formisano

Task #7535: Define versions of Tabman templatesClosedLucio Lelii

Task #7360: Expose tabular data in SDMX formatClosedCiro Formisano

Task #8717: SDMX and RDBClosedAymen Charef

Project Activity #9944: new Tabman OperationClosedCiro Formisano

Task #9434: Improvements on SDMX export service interfaceClosedLucio Lelii


Related issues

Related to D4Science Infrastructure - Task #117: Execute FAO MSY on the complete FAO Dataset Closed Jun 11, 2015
Related to D4Science Infrastructure - Task #287: Integrate IRD-IFREMER stock assessment processes Closed Jun 23, 2015
Related to D4Science Infrastructure - Task #208: Prepare and process Indian Ocean Tuna Commission catch st... Closed Jun 03, 2015
Related to BlueBRIDGE - Project Activity #863: Activity of the WKLife V Meeting Closed Oct 05, 2015 Oct 09, 2015
Related to D4Science Infrastructure - Task #1368: Install and configure "Mono" to run EwE stock assessment ... Closed Nov 16, 2015
Related to BlueBRIDGE - Project Activity #1441: Integrate .Net algorithms on the BlueBRIDGE computing pla... Rejected Sep 06, 2016 Sep 22, 2016
Related to BlueBRIDGE - Project Activity #1825: WP5 - T5.1 Blue Assessment - Latex / KnitR / markdown Aut... Rejected Dec 15, 2015
Related to BlueBRIDGE - Project Activity #1903: Integrate IRD VPA Workflow Closed Apr 06, 2016
Related to BlueBRIDGE - Project Activity #1983: Publish NetCDF files in the e-Infrastructure Closed Jan 14, 2016
Related to BlueBRIDGE - Task #3787: Integrate OpenCPU with the e-Infrastructure Closed Apr 26, 2016 May 23, 2016

History

#1 Updated by Franco Zoppi about 4 years ago

  • Due date set to Feb 28, 2018

#2 Updated by Franco Zoppi about 4 years ago

  • Status changed from New to In Progress

#3 Updated by Franco Zoppi about 4 years ago

  • Subject changed from T5.1 Stock Assessment VRE to T5.1 Stock Assessment VRE [Months: 1-30]

#4 Updated by Franco Zoppi about 4 years ago

  • Start date changed from Sep 11, 2015 to Sep 01, 2015

#5 Updated by Gianpaolo Coro almost 4 years ago

  • Related to Task #117: Execute FAO MSY on the complete FAO Dataset added

#6 Updated by Gianpaolo Coro almost 4 years ago

  • Related to Task #287: Integrate IRD-IFREMER stock assessment processes added

#7 Updated by Gianpaolo Coro almost 4 years ago

  • Related to Task #208: Prepare and process Indian Ocean Tuna Commission catch statistics added

#8 Updated by Julien Barde almost 4 years ago

September 2015 Activity Report
At the last ICCAT event, hold in Madrid from september 3 to september 25, gathering participants to Northern Atlantic Bluefin Tuna stock assessment working group, Bluebridge activities related to stock assessment have been presented and discussed. Meanwhile, these activities have been discussed with all partners during the Kick off meeting of Bluebridge in Pisa (14th-17th September).

Within Bluebridge project, a collaboration with this working group of 40 people (researchers, NGOs, decision makers) is planned to set up a Virtual Research Environment (VRE) to fit some needs of this group. To take a big step forward, the participants of this group expect to get the following services from a VRE:

  • get online access to data and codes needed to run the whole stock assessement worklfow,
  • enable to browse, discuss and comment the results of a run with a friendly and interactive Web environment,
  • keep tracks of each run in order to enable a posteriori re-execution (metadata of the workflow). Indeed each year the first step consists in comparing results of year n with results of year n+1 results
  • customization of each steps of the workflow.

According to ICCAT feedback a first demo will be set up in March or June (for the "Data preparation meeting" which might be held in Sète). According to the results and opportunities a similar approach might be demonstrated with other species (yellowfin tuna, swordfish...) and other Tuna Commissions (IOTC).
IRD has hired an engineer to work on these activities (starting first of November).

Below the summary of activities related to Ecopath with Ecosym tuna cases (will be updated soon):
At the last Ecopath with Ecosim (EwE) training session, held in Brasil from September 14th to September 18th (Universidad Federal do Rio Grande Do Norte, Natal) with 20 students from across Brasil, Portugal, and Spain, opportunities related to Bluebridge project activities have been presented and discussed. Meanwhile, these activities have been discussed with all partners during the kick-off meeting of Bluebridge in Pisa.

Within Bluebridge project a collaboration with the EwE development team is planned to connect the EwE approach to Bluebridge infrastructure services (WP3), and to set up a Virtual Research Environment (VRE) which will be used within future EwE training sessions. According to the ongoing discussions, the EwE model expects to get the following services from the infrastructure:
• facilitate access to various data and input parameters required by EwE (currently collected by hand from Fishbase, OBIS, SeaLifeBase, GBIF, Aquamaps, FAO, Worms): biological parameters, species distributions, functional responses, time series of spatial environmental data, and fisheries (socioeconomic) data,
• online parametrization and execution of the entire EwE workflow,
• ability to keep tracks of past EwE runs by storing workflow metadata enabling an a posteriori re-execution of these past runs,
• operate EwE online.
Activities will start as soon as IRD can hire the engineer in charge of specifications and EwE related developments.

#9 Updated by Anton Ellenbroek almost 4 years ago

  • Assignee set to Anton Ellenbroek

#10 Updated by Anton Ellenbroek almost 4 years ago

  • Assignee changed from Anton Ellenbroek to Julien Barde

#11 Updated by Anton Ellenbroek almost 4 years ago

  • Start date set to Sep 01, 2015

due to changes in a related task

#12 Updated by Anton Ellenbroek almost 4 years ago

  • % Done changed from 90 to 10

#13 Updated by Gianpaolo Coro almost 4 years ago

#14 Updated by Gianpaolo Coro almost 4 years ago

  • Related to Task #1368: Install and configure "Mono" to run EwE stock assessment algorithms added

#15 Updated by Anton Ellenbroek almost 4 years ago

October 2015 report

IRD describes the WP5 activities related here:
*ICCAT https://support.d4science.org/projects/bluebridge/wiki/BFT_Assessment
*EwE https://support.d4science.org/projects/bluebridge/wiki/Ecopath_and_Ecosim

Specific activities in October ...

FAO did not yet recruit a developer, and is negotiating the VRE content and algoroithms/models

#16 Updated by Gianpaolo Coro almost 4 years ago

#17 Updated by Anton Ellenbroek almost 4 years ago

November 2015 report

ICES listed two algorithms / models for integration

IRD describes the WP5 activities related here:
*ICCAT https://support.d4science.org/projects/bluebridge/wiki/BFT_Assessment
*EwE https://support.d4science.org/projects/bluebridge/wiki/Ecopath_and_Ecosim

FAO
* identified a developer, and
* continues negotiating the VRE content and algorithms/models and their use cases

#18 Updated by Anton Ellenbroek almost 4 years ago

  • Due date changed from Dec 04, 2015 to Dec 03, 2015

due to changes in a related task

#19 Updated by Pasquale Pagano almost 4 years ago

  • Related to Project Activity #1825: WP5 - T5.1 Blue Assessment - Latex / KnitR / markdown Automated Reports for Bluefin Tuna Stock Assessment Workflow added

#20 Updated by Gianpaolo Coro over 3 years ago

#21 Updated by Gianpaolo Coro over 3 years ago

#22 Updated by Anton Ellenbroek over 3 years ago

December 2015 report

ICES
*discussed integration and sharing of algorithms / models for integration internally
*discussed stock assessment algorithms suitable for a global audience with FAO
*initiated describing use case for single stock assessment https://support.d4science.org/projects/bluebridge/wiki/ICES_single_stock_assessment

IRD
Continued the WP5 activities described here:
*ICCAT https://support.d4science.org/projects/bluebridge/wiki/BFT_Assessment
*EwE https://support.d4science.org/projects/bluebridge/wiki/Ecopath_and_Ecosim

FAO
* review of lecacy apps (FLOD, GRADE, BiOnym, SDMX etc.)
* continues negotiating the VRE content and algorithms/models and their use cases internally

ENG
* Started with IRD integration of EwE / MONO https://support.d4science.org/issues/1461

#23 Updated by Anton Ellenbroek over 3 years ago

January 2016 Effort report

IRD
Continued the WP5 activities described here:

ICES

FAO

  • participation to TCom and presentation of WP5 FAO plans

  • continues negotiating the VRE content and algorithms/models and their use cases internally

ENG

#24 Updated by Anton Ellenbroek over 3 years ago

February 2016 Effort report

IRD
Continued the WP5 activities described here:

  • ICCAT https://support.d4science.org/projects/bluebridge/wiki/BFT_Assessment:

    • Identify extensions to support stock assessment VRE: ticket #2111
    • looking for solutions to fix the fotran compilation for ubuntu 12.04 with ifort instead of gfortan compiler
    • a first version of WPS to execute the whole preparametrized workflow has been provided,
    • ichtyop model has been successfully tested on the infrastructure, a description of netCDF files to be made available (by OPeNDAP or stored on the infrastructure) has been provided to drive the model,
    • visualization services with Shiny or other ways (rCharts...) have been integrated in the first WPS provided,
    • data dissemination services with netCDF have been tested by generating netCDF within ICCAT workflow. These files will be later made available by raster data publisher
    • data storage services by deploying a mirror of IRD sardara database which has been filled and updated with new tuna RFMOs datasets,
  • EwE VRE development(#1677) continued with Ecopath taxonomy dataformats support #1777

    • EwE Ticket #1461: Further integrating EwE with the Statistical Manager to enable remote execution runs of EwE on the BB infrastructure. This involved meetings, specifications, and writing and deploying test tools.
    • Participation to TCom
    • WPS process to download infrastructure GIS products as ESRI GRID files: #2229

ICES

  • Discussions with internal and external lead experts about documentation, implementation plan, and steps needed to finalize their code on the infrastructure. The need for a shared stock assessment analysis with FAO and other organizations was captured in a draft ToR.

FAO

  • Preparation for TWG-1 and EAB; slides, meeting content and programme.

  • continues negotiating the VRE content and algorithms/models and their use cases internally; negotiated contracts and ToR for Stock assessment consultant.

ENG

  • Integration of the Ichtyop model - discussion, analysis and design of the integration approach (#2344)
  • EwE integration released for inclusion in gCube 3.10.1

#25 Updated by Anton Ellenbroek over 3 years ago

March 2016 Effort report

IRD
In relation with #1903:
* we validated the underlying codes of ICCAT VPA by executing them on the following server: "statistical-manager.d.d4science.org",
* we learnt how to use the Statistical Algorithms Importer to deploy the codes previously validated and provided some feedback
(https://support.d4science.org/issues/2981)
* we deployed our first codes on the SAI and tested the result on the statistical manager of the devvre https://dev.d4science.org/group/devvre/statistical-manager. The whole ICCAT workflow has been deployed and tested
* improvement of automated report (for ICCAT Eastern Bluefin tuna working group report)
* some tests have been done to publish both html
(https://support.d4science.org/issues/2520#change-12597) and netCDF which are outputs of ICCAT use case
* work to enable ichthyop model has been updated (https://support.d4science.org/issues/2344#change-13506), codes have been tested successfully on the infra. We are waiting for OSCAR netCDF files to be made available on the infra to try the execution of the model with a local (instead of OPeNDAP) data access.
* management of ESRI GRID ASCII files to feed EwE models
https://support.d4science.org/issues/2229#change-12466
* Ecopath and Ecosim: Linkage EwE and FishBase, Updating EwE and the global ocean model, Debugging EwE.
* Ecopath and Ecosim: discussions about how to link Ecopath with FishBase and Sealifebase

ICES

  • Continued identification of overall workplan

FAO

  • Host TWG-1 and EAB; relevant materials on WP5 for slides, meeting content and programme.

  • Following the TWG, liaised with external parties (WECAFC in particular) to negotiate VRE content and algorithms/models and their use cases internally; negotiated contracts and ToR for Stock assessment consultant. First consultant started 01-04-16

ENG

  • Support integration activities, no new activity intitiated.

#26 Updated by Gianpaolo Coro over 3 years ago

  • Related to Task #3787: Integrate OpenCPU with the e-Infrastructure added

#27 Updated by Anton Ellenbroek over 3 years ago

April 2016 Effort report

IRD
* publication of netCDF files on the infra (#3138) from WPS (through R code): improvements required to publish non grid data. Still facing issue to get the whole file published (needed for #1983 & #2344, #3563).
* training to make proper use of Statistical Algorithm importer (SAI) and IRD feedback (#3081, #1903),
* deployment of Ichthyop model (#2344) by packaging R & Java codes with SAI and validation with statman.
* deployment of ICCAT BFT-E codes (#1983) with SAI and validation with statman.
* publication of html files on the infra (#2520) from statman
* Discussion about annotation of charts, discussion about identified tools and possible options (#3153) to make use of such services on the infra (#3153, #3787).
* requirement for providing RStudio in the e-Infrastructure VREs (#2516 & VRE #1779: IRD BFT Assessment)
* Ecopath taxonomy dataformats (#1777)

ICES

  • Participated to training in Pisa,
  • Discussed data poor models to propose for integration with FAO and WECAFC
  • Initiated the integration work on the first model (CEFAS Horse Mackerel)

FAO

  • Commenced on using OpenCPU; first case are Statlant data for CCAMLR statistical reports (#1678, #3158, #3787)
  • Discussed data poor models to propose for integration with FAO and WECAFC

ENG

  • Import script for OSCAR NetCDF developed and tested in development infrastructure.

#28 Updated by Anton Ellenbroek over 3 years ago

May 2016 Effort report

IRD

  • General: make (a better) use of the Statistical Algorithms Importer to publish R codes by ourselves, sending feedback in order to help setting up best practices and guidelines and updating some parts of our R codes accordingly,
  • General: make use of the algorithms (within Statman or R codes) to publish netCDF files or HTML pages on either Thredds or Apache servers (Task #2520)
  • General: Annotation of charts, discussion about identified tools and possible options. Discussing opportunities with FAO about using OPENCPU (Project Activity #3153)
  • Ichthyop model: writing a summary of the Ichthyop use case on the infra,
  • Ichthyop model: replicating OSCAR data on the infrastructure (Task #3563), trying to find a solution to give an access to the whole serie of OSCAR images as expected by Ichthyop (by packaging all yearly netCDF files within a single one)
  • Ichthyop model: need to learn how to parallelize the code (Project Activity #2344) once data access will be OK (see #3563)
  • Ichthyop model: enable WPS (Task #3986) interface for the Ichtyop model
  • Ichthyop model: publish Non-GRID netCDF files and NCML files (Task #3138)
  • ICCAT BFT-E: validating the different algorithms within the development VRE (SAI + Statistical Manager)
  • ICCAT BFT-E: build an example of data embedding request to WPS (Task #3987) => gathering algorithms deployed with SAI under a single item dedicated to ICCAT
  • TUNA ATLAS: update of Tuna Atlas / Sardara database (Support #1899) by deploying a new version (new model + new data)
  • TUNA ATLAS: preparing R codes to enable online data access within the Statistical Manager (and later in the production environment: Tuna Atlas VRE or StockAssessment)
  • TUNA ATLAS: starting a discussion to figure out how we will be able to generate metadata from Tuna Atlas and populate geonetwork
  • TUNA ATLAS: generate graphs and figures from Tabman (Task #3158)
  • EwE model: Ecopath taxonomy dataformats (Task #1777): Support dataformats related to Ecopath. EwE models can store taxonomic data to complement model information via the following classifications or trait fields. Acronyms used are SAUP (Sea Around Us Project), FB (FishBase), SLB (SeaLifeBase), WoRMS (World register of marine species)

ICES

  • Discussed data poor models to propose for integration with FAO and WECAFC
  • Initiated the integration work on the first model (CEFAS Horse Mackerel)

FAO

  • Continued developing OpenCPU; first case are Statlant data for CCAMLR statistical reports (#1678, #3158, #3787)
  • Discussed data poor models to propose for integration with FAO, ICES and WECAFC

ENG

  • OSCAR NetCDF files imported in the development infrastructure (#3563)

#29 Updated by Anton Ellenbroek about 3 years ago

June 2016 Effort report

IRD

  • General: make (a better) use of the Statistical Algorithms Importer to publish R codes by ourselves, sending feedback in order to help setting up best practices and guidelines and updating some parts of our R codes accordingly,
  • General: make use of the algorithms (within Statman or R codes) to publish netCDF files or HTML pages on either Thredds or Apache servers (Task #2520)
  • General: Annotation of charts, discussion about identified tools and possible options. Discussing opportunities with FAO about using OPENCPU (Project Activity #3153)
  • Ichthyop model: writing a summary of the Ichthyop use case on the infra,
  • Ichthyop model: replicating OSCAR data on the infrastructure (Task #3563), trying to find a solution to give an access to the whole serie of OSCAR images as expected by Ichthyop (by packaging all yearly netCDF files within a single one)
  • Ichthyop model: need to learn how to parallelize the code (Project Activity #2344) once data access will be OK (see #3563)
  • Ichthyop model: enable WPS (Task #3986) interface for the Ichtyop model
  • Ichthyop model: publish Non-GRID netCDF files and NCML files (Task #3138)
  • ICCAT BFT-E: validating the different algorithms within the development VRE (SAI + Statistical Manager)
  • ICCAT BFT-E: build an example of data embedding request to WPS (Task #3987) => gathering algorithms deployed with SAI under a single item dedicated to ICCAT
  • TUNA ATLAS: update of Tuna Atlas / Sardara database (Support #1899) by deploying a new version (new model + new data)
  • TUNA ATLAS: preparing R codes to enable online data access within the Statistical Manager (and later in the production environment: Tuna Atlas VRE or StockAssessment) Tickets: ....
  • TUNA ATLAS: Prepared persentation to FAO - ABNJ global project
  • TUNA ATLAS: generate graphs and figures from Tabman (Task #3158)
  • EwE model: Ecopath taxonomy dataformats (Task #1777): Support dataformats related to Ecopath.

ICES

  • Tickets (...)

FAO

  • Continued developing OpenCPU; cases Statlant data for CCAMLR statistical reports (#1678, #3158, #3787)
  • Discussed data poor models to propose for integration with FAO, ICES and WECAFC
  • Drafted documents for consultant to support stock assessment

ENG

  • Completed the merge of yearly (1993-2015) OSCAR files in a single one and import into the Thredds server in the development infrastructure (#3563)
  • Phone calls and discussione about how SDMX Modules of timeseries could be extended to better support Stock Assessment and other VREs
  • Requirements analysis of the extension of SDMX module of tabman to support time series

#30 Updated by Pasquale Pagano about 3 years ago

  • Due date changed from Jul 29, 2016 to Oct 01, 2016

due to changes in a related task

#31 Updated by Anton Ellenbroek about 3 years ago

July 2016 Effort report

IRD

  • IRD
    • #4530 / #4281 / #4818: general feedback about VREs and related applications
    • #4511: integrating R codes with OPenCPU or WPS within external web pages: interacting with Enrico Anello to deliver showcases and users specifications
    • Ichtyop model ** #2344: parallelization of Ichthyop model to enable multiple simulations in a raw, working on deployment with SAI / Dataminer ** #3563 / #4802: make use of OSCAR files within the infra ** #4195: publication in the production environment
    • Sardara database ** #1899: update of the database ** Prepared persentation to FAO - ABNJ global project
    • ICCAT BFT-E ** #4627: integration of codes (step 1 / step 2 / step 3 / setp 4) in the production environment

ICES

  • Tickets (...)

FAO

  • Continued developing OpenCPU; cases Statlant data for CCAMLR statistical reports (#1678, #3158, #3787)
  • Discussed data poor models to propose for integration with FAO, ICES and WECAFC
  • Drafted documents for consultant to support stock assessment

ENG

  • Tabman SDMX Module for time series: requirements analysis completed and technical analysis started: first draft of the solution defined (#4643)
  • Integration of the Ensemble model python script in the Statistical Manager through a Java wrapper (#4621)
    • study of the model and analysys of the script.
    • identification of major issues in the script preventing the reuse of the algorithm with different input paramenters. Infact a significant part of the input was hardcoded, instead of being expected as input files/parameters. Identified issues reported to the author.
    • modification of the script to allow a 'quicker' execution, for demo purposes (execution with the 'hardcoded' input requires ~7 days).
    • release and publication in the development infrastructure.

#32 Updated by Anton Ellenbroek about 3 years ago

August 2016 Effort report

IRD

  • IRD
    • #4530 / #4281 / #4818: general feedback about VREs and related applications. make (a better) use of the Statistical Algorithms Importer to publish R codes by ourselves, sending feedback in order to help setting up best practices and guidelines and updating some parts of our R codes accordingly,
    • #4846 / #4894: specifications for 2 VREs (ICCAT and Rstudio)
    • #4511: integrating R codes with OPenCPU or WPS within external web pages: interacting with Enrico Anello to deliver showcases and users specifications
    • Ichtyop model ** #2344: parallelization of Ichthyop model to enable multiple simulations in a raw, working on deployment with SAI / Dataminer ** #3563 / #4802: make use of OSCAR files within the infra ** #4195: publication in the production environment, updates and new algorithms in the workflow
    • Sardara database ** #4863: deploying algorithms to exploit the database (plots)
    • ICCAT BFT-E ** #4627: integration of codes (step 1 / step 2 / step 3 / setp 4) in the production environment. Fixing Bugs with deployments (R configurations, OS packages..) ** Deploying a first parallelized version

ICES

  • Tickets (...)

FAO

  • Continued developing OpenCPU; Sardara on-line Geospatial data viewer (#1678, #3158, #3787)
  • Activated consultants to work on Tuna Atlas Master data, Support to community programmers, and Schema's
  • Contracted consultant to support stock assessment; WECAFC DLM R models (DLM tool; ss3 etc)

ENG

  • Tabman SDMX Module for timeseries: continued technical analysis and refinement of the first draft solution. First test implementations (#4643)

#33 Updated by Anton Ellenbroek almost 3 years ago

September 2016 Effort report

IRD

  • General: issues with downloading URL from RStudio very slow (#4918, #4863)
  • ICCAT BFT-E VRE: asking help for parallization of Fortran, ICCAT step 3 algorithm (#4957)
  • General: general feedback on issues when using applications like SAI and dataminer (#4818)
  • Tuna Atlas and ICCAT BFT-E: testing current development of WPS Widget (#5027) to set up web pages

ICES

  1. MSE for horse mackerel – working on generalization to allow the algorithm to determine parameters from input files… but, there have been issues with versions of R packages that have been used in the original code. Expect to close in OCT, then initialize a ticket for CNR to help load it as a VRE.
  2. Mixed Fisheries assessment model – In OCT working group will be meeting at ICES; BB to work with the developers to get a general version running on the infrastructure.
  3. Ensemble model – Robert Thorpe participated in the ASC workshop; planning on generalizing some things in his model to speed up integration possibility.

A comment: In general stock assessment writing is for a one-off purpose and not really focused on generalizing

FAO

  • Continued developing OpenCPU; On-line fisheries data viewer #3158, #3787">[http://vps282167.ovh.net/mediawiki/index.php/BlueBridgeJsOpenCpuLayers]
  • Activated consultants to work on Tuna Atlas Master data, Support to community programmers, and Schema's
  • Contracted consultant to support stock assessment; WECAFC DLM R models (DLM tool; ss3 etc)

ENG

  • Tabman SDMX Module for timeseries: continued implementations according to SDMX standard and SDMX source API documentation(#4643)
  • Enhancements to EwE: implementation of notifications to users and refactoring (#4924)
  • Preliminary analysis of new models: Stock Synthesis (SS3) and Surplus Production in Continuous Time (SPiCT)

#34 Updated by Julien Barde almost 3 years ago

M14 October 2016 Effort report

IRD

  • Summary of technical issues / feedback for participation to TCom3 (#4957)
  • ICCAT BFT-E VRE:
  • Tuna Atlas: consolidating recent algorithms deployed in September and related Web Forms using OpenCPU (#5490)
  • General: fixing bugs for integration of RShiny package (#4894, #5466), asking for Sharelatex integration (#4894), RStudio and virtual workspace connexion (#4918), get a better implementation of WPS standard (#5544, #5545), misc (#5599)

ENG

  • Automating the process of publication of algorithms on dataminers (both published via SAI and not). Enhancements of the functionalities to keep dataminer clusters updated with the set of registered algorithms. Current solution has been discussed and analyzed. Alternative technologies have been explored. Design of the new service has been finalized. Implementation started. (#5557)
  • Completed the first prototype of SDMX data source and discussed during the TCOM. According to the outcome of that discussion the prototype will be modified and completed by making it compliant with Tabman that will contain the tables to be exposed in SDMX format. The technical work was started during last week of October (#4643)

FAO

  • Continued developing OpenCPU; On-line fisheries data viewer #4844 (and #1678, #3158, #3787)
  • Consultants started review of DSD editing tools and reference data Schema in use by FAO
  • Consultant review of SS3 to support stock assessment in WECAFC, and DLM tools;
  • SAI/Rstudio/Dataminer - troubleshooting
  • Web-app integration - troubleshooting

#35 Updated by Anton Ellenbroek almost 3 years ago

  • Due date changed from Feb 01, 2017 to Nov 04, 2016

due to changes in a related task

#36 Updated by Anton Ellenbroek almost 3 years ago

M15 November 2016 Effort report

IRD

ICES

  • Integration

FAO

  • Started with Regional Database for the WECAFC Region (data load) #1674
  • Continued developing OpenCPU; On-line fisheries data viewer #4844 (and #1678, #3158, #3787)
  • Consultants started review of DSD editing tools and reference data Schema in use by FAO
  • Consultant review of SS3 to support stock assessment in WECAFC, and DLM tools;

#37 Updated by Anton Ellenbroek almost 3 years ago

  • Due date changed from Nov 04, 2016 to May 31, 2017

due to changes in a related task

#38 Updated by Anton Ellenbroek over 2 years ago

M16 December 2016 Effort report

IRD

  • Started with preparation of D5.3
  • ICCAT BFT-E VRE: improving the RShiny application (bugs resolved). Waiting for the infrastructure to implement docker to host shiny applications (#5822 & #5823)
  • ICCAT BFT-E VRE: improvement of automated report (detailed report and executive summary version) by setting functions for each plot of the documents.
  • Tuna Atlas VRE: discussing conventions (related to RDA working group) for data structure of Tuna Atlas data and implementing some of them to export Sardara database data into NetCDF files (or NCML), testing different methods and libraries.
  • ICCAT BFT-E & Tuna Atlas VRE: presenting the VREs to IATTC and to Global Fishing Watch (at Google's)

ENG

  • Definition of integration plan for SDMX operation library and the Portlet (#5870)
  • minor fixes to SDMX data source, waiting for integration (#4643)
  • review preparation

ICES

FAO

  • Prepared presentations for LME Meeeting in Paris, and for event and review in Brussels
  • Loaded datasets for Regional Database for the WECAFC Region (data load) #1674
  • Continued developing OpenCPU; On-line fisheries data viewer #4844 (and #1678, #3158, #3787)
  • Consultants review of DSD editing tools and reference data Schema in use by FAO;
  • Consultant testing of SS3 to support stock assessment in WECAFC, and DLM tools;

#39 Updated by Anton Ellenbroek over 2 years ago

M17 January 2017 Effort report

D5.3 Preparation (All partners)

IRD

  • Preparation for BlueBridge event and review in Brussels
  • Specifications to prepare some work with IOTC and SS3 (billfish and tropical Tuna)
  • RDA working group: writing guidelines / conventions and adapting implementations to turn VREs data (ICCAT BFT-E, Tuna Atlas, Icththyop) into standardized data structures and (meta)data formats (NetCDF)
  • Tuna Atlas VRE: planning the next steps and discussing possible collaboration to enrich the VRE with FAO, preparing the content for the public Web page (to be reused as a template)
  • Specifications for a new VRE: discussing the specifications for a new VRE dedicated to Ichthyop model reusing algorithms deployed in Stock Assessment VRE and new ones.

ENG

  • Definition of the details of the integration between SDMX exporter and the Portal (#6548)
  • Implementation of the candidate solution of the "dynamic HTTP content" issue (#5422) and definition of the testbed (#6548)

ICES

  • No report available until 06 - Feb due to absence of staff

FAO

  • Prepared and delivered presentations for BB event and BB review in Brussels
  • Reveiwed datasets for Regional Database for the WECAFC Region (data load) #1674, #6098
  • Consultants review of BiOnym tools related to Ices2Asfis mapping for use by ICES and FAO; #6502 #6508
  • Consultant identification to support stock assessment with SS3 (#5810), FLR (#6348), DLM tools.

#40 Updated by Anton Ellenbroek over 2 years ago

M18 February 2017 Effort report

D5.3 Preparation (All partners) - Delay - FAO is redefining workplans

IRD

  • RDA: writing a document defining data structures for some examples of Fisheries datasets

  • FAO_Tuna_ATLAS VRE : update the 'TunaAtlasToNetCDF.R' to generate NetCDF files in an automated way from Sardara database:

    • complying with CF-Conventions.
    • additionnal global attributes,
    • automated listing of values for non quantitative variables and coordinate variables (dimensions, eg flag_values,flag_meanings,flag_masks,..) in the "units" attribute. See script here: https://goo.gl/lewZpf
    • Creating a generic algorithm to transform any table of data into NetCDF file (still ongoing). Validation will be done by testing this algorithm with new queries (other variables and dimensions of Sardara database)
  • Ichthyop_Model VRE :

    • identification of new algorithms to be deployed in the new VRE: comparison and simulation of trajectories (3 algorithms until now)
    • creation of the "SPATIAL_DENSITY_DISTRIBUTION" algorithm to map the spatial distribution of data (number of observations). This algorithm is generic and can be used for other trajectories data than the ones used for the study (FADs VS Drifters). Link : https://goo.gl/NnrHg7
    • creation of the "SCATTERPLOT_DIAGRAM" algorithm which represents 2 variables with scatter plots (clouds of points) to emphasize the level of correlation between these variables le degré de corrélation entre ces deux variables. This algorithm is generic and can be used for any table of data. Link : https://goo.gl/cGK6HD
    • creation of the "SPATIAL_DISTRIBUTION_OF_CORRELATION" algorithm which enables to map the spatial distribution of correlations between 2 variables according to spatiale area (eg longhurst provinces or grid of X degree). This algorithm is generic and can be used for any table of data. Link : https://goo.gl/CM7rDa
    • automated report :using the VRE for the collaborative edition of an article. Support to users to validate methods and compile the code in the infra.

ENG

  • Implementation of the Tabman Operation module for exporting SDMX Data Structures (#4643 and #7121) along with related Codelists on Fusion Registry. The component was released with gCube 4.3.0 (#7338)
  • Implementation of the portlet based solution enabling secure access to unsecure content through the Portal (#5422). The first version, supporting authentication, was released with gCube 4.3.0 (#7357)
  • Implementation of a service, aka DataMiner Manager Pool, able to automatize the process of publication of SAI algorithms on dataminer nodes (#5557)

ICES

Accomplishments:
Task #7491: ICES effort for February was devoted to developing shiny applications that were also used for the ICES MSY training course (in Anna’s WP). We created a Length-based indicator application that takes raw length frequency data and transforms them into output suitable for determining ICES proxy MSY reference points. The code is ready for deployment in a docker image and would be useful as a test case for shinyproxy (task #7492).

Task #5818: Closed

Deviations:
I think further effort on tasks #1768, #4620, and #5517 might not be useful, as they are not really suited for deployment on the e-infrastructure, modifying the “log-jam” functions would likely be more efficient using C++ or TMB rather than simply parallelizing inefficient code, and they might have limited broad-scale application to the rest of the community.

Plans:
Further effort should be designated to linking the e-infrastructure with docker images / shinyproxy such that data can be moved between the resources and so that the e-infrastructure is more than a simple server for shiny apps. I think Gabriele has some thoughts on how this might be done.

Further effort should be designated to linking workspace, Rstudio, and VREs; described in task #7493.

FAO

  • Prepared and delivered p

#41 Updated by Anton Ellenbroek over 2 years ago

M19 March 2017 Effort report

D5.3 Preparation (All partners) - Delay - FAO is redefining workplans

IRD

  • Specificactions and algorithms for Ichthyop_Model VRE (for now on StockAssessment VRE): #7449
    • Update following algorithms deployed on SAI: -"SPATIAL_DENSITY_DISTRIBUTION","SCATTERPLOT_DIAGRAM","SPATIAL_DISTRIBUTION_OF_CORRELATION".
    • New algorithms:
    • comparisons of trajectories: "smooth_scatterplot_Uvspeed"
    • prepare deployment as WPS
    • implement algorithms to compare large datasets of Drifters and FADs trajectories in Atlantic Ocean
    • transfering a sharelatex project to edit an article about trajectories comparisons using services of the infra (workspace- R on line, sharelatex RstudioLab VRE): #7437, #7627, #7862, #7612
  • Starting new work on SS3 with IOTC & Ifremer (#7450):
    • compiling SS3 online (the Bluebridge project infrastructure): learning basics to deal with a VRE and related RStudio
  • Preparing material for RDA: online documentation and presentations.
  • Participation to 4th TCom.
  • Participation to Regional Database workshop
  • First test with Geoserver Instance for Tuna Atlas (creating a WMS/WFS):#7451
  • Codes to read in RStudio files which are stored in the virtual workspace: #7495, #6820, #7113, #7609
  • metadata: #7455
  • misc: #7341, #7299, #7477, #7463, #7592, #7591, #7651

ENG

  • Continued the implementation of the Smartgear DataMiner Manager Pool service, able to automatize the process of publication of SAI algorithms on dataminer nodes (#5557)(#7928)
  • Proxy for unsecure services completed (#5422) and released (#7931)
  • Support and test for the new version of Fusion Registry deployed on the infrastructure (#4737)
  • Improvements of SDMX (#7358) exporter that now supports templates (#7359) and Fusion Registry 8
  • Followed the workshop on new Regional Database
  • Took part in the 4th TCOM presenting the latest achievements

ICES

Accomplishments:
Task #7491: ICES effort for February was devoted to developing shiny applications that were also used for the ICES MSY training course (in Anna’s WP). We created a Length-based indicator application that takes raw length frequency data and transforms them into output suitable for determining ICES proxy MSY reference points. The code is ready for deployment in a docker image and would be useful as a test case for shinyproxy (task #7492).

FAO

#42 Updated by Julien Barde over 2 years ago

M20 April 2017 Effort report

IRD

  • Algorithms for Ichthyop_Model VRE (for now on StockAssessment VRE): #7449
    • Create a WPS version of "SMOOTH_SCATTERPLOT_DIAGRAM" of algorithm (and deployment on SAI / Dataminer of PrototypingLab VRE in the left menu item "Ichtyop Model"): generic algorithm to make an indirect comparison of in situ observations by using remote sening data (through a data table displayed as as scatter plot).
    • Create a new generic algorithm "Enrichment_From_Netcdf" to enrich observations (location given by points for now) with data harvested on remote servers (Thredds / OPeNDAP): satellite products or model outputs. Parallelization of the code. Creation of a WPS version (deployment on SAI / Dataminer of PrototypingLab VRE in the left menu item "Ichtyop Model"),
    • Create a new generic algorithm "Search_Closest_Obs_FadsDrifters". Parallelization of the code. Ensure genericity of the code (not sticking only to the use case of Drifters and FADs). Execution of this algorithm to identify the couples of FADs & Drifters which are close in time and space in Indian Ocean from 2008 to 2014, same for Atlantic (for a publication).
  • ICCAT_BFT-E VRE:
    • update bluefin tuna stock assessment algorithms on ICCAT_BFT-E VRE:
    • make use of new "uploadFileWS" et "downloadFileWS" functions
    • modification of STEP 1 algorithm output (STEP1 parallelized) : now returns the whole output files instead of a list of links for each iteration.
  • SS3 with IOTC & Ifremer (#7450):
    • explore and better understand SS3 (SS3 user guide) and its parametrization (for the latest version: SS3.30): description of the inputs and outputs in terms of structure, data type, and conditional requirements.
    • write accordingly a R scripts that will automatically produce the four files that are required to launch the model,
    • sharing about the project with IOTC stock assessment officer to get samples of codes: the consultant's SS3 scripts used in the last assessment of yellowfin and bigeye tuna (2015).
    • updating the scripts provided by IOTC from SS3.24 to SS3.30.
    • configuring sharelatex project: #7437
  • Tuna Atlas VRE: update public Web page: #8061
  • Attending RDA: #604, discussing improvements for online documentation and presentations.
  • About metadata:
    • trying codes (R codes of Emmanuel Blondel) and discussing methods to populate the infrastructure catalog #7455
    • update public Web page: #8061
    • Discussing the use of GN for the Tuna Atlas VRE #7896
    • Discussing the use of Thredds for the VRE #5679
  • misc: #7495, #7609, #7591, #7995, #7592, #7989, #6820, #8031, #8131, #8259

ENG

  • SDMX Module implementation
    • implementation (#7359), test (#8033) and release (#7986, #7992, #7876) of the functionality to export Tabman templates as SDMX data structures
  • Secure access to unsecure content
    • proxy solution for static content releases (#7931)
    • analysis and first test implementation of the capability to support dynamic content, starting from javascript (#8284)
  • Finalized and released the first version of the Smartgear DataMiner Manager Pool service #5557 #8208

ICES

  • Collaborated with FAO on recruitment of developer and preparation of work-plan
  • Identifying DLM methods ICES needs consultant to develop.
  • Initiated in-house review of FIRMS links to ICES web services
  • Deep in ICES advice season so resources are limited

FAO

  • #7865; After species mapping experiments from ASFIS to WoRMS made with Aymen we requested CNR to update the Worms list for Bionym use.
  • #7882; To check if the online Rstudio in the e-infra can connect to the user's workspace
  • #8217; To share some ideas with the community on how to build a driver to mount the user's workspace in linux systems. related to #7882
  • #8272; To add a new page in the Tuna Atlas VRE for letting them run a model using the widget I built
  • #8318; To update the WPS Client R library installed in OpenCpu that has been changed to authenticate itself to the WPS just with the token instead of username/token
  • RDB => development of backend service; FAO proposed to use ENG effort with SDMX DSD and validation
  • RDB => FAO considering how to recruit front end developer shared for RDB and other projects (F3 = FCube)
  • Stock assessment scientist => A Rshiny developer, want to add CMSY to package but .. ** ICES will follow this activity closely, and recruit the consultant after FAO

#43 Updated by Anton Ellenbroek over 2 years ago

  • Due date changed from Jul 31, 2017 to May 31, 2017

due to changes in a related task

#44 Updated by Anton Ellenbroek over 2 years ago

  • Due date changed from May 31, 2017 to Jul 31, 2017

due to changes in a related task

#45 Updated by Anton Ellenbroek over 2 years ago

M21 May 2017 Effort report

IRD

  • Tuna Atlas VRE:
    • new version of the algorithm transforming Tuna Atlas data in NetCDF ‘TunaAtlasTo NetCDF_V2’: update the part which manages the time dimension to configure multiple time step (yearly, monthly, daily..)
    • using R codes of Emmanuel Blondel to write metadata for some datasets
    • using R codes of Emmanuel Blondel to publish metadata in Geonetwork
    • using R codes of Emmanuel Blondel to publish datasets as WMS/WFS using Geoserver instance.
    • Some Tuna Atlas datasets stored in the virtual workspace as CSV and NetCDF files (waiting to be displayed in Thredds)
  • Ichthyop VRE:
    • update of the algorithm ‘ENRICHMENT_FROM_NETCDF.R’ to fix bugs (users feedback).
    • creation of a new script ‘Search_Closest_Obs’ which takes three input parameters: two tables containing observations (located with points) and a spatio temporal ‘buffer’. The code will then find the closest (in space and time) observations in both tables by calculating for each couple of points the spatial and temporal distance.
    • update OSCAR NetCDF (Task #8713)
  • Eastern Bluefin Tuna 2017 Stock Assessment
    • preparing new data to be processed in 2017
    • update the algorithm of ICCAT for the new assessment: 1/ mapping old and new files provided by ICCAT, 2/ update chuncks of codes for the workflow to be me more generic (working for multiple stock assessment).
    • ongoing retrospective analysis (STEP 1) on the infrastructure (both RStudio and Dataminer).
  • SS3:
    • Successfully tested the latest version of the SS executable (SS3.3) on the infra, using a simple code (one simulation takes about 2.5 minutes).
    • Using example codes given by IOTC, and a simulation compliant with the consultant's methodology, one real-case simulation was tested. This simulation takes about 20 minutes on a laptop and 1.2 hours on the infra.
    • Calculated expected resource use for different scenarios of use during a real stock assessment, assuming a run time of 20 minutes (summary at the TCom).
    • Contacted the NOAA SS3 team to confirm that we can deploy SS3 for our purposes (a group of people registered as members of the VRE). The initial response is positive, and awaiting final confirmation.
    • Continuing to recompile IOTC example codes and splitting these into functions.
  • RDA call: preparing examples for metadata and data access

ENG

  • SDMX Module implementation (#7358)
    • Definition of new priorities about the modules to be implemented
    • Implementation of the module generating excel files according to an SDMX data structure started (#8717)
  • Secure access to unsecure content
    • proxy service supporting dynamic content and javascript ready to be released on gCube 4.6.0 (#8284)
  • According to new CNR guidelines, started the implementation of the second version of the Smartgear DataMiner Manager Pool service #5557 #8208
  • Ongoing wrapping of the OSCAR Dataset merger as a scheduled job for the Smart Executor (#8713)

ICES

  • Collaborated with FAO on recruitment of developer and preparation of work-plan
  • Identifying DLM methods ICES needs consultant to develop.
  • Initiated in-house review of FIRMS links to ICES web services
  • Deep in ICES advice season so resources are limited

FAO

  • #7865; After species mapping experiments from ASFIS to WoRMS made with Aymen we requested CNR to update the Worms list for Bionym use.
  • #7882; To check if the online Rstudio in the e-infra can connect to the user's workspace
  • #8217; To share some ideas with the community on how to build a driver to mount the user's workspace in linux systems. related to #7882
  • #8272; To add a new page in the Tuna Atlas VRE for letting them run a model using the widget I built
  • #8318; To update the WPS Client R library installed in OpenCpu that has been changed to authenticate itself to the WPS just with the token instead of username/token
  • RDB => development of backend service; FAO proposed to use ENG effort with SDMX DSD and validation
  • RDB => FAO considering how to recruit front end developer shared for RDB and other projects (F3 = FCube)
  • Stock assessment scientist => A Rshiny developer, want to add CMSY to package but .. ** ICES will follow this activity closely, and recruit the consultant after FAO

#46 Updated by Anton Ellenbroek about 2 years ago

M22 June 2017 Effort report

IRD

  • TCom preparation: https://docs.google.com/presentation/d/1BnfzyjtCpxaipBYoz1l-ldB9BNbaGv94ALD9jN8MYi8/edit#slide=id.g1f7496a4c8_0_8
  • Tuna Atlas VRE:
    • validation of CSW harvesting (from GN2 of the VRE to GN3) #8937 and CKAN #8874 #8879], #5679
    • enrichment of codes to create better metadata and new metadata sheets (for codelist and nominal catch) in close collaboration with FAO
    • metadata for processes: #5559
    • update of WMS/WFS (new ones for codelist and nominal catch)#9018
    • R package with assistance of FAO #9018
  • Ichthyop VRE:
    • finalizing processes for a publication on Fish Agregating Devices
    • OSCAR data update: #8713
  • Eastern Bluefin Tuna 2017 Stock Assessment
    • preparing new data to be processed in 2017, blocking issues: in #919 and #8996
    • processing new data provided by ICCAT and Ifremer.
    • expecting help for shiny dockerization: #8902, #9012
  • SS3:
    • description of different possible scenario and related model parameterization. Calculation of time execution on the infrastructure.
    • execution of past experiments / Stock assessment with different compiled versions
    • interactions with NOAA SS3 team.

ENG

  • New scheduling for SDMX Module implementation (#7358) agreed during the TCOM
  • Continuing the implementation of the module generating excel files according to an SDMX data structure (#8717)
  • Excel file successfully generated and saved in the Workspace, a first implementation of the graphical command provided by CNR (#8181)
  • Implementation of a new Tabman Resource related to SDMX object that will support the graphical user interface for providing the users with more information about the registry where data are exported (#9056)
  • New components including the features mentioned above are ready to be released (#8994, #8995, #8998,#8999)
  • The Implementation of the new version of the Smartgear DataMiner Manager Pool service has been finalized. The Integration test with SAI is ongoing (#5557)

ICES

  • C

FAO

  • Presented progress to CWp and FiRMS meetings, and species mapping experiments from ASFIS to WoRMS to ICES.
  • RDB => development of backend service; ENG effort with SDMX DSD and validation
  • RDB => FAO recruited front end developer
  • Stock assessment scientist => A R and Rshiny developer, working on adding CMSY to DLM Toolkit package
  • Assisted in implementing the first example shinyproxy application on the infrastructure (LBI indicator tool #7492 #5822 #8902).
  • Presented progress on the integration of CMSY into DLM toolkit to TCom 5 (#5238 #8500 #8501 #8503).
  • Benchmarked processing requirements of CMSY model as identified issues preventing DLM Toolkit integration (#8501 #8503).
  • Identified potential methods of optimizing CMSY algorithm runtime through code vectorization and adaptive Monte-Carlo search (#8501 #8503).

#47 Updated by Anton Ellenbroek about 2 years ago

M23 July 2017 Effort report

IRD

ENG

  • Smartgear DataMiner Manager Pool service implementation (#5557):
    • new features added
    • improved the core functionalities
    • bugs fixed
    • Concerning the Staging Phase: performed the overall end-to-end integration test among the service, SAI Importer and the CronJob
    • Concerning the Release Phase: development finalized and local test carried out (integration test to be performed)
    • Expected release in gCube 4.6.1
  • SDMX related activities (#7358)
    • first version of excel generator implemented (#8717) and released (#9389)
    • minor fixes and improvements (also with respect to the previusly planned release with gCube 4.6.0) on the functionalities supporting SDMX (#8995, #8998, #8994, #9001, #9902, #8999) and released with gCube 4.6.1
    • meeting with FAO in which the status of the activities has been evaluated and the next steps have been planned concerning excel generator

ICES

  • pp

FAO
Produced modified CMSY code using vectorization that significantly reduced runtime of CMSY (#8501 #8503).
Identified issue of species specific extreme runtime for CMSY (#8501 #8503).
Determined cause of extreme runtime for CMSY to be due to incompatible prior bounds on depletion (#8501 #8503).
Produced new CMSY implementation that uses revised parameter sampling logic and automatically adaptive prior bounds to decrease runtime and avoid species specific model failure scenarios (#8501 #8503).
Working with CNR to benchmark new code against expanded example species set and implement new code in dataminer (#8501 #8503).

#48 Updated by Anton Ellenbroek about 2 years ago

M24 August 2017 Effort report

IRD

  • Tuna Atlas VRE
    • improving the (meta)data workflow
  • Stock Assessment / Ichthyop VRE:
    • finalizing an article comparing in situ and remote sensing data (OSCAR sea surface currents compared to Fish Aggregating Devices trajectories)
  • IOTC SS3 VRE:
    • replicating past workflow (stock assessment)
    • planning new steps of the workflow (shiny apps and automated reports)
    • wrote the R script to transform (restructuring) the data outputs of SS3 (R data) into NetCDF files
    • submitted a paper for the IOTC billfish WPB15 meeting in San Sebastian (10/9-14/9), entitled "An online tool to easily run stock assessment models, using SS3 and SWO as an example", with authors: Anne-Elise Nieblas, Sylvain Bonhommeau, Taha Imzilen, Dan Fu, Fabio Fiorellato, and Julien Barde. Sylvain Bonhommeau will present the paper.

FAO / WECAFC

ENG

  • SDMX related activities (#7358)

    • implementation of the new features agreed during FAO-ENG meeting started (#8717)
  • The Integration test of the new version of the Smartgear DataMiner Manager Pool service with SAI, the CRON job and the portlet has been finalized, both for Staging and Release phases (#5557). Some modification has been carried out and bugs fixed.

ICES

#49 Updated by Anton Ellenbroek about 2 years ago

  • Due date changed from Dec 01, 2017 to Dec 31, 2017

due to changes in a related task

#50 Updated by Anton Ellenbroek almost 2 years ago

M25 September 2017 Effort report

IRD

  • Improve and enrich descriptions of Tuna Atlas metadata workflow to present a generic method for RDA
  • Update the code which writes NetCDF files to access Tuna Atlas datasets (improving execution time) => only done for the case of "nominal catch", working on other cases (catch, effort: trying other methods than R Rasterize function)
  • tickets: #9734, #9698, #9135

FAO

  • Stock assessment
  • Developing a how-to guide for the development of Shinyproxy docker images. #8902
  • SDMX related activities (#7358)

FAO / WECAFC

  • Testing existing DLMtoolkit shiny application to identify further development requirements. #9869

ENG

  • SDMX related activities (#7358)

    • Releases of the old features already planned for gCube 4.6 and released with 4.6.1 (#9389)
    • Implementation of the excel exporter completed (#8717) according to the features defined in July.
    • Definition of the next steps concerning the features needed by FAO for Tabman/RDB
  • Features concerning the Smartgear DataMiner Manager Pool service developed/tested/released. (#9559)(#8651)(#9641)(#9637)(#8245)(#9722)(#9562)(#9724)(#9737)(#9850)

    • Supporting CNR to deploy in preproduction and production environments

ICES

  • Implemented BFT-E and ICCAT shiny apps on the infrastructure. #9012

#51 Updated by Dimitris Katris almost 2 years ago

  • Due date changed from Dec 15, 2017 to Dec 31, 2017

due to changes in a related task

#52 Updated by Anton Ellenbroek almost 2 years ago

M26 October 2017 Effort report

IRD

  • Thredds confguration: working on configuration of Thredds to set up catalogs for VREs
  • IOTC SS3 VRE:
    • keep on developing a shiny application and dynamic report for IOTC / SS3 outputs
    • creating a workflow to extract metadata from NetCDF files using Thedds / OPeNDAP and R Codes
  • ICCAT BFT-E VRE: update metadata of NetCDF (2014 stock assesment outputs), same (ongoing work) for 2017 and dynamic report (with Sharelatex)
  • Ichthyop model:
    • "Enrichment_From_Netcdf" algorithm: can now use more than one product (natively OSCAR)
    • "Enrichment_From_Netcdf" algorithm: parallelization (ongoing)
    • "SCATTERPLOT_DIAGRAM" algorithm: possibility to create scatterplots diagram from 2 variables (to be specified as input parameters)
  • Tickets: #9826

FAO

  • Developing mean-length based mortality estimator attachment for Stock assessment shiny app
  • Developing a how-to guide for the development of Shinyproxy docker images. #8902
  • Implementing new fast CMSY package within data-miner
  • SDMX related activities (#7358)

FAO / WECAFC

  • Testing existing DLMtoolkit shiny application to identify further development requirements. #9869
  • Developing preliminary training plans for introduction workshop to VRE and data limited methods

FAO / ICES

  • Developing data translator between DLMtoolkit and FLR package data objects.

ENG

  • SDMX related activities (#7358)

    • New excel exporter module of Tabman released (#100018 and #10030).
    • Definition of the next steps concerning the features needed by FAO for Tabman/RDB
    • Implementation of SDMX Data Source started (#7360)
  • Bug fixing on the Data Miner Pool Manager (#10077 and #10076)

  • Improvements on Data Miner Pool Manager concerning the Production Environment (#10088)

#53 Updated by Anton Ellenbroek almost 2 years ago

M27 November 2017 Effort report

IRD

  • IOTC SS3 VRE:

    • SS3 NetCDF outputs have been enriched and tested through OPeNDAP access (eg files on Thredds),
    • first version of a shiny application for IOTC / SS3 outputs has been dockerized and deployed (https://shinyproxy.d4science.org/app/IOTC_SS3) but the online version still needs to be updated,
    • first version of the workflow to extract metadata from NetCDF files using Thedds / OPeNDAP and R Codes have been successfully tested.
  • ICCAT_BFT-E:

    • harmonization of headers of NetCDF files (names of variables and related attributes)
    • this will enable to reuse ICCAT shiny apps to visualize SS3 outputs
  • Ichthyop model algorithms (available here : https://goo.gl/bkaxgZ, described here) have been updated or created :

    • parallelization (ongoing) of Enrichment_From_Netcdf algorithm : This process take in parameter a spatio-temporal dataset and an OPeNDAP dataset URL and try to enrich the input dataset with OPeNDAP data. Input dataset must be an csv file of point geometry only (actually). Tested with SST data.
    • SCATTERPLOT_DIAGRAM: mathematical diagram using Cartesian coordinates to display values for typically two variables for a set of data
    • new SMOOTH_SCATTERPLOT_DIAGRAM algorithm deployed : mathematical diagram using Cartesian coordinates to display values for typically two variables for a set of data
    • new SPATIAL_DENSITY_DISTRIBUTION algorithm deployed : generates a plot of the spatial density distribution of a variable in a dataframe
    • new SPATIAL_DISTRIBUTION_OF_CORRELATION algorithm deployed : generates a plot of the spatial distribution of correlation for two variables in a dataframe
  • Tickets: #10340, #10210

FAO / Rome

  • Implementing new fast CMSY package within data-miner
  • SDMX related activities (#7358)

FAO / WECAFC

  • Developing mean-length based mortality estimator attachment for Stock assessment shiny app. #5238
  • Testing existing DLMtoolkit shiny application to identify further development requirements. #9869
  • Developing preliminary training plans for introduction workshop to VRE and data limited methods. #5238
  • Running comparative testing of CMSY code versions. #8503
  • Identified statistical difference between CMSY-Legacy and CMSY-FAST. #8503
  • Bug checking differences between CMSY-Legacy and CMSY-Vectorized. #8503
  • Implemented IOTC-SS3 diagnostic plots as docker image on https://shinyproxy.d4science.org/ #7450

ENG

  • SDMX related activities (#7358)

    • New version of the SDMX exporter released (#10030): the release includes the improvements implemented in the previous months
    • Implementation of SDMX Data Source continued: a first prototype released and evaluated by some partners (#7360)
  • New version of the dataminer pool manager including IS-based conviguration released (#10267)

#54 Updated by Julien Barde over 1 year ago

M28 December 2017 Effort report

IRD

  • IOTC SS3 VRE:

    • parralelization of codes to run the ss3
    • continuing to build on ss3 metadata (Outputs2NetCDF_light.R code, https://goo.gl/EKKg5Y): include species names, file input for generic metadata elements,
    • mapping the shared outputs between VPA and SS3
  • Tuna Atlas VRE:

    • set of scripts to calculate indicators related to FADs and VMS/ AIS trajectories (https://goo.gl/CAiCtf). These codes calculte 3 different types of indicators / variables which can feed the Tuna Atlas :
    • number of objects (FADs, vessels) within a given polygon (eg square of a grid) for a given period of time
    • total length (and normalized version of the same variable) of the segments of trajectories for all objects (FADs, vessels) within a given polygon (eg square of a grid) for a given period of time
    • total area explored (and normalized version of the same variable) by the the different objects (FADs, vessels) within a given polygon (eg square of a grid) for a given period of time (achived by creating a buffer over the polylines of the trajectories)
    • related metadata is written by the codes
    • adapting the code used to transform the Tuna Atlas data into NetCDF files in order to reuse it for FADs/VMS/AIS data.
  • RStudio lab VRE: trying other options to compile a publication with Sharelatex (#9826)

FAO / WECAFC

Developing preliminary training plans for introduction workshop to VRE and data limited methods. #5238
Completed comparative testing of CMSY code versions. #8503
Found and corrected bugs between CMSY-Legacy and CMSY-Vectorized. Confirmed CMSY-Legacy and CMSY-Vectorized
results are not significantly different. #8503

#55 Updated by Anton Ellenbroek over 1 year ago

M28 December 2017 Effort report

IRD

  • IOTC SS3 VRE:

    • SS3 NetCDF
    • shiny application for IOTC / SS3 outputs has been dockerized and deployed .....
    • the workflow to extract metadata from NetCDF files using Thedds / OPeNDAP and R Codes have been ....
  • ICCAT_BFT-E:

    • ??harmonization of headers of NetCDF files (names of variables and related attributes)
    • ??this will enable to reuse ICCAT shiny apps to visualize SS3 outputs
  • ??Ichthyop model algorithms (available here : https://goo.gl/bkaxgZ, described here) have been updated or created :

    • ??parallelization (ongoing) of [Enrichment_From_Netcdf]
  • Tickets: #10340, #10210

FAO / Rome

  • Implementing new fast CMSY package within data-miner and RShiny, added new methods to RShiny
  • SDMX related activities (#7358)

FAO / WECAFC

  • Developing mean-length based mortality estimator attachment for Stock assessment shiny app. #5238
  • Testing existing DLMtoolkit shiny application to identify further development requirements. #9869
  • Developing preliminary training plans for introduction workshop to VRE and data limited methods. #5238
  • Running comparative testing of CMSY code versions. #8503
  • Identified statistical difference between CMSY-Legacy and CMSY-FAST. #8503

ENG

  • SDMX related activities (#7358)
    • First version of the SDMX Data Source (#7360) released (#10807, #10808, #10809)
    • Modifications on the SDMX Exporter module required to support the SDMX Data Source implemented and released (#10806)
    • Bug fixing on the excel export module (#10798)
  • Dataminer pool manager 2.3.0 released (#10566): it logs dependency errors

#56 Updated by Anton Ellenbroek over 1 year ago

  • Due date changed from Dec 31, 2017 to Jan 31, 2018

due to changes in a related task

#57 Updated by Anton Ellenbroek over 1 year ago

M29 January 2018 Effort report

IRD

  • Stock Assessment / Ichthyop VRE:

  • IOTC SS3 VRE:

    • update of scripts to manage a new SS3 version (from ss3.24ab to ss3.24z)
    • adapting NOAA greater amberjack and Atlantic bluefin tuna code to VRE ss3write* scripts and editing shiny to account for a new spp
  • Tuna Atlas VRE:

    • finalization of the workflow (R & SQL) for the setup and update of the Tuna atlas DB and successful test of this workflow (i.e. filling of the new tuna atlas DB)
    • adaptation and optimization of R codes to manage the aggregation of FADs or VMS trajectories either with grids (rasterisation) or with irregular polygon (ex: EEZ).
    • calculation of muliple indicators from FAD and VMS/AIS high resolution data
    • creation of VMS map from different indicators
    • enrichment of metadata produced when calculating indicators
    • (ongoing) New version of R code used to transform Tuna Atlas data into NetCDF (including trajectories of FADs and fishing vessels): specifications of new data structures and metadata elements to be used as inputs in order to get a single function. Samples of outputs can be seen here: https://goo.gl/QtS6LZ
  • Metadata workshop Montpellier:

    • enrichment and validation of a metadata workflow from a Thredds catalog
    • enrichment and validation of a metadata workflow from a google spreadsheet

FAO / Rome

FAO / WECAFC / ICES
- Completing shiny app dockerization and bluebridge hosting documentation.
- Completed comparative testing of CMSY versions. #8503
- Developing shiny interface for indicator estimation using ICES DATRAS and SAG databases.

ENG

  • SDMX related activities (#7358)
    • Pre production Tests and bug fixing of SDMX Data Source Service
    • Definition of an autonomous operation to export SDMX data structures and data in excel format (#8717)

#58 Updated by Julien Barde over 1 year ago

M30 February 2018 Effort report

IRD

  • Prepared participation to events organized by FAO Rome on 19-22 March with Tuna RFMOs and other fisheries bodies.

  • Tuna Atlas VRE:

    • Creation of the global datasets of fishing efforts (from the effort data coming from the 5 tuna RFMOs)
    • Improvements on the metadata (particularly description and titles) of the tuna atlas data
    • execution of R codes with AIS data : pre-processing added to clean points,
    • improvement of R codes to facilitate their reuse after the end of the project (detailled descriptions, creation of functions for the code to be more readable). Documentation / summary to facilitate the reusing of the work later on
    • creation of R codes to enable comparisons between AIS and VMS data with plots (once processed with maps)
    • update (last version) of 'How_to_transforme_data_to_NetCDF_files.R' function used to create NetCDF from Sardara data and from (raw) trajectories data. (link: https://goo.gl/VdMWFX),
    • examples of outputs (NetCDF files) created by executing 'How_to_transforme_data_to_NetCDF_files.R' function rom SARDARA data inputs (link: https://goo.gl/ugc17V), FAD (lien: https://goo.gl/eUuEvu), VMS (lien:https://goo.gl/fuXHHG) and AIS (lien:https://goo.gl/1ux9w1).
  • French Tropical Tuna Fisheries VRE:

    • Testing of the R scripts written by Chloé Dalleau for the extraction of the data from the french tropical tuna fisheries databases managed by IRD
    • Validation of the setup of the new French tropical tuna atlas VRE
  • IOTC SS3 VRE:

    • Added BFT to VRE & shiny
    • Updated GAJ on VRE & shiny
    • Updated Shiny app
    • Parallelized ss3 write codes (ss324z_grid_foreach.R)
    • Completed the SKJ grid (180 runs)
    • UCleaned and organized final ss3 VRE codes
  • RStudio Lab / Stock assessment VRE:

    • new archive to be deployed on ZENODO (file "FADsDRIFTERS_ARTICLE_PIO.zip", link:https://goo.gl/mvZZZv)
    • verification of scripts deployed on DataMiner and update of 'ENRICHMENT_FROM_NETCDF' (chunk of code related to time management, OSTIA SST added as new satellite product to be used for enrichment)
    • parallelization of 'ENRICHMENT_FROM_NETCDF'(ongoing): problem identified with the netcdf4 which doesn't work within an iteration if parallelized (since using C). New try with RNetCDF.

FAO

  • Tuna Atlas
    ** Continued development with IRD of generic Tuna Atlas Features

  • FiRMS
    ** Continued work to strengthen FiRMS partnership collaboration towards uptake of BlueBRIDGE services for Tuna Atlas, RDB Data collection and harmonization and GRSF.
    ** A additional VRE for CWP statistical reference data management and dissemination was validated

  • SDG14.4.1
    ** Continued development of VRE based RShiny services
    ** Continued liaison with FAO on development of transversal eLearning course.

#59 Updated by Pasquale Pagano over 1 year ago

  • Status changed from In Progress to Closed

Also available in: Atom PDF