Project Activity #5828

Project WP #665: WP7 - Supporting Blue Environment: VREs Development [Months: 7-30]

Project Task #666: T7.1 Aquaculture Atlas Generation VRE [Months: 7-30]

Project Activity #3189: 7.1.6 Vizualisation of data (EO + aquaculture features) with EODA portal

Evaluate feasibly to publish Spatialite layer in GeoServer with existing tools

Added by Emmanuel Blondel about 3 years ago. Updated over 2 years ago.

Status:ClosedStart date:Dec 02, 2016
Priority:NormalDue date:
Assignee:Manuel Goacolou% Done:

100%

Sprint:WP07
Participants:5 - FAO
Milestones:
Duration:

Description

As discussed with CLS last week, i will evaluate how they can publish programmatically there Spatialite file to Geoserver using the existing tools. Outcome of this activity will include feasibility study, possible steps to proceed, and recommendations.

Material:

  • GeoServer 2.1.2 (used in D4Science
  • geoserver-manager (as blue common tool dependency or latest version)

aaps_publish_layers_spatialite.py Magnifier (7.62 KB) Emmanuel Blondel, Feb 17, 2017 07:19 PM

aaps_publish_layers_spatialite2.py Magnifier - Publication script based on GeoServer REST API only (4.39 KB) Emmanuel Blondel, Mar 17, 2017 07:26 PM


Subtasks

Task #6127: Deploy SG container with data transfer service to Geoserv...ClosedRoberto Cirillo


Related issues

Related to BlueBRIDGE - Project Activity #5794: Publish new Geoserver workspace "aaps" in support of 'Aqu... Closed Nov 17, 2016
Related to BlueBRIDGE - Support #6065: Geoserver - current forced rewriting rule to https preven... Closed Nov 28, 2016
Related to BlueBRIDGE - Support #6851: Need further information to exploit Home Library REST API Closed Feb 02, 2017
Related to D4Science Infrastructure - Task #7100: Assistance in using HomeLibrary REST Interface Closed Feb 17, 2017
Related to BlueBRIDGE - Support #7155: How to add AquacultureAtlasGeneration scope with AAPS Geo... Rejected Feb 17, 2017
Related to BlueBRIDGE - Task #7169: Install AAPS production GeoServer with spatialite plugin ... Closed Feb 20, 2017
Related to BlueBRIDGE - Task #7538: Deploy patch (JAR) in AAPS Production Geoserver Closed Mar 15, 2017
Related to BlueBRIDGE - Task #7590: Create & publish ISO metadata for CLS outputs Closed Mar 20, 2017
Blocks BlueBRIDGE - Task #7579: Publish all available CLS outputs as WMS/WFS for use in A... Closed Mar 17, 2017

History

#1 Updated by Emmanuel Blondel about 3 years ago

  • % Done changed from 0 to 10
  • Status changed from New to In Progress

I've been seeking for the GeoServer spatialite plugin for Geoserver version 2.1.2. I didn't find any release repository offering this extension. Repository of Nightly builds for this version is not accessible anymore apparently.

In addition, several issues were reported with spatialite extension & REST API (and this even targeting versions > 2.1.2, which i have to say is quite old)
http://osgeo-org.1560.x6.nabble.com/Upload-spatialite-to-GeoServer-using-REST-API-td5062528.html
https://sourceforge.net/p/geoserver/mailman/message/34904719/

As @fabio.sinibaldi@isti.cnr.it is suggesting here: https://support.d4science.org/issues/5794#note-5, i would opt for:
* setting a updaded and separated dev GeoServer instance (latest release 2.10.0)
* enable the spatialite plugin (available from the nightly builds: http://ares.boundlessgeo.com/geoserver/2.10.x/community-latest/geoserver-2.10-SNAPSHOT-spatialite-plugin.zip ) following the documentation: http://docs.geoserver.org/2.10.x/en/user/community/spatialite/index.html

The documentation also indicates the Spatialite extension is supported by REST API

@fabio.sinibaldi@isti.cnr.it can you confirm CNR can support this new runtime resource?

@mgoacolou@cls.fr , @nlongepe@cls.fr , let me know if it's ok for you. We can later then test REST API publication of spatialite file, but not with blue common Java libraries. Can you clarify how you want to use the REST API. See http://docs.geoserver.org/2.10.x/en/user/rest/examples/index.html#rest-configuration-examples It depends on how you want to use this (i suppose binding the publication to the data production flow), and your workflow language preference.

#2 Updated by Emmanuel Blondel about 3 years ago

  • Status changed from In Progress to Feedback

#3 Updated by Emmanuel Blondel about 3 years ago

  • Related to Project Activity #5794: Publish new Geoserver workspace "aaps" in support of 'AquacultureAtlasGeneration' VRE added

#4 Updated by Pasquale Pagano about 3 years ago

  • Assignee changed from Emmanuel Blondel to Fabio Sinibaldi

@fabio.sinibaldi@isti.cnr.it, there is a request to provide a dedicated instance of geoserver, v2.10 equipped with the spatialite extension. Can you create the sub tickets to sys admins to create the server, the certificate, ... and coordinate this activities?

#5 Updated by Fabio Sinibaldi about 3 years ago

I confirm we can provide the requested software, we just need to analyze/verify the required effort and steps needed to manage it.

I've just created the required main task ticket #5878 with subtasks to track initial activities.
I will add more subtasks as needed.

#6 Updated by Fabio Sinibaldi about 3 years ago

  • % Done changed from 10 to 40

The new geoserver with spatialite plugin is available for testing in development infrastructure as reported in #5880.
Can you please test and validate such instance? After that we can deploy a similar one in production infrastructure.

#7 Updated by Emmanuel Blondel about 3 years ago

Thanks @fabio.sinibaldi@isti.cnr.it and @andrea.dellamico@isti.cnr.it , much appreciated. I could access the geoserver web administration page, and SpatiaLite is listed among the available vector datastores.

In order to test Spatialite datastore & publication (manual and then automated), we need a remote access to the geoserver data dir where spatialite files will be stored. Could we have an SFTP access to this remote dir? Thanks in advance

#8 Updated by Andrea Dell'Amico about 3 years ago

Emmanuel Blondel wrote:

Thanks @fabio.sinibaldi@isti.cnr.it and @andrea.dellamico@isti.cnr.it , much appreciated. I could access the geoserver web administration page, and SpatiaLite is listed among the available vector datastores.

In order to test Spatialite datastore & publication (manual and then automated), we need a remote access to the geoserver data dir where spatialite files will be stored. Could we have an SFTP access to this remote dir? Thanks in advance

I just authorized your ssh key for the gcube user. You can write inside /srv/geoserver_data (I don't know if it's been already configured as the geoserver data directory).

#9 Updated by Emmanuel Blondel about 3 years ago

Thanks, i tried to connect through SFTP / ssh (Pageant), as for other instances (dataminer, dev portal, etc), but that one doesn't allow me, and ask me for a password.
The information i used is:

  • host: geoserver1-spatial-dev.d4science.org
  • port: 22
  • user: gcube

@andrea.dellamico@isti.cnr.it Could you please double check that i have the SSH rights to access it? or eventually let me know if something above is wrong
THanks in advance

#10 Updated by Emmanuel Blondel about 3 years ago

  • Priority changed from Normal to High

@andrea.dellamico@isti.cnr.it Could you feedback on the above, check and let me know if i'm doing something wrong to access to this server dir? (BTW, i'm doing exactly the same on other servers, and i can access them... possibly the ssh key you used is not the right one?)

#11 Updated by Andrea Dell'Amico about 3 years ago

Emmanuel Blondel wrote:

@andrea.dellamico@isti.cnr.it Could you feedback on the above, check and let me know if i'm doing something wrong to access to this server dir? (BTW, i'm doing exactly the same on other servers, and i can access them... possibly the ssh key you used is not the right one?)

Sorry, I missed your previous comment. On what hosts can I find a valid ssh key of yours? The one you put in your profile is incomplete, I tried to fix it but not in the right way it seems.

#12 Updated by Emmanuel Blondel about 3 years ago

For example, i can successfully access to dataminer1-proto.d4science.org

#13 Updated by Andrea Dell'Amico about 3 years ago

OK, the ssh key That I got from your portal profile is completely different from the one that I've found on dataminer1-proto. I fixed geoserver1-spatial-dev.d4science.org, can you try again?

#14 Updated by Emmanuel Blondel about 3 years ago

Thanks, i can now access it

#15 Updated by Emmanuel Blondel about 3 years ago

  • Related to Support #6065: Geoserver - current forced rewriting rule to https prevents from using GeoServer layer preview added

#16 Updated by Emmanuel Blondel about 3 years ago

  • % Done changed from 40 to 60
  • Assignee changed from Fabio Sinibaldi to Emmanuel Blondel

#17 Updated by Emmanuel Blondel about 3 years ago

@andrea.dellamico@isti.cnr.it I'm facing an issue with uploading the data to server using CURL:

curl -v -u admin:geoserver -XPUT -H "Content-type: application/x-sqlite3" --data-binary @test.db  "http://geoserver1-spatial-dev.d4science.org/geoserver/rest/workspaces/aaps/datastores/test/file.spatialite"

I receive this output:

*   Trying 146.48.123.19...
* Connected to geoserver1-spatial-dev.d4science.org (146.48.123.19) port 80 (#0)

* Server auth using Basic with user 'admin'
> PUT /geoserver/rest/workspaces/aaps/datastores/test/file.spatialite HTTP/1.1
> Authorization: Basic YWRtaW46Z2Vvc2VydmVy
> User-Agent: curl/7.40.0
> Host: geoserver1-spatial-dev.d4science.org
> Accept: */*
> Content-type: spatialite;Expect: 100-Continue;
> Content-Length: 10203136
> Expect: 100-continue
>
* Done waiting for 100-continue
< HTTP/1.1 413 Request Entity Too Large
< Server: nginx
< Date: Mon, 28 Nov 2016 21:35:29 GMT
< Content-Type: text/html
< Content-Length: 192
< Connection: close
<
<html>
<head><title>413 Request Entity Too Large</title></head>
<body bgcolor="white">
<center><h1>413 Request Entity Too Large</h1></center>
<hr><center>nginx</center>
</body>
</html>
* Closing connection 0

Error 413 Request Entity Too Large.

Would it be due to a size limitation in server. Could we set a value for client_max_body_size in nginx.conf? FYI, the file i'm trying to upload is a 10Mb db file. In case, i also have the posssibility to zip and upload data as application/zip, i've tried but it seems the data is still too large.

Thanks in advance for your support

#18 Updated by Andrea Dell'Amico about 3 years ago

The client_max_body_size was set, but only in the https section. There's a redirect, but the body size is evaluated before hitting the redirection rule. I've set the property in the http section too.

#19 Updated by Emmanuel Blondel about 3 years ago

Still doesn't work. File is ~ 9.9Mb, when i test the upload, uploaded file is ~ 0.7Mb

I've checked /etc/nginx/nginx.conf but it is unchanged. Sources of info i've checked seem to indicate that it's not enough to add it to /etc/nginx/sites-available/geoserver1-spatial-dev.d4science.org, and to pay attention when SSL is enabled (our case). See for example what is suggested at http://stackoverflow.com/a/35794955

I've uploaded the test.db file on the workspace (you can download it at https://goo.gl/H4lDjo), i would be grateful if you could test this CURL request:

curl -v -u admin:geoserver -XPUT -H "Content-type: application/x-sqlite3" --data-binary @test.db  "https://geoserver1-spatial-dev.d4science.org/geoserver/rest/workspaces/aaps/datastores/test/file.spatialite"

Thanks in advance

#20 Updated by Emmanuel Blondel about 3 years ago

  • Assignee changed from Emmanuel Blondel to Andrea Dell'Amico

#21 Updated by Emmanuel Blondel about 3 years ago

Hi @andrea.dellamico@isti.cnr.it , let me know when you can work on this, since this server misconfiguration is currently blocking the task. I'm waiting this to be solved to deliver a Python script to CLS that performs the automated publication. Thanks a lot in advance for your assistance

#22 Updated by Pasquale Pagano about 3 years ago

@fabio.sinibaldi@isti.cnr.it, can the data transfer be used instead of curl?

#23 Updated by Andrea Dell'Amico about 3 years ago

Emmanuel Blondel wrote:

curl -v -u admin:geoserver -XPUT -H "Content-type: application/x-sqlite3" --data-binary @test.db "https://geoserver1-spatial-dev.d4science.org/geoserver/rest/workspaces/aaps/datastores/test/file.spatialite"
```

But nginx does not fails with errors anymore. I'm going to try those stackoverflow suggestions but they seem a shot in the dark. The same configuration running on that geoserver is working on all other infrastructure services, btw.

#24 Updated by Andrea Dell'Amico about 3 years ago

Andrea Dell'Amico wrote:

Emmanuel Blondel wrote:

curl -v -u admin:geoserver -XPUT -H "Content-type: application/x-sqlite3" --data-binary @test.db "https://geoserver1-spatial-dev.d4science.org/geoserver/rest/workspaces/aaps/datastores/test/file.spatialite"
```

But nginx does not fails with errors anymore. I'm going to try those stackoverflow suggestions but they seem a shot in the dark. The same configuration running on that geoserver is working on all other infrastructure services, btw.

And nothing changed. I'm going to increase the error log level to see if we are missing something.

#25 Updated by Fabio Sinibaldi about 3 years ago

Pasquale Pagano wrote:

@fabio.sinibaldi@isti.cnr.it, can the data transfer be used instead of curl?

Theoretically yes. But to invoke the service you should either use the java client library or do direct http calls with curl or similar, i think.

#26 Updated by Andrea Dell'Amico about 3 years ago

Andrea Dell'Amico wrote:

Andrea Dell'Amico wrote:

Emmanuel Blondel wrote:

curl -v -u admin:geoserver -XPUT -H "Content-type: application/x-sqlite3" --data-binary @test.db "https://geoserver1-spatial-dev.d4science.org/geoserver/rest/workspaces/aaps/datastores/test/file.spatialite"
```

But nginx does not fails with errors anymore. I'm going to try those stackoverflow suggestions but they seem a shot in the dark. The same configuration running on that geoserver is working on all other infrastructure services, btw.

And nothing changed. I'm going to increase the error log level to see if we are missing something.

Nope, there's nothing on the nginx part that breaks the file.
Is there any configuration limit in the geoserver app or in that plugin? tomcat doesn't put any limit on the post uploads.

I also see that if I repeat the upload from curl the destination file isn't overwritten.

#27 Updated by Emmanuel Blondel about 3 years ago

Thanks for your tests

My comments:

  • there is no (at least no known and documented) configuration limit to PUT files through the Geoserver REST API. Indeed i've tested with Shapefiles it works (201 response) while with spatialite i get a 202. I'm going to investigate further.
  • about your latest comment: it is the normal behavior file is not overwritten by default, see http://docs.geoserver.org/stable/en/user/rest/api/datastores.html#update
  • @fabio.sinibaldi@isti.cnr.it Java is a priori not envisaged here. Could you give me indication on the command / http call to use data-transfer with curl? I will test it as alternative. Thanks in advance

#28 Updated by Emmanuel Blondel about 3 years ago

BTW, @andrea.dellamico@isti.cnr.it is still see that client_max_body_size 1000M; has not been set in etc/nginx/nginx.conf Is it normal?

#29 Updated by Andrea Dell'Amico about 3 years ago

Emmanuel Blondel wrote:

BTW, @andrea.dellamico@isti.cnr.it is still see that client_max_body_size 1000M; has not been set in etc/nginx/nginx.conf Is it normal?

I put in there and then removed. It made no difference, as expected.

#30 Updated by Emmanuel Blondel about 3 years ago

Ok, thanks. Some bug reported in Geoserver for H2 could be the same for Spatialite (see https://osgeo-org.atlassian.net/browse/GEOS-5869). I'm liaising with Geoserver developers to understand this.

In the meanwhile, i will need to test an alternative to push spatialite file there: Waiting for Fabio to give me some indication how to use data-transfer service through CURL.

#31 Updated by Emmanuel Blondel about 3 years ago

#32 Updated by Andrea Dell'Amico about 3 years ago

Emmanuel Blondel wrote:

https://geoserver1-spatial-dev.d4science.org/geoserver/ does not respond anymore

? it's working, I just tested from otside our network.

#33 Updated by Emmanuel Blondel about 3 years ago

Services work, but not geoserver web admin https://geoserver1-spatial-dev.d4science.org/geoserver/web Can you access this and login?
Here it's redirected to localhost:9000/geoserver ....

#34 Updated by Andrea Dell'Amico about 3 years ago

Emmanuel Blondel wrote:

Services work, but not geoserver web admin https://geoserver1-spatial-dev.d4science.org/geoserver/web Can you access this and login?
Here it's redirected to localhost:9000/geoserver ....

Ah ok, it was a piece of configuration from one of the tests with nginx. It works now.

#35 Updated by Emmanuel Blondel about 3 years ago

  • Assignee changed from Andrea Dell'Amico to Fabio Sinibaldi

@fabio.sinibaldi@isti.cnr.it Can you give more details about what you said https://support.d4science.org/issues/5828#note-25? Does data-transfer facility comes with a REST ws to upload data? And would it be applicable here?

In the meanwhile, i've contacted Geoserver developers to see if a patch can be applied.

#36 Updated by Fabio Sinibaldi about 3 years ago

Hi Emmanuel,
here are some curl calls to Data Transfer Service. Its current implementation wasn't designed to be used with direct calls, so it's a bit verbose.
You can instruct the service to download from a URL source to a specific subfolder of the persistence directory, and monitor its progress. You'll find more information (even if java-oriented) at
https://wiki.gcube-system.org/gcube/How_to_use_Data_Transfer_2

To submit a download request you can use the following curl example, but need to declare/change :
* a public url in tag from which the service will download the resource
* remote under to which the service should store the resource
* fileName of the downloaded resource under
* optionally change clash policies (or use the one in this example) in tags and
* your gcube token as HTTP header "gcube-token"

curl -X POST -d '<?xml version="1.0" encoding="UTF-8" standalone="yes"?>

<transferRequest><id></id><httpDownloadSettings><source>http://goo.gl/oLP7zG</source><options><range><min>80</min><max>80</max></range></options></httpDownloadSettings><destinationSettings><persistenceId>DEFAULT</persistenceId><subFolder>bla/bla/bllaaa</subFolder><destinationFileName>outputFile</destinationFileName><createSubfolders>true</createSubfolders><onExistingFileName>ADD_SUFFIX</onExistingFileName><onExistingSubFolder>APPEND</onExistingSubFolder></destinationSettings><pluginInvocations/></transferRequest>' --header "gcube-token: XXXXXX" --header "Accept: application/xml" --header "Content-Type: application/xml" http://geoserver1-spatial-dev.d4science.org/data-transfer-service/gcube/service/Requests

You'll receive a response like the following, from which you need to get the ticket Id (under TransferTicket/id/text()) in order to monitor download progress.
In TransferTicket you'll find useful information :
* : status of the download
* : final absolute path of destination file

<?xml version="1.0" encoding="UTF-8" standalone="yes"?><transferTicket><id>225adbf0-e967-41ea-bb7e-96b4901cd12d</id><httpDownloadSettings><source>http://goo.gl/oLP7zG</source><options><range><min>80</min><max>80</max></range></options></httpDownloadSettings><destinationSettings><persistenceId>DEFAULT</persistenceId><subFolder>bla/bla/bllaaa</subFolder><destinationFileName>outputFile</destinationFileName><createSubfolders>true</createSubfolders><onExistingFileName>ADD_SUFFIX</onExistingFileName><onExistingSubFolder>APPEND</onExistingSubFolder></destinationSettings><pluginInvocations/><status>TRANSFERRING</status><transferredBytes>0</transferredBytes><percent>0.0</percent><averageTransferSpeed>0</averageTransferSpeed><submissionTime><value>2016-12-02T12:29:08.980+01:00</value></submissionTime><destinationFileName>/home/gcube/SmartGears/state/DTService/bla/bla/bllaaa/outputFile(1)</destinationFileName><message>Opening connection</message></transferTicket>

To monitor download progress use the following example in order to get an updated TransferTicket by changing :
* gcube-token as header
* Transfer ticket id at the end of called uri

curl --header "gcube-token: XXXX" --header "Accept: application/xml" http://geoserver1-spatial-dev.d4science.org/data-transfer-service/gcube/service/TransferStatus/e5c22773-4b44-48d3-ad33-74e892e835a4

However, I don't think I have access to the machine, but the service seems to not be there.

#37 Updated by Emmanuel Blondel about 3 years ago

@fabio.sinibaldi@isti.cnr.it I could perform some request successfully using Curl. However 2 points that are not clear:

  • data is always uploaded to /home/gcube/SmartGears/state/data-transfer-service/ base path. I also had a look to the Capabilities request, and this is what i have:
  <availablePersistenceIds>
        <availablePersistenceIds>/data-transfer-service</availablePersistenceIds>
        <availablePersistenceIds>/whn-manager</availablePersistenceIds>
    </availablePersistenceIds>

My question: how to specify an absolute path on the server with the XML data-transfer request body? In our case: /srv/geoserver_spatialite/data

Thanks in advance for your feedback

#38 Updated by Fabio Sinibaldi about 3 years ago

@emmanuel.blondel@fao.org To access/write on absolute paths, is beyond current data-transfer capabilities. It has been designed this way for security reasons. We could discuss if we want to change this restriction, however one solution might be to create a symbolic link from under the persistence folder of SmartGear to the desired destination.
The only issue with this approach might be that SmartGear dynamically creates its persistence folder ("state") if it is not present. If we manually create this symbolic link, then it has to be created every time the node state is cleared.

One solution to this might be to add some configuration options to SmartGear, in order to make it create desired symbolic links along with persistence folders of deployed services. This would let also SmartGear to handle this links with persistence Ids, so they can be managed just like other persistence locations (I.e. by ID). This might bind SmartGear's code to the OS, so I kindly ask @lucio.lelii@isti.cnr.it do give us feedback on this approach.

#39 Updated by Fabio Sinibaldi about 3 years ago

@emmanuel.blondel@fao.org Could you please provide the source link you used for the transfer request in order for us to investigate on the "truncated file" issue?

#40 Updated by Andrea Dell'Amico about 3 years ago

Fabio Sinibaldi wrote:

@emmanuel.blondel@fao.org To access/write on absolute paths, is beyond current data-transfer capabilities. It has been designed this way for security reasons. We could discuss if we want to change this restriction, however one solution might be to create a symbolic link from under the persistence folder of SmartGear to the desired destination.
The only issue with this approach might be that SmartGear dynamically creates its persistence folder ("state") if it is not present. If we manually create this symbolic link, then it has to be created every time the node state is cleared.

One solution to this might be to add some configuration options to SmartGear, in order to make it create desired symbolic links along with persistence folders of deployed services. This would let also SmartGear to handle this links with persistence Ids, so they can be managed just like other persistence locations (I.e. by ID). This might bind SmartGear's code to the OS, so I kindly ask @lucio.lelii@isti.cnr.it do give us feedback on this approach.

Well, the ability to specify at configuration time an external directory would be more than welcome. The state directory is very unfortunate as a place, btw.

#41 Updated by Emmanuel Blondel about 3 years ago

Thanks to both of your feedback.

@fabio.sinibaldi@isti.cnr.it see this example that can be used https://goo.gl/H4lDjo to test truncation issues.

I confirm that we would need the capacity to transfer to an absolute path. Please let me know if this would be feasible, and with which timeframe.

For CLS (@nlongepe@cls.fr @mgoacolou@cls.fr) & Anton (@anton.ellenbroek@fao.org ), quick status report:

  • no news from Geoserver dev community about the Geoserver REST API bug i pointed out (upload works for the target dir, but file is truncated)
  • ongoing tests with the gcube data-transfer (so far the only alternative identified to upload a Spatialite file)
  • 2 limitations of the latter:(1) truncation of file (similar to Geoserver REST API) - currently inspected by CNR, (2) no capacity to upload the absolute path of target dir

I hope that one (geoserver REST API) or the other approach (data-transfer) will be unlocked soon, probably on data-transfer. @fabio.sinibaldi@isti.cnr.it and @andrea.dellamico@isti.cnr.it let us know when the 2 above mentioned limitations could be tackled.

Thanks in advance

#42 Updated by Fabio Sinibaldi about 3 years ago

@emmanuel.blondel@fao.org It seems that the problem is the shortened url. Using https://goo.gl/H4lDjo , the saved content is :

<HTML>
<HEAD>
<TITLE>Moved Permanently</TITLE>
</HEAD>
<BODY BGCOLOR="#FFFFFF" TEXT="#000000">
<H1>Moved Permanently</H1>
The document has moved <A HREF="http://data.d4science.org/NmR3UXd0dHF4RDRtN2ovZkNaVkVMSFR4dWM2THJrdjZHbWJQNStIS0N6Yz0">here</A>.
</BODY>
</HTML>
^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^$

If you use the full uri http://data.d4science.org/NmR3UXd0dHF4RDRtN2ovZkNaVkVMSFR4dWM2THJrdjZHbWJQNStIS0N6Yz0 the file seems to be correctly transferred. I'm gonna check a solution on this, however in the meantime you can use non-shortened urls.

PS: I have a feeling that the same issue MIGHT be the cause of GeoServer "truncate file" problem.

#43 Updated by Pasquale Pagano about 3 years ago

We had already an issue with the Google APIs in the past. Google changed the way the Google resolver resolves the shortened URLs and clearly they did the change without any notification. If I correctly remember @francesco.mangiacrapa@isti.cnr.it well identified this issue in the past.
I suggest to use the infrastructure URI as they are without using shortened URI as Fabio is suggesting.

#44 Updated by Fabio Sinibaldi about 3 years ago

After further analysis, it seems that the problem is that shortened urls are now https and this doesn't allow java connections to follow the redirect. In fact, you might change https://goo.gl/H4lDjo to http://goo.gl/H4lDjo to have data-transfer correctly handle the file.

I don't know when Google migrated to https for shortened urls, and neither I know if this manual fix (https-> http) will be supported in the future, so I continue to suggest using infrastructure URIs.

#45 Updated by Emmanuel Blondel about 3 years ago

I couldn't test because of a problem with token. Service now indicates my token is invalid (i used the same as before, ie the token from NextNext test VRE). I will try again early this wekk.

Can you please indicate if/when we can expect the data-transfer service to support absolute paths. If not, we will not be able to use data-transfer neither for this use case, and i'm still waiting for Geoserver team feedback about REST API limitation for spatialite (the best would still be a patch for the latter).

#46 Updated by Fabio Sinibaldi almost 3 years ago

Hi Emmanuel,
first of all sorry for the delay.

We checked the installation and configuration of authentication layer and the issue with your token should be fixed now. Could you try again?

About absolute paths, passing them to data-transfer binds too much the caller to the installation configuration of the node, so we are designing a solution that will use instead something like :
persistence-id = "GeoserverSpatialite"
[OPTIONAL] subPath = some/sub/directory/if/needed

In this way the caller won't need to know the specific absolute path in that particolar node, that might change depending on configuration and technology.
I'll get back to you on this as soon as we'll have something in place to be tested by you.

#47 Updated by Fabio Sinibaldi almost 3 years ago

Hi Emmanuel,
as you can see from the Capabilities report, the persistence location "/geoserver" is now available and it's configured to correspond to "/srv/geoserver_spatialite/data". Let us know if this solution works for you.

#48 Updated by Emmanuel Blondel almost 3 years ago

Great, thanks Fabio, i will test as soon as i can, and report here in case of issues.

#49 Updated by Emmanuel Blondel almost 3 years ago

Fabio, can you clarify if the data-transfer allows to upload a local file (similar as the GeoServer REST API), or is it limited to resources already published on the web? Thanks in advance

#50 Updated by Emmanuel Blondel almost 3 years ago

Fabio, i've tested the new Data Tranfer. Although we can now push a file in the target geoserver data dir, file is still truncated. This together with the above limitation (if you confirm we can't use it with local file), is blocking. Looking forward to your clarifications about using a local file, and possibly see why file is being truncated similarly as with the Geoserver REST API.

#51 Updated by Fabio Sinibaldi almost 3 years ago

Hi Emmanuel, here's my feedback :
- transfer of local files : it is supported by the java library data-transfer-library, but not by direct http calls to data-transfer service.
- truncated size : I thought we already solved this issue. As I reported before Christmas break, there's an issue when dealing with shortened urls (and it's most probably the same issue you encountered in GeoServer). Are you using shortened or full uris?

#52 Updated by Emmanuel Blondel almost 3 years ago

Apologies Fabio, i forgot the trick about long URLs. Indeed it works when i don't use the shortened URLs. Upload is OK. About the transfer of local files, i think that's the blocking point, as the piece of automated publication flow i'm writing is supposed to be appended to the production one, hence the Spatialite file will be a local file. Is it something you could easily support? (similarly as what the GeoServer REST API is providing, successfully for shapefiles, unfortunately not for Spatialite)

Again, thanks for your patience and support. As of now, the data-transfer appears to be the only solution that could allow us to publish programmatically spatialite DB files (I've recontacted Geoserver guys, no answer...). Let me know your thoughts about the transfer of local files

I take the opportunity to apology with CLS for the delay, as i know they wished to have this up-and-running in december.

#53 Updated by Emmanuel Blondel almost 3 years ago

  • File aaps_publish_layers.py added

Dear all, some report of my thoughts. Please @fabio.sinibaldi@isti.cnr.it @nlongepe@cls.fr comment or ask your questions. We can have soon a skype to discuss the way to move forward with AAPS WMS layers, with or without Spatialite (at least temporarily):

Although in the absolute i agree with CLS that Spatialite is better to use than ESRI shapefile (limitations, including those limits inherited from DBF format), the investigation done indicates a very poor support of Spatialite in Geoserver (this despite they indicate it is supported).

  • Geoserver REST API doesn't allow to upload Spatialite file. There is a bug. In spite of my emails to GeoServer team members (even saying willingness to spend time on my side to develop a patch, and requesting indications - because they are aware of this bug -), i didn't receive any feedback. So it seems that Spatialite is definitely not a priority for them.
  • I've tried to investigate further and test with a local Geoserver instance, but i found other issues (e.g. GeoServer doesn't work for Win OS 64-bit machines)

The path was then to substitute the GeoServer REST API upload operation by using the Data Transfer. I would say it is still the feasible alternative but still they are missing blocks in order to have a full working Spatialite WMS layer publication flow for AAPS:

  • So far i've tested the Data-Transfer call based on a spatialite file available on the workspace (URL). It's clear that the requirement is to upload it after its production, so we would need to have the capacity to upload it from LOCAL file (similar as with the Geoserver REST API when it works, e.g. with shapefiles), and not an already web-available resource.
  • As stated by Fabio, long URLs have to be used instead of shortened URLs otherwise the uploaded spatialite is truncated.
  • The bad news is that although not truncated with long URLs, the file seems corrupted (when: download from workspace? upload to Geoserver?) and then Geoserver is not able to find any spatialite tables, and it's impossible to configure the layer. @fabio.sinibaldi@isti.cnr.it any idea?

The message that Geoserver reports is:

Unable to obtain connection: [SQLITE_CORRUPT] The database disk image is malformed (database disk image is malformed)

As the spatialite production script was in Python, i've drafted a python script (see attached). At now the featureType publication is not working (because of the above file corruption), if solved the script has to be extended with the final layer creation (to publish the featuretype). Note that part of the script relies on the Python gsconfig module https://github.com/boundlessgeo/gsconfig/ but because very limited, the script i've used uses it when possible, otherwise i've added adhoc code.

Generally speaking if you want to proceed with programmatic publication of resources in Geoserver, you will have much better support of Geoserver REST API Interface in Java and R.

@nlongepe@cls.fr Let me know when you would be available to talk lively of this activity

Looking forward to your feedback,
Emmanuel

#54 Updated by Fabio Sinibaldi almost 3 years ago

Hi Emmanuel, thanks a lot for your feedback.
You're right : there was a bug on data-transfer-service. I fixed the bug and tested file integrity using sha1sum command, and every thing's ok now.
The fix is already deployed on the node, so you should be able to test it.

About the local file upload, we need to work on it but I think we could implement it.

#55 Updated by Emmanuel Blondel almost 3 years ago

  • File deleted (aaps_publish_layers.py)

#56 Updated by Emmanuel Blondel almost 3 years ago

  • File aaps_publish_layers.py added

Hi Fabio, great, i confirm it works now. I've updated the Python script. Note that i performed a basic publication (basic layer, default style). For advanced publication (custom style, etc), we need to extend the script.
@fabio.sinibaldi@isti.cnr.it how/when would it be possible to extend the Data-Transfer to support local file?

@nlongepe@cls.fr @mgoacolou@cls.fr See these GeoServer outputs after running the script:
- WMS (here default map preview, to let you see that WMS GetFeatureInfo is also operational): https://geoserver1-spatial-dev.d4science.org/geoserver/aaps/wms?service=WMS&version=1.1.0&request=GetMap&layers=aaps:mylayer&styles=&bbox=19.903981,35.479604,27.775127,40.554855&width=768&height=495&srs=EPSG:4326&format=application/openlayers
- WFS: https://geoserver1-spatial-dev.d4science.org/geoserver/aaps/ows?service=WFS&version=1.0.0&request=GetFeature&typeName=aaps:mylayer

Let me know when we can discuss about the workflow in general, and if use of Python is a mandatory pre-requisite. If not i would suggest to switch to R where support for Geoserver REST API is cleaner.

Looking forward to your feedback,

#57 Updated by Nicolas Longépé almost 3 years ago

Thank you very much for this nice progress !

@emmanuel.blondel@fao.org and @mgoacolou@cls.fr : would you be available for a phone call wednesday 8 February ? this week will be very complicated for us ....

Thks

#58 Updated by Emmanuel Blondel almost 3 years ago

Yes, i'm available, send me a call invitation at your preferred time slot.

#59 Updated by Fabio Sinibaldi almost 3 years ago

Hi @emmanuel.blondel@fao.org, we need to plan the required task for the development of this feature. However we'll try to release this feature in the upcoming gcube 4.3.0.

In the meantime, I think you could proceed with your use case by adding an extra step : upload your local file to HomeLibrary via its REST interface. Then, you can contact data transfer with the public url obtained by HL.
You can find related documentation here :

#60 Updated by Emmanuel Blondel almost 3 years ago

Thanks Fabio for this suggestion, i was not aware of the Home Library REST API, i'm going to have a look, and report here, with updates of the Python script if it works.

PS: In the meanwhile, please note that i'm still in the attempt to boost geoserver community in order to find a patch solution to the GeoServer REST API spatialite upload facility (which is not specific to Spatialite, but general to all other formats than Zipped ESRI Shapefile), although we cannot expect something at short-term.

#61 Updated by Emmanuel Blondel almost 3 years ago

  • Related to Support #6851: Need further information to exploit Home Library REST API added

#62 Updated by Emmanuel Blondel almost 3 years ago

Fabio, i'm going to do a series of tests with the HL REST Interface. First set of tests i did, it was not working. I will report potential issues in separate tickets.

In the meanwhile could you let us know for when is planned (date/month) the gcube 4.3.0 together with its support of local file transfer / REST interface? Thanks in advance

#63 Updated by Fabio Sinibaldi almost 3 years ago

Emmanuel,
since your use case is already supported by the combined use of HL + Data Transfer, at this point I seriously doubt we're gonna release a local upload for Data Transfer REST interface in 4.3.0.
The development phase of 4.3.0 is ending soon and we need to perform tests and validation before upgrading production environment.

#64 Updated by Emmanuel Blondel almost 3 years ago

Well this is a pity, unless HL REST Interface leads to work (for now i've tested the different methods without success, this following the doc available..). So far the use case is not supported by this combination, i will be able to say that if HL REST interface actually works. That would be (1) upload data to the web somewhere to then (2) move it to another place, and then (3) to delete the first uploaded data. 3 steps instead of 1. Why data-transfering should be bound to data already on the web??

#65 Updated by Fabio Sinibaldi almost 3 years ago

One thing is saying that the current implementation of HL has some issues, another one is saying that the library doesn't support the use case. I think we are on the first statement, here. So, please report your issues with HL and we'll be working on that.

About the data transfer service, what I'm saying is not that we won't cover this use case (local upload via REST), but that it was not previously planned. So, we will implement this feature, but since it's not blocking ('cause you can work without it), it has not maximum priority thus it probably won't be ready for release 4.3.
Please note that I'm using "probably" because, as I previously stated, we will try to deliver it in time for 4.3.

#66 Updated by Emmanuel Blondel almost 3 years ago

  • Related to Task #7100: Assistance in using HomeLibrary REST Interface added

#67 Updated by Emmanuel Blondel almost 3 years ago

  • Assignee changed from Fabio Sinibaldi to Emmanuel Blondel
  • Status changed from Feedback to In Progress

#68 Updated by Emmanuel Blondel almost 3 years ago

  • Status changed from In Progress to Paused

Waiting for hints on HomeLibrary REST interface or news from Geoserver dev team. Meanwhile, a python script will be provided for shapefile publication (as requested today by CLS)

#69 Updated by Emmanuel Blondel almost 3 years ago

  • Status changed from Paused to In Progress

#70 Updated by Emmanuel Blondel almost 3 years ago

  • Related to Support #7155: How to add AquacultureAtlasGeneration scope with AAPS Geoserver added

#71 Updated by Emmanuel Blondel almost 3 years ago

  • Status changed from In Progress to Resolved

Dear Nicolas & Manuel, i've finally led to have a working Python script to bypass (hardly) the limitation introduced by the old Geoserver REST API bug with DB files, and the fact that DataTransfer only allows transferring web-resources (not local resources).

The script does as following:

  • Create Geoserver workspace if needed (Geoserver REST API)
  • Create Geoserver Spatialite datastore if needed - only for new data (Geoserver REST API)
  • Upload local Spatialite DB file to i-Marine workspace (gCube HomeLibrary REST API)
  • Transfer uploaded Spatialite DB file to Geoserver data directory (gcube DataTransfer REST API)
  • Publish Spatialite FeatureType to GeoServer catalog, enabling WMS and WFS (Geoserver REST API)
  • Delete uploaded Spatialite DB file from i-Marine workspace (gCube HomeLibrary REST API)

But: i still have one issue, is the fact i have to use 2 different gCube security tokens to invoke the HomeLibrary and DataTransfer REST API (not sure if it is because of unregistered scope or because of 2 different environments DEV/PROD) - see #7155 As soon as this is solved by involved colleagues, i want to retest the full script. Early next week i think you should be able to test it.

Best regards
Emmanuel

#72 Updated by Emmanuel Blondel almost 3 years ago

  • File deleted (aaps_publish_layers.py)

#73 Updated by Emmanuel Blondel almost 3 years ago

Here's the updaded script how it should work (once gcube token issue solved)

#74 Updated by Emmanuel Blondel almost 3 years ago

  • Related to Task #7169: Install AAPS production GeoServer with spatialite plugin & DataTransfer service added

#75 Updated by Emmanuel Blondel almost 3 years ago

  • Assignee changed from Emmanuel Blondel to Manuel Goacolou
  • Status changed from Resolved to Feedback

#76 Updated by Emmanuel Blondel over 2 years ago

  • Related to Task #7538: Deploy patch (JAR) in AAPS Production Geoserver added

#77 Updated by Emmanuel Blondel over 2 years ago

Follow-up:

  • The patch provided by Geosolutions is operational (See #7538 for details)
  • See attached Python script to proceed with publication (for production Geoserver, please wait #7538 is solved)

At this stage, i'm done with my contribution to provide you the business logic (in Python) to publish Spatialite files as WMS/WFS. I need your feedback ASAP on this ticket (assigned to @mgoacolou@cls.fr ), so i can definitely close it.

Next:
1. Publish all CLS outputs:
- you are free to decide how you want to structure your layers. Note that you may create different workspace (having different workspace allows you to have separate instances of WMS/WFS service - hence separate GetCapabilities list of layers). Using Geoserver spatialite, for each new Spatialite DB file, you will need to configure a dedicated datastore.
- The script provided handles simple layers, with default style. If you want to customize this, you will need to define the layer style rules, and customize the script for specifying the custom style.

  1. Geo-visualization I'm going to create a ticket, in order to push on this (note it doesn't mean that i have to take the lead on this, i'm not assigned to this activity so if you expect my technical support, you should liaise and agree with @anton.ellenbroek@fao.org and probably other partner involved in this WP) I will only suggest possible scenarios to go forward (more or less straighforward depending to the case, and i have to say this is closely related to the publishing language you use: Python). This ticket should later be a master task or activity, and appropriate sub-tasks created.

#78 Updated by Emmanuel Blondel over 2 years ago

  • Blocks Task #7579: Publish all available CLS outputs as WMS/WFS for use in AAPS VRE added

#79 Updated by Emmanuel Blondel over 2 years ago

  • Related to Task #7590: Create & publish ISO metadata for CLS outputs added

#80 Updated by Manuel Goacolou over 2 years ago

I'm now working on testing the script aaps_publish_layers_spatialite2.py

#81 Updated by Manuel Goacolou over 2 years ago

The script is OK.
I test it on test and prod. Data are now available.

#82 Updated by Emmanuel Blondel over 2 years ago

  • Status changed from Feedback to Closed

Also available in: Atom PDF