This project is a use case of Sentinel 1 data preparation for further quantitative analysis on Earth observation. The processing takes a list of satellite images as an input and outputs them merged into an animated GIF. The automated deployment of the application into clouds is performed using the Nuvla service which is based on SlipStream and operated by SixSq.
The processing of the satellite images is distributed using the MapReduce model. The input and output files are stored in an object store located in the cloud. This implementation aims to minimize the execution time.
The number of mappers
deployed is dependent on the number of scenes to process. This provides
the benefits of having a constant map phase, which is important to prepare for the next phase of
the project, which is to enable SLA-bound Earth Observation products creation. For example,
time-to-product-creation
could be an SLA that the service must enforce, independently of the
number of scenes to process, or the number of concurrent request received.
Using Nuvla, the system is portable accros clouds, which means users can choose any cloud to perform the processing, as long as they have corresponding credentials in their Nuvla account.
Finally, this approach is clean
in terms of resource consumption, in the sense that resources
are only required for the processing of scenes. Once the processing is completed, the resources
are terminated, therefore stopping any associated cloud costs. Only the data left in the object
store will continue to incur costs. But since object store is the cheapest way to store data in
cloud services, this cost is reduced to a minimum. Furthermore, if the output product is deleted
from the object store after delivery to the end-user, this costs would be eliminated all together.
In order to successfully execute the application, you should have:
-
An account on Nuvla. Follow this link where you'll find how to create the account.
-
Cloud credentials added in your Nuvla user profile
-
Python
>=2.6 and <3
and python package managerpip
installed. Usually can be installed withsudo easy_install pip
. -
SlipStream Client installed:
pip install slipstream-client
.
-
Clone this repository with
$ git clone https://github.com/SimonNtz/SAR_app.git
-
Add the list of products into the input file
$ cd SAR_app/run/ $ # edit product_list.cfg
-
Set the environment variables
$ export SLIPSTREAM_USERNAME=<nuv.la username> $ export SLIPSTREAM_PASSWORD=<nuv.la password>
and run the SAR processor on Nuvla with
$ ./SAR_run.sh <cloud>
Where
<cloud>
is the connector instance name as defined in Nuvla and for which you have defined your cloud credentials (see section 2. of Prerequisites above). -
The command prints out the deployment URL which you can open in your browser to follow the progress of the deployment. When the deployment is done, the link to the result of the computation becomes available as the run-time parameter
ss:url.service
in the deployment Global section. Also, the command follows the progress of the deployment, detects when the deployment has finished, recovers and prints the link to the result of the computation.
The SAR_app's scripts form the application's base however the map and reduce functions are located in an other Github repository, by default SAR_proc.
During the deployment it get cloned locally using an application's parameter containing its respective URL.
The intent behind isolating the SAR processor is to make it customizable to the users with less effort.
While running the client script, a Github repository url respecting the SAR_proc requirements can be pass as an input parameter.
$ ./SAR_run.sh <cloud> <https://github.com/YOUR_USERNAME/SAR_proc>
European Space Agency ESA provides Earth Observation Data captured by their Sentinel-1 satellites fleet. Being constantly populated this dataset posses now a big potential to be exploited in a wide spectrum of applications.
The processing of the satellite images are done using the SNAP toolbox and its Sentinel-1 module S1tbx. This computation is distributed over multiple nodes within a cloud cluster. The global execution is divided in two steps following the MapReduce model. Finally, the implementation aims to minimize the execution time.
NOTE: in-progress, not fully optimize yet.
The full image processing is done by calling multiple functions of the S1tbx. Here are the ones that we used.
- Subsetting (crop image on ROI)
- Calibration (radiometric, outputting beta-nought)
- Speckle-Filter (Dopler effect correction)
- Terrain correction (Foreshortening and layover)
- Linear to DB pixels conversion
- Conversion in PNG format