OCO2 is the first dedicated satellite to study Carbon Dioxide launched in July 2, 2014 and OCO3 is the sister of OCO2 because it has similar instrument sensitivity and performance characteristics to OCO2. When flying a payload on the International Space Station (ISS), the OCO3 mission was designed to fly with the flight spare. This means we have 2 of the same instruments that is currently flying and since they fly differently - polar orbit versus a processing orbit This is really fantastic opportunity for science because It really permits NASA to study CO2 over different areas of the globe.
- Users are encouraged to revisit this section after completing the tutorials and attempt to execute the provided script on Google Colab.
- Execute the Google Colab script to gain access to OCO data from the PyDAP server. Subsequently, we may utilize the carbon footprint vertices for visualization purposes within a specified geographic region.
- (Ctrl + Click) Collab Script
- Accessing the OCO2/OCO3 Datasets
- Data Pre-processing
- Data Visualization
- Setup environment using Docker
- Demonstration
- In this tutorial, we will guide you through the process of using a Python script. We'll assume that you have some basic programming knowledge.
- To download OCO2/OCO3 data files from the Earth Data Search website, you can follow these steps: First, access the Earth Data Search website and navigate to the OCO2/OCO3 dataset of interest. Then, specify your search criteria such as time range, geographical area, and data product type. You may need to log in or create an account if prompted. Finally, follow the on-screen instructions to complete the download process, and once the files are downloaded, they can be accessed and utilized for further analysis or visualization.
- Prior to proceeding with this tutorial, users can visit the following hyperlink. Within the EarthData platform, a variety of Data Application tools are available for access. Each of these data tools is designed to perform distinct functions, encompassing search criteria utilization, data handling, subsetting and filtering, as well as data visualization and analysis capabilities.
- (Ctrl + Click) Data Tools provided by EarthData
- (Ctrl + Click) EarthDataSearch: FAQ
- The Earthdata Developer Portal serves as a hub for application developers interested in constructing software applications for searching, accessing, and exploring Earth science data hosted by EOSDIS
- (Ctrl+click) earthdataAPI
- Below example demonstrates the API access to OpenDAP using pydap tool (client/server software). Above Google collab demonstrates similar example.
- Eg: with python's
pydap
we can request to get access the OCO2 datasets using credentials from EarthData login: (Ctrl + Click) using openDap
- Eg: with python's
- primary data format for OCO2/OCO3 is netCDF4, affording users the capability to generate visualizations through software tools such as Panalopy, QGIS, and ArcGIS directly
- For enhanced accessibility, netCDF files can also be converted into CSV format, providing users with a more straightforward means of data manipulation and analysis. Additionally, for the purpose of optimizing scalability and conserving storage space, the option to save files in the parquet format is also available.
- Converting netCDF files to CSV format (Ctrl + Click)netCDF->csv
- Converting netCDF files to parquet format (Ctrl + Click)netCDF->parquet
- Other GIS software tools:
- (Ctrl + Click) Arset Tutorial 2022
- Utilization of the ARSET training script facilitates the visualization of OCO2/OCO3 data for a single day file, showcasing its practical application.
- Libraries:
- netCDF, pandas, numpy, matplotlib, pydap, plotly, Basemap, seaborn
- dask // to perfrom data aggregation and data-preprocessing
- We can use Dask Library to read multiple files and preprocess, visualize the data
- (Ctrl + Click) Data_Visualization Multiple files
- Docker-Based Environment Setup: To facilitate the installation of the necessary packages and libraries, the deployment of Docker containers is recommended. This approach streamlines the environment configuration process and ensures consistent dependencies.
- Containerization as Docker Image: For further versatility, the environment can be containerized into a Docker image. This Docker image can subsequently be deployed as a web application or a Jupyter server on Amazon Web Services (AWS). Comprehensive guidelines on these deployment processes can be found within the repository documentation, offering in-depth insights into the steps required for successful implementation.
- These procedures enhance the efficiency and reproducibility of the environment setup, enabling the seamless installation of essential components while providing flexibility in deploying the system as a web application or Jupyter server via AWS infrastructure. Further details are available in the repository documentation for comprehensive guidance.
- (Ctrl + Click) Setting-up-Environment
The principal aim of this project is to present data points representing XCO2 and illustrate their temporal evolution, thereby offering insight into the variability of the atmospheric carbon cycle for a specified year. In this context, the project utilizes the "vertices" attributes of carbon footprints to generate polygonal shapes, facilitating the visualization of carbon footprint patterns.
OCO2 data representing the variation of XCO2 by different years. THe visualization built on R code located inside ARSET/ directory representing the california region. NOTE: OCO2 launched on 2014 July, so for this reason the plots displayed in the later months.
- for any questions, please raise issue in this repository. Thanks