Skip to content

Commit

Permalink
feat: Developed the Streamlit page to upload CSV files
Browse files Browse the repository at this point in the history
  • Loading branch information
alanceloth committed Feb 3, 2024
1 parent 3a1d760 commit 7c7a2e9
Showing 1 changed file with 6 additions and 1 deletion.
7 changes: 6 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,12 @@ Following this, the project leverages dbt (data build tool) to retrieve data fro
- :white_check_mark: Add the contract as a submodule to the main repo.
- :white_check_mark: Configure the CI/CD using the GitHub Actions.
- :white_check_mark: Test the Workflow.
- [ ] **Create the Streamlit page to upload CSV files**: Develop a Streamlit page to streamline the process of uploading CSV files.
- :white_check_mark: **Add Sentry as the Observability tool**: Add Sentry to the project
- :warning: **Create the Streamlit page to upload CSV files**: Develop a Streamlit page to streamline the process of uploading CSV files.
- :white_check_mark: Create the streamlit frontend to upload files
- :white_check_mark: Create the backend to process the uploaded csv files and check if the schema are corret using the pydantic contract
- :white_check_mark: Create the app.py to execute the application
- :warning: Test the upload
- [ ] **Transform the CSV files into Parquet files**: Implement the necessary procedures to transform CSV files into Parquet files.
- [ ] **Save the Parquet files into AWS S3 Bucket**: Set up mechanisms to save the Parquet files into the designated AWS S3 Bucket.
- [ ] **Establish the dbt project**: Initiate the creation of the dbt project for seamless data management.
Expand Down

0 comments on commit 7c7a2e9

Please sign in to comment.