-
Notifications
You must be signed in to change notification settings - Fork 17
Development
One of the overarching goals of the TNO project is to ensure the long-term maintainability and incremental enhance-ability of the solution. To support this, Docker containers will be used to allow developers to generate identical or similar local environments which will speed up development and testing.
VS Code supports the use of Development Containers which further enables the containerization of the development process. This enables developers to have identical development environments which ensures there are no missing dependencies or variations that result in external environments failing.
You should no longer need to worry about "it works on my computer, not sure why it doesn't work in yours".
Refer to the repo docs for more information - here.
Through the use of Docker most required components of the solution can be run locally to build and test each feature. There are a few cloud dependencies that are required for a few of the features. These require an Azure subscription to a few services, Storage, Cognitive Services, and Video Analyzer.
Some of these dependencies may/will change over the course of the project.
The source code is maintained as open-source on GitHub. All required dependencies are included in the source code as a mono-repo. The source code is organized to keep the many different parts and technology grouped with related components.
Folder | Description |
---|---|
/.github | Configuration for GitHub, and GitHub Actions |
/.vscode | Configuration for VS Code |
/libs | Various dependency libraries required by the other projects |
/libs/java/dal | Data Access Layer to connect to the data sources |
/libs/java/core | Common library package for the java projects |
/libs/npm/core | Common library package for the react web applications |
/services | Backend Kafka Producers and Consumers (and other services) |
/services/rss | Backend Kafka Producer for ingesting RSS syndication feeds |
/services/atom | Backend Kafka Producer for ingesting ATOM syndication feeds |
/api | Backend RESTful APIs and associated projects projects |
/api/editor | Backend RESTful API for the editor app |
/api/subscriber | Backend RESTful API for the subscriber app |
/app | Frontend web applications |
/app/editor | Web application for editors |
/app/subscriber | Web application for subscribers |
/auth | Authentication services such as Keycloak |
/db | Data storage services, such as PostgreSQL, Elasticsearch, Kafka, Azurite |
/docs | Documentation related specifically to the code base, everything else should go into the wiki |
/network | Network services such as Nginx |
/openshift | Infrastructure as code, everything to setup Openshift |
/test | Global testing tools for the solution. All other tests will be within each project |
/tools | Tools that are used in the solution for development |
Common components and libraries that are shared between different parts of the solution should be separated and placed in the /libs
folder.
These libraries should be packaged into their respective language package managers (Maven, npm, Nuget, Docker, ...) so that they can be imported as dependencies ino multiple projects.
An example of this is the DAL to connect to the various data sources.
It will use Hibernate and enable the Java Spring APIs access to the data.
The APIs will continue to be light weight and only server their specific purpose of listening for requests and providing responses.
All the actual work will be performed by these commonly shared libraries to apply business rules and data access.
Examples of common components.
- core: Common functions, features, components, configuration
- dal.db: Data Access Layer to PostgreSQL database
- dal.es: Data Access Layer to Elasticsearch
- dal.kafka: Data Access Layer to Kafka
- dal.azure.blob: Data Access Layer to Azure
- dal.azure.cognitiveservices: Data Access Layer to Azure Cognitive Services
- dal.azure.videoanalyzer: Data Access Layer to Azure Video Analyzer
- bl: Business Layer which will reference the various DALs and enforce business rules
This architecture while more complex than placing all the code in a single project, will ensure SOLID principles.
The source control workflow strategy is fairly simple, and follows common industry standards.
Developers will fork the repository into their own GitHub account.
Developers should create a separate branch by copying the dev
branch for their work that is related to a Story.
All changes will be submitted to the bcgov/dev
branch through a Pull Request.
A Pull Request should be made once a branch has been rebased to ensure the new commits are appended to the end (this will reduce conflicts).
Additionally, this branch should be squash merged before the PR is created.
During review of the Pull Request any new changes should not be rebased so that each commit can be compared and reviewed appropriately.
Each Pull Request will require at least one review and approval from a code owner.
When the Pull Request is ready to be merged with the bcgov/dev
branch it should be squash merged so that it becomes a single commit.
When a release is ready to be handed off to UAT the dev
branch should be merged with the master
branch.
For those new to Git and rebasing, here is a short tutorial. We recommend you use VS Code as the Git editor as it has a nice UI to perform this task.
The following steps will rebase your branch on top of the selected branch.
- Fetch the latest commits for the branch you want to rebase onto
git fetch {remote} {source branch}
- Checkout the branch you want to rebase (if you haven't already)
git checkout {destination branch}
- Rebase the current branch on top of the other branch
git rebase {remote}/{source branch}
Before creating a PR you should squash all your intermediate commits into a single commit. To perform this first rebase (see above), then perform another rebase with the option to edit the commits.
- Rebase with intent to edit
git rebase -i {remote}/{source branch}
- Using the VS Code editor it will display a UI form that will allow you to squash each commit down to your single commit. Note that you do not want to squash all the way down to the original source branch commit. You only want to squash your commits.
- Once you have selected which commits to squash VS Code editor will display a text file containing all the commit messages. Update the message appropriately and save the file and close it. Your local branch will now be squashed.
BrowserStack can be used to test your local changes on an iOS/Android device. A free account may be utilized to test with this feature.
- Download BrowserStack Local
- When the download finishes ensure it is running on your local machine
- Navigate to the BrowserStack Live
- Select your mobile device of choice, and browser
- After the device loads, navigate to the local application (localhost:40080 by default)
- At first you will be redirected to a screen indicating an invalid redirect uri, and that is because BrowserStack currently does not support localhost and uses bs-local
- To solve this, navigate to Keycloak locally (http://localhost:50001/ by default), then navigate to Clients > TNO-App > Valid Redirect URI's and add an option for bs-local.com:40080
- Refresh the browser on the iOS device and it should load as expected
- Debugging can be done with the inspector tools located on the left panel