The Atlas Checks framework and standalone application are tools to enable quality assurance of Atlas data files. For more information on the Atlas mapping file format please see the Atlas project in Github.
Please see the contributing guidelines!
To run Atlas Checks the following is required:
To start working with Atlas Checks follow the steps below:
- Clone Atlas Checks project using the following command
git clone https://github.com/osmlab/atlas-checks.git
- Switch to newly created directory:
cd atlas-checks
- Execute
./gradlew run
This command will build and run Atlas Checks with all the default options against a sample Atlases of Belize downloaded from here. GeoJSON output will be produced that contains all the results found from the run. Those outputs will be found in atlas-checks/build/examples/data/output
. For more information on running Atlas Checks as a standalone application click here.
See configuration docs for more information about the configuration files that can be used to define specific details around the Atlas Checks application.
Atlas Checks have been developed to take advantage of distributed computing by running the checks in Spark. For more information on Spark see spark.apache.org. Running Atlas Checks locally is already executed within a local Spark environment on your machine, so running Spark in a cluster is simply a matter of updating the configuration. For more information see Running Atlas Checks in a Spark Cluster
See Development docs for more information about developing and best practices for new Atlas Checks.
A document with a list of tables for Available checks is available and includes descriptions and links to documentation for each check. Each table is organized by check type.