Oxford RobotCar dataset
Nordland Railway dataset
We provide the CityLearn framework proposed in CityLearn: Diverse Real-World Environments for Sample-Efficient Navigation Policy Learning, accepted for publication in the IEEE International Conference on Robotics and Automation (ICRA 2020). Preprint version available at https://arxiv.org/abs/1910.04335.
Project page: https://mchancan.github.io/projects/CityLearn
CityLearn is an interactive open framework for training and testing navigation algorithms on real-world environments with extreme visual appearance changes including day to night or summer to winter transitions. We leverage publicly available datasets, often used in visual place recognition and autonomous vehicles research, consisting of multiple traversals across different seasons, time of day or weather conditions. CityLearn is also designed to test the generalization capabilities of navigation algorithms including reinforcement learning agents.
CityLearn is developed on top of the Unity ML-Agents toolkit, which can run on Mac OS X, Windows, or Linux.
Some dependencies:
- Python 3.6
- Unity game engine
- Unity ML-Agents toolkit
-
Download and install Unity 2017.4.36 for Windows or Mac from here or through UnityHub for Linux.
-
Download and install Unity ML-Agents v0.8.1. Using other ML-Agents releases may require substantial changes as CityLearn was developed using v0.8.1. Once downloaded, you will need to install this for development. To do so, from the
ml-agents-0.8.1
repository's root directory, run:pip3 install -e ./ml-agents-envs pip3 install -e ./ml-agents
-
Clone this repository on a temporal folder:
git clone https://github.com/mchancan/citylearn.git
- Put the
CityLearn
directory inside theUnitySDK/Assets/ML-Agents/Examples/
folder in yourml-agents-0.8.1
directory. - Add the files provided in the
config
directory inside theconfig/
folder in yourml-agents-0.8.1
directory.
- Download the driving datasets of your choice. For some datasets, you may need to extract frames out of the downloaded videos using Avconv on Ubuntu, e.g.
- Oxford RobotCar: https://robotcar-dataset.robots.ox.ac.uk/
- Berkeley DeepDrive: https://bdd-data.berkeley.edu/
- Cityspaces: https://www.cityscapes-dataset.com/
- Kitti: http://www.cvlibs.net/datasets/kitti/index.php
- Nordland railway: https://nrkbeta.no/2013/01/15/nordlandsbanen-minute-by-minute-season-by-season/
- Multi-lane Road (videos): https://wiki.qut.edu.au/display/raq/2014+Multi-Lane+Road+Sideways-Camera+Datasets
- Gold Coast Drive (video): https://wiki.qut.edu.au/display/raq/Datasets
- UQ St Lucia: https://wiki.qut.edu.au/display/raq/UQ+St+Lucia
- St Lucia Multiple Times of Day (videos): https://wiki.qut.edu.au/display/raq/St+Lucia+Multiple+Times+of+Day
- Alderley (video+frames): https://wiki.qut.edu.au/pages/viewpage.action?pageId=181178395
- The code provided in the
CityLearn
directory can be directly used on this subset of the Nordland dataset, but you can easily use any other driving dataset. - Corresponding 64-d feature vectors of the Nordland subset are provided in the
features
directory. For extracting these features, we used this NetVLAD implementation.
- We also provide pre-trained agents in the
CityLearn
directory for testing. You may need to double-check the paths for both the images of the Nordland subset and the 64-dfeatures
in your computer, and then deploy the pre-trainedCityLearnBrain_NV64_Nordland.nn
agent using theCityLearnDeploy
scene.
CityLearn itself is released under the MIT License (refer to the LICENSE file for details) for academic purposes. For commercial usage, please contact us via mchancanl@uni.pe
If you find this project useful for your research, please use the following BibTeX entry.
@INPROCEEDINGS{chancan2020citylearn,
author={M. {Chanc\'an} and M. {Milford}},
booktitle={2020 IEEE International Conference on Robotics and Automation (ICRA)},
title={CityLearn: Diverse Real-World Environments for Sample-Efficient Navigation Policy Learning},
year={2020},
volume={},
number={},
pages={1697-1704}
}