Skip to content

Official implementation of paper "CityLearn: Diverse Real-World Environments for Sample-Efficient Navigation Policy Learning" by M. Chancán (ICRA 2020) https://doi.org/10.1109/ICRA40945.2020.9197336

License

Notifications You must be signed in to change notification settings

mchancan/citylearn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CityLearn: Diverse Real-World Environments for Sample-Efficient Navigation Policy Learning

Oxford RobotCar dataset

Nordland Railway dataset

We provide the CityLearn framework proposed in CityLearn: Diverse Real-World Environments for Sample-Efficient Navigation Policy Learning, accepted for publication in the IEEE International Conference on Robotics and Automation (ICRA 2020). Preprint version available at https://arxiv.org/abs/1910.04335.

Project page: https://mchancan.github.io/projects/CityLearn

About CityLearn

CityLearn is an interactive open framework for training and testing navigation algorithms on real-world environments with extreme visual appearance changes including day to night or summer to winter transitions. We leverage publicly available datasets, often used in visual place recognition and autonomous vehicles research, consisting of multiple traversals across different seasons, time of day or weather conditions. CityLearn is also designed to test the generalization capabilities of navigation algorithms including reinforcement learning agents.

Requirements

CityLearn is developed on top of the Unity ML-Agents toolkit, which can run on Mac OS X, Windows, or Linux.

Some dependencies:

  • Python 3.6
  • Unity game engine
  • Unity ML-Agents toolkit

Configuring CityLearn

  1. Download and install Unity 2017.4.36 for Windows or Mac from here or through UnityHub for Linux.

  2. Download and install Unity ML-Agents v0.8.1. Using other ML-Agents releases may require substantial changes as CityLearn was developed using v0.8.1. Once downloaded, you will need to install this for development. To do so, from the ml-agents-0.8.1 repository's root directory, run:

     pip3 install -e ./ml-agents-envs
     pip3 install -e ./ml-agents
    
  3. Clone this repository on a temporal folder:

     git clone https://github.com/mchancan/citylearn.git
    
  • Put the CityLearn directory inside the UnitySDK/Assets/ML-Agents/Examples/ folder in your ml-agents-0.8.1 directory.
  • Add the files provided in the config directory inside the config/ folder in your ml-agents-0.8.1 directory.
  1. Download the driving datasets of your choice. For some datasets, you may need to extract frames out of the downloaded videos using Avconv on Ubuntu, e.g.

Training you own agent

  • The code provided in the CityLearn directory can be directly used on this subset of the Nordland dataset, but you can easily use any other driving dataset.
  • Corresponding 64-d feature vectors of the Nordland subset are provided in the features directory. For extracting these features, we used this NetVLAD implementation.

Run a demo using a pre-trained model!

  • We also provide pre-trained agents in the CityLearn directory for testing. You may need to double-check the paths for both the images of the Nordland subset and the 64-d features in your computer, and then deploy the pre-trained CityLearnBrain_NV64_Nordland.nn agent using the CityLearnDeploy scene.

License

CityLearn itself is released under the MIT License (refer to the LICENSE file for details) for academic purposes. For commercial usage, please contact us via mchancanl@uni.pe

Citation

If you find this project useful for your research, please use the following BibTeX entry.

@INPROCEEDINGS{chancan2020citylearn,
	author={M. {Chanc\'an} and M. {Milford}},
	booktitle={2020 IEEE International Conference on Robotics and Automation (ICRA)},
	title={CityLearn: Diverse Real-World Environments for Sample-Efficient Navigation Policy Learning},
	year={2020},
	volume={},
	number={},
	pages={1697-1704}
}

Releases

No releases published

Packages

No packages published

Languages