Point camera at things to learn how to say them in a different language.
Native Android App built with React Native.
Made it as a part of my learning process of React and React Native.
Inspired by Thing Translator by dmotz.
Concepts present in captured image are extracted and translated to the given language.
It uses
- Clarifai's concept recognition from images
- Yandex's language translation.
what_the_thing_1.1.0-release.apk
Get the signed release apk for android from here or follow the steps to build it yourself.
Have a look at the React-native Docs to set up a development environment for React-native and Android.
# Install react-native
$ sudo npm install -g react-native-cli
# Clone repository
$ git clone https://github.com/vigzmv/what_the_thing.git
$ cd what_the_thing
Get your free API keys from Clarifai and Yandex. It is easy and free.
Place them in ./apiKeys.json
.
# Install required packages
$ yarn install || npm install
# Install the application on a running Android emulator or a connected Android device
$ react-native run-android
# Run the server/bundler
$ react-native start
# Done !
For a build that doesn't reqire the development server:
# Bundle debug build
$ react-native bundle --dev false --platform android --entry-file index.android.js --bundle-output ./android/app/build/intermediates/assets/debug/index.android.bundle --assets-dest ./android/app/build/intermediates/res/merged/debug
# Create debug build
$ cd android && ./gradlew assembleDebug
# Generated apk will be located at android/app/build/outputs/apk. Install it with
$ adb install app/build/outputs/apk/app-debug.apk
I would be releasing a signed apk package soon which could be installed directly to device for usage.
- Fork the repository
- Commit your changes
- Submit a pull request