Skip to content

Latest commit

 

History

History
106 lines (73 loc) · 3.42 KB

README.md

File metadata and controls

106 lines (73 loc) · 3.42 KB

BADA G2 : The social Robot Accessible to the Deaf

Project Team : [사과가 쿵!(Dropped Apple!)]

Member :

유호연(Hoyeon Yu), 김민우(Minwoo Kim), 배종학(Jonghak Hae), 이현우(Hyunwoo Lee), 최수진(Soojin Choi), 황지원(Jiwon Hwang)

Advisor :

Prof. 한재권(Jeakweon Han)

image

The Development of a Social Robot Accessible to the Deaf

HRI'21: ACM/IEEE International Conference on Human-Robot Interaction
Session: Student Design Competition

Abstract

Bada is a social robot that can interact with individuals with the deaf. It resembles the appearance of a robotic vacuum cleaner and its signaling of abnormal circumstances at home was modeled after the behavior of hearing dogs. Bada effectively reduce the loss of information during delivery by relaying messages in various ways including web service, text messages, visual representation, and haptic interface. We have developed Bada’s interaction process through several tests. Its behavior, interface, and interaction model would fairly contribute to the robotic accessibility technology.

Workflow

image

image

Specification

Hardware

Design

image

Architecture

image

Software

Interfaces

Web & Mobile

image

Robot Display

image

How to run

Requirements

Install

  • rplidar
  • laserfilter
  • realsense
  • move_base
  • robot localization package
    • sudo apt-get install libgeographic-dev

Launch

  1. object detection
# (on coral_ws/devel)
source ./setup.bash
roslaunch coral_usb edgetpu_object_detector.launch
  1. bringup
# (on catkin_ws) 
roslaunch bada_g2_bringup bada_g2_robot.launch
  1. audio
source catkin_ws/venv/bin/activate
roslaunch bada_audio bada_audio.launch
  1. navigation
# (on catkin_ws) 
roslaunch bada_g2_2dnav amcl_navigation.launch
  1. core
# (on catkin_ws)
rosrun bada_g2_core bada_g2_core_node 
  1. web bridge
# (on BADA_G2_web)
npm run watch
rosrun rosbridge_server rosbridge_websocket

Additional Resources