A signal processing perspective on human gait: Decoupling walking oscillations and gestures
  • Adrien Gregorj, Zeynep Yücel, , Akito Monden, Masahiro Shiomi
  • The 4th International Conference on Interactive Collaborative Robotics (ICR 2019), pp.75–85,Istanbul, Turkey, Aug. 2019. (Oral)

Abstract

This study focuses on gesture recognition in mobile interaction settings, i.e. when the interacting partners are walking. This kind of interaction requires a particular coordination, e.g. by staying in the field of view of the partner, avoiding obstacles without disrupting group composition and sustaining joint attention during motion. In literature, various studies have proven that gestures are in close relation in achieving such goals.

Thus, a mobile robot moving in a group with human pedestrians, has to identify such gestures to sustain group coordination. However, decoupling of the inherent -walking- oscillations and gestures, is a big challenge for the robot. To that end, we employ video data recorded in uncontrolled settings and detect arm gestures performed by human-human pedestrian pairs by adopting a signal processing approach. Namely, we exploit the fact that there is an inherent oscillatory motion at the upper limbs arising from the gait, independent of the view angle or distance of the user to the camera. We identify arm gestures as disturbances on these oscillations. In doing that, we use a simple pitch detection method from speech processing and assume data involving a low frequency periodicity to be free of gestures. In testing, we employ a video data set recorded in uncontrolled settings and show that we achieve a detection rate of 0.80.

Download Links: Springer | University's Repository (TBA)

  • 最終更新: 2020/05/25 19:08