Research

We have two research groups such as eye-tracking and vibration sensing.

  • Eye・Visual
  • Bone・Tactile
  • Robotics

  • Estimating Focused Object using Corneal Surface Image

corneal-imaging

Researchers are considering the use of eye tracking in head-mounted camera systems. Typical methods require detailed calibration in advance, but long periods of use disrupt the calibration record between the eye and the scene camera. In addition, the focused object might not be estimated even if the point-of-regard is estimated using a portable eye-tracker. Therefore, we propose a novel method for estimating the object that a user is focused upon, where an eye camera captures the reflection on the corneal surface. Eye and environment information can be extracted from the corneal surface image simultaneously. We use inverse ray tracing to rectify the reflected image and a scale-invariant feature transform to estimate the object where the point-of-regard is located. Unwarped images can also be generated continuously from corneal surface images. We consider that our proposed method could be applied to a guidance system and we confirmed the feasibility of this application in experiments that estimated the object focused upon and the point-of-regard.

  • Estimating 3D Point-of-Gaze and Focused Object

Unlike conventional portable eye-tracking methods that estimate the position of the mounted camera using 2D image coordinates, the techniques proposed here presents richer information about person’s gaze when moving over a wide area. They also include visualizing scanpaths when the user with a head-mounted device makes natural head movements. We employ a Visual SLAM technique to estimate the head pose and extract environmental information. When the person’s head moves, the proposed method obtains a 3D point-of-regard. Furthermore, scanpaths can be appropriately overlaid on image sequences to support quantitative analysis. Additionally, a 3D environment is employed to detect objects of focus and to visualize an attention map.

  • Tracking Iris and Pupil Simultaneously using RGB-IR Camera

We propose a novel eye-tracking method that uses a multispectral camera to simultaneously track the pupil and recognize the iris. Our hybrid approach leverages existing methods, combining them so as to compensate for weaknesses present in each individual method when used alone. Significantly, our method allows for movements of the center of rotation of the eye to be taken into consideration, this having been treated as a static point in most of earlier studies. Additionally, our method allows for the diameter of the pupil to be measured quantitatively using just a single camera. To confirm the effectiveness of our method, we conduct two experiments, in which we estimate the area and shape of the iris, the point-of-gaze, and the size of the pupil. We go on to observe that the effectiveness of our proposed method is increased compared to previous methods, particularly in situations where the eye is moved to the extreme inner corner of its socket.

  • Active Bone-Conducted Sound Sensing for Estimating Joint Angle

We propose a wearable sensor system that measures joint angles of an elbow and finger using vibration which is emitted actively.  A novelty of this research is to use active sensing for measuring a joint angle.  The active sensing means to emit vibration and sounds to a bone, and a microphone receives the propagated vibration and sounds.

  • Haptic-enabled Active Bone-Conducted Sound Sensing

We propose active bone-conducted sound sensing for estimating a joint angle of a finger and simultaneous use as a haptic interface. For estimating the joint angle, an unnoticeable vibration is input to the finger, and a perceptible vibration is additionally inputted to the finger for providing haptic feedback. The joint angle is estimated by switching the estimation model depending on the haptic feedback and the average error of the estimation is within about seven degrees.

  • Hand Pose Estimation based on Active Bone-Conducted Sound Sensing

Estimating hand poses is essential to achieve intuitive user interfaces. In Virtual Reality, an infrared (IR) camera is used for hand tracking, and direct manipulation can be accomplished by using hands. Additionally, wearable devices have also attracted attention because of their portability. We have developed a method based on the use of a wearable device to estimate the joint angle, which can be determined using the amplitude of vibration. However, the joint angle, which can be estimated, is limited to particular joints. Therefore, our proposed method determines the hand pose based on active bone-conducted sound sensing toward intuitive user interfaces. We employed the power spectral density as a feature, thereby enabling the hand pose to be classified with a support vector machine. We confirmed the recognition accuracy and the feasibility of our proposed method through evaluation experiments.

  • Estimating contact force of fingertip

This study proposes a method for estimating the contact force of the fingertip by inputting vibrations actively. The use of active bone-conducted sound sensing has been limited to es- timating the joint angle of the elbow and the finger. We ap- plied it to the method for estimating the contact force of the fingertip. Unlike related works, it is not necessary to mount the device on a fingertip, and tactile feedback is enabled using tangible vibrations.

  • Grip Force Estimation based on Active Bone-Conducted Sound Sensing

We propose a method for determining grip force based on active bone-conducted sound sensing, which is an active acoustic sensing. In our previous studies, we estimated the joint angle, hand pose, and contact force by emitting a vibration to the body. We aspired to expand to an additional application of an active bone-conducted sound sensing, thus, we tried to estimate the grip force by creating a wrist-type device. The grip force was determined by using the power spectral density as the features, and gradient boosted regression trees (GBRT). Through evaluation experiments, the average error of the estimated grip force was around 15 N. Moreover, we confirmed that the grip strength could be determined with high accuracy.

Coming soon…