Research

Current Research


Corneal imaging

  • Estimating Focused Object using Corneal Surface Image

corneal-imaging

Researchers are considering the use of eye tracking in head-mounted camera systems. Typical methods require detailed calibration in advance, but long periods of use disrupt the calibration record between the eye and the scene camera. In addition, the focused object might not be estimated even if the point-of-regard is estimated using a portable eye-tracker. Therefore, we propose a novel method for estimating the object that a user is focused upon, where an eye camera captures the reflection on the corneal surface. Eye and environment information can be extracted from the corneal surface image simultaneously. We use inverse ray tracing to rectify the reflected image and a scale-invariant feature transform to estimate the object where the point-of-regard is located. Unwarped images can also be generated continuously from corneal surface images. We consider that our proposed method could be applied to a guidance system and we confirmed the feasibility of this application in experiments that estimated the object focused upon and the point-of-regard.

  • Remote Corneal Imaging by Integrating a 3D Face Model and an Eyeball Model


In corneal imaging methods, it is essential to use a 3D eyeball model for generating an undistorted image. Thus, the relationship between the eye and eye camera is fixed by using a head-mounted device. Remote corneal imaging has several potential applications such as surveillance systems and driver monitoring. Therefore, we integrated a 3D eyeball model with a 3D face model to facilitate remote corneal imaging. We conducted evaluation experiments and confirmed the feasibility of remote corneal imaging. We showed that the center of the eyeball can be estimated based on face tracking, and thus, corneal imaging can function as continuous remote eye tracking.

  • Tracking Iris and Pupil Simultaneously using RGB-IR Camera


We propose a novel eye-tracking method that uses a multispectral camera to simultaneously track the pupil and recognize the iris. Our hybrid approach leverages existing methods, combining them so as to compensate for weaknesses present in each individual method when used alone. Significantly, our method allows for movements of the center of rotation of the eye to be taken into consideration, this having been treated as a static point in most of earlier studies. Additionally, our method allows for the diameter of the pupil to be measured quantitatively using just a single camera. To confirm the effectiveness of our method, we conduct two experiments, in which we estimate the area and shape of the iris, the point-of-gaze, and the size of the pupil. We go on to observe that the effectiveness of our proposed method is increased compared to previous methods, particularly in situations where the eye is moved to the extreme inner corner of its socket.

RealWorld

  • Estimating 3D Point-of-regard and Visualizing Gaze Trajectories under Natural Head Movements


The portability of an eye-tracking system encourages us to develop a technique for estimating 3D point-of-regard. Unlike conventional methods, which estimate the position in the 2D image coordinates of the mounted camera, such a technique can represent richer gaze information of the human moving in the larger area. In this paper, we propose a method for estimating the 3D point-of-regard and a visualization technique of gaze trajectories under natural head movements for the head-mounted device. We employ the visual SLAM technique to estimate head configuration and extract environmental information. Even in cases where the head moves dynamically, the proposed method could obtain a 3D point-of-regard. Additionally, gaze trajectories are appropriately overlaid on the scene camera image.

  • Estimating Point-of-Gaze using Smooth Pursuit Eye Movements without Implicit and Explicit User-Calibration

Detecting the point-of-gaze in the real world is a challenging problem in eye-tracking applications. The point-of-gaze is estimated using geometry constraints, and user-calibration is required. In addition, the distances of the focused targets are variable and large in the real world. Therefore, a calibration-free approach without geometry constraints is needed to estimate the point-of-gaze. Recent studies have investigated smooth pursuit eye movements (smooth pursuits) for human-computer interaction applications, and we consider that these smooth pursuits can also be employed in eye tracking. Therefore, we developed a method for estimating the point-of-gaze using smooth pursuits without any requirement for implicit and explicit user-calibration. In this method, interest points are extracted from the scene image, and the point-of-gaze is detected using these points, which are strongly correlated with eye movements. We performed a comparative experiment in a real environment and demonstrated the feasibility of the proposed method.


Polarization

  • Cross-Ratio Based Gaze Estimation using Polarization Camera System


In eye-tracking, near-infrared light is often emitted, and at least four LEDs are located at the corners of displays for detecting the screen plane in the cross-ratio based method. However, long-time radiation of near-infrared light can make a user fatigued. Therefore, in this study, we attempted to extract the screen area correctly without near-infrared radiation emission. A polarizing filter is included in the display, and thus, visibility of the screen can be controlled by the light’s polarization direction of the external polarized light filter. We propose gaze estimation based on the cross-ratio method using a developed polarization camera system, which can capture two polarized images of different angles simultaneously. Further, we confirmed that the point-of-gaze could be estimated using the screen reflection detected by computing the differences between two images without near-infrared emission.

  • Screen corner detection using polarization camera for cross-ratio based gaze estimation


Eye tracking, which measures line of sight, is expected to advance as an intuitive and rapid input method for user interfaces, and a cross-ratio based method that calculates the point-of-gaze using homography matrices has attracted attention because it does not require hardware calibration to determine the geometric relationship between an eye camera and a screen. However, this method requires near-infrared (NIR) light-emitting diodes (LEDs) attached to the display in order to detect screen corners. Consequently, LEDs must be installed around the display to estimate the point-of-gaze. Without these requirements, cross-ratio based gaze estimation can be distributed smoothly. Therefore, we propose the use of a polarization camera for detecting the screen area reflected on a corneal surface. The reflection area of display light is easily detected by the polarized image because the light radiated from the display is polarized linearly by the internal polarization filter. With the proposed method, the screen corners can be determined without using NIR LEDs, and the point-of-gaze can be estimated using the detected corners on the corneal surface. We investigated the accuracy of the estimated point-of-gaze based on a cross-ratio method under various illumination and display conditions. Cross-ratio based gaze estimation is expected to be utilized widely in commercial products because the proposed method does not require infrared light sources at display corners.

  • Polarized Near-Infrared Light Emission for Eye Gaze Estimation

The number of near-infrared light-emitting diodes (LEDs) is increasing to improve the accuracy and robustness of eye-tracking methods, and it is necessary to determine the identifiers (IDs) of the LEDs when applying multiple light sources. Therefore, we propose polarized near-infrared light emissions for an eye gaze estimation. We succeeded in determining the IDs of LEDs using polarization information. In addition, we remove glints from the cornea for correctly detecting the pupil center. We confirmed the effectiveness of using polarized near-infrared light emissions through evaluation experiments.

High-speed

  • Eye Gaze Estimation using Imperceptible Marker Presented on High-Speed Display

Advanced eye-tracking methods require a dedicated display equipped with near-infrared LEDs (light-emitting diodes). However, this requirement hinders the widespread adoption of such methods. Additionally, some glints may pass undetected when a large display is employed. To avoid these problems, we propose eye gaze estimation using imperceptible markers presented on a commercially available high-speed display. The marker reference points reflected on the cornea are extracted instead of glints, and the point-of-gaze can be estimated using the cross-ratio method.

Tactile Sensing

  • Vision-based tactile sensing using multiple contact images generated by RFTIR

Current vision-based tactile sensors have several limitations, such as their size and measurable surface. Therefore, we propose a novel vision-based tactile sensor based on the re-propagated frustrated total internal reflection (FTIR). The part of the FTIR generated by the contact is re-propagated through the medium, and the FTIR are observed from the side of the medium. We validate the physical principle of observation, including multiple contact images by simulations. In addition, a prototype system is developed to estimate the contact position through observations and regression algorithms. Finally, several experiments were performed to confirm the feasibility of the proposed contact estimation based on the repropagated FTIR.

Past Research

Acoustic Sensing

  • Active Bone-Conducted Sound Sensing for Estimating Joint Angle


We propose a wearable sensor system that measures joint angles of an elbow and finger using vibration which is emitted actively.  A novelty of this research is to use active sensing for measuring a joint angle.  The active sensing means to emit vibration and sounds to a bone, and a microphone receives the propagated vibration and sounds.

  • Haptic-enabled Active Bone-Conducted Sound Sensing


We propose active bone-conducted sound sensing for estimating a joint angle of a finger and simultaneous use as a haptic interface. For estimating the joint angle, an unnoticeable vibration is input to the finger, and a perceptible vibration is additionally inputted to the finger for providing haptic feedback. The joint angle is estimated by switching the estimation model depending on the haptic feedback and the average error of the estimation is within about seven degrees.

  • Hand Pose Estimation based on Active Bone-Conducted Sound Sensing


Estimating hand poses is essential to achieve intuitive user interfaces. In Virtual Reality, an infrared (IR) camera is used for hand tracking, and direct manipulation can be accomplished by using hands. Additionally, wearable devices have also attracted attention because of their portability. We have developed a method based on the use of a wearable device to estimate the joint angle, which can be determined using the amplitude of vibration. However, the joint angle, which can be estimated, is limited to particular joints. Therefore, our proposed method determines the hand pose based on active bone-conducted sound sensing toward intuitive user interfaces. We employed the power spectral density as a feature, thereby enabling the hand pose to be classified with a support vector machine. We confirmed the recognition accuracy and the feasibility of our proposed method through evaluation experiments.

  • Estimating contact force of fingertip


This study proposes a method for estimating the contact force of the fingertip by inputting vibrations actively. The use of active bone-conducted sound sensing has been limited to es- timating the joint angle of the elbow and the finger. We ap- plied it to the method for estimating the contact force of the fingertip. Unlike related works, it is not necessary to mount the device on a fingertip, and tactile feedback is enabled using tangible vibrations.

  • Grip Force Estimation based on Active Bone-Conducted Sound Sensing


We propose a method for determining grip force based on active bone-conducted sound sensing, which is an active acoustic sensing. In our previous studies, we estimated the joint angle, hand pose, and contact force by emitting a vibration to the body. We aspired to expand to an additional application of an active bone-conducted sound sensing, thus, we tried to estimate the grip force by creating a wrist-type device. The grip force was determined by using the power spectral density as the features, and gradient boosted regression trees (GBRT). Through evaluation experiments, the average error of the estimated grip force was around 15 N. Moreover, we confirmed that the grip strength could be determined with high accuracy.

  • Contact Point Estimation using Pneumatic System Noise


Active acoustic sensing is being widely used in various fields, with applications including shape estimation of soft pneumatic actuators. In a pneumatic system, air tubes are frequently adopted, and thus it is essential to detect failures along the air path. Although acoustic sensing has been used for detecting contact and identifying the contact position along a tube, it has not been applied to pneumatic systems. We devised an acoustic sensing method to this end for air tubes in a pneumatic system. As pneumatic system noise propagates through the air tube, we employed this type of noise instead of the conventional method of using a sound source or emitting vibration with an additional oscillator.