An improved 3D shape haptic rendering algorithm for finger mounted vibrotactile device

Wu Juan Han Xiao Yang Huaining

(State Key Laboratory of Bioelectronics, Southeast University, Nanjing 210096,China)(School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China)

AbstractTo improve the sense of reality on perception, an improved algorithm of 3D shape haptic rendering is put forward based on a finger mounted vibrotactile device. The principle is that the interactive information and the shape information are conveyed to users when they touch virtual objects at mobile terminals by attaching the vibrotactile feedback on a fingertip. The extraction of shape characteristics, the interactive information and the mapping of shape in formation of vibration stimulation are key parts of the proposed algorithm to realize the real tactile rendering. The contact status of the interaction process, the height information and local gradient of the touch point are regarded as shape information and used to control the vibration intension, rhythm and distribution of the vibrators. With different contact status and shape information, the vibration pattern can be adjusted in time to imitate the outlines of virtual objects. Finally, the effectiveness of the algorithm is verified by shape perception experiments. The results show that the improved algorithm is effective for 3D shape haptic rendering.

Keywordsfinger mounted vibrotactile device; 3D shape haptic rendering algorithm; contact status; local gradient

Nowadays, mobile terminals are widely used in daily life. With mobile terminals, people can use the gesture language on two-dimensional planes to achieve a better interactive experience[1-2]. Nevertheless, this interaction depends highly more on vision, which is not friendly to visually impaired people. To solve it, tactile feedback technology is applied on mobile terminals. In this way, visually impaired people are able to perceive virtual objects or read books on touch screen with tactile feedback technology[3]. Moreover, the sense of reality on perception is improved for users when interacting with terminals with the technology. For example, users are able to feel the texture of products such as clothes when shopping online[4]. Therefore, it is valuable to design suitable tactile interface on mobile terminals for different purposes.

One of the popular displays on tactile interface is shape rendering. In relevant studies and experiments on devices for shape rendering, it is found that users are sensitive to shape features, such as the height and gradient of an object’s surface. Wijntjes et al.[5]showed that the height information of an object played a major role in shape perception by doing experiments with height adjustable surface. Gordon and Morison[6]found that the curvature perception of an object was mainly related to the gradient of the object surface and they also discovered the gradient threshold of human curvature perception. In addition, Minsky et al.[7]put forward the concept of local gradient. The local gradient was the vector product of the direction vector and the gradient vector when users explored the surface of an object. Their experiments proved that a person could perceive the shape through the local gradient information. Pont et al.[8]found that it was easy to recognize the shape when combining the surface height information with the gradient according to the experimental results.

For shape rendering, there were some approaches for displaying shape, such as force feedback equipment[9-11], dynamic pin arrays[12-13]and so on. Recent studies showed that vibrotactile stimuli were fundamental in tactile feedback technology. The TeCIP Institute in Italy developed a set of wearable fingertip feedback devices to reproduce three-dimensional shapes. The device used the mechanical structure driven by servo motors to recreate the concave and convex direction of the virtual object surface. At the same time, the tactility of the surface was stimulated by a ring motor. Tactile faced with vibrotactile was portable and convenient in interaction with mobile terminals and could generate rich vibration patterns to convey more information about a shape. According to pre-experiments about vibration properties, it was found that users were sensitive to the intension and rhythm of vibration. Based on similar finger mounted devices, a shape haptic rendering algorithm was designed by Zhong et al[14]. In the algorithm, the amplitude and direction of lateral force at the touch point regarded as shape information were used to map the vibration intension and distribution of the fingertip vibrator. However, for lack of enough shape characteristics information, this algorithm did not perform well in all experimental shapes.

Based on the previous work of shape haptic rendering, an improved shape tactile rendering algorithm is proposed on a finger mounted device vibrotactile in this paper. The device is designed by settling three small size piezoelectric actuators around a finger and attaching a force sensor. It communicates with a tablet by WIFI module. In the proposed algorithm, features including height information and local gradient are all considered as the shape characteristic information. The sensor attached on the finger device is used to extract surface information such as shape data. Then, the features are, respectively, mapped to the intension, rhythm and distribution of the vibrotactile stimulus. Finally, the experiments are designed to evaluate the performance of the proposed algorithm.

1 Design of the Fingertip Device

The fingertip device used in this paper is shown in Fig.1. The distributed vibrotactile stimulus is accomplished by adjusting the vibration intension and rhythm of piezoelectric actuators which are attached to a plastic finger mount. Compared with the previous finger mounted device[14], the quantity of vibrators is changed from four to three in current devices, as in the dashed part of Fig.2(a). In the former experiments, users are easily interrupted by actuators’ vibration with close distributions. Therefore, to simulate different touching statuses and to display shape outlines clearly, the vibrators are attached in three directions inside the finger mount. The attached vibrators in the fingertip are shown in Fig.2(b) and the vibration distribution on the fingertip is shown in Fig.2(a). The whole experimental system consists of two modules: the interactive module and the control module. The interactive module is the interactive unit with three piezoelectric actuators mounted on the finger. Fig.2(c) shows that a subject’s finger wearing the finger mounted device interacts with a tablet. Also, in the control module, there are five units, which are the WIFI communication unit, host MCU unit, active force processing unit, piezoelectric actuator driver unit and power supply unit. They collaborate with each other so as to communicate with the mobile terminals, generating the driving signal for the actuators and a supply power for the whole system.

Fig.1 Wearable finger device communicating with the tablet

(a) (b)

(c)
Fig.2 The interactive module of the whole experimental system. (a) Structure of the interactive module; (b) The vibration distribution on the fingertip; (c) A subject’s finger works with the fingertip device

2 Improved Shape Haptic Rendering Algorithm

In this section, an improved shape rendering algorithm is proposed. Based on the similar wearable finger device, Zhong et al.[14]put forward a lateral-force-based algorithm for shape rendering. The lateral-force-based illusion was firstly defined and the virtual shape outline could be displayed by rendering the corresponding lateral force above it as shown in Fig.3. Zhong et al.[14]regarded the amplitude and direction of the lateral force at the touch point as the shape feature, then mapped to the vibration intension and distribution of the vibrator. For the reason that the extraction of shape feature information was not sufficient to fully display the objects’ shapes, such as the height information and details of the objects’ outlines, as well as limited vibration patterns, Zhong et al.[14]found that the algorithm did not work well in displaying the inclined surface and the flat surface.

Fig.3 Lateral-force-based haptic illusion

In relevant study and experiments on the tactile interaction and shape rendering, contact status and shape are the most essential tactile characteristics. The shape perception is derived from their combination. To improve the algorithm, the shape gradient along the movement direction has to be detected initially. This main characteristic parameter is then used to control the vibration distribution and patterns so as to map the perception to the vibration stimulus. The framework of the algorithm is shown in Fig.4.

Fig.4 Procedure of the shape rendering algorithm

In Fig.4, the image height distribution and the touch position are regarded as the inputs of the algorithm. The image height distribution is mainly processed to obtain the height information in order to calculate the driving voltage which reflects the intension of the vibrotactile stimulus. With the touch position, the local gradient can be calculated to reflect the characteristics of the surface curvature. To strengthen the reality of virtual surface perception, the local gradient is used to generate the rhythm of vibrotactile stimulus. From the gradient, the distribution of vibrotactile can also be obtained.

The followings are the key steps of shape rendering algorithm.

2.1 Local gradient calculation

When the finger slips over virtual objects on the touch screen, the touch position and finger movement are detected by the touch screen of the mobile terminal. The gradient of the virtual object’s height is defined as

(1)

whereh(x,y) is the height information of the image, which is usually calculated in the pre-process.

The unit direction vector of the finger movement is defined as

(2)

wherePimeans the finger position vector at timeti. When the touch screen of the mobile terminal is at low resolution, the average value ofViis usually calculated.

The local gradient is defined as the vector product of the image height gradient and the finger movement direction. It is calculated as

f=|h|×|Vi|×cosθi

(3)

wherefis the local gradient;θiis the angle of two vectors.

2.2 Distribution of vibrotactile stimuli

When contacting real objects, the fingertip distorts under stress in three directions:Down, left and right. The local surface orientation perception is relevant with both the finger distortion and the image gradient. The actuator is used under fingertips to express the image height information. The vibration position of the object shape is shown in Tab.1.

Tab.1 Vibration position of the object shape

Object shapeInclined surface(decline from left to right)Flat surfaceInclined surface(decline from right to left)Vibration positionLeftDownRight

2.3 Intension and rhythm of vibrotactile stimulus mapping

The height information and the gradient are mapped to vibration intension and rhythm, respectively, for displaying the shape. The height is the main feature of the outlines of objects and can be regarded as the space amplitude of the image. The local gradient is actually the rate of the height change in the sliding direction. Similarly, the vibration intension is the intuitive perception of vibration and changed by the vibration frequency. The rhythm reflects the rate of the vibration frequency change in time. Based on their similarities, the improved algorithm maps the height to the vibration intension as well as the gradient to the vibration rhythm. Both of them are in relationship of the direct proportion, which can be expressed as

R=k1×|f|D=k2×|h|

(4)

whereRis the vibration rhythm;Dis the vibration intension;k1is the ratio of the maximum of local gradient and the maximum of vibration rhythm;k2is the ratio of the maximum of touch position height and the maximum of vibration intension. By the tactile perception experiments of studying the influence on the vibration rhythm with changing periods, the relationship of vibration intension and period is shown as

R=-0.5T+105

(5)

whereTis the modulating period. Thus, the vibration rhythm can be obtained.

3 Performance of the Finger Mounted Device

To evaluate the performance of the improved shape rendering algorithm, the experiments of shape perception were designed.

3.1 Participants

18 subjects (10 males and 8 females), aged from 23 to 28 years old, are all right-handed. All of the subjects use the right hand to perceive different shapes.

3.2 Virtual experimental shapes

In this experiment, four basic typical shape units, which are edge, flat surface, inclined surface, quadric surfaces including convex surface and concave surface, are chosen for perception. Calculated by the improved algorithm, the vibration patterns for experimental shapes are shown in Fig.5. Motor L, D, R represent the motors attached on the interactive module in three directions, left, down and right, respectively. When the subject’s finger with the device slides on the tablet, the subject feels the simulated virtual objects by a vibrotactile stimulus. The vibration change mainly occurs during the exploring in the horizontal direction, as shown in Fig.5. For example, in the process of touching the convex surface of quadric surfaces, when the subject slides horizontally on the convex picture of screen from left to right duringt1tot5, the voltage waves of three motors change as the left part shown in Fig.5(d). However, if the subject slides vertically in the central part at timet3, there is only down motor vibrating as the motor status att3. Meanwhile, when the subject perceives from left to right duringt1tot5, the order of vibrating motors always changes. Fromt1tot5, motor R, motor D, motor L vibrate in order, which imitates the real contact process. In reality, when the finger slides from left to right on convex surface, the right side of finger usually contacts the surface initially, then the down side of finger contacts the peak of the surface, and after that the left side of finger contacts the rest of the surface. The order of vibrating motors depends on the local gradient direction of the finger contact point and the motors vibrate in different rhythms with the change of the object’s curvature calculated by Eq.(4). Also, the vibration intension is changed with the image height in Eq.(4). The vibration patterns of other shapes are shown in Fig.5.

(a)

(b)

(c)

(d)

(e)
Fig.5 The voltage waves of vibrators, two-dimensional pictures and three-dimensional graphics. (a) Edge; (b) Square; (c) Inclined angle; (d) Convex surface; (e) Concave surface

3.3 Experimental procedure

Before the formal experiment, subjects need to become used to the finger device and the vibrotactile stimulus corresponding to the specific shapes. They select the grey-level image of the perceived shape on the touch screen and the slide finger attached with the interactive module on the shape perception zone which is displayed on the screen. The experiments consist of two parts.

In Experiment 1, five experimental shape vibrations generated by two algorithms (Algorithm 1 is the previous one by Zhong et al. and Algorithm 2 is the improved one in this paper) are shown to subjects by the finger mounted device. There are 30 shape vibration patterns in total, which are shown to subjects randomly. There are five trials for five shapes. In each trial, the vibration pattern generated by two algorithms for each shape appears three times, respectively. Subjects need to choose a better one. They can repeat the perception and the repeated times are recorded.

In Experiment 2, no visual images are shown to subjects in the shape perception zone when their fingers slide on the screen. They have to match the “shape” they feel to the grey-level image. In each trial, the vibration pattern for each shape appears three times. There are 15 shape vibrations in total, shown to subjects randomly. Subjects have to do the same mapping for both Algorithm 2 and Algorithm 1.

3.4 Results and analysis

The comparison results of the two algorithms in Experiment 1 are shown in Tab.2, which reflect the subjective assessment of the two algorithms. For 20 subjects, there are always more people feeling more comfortable in Algorithm 2 than in Algorithm 1. In the perception of edge and flat surfaces, the repeated times change with the complexity of comparison and the shape outline. In the edge and flat surfaces, both algorithms perform well due to the simple vibration pattern corresponding to the shape outlines of the edge and flat surfaces. However, in the perception of the inclined surface, convex surface and concave surface, Algorithm 2 performs much better than Algorithm 1. That is to say, the improved algorithm can extract more shape features from the inclined surface, quadric surface and greatly enhances the sense of reality.

Tab.2 Comparison results of two algorithms

Virtual shapesAlgorithmRepeated timesNumber of subjectsFig.5(a)Algorithm 12.357Algorithm 22.2311Fig.5(b)Algorithm 12.478Algorithm 22.3010Fig.5(c)Algorithm 11.753Algorithm 21.8115Fig.5(d)Algorithm 12.055Algorithm 21.4510Fig.5(e)Algorithm 12.406Algorithm 22.0012

The accuracy result of Experiment 2 is shown in Fig.6. In Algorithm 1, the general average accuracy is 89.4%, while in Algorithm 2, the general average accuracy is 94%. In Fig.6, subjects perform worse in the flat surface and inclined surface. The reason is that subjects mainly recognize shapes by the vibration position for lack of sufficient shape feature characteristics in Algorithm 1, especially in the inclined surface. In Algorithm 2, subjects perform much better for the reason that they can clearly recognize the flat surface and inclined surface since the improved algorithm is based on the information extracted from the touch position and the finger gesture. The interactive information is considered when the finger is touching an object.

Fig.6 The accuracy of two algorithms

In both Algorithm 1 and Algorithm 2, subjects all perform well in quadric surfaces but the accuracy in Algorithm 2 is still a little higher. Subjects claim that they identify the quadric surfaces mainly depending on the vibration rhythm and the position and the inclined surface relying more on the vibration position and the intension. The repeated times of them in Tab.2 are also different. Therefore, the improved algorithm limits the influence on recognizing the quadric surfaces. It influences more the accuracy of the flat surface and the inclined surface. However, the accuracy of the flat surface and the inclined surface are still lower than the average accuracy 94%. The major reason is that subjects are more sensitive to the vibration rhythm and the position rather than to the intension. Besides, during the experiments, all subjects have the following similar feelings: 1) The finger gesture has little help for the shape perception. They perceived shapes from the vibration rhythm, intension and distribution; 2) Though subjects are able to distinguish the virtual shapes by vibrotactile stimuli but the sense of reality is not greatly improved in the edge and flat surface.

4 Conclusions

1) According to the reaction time and accuracy of the shape perception experiments, it is showed that the subjects can distinguish more different shapes than before.

2) Not only the repeated times are decreased but also the accuracy of shape identification is obviously improved. It is also found that subjects are more sensitive to the vibration rhythm and the position than vibration intension.

3) Therefore, the improved algorithm performs well in shape recognition, and has an improvement on the reality of shape perception. In the future, the sense of reality and multimode haptic rendering are a worthy research base for the current work.

References

[1] Sra M, Schmandt C. Expanding social mobile games beyond the device screen[J].PersonalandUbiquitousComputing, 2015,19(3/4): 495-508. DOI:10.1007/s00779-015-0845-0.

[2] Lohr M. Apps versus demonstration experiments: Improvement of quality of physics teaching in secondary education by the use of tablets[C]//IEEEInternationalConferenceonInteractiveMobileCommunicationTechnologiesandLearning. Thessaloniki, Greece, 2015:226-231. DOI:10.1109/IMCTL.2014.7011137.

[3] Jayant C, Acuario C, Johnson W, et al. V-braille: Haptic braille perception using a touch-screen and vibration on mobile phones[C]//InternationalACMSIGACCESSConferenceonComputersandAccessibility. Orlando, FL, USA, 2010:295-296. DOI:10.1145/1878803.1878878.

[4] Muniandy M, Ee W K. User’s perception on the application of haptics in mobile e-commerce[C]//IEEEInternationalConferenceonResearchandInnovationinInformationSystems. Kuala Lumpur, Malaysia, 2014:91-96. DOI:10.1109/ICRIIS.2013.6716691.

[5] Wijntjes M W A, Sato A, Hayward V, et al. Local surface orientation dominates haptic curvature discrimination[J].IEEETransactionsonHaptics, 2009,2(2): 94-102. DOI:10.1109/toh.2009.1.

[6] Gordon I E, Morison V. The haptic perception of curvature[J].Perception&Psychophysics, 1982,31(5): 446-450. DOI:10.3758/bf03204854.

[7] Minsky M, Ming O Y, Steele O, et al. Feeling and seeing: Issues in force display[J].ACMSIGGRAPHComputerGraphics, 1990,24(2): 235-241. DOI:10.1145/91394.91451.

[8] Pont S C, Kappers A M L, Koenderink J J. Similar mechanisms underlie curvature comparison by static and dynamic touch[J].Perception&Psychophysics, 1999,61(5): 874-894. DOI:10.3758/bf03206903.

[9] Mullenbach J, Shultz C, Piper A M, et al. Surface haptic interactions with a TPad tablet[C]//ProceedingsoftheAdjunctPublicationofthe26thAnnualACMSymposiumonUserInterfaceSoftwareandTechnology. St Andrews, UK, 2013:7-8. DOI:10.1145/2508468.2514929.

[10] Pacchierotti C, Salvietti G, Hussain I, et al. The hRing: A wearable haptic device to avoid occlusions in hand tracking[C]//IEEEHapticsSymposium. Philadelphia, USA, 2016: 134-139. DOI:10.1109/HAPTICS.2016.7463167.

[11] Karlin S. Tactus technology [J].IEEESpectrum, 2013,50(4): 23. DOI:10.1109/mspec.2013.6481691.

[12] Yang T H, Kim S Y, Kim C H, et al. Development of a miniature pin-array tactile module using elastic and electromagnetic force for mobile devices [C]//WorldHaptics2009—ThirdJointEuroHapticsConferenceandSymposiumonHapticInterfacesforVirtualEnvironmentandTeleoperatorSystems. Salt Lake City, UT, USA, 2009: 13-17. DOI:10.1109/WHC.2009.4810818.

[13] Sarakoglou I, Garcia-Hernandez N, Tsagarakis N G, et al. A high performance tactile feedback display and its integration in teleoperation[J].IEEETransactionsonHaptics, 2012,5(3): 252-263. DOI:10.1109/toh.2012.20.

[14] Zhong X J, Wu J, Han X, et al. Mobileterminals haptic interface: A vibro-tactile finger device for 3D shape rendering[C]//Proceedingsofthe10thInternationalConferenceonIntelligentRoboticsandApplications. Wuhan, China, 2017: 361-372. DOI:10.1007/978-3-319-65289-4_35.

一种基于指套式振动反馈装置的三维形状触觉再现改进算法

吴 涓 韩 啸 杨怀宁

(东南大学生物电子学国家重点实验室,南京 210096) (东南大学仪器科学与工程学院,南京 210096)

摘要为了提高感知真实感,基于指套式振动触觉反馈装置提出了一种改进的三维形状触觉渲染算法.其原理是通过在手指指尖上反馈振动触觉,从而在触摸移动终端中的虚拟物体时将交互信息和形状信息传达给用户.形状特征的提取、交互信息以及振动刺激形状信息的映射是该算法实现真实触觉渲染的关键步骤.交互过程的接触状态、触摸点的高度信息和局部梯度被视为形状信息,用来控制振动器的振动强度、节奏和分布.通过不同的接触状态和形状信息,可以及时调整振动模式来模拟虚拟物体的轮廓.最后,利用形状感知实验验证了算法的有效性.结果表明,改进后的算法对于三维形状触觉渲染是有效的.

关键词指套式振动触觉反馈装置; 3D形状触觉渲染算法; 接触状态; 局部梯度

DOI:10.3969/j.issn.1003-7985.2018.03.006

Received2018-01-23,

Revised2018-04-25.

BiographyWu Juan(1978—), female, doctor, professor, juanwuseu@seu.edu.cn.

Foundationitems:The National Natural Science Foundation of China(No. 61473088), Six Talent Peaks Projects in Jiangsu Province.

CitationWu Juan, Han Xiao, Yang Huaining. An improved 3D shape haptic rendering algorithm for finger mounted vibrotactile device[J].Journal of Southeast University (English Edition),2018,34(3):317-322.DOI:10.3969/j.issn.1003-7985.2018.03.006.

中图分类号TH70