CN100487568C - Enhanced real natural interactive helmet with sight line follow-up function - Google Patents

Enhanced real natural interactive helmet with sight line follow-up function Download PDF

Info

Publication number
CN100487568C
CN100487568C CNB2007100229788A CN200710022978A CN100487568C CN 100487568 C CN100487568 C CN 100487568C CN B2007100229788 A CNB2007100229788 A CN B2007100229788A CN 200710022978 A CN200710022978 A CN 200710022978A CN 100487568 C CN100487568 C CN 100487568C
Authority
CN
China
Prior art keywords
helmet
semi
optical filter
light path
infrared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB2007100229788A
Other languages
Chinese (zh)
Other versions
CN101067716A (en
Inventor
左洪福
赵新灿
徐兴民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CNB2007100229788A priority Critical patent/CN100487568C/en
Publication of CN101067716A publication Critical patent/CN101067716A/en
Application granted granted Critical
Publication of CN100487568C publication Critical patent/CN100487568C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention is a augmented-reality naturally interactive helm with a sight line tracking function, comprising scene camera, earphone and microphone, electromagnetic position tracker and transmitter, pupil space regulating knob, vertical helm regulating knob, horizontal helm regulating knob, helm liner, and helm light path box. As compared with the existing techniques, it is added with the sight line tracking function, and has better 3D visual effect, and a double-channel naturally interactive function, applied to the technical fields of device maintaining, medical treatment and diagnosis, etc.

Description

Augmented reality natural interactive helmet with eye tracking function
Technical field
The present invention obtains the user at the scene area-of-interest by following the tracks of user's direction of visual lines, and the object of target area is discerned, and utilizes the augmented reality technology, and the active user zone information of carrying out is strengthened.Belong to eye tracking, virtual reconstruction and augmented reality technical field.
Background technology
Figure that the augmented reality system utilization is additional or Word message strengthen the scene dynamics ground of real world on every side, and desired user can be as strengthening free activity in the information space at real world [1]No matter the register method of researchist's employing both at home and abroad is based on the orientation tracking equipment at present, based on computer vision, also being based on vision-azimuth tracker combined tracking, all is only to follow the tracks of user's head orientation, utilizes the head position change in information to realize the stack of virtual information [2]In special applications such as the navigation of the dynamo-electric maintenance of equipment military aircraft of complexity, weapon aiming, medical operatings, the visual channel is a most important information interface between the mankind and the external environment condition, the people is movable except head movement in strengthening information space, eyes are also rotating, remove approximate direction of visual lines with head position, do not consider that into eye movement the registration error maximum of virtual information has 20 degree.Therefore, augmented reality system must detect user's head position and direction of visual lines in real time, follow the tracks of the variation of user of service's sight line,, virtual information is presented at correct position according to the definite in real time mapping position of virtual information in the real space coordinate that will add of these information.At present, the video perspective formula augmented reality helmet of the Nvis company of the U.S. has been equipped with the eye tracking function, and the ISCAN company of Japan is also similarly being studied.Integrated eye tracking function is subject to the ambient lighting influence because of it in the optical perspective helmet, and the hardware difficulty of processing strengthens, and studies less.
Summary of the invention
The objective of the invention is present situation, propose a kind of helmet simple in structure with eye tracking function and binary channels natural interaction ability at above-mentioned prior art.
Augmented reality system with eye tracking function should be followed the tracks of user's direction of visual lines, obtain the area-of-interest of user in scene, simultaneously, embody the characteristics that the three-dimensional registration of augmented reality system, real-time, interactive and actual situation merge again according to the variation of the blinkpunkt of gaze tracking system reflection.Based on the augmented reality system of eye tracking, at first need the position of real-time follow-up user blinkpunkt in real scene; Then according to the scene priori that is kept at lane database in advance, judge user's area-of-interest and objectives by the true coordinate of blinkpunkt in scene in conjunction with the scene image that the scene video camera obtains, thereby obtain needing to show the object that is enhanced of enhancing information; At last, go to inquire about the virtual object data storehouse according to the target information of current identification, the virtual enhancing information of acquisition and real scene coupling, the employing augmented reality is three-dimensional to be registered and the actual situation integration technology, and active user's area-of-interest target is carried out the information enhancing.Augmented reality system with eye tracking function should be followed the tracks of user's direction of visual lines, obtain the area-of-interest of user in scene, simultaneously again according to the variation of the blinkpunkt of gaze tracking system reflection, embodying the characteristics that the three-dimensional registration of augmented reality system, real-time, interactive and actual situation merge, is not the simple addition of both functions.Integrated system need satisfy following principle: 1. can not reduce both performances; 2. can not the interference user notice, increase added burden to the user of service.
The concrete composition that the present invention has the augmented reality natural interactive helmet of eye tracking function comprises scene video camera, earphone and microphone, electromagnetic type position tracker transmitter, interocular distance adjusting knob, helmet vertical adjustment knob, helmet horizontal adjustment knob, helmet liner, helmet light path box.Wherein, the scene video camera is installed in helmet front end, and earphone and microphone are installed on helmet left side inwall or the right inwall by support, and electromagnetic type position tracker transmitter is installed on the helmet inwall at helmet rear portion, and helmet liner is connected with helmet inwall.Helmet light path box places helmet front end to receive the virtual information that real scene that the scene video camera transmits and computing machine generate.The interocular distance adjusting knob is installed on the helmet of the anterior left and right sides of the helmet and with helmet light path box and is connected.Helmet vertical adjustment knob and helmet horizontal adjustment knob are installed in the vertical direction and the horizontal direction of the helmet respectively, and both all link to each other with helmet liner.Light path in the helmet light path box comprises miniscope, the anti-optical filter of infrared height, semi-transparent semi-reflecting combination light microscopic, the saturating optical filter of infrared height, Eye imaging machine, infrared light supply.Wherein, infrared light supply is installed in close semi-transparent semi-reflecting combination light microscopic on the helmet light path box, and this semi-transparent semi-reflecting combination light microscopic is formed square lens by two triangle lens.Anti-optical filter of infrared height and semi-transparent semi-reflecting combination light microscopic are the top that 45 ° of oblique angles are placed in semi-transparent semi-reflecting combination light microscopic.Miniscope and the anti-optical filter of infrared height are 45 ° of oblique angle horizontal directions equally and lay.The saturating optical filter of infrared height places the front end of Eye imaging machine camera, and Eye imaging machine horizontal direction is placed on the locus that semi-transparent semi-reflecting combination light microscopic and the anti-optical filter of infrared height are 45 ° of oblique angles.
The present invention compared with prior art has following characteristics: on the basis of optical perspective helmet, increased the eye tracking function, the target of utilizing direction of visual lines to finish area-of-interest is chosen, carry out 3-dimensional digital by computing machine and rebuild, utilize the binocular imaging system of left and right sides Biscreen display to see the three-dimensional stereoscopic visual effect that three-dimensional body and real scene actual situation merge; By increasing eye tracking and phonetic function, make the augmented reality helmet have binary channels natural interaction function, by the binary channels natural interaction, can make the augmented reality technology under maintenance of equipment, medical diagnosis etc. " hand is busy " condition, have more wide purposes.
Description of drawings
Fig. 1 is the augmented reality natural interactive helmet structural representation with eye tracking function
Fig. 2 is the light path synoptic diagram in the helmet light path box
The label title of Fig. 1 and Fig. 2:
1 scene video camera, 2 earphones and microphone, 3 electromagnetic type position tracker transmitters, 4 interocular distance adjusting knobs, 5 helmet vertical adjustment knob, 6 helmet horizontal adjustment knob, 7 helmet liners, 8 helmet light path boxes, 9 miniscopes, the anti-optical filter of 10 infrared height, 11 semi-transparent semi-reflecting combination light microscopics, the saturating optical filter of 12 infrared height, 13 Eye imaging machines, 14 infrared light supplies.
Fig. 3 is the fundamental diagram with augmented reality natural interactive helmet of eye tracking function
Embodiment
Augmented reality natural interactive helmet with eye tracking function illustrated in figures 1 and 2 comprises compositions such as scene video camera 1, earphone and microphone 2, electromagnetic type position tracker transmitter 3, interocular distance adjusting knob 4, helmet vertical adjustment knob 5, helmet horizontal adjustment knob 6, helmet liner 7.Basic light path comprises miniscope 9, the anti-optical filter 10 of infrared height, semi-transparent semi-reflecting combination light microscopic 11, the saturating optical filter 12 of infrared height, Eye imaging machine 13, infrared light supply 14, mainly finishes user of service's eye tracking and dummy object Presentation Function.
In Fig. 1, scene video camera 1 is finished the natural scene collection, and earphone and microphone 2 are finished interactive voice, and electromagnetic type position tracker transmitter 3 adopts electromagnetic type to follow the tracks of, and is built on the helmet, finishes head position and follows the tracks of.When different users of service wears this helmet, because everyone head size is different with interocular distance, can be by the size of helmet vertical adjustment knob 5 and helmet horizontal adjustment knob 6 adjusting helmet liners 7, distance by interocular distance adjusting knob 4 is regulated between the light path makes virtual image display effect the best.
The inner helmet light path places in the helmet light path box, overlaps independently light path equipment by two and forms, respectively corresponding people's right and left eyes.Distance between the light path equipment can be regulated by interocular distance adjusting knob 4.Fig. 2 is the integrated system index path, the light path of gaze tracking system is to be placed on the infrared light that the infrared light supply 14 of eyes oblique upper sends to shine human eye, the semi-transparent semi-reflecting combination light microscopic 11 that the infrared light of people's eye reflex passes its place ahead is changed the direction horizontal infections by anti-sheet 10 reflections of infrared height afterwards, enters infrared Eye imaging machine 13 imagings through the saturating sheet 12 of infrared height; The optical perspective helmet light path is to be presented at the light that the virtual information on the miniscope 9 sends to shine after lens on the semi-transparent semi-reflecting combination light microscopic 11, received by human eye from the light of this combiner reflection, make the human eye can the perception virtual information, add scene light, the user is perception real scene and virtual information simultaneously, obtains the enhancing effect of real scene.
Narrate principle of work of the present invention below in conjunction with Fig. 3
Based on the augmented reality system of eye tracking, at first need the position of real-time follow-up user blinkpunkt in real scene; Then according to the scene priori that is kept at lane database in advance, judge user's area-of-interest and objectives by the true coordinate of blinkpunkt in scene in conjunction with the scene image that the scene video camera obtains, thereby obtain needing to show the object that is enhanced of enhancing information; At last, go to inquire about the virtual object data storehouse according to the target information of current identification, the virtual enhancing information of acquisition and real scene coupling, the employing augmented reality is three-dimensional to be registered and the actual situation integration technology, and active user's area-of-interest target is carried out the information enhancing.Wherein, eye tracking partly is used to follow the tracks of user's blinkpunkt, corresponding to
Figure C200710022978D00051
Solution procedure; The area-of-interest that virtual enhancing information display section branch is finished the user strengthens, corresponding to
Figure C200710022978D00052
Solution procedure, its principle is as shown in Figure 3.
Eye tracking
Figure C200710022978D00053
Calculating
Blinkpunkt is that the coordinate of E is F at eye coordinates e=(x e, y e, z e, 1) T, the coordinate F under screen coordinate system O o=(x o, y o, z o, 1) T, the relation of being described by Fig. 2 as can be known
F o = τ O ← B · τ B ← S · τ S ← R · τ R ← E · F e - - - ( 1 )
Wherein:
Figure C200710022978D00061
Obtain in real time by position tracker;
Figure C200710022978D00062
Can obtain by the static gaze tracking system of head;
Figure C200710022978D00063
With
Figure C200710022978D00064
Obtain by the system calibrating technology.
Figure C200710022978D00065
With
Figure C200710022978D00066
Demarcation
Adopt Eulerian angle and translation vector to describe
Figure C200710022978D00067
With
Figure C200710022978D00068
Parameter can reduce to 12, wherein preceding 6 parametric descriptions 6 parametric descriptions in back
Figure C200710022978D000610
Timing signal, the user watches the central round dot of eyes the place ahead diascope all the time attentively, and eye coordinates is that E Position Tracking Systems device induction coordinate system S is constant so, and eye coordinates is that E is that R overlaps with the main apparent coordinates of eye, and blinkpunkt is on the x of E axle.(x wherein o, y o, 0,1) TBe the screen coordinate of blinkpunkt, (x e, 0,0,1) TFor the eye coordinates of blinkpunkt is a coordinate, I is a unit matrix, has reflected that the timing signal eye coordinates is that E is the fact that R overlaps with the main apparent coordinates of eye.
x o y o 0 1 = τ O ← B · τ B ← S · τ S ← R · I · x e 0 0 1 - - - ( 2 )
Formula (2) is launched can cancellation x eBe rewritten as
g xy=F(P,M B←S) (3)
Wherein: g XyBe the two-dimensional screen coordinate that is calculated by the blinkpunkt computational algorithm, P is 12 model parameter set, M B ← SBe the value after the calibration of position tracker reading, F is the blinkpunkt computing function.Define average blinkpunkt error D MeanFor
D mean = Σ i = 1 n | | a - b | | , ( a ∈ g xy , b ∈ G xy ) - - - ( 4 )
Wherein: G XyBe the actual blinkpunkt screen coordinate of gathering in the calibration process, n is total number of blinkpunkt, by (2) and (3)
Figure C200710022978D000613
With
Figure C200710022978D000614
Demarcation be an iteration optimization solution procedure, constantly adjust each parameter of P, make D MeanMinimum.
τ R ← E Demarcation
Because being the true origin of R, current coordinate system E of eye and the main apparent coordinates of eye be the eyeball rotation center, therefore
Figure C200710022978D000616
The component that only comprises the reflection rotation.This step calibration request user head keeps static and requires the main apparent coordinates of eye is that the x axle of R is vertical with the approximate maintenance of screen plane.Suppose that the screen coordinate that the timing signal master looks blinkpunkt is C (x c, y c), the eyes rotation center is D apart from the distance of screen, the reading of position tracker is The screen coordinate that is calculated by gaze tracking system during eye gaze screen arbitrfary point is P (x g, y g).From the main apparent coordinates of eye is that R transforms to and watches P (x attentively g, y g) time eye coordinates be that E can be decomposed into: around the z of R axle rotation β angle to interim coordinate system; , can get to E around interim coordinate system y axle rotation alpha angle by α and β
Figure C200710022978D00071
β=arg?tan(|x c-x g|/D)
α = arg tan ( | y g - y c | / D 2 + | x g - x c 2 | ) - - - ( 5 )
Blinkpunkt calculates
After system calibrating is finished, can pass through formula (6) and calculate the coordinate (X, Y, 0,1) of user's blinkpunkt under screen coordinate system T
X = - r 1 · t z r 7 + t x Y = - r 4 · t z r 7 + t y - - - ( 6 )
R in the formula 1, r 4, r 7, t x, t y, t zFor
Figure C200710022978D00074
Element.
The virtual information registration Calculating
For virtual enhancing information accurately is placed in the scene, need the transformation relation between acquisition current scene and the optical perspective helmet display device, promptly find the solution The relative transformation relation of world coordinate system and optical perspective helmet display device.
τ H ← O = τ H ← S · τ B ← S - 1 · τ O ← B - 1 - - - ( 7 )
Wherein:
Figure C200710022978D00078
Can obtain by the demarcation of gaze tracking system,
Figure C200710022978D00079
Can calculate by the Position Tracking reading.
Figure C200710022978D000710
Demarcating steps be:
Wear optical perspective helmet display device, from the image direct imaging in the real scene on every side on user's retina.SPAAM single-point alignment algorithm is adopted in the demarcation of optical perspective helmet display device, and human eye and optical perspective helmet display device are regarded as a virtual video camera, comes it is demarcated by the video camera pin-hole model, and basic step is as follows:
1. show a true monumented point in the display optional position, and write down the world coordinates coordinate (x of this point w, y w, z w, 1) T
2. computing machine generates the individual virtual mark point of n (n 〉=6), is presented on the optical perspective helmet display device by equidistant arrangement, writes down the coordinate (x under the Helmet Mounted Display coordinate system of this n point i, y i, 1) TUser's moving-head makes that monumented point overlaps fully on virtual mark point and the screen that eyes are seen, and monumented point of every aligning writes down one time the position tracker reading;
3. foundation
Figure C200710022978D00081
Demarcate by gaze tracking system,
Figure C200710022978D00082
Read in real time by position tracker, calculate the coordinate (X under the true monumented point Position Tracking Systems device receiver coordinate system Mi, Y Mi, Z Mi, 1) T
4. with (x i, y i, 1) T(X Mi, Y Mi, Z Mi, 1) TSubstitution formula (8) is found the solution with the SVD decomposition method
Figure C200710022978D00083

Claims (1)

1. augmented reality natural interactive helmet with eye tracking function, comprise scene video camera (1), earphone and microphone (2), electromagnetic type position tracker transmitter (3), interocular distance adjusting knob (4), helmet vertical adjustment knob (5), helmet horizontal adjustment knob (6), helmet liner (7), helmet light path box (8), wherein, scene video camera (1) is installed in helmet front end, earphone and microphone (2) are installed on helmet left side inwall or the right inwall by support, electromagnetic type position tracker transmitter (3) is installed on the helmet inwall at helmet rear portion, helmet liner (7) is connected with helmet inwall, helmet light path box (8) places helmet front end to receive the virtual information that real scene that scene video camera (1) transmits and computing machine generate, interocular distance adjusting knob (4) is installed on the left of the helmet front portion or on the helmet on right side and with helmet light path box (8) and is connected, helmet vertical adjustment knob (5) and helmet horizontal adjustment knob (6) are installed in the vertical direction and the horizontal direction of the helmet respectively, both all link to each other with helmet liner (7), light path in the described helmet light path box (8), comprise miniscope (9), the anti-optical filter of infrared height (10), semi-transparent semi-reflecting combination light microscopic (11), the saturating optical filter of infrared height (12), Eye imaging machine (13), infrared light supply (14), wherein, infrared light supply (14) is installed in helmet light path box (8) and goes up near semi-transparent semi-reflecting combination light microscopic (11), this semi-transparent semi-reflecting combination light microscopic (11) is formed square lens by two triangle lens, the anti-optical filter of infrared height (10) is the top that 45 ° of oblique angles are placed in semi-transparent semi-reflecting combination light microscopic (11) with semi-transparent semi-reflecting combination light microscopic (11), miniscope (9) is the top that 45 ° of oblique angle levels are placed in the anti-optical filter of infrared height (10) equally with the anti-optical filter of infrared height (10), the saturating optical filter of infrared height (12) places the front end of Eye imaging machine (13) camera, and Eye imaging machine (13) horizontal direction is placed on the locus that semi-transparent semi-reflecting combination light microscopic (11) and the anti-optical filter of infrared height are 45 ° of oblique angles.
CNB2007100229788A 2007-05-29 2007-05-29 Enhanced real natural interactive helmet with sight line follow-up function Expired - Fee Related CN100487568C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2007100229788A CN100487568C (en) 2007-05-29 2007-05-29 Enhanced real natural interactive helmet with sight line follow-up function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2007100229788A CN100487568C (en) 2007-05-29 2007-05-29 Enhanced real natural interactive helmet with sight line follow-up function

Publications (2)

Publication Number Publication Date
CN101067716A CN101067716A (en) 2007-11-07
CN100487568C true CN100487568C (en) 2009-05-13

Family

ID=38880305

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2007100229788A Expired - Fee Related CN100487568C (en) 2007-05-29 2007-05-29 Enhanced real natural interactive helmet with sight line follow-up function

Country Status (1)

Country Link
CN (1) CN100487568C (en)

Families Citing this family (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2774257C (en) * 2008-09-30 2021-04-27 Universite De Montreal Method and device for assessing, training and improving perceptual-cognitive abilities of individuals
DE102009049073A1 (en) 2009-10-12 2011-04-21 Metaio Gmbh Method for presenting virtual information in a view of a real environment
KR101729023B1 (en) 2010-10-05 2017-04-21 엘지전자 주식회사 Mobile terminal and operation control method thereof
FR2970576B1 (en) * 2011-01-19 2013-02-08 Matchic Labs METHOD FOR DETERMINING THE DIRECTION OF THE LOOK AND DEVICE FOR IMPLEMENTING IT
JP6144681B2 (en) * 2011-08-30 2017-06-07 マイクロソフト テクノロジー ライセンシング,エルエルシー Head mounted display with iris scan profiling function
CN102572217B (en) * 2011-12-29 2014-08-20 华为技术有限公司 Visual-attention-based multimedia processing method and device
CN103150013A (en) * 2012-12-20 2013-06-12 天津三星光电子有限公司 Mobile terminal
EP2985651B1 (en) * 2013-04-11 2020-07-15 Sony Corporation Image display device and display device
US9524580B2 (en) * 2014-01-06 2016-12-20 Oculus Vr, Llc Calibration of virtual reality systems
CN104089606B (en) * 2014-06-30 2016-08-17 天津大学 A kind of free space eye tracking measuring method
CN104317055B (en) * 2014-10-31 2017-01-25 成都理想境界科技有限公司 Head-mounted type device used in cooperation with mobile terminal
WO2016082062A1 (en) * 2014-11-24 2016-06-02 潘有程 3d display helmet having wireless transmission function
CN104484523B (en) * 2014-12-12 2017-12-08 西安交通大学 A kind of augmented reality induction maintenance system realizes apparatus and method for
CN104581126A (en) * 2014-12-16 2015-04-29 青岛歌尔声学科技有限公司 Image display processing method and processing device for head-mounted display device
CN105828021A (en) * 2015-01-05 2016-08-03 沈阳新松机器人自动化股份有限公司 Specialized robot image acquisition control method and system based on augmented reality technology
WO2016115870A1 (en) * 2015-01-21 2016-07-28 成都理想境界科技有限公司 Binocular ar head-mounted display device and information displaying method therefor
CN104570356A (en) * 2015-02-03 2015-04-29 深圳市安华光电技术有限公司 Single image source binocular near-to-eye display device
EP3265866B1 (en) * 2015-03-05 2022-12-28 Magic Leap, Inc. Systems and methods for augmented reality
CN104731338B (en) * 2015-03-31 2017-11-14 深圳市虚拟现实科技有限公司 One kind is based on enclosed enhancing virtual reality system and method
CN104865705A (en) * 2015-05-04 2015-08-26 上海交通大学 Reinforced realistic headwear equipment based intelligent mobile equipment
CN105068252A (en) * 2015-09-07 2015-11-18 东南大学 Multi-parameter adjustable binocular augmented reality experimental device
CN105686835A (en) * 2016-01-18 2016-06-22 张江杰 Eyesight visualization device
FR3049722B1 (en) 2016-04-01 2018-03-30 Thales VISUALIZATION SYSTEM FOR A PONTET HEAD FOR AN AIRCRAFT COMPATIBLE WITH AN AUDIO HELMET
CN105955456B (en) 2016-04-15 2018-09-04 深圳超多维科技有限公司 The method, apparatus and intelligent wearable device that virtual reality is merged with augmented reality
CN106127858B (en) * 2016-06-24 2020-06-23 联想(北京)有限公司 Information processing method and electronic equipment
CN107765842A (en) * 2016-08-23 2018-03-06 深圳市掌网科技股份有限公司 A kind of augmented reality method and system
CN108572450B (en) * 2017-03-09 2021-01-29 宏碁股份有限公司 Head-mounted display, visual field correction method thereof and mixed reality display system
CN106959516A (en) * 2017-05-02 2017-07-18 广州蜃境信息科技有限公司 One kind is based on shuangping san augmented reality glasses
CN107657235A (en) * 2017-09-28 2018-02-02 北京小米移动软件有限公司 Recognition methods and device based on augmented reality
CN108196676B (en) * 2018-01-02 2021-04-13 联想(北京)有限公司 Tracking identification method and system
TWI702548B (en) * 2018-04-23 2020-08-21 財團法人工業技術研究院 Controlling system and controlling method for virtual display
CN108765498B (en) * 2018-05-30 2019-08-23 百度在线网络技术(北京)有限公司 Monocular vision tracking, device and storage medium
US10916064B2 (en) * 2018-07-23 2021-02-09 Magic Leap, Inc. Method and system for resolving hemisphere ambiguity using a position vector
JP2020053874A (en) * 2018-09-27 2020-04-02 セイコーエプソン株式会社 Head-mounted display device and cover member
CN109615664B (en) * 2018-12-12 2020-06-30 亮风台(上海)信息科技有限公司 Calibration method and device for optical perspective augmented reality display
CN110333775A (en) * 2019-05-16 2019-10-15 上海精密计量测试研究所 A kind of Space Equipment augmented reality maintenance guiding system and method
CN110147770A (en) * 2019-05-23 2019-08-20 北京七鑫易维信息技术有限公司 A kind of gaze data restoring method and system
CN110572632A (en) * 2019-08-15 2019-12-13 中国人民解放军军事科学院国防科技创新研究院 Augmented reality display system, helmet and method based on sight tracking
CN111044258B (en) * 2019-11-28 2021-09-28 扬州莱达光电技术有限公司 Simulation testing device in helmet of helmet display
CN114356482B (en) * 2021-12-30 2023-12-12 业成科技(成都)有限公司 Method for interaction with human-computer interface by using line-of-sight drop point
CN114494594B (en) * 2022-01-18 2023-11-28 中国人民解放军63919部队 Deep learning-based astronaut operation equipment state identification method
CN115284076A (en) * 2022-07-21 2022-11-04 太原重工股份有限公司 Workpiece positioning datum alignment method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347400A (en) * 1993-05-06 1994-09-13 Ken Hunter Optical system for virtual reality helmet
US5572749A (en) * 1994-06-28 1996-11-12 The Walt Disney Company Helmet mounting device and system
CN1161087A (en) * 1994-08-10 1997-10-01 实质(Ip)有限公司 Head mounted display optics
US6120461A (en) * 1999-08-09 2000-09-19 The United States Of America As Represented By The Secretary Of The Army Apparatus for tracking the human eye with a retinal scanning display, and method thereof
CN1584791A (en) * 2004-06-11 2005-02-23 上海大学 Man-machine interactive method and apparatus
CN1648840A (en) * 2005-01-27 2005-08-03 北京理工大学 Head carried stereo vision hand gesture identifying device
CN101077232A (en) * 2007-06-07 2007-11-28 南京航空航天大学 Human-computer interaction helmet for type computer

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347400A (en) * 1993-05-06 1994-09-13 Ken Hunter Optical system for virtual reality helmet
US5572749A (en) * 1994-06-28 1996-11-12 The Walt Disney Company Helmet mounting device and system
CN1161087A (en) * 1994-08-10 1997-10-01 实质(Ip)有限公司 Head mounted display optics
US6120461A (en) * 1999-08-09 2000-09-19 The United States Of America As Represented By The Secretary Of The Army Apparatus for tracking the human eye with a retinal scanning display, and method thereof
CN1584791A (en) * 2004-06-11 2005-02-23 上海大学 Man-machine interactive method and apparatus
CN1648840A (en) * 2005-01-27 2005-08-03 北京理工大学 Head carried stereo vision hand gesture identifying device
CN101077232A (en) * 2007-06-07 2007-11-28 南京航空航天大学 Human-computer interaction helmet for type computer

Also Published As

Publication number Publication date
CN101067716A (en) 2007-11-07

Similar Documents

Publication Publication Date Title
CN100487568C (en) Enhanced real natural interactive helmet with sight line follow-up function
US10983354B2 (en) Focus adjusting multiplanar head mounted display
US9984507B2 (en) Eye tracking for mitigating vergence and accommodation conflicts
US10241569B2 (en) Focus adjustment method for a virtual reality headset
US10025060B2 (en) Focus adjusting virtual reality headset
Kellner et al. Geometric calibration of head-mounted displays and its effects on distance estimation
US11854171B2 (en) Compensation for deformation in head mounted display systems
US5446834A (en) Method and apparatus for high resolution virtual reality systems using head tracked display
IL308780A (en) Enhanced pose determination for display device
CN108351515A (en) Eyes optical projection system and method
US20130128012A1 (en) Simulated head mounted display system and method
US6972733B2 (en) Method and apparatus for eye tracking in a vehicle
JP6369005B2 (en) Head-mounted display device and method for controlling head-mounted display device
CN205195880U (en) Watch equipment and watch system
Bohme et al. Remote eye tracking: State of the art and directions for future development
CN104808340A (en) Head-mounted display device and control method thereof
WO2022205769A1 (en) Virtual reality system foveated rendering method and system based on single eyeball tracking
US11743447B2 (en) Gaze tracking apparatus and systems
Cutolo et al. The role of camera convergence in stereoscopic video see-through augmented reality displays
CN108235778A (en) Calibration method and device based on cloud computing, electronic equipment and computer program product
US11119571B2 (en) Method and device for displaying virtual image
CN205103763U (en) Augmented reality system
CN109031667B (en) Virtual reality glasses image display area transverse boundary positioning method
Wu et al. Depth-disparity calibration for augmented reality on binocular optical see-through displays
JP2001218231A (en) Device and method for displaying stereoscopic image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090513

Termination date: 20160529