WO2015057507A1 - System and method for reconfigurable projected augmented/virtual reality appliance - Google Patents

System and method for reconfigurable projected augmented/virtual reality appliance Download PDF

Info

Publication number
WO2015057507A1
WO2015057507A1 PCT/US2014/060020 US2014060020W WO2015057507A1 WO 2015057507 A1 WO2015057507 A1 WO 2015057507A1 US 2014060020 W US2014060020 W US 2014060020W WO 2015057507 A1 WO2015057507 A1 WO 2015057507A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
headset
projector
head mounted
projectors
Prior art date
Application number
PCT/US2014/060020
Other languages
French (fr)
Inventor
Jeri J. ELLSWORTH
Original Assignee
Ellsworth Jeri J
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/267,325 external-priority patent/US20140340424A1/en
Application filed by Ellsworth Jeri J filed Critical Ellsworth Jeri J
Priority to JP2016524419A priority Critical patent/JP2016536635A/en
Priority to MX2016004537A priority patent/MX2016004537A/en
Priority to EP14853675.8A priority patent/EP3058417A4/en
Priority to KR1020167012649A priority patent/KR20160075571A/en
Priority to CA2926687A priority patent/CA2926687A1/en
Priority to CN201480064107.0A priority patent/CN105765444A/en
Publication of WO2015057507A1 publication Critical patent/WO2015057507A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/30Polarising elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0018Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for preventing ghost images
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the perceptive workbench Computer- vision-based gesture tracking, object tracking, and 3D reconstruction for augmented desks
  • Thad Starner Bastian Leibe, David Minnen, Tracy Westyn, Amy Hurst and Justin Weeks, Machine Vision and Applications, 2003, vol. 14, No. l, pp. 59-71
  • Talaeda Takefumi Ogawa, Kisyoshi Kiyokawa, Haruo Takemura, iswc, pp. 77-84, Eighth IEEE International Symposium on Wearable
  • This invention relates to the fields of virtual reality, augmented reality, board games and video games. More specifically this system allows multiple modes of operation from a reconfigurable head mounted display - projected images to surfaces, near to eye display and near to eye display with world image combiner for graphics overlay.
  • fixed optics head mounted display headsets typically consist of a display or plurality of displays and relay optics which deliver computer generated graphics to the eyes of users. Additional fixed optics may be included that combines light from the real world and allow graphics to be overlaid over that which the user views in the real world. Subsystems are often associated with these displays to track the sight line of the user so as to provide information that drives the rendering of a CGI scene for view in stereo vision, simulating 3D vision.
  • the invention comprises a headset or glasses that contain a display or plurality of displays with mode of primary operation, such as projected imaging, a sight line tracking subsystem and an attachment for relaying the image directly to the eyes of the user and/or world image combing optics.
  • the sight line tracking system provides the information needed to render a stereoscopic view of a computer generated scene such as used in first person point of view based video games or simulations.
  • Figure 1 A typical outward projected image headset, which comprises two projection display systems and apertures for light returning to the user from surfaces in the world, together with a camera for tracking a marker.
  • Figure 2. A wired connection system for the headset in Fig. 1.
  • Figure 3. A front view of the headset in Fig. 1, showing eye alignment with projectors.
  • Figure 4. An alternate headset that relies on anisotropic reflectance.
  • Figure 5. An alternate headset that uses a single projector.
  • Figure 6. An active "marker" pad for use in sight line tracking.
  • Figure 7a Optical paths from and back to the headset of Fig. 1.
  • Figure 8a - Optical path for "clip on” reconfiguration to closed virtual reality mode of operation.
  • FIG. 8b Operation of hinged "flip up" to switch modes.
  • Figure 8c Front "transparent" view of "clip on” apparatus in closed position.
  • Figure 8d. Single side application of "clip on” apparatus.
  • Figure 9. Alternate "clip on” reconfiguration for mixed real/virtual mode.
  • Figure 10. Alternate “clip on” reconfiguration with cameras for "electronic see through” mixed real/virtual mode.
  • the system of the present invention comprises glasses, or headset, that contain a display or projection system (Fig. 1-5) and line of sight tracking system (Fig. 6-7) as well as a mechanically attachable relay system (Fig. 8-10) to change the mode of operation from projected to near to eye viewing.
  • a display or projection system Fig. 1-5
  • line of sight tracking system Fig. 6-7
  • a mechanically attachable relay system Fig. 8-10
  • FIG. 1 A glasses embodiment is shown in Fig. 1, in which a frame 101 supports a pair of image projectors 102 and 104, a tracking camera or cameras 103 and viewing lenses 105 and 106. A compartment is shown 107 that may hold power cells and driver electronics as well as wireless electronic communication devices.
  • Fig. 2 shows an embodiment with wired connections 201 to a circuit box 202 that may include connections for both a computer/cell phone interface 203 such as HDMI and/or connections for other peripherals 204 such as USB.
  • the circuit box 202 may also include power cells.
  • the viewing lenses 105 and 106 in Fig. 1 provide means in conjunction with the projectors 102 and 104 to reject light that originates from the projector on the opposite side of the frame.
  • Said means may be through selective orthogonal polarization (planer or circular), or time division multiplexed active shutters, or spectral filtering by emitter physics or software selected colors or passive filtering, or other such means known in the art.
  • a retroreflective material 701 to return the majority of light 702 emitted by the projectors 102 and 104 in path 703 to the area overlapping the viewing lenses 105 and 106.
  • Prior art e.g. Stanton US6,535,182
  • Stanton US6,535,182 has taught systems in which projectors have been placed to the sides adjacent the hinges of the frame, but this carries the disadvantage that when the frames are made large enough to fit over the user's existing eyewear, the off-axis distance of the projectors from the user's eyes reduces the brightness of the returned image while trying to achieve low crosstalk of unwanted images from opposite sides.
  • Prior art e.g.
  • Fig. 3 shows the preferred alignment of the embodiment of Fig. 1, such that the projectors are positioned closely above the centers of each of the user's eyes, without the need for beamsplitters. It should be noted that the projectors could as well be mounted below the eyes, centered on these same center lines, and that the retroreflective material may be partially transparent such that the user can see objects placed behind it.
  • An alternate embodiment the alignment shown in Fig. 3 may be used in conjunction with an anisotropic retroreflective screen such that the pattern of returned brightness of the projected images falls off more rapidly in the horizontal direction than in the vertical direction.
  • Anisotropic retroreflectors may be fabricated based on slightly ellipsoidal reflecting spheres that have been aligned by axis, or holographic films on mirror surfaces or other means known in the retroreflector fabrication art, and in the art of
  • FIG. 5 An alternate embodiment using a single projector is shown in Fig. 5, where the projector 502 sends alternate frames sequentially, and the filtering viewing lenses 505 and 506 selectively pass the left and right images to the corresponding eyes.
  • the single projector 502 may coordinate with the viewing lenses by switching polarization orthogonality (while using either planer or circular polarization), or time multiplexing by means of active shutters in the viewing lenses, or by means of projecting restricted colors re left/right sides, to be passed by spectral filters at the viewing lenses.
  • polarization orthogonality while using either planer or circular polarization
  • time multiplexing by means of active shutters in the viewing lenses, or by means of projecting restricted colors re left/right sides, to be passed by spectral filters at the viewing lenses.
  • the sight line of the user it taken to be the line originating between the eyes of the user and extending forward parallel to the central projection lines of the projectors 102 and 104, which are mounted so as to be parallel to each other.
  • the sight line tracking subsystem comprises the headset camera or plurality of cameras, 103, which is mounted with central field of view line parallel to the central projection lines of 102 and 104, and a "marker" or plurality of markers that may take the form of a "pad” as shown in Figure 6.
  • this pad or plate 601 comprises a set of five infrared light emitting diodes in which the four outer units 602-605 are in constant output mode while the offset inner diode 606 is modulated using an identifying code pattern.
  • the power supply and modulation circuits for the emitters may be embedded in the material of the pad (not shown) or the emitters may be supplied by wire from elsewhere.
  • the marker may also have a front surface comprising retroreflective material so as to be part of the surface returning projected images to the headset.
  • a plurality of marker pads may be used in a given arrangement with different codes broadcast by the modulated IR source so as to be particularly identified by the headset firmware or software. Equivalent marker configurations will be apparent to designers skilled in the art.
  • Figure 7a shows the typical optical paths from the projectors on the headset to a retroreflective surface 701 mounted to a frame 705.
  • the nature of the retroreflective surface is such that the angle presented to the user is not critical and the surface may have bends, curves or flat sections.
  • Figure 7b shows the optical paths 705 of light originating from a marker pattern 704 of illuminators that are tracked by the camera (103 in Fig. 1) so as to provide geometric data that can be mathematically processed to calculate the user line of sight with respect to the fixed surface.
  • the marker of Fig. 6 has been embedded into the surface 701 such that openings are provided for the IR illumination, or alternately, the surface may be transparent to IR with a marker pad behind it.
  • the term "retroreflector" should be taken as any surface, transparent through opaque, that returns a significant amount of projected light directly back in the direction of the projector.
  • the headset in Fig. 1 may be converted from projected mode to an enclosed near to eye virtual reality display by means of a "clip on" optical relay system attachment that redirects the output of the projectors to an image forming path steered directly to each of the corresponding user eyes.
  • a cutaway diagram of the optical path of one side of the attachment is shown in Fig. 8a.
  • the enclosure 801 is held in place by a clamping means to projector housing 102 on the headset frame 101 with hinge mechanism 805.
  • the enclosure 801 contains means (not shown) to hold in place an arrangement of optical elements that steer the images generated by the projectors so as to be presented coaxial to the eyes of the user, and collimated to generate a visible image.
  • the image from projector 102 is directed downward by mirror 802 and then forward by beamsplitter 803 and then reflected by shaped mirror 804 that provides a collimated image of correct polarization to go back through beamsplitter 803 and headset viewing lens 105.
  • Diffractive, reflective or refractive optical elements may be placed in the optical system to change image properties. While this optical path has been described for this embodiment, many examples exist of near eye optical relay means used in the art of head mounted display, and those skilled in the art may design any number of alternate paths for this attachment.
  • Figure 8b shows the attachment as "flipped up" by means of hinge 805 such that the user may switch modes without completely removing the attachment.
  • the headset will have means (not shown) to electrically or optically detect the presence and position of the attachment such that the firmware and software associated with the system may make image corrections (such as inversion) necessary to support the mode in use.
  • image corrections such as inversion
  • mechanical means (not shown) will be included such that the user can "flip down” the attachment from the raised position with a quick nodding head movement so as to switch to enclosed virtual reality mode without removing hands from keyboards, game controllers or other equipment.
  • Fig. 8c shows a front view of the attachment clamped to the projectors, in the engaged position covering the face of the headset.
  • an optical relay may send the output of the projector to both eyes, where the unwanted frames are rejected by timed shutters or polarizing filters or spectral filters or other optical means.
  • the attachment may embody a means to provide a path for light to enter from the outside world as shown in Fig. 9.
  • the enclosure is fitted with an opening and a forward facing lens or lens system 901, to gather external light and pass it through filtering means 902 and semi reflective mirror 804 before joining the coaxial optical path described above in Fig. 8a.
  • Optics such as field of view, anamorphic, color correction and other properties of the projection or external path, can be modified by attachments with refractive, diffractive and reflective optical elements.
  • the filtering means 902 may include polarizers or electronic shutters, or spectral filters, or other means of masking or blocking parts of the image gathered by lens or lens system 901. Electronic means for control of said optical operations are not shown but are known to those skilled in the art. Alternately, a "see through" mode can be achieved by attaching one or more cameras 1001 to the front of the enclosure as shown in Fig. 10. In this embodiment the images of the external world are relayed electronically (not shown) to graphical mixing firmware and software (also not shown) which control the masking and substitution or overlaying of CGI images, as is well known in the art.
  • the embodiment of Fig.10 is particularly useful when combined with image processing software such as has been developed to track finger movements and gestures by means of images returned by video cameras.

Abstract

A system comprising a head mounted display with sight line tracking is presented with an attachment for reconfiguration from projected augmented reality applications to those using closed virtual reality as well as mixed modes.

Description

SYSTEM AND METHOD FOR RECONFIGURABLE PROJECTED AUGMENTED/VIRTUAL REALITY APPLIANCE
RELATED APPLICATIONS
The present application claims the benefit of provisional patent application No.
61/855,536 filed on May 17, 2013, entitled "Stereo 3D augmented reality display using retro-reflective screens and per eye filtering" by Jeri J. Ellsworth and No. 61/961,446 filed on October 15, 2013, titled "Reconfigurable Head Mounted Display System" also by Jeri J. Ellsworth, the entire contents of which are fully incorporated by reference herein.
U.S. PATENT DOCUMENTS:
3,614,314
4,349,815
4,657,512
4,799,765
5,003,300
5,151,722
5,162,828
5,189,452
5,210,626
5,436,765
5,467,104
5,572,229
5,581,271 ,606,458,621,572,661,603,677,795,726,670,742,263,742,264,064,749,091,546,147,805,421,047,490,095,522,474,532,116,535,182,552,854,594,085,611,384,611,385,747,611,814,442,825,987,847,336,926,429,963,379,031,067,088,516,118,212,200,536,242,527,253,960 ,262,919,355,795,391,574,420,751,446,943,450,188,450,310,495,836,499,217,505,207,538,950,542,209,567,385,646,537,724,441,791,809,804,507,839,575,843,403,791,483,936,519,944,616,982,959,004,769,179,604,189,263,194,325,237,626,300,159,310,763,328,360 8,376,548
8,378,924
8,388,146
8,433,172
8,434,674
8,441,734
8,467,133
8,472,120
8,477,425
8,482,859
8,487,837
8,488,246
8,494,212
8,553,334
8,576,143
8,576,276
8,582,209
8,587,612
8,625,200
8,632,216
8,634,139
8,643,951
U.S. PATENT APPLICATIONS
2002/0041446
2004/0150884
2007/0285752
2010/0309097
2011/0037951
2011/0075357 2012/0106191
2012/0327116
2013/0042296
2013/0196757
2013/0300637
OTHER PUBLICATIONS
"Augmented Reality Through Wearable Computing" Thad Starner, Steve Mann, Bradley Rhodes, Jeffrey Levine, 1997
"Computer Vision-Based Gesture Recognition for an Augmented Reality Interface" Moritz Storring, Thomas B. Moeslund, Yong Liu, and Erik Granum, In 4th IASTED International Conference on Visualization, Imaging, and Image Processing, Sep. 2004
"Constellation: a wide -range wireless motion-tracking system for augmented reality and virtual set applications" Eric Foxlin, Michael Harrington, George Pfeifer, Proceedings of the 25th annual conference on Computer graphics and interactive techniques
"Displays: Fundamentals and Applications" Rolf R. Hainich and Oliver Bimber, CRC Press 2011, ISBN 978-1-56881-439-1
"Finger tracking for interaction in augmented environments," Dorfmuller-Ulhaas, K.; Schmalstieg, D.; Augmented Reality, 2001. Proceedings. IEEE and ACM International Symposium on, pp. 55-64, 2001
"The perceptive workbench: Computer- vision-based gesture tracking, object tracking, and 3D reconstruction for augmented desks" Thad Starner, Bastian Leibe, David Minnen, Tracy Westyn, Amy Hurst and Justin Weeks, Machine Vision and Applications, 2003, vol. 14, No. l, pp. 59-71 "Tracking of User Position and Orientation by Stereo Measurement of Infrared Markers and Orientation Sensing," Masaki Maeda, Takefumi Ogawa, Kisyoshi Kiyokawa, Haruo Takemura, iswc, pp. 77-84, Eighth IEEE International Symposium on Wearable
Computers, Oct. 31 -Nov. 3, 2004
"Wearable Virtual Tablet: Fingertip Drawing on a Portable Plane-object using an Active- Infrared Camera" Norimichi Ukita and Masatsuga Kidode, 2004, retrieved from the internet May 11, 2011
Field of the Invention:
This invention relates to the fields of virtual reality, augmented reality, board games and video games. More specifically this system allows multiple modes of operation from a reconfigurable head mounted display - projected images to surfaces, near to eye display and near to eye display with world image combiner for graphics overlay.
Description of the Related Art:
There are many examples of fixed optics head mounted display headsets, which typically consist of a display or plurality of displays and relay optics which deliver computer generated graphics to the eyes of users. Additional fixed optics may be included that combines light from the real world and allow graphics to be overlaid over that which the user views in the real world. Subsystems are often associated with these displays to track the sight line of the user so as to provide information that drives the rendering of a CGI scene for view in stereo vision, simulating 3D vision.
SUMMARY The invention comprises a headset or glasses that contain a display or plurality of displays with mode of primary operation, such as projected imaging, a sight line tracking subsystem and an attachment for relaying the image directly to the eyes of the user and/or world image combing optics. The sight line tracking system provides the information needed to render a stereoscopic view of a computer generated scene such as used in first person point of view based video games or simulations.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1. - A typical outward projected image headset, which comprises two projection display systems and apertures for light returning to the user from surfaces in the world, together with a camera for tracking a marker.
Figure 2. - A wired connection system for the headset in Fig. 1.
Figure 3. - A front view of the headset in Fig. 1, showing eye alignment with projectors.
Figure 4. - An alternate headset that relies on anisotropic reflectance.
Figure 5. - An alternate headset that uses a single projector.
Figure 6. - An active "marker" pad for use in sight line tracking.
Figure 7a. - Optical paths from and back to the headset of Fig. 1.
Figure 7b. - Optical paths from tracking marker illuminators to the headset of Fig. 1.
Figure 8a. - Optical path for "clip on" reconfiguration to closed virtual reality mode of operation.
Figure 8b. - Operation of hinged "flip up" to switch modes.
Figure 8c. - Front "transparent" view of "clip on" apparatus in closed position. Figure 8d. - Single side application of "clip on" apparatus.
Figure 9. - Alternate "clip on" reconfiguration for mixed real/virtual mode. Figure 10. - Alternate "clip on" reconfiguration with cameras for "electronic see through" mixed real/virtual mode. DETAILED DESCRIPTION
The system of the present invention comprises glasses, or headset, that contain a display or projection system (Fig. 1-5) and line of sight tracking system (Fig. 6-7) as well as a mechanically attachable relay system (Fig. 8-10) to change the mode of operation from projected to near to eye viewing.
A glasses embodiment is shown in Fig. 1, in which a frame 101 supports a pair of image projectors 102 and 104, a tracking camera or cameras 103 and viewing lenses 105 and 106. A compartment is shown 107 that may hold power cells and driver electronics as well as wireless electronic communication devices. Alternately, Fig. 2 shows an embodiment with wired connections 201 to a circuit box 202 that may include connections for both a computer/cell phone interface 203 such as HDMI and/or connections for other peripherals 204 such as USB. The circuit box 202 may also include power cells.
The viewing lenses 105 and 106 in Fig. 1 provide means in conjunction with the projectors 102 and 104 to reject light that originates from the projector on the opposite side of the frame. Said means may be through selective orthogonal polarization (planer or circular), or time division multiplexed active shutters, or spectral filtering by emitter physics or software selected colors or passive filtering, or other such means known in the art.
As shown in Fig. 7a, depicting the projected augmented reality mode, the system relies on a retroreflective material 701 to return the majority of light 702 emitted by the projectors 102 and 104 in path 703 to the area overlapping the viewing lenses 105 and 106. Prior art (e.g. Stanton US6,535,182) has taught systems in which projectors have been placed to the sides adjacent the hinges of the frame, but this carries the disadvantage that when the frames are made large enough to fit over the user's existing eyewear, the off-axis distance of the projectors from the user's eyes reduces the brightness of the returned image while trying to achieve low crosstalk of unwanted images from opposite sides. Prior art (e.g. Fisher US5,572,229 and Fergason US5, 606,458) has also taught the use of beamsplitters in front of the users eyes to direct the projected light coaxial with the user sight line, which adds unwanted forward weight and extension of the frame structure. Fig. 3 shows the preferred alignment of the embodiment of Fig. 1, such that the projectors are positioned closely above the centers of each of the user's eyes, without the need for beamsplitters. It should be noted that the projectors could as well be mounted below the eyes, centered on these same center lines, and that the retroreflective material may be partially transparent such that the user can see objects placed behind it.
An alternate embodiment the alignment shown in Fig. 3 may be used in conjunction with an anisotropic retroreflective screen such that the pattern of returned brightness of the projected images falls off more rapidly in the horizontal direction than in the vertical direction. Anisotropic retroreflectors may be fabricated based on slightly ellipsoidal reflecting spheres that have been aligned by axis, or holographic films on mirror surfaces or other means known in the retroreflector fabrication art, and in the art of
autostereoscopic screens. This form of spatial isolation of left/right images is shown in Fig. 4, where the glasses frame 401 is open without filtering viewing lenses, but rather, relies on the anisotropic bright viewing return region 402 to limit the light crossing over to the opposite eye.
An alternate embodiment using a single projector is shown in Fig. 5, where the projector 502 sends alternate frames sequentially, and the filtering viewing lenses 505 and 506 selectively pass the left and right images to the corresponding eyes. As above, the single projector 502 may coordinate with the viewing lenses by switching polarization orthogonality (while using either planer or circular polarization), or time multiplexing by means of active shutters in the viewing lenses, or by means of projecting restricted colors re left/right sides, to be passed by spectral filters at the viewing lenses. In order to facilitate the presentation of either virtual or advanced forms of augmented reality, it is necessary to calculate the sight line of the user. For the purposes of this specification the sight line it taken to be the line originating between the eyes of the user and extending forward parallel to the central projection lines of the projectors 102 and 104, which are mounted so as to be parallel to each other.
The sight line tracking subsystem comprises the headset camera or plurality of cameras, 103, which is mounted with central field of view line parallel to the central projection lines of 102 and 104, and a "marker" or plurality of markers that may take the form of a "pad" as shown in Figure 6. In the current embodiment this pad or plate 601 comprises a set of five infrared light emitting diodes in which the four outer units 602-605 are in constant output mode while the offset inner diode 606 is modulated using an identifying code pattern. The power supply and modulation circuits for the emitters may be embedded in the material of the pad (not shown) or the emitters may be supplied by wire from elsewhere. The marker may also have a front surface comprising retroreflective material so as to be part of the surface returning projected images to the headset. A plurality of marker pads may be used in a given arrangement with different codes broadcast by the modulated IR source so as to be particularly identified by the headset firmware or software. Equivalent marker configurations will be apparent to designers skilled in the art.
Figure 7a shows the typical optical paths from the projectors on the headset to a retroreflective surface 701 mounted to a frame 705. The nature of the retroreflective surface is such that the angle presented to the user is not critical and the surface may have bends, curves or flat sections. Figure 7b shows the optical paths 705 of light originating from a marker pattern 704 of illuminators that are tracked by the camera (103 in Fig. 1) so as to provide geometric data that can be mathematically processed to calculate the user line of sight with respect to the fixed surface. In this figure the marker of Fig. 6 has been embedded into the surface 701 such that openings are provided for the IR illumination, or alternately, the surface may be transparent to IR with a marker pad behind it. For the purposes of this specification the term "retroreflector" should be taken as any surface, transparent through opaque, that returns a significant amount of projected light directly back in the direction of the projector.
The headset in Fig. 1 may be converted from projected mode to an enclosed near to eye virtual reality display by means of a "clip on" optical relay system attachment that redirects the output of the projectors to an image forming path steered directly to each of the corresponding user eyes. A cutaway diagram of the optical path of one side of the attachment is shown in Fig. 8a. In said diagram, the enclosure 801 is held in place by a clamping means to projector housing 102 on the headset frame 101 with hinge mechanism 805. The enclosure 801 contains means (not shown) to hold in place an arrangement of optical elements that steer the images generated by the projectors so as to be presented coaxial to the eyes of the user, and collimated to generate a visible image. In the shown embodiment the image from projector 102 is directed downward by mirror 802 and then forward by beamsplitter 803 and then reflected by shaped mirror 804 that provides a collimated image of correct polarization to go back through beamsplitter 803 and headset viewing lens 105. Diffractive, reflective or refractive optical elements may be placed in the optical system to change image properties. While this optical path has been described for this embodiment, many examples exist of near eye optical relay means used in the art of head mounted display, and those skilled in the art may design any number of alternate paths for this attachment.
Figure 8b shows the attachment as "flipped up" by means of hinge 805 such that the user may switch modes without completely removing the attachment. It is anticipated that the headset will have means (not shown) to electrically or optically detect the presence and position of the attachment such that the firmware and software associated with the system may make image corrections (such as inversion) necessary to support the mode in use. It is also anticipated that mechanical means (not shown) will be included such that the user can "flip down" the attachment from the raised position with a quick nodding head movement so as to switch to enclosed virtual reality mode without removing hands from keyboards, game controllers or other equipment. Fig. 8c shows a front view of the attachment clamped to the projectors, in the engaged position covering the face of the headset. This is drawn in x-ray style to show the headset behind it, but it should be considered as opaque. Those skilled in the art may design many other enclosures and means of attachment, such as by means of magnets or snaps or hook and loop fasteners etc., but in all designs, the fixture must not cover the camera 103, or restrict its field of view. Also nothing in this description precludes an implementation of half of the attachment, shown in Figure 8d, such as would be used for augmented reality applications feeding closed images or information to only a single eye.
Also, it would be clear to someone skilled in the art of optical relay that an equivalent attachment can be designed for the single projector embodiment disclosed in Fig. 5. Such an embodiment might involve a beamsplitter or active beam switch that relays images laterally to each eye prior to entering a system analogous to that shown in Fig. 8a.
Alternately, an optical relay may send the output of the projector to both eyes, where the unwanted frames are rejected by timed shutters or polarizing filters or spectral filters or other optical means.
In some augmented reality applications it is desirable to mix the images generated by the computer graphics system with the actual images of the real world. In order to achieve this end, the attachment may embody a means to provide a path for light to enter from the outside world as shown in Fig. 9. In this embodiment, the enclosure is fitted with an opening and a forward facing lens or lens system 901, to gather external light and pass it through filtering means 902 and semi reflective mirror 804 before joining the coaxial optical path described above in Fig. 8a. Optics, such as field of view, anamorphic, color correction and other properties of the projection or external path, can be modified by attachments with refractive, diffractive and reflective optical elements. The filtering means 902, may include polarizers or electronic shutters, or spectral filters, or other means of masking or blocking parts of the image gathered by lens or lens system 901. Electronic means for control of said optical operations are not shown but are known to those skilled in the art. Alternately, a "see through" mode can be achieved by attaching one or more cameras 1001 to the front of the enclosure as shown in Fig. 10. In this embodiment the images of the external world are relayed electronically (not shown) to graphical mixing firmware and software (also not shown) which control the masking and substitution or overlaying of CGI images, as is well known in the art. The embodiment of Fig.10 is particularly useful when combined with image processing software such as has been developed to track finger movements and gestures by means of images returned by video cameras.
CONCLUSION
An illustrative embodiment has been described by way of example herein. Those skilled in the art will understand, however, that change and modifications may be made to this embodiment without departing from the true scope and spirit of the elements, products, and methods to which the embodiment is directed, which is defined by my claims.


Claims

CLAIMS I claim:
1. A head mounted display comprising:
a headset or glasses frame supporting one or more image projectors;
said projectors mounted closely above or below the vertical pupil center line;
one or more retroreflective surfaces;
said surfaces returning projected images to said headset;
a filtering means to reduce the brightness of unwanted images originating from said projectors mounted on opposite sides of said headset.
2. The head mounted display of claim 1, wherein said filtering means comprises:
a first polarizing filter applied to a first projector;
a second polarizing filter applied to a second projector with polarization orientation of said second filter orthogonal to that of said first polarizing filter;
a first viewing lens with polarizing filter;
said first viewing lens on the same side of said headset as said first projector;
said first polarizing filter on said first viewing lens arranged so as to reject reflected images passed through said second polarizing filter on said second projector;
a second viewing lens with polarizing filter; said second viewing lens on the same side of said headset as said second projector;
said second polarizing filter on said second viewing lens arranged so as to reject reflected images passed through said first polarizing filter on said first projector.
3. The head mounted display of claim 2, wherein the polarization type of the light projected on each side is planer.
4. The head mounted display of claim 2, wherein the polarization type of said light projected on each side is circular.
5. The head mounted display of claim 1, wherein said filtering means comprises:
a first spectral filter applied to a first projector; a second spectral filter applied to a second
projector;
said second spectral filter passing parts of the visible spectrum disjoint from said first spectral filter; a first viewing lens with spectral filter;
said first viewing lens on the same side of said headset as said first projector;
said first spectral filter at said first viewing lens arranged so as to reject reflected images passed through said second spectral filter at said second projector;
a second viewing lens with spectral filter;
said second viewing lens on the same side of said headset as said second projector;
said second spectral filter at said second viewing lens arranged so as to reject reflected images
passed through said first spectral filter on said first projector.
6. The head mounted display of claim 5, wherein said spectral filtering of said projectors is by
means of color selection in the encoding of the pixels of the images projected or by means of the emission spectrum of the physical illuminators employed.
7. The head mounted display of claim 1, wherein said filtering means comprises:
first and second said image projectors having alternate time slots for image projection;
first and second viewing lenses with attached or internal transparency switching means;
said switching means coordinated with said projection time slots so as to block images originating from opposite side projectors.
8. The head mounted display of claim 1, wherein said filtering means comprises:
an anisotropic retroreflective surface having long axis of anisotropy in the vertical orientation; a vertical alignment of said projectors over or under the central position of the eye positions of said headset;
said anisotropic retroreflective having a reflective brightness pattern sufficiently narrow in the
horizontal dimension so as to isolate reflected images.
9. A system comprising:
the head mounted display of claim 1 ;
one or more cameras mounted on said headset for receiving optical signals from a geometric array of optical emitters;
said emitters mounted in conjunction with said retroreflective surface wherein one of said emitters in said array sends a coded identification pattern.
10. The system of claim 9, wherein said emitters project infrared light.
11. The system of claim 9, further incorporating means to calculate the user sight line with regard to the said array of emitters from said optical signals.
12. A system comprising:
a projection augmented reality headset or glasses;
a removable attachment;
said attachment mountable to one side or both sides of the front of said headset;
said attachment incorporating means to reconfigure operation of said headset to that of a near eye display system.
13. The attachment of claim 12, further incorporating:
lenses for receiving images from the real world in front of the user;
means for mixing said images with images from said projectors.
14. The attachment of claim 13 further incorporating: means to mask spatial portions of said images from the real world prior to mixing with images from said projectors.
15. The attachment of claim 12 further incorporating:
a hinged mounting means;
said means facilitating the switching of operation mode of said headset by rotating said attachment in and out of the path of image projection by said headset.
16. The attachment of claim 15 further incorporating: an electrical or optical sensor;
said sensor incorporating means for providing information to the system firmware or software as to the presence and/or position of said attachment.
17. A method for displaying augmented reality comprising the steps:
projecting images from one or more head mounted image projectors;
reflecting said images back to said headset by means of one or more retroreflective surfaces;
filtering said reflected images by means selected from the set of time sequencing, polarization, spectral usage or spatial brightness pattern;
passing filtered images to selected user eyes.
18. A method for displaying virtual reality comprising the steps:
providing a head mounted projected augmented reality appliance;
attaching an optical apparatus to said appliance;
said apparatus redirecting projected images into near eye mode.
19. A method for switching the mode of operation of a head mounted display comprising the steps: nodding of head causing the lowering of an optics apparatus without use of hands;
said apparatus redirecting projected images into near eye mode.
20. A method for tracking the sight line of a head mounted display comprising the steps: imaging an asymmetric pattern of five or more infrared light emitting diodes with one or more high resolution electronic cameras;
fixing said pattern of emitters in position with respect to a world position;
modulating a diode among said emitters having unique position in the pattern with a unique
identification code number;
encoding said modulation so as to enable demodulation by image processing of the signal from said imaging cameras;
extracting said unique identification code number from said demodulation;
looking up stored reference size and shape information related to said identification number;
solving for sight line coordinates by analyzing the image from said imaging cameras against the stored size and shape of said reference pattern.
PCT/US2014/060020 2013-10-15 2014-10-10 System and method for reconfigurable projected augmented/virtual reality appliance WO2015057507A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2016524419A JP2016536635A (en) 2013-10-15 2014-10-10 System and method for reconfigurable projection augmented reality / virtual reality appliance
MX2016004537A MX2016004537A (en) 2013-10-15 2014-10-10 System and method for reconfigurable projected augmented/virtual reality appliance.
EP14853675.8A EP3058417A4 (en) 2013-10-15 2014-10-10 System and method for reconfigurable projected augmented/virtual reality appliance
KR1020167012649A KR20160075571A (en) 2013-10-15 2014-10-10 System and method for reconfigurable projected augmented/virtual reality appliance
CA2926687A CA2926687A1 (en) 2013-10-15 2014-10-10 System and method for reconfigurable projected augmented/virtual reality appliance
CN201480064107.0A CN105765444A (en) 2013-10-15 2014-10-10 System and method for reconfigurable projected augmented/virtual reality appliance

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361961446P 2013-10-15 2013-10-15
US61/961,446 2013-10-15
US14/267,325 US20140340424A1 (en) 2013-05-17 2014-05-01 System and method for reconfigurable projected augmented/virtual reality appliance
US14/267,325 2014-05-01

Publications (1)

Publication Number Publication Date
WO2015057507A1 true WO2015057507A1 (en) 2015-04-23

Family

ID=52828565

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/060020 WO2015057507A1 (en) 2013-10-15 2014-10-10 System and method for reconfigurable projected augmented/virtual reality appliance

Country Status (7)

Country Link
EP (1) EP3058417A4 (en)
JP (1) JP2016536635A (en)
KR (1) KR20160075571A (en)
CN (1) CN105765444A (en)
CA (1) CA2926687A1 (en)
MX (1) MX2016004537A (en)
WO (1) WO2015057507A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180073166A (en) 2016-12-22 2018-07-02 엘지디스플레이 주식회사 Augmented reality device
CN107065185A (en) * 2017-03-30 2017-08-18 联想(北京)有限公司 A kind of control method and electronic equipment
US10771773B2 (en) * 2017-05-11 2020-09-08 Htc Corporation Head-mounted display devices and adaptive masking methods thereof
KR102461253B1 (en) * 2017-07-24 2022-10-31 삼성전자주식회사 Projection display apparatus including eye tracker
KR101969797B1 (en) * 2017-08-16 2019-08-21 김석배 Image realization apparatus using potable projector
KR20190085368A (en) 2018-01-10 2019-07-18 삼성전자주식회사 Folding-type wearable electronic device with optical transfering member transfer to transparent member from projector
CN110967166B (en) * 2018-09-28 2022-07-01 舜宇光学(浙江)研究院有限公司 Detection method, detection device and detection system of near-eye display optical system
CN109542240B (en) * 2019-02-01 2020-07-10 京东方科技集团股份有限公司 Eyeball tracking device and method
WO2021040076A1 (en) * 2019-08-27 2021-03-04 엘지전자 주식회사 Electronic device
TWI727725B (en) * 2020-03-27 2021-05-11 台灣骨王生技股份有限公司 Surgical navigation system and its imaging method
KR20220068431A (en) * 2020-11-19 2022-05-26 삼성전자주식회사 Augmented reality wearable electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4340274A (en) * 1979-01-11 1982-07-20 Redifon Simulation Limited Visual display apparatus
US6356392B1 (en) * 1996-10-08 2002-03-12 The Microoptical Corporation Compact image display system for eyeglasses or other head-borne frames
US20080062297A1 (en) * 2006-09-08 2008-03-13 Sony Corporation Image capturing and displaying apparatus and image capturing and displaying method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2625845B1 (en) * 2010-10-04 2021-03-03 Gerard Dirk Smits System and method for 3-d projection and enhancements for interactivity

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4340274A (en) * 1979-01-11 1982-07-20 Redifon Simulation Limited Visual display apparatus
US6356392B1 (en) * 1996-10-08 2002-03-12 The Microoptical Corporation Compact image display system for eyeglasses or other head-borne frames
US20080062297A1 (en) * 2006-09-08 2008-03-13 Sony Corporation Image capturing and displaying apparatus and image capturing and displaying method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HOLLISTER.: "How two Valve engineers walked away with the company's augmented reality glasses.", THE VERGE., 18 May 2013 (2013-05-18), pages 1 - 8, XP055334092, Retrieved from the Internet <URL:http://www.theverge.com/2013/5/18/4343382/technical-illusions-valve-augmented-reality-glasses-jeri-ellsworth-rick-johnson> [retrieved on 20141223] *
See also references of EP3058417A4 *

Also Published As

Publication number Publication date
JP2016536635A (en) 2016-11-24
CN105765444A (en) 2016-07-13
EP3058417A1 (en) 2016-08-24
MX2016004537A (en) 2017-05-10
EP3058417A4 (en) 2017-12-06
CA2926687A1 (en) 2015-04-23
KR20160075571A (en) 2016-06-29

Similar Documents

Publication Publication Date Title
US20140340424A1 (en) System and method for reconfigurable projected augmented/virtual reality appliance
WO2015057507A1 (en) System and method for reconfigurable projected augmented/virtual reality appliance
US10409073B2 (en) Virtual reality attachment for a head mounted display
CN109491087B (en) Modular detachable wearable device for AR/VR/MR
Azuma A survey of augmented reality
CN110476105A (en) Uniformity improves and the Waveguide display of the cross-coupling reduction between color
KR101883090B1 (en) Head mounted display
US10061137B2 (en) Smart head-mounted projection system
EP3714318B1 (en) Position tracking system for head-mounted displays that includes sensor integrated circuits
KR20220099580A (en) Head-mounted display for virtual and mixed reality with inside-out positional, user body and environment tracking
US9298255B2 (en) Transmissive display apparatus and operation input method
US10890771B2 (en) Display system with video see-through
US10990062B2 (en) Display system
TW201802642A (en) System f for decting line of sight
Itoh et al. Beaming displays
CN108427194A (en) A kind of display methods and equipment based on augmented reality
Mulder et al. A modular system for collaborative desktop vr/ar with a shared workspace
US10802281B2 (en) Periodic lenses systems for augmented reality
AU2014334682A1 (en) System and method for reconfigurable projected augmented/virtual reality appliance
US20200166752A1 (en) Display for use in display apparatus
CN109565584A (en) The dynamic of three-dimensional digit content is reconfigured quasi-
CN108696740A (en) A kind of live broadcasting method and equipment based on augmented reality
Calabrò et al. Wearable augmented reality optical see through displays based on integral imaging
US11860371B1 (en) Eyewear with eye-tracking reflective element
US20220342222A1 (en) Eyewear having a projector with heat sink shields

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14853675

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2926687

Country of ref document: CA

REEP Request for entry into the european phase

Ref document number: 2014853675

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014853675

Country of ref document: EP

Ref document number: MX/A/2016/004537

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2016524419

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2014334682

Country of ref document: AU

Date of ref document: 20141010

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20167012649

Country of ref document: KR

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112016008106

Country of ref document: BR

REG Reference to national code

Ref country code: BR

Ref legal event code: B01E

Ref document number: 112016008106

Country of ref document: BR

ENPW Started to enter national phase and was withdrawn or failed for other reasons

Ref document number: 112016008106

Country of ref document: BR