US20070137052A1 - Using viewing-angle-sensitive visual tags to determine angular orientation and/or location - Google Patents

Using viewing-angle-sensitive visual tags to determine angular orientation and/or location Download PDF

Info

Publication number
US20070137052A1
US20070137052A1 US11/312,826 US31282605A US2007137052A1 US 20070137052 A1 US20070137052 A1 US 20070137052A1 US 31282605 A US31282605 A US 31282605A US 2007137052 A1 US2007137052 A1 US 2007137052A1
Authority
US
United States
Prior art keywords
angle
visual tag
observation point
physical location
tag
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/312,826
Other versions
US7228634B1 (en
Inventor
James Reich
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Palo Alto Research Center Inc
Original Assignee
Palo Alto Research Center Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Palo Alto Research Center Inc filed Critical Palo Alto Research Center Inc
Priority to US11/312,826 priority Critical patent/US7228634B1/en
Assigned to PALO ALTO RESEARCH CENTER INCORPORATED reassignment PALO ALTO RESEARCH CENTER INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: REICH, JAMES E.
Application granted granted Critical
Publication of US7228634B1 publication Critical patent/US7228634B1/en
Publication of US20070137052A1 publication Critical patent/US20070137052A1/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/783Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems
    • G01S3/784Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived from static detectors or detector systems using a mosaic of detectors

Definitions

  • the present invention relates to techniques for using viewing-angle-sensitive visual tags to determine the angular orientation and/or physical location of an object.
  • GPS Global Positioning System
  • beacon nodes are relatively complicated and expensive to deploy.
  • these beacon nodes are likely to include batteries, which creates maintenance problems because the batteries need to be replaced at regular intervals.
  • WiFi access points it is also possible to use existing WiFi access points to determine the physical locations of objects inside a building based on the attenuation characteristics of WiFi signals within the building. However, determining locations in this way is not very accurate and requires potentially significant amounts of calibration effort. Furthermore, if furniture, access points, or even people move within a building, the system may require recalibration to effectively determine locations.
  • GPS systems are not able to determine the angular orientations of objects.
  • Such angular-orientation information can be useful in determining, for example, which direction a camera is pointing, or which direction an object is facing.
  • One embodiment of the present invention provides a system that uses a visual tag to determine an angle.
  • the system observes the visual tag from an observation point, wherein the visual tag includes an angle-sensitive image which changes in appearance when observed from different angles.
  • the system uses the appearance of the angle-sensitive image (as observed from the observation point) to determine the angle between the visual tag and the observation point.
  • the system uses the determined angle along with supplemental information to determine the physical location of the observation point.
  • the system obtains the supplemental information by, observing one or more additional visual tags from the observation point.
  • the system uses the appearance of angle-sensitive images in the additional visual tag(s) to determine an “additional angle” (or angles) between the additional visual tag(s) and the observation point.
  • the system uses this additional angle while determining the physical location of the observation point.
  • One additional observation will allow measurement of position in two dimensions, while a third observation set at 90 degrees to the other two will allow measurement of three-dimensional position.
  • the visual tag includes visible location information which indicates the physical location of the visual tag.
  • determining the physical location of the observation point involves using the visible location information and the appearance of the angle-sensitive image, along with the supplemental information (which partially constrains the location of the observation point) to determine the physical location of the observation point without requiring calibration or communication with an outside source.
  • the visual tag includes a visible “tag identifier.” This allows a database lookup to be performed based on the tag identifier to return information associated with the tag.
  • the angle-sensitive image includes a lenticular lens, which includes an array of optical elements (lenticules) which are configured so that when the lens is viewed from different angles, different areas under the lens are magnified.
  • lenticular lens which includes an array of optical elements (lenticules) which are configured so that when the lens is viewed from different angles, different areas under the lens are magnified.
  • the angle-sensitive image includes a hologram, which presents differing images when viewed from different angles.
  • the visual tag is affixed to a rotatable object.
  • the system uses the determined angle between the observation point and the visual tag to determine an angle of rotation for the rotatable object.
  • FIG. 1 illustrates a viewing-angle-sensitive visual tag in accordance with an embodiment of the present invention.
  • FIG. 2 illustrates how observations of two viewing-angle-sensitive visual tags are used to determine the physical location of an object in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates how observations of a single viewing-angle-sensitive visual tag are used to determine the physical location of an object in accordance with an embodiment of the present invention.
  • FIG. 4 illustrates how observations of a single viewing-angle-sensitive visual tag are used to determine the angular orientation of a rotatable object in accordance with an embodiment of the present invention.
  • FIG. 5 presents a flow chart illustrating the process of determining the physical location of an object by observing two viewing-angle-sensitive visual tags in accordance with an embodiment of the present invention.
  • a computer-readable storage medium which may be any device or medium that can store code and/or data for use by a computer system. This includes, but is not limited to, magnetic and optical storage devices, such as disk drives, magnetic tape, CDs (compact discs) and DVDs (digital versatile discs or digital video discs).
  • one embodiment of the present invention uses observations of viewing-angle-sensitive visual tags to determine the physical location of a viewer (located at an observation point) relative to the viewing-angle-sensitive visual tags.
  • knowing the direction from a fixed location to a viewer (or the direction from an object to a viewer in the object's frame of reference) is difficult without first determining the orientation of the viewer in an absolute frame of reference. Determining the bearing from a fixed tag to the viewer currently involves calibrating a camera's optical system to determine the mapping from pixel positions to angles. Furthermore, determining this bearing in a global coordinate system requires that the camera's orientation must be known in the global coordinate system. Determining the camera's orientation requires accurate real-time orientation measurements, which are often difficult to obtain. Furthermore, this technique requires that the camera be in communication with some type of remote data store which provides the location information for a given tag.
  • one embodiment of the present invention uses viewing-angle-sensitive visual tags to efficiently solve the above-described problems.
  • FIG. 1 illustrates a viewing-angle-sensitive visual tag 102 in accordance with an embodiment of the present invention.
  • visual tag 102 can be fabricated with an appearance that varies according to direction to the viewer in the frame of reference of the visual tag.
  • the visual tag 102 illustrated in FIG. 2 has an identifiably different appearance (summarized as an ID code) for each angular range in set of angular ranges.
  • ID code Identifiably different appearance
  • a visual tag which includes a hologram or an image behind a lenticular lens array, allows different discrete images of an object to appear to a viewer, depending on the relative angle between the viewer and the tag.
  • these technologies are used to display three-dimensional images.
  • one embodiment of the present invention uses these technologies to display a discrete code at each angle.
  • This code may simply be an identifier which is used to look up: the identity of the tag, the position of the tag, and the angular orientation of the tag.
  • this code can directly encode the tag's identity, physical location and angular orientation in a common coordinate system, as well as the angle from the tag to the viewer.
  • FIG. 1 illustrates how visual tag 102 displays a different pattern from each of three different viewing angles.
  • This pattern encodes: the tag's identifier (Tag #1), the tag's location (10,10) and the angle from the tag to the viewer ( ⁇ 30°).
  • One embodiment of the present invention uses tags which are sensitive to multiple wavelengths of light, including wavelengths not visible to the human eye to allow for “invisible” tagging.
  • FIG. 2 illustrates how observations of two viewing-angle-sensitive visual tags 204 and 206 are used to determine the physical location of an object 202 in accordance with an embodiment of the present invention.
  • the object 202 which includes a visual sensor such as a camera observes two visual tags 204 and 206 .
  • the appearance of visual tag 204 (as observed from object 202 ) allows the system determine the angle 208 from object 202 to visual tag 204 in the frame of reference of visual tag 204 .
  • the appearance of visual tag 206 allows the system to determine the angle 210 from object 202 to visual tag 206 in the frame reference of visual tag 206 .
  • angles 208 and 210 along with location and angular-orientation information for visual tags 204 and 206 , can be used to determine the physical location of object 202 . This process is described in more detail below with reference to the flow chart in FIG. 5 .
  • FIG. 3 illustrates how observations of a single viewing-angle-sensitive visual tag 304 are used to determine the physical location of an object 302 in accordance with an embodiment of the present invention.
  • the object 302 is constrained to move along the fixed path represented by the dashed arrows in FIG. 3 .
  • the fixed path might be a walkway through a building.
  • a visual sensor within object 302 observes a single visual tag 304 .
  • the appearance of visual tag 304 allows the system determine an angle 306 from object 302 to visual tag 304 in the frame of reference of visual tag 304 .
  • the physical location of object 302 is determined by considering: location of the path indicated by the dashed arrows, the angle 306 , and location and angular-orientation information for visual tag 304 .
  • FIG. 4 illustrates how observations of a single viewing-angle-sensitive visual tag 404 are used to determine the angular orientation of a rotatable object in accordance with an embodiment of the present invention.
  • a fixed camera 406 observes a visual tag 404 , which is attached to a rotating object 402 .
  • the rotating object 402 can be: a camera, a directional microphone, or a door.
  • the appearance of visual tag 404 (as observed from camera 406 ) is used by the system to determine the angle 408 from camera 406 to visual tag 404 in the frame of reference of visual tag 404 . This angle 408 can then be used to determine the angular orientation of rotatable object 402 .
  • FIG. 5 presents a flow chart illustrating the process of determining the physical location of an object by observing two viewing-angle-sensitive visual tags in accordance with an embodiment of the present invention.
  • the system observes a “first visual tag” from an observation point (step 502 ).
  • the system uses the appearance of the first visual tag (as observed from the observation point) to determine a “first angle” from the observation point to the first visual tag in the frame of reference of the first visual tag (step 504 ).
  • the system also determines the location of the first visual tag from information contained within the first visual tag (step 506 ).
  • this can involve obtaining the location information directly from the pattern presented by the first visual tag, or alternatively, by obtaining a tag identifier from the pattern presented by the first visual tag, and then using this tag identifier to look up location and orientation information for the first visual tag in a remote data store.
  • the system also observes a “second visual tag” from the observation point (step 508 ), and similarly uses the appearance of the second visual tag (as observed from the observation point) to determine a “second angle” from the observation point to the second visual tag in the frame of reference of the second visual tag (step 510 ).
  • the system also determines a location of the second visual tag from information contained within the second visual tag using the above-described techniques (step 512 ).
  • the system uses any one of a number of well-known triangulation techniques to determine the physical location of the observation point. In doing so, the system takes into account: the first angle, the location and orientation of the first visual tag, the second angle, and the location and orientation of the second visual tag (step 514 ).
  • a third visual tag can be used to determine the physical location of the object in three dimensions. More specifically, the system can observe the third visual tag from the observation point, wherein the observation of the third visual tag is orthogonal to the observations of the first visual tag and the second visual tag. The system can then use the appearance of an angle-sensitive image in the third visual tag to determine a “third angle” between the third visual tag and the observation point. The system can then determine the physical location of the observation point in three dimensions by using well-known triangulation techniques based on the first angle, the second angle and the third angle.

Abstract

One embodiment of the present invention provides a system that uses a visual tag to determine an angle. During operation, the system observes the visual tag from an observation point, wherein the visual tag includes an angle-sensitive image which changes in appearance when observed from different angles. Next, the system uses the appearance of the angle-sensitive image (as observed from the observation point) to determine the angle between the visual tag and the observation point. In a variation on this embodiment, the system uses the determined angle along with supplemental information to determine the physical location of the observation point.

Description

    BACKGROUND
  • 1. Field of the Invention
  • The present invention relates to techniques for using viewing-angle-sensitive visual tags to determine the angular orientation and/or physical location of an object.
  • 2. Related Art
  • Recent technological developments have made Global Positioning System (GPS) transceivers significantly cheaper and more portable. This has led to a proliferation of devices that use such GPS transceivers to determine physical location. For example, such devices include automobile navigation systems and GPS-enabled running watches.
  • Unfortunately, because of attenuation problems for GPS signals within buildings, these GPS-enabled devices are typically ineffective at determining physical locations inside buildings. Physical locations inside of a building can be determined by placing radio-frequency “beacon nodes” throughout the building. However, such beacon nodes are relatively complicated and expensive to deploy. Furthermore, these beacon nodes are likely to include batteries, which creates maintenance problems because the batteries need to be replaced at regular intervals.
  • It is also possible to use existing WiFi access points to determine the physical locations of objects inside a building based on the attenuation characteristics of WiFi signals within the building. However, determining locations in this way is not very accurate and requires potentially significant amounts of calibration effort. Furthermore, if furniture, access points, or even people move within a building, the system may require recalibration to effectively determine locations.
  • Another shortcoming of GPS systems is that they are not able to determine the angular orientations of objects. Such angular-orientation information can be useful in determining, for example, which direction a camera is pointing, or which direction an object is facing.
  • Hence, what is needed is a method and an apparatus for determining physical locations and/or angular orientations of objects without the limitations of the above-described techniques.
  • SUMMARY
  • One embodiment of the present invention provides a system that uses a visual tag to determine an angle. During operation, the system observes the visual tag from an observation point, wherein the visual tag includes an angle-sensitive image which changes in appearance when observed from different angles. Next, the system uses the appearance of the angle-sensitive image (as observed from the observation point) to determine the angle between the visual tag and the observation point.
  • In a variation on this embodiment, the system uses the determined angle along with supplemental information to determine the physical location of the observation point.
  • In a further variation, the system obtains the supplemental information by, observing one or more additional visual tags from the observation point. The system then uses the appearance of angle-sensitive images in the additional visual tag(s) to determine an “additional angle” (or angles) between the additional visual tag(s) and the observation point. The system then uses this additional angle while determining the physical location of the observation point. One additional observation will allow measurement of position in two dimensions, while a third observation set at 90 degrees to the other two will allow measurement of three-dimensional position.
  • In a variation on this embodiment, the visual tag includes visible location information which indicates the physical location of the visual tag.
  • In a further variation, determining the physical location of the observation point involves using the visible location information and the appearance of the angle-sensitive image, along with the supplemental information (which partially constrains the location of the observation point) to determine the physical location of the observation point without requiring calibration or communication with an outside source.
  • In a variation on this embodiment, the visual tag includes a visible “tag identifier.” This allows a database lookup to be performed based on the tag identifier to return information associated with the tag.
  • In a variation on this embodiment, the angle-sensitive image includes a lenticular lens, which includes an array of optical elements (lenticules) which are configured so that when the lens is viewed from different angles, different areas under the lens are magnified.
  • In a variation on this embodiment, the angle-sensitive image includes a hologram, which presents differing images when viewed from different angles.
  • In a variation on this embodiment, the visual tag is affixed to a rotatable object. In this embodiment, the system uses the determined angle between the observation point and the visual tag to determine an angle of rotation for the rotatable object.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 illustrates a viewing-angle-sensitive visual tag in accordance with an embodiment of the present invention.
  • FIG. 2 illustrates how observations of two viewing-angle-sensitive visual tags are used to determine the physical location of an object in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates how observations of a single viewing-angle-sensitive visual tag are used to determine the physical location of an object in accordance with an embodiment of the present invention.
  • FIG. 4 illustrates how observations of a single viewing-angle-sensitive visual tag are used to determine the angular orientation of a rotatable object in accordance with an embodiment of the present invention.
  • FIG. 5 presents a flow chart illustrating the process of determining the physical location of an object by observing two viewing-angle-sensitive visual tags in accordance with an embodiment of the present invention.
  • DETAILED DESCRIPTION
  • The following description is presented to enable any person skilled in the art to make and use the invention, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present invention. Thus, the present invention is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
  • The data structures and code described in this detailed description are typically stored on a computer-readable storage medium, which may be any device or medium that can store code and/or data for use by a computer system. This includes, but is not limited to, magnetic and optical storage devices, such as disk drives, magnetic tape, CDs (compact discs) and DVDs (digital versatile discs or digital video discs).
  • Determining Physical Location and Angular Orientation
  • As is described above, one embodiment of the present invention uses observations of viewing-angle-sensitive visual tags to determine the physical location of a viewer (located at an observation point) relative to the viewing-angle-sensitive visual tags.
  • In a closely related problem, knowing the direction from a fixed location to a viewer (or the direction from an object to a viewer in the object's frame of reference) is difficult without first determining the orientation of the viewer in an absolute frame of reference. Determining the bearing from a fixed tag to the viewer currently involves calibrating a camera's optical system to determine the mapping from pixel positions to angles. Furthermore, determining this bearing in a global coordinate system requires that the camera's orientation must be known in the global coordinate system. Determining the camera's orientation requires accurate real-time orientation measurements, which are often difficult to obtain. Furthermore, this technique requires that the camera be in communication with some type of remote data store which provides the location information for a given tag.
  • As is described below, one embodiment of the present invention uses viewing-angle-sensitive visual tags to efficiently solve the above-described problems.
  • Angle-Sensitive Visual Tag
  • FIG. 1 illustrates a viewing-angle-sensitive visual tag 102 in accordance with an embodiment of the present invention. By using a hologram or a lenticular lens system, visual tag 102 can be fabricated with an appearance that varies according to direction to the viewer in the frame of reference of the visual tag. For example, the visual tag 102 illustrated in FIG. 2 has an identifiably different appearance (summarized as an ID code) for each angular range in set of angular ranges. Thus, by determining which ID code a tag is displaying, the tag's angular orientation relative to the viewer, or the bearing from the tag to the user in the tag's frame of reference (these are equivalent) can be known.
  • Note that at the time of installation of a tag, it may be easier to map the (static) tag's location in a global coordinate system and then store that result in the information displayed on the tag, thereby obviating the need for communications and camera position measurements to determine the tag's location.
  • A visual tag, which includes a hologram or an image behind a lenticular lens array, allows different discrete images of an object to appear to a viewer, depending on the relative angle between the viewer and the tag. Normally, these technologies are used to display three-dimensional images. However, instead of displaying three-dimensional images, one embodiment of the present invention uses these technologies to display a discrete code at each angle. This code may simply be an identifier which is used to look up: the identity of the tag, the position of the tag, and the angular orientation of the tag. Alternatively, this code can directly encode the tag's identity, physical location and angular orientation in a common coordinate system, as well as the angle from the tag to the viewer.
  • For example, FIG. 1 illustrates how visual tag 102 displays a different pattern from each of three different viewing angles. This pattern encodes: the tag's identifier (Tag #1), the tag's location (10,10) and the angle from the tag to the viewer (−30°).
  • One embodiment of the present invention uses tags which are sensitive to multiple wavelengths of light, including wavelengths not visible to the human eye to allow for “invisible” tagging.
  • Location Determination from Two Angle-Sensitive Visual Tags
  • FIG. 2 illustrates how observations of two viewing-angle-sensitive visual tags 204 and 206 are used to determine the physical location of an object 202 in accordance with an embodiment of the present invention. In this example, the object 202 (which includes a visual sensor such as a camera) observes two visual tags 204 and 206. The appearance of visual tag 204 (as observed from object 202) allows the system determine the angle 208 from object 202 to visual tag 204 in the frame of reference of visual tag 204. Similarly, the appearance of visual tag 206 allows the system to determine the angle 210 from object 202 to visual tag 206 in the frame reference of visual tag 206.
  • Using well-known triangulation techniques, the angles 208 and 210 along with location and angular-orientation information for visual tags 204 and 206, can be used to determine the physical location of object 202. This process is described in more detail below with reference to the flow chart in FIG. 5.
  • Location Determination from a Single Angle-Sensitive Visual Tag
  • FIG. 3 illustrates how observations of a single viewing-angle-sensitive visual tag 304 are used to determine the physical location of an object 302 in accordance with an embodiment of the present invention. In this example, the object 302 is constrained to move along the fixed path represented by the dashed arrows in FIG. 3. For example, the fixed path might be a walkway through a building.
  • In this example, a visual sensor within object 302 observes a single visual tag 304. The appearance of visual tag 304 (as observed from object 302) allows the system determine an angle 306 from object 302 to visual tag 304 in the frame of reference of visual tag 304.
  • Using well-known triangulation techniques, the physical location of object 302 is determined by considering: location of the path indicated by the dashed arrows, the angle 306, and location and angular-orientation information for visual tag 304.
  • Angular Orientation Determination
  • FIG. 4 illustrates how observations of a single viewing-angle-sensitive visual tag 404 are used to determine the angular orientation of a rotatable object in accordance with an embodiment of the present invention. In this example, a fixed camera 406 observes a visual tag 404, which is attached to a rotating object 402. For example, the rotating object 402 can be: a camera, a directional microphone, or a door.
  • The appearance of visual tag 404 (as observed from camera 406) is used by the system to determine the angle 408 from camera 406 to visual tag 404 in the frame of reference of visual tag 404. This angle 408 can then be used to determine the angular orientation of rotatable object 402.
  • Process of Determining Location from Two Angle-Sensitive Visual Tags
  • FIG. 5 presents a flow chart illustrating the process of determining the physical location of an object by observing two viewing-angle-sensitive visual tags in accordance with an embodiment of the present invention. At the beginning of this process, the system observes a “first visual tag” from an observation point (step 502). Next, the system uses the appearance of the first visual tag (as observed from the observation point) to determine a “first angle” from the observation point to the first visual tag in the frame of reference of the first visual tag (step 504). The system also determines the location of the first visual tag from information contained within the first visual tag (step 506). As mentioned above, this can involve obtaining the location information directly from the pattern presented by the first visual tag, or alternatively, by obtaining a tag identifier from the pattern presented by the first visual tag, and then using this tag identifier to look up location and orientation information for the first visual tag in a remote data store.
  • The system also observes a “second visual tag” from the observation point (step 508), and similarly uses the appearance of the second visual tag (as observed from the observation point) to determine a “second angle” from the observation point to the second visual tag in the frame of reference of the second visual tag (step 510). The system also determines a location of the second visual tag from information contained within the second visual tag using the above-described techniques (step 512).
  • Finally, the system uses any one of a number of well-known triangulation techniques to determine the physical location of the observation point. In doing so, the system takes into account: the first angle, the location and orientation of the first visual tag, the second angle, and the location and orientation of the second visual tag (step 514).
  • Note that a third visual tag can be used to determine the physical location of the object in three dimensions. More specifically, the system can observe the third visual tag from the observation point, wherein the observation of the third visual tag is orthogonal to the observations of the first visual tag and the second visual tag. The system can then use the appearance of an angle-sensitive image in the third visual tag to determine a “third angle” between the third visual tag and the observation point. The system can then determine the physical location of the observation point in three dimensions by using well-known triangulation techniques based on the first angle, the second angle and the third angle.
  • The foregoing descriptions of embodiments of the present invention have been presented only for purposes of illustration and description. They are not intended to be exhaustive or to limit the present invention to the forms disclosed. Accordingly, many modifications and variations will be apparent to practitioners skilled in the art. Additionally, the above disclosure is not intended to limit the present invention. The scope of the present invention is defined by the appended claims.

Claims (29)

1. A method for using a visual tag that includes an angle-sensitive image to determine an angle, the method comprising:
observing the visual tag which includes the angle-sensitive image, wherein the visual tag is observed from an observation point;
wherein the angle-sensitive image changes in appearance when observed from different angles; and
using the appearance of the angle-sensitive image as observed from the observation point to determine a first angle between the visual tag and the observation point.
2. The method of claim 1, further comprising using the first angle along with supplemental information to determine a physical location of the observation point.
3. The method of claim 2,
wherein the method further comprises obtaining the supplemental information by, observing a second visual tag from the observation point, and using the appearance of an angle-sensitive image in the second visual tag to determine a “second angle” between the second visual tag and the observation point; and
wherein determining the physical location of the observation point involves using the first angle and the second angle to determine the physical location of the observation point in two dimensions.
4. The method of claim 3,
wherein obtaining the supplemental information also involves,
observing a third visual tag from the observation point,
wherein the observation of the third visual tag is orthogonal to the observations of the first visual tag and the second visual tag, and
using the appearance of an angle-sensitive image in the third visual tag to determine a “third angle” between the third visual tag and the observation point; and
wherein determining the physical location of the observation point involves using the first angle, the second angle and the third angle to determine the physical location of the observation point in three dimensions.
5. The method of claim 1, wherein the visual tag includes visible location information which indicates the physical location of the visual tag.
6. The method of claim 5, wherein determining the physical location of the observation point involves using the visible location information and the appearance of the angle-sensitive image, along with the supplemental information, which partially constrains the physical location of the observation point, to determine the physical location of the observation point without requiring calibration or communication with an outside source.
7. The method of claim 1, wherein the visual tag includes a visible “tag identifier,” which can be used to lookup information associated with the visual tag.
8. The method of claim 1, wherein the angle-sensitive image includes a lenticular lens, which includes an array of optical elements (lenticules) which are configured so that when the lens is viewed from different angles, different areas under the lens are magnified.
9. The method of claim 1, wherein the angle-sensitive image includes a hologram, which presents differing images when viewed from different angles.
10. The method of claim 1,
wherein the visual tag is affixed to a rotatable object; and
wherein the method further comprises using the first angle between the observation point and the visual tag to determine an angle of rotation for the rotatable object.
11. A computer-readable storage medium storing instructions that when executed by a computer cause the computer to perform a method for using a visual tag that includes an angle-sensitive image to determine an angle, the method comprising:
observing the visual tag which includes the angle-sensitive image, wherein the visual tag is observed from an observation point;
wherein the angle-sensitive image changes in appearance when observed from different angles; and
using the appearance of the angle-sensitive image as observed from the observation point to determine the angle between the visual tag and the observation point.
13. The computer-readable storage medium of claim 11, wherein the method further comprises using the first angle along with supplemental information to determine a physical location of the observation point.
13. The computer-readable storage medium of claim 11,
wherein the method further comprises obtaining the supplemental information by, observing a second visual tag from the observation point, and using the appearance of an angle-sensitive image in the second visual tag to determine a “second angle” between the second visual tag and the observation point; and
wherein determining the physical location of the observation point involves using the first angle and the second angle to determine the physical location of the observation point in two dimensions.
14. The computer-readable storage medium of claim 13,
wherein obtaining the supplemental information also involves,
observing a third visual tag from the observation point,
wherein the observation of the third visual tag is orthogonal to the observations of the first visual tag and the second visual tag, and
using the appearance of an angle-sensitive image in the third visual tag to determine a “third angle” between the third visual tag and the observation point; and
wherein determining the physical location of the observation point involves using the first angle, the second angle and the third angle to determine the physical location of the observation point in three dimensions.
15. The computer-readable storage medium of claim 11, wherein the visual tag includes visible location information which indicates the physical location of the visual tag.
16. The computer-readable storage medium of claim 15, wherein determining the physical location of the observation point involves using the visible location information and the appearance of the angle-sensitive image, along with the supplemental information, which partially constrains the physical location of the observation point, to determine the physical location of the observation point without requiring calibration or communication with an outside source.
17. The computer-readable storage medium of claim 11, wherein the visual tag includes a visible “tag identifier,” which can be used to lookup information associated with the visual tag.
19. The computer-readable storage medium of claim 11, wherein the angle-sensitive image includes a lenticular lens, which includes an array of optical elements (lenticules) which are configured so that when the lens is viewed from different angles, different areas under the lens are magnified.
19. The computer-readable storage medium of claim 11, wherein the angle-sensitive image includes a hologram, which presents differing images when viewed from different angles.
20. The computer-readable storage medium of claim 11,
wherein the visual tag is affixed to a rotatable object; and
wherein the method further comprises using the first angle between the observation point and the visual tag to determine an angle of rotation for the rotatable object.
21. An apparatus that uses a visual tag that includes an angle-sensitive image to determine an angle, comprising:
an observation mechanism configured to observe the visual tag which includes the angle-sensitive image, wherein the visual tag is observed from an observation point;
wherein the angle-sensitive image changes in appearance when observed from different angles; and
a processing mechanism configured to use the appearance of the angle-sensitive image as observed from the observation point to determine the angle between the visual tag and the observation point.
22. The apparatus of claim 21, wherein the processing mechanism is additionally configured to use the first angle along with supplemental information to determine a physical location of the observation point.
23. The apparatus of claim 22,
wherein the observation mechanism is additionally configured to obtain the supplemental information by, observing a second visual tag from the observation point; and
wherein the processing mechanism is additionally configured to use the appearance of an angle-sensitive image in the second visual tag to determine a “second angle” between the second visual tag and the observation point; and
wherein the processing mechanism is configured to use the first angle and the second angle to determine the physical location of the observation point in two dimensions.
24. The apparatus of claim 23,
wherein the observation mechanism is additionally configured to obtain the supplemental information by observing a third visual tag from the observation point, wherein the observation of the third visual tag is orthogonal to the observations of the first visual tag and the second visual tag, and
wherein the processing mechanism is additionally configured to use the appearance of an angle-sensitive image in the third visual tag to determine a “third angle” between the third visual tag and the observation point; and
wherein the processing mechanism is configured to use the first angle, the second angle and the third angle to determine the physical location of the observation point in three dimensions.
25. The apparatus of claim 21, wherein the visual tag includes visible location information which indicates the physical location of the visual tag.
26. The apparatus of claim 23, wherein while determining the physical location of the observation point, the processing mechanism is configured to use the visible location information and the appearance of the angle-sensitive image, along with the supplemental information, which partially constrains the physical location of the observation point, to determine the physical location of the observation point without requiring calibration or communication with an outside source.
27. The apparatus of claim 1, wherein the visual tag includes a visible “tag identifier,” which can be used to lookup information associated with the visual tag.
28. The apparatus of claim 1, wherein the angle-sensitive image includes a lenticular lens, which includes an array of optical elements (lenticules) which are configured so that when the lens is viewed from different angles, different areas under the lens are magnified.
29. The apparatus of claim 1, wherein the angle-sensitive image includes a hologram, which presents differing images when viewed from different angles.
US11/312,826 2005-12-19 2005-12-19 Using viewing-angle-sensitive visual tags to determine angular orientation and/or location Expired - Fee Related US7228634B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/312,826 US7228634B1 (en) 2005-12-19 2005-12-19 Using viewing-angle-sensitive visual tags to determine angular orientation and/or location

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/312,826 US7228634B1 (en) 2005-12-19 2005-12-19 Using viewing-angle-sensitive visual tags to determine angular orientation and/or location

Publications (2)

Publication Number Publication Date
US7228634B1 US7228634B1 (en) 2007-06-12
US20070137052A1 true US20070137052A1 (en) 2007-06-21

Family

ID=38120394

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/312,826 Expired - Fee Related US7228634B1 (en) 2005-12-19 2005-12-19 Using viewing-angle-sensitive visual tags to determine angular orientation and/or location

Country Status (1)

Country Link
US (1) US7228634B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010021367A1 (en) * 2010-05-25 2011-12-01 Deutsches Zentrum für Luft- und Raumfahrt e.V. Optical angle measuring device for use in optical adjusting unit of optical angle detecting system for measuring angle between surface to be detected and reference plane, has lenticular film made of transparent material
EP3920090A1 (en) * 2020-06-03 2021-12-08 Pepperl+Fuchs SE Method and device for determining the angle between an object and a detector

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10313751B2 (en) 2016-09-29 2019-06-04 International Business Machines Corporation Digital display viewer based on location
CN110749311B (en) * 2019-09-12 2021-08-31 浙江大华技术股份有限公司 Positioning method, positioning device and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3914742A (en) * 1973-06-25 1975-10-21 Inst Produktudvikling Apparatus for use in optical reading machines for transforming a two-dimensional line pattern into opto-electronically detectable images
US5289264A (en) * 1991-09-26 1994-02-22 Hans Steinbichler Method and apparatus for ascertaining the absolute coordinates of an object
US5732473A (en) * 1996-01-23 1998-03-31 Gagnon; David R. Holographic sundial
US20040035012A1 (en) * 2002-08-26 2004-02-26 Moehnke Stephanie J. Measuring device having symbols viewable in multiple orientations
US6785972B2 (en) * 2001-07-11 2004-09-07 Varda Goldberg Method and system for recording a viewing point
US6819409B1 (en) * 1999-04-08 2004-11-16 Ovd Kinegram Ag System for reading an information strip containing optically coded information
US20050124870A1 (en) * 2003-08-22 2005-06-09 Jan Lipson Measuring analytes from an electromagnetic spectrum using a wavelength router
US6961174B1 (en) * 1999-08-05 2005-11-01 Dr. Johannes Heidenhain Gmbh Reflectometer and method for manufacturing a reflectometer
US7107693B2 (en) * 2005-01-14 2006-09-19 Illinois Institute Of Technology Apparatus and method for precise angular positioning
US20060209292A1 (en) * 2004-09-14 2006-09-21 Dowski Edward R Jr Low height imaging system and associated methods
US7123354B2 (en) * 2001-04-03 2006-10-17 Dr. Johannes Heidenhain Gmbh Optical position measuring device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3914742A (en) * 1973-06-25 1975-10-21 Inst Produktudvikling Apparatus for use in optical reading machines for transforming a two-dimensional line pattern into opto-electronically detectable images
US5289264A (en) * 1991-09-26 1994-02-22 Hans Steinbichler Method and apparatus for ascertaining the absolute coordinates of an object
US5732473A (en) * 1996-01-23 1998-03-31 Gagnon; David R. Holographic sundial
US6819409B1 (en) * 1999-04-08 2004-11-16 Ovd Kinegram Ag System for reading an information strip containing optically coded information
US6961174B1 (en) * 1999-08-05 2005-11-01 Dr. Johannes Heidenhain Gmbh Reflectometer and method for manufacturing a reflectometer
US7123354B2 (en) * 2001-04-03 2006-10-17 Dr. Johannes Heidenhain Gmbh Optical position measuring device
US6785972B2 (en) * 2001-07-11 2004-09-07 Varda Goldberg Method and system for recording a viewing point
US20040035012A1 (en) * 2002-08-26 2004-02-26 Moehnke Stephanie J. Measuring device having symbols viewable in multiple orientations
US20050124870A1 (en) * 2003-08-22 2005-06-09 Jan Lipson Measuring analytes from an electromagnetic spectrum using a wavelength router
US20060209292A1 (en) * 2004-09-14 2006-09-21 Dowski Edward R Jr Low height imaging system and associated methods
US7107693B2 (en) * 2005-01-14 2006-09-19 Illinois Institute Of Technology Apparatus and method for precise angular positioning

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010021367A1 (en) * 2010-05-25 2011-12-01 Deutsches Zentrum für Luft- und Raumfahrt e.V. Optical angle measuring device for use in optical adjusting unit of optical angle detecting system for measuring angle between surface to be detected and reference plane, has lenticular film made of transparent material
DE102010021367B4 (en) * 2010-05-25 2012-03-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Optical angle measuring device
EP3920090A1 (en) * 2020-06-03 2021-12-08 Pepperl+Fuchs SE Method and device for determining the angle between an object and a detector
DE102020206893A1 (en) 2020-06-03 2021-12-09 Pepperl+Fuchs Se Method and device for determining an angular position between an object and a detector

Also Published As

Publication number Publication date
US7228634B1 (en) 2007-06-12

Similar Documents

Publication Publication Date Title
US11694407B2 (en) Method of displaying virtual information in a view of a real environment
EP3149698B1 (en) Method and system for image georegistration
US11163997B2 (en) Methods and apparatus for venue based augmented reality
US10242454B2 (en) System for depth data filtering based on amplitude energy values
US9401050B2 (en) Recalibration of a flexible mixed reality device
US7088389B2 (en) System for displaying information in specific region
AU2011211601B2 (en) Method for providing information on object within view of terminal device, terminal device for same and computer-readable recording medium
CA3055316C (en) Target detection and mapping
US8629904B2 (en) Arrangement for presenting information on a display
US20120044264A1 (en) Apparatus and method for providing augmented reality
WO2011071948A2 (en) System and method for determining geo-location(s) in images
US20150199848A1 (en) Portable device for tracking user gaze to provide augmented reality display
US20110183684A1 (en) Mobile communication terminal and method
US20210217210A1 (en) Augmented reality system and method of displaying an augmented reality image
CN105606076B (en) Geodetic Measuring System
NO20120982A1 (en) Device, system and method for identifying objects in a digital image, as well as transponder device
US7228634B1 (en) Using viewing-angle-sensitive visual tags to determine angular orientation and/or location
TWM580186U (en) 360 degree surround orientation and position sensing object information acquisition system
CN112055034B (en) Interaction method and system based on optical communication device
CN111162840B (en) Method and system for setting virtual objects around optical communication device
US20220084258A1 (en) Interaction method based on optical communication apparatus, and electronic device
WO2023138747A1 (en) Method for a configuration of a camera, camera arrangement, computer program and storage medium
US20140316905A1 (en) System and method for three-dimensional advertising
CN114663491A (en) Method and system for providing information to a user in a scene

Legal Events

Date Code Title Description
AS Assignment

Owner name: PALO ALTO RESEARCH CENTER INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:REICH, JAMES E.;REEL/FRAME:017360/0429

Effective date: 20051212

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20190612