US20020057270A1 - Virtual reality method - Google Patents

Virtual reality method Download PDF

Info

Publication number
US20020057270A1
US20020057270A1 US10/035,532 US3553201A US2002057270A1 US 20020057270 A1 US20020057270 A1 US 20020057270A1 US 3553201 A US3553201 A US 3553201A US 2002057270 A1 US2002057270 A1 US 2002057270A1
Authority
US
United States
Prior art keywords
image
pointer
direction signal
matrix
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/035,532
Inventor
Tsao Hao-Yu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Assigned to ACER INCORPORATED reassignment ACER INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HAO-YU, TSAO
Publication of US20020057270A1 publication Critical patent/US20020057270A1/en
Assigned to WISTRON CORP. reassignment WISTRON CORP. 1/2 RIGHT, TITLE & INTEREST Assignors: ACER INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05GCONTROL DEVICES OR SYSTEMS INSOFAR AS CHARACTERISED BY MECHANICAL FEATURES ONLY
    • G05G1/00Controlling members, e.g. knobs or handles; Assemblies or arrangements thereof; Indicating position of controlling members
    • G05G1/02Controlling members for hand actuation by linear movement, e.g. push buttons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Position Input By Displaying (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A virtual reality method provides, first, a plurality of images, and these images are connected in series as an image sequence. Then, a pointer pointed to a target-image in the image sequence is provided. A direction signal is received, and the pointer points to an adjacent image next to the target-image when the direction signal is a first direction; and the pointer points to an adjacent image previous to the target-image when the direction signal is a second direction.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to a virtual reality (VR) method, and particularly to a virtual reality method in which users can build up a virtual reality system by directly adopting the images arranged in a matrix form. [0002]
  • 2. Description of the Related Art [0003]
  • Conventionally, there are two methods to build up a virtual reality system. The first method is to photograph an object or the environment using a panoramic camera, so as to build up a virtual reality system. The second method is to establish the three-dimension (3D) digital information of an object or the environment so as to achieve the effect of virtual reality. [0004]
  • The first method requires a panoramic camera and related software (or plug-in software) for producing and playing the photos, and specific technical personnel to operate the camera and software. However, the panoramic camera and the software are expensive, and users always do not have time to learn the required skills. As a result, using the first method to build up a virtual reality system is unrealistic for general users. [0005]
  • On the other hand, since the 3D digital information of an object or the environment requires familiarity with a software tool, such as AUTOCAD. For an art designer or marketing personnel, the time and money needed to learn the tool is also unrealistic. [0006]
  • SUMMARY OF THE INVENTION
  • It is therefore an object of the present invention to provide a virtual reality method in which users can build up a virtual reality system by adopting images arranged in a matrix form. In addition, the present invention also helps art designers or marketing personnel to design an interactive VR object or VR character by simple concept and operation. [0007]
  • To achieve the above object, the present invention provides a virtual reality method which operates with a different horizontal angle. First, a plurality of images are provided, and these images are connected in series as an image sequence. Then, a pointer pointed to a target-image in the image sequence is provided, wherein the target-image is one of the images in the image sequence. Finally, a direction signal is received, and the pointer points to an adjacent image next to the target-image when the direction signal is a first direction; and the pointer points to an adjacent image previous to the target-image when the direction signal is a second direction. [0008]
  • Furthermore, the present invention also provides a virtual reality method which operates with a different horizontal angle and a different overlooking angle. First, a plurality of images are provided, and these images are arranged into a matrix. Then, a pointer pointed to a target-image in the matrix is provided, wherein the target-image is one of the images in the matrix. Finally, a direction signal is received, and the pointer points to an adjacent image next to the target-image when the direction signal is a first direction; the pointer points to an adjacent image previous to the target-image when the direction signal is a second direction; the pointer points to an adjacent image above the target-image when the direction signal is a third direction; and the pointer points to an adjacent image below the target-image when the direction signal is a fourth direction.[0009]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The aforementioned objects, features and advantages of this invention will become apparent by referring to the following detailed description of the preferred embodiment with reference to the accompanying drawings, wherein: [0010]
  • FIG. 1 is a flow chart illustrating the operation of a virtual reality method according to the first embodiment of the present invention; [0011]
  • FIG. 2 is a schematic diagram showing an example photographing the images of an object from different horizontal angles as disclosed in the first embodiment; [0012]
  • FIG. 3 is a flow chart illustrating the operation of a virtual reality method according to the second embodiment of the present invention; [0013]
  • FIG. 4 is a schematic diagram showing an example photographing the images of an object from different horizontal angles and different overlooking angles as disclosed in the second embodiment; and [0014]
  • FIG. 5 is a schematic diagram showing an example of the image matrix in the second embodiment.[0015]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • Referring to the accompanying figures, the preferred embodiments according to the present invention follow. [0016]
  • [First Embodiment][0017]
  • FIG. 1 illustrates the operation of a virtual reality method according to the first embodiment of the present invention, and FIG. 2 shows an example photographing the images of an object from different horizontal angles in the first embodiment. Referring to FIGS. 1 and 2, the first embodiment of the present invention follows. [0018]
  • The first embodiment of the present invention represents a virtual reality method that operates with a different horizontal angle. First, in step S[0019] 100, a plurality of images is provided, and these images are connected in series as an image sequence. These images are the photos of an object at different positions on a circle having a fixed radius, and there is a predetermined angle difference between one image and its adjacent image in the image sequence.
  • For example, referring to FIG. 2, the images described above are sixteen photos of an [0020] object 20 shot by a camera 30 at the different positions (1 to 16) on a circle, and the predetermined angle difference between two positions is 24 degree horizontal angle. The sixteen images are then connected in series as an image sequence.
  • In step S[0021] 105, a pointer pointed to a target-image in the image sequence is provided, wherein the target-image is one of the images in the image sequence. In step S110, a direction signal is received. Then, in step S115, the direction signal's right/left orientation is determined.
  • If the direction signal is a right (namely the first direction) signal, then as in step S[0022] 120, the pointer is verified as pointing to the last image of the image sequence. If it is, then as step S125, the pointer is altered to point to the first image of the image sequence; If the pointer is not pointing to the last image of the image sequence, then as step S130, the pointer is altered to point to an adjacent image next to the target-image in the image sequence.
  • For example, if the direction signal is a right signal and the pointer is pointing to the third image of the sixteen images in FIG. 2, then the pointer is altered to point to the fourth image. If the direction signal is a right signal and the pointer is pointing to the sixteenth image of the sixteen images in FIG. 2, then the pointer is altered to point to the first image. [0023]
  • In step S[0024] 155, the image indicated by the pointer is displayed, and the process returns to step S110, to receive another direction signal.
  • In addition, if the direction signal is not a right signal, then in step S[0025] 135, the direction signal's right/left orientation is determined.
  • If the direction signal is a left (namely the second direction) signal, then as step S[0026] 140, the pointer is verified as pointing to the first image of the image sequence. If it is, then as step S145, the pointer is altered to point to the last image of the image sequence; If the pointer is not pointing to the first image of the image sequence, then as step S150, the pointer is altered to point to an adjacent image previous to the target-image in the image sequence.
  • For example, if the direction signal is a left signal and the pointer is pointing to the third image of the sixteen images in FIG. 2, then the pointer is altered to point to the second image. If the direction signal is a left signal and the pointer is pointing to the first image of the sixteen images in FIG. 2, then the pointer is altered to point to the sixteenth image. [0027]
  • Then, in step S[0028] 155, the image pointed to by the pointer is displayed, and the process returns to step S110, to receive another direction signal.
  • As well, if the direction signal is not a left signal, the process goes directly to step S[0029] 110, to receive another direction signal.
  • [Second Embodiment][0030]
  • FIG. 3 illustrates the operation of a virtual reality method according to the second embodiment of the present invention, and FIG. 4 shows an example photographing these images of an object from different horizontal angles and different overlooking angles. Referring to FIGS. 3 and 4, the second embodiment of the present invention is described in detail as follows. [0031]
  • The second embodiment of the present invention represents a virtual reality method which operates with a different horizontal angle and a different overlooking angle. First, in step S[0032] 200, a plurality of images are provided, and these images are arranged into a matrix.
  • These images are the photos of an object at different positions on a virtual spherical surface. The images in the same row of the matrix represent the images photographed from the same overlooking angle but a different horizontal angle, and there is a predetermined horizontal angle difference between one image and its adjacent image in any row. In addition, the images in the same column of the matrix represent the images photographed from the same horizontal angle but different overlooking angle, and there is a predetermined overlooking angle difference between one image and its adjacent image in one column. [0033]
  • For example, referring to FIG. 4, the images described above are the photos of an [0034] object 20 shot by a camera 30 at the different positions on a virtual spherical surface. For example, at the position B of one predetermined overlooking angle, we can shoot sixteen photos of an object 20 using a camera 30 at the different positions (1 to 16) on a circle, and the predetermined horizontal angle difference between two positions is 24 degrees. The same situation exists for group A, B, C, . . . , and F. Therefore, there are 96 (6×16) images in this example, and these 96 images are arranged into a matrix 40, as shown in FIG. 5.
  • Then, in step S[0035] 205, a pointer pointed to a target-image in the matrix 40 is provided, wherein the target-image is one of the images in the matrix 40. Also, in step S210, a direction signal is received. Then, in step S215, the direction signal's right/left orientation is determined.
  • If the direction signal is a right (namely the first direction) signal, then as step S[0036] 220, the pointer is verified as pointing to the image in the last column of the matrix 40. If it is, then as step S225, the pointer is altered to point to the image in the first column of the matrix 40; If the pointer is not pointing to the image in the last column of the matrix 40, then as step S230, the pointer is altered to point to an adjacent image next to the target-image in the same row.
  • For example, if the direction signal is a right signal and the pointer is pointing to the third image of the sixteen images in row B of the [0037] matrix 40 in FIG. 5, the pointer is altered to point to the fourth image in row B. If the direction signal is a right signal and the pointer is pointing to the sixteenth image of the sixteen images in row B of the matrix 40 in FIG. 5, then the pointer is altered to point to the first image in row B.
  • In step S[0038] 295, the image pointed to by the pointer is displayed, and the process returns to step S210, to receive another direction signal.
  • In addition, if the direction signal is not a right signal, then in step S[0039] 235, the direction signal's right/left orientation is determined.
  • If the direction signal is a left (namely the second direction) signal, then as step S[0040] 240, the pointer is verified as pointing to the image in the first column of the matrix 40. If it is, then as step S245, the pointer is altered to point to the image in the last column of the matrix 40; If the pointer is not pointing to the image in the first column of the matrix 40, then as step S250, the pointer is altered to point to an adjacent image previous to the target-image in the same row.
  • For example, if the direction signal is a left signal and the pointer is pointing to the third image of the sixteen images in row B of the [0041] matrix 40 in FIG. 5, then the pointer is altered to point to the second image in row B. If the direction signal is a left signal and the pointer is pointing to the first image of the sixteen images in row B of the matrix 40 in FIG. 5, then the pointer is altered to point to the sixteenth image in row B.
  • Then, in step S[0042] 295, the image pointed to by the pointer is displayed, and the process returns to step S210, to receive another direction signal.
  • Furthermore, if the direction signal is not a left signal, then in step S[0043] 255, the direction signal is verified as being an up signal.
  • If the direction signal is an up (namely the third direction) signal, then as step S[0044] 260, the pointer is verified as pointing to the image in the first row of the matrix 40. If it is, then as step S265, the pointer is altered to point to the image in the first row of the matrix 40; If the pointer is not pointing to the image in the first row of the matrix 40, then as step S270, the pointer is altered to point to an adjacent image above the target-image in the same column.
  • For example, if the direction signal is an up signal and the pointer is pointing to the third image of the sixteen images in row B of the [0045] matrix 40 in FIG. 5, then the pointer is altered to point to the third image in row A. If the direction signal is an up signal and the pointer is pointing to the third image of the sixteen images in row A of the matrix 40 in FIG. 5, then the pointer is altered to point to the third image in row A.
  • Then, in step S[0046] 295, the image pointed to by the pointer is displayed, and the process returns to step S210, to receive another direction signal.
  • In addition, if the direction signal is not an up signal, then in step S[0047] 275, the direction signal is verified as being a down signal.
  • If the direction signal is a down (namely the fourth direction) signal, then as step S[0048] 280, the pointer is verified as pointing to the image in the last row of the matrix 40. If it is, then as step S285, the pointer is altered to point to the image in the last row of the matrix 40; If the pointer is not pointing to the image in the last row of the matrix 40, then as step S290, the pointer is altered to point to an adjacent image below the target-image in the same column.
  • For example, if the direction signal is a down signal and the pointer is pointing to the third image of the sixteen images in row B of the [0049] matrix 40 in FIG. 5, then the pointer is altered to point to the third image in row C. If the direction signal is a down signal and the pointer is pointing to the third image of the sixteen images in row F of the matrix 40 in FIG. 5, then the pointer is altered to point to the third image in row F.
  • Then, in step S[0050] 295, the image pointed to by the pointer is displayed, and the process returns to step S210, to receive another direction signal.
  • As well, if the direction signal is not a down signal, the process goes directly to step S[0051] 110, to receive another direction signal.
  • As a result, users can photograph these images of an object or the environment and use a simple image editing tool, such as MS Paint, to connect or arrange these images into an image sequence or a matrix. Users can further utilize the image sequence and the matrix with the present invention to build up a virtual reality system. In addition, designers or marketing personnel can use the virtual reality method of the present invention to create interactive VR objects or VR characters with reduced learning time and expense. [0052]
  • Although the present invention has been described in its preferred embodiment, it is not intended to limit the invention to the precise embodiment disclosed herein. Those who are skilled in this technology can still make various alterations and modifications without departing from the scope and spirit of this invention. Therefore, the scope of the present invention shall be defined and protected by the following claims and their equivalents. [0053]

Claims (20)

What is claimed is:
1. A virtual reality method, comprising the steps of:
providing a plurality of images and connecting the images in series as an image sequence;
providing a pointer pointed to a target-image in the image sequence, wherein the target-image is one of the images in the image sequence;
receiving a direction signal;
determining the direction signal;
altering the pointer to point to an adjacent image next to the target-image in the image sequence when the direction signal is a first direction signal; and
altering the pointer to point to an adjacent image previous to the target-image in the image sequence when the direction signal is a second direction signal.
2. The method as claimed in claim 1 further comprising:
determining whether the pointer is pointing to the last image of the image sequence; and
altering the pointer to point to the first image of the image sequence when the direction signal is the first direction signal and the pointer is pointing to the last image of the image sequence.
3. The method as claimed in claim 1 further comprising:
determining whether the pointer is pointing to the first image of the image sequence; and
altering the pointer to point to the last image of the image sequence when the direction signal is the second direction signal and the pointer is pointing to the first image of the image sequence.
4. The method as claimed in claim 1 further comprising displaying the image pointed to by the pointer.
5. The method as claimed in claim 2 wherein the first direction signal is a right signal.
6. The method as claimed in claim 3 wherein the second direction signal is a left signal.
7. The method as claimed in claim 1 wherein the images are the photos of an object at different positions on a circle having a fixed radius, and there is a predetermined angle difference between one image and its adjacent image in the image sequence.
8. The method as claimed in claim 7 wherein the predetermined angle difference is a 24 degree horizontal angle.
9. A virtual reality method, comprising the steps of:
providing a plurality of images and arranging the images into a matrix;
providing a pointer pointed to a target-image in the matrix, wherein the target-image is one of the images in the matrix;
receiving a direction signal;
determining the direction signal;
altering the pointer to point to an adjacent image next to the target-image in the matrix when the direction signal is a first direction signal;
altering the pointer to point to an adjacent image previous to the target-image in the matrix when the direction signal is a second direction signal;
altering the pointer to point to an adjacent image above the target-image in the matrix when the direction signal is a third direction signal; and
altering the pointer to point to an adjacent image below the target-image in the matrix when the direction signal is a fourth direction signal.
10. The method as claimed in claim 9 further comprising:
determining whether the pointer is pointing to the image in the last column of the matrix; and
altering the pointer to point to the image in the first column of the matrix when the direction signal is the first direction signal and the pointer is pointing to the image in the last column of the matrix.
11. The method as claimed in claim 9 further comprising:
determining whether the pointer is pointing to the image in the first column of the matrix; and
altering the pointer to point to the image in the last column of the matrix when the direction signal is the second direction signal and the pointer is pointing to the image in the first column of the matrix.
12. The method as claimed in claim 9 further comprising:
determining whether the pointer is pointing to the image in the first row of the matrix; and
altering the pointer to point to the image in the first row of the matrix when the direction signal is the third direction signal and the pointer is pointing to the image in the first row of the matrix.
13. The method as claimed in claim 9 further comprising:
determining whether the pointer is pointing to the image in the last row of the matrix; and
altering the pointer to point to the image in the last row of the matrix when the direction signal is the fourth direction signal and the pointer is pointing to the image in the last row of the matrix.
14. The method as claimed in claim 9 further comprising displaying the image pointed to by the pointer.
15. The method as claimed in claim 10 wherein the first direction signal is a right signal.
16. The method as claimed in claim 11 wherein the second direction signal is a left signal.
17. The method as claimed in claim 12 wherein the third direction signal is an up signal.
18. The method as claimed in claim 13 wherein the fourth direction signal is a down signal.
19. The method as claimed in claim 9 wherein the images are the photos of an object at different positions on a virtual spherical surface, and the images in the same row of the matrix represent the images photographed from the same overlooking angle but different horizontal angles, and there is a predetermined horizontal angle difference between one image and its adjacent image in one row, and the images in the same column of the matrix represent the images photographed from the same horizontal angle but different overlooking angles, and there is a predetermined overlooking angle difference between one image and its adjacent image in one column.
20. The method as claimed in claim 19 wherein the predetermined horizontal angle difference is a 24 degree horizontal angle.
US10/035,532 2000-03-14 2001-11-06 Virtual reality method Abandoned US20020057270A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW89204033 2000-03-14
TW089204033U TW516685U (en) 2000-03-14 2000-03-14 Touch-type pointing rod

Publications (1)

Publication Number Publication Date
US20020057270A1 true US20020057270A1 (en) 2002-05-16

Family

ID=21665241

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/793,461 Expired - Fee Related US6515652B2 (en) 2000-03-14 2001-02-26 Tactile pointing stick
US10/035,532 Abandoned US20020057270A1 (en) 2000-03-14 2001-11-06 Virtual reality method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/793,461 Expired - Fee Related US6515652B2 (en) 2000-03-14 2001-02-26 Tactile pointing stick

Country Status (2)

Country Link
US (2) US6515652B2 (en)
TW (1) TW516685U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7298378B1 (en) 2004-12-13 2007-11-20 Hagenbuch Andrew M Virtual reality universe realized as a distributed location network
US20100283796A1 (en) * 2004-03-03 2010-11-11 Gary Kramer System for Delivering and Enabling Interactivity with Images

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153481A1 (en) * 2007-12-12 2009-06-18 Gunther Adam M Data output device having a plurality of key stick devices configured for reading out data to a user and method thereof
US20090239665A1 (en) * 2007-12-31 2009-09-24 Michael Minuto Brandable thumbstick cover for game controllers
JP4968694B2 (en) * 2008-06-19 2012-07-04 富士通コンポーネント株式会社 Input device
WO2017180726A1 (en) * 2016-04-12 2017-10-19 Mark Slotta Control stick cap with retention features

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6014142A (en) * 1995-11-13 2000-01-11 Platinum Technology Ip, Inc. Apparatus and method for three dimensional manipulation of point of view and object
US6195122B1 (en) * 1995-01-31 2001-02-27 Robert Vincent Spatial referenced photography
US6329963B1 (en) * 1996-06-05 2001-12-11 Cyberlogic, Inc. Three-dimensional display system: apparatus and method
US6597380B1 (en) * 1998-03-16 2003-07-22 Nec Corporation In-space viewpoint control device for use in information visualization system
US6636246B1 (en) * 2000-03-17 2003-10-21 Vizible.Com Inc. Three dimensional spatial user interface
US6668082B1 (en) * 1997-08-05 2003-12-23 Canon Kabushiki Kaisha Image processing apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5889508A (en) * 1996-09-26 1999-03-30 Slotta; Mark R. Cushion for keyboard cursor control stick
US6271834B1 (en) * 1998-05-29 2001-08-07 International Business Machines Corporation Integrated pointing device having tactile feedback
TW526446B (en) * 1999-08-27 2003-04-01 Darfon Electronics Corp Pointing stick device capable of sensing adapter pressure acutely

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6195122B1 (en) * 1995-01-31 2001-02-27 Robert Vincent Spatial referenced photography
US6014142A (en) * 1995-11-13 2000-01-11 Platinum Technology Ip, Inc. Apparatus and method for three dimensional manipulation of point of view and object
US6329963B1 (en) * 1996-06-05 2001-12-11 Cyberlogic, Inc. Three-dimensional display system: apparatus and method
US6668082B1 (en) * 1997-08-05 2003-12-23 Canon Kabushiki Kaisha Image processing apparatus
US6597380B1 (en) * 1998-03-16 2003-07-22 Nec Corporation In-space viewpoint control device for use in information visualization system
US6636246B1 (en) * 2000-03-17 2003-10-21 Vizible.Com Inc. Three dimensional spatial user interface

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100283796A1 (en) * 2004-03-03 2010-11-11 Gary Kramer System for Delivering and Enabling Interactivity with Images
US7956872B2 (en) * 2004-03-03 2011-06-07 Virtual Iris Studios, Inc. System for delivering and enabling interactivity with images
US9087413B2 (en) 2004-03-03 2015-07-21 Engram Networking Llc System for delivering and enabling interactivity with images
US7298378B1 (en) 2004-12-13 2007-11-20 Hagenbuch Andrew M Virtual reality universe realized as a distributed location network

Also Published As

Publication number Publication date
US20010022576A1 (en) 2001-09-20
TW516685U (en) 2003-01-01
US6515652B2 (en) 2003-02-04

Similar Documents

Publication Publication Date Title
CN108062776B (en) Camera Attitude Tracking method and apparatus
CN102831401B (en) To following the tracks of without specific markers target object, three-dimensional overlay and mutual method and system
US6870532B2 (en) Image display
EP0583060A2 (en) Method and system for creating an illusion of three-dimensionality
EP0684059B1 (en) Method and apparatus for the display of video images
CN1698357B (en) Method for displaying an output image on an object
US5790713A (en) Three-dimensional computer graphics image generator
US7034821B2 (en) Three-dimensional computer modelling
GB2256567A (en) Modelling system for imaging three-dimensional models
US20020149581A1 (en) Method for occlusion of movable objects and people in augmented reality scenes
GB2099269A (en) Method of and apparatus for compiling digital image information
GB2100100A (en) A computer generated imagery system
CN108805979A (en) A kind of dynamic model three-dimensional rebuilding method, device, equipment and storage medium
Tatzgern et al. Exploring real world points of interest: Design and evaluation of object-centric exploration techniques for augmented reality
CN104537705A (en) Augmented reality based mobile platform three-dimensional biomolecule display system and method
US4831557A (en) Image composing apparatus
US20020057270A1 (en) Virtual reality method
CN114092670A (en) Virtual reality display method, equipment and storage medium
CN1979508A (en) Simulated humanbody channel collateral cartoon presenting system capable of excuting by computer and method therefor
US5550959A (en) Technique and system for the real-time generation of perspective images
CN112529006B (en) Panoramic picture detection method, device, terminal and storage medium
KR101940499B1 (en) Puzzle completion type augmented reality application system, and application, articles and electronic device therefor
EP0473310A2 (en) Oblique photographic data base generation
US20030090487A1 (en) System and method for providing a virtual tour
JP2004139294A (en) Multi-viewpoint image processing program, system, and marker

Legal Events

Date Code Title Description
AS Assignment

Owner name: ACER INCORPORATED, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAO-YU, TSAO;REEL/FRAME:012451/0048

Effective date: 20011026

AS Assignment

Owner name: WISTRON CORP., TAIWAN

Free format text: 1/2 RIGHT, TITLE & INTEREST;ASSIGNOR:ACER INC.;REEL/FRAME:013254/0417

Effective date: 20021106

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION