WO2000031690A1 - Method and device for creating and modifying digital 3d models - Google Patents
Method and device for creating and modifying digital 3d models Download PDFInfo
- Publication number
- WO2000031690A1 WO2000031690A1 PCT/SE1999/002145 SE9902145W WO0031690A1 WO 2000031690 A1 WO2000031690 A1 WO 2000031690A1 SE 9902145 W SE9902145 W SE 9902145W WO 0031690 A1 WO0031690 A1 WO 0031690A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control
- control points
- user
- representation
- real time
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
Definitions
- the present invention relates to a device and a method for interactively creating and modifying digital three-dimensional (3D) objects in real time.
- NURBS Non- Uniform Rational B-Spline
- VR virtual reality
- VE virtual environments
- SE simulated environments
- an object of the present invention is to provide an improved and more easily manageable method and device for creating and modifying digital three-dimensional objects.
- FIG. 1 is a schematic view of an embodiment of a device according to the invention
- Fig. 2 is a schematic flow chart of a method according to an embodiment of the invention.
- Fig. 3 is an example of a screen image when using the invention.
- a device is schematically shown in Fig. 1.
- the device comprises a memory 1 for storing data describing a digital three-dimensional object in real time.
- the storage takes place at least in a mathematical functional representation with suitably positioned control points, but preferably also in a polygon representation.
- the storage may take place in a prior-art CAD format or the like, but preferably use is made of the Cosmo Binary (CSB) format which comprises the mathematical functional representation as well as the polygon representation.
- the control points are generated in a suitable manner by the mathematical representation, and preferably the control points are defined according to the Non-Uniform Rational B-Spline (NURBS) system. However, some other system for splines-based parametric surfaces may also be used.
- the device comprises a tessellation means 2 for conversion in real time from the mathematical functional representation to the polygon representation of the object.
- a presentation means 3 is included for a preferably three-dimensional presentation of the polygon representation of the object in real time to a user 4.
- the presentation means may consist of a so-called HMD (Head-mounted display) , i.e. a display which is arranged on the user's head with a display in front of each eye to provide a stereo representation, i.e. 3D presentation.
- HMD Head-mounted display
- this system provides a good feeling of being present and realism but, on the other hand, it can only be used by one user and it may be uncomfortable to use for long periods of time.
- use can instead by made of one or more large screens and at least one projector for multichannel (stereo) projection.
- Several screens provide a better feeling of being present and realism.
- use is made of a screen on the wall in front of the user and an associated screen on the floor.
- screens formed as a semicircle and the like are used.
- additional and more complicated screens require more computer capacity and more advanced graphics processing.
- stereo glasses or shutter glasses i.e. glasses which at a high frequency alternate between showing an image to the right eye and to the left eye.
- Such glasses are supplied, for instance, by the company Stereographies Corp.
- the device further comprises at least one control means 5 with the aid of which the user 4 can control the presentation as well as the digital object.
- the control means preferably comprises sensors to detect gestures and movements of at least one of the hands of a user, and preferably use is made of a glove-based interaction system. Such systems are already known in the VR field.
- Fakespace, Inc. for instance, supply a system called “PinchGlove interaction system", in which gestures, i.e. finger movements, are identified by electric terminals which are arranged at the finger-tips of the glove and which generate different signals depending on which fingers are put together.
- Virtual Technologies supply a system called “CyberGlove®” which instead comprises sensors which detect the flexing of fingers and the like.
- the control means preferably comprises a movement tracker system which detects movements of the user's body, and in particular movements of his head and his hands.
- a number of such tracker systems are commercially available, for instance, Ascension "Flock-of-Birds tracker system” and Polhemus “3SPACE FASTRAK system” .
- Such tracker systems can, for instance, be magnetic, mechanical, optical or acoustic and comprise receivers (or transmitters) which are arranged on suitable surfaces on the user's body. Movements in these receivers are detected in relation to a fixed co-ordinate system.
- magnetic receivers are used which are arranged on the user, and any action on a surrounding field is detected by means of a transmitter and a detector.
- movements are detected in six dimensions (three translational movements and three rotational movements) , and the result is transferred, preferably wireless, to a receiving unit 6.
- the control means is preferably of such type that it is possible to "move around" in the virtual environment by means of gestures of the glove in combination with a relative movement of the hand. Moreover, it is possible to change the sight direction in the virtual environment.
- the control means may also preferably be used to select portions or parts of the presented object. The selection can be carried out by means of a "line" which projects from the hand position and which is visible in the simulation. It is thus possible to select a surface by pointing at the surface by means of such a line projecting from the hand position. Alternatively, it is, however, also possible to select surfaces by being close to them or in some other suitable manner. Such a selected surface can advantageously be marked by being highlighted or the like, and it can then be activated, i.e.
- Fig. 3 shows an example of a screen image, in which a surface 31 is activated, the control points 32 being visible. It is possible for such an activated surface to move or add control points with the aid of the control means.
- the receiving unit 6 When this is done, information about this is transmitted from the receiving unit 6 to a data-converting unit 7, in which the detected manipulation is converted into a new position for the control point, the mathematical functional representation of the object in the memory being updated. Subsequently, also the polygon representation is updated in real time by the tessellation means, whereupon the presentation image is updated.
- the user has the impression that the tessellated image of the object, as well as the control points which are currently visible, change directly as a consequence of the manipulation by the control means.
- the previous version of the object i.e. the one before the updating, is however preferably stored in the memory, or at least the information about the performed changes, which makes it possible to recall the last command.
- an input means such as a keyboard and a display, a 3D interface or the like.
- an input means such as a keyboard and a display, a 3D interface or the like.
- scaling factors which are to be used, such as the exchange between a movement of the hand and the corresponding movement taking place in the image, the scaling factor which is to be used when importing an object from the outside, the size of the object in the image, the format in which the current object is to be stored, etc.
- the points around the control point manipulated by the control means should also be manipulated, and in that case how many or how remote points are to be manipulated as well as the degree, i.e. the weighting, of the manipulation. It may, for instance, be suitable that directly adjoining points should be moved 50 % of the distance which the manipulated control point has been moved and that the control points in the layer outside the directly adjoining ones are moved 20 %. Moreover, it may be determined if the adjacent control points that are to be moved should only be the points which are located on the same surface, or if also points on adjacent surfaces should be moved. In the latter case, the system may also be arranged such that control points of adjacent surfaces are moved so that there will be no play or gaps between the adjoining surfaces.
- the desired degree of resolution of the tessellated image i.e. the maximum chordal deviation tolerance.
- the maximum chordal deviation during forming may, for instance, be set at 1 mm, whereas in other cases it is set at 0.1 or 0.05 mm. Furthermore, it is possible, during forming, to show only the portion of the object that is in the state of change, and optionally the closest portions. This also facilitates the real-time simulation and reduces the need for computer capacity. However, if and to what extent this should be done is preferably also decided by the user via the user interface.
- the object should be possible to turn over a marked surface to see it in profile, i.e. in section, rotate it etc. Further, it is preferably possible to bring the object into a state of creation with the aid of the control means, in which case additional control points can be added, for instance, by sweeping movements of the hands. In this way, new objects can be created, new surfaces can be added to an existing structure etc, m a manner that is similar to the work with physical prototypes.
- the tessellation means as well as the data-converting means can be provided m a commercially available computer.
- the computer should have computer hardware sufficiently powerful to allow observation of both the left- hand and the right-hand side of the obiect m real time. Use may, for instance, be made of Silicon Graphics Onyx2.
- the device should comprise the possibility of activating/deactivating what is referred to as "environment mapping", i.e. a predefined image of the environment reflected m real time on the surfaces of the object. Consequently, the object will be more realistically reproduced m the virtual environment.
- environment mapping i.e. a predefined image of the environment reflected m real time on the surfaces of the object. Consequently, the object will be more realistically reproduced m the virtual environment.
- Fig. 2 schematically shows an embodiment of a method according to the invention.
- This method comprises the following steps.
- a tessellation of the object takes place m real time, i.e. a conversion from the mathematical representation of the object into a polygon representation.
- step S2 at least the polygon representation, and possibly also the entire or portions of the mathematical functional representation m the form of control points, are presented to the user.
- m step S3 it is detected whether some portion of the object is activated and brought into the state of change. If this is not the case, the process reverts to step S2, but otherwise a lower resolution level is set m step S4.
- This step is, however, optional and not necessary to the invention.
- a setting may also be made so that only selected portions of the object are shown. Subsequently, the user's changes of the control points are detected, after which the mathematical representation of the object is accordingly updated m step S5. After that, the polygon representation is also updated, and the process reverts to step SI.
- the design tool according to the invention is more similar to traditional designing with physical prototypes and is therefore perfectly suitable as a replacement or supplement in such work.
- the invention may, for instance, advantageously be used in designing in the car industry to avoid the often unwieldy and complicated physical prototypes currently used.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP99972765A EP1131792A1 (en) | 1998-11-20 | 1999-11-19 | Method and device for creating and modifying digital 3d models |
AU14385/00A AU1438500A (en) | 1998-11-20 | 1999-11-19 | Method and device for creating and modifying digital 3d models |
SE0002373A SE0002373L (en) | 1998-11-20 | 2000-06-26 | Method and apparatus for creating and modifying digital 3D models |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE9803996-9 | 1998-11-20 | ||
SE9803996A SE9803996D0 (en) | 1998-11-20 | 1998-11-20 | Method and apparatus for creating and modifying digital 3D models |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2000031690A1 true WO2000031690A1 (en) | 2000-06-02 |
Family
ID=20413373
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SE1999/002145 WO2000031690A1 (en) | 1998-11-20 | 1999-11-19 | Method and device for creating and modifying digital 3d models |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP1131792A1 (en) |
AU (1) | AU1438500A (en) |
SE (1) | SE9803996D0 (en) |
WO (1) | WO2000031690A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7245299B2 (en) * | 2003-05-12 | 2007-07-17 | Adrian Sfarti | Bicubic surface real-time tesselation unit |
USRE42534E1 (en) | 2000-07-28 | 2011-07-12 | Adrian Sfarti | Bicubic surface real-time tesselation unit |
IT201700066749A1 (en) * | 2017-06-15 | 2018-12-15 | Gianluigi Palka | METHOD OF PROCESSING IMAGES OF STRUCTURAL AND / OR ARCHITECTURAL ELEMENTS AND / OR URBAN AND / OR GEOGRAPHICAL ELEMENTS |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108984169B (en) * | 2017-06-01 | 2022-05-03 | 刘开元 | Cross-platform multi-element integrated development system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0526881A2 (en) * | 1991-08-06 | 1993-02-10 | Canon Kabushiki Kaisha | Three-dimensional model processing method, and apparatus therefor |
WO1995011482A1 (en) * | 1993-10-21 | 1995-04-27 | Taligent, Inc. | Object-oriented surface manipulation system |
US5442168A (en) * | 1991-10-15 | 1995-08-15 | Interactive Light, Inc. | Dynamically-activated optical instrument for producing control signals having a self-calibration means |
US5537529A (en) * | 1993-04-22 | 1996-07-16 | Apple Computer, Inc. | Apparatus and method for creating versions of computer models and creating communications incorporating created versions therefrom |
JPH09179062A (en) * | 1995-12-25 | 1997-07-11 | Canon Inc | Computer system |
WO1997046975A1 (en) * | 1996-06-04 | 1997-12-11 | Muncey Grant J | Techniques for creating and modifying 3d models and correlating such models with 2d pictures |
-
1998
- 1998-11-20 SE SE9803996A patent/SE9803996D0/en unknown
-
1999
- 1999-11-19 EP EP99972765A patent/EP1131792A1/en not_active Withdrawn
- 1999-11-19 AU AU14385/00A patent/AU1438500A/en not_active Abandoned
- 1999-11-19 WO PCT/SE1999/002145 patent/WO2000031690A1/en not_active Application Discontinuation
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0526881A2 (en) * | 1991-08-06 | 1993-02-10 | Canon Kabushiki Kaisha | Three-dimensional model processing method, and apparatus therefor |
US5442168A (en) * | 1991-10-15 | 1995-08-15 | Interactive Light, Inc. | Dynamically-activated optical instrument for producing control signals having a self-calibration means |
US5537529A (en) * | 1993-04-22 | 1996-07-16 | Apple Computer, Inc. | Apparatus and method for creating versions of computer models and creating communications incorporating created versions therefrom |
WO1995011482A1 (en) * | 1993-10-21 | 1995-04-27 | Taligent, Inc. | Object-oriented surface manipulation system |
JPH09179062A (en) * | 1995-12-25 | 1997-07-11 | Canon Inc | Computer system |
WO1997046975A1 (en) * | 1996-06-04 | 1997-12-11 | Muncey Grant J | Techniques for creating and modifying 3d models and correlating such models with 2d pictures |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7532213B2 (en) | 2000-07-28 | 2009-05-12 | Adrian Sfarti | Bicubic surface real time tesselation unit |
USRE42534E1 (en) | 2000-07-28 | 2011-07-12 | Adrian Sfarti | Bicubic surface real-time tesselation unit |
US7245299B2 (en) * | 2003-05-12 | 2007-07-17 | Adrian Sfarti | Bicubic surface real-time tesselation unit |
IT201700066749A1 (en) * | 2017-06-15 | 2018-12-15 | Gianluigi Palka | METHOD OF PROCESSING IMAGES OF STRUCTURAL AND / OR ARCHITECTURAL ELEMENTS AND / OR URBAN AND / OR GEOGRAPHICAL ELEMENTS |
Also Published As
Publication number | Publication date |
---|---|
SE9803996D0 (en) | 1998-11-20 |
AU1438500A (en) | 2000-06-13 |
EP1131792A1 (en) | 2001-09-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5973678A (en) | Method and system for manipulating a three-dimensional object utilizing a force feedback interface | |
Dai | Virtual reality for industrial applications | |
Weimer et al. | A synthetic visual environment with hand gesturing and voice input | |
KR102249577B1 (en) | Hud object design and method | |
US6842175B1 (en) | Tools for interacting with virtual environments | |
US6091410A (en) | Avatar pointing mode | |
Gregory et al. | intouch: Interactive multiresolution modeling and 3d painting with a haptic interface | |
Kim et al. | A haptic-rendering technique based on hybrid surface representation | |
US6084587A (en) | Method and apparatus for generating and interfacing with a haptic virtual reality environment | |
US20020133264A1 (en) | Virtual reality system for creation of design models and generation of numerically controlled machining trajectories | |
Liang et al. | Geometric modeling using six degrees of freedom input devices | |
Bordegoni et al. | Haptic and sound interface for shape rendering | |
CN109697002B (en) | Method, related equipment and system for editing object in virtual reality | |
Marner et al. | Large scale spatial augmented reality for design and prototyping | |
KR20100067155A (en) | Apparatus and method for providing realistic contents through augmented book | |
JP2004362218A (en) | Three-dimensional object operating method | |
JP2001216015A (en) | Operation teaching device for robot | |
Angster | VEDAM: virtual environments for design and manufacturing | |
WO2000031690A1 (en) | Method and device for creating and modifying digital 3d models | |
Fiorentino et al. | Surface design in virtual reality as industrial application | |
Springer et al. | State-of-the-art virtual reality hardware for computer-aided design | |
Schkolne et al. | Surface drawing. | |
WO2000065461A1 (en) | Tools for interacting with virtual environments | |
JP4769942B2 (en) | 3D design support system and 3D design support method | |
CA2496773A1 (en) | Interaction with a three-dimensional computer model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref country code: AU Ref document number: 2000 14385 Kind code of ref document: A Format of ref document f/p: F |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AL AM AT AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ CZ DE DE DK DK DM EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1999972765 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 09831831 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 1999972765 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1999972765 Country of ref document: EP |