WO2000031690A1 - Method and device for creating and modifying digital 3d models - Google Patents

Method and device for creating and modifying digital 3d models Download PDF

Info

Publication number
WO2000031690A1
WO2000031690A1 PCT/SE1999/002145 SE9902145W WO0031690A1 WO 2000031690 A1 WO2000031690 A1 WO 2000031690A1 SE 9902145 W SE9902145 W SE 9902145W WO 0031690 A1 WO0031690 A1 WO 0031690A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
control points
user
representation
real time
Prior art date
Application number
PCT/SE1999/002145
Other languages
French (fr)
Inventor
Patrik Larking
Original Assignee
Opticore Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Opticore Ab filed Critical Opticore Ab
Priority to EP99972765A priority Critical patent/EP1131792A1/en
Priority to AU14385/00A priority patent/AU1438500A/en
Publication of WO2000031690A1 publication Critical patent/WO2000031690A1/en
Priority to SE0002373A priority patent/SE0002373L/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation

Definitions

  • the present invention relates to a device and a method for interactively creating and modifying digital three-dimensional (3D) objects in real time.
  • NURBS Non- Uniform Rational B-Spline
  • VR virtual reality
  • VE virtual environments
  • SE simulated environments
  • an object of the present invention is to provide an improved and more easily manageable method and device for creating and modifying digital three-dimensional objects.
  • FIG. 1 is a schematic view of an embodiment of a device according to the invention
  • Fig. 2 is a schematic flow chart of a method according to an embodiment of the invention.
  • Fig. 3 is an example of a screen image when using the invention.
  • a device is schematically shown in Fig. 1.
  • the device comprises a memory 1 for storing data describing a digital three-dimensional object in real time.
  • the storage takes place at least in a mathematical functional representation with suitably positioned control points, but preferably also in a polygon representation.
  • the storage may take place in a prior-art CAD format or the like, but preferably use is made of the Cosmo Binary (CSB) format which comprises the mathematical functional representation as well as the polygon representation.
  • the control points are generated in a suitable manner by the mathematical representation, and preferably the control points are defined according to the Non-Uniform Rational B-Spline (NURBS) system. However, some other system for splines-based parametric surfaces may also be used.
  • the device comprises a tessellation means 2 for conversion in real time from the mathematical functional representation to the polygon representation of the object.
  • a presentation means 3 is included for a preferably three-dimensional presentation of the polygon representation of the object in real time to a user 4.
  • the presentation means may consist of a so-called HMD (Head-mounted display) , i.e. a display which is arranged on the user's head with a display in front of each eye to provide a stereo representation, i.e. 3D presentation.
  • HMD Head-mounted display
  • this system provides a good feeling of being present and realism but, on the other hand, it can only be used by one user and it may be uncomfortable to use for long periods of time.
  • use can instead by made of one or more large screens and at least one projector for multichannel (stereo) projection.
  • Several screens provide a better feeling of being present and realism.
  • use is made of a screen on the wall in front of the user and an associated screen on the floor.
  • screens formed as a semicircle and the like are used.
  • additional and more complicated screens require more computer capacity and more advanced graphics processing.
  • stereo glasses or shutter glasses i.e. glasses which at a high frequency alternate between showing an image to the right eye and to the left eye.
  • Such glasses are supplied, for instance, by the company Stereographies Corp.
  • the device further comprises at least one control means 5 with the aid of which the user 4 can control the presentation as well as the digital object.
  • the control means preferably comprises sensors to detect gestures and movements of at least one of the hands of a user, and preferably use is made of a glove-based interaction system. Such systems are already known in the VR field.
  • Fakespace, Inc. for instance, supply a system called “PinchGlove interaction system", in which gestures, i.e. finger movements, are identified by electric terminals which are arranged at the finger-tips of the glove and which generate different signals depending on which fingers are put together.
  • Virtual Technologies supply a system called “CyberGlove®” which instead comprises sensors which detect the flexing of fingers and the like.
  • the control means preferably comprises a movement tracker system which detects movements of the user's body, and in particular movements of his head and his hands.
  • a number of such tracker systems are commercially available, for instance, Ascension "Flock-of-Birds tracker system” and Polhemus “3SPACE FASTRAK system” .
  • Such tracker systems can, for instance, be magnetic, mechanical, optical or acoustic and comprise receivers (or transmitters) which are arranged on suitable surfaces on the user's body. Movements in these receivers are detected in relation to a fixed co-ordinate system.
  • magnetic receivers are used which are arranged on the user, and any action on a surrounding field is detected by means of a transmitter and a detector.
  • movements are detected in six dimensions (three translational movements and three rotational movements) , and the result is transferred, preferably wireless, to a receiving unit 6.
  • the control means is preferably of such type that it is possible to "move around" in the virtual environment by means of gestures of the glove in combination with a relative movement of the hand. Moreover, it is possible to change the sight direction in the virtual environment.
  • the control means may also preferably be used to select portions or parts of the presented object. The selection can be carried out by means of a "line" which projects from the hand position and which is visible in the simulation. It is thus possible to select a surface by pointing at the surface by means of such a line projecting from the hand position. Alternatively, it is, however, also possible to select surfaces by being close to them or in some other suitable manner. Such a selected surface can advantageously be marked by being highlighted or the like, and it can then be activated, i.e.
  • Fig. 3 shows an example of a screen image, in which a surface 31 is activated, the control points 32 being visible. It is possible for such an activated surface to move or add control points with the aid of the control means.
  • the receiving unit 6 When this is done, information about this is transmitted from the receiving unit 6 to a data-converting unit 7, in which the detected manipulation is converted into a new position for the control point, the mathematical functional representation of the object in the memory being updated. Subsequently, also the polygon representation is updated in real time by the tessellation means, whereupon the presentation image is updated.
  • the user has the impression that the tessellated image of the object, as well as the control points which are currently visible, change directly as a consequence of the manipulation by the control means.
  • the previous version of the object i.e. the one before the updating, is however preferably stored in the memory, or at least the information about the performed changes, which makes it possible to recall the last command.
  • an input means such as a keyboard and a display, a 3D interface or the like.
  • an input means such as a keyboard and a display, a 3D interface or the like.
  • scaling factors which are to be used, such as the exchange between a movement of the hand and the corresponding movement taking place in the image, the scaling factor which is to be used when importing an object from the outside, the size of the object in the image, the format in which the current object is to be stored, etc.
  • the points around the control point manipulated by the control means should also be manipulated, and in that case how many or how remote points are to be manipulated as well as the degree, i.e. the weighting, of the manipulation. It may, for instance, be suitable that directly adjoining points should be moved 50 % of the distance which the manipulated control point has been moved and that the control points in the layer outside the directly adjoining ones are moved 20 %. Moreover, it may be determined if the adjacent control points that are to be moved should only be the points which are located on the same surface, or if also points on adjacent surfaces should be moved. In the latter case, the system may also be arranged such that control points of adjacent surfaces are moved so that there will be no play or gaps between the adjoining surfaces.
  • the desired degree of resolution of the tessellated image i.e. the maximum chordal deviation tolerance.
  • the maximum chordal deviation during forming may, for instance, be set at 1 mm, whereas in other cases it is set at 0.1 or 0.05 mm. Furthermore, it is possible, during forming, to show only the portion of the object that is in the state of change, and optionally the closest portions. This also facilitates the real-time simulation and reduces the need for computer capacity. However, if and to what extent this should be done is preferably also decided by the user via the user interface.
  • the object should be possible to turn over a marked surface to see it in profile, i.e. in section, rotate it etc. Further, it is preferably possible to bring the object into a state of creation with the aid of the control means, in which case additional control points can be added, for instance, by sweeping movements of the hands. In this way, new objects can be created, new surfaces can be added to an existing structure etc, m a manner that is similar to the work with physical prototypes.
  • the tessellation means as well as the data-converting means can be provided m a commercially available computer.
  • the computer should have computer hardware sufficiently powerful to allow observation of both the left- hand and the right-hand side of the obiect m real time. Use may, for instance, be made of Silicon Graphics Onyx2.
  • the device should comprise the possibility of activating/deactivating what is referred to as "environment mapping", i.e. a predefined image of the environment reflected m real time on the surfaces of the object. Consequently, the object will be more realistically reproduced m the virtual environment.
  • environment mapping i.e. a predefined image of the environment reflected m real time on the surfaces of the object. Consequently, the object will be more realistically reproduced m the virtual environment.
  • Fig. 2 schematically shows an embodiment of a method according to the invention.
  • This method comprises the following steps.
  • a tessellation of the object takes place m real time, i.e. a conversion from the mathematical representation of the object into a polygon representation.
  • step S2 at least the polygon representation, and possibly also the entire or portions of the mathematical functional representation m the form of control points, are presented to the user.
  • m step S3 it is detected whether some portion of the object is activated and brought into the state of change. If this is not the case, the process reverts to step S2, but otherwise a lower resolution level is set m step S4.
  • This step is, however, optional and not necessary to the invention.
  • a setting may also be made so that only selected portions of the object are shown. Subsequently, the user's changes of the control points are detected, after which the mathematical representation of the object is accordingly updated m step S5. After that, the polygon representation is also updated, and the process reverts to step SI.
  • the design tool according to the invention is more similar to traditional designing with physical prototypes and is therefore perfectly suitable as a replacement or supplement in such work.
  • the invention may, for instance, advantageously be used in designing in the car industry to avoid the often unwieldy and complicated physical prototypes currently used.

Abstract

The present invention relates to a device and a method for interactively creating and modifying digital three-dimensional (3D) objects in real time, said objects comprising a mathematical functional representation with suitably positioned control points. The inventive method comprises the steps of producing a polygon representation of the object in real time on the basis of the mathematical functional representation, presenting the polygon representation of the object and optionally or always at least a portion of a control point hull containing the control points in real time to a user with the aid of a presentation means, changing in a user-controlled manner the object with the aid of control means by moving or adding control points, changing the stored mathematical functional representation of the object on the basis of said change, and subsequently updating the polygon representation in real time.

Description

METHOD AND DEVICE FOR CREATING AND MODIFYING DIGITAL 3D MODELS
Technical Field
The present invention relates to a device and a method for interactively creating and modifying digital three-dimensional (3D) objects in real time.
Background
In design and construction work, use is often made of digital three-dimensional objects as models. During this work, it must be possible, on the one hand, to create the objects and, on the other, to change and modify them. Normally, the work with the objects is carried out by means of construction tools, such as commercially available CAD programs. The objects are then stored in the form of a mathematical functional represen- tation, in which all surfaces are defined in the form of various mathematical functions. These functions themselves contain what is referred to as control points, which together form a control point hull and which are shown on a display when working with the object. A common system for defining these control points is NURBS (Non- Uniform Rational B-Spline) , but there are also other systems which define splines-based parametric surfaces. When the object is to be changed, the user normally enters a table and changes the values of the control points and/or the mathematical functions, after which the object is redrawn on the screen.
However, this work is not completely satisfactory in all respects. It is fairly time-consuming and technically awkward to work with the object. In particular, this type of work is not very convenient for artists and designers who are used to working with real physical objects, for instance, of clay. Therefore, a tool is needed to make it possible to manipulate three-dimensional digital objects in a manner that is similar to manipulating real physical objects. Moreover, with such a tool the need for physical prototypes would decrease in many fields, such as in the car industry, which would result in more efficient and less expensive designing work.
Furthermore, it is known to imitate a real physical reality in digital environments by real-time simulation, more commonly referred to as "virtual reality" (VR) , "virtual environments" (VE) or "simulated environments" (SE) . In such systems, a three-dimensional screen image is shown to the user, said image being updated in real time on the basis of a detection of body movements performed by the user, for instance, hand movements. In such a real-time simulation, the shown objects are normally stored as tessellated surfaces, i.e. surfaces which have been covered by polygons, such as triangles.
Object of the Invention
Therefore, an object of the present invention is to provide an improved and more easily manageable method and device for creating and modifying digital three-dimensional objects.
This object is achieved by means of a method and a device according to the appended claims .
Brief Description of the Drawings
In the accompanying drawings :
Fig. 1 is a schematic view of an embodiment of a device according to the invention; Fig. 2 is a schematic flow chart of a method according to an embodiment of the invention; and
Fig. 3 is an example of a screen image when using the invention.
Description of Preferred Embodiments
A device according to the invention is schematically shown in Fig. 1. The device comprises a memory 1 for storing data describing a digital three-dimensional object in real time. The storage takes place at least in a mathematical functional representation with suitably positioned control points, but preferably also in a polygon representation. Thus, the storage may take place in a prior-art CAD format or the like, but preferably use is made of the Cosmo Binary (CSB) format which comprises the mathematical functional representation as well as the polygon representation. The control points are generated in a suitable manner by the mathematical representation, and preferably the control points are defined according to the Non-Uniform Rational B-Spline (NURBS) system. However, some other system for splines-based parametric surfaces may also be used. Moreover, the device comprises a tessellation means 2 for conversion in real time from the mathematical functional representation to the polygon representation of the object.
Further, a presentation means 3 is included for a preferably three-dimensional presentation of the polygon representation of the object in real time to a user 4. The presentation means may consist of a so-called HMD (Head-mounted display) , i.e. a display which is arranged on the user's head with a display in front of each eye to provide a stereo representation, i.e. 3D presentation.
The advantages of this system is that it provides a good feeling of being present and realism but, on the other hand, it can only be used by one user and it may be uncomfortable to use for long periods of time. As an alternative, use can instead by made of one or more large screens and at least one projector for multichannel (stereo) projection. Several screens provide a better feeling of being present and realism. Preferably, use is made of a screen on the wall in front of the user and an associated screen on the floor. There are, however, also systems in which screens formed as a semicircle and the like are used. However, additional and more complicated screens require more computer capacity and more advanced graphics processing. When large screens are used, the user must wear so-called stereo glasses or shutter glasses, i.e. glasses which at a high frequency alternate between showing an image to the right eye and to the left eye. Such glasses are supplied, for instance, by the company Stereographies Corp.
If large screens are used, several users can observe the object at the same time. A drawback of this solution is that the screens and the projection equipment are unwieldy. It is, however, also possible to use smaller screens, in which case the work at the screen is more similar to ordinary terminal work.
The device further comprises at least one control means 5 with the aid of which the user 4 can control the presentation as well as the digital object. The control means preferably comprises sensors to detect gestures and movements of at least one of the hands of a user, and preferably use is made of a glove-based interaction system. Such systems are already known in the VR field.
Fakespace, Inc., for instance, supply a system called "PinchGlove interaction system", in which gestures, i.e. finger movements, are identified by electric terminals which are arranged at the finger-tips of the glove and which generate different signals depending on which fingers are put together. Moreover, Virtual Technologies supply a system called "CyberGlove®" which instead comprises sensors which detect the flexing of fingers and the like. Further, the control means preferably comprises a movement tracker system which detects movements of the user's body, and in particular movements of his head and his hands. A number of such tracker systems are commercially available, for instance, Ascension "Flock-of-Birds tracker system" and Polhemus "3SPACE FASTRAK system" .
Such tracker systems can, for instance, be magnetic, mechanical, optical or acoustic and comprise receivers (or transmitters) which are arranged on suitable surfaces on the user's body. Movements in these receivers are detected in relation to a fixed co-ordinate system. Preferably, magnetic receivers are used which are arranged on the user, and any action on a surrounding field is detected by means of a transmitter and a detector. Preferably, movements are detected in six dimensions (three translational movements and three rotational movements) , and the result is transferred, preferably wireless, to a receiving unit 6.
The control means is preferably of such type that it is possible to "move around" in the virtual environment by means of gestures of the glove in combination with a relative movement of the hand. Moreover, it is possible to change the sight direction in the virtual environment. The control means may also preferably be used to select portions or parts of the presented object. The selection can be carried out by means of a "line" which projects from the hand position and which is visible in the simulation. It is thus possible to select a surface by pointing at the surface by means of such a line projecting from the hand position. Alternatively, it is, however, also possible to select surfaces by being close to them or in some other suitable manner. Such a selected surface can advantageously be marked by being highlighted or the like, and it can then be activated, i.e. converted into a state of change by means of a predetermined finger movement or the like. In a similar manner, it is, of course, possible to deactivate a selection by marking the surface once again. In an activated surface, the control points are shown. All such control points of the object together form a control point hull, and thus the activated surface constitutes a portion of, or possibly the entire, control point hull. Fig. 3 shows an example of a screen image, in which a surface 31 is activated, the control points 32 being visible. It is possible for such an activated surface to move or add control points with the aid of the control means. When this is done, information about this is transmitted from the receiving unit 6 to a data-converting unit 7, in which the detected manipulation is converted into a new position for the control point, the mathematical functional representation of the object in the memory being updated. Subsequently, also the polygon representation is updated in real time by the tessellation means, whereupon the presentation image is updated. As a result, the user has the impression that the tessellated image of the object, as well as the control points which are currently visible, change directly as a consequence of the manipulation by the control means. The previous version of the object, i.e. the one before the updating, is however preferably stored in the memory, or at least the information about the performed changes, which makes it possible to recall the last command.
Moreover, it is preferably possible to control a number of functional parameters via an input means, such as a keyboard and a display, a 3D interface or the like. It should, for instance, be possible to set different scaling factors which are to be used, such as the exchange between a movement of the hand and the corresponding movement taking place in the image, the scaling factor which is to be used when importing an object from the outside, the size of the object in the image, the format in which the current object is to be stored, etc. Preferably, it is also possible to zoom out of and into the image to facilitate the marking of smaller and larger surfaces. Moreover, it should be possible to determine whether the points around the control point manipulated by the control means should also be manipulated, and in that case how many or how remote points are to be manipulated as well as the degree, i.e. the weighting, of the manipulation. It may, for instance, be suitable that directly adjoining points should be moved 50 % of the distance which the manipulated control point has been moved and that the control points in the layer outside the directly adjoining ones are moved 20 %. Moreover, it may be determined if the adjacent control points that are to be moved should only be the points which are located on the same surface, or if also points on adjacent surfaces should be moved. In the latter case, the system may also be arranged such that control points of adjacent surfaces are moved so that there will be no play or gaps between the adjoining surfaces. It should also be possible to set the desired degree of resolution of the tessellated image, i.e. the maximum chordal deviation tolerance. Preferably, use can also be made of two different degrees of resolution; a high resolution which is used when no portion of the object is in the state of change, i.e. activated, and a low resolution which is used when portions of, or the entire, object is in the state of change. As a result, the need for computer capacity is reduced, and the tessellation is easier and smoother during the change, at the same time as an inferior resolution may be accepted in most cases in such a situation, whereas the image will be better when the object is not subjected to any change or actuation. The maximum chordal deviation during forming may, for instance, be set at 1 mm, whereas in other cases it is set at 0.1 or 0.05 mm. Furthermore, it is possible, during forming, to show only the portion of the object that is in the state of change, and optionally the closest portions. This also facilitates the real-time simulation and reduces the need for computer capacity. However, if and to what extent this should be done is preferably also decided by the user via the user interface.
In addition, it should be possible to turn over a marked surface to see it in profile, i.e. in section, rotate it etc. Further, it is preferably possible to bring the object into a state of creation with the aid of the control means, in which case additional control points can be added, for instance, by sweeping movements of the hands. In this way, new objects can be created, new surfaces can be added to an existing structure etc, m a manner that is similar to the work with physical prototypes. The tessellation means as well as the data-converting means can be provided m a commercially available computer. The computer should have computer hardware sufficiently powerful to allow observation of both the left- hand and the right-hand side of the obiect m real time. Use may, for instance, be made of Silicon Graphics Onyx2. Furthermore, the device should comprise the possibility of activating/deactivating what is referred to as "environment mapping", i.e. a predefined image of the environment reflected m real time on the surfaces of the obiect. Consequently, the object will be more realistically reproduced m the virtual environment.
Fig. 2 schematically shows an embodiment of a method according to the invention. This method comprises the following steps. First, in step SI a tessellation of the object takes place m real time, i.e. a conversion from the mathematical representation of the object into a polygon representation. Subsequently, in step S2 at least the polygon representation, and possibly also the entire or portions of the mathematical functional representation m the form of control points, are presented to the user. Then, m step S3 it is detected whether some portion of the object is activated and brought into the state of change. If this is not the case, the process reverts to step S2, but otherwise a lower resolution level is set m step S4. This step is, however, optional and not necessary to the invention. A setting may also be made so that only selected portions of the object are shown. Subsequently, the user's changes of the control points are detected, after which the mathematical representation of the object is accordingly updated m step S5. After that, the polygon representation is also updated, and the process reverts to step SI. The design tool according to the invention is more similar to traditional designing with physical prototypes and is therefore perfectly suitable as a replacement or supplement in such work. The invention may, for instance, advantageously be used in designing in the car industry to avoid the often unwieldy and complicated physical prototypes currently used.
The invention has now been described by means of embodiments. Many variants of the invention are, however, feasible. It is, for instance, possible to use the invention when working at an ordinary computer workstation, in which case a mouse or the like may be used as control means. It is also possible to use other virtual environments than those mentioned above. It goes without saying that it is also possible to constantly present the control point hull to the user, instead of making it optional, as stated above. Furthermore, the polygon representation preferably comprises triangles, but other polygons may, of course, also be used. These variants as well as similar ones must be considered to be covered by the invention as defined in the appended claims.

Claims

1. A device for interactively creating and modifying digital three-dimensional (3D) objects in real time, which comprise a mathematical functional representation with suitably positioned control points, said device comprising a memory for storing data of the object, a tessellation means for producing a polygon representation of the object in real time on the basis of the mathematical functional representation, a presentation means for preferably three-dimensional presentation of the polygon representation of the object in real time to a user and optionally or always at least a portion of a control point hull containing the control points, and control means for user-controlled change of the object by moving or adding control points, the control means being connected to data-converting means for changing the stored mathematical functional representation of the object as manipulated by the control means, the polygon representation subsequently being updated in real time.
2. A device as claimed in claim 1, wherein the mathematical functional representation takes place according to the Non-Uniform Rational B-Spline (NURBS) system or some other system for splines-based parametric surfaces .
3. A device as claimed in claim 1 or 2 , wherein the control means comprises sensors for detecting gestures and movements of at least one of the user's hands, and preferably comprises a glove-based interaction system.
4. A device as claimed in claim 3 , wherein the presentation means comprises head-mounted display (HMD) means to be arranged on the user's head.
5. A device as claimed in claim 3, wherein the presentation means comprises at least one large screen and at least one projector for multichannel (stereo) projection as well as stereo glasses to be worn by the user.
6. A device as claimed m any one of the preceding claims, wherein the control means comprises first means for activating and deactivating, respectively, a portion of the object, an activated portion of the object being brought into a state of change and the corresponding portion of the control hull being shown as marked on the presentation means.
7. A device as claimed in claim 6, wherein the tessellation means is adapted to use a lower resolution, that is a higher degree of chordal deviation, when a portion of the object is m the state of change than otherwise .
8. A device as claimed m claim 6 or 7 , wherein the presentation means, when a part of the object is in the state of change, is adjusted to show only a limited portion of the object, said portion containing at least the activated part of the object.
9. A device as claimed m any one of the preceding claims, wherein the storage of the object m the memory is carried out m a format which comprises the mathematical functional representation as well as the polygon representation.
10. A device as claimed m any one of the preceding claims, wherein the data-converting means, when a control point is manipulated, is adjusted to provide also a corresponding movement of adjacent control points, which of said control points are to be manipulated as well as the extent to which the manipulation is to take place being determined by the data-converting means on the basis of predetermined parameter values, preferably set by the user .
11. A device as claimed m any one of the preceding claims, wherein the control means comprises a means for optionally bringing the object into a state of creation, in which additional control points are capable of being added to the object.
12. A method for interactively creating and modifying digitally reproduced three-dimensional (3D) objects in real time, which comprise a mathematical functional representation with suitably positioned control points, said method comprising the steps of producing a polygon representation of the object in real time on the basis of the mathematical functional representation, presenting the polygon representation of the object and optionally or always at least a portion of a control point hull containing the control points in real time to a user with the aid of a presentation means, changing in a user- controlled manner the object with the aid of control means by moving or adding control points, changing the stored mathematical functional representation of the object on the basis of said change, and subsequently updating the polygon representation in real time.
13. A method as claimed in claim 12, wherein the change of the object comprises the steps of first activating a portion of the object, the activated portion of the object being brought into a state of change and the corresponding portion of the control hull being shown as marked on the presentation means, and subsequently chang- ing the set of the thus-shown control points.
14. A method as claimed in claim 13, wherein the tessellation allows a lower resolution, that is a higher degree of chordal deviation, when a portion of the object is in the state of change than otherwise.
15. A method as claimed in claim 13, wherein only a limited portion of the object is shown, when a part of the object is in the state of change, said portion containing at least the activated part of the object.
16. A method as claimed in any one of claims 12-15, wherein a manipulation of a control point also results in a corresponding movement of adjacent control points, which of said control points are to be manipulated as well as the extent to which the manipulation is to take place being determined on the basis of parameter values set by the user.
PCT/SE1999/002145 1998-11-20 1999-11-19 Method and device for creating and modifying digital 3d models WO2000031690A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP99972765A EP1131792A1 (en) 1998-11-20 1999-11-19 Method and device for creating and modifying digital 3d models
AU14385/00A AU1438500A (en) 1998-11-20 1999-11-19 Method and device for creating and modifying digital 3d models
SE0002373A SE0002373L (en) 1998-11-20 2000-06-26 Method and apparatus for creating and modifying digital 3D models

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE9803996-9 1998-11-20
SE9803996A SE9803996D0 (en) 1998-11-20 1998-11-20 Method and apparatus for creating and modifying digital 3D models

Publications (1)

Publication Number Publication Date
WO2000031690A1 true WO2000031690A1 (en) 2000-06-02

Family

ID=20413373

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE1999/002145 WO2000031690A1 (en) 1998-11-20 1999-11-19 Method and device for creating and modifying digital 3d models

Country Status (4)

Country Link
EP (1) EP1131792A1 (en)
AU (1) AU1438500A (en)
SE (1) SE9803996D0 (en)
WO (1) WO2000031690A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7245299B2 (en) * 2003-05-12 2007-07-17 Adrian Sfarti Bicubic surface real-time tesselation unit
USRE42534E1 (en) 2000-07-28 2011-07-12 Adrian Sfarti Bicubic surface real-time tesselation unit
IT201700066749A1 (en) * 2017-06-15 2018-12-15 Gianluigi Palka METHOD OF PROCESSING IMAGES OF STRUCTURAL AND / OR ARCHITECTURAL ELEMENTS AND / OR URBAN AND / OR GEOGRAPHICAL ELEMENTS

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108984169B (en) * 2017-06-01 2022-05-03 刘开元 Cross-platform multi-element integrated development system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0526881A2 (en) * 1991-08-06 1993-02-10 Canon Kabushiki Kaisha Three-dimensional model processing method, and apparatus therefor
WO1995011482A1 (en) * 1993-10-21 1995-04-27 Taligent, Inc. Object-oriented surface manipulation system
US5442168A (en) * 1991-10-15 1995-08-15 Interactive Light, Inc. Dynamically-activated optical instrument for producing control signals having a self-calibration means
US5537529A (en) * 1993-04-22 1996-07-16 Apple Computer, Inc. Apparatus and method for creating versions of computer models and creating communications incorporating created versions therefrom
JPH09179062A (en) * 1995-12-25 1997-07-11 Canon Inc Computer system
WO1997046975A1 (en) * 1996-06-04 1997-12-11 Muncey Grant J Techniques for creating and modifying 3d models and correlating such models with 2d pictures

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0526881A2 (en) * 1991-08-06 1993-02-10 Canon Kabushiki Kaisha Three-dimensional model processing method, and apparatus therefor
US5442168A (en) * 1991-10-15 1995-08-15 Interactive Light, Inc. Dynamically-activated optical instrument for producing control signals having a self-calibration means
US5537529A (en) * 1993-04-22 1996-07-16 Apple Computer, Inc. Apparatus and method for creating versions of computer models and creating communications incorporating created versions therefrom
WO1995011482A1 (en) * 1993-10-21 1995-04-27 Taligent, Inc. Object-oriented surface manipulation system
JPH09179062A (en) * 1995-12-25 1997-07-11 Canon Inc Computer system
WO1997046975A1 (en) * 1996-06-04 1997-12-11 Muncey Grant J Techniques for creating and modifying 3d models and correlating such models with 2d pictures

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7532213B2 (en) 2000-07-28 2009-05-12 Adrian Sfarti Bicubic surface real time tesselation unit
USRE42534E1 (en) 2000-07-28 2011-07-12 Adrian Sfarti Bicubic surface real-time tesselation unit
US7245299B2 (en) * 2003-05-12 2007-07-17 Adrian Sfarti Bicubic surface real-time tesselation unit
IT201700066749A1 (en) * 2017-06-15 2018-12-15 Gianluigi Palka METHOD OF PROCESSING IMAGES OF STRUCTURAL AND / OR ARCHITECTURAL ELEMENTS AND / OR URBAN AND / OR GEOGRAPHICAL ELEMENTS

Also Published As

Publication number Publication date
SE9803996D0 (en) 1998-11-20
AU1438500A (en) 2000-06-13
EP1131792A1 (en) 2001-09-12

Similar Documents

Publication Publication Date Title
US5973678A (en) Method and system for manipulating a three-dimensional object utilizing a force feedback interface
Dai Virtual reality for industrial applications
Weimer et al. A synthetic visual environment with hand gesturing and voice input
KR102249577B1 (en) Hud object design and method
US6842175B1 (en) Tools for interacting with virtual environments
US6091410A (en) Avatar pointing mode
Gregory et al. intouch: Interactive multiresolution modeling and 3d painting with a haptic interface
Kim et al. A haptic-rendering technique based on hybrid surface representation
US6084587A (en) Method and apparatus for generating and interfacing with a haptic virtual reality environment
US20020133264A1 (en) Virtual reality system for creation of design models and generation of numerically controlled machining trajectories
Liang et al. Geometric modeling using six degrees of freedom input devices
Bordegoni et al. Haptic and sound interface for shape rendering
CN109697002B (en) Method, related equipment and system for editing object in virtual reality
Marner et al. Large scale spatial augmented reality for design and prototyping
KR20100067155A (en) Apparatus and method for providing realistic contents through augmented book
JP2004362218A (en) Three-dimensional object operating method
JP2001216015A (en) Operation teaching device for robot
Angster VEDAM: virtual environments for design and manufacturing
WO2000031690A1 (en) Method and device for creating and modifying digital 3d models
Fiorentino et al. Surface design in virtual reality as industrial application
Springer et al. State-of-the-art virtual reality hardware for computer-aided design
Schkolne et al. Surface drawing.
WO2000065461A1 (en) Tools for interacting with virtual environments
JP4769942B2 (en) 3D design support system and 3D design support method
CA2496773A1 (en) Interaction with a three-dimensional computer model

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref country code: AU

Ref document number: 2000 14385

Kind code of ref document: A

Format of ref document f/p: F

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ CZ DE DE DK DK DM EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 1999972765

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 09831831

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 1999972765

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWW Wipo information: withdrawn in national office

Ref document number: 1999972765

Country of ref document: EP