US20050083349A1 - Physically interacting with a processor-based display - Google Patents

Physically interacting with a processor-based display Download PDF

Info

Publication number
US20050083349A1
US20050083349A1 US11/005,694 US569404A US2005083349A1 US 20050083349 A1 US20050083349 A1 US 20050083349A1 US 569404 A US569404 A US 569404A US 2005083349 A1 US2005083349 A1 US 2005083349A1
Authority
US
United States
Prior art keywords
display screen
image
processor
based system
article
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/005,694
Inventor
David Koizumi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/005,694 priority Critical patent/US20050083349A1/en
Publication of US20050083349A1 publication Critical patent/US20050083349A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03542Light pens for emitting or receiving light

Definitions

  • This invention relates generally to processor-based systems.
  • a processor-based system may include a display having a display screen.
  • the display screen may display images generated by a processor-based system. Generally, there is no way to interact with the images generated on that system in any extensive fashion.
  • touch screens are available which enable the user to touch the display screen and thereby to select an icon displayed on the screen.
  • This operation requires a specialized display screen.
  • the display screen must include a sensor which detects the presence of the user's finger and thereby correlates that presence to the selection of an appropriate icon.
  • the interaction that is possible is a direct result of the special configuration and construction of the display screen itself. Such interaction is not possible with any display screen.
  • additional expense may be incurred in providing a display screen which is also sensitive to touch.
  • FIG. 1 is a perspective view of one embodiment of the present invention
  • FIG. 2 is a perspective view of the embodiment shown in FIG. 1 , with the user interacting with an image displayed on a display screen;
  • FIG. 3 is a block diagram in accordance with one embodiment of the present invention.
  • FIG. 4 is a flow chart for software in accordance with one embodiment of the present invention.
  • an element 10 enables a user whose hand is indicated at A to interact with images being displayed on a computer display screen.
  • the element 10 includes a handle 12 that fits in the palm of the user's hand and presents a trigger 24 for operation by the user's index finger.
  • a telescoping shaft may include a proximal portion 14 and a distal portion 16 .
  • the shaft portions 14 and 16 may be splined to prevent relative rotation.
  • the portions 14 and 16 are spring biased to extend apart unless constrained.
  • a sensor housing 18 may be coupled to the distal portion 16 .
  • the sensor housing 18 may include detectors 22 that may detect the positions of spring biased light pens 20 .
  • the spring biased light pens 20 telescope in and out of the housing 18 in the direction of the arrows. As the pens 20 extend in and out of the housing 18 , the detectors 22 detect the position of each pen 20 relative to the housing 18 .
  • the element 10 is shown in position pressed against the display screen 28 of a computer display 26 .
  • the display 26 may be any type of computer display including a computer monitor.
  • the pens 20 are pressed against the screen 28 . Because three equally spaced pens 20 are utilized, the angular orientation of the element 10 with respect to the display screen 28 may be determined based on the extension of each of the pens 20 with respect to the housing 18 . However in the alternative embodiments, the element orientation detectors and the light pens may be separate devices.
  • the display screen 28 which may be glass, may be covered by another surface which may be flat and transparent.
  • another surface which may be flat and transparent.
  • other shapes and transparencies may be used.
  • the element 10 interacts with an image 30 displayed on the display screen 28 .
  • the image 30 may be a scissors-type gripper.
  • the gripper image 30 may grip a second image 32 such as a test tube.
  • the user may press against the screen 28 to cause the images 30 and 32 to appear to extend “deeper” into the display screen 28 .
  • the images 30 and 32 are altered to create this effect under computer control.
  • the exact position of the pens 20 on the display screen 28 may be determined in a fashion described hereinafter.
  • the angular orientation and extension of the portions 14 and 16 may also be determined.
  • the orientation of the element 10 with respect to the image 30 may be determined.
  • This information may be utilized to allow the element 10 to seemingly interact with and actually alter the image 30 .
  • the image 30 may appear to extend further into the screen 28 as if the image 30 , were actually part of the physical element 10 . This allows the user, whose hand is indicated at A, to apparently physically interact with images 30 , 32 displayed on the display screen 28 .
  • the element 10 may be coupled to a processor-based system 39 through a video pass through box 38 in one embodiment.
  • the video pass through box 38 may receive video control signals from the processor-based system 39 headed for the display 26 .
  • the pass through box 38 may receive the vertical and horizontal sync signals that the system 39 may generate to control the display 26 in one embodiment of the present invention.
  • the pass through box 38 receives the detector 22 signals from the element 10 .
  • the pass through box 38 may be of the type conventionally utilized with light pens to determine a location on a display screen selected by a light pen.
  • An example of such a box is the PXL-2000 USB External Interface available from FastPoint Technologies, Inc., Stanton, Calif.
  • Other techniques for identifying the location of the light pens 20 on the display screen 28 may also be used.
  • the pass through box 38 may receive signals from the light pens 20 a , 20 b and 20 c. Each light pens 20 may detect light signals generated by one or more pixels making-up the display screen 28 .
  • the optical transducers 34 convert those light signals into electrical signals and provide them to the video pass through box 38 .
  • the signals pass through the video pass through box 38 into a serial port such as a Universal Serial Bus hub 50 coupled to the processor-based system 39 .
  • the detectors 22 a - c that detect the extension of the pens 20 with respect to the housing 18 may also generate signals.
  • the detectors 22 may be rheostats that generate a signal indicative of the extent of spring biased extension of the pens 20 from the housing 18 .
  • Those analog signals may be converted to digital signals by the analog-to-digital converters 36 .
  • the digital signals may also pass through the pass through box 38 and the hub 50 to the processor-based system 39 .
  • a detector 22 d may be associated with the portions 14 and 16 to determine their relative extension. Thus, the detector 22 d determines the relative positions of the portions 14 and 16 which may be the result of the user pressing the element 10 into the screen 28 or releasing that force. In one embodiment, all of the detectors 22 may be rheostats.
  • the processor-based system 39 may include a processor 44 coupled to system memory 46 .
  • the processor 44 may be coupled to a bus 42 which in turn is coupled to a graphics interface 40 in one embodiment. Signals generated by the processor 40 may be passed through the graphics interface 40 and the video pass through box to the video display 26 .
  • a bridge 48 may also be coupled to the bus 42 .
  • the bridge 48 may be coupled to the hub 50 as well as a storage device 52 which may be a hard disk drive.
  • the storage device 52 may store software 54 for controlling the interaction between the element 10 and the display screen 28 .
  • the video pass through box 38 may receive the graphics signals intended for display on the video display 26 .
  • the box 38 may receive the vertical and horizontal sync signals as well.
  • the system 39 can determine the location of the light pens 20 on the display screen 28 .
  • the light pens 20 receive a flash of light when a particular underlying display screen 28 pixel is activated.
  • the pixels may be sequentially activated in response to vertical and horizontal sync signals generated by the system 39 .
  • the time from vertical and horizontal sync signal to light flash is indicative of screen 28 position of the pens 20 .
  • a vertical sync signal is generated to start each new frame.
  • a horizontal sync signal is generated with the beginning of each line.
  • the software 54 may begin in one embodiment, by determining whether the light pen data has been received as indicated in diamond 56 . If so, that data may be correlated to the vertical and horizontal sync signals as indicated in block 58 . The frame and screen coordinates for each particular received light pen signal may then be determined as indicated in block 60 .
  • a check at diamond 62 indicates whether the detector 22 data was received, as indicated in diamond 62 . If so, the angle of the element 10 with respect to the display screen 26 may be calculated. In addition, the distance of a handle 12 from the display screen is also calculated as indicated in block 64 , using the shaft data from the portions 14 and 16 .
  • a check a diamond 66 determines whether trigger activation has occurred. If so, an image such as the gripper image 30 may be altered. For example, the gripper image 30 may appear to “open”. For each unit of trigger activation in terms of time, a corresponding operation may be created virtually on the display screen 28 in one embodiment.
  • the orientations or images 30 and 32 may be recalculated.
  • the signals that generate the images 30 and 32 may be received and the revised signals may be transmitted to the display screen 26 for a display as indicated in block 70 .
  • the images 30 and 32 may be caused to move inwardly, as if they were coupled to the element 10 , by pressing the element 10 harder against the screen 28 . This action is detected by the detector 22 d . Similarly, the element 10 may be rotated or angularly adjusted with respect to the screen causing a corresponding change in position of the images 30 and 32 . This action is detected by the detectors 22 a - c. Similarly, operation of the trigger 24 may cause the preprogrammed change in one or both images 30 and 32 .

Abstract

A physical element may be caused to appear to interact with an image displayed on a computer display screen. The position of the element with respect to the display screen may be determined automatically. The user can then manipulate the element to cause an image, which may appear to be connected to the element, to be altered. Therefore, the user gets the impression that the element is capable of interacting and altering an image displayed on the display screen.

Description

    BACKGROUND
  • This invention relates generally to processor-based systems.
  • A processor-based system may include a display having a display screen. The display screen may display images generated by a processor-based system. Generally, there is no way to interact with the images generated on that system in any extensive fashion.
  • For example touch screens are available which enable the user to touch the display screen and thereby to select an icon displayed on the screen. However, this operation requires a specialized display screen. The display screen must include a sensor which detects the presence of the user's finger and thereby correlates that presence to the selection of an appropriate icon. Thus, the interaction that is possible is a direct result of the special configuration and construction of the display screen itself. Such interaction is not possible with any display screen. Moreover, additional expense may be incurred in providing a display screen which is also sensitive to touch.
  • With the advent of three dimensional graphics, relatively life-like images may be produced in computer displays. Ideally, the user would like to interact with those graphics. Currently, electronic interaction is possible through keyboards and other input devices.
  • Thus, there is a need for a way to physically interact with the images displayed on a computer display screen.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective view of one embodiment of the present invention;
  • FIG. 2 is a perspective view of the embodiment shown in FIG. 1, with the user interacting with an image displayed on a display screen;
  • FIG. 3 is a block diagram in accordance with one embodiment of the present invention; and
  • FIG. 4 is a flow chart for software in accordance with one embodiment of the present invention.
  • DETAILED DESCRIPTION
  • Referring to FIG. 1, an element 10 enables a user whose hand is indicated at A to interact with images being displayed on a computer display screen. In one embodiment, the element 10 includes a handle 12 that fits in the palm of the user's hand and presents a trigger 24 for operation by the user's index finger. Transversely connected to the handle 12, a telescoping shaft may include a proximal portion 14 and a distal portion 16.
  • The shaft portions 14 and 16 may be splined to prevent relative rotation. In one embodiment, the portions 14 and 16 are spring biased to extend apart unless constrained.
  • A sensor housing 18 may be coupled to the distal portion 16. The sensor housing 18 may include detectors 22 that may detect the positions of spring biased light pens 20. The spring biased light pens 20 telescope in and out of the housing 18 in the direction of the arrows. As the pens 20 extend in and out of the housing 18, the detectors 22 detect the position of each pen 20 relative to the housing 18.
  • Thus, turning to FIG. 2, the element 10 is shown in position pressed against the display screen 28 of a computer display 26. The display 26 may be any type of computer display including a computer monitor.
  • In this case, the pens 20 are pressed against the screen 28. Because three equally spaced pens 20 are utilized, the angular orientation of the element 10 with respect to the display screen 28 may be determined based on the extension of each of the pens 20 with respect to the housing 18. However in the alternative embodiments, the element orientation detectors and the light pens may be separate devices.
  • In some embodiments, the display screen 28, which may be glass, may be covered by another surface which may be flat and transparent. However, other shapes and transparencies may be used.
  • In the illustrated embodiment, the element 10 interacts with an image 30 displayed on the display screen 28. In one embodiment, the image 30 may be a scissors-type gripper. The gripper image 30 may grip a second image 32 such as a test tube. The user may press against the screen 28 to cause the images 30 and 32 to appear to extend “deeper” into the display screen 28. In actuality, the images 30 and 32 are altered to create this effect under computer control.
  • Thus, the exact position of the pens 20 on the display screen 28 may be determined in a fashion described hereinafter. The angular orientation and extension of the portions 14 and 16 may also be determined. As a result, the orientation of the element 10 with respect to the image 30 may be determined. This information may be utilized to allow the element 10 to seemingly interact with and actually alter the image 30. For example, as the user presses the element against the screen 28 against the spring bias between the portions 14 and 16, the image 30 may appear to extend further into the screen 28 as if the image 30, were actually part of the physical element 10. This allows the user, whose hand is indicated at A, to apparently physically interact with images 30, 32 displayed on the display screen 28.
  • Referring to FIG. 3, the element 10 may be coupled to a processor-based system 39 through a video pass through box 38 in one embodiment. The video pass through box 38 may receive video control signals from the processor-based system 39 headed for the display 26. Thus, the pass through box 38 may receive the vertical and horizontal sync signals that the system 39 may generate to control the display 26 in one embodiment of the present invention. In addition, the pass through box 38 receives the detector 22 signals from the element 10.
  • The pass through box 38 may be of the type conventionally utilized with light pens to determine a location on a display screen selected by a light pen. An example of such a box is the PXL-2000 USB External Interface available from FastPoint Technologies, Inc., Stanton, Calif. However, other techniques for identifying the location of the light pens 20 on the display screen 28 may also be used.
  • The pass through box 38 may receive signals from the light pens 20 a, 20 b and 20 c. Each light pens 20 may detect light signals generated by one or more pixels making-up the display screen 28. The optical transducers 34 convert those light signals into electrical signals and provide them to the video pass through box 38. In one embodiment, the signals pass through the video pass through box 38 into a serial port such as a Universal Serial Bus hub 50 coupled to the processor-based system 39.
  • The detectors 22 a-c that detect the extension of the pens 20 with respect to the housing 18 may also generate signals. In one embodiment, the detectors 22 may be rheostats that generate a signal indicative of the extent of spring biased extension of the pens 20 from the housing 18. Those analog signals may be converted to digital signals by the analog-to-digital converters 36. The digital signals may also pass through the pass through box 38 and the hub 50 to the processor-based system 39.
  • A detector 22 d may be associated with the portions 14 and 16 to determine their relative extension. Thus, the detector 22 d determines the relative positions of the portions 14 and 16 which may be the result of the user pressing the element 10 into the screen 28 or releasing that force. In one embodiment, all of the detectors 22 may be rheostats.
  • Finally, user operation of the trigger 24 may generate signals. Each time the user operates the trigger 24, the extent of trigger deflection and its duration may be encoded into an analog signal. That analog signal may be converted into a digital signal by the analog-to-digital converter 36 e. This digital signal, like the other signals discussed above, is passed through the video pass through box 38 and the hub 30 to the processor-based system 39.
  • The processor-based system 39 may include a processor 44 coupled to system memory 46. The processor 44 may be coupled to a bus 42 which in turn is coupled to a graphics interface 40 in one embodiment. Signals generated by the processor 40 may be passed through the graphics interface 40 and the video pass through box to the video display 26.
  • A bridge 48 may also be coupled to the bus 42. In one embodiment, the bridge 48 may be coupled to the hub 50 as well as a storage device 52 which may be a hard disk drive. The storage device 52 may store software 54 for controlling the interaction between the element 10 and the display screen 28.
  • The video pass through box 38 may receive the graphics signals intended for display on the video display 26. Thus, the box 38 may receive the vertical and horizontal sync signals as well. By extracting those vertical and horizontal sync signals, and comparing their timing to the timing of signals received from the light pens 20, the system 39 can determine the location of the light pens 20 on the display screen 28. In particular, the light pens 20 receive a flash of light when a particular underlying display screen 28 pixel is activated. The pixels may be sequentially activated in response to vertical and horizontal sync signals generated by the system 39. Thus, the time from vertical and horizontal sync signal to light flash is indicative of screen 28 position of the pens 20.
  • In one embodiment, a vertical sync signal is generated to start each new frame. A horizontal sync signal is generated with the beginning of each line. Thus, by knowing when a light signal is received by a light pen 20 relative to when a corresponding vertical sync signal and horizontal sync signal was detected, the system 39 may determine the vertical and horizontal coordinates of each light pen 20. The pass through box 38 may do the initial analysis to determine the pen 20 position or may simply forward the raw information onto the system 39 for analysis.
  • The software 54, shown in FIG. 4, may begin in one embodiment, by determining whether the light pen data has been received as indicated in diamond 56. If so, that data may be correlated to the vertical and horizontal sync signals as indicated in block 58. The frame and screen coordinates for each particular received light pen signal may then be determined as indicated in block 60.
  • Next, a check at diamond 62 indicates whether the detector 22 data was received, as indicated in diamond 62. If so, the angle of the element 10 with respect to the display screen 26 may be calculated. In addition, the distance of a handle 12 from the display screen is also calculated as indicated in block 64, using the shaft data from the portions 14 and 16.
  • A check a diamond 66 determines whether trigger activation has occurred. If so, an image such as the gripper image 30 may be altered. For example, the gripper image 30 may appear to “open”. For each unit of trigger activation in terms of time, a corresponding operation may be created virtually on the display screen 28 in one embodiment.
  • Based on the change in relative position between the portions 14 and 16, relative motion of a handle 12 with respect to the display screen, or rotation of the handle 12 relative to the display screen, the orientations or images 30 and 32 may be recalculated. The signals that generate the images 30 and 32 may be received and the revised signals may be transmitted to the display screen 26 for a display as indicated in block 70.
  • The images 30 and 32 may be caused to move inwardly, as if they were coupled to the element 10, by pressing the element 10 harder against the screen 28. This action is detected by the detector 22 d. Similarly, the element 10 may be rotated or angularly adjusted with respect to the screen causing a corresponding change in position of the images 30 and 32. This action is detected by the detectors 22 a-c. Similarly, operation of the trigger 24 may cause the preprogrammed change in one or both images 30 and 32.
  • In each case, three dimensional manipulation of the element 10 may result in a corresponding three dimensional alteration of an image 30 or 32. As a result, it may seem that the element 10 is physically linked to an image 30 or 32.
  • While the present invention has been described with respect to a limited number of embodiments, those skilled in the art will appreciate numerous modifications and variations therefrom. It is intended that the appended claims cover all such modifications and variations as fall within the true spirit and scope of this present invention.

Claims (20)

1. A method comprising:
enabling the position of an element on a display screen to be determined automatically; and
in response to physical manipulation of said element in three dimensions, enabling an image displayed on said display screen to be automatically altered in three dimensions.
2. The method of claim 1 including enabling the extension of said element to be determined automatically.
3. The method of claim 1 including providing a plurality of light pens on the end of said element such that said light pens may be pressed against the display screen.
4. The method of claim 3 including enabling the physical extension of said light pens from said element to be determined.
5. The method of claim 1 including enabling the angular rotation of said element to be determined.
6. The method of claim 1 including enabling the user to provide an input signal to indicate that an image displayed on said display screen should be modified in a preprogrammed fashion.
7. The method of claim 1 including enabling said image on said display screen to be altered automatically to appear to extend further into said display screen in response to said element being pressed harder against said display screen.
8. The method of claim 1 including creating the appearance that said image is physically connected to said element by enabling said image to be altered in a fashion that it would be altered in response to three dimensional manipulation of said element if the image were physically connected to said element.
9. The method of claim 1 including enabling said image to appear to grasp another image in response to manipulation of said element.
10. The method of claim 1 including enabling an image contacted by said element to appear to be altered in three dimensions in response to a corresponding manipulation of said element in three dimensions.
11. An article comprising a medium storing instructions that enable a processor-based system to:
determine the position of an element on a display screen; and
in response to physical manipulation of said element in three dimensions, alter an image displayed on the display screen.
12. The article of claim 11 further storing instructions that enable the processor-based system to determine the extension of said element.
13. The article of claim 11 further storing instructions that enable the processor-based system to determine the physical extension of a light pen from said element.
14. The article of claim 11 further storing instructions that enable the processor-based system to determine the angular rotation of said element.
15. The article of claim 11 further storing instructions that enable the processor-based system to recognize an input signal to indicate an image displayed on said display screen should be modified in a preprogrammed fashion.
16. The article of claim 11 further storing instructions that enable the processor-based system to alter an image so that the image appears to extend further into said display screen in response to said element being pressed against said display screen.
17. The article of claim 11 further storing instructions that enable the processor-based system to create the appearance that said image is physically connected to said element by altering said image in a fashion that it would be altered in response to three dimensional manipulation of said element if the image were physically connected to said element.
18. The article of claim 11 further storing instructions that enable the processor-based system to detect when said element is proximate to an image displayed on a display screen, and to alter said image in three dimensions in response to a corresponding manipulation of said element in three dimensions.
19. The article of claim 11 further storing instructions that enable the processor-based system to detect the position, with respect to said display screen, of a light pen associated with said element.
20-30. (canceled).
US11/005,694 2000-11-17 2004-12-07 Physically interacting with a processor-based display Abandoned US20050083349A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/005,694 US20050083349A1 (en) 2000-11-17 2004-12-07 Physically interacting with a processor-based display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/716,152 US6839049B1 (en) 2000-11-17 2000-11-17 Physically interacting with a processor-based display
US11/005,694 US20050083349A1 (en) 2000-11-17 2004-12-07 Physically interacting with a processor-based display

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/716,152 Division US6839049B1 (en) 2000-11-17 2000-11-17 Physically interacting with a processor-based display

Publications (1)

Publication Number Publication Date
US20050083349A1 true US20050083349A1 (en) 2005-04-21

Family

ID=33541655

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/716,152 Expired - Lifetime US6839049B1 (en) 2000-11-17 2000-11-17 Physically interacting with a processor-based display
US11/005,694 Abandoned US20050083349A1 (en) 2000-11-17 2004-12-07 Physically interacting with a processor-based display

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/716,152 Expired - Lifetime US6839049B1 (en) 2000-11-17 2000-11-17 Physically interacting with a processor-based display

Country Status (1)

Country Link
US (2) US6839049B1 (en)

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7251128B2 (en) * 2004-09-30 2007-07-31 Intel Corporation Adjustable portable computer
US8708822B2 (en) * 2005-09-01 2014-04-29 Nintendo Co., Ltd. Information processing system and program
JP4907129B2 (en) * 2005-09-01 2012-03-28 任天堂株式会社 Information processing system and program
WO2009146359A1 (en) 2008-05-28 2009-12-03 Illinois Tool Works Inc. Welding training system
US9101994B2 (en) 2011-08-10 2015-08-11 Illinois Tool Works Inc. System and device for welding training
US9573215B2 (en) 2012-02-10 2017-02-21 Illinois Tool Works Inc. Sound-based weld travel speed sensing system and method
US9368045B2 (en) 2012-11-09 2016-06-14 Illinois Tool Works Inc. System and device for welding training
US9583014B2 (en) 2012-11-09 2017-02-28 Illinois Tool Works Inc. System and device for welding training
US9583023B2 (en) 2013-03-15 2017-02-28 Illinois Tool Works Inc. Welding torch for a welding training system
US9672757B2 (en) 2013-03-15 2017-06-06 Illinois Tool Works Inc. Multi-mode software and method for a welding training system
US9728103B2 (en) 2013-03-15 2017-08-08 Illinois Tool Works Inc. Data storage and analysis for a welding training system
US9713852B2 (en) 2013-03-15 2017-07-25 Illinois Tool Works Inc. Welding training systems and devices
US9666100B2 (en) 2013-03-15 2017-05-30 Illinois Tool Works Inc. Calibration devices for a welding training system
US11090753B2 (en) 2013-06-21 2021-08-17 Illinois Tool Works Inc. System and method for determining weld travel speed
US10056010B2 (en) 2013-12-03 2018-08-21 Illinois Tool Works Inc. Systems and methods for a weld training system
US9724788B2 (en) 2014-01-07 2017-08-08 Illinois Tool Works Inc. Electrical assemblies for a welding system
US10170019B2 (en) 2014-01-07 2019-01-01 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US10105782B2 (en) 2014-01-07 2018-10-23 Illinois Tool Works Inc. Feedback from a welding torch of a welding system
US9589481B2 (en) 2014-01-07 2017-03-07 Illinois Tool Works Inc. Welding software for detection and control of devices and for analysis of data
US9757819B2 (en) 2014-01-07 2017-09-12 Illinois Tool Works Inc. Calibration tool and method for a welding system
US9751149B2 (en) 2014-01-07 2017-09-05 Illinois Tool Works Inc. Welding stand for a welding system
US9862049B2 (en) 2014-06-27 2018-01-09 Illinois Tool Works Inc. System and method of welding system operator identification
US10307853B2 (en) 2014-06-27 2019-06-04 Illinois Tool Works Inc. System and method for managing welding data
US9937578B2 (en) 2014-06-27 2018-04-10 Illinois Tool Works Inc. System and method for remote welding training
US10665128B2 (en) 2014-06-27 2020-05-26 Illinois Tool Works Inc. System and method of monitoring welding information
US9724787B2 (en) 2014-08-07 2017-08-08 Illinois Tool Works Inc. System and method of monitoring a welding environment
US11014183B2 (en) 2014-08-07 2021-05-25 Illinois Tool Works Inc. System and method of marking a welding workpiece
US9875665B2 (en) 2014-08-18 2018-01-23 Illinois Tool Works Inc. Weld training system and method
US11247289B2 (en) 2014-10-16 2022-02-15 Illinois Tool Works Inc. Remote power supply parameter adjustment
US10239147B2 (en) 2014-10-16 2019-03-26 Illinois Tool Works Inc. Sensor-based power controls for a welding system
US10490098B2 (en) 2014-11-05 2019-11-26 Illinois Tool Works Inc. System and method of recording multi-run data
US10204406B2 (en) 2014-11-05 2019-02-12 Illinois Tool Works Inc. System and method of controlling welding system camera exposure and marker illumination
US10210773B2 (en) 2014-11-05 2019-02-19 Illinois Tool Works Inc. System and method for welding torch display
US10373304B2 (en) 2014-11-05 2019-08-06 Illinois Tool Works Inc. System and method of arranging welding device markers
US10402959B2 (en) 2014-11-05 2019-09-03 Illinois Tool Works Inc. System and method of active torch marker control
US10417934B2 (en) 2014-11-05 2019-09-17 Illinois Tool Works Inc. System and method of reviewing weld data
US10427239B2 (en) 2015-04-02 2019-10-01 Illinois Tool Works Inc. Systems and methods for tracking weld training arc parameters
US10373517B2 (en) 2015-08-12 2019-08-06 Illinois Tool Works Inc. Simulation stick welding electrode holder systems and methods
US10593230B2 (en) 2015-08-12 2020-03-17 Illinois Tool Works Inc. Stick welding electrode holder systems and methods
US10657839B2 (en) 2015-08-12 2020-05-19 Illinois Tool Works Inc. Stick welding electrode holders with real-time feedback features
US10438505B2 (en) 2015-08-12 2019-10-08 Illinois Tool Works Welding training system interface
US11288978B2 (en) 2019-07-22 2022-03-29 Illinois Tool Works Inc. Gas tungsten arc welding training systems
US11776423B2 (en) 2019-07-22 2023-10-03 Illinois Tool Works Inc. Connection boxes for gas tungsten arc welding training systems

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734370A (en) * 1995-02-13 1998-03-31 Skodlar; Rafael Computer control device
US6377249B1 (en) * 1997-11-12 2002-04-23 Excel Tech Electronic light pen system
US6597347B1 (en) * 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4931018A (en) * 1987-12-21 1990-06-05 Lenco, Inc. Device for training welders
US5253068A (en) * 1992-01-31 1993-10-12 Crook Michael W Gun shaped remote control unit for a television
US5623582A (en) * 1994-07-14 1997-04-22 Immersion Human Interface Corporation Computer interface or control input device for laparoscopic surgical instrument and other elongated mechanical objects
US5600348A (en) * 1994-08-19 1997-02-04 Ftg Data Systems Adjustable tip light pen
JPH09152307A (en) * 1995-12-01 1997-06-10 Sega Enterp Ltd Apparatus and method for detection of coordinates, and game apparatus
US6146278A (en) * 1997-01-10 2000-11-14 Konami Co., Ltd. Shooting video game machine
IL120186A (en) * 1997-02-09 2000-06-01 Raviv Roni Display pointing device and method
US6097376A (en) * 1998-05-11 2000-08-01 Rothschild; Omri Light pen system for use with a CRT scanning display

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6597347B1 (en) * 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US5734370A (en) * 1995-02-13 1998-03-31 Skodlar; Rafael Computer control device
US6377249B1 (en) * 1997-11-12 2002-04-23 Excel Tech Electronic light pen system

Also Published As

Publication number Publication date
US6839049B1 (en) 2005-01-04

Similar Documents

Publication Publication Date Title
US6839049B1 (en) Physically interacting with a processor-based display
US8022928B2 (en) Free-space pointing and handwriting
US6326950B1 (en) Pointing device using two linear sensors and fingerprints to generate displacement signals
US5528263A (en) Interactive projected video image display system
EP0953934B1 (en) Pen like computer pointing device
US6351257B1 (en) Pointing device which uses an image picture to generate pointing signals
EP2325727B1 (en) Drawing, writing and pointing device for human-computer interaction
US7271795B2 (en) Intuitive mobile device interface to virtual spaces
US6654001B1 (en) Hand-movement-sensing input device
US20100103136A1 (en) Image display device, image display method, and program product
US6417837B1 (en) Coordinate input device
US6690357B1 (en) Input device using scanning sensors
US8754910B2 (en) Mouse having pan, zoom, and scroll controls
US20100103103A1 (en) Method And Device for Input Of Information Using Visible Touch Sensors
JP2002091689A (en) Four axes optical mouse
US20060256077A1 (en) Inertial sensing input apparatus
JP2006179000A (en) Mouse input device with secondary input device
US20150242179A1 (en) Augmented peripheral content using mobile device
WO2000070551A1 (en) Stylus pen for writing on the monitor
CN109460160B (en) Display control device, pointer display method, and non-transitory recording medium
US11334178B1 (en) Systems and methods for bimanual control of virtual objects
WO1996024923A1 (en) Calibration system for touch screens
TWI766509B (en) Display apparatus and control method of projected on-screen-display interface
EP1775656A1 (en) Inertial sensing input apparatus
JP6495519B2 (en) Image display device, display control program, and display control method

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION