US20050270496A1 - Projector with a device for measuring angle of inclination - Google Patents
Projector with a device for measuring angle of inclination Download PDFInfo
- Publication number
- US20050270496A1 US20050270496A1 US11/137,564 US13756405A US2005270496A1 US 20050270496 A1 US20050270496 A1 US 20050270496A1 US 13756405 A US13756405 A US 13756405A US 2005270496 A1 US2005270496 A1 US 2005270496A1
- Authority
- US
- United States
- Prior art keywords
- inclination
- angle
- image
- projection
- positional information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003287 optical effect Effects 0.000 claims abstract description 63
- 238000003384 imaging method Methods 0.000 claims description 108
- 238000000034 method Methods 0.000 claims description 44
- 238000012360 testing method Methods 0.000 claims description 11
- 230000001133 acceleration Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 40
- 238000012545 processing Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/26—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/132—Overhead projectors, i.e. capable of projecting hand-writing or drawing during action
Definitions
- the present invention relates to a projector having a device for measuring an angle of inclination between the projection optical axis of a projection device and a projection surface, and a method for correcting a trapezoidal distortion.
- Japanese Patent Laid-open Publication No. 281597/97 discloses a process of adjusting the angle of a liquid crystal display unit.
- the process has the step of detecting the installation angle of a liquid crystal projector and the step of detecting the distance between the liquid crystal projector and the projection surface onto which an image is projected. According to the disclosed processes, the angle of the liquid crystal display unit needs to be mechanically adjusted.
- the distance measuring device is provided independently of the projector.
- the projected images are not utilized. Further, the measured distance and the actual distance to the screen do not always coincide, if the screen is surrounded by a frame projecting forwardly toward the projector, or the screen is set back away from the projector.
- Japanese Patent Laid-open Publication No. 169211/2001 discloses a process of correcting a distortion by projecting a beam spot onto a curved screen from an angle-controllable laser pointer.
- a spot image is generated and projected onto a screen by the projector, and the beam spot and the projected spot image are captured by a camera for measuring their positions.
- the spot image is moved until it overlaps with the beam spot, then the coordinate of the pixel of the spot image in the frame memory is replaced with the coordinate of the beam spot and stored in a coordinate transform parameter memory.
- the inclination of the projector in the vertical direction which easily causes a distortion, is detected by a gravity sensor, when a screen is set vertically.
- the distortion is corrected depending on the detected inclination. See the specification etc. of Japanese Patent Laid-open Publication No. 5278/2003 for details.
- the projector cannot correct the distortion correctly, if the screen is not set vertically or the screen is inclined in the horizontal plane with respect to the projection optical axis of the projector.
- an image can be projected without distortion from the projector onto the screen by means of conventional techniques such as the conversion of the coordinates of a frame memory of the projector, etc.
- a projector includes a projection device, a cross line positional information acquiring device for acquiring positional information of a cross line between a projection surface of the projection device and a plane crossing the projection surface, an inclination detecting device for calculating an angle of inclination between a projection optical axis of the projection device and the projection surface based on the positional information acquired by the cross line positional information acquiring device, and an image distortion correcting device for correcting a trapezoidal distortion of an input image supplied to the projection device, based on the angle of inclination calculated by the inclination detecting device.
- a projector includes a projection device, a cross line positional information acquiring device for acquiring positional information of a cross line between a projection surface of the projection device and a plane crossing the projection surface, a vertical angle-of-inclination acquiring device for detecting an angle of inclination of a projection optical axis of the projection device in a vertical plane, an inclination detecting device for calculating an angle of inclination between the projection optical axis and the projection surface in a horizontal plane based on the positional information acquired by the cross line positional information acquiring device, and an image distortion correcting device for correcting a trapezoidal distortion of an input image supplied to the projection device, based on the angle of inclination in the horizontal plane calculated by the inclination detecting device and based on the angle of inclination in the vertical plane calculated by the vertical angle-of-inclination acquiring device.
- a method of correcting a trapezoidal distortion of a projector includes the steps of acquiring positional information of a cross line between a projection surface of a projection device and a plane crossing the projection surface, calculating an angle of inclination between a projection optical axis of the projection device and the projection surface based on the acquired positional information, and correcting a trapezoidal distortion of an input image supplied to the projection device, based on the calculated angle of inclination.
- the present invention is advantageous in that a trapezoidal distortion of an image can be corrected with a simple mechanism, because horizontal and vertical angles of inclination of the projection surface with respect to the projection optical axis of the projector can be calculated, based on the positional information of a cross line between a basically vertical wall surface serving as the projection surface and a plane crossing the wall surface.
- the projector further includes a vertical inclination sensor, then only the horizontal angle of inclination needs to be acquired based on the positional information of the cross line. It is sufficient, in this case, that only a cross line between a front wall and a ceiling appears in the imaging range of the image sensor, and which is highly likely to occur. Therefore, the present invention is applicable to a wider range of uses, allowing for easy correction of a trapezoidal distortion due to horizontal and vertical inclinations of the projector.
- FIG. 1 is the block diagram of a projector having a trapezoidal distortion correcting device according to a first embodiment of the present invention
- FIGS. 2A through 2C are front, side elevational, and plan views, respectively, of the projector having the trapezoidal distortion correcting device according to the first embodiment
- FIGS. 3A and 3B are horizontal and vertical cross-sectional views, respectively, showing the projected image range and the image sensor imaging range when a projector is set such that the imaging range of the image sensor includes walls, a ceiling, and a floor adjacent to the front wall, as well as the front wall;
- FIG. 4 is a view showing an image captured by the imaging device of the image sensor when a projector is set as shown in FIGS. 3A and 3B ;
- FIGS. 5A and 5B are horizontal and vertical cross-sectional views, respectively, showing a projected image range and an image sensor imaging range, when a projector is set such that an image is projected along a projection optical axis of the projector that is inclined with respect to a front wall in the horizontal plane;
- FIG. 6 is a view showing an image captured by the imaging device of the image sensor when the projector is set as shown in FIGS. 5A and 5B ;
- FIG. 6 is a view showing an image captured by an imaging device of the image sensor in the projector setup shown in FIGS. 5A and 5B ;
- FIG. 7 is a view showing a cross line between the front wall and the ceiling that is detected from the captured image shown in FIG. 4 ;
- FIGS. 8A and 8B are views showing a cross line between the front wall and the ceiling that is detected from the captured image shown in FIG. 6 ;
- FIGS. 9A and 9B are plan and side elevational views, respectively, showing the relationship between the actual cross line between the front wall and the ceiling shown in FIGS. 5A and 5B and the image of the cross line in the image that is captured by the imaging device of the image sensor;
- FIG. 10 is a view showing an image captured by the imaging device of the image sensor when the projector is set such that the main body of the projector is inclined only vertically with respect to the front wall;
- FIGS. 11A and 11B are views showing an image captured by the imaging device of the image sensor when the projector is set such that the projector is inclined horizontally and vertically with respect to the front wall;
- FIGS. 12A and 12B are views showing the relationship between the projector and the front wall, when the front wall faces the projector head-on and inclined with respect to the projector;
- FIG. 13 is the block diagram of a projector having a trapezoidal distortion correcting device according to the second embodiment of the present invention.
- FIG. 14 is the flowchart of a process for correcting a trapezoidal distortion according to the second embodiment
- FIG. 15 is the block diagram of the projector having the trapezoidal distortion correcting device according to the third embodiment.
- FIGS. 16A through 16C are views showing an arrangement and operation of the laser positioning device.
- FIGS. 17A and 17B are views that illustrates a process for acquiring a cross line between a front wall serving as a projection surface and a ceiling that is joined to an upper edge of the front wall.
- a projector having a trapezoidal distortion correcting device will be described below with reference to FIGS. 1 and 2 A through 2 C.
- projector 10 has projection device 20 having projection lens 21 and display unit 22 , image controller 23 for controlling an image generated by display unit 22 , trapezoidal distortion correcting device 30 , and central processing unit 60 for controlling the entire operation of projector 10 .
- Trapezoidal distortion correcting device 30 calculates an angle of inclination between a front wall serving as projection surface 70 and projector 10 , and corrects a distortion of an image that is inputted to trapezoidal distortion correcting device 30 .
- Image controller 23 controls an image of display unit 22 , based on an output signal from trapezoidal distortion correcting device 30 , thereby correcting the distortion on the image displayed on projection surface 70 .
- the image distortion is automatically corrected according to a predetermined process by central processing unit 60 .
- trapezoidal distortion correcting device 30 has image sensor 50 , image capturer 31 , inclination detector 32 , image input unit 41 , and image distortion correcting circuit 33 .
- Image sensor 50 has imaging lens 51 and imaging device 53 .
- Imaging lens 51 is disposed on a front surface of projector 10 and has an optical axis in a predetermined direction and a predetermined imaging range.
- Imaging device 53 is disposed perpendicularly to the optical axis of imaging lens 51 . Imaging device 53 detects light that passes through imaging lens 51 and outputs desired positional information of an image represented by the detected light.
- Imaging device 53 has an imaging surface covering the imaging range of imaging lens 51 .
- Imaging device 53 has a two-dimensional solid-state imaging device such as an image pickup tube or a CCD (Charge-Coupled Device) for outputting an image as a collection of pixels.
- Image capturer 31 captures an image from imaging device 53 as image information.
- Inclination detector 32 analyzes positional information of the captured image and calculates the angle of inclination between the front wall and projector 10 .
- Image distortion correcting circuit 33 corrects a trapezoidal distortion of the image that is supplied to image input unit 41 , based on the angle of inclination calculated by inclination detector 32 .
- Image input unit 41 is supplied with video information that represents an image that is projected by projection device 20 , and supplies an output signal to image controller 23 .
- Projector 10 utilizes the positional information of a horizontal cross line between the front wall surface serving as projection surface 70 and a ceiling or a floor which crosses the front wall surface, and/or a vertical cross line between the front wall surface serving as projection surface 70 and a side wall surface which crosses the front wall surface. Specifically, a horizontal and/or vertical angle of inclination between projection optical axis 27 of projection device 20 of projector 10 and projection surface 70 is calculated, based on the positional information on the cross line acquired by imaging device 53 of image sensor 50 .
- the positional information of a cross line may be acquired by various processes. There are two processes available for acquiring the positional information of a cross line in an image by imaging device 53 . According to the first process, the positional information of a cross line is acquired as a luminance change line in a captured image which represents the entire reflected light emitting from the reflecting surfaces including projection surface 70 in front of projector 10 , through imaging lens 51 to imaging device 53 . In this process, the cross line needs to be included in the imaging range of image sensor 50 . Since the imaging means is a digital camera, and the cross line is usually recognized as a collection of luminance change spots, the positional information can be obtained by analyzing the pixels if there is a certain amount of luminance change.
- the cross line in an image generated by imaging device 53 can usually be detected as a change in the luminance of the reflected light. Filtering or other appropriate processing to image data may be used in order to acquire a clear boundary line.
- the second process is applicable if the projection range of projection device 20 can be expanded sufficiently to cover a ceiling, a floor, or side walls.
- two or more test patterns each consisting of a vertical or a horizontal straight line are projected from projection device 20 onto front surfaces including projection surface 70 , and bent points of the test patterns appearing on the captured image of reflected light are acquired.
- a cross line between a flat surface serving as the projection surface and a surface crossing the flat surface is calculated as a straight line joining the bent points.
- test pattern images consists of two straight lines which are joined with each other at a cross point positioned at the cross line between the ceiling and the wall surface.
- Each set of the two crossing straight lines is determined by straight line equations using the coordinates of two detection points on each of the two straight lines.
- the cross point between the straight lines is determined as the cross point between the determined straight lines, and the cross line can be determined from the coordinates of the obtained two cross points according to a straight line equation
- a process of calculating an angle of inclination between the projection optical axis and the projection surface in the trapezoidal distortion correcting device of the projector according to the first embodiment of the present invention will be described below. It is assumed that projection optical axis 27 of projection lens 21 and the optical axis of imaging lens 51 lie parallel to each other. If projection optical axis 27 of projection lens 21 and the optical axis of imaging lens 51 do not lie parallel to each other, then an angle of inclination between projection optical axis 27 of projection lens 21 and projection surface 70 can be calculated, based on the relationship between projection optical axis 27 of projection lens 21 and the optical axis of imaging lens 51 which is known.
- FIGS. 3A and 3B are horizontal and vertical cross-sectional views, respectively, showing the projected image range and the image sensor imaging range when a projector is set such that the imaging range of the image sensor includes walls, a ceiling, and a floor adjacent to the front wall, as well as the front wall.
- FIG. 4 is a view showing an image captured by the imaging device of the image sensor when a projector is set as shown in FIGS. 3A and 3B .
- Projected image range 71 is a range in which an image projected from projection lens 21 is displayed
- image sensor imaging range 72 is a range in which an image is captured by image sensor 50 .
- Projector 10 is usually set so as to project an image onto front wall 61 such that the image is displayed in the substantially central area of front wall 61 in the horizontal direction, and is designed to project an image with a slightly upward angle in the vertical direction. Therefore, the projected image is slightly directed upwardly with regard to the horizontal line of projector 10 in front-to-back direction.
- the imaging range of image sensor 50 is wider than the projected image range of projector 10 . If projector 10 is set as shown in FIGS. 3A, 3B , and 4 , then front wall 61 , right side wall 62 , left side wall 63 , ceiling 64 , and floor 65 are included in the imaging range of image sensor 50 .
- FIGS. 5A and 5B are horizontal and vertical cross-sectional views, respectively, showing a projected image range and an image sensor imaging range, when a projector is set such that an image is projected along a projection optical axis of the projector that is inclined with respect to a front wall in the horizontal plane.
- FIG. 6 is a view showing an image captured by the imaging device of the image sensor when the projector is set as shown in FIGS. 5A and 5B .
- imaging device 53 of image sensor 50 captures an image as shown in FIG. 6 .
- cross lines between front wall 61 and ceiling 64 and between front wall 61 and floor 65 are captured as inclined cross lines, unlike those in the captured image shown in FIG. 4 .
- Inclination detector 32 of trapezoidal distortion correcting device 30 detects these cross lines from the image that is generated by imaging device 53 of image sensor 50 and captured by image capturer 31 , according to the process described above, generates parameters for correcting an image distortion, and outputs the generated parameters to image distortion correcting circuit 33 .
- FIG. 7 is a view showing a cross line between the front wall and the ceiling that is detected from the captured image shown in FIG. 4 .
- FIGS. 8A and 8B are views showing a cross line between the front wall and the ceiling that is detected from the captured image shown in FIG. 6 .
- FIG. 8A shows the captured image
- FIG. 8B shows horizontal and vertical reference lines to determine the positional information of the cross line in the captured image.
- the horizontal and vertical reference lines are provided as hypothetical lines and defined with respect to the origin that is established at the center of the captured image.
- the cross line between the front wall and the ceiling that is detected is shown as a bold line.
- the angle of inclination between front wall 61 and the main body of projector 10 can be determined by calculating the angle of inclination, using the positional information recognized by image sensor 50 , of cross line 66 b between image 61 b of the front wall and image 64 b of the ceiling in the captured image in FIGS. 8A and 8B .
- front wall 61 extends vertically and ceiling 64 extends horizontally, front wall 61 and ceiling 64 cross perpendicularly to each other, and the cross line which is generated by the connecting edges of front wall 61 and ceiling 64 extends perpendicularly to the vertical direction. If projector 10 is inclined only in a horizontal plane, then cross line 66 b is detected from the image that is captured by image sensor 50 according to the process described above, as shown in FIG. 8A .
- left vertical reference line V 1 and right vertical reference line V 2 are provided at given intervals on the left and right sides of center C of the captured image, respectively.
- cross line 66 b crosses left vertical reference line V 1 and right vertical reference line V 2 at cross points a 0 , a 1 , respectively, and central vertical reference line V 0 that passes center C crosses cross line 66 b at a cross point through which first horizontal reference line H 1 passes.
- the positional information of cross points a 0 , a 1 can be represented by coordinates x, y in a two-dimensional coordinate system that has center C as its origin.
- the section of cross line 66 b between cross points a 0 , a 1 is shown as a bold line.
- cross line 67 b between image 61 b of the front wall and image 62 b of the right side wall are captured as a vertical line in the image by imaging device 53 of image sensor 50 .
- the line segment between cross points a 0 , a 1 of cross line 66 b does not extend horizontally in the image that is captured by image sensor 50 , though the actual cross line between front wall 61 and ceiling 64 extends horizontally.
- FIGS. 9A and 9B are plan and side elevational views, respectively, showing the relationship between the actual cross line between the front wall and the ceiling shown in FIGS. 5A and 5B , and the image of the cross line in the image that is captured by the imaging device of the image sensor.
- Broken line V in FIGS. 9A and 9B represents hypothetical plane V for explaining the imaging surface of imaging device 53 of image sensor 50 .
- Hypothetical plane V extends perpendicularly to optical axis 55 that passes center C of the imaging surface of imaging device 53 and center 52 of imaging lens 51 .
- Hypothetical plane V is displayed at a reduced scale on the imaging surface of imaging device 53 which extends parallel to hypothetical plane V.
- line segment a 0 -a 1 of cross line 66 b is inclined to hypothetical plane V.
- actual point a 1 is captured as point a 1 ′ on hypothetical plane V, as shown in FIG. 9A .
- the angle of inclination ⁇ of front wall 61 with respect to a plane perpendicular to optical axis 55 of image sensor 50 which is to be determined last, can be determined by the following equations.
- the horizontal angle of inclination may be determined by referring to a table which was prepared in advance and represents the relationship between the horizontal angle of inclination between optical axis 27 and projection surface 70 , and the variables which are obtained from the positional information.
- the above trigonometric values may be stored as data table in a memory, and the stored data may be read from the memory.
- a table may be stored in and read from a memory which represents the relationship between points a 0 , a 1 which are expressed by (x, y) coordinates such as (xa 0 , ya 0 ), (xa 1 , ya 1 ) on the image sensor, and angle data.
- the relationship between points a 0 , a 1 and any other data that are required for the final image conversion may also be stored in the memory. This process allows for great reduction of the amount of calculations to be performed by central processing unit 60 .
- the lens of an image sensor usually has a distortion. It is often necessary, in the present embodiment as well, to take such a lens distortion into account in order to calculate the positional information accurately.
- the positional information of a point obtained by the image sensor may be represented as positional information (x 0 , y 0 ), and the positional information (x 0 , y 0 ) may be converted into positional information (x 0 ′, y 0 ′) prior to the above calculation, by referring to a distortion correcting table. It is possible, in this way, to correct trapezoidal distortions of the lens of the image sensor taking into account the distortions of the lens, and it is not necessary to correct the distortions in a separate process.
- Image distortion correcting circuit 33 generates corrective parameters for correcting the trapezoidal distortion of the image, by using the angle of inclination identified by the above process, according to a known process. These parameters are applied to the inputted image, and the trapezoidal distortion thereof can automatically be corrected.
- FIG. 10 is a view showing an image captured by the imaging device of the image sensor when the projector is set such that the main body of the projector is inclined only vertically with respect to the front wall.
- the angle of inclination of cross line 67 c between image 61 c of the front wall and image 62 c of the right side wall can be calculated in the same manner as described above.
- the angle of inclination of the main body of projector 10 can be then calculated with respect to front wall 61 in the vertical direction, which allows projector 10 to automatically correct a trapezoidal distortion of the projected image.
- FIGS. 11A and 11B are views showing an image captured by the imaging device of the image sensor when the projector is set such that the projector is inclined horizontally and vertically with respect to the front wall.
- FIG. 11A shows the captured image
- FIG. 11B shows the highlighted cross lines.
- Cross line 66 d and cross line 67 d join at cross point e 0 .
- Cross line 66 d has left end point e 1 in the image.
- Cross line 67 d and the limit line of the imaging range cross each other at cross point e 2 .
- Central vertical reference line V 0 and central horizontal reference line H 0 pass image center C, which is the origin.
- Right vertical reference line V 3 and second horizontal reference line H 2 pass cross point e 0 .
- cross line 66 d between image 61 d of the front wall and image 64 d of the ceiling, and cross line 67 d between image 61 d of the front wall and image 62 d of the right side wall are extracted first. Then the x, y, and z coordinates of cross point e 0 , end point e 1 , and cross point e 2 are calculated.
- Line segments e 0 -e 1 , e 0 -e 2 can be expressed explicitly by the coordinates (x, y, z) of cross point e 0 , angles ⁇ h formed between line segment e 0 -e 1 and the horizontal line, the angles ⁇ v formed between line segment e 0 -e 2 and the vertical line, and angle ⁇ c formed between these line segments.
- FIGS. 12A and 12B are views showing the relationship between the projector and the front wall, when the front wall faces the projector head-on and is inclined with respect to the projector.
- the projector is assumed to be fixed, and the front wall is assumed to rotate with respect to the fixed projector.
- FIG. 12B is a plan view showing line segment S representing a vertical wall that rotates in the imaging range of image sensor.
- Line segment S rotates about pivot point Cr 0 , i.e., moves in the back-and-forth direction when viewed from image sensor 50 .
- reference point d 0 positioned at the end of line segment S moves toward point d 1 or point d 2 .
- the movement of reference point d 0 can be detected within the range represented by angle ⁇ in the image that is captured by the image sensor.
- the angle ⁇ is defined by point d 2 where reference point d 0 overlaps with pivot pint Cr 0 when viewed from image sensor 50 (This situation normally does not occur, since it means that the wall rotates to an angular position where the wall is recognized as being overlapped with image sensor 50 ), and by point d 1 where a hypothetical line drawn from image sensor 50 forms a right angle with line segment S (The right side of the wall is recognized as being turned toward image sensor 50 , when vied from image sensor 50 ).
- the reference point which corresponds to the image of a cross line moves with the rotation of hypothetical plane V.
- the angle of inclination of projector 10 can be determined with respect to the front wall, by rotating hypothetical plane V about the x-axis and the y-axis to transform the coordinates of cross points e 0 , e 1 , e 2 , and by finding a rotational angle at which line segment e 0 -e 1 and line segment e 0 -e 2 lie horizontally and vertically, respectively.
- the inclination of the front wall can be calculated by identifying e 0 (x, y, z), ⁇ h, ⁇ v, ⁇ c in the image, shown in FIG. 11B , that is captured by image sensor 50 . Since the rotational angle of the image sensor can be identified together with these parameters, the angle of inclination of the main body of projector 10 can also be detected with respect to the vertical direction. Inclination detector 32 of trapezoidal distortion correcting device 30 calculates the parameters of the projection surface such as a position and a shape of thereof, based on the detected angle of inclination of the main body of projector 10 with respect to the wall surface. The calculated parameters are applied to image distortion correcting circuit 33 , and projection device 20 automatically projects an appropriately shaped image, i.e., a distortion-free image, onto the wall surface.
- projectors should preferably be installed such that a projected image is displayed not onto a ceiling or side walls, but onto a front wall only. Since the positional information of the cross line is obtained from the detected angle of inclination, it is possible to reduce the projected image onto the front wall only.
- the angle of inclination between the projector and the front wall is acquired by using the cross line between the front wall and a surface that is connected to the front wall.
- this angle may also be acquired by using a contact line between the front wall and, for example, a horizontal desk in contact with the front wall.
- FIG. 13 is the block diagram of a projector having a trapezoidal distortion correcting device according to the second embodiment of the present invention.
- Trapezoidal distortion correcting device 30 has an inclination sensor (G sensor) using a conventional acceleration detector used, for example, for centering a machine when it is installed.
- the inclination sensor is a vertical inclination sensor 54 for accurately measuring an angle of inclination with respect to the gravitational direction and for outputting the measured angle of inclination as numerical data.
- G sensor a conventional acceleration detector used, for example, for centering a machine when it is installed.
- the inclination sensor is a vertical inclination sensor 54 for accurately measuring an angle of inclination with respect to the gravitational direction and for outputting the measured angle of inclination as numerical data.
- Other details of the projector according to the second embodiment are identical to the projector according to the first embodiment, and those parts identical to the first embodiment are denoted by identical reference numerals, and will not be described in detail below.
- a trapezoidal distortion of an image is corrected based only on the information obtained from an image that is captured by image sensor 50 .
- a horizontal cross line such as cross line 66 between front wall 61 and ceiling 64
- a vertical cross line such as cross line 67 between front wall 61 and right side wall 62
- images are usually projected onto an upper area of a wall
- cross line 66 between front wall 61 and ceiling 64 is usually included in image sensor imaging range 72 .
- cross lines 67 between front wall 61 and side walls 62 , 63 are often excluded in image sensor imaging range 72 .
- an angle of inclination in the vertical direction is detected by vertical inclination sensor 54 , and is inputted to inclination detector 32 .
- Image capturer 31 captures the image from imaging device 53 , and supplies inclination detector 32 with the positional information of cross line 66 between front wall 61 and ceiling 64 .
- Inclination detector 32 calculates an angle of inclination in the horizontal direction based on the positional information.
- Inclination detector 32 outputs the calculated angle of inclination in the horizontal direction, together with the angle of inclination in the vertical direction that is detected by vertical inclination sensor 54 , to image distortion correcting circuit 33 .
- Image distortion correcting circuit 33 generates LSI control parameters, corrects trapezoidal distortions in the vertical and horizontal directions of the image input from image that is inputted unit 41 , and outputs corrected image data to image controller 23 .
- vertical inclination sensor 54 is an acceleration sensor or a gravitational sensor utilizing the gravity.
- vertical inclination sensor 54 may be a device for detecting the tilt angle of a tilting mechanism of the main body of projector 10 .
- This process includes a process of identifying an angle of inclination in the horizontal direction based on the positional information of a cross line between a wall surface and a ceiling in an image sensor imaging range, acquiring an angle of inclination in the vertical direction from a vertical inclination sensor, and correcting an output image on a display unit in the projection.
- image capturer 31 acquires image information from imaging device 53 of image sensor 50 in step S 1 .
- inclination detector 32 acquires cross line 66 b between image 61 b of the front wall and image 64 b of the ceiling from the image information in step S 2 , acquires cross points a 0 , a 1 of left and right reference lines V 1 , V 2 and cross line 66 b in the image in step S 3 . Inclination detector 32 then assigns coordinates to cross points a 0 , a 1 in step S 4 .
- inclination detector 32 calculates the distance between the two cross points in a direction parallel to the optical axis, based on the distance between the two cross points in the vertical direction, the distance in the vertical direction between optical axis 55 and a limit line of the image sensor imaging range, and vertical angle ⁇ 0 of the image sensor imaging range in step S 5 .
- inclination detector 32 calculates an angle of inclination in the horizontal direction of the projector, based on the distance between the two cross points in the direction parallel to the optical axis, the distance between the two cross points in the horizontal direction, and horizontal angle ⁇ 0 of the image sensor imaging range in step S 6 , acquires an angle of inclination in the vertical direction of the projector from vertical inclination sensor 54 in step S 7 , and outputs the acquired and calculated angle data to image distortion correcting circuit 33 .
- Image distortion correcting circuit 33 generates LSI control parameters in step S 8 , and controls the projector image processing LSI circuit in step S 9 .
- Input image 24 is then corrected, and display unit 22 generates output image 25 in step S 25 .
- output image 25 is projected onto projection surface 70 , an image similar to input image 24 is displayed on projection surface 70 .
- FIG. 15 is the block diagram of the projector having the trapezoidal distortion correcting device according to the third embodiment of the present invention.
- image sensor 50 and image capturer 31 acquire positional information of a cross line between a plane which serves as the projection surface and a plane which crosses the plane.
- laser positioning device 150 and position acquirer 131 are used to acquire positional information of the cross line.
- Laser positioning device 150 points laser beams at desired positions by means of laser pointer 151 that is capable of projecting a laser beam within a predetermined range including an image that is projected by the projector.
- position acquirer 131 acquires positional information of the cross line in a hypothetical image, and outputs the acquired positional information to inclination detector 32 .
- inclination detector 32 processes the supplied positional information to calculate an angle of inclination between projection optical axis 27 of projection device 20 and projection surface 70 .
- the projector according to the third embodiment is identical in arrangement and operation to the projector according to the first embodiment, except that laser positioning device 150 and position acquirer 131 are provided instead of image sensor 50 and image capturer 31 .
- laser positioning device 150 and position acquirer 131 are provided instead of image sensor 50 and image capturer 31 .
- Various processes are available to acquire positional information of a cross line by means of laser positioning device 150 .
- a typical process will be described below as an example.
- a laser positioning device is used that is capable of controlling the direction of the laser beam, and determining the position of the laser beam in a hypothetical image by a signal that is input when the laser beam overlaps with a cross line.
- FIGS. 16A through 16C are views showing an arrangement and operation of the laser positioning device.
- FIG. 16A shows the manner in which the laser pointer operates.
- FIG. 16B shows the manner in which the movement of the laser pointer is limited.
- FIG. 16C shows the manner in which the laser positioning device is installed.
- Laser positioning device 150 has laser pointer 151 that is movable about pivot point 152 in the vertical and horizontal direction, as shown in FIG. 16A .
- Laser pointer 151 has a tubular member that passes through plate 154 having an H-shaped aperture defined therein, as shown FIG. 16B . Therefore, a laser beam is projected from laser pointer 151 in the directions along the H-shaped aperture.
- Pivot point 152 is combined with a displacement acquirer (not shown) for measuring displacements (angles) in the horizontal and vertical directions.
- Laser pointer 1 is manually moved.
- Laser positioning device 150 is installed near projection lens 21 of projector 10 , as shown in FIG. 16C .
- FIGS. 17A and 17B are views that illustrate a process for acquiring a cross line between a front wall serving as a projection surface and a ceiling that is joined to an upper edge of the front wall.
- FIG. 17A shows a vertical cross-sectional view of a room
- FIG. 17B shows the situation in which two points a 0 , a 1 are acquired in a hypothetical image in the position acquirer.
- Laser pointer 151 is moved vertically and horizontally, as shown in FIG. 17A , to point the laser beam at a cross line between walls.
- a button provided on laser positioning device 150 is pressed in order to identify the position, and laser positioning device 150 outputs the position of the laser beam as positional information into a hypothetical image in position acquirer 131 .
- laser positioning device 150 outputs the position of the laser beam as positional information into a hypothetical image in position acquirer 131 .
- an angle of inclination of projection optical axis 27 of projector 10 can be acquired with respect to projection surface 70 , by the process according to the first embodiment described above with reference to FIGS. 8A, 8B and 9 A, 9 B.
Abstract
Description
- 1. Field of the Invention
- The present invention relates to a projector having a device for measuring an angle of inclination between the projection optical axis of a projection device and a projection surface, and a method for correcting a trapezoidal distortion.
- 2. Description of the Related Art
- As projectors have become smaller in size and higher in performance due to rapid advances in solid-state pixel display device technology, projectors for projecting images are being widely used and have become more attractive for use as large-size display apparatuses to replace home display-type TV sets.
- However, since a projector, unlike a display-type TV set, projects an image onto a projection surface such as a screen or a wall, the displayed image tends to be distorted, depending on the relative relationship between the projection optical axis of the projector and the projection surface. The specification etc. of Japanese Patent Laid-open Publication No. 281597/97 discloses a process of adjusting the angle of a liquid crystal display unit. The process has the step of detecting the installation angle of a liquid crystal projector and the step of detecting the distance between the liquid crystal projector and the projection surface onto which an image is projected. According to the disclosed processes, the angle of the liquid crystal display unit needs to be mechanically adjusted.
- There has also been disclosed a projector apparatus that detects the distance from the projection lens to a screen, calculates the angle of inclination, then automatically corrects the distortion based on the detected distance. See the specification etc. of Japanese Patent Laid-open Publication No. 355740/92, No. 81593/2000, and No. 122617/2000 for details.
- In these prior art systems or processes in which the distance to the screen is detected, however, the distance measuring device is provided independently of the projector. The projected images are not utilized. Further, the measured distance and the actual distance to the screen do not always coincide, if the screen is surrounded by a frame projecting forwardly toward the projector, or the screen is set back away from the projector.
- The specification etc. of Japanese Patent Laid-open Publication No. 169211/2001 discloses a process of correcting a distortion by projecting a beam spot onto a curved screen from an angle-controllable laser pointer. According to the disclosed process, a spot image is generated and projected onto a screen by the projector, and the beam spot and the projected spot image are captured by a camera for measuring their positions. The spot image is moved until it overlaps with the beam spot, then the coordinate of the pixel of the spot image in the frame memory is replaced with the coordinate of the beam spot and stored in a coordinate transform parameter memory. Although this process is excellent for acquiring an angle accurately, it needs a means to control the different angles (positions) of the laser pointer, and also requires a complicated system configuration, since it is necessary to employ a laser pointer and a digital camera having a two-dimensional array of imaging elements.
- According to another disclosed projector that sells in the market, the inclination of the projector in the vertical direction, which easily causes a distortion, is detected by a gravity sensor, when a screen is set vertically. The distortion is corrected depending on the detected inclination. See the specification etc. of Japanese Patent Laid-open Publication No. 5278/2003 for details. However, the projector cannot correct the distortion correctly, if the screen is not set vertically or the screen is inclined in the horizontal plane with respect to the projection optical axis of the projector.
- Once the angles of inclination of optical axis of the projector apparatus are detected with regards to the screen in the vertical and horizontal directions, an image can be projected without distortion from the projector onto the screen by means of conventional techniques such as the conversion of the coordinates of a frame memory of the projector, etc.
- It is an object of the present invention to provide a projector having a trapezoidal distortion correcting device that is capable of measuring, with a simple mechanism, angles of inclination in horizontal and vertical directions of a projection surface with respect to the projection optical axis of a projector, for the purpose of correcting distortions of images projected onto the projection surface by the projector.
- According to an aspect of the present invention, a projector includes a projection device, a cross line positional information acquiring device for acquiring positional information of a cross line between a projection surface of the projection device and a plane crossing the projection surface, an inclination detecting device for calculating an angle of inclination between a projection optical axis of the projection device and the projection surface based on the positional information acquired by the cross line positional information acquiring device, and an image distortion correcting device for correcting a trapezoidal distortion of an input image supplied to the projection device, based on the angle of inclination calculated by the inclination detecting device.
- According to another aspect of the present invention, a projector includes a projection device, a cross line positional information acquiring device for acquiring positional information of a cross line between a projection surface of the projection device and a plane crossing the projection surface, a vertical angle-of-inclination acquiring device for detecting an angle of inclination of a projection optical axis of the projection device in a vertical plane, an inclination detecting device for calculating an angle of inclination between the projection optical axis and the projection surface in a horizontal plane based on the positional information acquired by the cross line positional information acquiring device, and an image distortion correcting device for correcting a trapezoidal distortion of an input image supplied to the projection device, based on the angle of inclination in the horizontal plane calculated by the inclination detecting device and based on the angle of inclination in the vertical plane calculated by the vertical angle-of-inclination acquiring device.
- According to yet another aspect of the present invention, a method of correcting a trapezoidal distortion of a projector includes the steps of acquiring positional information of a cross line between a projection surface of a projection device and a plane crossing the projection surface, calculating an angle of inclination between a projection optical axis of the projection device and the projection surface based on the acquired positional information, and correcting a trapezoidal distortion of an input image supplied to the projection device, based on the calculated angle of inclination.
- The present invention is advantageous in that a trapezoidal distortion of an image can be corrected with a simple mechanism, because horizontal and vertical angles of inclination of the projection surface with respect to the projection optical axis of the projector can be calculated, based on the positional information of a cross line between a basically vertical wall surface serving as the projection surface and a plane crossing the wall surface.
- If the projector further includes a vertical inclination sensor, then only the horizontal angle of inclination needs to be acquired based on the positional information of the cross line. It is sufficient, in this case, that only a cross line between a front wall and a ceiling appears in the imaging range of the image sensor, and which is highly likely to occur. Therefore, the present invention is applicable to a wider range of uses, allowing for easy correction of a trapezoidal distortion due to horizontal and vertical inclinations of the projector.
- The above and other objects, features, and advantages of the present invention will become apparent from the following description with reference to the accompanying drawings which illustrate examples of the present invention.
-
FIG. 1 is the block diagram of a projector having a trapezoidal distortion correcting device according to a first embodiment of the present invention; -
FIGS. 2A through 2C are front, side elevational, and plan views, respectively, of the projector having the trapezoidal distortion correcting device according to the first embodiment; -
FIGS. 3A and 3B are horizontal and vertical cross-sectional views, respectively, showing the projected image range and the image sensor imaging range when a projector is set such that the imaging range of the image sensor includes walls, a ceiling, and a floor adjacent to the front wall, as well as the front wall; -
FIG. 4 is a view showing an image captured by the imaging device of the image sensor when a projector is set as shown inFIGS. 3A and 3B ; -
FIGS. 5A and 5B are horizontal and vertical cross-sectional views, respectively, showing a projected image range and an image sensor imaging range, when a projector is set such that an image is projected along a projection optical axis of the projector that is inclined with respect to a front wall in the horizontal plane; -
FIG. 6 is a view showing an image captured by the imaging device of the image sensor when the projector is set as shown inFIGS. 5A and 5B ; -
FIG. 6 is a view showing an image captured by an imaging device of the image sensor in the projector setup shown inFIGS. 5A and 5B ; -
FIG. 7 is a view showing a cross line between the front wall and the ceiling that is detected from the captured image shown inFIG. 4 ; -
FIGS. 8A and 8B are views showing a cross line between the front wall and the ceiling that is detected from the captured image shown inFIG. 6 ; -
FIGS. 9A and 9B are plan and side elevational views, respectively, showing the relationship between the actual cross line between the front wall and the ceiling shown inFIGS. 5A and 5B and the image of the cross line in the image that is captured by the imaging device of the image sensor; -
FIG. 10 is a view showing an image captured by the imaging device of the image sensor when the projector is set such that the main body of the projector is inclined only vertically with respect to the front wall; -
FIGS. 11A and 11B are views showing an image captured by the imaging device of the image sensor when the projector is set such that the projector is inclined horizontally and vertically with respect to the front wall; -
FIGS. 12A and 12B are views showing the relationship between the projector and the front wall, when the front wall faces the projector head-on and inclined with respect to the projector; -
FIG. 13 is the block diagram of a projector having a trapezoidal distortion correcting device according to the second embodiment of the present invention; -
FIG. 14 is the flowchart of a process for correcting a trapezoidal distortion according to the second embodiment; -
FIG. 15 is the block diagram of the projector having the trapezoidal distortion correcting device according to the third embodiment; -
FIGS. 16A through 16C are views showing an arrangement and operation of the laser positioning device; and -
FIGS. 17A and 17B are views that illustrates a process for acquiring a cross line between a front wall serving as a projection surface and a ceiling that is joined to an upper edge of the front wall. - A projector having a trapezoidal distortion correcting device according to a first embodiment of the present invention will be described below with reference to
FIGS. 1 and 2 A through 2C. - As shown in
FIGS. 1 and 2 A through 2C,projector 10 hasprojection device 20 havingprojection lens 21 anddisplay unit 22,image controller 23 for controlling an image generated bydisplay unit 22, trapezoidaldistortion correcting device 30, andcentral processing unit 60 for controlling the entire operation ofprojector 10. Trapezoidaldistortion correcting device 30 calculates an angle of inclination between a front wall serving asprojection surface 70 andprojector 10, and corrects a distortion of an image that is inputted to trapezoidaldistortion correcting device 30.Image controller 23 controls an image ofdisplay unit 22, based on an output signal from trapezoidaldistortion correcting device 30, thereby correcting the distortion on the image displayed onprojection surface 70. The image distortion is automatically corrected according to a predetermined process bycentral processing unit 60. - According to a first embodiment of the present invention, trapezoidal
distortion correcting device 30 hasimage sensor 50,image capturer 31,inclination detector 32,image input unit 41, and imagedistortion correcting circuit 33.Image sensor 50 hasimaging lens 51 andimaging device 53.Imaging lens 51 is disposed on a front surface ofprojector 10 and has an optical axis in a predetermined direction and a predetermined imaging range.Imaging device 53 is disposed perpendicularly to the optical axis ofimaging lens 51.Imaging device 53 detects light that passes throughimaging lens 51 and outputs desired positional information of an image represented by the detected light.Imaging device 53 has an imaging surface covering the imaging range ofimaging lens 51.Imaging device 53 has a two-dimensional solid-state imaging device such as an image pickup tube or a CCD (Charge-Coupled Device) for outputting an image as a collection of pixels.Image capturer 31 captures an image fromimaging device 53 as image information.Inclination detector 32 analyzes positional information of the captured image and calculates the angle of inclination between the front wall andprojector 10. Imagedistortion correcting circuit 33 corrects a trapezoidal distortion of the image that is supplied to imageinput unit 41, based on the angle of inclination calculated byinclination detector 32.Image input unit 41 is supplied with video information that represents an image that is projected byprojection device 20, and supplies an output signal to imagecontroller 23. -
Projector 10 according to the first embodiment of the present invention utilizes the positional information of a horizontal cross line between the front wall surface serving asprojection surface 70 and a ceiling or a floor which crosses the front wall surface, and/or a vertical cross line between the front wall surface serving asprojection surface 70 and a side wall surface which crosses the front wall surface. Specifically, a horizontal and/or vertical angle of inclination between projectionoptical axis 27 ofprojection device 20 ofprojector 10 andprojection surface 70 is calculated, based on the positional information on the cross line acquired byimaging device 53 ofimage sensor 50. - The positional information of a cross line may be acquired by various processes. There are two processes available for acquiring the positional information of a cross line in an image by imaging
device 53. According to the first process, the positional information of a cross line is acquired as a luminance change line in a captured image which represents the entire reflected light emitting from the reflecting surfaces includingprojection surface 70 in front ofprojector 10, throughimaging lens 51 toimaging device 53. In this process, the cross line needs to be included in the imaging range ofimage sensor 50. Since the imaging means is a digital camera, and the cross line is usually recognized as a collection of luminance change spots, the positional information can be obtained by analyzing the pixels if there is a certain amount of luminance change. Since the angle formed between the front wall surface serving asprojection surface 70 and the optical axis of the imaging lens, and the angle formed between a side wall surface, a ceiling, or a floor which crosses the front wall surface serving asprojection surface 70 and the optical axis of the imaging lens, differs greatly, the cross line in an image generated by imagingdevice 53 can usually be detected as a change in the luminance of the reflected light. Filtering or other appropriate processing to image data may be used in order to acquire a clear boundary line. - The second process is applicable if the projection range of
projection device 20 can be expanded sufficiently to cover a ceiling, a floor, or side walls. According to the second process, two or more test patterns each consisting of a vertical or a horizontal straight line are projected fromprojection device 20 onto front surfaces includingprojection surface 70, and bent points of the test patterns appearing on the captured image of reflected light are acquired. A cross line between a flat surface serving as the projection surface and a surface crossing the flat surface is calculated as a straight line joining the bent points. - For example, when a plurality of vertical test patterns are projected onto the wall surface serving as
projection surface 70 and the ceiling crossing the wall surface, since the wall surfaces are inclined with respect to projectionoptical axis 27, images of bent test patterns are generated on the ceiling and the wall surface. Reflected light of the image of bent test patterns passes throughimaging lens 51 ofprojector 10, and is applied to the imaging surface ofimaging device 53 as test pattern images. Each of the test pattern images consists of two straight lines which are joined with each other at a cross point positioned at the cross line between the ceiling and the wall surface. Each set of the two crossing straight lines is determined by straight line equations using the coordinates of two detection points on each of the two straight lines. The cross point between the straight lines is determined as the cross point between the determined straight lines, and the cross line can be determined from the coordinates of the obtained two cross points according to a straight line equation This process is applicable to a projector which has a solid-state image display device. - A process of calculating an angle of inclination between the projection optical axis and the projection surface in the trapezoidal distortion correcting device of the projector according to the first embodiment of the present invention will be described below. It is assumed that projection
optical axis 27 ofprojection lens 21 and the optical axis ofimaging lens 51 lie parallel to each other. If projectionoptical axis 27 ofprojection lens 21 and the optical axis ofimaging lens 51 do not lie parallel to each other, then an angle of inclination between projectionoptical axis 27 ofprojection lens 21 andprojection surface 70 can be calculated, based on the relationship between projectionoptical axis 27 ofprojection lens 21 and the optical axis ofimaging lens 51 which is known. -
FIGS. 3A and 3B are horizontal and vertical cross-sectional views, respectively, showing the projected image range and the image sensor imaging range when a projector is set such that the imaging range of the image sensor includes walls, a ceiling, and a floor adjacent to the front wall, as well as the front wall.FIG. 4 is a view showing an image captured by the imaging device of the image sensor when a projector is set as shown inFIGS. 3A and 3B . - Projected
image range 71 is a range in which an image projected fromprojection lens 21 is displayed, and imagesensor imaging range 72 is a range in which an image is captured byimage sensor 50.Projector 10 is usually set so as to project an image ontofront wall 61 such that the image is displayed in the substantially central area offront wall 61 in the horizontal direction, and is designed to project an image with a slightly upward angle in the vertical direction. Therefore, the projected image is slightly directed upwardly with regard to the horizontal line ofprojector 10 in front-to-back direction. The imaging range ofimage sensor 50 is wider than the projected image range ofprojector 10. Ifprojector 10 is set as shown inFIGS. 3A, 3B , and 4, thenfront wall 61,right side wall 62,left side wall 63,ceiling 64, andfloor 65 are included in the imaging range ofimage sensor 50. - Projectors are often set in an oblique direction in the horizontal plane, with the projection optical axis inclined to a front wall. This is due to the function which corrects a trapezoidal distortion of a projected image not only in the vertical direction, but also in the horizontal direction.
FIGS. 5A and 5B are horizontal and vertical cross-sectional views, respectively, showing a projected image range and an image sensor imaging range, when a projector is set such that an image is projected along a projection optical axis of the projector that is inclined with respect to a front wall in the horizontal plane.FIG. 6 is a view showing an image captured by the imaging device of the image sensor when the projector is set as shown inFIGS. 5A and 5B . - Heretofore, it has been necessary to correct a horizontal trapezoidal distortion manually by using an adjustment bar or a pointer that indicates the projected position. According to another process, if a screen is set on a wall, then a horizontal trapezoidal distortion can be automatically corrected by detecting the frame of the screen. However, if an image is projected on a wall directly, a horizontal trapezoidal distortion still needs to be manually adjusted.
- If an image is projected from
projector 10 which is inclined only in the horizontal direction as shown inFIGS. 5A and 5B , then imagingdevice 53 ofimage sensor 50 captures an image as shown inFIG. 6 . InFIG. 6 , cross lines betweenfront wall 61 andceiling 64 and betweenfront wall 61 andfloor 65 are captured as inclined cross lines, unlike those in the captured image shown inFIG. 4 . -
Inclination detector 32 of trapezoidaldistortion correcting device 30 detects these cross lines from the image that is generated by imagingdevice 53 ofimage sensor 50 and captured byimage capturer 31, according to the process described above, generates parameters for correcting an image distortion, and outputs the generated parameters to imagedistortion correcting circuit 33. - Operation of
inclination detector 32 will be described below.FIG. 7 is a view showing a cross line between the front wall and the ceiling that is detected from the captured image shown inFIG. 4 .FIGS. 8A and 8B are views showing a cross line between the front wall and the ceiling that is detected from the captured image shown inFIG. 6 .FIG. 8A shows the captured image, andFIG. 8B shows horizontal and vertical reference lines to determine the positional information of the cross line in the captured image. The horizontal and vertical reference lines are provided as hypothetical lines and defined with respect to the origin that is established at the center of the captured image. The cross line between the front wall and the ceiling that is detected is shown as a bold line. - As is clear by comparing
FIGS. 7, 8A , and 8B, ifprojector 10 projects an image ontofront wall 61, with the main body thereof inclined in a horizontal plane, then imagesensor 50 disposed nearprojection lens 21 is also inclined in a horizontal plane with respect tofront wall 61, and, as a result, the cross line is recognized as an inclined cross line in the image. - If the main body of
projector 10 is inclined only in a horizontal plane with respect tofront wall 61, then the angle of inclination betweenfront wall 61 and the main body ofprojector 10 can be determined by calculating the angle of inclination, using the positional information recognized byimage sensor 50, ofcross line 66 b betweenimage 61 b of the front wall andimage 64 b of the ceiling in the captured image inFIGS. 8A and 8B . - A process of identifying the angle of inclination between
front wall 61 and the main body ofprojector 10 will be described below. It is assumed thatfront wall 61 extends vertically andceiling 64 extends horizontally,front wall 61 andceiling 64 cross perpendicularly to each other, and the cross line which is generated by the connecting edges offront wall 61 andceiling 64 extends perpendicularly to the vertical direction. Ifprojector 10 is inclined only in a horizontal plane, then crossline 66 b is detected from the image that is captured byimage sensor 50 according to the process described above, as shown inFIG. 8A . - As shown in
FIG. 8B , left vertical reference line V1 and right vertical reference line V2 are provided at given intervals on the left and right sides of center C of the captured image, respectively. Assume thatcross line 66 b crosses left vertical reference line V1 and right vertical reference line V2 at cross points a0, a1, respectively, and central vertical reference line V0 that passes center C crosses crossline 66 b at a cross point through which first horizontal reference line H1 passes. The positional information of cross points a0, a1 can be represented by coordinates x, y in a two-dimensional coordinate system that has center C as its origin. InFIG. 8B , the section ofcross line 66 b between cross points a0, a1 is shown as a bold line. - If projection
optical axis 27 ofprojector 10 is inclined only in a horizontal plane with respect tofront wall 61, then crossline 67 b betweenimage 61 b of the front wall andimage 62 b of the right side wall are captured as a vertical line in the image by imagingdevice 53 ofimage sensor 50. The line segment between cross points a0, a1 ofcross line 66 b, however, does not extend horizontally in the image that is captured byimage sensor 50, though the actual cross line betweenfront wall 61 andceiling 64 extends horizontally. -
FIGS. 9A and 9B are plan and side elevational views, respectively, showing the relationship between the actual cross line between the front wall and the ceiling shown inFIGS. 5A and 5B , and the image of the cross line in the image that is captured by the imaging device of the image sensor. Broken line V inFIGS. 9A and 9B represents hypothetical plane V for explaining the imaging surface ofimaging device 53 ofimage sensor 50. Hypothetical plane V extends perpendicularly tooptical axis 55 that passes center C of the imaging surface ofimaging device 53 andcenter 52 ofimaging lens 51. Hypothetical plane V is displayed at a reduced scale on the imaging surface ofimaging device 53 which extends parallel to hypothetical plane V. - If hypothetical plane V passes cross point a0 in
FIGS. 9A and 9B , two cross points a0, a1 ofcross line 66 b with left vertical reference line V1 and right vertical reference line V2 are captured as a0 and a1′ on hypothetical plane V. Now assume that: - (1) the distance between cross point a1 and hypothetical plane V is represented by L,
- (2) the distance between hypothetical plane V and
imaging lens center 52 parallel tooptical axis 55 is represented by L0, - (3) the distance between cross points a1, a1′ in a direction perpendicular to
optical axis 55 in a horizontal plane is represented by WC, - (4) the horizontal distance between cross points a0, a1′ on hypothetical plane V is represented by WT,
- (5) the angle formed between
optical axis 55 and a line interconnectingimaging lens center 52 and cross point a1 in the horizontal plane is represented by β, - (6) the angle formed between
optical axis 55 and a limit line of imagesensor imaging range 72 in the horizontal plane is represented by β0, - (7) the angle formed between
cross line 66 b and hypothetical plane V is represented by φ, - (8) the vertical distance between the horizontal plane including
optical axis 55 and cross point a0 is represented by Ha0, - (9) the vertical distance between the horizontal plane including
optical axis 55 and cross point a1′ is represented by Ha1, - (10) the vertical distance between the horizontal plane including
optical axis 55 and a cross point between the limit line of imagesensor imaging range 72 and hypothetical plane V is represented by H0, - (11) the angle formed between
optical axis 55 and a line interconnectingimaging lens center 52 and cross point a1 in a vertical plane is represented by θ, and - (12) the angle formed between
optical axis 55 and the limit line of imagesensor imaging range 72 is represented by θ0. - Actual cross points a0, a1 are arranged horizontally. However, since
image sensor 50 has three-dimensional radial-shapedrange 72, when line segment a0-a1 ofcross line 66 b is observed in the lateral direction as shown inFIG. 9B , cross point a1 is captured as if it rotated at angle θ to the entire angle (θ0×2) of the imaging range ofimage sensor 50. In other words, because the two-dimensional image is captured byimage sensor 50 as a collection of points projected onto hypothetical plane V inFIG. 9B , cross point a1 is captured as if it was projected onto point a1′ on hypothetical plane V. - Distance L between cross point a1 and hypothetical plane V is determined by the following equations:
tan θ0=H 0/L 0, L 0=H 0/tan θ0
tan θ=Ha 1/L 0, L 0=Ha 1/tan θ
H 0/tan θ0=Ha 1/tan θ, tan θ=tan θ0×Ha 1/H 0 - Because tan θ0 is known, and Ha1, H0 can be determined from the image that is generated by imaging
device 53, tan θ, and hence θ, can be calculated. Distance L then can be determined by the following equation for triangle a0, a1, a1′:
L=(Ha 0−Ha 1)/tan θ - Referring to
FIG. 9A , line segment a0-a1 ofcross line 66 b is inclined to hypothetical plane V. As described above with reference toFIG. 9B , since the image captured byimage sensor 50 has three-dimensional radial-shapedrange 72, actual point a1 is captured as point a1′ on hypothetical plane V, as shown inFIG. 9A . The angle of inclination φ offront wall 61 with respect to a plane perpendicular tooptical axis 55 ofimage sensor 50, which is to be determined last, can be determined by the following equations. First, tan β can be determined by the following equations, since the angle β0 of imagesensor imaging range 72 is known, and WT and W0 is determined from the image that is captured by imagingdevice 53 and shown in FIGS. 8A and 8B:
tan β0=W 0/2L 0, L 0=W 0/2 tan β0
tan β=WT/2L 0, L 0=WT/2 tan β
W 0/2 tan β0=WT/2 tan β, tan θ=tan β0×WT/W 0
Next, angle φ of inclination can be determined by the following equations, because WC and WT are determined from the image that is captured by imaging device 53:
L=WC/tan β
φ=tan−1 {L/(WC+WT)} - Because the above calculations involve trigonometric functions which pose a very heavy load on
central processing unit 60 incorporated in the projector, the horizontal angle of inclination may be determined by referring to a table which was prepared in advance and represents the relationship between the horizontal angle of inclination betweenoptical axis 27 andprojection surface 70, and the variables which are obtained from the positional information. Alternatively, the above calculations may be processed by a high-performance CPU. Since tan θ/tan θ approximately equal θ/θ0, distance L may be determined by the following approximation without any significant error:
θ=θ0×Ha 1/H 0 - According to another process for reducing the amount of calculations to be performed by
central processing unit 60, the above trigonometric values may be stored as data table in a memory, and the stored data may be read from the memory. Alternatively, a table may be stored in and read from a memory which represents the relationship between points a0, a1 which are expressed by (x, y) coordinates such as (xa0, ya0), (xa1, ya1) on the image sensor, and angle data. The relationship between points a0, a1 and any other data that are required for the final image conversion may also be stored in the memory. This process allows for great reduction of the amount of calculations to be performed bycentral processing unit 60. - The lens of an image sensor usually has a distortion. It is often necessary, in the present embodiment as well, to take such a lens distortion into account in order to calculate the positional information accurately. In this case, the positional information of a point obtained by the image sensor may be represented as positional information (x0, y0), and the positional information (x0, y0) may be converted into positional information (x0′, y0′) prior to the above calculation, by referring to a distortion correcting table. It is possible, in this way, to correct trapezoidal distortions of the lens of the image sensor taking into account the distortions of the lens, and it is not necessary to correct the distortions in a separate process.
- Image
distortion correcting circuit 33 generates corrective parameters for correcting the trapezoidal distortion of the image, by using the angle of inclination identified by the above process, according to a known process. These parameters are applied to the inputted image, and the trapezoidal distortion thereof can automatically be corrected. - If the main body of
projector 10 is inclined only vertically with respect tofront wall 61, then an image is captured byimage sensor 50 as shown inFIG. 10 .FIG. 10 is a view showing an image captured by the imaging device of the image sensor when the projector is set such that the main body of the projector is inclined only vertically with respect to the front wall. In this case, as with the case in which the projector is inclined only in the horizontal plane with respect to the front wall, the angle of inclination ofcross line 67 c betweenimage 61 c of the front wall andimage 62 c of the right side wall can be calculated in the same manner as described above. The angle of inclination of the main body ofprojector 10 can be then calculated with respect tofront wall 61 in the vertical direction, which allowsprojector 10 to automatically correct a trapezoidal distortion of the projected image. - A process for calculating an angle of inclination when the main body of
projector 10 is inclined both horizontally and vertically with respect tofront wall 61 will be described below. In this case, an image shown inFIGS. 11A and 11B is captured by the image sensor.FIGS. 11A and 11B are views showing an image captured by the imaging device of the image sensor when the projector is set such that the projector is inclined horizontally and vertically with respect to the front wall.FIG. 11A shows the captured image, andFIG. 11B shows the highlighted cross lines.Cross line 66 d andcross line 67 d join at cross point e0.Cross line 66 d has left end point e1 in the image.Cross line 67 d and the limit line of the imaging range cross each other at cross point e2. Central vertical reference line V0 and central horizontal reference line H0 pass image center C, which is the origin. Right vertical reference line V3 and second horizontal reference line H2 pass cross point e0. In this process,cross line 66 d betweenimage 61 d of the front wall andimage 64 d of the ceiling, and crossline 67 d betweenimage 61 d of the front wall andimage 62 d of the right side wall, are extracted first. Then the x, y, and z coordinates of cross point e0, end point e1, and cross point e2 are calculated. - Line segments e0-e1, e0-e2 can be expressed explicitly by the coordinates (x, y, z) of cross point e0, angles φh formed between line segment e0-e1 and the horizontal line, the angles φv formed between line segment e0-e2 and the vertical line, and angle φc formed between these line segments.
-
FIGS. 12A and 12B are views showing the relationship between the projector and the front wall, when the front wall faces the projector head-on and is inclined with respect to the projector. In these figures, the projector is assumed to be fixed, and the front wall is assumed to rotate with respect to the fixed projector.FIG. 12B is a plan view showing line segment S representing a vertical wall that rotates in the imaging range of image sensor. Line segment S rotates about pivot point Cr0, i.e., moves in the back-and-forth direction when viewed fromimage sensor 50. As line segment S rotates about pivot point Cr0, reference point d0 positioned at the end of line segment S moves toward point d1 or point d2. - The movement of reference point d0 can be detected within the range represented by angle ψ in the image that is captured by the image sensor. The angle ψ is defined by point d2 where reference point d0 overlaps with pivot pint Cr0 when viewed from image sensor 50 (This situation normally does not occur, since it means that the wall rotates to an angular position where the wall is recognized as being overlapped with image sensor 50), and by point d1 where a hypothetical line drawn from
image sensor 50 forms a right angle with line segment S (The right side of the wall is recognized as being turned towardimage sensor 50, when vied from image sensor 50). In any case, the reference point which corresponds to the image of a cross line moves with the rotation of hypothetical plane V. If the angle of inclination ofprojector 10 is 0 degree with respect to the front wall, then the cross line is horizontal or vertical as shown inFIGS. 4A and 4B . Thus, the angle of inclination ofprojector 10 can be determined with respect to the front wall, by rotating hypothetical plane V about the x-axis and the y-axis to transform the coordinates of cross points e0, e1, e2, and by finding a rotational angle at which line segment e0-e1 and line segment e0-e2 lie horizontally and vertically, respectively. - Alternatively, the inclination of the front wall can be calculated by identifying e0 (x, y, z), φh, φv, φc in the image, shown in
FIG. 11B , that is captured byimage sensor 50. Since the rotational angle of the image sensor can be identified together with these parameters, the angle of inclination of the main body ofprojector 10 can also be detected with respect to the vertical direction.Inclination detector 32 of trapezoidaldistortion correcting device 30 calculates the parameters of the projection surface such as a position and a shape of thereof, based on the detected angle of inclination of the main body ofprojector 10 with respect to the wall surface. The calculated parameters are applied to imagedistortion correcting circuit 33, andprojection device 20 automatically projects an appropriately shaped image, i.e., a distortion-free image, onto the wall surface. - Usually, projectors should preferably be installed such that a projected image is displayed not onto a ceiling or side walls, but onto a front wall only. Since the positional information of the cross line is obtained from the detected angle of inclination, it is possible to reduce the projected image onto the front wall only.
- In the above embodiment, the angle of inclination between the projector and the front wall is acquired by using the cross line between the front wall and a surface that is connected to the front wall. However, this angle may also be acquired by using a contact line between the front wall and, for example, a horizontal desk in contact with the front wall.
- A process for calculating an angle of inclination between the projection optical axis and the projection surface in a trapezoidal distortion correcting device of a projector according to a second embodiment of the present invention will be described below.
FIG. 13 is the block diagram of a projector having a trapezoidal distortion correcting device according to the second embodiment of the present invention. Trapezoidaldistortion correcting device 30 has an inclination sensor (G sensor) using a conventional acceleration detector used, for example, for centering a machine when it is installed. The inclination sensor is avertical inclination sensor 54 for accurately measuring an angle of inclination with respect to the gravitational direction and for outputting the measured angle of inclination as numerical data. Other details of the projector according to the second embodiment are identical to the projector according to the first embodiment, and those parts identical to the first embodiment are denoted by identical reference numerals, and will not be described in detail below. - According to the first embodiment, a trapezoidal distortion of an image is corrected based only on the information obtained from an image that is captured by
image sensor 50. In order to acquire the angles of inclination in the horizontal and vertical directions ofprojector 10 with respect tofront wall 61, it is necessary that a horizontal cross line, such as cross line 66 betweenfront wall 61 andceiling 64, and a vertical cross line, such as cross line 67 betweenfront wall 61 andright side wall 62, be included in the image sensor imaging range. Since images are usually projected onto an upper area of a wall, cross line 66 betweenfront wall 61 andceiling 64 is usually included in imagesensor imaging range 72. However, because images are generally projected toward the central upper area of a wall, cross lines 67 betweenfront wall 61 andside walls sensor imaging range 72. - If an image distortion is corrected based only on cross line 66 between
front wall 61 andceiling 64, then the main body ofprojector 10 needs to be placed horizontally, because image distortion is not corrected with respect to a vertical inclination. Consequently, if there is a vertical inclination, then the image distortion is not corrected properly. - According to the second embodiment, an angle of inclination in the vertical direction is detected by
vertical inclination sensor 54, and is inputted toinclination detector 32.Image capturer 31 captures the image fromimaging device 53, and suppliesinclination detector 32 with the positional information of cross line 66 betweenfront wall 61 andceiling 64.Inclination detector 32 calculates an angle of inclination in the horizontal direction based on the positional information.Inclination detector 32 outputs the calculated angle of inclination in the horizontal direction, together with the angle of inclination in the vertical direction that is detected byvertical inclination sensor 54, to imagedistortion correcting circuit 33. Imagedistortion correcting circuit 33 generates LSI control parameters, corrects trapezoidal distortions in the vertical and horizontal directions of the image input from image that is inputtedunit 41, and outputs corrected image data to imagecontroller 23. - In the illustrated embodiment,
vertical inclination sensor 54 is an acceleration sensor or a gravitational sensor utilizing the gravity. However,vertical inclination sensor 54 may be a device for detecting the tilt angle of a tilting mechanism of the main body ofprojector 10. - A process for correcting a trapezoidal distortion according to the second embodiment will be described below with reference to
FIG. 14 . This process includes a process of identifying an angle of inclination in the horizontal direction based on the positional information of a cross line between a wall surface and a ceiling in an image sensor imaging range, acquiring an angle of inclination in the vertical direction from a vertical inclination sensor, and correcting an output image on a display unit in the projection. First,image capturer 31 acquires image information fromimaging device 53 ofimage sensor 50 in step S1. Then,inclination detector 32 acquirescross line 66 b betweenimage 61 b of the front wall andimage 64 b of the ceiling from the image information in step S2, acquires cross points a0, a1 of left and right reference lines V1, V2 andcross line 66 b in the image in step S3.Inclination detector 32 then assigns coordinates to cross points a0, a1 in step S4. Thereafter,inclination detector 32 calculates the distance between the two cross points in a direction parallel to the optical axis, based on the distance between the two cross points in the vertical direction, the distance in the vertical direction betweenoptical axis 55 and a limit line of the image sensor imaging range, and vertical angle θ0 of the image sensor imaging range in step S5. Next,inclination detector 32 calculates an angle of inclination in the horizontal direction of the projector, based on the distance between the two cross points in the direction parallel to the optical axis, the distance between the two cross points in the horizontal direction, and horizontal angle β0 of the image sensor imaging range in step S6, acquires an angle of inclination in the vertical direction of the projector fromvertical inclination sensor 54 in step S7, and outputs the acquired and calculated angle data to imagedistortion correcting circuit 33. Imagedistortion correcting circuit 33 generates LSI control parameters in step S8, and controls the projector image processing LSI circuit in step S9.Input image 24 is then corrected, anddisplay unit 22 generatesoutput image 25 in step S25. Whenoutput image 25 is projected ontoprojection surface 70, an image similar toinput image 24 is displayed onprojection surface 70. - A process for calculating an angle of inclination between the projection optical axis and the projection surface in a trapezoidal distortion correcting device of a projector according to a third embodiment of the present invention will be described below.
FIG. 15 is the block diagram of the projector having the trapezoidal distortion correcting device according to the third embodiment of the present invention. - In the first and second embodiments,
image sensor 50 andimage capturer 31 acquire positional information of a cross line between a plane which serves as the projection surface and a plane which crosses the plane. According to the third embodiment,laser positioning device 150 andposition acquirer 131 are used to acquire positional information of the cross line.Laser positioning device 150 points laser beams at desired positions by means oflaser pointer 151 that is capable of projecting a laser beam within a predetermined range including an image that is projected by the projector. When two or more points are selected at desired positions on a cross line between a plane which serves as the projection surface and another plane,position acquirer 131 acquires positional information of the cross line in a hypothetical image, and outputs the acquired positional information toinclination detector 32. In the same manner as with the first and second embodiments,inclination detector 32 processes the supplied positional information to calculate an angle of inclination between projectionoptical axis 27 ofprojection device 20 andprojection surface 70. - The projector according to the third embodiment is identical in arrangement and operation to the projector according to the first embodiment, except that
laser positioning device 150 andposition acquirer 131 are provided instead ofimage sensor 50 andimage capturer 31. Those parts identical to the first embodiment are denoted by identical reference numerals, and will not be described in detail below. - Various processes are available to acquire positional information of a cross line by means of
laser positioning device 150. A typical process will be described below as an example. In this process, a laser positioning device is used that is capable of controlling the direction of the laser beam, and determining the position of the laser beam in a hypothetical image by a signal that is input when the laser beam overlaps with a cross line. -
FIGS. 16A through 16C are views showing an arrangement and operation of the laser positioning device.FIG. 16A shows the manner in which the laser pointer operates.FIG. 16B shows the manner in which the movement of the laser pointer is limited.FIG. 16C shows the manner in which the laser positioning device is installed.Laser positioning device 150 haslaser pointer 151 that is movable aboutpivot point 152 in the vertical and horizontal direction, as shown inFIG. 16A .Laser pointer 151 has a tubular member that passes throughplate 154 having an H-shaped aperture defined therein, as shownFIG. 16B . Therefore, a laser beam is projected fromlaser pointer 151 in the directions along the H-shaped aperture.Pivot point 152 is combined with a displacement acquirer (not shown) for measuring displacements (angles) in the horizontal and vertical directions.Laser pointer 1 is manually moved.Laser positioning device 150 is installed nearprojection lens 21 ofprojector 10, as shown inFIG. 16C . -
FIGS. 17A and 17B are views that illustrate a process for acquiring a cross line between a front wall serving as a projection surface and a ceiling that is joined to an upper edge of the front wall.FIG. 17A shows a vertical cross-sectional view of a room, andFIG. 17B shows the situation in which two points a0, a1 are acquired in a hypothetical image in the position acquirer.Laser pointer 151 is moved vertically and horizontally, as shown inFIG. 17A , to point the laser beam at a cross line between walls. When the laser beam points to the cross line, a button provided onlaser positioning device 150 is pressed in order to identify the position, andlaser positioning device 150 outputs the position of the laser beam as positional information into a hypothetical image inposition acquirer 131. When two points a0, a1 are acquired in this way in the hypothetical image inposition acquirer 131, as shown inFIG. 17B , an angle of inclination of projectionoptical axis 27 ofprojector 10 can be acquired with respect toprojection surface 70, by the process according to the first embodiment described above with reference toFIGS. 8A, 8B and 9A, 9B. - While preferred embodiments of the present invention have been described using specific terms, such description is for illustrative purposes only, and it is to be understood that changes and variations may be made without departing from the spirit or scope of the following claims.
Claims (18)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004161494A JP3960390B2 (en) | 2004-05-31 | 2004-05-31 | Projector with trapezoidal distortion correction device |
JP2004-161494 | 2004-05-31 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20050270496A1 true US20050270496A1 (en) | 2005-12-08 |
US7452084B2 US7452084B2 (en) | 2008-11-18 |
Family
ID=34937090
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/137,564 Active 2026-05-17 US7452084B2 (en) | 2004-05-31 | 2005-05-26 | Projector with a device for measuring angle of inclination |
Country Status (4)
Country | Link |
---|---|
US (1) | US7452084B2 (en) |
EP (1) | EP1602894A1 (en) |
JP (1) | JP3960390B2 (en) |
CN (1) | CN100501559C (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070058136A1 (en) * | 2005-09-12 | 2007-03-15 | Casio Computer Co., Ltd. | Projecting apparatus and method and recording medium recording the projecting method |
US20070182936A1 (en) * | 2006-02-08 | 2007-08-09 | Canon Kabushiki Kaisha | Projection display apparatus |
US20090015730A1 (en) * | 2006-02-10 | 2009-01-15 | Kazuya Arakawa | Image projecting method and projector |
US20100014778A1 (en) * | 2008-07-18 | 2010-01-21 | Seiko Epson Corporation | Image correcting apparatus, image correcting method, projector and projection system |
US20100315601A1 (en) * | 2009-06-11 | 2010-12-16 | Seiko Epson Corporation | Projector and trapezoidal distortion correcting method |
US20110107222A1 (en) * | 2009-10-29 | 2011-05-05 | Yoshiharu Uchida | Presentation system and display device for use in the presentation system |
EP2570891A1 (en) * | 2011-09-15 | 2013-03-20 | Funai Electric Co., Ltd. | Projector |
US20140247358A1 (en) * | 2011-11-24 | 2014-09-04 | Aisin Seiki Kabushiki Kaisha | Image generation device for monitoring surroundings of vehicle |
US8911096B2 (en) | 2008-12-10 | 2014-12-16 | Nikon Corporation | Projection apparatus for projecting and processing an image |
US20150049117A1 (en) * | 2012-02-16 | 2015-02-19 | Seiko Epson Corporation | Projector and method of controlling projector |
DE102006030194B4 (en) * | 2006-06-30 | 2015-07-02 | Airbus Operations Gmbh | Integrated projector in a seat for visual display of information |
US20150323669A1 (en) * | 2014-05-08 | 2015-11-12 | Hitachi-Lg Data Storage Korea, Inc. | Apparatus for detecting distances in two directions |
US9749605B2 (en) * | 2015-06-30 | 2017-08-29 | Coretronic Corporation | Projection apparatus |
WO2018110774A1 (en) * | 2016-12-16 | 2018-06-21 | Cj Cgv Co., Ltd. | Method of automatically correcting projection area based on image photographed by photographing device and system therefor |
DE102010023108B4 (en) | 2009-06-04 | 2019-12-05 | Sypro Optics Gmbh | Projector with automatic focusing and imaging process |
CN111948802A (en) * | 2019-05-16 | 2020-11-17 | 精工爱普生株式会社 | Optical device, control method of optical device, and image display apparatus |
CN113547512A (en) * | 2021-08-04 | 2021-10-26 | 长春电子科技学院 | Intelligent detection manipulator for machining clamp body |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5010202B2 (en) * | 2006-07-31 | 2012-08-29 | Necディスプレイソリューションズ株式会社 | Projector and projection image adjustment method |
JP5439733B2 (en) * | 2008-03-31 | 2014-03-12 | リコーイメージング株式会社 | Imaging device |
JP5266953B2 (en) * | 2008-08-19 | 2013-08-21 | セイコーエプソン株式会社 | Projection display apparatus and display method |
JP5446753B2 (en) * | 2008-12-10 | 2014-03-19 | 株式会社ニコン | Projection device |
WO2010067688A1 (en) * | 2008-12-10 | 2010-06-17 | 株式会社ニコン | Projection device |
JP5736535B2 (en) * | 2009-07-31 | 2015-06-17 | パナソニックIpマネジメント株式会社 | Projection-type image display device and image adjustment method |
KR101087870B1 (en) * | 2009-09-02 | 2011-11-30 | 채광묵 | Transmitting Apparatus and Receiving Apparatus for Remote Position Indication |
TWI439788B (en) * | 2010-01-04 | 2014-06-01 | Ind Tech Res Inst | System and method for projection correction |
US8506090B2 (en) | 2010-03-22 | 2013-08-13 | Microvision, Inc. | Projection system with image orientation correction and corresponding method |
JP5625490B2 (en) * | 2010-05-25 | 2014-11-19 | セイコーエプソン株式会社 | Projector, projection state adjustment method, and projection state adjustment program |
JP5671901B2 (en) * | 2010-09-15 | 2015-02-18 | セイコーエプソン株式会社 | Projection type display device and control method thereof |
CN102271237A (en) * | 2011-02-25 | 2011-12-07 | 鸿富锦精密工业(深圳)有限公司 | Projection device and method thereof for correcting trapezoidal distortion |
JP5924042B2 (en) * | 2012-03-14 | 2016-05-25 | セイコーエプソン株式会社 | Projector and projector control method |
JP6172495B2 (en) | 2012-12-28 | 2017-08-02 | 株式会社リコー | Calibration apparatus, apparatus, projector, three-dimensional scanner, calibration method, method, program, and storage medium |
FI125799B (en) * | 2013-10-11 | 2016-02-29 | Outotec Finland Oy | Method and arrangement for preparing cast anodes for use in the electrolytic refining of metal |
CN104952035A (en) * | 2015-06-12 | 2015-09-30 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN106331666B (en) * | 2015-07-03 | 2020-02-07 | 中兴通讯股份有限公司 | Projection terminal trapezoidal correction method and device and projection terminal |
CN105203050A (en) * | 2015-10-08 | 2015-12-30 | 扬中中科维康智能科技有限公司 | Method for detecting included angle between tracking reflector and cross shaft of laser tracker |
CN106612422B (en) * | 2015-12-31 | 2018-08-28 | 北京一数科技有限公司 | A kind of projection correction's method and device |
CN105954961A (en) * | 2016-05-06 | 2016-09-21 | 联想(北京)有限公司 | Information processing method and projection equipment |
CN106797456B (en) * | 2016-12-30 | 2019-02-01 | 深圳前海达闼云端智能科技有限公司 | Projected picture correcting method, means for correcting and robot |
CN109212874B (en) * | 2017-07-05 | 2021-04-30 | 成都理想境界科技有限公司 | Scanning projection equipment |
CN109660703B (en) * | 2017-10-12 | 2021-10-26 | 台湾东电化股份有限公司 | Method for correcting optical mechanism |
CN107835399A (en) * | 2017-10-31 | 2018-03-23 | 潍坊歌尔电子有限公司 | The method, apparatus and projector equipment of a kind of projection correction |
CN108050991B (en) * | 2017-11-16 | 2020-09-11 | 长江存储科技有限责任公司 | Method for measuring side wall inclination angle based on scanning electron microscope |
US11259013B2 (en) * | 2018-09-10 | 2022-02-22 | Mitsubishi Electric Corporation | Camera installation assistance device and method, and installation angle calculation method, and program and recording medium |
JP7283904B2 (en) * | 2019-01-18 | 2023-05-30 | 株式会社トプコン | Measuring device and control method for measuring device |
CN110809141A (en) * | 2019-09-29 | 2020-02-18 | 深圳市火乐科技发展有限公司 | Trapezoidal correction method and device, projector and storage medium |
CN113452971B (en) * | 2020-03-25 | 2023-01-03 | 苏州佳世达光电有限公司 | Automatic horizontal trapezoidal correction method for projection device |
CN112104851B (en) * | 2020-09-15 | 2022-04-08 | 成都极米科技股份有限公司 | Detection method, device and detection system for picture correction |
CN112902876B (en) * | 2021-01-14 | 2022-08-26 | 西北工业大学 | Method for measuring weld deflection of spin forming curved surface member of tailor-welded blank |
JP7318669B2 (en) * | 2021-01-27 | 2023-08-01 | セイコーエプソン株式会社 | Display method and display system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6520646B2 (en) * | 1999-03-03 | 2003-02-18 | 3M Innovative Properties Company | Integrated front projection system with distortion correction and associated method |
US7150536B2 (en) * | 2003-08-08 | 2006-12-19 | Casio Computer Co., Ltd. | Projector and projection image correction method thereof |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2536110Y2 (en) | 1990-01-05 | 1997-05-21 | 富士通テン株式会社 | Sound field configuration device |
JPH06253241A (en) | 1993-02-26 | 1994-09-09 | Matsushita Electric Ind Co Ltd | Projection distortion correction method for projection type display device |
JP3519393B2 (en) | 2001-12-26 | 2004-04-12 | 株式会社東芝 | Projection display device |
JP3879560B2 (en) | 2002-03-27 | 2007-02-14 | セイコーエプソン株式会社 | Projection-type image display device |
JP2004093275A (en) | 2002-08-30 | 2004-03-25 | Seiko Precision Inc | Angle detection device and projector having the same |
JP3914938B2 (en) | 2004-04-30 | 2007-05-16 | Necビューテクノロジー株式会社 | Projector keystone distortion correction device and projector including the keystone distortion correction device |
-
2004
- 2004-05-31 JP JP2004161494A patent/JP3960390B2/en not_active Expired - Fee Related
-
2005
- 2005-05-26 US US11/137,564 patent/US7452084B2/en active Active
- 2005-05-31 CN CNB2005100747379A patent/CN100501559C/en not_active Expired - Fee Related
- 2005-05-31 EP EP05011722A patent/EP1602894A1/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6520646B2 (en) * | 1999-03-03 | 2003-02-18 | 3M Innovative Properties Company | Integrated front projection system with distortion correction and associated method |
US7150536B2 (en) * | 2003-08-08 | 2006-12-19 | Casio Computer Co., Ltd. | Projector and projection image correction method thereof |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070058136A1 (en) * | 2005-09-12 | 2007-03-15 | Casio Computer Co., Ltd. | Projecting apparatus and method and recording medium recording the projecting method |
US20070182936A1 (en) * | 2006-02-08 | 2007-08-09 | Canon Kabushiki Kaisha | Projection display apparatus |
US20090015730A1 (en) * | 2006-02-10 | 2009-01-15 | Kazuya Arakawa | Image projecting method and projector |
DE102006030194B4 (en) * | 2006-06-30 | 2015-07-02 | Airbus Operations Gmbh | Integrated projector in a seat for visual display of information |
US20100014778A1 (en) * | 2008-07-18 | 2010-01-21 | Seiko Epson Corporation | Image correcting apparatus, image correcting method, projector and projection system |
US8911096B2 (en) | 2008-12-10 | 2014-12-16 | Nikon Corporation | Projection apparatus for projecting and processing an image |
DE102010023108B4 (en) | 2009-06-04 | 2019-12-05 | Sypro Optics Gmbh | Projector with automatic focusing and imaging process |
US8337023B2 (en) * | 2009-06-11 | 2012-12-25 | Seiko Epson Corporation | Projector and trapezoidal distortion correcting method |
US20100315601A1 (en) * | 2009-06-11 | 2010-12-16 | Seiko Epson Corporation | Projector and trapezoidal distortion correcting method |
US10373356B2 (en) | 2009-10-29 | 2019-08-06 | Maxell, Ltd. | Presentation system and display apparatus for use in the presentation system |
US11625876B2 (en) | 2009-10-29 | 2023-04-11 | Maxell, Ltd. | Presentation system and display device for use in the presentation system |
US20110107222A1 (en) * | 2009-10-29 | 2011-05-05 | Yoshiharu Uchida | Presentation system and display device for use in the presentation system |
EP2317761A3 (en) * | 2009-10-29 | 2013-03-06 | Hitachi Consumer Electronics Co. Ltd. | Presentation system and display device for use in the presentation system |
US10950023B2 (en) | 2009-10-29 | 2021-03-16 | Maxell, Ltd. | Presentation system and display device for use in the presentation system |
US9098195B2 (en) | 2009-10-29 | 2015-08-04 | Hitachi Maxell, Ltd. | Presentation system and display device for use in the presentation system |
EP2570891A1 (en) * | 2011-09-15 | 2013-03-20 | Funai Electric Co., Ltd. | Projector |
US10007853B2 (en) * | 2011-11-24 | 2018-06-26 | Aisin Seiki Kabushiki Kaisha | Image generation device for monitoring surroundings of vehicle |
US20140247358A1 (en) * | 2011-11-24 | 2014-09-04 | Aisin Seiki Kabushiki Kaisha | Image generation device for monitoring surroundings of vehicle |
US20150049117A1 (en) * | 2012-02-16 | 2015-02-19 | Seiko Epson Corporation | Projector and method of controlling projector |
US20150323669A1 (en) * | 2014-05-08 | 2015-11-12 | Hitachi-Lg Data Storage Korea, Inc. | Apparatus for detecting distances in two directions |
US9739874B2 (en) * | 2014-05-08 | 2017-08-22 | Hitachi-Lg Data Storage Korea, Inc. | Apparatus for detecting distances in two directions |
US9749605B2 (en) * | 2015-06-30 | 2017-08-29 | Coretronic Corporation | Projection apparatus |
US10051250B2 (en) | 2016-12-16 | 2018-08-14 | Cj Cgv Co., Ltd. | Method of automatically correcting projection area based on image photographed by photographing device and system therefor |
WO2018110774A1 (en) * | 2016-12-16 | 2018-06-21 | Cj Cgv Co., Ltd. | Method of automatically correcting projection area based on image photographed by photographing device and system therefor |
CN111948802A (en) * | 2019-05-16 | 2020-11-17 | 精工爱普生株式会社 | Optical device, control method of optical device, and image display apparatus |
CN113547512A (en) * | 2021-08-04 | 2021-10-26 | 长春电子科技学院 | Intelligent detection manipulator for machining clamp body |
Also Published As
Publication number | Publication date |
---|---|
CN1704837A (en) | 2005-12-07 |
JP2005347790A (en) | 2005-12-15 |
US7452084B2 (en) | 2008-11-18 |
CN100501559C (en) | 2009-06-17 |
JP3960390B2 (en) | 2007-08-15 |
EP1602894A1 (en) | 2005-12-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7452084B2 (en) | Projector with a device for measuring angle of inclination | |
EP1517550B1 (en) | Projector with tilt angle measuring device | |
US7139424B2 (en) | Stereoscopic image characteristics examination system | |
US20050179875A1 (en) | Projector with a plurality of cameras | |
JP5401940B2 (en) | Projection optical system zoom ratio measurement method, projection image correction method using the zoom ratio measurement method, and projector for executing the correction method | |
US20100053569A1 (en) | Projection display apparatus and display method | |
CN109949728B (en) | Detection apparatus for display panel | |
US7204595B2 (en) | Projector and method of correcting projected image distortion | |
KR20020066219A (en) | Imaging system, program used for controlling image data in same system, method for correcting distortion of captured image in same system, and recording medium storing procedures for same method | |
JP3742085B2 (en) | Projector having tilt angle measuring device | |
JP3741136B2 (en) | Obstacle adaptive projection display | |
JP3926311B2 (en) | Projector having tilt angle measuring device | |
JP2005331585A (en) | Projector having device for measuring distance and tilt angle | |
JP2899553B2 (en) | Position adjustment method for solid-state imaging device | |
JP3730982B2 (en) | projector | |
JP3914938B2 (en) | Projector keystone distortion correction device and projector including the keystone distortion correction device | |
JP3742086B2 (en) | Projector having tilt angle measuring device | |
JP2005024618A (en) | Projector having tilt angle measuring instrument | |
JP4535769B2 (en) | Projector with tilt angle measuring device | |
JP4652219B2 (en) | Projector screen | |
JP2020067511A (en) | Camera system, control method and program of the same | |
JP2004363856A (en) | Projection type display device | |
JP3730979B2 (en) | Projector having tilt angle measuring device | |
JP3757224B2 (en) | Projector having tilt angle measuring device | |
JP4339087B2 (en) | Projector with automatic trapezoidal distortion correction means |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NEC VIEWTECHNOLOGY, LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MOCHIZUKI, KAZUO;REEL/FRAME:016910/0730 Effective date: 20050506 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: NEC DISPLAY SOLUTIONS, LTD., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:NEC VIEWTECHNOLOGY, LTD.;REEL/FRAME:035509/0782 Effective date: 20070401 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |
|
AS | Assignment |
Owner name: SHARP NEC DISPLAY SOLUTIONS, LTD., JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:NEC DISPLAY SOLUTIONS, LTD.;REEL/FRAME:055256/0755 Effective date: 20201101 |