|Publication number||US5020114 A|
|Application number||US 07/233,128|
|Publication date||28 May 1991|
|Filing date||17 Aug 1988|
|Priority date||17 Aug 1987|
|Also published as||EP0304042A2, EP0304042A3|
|Publication number||07233128, 233128, US 5020114 A, US 5020114A, US-A-5020114, US5020114 A, US5020114A|
|Inventors||Arisa Fujioka, Takashi Konda|
|Original Assignee||Kabushiki Kaisha Toshiba|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (4), Non-Patent Citations (2), Referenced by (68), Classifications (18), Legal Events (5)|
|External Links: USPTO, USPTO Assignment, Espacenet|
S=S*.sub.e ·(I/I*.sub.e) (6)
S=S*.sub.e ·(I/I*.sub.e) (6)
1. Field of the Invention
The present invention relates to an object discriminating apparatus used for discrimination of vehicles, men, and the like for, e.g., automatic steering of vehicles, and a method therefor.
2. Description of the Related Art
In modern society, vehicles have become indispensable as a transportation means. However, many traffic accidents occur by careless driving. For this reason, the study of automatic steering of vehicles has been recently made from the viewpoint of prevention against traffic accidents.
The study of automatic steering of vehicles is branched into various subjects. One of important subjects is concerned with detection and discrimination of an obstacle, wherein an obstacle is detected from a photographic image and the type of the obstacle is discriminated. As a conventional method of detecting an obstacle using a photographic image, a method of determining an object, which is crossing road boundaries on an image, to be an obstacle is known (32nd (first semiannual conference, 1986) National Conference Preliminary Reports 6N-8, pp. 1943-1944, Information Processing Society). According to another known method, a vehicle traveling ahead is determined by detecting horizontal edges in an image on the basis of the fact that the image of a vehicle includes a large number of horizontal edges compared with background image (Journal of the Information Processing Society (Japan), vol. 27, No. 7, pp. 663-690).
In both the above-described methods, however, only an obstacle is detected, but the type of the obstacle cannot be discriminated.
It is an object of the present invention to provide an object discriminating apparatus, which can detect an object from a photographic image and discriminate the type of the object, and a method therefor.
In order to achieve the above object, there is provided an object discriminating apparatus comprising: image photographing means for photographing an image; object detecting means for detecting an object from the photographic image photographed by the image photographing means; area computing means for computing a projection area of the object detected by the object detecting means in an arbitrary plane of a 3-dimensional space; and object discriminating means for discriminating the object on the basis of the projection area computed by the area computing means.
According to the present invention, a method of discriminating an object comprises the steps of: photographing an image; detecting an object from the photographic image, computing a projection area of the detected object in an arbitrary plane of a 3-dimensional space, and discriminating the object on the basis of the computed projection area.
According to the object discriminating apparatus and the method therefor of the present invention, the projection area of an object in an arbitrary plane of a 3-dimensional space is computed from a photographic image. Therefore, even if a distance between a photographing position and an obstacle is changed, an actual area can always be calculated, and hence discrimination of an object can be reliably performed.
FIG. 1 is a block diagram showing an object discriminating apparatus according to an embodiment of the present invention;
FIG. 2 is a view illustrating a photographic image obtained by an image unit;
FIGS. 3A and 3B are flow charts showing processing of an object detecting section, an area computing section, and an object discriminating section in FIG. 1;
FIG. 4 is a view for explaining a method of computing the projection area of an object in an arbitrary plane of a 3-dimensional space using a photographic image;
FIG. 5 is a view illustrating coordinate points A', B', C', and D' in an arbitrary plane of a 3-dimensional space when they are respectively set to be ω1 (u1, v1), ω2 (u2, v2), ω3 (u3, v3), and ω4 (u4, v4); and
FIG. 6 is a view illustrating the projection areas of a vehicle and a pedestrian in an arbitrary plane of a 3-dimensional space.
FIG. 1 is a block diagram showing an object discriminating apparatus according to an embodiment of the present invention. Referring to FIG. 1, central control unit (CCU) 1 comprises object detecting section 3, area computing section 5, and object discriminating section 7. Imaging unit 17 is connected to CCU 1. Imaging unit 17 forms an image of a target visual field and outputs an image signal. Imaging unit 17 can be constituted by, e.g., an industrial television (ITV) camera such as an automatic focusing type of camera which is off the shelf. In addition, read-only memory (ROM) 9 for storing the program of the flow charts shown in FIGS. 3A and 3B, and random access memory (RAM) 11 serving as a work area for performing various calculations are connected to CCU 1. CCU 1 can be constituted by, e.g., a microcomputer. CCU 1, ROM 9, and RAM 11 can be integrated into a one-chip microcomputer.
FIG. 2 shows an example of image 23 photographed by imaging unit 17. In this embodiment, image 23 is constituted by vehicle 19 and pedestrian 21.
A case wherein vehicle 19 and pedestrian 21 are discriminated in image 23 by using the object discriminating apparatus of the present invention will be described below with reference to the flow charts in FIGS. 3A and 3B.
In step 31 in FIG. 3A, an image photographed by imaging unit 17 is read. In step 33, it is determined whether a detection signal is output from object detecting section 3. If NO is obtained, the flow returns to step 31. If YES in step 33, CCU 1 computes number S*e of pixels of the image area of the detected object in step 35.
Since vehicle 19 and pedestrian 21 photographed by imaging unit 17 are transformed from 3-dimensional space data into 2-dimensional plane data, the 2-dimensional plane data must be perspectively transformed into the 3-dimensional space data.
If x* - y* is a coordinate system prior to the transformation, and x - y - z is a coordinate system after the transformation, transformation from 2-dimensional plane data into 3-dimensional space data can be represented by: ##EQU1## where x* - y* - z* - ω* is a homogeneous coordinate system.
Equations (1) and (2) are related to equations of transformation when a viewpoint is located at a position of z =h on the z-axis in a right-handed coordinate system. The position of Z =h represents a focal position of the ITV camera. Since a viewpoint is located on the origin in an eye coordinate system (xe - ye - ze), the related equations of transformation in the eye coordinate system can be represented by the following equations if x*e - y*e is a coordinate system prior to the transformation and x*e - y*e - z*e - ω*e is a homogeneous coordinate system: ##EQU2## Therefore, the projection areas of vehicle 19 and pedestrian 21 in an arbitrary plane of a 3-dimensional space can be obtained using equations (5) and (6) in the following procedure. Note that plane P is perpendicular to a plane including the optical axis of a camera shown in FIG. 6 and to a plane (ground) with which the object is in contact.
First, in step 37, the coordinate system (x*e - y*e) of photographic image 23 is set as shown in FIG. 4, and minimum values y*e and x*e, and maximum values y*e and x*e of vehicle 19 and pedestrian 21 are respectively obtained. In step 39, point A' satisfying (a1, a2 -1) =(x*e1, y*e1), point B' satisfying (b1 +1, b2) =(x*e2, y*e2), point C' satisfying (c1, c2 +1) =(x*e3, y*e3), and point D' satisfying (d1 -1, d2)=(x*e4, y*e4) are obtained. In addition, in step 41, the area (I*e) of rectangle A'B'C'D' is computed. Area I*e of the rectangle having vertexes A', B', C', and D' can be given by: ##EQU3##
Subsequently, points A', B', C', and D' on photographic image 23 are transformed and the respective coordinates of points A', B', C', and D' in an arbitrary plane of a 3-dimensional space are obtained. More specifically, in step 43, points A', B', C', and D' on the image are transformed and corresponding points ω1 (u1, v1), ω2 (u2, v2), ω3 (u3, v3), and ω4 (u4, v4) in the arbitrary plane shown in FIG. 5 are computed. In addition, CCU 1 calculates the area (I) of rectangle ω1, ω2, ω3, ω4 the arbitrary plane. Area I of rectangle ω1 ω2, ω3, ω4 can be calculated by the following equation: ##EQU4## In step 47, CCU 1 calculates the projection area of the object on an arbitrary plane by the following equation:
S=S*.sub.e ·(I/I*.sub.e) (9)
In step 49, CCU 1 discriminates the object on the basis of the projection area on the arbitrary plane. This processing is performed, for example, such that a standard projection area for each type of vehicle and a standard projection area of pedestrians are stored in RAM 11, and CCU 1 discriminates an object by comparing a calculated projection area with a projection area read out from RAM 11. Discrimination of an object may be performed by determining whether a projection area on an arbitrary plane of a 3-dimensional space exceeds a given threshold value or not. Upon comparison with a threshold value, even when only a vehicle or a pedestrian is photographed in a photographic image, the type of an obstacle can be discriminated. In the above-described embodiment, an object is detected by storing the density value of a road in advance and using it as a threshold value. However, an object may be detected by other known techniques, e.g., detecting horizontal edges in an image.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4007440 *||28 Jan 1976||8 Feb 1977||Agency Of Industrial Science & Technology||Apparatus for recognition of approximate shape of an article|
|US4646354 *||25 Oct 1984||24 Feb 1987||Mitsubishi Denki Kabushiki Kaisha||Area measuring apparatus using television|
|US4752964 *||9 Apr 1985||21 Jun 1988||Kawasaki Jukogyo Kabushiki Kaisha||Method and apparatus for producing three-dimensional shape|
|US4965844 *||31 Mar 1986||23 Oct 1990||Sony Corporation||Method and system for image transformation|
|1||Article from Principles of Interactive Computer Graphics; Entitled: "Three Dimensional Transformations and Perspective"; pp. 333-365.|
|2||*||Article from Principles of Interactive Computer Graphics; Entitled: Three Dimensional Transformations and Perspective ; pp. 333 365.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US5153721 *||3 Jun 1991||6 Oct 1992||Olympus Optical Co., Ltd.||Method and apparatus for measuring an object by correlating displaced and simulated object images|
|US5483168 *||1 Mar 1993||9 Jan 1996||The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration||Optical potential field mapping system|
|US5519784 *||16 May 1995||21 May 1996||Vermeulen; Pieter J. E.||Apparatus for classifying movement of objects along a passage by type and direction employing time domain patterns|
|US5877801 *||5 Jun 1997||2 Mar 1999||Interactive Pictures Corporation||System for omnidirectional image viewing at a remote location without the transmission of control signals to select viewing parameters|
|US5903319 *||17 Jan 1995||11 May 1999||Interactive Pictures Corporation||Method for eliminating temporal and spacial distortion from interlaced video signals|
|US6157385 *||17 May 1999||5 Dec 2000||Oxaal; Ford||Method of and apparatus for performing perspective transformation of visible stimuli|
|US6201574||5 Feb 1997||13 Mar 2001||Interactive Pictures Corporation||Motionless camera orientation system distortion correcting sensing element|
|US6222683||31 Jul 2000||24 Apr 2001||Be Here Corporation||Panoramic imaging arrangement|
|US6243131||17 Jan 1995||5 Jun 2001||Interactive Pictures Corporation||Method for directly scanning a rectilinear imaging element using a non-linear scan|
|US6252603||8 Jul 1999||26 Jun 2001||Ford Oxaal||Processes for generating spherical image data sets and products made thereby|
|US6271853||8 Jul 1999||7 Aug 2001||Ford Oxaal||Method for generating and interactively viewing spherical image data|
|US6323862||8 Jul 1999||27 Nov 2001||Ford Oxaal||Apparatus for generating and interactively viewing spherical image data and memory thereof|
|US6331869||7 Aug 1998||18 Dec 2001||Be Here Corporation||Method and apparatus for electronically distributing motion panoramic images|
|US6337708||21 Apr 2000||8 Jan 2002||Be Here Corporation||Method and apparatus for electronically distributing motion panoramic images|
|US6341044||19 Oct 1998||22 Jan 2002||Be Here Corporation||Panoramic imaging arrangement|
|US6356296||8 May 1997||12 Mar 2002||Behere Corporation||Method and apparatus for implementing a panoptic camera system|
|US6369818||25 Nov 1998||9 Apr 2002||Be Here Corporation||Method, apparatus and computer program product for generating perspective corrected data from warped information|
|US6373642||20 Aug 1998||16 Apr 2002||Be Here Corporation||Panoramic imaging arrangement|
|US6392687||4 Aug 2000||21 May 2002||Be Here Corporation||Method and apparatus for implementing a panoptic camera system|
|US6426774||13 Jul 2000||30 Jul 2002||Be Here Corporation||Panoramic camera|
|US6466254||7 Jun 2000||15 Oct 2002||Be Here Corporation||Method and apparatus for electronically distributing motion panoramic images|
|US6480229||17 Jul 2000||12 Nov 2002||Be Here Corporation||Panoramic camera|
|US6493032||12 Nov 1999||10 Dec 2002||Be Here Corporation||Imaging arrangement which allows for capturing an image of a view at different resolutions|
|US6515696||25 Apr 2000||4 Feb 2003||Be Here Corporation||Method and apparatus for presenting images from a remote location|
|US6542184||8 Mar 2000||1 Apr 2003||Edward Driscoll, Jr.||Methods, apparatus, and program products for presenting panoramic images of a remote location|
|US6583815||14 Aug 2000||24 Jun 2003||Be Here Corporation||Method and apparatus for presenting images from a remote location|
|US6593969||8 Mar 2000||15 Jul 2003||Be Here Corporation||Preparing a panoramic image for presentation|
|US6731284||2 Oct 2000||4 May 2004||Ford Oxaal||Method of and apparatus for performing perspective transformation of visible stimuli|
|US6906620 *||28 Aug 2003||14 Jun 2005||Kabushiki Kaisha Toshiba||Obstacle detection device and method therefor|
|US6924832||11 Sep 2000||2 Aug 2005||Be Here Corporation||Method, apparatus & computer program product for tracking objects in a warped video image|
|US6947611 *||9 May 2002||20 Sep 2005||Sharp Kabushiki Kaisha||System, method and program for perspective projection image creation, and recording medium storing the same program|
|US7132933 *||23 Mar 2005||7 Nov 2006||Kabushiki Kaisha Toshiba||Obstacle detection device and method therefor|
|US7242425||17 Apr 2003||10 Jul 2007||Be Here, Corporation||Panoramic camera|
|US7486324||17 Apr 2003||3 Feb 2009||B.H. Image Co. Llc||Presenting panoramic images with geometric transformation|
|US7684624||3 Mar 2003||23 Mar 2010||Smart Technologies Ulc||System and method for capturing images of a target area on which information is recorded|
|US7714936||2 Jul 1997||11 May 2010||Sony Corporation||Omniview motionless camera orientation system|
|US8103057||22 Feb 2010||24 Jan 2012||Smart Technologies Ulc||System and method for capturing images of a target area on which information is recorded|
|US8509523||1 Nov 2011||13 Aug 2013||Tk Holdings, Inc.||Method of identifying an object in a visual scene|
|US8594370||26 Jul 2005||26 Nov 2013||Automotive Systems Laboratory, Inc.||Vulnerable road user protection system|
|US8818042||18 Nov 2013||26 Aug 2014||Magna Electronics Inc.||Driver assistance system for vehicle|
|US8842176||15 Jan 2010||23 Sep 2014||Donnelly Corporation||Automatic vehicle exterior light control|
|US8917169||2 Dec 2013||23 Dec 2014||Magna Electronics Inc.||Vehicular vision system|
|US8977008||8 Jul 2013||10 Mar 2015||Donnelly Corporation||Driver assistance system for vehicle|
|US8993951||16 Jul 2013||31 Mar 2015||Magna Electronics Inc.||Driver assistance system for a vehicle|
|US9008369||25 Aug 2014||14 Apr 2015||Magna Electronics Inc.||Vision system for vehicle|
|US9171217||3 Mar 2014||27 Oct 2015||Magna Electronics Inc.||Vision system for vehicle|
|US9191634||3 Apr 2015||17 Nov 2015||Magna Electronics Inc.||Vision system for vehicle|
|US9330321||22 Oct 2013||3 May 2016||Tk Holdings, Inc.||Method of processing an image of a visual scene|
|US9428192||16 Nov 2015||30 Aug 2016||Magna Electronics Inc.||Vision system for vehicle|
|US9436880||13 Jan 2014||6 Sep 2016||Magna Electronics Inc.||Vehicle vision system|
|US9440535||27 Jan 2014||13 Sep 2016||Magna Electronics Inc.||Vision system for vehicle|
|US9555803||16 May 2016||31 Jan 2017||Magna Electronics Inc.||Driver assistance system for vehicle|
|US9609289||29 Aug 2016||28 Mar 2017||Magna Electronics Inc.||Vision system for vehicle|
|US9643605||26 Oct 2015||9 May 2017||Magna Electronics Inc.||Vision system for vehicle|
|US9736435||20 Mar 2017||15 Aug 2017||Magna Electronics Inc.||Vision system for vehicle|
|US20020147991 *||10 Apr 2001||10 Oct 2002||Furlan John L. W.||Transmission of panoramic video via existing video infrastructure|
|US20020181803 *||9 May 2002||5 Dec 2002||Kenichi Kawakami||System, method and program for perspective projection image creation, and recording medium storing the same program|
|US20030193606 *||17 Apr 2003||16 Oct 2003||Be Here Corporation||Panoramic camera|
|US20030193607 *||17 Apr 2003||16 Oct 2003||Be Here Corporation||Panoramic camera|
|US20040096082 *||28 Aug 2003||20 May 2004||Hiroaki Nakai||Obstacle detection device and method therefor|
|US20040175042 *||3 Mar 2003||9 Sep 2004||Wallace Kroeker||System and method for capturing images of a target area on which information is recorded|
|US20050169530 *||23 Mar 2005||4 Aug 2005||Kabushiki Kaisha Toshiba||Obstacle detection device and method therefor|
|US20070152157 *||6 Nov 2006||5 Jul 2007||Raydon Corporation||Simulation arena entity tracking system|
|US20080306708 *||3 Jun 2008||11 Dec 2008||Raydon Corporation||System and method for orientation and location calibration for image sensors|
|US20090010495 *||26 Jul 2005||8 Jan 2009||Automotive Systems Laboratory, Inc.||Vulnerable Road User Protection System|
|US20100149349 *||22 Feb 2010||17 Jun 2010||Smart Technologies Ulc||System and method for capturing images of a target area on which information is recorded|
|USRE44087||27 Jan 2011||19 Mar 2013||B.H. Image Co. Llc||Presenting panoramic images with geometric transformation|
|WO2017125907A1 *||22 Dec 2016||27 Jul 2017||Agt International Gmbh||A method and system for object detection|
|U.S. Classification||382/285, 382/286, 356/628|
|International Classification||G06T7/60, G06K9/42, G06K9/00, G06K9/52, G06T1/00|
|Cooperative Classification||G06K9/00791, G06T7/60, G06K9/42, G06T2207/30261, G06K9/52, G06T2207/20224|
|European Classification||G06K9/52, G06T7/60, G06K9/42, G06K9/00V6|
|6 Mar 1991||AS||Assignment|
Owner name: KABUSHIKI KAISHA TOSHIBA, 72 HORIKAWA-CHO, SAIWAI-
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST.;ASSIGNORS:FUJIOKA, ARISA;KONDO, TAKASHI;REEL/FRAME:005620/0713
Effective date: 19880809
|26 Sep 1994||FPAY||Fee payment|
Year of fee payment: 4
|22 Dec 1998||REMI||Maintenance fee reminder mailed|
|30 May 1999||LAPS||Lapse for failure to pay maintenance fees|
|27 Jul 1999||FP||Expired due to failure to pay maintenance fee|
Effective date: 19990528