WO2016179798A1 - A system and a computer-implemented method for calibrating at least one senser - Google Patents

A system and a computer-implemented method for calibrating at least one senser Download PDF

Info

Publication number
WO2016179798A1
WO2016179798A1 PCT/CN2015/078769 CN2015078769W WO2016179798A1 WO 2016179798 A1 WO2016179798 A1 WO 2016179798A1 CN 2015078769 W CN2015078769 W CN 2015078769W WO 2016179798 A1 WO2016179798 A1 WO 2016179798A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
relative position
moving
imaging apparatus
component
Prior art date
Application number
PCT/CN2015/078769
Other languages
French (fr)
Inventor
Carsten Isert
Tao Xu
Guoyang XIE
Original Assignee
Bayerische Motoren Werke Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke Aktiengesellschaft filed Critical Bayerische Motoren Werke Aktiengesellschaft
Priority to PCT/CN2015/078769 priority Critical patent/WO2016179798A1/en
Publication of WO2016179798A1 publication Critical patent/WO2016179798A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles

Definitions

  • the present disclosure relates in general to the field of the advanced driving assistant system (ADAS) or highly automatic driving (HAD) , and in more particular, to a system and a computer-implemented method for calibrating at least one sensor mounted in a car.
  • ADAS advanced driving assistant system
  • HAD highly automatic driving
  • the system to be calibrated has to a camera which is already mounted in the system for capturing images after calibration, and the field of view of the camera shall have enough overlap with the laser-scanners so that they can share certain information for calibration.
  • this requirement may not be fulfilled, because the on-car cameras are designed for looking at special objects and there is no need to have overlapping field of view with the laser scanner, and the resolution of these on-car cameras may not be high enough for laser calibration.
  • An object of the present invention is to provide a system and a computer-implemented method for calibrating at least one sensor so as to address any of the above problems.
  • a system for calibrating at least one sensor capable of emitting lights comprising: a component with a planar pattern capable of receiving light beams from a sensor; an imaging apparatus capable of shooting the component; a first moving apparatus with one end being attached to the imaging apparatus and the other end being able to be fixed so that the imaging apparatus can move; a processing apparatus which is configured to, in response to each of the at least one sensor individually irradiating light beams to said component, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the sensor and the other end of the first moving apparatus, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  • the at least one sensor is mounted in a driving equipment
  • the system for calibrating at least one sensor may further comprise a mounting apparatus on which the driving equipment is mounted so that the driving equipment can be fixed or move.
  • the processing apparatus may be further configured to obtain the relative position between each sensor and the mounting apparatus, based on the relative position between each sensor and the other end of the first moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
  • the processing apparatus may be further configured to obtain the relative position between the sensors, based on the relative position between each sensor and the other end of the first moving apparatus or based on the relative position between each sensor and the mounting apparatus.
  • system for calibrating at least one sensor may further comprise a second moving apparatus, attaching to the component so as to be capable of moving it.
  • At least one of the first moving apparatus and the second moving apparatus may be mounted on a guiding apparatus.
  • a computer-implemented method for calibrating at least one sensor capable of emitting lights characterized in comprising the following steps: in response to each of the at least one sensor individually irradiating light beams to a component with a planar pattern, obtaining the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtaining the relative position between the sensor and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  • the method may further comprise the step of: moving a driving equipment in which the at least one sensor is mounted by using a mounting apparatus on which the driving equipment is mounted, so that each of the at least one sensor is capable of individually irradiating light beams to a component.
  • the method may further comprise the step of: obtaining the relative position between each sensor and the mounting apparatus, based on the relative position between each sensor and the other end of the first moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
  • the method may further comprise the step of: obtaining the relative position between the sensors, based on the relative position between each sensor and the other end of the first moving apparatus or based on the relative position between each sensor and the mounting apparatus.
  • the method may further comprise the step of: moving the component by using a second moving apparatus, so as to be capable of individually receiving light beams from each of the at least one sensor.
  • the method may further comprise the step of: moving at least one of the first moving apparatus and the second moving apparatus by using a guiding apparatus.
  • an apparatus characterized in comprising: a memory with computer executable instructions stored therein; and a processor, coupled to the memory and configured to: in response to each of the at least one sensor individually irradiating light beams to a component with a planar pattern, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the sensor and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  • a non-transitory machine readable storage medium in which instructions causing a machine to perform the following operations are stored: in response to each of the at least one sensor individually irradiating light beams to a component with a planar pattern, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the sensor and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  • Fig. 1 illustrates a block diagram of a system for calibrating at least one sensor in accordance with an exemplary embodiment of the present disclosure.
  • Fig. 2 illustrates a block diagram of another system for calibrating at least one sensor in accordance with other exemplary embodiments of the present disclosure.
  • Fig. 3 illustrates an exemplary detailed example of actual arrangement of the system for calibrating at least one sensor 200.
  • Fig. 4 illustrates a computer-implemented method for calibrating at least one laser according to an exemplary embodiment of the present invention.
  • Fig. 5 illustrates another computer-implemented method for calibrating at least one laser according to other exemplary embodiments of the present invention.
  • Fig. 6 illustratively explains the moving step S105 according to other exemplary embodiments of the present invention.
  • Fig. 7 illustrates a general hardware environment wherein the present disclosure is applicable in accordance with an exemplary embodiment of the present disclosure.
  • vehicle also referred to as “driving equipment” hereafter
  • vehicle refers to such a vehicle wherein one or more sensors may be provided (be mounted or be disposed) .
  • vehicle comprises but is not limited to a car, a truck, a bus, an airplane, a ship, or the like.
  • the system for calibrating at least one sensor 100 may comprise: a component with a planar pattern 101, which is configured to be capable of receiving light beams from a sensor; an imaging apparatus 102, which is configured to be capable of shooting the component; a first moving apparatus 103, one end of which is attached to the imaging apparatus 102 and the other end is able to be fixed so that the imaging apparatus 102 can move; a processing apparatus 104, which is configured to, in response to each of the at least one sensor individually irradiating light beams to said component, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the sensor and the other end of the first moving apparatus, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  • At least one sensor 105 may be laser scanners, which may emit invisible laser beams.
  • other sensors may also be used in embodiments of the present invention.
  • laser scanners which may emit visible laser beams, or other sensors which may emit invisible lights or visible lights may also be used in embodiments of the present invention.
  • the component with a planar pattern 101 may be configured to be capable of receiving light beams from a sensor.
  • a sensor may be a checkerboard.
  • a planar pattern may represent such a pattern which has regular arrangement, such as checkerboard pattern with white and black blocks regularly arranged.
  • the present invention does not make any limitation to the component 101, as long as it can receive light beams.
  • the imaging apparatus 102 is used to photograph the images of the component with a planar pattern 101.
  • the imaging apparatus 102 may be an image capturing device, such as static camera, video camera, or the like.
  • the first moving apparatus 103 is used to move the imaging apparatus 102, one end of which is attached to the imaging apparatus 102 and the other end is able to be fixed.
  • the first moving apparatus 103 may be a robot arm, a moving stage, or the like.
  • the processing apparatus 104 is used to obtain the position relationships among the components of the system. Specifically, each of the at least one sensor 105 irradiates light beams to the component with a planar pattern 101, respectively, and then light scanning data from this sensor 105 may be obtained, for example, either directly from the sensor 105 (for example, from the readings of a laser scanner) or in other manners (for example, from other detecting means) , and based on the light scanning data from the sensor 105 and the corresponding images of the component shot by the imaging apparatus, the relative position between the sensor 105 and the imaging apparatus 102 may be obtained.
  • each of the at least one sensor 105 irradiates light beams to the component with a planar pattern 101, respectively, and then light scanning data from this sensor 105 may be obtained, for example, either directly from the sensor 105 (for example, from the readings of a laser scanner) or in other manners (for example, from other detecting means) , and based on the light scanning data from the sensor 105 and the
  • the light scanning data may be laser scanning data directly read from the laser scanner, which can reflect the angle of the laser beams emitted by the laser scanner with respect to the laser scanner and also the distance between the laser scanner and the component to which the laser beams are irradiated.
  • the relative position between the other end of the first moving apparatus 103 and the imaging apparatus 102 is obtained.
  • the detailed description regarding how to obtain the position relationships among the components of the system will be explained later by taking an example.
  • processing apparatus 104 may be a controller, a processor, a microprocessor, or the like, or any combination thereof.
  • each senor 101 may be calibrated automatically, without needing to manually move the imaging apparatus 102 as in prior art, and simultaneously, the automatic calibration process makes the calibration more accurate, robust and efficient than manual calibration process in prior art.
  • Fig. 2 illustrates a block diagram of another system for calibrating at least one sensor in accordance with other exemplary embodiments of the present disclosure.
  • the at least one sensor 105 may be mounted in a driving equipment 106 as shown in Fig. 2, such as a car, a bus, a truck, an airplane, a ship, or the like.
  • the number of the sensors 105 may be one or more.
  • the number of the sensors may be four, and they may be provided at the front side of the vehicle, at the rear side of the vehicle, at the left side of the vehicle, and at the right side of the vehicle, respectively.
  • more or less sensors may be provided in the driving equipment 106 as necessary.
  • the sensors 105 provided in the driving equipment 106 may obtain road information, extract lane, and/or detect obstacle etc., so as to guarantee the vehicle’s safety along the lane.
  • the calibration system may be used for a massive industrial production and can be integrated into current vehicle (such as car etc. ) assembly line, so as to be used to calibrate the sensors after mounted onto the vehicle and before leaving the factory.
  • current vehicle such as car etc.
  • the system for calibrating at least one sensor 105 may further comprise a mounting apparatus 107 as shown in Fig. 2.
  • the driving equipment 106 may be mounted on the mounting apparatus 107, so that the driving equipment 106 can be fixed or can move.
  • the mounting apparatus 107 may be a mounting base for a car or other mounting plat etc.
  • the driving equipment or vehicle 106 can also be moveable through the mounting apparatus 107 so that each of the sensors 105 provided in the driving equipment may be adjusted to align with the component 102 automatically for irradiating lights thereto, which enhances the automation of the calibration process and thus makes the calibration more accurate, robust and efficient.
  • the calibration system according to these embodiments of the present invention does not require a large empty area for providing external markings, for example.
  • the processing apparatus 104 may be used to obtain the relative position between each sensor and the mounting apparatus, based on the relative position between each sensor and the other end of the first moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
  • the calibration system according to this exemplary embodiment of the present invention may be more accurate, robust and efficient, because the mounting apparatus 107 directly associates with the driving equipment 106.
  • the processing apparatus may be used to obtain the relative position between the sensors, based on the relative position between each sensor and the other end of the first moving apparatus or based on the relative position between each sensor and the mounting apparatus.
  • the calibration system may implement the calibration of multi-sensors (such as laser scanners) provided in a vehicle 106 (such as a car) , and moreover, the calibration process is fully automated, which makes the calibration more accurate, robust and efficient.
  • multi-sensors such as laser scanners
  • the system for calibrating at least one sensor 200 as shown in Fig. 2 may further comprise a second moving apparatus 108 as shown in Fig. 2, which attaches to the component with a planar pattern 101 so as to be capable of moving the component 101.
  • the calibration system may implement fully automated calibration process, which makes the calibration more accurate, robust and efficient.
  • At least one of the first moving apparatus 107 and the second moving apparatus 108 may be mounted on a guiding apparatus 109 as shown in Fig. 2.
  • the guiding apparatus 109 may be one or more guide rails which can guide the first moving apparatus 107 and/or the second moving apparatus 108 to move, for example, around the driving equipment 106, with higher accuracy than manual operation. In case of using the guiding apparatus 109, it is unnecessary for the driving equipment (specifically, the sensors 105 therein) to move. Of course, coexisting of the mounting apparatus 107 and the guiding apparatus 109 in the system 200 as shown in Fig. 2 is also acceptable.
  • the calibration system may implement the calibration of multi-sensors (such as laser scanners) provided in a vehicle 106 (such as a car) , and moreover, the calibration process is fully automated, which makes the calibration more accurate, robust and efficient.
  • multi-sensors such as laser scanners
  • a HAD car as an example of the driving equipment 106 may be equipped with several laser scanners as an example of the sensor 105, being installed on the front, back, and both sides of the car.
  • the sensor (s) 105 may be installed in or outside the car 106, although it appears that the one illustrative laser scanner 105 is installed outside the car 106.
  • the system 200 can be a part of the car assembly line. During the production of a car, when the car finishes assemble of the sensors, it can then move to the calibration system 200 for performing the laser scanner calibration.
  • the calibration system 200 may contain: two robot arms as an example of first moving apparatus 104 and second moving apparatus 108, respectively; one external camera as an example of imaging apparatus 102, to which one of the robot arm 104 attaches; one checkerboard plane as an example of the component with a planar pattern 101; a mounting base as an example of the mounting apparatus 107, on which the car is mounted or disposed; and guide rails as an example of the guiding apparatus 109, which can guide the robot arms to move.
  • the mounting base 107 can rotate along the vertical axis so that each laser scanner 105 around the car can be calibrated.
  • the mounting base 107 is used to mount the car and fix it, so that the car will not move relative to the mounting base during the calibration process.
  • Fig. 3 only illustrates the guide rails 109 associated with the robot arm of the checkerboard 101, the guide rails 109 associated with the robot arm of the external camera 102 may be similar to them.
  • the external camera 102 is mounted on one robot arm 103, and the checkerboard 101 is mounted on the other robot arm 108. Since the robot arms can move the position and rotation of the checkerboard and the camera, the calibration process can be performed automatically.
  • the robot arms may also be mounted on the guiding rails 109 (guiding apparatus) , so that the checkerboard and the camera can be freely moved with high accuracy.
  • the processing apparatus 104 may be provided in the driving equipment 106, for example connecting to the sensors so as to easily obtain the light scanning data from the sensors, or may be provided outside the driving equipment 106. In case of being provided outside the driving equipment 106, the information required by the processing apparatus may be provided to it via input by user or via any or combination of various communications.
  • step S110 in response to each of the at least one sensor 105 individually irradiating light beams to a component with a planar pattern 101, the relative position between the sensor 105 and the imaging apparatus 102 is obtained, based on light scanning data from the sensor and the respective images of the component 101 shot by the imaging apparatus 102.
  • the light scanning data may be laser scanning data directly read from the laser scanner, which can reflect the angle of the laser beams emitted by the laser scanner with respect to the laser scanner and also the distance between the laser scanner and the component to which the laser beams are irradiated.
  • light scanning data does not limit to this, as long as the light irradiating positions on the component with a planar pattern can be obtained and then by using the relative position of the sensor irradiating the lights to the component with a planar pattern and the component, the positions in the images of the component shot by the imaging apparatus which correspond to light irradiating positions on the component can be obtained (see, for example, Q. Zhang and R. Pless. Extrinsic calibration of a camera and laser range finder (improves camera calibration) . In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2004. ) .
  • step S120 the relative position between the laser and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed is obtained, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  • each senor 105 may be calibrated automatically, without needing to manually move the imaging apparatus 102 as in prior art, and simultaneously, the automatic calibration process makes the calibration more accurate, robust and efficient than manual calibration process in prior art.
  • Fig. 5 illustrates another computer-implemented method for calibrating at least one laser according to other exemplary embodiments of the present invention.
  • each of the at least one sensor 105 is capable of individually irradiating laser beams to the component with a planar pattern 101.
  • the calibration method according to exemplary embodiments of the present invention is more flexible, because the relating components can be arbitrarily moved as necessary.
  • the moving step S105 may comprise driving equipment moving step S101, where a driving equipment 106 in which the at least one sensor 105 is mounted or disposed may be moved by using a mounting apparatus 107 on which the driving equipment 106 is mounted, so that each of the at least one sensor is capable of individually irradiating light beams to the component with a planar pattern.
  • the calibration method according to this exemplary embodiment of the present invention may be used for a massive industrial production and can be integrated into current vehicle (such as car etc. ) assembly line, so as to be used to calibrate the sensors after mounted onto the vehicle and before leaving the factory.
  • current vehicle such as car etc.
  • the driving equipment or vehicle 106 can also be moveable through the mounting apparatus 107 so that each of the sensors 105 provided in the driving equipment may be adjusted to align with the component 102 automatically for irradiating lights thereto, which enhances the automation of the calibration process and thus makes the calibration more accurate, robust and efficient.
  • the calibration system according to this exemplary embodiment of the present invention may be used for a massive industrial production and can be integrated into current vehicle (such as car etc. ) assembly line, so as to be used to calibrate the sensors after mounted onto the vehicle and before leaving the factory.
  • current vehicle such as car etc.
  • the moving step S105 may comprise component moving step S102, where the component with a planar pattern is moved by using a second moving apparatus, so as to be capable of individually receiving light beams from each of the at least one sensor.
  • the calibration method according to this exemplary embodiment of the present invention may be more flexible, accurate, robust and efficient.
  • the calibration system according to these embodiments of the present invention does not require a large empty area for providing external markings, for example.
  • the moving step S105 may comprise component moving step S103, where at least one of the first moving apparatus and the second moving apparatus may be moved by using a guiding apparatus 109 such as guiding rails.
  • steps S101, S102 and S103 may be arbitrarily combined or be used individually, and the present invention does not make any limitation to this.
  • the relative position between each sensor and the mounting apparatus may be obtained, based on the relative position between each sensor and the other end of the first moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
  • the relative position between the lasers may be obtained, based on the relative position between each laser and the other end of the first moving apparatus or based on the relative position between each laser and the mounting apparatus.
  • the aspects of the present invention may obtain at least one of the following advantages: 1) multi-sensors (such as laser scanners) on a car can be calibrated; 2) the calibration process is fully automated, which makes the calibration more accurate, robust and efficient; 3) the calibration system is suitable for a massive industrial production and can be integrated into current vehicle (such as car etc. ) assembly line; 4) the calibration system does not require a large empty area for multi laser scanner calibration.
  • multi-sensors such as laser scanners
  • Step 1 The car is transported through assembly line and locked on the mounting base.
  • Step2 The second robot arm (R1) that holds the checkerboard will move, so that one of the car laser scanner beams intersects with the checkerboard. It is better to move the checkerboard to let more than one laser scanner beams intersect on the checkerboard at the same time. Then, record the laser scanning data which can reflect the angle of the laser beams emitted by the laser scanner with respect to the laser scanner and also the distance between the laser scanner and the checkerboard.
  • Step3 The first robot arm (R0) moves the camera, so that the camera can see the checkerboard. Then using traditional camera calibration methods (such as Jean-Yves Bouguet. Camera Calibration Toolbox for Matlab, 2003) , we can get the relative transformation (position and rotation) of the checkerboard with respect to (wrt) the camera c T b . Since the transformation of the end-effector of the robot arm R0 wrt the base of this robot r0 T c can be calculated using forward kinematics of the robot arm, we can then get the transformation of the checkerboard wrt the base of R0 r0 T b .
  • Step4 Rotate the checkerboard with different angles and repeat steps 2 to 3 to collect different set of laser scanning data and checkerboard transformations wrt camera.
  • Step5 By the rule that all laser beams interacts with the checkerboard in eachdata set shoulb be coplanar, we can find the optimized laser scanner transformation wrt the camera c T l0 . Also, using the robot arm forward kinematics, we can calculate the transformation of the laser scanner wrt the base of R0 r0 T l0 .
  • Step6 Rotate the mounting base with angle ⁇ i so that another laser scanner (the i th laser scanner l i ) beams strike on the checkerboard and repeatsteps 2 to 5 to get the transformation of the i th laser scanner wrt the R0 base r0 T′ li . Since the transformation of R0 base wrt the mounting base B T r0 is predefined, we can then get the transformation of the i th laser scanner wrt R0 base before rotating the mounting base where
  • Step7 Repeat Step6 untill all the laser sensors have been iterated. Then the transformation for all laser scanner i wrt R0 base can be obtained. Taking the first laser scanner as the reference laser scanner, the transformation of the ith laser scanner wrt the first laser scanner l 0 can be calculated as At this step, all the laser scanners are calibrated with the reference laser scanner.
  • an apparatus 700 may comprise: a memory with computer executable instructions stored therein; and a processor, coupled to the memory and configured to: in response to each of the at least one laser individually irradiating laser beams to a component with a planar pattern, obtain the relative position between the laser and the imaging apparatus, based on laser scanning data from the laser and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the laser and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  • a non-transitory machine readable storage medium in which instructions causing a machine to perform the following operations are stored: in response to each of the at least one laser individually irradiating laser beams to a component with a planar pattern, obtain the relative position between the laser and the imaging apparatus, based on laser scanning data from the laser and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the laser and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  • the computing device 700 may be any machine configured to perform processing and/or calculations, may be but is not limited to a work station, a server, a desktop computer, a laptop computer, a tablet computer, a personal data assistant, a smart phone, an on-vehicle computer or any combination thereof.
  • the aforementioned intelligent voice assistant apparatus 200 may be wholly or at least partially implemented by the computing device 700 or a similar device or system.
  • the computing device 700 may comprise elements that are connected with or in communication with a bus 702, possibly via one or more interfaces.
  • the computing device 700 may comprise the bus 702, and one or more processors 704, one or more input devices 706 and one or more output devices 708.
  • the one or more processors 704 may be any kinds of processors, and may comprise but are not limited to one or more general-purpose processors and/or one or more special-purpose processors (such as special processing chips) .
  • the input devices 706 may be any kinds of devices that can input information to the computing device, and may comprise but are not limited to a mouse, a keyboard, a touch screen, a microphone and/or a remote control.
  • the output devices 708 may be any kinds of devices that can present information, and may comprise but are not limited to display, a speaker, a video/audio output terminal, a vibrator and/or a printer.
  • the computing device 700 may also comprise or be connected with non-transitory storage devices 710 which may be any storage devices that are non-transitory and can implement data stores, and may comprise but are not limited to a disk drive, an optical storage device, a solid-state storage, a floppy disk, a flexible disk, hard disk, a magnetic tape or any other magnetic medium, a compact disc or any other optical medium, a ROM (Read Only Memory) , a RAM (Random Access Memory) , a cache memory and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions and/or code.
  • non-transitory storage devices 710 which may be any storage devices that are non-transitory and can implement data stores, and may comprise but are not limited to a disk drive, an optical storage
  • the non-transitory storage devices 710 may be detachable from an interface.
  • the non-transitory storage devices 710 may have data/instructions/code for implementing the methods and steps which are described above.
  • the computing device 700 may also comprise a communication device 712.
  • the communication device 712 may be any kinds of device or system that can enable communication with external apparatuses and/or with a network, and may comprise but are not limited to a modem, a network card, an infrared communication device, a wireless communication device and/or a chipset such as a Bluetooth TM device, 1302.11 device, WiFi device, WiMax device, cellular communication facilities and/or the like.
  • the transceiver (s) 107 as aforementioned may, for example, be implemented by the communication device 712.
  • the computing device 700 When the computing device 700 is used as an on-vehicle device, it may also be connected to external device, for example, a GPS receiver, sensors for sensing different environmental data such as a laser scanner, an acceleration sensor, a wheel speed sensor, a gyroscope and so on. In this way, the computing device 700 may, for example, receive location data and sensor data indicating the travelling situation of the vehicle.
  • external device for example, a GPS receiver, sensors for sensing different environmental data such as a laser scanner, an acceleration sensor, a wheel speed sensor, a gyroscope and so on.
  • the computing device 700 may, for example, receive location data and sensor data indicating the travelling situation of the vehicle.
  • other facilities such as an engine system, a wiper, an anti-lock Braking System or the like
  • non-transitory storage device 710 may have map information and software elements so that the processor 704 may perform route guidance processing.
  • the output device 706 may comprise a display for displaying the map, the location mark of the vehicle and also images indicating the travelling situation of the vehicle.
  • the output device 706 may also comprise a speaker or interface with an ear phone for audio guidance.
  • the bus 702 may include but is not limited to Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus. Particularly, for an on-vehicle device, the bus 702 may also include a Controller Area Network (CAN) bus or other architectures designed for application on an automobile.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnect
  • CAN Controller Area Network
  • the computing device 700 may also comprise a working memory 714, which may be any kind of working memory that may store instructions and/or data useful for the working of the processor 704, and may comprise but is not limited to a random access memory and/or a read-only memory device.
  • working memory 714 may be any kind of working memory that may store instructions and/or data useful for the working of the processor 704, and may comprise but is not limited to a random access memory and/or a read-only memory device.
  • Software elements may be located in the working memory 714, including but are not limited to an operating system 716, one or more application programs 718, drivers and/or other data and codes. Instructions for performing the methods and steps described in the above may be comprised in the one or more application programs 718, and the units of the aforementioned intelligent voice assistant apparatus 200 may be implemented by the processor 704 reading and executing the instructions of the one or more application programs 718. More specifically, the determination unit 201 of the aforementioned intelligent voice assistant apparatus 200 may, for example, be implemented by the processor 704 when executing an application 718 having instructions to perform the step S402, S502, or S602.
  • the vocalization control unit 202 of the aforementioned intelligent voice assistant apparatus 200 may, for example, be implemented by the processor 704 when executing an application 718 having instructions to perform the step S405, S506, or S606.
  • Other units of the aforementioned intelligent voice assistant apparatus 200 may also, for example, be implemented by the processor 704 when executing an application 718 having instructions to perform one or more of the aforementioned respective steps.
  • the executable codes or source codes of the instructions of the software elements may be stored in a non-transitory computer-readable storage medium, such as the storage device (s) 710 described above, and may be read into the working memory 714 possibly with compilation and/or installation.
  • the executable codes or source codes of the instructions of the software elements may also be downloaded from a remote location.
  • the present disclosure may be implemented by software with necessary hardware, or by hardware, firmware and the like. Based on such understanding, the embodiments of the present disclosure may be embodied in part in a software form.
  • the computer software may be stored in a readable storage medium such as a floppy disk, a hard disk, an optical disk or a flash memory of the computer.
  • the computer software comprises a series of instructions to make the computer (e.g., a personal computer, a service station or a network terminal) execute the method or a part thereof according to respective embodiment of the present disclosure.

Abstract

A system and a computer-implemented method for calibrating at least one sensor (100) are provided. The system for calibrating at least one sensor comprises: a component with a planar pattern (101) capable of receiving light beams from a sensor, an imaging apparatus (102) capable of shooting the component; a first moving apparatus (103) with one end being attached to the imaging apparatus (102) and the other end being able to be fixed so that the imaging apparatus (102) can move, a processing apparatus (104) which is configured to, in response to each of the at least one sensor (100) individually irradiating light beams to said component, obtain the relative position between the sensor (100) and the imaging apparatus (102), based on light scanning data from the sensor (100) and the respective images of the component shot by the imaging apparatus (102), and obtain the relative position between the sensor (100) and the other end of the first moving apparatus (103), based on the relative position between the other end of the first moving apparatus (103) and the imaging apparatus (102).

Description

A SYSTEM AND A COMPUTER-IMPLEMENTED METHOD FOR CALIBRATING AT LEAST ONE SENSER FIELD OF THE INVENTION
The present disclosure relates in general to the field of the advanced driving assistant system (ADAS) or highly automatic driving (HAD) , and in more particular, to a system and a computer-implemented method for calibrating at least one sensor mounted in a car.
BACKGROUND OF THE INVENTION
With the development of the advanced driving assistant system or highly automatic driving vehicle (such as car, airplane, or the like) , more and more sensors, including cameras, speedometer, IMU, laser scanner, etc. are added to cars so as to obtain road information, extract lane, detect obstacle, which guarantee the vehicle’s safe driving along the lane. Calibration of these sensors have become more and more important in order to improve the accuracy and robust of these ADAS or HAD systems.
Among these sensors, calibration for cameras, speedometer or IMU have already been investigated and practical methods have been used in massive production, since these sensor have already been used in consumer cars for many years. However, laser scanners, as a newly introduced sensor for HAD cars, have not been well researched. In a HAD car system, usually more than one laser scanners would be used and how to calibrate the relative posture of these laser scanners to each other is to be investigated.
Furthermore, currently, most HAD car systems are under research in many car companies. Only few prototypes may be developed and the effort and automaticity of calibrating these laser scanners are not much cared. However, with the fast development of the HAD technology, massive production is to be visioned, and efficient and automatic sensor calibration will become significantly important then.
Now, nearly no methods have been developed for multi-laser calibration on cars. There are only few related methods on camera-laser calibration, or camera calibration. Specifically, these methods just focus on camera-to-laser or camera-sole calibrations, and few methods have been developed for laser-to-laser calibration and on-car laser calibrations. Also, many of these methods require manual operation involvement, and few of them are for fully automatic calibration, which is very import for massive industrial production. In addition,  some of these methods require large empty ground and lots of manual work, which is not possible for massive production.
Further, in the existing camera-laser calibration methods, the system to be calibrated has to a camera which is already mounted in the system for capturing images after calibration, and the field of view of the camera shall have enough overlap with the laser-scanners so that they can share certain information for calibration. However, on most HAD cars, this requirement may not be fulfilled, because the on-car cameras are designed for looking at special objects and there is no need to have overlapping field of view with the laser scanner, and the resolution of these on-car cameras may not be high enough for laser calibration.
SUMMARY OF THE INVENTION
The present invention aims to solve the problems as described above. An object of the present invention is to provide a system and a computer-implemented method for calibrating at least one sensor so as to address any of the above problems.
In accordance with a first exemplary embodiment of the present disclosure, a system for calibrating at least one sensor capable of emitting lights is provided, characterized in comprising: a component with a planar pattern capable of receiving light beams from a sensor; an imaging apparatus capable of shooting the component; a first moving apparatus with one end being attached to the imaging apparatus and the other end being able to be fixed so that the imaging apparatus can move; a processing apparatus which is configured to, in response to each of the at least one sensor individually irradiating light beams to said component, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the sensor and the other end of the first moving apparatus, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
In an example of the present embodiment, the at least one sensor is mounted in a driving equipment, and the system for calibrating at least one sensor may further comprise a mounting apparatus on which the driving equipment is mounted so that the driving equipment can be fixed or move.
In another example of the present embodiment, the processing apparatus may be further configured to obtain the relative position between each sensor and the mounting apparatus, based on the relative position between each sensor and the other end of the first  moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
In another example of the present embodiment, the processing apparatus may be further configured to obtain the relative position between the sensors, based on the relative position between each sensor and the other end of the first moving apparatus or based on the relative position between each sensor and the mounting apparatus.
In another example of the present embodiment, the system for calibrating at least one sensor may further comprise a second moving apparatus, attaching to the component so as to be capable of moving it.
In another example of the present embodiment, at least one of the first moving apparatus and the second moving apparatus may be mounted on a guiding apparatus.
In accordance with a second exemplary embodiment of the present disclosure, a computer-implemented method for calibrating at least one sensor capable of emitting lights is provided, characterized in comprising the following steps: in response to each of the at least one sensor individually irradiating light beams to a component with a planar pattern, obtaining the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtaining the relative position between the sensor and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
In an example of the present embodiment, the method may further comprise the step of: moving a driving equipment in which the at least one sensor is mounted by using a mounting apparatus on which the driving equipment is mounted, so that each of the at least one sensor is capable of individually irradiating light beams to a component.
In another example of the present embodiment, the method may further comprise the step of: obtaining the relative position between each sensor and the mounting apparatus, based on the relative position between each sensor and the other end of the first moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
In another example of the present embodiment, the method may further comprise the step of: obtaining the relative position between the sensors, based on the relative position between each sensor and the other end of the first moving apparatus or based on the relative position between each sensor and the mounting apparatus.
In another example of the present embodiment, the method may further comprise the step of: moving the component by using a second moving apparatus, so as to be capable of individually receiving light beams from each of the at least one sensor.
In another example of the present embodiment, the method may further comprise the step of: moving at least one of the first moving apparatus and the second moving apparatus by using a guiding apparatus.
In accordance with a third exemplary embodiment of the present disclosure, an apparatus is provided, characterized in comprising: a memory with computer executable instructions stored therein; and a processor, coupled to the memory and configured to: in response to each of the at least one sensor individually irradiating light beams to a component with a planar pattern, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the sensor and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
In accordance with a fourth exemplary embodiment of the present disclosure, a non-transitory machine readable storage medium is provided, in which instructions causing a machine to perform the following operations are stored: in response to each of the at least one sensor individually irradiating light beams to a component with a planar pattern, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the sensor and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
Further scope of applicability of the present disclosure will become apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating preferred embodiments of the present disclosure, are given by way of illustration only, since various changes and modifications within the spirit and scope of the present disclosure will become apparent to those skilled in the art from the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other aspects and advantages of the present disclosure will become apparent from the following detailed description of exemplary embodiments taken in conjunction with the accompanying drawings which illustrate, by way of example, the principles of the present disclosure. Note that the drawings are not necessarily drawn to scale.
Fig. 1 illustrates a block diagram of a system for calibrating at least one sensor in accordance with an exemplary embodiment of the present disclosure.
Fig. 2 illustrates a block diagram of another system for calibrating at least one sensor in accordance with other exemplary embodiments of the present disclosure.
Fig. 3 illustrates an exemplary detailed example of actual arrangement of the system for calibrating at least one sensor 200.
Fig. 4 illustrates a computer-implemented method for calibrating at least one laser according to an exemplary embodiment of the present invention.
Fig. 5 illustrates another computer-implemented method for calibrating at least one laser according to other exemplary embodiments of the present invention.
Fig. 6 illustratively explains the moving step S105 according to other exemplary embodiments of the present invention.
Fig. 7 illustrates a general hardware environment wherein the present disclosure is applicable in accordance with an exemplary embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
In the following detailed description, numerous specific details are set forth to provide a thorough understanding of the described exemplary embodiments. It will be apparent, however, to one skilled in the art that the described embodiments can be practiced without some or all of these specific details. In other exemplary embodiments, well known structures or process steps have not been described in detail in order to avoid unnecessarily obscuring the concept of the present disclosure.
It is to be noted that the term “vehicle” (also referred to as “driving equipment” hereafter) used through the specification refers to such a vehicle wherein one or more sensors may be provided (be mounted or be disposed) . For example, such a vehicle comprises but is not limited to a car, a truck, a bus, an airplane, a ship, or the like.
Next, embodiments of the present invention will be described in detail below with reference to the drawings. Here, please note that similar reference numerals are used to indicate similar components, and detailed description thereof will be omitted for conciseness.
Referring first to Fig. 1, there is shown a block diagram of a novel system 100  for calibrating at least one sensor 105 in accordance with an exemplary embodiment of the present disclosure. The system for calibrating at least one sensor 100 may comprise: a component with a planar pattern 101, which is configured to be capable of receiving light beams from a sensor; an imaging apparatus 102, which is configured to be capable of shooting the component; a first moving apparatus 103, one end of which is attached to the imaging apparatus 102 and the other end is able to be fixed so that the imaging apparatus 102 can move; a processing apparatus 104, which is configured to, in response to each of the at least one sensor individually irradiating light beams to said component, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the sensor and the other end of the first moving apparatus, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
Here, for example, at least one sensor 105 may be laser scanners, which may emit invisible laser beams. However, other sensors may also be used in embodiments of the present invention. For example, laser scanners which may emit visible laser beams, or other sensors which may emit invisible lights or visible lights may also be used in embodiments of the present invention.
The component with a planar pattern 101 may be configured to be capable of receiving light beams from a sensor. For example, it may be a checkerboard. Here, a planar pattern may represent such a pattern which has regular arrangement, such as checkerboard pattern with white and black blocks regularly arranged. The present invention does not make any limitation to the component 101, as long as it can receive light beams.
The imaging apparatus 102 is used to photograph the images of the component with a planar pattern 101. For example, the imaging apparatus 102 may be an image capturing device, such as static camera, video camera, or the like.
The first moving apparatus 103 is used to move the imaging apparatus 102, one end of which is attached to the imaging apparatus 102 and the other end is able to be fixed. For example, the first moving apparatus 103 may be a robot arm, a moving stage, or the like.
The processing apparatus 104 is used to obtain the position relationships among the components of the system. Specifically, each of the at least one sensor 105 irradiates light beams to the component with a planar pattern 101, respectively, and then light scanning data from this sensor 105 may be obtained, for example, either directly from the sensor 105 (for example, from the readings of a laser scanner) or in other manners (for example, from other  detecting means) , and based on the light scanning data from the sensor 105 and the corresponding images of the component shot by the imaging apparatus, the relative position between the sensor 105 and the imaging apparatus 102 may be obtained. In the case where the sensor is a laser scanner, for example, the light scanning data may be laser scanning data directly read from the laser scanner, which can reflect the angle of the laser beams emitted by the laser scanner with respect to the laser scanner and also the distance between the laser scanner and the component to which the laser beams are irradiated.
Further, based on the relative position between the other end of the first moving apparatus 103 and the imaging apparatus 102, the relative position between the sensor and the other end of the first moving apparatus 103 is obtained. The detailed description regarding how to obtain the position relationships among the components of the system will be explained later by taking an example.
Furthermore, the processing apparatus 104 may be a controller, a processor, a microprocessor, or the like, or any combination thereof.
Through the above embodiment of the present invention, each senor 101 may be calibrated automatically, without needing to manually move the imaging apparatus 102 as in prior art, and simultaneously, the automatic calibration process makes the calibration more accurate, robust and efficient than manual calibration process in prior art.
Fig. 2 illustrates a block diagram of another system for calibrating at least one sensor in accordance with other exemplary embodiments of the present disclosure.
According to another exemplary embodiment of the present invention, the at least one sensor 105 may be mounted in a driving equipment 106 as shown in Fig. 2, such as a car, a bus, a truck, an airplane, a ship, or the like. Here, the number of the sensors 105 may be one or more. For example, if being mounted in a car, the number of the sensors may be four, and they may be provided at the front side of the vehicle, at the rear side of the vehicle, at the left side of the vehicle, and at the right side of the vehicle, respectively. In fact, more or less sensors may be provided in the driving equipment 106 as necessary. The sensors 105 provided in the driving equipment 106 may obtain road information, extract lane, and/or detect obstacle etc., so as to guarantee the vehicle’s safety along the lane.
In this manner, the calibration system according to this exemplary embodiment of the present invention may be used for a massive industrial production and can be integrated into current vehicle (such as car etc. ) assembly line, so as to be used to calibrate the sensors after mounted onto the vehicle and before leaving the factory.
In addition, the system for calibrating at least one sensor 105 may further  comprise a mounting apparatus 107 as shown in Fig. 2. The driving equipment 106 may be mounted on the mounting apparatus 107, so that the driving equipment 106 can be fixed or can move. Here, the mounting apparatus 107 may be a mounting base for a car or other mounting plat etc.
In this manner, during calibration, the driving equipment or vehicle 106 can also be moveable through the mounting apparatus 107 so that each of the sensors 105 provided in the driving equipment may be adjusted to align with the component 102 automatically for irradiating lights thereto, which enhances the automation of the calibration process and thus makes the calibration more accurate, robust and efficient.
In addition, the calibration system according to these embodiments of the present invention does not require a large empty area for providing external markings, for example.
According to another exemplary embodiment of the present invention, the processing apparatus 104 may be used to obtain the relative position between each sensor and the mounting apparatus, based on the relative position between each sensor and the other end of the first moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
In this manner, the calibration system according to this exemplary embodiment of the present invention may be more accurate, robust and efficient, because the mounting apparatus 107 directly associates with the driving equipment 106.
According to another exemplary embodiment of the present invention, the processing apparatus may be used to obtain the relative position between the sensors, based on the relative position between each sensor and the other end of the first moving apparatus or based on the relative position between each sensor and the mounting apparatus.
In this manner, the calibration system according to this exemplary embodiment of the present invention may implement the calibration of multi-sensors (such as laser scanners) provided in a vehicle 106 (such as a car) , and moreover, the calibration process is fully automated, which makes the calibration more accurate, robust and efficient.
According to another exemplary embodiment of the present invention, the system for calibrating at least one sensor 200 as shown in Fig. 2 may further comprise a second moving apparatus 108 as shown in Fig. 2, which attaches to the component with a planar pattern 101 so as to be capable of moving the component 101.
In this manner, the calibration system according to this exemplary embodiment of the present invention may implement fully automated calibration process, which makes the calibration more accurate, robust and efficient.
According to another exemplary embodiment of the present invention, at least one of the first moving apparatus 107 and the second moving apparatus 108 may be mounted on a guiding apparatus 109 as shown in Fig. 2.
Here, the guiding apparatus 109 may be one or more guide rails which can guide the first moving apparatus 107 and/or the second moving apparatus 108 to move, for example, around the driving equipment 106, with higher accuracy than manual operation. In case of using the guiding apparatus 109, it is unnecessary for the driving equipment (specifically, the sensors 105 therein) to move. Of course, coexisting of the mounting apparatus 107 and the guiding apparatus 109 in the system 200 as shown in Fig. 2 is also acceptable.
In this manner, the calibration system according to this exemplary embodiment of the present invention may implement the calibration of multi-sensors (such as laser scanners) provided in a vehicle 106 (such as a car) , and moreover, the calibration process is fully automated, which makes the calibration more accurate, robust and efficient.
Here, it is to be noted that not all of these components in Fig. 2 are necessary for solving the technical problem of the present invention, and they may be removed or added depending on actual situation, or actual requirements such as the required accuracy etc.
For easily understanding the concept of the aspects in the present invention, an exemplary detailed example of the arrangements of the system for calibrating at least one sensor 200 will be described with reference to Fig. 3.
As shown in Fig. 3, A HAD car as an example of the driving equipment 106 may be equipped with several laser scanners as an example of the sensor 105, being installed on the front, back, and both sides of the car. Here, the sensor (s) 105 may be installed in or outside the car 106, although it appears that the one illustrative laser scanner 105 is installed outside the car 106. The system 200 can be a part of the car assembly line. During the production of a car, when the car finishes assemble of the sensors, it can then move to the calibration system 200 for performing the laser scanner calibration.
The calibration system 200 may contain: two robot arms as an example of first moving apparatus 104 and second moving apparatus 108, respectively; one external camera as an example of imaging apparatus 102, to which one of the robot arm 104 attaches; one checkerboard plane as an example of the component with a planar pattern 101; a mounting base as an example of the mounting apparatus 107, on which the car is mounted or disposed; and guide rails as an example of the guiding apparatus 109, which can guide the robot arms to move. The mounting base 107 can rotate along the vertical axis so that each laser scanner 105 around the car can be calibrated. The mounting base 107 is used to mount the car and fix  it, so that the car will not move relative to the mounting base during the calibration process. In addition, although Fig. 3 only illustrates the guide rails 109 associated with the robot arm of the checkerboard 101, the guide rails 109 associated with the robot arm of the external camera 102 may be similar to them.
The external camera 102 is mounted on one robot arm 103, and the checkerboard 101 is mounted on the other robot arm 108. Since the robot arms can move the position and rotation of the checkerboard and the camera, the calibration process can be performed automatically.
In addition, the robot arms may also be mounted on the guiding rails 109 (guiding apparatus) , so that the checkerboard and the camera can be freely moved with high accuracy.
Here, it is to be noted that the processing apparatus 104 may be provided in the driving equipment 106, for example connecting to the sensors so as to easily obtain the light scanning data from the sensors, or may be provided outside the driving equipment 106. In case of being provided outside the driving equipment 106, the information required by the processing apparatus may be provided to it via input by user or via any or combination of various communications.
Furthermore, a computer-implemented method 10 for calibrating at least one laser according to an exemplary embodiment of the present invention will be described below with reference to Fig. 4.
As shown in Fig. 4, at step S110, in response to each of the at least one sensor 105 individually irradiating light beams to a component with a planar pattern 101, the relative position between the sensor 105 and the imaging apparatus 102 is obtained, based on light scanning data from the sensor and the respective images of the component 101 shot by the imaging apparatus 102.
Here, as described above, in the case where the sensor is a laser scanner, for example, the light scanning data may be laser scanning data directly read from the laser scanner, which can reflect the angle of the laser beams emitted by the laser scanner with respect to the laser scanner and also the distance between the laser scanner and the component to which the laser beams are irradiated. Of course, light scanning data does not limit to this, as long as the light irradiating positions on the component with a planar pattern can be obtained and then by using the relative position of the sensor irradiating the lights to the component with a planar pattern and the component, the positions in the images of the component shot by the imaging apparatus which correspond to light irradiating positions on  the component can be obtained (see, for example, Q. Zhang and R. Pless. Extrinsic calibration of a camera and laser range finder (improves camera calibration) . In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2004. ) .
Then, at step S120, the relative position between the laser and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed is obtained, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
Through the above embodiment of the present invention, each senor 105 may be calibrated automatically, without needing to manually move the imaging apparatus 102 as in prior art, and simultaneously, the automatic calibration process makes the calibration more accurate, robust and efficient than manual calibration process in prior art.
Fig. 5 illustrates another computer-implemented method for calibrating at least one laser according to other exemplary embodiments of the present invention.
As shown in Fig. 5, at moving step S105, through the moving of one or more components such as mounting apparatus 107, second moving apparatus 108, and guiding apparatus 109 etc., each of the at least one sensor 105 is capable of individually irradiating laser beams to the component with a planar pattern 101.
In this manner, the calibration method according to exemplary embodiments of the present invention is more flexible, because the relating components can be arbitrarily moved as necessary.
More specifically, as shown in Fig. 6, according to another exemplary embodiment of the present invention, the moving step S105 may comprise driving equipment moving step S101, where a driving equipment 106 in which the at least one sensor 105 is mounted or disposed may be moved by using a mounting apparatus 107 on which the driving equipment 106 is mounted, so that each of the at least one sensor is capable of individually irradiating light beams to the component with a planar pattern.
In this manner, the calibration method according to this exemplary embodiment of the present invention may be used for a massive industrial production and can be integrated into current vehicle (such as car etc. ) assembly line, so as to be used to calibrate the sensors after mounted onto the vehicle and before leaving the factory.
In addition, during calibration, the driving equipment or vehicle 106 can also be moveable through the mounting apparatus 107 so that each of the sensors 105 provided in the driving equipment may be adjusted to align with the component 102 automatically for irradiating lights thereto, which enhances the automation of the calibration process and thus  makes the calibration more accurate, robust and efficient.
In addition, the calibration system according to this exemplary embodiment of the present invention may be used for a massive industrial production and can be integrated into current vehicle (such as car etc. ) assembly line, so as to be used to calibrate the sensors after mounted onto the vehicle and before leaving the factory.
More specifically, as shown in Fig. 6, according to another exemplary embodiment of the present invention, the moving step S105 may comprise component moving step S102, where the component with a planar pattern is moved by using a second moving apparatus, so as to be capable of individually receiving light beams from each of the at least one sensor.
In this manner, the calibration method according to this exemplary embodiment of the present invention may be more flexible, accurate, robust and efficient.
In addition, the calibration system according to these embodiments of the present invention does not require a large empty area for providing external markings, for example.
More specifically, as shown in Fig. 6, according to another exemplary embodiment of the present invention, the moving step S105 may comprise component moving step S103, where at least one of the first moving apparatus and the second moving apparatus may be moved by using a guiding apparatus 109 such as guiding rails.
Here, it is to be noted that, the steps S101, S102 and S103 may be arbitrarily combined or be used individually, and the present invention does not make any limitation to this.
Then, as shown in Fig. 5, according to another exemplary embodiment of the present invention, at the third obtaining step S130, the relative position between each sensor and the mounting apparatus may be obtained, based on the relative position between each sensor and the other end of the first moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
Then, as shown in Fig. 5, according to another exemplary embodiment of the present invention, at the fourth obtaining step S140, the relative position between the lasers may be obtained, based on the relative position between each laser and the other end of the first moving apparatus or based on the relative position between each laser and the mounting apparatus.
Until this time, both any one of the sensors and the relative positions of the sensors can be calibrated.
In view of the above, the aspects of the present invention may obtain at least one  of the following advantages: 1) multi-sensors (such as laser scanners) on a car can be calibrated; 2) the calibration process is fully automated, which makes the calibration more accurate, robust and efficient; 3) the calibration system is suitable for a massive industrial production and can be integrated into current vehicle (such as car etc. ) assembly line; 4) the calibration system does not require a large empty area for multi laser scanner calibration.
Next, an exemplary calibration process among the sensors will be described below for easily understanding by those skilled in the art. However, those skilled in the art shall know that, the following example of the calibration process among the sensors is only illustrative, and the present invention does not limit to this. On the other hand, as for the calibration process for single sensor, the above description has clearly set forth it and thus it will be omitted here for conciseness.
Step 1. The car is transported through assembly line and locked on the mounting base.
Step2. The second robot arm (R1) that holds the checkerboard will move, so that one of the car laser scanner beams intersects with the checkerboard. It is better to move the checkerboard to let more than one laser scanner beams intersect on the checkerboard at the same time. Then, record the laser scanning data which can reflect the angle of the laser beams emitted by the laser scanner with respect to the laser scanner and also the distance between the laser scanner and the checkerboard.
Step3. The first robot arm (R0) moves the camera, so that the camera can see the checkerboard. Then using traditional camera calibration methods (such as Jean-Yves Bouguet. Camera Calibration Toolbox for Matlab, 2003) , we can get the relative transformation (position and rotation) of the checkerboard with respect to (wrt) the camera cTb. Since the transformation of the end-effector of the robot arm R0 wrt the base of this robotr0Tc can be calculated using forward kinematics of the robot arm, we can then get the transformation of the checkerboard wrt the base of R0 r0Tb.
Step4. Rotate the checkerboard with different angles and repeat steps 2 to 3 to collect different set of laser scanning data and checkerboard transformations wrt camera.
Step5. By the rule that all laser beams interacts with the checkerboard in eachdata set shoulb be coplanar, we can find the optimized laser scanner transformation wrt the camera cTl0. Also, using the robot arm forward kinematics, we can calculate the transformation of the laser scanner wrt the base of R0 r0Tl0.
Step6. Rotate the mounting base with angle θi so that another laser scanner (the  i th laser scanner li) beams strike on the checkerboard and repeatsteps 2 to 5 to get the transformation of the i th laser scanner wrt the R0 base r0T′li. Since the transformation of R0 base wrt the mounting baseBTr0 is predefined, we can then get the transformation of the i th laser scanner wrt R0 base before rotating the mounting base
Figure PCTCN2015078769-appb-000001
where
Figure PCTCN2015078769-appb-000002
Step7. Repeat Step6 untill all the laser sensors have been iterated. Then the transformation for all laser scanner i wrt R0 base can be obtained. Taking the first laser scanner as the reference laser scanner, the transformation of the ith laser scanner wrt the first laser scanner l0 can be calculated as
Figure PCTCN2015078769-appb-000003
At this step, all the laser scanners are calibrated with the reference laser scanner.
Furthermore, an apparatus 700 according to an exemplary embodiment of the present invention may comprise: a memory with computer executable instructions stored therein; and a processor, coupled to the memory and configured to: in response to each of the at least one laser individually irradiating laser beams to a component with a planar pattern, obtain the relative position between the laser and the imaging apparatus, based on laser scanning data from the laser and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the laser and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
In addition, it is provided a non-transitory machine readable storage medium according to an exemplary embodiment of the present invention, in which instructions causing a machine to perform the following operations are stored: in response to each of the at least one laser individually irradiating laser beams to a component with a planar pattern, obtain the relative position between the laser and the imaging apparatus, based on laser scanning data from the laser and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the laser and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
Next, a computing device 700, which is an example of the hardware device that may be applied to the aspects of the present disclosure, will now be described with reference  to FIG. 7. The computing device 700 may be any machine configured to perform processing and/or calculations, may be but is not limited to a work station, a server, a desktop computer, a laptop computer, a tablet computer, a personal data assistant, a smart phone, an on-vehicle computer or any combination thereof. The aforementioned intelligent voice assistant apparatus 200 may be wholly or at least partially implemented by the computing device 700 or a similar device or system.
The computing device 700 may comprise elements that are connected with or in communication with a bus 702, possibly via one or more interfaces. For example, the computing device 700 may comprise the bus 702, and one or more processors 704, one or more input devices 706 and one or more output devices 708. The one or more processors 704 may be any kinds of processors, and may comprise but are not limited to one or more general-purpose processors and/or one or more special-purpose processors (such as special processing chips) . The input devices 706 may be any kinds of devices that can input information to the computing device, and may comprise but are not limited to a mouse, a keyboard, a touch screen, a microphone and/or a remote control. The output devices 708 may be any kinds of devices that can present information, and may comprise but are not limited to display, a speaker, a video/audio output terminal, a vibrator and/or a printer. The computing device 700 may also comprise or be connected with non-transitory storage devices 710 which may be any storage devices that are non-transitory and can implement data stores, and may comprise but are not limited to a disk drive, an optical storage device, a solid-state storage, a floppy disk, a flexible disk, hard disk, a magnetic tape or any other magnetic medium, a compact disc or any other optical medium, a ROM (Read Only Memory) , a RAM (Random Access Memory) , a cache memory and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions and/or code. The non-transitory storage devices 710 may be detachable from an interface. The non-transitory storage devices 710 may have data/instructions/code for implementing the methods and steps which are described above. The computing device 700 may also comprise a communication device 712. The communication device 712 may be any kinds of device or system that can enable communication with external apparatuses and/or with a network, and may comprise but are not limited to a modem, a network card, an infrared communication device, a wireless communication device and/or a chipset such as a BluetoothTM device, 1302.11 device, WiFi device, WiMax device, cellular communication facilities and/or the like. The transceiver (s) 107 as aforementioned may, for example, be implemented by the communication device 712.
When the computing device 700 is used as an on-vehicle device, it may also be  connected to external device, for example, a GPS receiver, sensors for sensing different environmental data such as a laser scanner, an acceleration sensor, a wheel speed sensor, a gyroscope and so on. In this way, the computing device 700 may, for example, receive location data and sensor data indicating the travelling situation of the vehicle. When the computing device 700 is used as an on-vehicle device, it may also be connected to other facilities (such as an engine system, a wiper, an anti-lock Braking System or the like) for controlling the traveling and operation of the vehicle.
In addition, the non-transitory storage device 710 may have map information and software elements so that the processor 704 may perform route guidance processing. In addition, the output device 706 may comprise a display for displaying the map, the location mark of the vehicle and also images indicating the travelling situation of the vehicle. The output device 706 may also comprise a speaker or interface with an ear phone for audio guidance.
The bus 702 may include but is not limited to Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus. Particularly, for an on-vehicle device, the bus 702 may also include a Controller Area Network (CAN) bus or other architectures designed for application on an automobile.
The computing device 700 may also comprise a working memory 714, which may be any kind of working memory that may store instructions and/or data useful for the working of the processor 704, and may comprise but is not limited to a random access memory and/or a read-only memory device.
Software elements may be located in the working memory 714, including but are not limited to an operating system 716, one or more application programs 718, drivers and/or other data and codes. Instructions for performing the methods and steps described in the above may be comprised in the one or more application programs 718, and the units of the aforementioned intelligent voice assistant apparatus 200 may be implemented by the processor 704 reading and executing the instructions of the one or more application programs 718. More specifically, the determination unit 201 of the aforementioned intelligent voice assistant apparatus 200 may, for example, be implemented by the processor 704 when executing an application 718 having instructions to perform the step S402, S502, or S602. In addition, the vocalization control unit 202 of the aforementioned intelligent voice assistant apparatus 200 may, for example, be implemented by the processor 704 when executing an application 718 having instructions to perform the step S405, S506, or S606. Other units of  the aforementioned intelligent voice assistant apparatus 200 may also, for example, be implemented by the processor 704 when executing an application 718 having instructions to perform one or more of the aforementioned respective steps. The executable codes or source codes of the instructions of the software elements may be stored in a non-transitory computer-readable storage medium, such as the storage device (s) 710 described above, and may be read into the working memory 714 possibly with compilation and/or installation. The executable codes or source codes of the instructions of the software elements may also be downloaded from a remote location.
Those skilled in the art may clearly know from the above embodiments that the present disclosure may be implemented by software with necessary hardware, or by hardware, firmware and the like. Based on such understanding, the embodiments of the present disclosure may be embodied in part in a software form. The computer software may be stored in a readable storage medium such as a floppy disk, a hard disk, an optical disk or a flash memory of the computer. The computer software comprises a series of instructions to make the computer (e.g., a personal computer, a service station or a network terminal) execute the method or a part thereof according to respective embodiment of the present disclosure.
The present disclosure being thus described, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the present disclosure, and all such modifications as would be obvious to those skilled in the art are intended to be included within the scope of the following claims.

Claims (14)

  1. A system for calibrating at least one sensor capable of emitting lights, characterized in comprising:
    a component with a planar pattern capable of receiving light beams from a sensor;
    an imaging apparatus capable of shooting the component;
    a first moving apparatus with one end being attached to the imaging apparatus and the other end being able to be fixed so that the imaging apparatus can move;
    a processing apparatus which is configured to, in response to each of the at least one sensor individually irradiating light beams to said component, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and obtain the relative position between the sensor and the other end of the first moving apparatus, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  2. The system as claimed in claim 1, wherein the at least one sensor is mounted in a driving equipment, and the system further comprising:
    a mounting apparatus on which the driving equipment is mounted so that the driving equipment can be fixed or move.
  3. The system as claimed in claim 2, wherein the processing apparatus is further configured to obtain the relative position between each sensor and the mounting apparatus, based on the relative position between each sensor and the other end of the first moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
  4. The system as claimed in claim 1 or 3, wherein the processing apparatus is further configured to obtain the relative position between the sensors, based on the relative position between each sensor and the other end of the first moving apparatus or based on the relative position between each sensor and the mounting apparatus.
  5. The system as claimed in claim 1, further comprising:
    a second moving apparatus, attaching to the component so as to be capable of moving it.
  6. The system as claimed in claim 5, wherein at least one of the first moving apparatus and the second moving apparatus is mounted on a guiding apparatus.
  7. A computer-implemented method for calibrating at least one sensor capable of emitting lights, characterized in comprising the following steps:
    in response to each of the at least one sensor individually irradiating light beams to a component with a planar pattern, obtaining the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and
    obtaining the relative position between the sensor and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  8. The computer-implemented method as claimed in claim 7, further comprising the step of:
    moving a driving equipment in which the at least one sensor is mounted by using a mounting apparatus on which the driving equipment is mounted, so that each of the at least one sensor is capable of individually irradiating light beams to the component with a planar pattern.
  9. The computer-implemented method as claimed in claim 8, further comprising the step of:
    obtaining the relative position between each sensor and the mounting apparatus, based on the relative position between each sensor and the other end of the first moving apparatus and the relative position between the other end of the first moving apparatus and the mounting apparatus.
  10. The computer-implemented method as claimed in claim 7 or 9, further comprising the step of:
    obtaining the relative position between the sensors, based on the relative position between each sensor and the other end of the first moving apparatus or based on the relative position between each sensor and the mounting apparatus.
  11. The computer-implemented method as claimed in claim 7, further comprising the step  of:
    moving the component by using a second moving apparatus, so as to be capable of individually receiving light beams from each of the at least one sensor.
  12. The computer-implemented method as claimed in claim 11, further comprising the step of:
    guiding at least one of the first moving apparatus and the second moving apparatus by using a guiding apparatus.
  13. An apparatus, characterized in comprising:
    a memory with computer executable instructions stored therein; and
    a processor, coupled to the memory and configured to:
    in response to each of the at least one sensor individually irradiating light beams to a component with a planar pattern, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and
    obtain the relative position between the sensor and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
  14. A non-transitory machine readable storage medium, in which instructions causing a machine to perform the following operations are stored:
    in response to each of the at least one sensor individually irradiating light beams to a component with a planar pattern, obtain the relative position between the sensor and the imaging apparatus, based on light scanning data from the sensor and the respective images of the component shot by the imaging apparatus, and
    obtain the relative position between the sensor and the other end of a first moving apparatus with one end attached to the imaging apparatus and the other end being able to be fixed, based on the relative position between the other end of the first moving apparatus and the imaging apparatus.
PCT/CN2015/078769 2015-05-12 2015-05-12 A system and a computer-implemented method for calibrating at least one senser WO2016179798A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/078769 WO2016179798A1 (en) 2015-05-12 2015-05-12 A system and a computer-implemented method for calibrating at least one senser

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/078769 WO2016179798A1 (en) 2015-05-12 2015-05-12 A system and a computer-implemented method for calibrating at least one senser

Publications (1)

Publication Number Publication Date
WO2016179798A1 true WO2016179798A1 (en) 2016-11-17

Family

ID=57248737

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/078769 WO2016179798A1 (en) 2015-05-12 2015-05-12 A system and a computer-implemented method for calibrating at least one senser

Country Status (1)

Country Link
WO (1) WO2016179798A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017205061A1 (en) * 2016-05-24 2017-11-30 Illinois Tool Works Inc. Three-dimensional calibration tools and methods
CN110987468A (en) * 2019-11-27 2020-04-10 无为县金顺机动车检测有限公司 Vehicle chassis central region scanning detector
DE102020211483A1 (en) 2020-09-14 2022-03-17 Robert Bosch Gesellschaft mit beschränkter Haftung Method for testing a sensor system of a motor vehicle

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4687326A (en) * 1985-11-12 1987-08-18 General Electric Company Integrated range and luminance camera
CN1566903A (en) * 2003-06-11 2005-01-19 北京航空航天大学 Laser vision on-line automatic measuring method for tire multiple geometrical parameters
CN201225882Y (en) * 2008-07-27 2009-04-22 扬州大学 Mechanism for regulating marking target
US20120242839A1 (en) * 2011-03-25 2012-09-27 Tk Holdings Inc. Image sensor calibration system and method
CN102789111A (en) * 2011-05-19 2012-11-21 华晶科技股份有限公司 Lens calibration system and method thereof
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN203687871U (en) * 2013-12-12 2014-07-02 北京安铁软件技术有限公司 High-precision calibration checkerboard used for measuring train wheels
CN103996183A (en) * 2013-02-18 2014-08-20 沃尔沃汽车公司 Method for calibrating a sensor cluster in a motor vehicle
CN104276102A (en) * 2013-07-10 2015-01-14 德尔福电子(苏州)有限公司 Surround view system calibrating device based on vehicle position detection

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4687326A (en) * 1985-11-12 1987-08-18 General Electric Company Integrated range and luminance camera
CN1566903A (en) * 2003-06-11 2005-01-19 北京航空航天大学 Laser vision on-line automatic measuring method for tire multiple geometrical parameters
CN201225882Y (en) * 2008-07-27 2009-04-22 扬州大学 Mechanism for regulating marking target
US20120242839A1 (en) * 2011-03-25 2012-09-27 Tk Holdings Inc. Image sensor calibration system and method
CN102789111A (en) * 2011-05-19 2012-11-21 华晶科技股份有限公司 Lens calibration system and method thereof
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN103996183A (en) * 2013-02-18 2014-08-20 沃尔沃汽车公司 Method for calibrating a sensor cluster in a motor vehicle
CN104276102A (en) * 2013-07-10 2015-01-14 德尔福电子(苏州)有限公司 Surround view system calibrating device based on vehicle position detection
CN203687871U (en) * 2013-12-12 2014-07-02 北京安铁软件技术有限公司 High-precision calibration checkerboard used for measuring train wheels

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017205061A1 (en) * 2016-05-24 2017-11-30 Illinois Tool Works Inc. Three-dimensional calibration tools and methods
US10928193B2 (en) 2016-05-24 2021-02-23 Illinois Tool Works Inc. Three-dimensional calibration tools and methods
CN110987468A (en) * 2019-11-27 2020-04-10 无为县金顺机动车检测有限公司 Vehicle chassis central region scanning detector
DE102020211483A1 (en) 2020-09-14 2022-03-17 Robert Bosch Gesellschaft mit beschränkter Haftung Method for testing a sensor system of a motor vehicle

Similar Documents

Publication Publication Date Title
US11250288B2 (en) Information processing apparatus and information processing method using correlation between attributes
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
EP3627180A1 (en) Sensor calibration method and device, computer device, medium, and vehicle
US11017558B2 (en) Camera registration in a multi-camera system
KR102420476B1 (en) Apparatus and method for estimating location of vehicle and computer recordable medium storing computer program thereof
WO2017159382A1 (en) Signal processing device and signal processing method
WO2018196391A1 (en) Method and device for calibrating external parameters of vehicle-mounted camera
WO2019022912A1 (en) Systems and methods for determining a vehicle position
WO2019000945A1 (en) On-board camera-based distance measurement method and apparatus, storage medium, and electronic device
US20180130008A1 (en) System and Method for Aerial Vehicle Automatic Landing and Cargo Delivery
US20170344021A1 (en) Information processing apparatus, vehicle, and information processing method
KR101526826B1 (en) Assistance Device for Autonomous Vehicle
KR102006291B1 (en) Method for estimating pose of moving object of electronic apparatus
US11828828B2 (en) Method, apparatus, and system for vibration measurement for sensor bracket and movable device
JP6701532B2 (en) Image processing apparatus and image processing method
WO2016179798A1 (en) A system and a computer-implemented method for calibrating at least one senser
CN110766761A (en) Method, device, equipment and storage medium for camera calibration
JP2019120629A (en) Position calculation device, position calculation program, and coordinate marker
JP2018189463A (en) Vehicle position estimating device and program
US11377125B2 (en) Vehicle rideshare localization and passenger identification for autonomous vehicles
US20230176201A1 (en) Relative lidar alignment with limited overlap
CN110023781A (en) Method and apparatus for determining the accurate location of vehicle according to the radar of vehicle-periphery signature
US20210080264A1 (en) Estimation device, estimation method, and computer program product
WO2020248209A1 (en) 3d odometry in 6d space with roadmodel 2d manifold
US20240051359A1 (en) Object position estimation with calibrated sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15891501

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15891501

Country of ref document: EP

Kind code of ref document: A1