CN100430686C - Information processing method and device therefor - Google Patents

Information processing method and device therefor Download PDF

Info

Publication number
CN100430686C
CN100430686C CNB2005100680684A CN200510068068A CN100430686C CN 100430686 C CN100430686 C CN 100430686C CN B2005100680684 A CNB2005100680684 A CN B2005100680684A CN 200510068068 A CN200510068068 A CN 200510068068A CN 100430686 C CN100430686 C CN 100430686C
Authority
CN
China
Prior art keywords
mark
information
image
orientation
camera head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CNB2005100680684A
Other languages
Chinese (zh)
Other versions
CN1865841A (en
Inventor
穴吹真秀
佐藤清秀
荒谷真一
小竹大辅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN1865841A publication Critical patent/CN1865841A/en
Application granted granted Critical
Publication of CN100430686C publication Critical patent/CN100430686C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The present invention relates to an information processing device which is used for obtaining setting information about a camera of a mark which is provided for the camera and requires to be corrected. The present invention comprises a first image obtaining unit, a second image obtaining unit, a first detecting unit, a second detecting unit and a corrected information calculating unit, wherein the first image obtaining unit is used for obtaining a first image shot by the camera; the second image obtaining unit is used for obtaining a second image shot by the camera in the position of an overlooking visual angle; the first detecting unit is used for detecting the image coordinate information about reference marks arranged in a scene from the first image; the second detecting unit is used for detecting the image coordinate information about marks to be corrected from the second image; the corrected information calculating unit obtains corrected information by using the image coordinate information about the reference marks detected by the first detecting unit and the second detecting unit.

Description

Information processing method and equipment
Technical field
The present invention relates to the placement information about this camera head that is provided at the mark on the camera head is calibrated.
Background technology
In recent years, carried out the active research activities about mixed reality, its target is realistic space and Virtual Space are carried out seamless link.A kind ofly be used to represent that the image display of this mixed reality is achieved by image method, be used for the image overlay in the Virtual Space is plotted to the image that uses video camera for example to take in realistic space, described Virtual Space (dummy object, characteristic information or the similar information of drawing with computerized mapping) generates according to the position and the orientation of back with the camera head that describes.
Wish and this image display can be applied in the various fields that are different from conventional virtual reality, the surgery that for example is used for the state in patient body stack is shown to body surface is auxiliary, and wherein player's mixed reality recreation of in realistic space, fighting with a virtual enemy, or the like.
Sixty-four dollar question was the alignment that how to realize exactly between realistic space and the Virtual Space during all these were used, and had attempted many methods up to now.Alignment problem in the mixed reality is concluded position and the orientation problem of promptly obtaining the camera head of (that is, in world coordinate system) in scene.
The method that addresses the above problem common employing is, in described scene, place or be provided with mark, the coordinate of usage flag in world coordinate system and the coordinate of the projected image of the mark in the image taken by camera head, thus the position and the orientation of camera head in the scene obtained.A kind ofly according to the image coordinate that is marked at coordinate in the world coordinate system and its projected image in the scene Calculation Method is carried out in the position of camera head and orientation and be proposed and still be used for photogrammetric field.
Realized the technology of obtaining the position and the orientation of object about a plurality of marks of wanting measured object by being provided with, use the external photography video camera shot object of looking down, the image coordinate of the projected image of the mark in the eye view image that detection is taken the photograph is (referring to R.M.Haralick, C.Lee, K.Ottenberg, and M.Nolle: " Review and analysis of solutions ofthe three point perspective pose estimation problem ", Int ' l.J.ComputerVision, Vol.13, No.3, pp.331-356,1994; And D.G.Lowe: " Fittingparameterized three-dimensional models to images ", IEEE translations onPAMI, Vol.13, No.5, pp.441-450,1991).
In addition, the inventor has proposed a kind of measuring method of position and orientation of camera head in the publication application No.20040176925 of a U.S. (filing an application at USPTO on January 8th, 2004), comprising: thus a kind ofly calculate the position of camera head and the method in orientation by the projected image that from the image of wanting measured camera head to take, detects the mark in the scene; Thereby a kind ofly be arranged on camera head and calculate the position of camera head and the method in orientation from the projected image of on one's body mark by detecting the image of visual angle from looking down of the measured object of wanting of looking down that the position, visual angle takes.
Yet, use is looked down the image coordinate of the projected image of the mark in the image of visual angle and is calculated the position of camera head and the method in orientation by detection, the relative position relation and the measured object that are arranged between a plurality of marks on the object of measurement must be known.If do not know the relative position relation between each mark, for example, if can being described to the mark of a single point, each its feature is placed on the object that to measure, for example use with coloured spherical labels or circular mark centre of gravity place, then must measure the three-dimensional position in each coordinate system that is marked at the object of wanting measured in advance as the projected image of the mark of feature.On the other hand, if known the relative position relation between the mark, for example be placed under the situation of marking the mark (if base is to using above-mentioned mark) on (jig) in advance as the location on basis, each is marked at the position (coordinate system of the relevant position of descriptive markup is called " mark coordinate system " with being used for by this way later on) on this location target coordinate system of having measured in advance and has been placed in location mark on the object of measurement, and the position and the orientation of locating in the coordinate system that is marked on the object that will measure all is necessary for known hereto.It should be noted that, if be described with a plurality of characteristics to mark, for example, if use have known to square or triangle summit as the projected image of the mark of the shape of feature, then each mark is interpreted as the group that a plurality of marks constitute, and is considered to belong to the situation of a kind of " relative position relation between the mark is known ".Yet being used for of generally not knowing carried out accurate Calibration Method to these, generally speaking, do not have other method and has only and use by the inaccurate position of hand dipping or position and orientation as given value.Therefore, in the said method that the image coordinate of the projected image of looking down the mark in the image of visual angle by measurement detects the position and the orientation of object, also exist room for improvement, because can only carry out the measurement of low accuracy from the position and the orientation of the viewpoint object of reality.
Summary of the invention
The present invention the objective of the invention is to address the above problem, so can obtain the placement information about the mark of object easily and exactly.Therefore, the present invention is as described below.
According to a first aspect of the invention, the present invention relates to a kind of information processing method, be used for the position about the mark of object of calculating and setting on object, comprising: object space and orientation obtaining step, obtain the position and the azimuth information of object; First width of cloth image input step, input is from looking down look down first width of cloth image that visual angle take of visual angle image unit at object; Detect step, certification mark from first image; Mark position and orientation calculation procedure are used the position of object and orientation and are with reference to the position of calculating mark with the object about the information of the image coordinate of the mark that detects.
Following illustrated embodiment and accompanying drawing are combined, and other feature and advantage of the present invention will become clear, represent identical or similar part with same mark in whole accompanying drawing.
Description of drawings
Fig. 1 is the synoptic diagram of the configuration of the mark correcting device of expression first embodiment according to the invention.
Fig. 2 is the process flow diagram of expression according to the process of the mark calibration steps of first embodiment.
Fig. 3 is the synoptic diagram of expression according to the configuration of the mark correcting device of second embodiment of the present invention.
Fig. 4 is the process flow diagram of expression according to the process of the mark calibration steps of second embodiment.
Fig. 5 is the synoptic diagram of expression according to the configuration of the mark correcting device of the 3rd embodiment.
Fig. 6 is the process flow diagram of expression according to the process of the mark calibration steps of the 3rd embodiment.
Fig. 7 is the synoptic diagram of expression according to the configuration of the mark correcting device of the modification of technical solution of the present invention.
Fig. 8 is the synoptic diagram of expression according to the illustrative configurations of the mark correcting device of the 4th embodiment.
Fig. 9 is the process flow diagram of expression according to the process of the mark calibration steps of the 4th embodiment.
Embodiment
Describe the specific embodiment of the present invention with reference to the accompanying drawings in detail.
First embodiment
Obtain the mark that is arranged on the camera head position about this camera head for each mark according to the mark correcting device of present embodiment.The following describes mark correcting device and mark calibration steps according to present embodiment.
Fig. 1 is the synoptic diagram of the example arrangement of the mark correcting device 100 of expression first embodiment according to the invention.As shown in Figure 1, mark correcting device 100 comprises: first person marker detection unit 110, position and orientation computing unit 120, look down marker detection unit, visual angle 150, indicating member 160, Data Management Unit 170 and calibration information computing unit 180, and this mark correcting device 100 is connected to the camera head 130 that will be calibrated.
This explanation should be understood to according to the user of the mark correcting device 100 of present embodiment one or more mark P that will be calibrated are set k(k=1 is to K here 2, call in the following text and abbreviate " looking down the visual angle mark " as).Look down the visual angle and be marked at first person photography video camera coordinate system (a kind of coordinate system, wherein, a point on the camera head 130 is defined as initial point, and, the axle of three quadratures is respectively defined as X-axis, Y-axis and Z axle) in the position be unknown, and looking down the visual angle, to be marked at the unknown position of looking down in the photography video camera coordinate system of visual angle be the information that will calibrate according to the mark correcting device of present embodiment, that is, according to the output of the mark calibration of the mark correcting device of present embodiment.
Be arranged in a plurality of points in the realistic space is a plurality of in world space coordinate system (a kind of coordinate system, wherein, a point in the realistic space is defined as initial point, and three orthogonal axes are respectively defined as X-axis, Y-axis and Z axle) in have known location first person mark Q k(wherein k=1 is to K 1), as the mark (back is called first person mark (reference marker)) that uses camera head 130 to take.Preferably, first person mark Q kPlace so that camera head 130 is observed three or more marks at least when obtaining the mark calibration data like this.As shown in fig. 1, place four first person mark Q 1, Q 2, Q 3And Q 4, in these marks, observe mark Q for three 1, Q 3And Q 4Be positioned within the visual field of camera head 130.
First person mark Q kCan perhaps can be configured to for example have respectively the unique point of the physical feature of different structure feature for having the circular mark of different colours respectively.Also can use the square marks of square region with certain area size.In fact, the mark of any kind can use, and which is as long as can detect image coordinate and can identify which mark in the projected image on the image of taking.
Be imported into first person marker detection unit 110 from the image (back is called " first person image ") of camera head 130 outputs.First person marker detection unit 110 is from camera head 130 input first person images, and the first person mark Q that detection is taken in input picture kImage coordinate.For example, if each first person mark Q kBe configured to the mark of different colours, then detect zone, and its centre of gravity place is regarded as the detection coordinates of this mark corresponding to each marker color on the first person image.If each first person mark Q kBe configured to difference and have the unique point of different structure feature, then by the first person image is suitable for use each mark in advance as the position of the template matches technology for detection mark of the template image of Given information.If use square marks as the reference mark, then image to be carried out binary conversion treatment and then it is marked, have certain area or the more large-area zone that form from four straight lines are used as mark candidate and detect.In addition, in the candidate region, whether exist special figure that error-detecting is eliminated by judging, thereby obtain the direction and the identifier of mark.Detected like this square marks can be thought four marks being made of four summits.
In addition, first person marker detection unit 110 is with each first person mark Q KnImage coordinate u QknWith its identifier k nOutput to position and orientation computing unit 120.Now, n (n=1 is to N) is each index of detected mark, and N represents the sum of detected mark.For example, under the situation of Fig. 1, so N=3 is identifier k 1=1, k 2=3, k 3=4, and export corresponding image coordinate u Qk1, u Qk2, and u Qk3In the present embodiment, first person marker detection unit 110 is carried out the marker detection processing with the image of camera head 130 inputs continuously as triggering, but can do such arrangement, promptly finish processing (using first person image) in this some input according to the request of position and orientation computing unit 120.
Position and orientation computing unit 120 are according to each detected first person mark Q KnImage coordinate u QknWith as the world coordinates x of Given information in advance w QknBetween relation calculate the position and the orientation of camera head.Be used for from the world coordinates of first person mark and image coordinate is calculated the position of camera head and the method in orientation is known (referring to R.M.Haralick, C.Lee, K.Ottenberg, and M.Nolle: " Review and analysis ofsolutions of the three point perspective pose estimation problem ", Int ' l.J.Computer Vision, Vol.13, No.3, pp.331-356,1994; And D.G.Lowe: " Fitting parameterized three-dimensional models to images ", IEEEtranslations on PAMI, Vol.13, No.5, pp.441-450,1991).For example, if the first person mark is placed on the same plane, then the detection to four or more a plurality of marks allows the position of camera head and orientation to detect by the corresponding calculating of two dimension (two-dimensionalhomographic calculations).Use not also is being known with six on the plane or the position of more a plurality of marker detection camera heads and the method in orientation.In addition, error between the logical value of the image coordinate by making the mark that calculates from the estimated value in the position of camera head and orientation reaches the method that the estimated value to this position and orientation is optimized that minimizes, by use the image Jacobian for example gauss-newton method carry out double counting.Position and the orientation of the camera head 130 that calculates like this in world coordinate system is output to Data Management Unit 170 according to the request of Data Management Unit 170.Though present embodiment comprises position and orientation computing unit 120, will be from the input of the data of first person marker detection unit 110 as triggering, the calculating in executing location and orientation continuously, but can do an arrangement, promptly finish processing (using input data) at this point according to the request of Data Management Unit 170.In the following description, the position of camera head 130 and orientation should be understood that with homogeneous coordinates matrix (model conversion matrix; Be used for the matrix in the coordinate conversion of the first person photography video camera coordinate system coordinate in the world coordinate system) M WCExpression.
Looking down visual angle photography video camera 140 is fixed and is placed on the position that can take camera head 130 when obtaining the mark correction data.Looking down position and the orientation of visual angle photography video camera 140 in world coordinates is retained in the calibration information computing unit 180 as given value in advance.
Below, we use the term expression of " looking down the visual angle photography video camera " from the photography video camera of third person viewpoint object observing; The position of photography video camera is not limited to " looking down " position.
Look down marker detection unit, visual angle 150 input by looking down the image (looking down the visual angle image) that visual angle photography video camera 140 is taken, the processing by being similar to first person marker detection unit 110 is looked down visual angle mark P to what take in the image kImage coordinate detect, and according to the request of Data Management Unit 170 with detected image coordinate u QkmWith its identifier k mOutput in the Data Management Unit 170.Here, m (m=1 is to M) is each index of detected mark, and M represents the sum of detected mark.For example, under situation shown in Figure 1, so M=2 is identifier k 1=1, k 2=2, and export corresponding image coordinate u Pk1And u Pk2In the present embodiment, with the input of looking down the visual angle image as triggering, look down marker detection unit, visual angle 150 and carry out marker detection continuously and handle, but can do an arrangement, promptly handle (use what this point was imported and look down the visual angle image) according to the request of Data Management Unit 170.
If the operator imports data and obtains the order (not shown), the order that indicating member 160 will " obtain data " is sent to Data Management Unit 170, if input calibration information calculation command then transmits one " calibration information calculating " and orders calibration information computing unit 180.Order is input to indicating member 160 can for example be used keyboard or use the GUI that shows on display by by distributing the key by particular command to finish.
When Data Management Unit 170 receives " obtaining data " when indication from indicating member 160, from position and orientation computing unit 120 position and the orientation of input camera head 130 world coordinate system, from looking down image coordinate and its identifier that the visual angle mark is looked down in marker detection unit, visual angle 150 input, data are added " position of image unit 130 and orientation and look down the image coordinate of visual angle mark " and remain into to each looks down in the tables of data that the visual angle mark prepares.Now, from position and the orientation of camera head 130 world coordinate system of position and 120 inputs of orientation computing unit, exactly from looking down the data of when taking the image that has detected the image coordinate of looking down the visual angle mark, obtaining of marker detection unit, visual angle 150 inputs.Data Management Unit 170 outputs to calibration information computing unit 180 according to the request of calibration information computing unit 180 with the tables of data of looking down the visual angle mark of each generation.
When calibration information computing unit 180 receives " calibration information calculating " when indication from indicating member 160, the tables of data of input Data Management Unit 170, carry out calibration process based on this, and the consequent calibration information (that is, each looks down the position of visual angle mark in first person photography video camera coordinate) that output is obtained.
The process flow diagram of the processing that Fig. 2 finishes when correcting device obtains calibration information according to present embodiment for expression.According to this process flow diagram, program code is stored in according in the storer in the equipment of present embodiment, as random-access memory (ram), ROM (read-only memory) (ROM) or similar storer (not shown), this program code is read and carries out by CPU (central processing unit) (CPU) then.
At first, at step S2010, whether indicating member 160 decision operation persons have imported data and have obtained order.The input data are obtained order when camera head 130 is placed in the position that obtains the mark calibration data.If judged result is not import data to obtain order (being not), command unit 160 execution in step S2050 in step S2010.If judged result is to have imported data to obtain order (in step S2010 for being), indicating member 160 execution in step S2020.
At step S2020, Data Management Unit 170 is from position and orientation computing unit 120 position and the orientation Ms of input camera head 130 world coordinate system WC
At step S2030, Data Management Unit 170 is looked down the marker detection unit 150 detected visual angle mark P that look down from looking down 150 inputs of marker detection unit, visual angle KmImage coordinate u PkmWith its identifier k mAccording to the processing of in step S2020 and S2030, carrying out, be M in the position and the orientation of camera head 130 WCSituation under the image coordinate of looking down the visual angle mark.Be noted that from the information of looking down 150 inputs of marker detection unit, visual angle be not to be necessary for and all relevant information of visual angle mark of looking down, it is just enough that visual angle image on detect with mark relevant information looking down of this some this moment.
Next, at step S2040, Data Management Unit 170 with the input data to adding each detected visual angle mark P that looks down to KmTables of data L PkIn.Concrete, for M from position and 120 outputs of orientation computing unit WCAs M Wci, from looking down the u of marker detection unit, visual angle 1 50 inputs PkmAs u i Pk, data are to [M Wci, u i Pk] as i and P kRelevant data are registered to about looking down visual angle mark P kTables of data L PkIn, these data are to by from looking down the identifier k of marker detection unit, visual angle 150 input mSort.(i=1 is to I to be noted that i Pk) be that each is at tables of data L PkThe right index of data of middle registration, I PkExpression registration about looking down visual angle mark P kThe right sum of data.Data like this, have just been obtained.
At step S2050, to looking down the tables of data of visual angle mark about all that obtained by Data Management Unit 170 or judging about a tables of data of looking down the visual angle mark at least, promptly whether the judgment data table is enough to calculate calibration information.If at least one tables of data or all tables of data are found can not satisfy condition, process is returned step S2010, waits pending data to obtain the input of order.On the other hand, if being found, at least one tables of data or all tables of data satisfy the calibration information The conditions of calculation, then process execution in step S2060.Certain that satisfies the calibration information design conditions is looked down visual angle mark P kThe example of condition of tables of data be, for tables of data L Pk, obtain two or more different data to [M Wci, u i Pk].As will be described, obtain two equations (seeing expression formula 3) from individual data centering, so if it is right to have obtained two or more data, can obtain looking down the visual angle and be marked at position in the first person photography video camera coordinate system from four equations, this coordinate system comprises three parameters.Yet the otherness in the input data is big more, and the calibration information of generation is just accurate more, so can do such arrangement, condition request lot of data is set promptly.
Next, at step S2060, whether decision operation person imports the calibration information calculation command.If imported the calibration information calculation command, handle and advance to step S2070, if do not import the calibration information calculation command, process turns back to step S2010 and waits pending data to obtain the input of order.
Calibration information computing unit 180 is handled the calibration information that obtains, and promptly look down the visual angle and be marked at position in the first person photography video camera coordinate system, be the vectorial [x of three values Cy Cz C] TIn the following description, this unknown parameter will be described to state vector s Pk=[x C Pky C Pkz C Pk] TContinue now to describe about look down visual angle mark P at certain kOn processing; Yet, should be understood to, processing is to carry out about the visual angle marks of looking down all in the tables of data that satisfies the calibration information design conditions.
At step S2070, calibration information computing unit 180 distributes a suitable initial value (for example, [000] T) to state vector s Pk
At step S2080, calibration information computing unit 180 according to each at tables of data L PkIn data set [M Wci, u i Pk] (wherein i=1 is to I Pk) look down visual angle mark P at all i calculating kThe logical value u that looks down the visual angle image coordinate i Pk'=[u Xi Pk', u Yi Pk'].Here be noted that, the logical value of looking down the visual angle image coordinate of looking down the visual angle mark is meant position (coordinate) data in looking down the visual angle image, looks down visual angle mark P what generated the position on this position in first person photography video camera coordinate system kShould be visible.According to as the function calculation u in the expression formula 1 i Pk'.
u i P k ′ = F i ( s P k ) - - - ( 1 )
Visual angle mark P is looked down in expression formula 1 expression kPosition in first person photography video camera coordinate system.
Particularly, function F i() is provided with as shown in expression formula 2.
x B i P k = x B i P k y B i P k z B i P k 1 T = M WB - 1 · M WC i · x C P k y C P k z C P k 1 T - - - ( 2 )
Expression formula 2 is used for from s PkObtain that (that is, position and the orientation at camera head 1 30 is M on the point that obtains i data set WCiPoint on) look down visual angle mark P kPosition vector x in looking down visual angle photography video camera coordinate Bi PkWith expression formula 3.
u i P k ′ = u xi P k ′ u yi P k ′ T = - f x B x B i P k z B i P k - f y B y B i P k z B i P k T - - - ( 3 )
Expression formula 3 is used for from x Bi PkObtain looking down visual angle mark P kCoordinate u in looking down visual angle photography video camera image i Pk'.Here be noted that f B xAnd f B yExpression is looked down visual angle photography video camera 140 on the X-direction and the focal length on the Y direction, and the given value that should be understood to keep in advance respectively.M WBIt is a transition matrix, be used for to look down the interior coordinate conversion of visual angle photography video camera coordinate system to world coordinates, and calculate in advance according to looking down the position and the orientation of visual angle photography video camera 140 in world coordinate system, keep in advance as given value.
At step S2090, calibration information computing unit 180 uses expression formula 4 to calculate at all i and is included in tables of data L PkIn each data centralization look down visual angle mark P kReal image coordinate u i PkImage coordinate logical value u with correspondence i Pk' between error delta u i Pk
Δ u i P k = u i P k - u i P k ′ - - - ( 4 )
At step S2100, calibration information computing unit 180 calculates about state vector s for all i PkJacobian J uis Pk ( = ∂ u i Pk / ∂ s Pk ) (promptly 2 row * 3 row Jacobi matrixes have the state vector of using s for its component PkComponent by to the function F in the expression formula 1 iAsk separating that local derviation obtains).Especially, calculate 2 row * 3 row Jacobi matrixes J uiBi Pk ( = ∂ u i Pk / ∂ x Bi Pk ) , Has the position vector of using x for its component Bi PkComponent by the right side of expression formula 3 being asked separating that local derviation obtains, calculate 3 row * 3 row Jacobi matrixes J xBis Pk ( = ∂ x Bi Pk / ∂ s Pk ) , Has the state vector of using s for its component PkComponent by the right side of expression formula 2 being asked separating that local derviation obtains, and J Uis PkUse expression formula 5 to calculate.
J u i s P k = J u i x Bi P k · J x Bi s P k - - - ( 5 )
At step S2110, calibration information computing unit 180 is according to the above-mentioned poor Δ that calculates at each i i PkWith Jacobi matrix J Uis PkCalculated correction value Δ s PkParticularly, at all i values, perpendicular to given error delta u i Pk2I PkThe error vector of dimensional vector is shown in expression formula 6.
U = Δu 1 P k Δ u 2 P k . . . Δu I P k P k - - - ( 6 )
With given Jacobi matrix J Uis PkPerpendicular 2I PkColumn matrix J OK * 3 Uis PkShown in expression formula 7, be created.
Φ = J u 1 s P k J u 2 s P k . . . J u I P k s P k - - - ( 7 )
Use the pseudo inverse matrix Φ of Φ +And calculate Δ s by expression formula 8 Pk
Δ s P k = Φ + U - - - ( 8 )
Now, Δ s PkBe tri-vector, so as long as 2I PkEqual 3 or just can obtain Δ s greatlyyer Pk, promptly as long as I PkEqual 2 or bigger.Be noted that and pass through Φ +=(Φ TΦ) -1Φ TOr other method can be obtained Φ +
At step S2120, calibration information computing unit 180 uses the correction value delta s that calculates in step S2110 PkLook down visual angle mark P with 9 pairs of expression formulas kPosition vector s in first person photography video camera coordinate system PkProofread and correct, and the value that will obtain is as new s Pk
s P k + Δ s P k → s P k - - - ( 9 )
At step S2130, calibration information computing unit 180 uses certain criterion, and for example whether error vector U is less than predetermined threshold values or correction value delta s PkWhether, judge whether to obtain the convergence of calculating less than predetermined threshold values.Return step S2080 and use the state vector after proofreading and correct to repeat processing if judged result for not obtaining convergence, is handled from step S2080.
When the result who in step S2130, judges when having obtained the convergence of calculating, the state vector s that 180 outputs of calibration information computing unit are obtained in step S2140 PkLook down visual angle mark P as indication kLocation parameter in first person photography video camera coordinate system.
At last, at step S2150, judge whether to stop calibration process.If operator's cue mark correcting device 100 stops calibration process, termination then, if calibration process (that is, recalibrating) is proceeded in operator's indication, then flow process return step S2100 and etc. pending data obtain the input of order.
Like this, the mark that provides on camera head can obtain easily and exactly about the position of camera head.
<modification 1-1 〉
Though present embodiment uses the corrected value of the steepest descent method computing mode vector shown in the expression formula 10, the calculating of corrected value does not always need to use the steepest descent method to obtain.For example, can use LM method (Levenberg-Marquardt method) to obtain, this method is the iterative method of linear equation, also can be known as strong method of estimation for example M-estimate that statistical method is combined to wherein, perhaps can use any other numerical computation method, this can not break away from essence of the present invention.For present embodiment, at step S2070 in S2130, for the mark position that will obtain provides an initial value, and obtain optimum value, but can use simpler computing method to obtain the position of mark by the double counting of using the image Jacobian.For example expression formula 3 is launched to have generated relational expression as shown in expression formula 10 and 11.
( u xi P k r 31 i BC + f x B r 11 i BC ) x P k + ( u xi P k r 32 i BC + f x B r 12 i BC ) y P k + ( u xi P k r 33 i BC + f x B r 13 i BC ) z P k = - u xi P k z i BC - f x B x i BC - - - ( 10 )
( u yi P k r 31 i BC + f y B r 21 i BC ) x P k + ( u yi P k m r 32 i BC + f y B r 22 i BC ) y P k + ( u yi P k r 33 i BC + f y B r 23 i BC ) z P k = - u yi P k z i BC - f y B y i BC - - - ( 11 )
This relational expression can be used to directly from tables of data L Pk=[M Wci, u i Pk] (wherein i=1 is to I Pk) obtain s Pk, shown in expression formula 12.
x P k y P k z P k = u x 1 P k r 311 BC + f x B r 111 BC u x 1 P k r 321 BC + f x B r 121 BC u x 1 P k r 331 BC + f x B r 131 BC u y 1 P k r 311 BC + f y B r 211 BC u y 1 P k r 321 BC + f y B r 221 BC u y 1 P k r 331 BC + f y B r 231 BC · · · u x I P k P k r 31 I P k BC + f x B r 11 I P k BC u x I P k P k r 32 I P k BC + f x B r 12 I P k BC u x I P k P k r 33 I P k BC + f x B r 13 I P k BC u y I P k P k r 31 I P k BC + f y B r 21 I P k BC u y I P k P k r 32 I P k BC + f y B r 22 I P k BC u y I P k P k r 33 I P k BC + f y B r 23 I P k BC + - u x 1 P k z 1 BC - f x B x 1 BC - u y 1 P k z 1 BC - f y B y 1 BC . . . - u x I P k P k z I P k BC - f x B x I P k BC - u yI P k P k z I P k BC - f y B y I P k BC
(12)
Wherein
r 11 i BC r 12 i BC r 13 i BC x i BC r 21 i BC r 22 i BC r 23 i BC y i BC r 31 i BC r 32 i BC r 33 i BC z i BC 0 0 0 1 = M WB - 1 · M WC i - - - ( 13 )
<modification 1-2 〉
For present embodiment, position and orientation computing unit 120 have been described to according to each detected each first person mark Q from the image that camera head 130 is taken KnImage coordinate u QknWorld coordinates x with the mark that keeps as Given information in advance W QknBetween mutual relationship the position and the orientation of camera head 130 are calculated; Yet the characteristics of image that uses when position that obtains camera head and orientation is not the characteristics of image of the point shown in being necessary for.For example, the position that first person marker detection unit 110 and position and orientation computing unit 120 can be by being used to obtain the camera head that uses linear feature and the technology in orientation or use by being used to obtain that for example realize the position of the camera head of elliptic geometry feature and the technology in orientation, as described at " A.I.Comport; E.Marchand; F.Chaumette:A real-time tracker for markerless augmented reality, Proc.Int ' l Symp ".At " mixing and expansion reality " 2004, the 36-45 pages or leaves, 2004, perhaps can use the combination of these technology, in fact, use any photography video camera position of any characteristics of image and direction estimation method to adopt.
A kind of mixed method based on two kinds of methods: by will be for example the 6DOF position of magnetic sensor and aspect sensor or for example the 3DOF sensor of gyroscopic sensors be installed on the camera head 130 relevant on the image that the world coordinates that uses in advance the mark that keeps as Given information is taken from camera head 130 between the detected characteristics of image; Be installed to the measured value of the sensor on the camera head 130.Referring to Uchiyama, Yamamoto, Tamura " Ahybrid positioning technique for mixed reality:Concomitant use of 6-degree-of-freedom sensor and vision technique ", Japan Virtual RealityAcademic Journal, Vol.8, No.1, pp.119-125,2003, and also Fujii, Kanbara, Iwasa, Takemura, Yokoya " Positioning by stereo camera withconcomitant use of gyro sensor for expanded reality ", Institute ofElectronics, Information and Communication Engineers PRMU 99-192 (Technical Report of IEICE vol.99, No.574, pp.1-8), 1999.
<modification 1-3 〉
For the present embodiment that at this moment is described, object is a camera head, can be measured with position and the orientation of this object in world coordinate system of opposing by the camera head captured image information of controlling oneself in this case.If object is a body arbitrarily, for example the 6DOF sensor of magnetic sensor can be measured with oppose its position and orientation.Fig. 7 represents this situation.
This modification technical scheme according to the present embodiment shown in Fig. 7, mark correcting device 700 is used to use each mark for the position of object 710 mark that is provided on the object 710 to be calibrated, promptly, in the object coordinates system, use the 6DOF sensor 720 that is installed in object 710 and looks down on the visual angle photography video camera 140 that each mark is calculated.The example of 6DOF sensor 720 (for example comprises magnetic position and aspect sensor, the FASTRAK that makes by the Polhemus of the U.S., the Flock of Birds that makes by the Ascension TechnologyCorporation of the U.S.), sonac or similar sensor, but the present invention is not limited to these; The 6DOF sensor of any kind can use.Mark correcting device 700 has position and orientation computing unit 730 to replace first person marker detection unit 110 and position and the orientation computing unit 120 shown in Fig. 1.Position and orientation computing unit 730 are from 6DOF sensor 720 input positions and measurement of bearing value (sensor is from position and orientation in the sensor coordinate system system), use known calibration information to calculate position and the orientation of object 710 world coordinate system, and it is outputed in the Data Management Unit 170 according to the request of Data Management Unit 170 from position and measurement of bearing value.In others, mark correcting device 700 is identical with mark correcting device 100.
<modification 1-4 〉
For present embodiment described herein, with do to the photography video camera that object is taken be place the fixed position look down the visual angle photography video camera, but if the position of photography video camera and orientation can be measured in world coordinate system, then looking down photography video camera does not just need to fix.
For example, can do an arrangement, it is the 6DOF sensor, for example the magnetic position aspect sensor (for example, the FASTRAK that the Polhemus of the U.S. makes, the Flock of Birds that the Ascension Technology Corporation of the U.S. makes), optical sensor (for example, the Optotrak that Canadian NDI makes, the Vicon Tracker that the Vicon MotionSystems Inc of Britain makes), sonac or similar sensor, be installed to and look down on the photography video camera of visual angle, and obtain about looking down position and the orientation of visual angle photography video camera at world coordinate system from measurement value sensor.Position and the orientation of looking down the visual angle photography video camera also can be from looking down the first person mark Q that the visual angle photography video camera is taken KnImage coordinate u QknObtain, and the coordinate Xw of mark QknKeep in advance as Given information.In addition, can do an arrangement, be that above-mentioned 6DOF sensor or 3DOF sensor are installed to and are looked down on the photography video camera of visual angle, and position and the orientation of looking down the visual angle photography video camera can obtain by mixed method according to the measured value of looking down visual angle image and sensor.
In this case, the position of measuring by said method of looking down the visual angle photography video camera and the position and the orientation of orientation and camera head 130 are acquired simultaneously, and are stored in step S2040 and exist in the tables of data.Then, at step S2080, the value that calculates according to the value in position that keeps in the tables of data and orientation is used as the M in the expression formula 2 WB(position and the orientation of visual angle photography video camera in world coordinates looked down in expression), rather than in advance as the fixed value of given value.
Second embodiment
For present embodiment, if a plurality of marks that are provided on the camera head have a mark coordinate system, and each coordinate that is marked in the mark coordinate system is known, and then the mark correcting device obtains the mark coordinate system about the position of camera head and orientation or each mark position about camera head.Below mark correcting device and mark calibration steps according to present embodiment are described.
Fig. 3 is the synoptic diagram of expression according to the illustrative configurations of the mark correcting device of present embodiment.Be noted that the parts identical with same label and symbolic representation, and repetition is not carried out in its detailed being described in here with the parts shown in Fig. 1.As shown in Figure 3, mark correcting device 300 comprises first person marker detection unit 110, position and orientation computing unit 120, looks down visual angle photography video camera 140, looks down marker detection unit, visual angle 150, indicating member 160, Data Management Unit 370 and calibration information computing unit 380, and is connected to the camera head 130 that will proofread and correct.
Ensuing explanation should be understood to, and is provided with one or more with the mark P that is corrected according to the user of the mark correcting device 300 of present embodiment k(k=1 is to K here 2, hereinafter the same abbreviating as " looked down the visual angle mark " among this mark and first embodiment).Looking down the visual angle, to be marked at the same among position and first embodiment in the first person photography video camera coordinate system be unknown, but for present embodiment, the relative position relation between the mark is known.An example of this situation is so a kind of arrangement, promptly looks down visual angle mark P for four kThe mark group of (wherein, k=1,2,3,4) constitutes its size and is known single square marks R 1Fig. 3 represents such situation.Here, square marks R 1On a point be defined as initial point, and, three quadratures the axle be respectively defined as X-axis, Y-axis and Z axle, each looks down visual angle mark P kCoordinate in the mark coordinate system is known.Therefore, in case find (promptly in first person photography video camera coordinate system, be used for the coordinate conversion of the mark coordinate system transition matrix in the first person photography video camera coordinate system) in the position and the orientation of mark coordinate system, each looks down visual angle mark P kUnknown position in first person photography video camera coordinate system just can be calculated.Therefore, the information of calibrating according to the mark correcting device of present embodiment is position and the orientation of mark coordinate system in first person photography video camera coordinate system.
The same with first embodiment, as in world space coordinate system, to have known location a plurality of first person mark Q kBe placed on a plurality of points in the realistic space, as the mark that uses camera head 130 to take.
Identical with first implementation column, first person marker detection unit 110 detects the first person mark Q that takes in the input picture from camera head 130 input first person images kImage coordinate, and with each detected first person mark Q KnImage coordinate u QknWith its identifier k nOutput in position and the orientation computing unit 120.
Identical with first embodiment, position and orientation computing unit 120 are according to each detected first person mark Q KnImage coordinate u QknWith the world coordinates x that keeps as Given information W QknBetween mutual relationship calculate the position and the orientation of camera head 130.This calculating is identical with the description of being done in first embodiment.Position and the orientation of the camera head 130 that calculates like this in work coordinate system system is output to Data Management Unit 370 according to the request of Data Management Unit 370.
Identical with first embodiment, when obtaining the mark correction data, look down visual angle photography video camera 140 and be fixed and be placed on the position that can take camera head 130.Looking down position and orientation conduct in advance the given value of visual angle photography video camera 140 in world coordinates is retained in the calibration information computing unit 380.
With identical among first embodiment, look down marker detection unit, visual angle 150 input by looking down the image that visual angle photography video camera 140 is taken, by similarly handling looking down visual angle mark P in the image with first person marker detection unit 110 kImage coordinate detect and the image coordinate u that will arrive according to the request detection of Data Management Unit 370 QkmWith its identifier k mOutput to Data Management Unit 370.In example shown in Figure 3, square marks is used as mark, detects processing procedure so carry out the square marks of describing in first embodiment.
Identical with first embodiment, if operator's (not shown) input data are obtained order, indicating member 160 will " obtain data ", and indication is sent to Data Management Unit 370, if input calibration information calculation command then is sent to calibration information computing unit 380 with " calibration information calculating " order.
When Data Management Unit 370 receives " obtaining data " when indication from indicating member 160, from position and orientation computing unit 120 position and the orientation of input camera head 130 world coordinate system, from looking down image tagged and the identifier thereof that the visual angle mark is looked down in marker detection unit, visual angle 150 input, and data are added and are retained in to each looks down in the tables of data that the visual angle mark prepares " position of image unit 130 and orientation and the image coordinate of looking down the visual angle mark ".Now, from position and the orientation of camera head 130 world coordinate system of position and orientation computing unit 120 inputs, be from look down 150 inputs of marker detection unit, visual angle when shot detection to the image of the image coordinate of looking down the visual angle mark time data obtained.Data Management Unit 370 is according to the request of calibration information computing unit 380, and the tables of data of each that generates being looked down the visual angle mark outputs in the calibration information computing unit 380.
When calibration information computing unit 380 receives " calibration information calculating " when indication from indicating member 160, be input Data Management Unit 370 input data tables, carry out calibration process based on this, and the consequent calibration information that output is obtained (that is, each is looked down the visual angle and is marked at position in the first person photography video camera coordinate system).
Fig. 4 is illustrated in the process flow diagram of the processing of carrying out when correcting device obtains according to calibration information of the present invention.Be noted that wherein with Fig. 2 in same section with identical reference number and symbolic representation, and its detailed being described in here will not repeat.Program code according to this process flow diagram is stored in according in the storer in the equipment of present embodiment, and for example RAM, ROM or similar storer (not shown) are read and carried out by the CPU (not shown) then.
At first, at step S2010, whether indicating member 160 decision operation persons import data and obtain order.If judged result is obtained order for importing data in step S2010, execution in step S2020 then, and before execution in step S4050 execution in step S2020, S2030 and S4040.On the other hand, if the judged result in step S2010 is obtained order for not importing data, then direct execution in step S4050.Identical among processing among step S2010, S2020 and the S2030 and above-mentioned first embodiment.After the completing steps S2030, flow process advances to the processing among the step S4040.
At step S2040, Data Management Unit 370 with the input data to D jAdd to detected each look down visual angle mark P KmTables of data L in.Particularly, for M from position and 120 inputs of orientation computing unit WCAs M Wcj, from looking down the identifier k of marker detection unit, visual angle 150 inputs mAs k j, from looking down the u of marker detection unit, visual angle 150 inputs PkmAs u j Pk, data set [M Wcj, u j Pk, k j] as about P kJ data be registered to about looking down visual angle mark P kTables of data L in.(j=0 is to J to be noted that j Pk) be the right index of each data in tables of data L, J represents the right sum of data registered.Like this, obtain data.
At step S4050, look down the tables of data of visual angle mark or judge about all what data administrative unit 370 was obtained about at least one tables of data of looking down the visual angle mark, judge whether this tables of data is enough for calculating calibration information.If at least one tables of data or all tables of data are found do not satisfy condition, then flow process turns back to step S2010, waits pending data to obtain the input of order.On the other hand, satisfy the calibration information The conditions of calculation if at least one tables of data or all tables of data are found, then flow process advances to step S2060.The example that tables of data satisfies the calibration information design conditions is, tables of data L comprises about three or more the different visual angle mark P that look down kData.Yet the difference in the input data is big more, and the accuracy of the calibration information of generation is just high more, so can do such arrangement, promptly condition setting becomes the request lot of data.
Next, at step S2060, whether decision operation person has imported the calibration information calculation command.If imported the calibration information calculation command, handle to advance to step S4070, and if do not import the calibration information calculation command, then flow process turn back to step S2010 and etc. pending data obtain the input of order.
Calibration information computing unit 380 is handled the calibration information that obtains, and promptly look down the visual angle and be marked at position in the first person photography video camera coordinate system, be 6 values vectorial [xyz ξ ψ ζ] TIn the following description, this unknown parameter will be as state vector s=[xyz ξ ψ ζ] TDescribe.
At step S4070, calibration information computing unit 380 is given suitable initial value of state vector.For initial value, the operator can use manually conventional value of input of indicating member 160, perhaps can do such arrangement, the a plurality of detection coordinates of looking down the visual angle mark that are certain time input are extracted out (promptly from tabulation L, looking down the visual angle image detection from same comes out), and these data are used to calculate position and the orientation of mark coordinate system in looking down the visual angle coordinate system with known method, obtain being used for coordinate conversion with the mark coordinate system from the position that obtains and orientation to the transition matrix of looking down visual angle photography video camera coordinate system, and, use position and the orientation M that imports and be retained in the camera head the tabulation L simultaneously from expression formula 14 WCObtain the position of expressive notation coordinate in first person photography video camera coordinate system and the transition matrix M in orientation CM, the initial value that is used as s with the position and the orientation of this matrix representation.
M CM = M WC - 1 · M WB · M BM - - - ( 14 )
M wherein WBBe to be used for calculating according to the position of looking down visual angle photography video camera 140 in the world coordinate system and orientation in advance, and keeping as being worth in advance with looking down the transition matrix of the coordinate conversion of visual angle photography video camera coordinate system to world coordinates.
Now, be used for calculating the mark coordinate system and comprise a kind of technology at the example of the known technology in the position of looking down the visual angle coordinate system and orientation, if promptly the first person mark is placed in on the plane, the detection of four or more a plurality of marks is allowed to calculate by two-dimentional homomorphism the position and the orientation of detection camera head.Also comprise a kind of use six or more a plurality of not technology and a kind of technology that is used to obtain optimum solution of the mark on together individual plane, these are separated as initial value, carry out double counting by for example gauss-newton method.
At step S4080, calibration information computing unit 380 is according to each the data set D among the tables of data L j=[M Wcj, u j Pkj, k j] (wherein j=l is to J), look down visual angle mark P at all j calculating KjThe logical value u that looks down the visual angle image coordinate j Pkj'=[u Xj Pkj', u Yj Pkj'].Here be noted that the logical value of looking down the visual angle image coordinate of looking down the visual angle mark is meant position (coordinate) data in looking down the visual angle image, looks down visual angle mark P in this position KjPosition vector s in the mark coordinate system M PkjFor known, and when position and the orientation of mark coordinate system in first person photography video camera coordinate system be s tense marker P KjShould be for visual.According to the function calculation u shown in the expression formula 15 j Pkj'.
u j P k j ′ = F j ( s ) - - - ( 15 )
The position of function representation mark coordinate system in first person photography video camera coordinate system in the expression formula 15.
Particularly, function F j() is provided with as shown in expression formula 16.
x Bj P k j = x B j P k j y B j P k j z B j P k j 1 T = M WB - 1 · M WC j · M CM ( s ) · x M P k j - - - ( 16 )
Function shown in the expression formula 16 is used to obtain that (that is, position and the orientation at camera head 130 is M obtaining the point of j data set from s WCjPoint on) look down visual angle mark P kPosition side amount x in looking down visual angle photograph coordinate Bj PkjExpression formula 17 is used for from x Bj PkjObtain to look down and look down visual angle mark P in the photography video camera image of visual angle KjCoordinate u j Pkj'.
u j P k j ′ = ux j P k j ′ uy j P k j ′ T = - f x B x B j P k j z B j P k j - f y B y B j P k j z B j P k j T - - - ( 17 )
Here be noted that f B xAnd f B yLook down the focal length of visual angle photography video camera 140 on expression X-direction and the Y direction, be construed as given value in advance.M CM(s) be model conversion matrix (being used for), define with expression formula 18 with the coordinate conversion of the mark coordinate system coordinate in the first person photography video camera coordinate system.
M CM ( s ) = ξ 2 θ 2 ( 1 - cos θ ) + cos θ ξψ θ 2 ( 1 - cos θ ) - ζ θ sin θ ξζ θ 2 ( 1 - cos θ ) + ψ θ sin θ x ψξ θ 2 ( 1 - cos θ ) + ζ θ sin θ ψ 2 θ 2 ( 1 - cos θ ) + cos θ ψζ θ 2 ( 1 - cos θ ) - ξ θ sin θ y ζξ θ 2 ( 1 - cos θ ) - ψ θ sin θ ζψ θ 2 ( 1 - cos θ ) + ξ θ sin θ ζ 2 θ 2 ( 1 - cos θ ) + cos θ z 0 0 0 1 - - - ( 18 )
Wherein
θ = ξ 2 + ψ 2 + ζ 2 - - - ( 19 )
At step S4090, calibration information computing unit 380 is looked down visual angle mark P at all j with what be included in each data centralization among the expression formula 20 computational data table L KjReal image coordinate u j PkjWith corresponding image coordinate logical value u j Pkj' between error delta u j Pkj
Δ u j P k j = u j P k j - u j P k j ′ - - - ( 20 )
At step S4100, calibration information computing unit 380 is at the image Jacobi matrix of all j calculating about state vector s J uj Pk s ( = ∂ u j Pkj / ∂ s ) (that is, and 2 row * 6 row Jacobi matrixes, the component that has with state vector s for its component passes through the function F in the expression formula 17 j() asks separating that local derviation obtains).Particularly, calculate 2 row * 3 row Jacobi matrixes J ujBj Pkj ( = ∂ u j Pkj / ∂ x Bj Pkj ) , Has the location fix of using x for its component Bj PkjComponent by the right side of expression formula 19 being asked separating that local derviation obtains, calculate 3 row * 6 row Jacobi matrixes J xBjs Pkj ( = ∂ x Bj Pkj / ∂ s Pkj ) , Its component has with the component of state vector s by the right side of expression formula 18 being asked separating that local derviation obtains, and J Uj Pk sCalculate by expression formula 21.
J u j s P k j = J u j x Bj P k j · J x Bj s P k j - - - ( 21 )
In step 4110, calibration information computing unit 380 is according to error delta j PkjWith the above-mentioned Jacobi matrix J that calculates at all j Uj Pk sError of calculation value Δ s.Particularly, at all j, perpendicular to given error delta u j PkjThe error vector of 2J dimensional vector as shown in expression formula 22.
U = Δ u 1 P k 1 Δ u 2 P k 2 . . . Δu J P k J - - - ( 22 )
Perpendicular to given Jacobi matrix J Uj Pk s2J capable * 6 column matrix create shown in expression formula 23.
Φ = J u 1 s P k 1 J u 2 s P k 2 . . . J u J s P k J - - - ( 23 )
Expression formula 24 is used the pseudo inverse matrix Φ of Φ +Calculate Δ s.
Δs=Φ +U (24)
Δ s is 6 dimensional vectors, so as long as 2J equal 6 or bigger value just can obtain Δ s, that is, J equals 3 or bigger.Be noted that Φ +Can pass through Φ +=(Φ TΦ) -1Φ TOr other method is obtained.
At step S4120, calibration information computing unit 380 uses the correction value delta s that calculates expression formula 25 correcting states vector s in step S4110, position and the orientation of the mark coordinate that this state vector s represents in first person photography video camera coordinate system, and the value that will obtain is as new s value.
s+Δs→s (25)
At step S4130, calibration information computing unit 380 judges whether to obtain the convergence of calculating, and this judges certain criterion of use, and for example whether error vector U is less than predetermined threshold values, and perhaps whether correction value delta s is less than predetermined threshold values.If judged result is not obtain convergence, the state vector s after then use is proofreaied and correct is from step S4080 process repeats.
When the judged result among the step S4130 when having obtained the convergence of calculating, the state vector s that obtains in 380 outputs of step S4140 calibration information computing unit is as the position of cue mark coordinate system in first person photography video camera coordinate system.At this moment output format can be s itself, perhaps can do a kind of arrangement, and promptly the location components of s is worth vector representations with 3, and direction component is represented with Eulerian angle or rotation matrix, perhaps can be the coordinate conversion matrix M that generates from s CM
At last, at step S2150, judge whether to stop calibration process.If operator's cue mark correcting device 300 stops calibration process, then processing finishes, if calibration process (promptly recalibrating) is proceeded in indicator's indication, then flow process return step S2010 and etc. pending data obtain the input of order.
Like this, the mark that is provided on the camera head can be by easily and be obtained exactly about the position of camera head.
<modification 2-1 〉
Though present embodiment is described to position and the orientation of mark coordinate system in the first photography video camera coordinate system exported as calibration information, can do an arrangement, promptly calculates each and looks down visual angle mark P kAlso export as calibration system position in the first photography video camera coordinate system.In this case, can obtain each and look down visual angle mark P kPosition in the first photography video camera coordinate system, i.e. the coordinate conversion matrix M that obtains from s CMWith look down the visual angle and be marked at known location x in the mark coordinate system M PkProduct.
<modification 2-2 〉
Though present embodiment is described to detected each first person mark Q in the image that position and orientation computing unit 1 20 take according to camera head 130 KnImage coordinate u QknThe world coordinates s that keeps as Given information in advance W QknBetween mutual relationship calculate the position and the orientation of camera head 130, but and first embodiment<modification 1-2 the same, can use out of Memory to finish image and position probing and use other sensor to finish image and position probing.
<modification 2-3 〉
Though present embodiment uses the corrected value of the steepest descent method computing mode vector shown in the expression formula 26, the calculating of corrected value always must not obtained with the steepest descent method.For example, this can obtain with LM method (Levenberg-Marquardt method), this method is the iterative method of linear equation, also can be known as strong estimation approach for example the statistical method estimated of M-be combined to wherein, perhaps can use any other numerical computation method, this can not break away from essence of the present invention.
Be position and the orientation suitable initial value that provide of mark coordinate system in first person photography video camera coordinate system though described in the present embodiment, and the optimum value of using the image Jacobian to obtain for the s about all input data by double counting, but position and the orientation of mark coordinate system in first person photography video camera coordinate system can use simpler computing method to obtain.For example, according to the process of describing about step S4070, position and the orientation of mark coordinate system in first person photography video camera coordinate system can only be used at the single coordinate of looking down detected a plurality of first person marks on the image of visual angle and obtain, and these are used as calibration information output.
<modification 2-4 〉
In addition, use square marks as looking down the visual angle mark though the present invention has described, the type of mark does not limit, as long as each position that is marked in the mark coordinate system is known in the mark group that realizes.For example, can use a group as a plurality of circular marks that use among first embodiment, perhaps the mark of a plurality of types can coexist.In addition, even under the situation of a plurality of mark coordinate systems, calibration also can be carried out similar processing, and this processing procedure is carried out above-mentioned finishing dealing with by being respectively each mark coordinate system, perhaps by concurrently each mark coordinate system being carried out above-mentioned finishing dealing with.
<modification 2-5 〉
For described present embodiment, object is a camera head, and in this case, camera head self captured image information can be used as to be measured the position and the orientation of object in world coordinate system.If object is a body arbitrarily, for example the 6DOF sensor of magnetic sensor can be used as its position and orientation are measured.The configuration of this situation and first embodiment shown in Fig. 7<revise 1-3 in configuration identical.The example of 6DOF sensor (for example comprises magnetic position and aspect sensor, the FASTRAK that makes by the Polhemus of the U.S., the Flock of Birds that makes by the Ascension Techno1ogy Corporation of the U.S.), sonac or similar sensor, but the present invention is not limited to these; Any other 6DOF sensor can use.
<modification 2-6 〉
For present embodiment described herein, with doing the photography video camera that object is taken is to be arranged on to look down the visual angle photography video camera on the fixed position, if but the position of photography video camera and orientation can be measured in world coordinate system, then look down the visual angle photography video camera and just there is no need to fix, among this point and first embodiment<revise 1-4〉identical.
In this case, among the position of measuring by above-mentioned method of looking down the visual angle photography video camera and orientation and the step S2020 position and the orientation while of camera head obtained, and in step S4040, be stored in the tables of data.Then, at step S4080, in expression formula 18 (position and the orientation of visual angle photography video camera in world coordinates looked down in expression), be used as M according to the value that position and orientation calculated that is retained in the tables of data WB, rather than as the fixed value that keeps in advance of given value.
The 3rd embodiment
For present embodiment, if a plurality of marks that provide on the camera head have a mark coordinate system, and each coordinate that is marked in the mark coordinate system is known, then the mark correcting device obtains the mark coordinate system about the position of this camera head and orientation or each mark position about this camera head, and this point is identical with second embodiment.Mark correcting device and mark calibration steps according to present embodiment are described below.
Fig. 5 is the synoptic diagram of expression according to the configuration of the mark correcting device of present embodiment.Be noted that with those the identical parts shown in Fig. 3 with identical reference number and symbolic representation, and repetition is not carried out in its detailed being described in here.As shown in Figure 5, mark correcting device 500 comprises first person marker detection unit 110, looks down visual angle photography video camera 140, looks down marker detection unit, visual angle 150, indicating member 160, position and orientation computing unit 520, Data Management Unit 570 and calibration information computing unit 580, and is connected to the camera head 130 that will measure.Following explanation should be understood to according to the user of the mark correcting device of present embodiment one or more mark P that will measure are set k(wherein k=1 is to K 2), with identical among second embodiment.The something in common of present embodiment and second embodiment is, is provided with first person mark Q k, and first person marker detection unit 110, look down visual angle photography video camera 140, identical among the operation of looking down marker detection unit 150, visual angle and indicating member 160 and second embodiment; Therefore, its detailed description is multiple with regard to not carrying out.The difference of present embodiment and second embodiment is, first person marker detection unit 110 outputs to testing result in the Data Management Unit 570 according to the request of Data Management Unit 570, looks down marker detection unit 150, visual angle testing result is outputed to position and orientation computing unit 520.Though second embodiment comprises the testing result of using the first person mark and obtains the position and the orientation of camera head 130, the mark that means the requirement in the position that is used for calculating camera head 130 and orientation all will detect from single first person image, but, only be that one or more marks need detect from single first person image for present embodiment.
Position and orientation computing unit 520 are looked down visual angle mark P from looking down 150 inputs of marker detection unit, visual angle KmImage coordinate u PkmAnd identifier, and according to the mark coordinate system that keeps as Given information in advance and the three-dimensional coordinate x of each mark M PkmBetween mutual relationship calculate position and the orientation of mark coordinate system in looking down visual angle photography video camera coordinate system.The processing described in the step S4070 is identical among these computing method and second embodiment.Position that calculates and orientation are output to Data Management Unit 370 according to the request of Data Management Unit 370.In the following description, 4 * 4 homogeneous coordinates matrix M is passed through in position and the orientation of mark coordinate system in looking down visual angle photography video camera coordinate system BM(be used for the mark coordinate system coordinate conversion to the matrix of looking down the coordinate in the photography video camera coordinate system of visual angle) obtains.
When Data Management Unit 570 receives from indicating member 160 " obtain data " indication the time, the image coordinate of first person mark and identifier thereof be 110 inputs from first person marker detection unit, position and the orientation of mark coordinate system in first person photography video camera coordinate system imported from position and orientation computing unit 520, and the first person mark that is each input is created data set " position and the orientation of mark coordinate system in first person photography video camera coordinate system; the identifier of first person mark ", and this data set is added and is retained in the individual data table.Data Management Unit 570 outputs to the tables of data that generates in the calibration information computing unit 580 according to the request of calibration information computing unit 580.
When calibration information computing unit 580 when indicating member 160 receives " calibration information calculating ", the tables of data of input Data Management Unit 570, carry out calibration process based on this, and its result's of conduct that output is obtained calibration information (that is position and the orientation of mark coordinate system in first person photography video camera coordinate system).
Fig. 6 is the process flow diagram that is illustrated in the processing of carrying out when correcting device obtains calibration information according to present embodiment.Program code according to this process flow diagram is stored in according in the storer in the equipment of present embodiment, for example in RAM, ROM or the similar storer (not shown), is read and is performed by the CPU (not shown) then.
At step S6010, whether indicating member 160 decision operation persons import data and obtain order.When being placed on the locational camera head 130 the input data, the operator obtains order so that obtain the mark calibration data.Be transfused to if data are obtained order, indicating member 160 advances to step S6020.As look into data and obtain order and be not transfused to, indicating member 160 advances to step S6050.
At step S6020, Data Management Unit 570 is position and the orientation M first person photography video camera coordinate system from position and orientation computing unit 520 input marking coordinate systems BM
At step S6030, Data Management Unit 570 is 110 input first person marker detection unit 110 detected each first person mark Q from first person marker detection unit KnImage coordinate u QknAnd identifier k n
At step S6040, Data Management Unit 570 is with the first person mark Q of each input KnAs data D jAdd among the tables of data L.Particularly, for M from position and 520 inputs of orientation computing unit WCAs M Wcj, the k of 110 inputs from first person marker detection unit nAs k j, the u of 110 inputs from first person marker detection unit QknAs u j Qkj, data set D j=[M BMj, u j Qkj, k j] be registered among the tables of data L as j data.Be noted that the index of j (j=1 is to J) for each data set of registering in tables of data, J represents the right sum of data registered.Like this, obtain data.
At step S6050, the tables of data that data administrative unit 570 is obtained is judged whether the judgment data table is enough to calculate calibration information.Do not satisfy condition if find tables of data, then flow process return step S6010 and etc. pending data obtain the input of order.On the other hand, if find that tables of data satisfies the calibration information The conditions of calculation, flow process advances to step S6060.The example that tables of data satisfies the calibration information The conditions of calculation is that tables of data L comprises about three or more different first person mark Q kData.Yet the otherness of input data is big more, and the accuracy of the control information of generation is just high more, so can do such arrangement, condition request lot of data is set promptly.
Next, at step S6060, whether decision operation person imports the calibration information calculation command.If imported the calibration information calculation command, then handle and advance to step S6070, if do not import the calibration information calculation command, then flow process turns back to step S6010 and waits pending data to obtain the input of order.
580 pairs of calibration informations that obtain of calibration information computing unit are handled, and promptly position and the orientation of mark coordinate system in first person photography video camera coordinate system is worth vector [xyz ξ ψ ζ] as 6 TIn the following description, this unknown parameter will be used as state vector s=[xyz ξ ψ ζ] TBe described.
At step S6070, with the same among second embodiment, calibration information computing unit 580 is given suitable initial value of state vector s.For initial value, the operator uses for example typical values of indicating member 160 manual inputs.
At step S6080, according to each the data set D among the tables of data L j(j=1 wherein, 2 to J), calibration information computing unit 580 calculates each first person mark Q at all j KjThe logical value u of first person image coordinate j Qkj'=[u Xj Qkj', u Yj Qkj'].Here the logical value that is noted that the first person image coordinate of first person mark is meant the position (coordinate) in looking down the visual angle image, in this position, and first person mark Q KjPosition x in world coordinate system W QkjBe known, position and the orientation looked down in the photography video camera coordinate system of visual angle when the mark coordinate system are M BMjAnd position and the coordinate when being s of mark coordinate system in the first photography video camera coordinate system, this first person mark Q KjFor visible.According to the function calculation u shown in the expression formula 26 j Qkj'.
u j Q k j ′ = F j ( s ) - - - ( 26 )
Particularly, function F j() is provided with as shown in expression formula 27.
x Cj Q k j = x C j Q k j y C j Q k j z C j Q k j 1 T = M CM ( s ) · M BM j - 1 · M WB - 1 · x W Q k j - - - ( 27 )
Expression formula 27 is as obtaining from s in that (that is, to look down position and orientation in the photography video camera coordinate system of visual angle be M to the mark coordinate system on the point that obtains j data set BMjPoint on) the position x of first person photography video camera coordinate Cj QkjExpression formula 28 is used for from x Cj QkjObtain the first person mark Q in first person photography video camera image KjCoordinate u j Qkj'.
u j Q k j ′ = ux j Q k j ′ uy j Q k j ′ T = - f x C x C j Q k j z C j Q k j - f y C y C j Q k j z C j Q k j T - - - ( 28 )
Here be noted that f C xAnd f C yRepresent camera head 130 respectively on the X-direction and the focal length on the Y direction, and the given value that should be understood to keep in advance.M CM(s) be the model conversion matrix determined by s (being used for matrix), and define by expression formula 27 with the coordinate conversion of the mark coordinate system coordinate in the first person photography video camera coordinate system.Be noted that the matrix product (M on expression formula 27 right sides BM(s) M BM - 1M WB -1) represent by s and M BMjPosition and the orientation (M of camera head in world coordinate system that determines WC -1).
At step S6090, at all j, calibration information computing unit 580 uses expression formula 29 to calculate the first person mark Q that is included in each data centralization among the tables of data L KjReal image coordinate u j QkjWith corresponding image coordinate logical value u j Qkj' between error delta u j Qkj
Δ u j Q k j = u j Q k j - u j Q k j ′ - - - ( 29 )
At step S6100, at all j, the image Jacobi matrix that calibration information computing unit 580 calculates about state vector s J uj Qkj s ( = ∂ u j Qkj / ∂ s ) (that is, i.e. component with user mode vector s of 2 row * 6 row Jacobi matrixes, its component is by to the function F in the expression formula 28 j() asks separating of local derviation acquisition).Particularly, 2 row * 3 row Jacobi matrixes J ujCj Qkj ( = ∂ u j Qkj / ∂ x Cj Qkj ) Its each component has the use location vector x Cj QkjComponent by the right side of expression formula 28 being asked separating that local derviation obtains, 3 row * 6 row Jacobi matrixes J xCj Qkj s ( = ∂ x Cj Qkj / ∂ s ) , The component that its component has a user mode vector s is separated J by what the right side of expression formula 29 is asked local derviation obtains Uj Qkj sUse expression formula 30 to calculate.
J u j s Q k j = J u j x Cj Q k j · J x Cj s Q k j - - - ( 30 )
At step S6110, calibration information computing unit 580 is according to error delta u j QkjThe Jacobi matrix J that calculates at all j with the front Uj Qk sCalculated correction value Δ s.Particularly, create and given error delta u at all j j QkjThe error vector U of perpendicular 2J dimensional vector and with given Jacobi matrix J Uj Qk sPerpendicular 2J is capable * 6 column matrix, and use the pseudo inverse matrix Φ of Φ +And calculate Δ s with expression formula 24.Now, Δ s is 6 dimensional vectors, so as long as 2J equals 6 or bigger, promptly J equals 3 or bigger, just can obtain Δ s.Be noted that Φ +Can pass through Φ +=(Φ TΦ) -1Φ TOr other method is obtained.
At step S6120, calibration information computing unit 580 use the correction value delta s that calculates among the step S6110 with 25 pairs of expressive notation coordinates of expression formula in first person photography video camera coordinate system the position and the state vector s in orientation proofread and correct, and the value that will obtain is as new s.
s+Δs→s (25)
At step S6130, calibration information computing unit 580 judges whether to obtain the convergence of calculating, and this judges certain criterion of use, and for example whether error vector U is less than predetermined threshold values, and perhaps whether correction value delta s is less than predetermined threshold values.If do not obtain convergence, then proofread and correct back state vector s and be used to begin to carry out re-treatment from step S6080.
When the judged result in step S6130 is the convergence of having obtained calculating, then calibration information computing unit 580 is exported the state vector s that obtains in step S6140, and this state vector s is as the location parameter of expressive notation coordinate system in first person photography video camera coordinate system.At this moment output format can be himself, perhaps also can do such arrangement, and promptly the location components of s is worth vector representations with 3, and position vector is represented with Eulerian angle or rotation matrix or can is the coordinate conversion matrix M that generates from s CM
At last, at step S6150, judge whether to stop calibration process.If operator's cue mark correcting device 500 stops calibration process, then handle being terminated, if calibration process (that is, recalibrating) is proceeded in operator's indication, then flow process turn back to step S6010 and etc. pending data obtain the input of order.
Like this, the mark that is provided on the camera head can easily and exactly obtain about the position of camera head or position and orientation.
<modification 3-1 〉
Now, though be described as in the present embodiment position and the orientation of mark coordinate system in the first photography video camera coordinate system exported as calibration information, can do such arrangement, promptly each looks down visual angle mark P kCalibration system output is calculated and be used as in position in first person photography video camera coordinate system.
<modification 3-2 〉
Though present embodiment uses the corrected value of the steepest descent method computing mode vector shown in the expression formula 26, the calculating of corrected value always must not calculated with the steepest descent method.For example, this can obtain with LM method (Levenberg-Marquardt method), this method is a kind of iterative method of known linear equation, also can be known as strong estimation approach for example the statistical method estimated of M-be combined to wherein, perhaps can use any other numerical computation method, this can not break away from essence of the present invention.
<modification 3-3 〉
In addition, though in the present embodiment, use square marks as looking down the visual angle mark, the type of mark is unrestricted, as long as the mark group makes that wherein each position that is marked in the mark coordinate system is known.For example, can use the group of a plurality of circular marks that use among first embodiment, perhaps polytype mark can coexist.In addition, thereby even under the situation of a plurality of mark coordinate systems, also can finish similar calibration, thereby perhaps finish calibration by concurrently each mark coordinate system being carried out above-mentioned processing by respectively each mark coordinate system being carried out above-mentioned processing.
<modification 3-4 〉
Though the present embodiment usage flag is marked at as first person and generates two-dimensional coordinate on the piece image, the reference marker that linear feature or geometric properties can be used as assessment uses.For example, if use linear feature, then can do such arrangement, be that corrected value can be calculated by following process under framework same as the previously described embodiments, promptly, use from the distance of straight line initial point as error evaluation with reference to calculating, create error vector U from the estimated value d ' of detected value d and the state vector s use error Δ d of image, the Jacobi matrixes that are listed as with 1 row * 4 J ds ( = ∂ d / ∂ s ) Create matrix Φ, this Jacobi matrix has as the component of the user mode vector s of its component by the calculation expression of d ' being asked separating that local derviation obtains.Be noted that, the calculation expression of d ' is at D.G.Lowe:Fitting parameterized three-dimensional models to images, IEEE Transactions on PAMI, Vol.13, No.5, pp.441-450,1991, and in Fujii, Kanbara, Iwasa, Takemura, Yokoya:Positioning by stereo camera with concomitant use of gyro sensor forexpanded reality, Institute of Electronics, Information andCommunication Engineers PRMU 99-192 (Technical Report of IEICEvol.99, No.574 pp.1-8), discloses in 1999, and the position of camera head and orientation can easily obtain, because the position of camera head and orientation are as the function of s obtained (product of three of the right side matrixes in the expression formula 29).These characteristics can be used by the sum of errors image Jacobian that obtains from linear characteristic and dot characteristics and other mark is gathered in a kind of mode of coexistence.
The 4th embodiment
For the mark correcting device according to present embodiment, the calibration that is provided at a plurality of marks on the object has an aspect sensor of 6DOF by use packaged type camera head is taken object and is finished.In first and second embodiment, the position and the orientation of only markd position or mark are used as unknown parameter, but for present embodiment, photography video camera also uses as unknown parameter about the position and the orientation of object, and the position of mark or the position of mark and orientation also are like this.
For present embodiment, mark is made of three or more points on same straight line not.The relative position of supposing a plurality of points of forming mark is known, and it is known promptly forming in the position that is defined as about the point of the mark on the mark coordinate system of each mark.For present embodiment,, then,, be used as calibration information and obtain promptly about the coordinate system of object with the position and the orientation about the mark coordinate system of each mark of object as object of reference regulation if a plurality of mark is provided on the object.
Fig. 8 is the synoptic diagram of expression according to the illustrative arrangement of the mark correcting device 1100 of present embodiment.As shown in Figure 8, mark correcting device 1100 comprises marker detection unit 1020, position and orientation computing unit 1030, indicating member 1050, Data Management Unit 1040 and calibration information computing unit 1060, and is connected to the object 1000 that mark is provided.
6DOF sensors A 2 is installed on the object 1000, so that Measuring Object is about the position and the orientation of world coordinate system.This 6DOF sensors A 2 is connected to position and orientation computing unit 1030.Equally, 6DOF sensors A 1 is installed on the camera head 1010, so that measure position and the orientation of the photography video camera of formation image unit 1010 about world coordinate system.This 6DOF sensors A 1 is connected to position and orientation computing unit 1030.
Now, be provided on the object 1000 according to a plurality of marks of above-mentioned definition.Mark P on the object 1000 k(wherein k=1 is to K 0) expression, K 0Quantity for the mark on the object 1000 that will calibrate.Mark P kAlso be by a p Ki(k=1 is to K 0, i=1 is to N k) composition, wherein N kMark P is formed in expression kThe total quantity of point.
As previously mentioned, the position of point in the mark coordinate system of forming the mark on the object is known, but position and orientation in object coordinates system (coordinate system, one of them point is defined as initial point, the axle of three quadratures is defined as X-axis, Y-axis and Z axle respectively) are unknown.Now, three or more the reference markers of putting as regulation object coordinates system that have known location in the object coordinates system on same straight line are not provided on the object.All reference markers need be observed by the image unit 1010 that obtains the mark calibration data in the image identical with other mark that will calibrate.Mark P kAny form all allow, as long as the image coordinate of the projected image on the image of taking can be detected, and as long as can debate mutually between the mark not and the point of forming mark is to debate other.
Image unit 1010 is taken the mark that is arranged on the object from all places and orientation, and the image of shooting is imported into marker detection unit 1020.
Marker detection unit 1020 is from camera head 1010 input first person images, and the mark P that takes in the input picture is formed in detection kThe image coordinate of point.Detected mark P further will be formed in marker detection unit 1020 KnSome p KniImage coordinate u PkniOutput to Data Management Unit 1040 with its identifier.Here, n (n=1 is to M) represents the index of the mark of each detection, and M represents the sum of the mark that detects.For the example shown in Fig. 8, the square marks that has identifier 1,2 and 3 is taken, and so M=3 here is identifier k 1=1, k 2=2, k 3=3, and export corresponding image coordinate u Pk1i, u Pk2iAnd u Pk3i(wherein i=1,2,3,4).
Identical with first embodiment, position and orientation computing unit 1030 calculate position and the orientation of photography video camera in the object coordinates system in position and the orientation the real-world coordinates system from photography video camera and object, and photography video camera and object position and the orientation in the real-world coordinates system obtains from being installed in 6DOF sensors A 1 on the photography video camera and the 6DOF sensors A 2 that is installed on the object.Position and the orientation of the photography video camera that calculates like this in world coordinate system is output to Data Management Unit 1040 according to the request of Data Management Unit 1040.
Equally, position and orientation computing unit 1030 are represented respectively in inside as 3 values vectors [xyz] T[ξ ψ ζ] TPosition and the orientation of photography video camera in the object coordinates system.The method that various usefulness 3 value vector representation orientation are arranged, in this case, during with the position of 3 value vector representation photography video cameras and orientation, with the size definition anglec of rotation of vector, with the direction definition sense of rotation of vector.Here, 6 dimensional vectors [xyz ξ ψ ζ] are used in position and orientation respectively TExpression.
If operator's (not shown) input data are obtained order, then indicating member 1050 will " obtain data " indication will be sent to Data Management Unit 1040, if input calibration information calculation command then is sent to calibration information computing unit 1060 with " calibration information calculating " indication.
When Data Management Unit 1040 receives " obtaining data " when indication from indicating member 1050, position and the orientation of photography video camera in the object coordinates system imported from position and orientation computing unit 1030, look down image coordinate and identifier 1020 inputs from the marker detection unit thereof of visual angle mark, data set " identifier of the image coordinate of the position of photography video camera and orientation, mark and mark in the object coordinates system " is added and is retained in the individual data table.Now, position and the orientation of photography video camera the object coordinates system from position and 1030 inputs of orientation computing unit is the data of obtaining when taking the image that has detected the image coordinate of looking down the visual angle mark of 1020 inputs from the marker detection unit.Data Management Unit 1040 is according to the request of calibration information computing unit 1060, and the tables of data of looking down the visual angle mark of each generation is outputed to calibration information computing unit 1060.
When calibration information computing unit 1060 receives " calibration information calculating " when indication from indicating member 1050, the tables of data of input Data Management Unit 1040, and carry out calibration process based on this, calibration information (being the position and the orientation of the mark coordinate in the object coordinates system) output that will obtain as its result.
The process flow diagram of the processing that Fig. 9 carries out when correcting device obtains calibration information according to present embodiment for expression.Be stored in the storer according to the equipment of present embodiment according to the program code of this process flow diagram, for example RAM, ROM or similar storer (not shown) are read and carry out by the CPU (not shown) then.
At step S9010, whether indicating member 1050 decision operation persons import data and obtain order.The operator is placing on the locational object input data to obtain order, so that obtain calibration data.Obtain order if imported data, then indicating member 1050 advances to step S9020.
At step S9020, Data Management Unit 1040 is from position and orientation computing unit 1030 position and the orientation ts of input photography video camera the object coordinates system Oc=[x Ocy Ocz Ocξ Ocψ Ocζ Oc] TPosition and orientation computing unit 1030 calculate t continuously OcUse expression from being installed in the measurement result s of the 6DOF sensors A 1 on the photography video camera A1=[x A1Cy A1Cz A1Cξ A1Cψ A1Cζ A1C] TThe position of the photography video camera that obtains in world coordinate system and the transition matrix M in orientation Wc(s A1) and represent from being installed in the measurement result s of the 6DOF sensors A 2 on the object A2=[x A2Oy A2Oz A2Oξ A2Oψ A2Oζ A2O] TThe position of the object that obtains in world coordinate system and the transition matrix M in orientation WO, the position of photography video camera and the transition matrix M in orientation in the expression object coordinates system OcRepresent with expression formula 31.
M OC = M WO - 1 ( s A 2 ) · M WC ( s A 1 ) - - - ( 31 )
The matrix M that to obtain from expression formula 33 OcTranslational component separate with direction component and generate position and the orientation t of camera head in the object coordinates system Oc=[x Ocy Ocz Ocξ Ocψ Ocζ Oc] T
At step S9030, Data Management Unit 1040 is 1020 input marking detecting units, 1020 detected composition mark P from the marker detection unit KnThe image coordinate group u of point PkniWith its identifier k n Marker detection unit 1020 is carried out mark continuously and is handled about the detection of input picture, so the processing of this step can be t in the position and the orientation of camera head OcSituation under obtain the image coordinate of mark.The information that are noted that from the marker detection unit 1020 inputs are not to be necessary for the information underlined about, this moment this point about image on the information of detected mark just enough.
Next, at step S9040, Data Management Unit 1040 is with the data set D of input 1Add all detected mark P to KnTables of data DL in.Particularly, for t from position and 1030 inputs of orientation computing unit OcAs t Ocj=[x Ocjy Ocjz Ocjξ Ocjψ Ocjζ Ocj] T, the identifier k of 1020 inputs from the marker detection unit nAs k Nj, also be the u of 1020 inputs from the marker detection unit PkniAs u L Pknji, data set [t Ocj, u L Pknji, k Nj] be registered among the tables of data DL as L data.(j=1 is to N to be noted that j J) be the index of each captured image, (L=1 is to N for L L) be the index of each data set of in data list DL, registering, N JThe sum of representing captured image, N LThe sum of the data set that expression has been registered.Like this, data are obtained.
At step S9050, judge that at the tables of data that Data Management Unit 1040 obtains whether this tables of data is enough to calculate calibration information.If do not satisfy condition, then flow process turns back to step S9010, and etc. pending data obtain the input of order.On the other hand, if satisfy the calibration information The conditions of calculation, then flow process advances to step S9060.An example of condition that satisfies the tables of data of calibration information The conditions of calculation is that tables of data DL comprises the data about three or more different reference markers.Yet otherness is big more in the data of input, and the accuracy of the calibration information that generates is just high more, so can do a kind of like this arrangement, is about to condition enactment and is the request lot of data.
Next, at step S9060, whether decision operation person imports the calibration information calculation command.If imported the calibration information calculation command, then handle advancing to step S9070, and if do not import the calibration information calculation command, then flow process turns back to step S9010 and waits pending data to obtain the input of order.
1060 pairs of calibration informations that obtain of calibration information computing unit are handled, that is, and and as position and the orientation in the marking objects coordinate system of 6 value vectors, with identical among second embodiment.In the following description, this unknown parameter will be as state vector s m=[x my mz mξ mψ mζ m] TBe described.For present embodiment, t OcjAs state vector s Cj=[x Ocjy Ocjz Ocjξ Ocjψ Ocjζ Ocj] TDescribe.At step S9070, calibration information computing unit 1060 is given state vector s respectively mAnd s CjSuitable initial value.For S CjInitial value, adopt the t that obtains from the output of sensor OcjFor s mInitial value, the operator can use typical values of the manual input of indicating member 1050.The transition matrix M of use from the photography video camera coordinate system to the object coordinates system OCAnd from the object coordinates system to the transition matrix M that is installed in the mark coordinate system on the object MO, transition matrix M from the photography video camera coordinate system to the mark coordinate system MCObtain by expression formula 32.
M MC=M MO·M OC (32)
At step S9080, at all L, calibration information computing unit 1060 each data set D from tables of data DL L=[t OCj, u j Pknji, k j] (j=1 wherein, 2 to N J, L=1,2 to N L) and state vector s mAnd s Cj, the mark P that obtains KnjThe calculated value u of image coordinate 1LP Knji'=[u XL Pknji', u YL Pknji'].Here the calculated value that is noted that the image coordinate of mark is meant position (coordinate) data in the image, in this position, and mark P KnjShould be visible, this mark P KnjPosition vector h in the mark coordinate system M PknjFor known, position and the orientation of this mark coordinate system in the object coordinates system is s mAccording to the function calculation u shown in the expression formula 33 L Pknji'.
u L P k nj i ′ = F L ( s ) - - - ( 33 )
State vector s=[s ms Cj] position of expressive notation coordinate system in the object coordinates system.
Particularly, function is provided with as shown in expression formula 34.
c L P k n j i = x c i P k n j i y c i P k n j i z c i P k n j i 1 T = M OC - 1 ( s cj ) · M OM ( s ni P k n j ) · h M P k n j i - - - ( 34 )
Expression formula 34 is used for from s m, S CjObtain on the point that obtains L data set with expression formula 35 that (promptly concentrating the position and the orientation of camera head 1010 at individual data is t OcjPoint on) mark P KnjPosition vector c in the photography video camera coordinate L Pknji
u L P k n j i ′ = ux L P k n j i ′ uy L P k n j i ′ T = - f x B x c L P k n j i z c L P k n j i - f y B y c L P k n j i z c L P k n j i T - - - ( 35 )
Expression formula 35 is used for from c L PknjiObtain the mark P in the image KnjCoordinate u L Pknji'.Here be noted that f B xAnd f B yRepresent the focal length of photography video camera on X-direction and Y direction respectively, and be appreciated that the given value that keeps in advance.M OM(s m Pknj) be by s m PknjThe model conversion matrix of determining (being used for matrix) with the coordinate conversion of the mark coordinate system coordinate in the object coordinates system, M OC -1(s Cj) be by s CjThe model conversion matrix of determining (being used for) with the coordinate conversion of the object coordinates system matrix in the photography video camera coordinate system.M OM(s m Pknj) by expression formula 36 definition.
M OM ( s m ) = ξ m 2 θ 2 ( 1 - cos θ ) + cos θ ξ m ψ m θ 2 ( 1 - cos θ ) - ζ m θ sin θ ξ m ζ m θ 2 ( 1 - cos θ ) + ψ m θ sin θ x m ψ m ξ m θ 2 ( 1 - cos θ ) + ζ m θ sin θ ψ m 2 θ 2 ( 1 - cos θ ) + cos θ ψ m ζ m θ 2 ( 1 - cos θ ) - ξ m θ sin θ y m ζ m ξ m θ 2 ( 1 - cos θ ) - ψ m θ sin θ ζ m ψ m θ 2 ( 1 - cos θ ) + ξ m θ sin θ ζ m 2 θ 2 ( 1 - cos θ ) + cos θ z m 0 0 0 1 - - - ( 36 )
Wherein
θ = ξ m 2 + ψ m 2 + ζ m 2 - - - ( 37 )
And
15
M OC - 1 ( s Cj ) = ξ A 1 2 θ 2 ( 1 - cos θ ) + cos θ ξ A 1 ψ A 1 θ 2 ( 1 - cos θ ) - ζ A 1 θ sin θ ξ A 1 ζ A 1 θ 2 ( 1 - cos θ ) + ψ A 1 θ sin θ x A 1 ψ A 1 ξ A 1 θ 2 ( 1 - cos θ ) + ζ A 1 θ sin θ ψ A 1 2 θ 2 ( 1 - cos θ ) + cos θ ψ A 1 ζ A 1 θ 2 ( 1 - cos θ ) - ξ A 1 θ sin θ y A 1 ζ A 1 ξ A 2 θ 2 ( 1 - cos θ ) - ψ A 1 θ sin θ ζ A 1 ψ A 1 θ 2 ( 1 - cos θ ) + ξ A 1 θ sin θ ζ A 1 2 θ 2 ( 1 - cos θ ) + cos θ z A 1 0 0 0 1
(38)
Wherein
θ = ξ A 1 2 + ψ A 1 2 + ζ A 1 2 - - - ( 39 )
At step S9090,1060 couples of all L of calibration information computing unit use the mark P that is included in each data centralization among the expression formula 40 computational data table DL KnjReal image coordinate u L PknjiWith corresponding image coordinate logical value u L Pknji' between error delta u L Pknji
Δu l P k n j i = u l P k n j i - u l P k n j i ′ - - - ( 40 )
At step S9100, at all L, calibration information computing unit 1060 calculates about state vector s=[S mS Cj] the image Jacobi matrix J uL Pknji s ( = ∂ u L Pknji / ∂ s ) (that is (2 * N, L) OK * (6 * K O+ 6 * K NJ) the row Jacobi matrix, its component is had user mode vector s mAnd s CjComponent by to function F in the expression formula 33 j() asks separating of local derviation acquisition).Here, N LThe sum of representing all detected marks, K OBe the sum of the mark that will calibrate, K NJQuantity for the image taken.
At step S9110, the error delta u that calibration information computing unit 1060 calculates at all L according to the front L PknjiWith Jacobi matrix J UL Pknji sCalculate the correction value delta s of s.Particularly, with given error delta u at all L L PkndjiVertical 2N LThe error vector of dimensional vector is as shown in expression formula 41.
U = Δu L P k 1 1 1 . . . . . . Δu L P k n j i . . . . . . Δu N L P k N N J N k - - - ( 41 )
With given Jacobi matrix J UL Pknji sVertical (2 * N L) OK * (6 * K O+ 6 * K NJ) column matrix also is created as shown in expression formula 42.
Φ = ΔJ u 1 s P k 1 1 . . . J u l s P k n j i . . . . . . . . . ΔJ u N L s P k N N J i - - - ( 42 )
As shown in expression formula 43, with the pseudo inverse matrix Φ that uses Φ +Expression formula 43 calculate Δ s.
Δs=Φ +U (43)
Now, Δ s is (6 * K O+ 6 * K NJ) dimensional vector, so as long as 2N LEqual (6 * K O+ 6 * K NJ) or just can obtain Δ s greatlyyer.Be noted that Φ +Can pass through Φ +=(Φ TΦ) -1Φ TOr other method obtains.
At step S9120, calibration information computing unit 1060 uses the correction value delta s that calculates at step S9110 with expression formula 25 (repeating to be expression formula 44 here) computing mode vector s, and the value that will obtain is as new s.Here state vector s=[s ms Cj] for being marked at the position in the object coordinates system and the state vector s in orientation mWith the position of photography video camera in the object coordinates system and the state vector s in orientation Cj
s+Δs→s (44)
At step S9130, calibration information computing unit 1060 judges whether to obtain the convergence of calculating, and this judges certain criterion of use, and for example whether error vector U is less than predetermined threshold values, and perhaps whether correction value delta s is less than predetermined threshold values.If judged result is not for obtaining convergence, the state vector s after then proofreading and correct is used to repeat processing procedure from step S6080.
If the judged result in step S9130 is for having obtained the convergence of calculating, s that then will the state vector s that calibration information computing unit 1060 is obtained in step S9140 mAs calibration information, that is, and position and the orientation of the coordinate system that serves as a mark in the object coordinates system.At this moment Shu Chu form can be s mSelf perhaps can do a kind of like this arrangement, i.e. s mLocation components with 3 value vector representations, direction component is placed matrix representations or can be for from s with Eulerian angle or 3 * 3 mThe coordinate conversion matrix M that generates OM
At last, at step S9150, judge whether to stop calibration process.If operator's cue mark correcting device 1100 stops calibration process, then processing finishes, if calibration process (promptly recalibrating) is proceeded in operator's indication, then flow process turn back to step S9010 and etc. pending data obtain the input of order.
Like this, being arranged on the position of the mark relevant with camera head on the camera head or position and orientation (that is, in the object coordinates system) can obtain easily and exactly.
<modification 4-1 〉
Now, can move freely though the present invention has been described as image unit 1010, among first and second embodiment that can be set to coexist to look down the visual angle photography video camera the same, promptly image unit 1010 is for fixing.In this case, 6DOF sensors A 1 just not necessarily, position and orientation computing unit 1030 as the image unit 1010 of given value position and orientation and position and the orientation of object world coordinate system that obtain from 6DOF sensors A 2 in world coordinate system, calculates position and the orientation of image unit 1010 in the object coordinates system according in advance.For first and second embodiment, need obtain the position of looking down the visual angle photography video camera and orientation exactly as given value, but for present embodiment, position and the orientation of photography video camera in the object coordinates system of position and 1030 outputs of orientation computing unit is corrected, this has advantage, and the promptly common position and the input in orientation are just enough.
Other embodiment
The present invention can be by being provided for storing the storage medium (or recording medium) of program code from the software of carrying out the foregoing description function to system or equipment and the computing machine (or CPU or MPU (microprocessing unit)) that is used for reading and carry out the system or equipment of the program code that is stored in storage medium be achieved.In this case, the program code self that reads out from storage medium is realized the function of the foregoing description.The invention is not restricted to carry out the situation that the program code that reads out is realized the function of the foregoing description by the computing machine system; Also be not limited to operating system or similar system operating part or the whole actual treatment moved on computers, for example realize the function of the foregoing description.
In addition, the present invention includes such arrangement, the function that is the foregoing description is finished by read and write the program code that is included in the function expansion card that inserts computing machine or is connected to the storer in the functional expansion unit of computing machine from recording medium, then be provided to CPU or similar units operating part or whole actual treatment on function expansion card or the functional expansion unit, for example realize the function of the foregoing description.
If the present invention is applied in the above-mentioned storage medium, then the program code corresponding to previously described process flow diagram is deposited in the storage medium.
Though the present invention is illustrated with reference to example embodiment, should be understood to the invention is not restricted to disclosed embodiment.On the contrary, the present invention has a mind to cover various modifications and the equivalent technical solutions in the spirit and scope that are included in the appending claims.The scope of following claims is consistent with the wideest explanation, so that comprise all such modifications and equivalent configurations and function.

Claims (20)

1. information processing method that is used to calculate the mark that is provided on the object about the position of object, this information processing method comprises:
Object space and azimuth information obtaining step obtain the position and the azimuth information of object in predetermined coordinate system;
First width of cloth image input step is from looking down look down first width of cloth image that visual angle take of visual angle image unit input from object;
Detect step, certification mark from first width of cloth image;
The mark position calculation procedure, use the position of object and azimuth information and about the image coordinate information calculations of detected mark with the position of object as the mark of reference.
2. according to the information processing method of claim 1, wherein object comprises first person shooting unit;
Wherein, object space and azimuth information obtaining step also comprise:
Obtain second width of cloth image of taking unit photographs with first person,
Extract the mark that is positioned at beyond the object from second width of cloth image, and
According to about the image coordinate information of the mark that extracts from second width of cloth image and the marker location information that keeps in advance, obtain position and orientation that first person is taken the unit.
3. according to the information processing method of claim 1, the relative position between wherein a plurality of marks is known.
4. according to the information processing method of claim 1, the position that wherein is marked in the mark coordinate system is known; And
Wherein, in the mark position calculation procedure, calculate and be used for the conversion parameter between the coordinate system of the position of mark coordinate system and object and azimuth information, changed.
5. according to the information processing method of claim 1, object and look down the visual angle to take the relative position and the orientation of unit be unknown parameter wherein.
6. according to the information processing method of claim 5, also comprise calculating and look down of the calculating of shooting unit, visual angle about the position and the orientation of object.
7. according to the information processing method of claim 1, wherein look down shooting unit, visual angle and fix.
8. according to the information processing method of claim 1, wherein the mark position calculation procedure also comprises:
Estimating step, according to mark about object the location estimation value, estimate information about the image coordinate of the mark in the image;
Aligning step according to about the information of the image coordinate of detected mark in described detection step with about the error between the information of the image coordinate of the mark estimated, is proofreaied and correct estimated value in described estimating step.
9. according to the information processing method of claim 1, wherein the mark position calculation procedure is calculated the orientation of mark and with the position of object as the mark of reference.
10. according to the information processing method of claim 9, wherein, the mark position calculation procedure also comprises:
Estimating step according to the estimated value of mark about the position and the orientation of object, is estimated the information about the image coordinate of the mark in the image;
Aligning step according to about the information of the image coordinate of detected mark in described detection step with about the error between the information of the image coordinate of the mark estimated, is proofreaied and correct estimated value in described estimating step.
11. one kind is used to obtain and is provided at the mark that will be calibrated on first camera head information processing method about the placement information of this first camera head, this information processing method comprises:
Obtain first width of cloth image of taking with this first camera head;
Obtain and utilize second width of cloth image of second camera head from this first camera head of looking down the position, visual angle and taking;
From first width of cloth image, detect information about the image coordinate of the reference marker that places scene;
From second width of cloth image, detect information about the image coordinate of the mark that will be calibrated;
Use is obtained calibration information about the information of the image coordinate of the detected reference marker that places scene with about the information of the image coordinate of the mark that will be calibrated.
12., wherein look down the position, visual angle and be the fixed position in the scene according to the information processing method of claim 11.
13. according to the information processing method of claim 11, wherein, use is about the position and the orientation of information calculations first camera head of the image coordinate of reference marker;
Wherein, according to position and the information in orientation and the image coordinate of the mark that will be calibrated, calculate calibration information about first camera head.
14., wherein,, estimate information about the image coordinate of the mark that in second width of cloth image, will be calibrated according to the position of first camera head and the estimated value of orientation and calibration information according to the information processing method of claim 13;
Wherein, proofread and correct the estimated value of calibration information, so as to reduce about the information of the image coordinate of the mark that will be calibrated and estimation about the error between the information of the image coordinate of the mark that will be calibrated.
15. according to the information processing method of claim 11, wherein calibration information provides the mark that will be calibrated on first camera head about the position and the azimuth information of first camera head;
Wherein, according to about the information of the image coordinate of the mark that will be calibrated and the estimated value of current calibration information, calculate the position and the azimuth information of first camera head;
Wherein, according to position and the information in orientation and the image coordinate of reference marker, calculate calibration information about first camera head.
16. according to the information processing method of claim 11, wherein, calibration information provides the mark that will be calibrated position and the azimuth information about first camera head on first camera head;
Wherein, according to about the information of the image coordinate of the mark that will be calibrated and the estimated value of current calibration information, estimate image coordinate information about the reference marker in first width of cloth image;
Wherein, the estimated value of calibration information is corrected so that reduce about the information of the image coordinate of reference marker and about the error between the information of the image coordinate of the reference marker estimated.
17., wherein, be the image coordinate of at least one point of determining of the mark that will be calibrated about the information of the image coordinate of the mark that will be calibrated according to the information processing method of claim 11.
18. the messaging device about the position of object that is used to calculate the mark that is provided on the object, this equipment comprises:
Object space and orientation obtain equipment, are used for obtaining in predetermined coordinate system the position and the azimuth information of object;
The image input block is used to import the image that the visual angle is taken of looking down from object;
Detecting unit is used for certification mark from the image; And
The mark position computing unit, be used to use the position of object and orientation and about the information of detected marking image coordinate with object as with reference to the position of calculating this mark.
19. according to the messaging device of claim 18, wherein the mark position computing unit is used to calculate the orientation of mark and is the position of the mark of reference with the object.
20. a messaging device that is used to obtain the mark that will be calibrated that is provided on first camera head about the placement information of this first camera head, this messaging device comprises:
First width of cloth image acquisition unit obtains first width of cloth image of taking with first camera head;
Second width of cloth image acquisition unit obtains and utilizes second width of cloth image of second camera head from this first camera head of looking down the position, visual angle and taking;
First detecting unit detects information about the image coordinate that is positioned over the reference marker the scene from first width of cloth image;
Second detecting unit detects information about the image coordinate of the mark that will be calibrated from second width of cloth image;
Computing unit uses about the information of the image coordinate that is positioned over the detected reference marker in the scene with about the information of the image coordinate of the mark that will be calibrated, and obtains calibration information.
CNB2005100680684A 2004-05-14 2005-05-16 Information processing method and device therefor Active CN100430686C (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP144784/2004 2004-05-14
JP144895/2004 2004-05-14
JP2004144895 2004-05-14
JP320637/2004 2004-11-04
JP065356/2005 2005-03-09

Publications (2)

Publication Number Publication Date
CN1865841A CN1865841A (en) 2006-11-22
CN100430686C true CN100430686C (en) 2008-11-05

Family

ID=37424935

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2005100680684A Active CN100430686C (en) 2004-05-14 2005-05-16 Information processing method and device therefor

Country Status (1)

Country Link
CN (1) CN100430686C (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102265112A (en) * 2009-09-04 2011-11-30 松下电器产业株式会社 Device for collecting position calibration information, method for collecting position calibration information, and program for collecting position calibration information

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4960754B2 (en) * 2007-04-25 2012-06-27 キヤノン株式会社 Information processing apparatus and information processing method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5978521A (en) * 1997-09-25 1999-11-02 Cognex Corporation Machine vision methods using feedback to determine calibration locations of multiple cameras that image a common object
US6064749A (en) * 1996-08-02 2000-05-16 Hirota; Gentaro Hybrid tracking for augmented reality using both camera motion detection and landmark tracking
CN1275022A (en) * 2000-06-23 2000-11-29 成都索贝数码科技股份有限公司 Detection method and system for inner and outer video camera parameters of virtual studio
WO2001071662A2 (en) * 2000-03-20 2001-09-27 Cognitens, Ltd. System and method for globally aligning individual views based on non-accurate repeatable positioning devices
TW580580B (en) * 2002-05-28 2004-03-21 Chung Shan Inst Of Science Method and apparatus for determining the spatial position and angular orientation of an object

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064749A (en) * 1996-08-02 2000-05-16 Hirota; Gentaro Hybrid tracking for augmented reality using both camera motion detection and landmark tracking
US5978521A (en) * 1997-09-25 1999-11-02 Cognex Corporation Machine vision methods using feedback to determine calibration locations of multiple cameras that image a common object
WO2001071662A2 (en) * 2000-03-20 2001-09-27 Cognitens, Ltd. System and method for globally aligning individual views based on non-accurate repeatable positioning devices
CN1275022A (en) * 2000-06-23 2000-11-29 成都索贝数码科技股份有限公司 Detection method and system for inner and outer video camera parameters of virtual studio
TW580580B (en) * 2002-05-28 2004-03-21 Chung Shan Inst Of Science Method and apparatus for determining the spatial position and angular orientation of an object

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A real-time tracker for markless augmented reality.. Andrew I. Comport, et al.Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality. 2003
A real-time tracker for markless augmented reality.. Andrew I. Comport, et al.Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality. 2003 *
Robust Vision-Based Registration UtilizingBird's-Eye View with User's View. Kiyohide Satoh, Shinji Uchiyama, et al.Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality. 2003 *
Robust Vision-Based Registration UtilizingBird's-Eye View with User's View.. Kiyohide Satoh, Shinji Uchiyama, etal.Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality. 2003

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102265112A (en) * 2009-09-04 2011-11-30 松下电器产业株式会社 Device for collecting position calibration information, method for collecting position calibration information, and program for collecting position calibration information
CN102265112B (en) * 2009-09-04 2014-07-02 松下电器产业株式会社 Device for collecting position calibration information and method for collecting position calibration information

Also Published As

Publication number Publication date
CN1865841A (en) 2006-11-22

Similar Documents

Publication Publication Date Title
EP1596333B1 (en) Calibration for mixed reality
CN1847789B (en) Method and apparatus for measuring position and orientation
JP4976756B2 (en) Information processing method and apparatus
JP4914038B2 (en) Information processing method and apparatus
JP4914039B2 (en) Information processing method and apparatus
CN100489833C (en) Position posture measuring method, position posture measuring device
US9378559B2 (en) System and method for motion estimation
CN107990899A (en) A kind of localization method and system based on SLAM
US7630555B2 (en) Position and orientation measuring method and apparatus
CN109522935A (en) The method that the calibration result of a kind of pair of two CCD camera measure system is evaluated
CN103134425B (en) Information processor and information processing method
CN107481276A (en) The automatic identifying method of mark point sequence in a kind of 3 d medical images
JP4956456B2 (en) Image processing apparatus and image processing method
CN105659107B (en) For determining the method and ultrasonic equipment of the posture of object
CN110065075A (en) A kind of spatial cell robot external status cognitive method of view-based access control model
JP2005326275A (en) Information processing method and device
CN105631161B (en) A kind of determination method and apparatus that actual situation model is overlapped
JP4533193B2 (en) Information processing apparatus and information processing method
CN107329593A (en) A kind of VR handles localization method and device
CN106461414B (en) A kind of the posture relationship calculation method and smart machine of smart machine
CN100430686C (en) Information processing method and device therefor
JP4566786B2 (en) Position and orientation measurement method and information processing apparatus
JP4533090B2 (en) Index calibration apparatus and information processing method
JP5726024B2 (en) Information processing method and apparatus
CN108088426A (en) One kind shoots with video-corder observed object locating measurement method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant